top of page

Research

 


In some way or other, all of my work is about how the formal and the normative make contact.  Below are some abstracts and drafts of some published and unpublished work.

published and forthcoming work

 

 

The Positive Argument for Impermissivism,

Australasian Journal of Philosophy [penultimate draft]

Epistemic impermissivism is the view that there is never more than one doxastic attitude it is rational to have in response to one's total evidence. Epistemic permissivism is the denial of this claim. The debate between the permissivist and the impermissivist has proceeded, in large part, by way of "negative" arguments that highlight the unattractiveness of the opposing position. In light of the deadlock that has ensued, this paper has two aims. The first is to introduce the concept of a "positive" argument for impermissivism. The second is to show that this argument faces a dilemma, one that generalizes the problems that famously arise for formal constraints like the Principal of Indifference. The aim of this paper is to strengthen the argument against the impermissivist by showing that no positive argument for impermissivism is likely to succeed.

Conditionalization, The Blackwell Companion to Epistemology, 3rd Edition [penultimate draft]

Time-Slice Epistemology for Bayesians, Inquiry, forthcoming. [penultimate draft]

Recently some have challenged the idea that there are genuine norms of diachronic rationality. Part of this challenge has involved offering replacements for diachronic principles. Skeptics about diachronic rationality believe that we can provide an error theory for it by appealing to synchronic updating rules that mimic the behavior of diachronic norms. In this paper, I argue that the most promising attempts to develop this position within the Bayesian framework are unsuccessful. I defend a new synchronic surrogate of Conditionalization that draws upon some of the features of each of these earlier attempts. At the heart of this discussion is the question of what exactly it means to say that one norm is a surrogate for another. I suggest that surrogacy, in the given context, can be taken as a proxy for the degree to which formal and traditional epistemology can be made compatible.

Moral Encroachment and Positive Profiling, Erkenntnis, forthcoming [penultimate draft]

Some claim that moral factors affect the epistemic status of our beliefs. Call this the moral encroachment thesis. It's been argued that the moral encroachment thesis can explain at least part of the wrongness of racial profiling. The thesis predicts that the high moral stakes in cases of racial profiling make it more difficult for these racist beliefs to be justified. This paper considers a class of racial generalizations that seem to do just the opposite of this. The high moral stakes of the beliefs that we infer from these generalizations make it easier rather than harder for these beliefs to be justified. I argue that the existence of this class of cases---cases of ``positive profiling''---give us reason to expand our account of moral encroachment, in a way that brings it closer to the ideal of pragmatic encroachment that motivates it in the first place.

Bayesian Coherentism, Synthese 198 (10): 9563-9590. 2021. [penultimate draft]

This paper considers a problem for Bayesian epistemology and goes on to propose a solution to it. On the traditional Bayesian framework, an agent updates her beliefs by Bayesian conditioning, a rule that tells her how to revise her beliefs whenever she gets evidence that she holds with certainty. In order to extend the framework to a wider range of cases, Richard Jeffrey (1965) proposed a more liberal version of this rule that has Bayesian conditioning as a special case. Jeffrey conditioning is a rule that tells the agent how to revise her beliefs whenever she gets evidence that she holds with any degree of confidence. The problem? While Bayesian conditioning has a foundationalist structure, this foundationalism disappears once we move to Jeffrey conditioning. If Bayesian conditioning is a special case of Jeffrey conditioning then they should have the same normative structure. The solution? To reinterpret Bayesian updating as a form of diachronic coherentism.

Commutativity, Normativity and Holism: Lange Revisited, Canadian Journal of Philosophy 50 (2): 159-173. 2020. [link]

 

Lange (2000) famously argues that although Jeffrey Conditionalization is non-commutative over evidence, it's not defective in virtue of this feature. Since reversing the order of the evidence in a sequence of updates that don't commute does not reverse the order of the experiences that underwrite these revisions, the conditions required to generate commutativity failure at the level of experience will fail to hold in cases where we get commutativity failure at the level of evidence. If our interest in commutativity is, fundamentally, an interest in the order-invariance of information, an updating sequence that does not violate such a principle at the more fundamental level of experiential information should not be deemed defective. This paper claims that Lange's argument fails as a general defense of the Jeffrey framework. Lange's argument entails that the inputs to the Jeffrey framework differ from those of classical Bayesian Conditionalization, in a way that makes them defective. Therefore, either the Jeffrey framework is defective in virtue of not commuting its inputs, or else it is defective in virtue of commuting the wrong kinds of ones.

Higher-Order Beliefs and the Undermining Problem for Bayesianism, Acta Analytica (2019) [link]

Jonathan Weisberg has argued that Bayesianism's rigid updating rules make Bayesian updating incompatible with undermining defeat. In this paper, I argue that when we attend to the higher-order beliefs we must ascribe to agents in the kinds of cases Weisberg considers, the problem he raises disappears. Once we acknowledge the importance of higher-order beliefs to the undermining story, we are led to a different understanding of how these cases arise. And on this different understanding of things, the rigid nature of Bayesianism's updating rules is no obstacle to its accommodating undermining defeat.

in progress

The Retrospective Account of Bayesian Updating [email for a draft]

This paper argues for a new account of Bayesian updating by taking a Retrospective Approach to diachronic coherence. This approach says that an agent is diachronically coherent whenever the information she has revised her beliefs on satisfies whatever constraint we would want our evidence to satisfy. This approach contrasts with a common way of thinking about the Bayesian framework, according to which it treats evidence as a black box. The aim of this paper is to provide a different interpretation of Bayesianism's main updating constraint by filling in this black box with a Bayesian account of evidence.

Normativity and Arbitrariness [email for a draft]

It's a widely held idea that logic is normative. But defending this idea is complicated. On a particularly strong view, to say that logic is normative is to claim that it can be analyzed in normative terms. One notable proponent of this position is MacFarlane (2004), who argues that we can gain insight into the nature of logical consequence by exploring its relation to certain coherence constraints. However, skepticism about the normativity of coherence makes this strategy seem less promising than it might first have appeared. This paper appeals to the recently popular idea that we can understand formal systems as modeling epistemic situations---by analogy with the sorts of models that are used in the sciences---to provide the sort of account of the normativity of logic described above. On the view that I propose, the normativity of logic is grounded, not in the aims of modeling, nor in the target of a model, but in a choice we are compelled to make at the beginning of the modeling process. An essential property of our logics, on this story, is a form of arbitrariness they exhibit. Logic contributes this feature to the models of our norms.

bottom of page