Grad Research
Graduate students participate in a rich research environment, with regular department colloquia, productive small workshops, and opportunities to collaborate with faculty. Most students who envision going on to the PhD choose to write a Professional Paper, which usually begins as successful paper from a graduate seminar and is later honed into a writing sample over the course of a term, working with their supervisors. Abstracts for recent professional papers are below, organized by topic.
Incoming students are given a travel budget of $1200. They are encouraged to submit short papers to conferences and/or attend workshops in their areas of interest. Conference/workshop funding for MA students is a new addition to our program and to date students have had overwhelmingly positive experiences using their travel funds.
Professional Papers and Theses
2020
Don't Let Me Be Misunderstood: Sarah Paquette
Abstract: The standard interpretation of Hume on beliefs acquired through testimony reduces testimony to perceptual evidence, resulting in an account of belief formation and decision making about moral matters that fails to account for the passions and character of those involved, as well as how social relations and sympathy bear influence. I argue that this is a problem absent from Hume’s original work, as he draws a distinction between moral matters and matters of fact and presents a nuanced descriptive account.
Supervisor: Lisa Shapiro
On Rosen's Bridge-Law Non-Naturalism: Moral Laws and Weak Supervenience: Reza Abdolrahmani
Abstract
This paper examines Rosen’s bridge-law non-naturalism, the view that an adequate explanation of why a given moral property is instantiated contains (i) natural properties and (ii) a moral law that bridges these two sui generis properties. I will argue that Rosen’s account of moral laws faces three problems. (1) His account of moral laws could allow an implausible explanans to be cited in explaining why an entity has a given moral property (the problem of explanation-conduciveness). (2) Moral laws do not contain a required inner necessity between moral and non-moral properties (the problem of lack of inner necessity). (3) The bruteness of moral laws, per Rosen, makes them unknowable, while they presumably must be epistemically accessible in order to be action-guiding. I will show that the way in which Rosen addresses this issue is defective (the problem of epistemic access). As will become clear, these three problems together put Rosen’s account of non-naturalism under substantial pressure.
Supervised by Dr. Tom Donaldson
Provoking Change: Debunking the Provocation Defence for Murder: Lauren Perry
Abstract
Provocation is a partial defence, which if successful reduces the charge of murder to manslaughter. The law treats a killing as provoked if the defendant killed because they lost control in anger caused by the victim’s wrongful act or insult. The defence is controversial: many feminists argue the defence should be revised or abolished, because it perpetuates undesirable masculine norms of violence and anger expression towards women. Others, while sympathetic to feminist arguments, defend provocation on the ground it acts as a “concession to human frailty.” This paper offers two debunking arguments against the “concession to human frailty.” The first is historical: provocation emerged not as a concession to human frailty but as mitigation for aristocratic men who acted to preserve their honour. Second, I argue the contemporary defence, despite a reconceptualization in terms of “loss-of-self-control,” functions analogously to the historical defence. Provocation should be abolished: it is a not a concession to human frailty, but mitigation for those who kill to preserve their social and moral status in an unjust system.
Supervised by Dr. Evan Tiffany
Knowing When a Gnomon Is Not Working: A Complementary Scientific Approach to Eratosthenes' Calculation of the Earth's Circumference: Cem Erkli
Abstract
According to the model-based epistemology of measurement, measurement is justified by calibration models that allow a user to go from an instrument indication to the measured value of a quantity (Tal, 2017). In this paper, I explore the ways in which the measuring instruments themselves, as opposed to calibration models, limit certain inferences about the value of a quantity. A specific case of this limitation is seen in measuring instruments that are physical models, since the similarity of the model to its target system can determine whether an inference is justified or not. To argue this point, I apply the contemporary literature on physical models (e.g. Weisberg 2012, Sterrett 2001) and measurement to the historical case of Eratosthenes' (276 - 194 BC) calculation of the earth's circumference. In the end, I suggest a broad framework for analyzing the epistemic role instruments play in science based on Nancy Cartwright's nomological machines.
Supervised by Dr. Holly Andersen
Credibility as a Distributive Good: Varsha Pai
Abstract
The credibility that one is given plays a central role in determining the degree to which one is allowed to participate in the production and distribution of social goods. Miranda Fricker (2007) sheds light on the injustice of credibility deficit in her conception of testimonial injustice. This conception, however, views credibility excess as irrelevant to the injustice in question. In this paper, I investigate features of credibility as a good and argue that it falls within the purview of distributive justice. In doing so, I demonstrate that excesses of credibility do create corresponding deficits, making them relevant to the question of testimonial injustice. Accordingly, I offer an understanding of testimonial injustice that accommodates the distributive nature of credibility while being consistent with Fricker's broader aims.
Supervised by Dr. Endre Begby
Can't Put It Into Words? Put It Here: A Framework for Modelling Inexpressible Information: Somayeh Tohidi
Abstract
In this paper, I introduce a framework called Syntactic Probabilistic Aumann Structures (SPAS), which is an adapted version of a framework widely used by economists, to provide a novel perspective for addressing some problems in social and formal epistemology. I am going to argue that this framework is superior to a mainstream framework like standard Bayesianism for modelling situations in which inexpressible information is involved. In single-agent scenarios, learning experiences that cannot be captured by a set of sentences may impede the process of updating. I will show how thinking about these situations in terms of SPAS enables an agent to update their credence function using inexpressible information. This possibility emerges from the resources that SPAS has for modelling pieces of information as non-linguistic entities. In multi-agent scenarios, on the other hand, information that is inexpressible in the common language of the agents can cause them to disagree on the credence that they assign to an event. The question is how each agent should update their credence function upon learning the other agent's credence, considering the fact that direct conditionalization on the credence of others is computationally very difficult. In SPAS, agents can indirectly exchange their private inexpressible information simply by announcing their credence towards a certain sentence to each other. So, they can update their credence function by conditionalizing on the information that they have inferred from an announced credence, instead of conditionalizing on the announced credence itself. This possibility stems from the resources that this framework has for distinguishing between the pieces of information that can be learned (with certainty) and other pieces of information. I will show the relevance of this result to the problem of 'peer disagreement' by showing that agents with asymmetric information can be considered as peers according to the recent, less controversial notion of peerhood.
Supervised by Dr. Nic Fillion
On Regan’s Solution to the Moral Problem of Animal Predation: Lutfi Shoufi
Abstract
n my paper, I aim to critique Tom Regan’s response to the moral issue of animal predation. Contrary to his ambition of showing that we do not have a duty to intervene with animal predation, Regan’s response actually leaves us with such an obligation. In fact, Regan’s response leads to what many would consider to be a radically counter-intuitive obligation: we ought to intervene on behalf of the predator in many predator-prey conflicts. As we will see, Regan's response to the moral issue of predation is importantly built on the putative competence of wild animals. But if my argument succeeds, I would have shown that the burden is on Regan to either come up with a plausible conception of competence, which I suggest he has not offered us, such that all wild animals would be competent in their conflicts or come up with a new, non-competence-based, strategy in order to resolve the moral issue of predation. At best, Regan's solution is incomplete. At worst, Regan's solution is self-undermining.
The Only Possible Interpretation of the Only Possible Argument: Schuyler Pringle
Abstract
In the paper, I outline and defend a widely underappreciated reading of Kant’s The Only Possible Argument, according to which, the existence of the simple, unique, and absolutely necessary being, which Kant thought to materially ground all possibility, follows naturally from Kant’s definition of absolute necessity – that somethings opposite, its nonexistence, should cancel all possibility whatsoever. Since, according to Kant, the internal possibilities of all things presupposes some existence or other, all possibility may therefore be said to presuppose some existence – the existence of one or many things – that itself actually instantiates the complete set of the most fundamental predicates. If all predicates of all possible things are grounded in some existence that actually instantiates the complete set of the most fundamental predicates and so thereby grounds all possibility, then the cancellation of this existence would cancel all possibility whatsoever. According to Kant’s definition of absolute necessity, such an existence is absolutely necessary. And, as I will explain later in the paper, if something is absolutely necessary, then it must also be both unique and simple. Thus, it simply follows from Kant’s definition of absolute necessity that there exists a simple, unique, and absolutely necessary being which grounds all possibility. I then go on to explain what motivates Kant to adopt this particular definition of absolute necessity, and, in doing so, I demonstrate that a widespread assumption commonly held within the literature is, in fact, false – Kant did not accept something similar to the axiom of S5; Kant did not think that whatever is possible is necessarily possible.
Supervised by Dr. Dai Heide
An Ill-Natured Neologicism: Bob Hale on the Modal Epistemology of Mathematics: Paul Xu
Abstract
Neologicists believe that all truths of pure mathematics are “analytic”, in the sense that they are logically entailed by definitions. Since definitions in mathematics are typically thought of as necessary truths, neologicists have an interest in discussing modal mathematical truths, truths like “2 is necessarily a number”. Surprisingly, however, neologicists have said very little in this regard. In fact, the only neologicist who has discussed this issue at length is Bob Hale, 3 who attempts to fill this lacuna by appealing to the essences or natures of mathematical objects. In this paper I will discuss how neologicists may make sense of modal mathematical knowledge by examining Hale’s essentialist proposal. I aim to show that Hale’s essentialism carries unacceptable epistemological consequences for neologicism, and I will recommend a promising alternative in trivialist platonism.
Supervised by Dr. Tom Donaldson
On the Syllogistic Character of the Aristotelian Enthymeme: Ali Taghavinasab
Abstract
For more than two thousand years, the Aristotelian enthymeme was partly defined as a kind of syllogism (sullogismos) in which one of the premises is usually suppressed. However, in the last century this doctrine has been rejected as a complete misunderstanding of Aristotle, which has little practical use. In this paper, I try to defend the syllogistic character of the enthymeme and the idea that the Aristotelian enthymeme should be understood within his logical theory. In the first part of the paper, I will argue that only this doctrine sufficiently distinguishes Aristotle’s conception of the enthymeme from that of the authors of rhetorical manuals before him, as well as non-specialist authors. In order to show this, I highlight two salient features of Aristotle’s concept of enthymeme and argue that only the syllogistic interpretation can account for these features. In the second part, I will argue that the syllogistic interpretation of enthymeme provides a straightforward explanation of the similarities and dissimilarities between rhetorical arguments, on the one hand, and demonstrative (apodeixis) and dialectical arguments on the other hand. I am not claiming that the syllogistic doctrine of the enthymeme provides a comprehensive account of the Aristotelian enthymeme – many of its rhetorical characteristics, such as the role of emotions, lie outside the scope of the paper. My claim is that every satisfactory account of Aristotelian enthymeme should take into account its syllogistic structure. Aristotle’s redefinition of enthymeme in term of his logical doctrine is a part of his project to highlight the cognitive and rational sides of the art of rhetoric.
Supervisors: Dr. Evan Tiffany, Dr. David Mirhady (Humanities, SFU)
Second reader: Dr. Sarah Hogarth Rossiter
Paradigms, Poles, and Higher-Order Vagueness by David Rattray
Abstract
Central to any conception of vagueness in natural language is a conception of borderlineness. Usually, our reluctance to positively or negatively answer the question ‘Does a count as F?’ is characteristic of a being a borderline case of F. The traditionalist thinks that the existence of borderline cases is definitional of vagueness: a predicate is vague just in case it admits of borderline cases. This leads the traditionalist to think of vague predicates as resisting the existence
of sharp boundaries. Because vague predicates permit borderline cases, they do not effect a sharp division between their positive and negative cases — there is a third class of borderline cases buffering the two. A threefold division, while helpful at first blush, runs into trouble when we furnish the theory with the means to express borderlineness. This is usually accomplished in one of two ways: we can add a formal surrogate of the notion of determinacy (clarity, definiteness) and identify borderline F cases as those that are not determinately F and not determinately not-F; or, we can introduce a formal equivalent of borderlineness directly. The problem is that just as a vague predicate admits of borderline cases — and thus resists sharp boundaries — we expect borderlineness to admit of borderline cases as well. That is, there shouldn’t be a sharp division between the clear F cases and the borderline F cases, nor between the borderline F cases and the clear not-F cases. Should we effect a five-fold division by introducing “second-order” borderline cases, the above reasoning iterates: we expect “third-order” borderline cases to blur any new boundary, which in turn will require borderline cases of their own. Rinse and repeat. The traditionalist calls the requirement that a vague predicate’s boundaries be blured by borderline cases the phenomenon of higher-order vagueness, and expects it to be accomodated by any successful theory of vagueness. Following others, I call this ‘hierarchical higher-order vagueness’, as it forces us up through a vertiginous hierarchy of multiplying borderline cases.
Supervised by Tom Donaldson
2019
Justifying the ‘Yuck Factor’: Disgust Revisited by Cody Brooks
Abstract:
Just how moral is moral disgust? In the past decade we appear to have crept steadily towards consensus on this question. According to a growing number of philosophers, disgust is irrelevant to our moral thinking and, when relied on, we run the risk of being led severely morally astray. In this paper, I subject this growing consensus to scrutiny by challenging an argument against moral disgust that has been made by Daniel Kelly. In perhaps the most thorough take-down of the emotion currently available, Kelly argues that moral disgust is unreliable as an indicator of moral truth and is irrelevant to moral justification. In what follows, I suggest that even if disgust is an unreliable path to moral truth, some reliance on the emotion may still be justified – specifically if one is of low socio-economic standing.
Supervised by Evan Tiffany
Agency in the Natural World by Simon Pollon (PhD Thesis)
Abstract:
Human agency, like our other traits, is likely continuous with that of other organisms that have evolved on this planet. However, Modern Action Theory has focused almost exclusively on the agency of human beings, so it is not obvious how agency should be understood as a more deeply and broadly distributed, or more basic, biological type. The central aim of this thesis is to fill this gap by developing and defending an account of what I’ll call “Biologically Basic Agency.” In this work’s first chapter, I establish a preliminary set of adequacy conditions drawn from broad consensus in Modern Action Theory and the needs of biological categorization. These adequacy conditions are then amended and supplemented over the course of the subsequent three chapters via critical discussion of two accounts of Biologically Basic Agency attempting to meet the adequacy conditions developed in Chapter 1. In Chapter 2, I show that Tyler Burge’s (2010) account of Primitive Agency cannot be empirically refuted and therefore is trivial. In the third chapter, I argue that Kim Sterelny’s (2003) account of the Detection System cannot serve as the evolutionary precursor to agency, because the kind of general evolutionary story Sterelny desires is empirically implausible. In the fourth Chapter, I continue my discussion of Sterelny’s Detection System, because his basic idea that the simplest adaptive behavioral systems are ‘”feature (or signal) specific” is deeply intuitive and popular amongst Philosophers and Cognitive Scientists. Focusing on the behavior of simple model organisms, I argue that, contra Sterelny and this intuition, these organisms move themselves through their environments toward a best overall place to be within one’s environment relative to a number of (often competing) environmental features relevant to the biological needs of these organisms (typically utilizing sensory inputs corresponding with these various features of the environment). I call such behavior ‘Utopic Behavior.’ Finally, in Chapter 5, I defend Utopic Behavior as an account of Biologically Basic Agency, as it both meets the various adequacy conditions I have established and demonstrates a clear continuity between human agency and that of other organisms that have also evolved on this planet.
Supervised by Evan Tiffany
The Worst of All Possible Words: Slurs, Reappropriation, and Bracketing by Kelsey Vicars
Abstract:
Slurs can be used to offend, to insult, to derogate, and to subordinate members of the groups they target. Certain slurs can also be used to empower those groups, and to signify a shared social identity. When a slur is reclaimed or reappropriated, the power that the term has to offend and insult is taken back by the members of the group the slur was originally intended to disparage. In this paper I will suggest that the complexity of this phenomenon poses a problem for existing philosophical accounts of slurring terms: common to these is that they incorporate the expression of a derogating or contemptuous perspective into all uses. After a brief discussion of the direction that standard accounts of slurs take, I will explain how reclaimed slurs present a challenge. I argue that the overwhelming focus on pejorative slurs has come at the unfortunate expense of exploring these terms in their reappropriated forms, and thus has largely ignored the important ethical and legal roles these terms play.
I then turn to a discussion of how reappropriated slurs have been treated in the literature. In general, these terms are either regarded as instances of non-literal language, attempts are made to explain them away by appealing to speaker identity, they are deemed insignificant or uncommon, or ignored completely. I argue that all of these strategies are misguided, and particularly, that to ignore or ‘bracket’ these terms is especially problematic. I then present an explanation of the ethical and legal significance of reappropriated terms. I conclude by suggesting what directions an account of slurring that accommodates both pejorative and reappropriated uses might take.
Supervised by Endre Begby
Developing a Fuller Picture of Moral Encroachment by Carolyn von Klemperer
Abstract:
Many hold that our beliefs can be epistemically faultless despite being morally problematic. Proponents of moral encroachment push back against this, arguing that moral considerations can impact the epistemic standing of our beliefs. In particular, the amount of evidence required for a belief to be epistemically justified can be influenced by whether the belief imposes a risk of harm upon the belief-target. In this paper, I point out two related gaps in the current picture of moral encroachment. I argue that the current picture should be expanded from its belief-target restricted focus to recognize risk of harm also to third parties and to the believer herself. Further, I argue that, as this first expansion helps to reveal, the theoretical framework of moral encroachment should allow the evidential threshold for justification to move not only up, but also down. Finally I consider the question of whether moral benefits, and not just moral harms, can play a role in moral encroachment. The purpose of these arguments is to paint a fuller picture of moral encroachment, one that better captures the complexity of our moral landscapes, and in doing so, helps moral encroachment to be more consistent with its aim.
Supervised by Endre Begby
The Role of Non-Causally Related Variables in Causal Models by Weixin Cai
Abstract:
Proponents of the structural equations approach construct causal models to discover causal relationships and represent the causal structure of a system. Many authors who agree with this way of studying causation have required variables in a causal model to be “fully distinct” so that any dependence relationship found in the model must be genuinely causal. In this paper, I argue that this requirement cannot be universally held because non-causally related variables are needed as a general solution to distinguish between causal structures of late preemption and of overdetermination in a model. By incorporating non-causally related variables in a model, we can represent the distinctive causal feature of late preemption in a way different from the model we use to represent the distinctive causal feature of overdetermination. I also explore some alternative solutions without using non-causally related variables and argue that none of them can serve as a general solution to this problem. Then I explain why allowing to incorporate non-causally related variables in a causal model is consistent with a better way of understanding the claim that variables in a model must be “fully distinct”: while the requirement of distinctness should be strictly followed in the context of causal discovery, we can freely use non-causally related variables in the context of causal representation to represent the distinctive causal feature of a causal structure.
Supervised by Holly Andersen
Wright versus Pearson: Causal Inference in the Early Twentieth Century by Zili Dong
Abstract:
Causation has not always been a welcome idea in the history of science. In the early twentieth-century, statistics and biometry were under the dominance of Karl Pearson, who advocated correlation while dismissed causation. So when a method of causal inference called “path analysis” was invented by Sewall Wright during the 1920s, the method was largely ignored. From an integrated history and philosophy of science perspective, this paper argues that contrary to what many might have thought, the marginalization of causal inference in the early twentieth century was to some extent inevitable.
I first depict a Kuhnian picture of twentieth-century statistics which takes Wright’s invention of path analysis as a premature attempt of revolution against the paradigm of correlation analysis. Under this picture, I then examine the formal, conceptual, and methodological aspects of causal inference in the early twentieth century. For the formal aspect, I demonstrate that causal inference methods such as path analysis require unconventional formalisms like causal diagrams, whose power could hardly be appreciated in the early twentieth century. For the conceptual aspect, I argue that Pearson’s positivist conception of causation was well-motivated given the background of science and philosophy during that time. For the methodological aspect, I argue that it was methodologically reasonable to eschew path analysis in empirical research since the application of path analysis relies on assumptions that were often hard to meet.
Supervised by Holly Andersen
Process, not just the product: the case of network motifs analysis by Shimin Zhao
Abstract:
Real-world complex systems are increasingly represented and analyzed as complex networks. In this paper, I argue that the distinctiveness of this network approach to complex systems can be better captured when we focus on the process of network motifs analysis, not only the product, the resulting network explanations. Particularly, by examining the process in which a network motif is identified and analyzed, I argue that network motifs analysis is distinctive because the identification of a network motif relies on the analysis of the target network and the comparison of the target network to the random networks, which means that a network motif is not an internal, but a doubly relational property of patterns with respect to the target network and to the random networks. This doubly relational nature of network motifs, however, cannot be captured by a narrow focus on explanations.
Supervised by Holly Andersen
God Can Do Otherwise: A Defense of Act Contingency in Leibniz’s Theodicy by Dylan Flint
Abstract:
In this paper I articulate and defend an account of contingency found in Leibniz’s Theodicy, which I am calling act contingency. The basic idea of act contingency is simple: While God is, on this view, morally bound to the creation of the best, he is nonetheless free to have done otherwise. This account thus introduces contingency into Leibniz’s thought through the fact that, in a certain sense, God could have created a sub-optimal world. While this account of contingency permeates Leibniz’s mature thought, it has not been well received by current scholarship. The biggest problem with the account is straightforward. It seems that if God has a perfect nature, then it is not possible for him to do anything but the best. This fact, it would seem, renders the creation of other sub-optimal options impossible. Instead of trying to find the root of contingency for Leibniz through the contingency of God’s actions, it has become common practice in recent scholarship to locate the needed contingency through the objects of God’s choice. This has been done through either arguing for the per se possibility of other worlds, or through an account of infinite analysis, which declares it is contingent which world is the best. This paper challenges this established practice. I argue Leibniz in fact had good reason to adopt act contingency and that it is itself a philosophically satisfying account of contingency and divine agency.
Supervised by Dai Heide
"Externalism in Philosophy of Perception and Argument(s) from Dreaming" by Dogan Erisen
Abstract: A recurrent pattern of debate between the proponents of internalism and externalism over mental phenomena is as follows: externalists pick a target mental phenomenon, say, visual perception, and argue that it has the characteristics it has because of a property that is not possessed internally. Internalists, in return, substitute an analogue mental phenomenon, one that putatively suits their position, to argue that it shows every characteristic that the original target phenomenon shows, thus the allegedly crucial external property plays no ineliminable role. Within these debates a particular analogue phenomenon frequently appears: dreaming. In what follows, I discuss the ways in which externalism comes under dispute through dream phenomena. I then investigate the scientific literature to evaluate whether the way dreaming is conceived by internalists is substantiated by available body of evidence. I conclude that the current state of sleep science does not lend support to internalists’ conception of dreaming.
Keywords: dreaming; externalism; embodiment cognition; perceptual experience; constitutive explanation
Motivating a Variant of Anscombe's Conventionalism: A Proposal and Defense by Cody Britson
Abstract:
In this essay, I motivate a variant of Anscombe’s metaethical conventionalism as an alternative to a prominent Neo-Kantian view of practical normativity. I accomplish this by first arguing that Darwall’s second-personal view of normativity, an exemplar Neo-Kantian view, has substantive metaphysical issues which arise from internal tensions. While this argument does not provide all things considered reasons to abandon this view and ones like it, it does suggest we should look for a more metaphysically sound alternative. Towards this end, I draw on Anscombe’s work on promising and authority to argue that voluntary participation in social conventions provides the best explanation of the source of practical normativity. I finish by defending this view against the charge of relativity.
"Against Primitivism in Metaphysical Explanation" by Navid Tarighati
Abstract: In this paper I will argue that metaphysical explanation cannot be cashed out in terms of the notions of ground and essence alone. First, I will explain each of the two notions of ground and essence. Then, I will argue that the notion of ground can be definitionally reduced to a special type of essence. After addressing the objections that this reduction might face, I will modify the reduction of ground to essence in order to meet the aforementioned objection. I will argue that this new notion of essence, called alethic essence, is not sufficient for explaining that in virtue of which some facts hold. As such, neither ground nor essence individually, nor ground and essence together can serve as the basis for a primitivist approach towards the class of metaphysical explanatory relations.
Cultural Diversity and the Demands of Non-Domination: An Internal Challenge to Phillip Pettit's Neo-Roman Republicanism by Damien Chen
Abstract: Most modern liberal countries have a diversity of cultures and ethical values. A commitment to ethical pluralism has become a common political consciousness. However, traditional republicanism is vested with thick sectarian ethical values. In Phillip Pettit's neo-Roman republicanism, he seeks to champion a republican understanding of freedom – non-domination – while divorcing from sectarian ethical values in traditional republicanism. This paper mobilizes the conceptual resources of non-domination to present an internal challenge to Pettit about his success in meeting the demands of non-domination and ethical pluralism, which is essentially that cultural minorities could enjoy non-domination while holding on to their ethical values.
I argue that cultural minorities are potentially subject to both the domination by the state and the domination by the westerner cultural majority. Section 1 analyzes the demands of ethical pluralism from the perspective of non-domination and sets the stage by introducing the social background of cultural minority’s vulnerability. Section 2 argues that there is potentially state domination of some cultural minorities, because Pettit’s electoral-cum-contestatory system, which is supposed to secure legitimacy for the state, requires citizens to have a competitive individualist character, which does not reflect the character of many cultural minorities or their ethical values. Hence, in this system, some cultural minorities do not properly control the state. In Pettit’s theory, the lack of control means that state interferences are dominating these minorities. Section 3 is about potential majoritarian domination. This domination stems from republicans’ willingness to interfere in and end minorities’ questionable practices. Since state policies have a heavy influence on public opinions, a liberating policy that depicts a culture negatively could assist the majority’s domination of that cultural minority. I caution against a direct and aggressive approach’s risk of assisting majoritarian domination of minorities through weakening their voices and stigmatizing them, especially the vulnerable sections within them. Instead, I argue for an approach that empowers minorities’ discursive control.
Supervised by Evan Tiffany
"Do You Hear What I Hear? Individual Differences in Speech Phenomenology" by Dzintra Ullis
Abstract: Suppose that I, a native speaker of Indonesian, hear the question, “Anda mau beli apa?” How does my experience of what was heard differ from your experience, as a monolingual English speaker? One difference is semantic, i.e., I hear the question as meaningful. Setting semantics aside, surely there are other phenomenal differences between our speech experiences. This paper begins with an examination of Casey O’Callaghan’s (2017), account of speech phenomenology. His account depends upon three levels of phenomenology (low-, mid-, high-), where phenomenal differences stem from the middle-level of phenomenology, i.e., from having phoneme awareness of a language. However, given the unique and highly complicated signal processing tasks the mammalian auditory system faces, and the (likely) unique physiological organization that has come about because of it, O’Callaghan’s view is left in a precarious situation. I focus on two ways the auditory system resolves these complex tasks—through multimodality and life-long plasticity. I argue that there is no single capacity to distinguish speech phenomenologies between speakers of different languages. Even speakers of the same language will likely have distinct phenomenologies, given their histories of language-related experience and training. As it turns out, language speakers are like musicians. Years of training and practice produces musical expertise, impacting both how she plays and what she hears. The same is true of us, in that years of training and practice produce linguistic expertise, impacting both how we speak and what we hear.
Supervised by Kathleen Akins
"Lockean Forensic Persons & The Having of a Body" by Anthony Nguyen
Abstract: Some insist that Locke’s thesis of self-ownership is in tension with his theory of personal identity. On the one hand, political theorists read Locke’s self-ownership thesis as claiming that each individual has natural ownership rights over their body and labor. Self-ownership rights are said to be natural rights for all persons because they are intrinsic to personhood. On the other hand, many think that Locke’s theory of personal identity posits the person as an essentially disembodied psychological entity because having a body is not necessary for personhood. If, however, having a body is unnecessary for being a person, then why is it that all persons have natural ownership rights over their bodies? Several recent scholars respond to this puzzle by claiming that Locke’s views on personal identity are inconsistent with his theory of natural rights.
I argue against this charge of inconsistency. I argue that Lockean persons are not essentially disembodied psychological entities. Instead, the “person” that is the subject of natural rights are intelligent agents with “forensic” capacities needed for obtaining accountability and moral agency. I argue that in order for a Lockean forensic person to be non-arbitrarily accountable for voluntary bodily acts, it is necessary that such a forensic person has a body, even when the theoretical conditions for being a forensic person do not require having a body. Because forensic persons do have bodies in normal circumstances, Locke’s account of personhood and personal identity is consistent with his theory of natural rights, particularly in terms of having natural ownership rights over one’s body.
2018
Smart Shopping for Inductive Methods by Matthew Maxwell
Abstract: Conciliatory theories of disagreement state that in the face of disagreement one must reduce one's confidence in the disputed proposition. Adam Elga claims that this conciliatory principle falls to a self-undermining problem; when the disagreement is about whether or not one ought to be a conciliationist, we find that conciliationism recommends its own rejection. In order to protect the conciliatory principle from self-undermining, Elga insists that we must be dogmatic about inductive methods and principles. In this paper I contend that there is no need to do this. The conciliatory principle does not necessarily fall to the self-undermining argument and, even if the principle could possibly fall to the self-undermining problem, dogmatism is unwarranted. Because dogmatism is unwarranted we will need a new way of defending conciliationism. Finally, I argue that the conciliatory principle is best defended empirically.
Supervised by Endre Begby
The Role of Audience in Mathematical Proof Development by Zoe Ashton
Abstract: The role of audiences in mathematical proof has largely been neglected, in part due to misconceptions like those in Perelman & Olbrechts-Tyteca (1969) which bar mathematical proofs from bearing reflections of audience consideration. In this paper, I argue that mathematical proof is typically argumentation and that a mathematician develops a proof with his universal audience in mind. In so doing, he creates a proof which reflects the standards of reasonableness embodied in his universal audience. Given this framework, we can better understand the introduction of proof methods based on the mathematician’s likely universal audience. I examine a case study from Alexander and Briggs’s work on knot invariants to show that we can fruitfully reconstruct mathematical methods in terms of audiences.
Supervised by Nic Fillion
Kant on the Unity of Spontaneity by Haley Brennan
Abstract: Kant describes absolute spontaneity as an act of transcendental freedom. It is generally taken that this concept of spontaneity is meant to apply to acts of the autonomous will, not extending beyond practical philosophy. On this view then, when Kant uses the concept 'spontaneity' to refer to the understanding in his theoretical philosophy it must be a different (thinner or qualified) kind. I argue that this view is mistaken. Instead, spontaneity is a unified concept for Kant, one that is always equivalent to transcendental freedom in both practical and theoretical contexts. I defend this view against two possible objections: (1) that the understanding is causally determined and so cannot be genuinely free, and (2) that this violates Kant's claims about our epistemic limitations. In doing so, I articulate the way that the understanding is spontaneous, and how we are genuinely free in our rational activity.
Supervised by Dai Heide
2017
The Road to a Fair Standard of Negligence by Ulyana Omelchenko
Abstract: An agent commits a negligent act in the following circumstances: the agent should be aware of a substantial and unjustifiable risk that her action carries to another; she is not aware of the risk; she proceeds with the act and wrongs the other. When she is tried for negligence, the agent’s behaviour is compared to a standard that describes the appropriate conduct for the situation in question. Two competing intuitions influence the content of this standard: we do not want to over-punish the agent for committing a wrong she was not even aware of; however, we also do not want to under-punish her for wronging the victim in a situation where we feel that she should have taken the relevant risks into account.
In this paper I look at the different types of standards of conduct commonly proposed in the literature that try to balance these two intuitions; my final goal is to find the standard that fares the best. I show that none of the standards succeeds in accommodating both intuitions fully, and point out two of the strongest accounts, each of which accommodates one of the intuitions fully, and the second intuition partly. I conclude that there is conceptual space for a standard that would combine the benefits of these two best standards, while solving their problems.
Supervised by Evan Tiffany
Not by Institutions Alone: Empowerment and Human Development by Wil Contreras
Abstract: In Why Nations Fail, Acemoglu and Robinson aim to explain wealth disparities between countries using their distinction between extractive and inclusive institutions. They contend that inclusive institutions promote prosperity, while extractive ones undermine it. They conclude that broad civic empowerment serves as the foundation for inclusive institutions, igniting the path towards prosperity. While a promising theory, this notion of empowerment is left underspecied: what is it exactly? How can it be leveraged to improve people's well-being? I believe better answers can be gleaned from Miranda Fricker's epistemic injustice framework, employing her conceptions of structural identity power and hermeneutical marginalization. Specically, I build on the idea that power spans at least two dimensions, psychological and material; and that, while Acemoglu and Robinson's theory muddles these together, Fricker's framework helps delineate and understand the relationship between the two. Given this relationship, I propose that a marginalized group is psychologically disempowered when it has internalized detrimental beliefs that translate into detrimental behaviour patterns|patterns that keep the group materially disempowered. Thus, empowerment involves abandoning those beliefs, breaking those patterns, and the consequent disruption of the social order, towards greater equality. Ultimately, I conclude that epistemic change precedes social change: that political empowerment begins with psychological empowerment, demanding a more comprehensive approach to poverty relief.
Supervised by Endre Begby
Reflective Immunity: An Essay on Korsgaard's Conception of Action and Reasons by Anthony Moradi
Abstract: According to Christine Korsgaard, an obligation can be justified as such only by virtue of our own commitments and the practical identities that we occupy by virtue of those commitments. Insofar as this claim is accepted, then an apparent problem presents itself which Korsgaard herself identifies as “the paradox of self-constitution”. This problem is namely that a proponent of this view seems to be left with the potentially intractable burden of having to explain how our practical identities can be the ultimate normative source for our actions, if those practical identities are themselves constituted by our actions. Korsgaard’s own solution is to suggest that the adoption of a practical identity should not be conceived of as a particular mental event, or a particular goal that we can achieve, but rather as an inescapably ongoing and intrinsically reason giving activity that defines action itself. In this paper, I argue that such a conception of action is inadequate to encompass the activity of human agency in its entirety, and that this is because it cannot, by definition, encompass our capacity to make norms the object of our reflective scrutiny. I argue further that insofar as this is the case, it will have to be granted that the activity of reflective scrutiny is immune from the normative authority of even our own prior commitments.
Supervised by Evan Tiffany
Sparse Coding Explanation in Computational Neuroscience by Imran Thobani
Abstract: This paper investigates the nature of explanations in computational neuroscience, a field that has received relatively little attention in philosophy of neuroscience. It does so by examining a sparse coding explanation, in which the researchers construct a computational system for encoding images using a technique known as sparse coding. By observing how the computational system behaves depending on which image is being represented, the researchers can account for similar behavior in the response properties of neurons in the human visual cortex. In Section 3, I argue that this explanation has two crucial features, and I explicate each in detail. One is an instantiation relationship between a set of coefficients in the brain, which are used to represent visual information, and a set of firing rates of simple cells in the visual cortex. The other feature is a relationship between the sparse coding system designed by the researchers and a representational system in the brain. This relationship consists partly in the fact that the two systems encode sufficiently similar visual information using a set of coefficients that behave in a sufficiently similar way between the two systems. Because of this relationship, the sparse coding system has receptive fields that have the same coarse-grained spatial properties as receptive fields in the brain. However, for the explanation to work as it does, it is also crucial that the two systems not be exactly alike. The fine-grained differences between the two systems allow the researchers to explore the space of possible representational systems in a way that isolates the relevant properties of the brain that account for response properties of simple cells.
Supervised by Holly Andersen
On Cauchy's Rigourization of Complex Analysis by Gabriel Lariviere
Abstract: In this paper, I look at an important development in the history of Western mathematics: Cauchy's early (1814-1825) rigourization of complex analysis. I argue that his work should not be understood as a step in improving the deductive methods of mathematics but as a clear, innovative and systematic stance about the semantics of mathematical languages. Cauchy's approach is contrasted with Laplace's use of the "notational inductions," predominantly used in the calculation of various definite integrals. I suggest that Laplace's techniques are partly explained by the influence of some of Condillac's ideas about the language of algebra and abstract quantities. Cauchy's opposition is then not to be seen as stemming from a comeback of geometric and synthetic methods, as Argand's theory might be. Instead, from within this algebraic tradition, Cauchy rejected the key Condillacian doctrines that algebra is about abstract quantities and that its grammar provides means of discovering new mathematical truths. He thereby reduced the gap between arithmetic and algebra and fruitfully extended his approach to imaginary numbers, a significant way in which he contributed to the arithmetization of calculus and the development of complex analysis like no one before him. I finish by discussing lessons we can draw about how mathematical rigour differs from rigour in other sciences.
Supervised by Nic Fillion
Why Two (or more) Belief Dependant Peers are Better than One by Arianna Falbo
Abstract: The following principle is widely assumed in the literature surrounding the epistemology of peer disagreement: When S disagrees with a group of epistemic peers P1, P2, P3. . . Pn concerning the truth of p, if S has already rationally accounted for the dissent of P1, then S should not consider the dissent of P2, P3. . . Pn if the beliefs of these subsequent dissenters are not independent of P1’s belief that p. I call this the ‘Belief-Dependence Principle’ (BDP) and argue that it is false. This principle overlooks the importance of a peer’s epistemic perspective, which can itself provide valuable psychological evidence. I argue that the degree of psychological evidence offered by a group of dissenting peers is partly a function of how distinct their perspectives are from one’s own and from those of other peers whose opinions one has already taken into account. In closing, I discuss how denying BDP brings into focus the importance of diverse perspectives to epistemic practices.
Supervised by Endre Begby
Testimony without Assertion by Mahan Esmaeilzadeh
Abstract: In this paper, I argue against closely associating testimony with only those speech-acts that convey high confidence, such as assertions. Given that one’s justification in holding a belief can come in degrees, there seems to be no reason to restrict the range of beliefs that the speaker can testify to with the intent of informing her hearer to only those beliefs that pass a certain threshold of justificatory strength. In doing so, I emphasize the distinction between the speaker’s expressed confidence and her actual evidential standing. Arguing that while the former confers the right of complaint to the hearer, it is the latter that justifies the hearer’s belief.
Supervised by Endre Begby
2016
On Signaling Games and their Models by Travis LaCroix
Abstract: To communicate meaningfully, members of a linguistic population must cooperate—i.e., they must agree upon some convention: one member should use a word to mean one thing if the other members use that word to mean the same thing. Conventional language-use in a population of speakers can be modeled using game-theoretic tools—this idea gives rise to the so-called signaling game. To explain how communication might evolve, contemporary scholars—following the work of Skyrms (1996)—have been using increasingly complex game-theoretic models which purport to capture readily observable linguistic phenomena in both human and non-human populations alike.
Shortly after the publication of Skyrms’ (1996) evolutionary account of social norms, and the introduction of evolutionary explanations of meaning, D’Arms, Batterman, and Gorny (1998) pointed out how surprisingly little critical attention this use of evolutionary models has received. Since 1998, a significant amount of work has been done applying the same evolutionary principles in the context of signaling games and the evolution of language. The models used in these various works have become more sophisticated and vastly more numerous; however, the practice of modeling signaling within an evolutionary framework has still received surprisingly little critical attention.
This paper applies the theoretical criteria laid out by D’Arms, et al. to various aspects of evolutionary models of signaling. The question that D’Arms, et al. seek to answer can be formulated as follows: Are the models that we use to explain the phenomena in question conceptually adequate? The conceptual adequacy question relates the formal aspects of the model to those aspects of the natural world that are supposed to be captured by the model. Moreover, this paper extends the analysis of D’Arms, et al. by asking the following additional question: Are the models that we use sufficient to explain the phenomena in question? The sufficiency question asks what formal resources are minimally required in order for the model to get the right results most of the time.
Supervised by Nic Fillion
Aristotle's Theory of Explanatory Entailment by Brad Zurcher
Abstract: Within the last sixty years, work by Jan Lukasiewicz (1957) and John Corcoran (1974) among others has reinvigorated interest in Aristotle's syllogistic logic by examining its relation to modern deductive systems. It is now generally agreed that Aristotle's syllogistic system possesses the rigor and precision necessary to permit formalization as a modern symbolic system. Though limited in the logical forms it can express, it is claimed that Aristotle was examining the same concept of logical consequence as modern logicians. However, this redemption of the syllogism as a rst-class logical system only highlighted a previous perplexity: if Aristotle was a logician of such consummate skill and precision, why is it that the deductive system he developed is so expressively impoverished?
In this paper I provide an answer to this question by arguing, contrary to the current consensus, that Aristotle's syllogism was not intended to capture the concept of logical consequence simpliciter but instead a stricter notion of consequence|Explanatory Entailment|in which the premises of a deduction must not only necessitate, but also present an explanation for the truth of their conclusion. I then provide textual evidence from the Posterior Analytics that Aristotle believed this requirement could only be satised by the syllogistic system. I conclude that the limited expressive power of the syllogism is not the result of an oversight on Aristotle's part, but is instead a consequence of his concept of explanatory entailment and his belief that this concept could only be captured by the categorical structure of the syllogism.
Supervised by Nic Fillion
From a Deflationist Theory of Truth to Normative Meta-Epistemology by Graham Moore
On the Experience of Passage and the Temporally Evolving Point of View by Hesham Kahil
Clear and Distinct Perception and Self-Knowledge in Descartes by Michaela Manson
Kantian Moral Character and the Problem of Moral Progress by Nikolas Hamm
Abstract: There appears to be a flaw in Kant's theory of moral change. More precisely, the duty of moral self-improvement seems to be at odds with fundamental aspects of Kant's metaphysics of freedom. Kant claims that the determination of moral character must be through a free moral choice, and that free choice is only possible insofar as it lies outside space and time. Thus moral progress poses a problem for Kant; change presupposes temporal extension, and yet Kant is adamant that temporal properties cannot apply to noumena.
This concern has been raised by a number of commentators as evidence against metaphysically laden interpretations of Kant's notions of agency and character. In what follows, however, I demonstrate that the problem Kant faces is merely illusory. Rather, a careful analysis of the nature of moral character and the process of moral conversion reveals that, due to the dual nature of humans - as both noumenally and phenomenally affected beings - an improvement in an agent's phenomenal incentives can suffice to explain moral conversion, provided that the motive for such change comes from the moral law itself.
Supervised by Owen Ware
Happiness as an Ideal of Non-Moral Choice in Kant by William Braedson
Abstract: I defend a reading of Kant’s account of non-moral choice that gives due weight to Kant’s views on happiness and prudential principles. The received or “intellectual” view on Kant emphasizes the role of objective reasons. On this view, the paradigm form of choice is that which is guided by principles or values that are universally valid; this makes non-moral choices merely erroneous or misinformed attempts at moral choice. The view I defend takes into account Kant’s remarks on sensible inclinations. As human beings with sensible needs, we have a “commission” to pursue happiness, i.e. the greatest possible satisfaction of one’s desires. However, since the concept of happiness is necessarily indeterminate, everyone is faced with the task of deliberating about what is constitutive of one’s own happiness. This deliberation opens a space for non-moral (hedonic) choice that does not structure itself according to the intellectualist commitment to universally valid reasons. Furthermore, since one brings a given sensible inclination under a hedonic maxim, my account is able to meet the requirements of intelligibility and spontaneity that motivate the intellectualist account.
Supervised by Owen Ware
Breaking Promises: Objective and Attitudinal Wrongs by Brittany French
Abstract: As a bipolar duty, promissory obligations place the promisor in a position to wrong her promisee if she fails to fulfill her obligations to the promisee without a good excuse. In the event of afailed promise that wrongs a promisee, wronging is incurred onto the promisee. There is an ‘objective wrong’ that results from failure to complete the actions necessary for fulfillment of the contents of the promise. I argue that there is another kind of wronging, I call ‘attitudinal wrong’ that could be incurred on the promisee in cases where the promisor’s attitudes express a disregard for the promisee to whom she owes an obligation. I build off of the concept of ‘normative injury’ discussed in Jay Wallace’s forthcoming “Lectures” to give shape to the two kinds of wrongings. As a set-up to my thesis, I provide the architecture of defect promises, arguing that in only three of the five classes of failed promises does a promisee suffer some form of wrong.
Supervised by Evan Tiffany
2015
Machine Learning as Experiment by Kathleen Creel
Abstract: Scientific inquiry increasingly depends on massive data sets, such as the data produced by the Large Hadron Collider or by government agencies for study of the social sciences. While the epistemic role and experimental status of computer simulation has been recently discussed, the relationship between these datasets, the computational techniques used to find patterns in them, and the phenomena the datasets describe require more extensive philosophical analysis. I propose that machine learning, a branch of artificial intelligence research that focuses on algorithms designed to improve their performance at tasks over time, and its algorithms, do not merely detect patterns in data, as has been suggested. Rather, two types of machine learning, genetic programming and deep learning, deserve the epistemic status of experimental methods. By analyzing patterns in data, genetic programming and deep learning enable us to discover broader range of phenomena than traditional techniques of investigation can. These varieties of machine learning possess a characteristic cluster of the features of experimentation and contribute to scientific inquiry in their disciplines in ways similar to the contributions of experiments in other disciplines. Because of these two factors, which I suggest constitute a definition of experiment, scientific activities that include these types of machine learning have the epistemic status of experiment.
Supervised by Holly Andersen
What is rationality good for? A game theoretic perspective by Kino Zhao
Abstract: Game theory is sometimes defined as “the study of interactions among rational agents”. This perspective has proven successful in many disciplines, such as economics, behavioral psychology, and evolutionary biology. It also faces some challenges, many of which relate, in one form or another, to how rationality is defined and utilized. Why does game theory need rationality? Or does it? In this paper, I examine the roles rationality plays in three subfields of game theory: classical, epistemic, and evolutionary. In particular, I look for roles that seem to require rationality, and question whether they can be played by any alternatives. I later develop two examples to explore the possibility of using game theoretic models without assuming players to be rational. Using the examples, I argue for my two theses: 1) the rational reasoning process should not enjoy any “privilege” over heuristics when used to interpret player behavior, except when they differ in their predictive outcomes; 2) discrepancies between rational predictions and experimental evidence should be treated as a call for alternative perspectives, instead of as a reason to discredit the use of game theoretic modeling.
Supervised by Nic Fillion
The Relational Conception of Equality by Devon Cass
Abstract: A prevailing assumption in political philosophy is that equality essentially requires that individuals are entitled to a (pro tanto) equal distribution of goods. Critics claim that this view misses the point of equality and argue that its value should be understood as involving the quality of social relationships more broadly. This paper examines the distributive implications of this ‘relational’ conception of equality. I argue that the existing proposals of ‘relational equality’ offered by Samuel Scheffler and Elizabeth Anderson have overly ambiguous or implausible implications for distributive justice. In turn, I develop an account of relational equality that addresses these issues.
Supervised by Sam Black
Territorial Rights and Cultural Attachment by Joshua Ogden
Abstract: Sovereign states exercise territorial rights: to control jurisdiction, borders, and resources. While theorists have long discussed the justification of state rights over people, the justification of state rights over territory has, until recently, been largely taken for granted. In this essay, I argue that the state is not the ultimate holder of territorial rights; nor have territorial rights been transferred to the state from individuals. Rather, states exercise territorial rights on behalf of nations – a type of cultural group. I propose an account that differs from other such nationalist accounts, for one, by being entirely present-oriented in its justification – denying, for instance, the quasi-Lockean thesis that a nation’s history of ‘developing’ the land can, in itself, directly generate territorial rights. I argue instead, from a principle of treating people as equals, that territorial rights ought to be assigned to nations on the basis of present cultural attachment to their respective homelands.
Supervised by Sam Black
The Referent of ‘Reference’ and the Philosophy of Philosophy by Brent Stewart
Abstract: Naturalism suggests two types of constraints on inquiry. The first is that lower and higher level theories should be able to be integrated. For example, if one theory assumes something that another theory rules out – then something has to give. The second (foregrounded by Huw Price) is that the results of the human sciences should be used to calibrate the process of inquiry itself. The human being is the most basic instrument of inquiry and what we learn about ourselves has direct relevance for the methodology of inquiry. Using the first constraint, I explore what impact the success of enactive cognitive science might have on theories of linguistic reference. I argue that the two main contenders – the descriptive and causal-historical theories – do not mesh well with the basic tenets of enactive cognitive science. In response, I propose a sketch of a new theory – attentional theory – that fares better. Next, using the second constraint, I look at how adopting attentional theory might change philosophical practice. The result, I argue, is a change in how philosophers can appeal to language in constructing arguments and a subsequent refocusing of what makes for a good philosophical target.
Supervised by Ray Jennings
The Phenomenology of Becoming a Moral Agent by Emily Hodges
Abstract: While we often assume that individuals are essentially capable of being moral agents, it is not the case that we are all automatically moral. In fact, many of us feel that being a moral agent is not just something that we ought to do, but something we ought to commit to even if we do not always get it right. Perhaps becoming a moral agent is a process that takes time, hard work, and commitment. In this paper I argue that becoming a moral agent involves a process comprising two critical moments. To understand this process, I explore the moments of becoming a moral agent as described by Kant and Levinas. In the Kantian moment, the individual becomes a moral agent when she realizes the objective authority of the moral law, which subjectively grounds moral action through the experience of respect. Yet the Kantian account does not seem able to grant others the full dignity they deserve as ends-in-themselves. I therefore explore Levinas’s account wherein the individual experiences the other as the ground of moral obligation. Though such grounding grants the other dignity, the Levinasian account cannot give us a moral standard or self-respect. I conclude that the two moments can be synthesized into a single process of moral cultivation. Here, the moral law is created through the individual’s encounter with a “second-personal demand.” However, it is only upon conscious self-legislation of the moral law that the agent takes on the higher vocation of being a responsible interlocutor that answers the ethical demand of the other respectfully. As the self is here transformed into a moral agent, the subjective motivation is self-respect as well as respect for the other. The moral self is connected intrinsically to the other, grounding morality in the dignity of both the self and other.
Supervised by Evan Tiffany
Kant on Givenness and the Phenomenological non-Presence of Space and Time by Rosalind Chaplin
Abstract: A number of contemporary Kant scholars hold that Kant believes space and time are ‘given’ as phenomenologically present to the mind in sensibility alone. I argue that Kant in fact denies that either space or time is epistemically accessible to us in mere sensibility and that his considered position is that we depend on synthesis for awareness of the basic properties of space and time. However, this is not to say that space and time themselves depend on synthesis. As I argue, Kant believes that our epistemic access to space and time depends on synthesis, but this is compatible with the claim that space and time are ‘given’ in sensibility alone. In the context of his discussion of space and time, Kant takes ‘givenness’ to be a metaphysical notion rather than an epistemological one, and part of what is distinctive about Kant’s theory of space and time is his effort to pull metaphysical and epistemological priority apart.
Supervised by Dai Heide
A Lawful Freedom: Kant’s Practical Refutation of Noumenal Chance by Nicholas Dunn
Abstract: This paper asks how Kant’s mature theory of freedom handles an objection pertaining to chance. This question is significant given that Kant raises this criticism against libertarianism in his early writings on freedom before coming to adopt a libertarian view of freedom in the Critical period. After motivating the problem of how Kant can hold that the free actions of human beings lack determining grounds while at the same maintain that these are not the result of ‘blind chance,’ I argue that Kant’s Critical doctrine of transcendental idealism, while creating the ‘conceptual space’ for libertarian freedom, is not intended to provide an answer to the problem of chance with respect to our free agency. I go on to show how the resources for a refutation of chance only come about in the practical philosophy. In the 2nd Critique, Kant famously argues for the reality of freedom on the basis of our consciousness of the Moral Law as the law of a free will. However, Kant also comes to build into his account of the will a genuine power of choice, which involves the capacity to deviate from the Moral Law. I conclude by showing that this apparent tension can be resolved by turning to his argument for the impossibility of a diabolical will. This involves a consideration of the distinct kind of grounding relationship that practical laws have to the human will, as well as the way that transcendental idealism makes this possible.
Supervised by Dai Heide
Is Conciliationism Incoherent? by Mike Perry
Abstract: Conciliationism—according to which whenever you find yourself in disagreement with an epistemic peer about a proposition, you should significantly shift your credence in that proposition in their direction—is an epistemic rule that is both intuitively compelling and useful. Adam Elga has argued, however, that Conciliationism is incoherent because it is self-undermining (it sometimes says that you should reduce confidence in itself). After a brief discussion of epistemic rules in general, I explain and evaluate Elga’s argument. The argument, it turns out, threatens not only Conciliationism, but almost any epistemic rule that it’s sometimes rational to doubt. Thankfully, it is persuasive only if we ignore an important distinction between different senses in which it can be true that someone “should” do something. Conciliationism does not emerge unscathed, however. Although the argument doesn’t show that Conciliationism is incoherent, it does show that it’s not as useful as it might initially seem.
Supervised by Endre Begby
2014
Enactivism: Filling in the Gaps by Nicole Pernat
Abstract: Enactivism is an anti-representationalist version of embodied cognition that aims to explain visual perception and perceptual presence. It states these are grounded in tight couplings between sensory input and motor output called “sensorimotor contingencies (SMCs). I argue that enactivists Kevin O’Regan and Alva Noe are unclear on the nature of SMCs and their mastery. Explaining such theoretical constructs requires describing their physical mechanisms (Andersen, 2011; Machamer, Darden, Craver, 2000), which in turn requires clarifying what mastery, in this case, is. I outline four possible interpretations and show why the first three do not serve the enactivists’ purpose. Grush (2007) provides the fourth interpretation: emulation. He provides necessary details on the nature of sensorimotor processes, and convincingly argues that mastered SMCs are emulators – which are representational. Enactivists cannot detail their view sufficiently to explain perceptual presence and remain anti-representationalist. I conclude, contra enactivists, that mastered SMCs must probably be representational.
Supervised by Kathleen Akins
Doxastic Normativity Without Epistemic Justification by Syeda Komal Gilani
Abstract: This paper sketches a type of justification for beliefs that does not aim for the truth, namely phenomenal justification. The type of justification suggested is structured by experience alone, and not by truth, utility, perfection, or any other traditionally recognized source of doxastic normativity. In making a case for phenomenal justification, I explicate and motivate the problem of the criterion, and show how phenomenal justification has an advantage over epistemic justification in avoiding the problem.
Supervised by Phil Hanson
2013
On the ‘Evolutionary Contingency Thesis’ Debate by Tiernan Armstrong-Ingram
Abstract: John Beatty developed the ‘Evolutionary Contingency Thesis’ – that all outcomes of evolution are contingent – in support of the conclusion that there are no laws in biology. Contra Beatty, Elliot Sober has argued against the move from contingency to lawlessness. Lane Des Autels criticizes Sober’s reply to Beatty, claiming that Sober admits of too many laws. Here, I argue three points relevant to this tripartite debate: First, Sober’s attempt to circumvent the contingency problem strips away the causal element found in Beatty’s account of contingency. Second, Des Autels has unfairly characterized Sober as granting law-hood to any contingent generalization we might choose, when Sober is merely arguing that contingent generalizations should not be systematically excluded from consideration for law-hood. Third, Sober’s response to Beatty only succeeds if one also holds Sober’s conception of what constitutes a law, which Beatty does not. Fundamental to their disagreement is a difference in how Beatty and Sober conceive of what does or does not count as a ‘law’, and that issue extends well beyond biology and the philosophy thereof.
Supervised by Holly Andersen
The Argument From Perceptual Variation Reconsidered by Jane Chen
Abstract: Jonathan Cohen argues for relationalism about colour by what he calls ‘the argument from perceptual variation’. Cohen’s relationalism claims that whether a stimulus has a particular colour or not depends on who perceives the stimulus and the viewing conditions under which the stimulus is perceived (including illumination, background and so on). For example, if a single spectral light appears pure green to Subject A and bluish green to Subject B, then it is pure green to Subject A and bluish green to Subject B. To support this position, his argument from perceptual variation argues that there is no principled reason to choose between the two distinct perceptual effects (that are usually called “phenomenal appearances”/ “experiences”/ “visual states”) of the two subjects, and that anti-relationalism implies a selection between them and therefore involves stipulation, which should be avoided when possible. To criticize this argument, I argue that there is at least one coherent and plausible anti-relationalist account of colour that does not imply a selection between the two perceptual effects and thus also avoids stipulation. To make my point, I construct a hypothetical case called “the L box” which exactly parallels the case of perceptual variation of colour and argue that a relationalist conclusion about the L properties does not follow. I anticipate three objections with regard to colour terms. My conclusion is that perceptual variation should not pose a problem for anti-relationalism in the case of colour.
Supervised by Kathleen Akins
A Comparison of William James and Nietzsche on Consciousness and Will by Vera Yuen (Note: this is a thesis)
Abstract: My thesis compares William James’ and Friedrich Nietzsche’s construals of consciousness and will, two of the core notions in both philosophy and psychology. I delineate the elements that are significant in their respective accounts of the notions, and show that there are interesting and significant parallels in their views. An appreciation of the affinities in James’ and Nietzsche’s accounts of consciousness and will facilitates an appreciation of their remarkably parallel contributions in both philosophy and psychology. It also enhances an appreciation of James as a philosopher with a rich background and expertise in psychology, and an appreciation of Nietzsche as an original, important philosopher-psychologist. Furthermore, the parallels I draw between their views provide materials that substantiate the construal of a strand in contemporary psychology that is philosophically informed, pragmatist, and which embraces a radical version of empiricism.
Supervised by Holly Andersen
On Logical Revision: Field, Maddy, and a Hybrid Proposal by Matthew Harrop
Abstract: I begin this paper by exploring two approaches to logical revision – one broadly motivated by ‘rational’ considerations and the other by ‘empirical’ considerations. First, I consider Hartry Field’s proposal to adopt a paracomplete logic (FPL) in order to overcome the Liar and Curry Paradoxes. I then examine Penelope Maddy’s proposal, which takes a weak “rudimentary logic” to capture both the logical structure of the macro-world of our everyday experience and the logical structure of our cognition. After pointing out and attempting to overcome particular difficulties faced by these two accounts of logical revision, I sketch a conception of logic that involves a novel combination of what I take to be the plausible elements in Field’s and Maddy’s views – namely that the logical structure of the macro-world has influenced the logical structure of our cognition via natural selection and that our choice of which all-purpose logic to adopt is settled in large part by an evaluation of how well these logics allow us to achieve our epistemic goals. One of the upshots of this hybrid account is that it takes seriously the problem of modesty – a conceptual puzzle associated with the rational revision of logic – while still allowing for the rational revision of logic. Another upshot is that it allows the rational revision of logic to be motivated by either ‘rational’ or ‘empirical’ concerns. I conclude the paper by considering challenges to my proposal, specifically those stemming from its (in)ability to accommodate dialetheism.
Supervised by Phil Hanson
Disagreement: What’s It Really Good For? by Jesus Moreno
Abstract: In this essay I will argue that the practice of inquiry presupposes the possibility of seeing each other as disagreeing on normative matters. If inquiry is a constitutive feature of being the kind of beings that we are, that is if inquiry is indispensable to us insofar as we cannot fail to engage in it, then the possibility of seeing each other as disagreeing on normative matters must be part of the picture presented by our theoretical accounts of normativity. I will base my argument on a distinction between a) first-person and third-person perspectives of cases of agreement/non-agreement, and b) agreement/non-agreement (in either of the two forms mentioned above) and conflict. Conflict, which I shall characterize as incompatibility of preferences, is shown to be neither necessary nor sufficient for either form of disagreement. I will focus on cases of first and third-person disagreement, arguing that they are neither necessary nor sufficient for one another (we can be disagreeing without knowing so, and we can think we are disagreeing without being in third-person disagreement). It is the first-person form of disagreement which I argue proves indispensable for inquiry. I will attempt to show that first-person disagreement on normative matters cannot be reduced to a conflict of preferences – for in cases where such reduction is attempted we would fail to see ourselves, and others, as disagreeing. If an account of normativity entails the dismissal of first-person disagreement on normative matters then such accounts cannot be consistently held – for either it would lead us to employ, in the very practice of inquiring whether the account can be held, what is being dismissed (which is inconsistent) or it would lead us to quietism.
Supervised by Phil Hanson
A Substantive-Argument Alternative to Arguing from Insufficient Evidence by Berman Chan
Abstract: The first section of this paper will argue that arguing from insufficient evidence may be legitimate in arguments that are meant to encourage some practical outcome. However, in that section I will distinguish those cases from philosophical arguments which are mostly concerned with evaluating arguments for their own sakes, in which I argue that the use of burden of proof is not legitimate. The second section discusses some arguments from insufficient evidence which superficially seem to use burden of proof. I give reasons for approving of some of these arguments, developing and defending a method for these arguments which employ substantive arguments, thus avoiding using burden of proof.
Supervised by Martin Hahn
Fitting a Square Peg into a Eudaimonic Hole: LeBar, Virtue Ethics and the Second-Person Standpoint by Graham Robertson
Abstract: A fairly well known objection to virtue ethics is that it is unable to provide the proper sort of reasons that a moral theory ought to: those commonly known as ‘victim-focused’ as opposed to those focusing on the flourishing of the agents themselves. In his paper ‘Virtue Ethics and Deontic Constraints’, LeBar attempts to meet this objection by incorporating Darwall’s second-person standpoint—a metaethical view of moral reasons—into the normative component of a eudaimonic virtue theory. In this paper I argue that LeBar fails to properly incorporate victim-focused reasons, due to the fact that he is unable to account for second-person reasons possessing normative force relative to other reasons that is based solely on the authority of second personal claims.
Supervised by Evan Tiffany
Kant’s Transcendental Exposition of Space and the Principles of Intuitive Reasoning by Christopher Palmer
Abstract: The Transcendental Exposition of the Concept of Space occupies a peculiar place in Kant’s theory of geometry. Clearly, the Exposition tries to connect his thoughts on space and geometry. What is challenging is understanding the precise connection Kant is drawing. Philosophers have historically interpreted the Exposition as containing an argument regarding our representation of space that is premised upon the epistemic status of geometrical judgments: Because synthetic a priori cognitions of geometry are possible, space must be a pure intuition since no other representation is suitable for these judgments. Modern Kant scholars interpret the arguments of the Transcendental Exposition to present Kant as arguing in roughly the opposite way; his account that space is a pure intuition is a premise to an argument that affirms the possibility of the synthetic a priori cognitions of geometry. From the conclusions about space in the Metaphysical Exposition, Kant (on this reading) already has the material necessary to prove how the kinds of judgments unique to geometry are conceivable. While I agree with the modern interpretation’s general point that Kant is arguing from a particular conception of space to a claim about geometry, I do not think that he is attempting to prove the possibility of a certain category of cognitions. Alternatively, I will argue that Kant in the Transcendental Exposition is demonstrating how a particular method or procedure is possible for cognizers to attain synthetic a priori cognitions. The arguments of the Exposition establish a principle regarding the a priori determination of concepts, and this principle is indispensable for construction and intuitive reasoning, which is how Kant thinks cognizers attain such synthetic a priori cognitions.
Supervised by Dai Heide