Archive

Kelin Emmett (SFU) "Refusing to Will the Means"

Friday, July 28, WMC 3220

Many Kantians believe that hypothetical imperatives depend on categorical imperatives for their normativity. I argue that this view stems from a prevalent, but mistaken, understanding of hypothetical imperatives as primarily anti-akratic principles that prescribe the volition of the necessary known means to one who wills the end. On this view, all practical necessity is really categorical, depending on the categorical imperative’s endorsement of our ends as good. It thus collapses any normative difference between hypothetical and categorical imperatives. I argue instead that, for Kant, the volition of the necessary known means and the volition of the end are the very same volition, and so it does not turn out to be possible, on his account of practical reason, to will an end and fail to will the necessary known means to it. Based on Kant’s conception of willing an end then, I offer an alternative understanding of hypothetical imperatives, one which preserves the normative difference between categorical and hypothetical imperatives, and the correlative distinction between practical failures that are evil, and those that are merely stupid.

Chung-Fuk Lau (Chinese University of Hong Kong),"Kant's Concept of Cognition and the Key to the Whole Secret of Metaphysics"

Friday, June 30, WMC 3220

Kant’s Critique of Pure Reason had an enormous impact on the development of epistemology. However, certain core concepts in contemporary epistemology may have rather misguided interpretations of the Critique. A particularly tricky problem is Kant’s concept of cognition (Erkenntnis), which is different from the concept of propositional knowledge. Kant does not see the epistemological issues of truth and justification, which are otherwise central to the concept of knowledge, as his basic problem. What is essential to Kant’s concept of cognition is rather an appropriate relation to the object that connects us to the world and confers objective validity on intuitions, concepts, and judgments. The paper distinguishes between two senses of cognition, which can be characterized as the judgmental and the representational sense respectively, and argues that the possibility of representational relation depends on a set of necessary rules, by virtue of which objectively valid representations can be distinguished from invalid ones. The paper concludes with remarks on Kant’s thesis of the uncognizability of things in themselves and the possibility of extending the realm of cognizable objects beyond possible experience by the practical use of reason. In this sense, Kant’s concept of cognition can offer a foundation for a novel metaphysics that has both a theoretical as well as a practical dimension.

Kevin Zollman, CMU, "Epistemic Dilemmas"
March 10, 2017

Social dilemmas like the Prisoner's Dilemma and Tragedy of the Commons are well known problems for the social organization of economic life. In these complex social situations, individuals who pursue their own self interest will make the group as a whole worse off. While well studied in situations of instrumental rationality, they have been less studied in the context of epistemology. In this talk I will explore some epistemic social dilemmas and discuss how various solutions might work in the epistemic setting.

Kerry McKenzie, UCSD, "What’s wrong with ontic structuralism?"
February 10, 2017 ~ WMC 3510

Epistemic structural realism entered the philosophy of science in the 1980s as an antidote to the problem of theory change. By the late 90s, ontic structural realism had come on the scene as a thesis about the metaphysics of the fundamental. But the two forms of structuralism seem to sit awkwardly with one another, given that we still await a truly fundamental theory of physics and hence can expect our metaphysics of fundamental physics to change at some point in the future. The question then arises as to whether our metaphysics of the fundamental is open to notions of progress and continuity analogous to those the structuralist claims may be applied to scientific theories.

In this talk, I will argue that there are no such prospects. At the heart of that pessimism is the fact that metaphysical concepts, as least as canonically understood, seem in principle resistant to the notion of ‘approximation’, making it difficult to regard successive metaphysical theories as successive improvements of one another. While seemingly spelling bad news for ontic structural realism, the same observation might be of assistance in the project of demarcating metaphysics from science. Perhaps most valuably, it invites us to envisage the possibility of a metaphysics with more tolerance for the fact that we routinely get things wrong: that is, to envisage a metaphysics that is more like science.

Kristin Primus, Berkeley, "A New Model of Spinoza's Monism"
November 18, 2016 ~WMC 3510

According to Spinoza, God, the one and only necessarily-existing substance, is the immanent cause of the essence and existence of *all* things (E1p16, E1p18, E1p25).  But if this is what he thinks, how can Spinoza hold both that substance is the (immanent) cause of the existence and essence of all things, while also holding that some things – finite things --  *cannot* be caused by the infinite substance, but must be (transitively) caused by other finite things (E1p28)?

It’s not surprising that Leibniz was one of the first to draw attention to this apparent inconsistency in Spinoza’s system:  Leibniz had settled on his own solution to the question of how to balance immanent diving causation and transitive creaturely causation, a solution he thought would enable him to avoid the theologically problematic necessitarianism and monism of Spinoza’s system.  Of course, Leibniz’s case for his own system is that much stronger if he can point to a fatal internal inconsistency in Spinoza’s system, rather than just point out how that system cannot accommodate certain theological  commitments – to creaturely free will, to an eminently good, transcendent God, to the contingency of this world’s creation – that a Spinozist is happy to give up.

There have been attempts to come to Spinoza’s defense, but these attempts require weakening  Spinoza’s commitments to monism, necessitarianism, or the reality of finite things.  In this paper, I will argue offer a defense that does not require such weakening, although it does require that we read Spinoza as denying that we cognize the things of ordinary experience as inhering in the one eternal substance.  Given that Spinoza thinks it an uncommon cognitive achievement to see things as modes inhering in substance, this is a welcome result.

Marko Malink, New York University "Aristotle on Principles as Elements"
October 7, 2016 ~ WMC 3510

In his discussion of the well-known four causes, Aristotle makes a puzzling claim to the effect that 'the hypotheses are material causes of the conclusion' (Physics 2.3, Metaphysics 5.2). This is usually taken to mean that the premises of any valid deductive argument are material causes of the conclusion. By contrast, I argue that Aristotle's claim applies not to deductive arguments in general, but only to scientific demonstrations, i.e., to explanatory deductions from unproved first principles. For Aristotle, the theorems of a given science can be viewed as compounds consisting of the first principles from which they are demonstrated. Accordingly, the first principles of a science can be regarded as elements from which theorems are obtained by composition (synthesis), much like a syllable is obtained by composing letters. First principles are thus material causes of the theorems demonstrated from them.

Saul Kripke, Graduate Center, CUNY "Naming and Necessity Revisited"
September 16, 2016 ~ WMC 3260

The Simon Fraser Philosophy Department has requested that I give a talk on Naming
and Necessity
. In this talk, however, I will presuppose knowledge of the book and will take
the opportunity to revisit some of its more controversial points. I will address some
misinterpretations as well as discuss some statements and characterizations I made that
do need modification, correction, or further elaboration. Some of the topics I hope to
discuss are, for example, a footnote that appears to give a trivial endorsement of the
description theory after everything I had said against it, Evan’s ‘Madagascar’ example,
the view that Putnam’s linguistic division of labor could replace my own picture, and
some of my own remarks on natural kinds. (I don’t know if I will be able to cover this all
in a single talk, but I will do what I can.)

Academic Year 2015

Carla Fehr, University of Waterloo "Implicit Bias and Racial Health Disparities"
Friday, July 8, 3:30-5:00 pm.  Room:  WMC 3220

Abstract

There are significant racial and ethnic disparities in healthcare:  members of some racialized groups have higher death rates and worse health outcomes across a wide range of medical conditions. There is mounting evidence that some of these disparities are due to the role of unconscious, implicit racial biases among physicians. Given that physicians' implicit biases have been demonstrated to harm their patients, it is important to identify effective strategies for reducing their harmful effects.  In this paper, we evaluate the most frequently proposed solutions in the medical literature.  We argue that, unfortuately, much of what has been proposed simply does not work.  We further demonstrate that many of the strategies for ameliorating implicit bias that are effective in other contexts do not apply well in a medical context.  Finally, we maintain that the reason many of the proposed strategies do not work is because they focus at the level of the individual physician, rather than at the level of healthcare systems and institutions.

Brian Keeley, Pitzer College "Conspirational explanation, scientific method and credulity"
Friday, April 22. 3:30-5:00 p.m.  Room:  WMC 3510

Abstract

Where does entertaining (or even promoting) conspiracy theories stand with respect to rational inquiry? According to one view, conspiracy theorists are open-minded skeptics, being careful not to accept uncritically common wisdom, exploring alternative explanations of events, no matter how unlikely they might seem at first glance. Seen this way, they are akin to scientists attempting to explain the social world. On the other hand, they are also sometimes seen as overly credulous, believing everything they read on the internet, say. In addition to conspiracy theorists and scientists, another significant form of explanation of the events of the world can be found in religious contexts, such as when a disaster is explained as being an “act of God.”  By comparing conspiratorial thinking with scientific and religious forms of explanation, features of all three are brought into more clear focus. For example, both anomalies and background values are seen as important in motivating scientific research, although the details are less clear, as well as when anomalies and background values lead scientists astray. This paper uses conspiracy theories as a lens through which to investigate rational or scientific inquiry. In addition, a better understanding of the scientific method as it might be applied in the study of events of interest to conspiracy theorists can help understand their epistemic virtues and vices.

Roy Sorensen, Washington U., St. Louis "Other-Centric Reasoning"
Friday, March 18, 3:30-5:00 pm.  Room:  WMC 3255

Abstract

Question-begging has an opposite fallacy.  Instead of relying on my beliefs (for my premises) when I should be using my adversary's beliefs, I rely on my adversary's beliefs when I should rely on my own.  Just as question-begging emerges from ego-centrism, its opposite emerges from other-centrism.  Stepping into the other guy's shoes is an effective strategy for understanding him.  But you must return to your own shoes when forming your beliefs.  Evidence is agent-centered.

Other-centric reasoning is most striking when both parties partake simultaneously.  We are then treated to the spectacle of each side using the other's premises to establish their conclusion.  These remarkable debates arise regularly when there is open disagreement about whether a right-conferring relationship has ended.  Those who contend the relationship is abrogated will be tempted to stand on the rights persistently credited to them by their adversary.

Ariel Zylberman, SFU, "The Relationship of Respect:  The Relational Source of Moral Normativity"
Friday, February 26, 3:30-5 p.m.  Room:  WMC 3255

Abstract

It is typically assumed that moral obligation has two essential features: moral obligation is objective – valid for all moral agents – and deontic – binding all moral agents in a non-discretionary and perhaps even unconditional way. Yet, by privileging one of these aspects over the other, philosophical accounts face well-known difficulties explaining both features of moral obligation. My aim in this article is to introduce and defend an alternative, relational account. The organizing thought is that the morally obligatory has its justificatory source in a basic relationship of respect among equally free persons. The principal advantage of a relational account is that it promises to accommodate the two essential features of moral obligation. Unlike accounts privileging the objectivity of moral norms, a relational account can offer a principled distinction between norms that simply recommend and those that command. And unlike accounts privileging the demandingness of morality, a relational account does not ground moral obligation in any actual or implicit command, thereby avoiding the threat of emptiness and arbitrariness faced by voluntarist accounts.

Rosemary Twomey, SFU "Sense-Organs and Sensory Integration in Plato and Aristotle"
Friday, February 5, 3:30-5:00 p.m.  Room:  WMC 3255

Abstract

We perceive a great variety of things and do so without any difficulty. Although we learn from especially chaotic perceptual conditions that perceptual input has the potential to confuse or overwhelm us, these conditions are not the norm. How are we able to integrate information so seamlessly in the normal situation, and what limitations do the individual senses (the “special-senses” of Aristotle) and their organs put on the range of explanations of the unity of perception? In answering these questions, Plato argues that there is a great deal of rational processing that completes the work the senses cannot do on their own. By contrast, Aristotle contends that the senses have the wherewithal to synthesize the data on their own. I show that the disagreement between the two philosophers is not a result of divergent conceptual views about the scope of perception, as is usually thought, but instead has an empirical basis in differing assumptions about the physical constitution of the organs. I conclude that contemporary philosophical discussions of these issues should also be sensitive to empirical considerations about the differences and similarities among the sense organs.

Bert Baumgaertner, University of Idaho "Opinion strength influences the spatial dynamics of opinion formation"
Friday, January 29, 3:30-5:00 p.m.  Room WMC 3253

Abstract

Opinions are rarely black and white; they can be held with different degrees of conviction and this expanded attitude spectrum has the potential to affect the influence one opinion has on others.  Our goal is to understand how different aspects of influence lead to recognizable spatio-temporal patterns of opinions and their strengths.  To do this, we introduce a stochastic spatial agent-based model of opinion dynamics that includes a spectrum of opinion strengths and various possible rules for how the opinion type and strength of one individual affect the influence that individual has on others.  Through simulations of the model, we find that even a small amount of amplification of opinion strength through interaction with like-minded neighbors can tip the scales in favor of polarization and deadlock.

Eric Watkins, UCSD, "Kant on Cognition and Knowledge"
Friday, November 13, 3:30 - 5 p.m.  Room:  WMC 3510

Abstract

In this paper, we argue that by cognition (Erkenntnis) Kant has in mind a representational state that is distinct from knowledge (Wissen).  For cognition, taken in the sense that is most important to Kant's project in the Critique of Pure Reason, is an awareness of the existence and general features of an object, whereas knowledge is an act of assent based on objectively sufficient grounds.  Thus, my awareness of the object in front of me as a round ball is a cognition (whether the ball is in fact round or not), whereas my assenting to the claim that the ball is round on the basis of grounds that I take to be objectively sufficient for that claim is a case of knowledge.  Taking note of the difference between cognition and knowledge leads to a different conception of the nature of Kant's overall argument in the first Critique concerning synthetic a priori cognition.

Roberta Millstein, UC Davis, "Rethinking Aldo Leopold's Land Community Concept"
Friday, November 6, 3:30 - 5 p.m.  Room:  WMC 3510

Abstract

The “land community” (or “biotic community”) that features centrally in Aldo Leopold’s Land Ethic has typically been equated with the concept of “ecosystem.” Moreover, some have challenged this central Leopoldean concept given the multitude of meanings of the term “ecosystem” and the changes the term has undergone since Leopold’s time (see, e.g., Shrader-Frechette 1996). Even one of Leopold’s primary defenders, J. Baird Callicott, asserts that there are difficulties in identifying the boundaries of ecosystems and suggests that we recognize that their boundaries are determined by scientific questions ecologists pose (Callicott 2013). I argue that we need to rethink Leopold’s concept of land community in the following ways. First, we should recognize that Leopold’s views are not identical to those of his contemporaries (e.g., Clements, Elton), although they resemble those of some subsequent ecologists, including some of our contemporaries (e.g., O’Neill 2001, Post et al. 2007, Hastings and Gross 2012). Second, the land community concept does not map cleanly onto the concept of “ecosystem”; it also incorporates elements of the “community” concept in community ecology by emphasizing the interactions between organisms and not just the matter/energy flow of the ecosystem concept. Third, the boundary question can be illuminated by considering some of the recent literature on the nature of biological individuals (in particular, Odenbaugh 2007; Hamilton, Smith, and Haber 2009; Millstein 2009), focusing on concentrations of causal relations as determinative of the boundaries of the land community qua individual. There are challenges to be worked out, particularly when the interactions of community members do not map cleanly onto matter/energy flows, but I argue that these challenges can be resolved. The result is a defensible land community concept that is ontologically robust enough to be a locus of moral obligation while being consistent with contemporary ecological theory and practice.

Dana Nelkin, University of California San Diego "Accountability and Desert"
Friday, October 23, 3:30-5 p.m.

Abstract

In recent decades, participants in the debate about whether we are free and responsible agents have increasingly begun their papers or books by fixing the terms "free" and "responsible" in clear ways to avoid misunderstanding.  This is an admirable development, and while some misunderstandings have certainly been avoided, and positions better illuminated as a result, new and interesting questions also arise.  Two ways of fixing these terms and identifying the underlying concepts have emerged as especially influential, one that takes the freedom required for responsibility to be understood in terms of accountability and the other in terms of desert.  In this paper, I start by asking:  are theorists talking about the same things, or are they really participating in two different debates?  Are desert and accountability mutually entailing?  I then tentatively conclude that they are mutually entailing.  Coming to this conclusion requires making finer distinctions among various more specific and competing accounts of both accountability and desert.  Whether in the end the thesis that the two notions come apart is true, exploring the more finely specified accounts can help us see in a more fine-grained way the things we care most about in the free will and responsibility debate.

David Christensen, Brown University "Disagreement, Drugs, etc:  from Accuracy to Akrasia"
3:30 p.m. Friday September 25, 2015, WMC 3510.

Abstract

We often get evidence concerning the reliability of our own thinking about some particular matter. This can come from the disagreement of others, or from information about our being subject to the effects of drugs, fatigue, emotional ties, implicit biases, etc. Accounts that make rational belief sensitive to such evidence depend on reliability-assessments of the agent’s thinking about the matter in question. But specific proposals about how such assessments rationally constrain agents’ credences seem to yield unintuitive results—especially in cases where the agent’s first-order thinking does fall short rationally. This paper examines some pros and cons of two fairly general models for accommodating higher-order evidence. The one that currently seems most promising to me also turns out to have the consequence that epistemic akrasia should occur more frequently than is sometimes supposed. But it also helps us see why this might be OK after all.

SFU Research Masterclass Series presents

My Term Paper:  1962-2015, Nearly Complete:  A Conversation with Dr. Ray Jennings, Professor, Department of Philosophy

Thursday September 24, 2015, 11:30 a.m. - 12:20 p.m., Irmacs Presentation Studio ASB10900

About the Interviewee:  As an undergraduate at Queen's my studies traversed Classics, English Literature, and finally Philosophy as the only department where Logic was offered. I understood virtually nothing else anyone said. My M.A. thesis was about the formalisation of preference. From late September 1965 to early June 1967, I was a doctoral student at the University of London working under Bernard (Later Sir Bernard) Williams, whom I understood very well; he left me to work on my own at understanding natural language quantifiers. I arrived at S.F.U in September 1967 at the age of 25 knowing virtually nothing about Philosophy. Gilbert Ryle, whose journal had published much of my M.A. thesis visited S.F.U. that autumn and invited me to spend a year at Oxford, where I worked up a readable manuscript from my doctoral thesis. I was dismayed to discover that I didn't understand what Oxford philosophers were talking about, and hung out at the Maths Institute, driving from time to time to Cambridge to spend time with Bernard Williams, by then the Provost of Kings. But the more technical aspects of logic had taken hold and I spent a calendar year 1972-73 with the Logic Group at Victoria University, Wellington N.Z.

The 70's and early 80's were years of collaboration with P.K. Schotch at Dalhousie, and during much of that period and later I was spending half of every year in the School of Maths and Physical Sciences at the University of Sussex. In the early 80's Risto Hilpinen (Turku, Finland) urged me to publish my thesis as an historical curiosity as linguists had developed an interest there. What had begun as a term paper in 1962 yielded The Genealogy Of Disjunction (OUP) in 1994. It soon occurred to me that what I had said about natural connectives represented a small part of a biological story that could be told about the whole of human language. Since then the Biology of Language and The Formal Sciences have taken up all of my research time, and have influenced everything I have taught.

About the Interviewer: Mr. Joe Thompson is a Ph.D. candidate in Psychology at SFU. His goal is to use big data to better understand complex skill learning. As our lives become more computer-mediated, digital records of performance become easier to access. Joe's research uses digital records from a highly competitive and internationally played video game, StarCraft 2. These records are left behind whenever someone plays a game of StarCraft, and contain a timestamped log of every action contained within. This allows us to collect large samples of second by second performance for thousands of individuals in a complex task domain and verify their skill electronically, and to address general questions regarding the nature of skill development in complex domains. Much of the theoretical foundation for Joe's work was formed during his Masters degree in Philosophy, where Joe studied under Ray Jennings.

***

The main idea of the IRMACS Centre's series SFU Research Masterclass is to have a group of prominent SFU researchers that will, instead of an academic lecture on their research topic, tell the story of their research path and the "best practices" and tips they learned along the way - how they came to be interested in the topic, how their research directions have changed over the years, any major shifts in direction, who their collaborators are and how they developed those collaborations.  We invite SFU graduate students and postdoctoral fellows, as well as SFU researchers across disciplines, SFU research personnel, and researcher grant facilitators to attend the SFU Research Masterclass sessions.  For more information contact Veselin Jungie at vjungic@irmacs.sfu.ca

Academic Year 2014

Nicolas Fillion, Simon Fraser University,  "The surprisingly old origins of modern decision and game theory"
3:30 p.m. Friday, July 17, 2015, WMC 3510

Abstract

The origins of modern game theory, with its emphasis on solution concepts that apply to arbitrary games, is typically associated with the works of von Neumann & Morgenstern in the 1930s and with the works of Nash in the 1950s. However, already in 1713, Part V of the second edition of Pierre Rémond de Montmort's Essay d'analyse sur les jeux de hazard contained correspondence on probability problems between Montmort and Nicolaus Bernoulli. It concludes with the first mixed-strategy solution of a game (called Le Her), which is attributed to their colleague Waldegrave. However, their work has been criticized and its significance downplayed by most historians. I challenge this view on the basis of an additional 44 unpublished letters between them (and Waldegrave) found in the archives in Basel. This correspondence addresses many confusing aspects of their earlier discussion of the concept of solution to strategic games. I will describe this body of correspondence as it relates to the discovery of the concept of mixed strategy equilibria and put it in its historical context. I will also argue that, far from falling short of the conceptual rigour found in modern analyses, their discussion anticipates refinements of Nash equilibria that only gained traction starting in the 1970s.

Robert Adams, Rutgers,  "Leibniz and Pantheism"
Friday, July 10, 2015 at 3:30 p.m.

Abstract

The thought (or charge) that Leibniz's philosophy really implies a pantheism similar to that of Spinoza has been mooted for more than a century.  It surfaced, with considerable impact on German philosophy, in the Pantheism Controversy of the 1780s.  This paper will examine arguments connected with Leibniz's (and other traditional) views on divine knowledge and divine causality.  In addition to Leibniz, Malebranche, Lessing, Mendelssohn, and Schleiermacher will figure in the discussion.

Language at the Interface Conference
24-26 April 2015

Friday @ 470 Wosk Centre

3-5:00 - Peter Carruthers (UMD) "The Causes and Contents of Inner Speech"

5-7:00 - Guillaume Beaulac (Yale) "The Roles of Language and the Architecture of Cognition"

Saturday @ 1600 Harbour Centre

10-12:00 - Wolfram Hinzen (ICREA/UB) "Language with No Interface"

1-3:00 - Anna Papafragou (UDel) "Cross-Linguistic Diversity and Human Cognition"

3:30-5:30 - Friederike Moltmann (CNRS/NYU) "Attitudinal Objects and the Semantics of Attitude Reports"

5:30-7:30 - Jacques Lamarche (UWO) "Language Interface under Combinational Mapping"

Sunday @ 1600 Harbour Centre

10-11:00 - Alexander Malt (Durham) "Grammar at the Edge of Chaos"

 

 

Dai Heide, SFU, "Kant on Cosmological Unity and the Unity of Space"
Friday, March 27, 2015, 3:30-5:00 p.m.  Room: WMC 3510

Abstract

From at least 1770 onwards, Kant holds that space is a singular unity. This means, in part, that any two places, or regions of space, are parts of a larger space that encompasses them. Kant often asserts this claim without obvious justification; commentators are accordingly divided over what justification, if any, Kant provides for this claim. One camp holds that the unity of space is simply given in a non-conceptual intuition of space. Another holds that the unity of space is the product of the subject’s conceptually guided synthesis of the sensory manifold. I argue that Kant has a distinct justification for his claim that space is a singular unity. Specifically, I argue that unity of space is grounded in Kant’s commitment to an anti-Leibnizian cosmology according to which fundamental substances (“noumena” in Kant’s terminology) must stand in relations of mutual interaction in order to constitute a world. Since Kant holds that human subjects are, in themselves, noumena (and thus subject to the requirement that they stand in interactive relations with other noumenal subjects), and since he holds that space is the subject’s form of “receptivity” to other beings, the requirement that mutually interacting noumena constitute a single unified world entails that the subject’s form of receptivity to other existents reflects or “represents” this condition. Thus, for Kant, the unity of space is not fundamentally grounded in the subject’s intellectual activity in relation to the subject’s sensible manifold, nor is it merely a brute given; rather, the unity of the subject’s spatial manifold is grounded in a cosmological condition upon the unity of the created world at the most fundamental level. While this interpretation will be controversial, I defend it from several potential objections, including the objection that it violates Kant’s commitment to epistemic humility regarding things in themselves.

"Hurricane Sandy and Climate Change:  Predictions and Responses" by Adam Sobel

Abstract

Forecasts of Superstorm Sandy's landfall on the northeastern US coast in the days before enabled many life-saving preparations.  How can science-based warnings about disasters to come in the less immediate future, like the coastal flooding predicted by human-induced climate change spur investment in infrastructure before disaster strikes?

Michael Raven, University of Victoria "Expressing the truth, describing the world"
Friday, February 27, 2015, 3:30-5:00 p.m.  Room:  WMC 3510

Abstract:

Although many areas of inquiry share a general interest in what the world is like, metaphysics is often supposed to be distinctive for its special interest in what the world is fundamentally like.  My aim is to clarify an elusive dispute over how to conceive of this interest.  In a recent symposium on Ted Sider's influential Writing the Book of the World, Kit Fine distinguished two projects in the metaphysics of fundamentality.  The expressive project concerns expressing the truth in the most fundamental terms whereas the descriptive project concerns describing the world in the most fundamental terms.  To illustrate the difference:  given that P is true, 'P' describes the world as well as 'P or Q' even though 'P or Q' expresses something (disjunction) that 'P' does not.  Both Fine and Sider agree that the distinction is significant to the metaphysics of fundamentality; but Fine accepts it whereas Sider rejects it.  My aim is to help clarify this elusive distinction by exploring how the topical notion of ground can be applied to it.  Ground is supposed to be a distinctively metaphysical kind of explanation:  for example, the fact that there is political instability in the Middle East is grounded in (holds in virtue of, is determined by) facts about the activities and attitudes of various people.  I will explore an account of what a statement describes about the world that relies on ground and discuss how it might help clarify the elusive distinction between the two projects.

Becko Copenhaver, Lewis & Clark "Thomas Reid on Aesthetic, Moral and Pathological Perception"
Friday, February 6, 2015, 3:30-5:00 p.m.  Room:  WMC 3510

Abstract

Thomas Reid's theory of perception includes an account of what he calls original perception, by which typical perceivers see colors, feel shapes, hear sounds, smell odors, and taste flavors.  This original perception allows us to represent very basic features in the environment.  But Reid also presents a rich account of what he calls acquired perception.  According to Reid, we are capable of acquiring perceptual sensitivity to features not given in original perception.  For example, we acquire the ability to see distance, size, and shape, though those features are not original to vision.  We also acquire the ability to perceive higher-order properties -- properties like 'being a tomato', or 'being a Pinot Noir'.  When we examine Reid's account of aesthetic and moral perception, we find a similar developmental account of our perceptual capacities.  I examine Reid's theory and draw some general conclusions about what we can learn from it about cases of pathological seeing, for example, cases of implicit bias.

Holly Andersen, SFU "Patterns, Information, Causation"
Friday, November 28, 2014, 3:30-5 p.m.

Abstract: I will present the main elements of a pattern metaphysics for causation, along with some useful information-theoretic tools for analyzing causation that are a consequence of this account. I first elaborate several useful notions such as counterfactual robustness of causal relata and generalize it using the pattern ontology of Dennett (1991). I then defend the metaphysical claim that causation is the collection of information-theoretic relationships between patterns that are instantiated in the causal nexus. This pattern approach reconciles what is strongest and most appealing about physical process theories, with their emphasis on the actuality of causation, with counterfactual theories such as interventionism, with their flexibility with respect to changes in causal relata and relations, but without being a ‘middle way’ approach. Patterns are metaphysically fundamental, and adequate to characterize causation in any possible world, where the nature of the causal nexus may vary between worlds but where causation is always an information-theoretic relationship between patterns instantiated in whatever the nexus is. I then consider some ontological consequences of this view for causation within philosophy of science. This information-theoretic approach to causation as patterns allows us to represent causal relata as volumes in an appropriately defined phase space, over which we can define probability distributions between which we can precisely investigate a number of interesting relationships, such as mutual information, joint entropy, etc. The talk will be oriented for a general philosophical audience - the technical aspects will be kept to a minimum.

 

Ashley Atkins, SFU "Modality as a Window into Cognition"
Friday, November 14, 2014, 3:30-5 p.m.  WMC 3510.

Abstract:

My talk concerns a cluster of natural language expressions that provide us with a window into cognition. These otherwise-eclectic expressions all give rise to modal interpretations—interpretations that concern how an event or state would develop in the absence of barriers to its continuation. Interestingly, however, they give rise to these modal interpretations despite the fact that they do not have linguistically-encoded modal meanings (unlike explicitly modal expressions such as 'must' and 'might'). An explanation of these interpretations, then, tells us about more than the meanings of these expressions: it tells us something about modal cognition, namely, how it comes to be engaged by these expressions and about what its interpretive output is once it is so engaged. We can say, in short, that these expressions reveal not the modal structure of language, but the modal structure of cognition.

Expressions with these characteristics are unfamiliar and the sort of explanation that they call for departs from standard models of explanation in both linguistics and the philosophy of language. After all, it is standardly assumed that the task of providing an analysis of the meaning of an expression and the task of providing an analysis of systematic aspects of its interpretation (particularly those that are context-independent) are one and the same. This raises some issues concerning the intelligibility of such explanations. I want to focus on these issues. In the course of my talk I'll be addressing questions like “Why has it taken so long to recognize the character of these expressions?” “Why should the interpretations to which they give rise be expected?” and “Where else might we find interpretations like these?” The answers to these questions reveal a great deal about the limitations of dominant frameworks for thinking about meaning and interpretation in both linguistics and philosophy as well as something about how they need to be revised.

Michael Anderson (Franklin & Marshall College) "Mining the brain for a new taxonomy of the mind"
Friday, October 24, 2014, 3:30-5 p.m.  Room:  WMC 3510.

Abstract

In the past few years, there have been a number of calls to rethink the categories we use in the psychological sciences (e.g. Anderson 2010; Barrett & Satpute 2013; Lenartowicz et al. 2010; Lindquist 2013; Lindquist & Barrett 2012; Poldrack 2010).  One motivation for this is the growing realization that it is very rare to find a neat 1:1 mapping between a given psychological category and specific tissue within the brain. In fact, several statistical analyses of experiments from large collections of neuroimaging results (Poldrack 2006; Anderson 2010; Anderson, Kinnison & Pessoa 2013) demonstrate that regions of the brain appear to be activated by multiple tasks across diverse task categories. Such findings have prompted questions about the validity of the functional ontology we are using to interpret the neuroscientific data. Thus, for instance, Lenartowicz et al. (2010) performed a multivariate analysis of a number of neuroimaging studies of cognitive control, and found that while some sub-processes of cognitive control (eg. working memory, response selection, response inhibition) appeared to result in distinct patterns of brain activation, task switching did not. They suggest, in light of this finding, that perhaps “task switching” does not actually name a real psychological process. Other work (Gold et al. 2011; Poldrack, Halchenko & Hanson 2009; Yarkoni et al. 2010) has pointed toward other possible reforms, but there is as of yet no consensus on what the right categories are, or what the criteria should be for judging validity. Should we look for categories that maximize local neural selectivity? Offer the best predictions of distributed patterns of activity? What kind of function-structure mapping do we need to do good cognitive neuroscience? More generally, this work raises with new force the old question of how much weight neuroscientific data should have when modeling psychological processes. To what degree is the validity of a psychological category beholden to neuroscientific data?  In this talk I will summarize the debate and consider some of its implications for both scientific and folk psychology.

Anna-Sara Malmgren (Stanford University) "Goodness, Availability, and Argument Structure"
Friday, October 10. 2014, 3:30-5 p.m.  Room:  WMC 3510

Abstract:

On a widely shared conception of inferential justification, an agent is inferentially justified in believing that p only if she has an antecedently justified belief in each non-redundant premise of a good argument for p.  In this talk I focus on a couple of under-explored questions that seriously complicate the application of this conception to cases and that, perhaps, show that we don't understand it very well at all.  First:  what is a good argument?  Second:  what counts as a relevantly complete representation of a given argument form?

Jonathan Quong (University of Southern California) "Rights Against Harm"
Friday September 19, 2014, 3:30-5 p.m.  Room:  WMC 3510

Abstract:

Some philosophers defend a fact-relative view of moral rights against harm:

If B had access to all the morally relevant facts about X-ing, and under these conditions, B's X-ing would constitute an infringement of A's right not to be harmed by B, then if B X's, B infringes A's moral rights regardless of whether B does have access to all the morally relevant facts.

Put differently, proponents of the fact-relative view believe that an agent's imperfect knowledge about what will occur if she performs some act is irrelevant to the question of whether she will infringe another person's moral rights in performing that act.

In this paper I argue that the fact-relative view of moral rights is mistaken.  I illustrate the importance of this conclusion with a discussion of the morality of defensive harm.

 

Academic Year 2013

Sara Protasi (Yale University) "The Varieties of Envy"
Friday, August 29 at 3:30 p.m. Room: Halpern Centre 114

Abstract: In this paper I present a novel taxonomy of envy, according to which there are four kinds of envy: emulative, inert, aggressive and spiteful envy. An inquiry into the varieties of envy is valuable not only to understand it as a psychological phenomenon, but also to shed light on the nature of its alleged viciousness. Envy is one of the so-called deadly sins, but it has been scarcely analyzed by contemporary moral philosophers. Developing a more precise and adequate knowledge of envy's anatomy allows the moralist to come up with the right diagnoses and remedies.

Here is an overview of the paper. The first section introduces the intuition that there is more than one kind of envy, together with the anecdotal and linguistic evidence that supports it. The second section proposes and explains in detail a definition of envy tout court. The third section presents a recurring distinction between behavioral tendencies of envy, which has been explained in two distinct ways, one mostly proposed by psychologists, the other discernible in the philosophical tradition. The fourth section argues that these models of explanation track two variables (focus of concern and obtainability of the good), whose interplay is responsible for the existence of the four envies. The fifth section illustrates four paradigmatic cases, and provides a detailed analysis of the phenomenology, motivational structure, and typical behavioral outputs of each. The paper ends with a brief discussion of the implications of the taxonomy for moral education.

Paul Pietroski (University of Maryland) "Mostly Framing"
Monday, June 23 at 1:30 p.m.  Room:  IRMACS Centre

Abstract:  I will discuss a series of studies that illustrate how experimental methods can help adjudicate between provably equivalent but procedurally distinct specifications of the semantic properties of quantificational/comparative expressions like 'most' and 'more'.  For example, while semanticists often paraphrase (1) with (2) or the formalization (2a),

(1)  Most of the dots are blue.

(2)  The number of blue dots exceeds the number of non-blue dots.

(2a) #{x: x is a blue dot}>#{x: x is a non-blue dot}

there is evidence that (3/3a) better represents how speakers of English understand (1).

(3)  The number of blue doets exceeds the number of dots minus the number of blue dots.

(3a)  #{x: x is a blue dot}>[#{x: x us a dot} - #{x: x is blue dot}]

I will also argue that an independently motivated emphasis on procedural/algorithmic description in semantics - as opposed to focusing on description-neutral truth conditions, which can be represented in various formats - coheres with observations that logically equivalent sentences, involving 'most' and 'more', can trigger different "framing effects" in a roughly Kahneman/Tversky sense.  In short, linguistic meanings exhibit specific representational formats that are experimentally detectable.

This talk is in collaboration with the Cognitive Science and Linguistics Departments.

Samantha Matherne (UC Santa Cruz) "Images and Kant's Theory of Perception"
Friday, May 16, 2014 at 3:30.  Room:  Halpern 114

Abstract:  Recent discussions about Kant's theory of perception have focused primarily on the topic of conceptualism, i.e., whether Kant requires concepts or conceptual capacities for perception.  One of the issues in this debate concerns his analysis of how we perceptually represent an object with multiple properties.  It is often assumed that in order to decide this issue, we must look to Kant's analysis of intuitions.  In this paper, however, I show that it is not intuitions, but rather images that Kant identifies as the relevant representations.  I argue, moreover, that Kant is a conceptualist about images.  However, I also suggest that this does not make Kant a thorough-going conceptualist about perception because he is still a non-conceptualist about intuitions.

Elaine Landry (UC Davis) "Category-Theoretic Mathematical Structuralism"
Friday, March 28, 2014 at 3:30 pm. Room:  WMC 3510

Abstract:  In this talk I will use Reck's (2003) account of Dedekind's mathematical structuralism, particularly his distinction between conceptual/methodological and semantic/metaphysical varieties, to argue for a more helpful distinction between algebraic/Hilbertian and assertory/Fregean mathematical structuralism.  I will then use this to underwrite an account of category-theoretic mathematical structuralism that stands up to the objections of Feferman (1977), Hellman (2003) and Shapiro (2005).

This talk is part of the History and Philosophy of Mathematics Colloquium Series

Elisabeth Camp (Rutgers) "Playing with Concepts:  Logic, Association and Imagination"
Friday, March 21, 2014 at 3:30 pm. Room:  WMC 3510

Abstract:  Recent theorizing about concepts has been dominated by two general models: crudely speaking, a philosophical one on which concepts are arbitrary, rule-governed, re-combinable bit, and a psychological one on which they are associative, holistic, gestalt networks.  I present arguments for thinking that thought operates in both these ways.  But where 'dual systems’ models of cognition emphasize conflicts between these two modes of cognition, I argue that the interplay between them is crucial for the distinctive richness and power of human imagination.

Jim Brown (Toronto)  "Mathematics: Pure vs. Applied"
Friday, March 14 at 3:30.  Room:  WMC 3510

Abstract:  Making a distinction between pure and applied mathematics is commonplace.  Yet, the distinction is very different in the minds of philiosophers from the way mathematicians conceive it.  I will explore this difference and its significance for mathematical evidence, focussing on issues such as intuitions, visual reasoning, and attempts to use mathematics to explain things in the physical world.

Tim Caulfield (Alberta) "Who Controls Your Genes, Cells, and Tissue?"
Thursday, March 13, 2014, 6:30 pm. SFU Segal Graduate School of Business, 500 Granville St., Room 1300-1350.

Over the past few decades, billions have been invested in genetic research. The same can be said for stem cell research. We are constantly told that, rightly or not, we are in the midst of biotechnology revolution – a time when the advances occurring in these exciting fields will revolutionize health care and energize our economy. Despite this massive investment and rhetoric, there is still a great deal of uncertainty regarding the control and ownership of genetic material and human tissue. Indeed, this has become one of the most contested and controversial areas of science policy. Recent advances, including inexpensive whole genome sequencing, seem likely to add further complications. This presentation will review the relevant law and policies and explore what the empirical evidence tells us about public perceptions.

This talk is part of the Institute for Values in Policy and Science (ViPS) Public Lecture Series

Chris Pincock (Ohio State) "Newton, Laplace and Salmon on Explaining the Tides"
Friday, February 28, 2014 at 3:30 pm.  Room:  WMC 3510

Abstract:  Salmon cites Newton's explanation of the tides in support of a causal account of scientific explanation.  In this paper I reconsider the details of how Newton and his successors actually succeeded in explaining several key features of the tides.  It turns out that these explanations depend on elements that are not easily interpreted in causal terms.  Often an explanation is obtained even though there is a considerable gap between what the explanation says and the underlying causes of the phenomenon being explained.  More work is needed to determine the admissible ways in which this gap can be filled.  I use the explanations offered after Newton to indicate two different ways that non-causal factors can be significant for scientific explanation.  In Newton's equilibrium explanation, only a few special features of the tides can be explained.  A later explanation deploys a kind of harmonic analysis to provide an informative classification of the tides at different locations.  I consider the options for making sense of these explanations.

Helen Nissenbaum (NYU) "Does Privacy Depend on Context? A Theory and Some Applications"
Monday, February 24, 2014. Vancouver Campus

This talk is part of the Institute for Values in Policy and Science (ViPS) Public Lecture Series

Abstract:  Most people care about the inappropriate sharing of information rather than the sharing of information itself.  Threats to privacy arise from inappropriate flows of information through the use of computing and information technology.  Living with our technologies involves clarifying what makes sharing appropriate and inappropriate.  Does context matter to what we judge to be private?  Does it matter what was shared?  And with whom?  This talk reveals how these questions have shaped the theory of privacy and illustrates key ideas with practical applications.

David Bellhouse (UWO) "The Roots of Actuarial Science in 18th-Century Britain"
Friday, February 21, 2014 at 3:30 pm. Room:  WMC 3510

This talk is part of the History and Philosophy of Mathematics Colloquium Series

Abstract:  The mathematics for the proper valuation of life annuities and other life contingent contracts, as well as the construction of life tables, found their flowering in the eighteenth-century Britain. Seemingly paradoxical, social historians have noted a distinct lack of relationship between the actuarial work of the mathematicians in the eighteenth century and the life insurance and annuity industry as it developed at this time. I will examine what may have originally motivated the mathematical work on annuities among some of the leading mathematicians of the day: Abraham De Moivre, James Dodson, Edmond Halley, William Jones and Thomas Simpson. They were part of a circle of mathematicians active in the Royal Society up to about 1760. I will try to demonstrate that the major motivating factor for annuity valuations was connected to issues around the foundation of eighteenth-century British society – property. What may have motivated the insurance industry finally to embrace actuarial mathematics is the bursting of an annuity bubble in the 1770s and the subsequent work of the theologian and mathematician Richard Price followed by his nephew, the actuary William Morgan.

Kathleen Akins (SFU) "Synaesthesia and learning: A critical review and novel theory"
Friday, December 6. 2013 at 3:30 pm. WMC 3510

Synaesthesia is a phenomenon is which ‘ordinary’ stimuli such as letters, numbers, weekdays and months evoke seemingly unconnected sensory responses—e.g. letters evoke specific colours or personalities, numbers appear in rigid spatial arrangements, and the names of months are arranged around the synaesthete in peri-personal space.

From 2008-2011, we conducted the largest survey of synaesthetic tendencies (N = 11,664) ever conducted at Charles University in Prague, Czech Republic, and Simon Fraser University, Burnaby, Canada. This was the first cross-cultural and cross-linguistic survey ever and given the hurdles, one that is unlikely to be reproduced. Our goal was to understand how—or rather, if—either language and/or literacy training affected the incidence and form of synaesthesia. We chose Czech not merely because a native informant was close to hand (thank you Martin!) but because while the ‘base letters’ of the Czech alphabet are largely identical to the Roman/English alphabet, more or less everything else about Czech is quite different: Czech is an almost perfectly transparent language, with one phoneme for each grapheme and vice versa, and as a result, has a quite different (and regimented) form of literacy training. It also has a system of diacritics, added to the ‘base’ letters of English, which systematically change the Czech phonemes of the relevant base letter. So, here, first, I will present some quite unexpected findings about the influence of language learning on the prevalence of synaesthesia.

That said, the real (aka philosophical) reason for the study was to investigate the role that colour plays in perception and cognition. Recent work in colour vision suggests that colour plays a far more crucial role in human visual perception than had previously been thought. Colour vision is—of course!—for seeing the colours, apples as red, grass as green and the sky as blue. But it is also crucial for discerning almost all of the other visual properties we perceive. At bottom, colour is for seeing more generally. Moreover, almost all forms of synaesthesia (including colour synaesthesia) involve the mapping of categorical perceptual properties—here categories of colours—onto learned categories of cultural artifacts (e.g. red onto the letter ‘A’). But if colour helps us to see, and synaesthetes gain ‘colours’ when they learn to read, then just maybe seeing letters as coloured is beneficial to the task. Perhaps it helps chilren to recognize and use those strange two-dimensional objects known as ‘letters’. Starting with a brief primer on what colour is ‘good for’, I will present our results which show that synaesthetic colours do aid in literacy learning and that their role is dynamic, tied to the conceptual/perceptual demands of learning a specific orthography. For philosophers, the pay off is quite interesting: top-down conceptual activity primes bottom-up perceptual processing in a repetitive cycle. As literacy takes hold, this unconscious process maps colour similarity (in 3 dimensions) to various aspects of letter similarity. Thus the letters are ‘coloured’.

Audrey Yap (U Vic) "The History of Algebra's Impact on Philosophy of Mathematics"
Friday, November 29, 2013 at 3:30 pm. WMC 3255

Structuralism in the philosophy of mathematics encompasses a range of views that see structures, such as the entire system of natural numbers, as the proper objects of mathematics, rather than objects like individual natural numbers. This view is a relatively recent one in the history of analytic philosophy---many see a paper by Paul Benacerraf in 1965 as one of the earliest articulations of such a point of view. This paper will look at the way in which earlier mathematical contributions by Emmy Noether, extending work of Richard Dedekind, constitutes a transition in mathematics that enables a structural point of view. The assumption is that mathematics itself needed to be conducive to a structural point of view by having the technical resources to treat structures as objects in their own right. Noether's work in algebra is one of the factors giving mathematics those resources. Better-known figures in early 20th century philosophy of math, such as Frege, Russell, and Goedel, are associated with a variety of philosophical viewpoints, but their contributions do not provide the basis for structuralism that Noether's work does.

Christopher Mole (UBC) "Attention to Unseen Objects"
Friday, November 1, 2013 at 3:30 pm. Room WMC 3250

Abstract: This talk will revisit a debate about the relationship between attention and consciousness.  I previously defended the claim that, in order for a person to pay attention to a thing, that person must be conscious of that thing. Recent work puts this claim under new pressure.  I'll be looking at that work, considering ways in which my claim might be modified to accommodate it, and exploring the consequences of abandoning that claim altogether.

Adam D. Moore (Washington) "Privacy, Security, and Surveillance: WikiLeaks, Big Data, and the New Accountability"
Wednesday, October 30, 2013 at 6:30 pm. Harbour Centre, Room 1420-1430.

Abstract: This presentation considers the "new accountability" forced upon corporations and governments by the "big data" movement and information sharing sites like WikiLeaks. I argue that accessing and sharing sensitive information about individuals, corporations, and governments is, in the typical case, morally suspect -- we simply do not owe each other the level of information access promised by 'big data" or WikiLeaks. Arguably, at this time individuals are paying the cost of this information sharing movement.  Nevertheless, the wrongness of making the average citizen an information target, subject to the ever watchful gaze of some big brother, is mitigated by two factors.


First, in democratic societies and to the extent that the information published is important for accountability, we have a right to know much of the information published.  Second, this sort of sharing is forcing a realignment of power. For decades corporations and governments have been able to collect, store, and share information about ordinary citizens while walling off access to their own information. Sharing sites like WikiLeaks are changing this balance of power. And while it is the case that "two wrongs don’t make a right" for this slogan to be true one has to acknowledge the first wrong.  Perhaps once we are all information targets, equal in our inability to protect our own informational privacy, we will be in a position to adopt justified information sharing practices and practices of accountability.

This talk is presented by ViPS, the Institute for Values in Policy and Science and is part of the 2013-2014 Public Lecture Series in Privacy.

David Owen (Arizona) "Reason Alone"
Thursday, October 24, 2013 at 12:30 pm in WMC 3250

Abstract: In all three books of the Treatise, Hume uses the notion of “reason alone” when he wishes to show how limited the powers of reason are. Hume summarizes 1.4.1, “Scepticism with regard to reason”, by saying “the understanding, when it acts alone… entirely subverts itself” (THU 1.4.7.7, SB 267). In 2.3.1, “Of the influencing motives of the will”, he says that” “reason alone can never be a motive to any action of the will” (THU 2.3.1, SB 413), and he extends the point in a more accurate fashion a little later when he says “reason alone can never produce any action, or give rise to volition” (THU 2.3.1, SB 414). And in 3.1.1, this point is used as part of one of the arguments that show that moral distinctions are not derived from reason. Recognizing something as virtuous or vicious has “an influence on the actions and affections”. But “reason alone, as we have already prov’d, can never have any such influence… The rules of morality, therefore, are not the conclusions of our reason.” (THU 3.1.1, SB 457) It is even possible to describe the famous negative argument concerning probable reasoning, the problem of induction, in terms of “reason alone”, though as I remember Hume never uses the expression “reason alone” in that context. In this paper, I give a univocal account of what means by “reason alone” in these contexts.

Marleen Rozemond (University of Toronto)  "Early modern consciousness and the mind-body problem"
Friday, October 4, 2013 at 3:30 pm. WMC 3250.

Early modern philosophers paid extensive attention to the relationship between the mental and the physical. In this paper I discuss arguments for the claim that bodies can’t think from Descartes, Leibniz, Cudworth and Samuel Clarke.   Their approach is interestingly different from current approaches.  The latter focus on the problem of consciousness as it is manifested especially in sensory states.  But we will see that this is not so in any of these thinkers.  The paper examines what features of the mental these thinkers focus on and on what ground they argue that bodies can’t think.

Colin Marshall (University of Washington)  "Kant on Intuition and Impenetrability"
Friday, September 20, 2013 at  3:30 pm. WMC 3520.

Abstract:  It is well known that Kant claims that causal judgments, including judgments about forces, must have an a priori basis.  It is less well known that Kant himself claims that we can perceive the repulsive force of bodies (i.e. their impenetrability) through the sense of touch.  Together, these claims present an interpretive puzzle, since they appear to commit Kant to both affirming and denying that we can have intuitions of force.  My first aim is to show that both sides of the puzzle have deep roots in Kant's philosophy.  My second aim is to present what I take to be the most promising approach to resolving the puzzle.

Robert Batterman (Pittsburgh) "Minimal Model Explanations"
Friday, September 13, 2013 at 3:30 pm. Room WMC 3520

Abstract:  This paper discusses minimal model explanations, which I argue are quite distinct from various causal, mechanical, difference making, etc., strategies prominent in the philosophical literature. I contend that what accounts for the explanatory power of these models is not that they have certain features in common with real systems. Rather, the models are explanatory because of a detailed story about why a class of systems will all display the same large-scale behavior because the details that distinguish them are irrelevant. This story explains patterns across extremely diverse systems and shows how minimal models can be used to understand real systems.

Academic Year 2012

Kevin Scharp (Ohio State University) “Celsius and Kelvin”
Friday, August 9, 2013 at 3:30 pm in WMC 3255

Abstract: It might seem like the Celsius scale and the Kelvin scale of temperature measurement are similar enough that there is no significant difference between using one versus the other. However, they are distinct kinds of scale, and the move from Celsius to Kelvin is a good example of an important sort of conceptual change. This change can be seen as an increase in a certain kind of precision – a measurement theoretic precision. There are whole classes of sentences that are indeterminate when using a Celsius scale, but are determinate with respect to a more precise scale, like Kelvin. A cost of this increase in precision, and with it an increase in determinacy, is that Kelvin has a wider range of constitutive principles, for example that there is a minimal temperature. The transition from Celsius to Kelvin is one of turning empirical discoveries about the world (e.g., that -273.15˚C is the minimal temperature) into principles implicit in our concepts (e.g. that 0K is the minimal temperature). The resulting concepts give us more determinate answers to our questions about the world, but they do so at the cost of being defective if our empirical discovery turns out to be wrong. The existence and prevalence of this kind of conceptual change is a counterexample to some entrenched ways of thinking about the relation between our concepts, their possessors, and the world.

Alan Wertheimer, Senior research scholar, Department of Bioethics, U.S. National Institutes of Health and Professor of Political Science (emeritus), University of Vermont

Research Ethics and the Interaction Principle
Wednesday, June 19, 6:00-7:45 p.m.

SFU Harbour Centre (downtown Vancouver campus), Room 1600

ABSTRACT:  Clinical research typically involves an interaction between an investigator and a subject.  In some cases subjects participate because they expect to benefit from participation.  Still, it is often suggested that the researcher’s use of the subject – the interaction – generates ethical obligations that go beyond those to those that inform the subject’s consent.  Call this the “interaction principle.”  Although much of the literature on research ethics appeals to this principle, it has some paradoxical implications.  For example, it implies that it can be ethically worse to interact with someone to her benefit than not to interact at all.  And it implies that one can have greater obligations to those whom one has already benefited than to others.  This talk shows how this principle is important in research ethics and explores whether it can be defended.

SPEAKER BIO:  Alan Wertheimer, PhD, is senior research scholar in the Department of Bioethics at the NIH. He is also professor emeritus of political science at the University of Vermont, where he taught from 1968-2005. Dr. Wertheimer is the author of 4 books (CoercionExploitationConsent to Sexual Relations, and Rethinking the Ethics of Clinical Research. He has held fellowships at the Institute for Advanced Study at Princeton University, and the Program in Ethics and the Professions at Harvard University. Prior to joining the NIH, Dr. Wertheimer’s work focused on political philosophy and philosophy of law. His work now focuses on issues in research ethics.

This lecture is sponsored by the UBC W. Maurice Young Centre for Applied Ethics, SFU Philosophy Department, SFU Faculty of Health Sciences, UBC Department of Philosophy, and the Social Science and Humanities Research Council of Canada.

Alexander Logvinenko (Glasgow Caledonian University) “What Colours do Colourblind People See?"
Wednesday, June 19, 2013 at 10:30 am in TASC1 9204W

Dichromatic colour vision is commonly believed to be a reduced form of trichromatic colour vision (referred to as the reductionist hypothesis). In particular, the colour palette of the dichromats is believed to be a part of the colour palette of the trichromats. As the light-colour palette differs from the object-colour palette, the dichromatic colour palettes have been derived separately for light-colours and object-colours in this report. As to light-colours, the results are in line with the widely accepted view that the dichromatic colour palettes contain only two hues. However, the dichromatic object-colour palettes have proved to contain the same six component colours which constitute the trichromatic object-colour palette (yellow, blue, red, green, black, and white). Moreover, all the binary and tertiary combinations of the six component colours present in the trichromatic object-colour palette also occur in the dichromatic object-colour palettes. Yet, only five of the six component colours are experienced by dichromats as unitary (unique) object-colours. The green unitary colour is absent in the dichromatic object-colour palettes. The difference between the dichromatic and trichromatic object-colour palettes arises from the fact that not every combination of the component-colour magnitudes occurs in the dichromatic object-colour palettes. For instance, in the dichromatic object-colour palettes there is no colour with the strong green component colour. Furthermore, each achromatic (black or white) component colour of a particular magnitude is combined with the only combination of the chromatic components. In other words, the achromatic component colours are bound with the chromatic component combinations in dichromats.

Bio:

Dr. Logvinenko received his Ph.D. from Moscow State University in 1984. He is Professor, Department of Life Sciences, School of Health and Life Sciences, Glasgow Caledonian University. He is well known for his extensive work on colour and lightness. During June, he is visiting Brian Funt's Computational Vision Lab in the School of Computing Science. His visit is funded by the Ebco/Eppich endowment.

Related Publications:

Logvinenko, "Object-Colour Manifold, International Journal of Computer Vision, 2013.
Logvinenko, "On the colours dichromats see," Color Research and Application, 2012.
Logvinenko, Funt, & Godau, "Metamer Mismatch Volumes," CGIV 2012.
Logvinenko & Tokunaga, “Colour constancy as measured by least dissimilar matching,” Seeing and Perceiving, 2011.
Logvinenko, An Object-Colour Space, Journal of Vision, 2009

Japa Pallikkathayil (Pittsburgh) "The Truth About Deception"
Friday, April 12, 2013 at 3:30 in WMC 2503

Abstract: Many people have held that lying is in some sense worse than other forms of deception.  The prohibition on lying is often thought to be very stringent.  Some have even been tempted to think that it is absolute.  In contrast, the prohibition on other forms of deception seems to be looser. In this paper, I am going to give an account of the moral objection to deception in general. I will explore what, if anything, distinguishes the moral objection to lying in particular. The nature of this objection will depend on how precisely we understand what transpires between a speaker and her audience. I will argue that my account of the moral objection to deception provides a helpful framework for situating competing conceptions of the speaker-audience relationship and identifying their moral implications.

Anthony Laden (Illinois – Chicago) "The Practice of Equality"
Friday, April 5, 2013 at 3:30 in WMC 2532

Abstract: Several decades ago, Catharine MacKinnon asked the wonderful question, “What is an equality question a question of?” In asking her question, she hoped to draw our attention to the fundamental difference between thinking of equality in terms of non-discrimination and equality in terms of non-domination. This paper takes up MacKinnon’s question by reflecting on a certain set of dialectics in discussions of equality in the last several decades, and in particular, by pivoting around some recent work of Rainer Forst. Forst links up a view of justice (and thus equality) as a matter of relationships (and thus of non-domination) with a view of justice in terms of what he calls the right to justification. I suggest that this approach makes clear a certain dialectical progression that points beyond Forst’s work towards what I call an intersubjective conception of justice, and I explore some of the contours of this further position. The position I invite you to consider has implications not only for how we think of equality, and thus how we answer MacKinnon’s question, but how we think of the practice of political philosophy more generally.