Sara Bernstein is an Assistant Professor of Philosophy at Duke University. She works mainly on the metaphysics of causation, and on topics at the intersection of causation and ethics. She also has interests in the metaphysics of time and time travel. She was the 2013-2014 Andrew W. Mellon Assistant Professor of Philosophy. She is also an at-large member of the Board of Officers of the American Philosophical Association. She received her Ph.D in 2010 from the University of Arizona.

What Might Have Been: Causation and Possibility

Sara Bernstein

***

Thank you, Meena, for this excellent series and for inviting me to post.

For a long time I’ve been interested in “weird” types of causation—overdetermination and causation by omission, for example—and what they teach us about causation more generally. I’m also interested in how these topics intersect with ethics and moral responsibility.

***

Billy and Suzy are assassins. But Suzy is no ordinary one: having received a huge NSF grant for advanced weaponry, she’s recently developed a “Smart Bullet”[1] whose properties are the following: it will bring about the demise of its target should another bullet not do so first. (Lest you think this is a farfetched metaphysics example, check this out.) Consider the Smart Bullet in action:

Smart Bullet. Billy has a normal bullet while Suzy has a satellite-guided “smart” bullet poised to kill Victim if Billy’s bullet doesn’t do the job. Suppose that Billy shoots the gun containing his normal bullet and Suzy shoots the gun containing her “smart” bullet. Billy’s bullet kills Victim whereas Suzy’s bullet merely hovers near Victim, but is not called into action. Had Billy’s bullet not killed Victim, Suzy’s bullet would have killed Victim at exactly the same time and in exactly the same way.

There is an intuitive sense in which Billy’s bullet is the cause of Victim’s death and Suzy’s isn’t. But what, exactly, is the relationship between Suzy’s bullet and Victim’s death? A natural answer is that Suzy’s smart bullet makes no actual causal contribution to Victim’s death: it is a mere would-be or “possible” cause of Victim’s death, and thus, it is not a cause of Victim’s death at all.

My current research challenges this assumption—namely, the assumption that “actual causation” is the only kind of causation. I think there are good theoretical reasons to view the relationship between merely possible causes and outcomes—for example, the relationship between Suzy’s Smart Bullet and Victim’s death — to be a type of causation.

In recent years, causation theorists have struggled in various ways to deal with merely possible causes. One goal of causal theories has been to cordon off actual from merely possible causes—in our case, to explain what makes the relationship between Billy’s bullet and Victim’s death different from that between Suzy’s (backup) bullet and Victim’s death.

Unfortunately, maintaining the distinction between actual and possible causes has proven supremely difficult. Cases of preemption like the one involving Suzy and Billy have plagued accounts of causation according to which the outcome depends on a particular cause for its occurrence. An event can raise the chance that an outcome will occur without causing the outcome, which has proven problematic for probability-raising accounts of causation. Energy transfer accounts of causation cannot countenance causation by omission involving would-be events, since there is no actual event to or from which energy can be transferred. These theories resort to complicated emendations to maintain or model the distinction between actual and possible causes.

Rather than complicate a theory of causation in order to distinguish between actual and possible causes, I think we should stop trying to distinguish between them altogether. We should broaden the category of causation to include actual and would-be or “possible” causes. Doing so bypasses the many well-known problems that have afflicted theories of causation.

My recent and current work develops the view that actual and possible causation are both species of a common genus of causation: what I call causal relevance. Consider an event c, such as Suzy’s shooting a Smart Bullet at Victim, and an effect e, such as Victim’s death. C is causally relevant to e if c brings about e in a nearby possible world. (For c to bring about e in a world W is for c to be counterfactually sufficient for e in W.) There is such a world: one in which Billy did not fire. As such, causal relevance is a continuum reflected by distance from actuality: The most causally relevant event is the cause of the outcome in the actual world (Billy’s firing), followed by events that, in increasingly distant possible worlds, bring about the outcome. Should we so choose, we can then draw the traditional distinction between actual and possible causes: the actual cause is simply the most causally relevant c. The rest are merely possible causes, but are causes nonetheless.

One techie upshot of this view is that cases normally thought to be preemption—in which one cause brings about the outcome whereas another “backup” cause does not—actually come out as special instances of overdetermination, or cases involving multiple sufficient causes. For if Smart Bullet is considered a cause of Victim’s death in addition to the normal bullet, then there are multiple sufficient causes of Victim’s death.

Now, you might still want to hold fast to a distinction between actual and possible causation. But even skeptics should be swayed by the famous “Thirsty Traveler” case:

Thirsty Traveler. Billy and Suzy are each assassins targeting Thirsty Traveler. Thirsty Traveler has a canteen full of water that she needs to drink for her survival. In an attempt to kill Thirsty Traveler, Billy fills the canteen with poison that kills by dehydration. Suzy, unaware of Billy’s assassination attempt, drains the canteen. Victim tries to drink water from her canteen, but the canteen is empty, and she dies of dehydration.

In this example, neither assassin’s individual actual causal contribution causes Victim’s demise. And yet each assassin’s individual causal contribution is counterfactually sufficient to bring about desert-dweller’s death—a possible cause, but a cause nonetheless. Possible causation provides the best explanation of this case.

Removing a sharp distinction between actual and possible causation has more general consequences. In my paper “A Closer Look at Trumping” (2014), I argue that trumping preemption, often thought to be a distinctive kind of preemption, is actually a case of overdetermination. Here is one such case:

Trumping Preemption. It is a rule of a remote-controlled time-lock safe that at midnight it responds to the first lock command of the day. At 6am, Billy commands the safe to lock. At noon, Suzy commands the safe to lock. At midnight, the safe locks.

Consider another sort of familiar case taken from the free will literature:

Frankfurt Case. Billy is an assassin and Suzy is an evil neuroscientist. A chip Suzy has installed in Billy’s brain will make him shoot and kill Victim at time t if Billy doesn’t do it of his own volition. Billy shoots and kills Victim of his own volition at time t.

Smart Bullet, Trumping Preemption, and Frankfurt case are united by a common structure: in each of these cases, there is a merely possible or “backup” cause that is sufficient to bring about the outcome in exactly the way that it occurs. I hold that these backup causes are causes simpliciter of each outcome, and that each is a special kind of overdetermination. Moreover, Thirsty Traveler is also a special kind of overdetermination: overdetermination in which multiple causes are counterfactually sufficient to bring about an outcome.

Holding that possible causes are causes is a radical thesis. But there are several reasons to support this expansion of the category of causation. First, I don’t think that anything metaphysically significant undergirds the distinction between actual and possible causation. Some causal theorists hold that causation is an “oomph” or transfer of energy from cause to effect. According to this view, what distinguishes Billy’s bullet from Suzy’s Smart Bullet is that Billy’s bullet transfers some sort of physical force to Victim, whereas Suzy’s Smart Bullet does not. But this is an overly simplistic picture of causation—one that often doesn’t capture cases of causation that are intuitively so. For example, it is easy to imagine a magical spell-casting weapon that impacts its Victim without any physical forces. A less outré example involves particles subject to quantum entanglement, such that spinning one particle up at one end of the universe causes the other half to spin down at the other end of the universe—an intuitive case of causation, but without transfer of physical force. Rejecting “oomph” or physical force causation enables us to view actual and possible causation on on a metaphysical par.

Second, there are many cases in which possible causes play similar roles to actual causes in explanation, prediction, and moral evaluation. This is particularly so in cases of causation by omission. Consider the recent example of the Air France pilot who failed to increase the airplane’s speed, thus causing the plane to crash. Intuitively, the pilot’s failure to increase the airplane’s speed caused the plane to go down. In my recent papers “Omissions as Possibilities” (2014) and “Omission Impossible” (forthcoming), I argue that the best way to understand such omissions is as possibilities: an omission is, partly, a merely possible event. The omitted increase in airspeed refers to an event at a world in which the pilot does increase his airspeed. With this idea in mind, consider the following pair of cases:

Button Pressing. If the airline pilot presses a particular button, the plane will malfunction and crash. The pilot presses the button, and the plane crashes. Omitted Button Pressing. If the airline pilot fails to press a particular button, the plane will malfunction and crash. The pilot fails to press the button, and the plane crashes.

In these cases, the button-pressing and the failure to press the button respectively ensure that the plane crashes; both the button-pressing and the failure to press the button explain the plane’s crash; and the pilot is equally morally blameworthy for the outcome in both cases. These cases suggest that, in cases of causation by omission, would-be events are very cause-like. I hold that we should just go whole-hog and call them causes.

Accepting that possible causation is a type of causation has all kinds of interesting connections to other topics, including topics in ethics. Here are some that I have been thinking about.

* Does abandoning the distinction between actual and possible causation require a rejection of the existence of moral luck? Cases of moral luck are those in which luck makes a difference to an agent’s moral responsibility for an outcome. Paradigmatic examples of moral luck involve different outcomes: the shooter who hits Victim versus the shooter whose bullet was intercepted by a bird. This case is one of resultant luck, or luck involving how things turn out. In traditional cases of resultant luck, moral differentiation between agents rests on causal differentiation between agents: one agent causes a particular bad outcome, such as Victim’s death, and the other agent does not.

But holding that possible causes are causes seems to eliminate moral luck. For even the morally lucky agent—for example, Suzy in Smart Bullet—is causally responsible for Victim’s death. Assuming a relationship between causation and moral responsibility, Suzy is thus morally responsible for Victim’s death as well. (Note that she is not merely blameworthy for the attempt. She is morally responsible for the outcome itself.)

* Here is a puzzle brought out by the following pair of cases:[2]

Victim. Two independently employed assassins, unaware of each other, are dispatched to eliminate Victim. Being struck by one bullet is sufficient to kill Victim. Each assassin shoots, and Victim dies. Colliding Bullets. Two independently employed assassins, each unaware of the other, are dispatched to eliminate Victim. Each assassin shoots. Discharged alone, either bullet would have been sufficient to kill Victim. But the bullets collide midstream and weaken their trajectory such that both bullets are necessary for Victim’s death. Victim dies from the wound.[3]

The cases differ causally insofar Victim is a case of causal overdetermination, in which there are multiple sufficient causes for an outcome, whereas Hardy Victim is normally viewed as a case of joint causation, in which there are multiple necessary causes for an outcome. Another way of formulating this difference is to say that the actual causal contributions of the assassins in (Victim) quantitatively differ from the actual causal contributions of the assassins in (Colliding Bullets).

But the Possible Causation view makes distinguishing between these cases tricky. For according to the Possible Causation view, (Colliding Bullets) is also a kind of causal overdetermination: overdetermination involving the counterfactual causal contribution of each assassin. If what we care about is the fact that each assassin in (Colliding Bullets) would have been individually sufficient to bring about Victim’s death were it not for the other assassin, then both assassins in that case bear full causal responsibility for the outcome—that is, responsibility equal to that of the assassins in (Victim).

The puzzle gets even trickier when we consider whether the assassins in each case differ in terms of moral responsibility. That is: does each assassin’s proportion of moral responsibility for victim’s death quantitatively differ between Victim and Colliding Bullets? I discuss this and other puzzles at length in my “Proportionality Luck” (ms).

[1] This example is modified from Yablo’s “smart rock” case.

[2] Carolina Sartorio has been thinking about something similar. See here: http://peasoup.typepad.com/peasoup/2013/04/sartorios-intuition-pumps-about-responsibility.html

[3] This example comes from Carolina Sartorio’s “Two Wrongs Don’t Make a Right”. Legal Theory 18 (4):473-490 (2012).