Abstract
Recent philosophical work on explanation explores the notion of “constraints” and the role they play in scientific explanation. An influential account of “explanation by constraint” is provided by Lange (2017), who considers these topics in the context of the physical sciences. Lange’s account contains two main features. First, he suggests that constraints explain by exhibiting a strong form of necessity that makes the explanatory target inevitable. Second, he claims that constraints provide a type of non-causal explanation, because they necessitate their outcomes in a way that is stronger than standard causal laws. This non-causal claim is supported by other work in the field (Green and Jones 2016) and it has helped advance accounts of non-causal explanation. While Lange’s work focuses on constraints in physics, this talk explores constraints in a broader set of scientific fields, namely, biology, neuroscience, and the social sciences. In these domains, scientists discuss developmental, anatomical, and structural constraints, respectively. I argue that these examples capture a type of causal constraint, which figures in a common type of causal explanation in science. I provide an analysis of (1) what it means for an explanatory factor to quality as a constraint and (2) how we know whether such factors are causal or not. Although Lange’s account does not include causal constraints, I clarify how this work is motivated by and shares similarities to his account. In particular, this work suggests that causal constraints explain the restriction of an outcome, exhibit a strong form of explanatory influence, and figure in impossibility explanations. Work on explanatory constraints contributes to the philosophical literature in a variety of ways. First, this work helps shed light on the diverse types of explanatory patterns that we find in science. This provides a more realistic picture of scientific explanation and the methods, strategies, 5 and reasoning it involves. Second, appreciating that some explanatory constraints are causal has implications for attributing causal responsibility to parts of a system and for suggesting potential interventions that allow for control. In the context of the social sciences, for example, this has implications for holding social structural factors accountable for outcomes and for suggesting policy-level interventions that bring about desired change.