When Causes Constrain and Explain

This abstract has open access
Abstract
Recent philosophical work on explanation explores the notion of “constraints” and the role they play in scientific explanation. An influential account of “explanation by constraint” is provided by Lange (2017), who considers these topics in the context of the physical sciences. Lange’s account contains two main features. First, he suggests that constraints explain by exhibiting a strong form of necessity that makes the explanatory target inevitable. Second, he claims that constraints provide a type of non-causal explanation, because they necessitate their outcomes in a way that is stronger than standard causal laws. This non-causal claim is supported by other work in the field (Green and Jones 2016) and it has helped advance accounts of non-causal explanation. While Lange’s work focuses on constraints in physics, this talk explores constraints in a broader set of scientific fields, namely, biology, neuroscience, and the social sciences. In these domains, scientists discuss developmental, anatomical, and structural constraints, respectively. I argue that these examples capture a type of causal constraint, which figures in a common type of causal explanation in science. I provide an analysis of (1) what it means for an explanatory factor to quality as a constraint and (2) how we know whether such factors are causal or not. Although Lange’s account does not include causal constraints, I clarify how this work is motivated by and shares similarities to his account. In particular, this work suggests that causal constraints explain the restriction of an outcome, exhibit a strong form of explanatory influence, and figure in impossibility explanations. Work on explanatory constraints contributes to the philosophical literature in a variety of ways. First, this work helps shed light on the diverse types of explanatory patterns that we find in science. This provides a more realistic picture of scientific explanation and the methods, strategies, 5 and reasoning it involves. Second, appreciating that some explanatory constraints are causal has implications for attributing causal responsibility to parts of a system and for suggesting potential interventions that allow for control. In the context of the social sciences, for example, this has implications for holding social structural factors accountable for outcomes and for suggesting policy-level interventions that bring about desired change.
Abstract ID :
PSA2022284
Submission Type
Topic 1
Associate Professor
,
UC Irvine

Abstracts With Same Type

Abstract ID
Abstract Title
Abstract Topic
Submission Type
Primary Author
PSA2022514
Philosophy of Biology - ecology
Contributed Papers
Dr. Katie Morrow
PSA2022405
Philosophy of Cognitive Science
Contributed Papers
Vincenzo Crupi
PSA2022481
Confirmation and Evidence
Contributed Papers
Dr. Matthew Joss
PSA2022440
Confirmation and Evidence
Contributed Papers
Mr. Adrià Segarra
PSA2022410
Explanation
Contributed Papers
Ms. Haomiao Yu
PSA2022504
Formal Epistemology
Contributed Papers
Dr. Veronica Vieland
PSA2022450
Decision Theory
Contributed Papers
Ms. Xin Hui Yong
PSA2022402
Formal Epistemology
Contributed Papers
Peter Lewis
127 visits