Birmingham
Nov 12, 2022 09:00 AM - 11:45 AM(America/New_York)
20221112T0900 20221112T1145 America/New_York Multiscale Modeling Across the Sciences: Tailoring Techniques to Particular Contexts

The aim of this symposium is to generate a more unified, yet pluralistic, framework for thinking about how similarities and differences in scientists' modeling goals across various modeling contexts influence which multiscale modeling techniques are justified in those contexts. To accomplish this, the symposium will bring together scholars from various stages of their careers to compare multiscale modeling approaches in physics, nanoscience, economics, and biology. What we find is that some of the modeling goals and practical constraints that influence multiscale modelers in these fields are common features of many modeling contexts-i.e., there are some features that are stable across these cases. However, there are also several unique methodologies that are tailored to specific pragmatic constraints and modeling goals of specific fields (or types of phenomena). This interdisciplinary analysis of multiscale modeling contexts will improve our understanding of where and why different multiscale modeling approaches are justified.

Birmingham PSA 2022 office@philsci.org
46 attendees saved this session

The aim of this symposium is to generate a more unified, yet pluralistic, framework for thinking about how similarities and differences in scientists' modeling goals across various modeling contexts influence which multiscale modeling techniques are justified in those contexts. To accomplish this, the symposium will bring together scholars from various stages of their careers to compare multiscale modeling approaches in physics, nanoscience, economics, and biology. What we find is that some of the modeling goals and practical constraints that influence multiscale modelers in these fields are common features of many modeling contexts-i.e., there are some features that are stable across these cases. However, there are also several unique methodologies that are tailored to specific pragmatic constraints and modeling goals of specific fields (or types of phenomena). This interdisciplinary analysis of multiscale modeling contexts will improve our understanding of where and why different multiscale modeling approaches are justified.

Multiscale Modeling in Physics and Materials Science: Methodological Lessons from Representative Volume ElementsView Abstract
Contributed Papers 09:00 AM - 11:45 AM (America/New_York) 2022/11/12 14:00:00 UTC - 2022/11/12 16:45:00 UTC
This talk will look at a ubiquitous methodology in condensed matter physics and materials science that aims to understand bulk behaviors of many-body systems. The focus is on finding and characterizing structures that exist at scales in between the so-called fundamental or atomic, and that of the continuum. I will argue that such multiscale techniques provide justification and explanation for the continued use of effective theories in various theoretical contexts. My focus is on the role played by so-called “representative volume elements,” or RVEs, in homogenization theory. At everyday, continuum scales, a material like a steel beam looks reasonably homogenous. However, if we zoom in, we will begin to see structure that are hidden at everyday, naked-eye length scales. In order to model the main, important features of the piece of steel at these shorter length scales, scientists employ RVEs. RVEs are statistically representative of features of a material at some particular spatial scale. Importantly, RVEs (1) are scale-relative, that is, the actual characteristic lengths of the structures in an RVE can vary considerably, and (2) are always considered to be continua. These features of RVEs lead to both a unified methodological approach for modeling materials as varied as steel, wood, water, and gases; and, they provide methodological constraints that guide modeling strategies. I illustrate these features of RVEs by looking at examples where one can determine effective values for material parameters describing bulk behaviors. These include parameters like Young’s modulus for elastic materials and transport coefficients such as thermal and electrical conductivity. I will emphasize how little the values for these effective parameters depend on lowest scale/fundamental features of the systems; or, in other words, how effective parameters succeed in being autonomous from the fundamental features of the systems. Upper-scale phenomena of the sort I will consider in this talk often display a remarkable insensitivity to changes in lower-scale details. This is, of course, a hallmark of effective theories. Using these lessons from RVE modeling techniques, I will further discuss how reductive strategies, and those that emphasize the role of fundamentality in justifying the use of a multiscale modeling technique, ignore the autonomy of effective theories and why ignoring that autonomy inhibits multiscale modeling.
Presenters
RB
Robert Batterman
Speaker, University Of Pittsburgh
When Scale Separation Fails: Multiscale Modeling of Nanoscale NucleationView Abstract
Contributed Papers 09:00 AM - 11:45 AM (America/New_York) 2022/11/12 14:00:00 UTC - 2022/11/12 16:45:00 UTC
This joint work between a philosopher and two chemists illustrates the practical scientific need for improved approaches to multiscale modeling. Nucleation models are essential to predicting the composition, structure, and growth rate, of nanoscale materials synthesis, and many models of nucleation exist. Classical nucleation models, for example, aim to predict the rate of nucleation based on the relationship between the thermodynamic properties of surface free energy and volume free energy. This model employs concepts that describe system-wide phenomena (e.g., surface area and volume) which treats nucleation as a bulk, continuous, and system-wide process. Other models of nucleation aim to describe the patterns of formation of the seeds or “nuclei” in the newly formed phase, predicting and explaining how a particular nucleus will grow and when it is more likely for a new nucleus to form vs. an existing nucleus to continue to grow. One example is the LaMer model, which predicts the homogeneous nucleation of a colloid based on the concentration of precursor over time. At low precursor concentrations, formation of nuclei is unfavorable and does not occur until the precursor reaches a critical concentration, at which point a burst nucleation event occurs. After this event monomer concentration is too low to continue nucleation and the nuclei enter the growth phase. These models employ concepts that describe the individual nuclei as individual solids, occasionally with internal structure of their own. Such models treat nucleation as an aggregate of individual microscale nucleation events. We argue that in bulk chemistry, physics, and materials science, the success of employing these different types of models jointly is due in part to the high degree of scale separation between the dynamics of the macroscale model and the dynamics of the microscale model. We contrast this rationale for the success of a modeling strategy with rationales that appeal to one or the other model being the more “fundamental” model of the system and with rationales that aim to draw a relationship of emergence between the two types of models. Then, we use Simon and Millstone’s research program in the thermodynamics and chemical kinetics of nanoparticle formation to raise a modeling challenge: how should nanochemists adapt nucleation models to the nanoscale? Nucleation plays a central role in the formation of a class of nanomaterial known as colloidal metal nanoparticles (CMNPs). Synthesizing CMNPs faces a variety of practical challenges related to the need to keep the particles from growing above the nanoscale, as well as the need to create a group of particles that are all of the same size, shape, and crystal structure. Solving these problems requires the use of nucleation models, but at the nanoscale, there is no longer the same degree of scale separation between macroscale and microscale nucleation models. We conclude by discussing what strategies are available to nanochemists for rationalizing the use of both types of nucleation models.
Presenters Julia Bursten
University Of Kentucky
ZS
Zoe Simon
University Of Pittsburgh
JM
Jill Millstone
Participant, University Of Pittsburgh
Bridging Across Spatial and Temporal Scales: Optimization and Homogenization in Conservation EcologyView Abstract
Contributed PapersScientific Models / Modeling 09:00 AM - 11:45 AM (America/New_York) 2022/11/12 14:00:00 UTC - 2022/11/12 16:45:00 UTC
In this paper, I use a number of examples of multiscale modeling in biology to argue that the primary challenge facing these modelers is not how to metaphysically interpret their models, but is instead using various idealizations to bring the available multiscale modeling techniques to bear on the phenomena of interest. This is particularly true when the aim of the biological modelers is to inform policy decisions concerning epidemics, climate change and conservation which involve a wide range of spatial and temporal scales. The ‘best-case scenario’ in these instances of multiscale modeling is when the dominant features of the system can be separated into distinct scales. When this occurs, scientists can effectively model the phenomenon by using modeling techniques designed for those particular scales (and type of processes). One example of this is attempting to use optimization techniques to model tradeoffs between the short-term and long-term adaptations of plants and animals to anthropogenic climate change. In these cases, biologists first model the adaptive strategies of individual plants and animals at shorter temporal scales (e.g. hours and days). They then model what would be adaptive for the overall population at longer temporal scales (e.g. generations). For example, when CO2 levels rise, in the short term, plants reduce their water use by reducing their stomatal conductance. However, at longer time scales, the models predict an increased water use due to increased photosynthetic capabilities, larger leaves, and deeper root depths. Finally, the optimal phenotypic trait (and the one we might want to use interventions to bring about) will be the one that best balances these short-term and long-term benefits. While scale separation is often useful for multiscale modelers, in many cases in the biological (and social) sciences, such a clean separation of scales is not possible. In these cases, biological modelers have begun borrowing various modeling techniques first deployed in physics. In particular modelers in spatial ecology have begun using 7 homogenization techniques to model plants and animals’ migration patterns across heterogeneous landscapes. In these cases, various idealizations are introduced to be able to model the system as a homogenous medium at the largest scale (e.g. the whole ecosystem) while taking into account the influences of variations at smaller scales (e.g. slower movement through mountains than fields). This results in a macroscale equation that encodes key features from smaller scales into its parameters and constants. This highly idealized modeling technique enables biologists to incorporate features from across a wide range of scales in much more computationally effective models that can more easily be used to inform policy decisions about the outcomes of specific interventions. What these two sets of cases show is the relationships between scales, the available modeling approaches, and scientists’ purposes for their models all influence which multiscale modeling technique is best suited to biologists’ goals and which idealizations can be justifiably used in order to deploy those modeling techniques.
Presenters
CR
Collin Rice
Assistant Professor Of Philosophy, Colorado State University
Multiscale Reasoning in EconomicsView Abstract
Contributed PapersScientific Models / Modeling 09:00 AM - 11:45 AM (America/New_York) 2022/11/12 14:00:00 UTC - 2022/11/12 16:45:00 UTC
There is a substantial amount of discussion in the philosophical literature on multiscale modeling in physical and material contexts. But there is less such discussion when it comes to the social sciences. This paper earmarks economics as a promising area for multiscale exploration for a number of reasons. First, there’s a sense in which things that go on at one scale on their own do not reduce to what goes on at another, though goings-on at one scale may contribute to goings-on at another. Formal results such as the Sonnenschein-Mantel-Debreu theorem and the questionable success of the microfoundations program provide at least two prima facie reasons we should be suspicious of reductionist attempts in economics. The next natural step would be to consider how it is that economists use multiple models together. Second, economists in actual practice, such as those at central banks, often really do use multiple models in order to achieve monetary policy aims. The literature is now over trod with commentary about the representational capacity of these (often idealized) models, so much so that not only is the discussion often quite divorced from economic practice, but it also makes the possibility of a realist interpretation of economics look rather tenuous. This paper shifts its attention to the role of models not just as representational devices, but also as devices for the construction of performative narratives. Models, when deployed in policy analysis, are part of the effort to construct coherent narratives that help guide action. This is an enterprise that often requires different parties to coordinate their expertise and resources. This role requires that models take on both the role of being explanatory — being able to be the kind of tool that people can use to ask what-if-things-had-been-different or why-questions — and performative — being the kind of tool that economists can use to effectively enact changes in the world. Doing so will highlight a number of features about economic models in particular that has escaped much of mainstream philosophical attention — for instance, why it might be that economic models furnish 6 how-actually (as opposed to merely how-possibly) explanation. The multiscale modeling framework, I propose, is one promising way of grounding this conception of model usage. To this end, I examine two case studies. One is explicitly multiscale: integrated computable general equilibrium and microsimulation modeling strategies, with particular attention to income distribution effects. The second is more implicit: I examine the process by which the U.S. Federal Reserve, via the construction of the Greenbook (now Tealbook) conditional forecasts which are then distributed at the Federal Open Market Committee meetings for discussion, offers policy recommendations. Finally, I suggest that these considerations actually point to a kind of pragmatic realism as the right attitude to have towards many economic models.
Presenters
JJ
Jennifer Jhun
Reviewer, Duke
Speaker
,
University of Pittsburgh
University of Kentucky
University of Pittsburgh
Reviewer
,
Duke
Assistant Professor of Philosophy
,
Colorado State University
+ 1 more speakers. View All
Poster Presenter, Session Chair
,
University of Kentucky
No attendee has checked-in to this session!
Upcoming Sessions
607 visits