- - Simulations, Models, and Theories: Complex Physical Systems and Their Representations Eric Winsberg?$ University of South Florida Using an example of a computer simulation of the convective structure of a red giant star, this paper argues that simulation is a rich inferential process, and not simply a "number crunching" technique. The scientific practice of simulation, moreover, poses some interesting and challenging epistemological and methodological issues for the philosophy of science. I will also argue that these challenges would be best addressed by a philosophy of science that places less emphasis on the representational capacity of theories (and ascribes that capacity instead to models) and more emphasis on the role of theory in guiding (rather than determining) the construction of models. 1. Introduction. There are many complex phenomena in the physical world which, from an epistemological point of view, share a curious character- istic. These are systems for which the governing laws of physics are well understood but, due to the complexity of interactions that develop, the implications of these laws are not. These systems, generally chaotic and nonlinear, are typically studied using the techniques of computer simula- tion. The term 'computer simulation' is used to describe a wide variety of different applications. This paper will focus on the particular techniques ?Send requests for reprints to the author, Department of Philosophy, University of South Florida, 4202 East Fowler Avenue, F A 0 226, Tampa, F L 33620. $Some of the ideas discussed here come from my dissertation, and I am grateful to members of my committee, especially Frederick Suppe, Michael Friedman, and Michael Dickson, for feedback. Earlier versions of this paper were presented at Northwestern University and in Vancouver, and many attendees made helpful comments, especially Arthur Fine, Mathias Frisch, Steven Kellert, and Mauricio Suarez. Some of the re- search for this project was supported by a postdoctoral fellowship at Northwestern University. Philosophy of Science, 68 (Proceedings) pp. 8442-S454. W31-8248/2001/68suppW35$OOW Copyright 2001 by the Philosophy of Science Association. All rights reserved. SIMULATIONS, MODELS, AND THEORIES s443 of simulation that are used to study the kinds of systems described above. In other words, I will be talking about the use of computers for modeling very complex physical phenomena for which there already exist good, well- understood theories of the processes underlying the phenomena in ques- tion. These computational techniques, often called simulations, have as their aim the understanding and representation of the complex phenomena they model. Why should a scientific practice that already begins with good theo- retical understanding be of interest to philosophers of science? Once we acquire a good theoretical understanding of a phenomenon, must not all of the scientific work that could be of philosophical interest be over? In this paper I will argue to the contrary. I will argue that the scientific prac- tice of simulation poses some interesting and challenging epistemological and methodological issues for philosophers of science. I will also argue that these challenges would be best addressed by a philosophy of science that places less emphasis on the representational capacity of theories (and ascribes that capacity instead to models) and more emphasis on the role of theory in guiding, rather than determining, the construction of models. First I give an overview of the techniques of simulation I am concerned with in this paper. I focus my overview of the methods of simulation on the role played by a series of different levels of models. In Section 3, I describe the work of a group of astrophysicists using simulation to study the convective structure of red giant stars. In Section 4, I discuss some of the distinct epistemological characteristics that are exhibited by simulation studies and argue that these characteristics are novel to the philosophy of science. I argue that the realization and sanctioning of simulation results relies on many resources that come from outside of the domain of theory, even when the system in question is theoretically well understood. In Sec- tion 5, I argue that we should take a fresh look, in the light of an inves- tigation of simulation modeling, at what we take to be the nature of the relationship between scientific theories and their empirical consequences. In Section 6, I draw connections with some recent philosophical work on models. In the last section, I make some concluding remarks. 2. Simulation Techniques. Let me begin by giving a rough sketch of the kinds of techniques that are at issue. As an illustration, let us suppose that we are confronted with a physical system of which we would like to gain a better understanding: a severe storm, a gas jet, or the turbulent flow of water in a basin.' The system in question is made up of certain underlying 1. Compare Wilhelmson et al. 1990 on the simulation of a severe storm, Smarr 1985 and Kaufmann and Smarr 1993 on the simulation of intergalactic gas jets, and Moin and Kim 1997 on simulations of turbulence. s444 ERIC WINSBERG components that behave according to a well-established set of physical laws. We begin with the assumptions that we know what these components are, and that we know the laws that govern them. The assumptions we have made about our system allow us write down a set of partial differential equations. These differential equations repre- sent the exact determination of how the system will evolve through time, as given by the physical model and the laws that describe it. The problem is that when these underlying components of the system (i.e., solid parti- cles, parcels of fluid, etc.) interact in the manner that we suppose they do in our physical system, the mathematical equations of our model have an unfortunate feature. In the types of systems that the simulation modeler is concerned with, it is mathematically impossible to find an analytic so- lution to these equations-the model is said to be nonintegrable. That is, it is impossible to write down closed form equations (i.e., equations given in terms of known mathematical functions) which represent an exact so- lution to the set of differential equations and will thereby tell us what the system will do over time. The approach that the simulationist takes to this problem involves turn- ing differential equations, which relate continuous rates of change over infinitesimal intervals, into difference equations, which relate rates of change over finite, or discrete, intervals. The values provided by these difference equations can then be calculated by a digital computer, from one discrete moment of time to the next. A point that is crucial to my argument in this paper, however, is that simulation is frequently not a simple matter of number crunching. The reason for this is quite simple. In the study of complex non-linear systems, the full set of differential equations determined by the kind of purely theo- retical reasoning I have discussed so far is often both computationally and analytically intractable. In other words, there is usually insufficient time and computer power to crunch through these equations with enough ac- curacy to be useful. One of the most common reasons that this problem arises is the fact that often, especially in systems involving complex fluid flows, there are motions that occur on very small time and length scales that are very important to the evolution of the system as a whole. Any reasonable attempt simply to crunch through the equations will end up leaving some of these important features behind. The upshot is that if a simulation study is to be successful at accurately representing the motions of a complex non-linear system, many "tricks of the trade" need to be employed. In the end, the process becomes much more complicated than simply a matter of substituting difference equations for differential equa- tions, and numerical calculation for true integration. Transforming the continuous differential equations of a dynamical model into discrete algebraic equations that can be cranked out by the SIMULATIONS, MODELS, AND THEORIES S445 computer is only the first step of the process. Even though this solves the problem of analytical intractability, the new model also must be made computationally tractable. Simulationists use ad hoc modeling assump- tions to help make their computational models more tractable and man- ageable. Ad hoc modeling assumptions include such techniques as simplifying assumptions, removal of degrees of freedom, and even substitution of sim- pler empirical relationships for more complex but also more theoretically founded laws. This model making can be eliminative or creative. The mod- eling can involve eliminating considerations from the dynamical model, or making up new ones. Sometimes the simulationist will ignore important factors or influences from their computational models because limitations of computational power make their inclusion impractical. This is what I refer to as eliminative ad hoc modeling. In this case, the simulationist has one of two options: either to assume that the effects of this neglected factor are negligible or to make use of some sort of empirical "fudge factoro-- creative ad hoc modeling-to make up for the absence of the neglected factor. Ad hoc models typically involve some kind of relatively simple math- ematical relationship that is designed to approximately capture some physical effect in nature. When appropriately "coupled" to the more theo- retical equations of a simulation, they allow the simulation to produce results that are more realistic than they could have been without some consideration of that physical effect. 3. A Model Star. A concrete example will help to clarify the remarks in the previous section. Consider the work conducted by David Porter, Steve Anderson, and Paul Woodward of the University of Minnesota's Labo- ratory for Computational Science & Engineering (LCSE). Their research focuses on the convective properties of red giant stars using techniques of simulation (Porter, Anderson, and Woodward 1998; Jacobs, Porter, and Woodward 1998). The group's interest in this species of star stems from the fact that these stars are convectively unstable almost all the way through to the core. In younger stars, heat from nuclear fusion finds its way to the surface primarily through radiative diffusion. Only during the cooler last third of their journey is this heat transported by convective motions of stellar gas. But in a red giant, the complex and turbulent pro- cess of convection begins much nearer to the core, and so these stars ex- hibit particularly complex and unstable motions of fluid. Modeling the convective movement of energy through this system is correspondingly difficult. Very small changes in temperature, pressure, and density in one part of the system may lead to turbulent vortices else- where; small surface eddies can lead to large convective flows. Thus, if the S446 ERIC WINSBERG model is to capture large-scale effects with any degree of accuracy, it must take into account the effects that take place on small scales-without the computations overwhelming the computing system. The basic equations that govern the model are the Euler equations for fluid dynamics. These are relatively simple forms of fluid dynamical equa- tions. They are based on the laws of conservation of mass, momentum, and energy. They ignore viscosity but include the effects of compressibility (Porter and Woodward 1994). The use of the inviscid Euler equations is an example of eliminative ad hoc modeling. Viscosity is clearly present in the flow of gas in a star and, in fact, contributes to the dynamics in crucial ways. It does so, however, at length scales which are far too small to be tracked by a reasonable computer program. The researchers deal with this problem using what I call creative ad hoc modeling-making up a factor to replace the one that has been removed. Here is how they describe the situation in their own model: Viscous effects, which act only on tiny scales unresolvable by the com- putational grid, were approximated by a carefully formulated numer- ical viscosity. This viscosity of the numerical scheme dissipates kinetic energy of fluid motion into heat, like the real viscosity of the gas, but on the much larger scales of the computational grid. This numerical viscosity was carefully designed to restrict its dissipative effects to the shortest length scales possible, consistent with accurate representation of the nearly inviscid flow on the longer length scales. (Porter, An- derson, and Woodward 1998) The researchers made many other modeling assumptions. Since they were primarily interested in the star's envelope, and not the core, the core was treated as a simple heat source, without regard to the internal physics. The model core was also much larger (relative to the envelope) than a real core, in order to allow the core to be closer to spherical without altering the grid structure of the simulation. The model also needs to track how heat moves through the system via conduction. The modelers assume that the rate of thermal diffusivity de- pended only on the gas pressure, and could therefore be treated with rela- tive simplicity. Finally, the model needs to account for how energy dissi- pates from the surface of the star. The physics of this process are in fact quite complicated, but the researchers were able to argue for a much sim- pler treatment of the problem. They simply used the standard formula for the radiation of a black body and applied it exclusively to those parcels of fluid that, based on their calculated pressures, were likely to be found close enough to the surface to be able to efficiently radiate heat. 4. A Distinct Epistemology. Given the complexity of the methodological SIMULATIONS, MODELS, A N D THEORIES s447 structure of simulation, it is not surprising that simulationists need to be concerned about how they can justify the conclusions reached in their studies. Even though simulation is fundamentally about replacing analyt- ical solution with calculation (a mere mathematical transformation), the question of the reliability of the results of simulation modeling goes be- yond mere worries about the reliability of the calculation, and reaches out to the entire simulation process and the conclusions scientists reach using these techniques. Computer simulation has a distinct epistemology (Winsberg 1999). In other words, the techniques simulationists use to attempt to justify simu- lation are unlike anything that usually passes for epistemology in the phi- losophy of science literature. I would like to focus on three of the unusual features of this epistemology: it is downward, it is autonomous due to a scarcity of data, and it is motley. The first point may be fairly clear by now, but it is worth making explicit. Typically, to a philosopher of science, epistemological issues arise when we try to justify high level theoretical claims based on low level data or specific observational reports. But simulation is about starting with theory and working your way down. This kind of epistemology is, to the philosopher of science, a curious beast. It is an epistemology that is con- cerned with justifying inferences from a theory to its application-an in- ference that most philosophy of science has assumed is deductive and consequently not in need of justification. The second point is more subtle. Also typical of the notion of episte- mology in philosophy of science is that it is founded on comparison. If you want to know if some representational structure is accurate or reliable, compare it to the thing it is meant to represent. But simulation techniques are used precisely when data on the phenomena to be represented are conspicuously sparse. Because of this I describe the epistemology of simu- lation with a term I borrow from Jeffrey Ramsey: autonomous (Ramsey 1992). Since simulations are used to generate representations of systems for which data are sparse, the transformations they make use of need to be justified internally; that is, the transformations need to be considered well motivated based on their own internal form, and not solely on the basis of what they produce. Simulation requires an epistemology that will guide us in evaluating the trustworthiness of an approximation qua technique, in advance of being able to compare our results with the broad range of the phenomena we might wish to study. In general, the inferential moves made in simulations are evaluated on a variety of fronts, and they can be justified based on considerations coming from theory, from empirical gen- eralizations, from data, or from experience in modeling similar phenom- ena in other contexts. S448 ERIC WINSBERG The last but probably most important point is that the epistemology of simulation is motley. Even though all simulation modeling of the kind described in this paper fundamentally begins with theory, and even though we think of simulation as an attempt to "solve" the mathematical equa- tions of this theoretical structure, our theoretical knowledge is just one of several ingredients that simulationists use to produce their results. All of these sources and their influences need to be considered when justifying simulation results. If the influences and possible pitfalls of each element are not properly understood and managed by the simulationist, they may potentially threaten the credibility of simulation results. Doing so, moreover, requires reliance upon an equally diverse range of sources of knowledge and skills. A great deal of this knowledge is not contained in the theoretical knowl- edge that formed the original basis for the simulation. For example, in many simulations, it is very important for researchers to keep track of the movement of kinetic energy in the system being sim- ulated. But in any simulation of a fluid system that employs a discreet space-time grid, there is always the need to consider the impact of motions that occur "inside" the cells of the grid. In order to avoid losing energy to these local motions, simulations often keep track of a variable called "sub-grid-scale kinetic energy" that is calculated for each cell. But theory does not dictate what mathematical function should be used to "couple" this variable to the rest of the system. Therefore, the particular choice of how to calculate the interaction between this variable and the others of the simulation needs to be justified on some other basis. 5. Models, Theories, and Representations. Once we understand that simu- lation has its own epistemology-and that this epistemology is downward, autonomous, and motley-I argue that we should take a fresh look at our understanding of the nature of the relationship between scientific theories and their empirical consequences. In most cases, the equations that form the theoretical basis of systems of interest to simulationists are analytically unsolvable. That is, there is no mathematically expressible function that is the solution to these equa- tions. Therefore, representations of these systems cannot possibly come in the form of linguistic entities derivable from a linguistic theoretical struc- ture. The syntactic view is clearly not going to do the work that we need here. Moreover, deductive inferences, by definition, confer certainty on their conclusions (provided that the premises are true!). The inferences that take place in simulation modeling confer no such thing. At their best, they confer reasonable warrant for believing the conclusions reached, and this only when painstaking steps are taken to ensure success. SIMULATIONS, MODELS, A N D THEORIES S449 It might be argued, however, that while simulation results are not ac- tually deducible from the theory, they do bear a logical relation to the theoretical structure, namely, a relation of semantic containment. Before we consider this possibility, we need to look carefully at what we might be proposing. One thing that we would probably not want to propose is that our models of phenomena themselves actually stand in a logical re- lation to our theories. Our models are approximate, and they are often rich with detail that we know is not robust. What one might propose though, is that if all has gone well, the results of a particular simulation closely resemble an ideal structure that is em- bedded in the overall structure of the theory. Unfortunately, this ideal structure itself and the exact nature of the embedding relationship are inaccessible to us. We should, nevertheless, be confident that the structure is there and that we have successfully approximated it. The fact that we lack the certainty of deduction is merely a symptom of the fact that there is some chance that we are wrong and that our results do not bear any relation to the ideal structure. In other words, there is some structure that is a logical consequence of our theories, and what we have is something that we strongly believe approximates the structure. Is this a view that we should adopt? All of this talk about abstract structures that stand in logical, but not strictly deductive, relations ought to make it pretty clear that we should turn to the resources of the semantic view of theories to get a good frame- work for making this question more precise. The semantic view has been articulated, with substantial variations, by Patrick Suppes (1962), Bas Van Fraassen (1970, 1980), and Frederick Suppe (1974, 1989). But according to a generalized semantic view of theories, a theory is a cluster of trajec- tories, in some state space, which specifies a set of allowable state transi- tion trajectories. Thus, according to this view, a theory is a family of models. A model, in this sense, is an abstract mathematical entity. More specifically, such a model is a trajectory in a particular phase space. To specify a theory is to specify a cluster of geometrical objects in a phase space that, in turn, specifies the allowable state transition trajectories for any system under the domain of the theory. If this is the prevailing philosophical conception of a scientific theory, it becomes natural to ask the following question about the practice of simulation: Do we contribute to our philosophical understanding of the practice of simulation by characterizing it as an attempt to extract infor- mation about the structure of the allowed state transition trajectories of a theory, where such trajectories are specified by the underlying laws of the simulation as articulated in their mechanical models? To break this question into manageable parts: Should we think of the laws, equations, and mechanical models which simulations take as their starting point as S450 ERIC WINSBERG specifications of phase space trajectories? And should we think of simu- lation as merely the process of mathematically calculating the phase space trajectories that have already been specified? I would argue that we should not. Given the rich complexity of the process of deriving warrant for simulation results, and the extent to which this process focuses on elements external to anything we would reasonably include as part of theory, it would be unrealistic to interpret this war- ranting process as being about the relationship of the results to some for- mal model. That is, if we think carefully about the epistemological steps that go into warranting the reliability of simulation results, we find that they have everything to do with ensuring that the results match up well with the real world, and little to do with ensuring that they resemble some ideal structure. If we think of simulations (of the kind I present) as at- tempts to determine which state transition trajectories are the ones that are determined by our laws and mechanical models, then we must con- clude that our results are totally unwarrantable-an unsavory conclusion indeed. It is only if we view simulations as attempts to provide--directly- representations of real systems, and not abstract models, that the episte- mology of simulation makes any sense. Only the former kind of inference can be warranted with the motley epistemology I have described. Perhaps more importantly, the picture of simulation that the semantic view of theories proposes obscures one of the most interesting facts about the practice of simulation modeling: Simulation is a practice in which having confidence in the models we construct depends on several factors being in place, none of which are guaranteed by our theoretical knowledge. It depends on facts we know about our computers and about our graphical techniques. It depends on the confidence we have in the various ad hoc models we use4onfidence we derive from laboratory and observational experience. It depends on our ability to calibrate models against empirical results. And, finally, it depends on the confidence we have in the tacit observing abilities (often acquired in the role of skilled experimenters and observers, as well as in the role of skilled simulators) of simulationists to make judgments about the degrees of resemblance between different classes of images. In short, while we make use of laws, mechanical models, and equations in order to construct our representations of the motions of complex physical systems, these elements alone do not determine, or spec- ify, these motions for us. 6. Mediating Models. It is perhaps worthwhile, at this point, to compare the conclusions we have thus far drawn with some remarks about the semantic view of theories recently made by a group of philosophers inter- ested in the role of "mediating models" (cf. Morgan and Morrison 1999). These philosophers have argued for a rejection of the semantic view of SIMULATIONS, MODELS, A N D THEORIES s45 1 theories based on their account of the role of mediating models in the application of theory to the world. Beginning with Nancy Cartwright's book How the Laws of Physics Lie (1983), this literature has identified the role played by models in making theoretical laws applicable to concrete physical situations. Mediating models are models that stand between theory and the world. They function by making theory applicable to real-world situations that do not immediately fall under the theory's domain. In a recent paper, Margaret Morrison has identified two facets of mediating models that are crucial for our purposes (Morrison 1998; Suarez 1999). First, mediating models are not derivable from theory; that is, this practice of model mak- ing is not theory driven. The second facet Morrison identifies is that me- diating models are not necessitated by data. They involve substantial theo- retical and conceptual assumptions that have their origin in what we would loosely call theorizing, not data analysis. So, while mediating models are not theory driven, they are a form of theorizing rather than a means of organizing data. As Morrison writes, Although they are designed for a specific purpose these models have an autonomous role to play in supplying information, information that goes beyond what we are able to derive from the dataltheory combination alone. (Morrison 1998, 67) Because of these two characteristics (especially the first), several au- thors, including Cartwright (1994, 358-359), Morrison (1998,67-69), and Suarez (1999, 171-173), have pointed out that the necessity of mediating models in science spells trouble for the semantic view of theories. If me- diating models come from outside of theory and are required in the ap- plications of theory in many instances, then an account in which the dis- tinction between model and theory collapses will be inadequate. Suarez makes the point most succinctly: [In the] semantic conception of theories . . . the distinction between theory and model collapses as, according to the semantic view, theo- ries are models-they are really nothing but collections of models. . . . So the contrast between theories and models disappears. (Suarez 1999, 172) It occurs to me that the immediate move from this realization to a rejection of the semantic conception might seem to some as though it relies on a conflation of two separate notions of "model." It is true that "theories are families of models" has long been a slogan of the semantic conception. It is also true that in order to adequately understand scientific practice, we need to understand mediating models as being separate from theory. It is perfectly possible, however, that theories are families of models S452 ERIC WINSBERG (i.e., clusters of trajectories in a particular phase space) whose specification requires models of a different type (what we might call mediating models) in order for a set of laws in a domain to successfully pick out the appro- priate set of trajectories. It is meant to be one of the semantic view's strengths that it remains silent about the particular linguistic formulation of theories. If, then, mechanical models are needed to help clarify what the laws are specifying, then so be it. This is what I had in mind when I asked, above: "Should we think of the laws, equations, and mechanical models, which simulations take as their starting point, as specifications of phase space trajectories?" This rebuttal is perfectly sensible, but I believe it also misses a crucial point. Mediating models are constructed in order to extend theories into new domains of application. In practice, theories provide guidance on how these mediating models should be constructed but do not determine their final form. I believe that this is what Morrison means when she says that mediating models are neither derivable from theory, nor driven by data. The semantic view of theories is to be criticized because it inhibits us from seeing how a theory can guide its own application in an area previously not in its domain. Quite similarly, I believe, a semantic conception of theories (and other theory driven reconstructions of scientific practice of its ilk) inhibits us from seeing how theory guides, but does not determine, how models of complex systems are constructed. What an examination of the epistemol- ogy of simulation shows is that the semantic theory lacks the resources to provide us with an understanding of how theory gets applied in practice, even in situations where the theory is not being extended into new domains of application. That is, even in situations where a phenomenon is theo- retically well understood, traditional accounts of the nature of theories obscure the complex relationship that sometimes exists between theory on the one hand, and actual representation of phenomena on the other. In dealing with very complex systems, we proceed from theoretical knowledge and move to new knowledge and representations of systems that are already theoretically well understood. When it comes to complex systems, we simply cannot bend our theories to our cognitive will-they will not yield results with any mechanical turn of a crank. The models that we need to construct in order to do our science need to be constructed delicately and from as many sources as are available. Consequently, these models are no mere instantiations of our theoretical structures, though they are the results of a form of calculation; they are rich, physical con- structs that mediate between our theories and the world. 7. Conclusion. What are the philosophical lessons to be learned here? I will focus on two which, broadly speaking, are perhaps lessons that are be- SIMULATIONS, MODELS, AND THEORIES s4.53 coming familiar in recent philosophy of science. The first lesson is that we need to move towards a philosophy of science that focuses on concrete models, rather than abstract theories, as the loci of reliable representations of real systems. Understanding how these models get constructed, how they function in their representational capacity, and where they get their reliability is an important challenge facing the philosophy of science. Furthermore, as Morrison notes, modeling practices in the physical science are diverse. "Models are used in a broad field of subjects each with its own particular techniques for model construction [and] even in a dis- cipline like physics its history displays a diversity of models that cannot be encapsulated by one specific characterization" (1998, 81). But in ad- dition to being diverse, modeling techniques are also layered, that is, they are applied one on top of the other. Perhaps most importantly, we need to understand how theory can play a pivotal role in the construction and sanctioning of models without suc- cumbing to a view in which models are merely subordinate to theory. I can perhaps best articulate this second lesson with reference to the follow- ing quotation from Manfred Stockler's commentary on Nancy Cart- wright's "anti-fundamentalism." Stockler is responding to Cartwright's claim that the laws of nature are "each heavily constricted in the domain of features it rules and each separate and independent from the other" (Cartwright 1998, 23): I argue that from failures of constructing concrete models it does not follow much about the range of fundamental laws. Suppose these fail- ures are caused by practical limits to writing down equations and solving them. Such practical problems do not allow us to restrict the set of possible models of quantum mechanics which is determined by the structure of the theory. (Stockler 1998, 38; original emphasis) The debate in which Stockler and Cartwright are engaged is a meta- physical one, and the criteria for its resolution are not obvious at this time. But the second lesson of this paper is this. Whatever metaphysical position we hold in that debate, there are important and challenging epistemolog- ical and methodological issues in scientific theorizing that will be over- looked by a philosophy of science that sees theories as fully articulated structures and treats calculational problems as merely the result of prac- tical limitations. REFERENCES Cartwright, Nancy (1983), How the Laws of Physics Lie.Oxford: Oxford University Press. (1994), "The Metaphysics of the Disunified World", in PSA 1994, vol. 2. East Lansing: Philosophy of Science Association, 357-364. s454 ERIC WINSBERG (1998), "How Theories Relate: Takeovers or Partnerships?', Philosophia Naturalis 54: 23-34. Jacobs, Mark, David Porter, and Paul Woodward (1998), "3-D Numerical Models of Red Giant Stars", American Astronomical Society Meeting #193, abstract. Kaufmann, William and Lawrence Smarr (1993), Supercomputing and the Transformation of Science. New York: Scientific American Library. Moin, Parviz and John Kim (1997), "Tackling Turbulence with Supercomputer", Scient8c American 276(1): 62-68. Morgan, Mary and Margaret Morrison (1999), Models as Mediators. Cambridge: Cambridge University Press. Morrison, Margaret (1998), "Modeling Nature: Between Physics and the Physical World", Philosouhia Naturalis 54: 65-85. Porter, c avid and Paul Woodward (1994), "High-Resolution Simulations of Compressible Convection Using the Piecewise-Parabolic Method", The Astrophysical Journal 93 (Supplement): 309-32 1. Porter, David, Sarah Anderson, and Paul Woodward (1998), "Simulating a Pulsating Red Giant Star", preprint. Ramsey, Jeffrey (1992), "An Expanded Epistemology of Approximations", in P S A 1992, vol. 1. East Lansing: Philosophy of Science Association, 154-166. Smarr, Lawrence (1985), "An Approach to Complexity", Science 228: 403408. Stockier, Martin (1998), "On the Unity of Physics in a Dappled World: Comment on Nancy Cartwright", Philosophia Naturalis 54: 35-39. Suarez, Maurico (1999), "The Role of Models in the Application of Scientific Theories: Epistemological Implications", in M.S. Morgan and M.C. Morrison (eds.), Models as Mediators. Cambridge: Cambridge University Press, 168-196. Suppe, Frederick (ed.) (1974), The Structure of Scientific Theories. Urbana: University of Illinois Press. (1989), The Semantic Conception of Theories and Scientific Realism. Urbana: Uni- versity of Illinois Press. Suppes, Patrick (1962), "Models of Data", in E. Nagel, P. Suppes, and A. Tarski (eds.), Logic, Methodology and Philosophy of Science. Stanford: Stanford University Press. van Fraassen, Bas (1970), "On the extension of Beth's Semantics of Physical Theories", Philosophy of Science 37: 325-338. (1980), The Scientific Image. Oxford: Oxford University Press. Wilhelmson, Robert, Brian Jewett, Crystal Shaw, Louis Wicker, Matthew Arrott, Colleen Bushell, Mark Bajuk, Jeffrey Thingvold, and Jerry Yost (1990), "A Study of the Evo- lution of a Numerically Modeled Severe Storm", The International Journal of Super- computer Applications, 4(2): 20-36. Winsberg, Eric (1999), "Sanctioning Models: The Epistemology of Simulation", Science in Context, 12(2): 275-293.