handbook springer oƒ model-based science

59
123 Springer Handbook Model-Based Science Magnani Bertolotti Editors

Upload: others

Post on 16-Apr-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Handbook Springer oƒ Model-Based Science

123

Springer

Handbookoƒ

Model-Based Science

Magnani Bertolotti Editors

Page 2: Handbook Springer oƒ Model-Based Science

XXV

Contents

List of Abbreviations ............................................................. XXXVII

Part A Theoretical Issues in Models

1 The Ontology of ModelsAxel Gelfert ....................................................................... 51.1 Kinds of Models: Examples from Scientific Practice .................. 61.2 The Nature and Function of Models ................................... 81.3 Models as Analogies and Metaphors .................................. 101.4 Models Versus the Received View: Sentences and Structures ........ 121.5 The Folk Ontology of Models ........................................... 161.6 Models and Fiction ...................................................... 181.7 Mixed Ontologies: Models as Mediators and Epistemic Artifacts .... 201.8 Summary ................................................................. 21References ....................................................................... 22

2 Models and TheoriesDemetris Portides ................................................................ 252.1 The Received View of Scientific Theories .............................. 262.2 The Semantic View of Scientific Theories .............................. 36References ....................................................................... 47

3 Models and RepresentationRoman Frigg, James Nguyen ................................................... 493.1 Problems Concerning Model-Representation ......................... 513.2 General Griceanism and Stipulative Fiat .............................. 553.3 The Similarity Conception............................................... 573.4 The Structuralist Conception............................................ 663.5 The Inferential Conception ............................................. 763.6 The Fiction View of Models ............................................. 833.7 Representation-as ...................................................... 913.8 Envoi ..................................................................... 96References ....................................................................... 96

4 Models and ExplanationAlisa Bokulich.................................................................... 1034.1 The Explanatory Function of Models................................... 1044.2 Explanatory Fictions: Can Falsehoods Explain? ....................... 1084.3 Explanatory Models and Noncausal Explanations .................... 1124.4 How-Possibly versus How-Actually Model Explanations ............ 1144.5 Tradeoffs in Modeling: Explanation versus Other Functions

for Models................................................................ 1154.6 Conclusion ............................................................... 116References ....................................................................... 117

[email protected]

Page 3: Handbook Springer oƒ Model-Based Science

XXVI Contents

5 Models and SimulationsNancy J. Nersessian, Miles MacLeod ........................................... 1195.1 Theory-Based Simulation............................................... 1195.2 Simulation not Driven by Theory....................................... 1215.3 What is Philosophically Novel About Simulation? .................... 1245.4 Computational Simulation and Human Cognition.................... 127References ....................................................................... 130

Part B Theoretical and Cognitive Issueson Abduction and Scientific Inference

6 Reorienting the Logic of AbductionJohn Woods ...................................................................... 1376.1 Abduction ................................................................ 1386.2 Knowledge ............................................................... 1416.3 Logic ...................................................................... 148References ....................................................................... 149

7 Patterns of Abductive InferenceGerhard Schurz .................................................................. 1517.1 General Characterization of Abductive Reasoning and Ibe .......... 1527.2 Three Dimensions for Classifying Patterns of Abduction ............. 1547.3 Factual Abduction ....................................................... 1557.4 Law Abduction........................................................... 1587.5 Theoretical-Model Abduction .......................................... 1597.6 Second-Order Existential Abduction................................... 1617.7 Hypothetical (Common) Cause Abduction Continued................. 1627.8 Further Applications of Abductive Inference .......................... 169References ....................................................................... 171

8 Forms of Abduction and an Inferential TaxonomyGerhard Minnameier ............................................................ 1758.1 Abduction in the Overall Inferential Context .......................... 1778.2 The Logicality of Abduction, Deduction, and Induction.............. 1838.3 Inverse Inferences ....................................................... 1858.4 Discussion of Two Important Distinctions Between Types

of Abduction ............................................................. 1898.5 Conclusion ............................................................... 193References ....................................................................... 193

9 Magnani’s Manipulative AbductionWoosuk Park ..................................................................... 1979.1 Magnani’s Distinction Between Theoretical

and Manipulative Abduction ........................................... 1979.2 Manipulative Abduction in Diagrammatic Reasoning ................ 1989.3 When Does Manipulative Abduction Take Place? ..................... 2039.4 Manipulative Abduction as a Form of Practical Reasoning .......... 2049.5 The Ubiquity of Manipulative Abduction .............................. 2069.6 Concluding Remarks .................................................... 212References ....................................................................... 212

[email protected]

Page 4: Handbook Springer oƒ Model-Based Science

Contents XXVII

Part C The Logic of Hypothetical Reasoning,Abduction, and Models

10 The Logic of Abduction: An IntroductionAtocha Aliseda ................................................................... 21910.1 Some History............................................................. 21910.2 Logical Abduction ....................................................... 22210.3 Three Characterizations ................................................. 22510.4 Conclusions .............................................................. 228References ....................................................................... 229

11 Qualitative Inductive Generalization and ConfirmationMathieu Beirlaen ................................................................ 23111.1 Adaptive Logics for Inductive Generalization ......................... 23111.2 A First Logic for Inductive Generalization ............................. 23211.3 More Adaptive Logics for Inductive Generalization ................... 23711.4 Qualitative Inductive Generalization and Confirmation.............. 24011.5 Conclusions .............................................................. 24511.A Appendix: Blocking the Raven Paradox? .............................. 246References ....................................................................... 247

12 Modeling Hypothetical Reasoning by Formal LogicsTjerk Gauderis.................................................................... 24912.1 The Feasibility of the Project ........................................... 24912.2 Advantages and Drawbacks ............................................ 25112.3 Four Patterns of Hypothetical Reasoning .............................. 25212.4 Abductive Reasoning and Adaptive Logics ............................ 25512.5 The Problem of Multiple Explanatory Hypotheses .................... 25612.6 The Standard Format of Adaptive Logics............................... 25612.7 LArs: A Logic for Practical Singular Fact Abduction..................... 25812.8 MLAss: A Logic for Theoretical Singular Fact Abduction................ 26112.9 Conclusions .............................................................. 26512.A Appendix: Formal Presentations of the Logics LArs and MLAss ........ 265References ....................................................................... 267

13 Abductive Reasoning in Dynamic Epistemic LogicAngel Nepomuceno-Fernández, Fernando Soler-Toscano,Fernando R. Velázquez-Quesada .............................................. 26913.1 Classical Abduction ..................................................... 27013.2 A Dynamic Epistemic Perspective ...................................... 27213.3 Representing Knowledge and Beliefs.................................. 27513.4 Abductive Problem and Solution....................................... 27813.5 Selecting the Best Explanation......................................... 28113.6 Integrating the Best Solution........................................... 28413.7 Working with the Explanations ........................................ 28713.8 A Brief Exploration to Nonideal Agents ................................ 28913.9 Conclusions .............................................................. 290References ....................................................................... 292

14 Argumentation and Abduction in Dialogical LogicCristina Barés Gómez, Matthieu Fontaine ..................................... 29514.1 Reasoning as a Human Activity ........................................ 295

[email protected]

Page 5: Handbook Springer oƒ Model-Based Science

XXVIII Contents

14.2 Logic and Argumentation: The Divorce ................................ 29714.3 Logic and Argumentation: A Reconciliation ........................... 29914.4 Beyond Deductive Inference: Abduction .............................. 30314.5 Abduction in Dialogical Logic........................................... 30614.6 Hypothesis: What Kind of Speech Act?................................. 31014.7 Conclusions .............................................................. 312References ....................................................................... 312

15 Formal (In)consistency, Abduction and ModalitiesJuliana Bueno-Soler, Walter Carnielli, Marcelo E. Coniglio,Abilio Rodrigues Filho........................................................... 31515.1 Paraconsistency.......................................................... 31515.2 Logics of Formal Inconsistency ......................................... 31615.3 Abduction ................................................................ 32215.4 Modality.................................................................. 32715.5 On Alternative Semantics for mbC...................................... 33115.6 Conclusions .............................................................. 333References ....................................................................... 334

Part D Model-Based Reasoning in Scienceand the History of Science

16 Metaphor and Model-Based Reasoningin Mathematical PhysicsRyan D. Tweney .................................................................. 34116.1 Cognitive Tools for Interpretive Understanding ....................... 34316.2 Maxwell’s Use of Mathematical Representation ...................... 34516.3 Unpacking the Model-Based Reasoning .............................. 34816.4 Cognition and Metaphor in Mathematical Physics ................... 35016.5 Conclusions .............................................................. 351References ....................................................................... 352

17 Nancy Nersessian’s Cognitive-Historical ApproachNora Alejandrina Schwartz ..................................................... 35517.1 Questions About the Creation of Scientific Concepts.................. 35617.2 The Epistemic Virtues of Cognitive Historical Analysis ................ 35917.3 Hypothesis About the Creation of Scientific Concepts ................ 36317.4 Conclusions .............................................................. 373References ....................................................................... 373

18 Physically Similar Systems – A History of the ConceptSusan G. Sterrett ................................................................. 37718.1 Similar Systems, the Twentieth Century Concept ..................... 37918.2 Newton and Galileo..................................................... 38018.3 Late Nineteenth and Early Twentieth Century ........................ 38318.4 1914: The Year of Physically Similar Systems ......................... 39718.5 Physically Similar Systems: The Path in Retrospect ................... 408References ....................................................................... 409

19 Hypothetical Models in Social ScienceAlessandra Basso, Chiara Lisciandra, Caterina Marchionni .................. 41319.1 Hypothetical Modeling as a Style of Reasoning ....................... 413

[email protected]

Page 6: Handbook Springer oƒ Model-Based Science

Contents XXIX

19.2 Models Versus Experiments: Representation, Isolationand Resemblance ....................................................... 416

19.3 Models and Simulations: Complexity, Tractability and Transparency 42019.4 Epistemology of Models ................................................ 42319.5 Conclusions .............................................................. 42819.A Appendix: J.H. von Thünen’s Model of Agricultural Land Use

in the Isolated State .................................................... 42919.B Appendix: T. Schelling’s Agent-Based Model of Segregation

in Metropolitan Areas ................................................... 430References ....................................................................... 431

20 Model-Based DiagnosisAntoni Ligęza, Bartłomiej Górny ............................................... 43520.1 A Basic Model for Diagnosis ............................................ 43720.2 A Review and Taxonomy of Knowledge Engineering Methods

for Diagnosis ............................................................. 43820.3 Model-Based Diagnostic Reasoning ................................... 44020.4 A Motivation Example................................................... 44020.5 Theory of Model-Based Diagnosis ..................................... 44220.6 Causal Graphs............................................................ 44420.7 Potential Conflict Structures ............................................ 44620.8 Example Revisited. A Complete Diagnostic Procedure ................ 44820.9 Refinement: Qualitative Diagnoses .................................... 45020.10 Dynamic Systems Diagnosis: The Three-Tank Case.................... 45420.11 Incremental Diagnosis .................................................. 45620.12 Practical Example and Tools ............................................ 45820.13 Concluding Remarks .................................................... 459References ....................................................................... 460

21 Thought Experiments in Model-Based ReasoningMargherita Arcangeli............................................................ 46321.1 Overview ................................................................. 46421.2 Historical Background................................................... 46721.3 What Is a Thought Experiment? ........................................ 46921.4 What Is the Function of Thought Experiments?....................... 47521.5 How Do Thought Experiments Achieve Their Function? .............. 484References ....................................................................... 487

Part E Models in Mathematics

22 Diagrammatic Reasoning in MathematicsValeria Giardino ................................................................. 49922.1 Diagrams as Cognitive Tools ............................................ 49922.2 Diagrams and (the Philosophy of) Mathematical Practice ........... 50122.3 The Euclidean Diagram.................................................. 50322.4 The Productive Ambiguity of Diagrams ................................ 50922.5 Diagrams in Contemporary Mathematics .............................. 51022.6 Computational Approaches ............................................. 51522.7 Mathematical Thinking: Beyond Binary Classifications .............. 51822.8 Conclusions .............................................................. 520References ....................................................................... 521

[email protected]

Page 7: Handbook Springer oƒ Model-Based Science

XXX Contents

23 Deduction, Diagrams and Model-Based ReasoningJohn Mumma .................................................................... 52323.1 Euclid’s Systematic Use of Geometric Diagrams....................... 52423.2 Formalizing Euclid’s Diagrammatic Proof Method .................... 52523.3 Formal Geometric Diagrams as Models ................................ 532References ....................................................................... 534

24 Model-Based Reasoning in Mathematical PracticeJoachim Frans, Isar Goyvaerts, Bart Van Kerkhove ........................... 53724.1 Preliminaries............................................................. 53724.2 Model-Based Reasoning: Examples ................................... 53824.3 The Power of Heuristics and Plausible Reasoning .................... 54024.4 Mathematical Fruits of Model-Based Reasoning ..................... 54224.5 Conclusion ............................................................... 54624.A Appendix................................................................. 546References ....................................................................... 548

25 Abduction and the Emergence of NecessaryMathematical KnowledgeFerdinand Rivera ................................................................ 55125.1 An Example from the Classroom ....................................... 55125.2 Inference Types .......................................................... 55525.3 Abduction in Math and Science Education............................ 56125.4 Enacting Abductive Action in Mathematical Contexts ................ 564References ....................................................................... 566

Part F Model-Based Reasoning in Cognitive Science

26 Vision, Thinking, and Model-Based InferencesAthanassios Raftopoulos ....................................................... 57326.1 Inference and Its Modes ................................................ 57626.2 Theories of Vision ....................................................... 57726.3 Stages of Visual Processing ............................................. 58526.4 Cognitive Penetrability of Perception and the Relation

Between Early Vision and Thinking .................................... 58826.5 Late Vision, Inferences, and Thinking ................................. 59126.6 Concluding Discussion .................................................. 59626.A Appendix: Forms of Inferences......................................... 59726.B Appendix: Constructivism .............................................. 59826.C Appendix: Bayes’ Theorem and Some of Its

Epistemological Aspects ................................................ 60026.D Appendix: Modal and Amodal Completion or Perception............ 60026.E Appendix: Operational Constraints in Visual Processing ............. 601References ....................................................................... 602

27 Diagrammatic ReasoningWilliam Bechtel .................................................................. 60527.1 Cognitive Affordances of Diagrams and Visual Images ............... 60627.2 Reasoning with Data Graphs ........................................... 60827.3 Reasoning with Mechanism Diagrams ................................. 61327.4 Conclusions and Future Tasks .......................................... 616References ....................................................................... 617

[email protected]

Page 8: Handbook Springer oƒ Model-Based Science

Contents XXXI

28 Embodied Mental Imagery in Cognitive RobotsAlessandro Di Nuovo, Davide Marocco, Santo Di Nuovo, Angelo Cangelosi . 61928.1 Mental Imagery Research Background ................................. 62028.2 Models and Approaches Based on Mental Imagery

in Cognitive Systems and Robotics ..................................... 62228.3 Experiments ............................................................. 62428.4 Conclusion ............................................................... 635References ....................................................................... 635

29 Dynamical Models of CognitionMary Ann Metzger ............................................................... 63929.1 Dynamics................................................................. 63929.2 Data-Oriented Models .................................................. 64129.3 Cognition and Action Distinct .......................................... 64429.4 Cognition and Action Intrinsically Linked ............................. 64829.5 Conclusion ............................................................... 653References ....................................................................... 655

30 Complex versus Complicated Models of CognitionRuud J.R. Den Hartigh, Ralf F.A. Cox, Paul L.C. Van Geert .................... 65730.1 Current Views on Cognition ............................................. 65830.2 Explaining Cognition .................................................... 66030.3 Is Cognition Best Explained by a Complicated or Complex Model? .. 66230.4 Conclusion ............................................................... 666References ....................................................................... 666

31 From Neural Circuitry to MechanisticModel-Based ReasoningJonathan Waskan ............................................................... 67131.1 Mechanistic Reasoning in Science ..................................... 67231.2 The Psychology of Model-Based Reasoning ........................... 67331.3 Mental Models in the Brain:

Attempts at Psycho-Neural Reduction................................. 67531.4 Realization Story Applied ............................................... 68631.5 Mechanistic Explanation Revisited..................................... 68731.6 Conclusion ............................................................... 690References ....................................................................... 690

Part G Modelling and Computational Issues

32 Computational Aspects of Model-Based ReasoningGordana Dodig-Crnkovic, Antonio Cicchetti................................... 69532.1 Computational Turn Seen from Different Perspectives ............... 69532.2 Models of Computation ................................................. 69732.3 Computation Versus Information....................................... 70032.4 The Difference Between Mathematical and Computational

(Executable) Models ..................................................... 70232.5 Computation in the Wild ............................................... 70332.6 Cognition: Knowledge Generation by Computation

of New Information ..................................................... 70632.7 Model-Based Reasoning and Computational Automation

of Reasoning............................................................. 709

[email protected]

Page 9: Handbook Springer oƒ Model-Based Science

XXXII Contents

32.8 Model Transformations and Semantics: Separation BetweenSemantics and Ontology ................................................ 712

References ....................................................................... 715

33 Computational Scientific DiscoveryPeter D. Sozou, Peter C.R. Lane, Mark Addis, Fernand Gobet ................ 71933.1 The Roots of Human Scientific Discovery .............................. 72033.2 The Nature of Scientific Discovery ...................................... 72133.3 The Psychology of Human Scientific Discovery ........................ 72233.4 Computational Discovery in Mathematics ............................. 72333.5 Methods and Applications in Computational Scientific Discovery ... 72533.6 Discussion................................................................ 730References ....................................................................... 731

34 Computer Simulations and Computational Models in ScienceCyrille Imbert..................................................................... 73534.1 Computer Simulations in Perspective.................................. 73634.2 The Variety of Computer Simulations and Computational Models... 73934.3 Epistemology of Computational Models and Computer Simulations 74334.4 Computer Simulations, Explanation, and Understanding ........... 75034.5 Comparing: Computer Simulations, Experiments

and Thought Experiments .............................................. 75834.6 The Definition of Computational Models and Simulations........... 76734.7 Conclusion: Human-Centered, but no Longer

Human-Tailored Science................................................ 773References ....................................................................... 775

35 Simulation of Complex SystemsPaul Davidsson, Franziska Klügl, Harko Verhagen ........................... 78335.1 Complex Systems ........................................................ 78335.2 Modeling Complex Systems ............................................. 78535.3 Agent-Based Simulation of Complex Systems......................... 78935.4 Summing Up and Future Trends........................................ 795References ....................................................................... 796

36 Models and Experiments in RoboticsFrancesco Amigoni, Viola Schiaffonati......................................... 79936.1 A Conceptual Premise ................................................... 79936.2 Experimental Issues in Robotics ....................................... 80136.3 From Experimental Computer Science to Good Experimental

Methodologies in Autonomous Robotics .............................. 80236.4 Simulation ............................................................... 80436.5 Benchmarking and Standards.......................................... 80736.6 Competitions and Challenges .......................................... 80936.7 Conclusions .............................................................. 812References ....................................................................... 812

37 BioroboticsEdoardo Datteri .................................................................. 81737.1 Robots as Models of Living Systems.................................... 81737.2 A Short History of Biorobotics .......................................... 825

[email protected]

Page 10: Handbook Springer oƒ Model-Based Science

Contents XXXIII

37.3 Methodological Issues .................................................. 82637.4 Conclusions .............................................................. 833References ....................................................................... 834

Part H Models in Physics, Chemistry and Life Sciences

38 Comparing Symmetries in Models and SimulationsGiuseppe Longo, Maël Montévil ................................................ 84338.1 Approximation........................................................... 84438.2 What Do Equations and Computations Do? ........................... 84538.3 Randomness in Biology................................................. 84838.4 Symmetries and Information in Physics and Biology................. 84938.5 Theoretical Symmetries and Randomness............................. 852References ....................................................................... 854

39 Experimentation on Analogue ModelsSusan G. Sterrett ................................................................. 85739.1 Analogue Models: Terminology and Role.............................. 85839.2 Analogue Models in Physics ............................................ 86839.3 Comparing Fundamental Bases for Physical Analogue Models ...... 87339.4 Conclusion ............................................................... 876References ....................................................................... 877

40 Models of Chemical StructureWilliam Goodwin ................................................................ 87940.1 Models, Theory, and Explanations in Structural Organic Chemistry . 88140.2 Structures in the Applications of Chemistry ........................... 88340.3 The Dynamics of Structure .............................................. 88540.4 Conclusion ............................................................... 889References ....................................................................... 889

41 Models in GeosciencesAlisa Bokulich, Naomi Oreskes.................................................. 89141.1 What Are Geosciences?.................................................. 89141.2 Conceptual Models in the Geosciences ................................ 89241.3 Physical Models in the Geosciences ................................... 89341.4 Numerical Models in the Geosciences ................................. 89541.5 Bringing the Social Sciences Into Geoscience Modeling.............. 89741.6 Testing Models: From Calibration to Validation ....................... 89841.7 Inverse Problem Modeling.............................................. 90241.8 Uncertainty in Geoscience Modeling................................... 90341.9 Multimodel Approaches in Geosciences ............................... 90741.10 Conclusions .............................................................. 908References ....................................................................... 908

42 Models in the Biological SciencesElisabeth A. Lloyd................................................................ 91342.1 Evolutionary Theory ..................................................... 91342.2 Confirmation in Evolutionary Biology ................................. 92242.3 Models in Behavioral Evolution and Ecology.......................... 925References ....................................................................... 927

[email protected]

Page 11: Handbook Springer oƒ Model-Based Science

XXXIV Contents

43 Models and Mechanisms in Cognitive ScienceMassimo Marraffa, Alfredo Paternoster ........................................ 92943.1 What is a Model in Cognitive Science?................................. 92943.2 Open Problems in Computational Modeling .......................... 94043.3 Conclusions .............................................................. 948References ....................................................................... 949

44 Model-Based Reasoning in the Social SciencesFederica Russo ................................................................... 95344.1 Modeling Practices in the Social Sciences ............................. 95444.2 Concepts of Model....................................................... 95844.3 Models and Reality ...................................................... 96244.4 Models and Neighboring Concepts..................................... 96344.5 Conclusion ............................................................... 967References ....................................................................... 968

Part I Models in Engineering, Architecture,and Economical and Human Sciences

45 Models in Architectural DesignPieter Pauwels ................................................................... 97545.1 Architectural Design Thinking .......................................... 97645.2 BIM Models and Parametric Models.................................... 98145.3 Implementing and Using ICT for Design and Construction ........... 984References ....................................................................... 987

46 Representational and Experimental Modelingin ArchaeologyAlison Wylie ...................................................................... 98946.1 Philosophical Resources and Archaeological Parallels................ 99046.2 The Challenges of Archaeological Modeling ........................... 99146.3 A Taxonomy of Archaeological Models ................................. 99246.4 Conclusions .............................................................. 1000References ....................................................................... 1000

47 Models and Ideology in DesignCameron Shelley ................................................................. 100347.1 Design and Ideology .................................................... 100347.2 Models and Ideology.................................................... 100447.3 Revivalism: Looking to the Past ........................................ 100547.4 Modernism: Transcending History ..................................... 100647.5 Industrial Design: The Shape of Things to Come ...................... 100947.6 Biomimicry............................................................... 101147.7 Conclusion ............................................................... 1013References ....................................................................... 1013

48 Restructuring Incomplete Models in Innovators Marketplaceon Data JacketsYukio Ohsawa, Teruaki Hayashi, Hiroyuki Kido ............................... 101548.1 Chance Discovery as a Trigger to Innovation .......................... 101648.2 Chance Discovery from Data and Communication .................... 1016

[email protected]

Page 12: Handbook Springer oƒ Model-Based Science

Contents XXXV

48.3 IM for Externalizing and Connecting Requirements and Solutions . 102048.4 Innovators Marketplace on Data Jackets .............................. 102248.5 IMDJ as Place for Reasoning on Incomplete Models .................. 102348.6 Conclusions .............................................................. 1029References ....................................................................... 1029

49 Models in Pedagogy and EducationFlavia Santoianni ............................................................... 103349.1 Pluralism ................................................................. 103449.2 Dialecticity ............................................................... 103949.3 Applied Models .......................................................... 104249.4 Conclusions .............................................................. 1048References ....................................................................... 1048

50 Model-Based Reasoning in Crime PreventionCharlotte Gerritsen, Tibor Bosse ................................................ 105150.1 Ambient Intelligence.................................................... 105350.2 Methodology............................................................. 105450.3 Domain Model ........................................................... 105550.4 Analysis Model........................................................... 105850.5 Support Model ........................................................... 106050.6 Results.................................................................... 106050.7 Discussion................................................................ 1062References ....................................................................... 1062

51 Modeling in the Macroeconomics of Financial MarketsGiovanna Magnani.............................................................. 106551.1 The Intrinsic Instability of Financial Markets ......................... 106651.2 The Financial Theory of Investment ................................... 107151.3 The Financial Instability Hypothesis Versus the Efficient

Markets Hypothesis ..................................................... 107451.4 Irving Fisher’s Debt-Deflation Model .................................. 107451.5 Policy Implications and the Shareholder Maximization

Value Model.............................................................. 107951.6 Integrating the Minskyian Model with New Marxists

and Social Structure of Accumulation (SSA) Theories ................. 108551.7 Risk and Uncertainty .................................................... 1086References ....................................................................... 1098

52 Application of Models from Social Science to Social PolicyEleonora Montuschi ............................................................. 110352.1 Unrealistic Assumptions ................................................ 110552.2 Real Experiments, Not Models Please! ................................. 111052.3 Conclusions .............................................................. 1115References ....................................................................... 1116

53 Models and Moral DeliberationCameron Shelley ................................................................. 111753.1 Rules ..................................................................... 111853.2 Mental Models ........................................................... 111953.3 Schemata................................................................. 1121

[email protected]

Page 13: Handbook Springer oƒ Model-Based Science

735

PartG|34

34. Computer Simulations and Computational Modelsin Science

Cyrille Imbert

Computational science and computer simulationshave significantly changed the face of science inrecent times, even though attempts to extend ourcomputational capacities are by no means newand computer simulations are more or less ac-cepted across scientific fields as legitimate waysof reaching results (Sect. 34.1). Also, a great varietyof computational models and computer simu-lations can be met across science, in terms ofthe types of computers, computations, compu-tational models, or physical models involved andthey can be used for various types of inquiriesand in different scientific contexts (Sect. 34.2).For this reason, epistemological analyses of com-puter simulations are contextual for a great part.Still, computer simulations raise general ques-tions regarding how their results are justified, howcomputational models are selected, which type ofknowledge is thereby produced (Sect. 34.3), orhow computational accounts of phenomena partlychallenge traditional expectations regarding theexplanation and understanding of natural sys-tems (Sect. 34.4). Computer simulations also sharevarious epistemological features with experimentsand thought experiments; hence, the need fortransversal analyses of these activities (Sect. 34.5).Finally, providing a satisfactory and fruitful defini-tion of computer simulations turns out to be moredifficult than expected, partly because this notionis at the crossroads of difficult questions like thenature of representation and computation or thesuccess of scientific inquiries (Sect. 34.6). Over-all, a pointed analysis of computer simulationsin parallel requires developing insights about theevolving place of human capacities and humanswithin (computational) science (Sect. 34.7).

34.1 Computer Simulations in Perspective . 73634.1.1 The Recent Philosophy of Scientific

Models and Computer Simulations ...... 73634.1.2 Numerical Methods and Computational

Science: An Old Tradition .................... 73734.1.3 A More or Less Recent Adoption

Across Scientific Fields........................ 738

34.1.4 Methodological Caveat ....................... 738

34.2 The Variety of Computer Simulationsand Computational Models ............... 739

34.2.1 Working Characterization ................... 73934.2.2 Analog Simulations

and Their Specificities ........................ 74034.2.3 Digital Machines, Numerical Physics,

and Types of Equivalence ................... 74134.2.4 Non-Numerical Digital Models ............ 74134.2.5 Nondeterministic Simulations............. 74234.2.6 Other Types of Computer Simulations .. 742

34.3 Epistemology of ComputationalModels and Computer Simulations..... 743

34.3.1 Computer Simulationsand Their Scientific Roles ................... 743

34.3.2 Aspects of the Epistemological Analysisof Computer Simulations .................... 744

34.3.3 Selecting Computational Modelsand Practices .................................... 746

34.3.4 The Production of ‘New’ Knowledge:In What Sense?.................................. 748

34.4 Computer Simulations, Explanation,and Understanding ........................... 750

34.4.1 Traditional Accounts of Explanation .... 75134.4.2 Computer Simulations:

Intrinsically Unexplanatory? ............... 75134.4.3 Computer Simulations:

More Frequently Unexplanatory? ........ 75234.4.4 Too Replete to Be Explanatory?

The Era of Lurking Suspicion ............... 75434.4.5 Bypassing the Opacity of Simulations .. 75734.4.6 Understanding and Disciplinary Norms 758

34.5 Comparing:Computer Simulations, Experimentsand Thought Experiments ................. 758

34.5.1 Computational Mathematicsand the Experimental Stance . ............. 759

34.5.2 Common Basal Features ..................... 75934.5.3 Are Computer Simulations

Experiments? .................................... 76234.5.4 Knowledge Production,

Superiority Claims, and Empiricism ..... 76534.5.5 The Epistemological Challenge

of Hybrid Methods ............................. 767

Page 14: Handbook Springer oƒ Model-Based Science

PartG|34.1

736 Part G Modelling and Computational Issues

34.6 The Definition of ComputationalModels and Simulations .................... 767

34.6.1 Existing Definitions of Simulations ...... 76834.6.2 Pending Issues .................................. 77034.6.3 When Epistemology Cross-Cuts

Ontology ........................................... 773

34.7 Conclusion: Human-Centered, butno Longer Human-Tailored Science . ... 773

34.7.1 The Partial Mutationof Scientific Practices ......................... 774

34.7.2 The New Place of Humans in Science ... 77434.7.3 Analyzing Computational Practices

for Their Own Sake............................. 77434.7.4 The Epistemological Treatment

of New Issues .................................... 775

References ................................................... 775

For several decades, much of science has been com-putational, that is, scientific activity where computersplay a central and essential role. Still, computationalscience is larger than the set of activities resorting tocomputer simulations. For example, experimental sci-ence, from vast experiments in nuclear physics at theEuropean Organization for Nuclear Research (CERN)to computational genomics, relies heavily on comput-ers and computational models for data acquisition andtheir treatment, but does not seem to involve computersimulations proper. In any case, there is a great andstill proliferating variety of types of computer simu-lations, which are used for different types of inquiriesand in different types of theoretical contexts. For thisreason, one should be careful when describing the phi-losophy of computer simulations and nonjustified gen-eralizations should be avoided. At the same time, howmuch the development of computer simulations hasbeen changing science is a legitimate question. Com-

puter simulations raise questions about the traditionalconceptualization of science in terms of experiments,theories and models, about the ways that usual scien-tific activities like predicting, theorizing, controlling, orexplaining are carried out with the help of these newtools and, more generally, how the production of sci-entific knowledge by human creatures is modified bycomputer simulations. Importantly, while the specificphilosophical analysis of computer simulations is re-cent (even if it was preceded by the development of thephilosophical study of scientific models) and compu-tational science is a few decades old, the developmentof computational tools and mathematical techniquesaimed at bypassing the complexity of problems be-longs to a much older tradition. This means that claimsabout how much computer simulations change science,and how much a closer attention to computer simula-tions should change our picture of scientific activity, arequestions to be treated with circumspection.

34.1 Computer Simulations in Perspective

When discussing philosophical and epistemological is-sues related to computational models and computersimulations, different chronologies should be kept inmind. The blossoming of the philosophy of mod-els and simulations, within the philosophy of scienceis something recent (Sect. 34.1.1). The developmentof techniques aimed at extending our inferential andcomputational powers corresponds to a longer trend,even if the recent invention of powerful digital ma-chines has changed the face of computational science(Sect. 34.1.2). Finally, the acceptation of computersimulations as legitimate scientific tools across the dif-ferent fields goes at various paces (Sect. 34.1.3). Thismeans that, even if computer simulations do changethe face of science, much care is needed when analyz-ing the aspects of science which are actually changed,and how we should modify our picture of sciencewhen we adopt a computer simulation-based perspec-tive (Sect. 34.1.4).

34.1.1 The Recent Philosophyof Scientific Modelsand Computer Simulations

While the use of computer simulations in the empiri-cal sciences, in particular physics, developed after theconstruction of the (ENIAC) computer during WorldWar II [34.1], and started changing how the empir-ical sciences were practiced, for decades computer-related discussions among philosophers were primarilyfocused on the development of artificial intelligenceand the analysis of human cognition. Particularly ac-tive were debates in philosophy of mind regardingthe question of the computational theory of the mind,that is, whether the mind can be likened to a digitalcomputer, and in particular to a classical machine em-ploying rules and symbolic representations [34.2–6].However, within the mainstream philosophy of science,continued interest for computational science, compu-

Page 15: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.1 Computer Simulations in Perspective 737

PartG|34.1

tational models, and digital simulations of empiricalsystems as such did not really start until the early 1990s,with articles by Humphreys [34.7, 8], Rohrlich [34.9] orHartmann [34.10]. (Such a description of the field isnecessarily unfair to earlier works about the use of com-puter simulations in the empirical sciences. Particularmention should be given to the works of Bunge [34.11]or Simon [34.12].) An article by Hughes about theinvestigations of the Ising model [34.13], a special is-sue of Science in Context edited by Sismondo [34.14]and works by Winsberg [34.15–17], who completedhis Ph. D. in 1999 about computer simulations, alsocontributed to the development of this field. Finally,in 2006, the Models and Simulations Conference tookplace, which was the first of what was to become a stillactive conference series, which has contributed to mak-ing the issue of computational science one of the fieldsof philosophy of science.

Philosophical works about scientific models, a veryclose field, were not significantly older. The impor-tance of the notion of set-theoretic model had beenemphasized by partisans of the model-theoretic viewof theories in the 1970s, but, if one puts aside worksby pioneers like Black [34.18] or Hesse [34.19], thisdid not launch investigations about scientific modelsproper. Overall, the intense epistemological study ofmodels did not start until the 1980s, with in particulara seminal article by Redhead about scientific modelsin physics [34.20]. Members of the Stanford Schoolalso argued against the view that science was unifiedand that theories played a dominant role in scien-tific activities such as the selection and constructionof models [34.21], and conversely emphasized the au-tonomy of experimental and modeling practices. Thiscontext was appropriate for an independent investiga-tion about the role of models in science, which bloomedat the end of the 1990s [34.22] and was further fedby a renewal of interest for the question of scien-tific representation [34.23–25]. These investigations ofmodels paved the way for new studies focused neitheron theories nor on experiments. However, while the dif-ficulty to explore a model was already acknowledgedin works by Redhead and Cartwright, interest for theactual modes of its exploration, in particular by com-puter simulations, was not triggered. Indeed, the focusremained on the effects of the complexity of the in-quiry on scientific representations, with studies aboutsimplifications, approximations, or idealizations (EvenLaymon’s 1990 paper [34.26], in spite on its apparentfocus on computer simulation, mainly deals with thenature of approximation and what it is to accept or be-lieve a theory.), or how to articulate the model-theoreticview of theories and the uses of models and repre-sentations in actual scientific practices, by taking into

account scientific users, qua intentional cognitive crea-tures [34.27, 28], and their cognitively constrained waysto handle models by means of inferences, graphs, pic-tures or diagrams (Kulvicki [34.29], Giardino Chap. 22,this volume; Bechtel Chap. 27, this volume). Overall,in spite of the close connection within scientific prac-tice between the uses of models and their computationalexplorations, the issue of computational models andcomputer simulations was not seen clearly as a fruit-ful field of inquiry of its own, this trend of thoughtbeing explicitly and vividly brought to the fore in2008 in a deliberately provocative paper by Frigg andReiss [34.30].

34.1.2 Numerical Methodsand Computational Science:An Old Tradition

The second relevant chronology is that of the ad-vancement in attempts to solve complex mathemati-cal problems by developing computing machines andmathematical methods. Importantly, while the develop-ment of digital computers in the mid-twentieth centurychanged the face of scientific computation, humansdid not wait for this decisive breakthrough to extendtheir mathematical and computational powers. Further,as Mahoney wrote it, “the computer is not one thing,but many different things, and the same holds true ofcomputing” [34.31], and it is only in the twentieth cen-tury that different historical strands related to logic,mathematics, or technologies came together. On theone hand, early mathematical procedures, like New-ton’s method to find the roots of real-valued functions,or Euler’s method to solve ordinary differential equa-tions, were developed to provide numerical approxi-mations for problems in numerical analysis. This fieldwas already important to investigate physical systemsbut, with the advent of digital computers, it becamea crucial part of (computational) science. On the otherhand, mechanical calculating tools, such as abacusesor slide rules, were used from the Antiquity throughthe centuries. The invention by Pascal of a device(the Pascaline) to perform additions and subtractions,and the conceptualization by Babbage of mechanicalcomputing systems fed by punched cards, were im-portant additional steps. Human computers were alsoused. For example, in 1758, Clairaut predicted the re-turn of Halley’s comet, by dividing the computationalwork with other colleagues [34.32]. Gaspard de Pronyproduced the logarithm and trigonometric tables inthe early nineteenth century by dividing the compu-tational tasks into elementary operations, which werecarried out by unemployed hairdressers with little ed-ucation. Human computers were used during World

Page 16: Handbook Springer oƒ Model-Based Science

PartG|34.1

738 Part G Modelling and Computational Issues

War I to compute artillery tables and World War IIto help with the Manhattan project [34.33, 34]. Fi-nally, mechanical analog computers were developedfor scientific purposes by engineers and scientists likeThomson or Kelvin, in the late nineteenth century,Vannevar Bush, between the two World Wars, or En-rico Fermi, in 1947, and such computers were usedtill the 1960s. Finally, even in the digital era, newtechnological change can have a large impact. Fordecades, access to computational resources was diffi-cult and only possible in the framework of big projects.Typically, Schelling’s first simulations of residentialsegregation [34.35] were hand made. An importantrecent step has been the development of personal com-puters, which has brought more flexibility and mayhave triggered the development of new modeling prac-tices [34.36].

34.1.3 A More or Less Recent AdoptionAcross Scientific Fields

The development of computational science and the useof computational models and simulation methods varyfrom one field to another. Since the 1940s onward,computer simulations have been used in physics, andcomputers were also used in artificial intelligence asearly as the late 1950s. However, some fields have re-sisted such methods, and still do, as far as commonlyaccepted mainstreammethods are concerned. Typically,the development of computational models and com-puter simulations in the human and social sciences,with the possibility of analyzing diachronic interac-tions between agents (versus static models describingequilibria) is much more recent. As emphasized ear-lier, Schelling’s initial dynamic model of segregationwas first run manually in 1969. Attempts to use com-putational science to predict social and economic be-havior were globally met with suspicion in the 1960sand 1970s, all the more since these studies were of-ten carried out by scholars who did not belong towell-entrenched traditions in these fields (such as sci-entists studying complexity, including human behavior,in institutions like the Santa Fe Institute). Overall,in economics, computer simulations are still not ac-cepted [34.37]. Similarly, the development of a specific(and still somewhat distinct) subfield using computa-tional methods to analyze social phenomena is recent,with the edition by Hegselmann et al. of the volumeModelling and Simulation in the Social Sciences fromthe Philosophy of Science Point of View [34.38], theneed felt to create, in 1998, the Journal of ArtificialSocieties and Social Simulation and the publication in2005 of the handbook Simulation for the Social Scien-tist by Gilbert and Troitzsch [34.39].

34.1.4 Methodological Caveat

These different chronological perspectives call for thefollowing comments.

First, philosophers should be careful when devel-oping an epistemology of computational models andcomputer simulations. Modeling and simulating prac-tices have been developed in various epistemic contextsin scientific fields in which well-entrenched theoriesare more or less present and which have differentmethodological and scientific norms. Thus, the role ofcomputer simulations and their epistemological assess-ment can significantly differ from one case to another,and bold generalizations should be carefully justified oravoided. As just mentioned, the use of computer sim-ulations is central and accepted in fields like climatescience (even if it raises important problems) but isstill regarded with great suspicion in fields like eco-nomics [34.37, 40].

Second, how much computational models and com-puter simulations correspond to epistemologically dif-ferent practices, which should be described in termsof some computational turn, cannot be assumed, butshould be investigated on a case-by-case basis regardingall potentially relevant aspects. This can be illustratedwith the question of the tractability of scientific mod-els. Humphreys, in his 2004 book Extending Ourselves,proposes the following two principles to analyze sci-ence: “It is the invention and deployment of tractablemathematics that drives much progress in the physi-cal sciences”; and its converse version: “most scien-tific models are specifically tailored to fit, and henceare constrained by, the available mathematics” [34.41,pp. 55–56]. These two principles suggest both a con-tinuist and discontinuist reading of the developmentof science. First, students of science need to assesswhich precise aspects of scientific practices have beenchanged by the development of computers and whetherthese changes should be seen as a scientific revolution,or simply as an extension of existing modes of rea-soning [34.42]. In this perspective, questions about thetractability and complexity of models can no longer beignored, and may be crucial to an understanding of hownew branches of modeling and computational practicescan develop and of how the dynamics of science canbe qualitatively different [34.43]. At the same time, sci-entific practices were also constrained by the availablemathematics before the advent of computers, and newfindings in mathematics already paved the way for thedevelopment of new scientific practices. For example,Lakatos emphasizes that [34.44, p. 137]

“the greatness of the Newtonian programme comespartly from the development – by Newtonians of

Page 17: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.2 The Variety of Computer Simulations and Computational Models 739

PartG|34.2

classical infinitesimal analysis which was a crucialprecondition for its success.”

From this point of view, a continuist reading is alsorequired.

Third, the computational perspective may requirepartly revising the philosophical treatment of questionsabout science, and scientific representation in particular.Computer simulations are actual activities of investi-gation of scientific models, and, for this reason, thetractability and computational constraints that they facecan hardly be ignored when analyzing them. They forceus to adopt an in practice perspective, where what mat-ters is not the logical content of representations (thatis, the information which agents can access in prin-ciple, with unlimited resources), but the results andconclusions which agents can in practice reach with theinferential resources they have [34.41, §5.5]. By con-trasts, traditional analyses of scientific models adopt anin-principle stance: the question of their exploration andof the tractability of the methods used to explore themis one question among others, and is implicitly ideal-ized away when discussing other issues. This impliessurreptitiously smuggling in the unjustified claim thatthe distinction between what is possible in principle andwhat is possible in practice can be ignored for the inves-tigation of these other issues, which may sometimes becontroversial.

At the same time, philosophers of science drawtheir examples from the scientific literature, which, by

definition, presents successful investigations of modelswhich must have been found to be, one way or an-other, tractable enough regarding the inquiries pursued.In brief, discussions about the scientific models whichare found in scientific practices are ipso facto concern-ing computationally tractable models, or models havingcomputationally tractable versions.

How much these remarks imply that existing anal-yses about scientific models have been discretelyskewed, or on the contrary that the constraints oftractability have already been taken into account, needsto be ascertained, and the answer may be different de-pending on the question investigated. For example, fordecades the question of the relations between fields hasmainly been treated in terms of relations between theo-ries. While this perspective is in part legitimate, recentinvestigations suggest that tractable models may also bea relevant unit to analyze scientific theoretical, method-ological or practical transfers between fields [34.41,§3.3], [34.45, 46]. In any case, when discussing ques-tions related to scientific representation, explanation, orconfirmation, philosophers of science must watch outthat answers may sometimes differ for the models thatscientists work with daily (and which more and morerequire computers to be investigated), and for simpleanalytically solvable models, which philosophers morenaturally focus upon, and which may have a specificscientific status regarding the construction of knowl-edge and the development of families of models in eachfield.

34.2 The Variety of Computer Simulationsand Computational Models

Computer simulations involve the use of computersto represent and investigate the behavior of physicalsystems (Sect. 34.2.1). Beyond this simple character-ization, various types of computer simulations can bemet in science, each with its specificities, and, it isimportant to distinguish them to avoid undue extrap-olations. Differences can be met at various levels ofdescription. Computing machines can be digital or ana-log (Sect. 34.2.2). Digital computers are usually usedto carry out numerical computations (Sect. 34.2.3),even if all computer simulations do not involve op-erations on numbers (Sect. 34.2.4). In both cases,computations may be deterministic or nondeterminis-tic (Sect. 34.2.5). Finally, various types of mathematicaland physical computational models can be met acrossscience, such as equation-based models, agent-basedmodels, coupled models or multiscale models, but, notall important computational methods or mathematical

frameworks are used to carry out computer simulations(Sect. 34.2.6).

My purpose in this section is to present and charac-terize different important types of simulations, whichare used in scientific practice and will regularly bereferred to in the following sections, and to high-light some specific epistemological questions related tothem.

34.2.1 Working Characterization

In science, computer simulations are based on the useof computers. A computer is a physical apparatus whichcan reliably be used to carry out logical and mathemat-ical operations. A computer simulation corresponds tothe actual use of a computer to investigate a physicalsystem S, by computationally generating the descrip-tion of some of the states of one of the potential

Page 18: Handbook Springer oƒ Model-Based Science

PartG|34.2

740 Part G Modelling and Computational Issues

trajectories of S in the state space of a computationalmodel of S (working characterization).

This characterization is not meant as a full-blowndefinition (Sect. 34.6) but as a synthetic presentation ofimportant features of computer simulations.

First, it emphasizes that the development of com-puters is a central step in the recent evolution of science,which was made possible by steady conceptual andtechnical progresses in the twentieth century. It cantherefore be expected that computational aspects areoften, though not necessarily always, central for theepistemological analysis of computational science andcomputer simulations (Sect. 34.3). Second, the work-ing definition is meant to emphasize that all uses ofcomputers in science cannot be seen as computer sim-ulations. Typically, the use of computers to analyze bigdata is not considered as a computer simulation sincethe dynamics of the target system is not represented.Third, the characterization remains neutral regardingthe question of whether in science there are simulationsthat are not based on the use of computers (what-ever these could be). It is not incompatible with theclaim that computer simulations are some sort of ex-perimental activity, even if people willing to endorsesuch claims need to explain and justify in which sensethe corresponding uses of computers can be consideredas experimental (Sect. 34.5). Finally, since differenttypes of computers exist, computer simulations maycorrespond to various types of objects. The workingdefinition emphasizes that, in order to analyze actualscience, the emphasis should be primarily on modelsof computations that can have an actual physical re-alization, and on physical systems that can be usedin practice for scientific purposes – even if investiga-tions about potential machines, and how some physicalsystems could instantiate them, may be relevant forfoundational issues.

I now turn to the description of important types ofcomputer simulations that have been, or still are, usedin science and that figure in epistemological discussionsabout computer simulations.

34.2.2 Analog Simulationsand Their Specificities

Analog computers were important tools for scientificcomputing till the late 1960s, during which with hand-books of analog computation were still being writ-ten [34.47, 48], and attempts were made in the early1970s to link analog and digital computers. Analogsimulations and physical analog systems are still occa-sionally used to investigate physical systems.

An analog computer is a physical machine whichis able to carry out algebraic and integrodifferential

operations upon continuous physical signals. Thus, op-erations that would be difficult to program on a digitalcomputer are immediately possible on an analog ma-chine. The specificity of analog machines is that theycontain physical elements whose dynamics decisivelycontribute to the dynamic instantiation of these math-ematical operations. For a machine to be used as ananalog computer, its physical dynamics must be explic-itly known and completely under control so that thereis no uncertainty about the operations which are carriedout. While systems like wind tunnels cannot be made tocompute several different dynamics, mechanical analogcomputers like the differential analyzer and electricalanalog computers can be used as general-purpose com-putational tools.

Even if analog computers and analog simulationsare seldom used nowadays, understanding them is epis-temologically important. For instance, while quantumcomputation is an extension of classical digital com-putation, quantum analog computing, which involvesno binary encoding, may prove useful for the purposeof the quantum simulation of physical systems [34.49].Analog computers are considered to be potentially morepowerful than digital machines and to be actually in-stantiated by physical systems, even if we are unableto use them to the full extent of their capacities be-cause of analog noise or the impossibility of preciselyextracting the information they process. The analysisof analog computability is also important for foun-dational studies aimed at determining which types ofactual computers devices could be used for the pur-poses of computer simulations, how much resources wemay need to simulate physical systems or what natu-ral systems can compute [34.50–52]. For example, theGeneral Purpose Analog Computer was introduced byShannon as a model of the differential analyzer, whichwas used from the 1930s to 1960s.

Finally, understanding how analog computers workis important to understand analog simulations and howthey differ from digital simulations. As is pitifullyemphasized by Arnold [34.53, p. 52], the failure todistinguish properly between digital computer simula-tions and analog simulations can be (and has recentlybeen) a major source of error in the philosophical dis-cussions of computer simulations. Analog computersphysically instantiate the mathematical dynamics whichthey are used to investigate. Therefore, the analog com-putational model that is analyzed is instantiated bothby the physical computer and by the target system thatis being simulated. Thus, the simulating and simulatedprocesses share a common structure and are isomor-phic [34.54], which need not be the case for digitalsimulations (Sect. 34.5.3). Importantly, this commonstructure is purely mathematical, and involves dimen-

Page 19: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.2 The Variety of Computer Simulations and Computational Models 741

PartG|34.2

sionless quantities [34.55, Chap. 8]. While the need todescribe systems in terms of dimensionless quantitiesis a general one in the empirical sciences [34.56–58],and is also crucial for digital simulations, here it isspecifically important to understand the type of reason-ing involved in analog simulations. Indeed, the physicaldescription of the simulating and simulated systemsmatter only in so far as one needs to justify that they in-stantiate a common dimensionless dynamical structure.In brief, such analogical reasoning does not involveany direct comparison between the physical materialproperties of the simulating and simulated systems:the mathematical structure mediates the comparison. Inother words, even with analog simulations, an analysisof the similarities of the two systems is irrelevant onceone knows which analog computation is being carriedout by both systems.

34.2.3 Digital Machines, Numerical Physics,and Types of Equivalence

In digital machines, information is processed discretely,coded in binary digits (1 or 0), and stored in transistors.Computations involve the transition between computa-tional states. These transitions are described in termsof logical rules between the states. If these rules canbe described in a general form, they may be describedin terms of equations involving variables. Digital com-puters can have various types of architecture with dif-ferent computational performances. Traditionally, soft-ware was written for sequential computation, in whichone instruction is executed at a time. In contrast, modernsupercomputers are designed to solve tasks in parallel,and parallelism can be supported at different levels ofarchitecture, which often implies the need to adapt algo-rithms, if not models, to parallel computation [34.59].

Digital machines can be used to develop differenttypes of computer simulations. Much computationalscience is numerical: binary sequences code for num-bers and computers carry out numerical operations onthese numbers by processing the binary strings. Sincecomputers can only devote limited memory to repre-sent numbers (e.g., with floating-point representation),numerical science usually involves numerical approxi-mations. In other words, computer simulations do notprovide exact solutions to equations – even if the notionof an exact solution is not as straightforward as philoso-phers usually take it to be [34.60].

Different types of equivalence between compu-tations, and, by extension, computer simulations,should be distinguished beyond equivalence at the bitlevel [34.61]. Logical and mathematical expressionsand algorithms can be mathematically equivalent whenthey refer to, or compute, the same mathematical object

or some of its properties. Because of floating-point rep-resentation, round-off errors cannot be avoided in simu-lations. When algorithms result in small cumulative er-rors, they are stable and two such stable algorithms maybe considered as numerically equivalent – although theyneed not be computationally equivalent in terms of theircomputational efficiency. Finally, based on the typeof inquiry pursued, wider notions of representationalequivalence can be defined at the computational modelor computer simulation level. Typically, two computa-tions yielding the same average quantity, or describingthe same topology of a trajectory, may be consideredas equivalent. Overall, this shows that analyses of thefailures and predictive or explanatory successes of com-puter simulations must often be rooted in the technicaldetails of computational practices [34.62]. From thispoint of view, an important part of computational sci-ence can be seen as the continuation of the numericalanalysis tradition presented in Sect. 34.1.2.

34.2.4 Non-Numerical Digital Models

A large part of science gives a central role to scientifictheories couched in terms of differential equations re-lating continuous functions with their derivatives. Forthis reason, much of computational science is basedon finite-difference equations aimed at finding ap-proximate numerical solutions to differential equations.However this theory- and equation-oriented picturedoes not exhaust actual practices in computational sci-ence. First, computer simulations can be carried out inthe absence of theories – which turns out to be a prob-lem when it comes to the explanatory value of computersimulations (Sect. 34.4). Second, even when equation-based theories exist, computational models are not nec-essarily completely determined by these theories and bymathematical results describing how to discretize equa-tions appropriately (Sect. 34.3.2). Finally, even whenwell entrenched, equation-based, theories exist, digital,but non-numerical, computer simulations can be de-veloped. This perspective was advocated in the 1980sby computer scientists like Fredkin, Toffoli, or Mar-golus. Building on the idea previously expressed byFeynman, that maybe “nature, at some extremely mi-croscopic scale, operates exactly like discrete computerlogic” [34.63], they wanted to develop “a less round-about way to make nature model itself” [34.64, p. 121]than the use of computers to obtain approximate numer-ical solutions of equations. The idea was to representmore directly physical processes by means of phys-ically minded models, with interactions on a spatiallattice providing an emulation “of the spatial locality ofphysical law” [34.65] and to use exact models obeyingdiscrete symbolic dynamics to dispense with numer-

Page 20: Handbook Springer oƒ Model-Based Science

PartG|34.2

742 Part G Modelling and Computational Issues

ical approximations. In practice, this resulted in therenewed development of cellular automata (hereafterCA) studies and their use for empirical investigations.A CA involves cells in a specific geometry; each cellcan be in one of a finite set of states, and evolves fol-lowing a synchronous local update rule based on thestates of the neighboring cells. The field of CA waspioneered in the 1940s by Ulam’s works on lattice net-works and von Neumann’s works on self-replicatingautomata. It was shown over the decades that such mod-els, though apparently over-simplistic, can not only besuccessfully used in fields as different as the social sci-ences [34.66] and artificial life [34.67], but also physics,in which they were shown in the late 1970s and 1980s tobe mesoscopic alternate to Navier–Stokes macroscopicequations [34.68].

34.2.5 Nondeterministic Simulations

Another important distinction is between determinis-tic and nondeterministic algorithms. From the onset,computers were used to execute nondeterministic algo-rithms, which may behave differently for different runs.

Nondeterministic simulations involve using randomnumber generators, which can be based on randomsignals produced by random physical processes, oron algorithms producing pseudorandom numbers withgood randomness properties. Overall, the treatment ofrandomness in computer simulations is a tricky issuesince generating truly random signals, with no spuriousregularities which may spoil the results by introducingunwanted patterns, turns out to be difficult.

Monte Carlo methods, also called Monte Carloexperiments, are a widely used type of nondetermin-istic simulations. They were central to the Manhattanproject, which led to the production of the first nu-clear bombs and contributed heavily to the developmentof computer simulations. They can be used for vari-ous purposes such as the calculation of mathematicalquantities like Pi or the assessment of average quan-tities in statistical physics by appropriately samplingsome interval or region of a state space. These practicesare hard to classify and, depending on the case, seemto correspond to computational methods, experiments,or full-blown computer simulations. Metropolis andUlam [34.69] is a seminal work, Galison [34.70, 71]correspond to historical studies, and Humphreys [34.8],Beisbart and Norton [34.72]) to epistemologicalanalyses.

34.2.6 Other Types of Computer Simulations

It is difficult to do justice to all the kinds of simu-lations that are seen in scientific practice. New com-

putational methods are regularly invented, and theseoften challenge previous attempts to provide rationaltypologies. Further, the features presented in the pre-vious sections are often mixed in complex ways. Forexample, CA-based methods in fluid dynamics, whichwere not originally supposed to involve numbers or of-fer exact computations, were finally turned into latticeBoltzmann methods, which involve making local av-erages [34.73]. Here, I shall merely present types ofcomputer simulations that are widely discussed in thephilosophical literature.

Agent-Based MethodsAgent-based methods involve the microlevel descrip-tion of agents and their local interactions (in contrastto global descriptions like balance or equilibrium equa-tions), and provide tools to analyze the microscopicgeneration of phenomena. They are often opposed toequation-based approaches, but the distinction is notcompletely sharp, since equations do not need to de-scribe global behaviors and, when discretized, oftenyield local update rules. Agent-based models and simu-lations are used across fields to analyze artificial, social,biological, etc., agents. CA models like the Schellingmodel of segregation can be seen as agent-based modelseven though most such agent-based also involve num-bers in the descriptions of local interactions. Becausethey rely on microscopic descriptions, agent-based sim-ulations are often at the core of debates about issuessuch as emergence [34.74], explanation [34.75], ormethodological individualism in science [34.76].

Coupled and Multiscale ModelsExtremely elaborate computational models, developedand studied by large numbers of scientists, are in-creasingly used to investigate complex systems such asEarth’s atmosphere, be it for the purpose of precise pre-dictions and weather forecasting or for the analysis oflarger less precise trends of climate studies. While influid dynamics, it is sometimes possible to carry outdirect simulations, where the whole range of spatialand temporal scales from the dissipation to the integralscale are represented [34.77, Chap. 9], such methodsare too costly for atmosphere simulations, in which sub-grid models of turbulence or cloud formation need to beincluded (see Edwards [34.78] and Heymann [34.79]for accessible and clear introductions). Also, differentmodels sometimes need to be coupled like in the caseof global coupled ocean-atmosphere general circulationmodels.

These complex computer simulations raise a num-ber of epistemological issues. First, in the case ofmultiscale or coupled models, the physical and com-putational compatibility of the different models can be

Page 21: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.3 Epistemology of Computational Models and Computer Simulations 743Part

G|34.3

a tricky issue, and one must be careful that it doesnot create spurious behavior in the computer simulation(see Winsberg [34.16, 80], Humphreys [34.41, 81] formore analyses about such models). Second, since thereare various ways of taking into account subgrid phe-nomena, pluralism in the modeling approaches cannotbe avoided [34.82]. Importantly, the existence of differ-ent incompatible models need not be seen as a problem,and scientists can try to learn by comparing their resultsor elaborate ensemblemethods to try to deal with uncer-tainties [34.83]. The development of investigations ofsuch large-scale phenomena requires collective work,both within and between research teams. Typically, notonly the interpretation of the models, their justification,the numerical codes [34.84], but also the standard ofresults [34.78, 85] must be shared by members of thecorresponding communities. An important but still un-explored question is howmuch the collective dimensionof these activities influences epistemologically howthey are carried out. From this point of view, the epis-temology and philosophy of computational models andcomputer simulations can be seen as another chapter ofthe analysis of the collective dimension of science.

Computational Methodsversus Computer Simulations

Not all major families of mathematical and compu-tational methods are used to produce computationalmodels or computer simulations of empirical systems.Evolutionary algorithms are used for the investigationof artificial worlds, or of foundational issues about evo-lution, and they have important applications in the fieldof optimization methods. Artificial neural networks areused in the field of machine learning and data learn-ing and to predict the behavior of physical systemsout of large databases. Bayesian networks are helpfulto model knowledge, develop reasoning methods, orto treat data. Overall, all these computational methodshave clear practical applications. They can be used forscientific tasks, sometimes concurrently with computersimulations in the case of predictive tasks. However, nogenuine representations of physical systems and theirdynamics seem to be attached to their use – even if, asthe development of CA-based simulations has shown,novel formal settings may eventually have unexpectedapplications for modeling purposes in the empiricalsciences.

34.3 Epistemology of Computational Modelsand Computer Simulations

Epistemologists analyze whether and howmuch knowl-edge claims are justified. In this case, it requires analyz-ing the specific roles played by computer simulationsin the production and generation of items of knowl-edge (Sect. 34.3.1). Different levels of description andanalysis can be relevant when investigating the epis-temology of computer simulations, in addition to thatof the computational model and how it is theoreticallyor experimentally justified (Sect. 34.3.2). Importantly,how computer simulations are justified, and why spe-cific computational models are used by scientists, areoverlapping (though not identical) questions. For ex-ample, field- or inquiry-specific explanations of theuse of computer simulations fail to account for cross-disciplinary recurrences in the use of computationalmodels, which may have more general mathematicalor computational explanations (Sect. 34.3.3). Overall,computer simulations are one of the main sources ofknowledge and data in contemporary science, even ifthe sense in which they produce new data and knowl-edge is often misunderstood (Sect. 34.3.4).

34.3.1 Computer Simulationsand Their Scientific Roles

Science, as an institution, aims to reach epistemic goalsof various sorts, both propositional (like reaching someepistemic states, typically justified true beliefs) andpractical (like being able to reliably manipulate somephysical systems). Epistemologically analyzing sciencerequires the study of the reliability and efficiency of sci-entific procedures to reach these goals. Accordingly, todevelop the epistemology of computer simulations, onefirst needs to single out their different scientific goals.

Even if they also serve pedagogical or expositorypurposes, most computer simulations can be describedas activities aimed to develop knowledge. There ex-ist various types of scientific knowledge (see Humph-reys [34.81] for an overview), which raise specific prob-lems, and, conversely, various types of knowledge canbe produced by computer simulations.

Typically, items of knowledge may differ in howthey are justified (theoretically, experimentally, induc-tively, etc.), and whether they were reached by a priorior a posteriori investigations. They may also differregarding the activities needed to produce them andthe type of information that they provide. For exam-

Page 22: Handbook Springer oƒ Model-Based Science

PartG|34.3

744 Part G Modelling and Computational Issues

ple, predictive or explanatory knowledge, or knowledgeabout how systems behave and can be controlled areof different types. Some scientific roles can be general(predicting) and others are very specific. For example,computer simulations are used to develop evidentialstandards in physics by simulating detection proceduresand identifying patterns of data (signatures) [34.86].Overall, developing a coherent and fine-grained episte-mology of computer simulations would require drawinga map of their various roles to see how much their epis-temological features are general or contextual and rolespecific.

Let us now be more specific. In the twentieth cen-tury, the role of experiments, as sources of empiricalevidence about nature and guides in the selection oftheories, was repeatedly, if not exclusively, empha-sized by empiricist philosophers of science. Conversely,activities which did not provide such evidence weremainly seen as serving theoretical purposes. Typically,models were first seen as being primarily of a theoret-ical nature [34.20, §5]. In this perspective, Models asMediators, in 1999, represented a significant advance.Morgan and Morrison, by presenting a more precise“account of what [models] do” in science [34.20, p. 18],offered a more nuanced epistemological picture, wheremodels were shown to have functions as diverse asinvestigating theories and the world, intervening, help-ing for measurement purposes, etc. Since an importantrole of computer simulations is to demonstrate thecontent of models [34.13] or unfold well-defined sce-narios [34.87], computer simulations can be expectedto have, or contribute to, similar roles to those describedby Morgan and Morrison and to potentially share theseroles with other demonstrative activities like argumen-tation or mental simulations.

Importantly, such a description of science, whereitems or activities as diverse as theories, models, com-puter simulations, thought experiments, or experimentsmay serve partly overlapping purposes, remains com-patible with empiricism provided that experiments areseen as, in the architecture of knowledge, the only ul-timate source of primary evidence about the nature ofphysical systems. It is also compatible with the claimthat secondary, derived sources of knowledge, like the-ories, models, or simulations, can sometimes be morereliable than experiments to provide information abouthow systems behave, in particular in cases in which ex-perimental evidence is hard to come by (Sect. 34.5.4).

Overall, it is unlikely that there is such a thing asthe epistemology of computational models and simu-lations. If the various roles of computer simulationsare specific cases of general functions, like demon-strating or unfolding, there may be such a thing asa general, but incomplete, epistemology of computer

simulations, corresponding to the general epistemolog-ical problems raised by such general functions. In anycase, to complete the picture, one needs to go deeperinto the analysis of the roles that computer simulationsserve within scientific practices and how they fulfillthese roles in various types of contexts. This programis not incompatible with the philosophical perspectivesof some of the advocates of the so-called practice turnin science [34.88], and in particular of authors who putcontextually described scientific activities at the core oftheir description of science [34.89, 90].

34.3.2 Aspects of the EpistemologicalAnalysis of Computer Simulations

A Multilayered EpistemologyEpistemology analyzes the types of justifications thatwe have for entertaining knowledge claims, and inves-tigates how and why we epistemically fail or succeed.In the case of computer simulations, failure may takeplace at various levels, from the material implementa-tion of the computation to the physical model that is atthe core of the inquiry, and at all the intermediate se-mantic levels of interpretation that are needed to usecomputers for the investigation of models (see Barber-ousse et al. [34.91] for a general description and Grimet al. [34.92] for a discussion of some specific failuresfound in computer simulations). Overall, the epistemol-ogy of computer simulations involves discussing thereasons that we have for claiming:

1. The computers that we use work correctly.2. The programs or algorithms do what we want them

to do.3. The computer simulations, qua physical represen-

tations, correctly depict what we want them torepresent.

Steps 1 and 2 correspond to questions related toengineering and computer science. I shall not discussthese at length but will simply illustrate them to showhow serious they are in this context. For example,at the architectural level, parallel computing requirescoordination the different cores of computers so thatall potential write-conflicts in the memory are solved.At the program level, when trying to solve a prob-lem P with an unknown solution S, scientists need toprove the correctness of the algorithms they use andto verify that the programs written do indeed executethese algorithms. Many such verification problems areundecidable, which means that no general versatile pro-cedure can be found to make this verification for allcases. However, this does not imply that proofs of thecorrectness of the algorithm cannot sometimes be pro-

Page 23: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.3 Epistemology of Computational Models and Computer Simulations 745Part

G|34.3

vided for specific problems. Overall, scientists in thisfield still actively investigate and debate how much al-gorithms can be verified (see Fetzer [34.93], Aspertiet al. [34.94] and Oberkampf and Roy [34.95] for dis-cussions). At a higher mathematical level, as we sawearlier, many computational methods provide numeri-cal methods for approximately solving problems, andthe stability of algorithms can be a source of concern,which means that analyzing computational errors is partof the epistemology of simulations [34.62].

Finally, one needs to assess whether the approxi-mations in the solution, as well as the representationalinadequacies of the model, are acceptable regardingthe physical inquiry pursued. At this interpretationallevel, because of the variety of theoretical contextsin which computer simulations are carried out, thereis no single and general way in which the reliabil-ity of the results they provide can be analyzed. Thecredentials of computer simulations will be differentdepending on whether a sound theory is being used,how much uncertainty there is about the initial condi-tions, how complex the target system is, whether drasticsimplification assumptions have been made etc. Also,depending on what the simulation is used for, and whattype of knowledge it is meant to provide, the justi-ficatory requirements will be more or less stringent.It takes different arguments to justify that based ona simulation one knows how to represent, control, pre-dict, explain, or understand the behavior of the system(see Sect. 34.4 for a discussion of the last two cases,and [34.96] for similar analyses). Similarly, precisequantitative spatial-temporal predictions are in needof much pointed justifications than computer simula-tions aimed at studying average quantities or qualitativebehaviors of systems. Importantly, this discussion ofthe reliability of computer simulations overlaps sig-nificantly with that of the epistemology of physicalmodels, and with how the results issued from approx-imate, idealized, coarse grained, or simply partly falsemodels can still be scientifically valuable (see Por-tides Chap. 2, this volume; Frigg and Nguyen Chap. 3,this volume). However, in the present context, it is im-portant to emphasize that, even if the content of modelsobviously constrains the reliability of the informationthat can be extracted from them, models do not bythemselves produce results – only procedures which in-vestigate them do. In this perspective, the epistemologyof computer simulations is a reminder that reliabil-ity primarily characterizes practices or activities thatproduce knowledge and that models, taken alone, arenot such practices. In other words, epistemological dis-cussions about the reliability of models as knowledgeproviders make sense only by explicitly reintroducingsuch practices or when it can be assumed that reliably

extracting all their content is possible, an assumptionthat, in the framework of computational science, is of-ten not plausible.

From Theoretical to Empirical JustificationsComputer simulations have often been viewed as waysof exploring theories by hypothetico-deductive meth-ods. This characterization captures a part of the truth,since existing theories are often a starting point for theconstruction of computer simulations. In simple cases,computer simulations can mainly be determined by the-ories, like in the case of direct simulations [34.77] influid dynamics, which derive from Navier–Stokes equa-tions, and in which all relevant scales are simulated andno turbulence model is involved.

However, taken as a general description, this viewmisrepresents how computer simulations are often pro-duced and their validity justified. As emphasized byLenhard [34.97], even when theoretical equations arein the background, computer simulations often resultfrom some cooperation between theory and experi-ments. For example, in 1955 when Norman Phillipsmanaged to reproduce atmospheric wind and pressurerelations with a six-equation model, which arrange-ment of equations could lead to an adequate model ofthe global atmosphere was uncertain and the need forexperimental validation was primordial to confirm hisspeculative modeling assumptions. Overall, the role ofempirical inputs in simulation studies is usually cru-cial to develop phenomenological modules of models,parameterize simulations, or investigate their reliabilitybased on their empirical successes [34.15, 17].

At the same time, since computer simulations areused precisely in cases where empirical data are absent,sparse, or unreliable [34.16], sufficient data to build upand empirically validate a computational model maybe missing. In brief, in some cases, computer simula-tions can be sufficiently constrained neither by theoriesnor by data and are somewhat autonomous. From anepistemological point of view, this potential situationof theoretical and experimental under-determination isnot something to be hailed, since it undermines the sci-entific value of their results (see also Sect. 34.4.2).

The Epistemology of Complex SystemsBecause computer simulations are generally used toanalyze complex systems, their epistemology partlyoverlaps that of complex systems and their modeling.It involves the analysis of simplification proceduresat the representational or demonstration levels and ofhow various theoretical or experimental justificationsare often used concomitantly. Overall, when it comesto investigating complex systems, obtaining reliableknowledge is difficult. Thus, any trick or procedure that

Page 24: Handbook Springer oƒ Model-Based Science

PartG|34.3

746 Part G Modelling and Computational Issues

works is welcome and the result is often what Winsberghas labeled a motley epistemology [34.16].

At the same time, sweeping generalizations shouldbe avoided. Philosophers studying computer simula-tions have too often cashed in their epistemology interms of that of the most complex cases, such as com-puter simulations in climate science, which are charac-terized by extreme uncertainties and the complexity oftheir dynamics. But computer simulations are used toinvestigate systems that have various types and degreesof complexity, and whose investigation meets differentsorts of difficulties. It is completely legitimate, and po-litically important, that philosophers epistemologicallyanalyze computational models and computer simula-tions in climate science (see, e.g., [34.98] for an earlyinfluential article). However, to obtain a finer grainedand more disentangled picture of the epistemology ofcomputer simulations, and not to put everything in thesame boat, a more analytic methodology should be ap-plied. More specifically, one should first analyze howthe results are justified in more simple cases of com-puter simulations where specific scientific difficultiesare met independently. In a second step, it can be an-alyzed how adding up scientific difficulties changesjustificatory strategies and when exactly more holisticepistemological analyses are appropriate [34.99]. In thisperspective, much remains to be done.

Epistemic OpacityEpistemic opacity is present in computer simulations tovarious degrees and has various origins.

Models are often said to be white, gray, or blackboxes depending on how they represent their target sys-tem. White-box models describe the fundamental lawsor causal mechanisms of systems whereas black-boxmodels simply correctly connect different aspects oftheir behavior. This distinction partly overlaps with thatof theoretical and phenomenological models (see Bar-berousse and Imbert [34.100, §3.2] for sharper distinc-tions about these last notions). Computer simulationscan be based on all types of such models, which mayaffect the understanding that they yield [34.101] (seealso Sect. 34.4).

Opacity can also be present at the computationalmodel or computational process level. Global epistemicopacity may arise from the complexity of the computa-tion when it is not possible for an agent to inspect andverify all steps of the process [34.41, §3.7], [34.102]. Itis in part contingent since it is rooted in our limitationsas epistemic creatures, but it may be in part intrinsic inthe sense that the complexity of the computationmay beirreducible (see Sect. 34.4.3). Importantly, it is compati-ble with local epistemic transparency, when any step ofthe process can be inspected by a human mind – which

may prove useful in cases in which problems can belocated by testing parts of the process and applying a di-chotomy procedure. Local transparency requires that alldetails of the physical models and computational algo-rithms used be transparent, which may be more or lessthe case. Usually, computer simulationsmake heavy useof mathematical resource libraries such as code lines,routines, functions, algorithms, etc. In applied science,more or less opaque computational software can beproposed to simulate various types of systems, for ex-ample, in computational fluid dynamics [34.91, p. 567].This raises epistemological problems since black-boxsoftware is built on physical models with limited do-mains of physical validity, and results will usually bereturned even when users unknowingly apply such soft-ware outside these domains of validity.

Another form of epistemic opacity for individualscientists arises from the fact that investigating natu-ral systems by computer simulations may require dif-ferent types of experts, both from the empirical andmathematical sciences. As a result, no single scien-tist has a thorough understanding of all the details ofthe computational model and computational dynam-ics. Such type of opacity is not specific to computersimulations, since it is a consequence of the epis-temic dependence between scientists within collabora-tions [34.103].

34.3.3 Selecting Computational Modelsand Practices

How do individual scientists decide to pursue specifictheories, and, in particular, what types of sociological,psychological, or epistemic factors play a role in suchprocesses? Conversely, do selected theories share spe-cific features or properties? Mutatis mutandis, similarquestions can be asked about other elements of science,such as research programs, experiments, models, prac-tices, and, in the present case, computational modelsand computer simulations. Philosophers have mainlyanalyzed these questions by focusing on the explicitscientific justifications of individual practices, and thecontent of the representations involved. As we shall see,this is only a part of the story.

Explanation of Usesversus Justification of Uses

A helpful distinction is that between the explanation(and context) of use of a practice and its scientific jus-tification within a scientific inquiry aimed at reachingspecific purposes. To use words close to Reichen-bach’s [34.104, pp. 36–37], while the latter deals withthe objective relation that scientists consciously try toestablish between these given <activities> (simula-

Page 25: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.3 Epistemology of Computational Models and Computer Simulations 747Part

G|34.3

tions, experiments, etc.,) and the conclusions that areobtained from them, other aspects of material, computa-tional, cognitive, or social natures, potentially unknownto the scientific agents involved in the inquiry, may playa role to explain that these activities were actually car-ried out. For example, in the case of the MillenniumRun (a costly simulation in astrophysics), the resultswere made publicly accessible. Scientists who were notinvolved in the process leading to the decision to carryout this simulation could try to make the best of it sinceit was already there and milk it as much as possiblefor different purposes. Or, some scientists may decideto study biological entities like proteins or membranesby means of Monte Carlo simulations, because mem-bers of their teams happen to be familiar with thesetools. However, once they have decided to do so, theymust still justify how their computer simulations sup-port their conclusions.

In the perspective of explaining actual scientificuses, one also needs to distinguish between explana-tions aimed to account for specific uses (e.g., Why wasthe millennium simulation carried out in 2005 by theVirgo consortium?) and those aimed to explain moregeneral patterns, corresponding to the use of practicesof a given type, within or across several fields of science(e.g., Why are Monte Carlo simulations regularly usedin this area of physics?, Why are they used regularlyin science?). Importantly, since different instantiationsof a pattern may have different explanations, the ag-gregated frequency of a scientific practice, like that ofthe use of the Ising model across science, may be thecombined effect of general transversal factors and ofinquiry- or field-specific features [34.105].

Field-Specific versus Cross-DisciplinaryExplanations

A tempting move has often been to answer that sci-entific choices are primarily, if not completely, theorydriven – and are therefore field specific. After all, theo-ries guide scientists in their predictive and explanatoryactivities by fueling the content of their representa-tions of natural systems. However, a reason to look foradditional elements of explanations is that the spec-trum of actual modeling and computational practices issmaller than our scientific knowledge and goals wouldallow [34.21, 41, 106]. For example, why do the har-monic oscillator, the Ising model, the Lotka–Volterramodel, Monte Carlo simulations, etc., play such promi-nent roles throughout science?

As highlighted by Barberousse and Imbert[34.105], a variety of significantly different expla-nations of the greater or lesser use of models ofa given type, and of scientific practices, can be found,beyond the straightforward suggestion that there are

regularities in nature, which are mirrored by modelingand computational practices.

Local FactorsThe explanation may be rooted in the specificities ofmodeling and computational activities. In particular, ifexplaining is better achieved by limiting the numberof (types) of (computational) models [34.21, pp. 144–5], or explanatory argument patterns [34.107], it isno surprise that often the same computational mod-els and practices are met. Also, scientists may feelthe need to avoid dispersion of their efforts in caseswhen research programs need to be pursued for a longtime before good results can be reached and it ismore profitable to exploit a local mine than to go dig-ging somewhere else [34.106, Chaps. 4 and 5], [34.21,pp. 143–4]. More generally, the recurrence of compu-tational practices may be viewed as another example ofthe benefits of adopting scientific standards [34.108].One may also, in the Kuhnian tradition, put the em-phasis on the education of scientists, who are taughtto see new problems through the lens of specific prob-lems or exemplars [34.106, p. 189], and emphasize thatthis education has a massive influence on the lineagesof models or practices which are later developed. Thisstory can have a more or less contingentist version, de-pending on why the original models or practices at thelineage seeds are adopted in the first place, and whythese uses are perpetuated and scientists do not emanci-pate from them after schooling.

Theories may also play an indirect role in the se-lection of computational models. For example, modelsnaturally couched in the standard formalism of a theorymay be easier to use, even if the same physics can alsobe put to work by using other models. Barberousse andImbert [34.100] analyze the case of lattice methods forfluid simulations in depth, which, though significantlydifferent from approaches based on Navier–Stokes dif-ferential equations, can be used for the same purposes,even if this requires spending time learning and har-nessing new methods and formalisms, which physicistsmay be reluctant to do.

Computationaland Mathematical Explanations

As seen in Sect. 34.1.4, Humphreys [34.41, 81], sug-gests that most scientific models are tailored to fit theavailable mathematics, hence the importance in sci-entific practice of tractable models (see Humphreys’snotion of computational template [34.41, §3.4], andfurther analyses in [34.45]). Even if one grants thepotential importance of such mathematical and compu-tational factors, cashing out in detail the correspondingexplanation is not straightforward. Barberousse and

Page 26: Handbook Springer oƒ Model-Based Science

PartG|34.3

748 Part G Modelling and Computational Issues

Imbert [34.105] emphasize that there are various com-putational explanations. The objective computationallandscape (how intrinsically difficult problems are, howfrequent easy problems are) probably influences howscience develops, even if knowing exactly what it lookslike and how it constrains scientific activity is of the ut-most difficulty. However, the epistemic computationallandscape (scientists’ beliefs about the objective com-putational landscape) may just be as important since itframes modeling choices made by scientists.

Other potentially influential factors may also in-clude how difficult it is to explore the objective land-scape (and the corresponding beliefs regarding theeasiness of this exploration), how much scientists, whotry to avoid failure, are prone to resort to tractablemodels, or which techniques are used to select suchtractable models (since some specific techniques, likepolynomial approximations, may repeatedly select thesame models within the pool of tractable models). Fi-nally, modeling conservativeness may also stem fromthe computational and result pressure experienced byscientists, that is, how scarce computational resourcesare in their scientific environment and how much scien-tists need to publish results regularly.

Universality, Minimality,and Multiple Realizability

Other explanations may be offered in terms of howweak the hypotheses are to satisfy a model or a dis-tribution. For example, the Poisson distribution is oftenmet because various types of stochastic processes sat-isfy the few conditions that are required to deriveit [34.41, pp. 88–89]. Relationships between modelsand how models approximate to each other may alsobe important. Typically, the Gaussian distribution isthe limit of various other distributions (see however,Lyon [34.109] for a more refined analysis and the claimthat in Nature Gaussian distributions are common, butnot pervasive). More generally, models that capture uni-versal features of physical systems and are rooted inbasic properties, such as their topology, can be ex-pected to be met more often. Therefore, for reasonshaving to do with the mathematics of coarse-grain de-scriptions, and the explanation of multiple realizability,many systems fall into the same class and have similardescriptions [34.110–112] when minimal, macro-level,or simply qualitative models are built and explored.

Importantly, all the above explanations are not ex-clusive. Typically, the emphasis on tractability may bea general one in the sense that models always need tobe tractable if they are to be used by scientists.

34.3.4 The Production of ‘New’ Knowledge:In What Sense?

Be Careful of Red Herrings!It is commonly agreed that computer simulationsproduce new knowledge, new data, new results, ornew information about physical systems (Humphreys[34.41], Winsberg [34.113, pp. 578–579], Norton andSuppe [34.114, p. 88], Barberousse et al. [34.91,p. 557], Beisbart [34.115]). This can be considered asa factual statement, since contemporary science, whichis considered to produce knowledge, relies more andmore heavily on computer simulations.

At the same time, the notion of knowledge shouldnot be a red herring. It is commonly considered thatexperiments, inferences, thought experiments, repre-sentations, or models can bring knowledge, which thengenerates the puzzle that widely different activities havesimilar powers. The puzzle may be seen as somewhatartificial since knowledge, especially scientific, can beof different types [34.81], and when new knowledge isproduced, the novelty can also be of different types. Inthis perspective, it may be that what is produced byeach of these activities falls under a general identicalconcept but is significantly different. From this pointof view, the real question concerning computer simu-lations is not whether they produce knowledge, but inwhich particular sense they produce knowledge, whatkind of knowledge they produce, what is specific tothe knowledge produced by computer simulations, andwhat type of novelty is involved.

A comparison can be made with thought experi-ments, for which the question of how they can producenew knowledge has also been debated. Both activi-ties correspond to the exploration of virtual situations,and do not involve direct interactions with the sys-tems investigated. From this point of view, computersimulations and thought experiments can be seen asplatonic investigation of ideas, with this differencethat, for computer simulations, the mind is assisted bycomputers [34.41, p. 115–116]. Overall, computer sim-ulations have been claimed to sometimes play the samerole of unfolding as thought experiments [34.87], havesometimes been equated with some types of thoughtexperiments [34.116], and it has been suggested thatcomputational modeling might bring the end of thoughtexperiments [34.117]. Importantly, even if thought ex-periments are perhaps less used in science than for-merly, this latter claim seems implausible. The reasonis that there are different kinds of thought experiments,and many reveal conceptual possibilities that have lit-tle to do with computational explorations. Arguably, thepossibility to set up computer simulations would haveadded nothing to famous thought experiments such as

Page 27: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.3 Epistemology of Computational Models and Computer Simulations 749Part

G|34.3

those made by Galileo, Einstein, Podolsky and Rosen,or Schrödinger. (I am grateful to Paul Humphreys foremphasizing this point.) In any case, a satisfactoryaccount of these activities should account for bothsimilarities and differences in how they work epistemo-logically and how they are used.

In any case, the question of how and what wecan learn about reality by using these methods arises,even if the sources of puzzlement do not exactly touchthe same points in each case. Indeed, how mentalthought experiments work is more opaque than howcomputer simulations do. For this reason, their ratio-nal reconstruction as logical arguments [34.118, p. 354]is more controversial than that of computer simula-tions [34.115], and it is less clear whether their posi-tive or negative epistemic credentials are those of thecorresponding reconstructed argument [34.119]. (Forexample, if certain thought experiments are reliablebecause mental reasoning capacities about physical sit-uations have been molded by evolution, development,or daily experiments, it is not clear that their logi-cal reconstruction will more vividly make clear whythey are reliable.) The situation is clearer for com-puter simulations since the process is externalized andis based on more transparent mechanisms (see how-ever Sect. 34.3.2). Then, if computer simulations arenothing else than (computationally assisted) thinkingcorresponding to the application of formal rules, andtheir output is somewhat contained in the description ofthe computational model, how knowledge is generatedis clearer but the charge of the lack of novelty is heavier.

The Need for an Adequate Notion of ContentSuppose that a physical system S is in a state s at time tand obeys deterministic dynamicsD. Then, the descrip-tion of D and s characterizes a mathematical structureM, which is the trajectory of S in its phase space andis known as such. If a computer simulation unfolds thistrajectory, then it explicitly shows which states S willbe in. At the same time, any joint description of oneof these states and of the dynamics denotes the samestructure M, which is known to characterize the evo-lution of S. So, from a logical point of view, no newcontent has been unraveled by the computer simulation,which can at best be seen as a means of producing newdescriptions of identical contents. In brief, if knowl-edge is equated with that of logical content, computersimulations do not seem to be necessarily producingnew knowledge. We may even be tempted to describecomputer simulations as somewhat infertile and therebyperpetuate a tradition according to which formal or me-chanical procedures to draw inferences, and rules oflogic in particular, are sterile, as far as discovery is con-cerned, and can at best be used to present pieces of

knowledge that have already been found – a positiondefended by Descartes in 1637 in the Discours de laMéthode [34.120]. This kind of puzzle, though particu-larly acute for computer simulations, is not specific tothem and is nothing new for philosophers of language –Frege and Russell already analyzed similar ones. How-ever, this shows that, pace the neglect for linguisticissues in the present philosophy of science, without anadequate theory of reference and notion of content thatwould make clear what exactly we know and do notknow when we make a scientific statement, we are ill-equipped to precisely analyze the knowledge generatedby computer simulations [34.41, 121].

Computational science may also remain somewhatmysterious if one reasons with the idealizations usu-ally made by philosophers of science. As emphasizedin Sect. 34.1.4, idealizing away the practical constraintsfaced by users is characteristic of much traditional phi-losophy of science and theories of rationality. In thepresent case, it is true that “in principle, there is nothingin a simulation that could not be worked out with-out computers” [34.122, p. 368]. Nevertheless, adoptingthis in principle position is unlikely to be fruitful heresince, when it comes to actual computational science,which scientific content can be reached in practice isa crucial issue if one wants to understand how com-putational knowledge develops and pushes back theboundaries of science (see Humphreys [34.41, p. 154]and Imbert [34.102, §6]).

Overall, it is clear that present computational pro-cedures and computer simulations do contribute to thedevelopment of scientific knowledge. Thus, it is incum-bent on epistemologists and philosophers of sciences todevelop conceptual frameworks to understand how andin what sense computer simulations extend our scienceand what type of novelty is involved.

Computer Simulationsand Conceptual Emergence

Computer simulations unfold the content of computa-tional models. How to characterize the novelty of theknowledge that they bring us? Since the notion of nov-elty is also involved in discussions about emergence,the literature about this latter notion can be profitablyput to work here.

Just as emergence may concern property instancesand not types [34.123, 124, p. 589], the notion of nov-elty needed here should apply to tokens of propertiesinstantiated in distinctive systems and circumstances, orto specific regularities the scope of which covers suchtokens and circumstances. For example, the apparitionof vortices in fluids is in a sense nothing new, since thebehavior of fluids is covered by existing theories in fluiddynamics, no new concept is involved, and other phe-

Page 28: Handbook Springer oƒ Model-Based Science

PartG|34.4

750 Part G Modelling and Computational Issues

nomena of this type are already understood for somewell-studied configurations. At the same time, findingout that patterns of vortices emerge in configurations ofa new type is a scientific achievement and the discoveryof some new piece of knowledge.

Importantly, as emphasized by Barberousse andVorms [34.125, p. 41], the notion of novelty should beseparated from that of surprise. When the exact valueof a variable is precisely learnt and lies within therange that is enabled by some physical hypothesis orprinciple, we have a kind of unsurprising novelty. Bar-berousse and Vorms give an example from experimentalscience, but computer simulations may also provide ex-act values for quantities, which agree with general laws(e.g., laws of thermodynamics) and are therefore partlyexpected.

In addition, computer simulations can provide casesof surprising novelty, concerning behaviors that arecovered by existing theories like chaotic behavior forclassical mechanics. Indeed, Lorenz attractor and be-haviors of a similar type were discovered by meansof computer simulations of a simplified mathematicalmodel initially designed to analyze atmospheric con-vection, and this stimulated the development of chaostheory [34.125, p. 42].

This leads us to a type of novelty, related towhat Humphreys calls conceptual emergence. Some-thing is conceptually emergent relative to a theoreticalframework F when it requires a conceptual apparatusthat is not in F to be effectively represented [34.41,p. 131], [34.123, p. 585]. The conceptual apparatusmay require new predicates, new laws and sometimesthe introduction of a new theory. Importantly, con-ceptual emergence is not merely an epistemic notion.It does not depend on the concepts we already pos-sess and the conceptual irreducibility is between twoconceptual frameworks. Further, even if instances ofthe target pattern can be described at the microlevelwithout the conceptually emergent concepts, the de-scription of the pattern itself, if it is made withoutthese novel concepts, is bound to be a massive dis-junction of microproperties, which misrepresents the

macro-pattern qua pattern. Also, the same conceptuallyemergent phenomena may arise in different situationsand its description may therefore require an indepen-dent conceptual framework, just like the regularities ofspecial science require new concepts, unless one is pre-pared to describe their content in terms of a massivedisjunction of all the cases they cover [34.126].

Interestingly, various phenomena investigated bycomputational science are conceptually emergent. Evenif computer simulations are sufficient to generate them,identifying, presenting, and understanding them mayrequire further analyses of the simulated data, re-descriptions at higher scales and the development ofnew theoretical tools. For example, traffic stop-and-gopatterns in CA models of car traffic, emergent phe-nomena in agent-based simulations, and much of theknowledge acquired in classical fluid dynamics seemto correspond to the identification and analysis of con-ceptually emergent phenomena. Effectively, it is byconceptually representing these phenomena in differentframeworks that one manages to gain novel informa-tion about these systems, above and beyond our blindknowledge of the microdynamics that generates them.

It is important to emphasize that different typesof novelty described above are also met in experi-ments exploring the behavior of systems for which thefundamental physics is known. In other words, the po-tential novelty of experimental results should not beoveremphasized. Even if only experiments can con-found us [34.127, pp. 220–221] through results whichare not covered by our theories or models, many ofthe new empirical data that these experiments provideus with are no more novel than those produced bycomputer simulations. The statements describing theseresults are not strongly referential, in the sense that nounknown aspects of the deep nature of the correspond-ing systems would be unveiled by a radically new actof reference [34.87, pp. 3463–3464]. These statementsderive from what we already know about the physicalsystems investigated, and the experimental systems un-ravel them for us. In this sense, they are merely weaklyreferential.

34.4 Computer Simulations, Explanation, and Understanding

Can scientists provide explanations by simulating phe-nomena? If the answer is based on the explanatoryrequirements corresponding to the existing accountsof explanation, it is hard to see why some computersimulations could not be explanatory (Sect. 34.4.1).Why the specificities of computer simulations shouldnecessarily deprive them for their explanatory poten-tial is also unclear (Sect. 34.4.2), which is compatible

with the claim that computer simulations are used forinquiries whose results are, on average, less explana-tory (Sect. 34.4.3). Be this as it may, because theyheavily rely on informational and computational re-sources, computer simulations challenge our intuitionsabout explanatoriness, and in particular the expecta-tion that good explanations should enable scientists toenjoy first-person objective understanding of the sys-

Page 29: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.4 Computer Simulations, Explanation, and Understanding 751

PartG|34.4

tems they investigate (Sect. 34.4.4). Even if computersimulations fail to meet these expectations because oftheir epistemic opacity, understanding may sometimesbe regained by appropriately visualizing the results orstudying phenomena at a coarser level (Sect. 34.4.5). Inany case, scientific judgments about such issues are in-fluenced by disciplinary norms, which may sometimesevolve with the development of computational science(Sect. 34.4.6).

34.4.1 Traditional Accounts of Explanation

Philosophers of science have discussed intensively theissue of scientific explanation over the last decades.The seminal works of Hempel were published in the1940s, when computational science started to develop.However, until recently, discussions about computersimulations and explanations did not interfere witheach other – which could suggest that for theoristsof explanation, how explanations are produced doesnot in fact matter. While it is true that many of theexamples of explanatory inquiries analyzed in the liter-ature are simple and, at least in their most elementaryversions, do not belong to computational science, itis hard to see why computer simulations could notin some cases satisfy the requirements correspondingto major accounts of explanations. According to thedeductive-nomological (hereafter DN) model, one ex-plains a phenomenon when a sentence describing it islogically deduced from true premises essentially con-taining a scientific law [34.128, pp. 247–248]. Forexample, the explanation of the trajectory of a comet,by means of a computer simulation of its trajectorybased on the laws of classical (or relativistic) mechanicstogether with the initial positions of all bodies signif-icantly influencing its trajectory, seems to qualify asa perfect example of DN explanation – provided thatcomputer simulations can be seen as deductions [34.91,115].

Analog statements can be made concerning thecausal and unificationist models of explanations. Thecomputer simulation of the comet’s trajectory is a wayto trace the corresponding causal processes, describedin terms of mark transmission [34.129] or of conservedquantities such as energy and momentum [34.130].Other causal theorists of explanation like Railton haveclaimed that explanatory progress is made by detail-ing the various causal mechanisms of the world andall the nomological information relevant to the inves-tigated phenomenon; the corresponding “ideal explana-tory text” is thereby slowly unveiled [34.131]. But, oneshould note that, because such ideal explanatory textsare necessarily complex, their investigation is almost in-evitably made by computational means.

Similarly, computer simulations can sometimes beinstantiations of argument patterns that are part of whatKitcher describes as the explanatory store unifying ourbeliefs [34.107]. For example, the computation of thecomet’s trajectory can be seen as an instantiation of“the Newtonian schema for giving explanations of mo-tions in terms of underlying forces” [34.132, p. 121,p. 179].

Be this as it may, computer simulations have oftenbeen claimed, both by scientists and philosophers, to besomewhat problematic concerning explanatoriness andlacking some of the features that are expected to gowith the fulfillment of explanatory requirements. Thisreproach of unexplanatoriness can be understood in sev-eral senses.

34.4.2 Computer Simulations:Intrinsically Unexplanatory?

One may first claim that computer simulations in gen-eral, or some specific types of them, do not meetone’s favorite explanatory requirements. For example,agent-based simulations may be described as not usu-ally involving covering laws nor providing explanatorycausal mechanisms or histories [34.75, 133]. However,one should not ascribe to computer simulations re-proaches that should be made to the field itself. Ifa field does not offer well-entrenched causal laws andone is convinced that explanations should be basedon such laws, then the computer simulations made insuch fields are not explanatory, but this has nothing todo with computer simulations in general. Also, somecomputer simulations are built with scientific materiallike phenomenological regularities, which potentiallymakes them unexplanatory, but this material could alsobe used in the context of explanatory inquiries involv-ing arguments or closed form solutions to models. Thus,the problem comes from the use of this material and notfrom the reliance on one or another mode of demon-stration – and claiming that computer simulations areunexplanatory is like blaming the hammer for the hard-ness of the rock.

For this reproach to be meaningful (and specificto computer simulations), it should be the case thatother inquiries based on the same material are indeedexplanatory, but that the corresponding explanationsbased on computer simulation are not, because of spe-cific features of computer simulations or some typesof them. It is not completely clear how this can beso. Computer simulations are simply means of explor-ing scientific models and hypotheses by implementingalgorithms, which provide information about tractableversions of these models or hypotheses. Therefore, theirexplanatory peculiarity, if any, should be an effect of

Page 30: Handbook Springer oƒ Model-Based Science

PartG|34.4

752 Part G Modelling and Computational Issues

specific features like the use of algorithms, coding lan-guages, or external computational processes.

There is no denying that the need to format scien-tific models and hypotheses into representations thatare suitable for computational treatment comes withconstraints. For example, a recent challenge has beento adapt coupled circulation models and their algo-rithms to the architecture of modern massively parallelsupercomputers. Similarly, when one uses CA mod-els for fluid dynamics, the physical hypotheses mustbe expressed in the straightjacket of up-to-date rulesbetween neighboring cells on a grid. Beyond these gen-uine constraints on computational practices, one shouldremember that, computational languages, provided theyare rich enough, are content neutral in the sense that anycontent that can be expressed with some language canalso be expressed with them. Similarly, computationaldevices like the computers we use daily are universalmachines in the sense that any solution to a compu-tational problem (or inference) that can be producedby other machines can also be produced by them. Forthese reasons, it is hard to see why, in principle, com-puter simulations should be explanatorily limited, sincethe theoretical content and inferences related to othermeans of inquiries can also be processed by them.

The case of CA models abovementioned exempli-fies nicely this point. For several decades, CA mod-els have been used under various names in variousfields; from Schelling’s investigations about spatial seg-regation in neighborhoods, analysis of shock wavesin models of car traffic, models of galaxies, inves-tigations of the Ising model, to fluid dynamics (seeIlachinski [34.134] for a survey). Because existing the-ories and scientific laws are not expressed in terms ofCA, some philosophers have claimed that CA-basedsimulations were merely phenomenological [34.135,pp. 208–209], [34.9, p. 516]. Nevertheless, Barberousseand Imbert [34.100] have argued that such bold generalstatements do not resist close scrutiny. They present thecase of lattice gas models of fluids and argue that, be-yond their unusual logical nature, from a physical pointof view, such mesoscopic models and computer sim-ulations make use of the same underlying physics ofconserved quantities as more classical models, and canbe seen as no less theoretical than concurrent computersimulations of fluids based on macroscopic Navier–Stokes equations. Therefore, there is no reason whysuch computer simulations could not be usable for sim-ilar explanatory purposes.

Overall, there is no denying that some (and possi-bly many) computer simulations are not explanatory.Providing various examples of unexplanatory computersimulations is scientifically valuable, but it says nothinggeneral about their general lack of explanatory power,

unless one shows why unexplanatoriness stems fromspecific features of (some types of) computer simula-tions qua simulations. In the absence of such conceptualanalyses, one can simply conclude that some scientificuses of computer simulations, or some computationalpractices, turn out to be unexplanatory.

34.4.3 Computer Simulations:More Frequently Unexplanatory?

A different claim is that, given the current uses ofcomputer simulations in science, they are more oftenunexplanatory than other scientific items or activities,even if this is partly a contingent matter of fact. Theexplanatoriness of computer simulations can be threat-ened in various ways. Computational models may bebuilt on false descriptions of target systems or may lacktheoretical support and simply encapsulate phenomeno-logical regularities; they may have been spoiled by theapproximations, idealizations, and modeling tricks usedto simplify models and make them tractable; they maydepart from the well-entrenched explanatory norms ina field or may not correspond to accepted explanatorymethods. Clearly, none of these features is specific tocomputer simulations. However, it may be the case thatbecause of their current uses in science, computer sim-ulations more frequently instantiate them.

The Lure of Computational ExplorationsBecause they are powerful heuristic tools, and becauseother means of exploration are often not available,computer simulations are more often used to toy andtinker with hypotheses, models, or mechanisms and,more generally, to experiment on theories [34.135,136]. This may especially be the case in fields wherethere is no well-established theory to justify (or inval-idate) the construction of models, or where collectedevidence is not sufficient to check that the simulatedmechanisms correspond to actual mechanisms in targetsystems. For example, in cognitive science, competingtheories of the mind and its architecture coexisted fordecades, and even modern techniques of imaging likefMRI (functional magnetic resonance imaging), thoughempirically informative, do not provide sufficient evi-dence to determine how the brain works precisely interms of causal mechanisms. Accordingly, in this field,developing a model that is able to simulate the cog-nitive performances of an agent does not imply thatone has understood and explained how her brain works,and more refined strategies that constrain the functionalarchitecture must be developed if one wants to makeexplanatory claims [34.4, Chap. 5]. The issue is allthe more complex in this specific field since the in-quiry may also involve determining (verses assuming)

Page 31: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.4 Computer Simulations, Explanation, and Understanding 753

PartG|34.4

whether neural processes are computations [34.137].Similarly, in the social sciences, empirically validat-ing a simulation is far from being straightforward andas a result the epistemic and, in particular, explana-tory value of computer simulations is often question-able [34.138].

Overall, since computer simulations offer powerfultools to investigate hypotheses and match phenomena,it is a temptation for scientists to take a step further andclaim that their computer simulations have explanatoryvalue. In brief, computer simulations offer a somewhatnatural environment for such undue explanatory claims.

The Worries of Under-DeterminationIn the case of computer simulations, the higher fre-quency of inappropriate explanatory claims may bereinforced by the combination of several factors.

When toying with hypotheses, scientists are ofteninterested in trying to reproduce some target phe-nomenology, so they often do not tinker in a neutralway. The specific problemwith computer simulations isthat, in many cases, getting the phenomenology right issomewhat too easy, and the general problem of under-determination of theoretical claims by the evidence isparticularly acute.

First, computer simulations are often used in caseswhere data are scarce, incomplete, or of low quality(see, e.g., [34.78, Chap. 10] for the case of climate dataand how making data global was a long and difficultprocess). The scarcity of data can also be a primary mo-tivation to use computer simulations to inquire abouta system for which experiments are difficult or impos-sible to carry out, like in astrophysics [34.139]. Further-more, knowledge of the initial and boundary conditionsout of which the computer simulations should be fedmay also be incomplete, which leaves more latitudefor scientists to fill in the blanks and possibly matchdata. As a result, confidence in the result of com-puter simulations like the Millennium Run and in theirrepresentational and explanatory success is in part un-dermined [34.139].

Second, computer simulations usually involve morevariables and parameters than theories. For example,for a 10� 10 grid with cells characterized by threevariables, the total number of variables is already 300.This raises the legitimate suspicion that, by tuning vari-ables in an appropriate way, there is always a meansto obtain the right phenomenology. (Ad hoc tuningis of course not completely straightforward, since themany variables involved in a computer simulation areusually jointly constrained. Typically, in a fluid simu-lation, all cells of the grid obey the same update ruleand are correlated.) This possibility of tuning variablesand parameters is indeed used in fields like machine

learning, which can be based, for example, on the useof artificial neurons. In such fields, one first combinesa limited number of elementary mathematical functions(e.g., artificial neurons) that, when adequately parame-terized, reproduce potentially complex behaviors foundin databases (the learning phase). In a second step,one uses the parameterized functions (e.g., the trainedneural network) on new cases in the hope that extrap-olation and prediction are possible. In such cases, evenif the right phenomenology is reproduced, and extrap-olation partly works, it is clear that the trained neuralnetwork and the corresponding mathematical functionsdo not explain the phenomena. Overall, this meansthat the ability to reproduce some potentially complexphenomenon is far from being sufficient to claim thatthe corresponding computer simulation has explanatorypower (see also [34.140] for the issue of the over-fittingof computer simulations to data).

Third, when scientists do succeed, they may be sub-ject, as other human creatures, to confirmation biases,overweigh their success and tend to ignore the fact thatvarious mechanisms or laws can produce the same data(or that other aspects of their computer simulations donot fit). While such biases are not specific to computa-tional inquiries, they are all the more epistemologicallydangerous since matching phenomena is easy.

Complex Systems Resist ExplanationBecause they are very powerful tools, computer simu-lations are specifically used for difficult investigations,which usually have features that may spoil their ex-planatory character [34.141, 142]. Typically, in the nat-ural sciences, computer simulations and computationalmethods are centrally used for the study of so-calledcomplex systems [34.143, 144], see also Chap. 35. Re-alistically investigating complex systems would implytaking into account many interrelated nonlinear aspectsof their dynamics including long-distance interactionsand, in spite of the power of modern computers, the cor-responding models are usually intractable. Therefore,drastic simplifications need to be made in both the con-struction of the model and its mathematical treatment,which often threatens the epistemic value of the results.

Importantly, for the above reasons, the problem ofthe explanatory value of computer simulations can ariseeven in fields like fluid dynamics where the underly-ing theories are well known. It is no surprise that thisproblem is more acute in fields, such as the human andsocial sciences, in which no such theories are available,the investigated objects are even more complex, sounddata are more difficult to collect and interpret, and thevery nature of what counts as a sound explanation andgenuine understanding is more debated [34.145, 146]especially in relation to computer simulations [34.133].

Page 32: Handbook Springer oƒ Model-Based Science

PartG|34.4

754 Part G Modelling and Computational Issues

For these reasons, even if there are good arguments forclaiming that computer simulations do not fare worsethan other methods like analytic models or experi-ments (see [34.40] for the case of economics), it is notsurprising that their potential explanatory power is un-dervalued.

Overall, it is plausible that often computer simula-tions have less explanatory power than other methods,and that this does not stem from their nature but fromthe type of uses they usually have in science. If this isthe case, the question of the explanatory power of com-puter simulations is to be treated on a case-by-case basisby using the same criteria as for assessing the explana-tory power of other scientific activities, pace the distrustthat shrouds the use of computer simulations.

34.4.4 Too Replete to Be Explanatory?The Era of Lurking Suspicion

Theories of explanation should capture our intuitionsabout what is explanatory. From this point of view,it is interesting to see whether computer simulationsmeet these intuitions, especially when they fulfill theexplanatory requirements described by theories of ex-planations.

Computer Simulationsand Explanatory Relevance

Good explanations should not include explanatorilyirrelevant material. While determining whether somepiece of information is explanatorily relevant to explainsome target fact is a scientific task, finding a satis-factory notion of explanatory relevance is a task forphilosophers. Despite progresses concerning this prob-lem, current accounts of explanation still fall short ofcapturing this notion [34.147, 148]. At the same time,existing results are sufficient to understand why com-puter simulations raise concerns regarding explanatoryrelevance.

Scientific information, in particular causal laws,accounts for the behavior of phenomena. Thus, it is le-gitimate, when trying to explain some phenomenon, toshow that its occurrence can be derived from a scientificdescription of the corresponding system. Nevertheless,even then, one may fall short of satisfying the require-ment of explanatory relevance. This is clearly explainedby Salmon in his 1989 review of theories of explanation,where he asks “Why are irrelevancies harmless to argu-ments but fatal to explanations?” and further states that“irrelevant premises are pointless, but they have no ef-fect whatever on the validity of the argument” [34.149,p. 102]. While philosophers have mainly focused onthe discussion of irrelevant unscientific premises, theproblem actually lies deeper. Parts of the content of

laws or mechanisms, essentially involved in explana-tory arguments, can be irrelevant to the explanation ofaspects of phenomena that are covered by these laws ormechanisms [34.148]. So the problem is not simply todiscard inessential (unscientific or scientific) premises,but also to determine, within the content of the scien-tific premises that are essentially used in explanatoryderivations, what is relevant and what is not [34.102,148, 150].

This problem is especially acute for computer sim-ulations. Take a computer simulation that unfolds thedetailed evolution of a system based on the descriptionof its initial state and the laws governing it. Then allaspects of the computational model are actually used inthe computational derivation. At the same time, all suchaspects are not necessarily explanatorily relevant withrespect to all facets of the computed behavior. Typically,some aspects of the computed behavior may simply de-pend on the topology of the system, on symmetries inits dynamics or initial conditions, on the fact that someinitial quantity is above some threshold, etc.

Accordingly, the following methodological maximmay be proposed: the more an explanation (resp. anargument) contains independent pieces of scientific in-formation, the more we are entitled to suspect that itcontains irrelevancies (regarding the target behavior).

At the same time, one should remain aware thatexplaining some target phenomenon may sometimesirreducibly require that all the massive gory details in-volved in the simulation of the corresponding systemare included. For example, as chaos theory shows it,explaining the emergence and evolution of a hurricanemay essentially require describing the flapping of a but-terfly’s wings weeks earlier.

An additional problem is that there is no generalscientific method to tell whether a premise, or somepart of the information it conveys, is relevant. Con-trarily to what the hexed salt example [34.151] mayperhaps suggest, irrelevant pieces of information withinan explanation do not wear this irrelevance on theirsleeves and are by no means easy to identify. This isthe problem of the lack of transparency, or of opacity,of irrelevant information.

Overall, since they are based on informationallyreplete descriptions of their target systems, computersimulations legitimately raise the suspicion of beingcomputational arguments that contain many irrelevan-cies, and therefore of being poor explanations – evenwhen they are not.

Computer Simulations, Understanding,and Inferential Immediacy

Mutatis mutandis, similar conclusions can be reachedregarding the issue of computational resources. Since

Page 33: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.4 Computer Simulations, Explanation, and Understanding 755

PartG|34.4

this issue is closely related to the question of how muchcomputer simulations can bring about understanding,things shall be presented through the lens of this latternotion.

It is usually expected that explanations bring under-standing. Theorists of understanding, while disagreeingon the precise nature of this notion, have exploredits various dimensions, which provides a good toolkitto analyze how computer simulations fare on this is-sue.

Hempel cashes in the notion of understanding interms of nomic expectability. From this point of view,taken as explanatory arguments, computer simulationsseem able to provide understanding since, like otherscientific representations, they can rely on nomologicalregularities. Further, in contrast to sketchy explanations,they make the nomic dependence of events explicit.Consider the explanation analyzed by Scriven that “theimpact of my knee on the desk caused the tipping overof the inkwell” [34.152]. The hidden strategy describedbyWoodward [34.153] is to claim that the value of thislatter nonnomological explanation is to be measuredagainst an ideal explanation, which is fully deductiveand nomological and describes the detailed successionof events that led to the stain on the carpet, even if thiscomplete explanation is often inaccessible. From thispoint of view, a computer simulation can offer a way toapproach such an ideal explanation, by providing an ex-plicit deduction of the lawful succession of events thatbrought about the explanandum. However, an epistemicproblem is that, once such a computer simulation hasbeen carried out (and properly stored), it is possible toexplicitly highlight any part of it, but it is not possible toscrutinize all parts because there are too many of them.This is one of the reasons why computer simulations areintrinsically opaque to human minds [34.41, §5.3], seealso Sect. 34.3.2.

Be this as it may, causal theorists of explanationshould agree that computer simulations often contributesignificantly to developing our understanding by re-ducing uncertainty about the content of causal idealexplanatory texts, as requested in [34.131].

Computer simulations also seem to be able to pro-vide unificatory understanding. For unificationists likeKitcher, understanding is a matter of “deriving descrip-tions of many phenomena using the same pattern ofderivation again and again” [34.107, p. 423]. Sincecomputer simulations offer more ways of deriving phe-nomena, by providing new patterns of derivation orinstantiating existing patterns in more complex cases,at least some of them contribute to unification.

Things are less straightforward with Woodward’saccount of explanation and understanding. Woodwardargues that a good explanation provides “understanding

by exhibiting a pattern of counterfactual dependencebetween explanans and explanandum” [34.154, p. 13].From this point of view, computer simulations fare wellsince, if one does not go beyond their domain of va-lidity, they provide general patterns of counterfactualdependence between their inputs I and outputsO, whichare obtained by applying t times their update algorithms(UA), that is, more formally, O.t; l/D UAt.I/.

Is there a philosophical catch? Woodward also re-quires that the pattern of counterfactual dependence bedescribed in terms of a functional relation. But whatis to count as a function in this context? Functionscan be defined explicitly (by means of algorithms) orimplicitly (by means of equations). The advantage ofcomputer simulations is that they provide algorithmicformulations based on elementary operations of how theexplanandum varies with the explanans. From this pointof view, computer simulations are more explicit thanmodels, which simply provide equations linking the ex-planans and the explanandum. However, the problem isthat with computer simulations any kind of functionalimmediacy is lost, since it is computationally costlyto carry out the algorithm. Indeed, Woodward usuallydescribes straightforward examples of functional de-pendence like Y D 3X1C 4X2. With such functions, wemay feel that the description of the counterfactual de-pendence is just there, since, by simply instantiatingthe variables and carrying out the few operations in-volved, specific numerical relations are accessible. Insuch simple cases, a human mind can do the workby itself and answer the corresponding what-if-things-had-been-different (what-if) questions. In contrast, witha computer simulation, computing the output takesmuch computational power. So the tentative conclusionis that computer simulations provide understanding inWoodward’s sense, but this understanding is not imme-diately accessible, the degree of (non)-immediacy beingdescribed by the computational resources it takes toanswer each what-if question. Importantly, an equation-based model may give the illusion of immediacy, sincethe equation presents a short description of how thevariables are correlated. However, one should watchout that short equations can be unsolvable, and shortdescriptions of algorithms (like O.t; l/D UAt.I/) withsimple inputs can yield complex behaviors that arecomputationally costly to predict [34.155].

Similar conclusions can be reached if one focuseson analyses of understanding proper.De Regt andDiekspropose to analyze understanding in terms of intelligi-bility, where this latter notion implies the ability to rec-ognize qualitative characteristic consequences withoutperforming exact calculations [34.156]. In this sense,understanding seems to be a matter of immediacy, aswas already suggested by Feynman, who described it as

Page 34: Handbook Springer oƒ Model-Based Science

PartG|34.4

756 Part G Modelling and Computational Issues

the ability to foresee the behavior of a system, at leastqualitatively, or the consequences of a theory, withoutsolving exactly the equations or performing exact cal-culations [34.157, Vol. 2, 2–1].

Depending on the cases, foreseeing consequencesrequires logical and cognitive operations to a greater orlesser extent. Thus, the above ideas may be rephrased ina more gradualist way, by saying that the less inferen-tial or computational steps one needs to go through toforesee the behavior of a system or the consequences ofa theory, the better we understand it. In this perspective,computer simulations fare terribly badly, since they in-volve going through many gory computational stepsand, even once these have been carried out, scientistsusually end up with no simple picture of the results andno inferential shortcuts that could exempt them fromthis computational stodginess for future similar investi-gations.

Understanding: What Do We Losewith Computer Simulations?

Before the advent of computational science, explana-tory advances in science were always the direct productof human minds and pen-and-rubber methods. There-fore, any actual scientific explanation that satisfied therequirements for explanatoriness was also human sized,and the epistemic benefits logically contained withinsuch explanations could actually be enjoyed by compe-tent and informed epistemic agents. In [34.158, p. 299],Hempel states that an explanatory argument shows that“the occurrence [of an event] was to be expected” andhe adds “in a purely logical sense.” This addition em-phasizes that expectation should not be understood asa psychological notion nor refer to the psychologicalaspects of the activity of explaining. In the case ofcomputer simulations, this addition is somewhat su-perfluous. Nomic expectability remains for scientists,since, based on computer simulations, they may knowthat they can entertain the belief that an event shouldhappen. However, this belief is completely cold. Sincethe activity of reasoning is externalized in computers,it is no longer part of the proper cognition of scientistsand does not come with the psychological side-effectsassociated with first-person epistemic activities, such asemotions or feelings of expectation, impressions of cer-tainty and clarity, or the oft-mentioned aha or eurekafeeling which usually comes with first-person experi-ences of understanding. In other words, with computersimulations, the mind is no longer the carrier of the ac-tivity of explanation, and simply records what it shouldbelieve. Unfortunately, epistemic benefits associatedwith the individual ability to carry out this activity arealso lost. Since the explanatory argument can no longerbe surveyed by a human mind, the details of the rela-

tions between the premises of the explanatory argumentand its conclusion are opaque. Therefore, scientists areno longer able to encompass uno intuitu all aspects ofthe explanation and how they are related, to developexpectations about counterfactual situations (in whichsimilar hypotheses are met), and the unificatory knowl-edge that only global insights can provide is also lost.Overall, with computer simulations the objective in-telligibility that is enclosed in explanations and canbe accessed by first-person epistemic appropriation ofthe explanatory arguments can no longer be completelyenjoyed by scientists (see also [34.159] for further anal-yses about epistemic opacity in this context). In thisperspective, the problem of computer simulations is notthat they have less explanatory value but that we can-not have epistemic access to this explanatory value.In brief, this problem would not pertain to the logicof computer-simulation-based explanations but to theirepistemology.

New Standards for Understanding?The gradualist description regarding the need of cogni-tive and logical operations to foresee consequences (seeSect. 34.4.4 Computer Simulations, Understanding,and Inferential Immediacy) suggests that the bound-ary between cases where intelligibility is present oris lost is not completely sharp. Importantly, the abil-ity to foresee consequences depends on various factorssuch as the knowledge of physical or mathematicaltheorems to facilitate deductions, the knowledge ofpowerful formalisms to facilitate inferences, how muchthe intuition of scientists has been trained to anticipateconsequences of a certain type and has somewhat in-ternalized inferential routines, etc., [34.102, §6.4]. Inother words, at least in some cases, the frontiers of whathas a computational explanation, but remains unintelli-gible to a human mind, can be pushed back to someextent.

This raises the question of how much the frontiersof intelligibility can be extended and whether the idealof inferential or computational briefness for explana-tions should be considered as a normative standard.Two positions are possible. One may claim that genuineexplanations should always yield the possibility forhuman subjects to access the corresponding understand-ing. Or one may claim that, as shown by computationalscience, we have gone beyond human-sized science, notall good explanations can be comprehended by humanminds, and this is not a defect of our science, even if itis clearly an epistemic inconvenience.

A motivation for endorsing the former claim is thatthe lack of intelligibility of explanations often stemsfrom epistemic flaws of the agents producing them andcan be corrected. Typically, in science, results are often

Page 35: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.4 Computer Simulations, Explanation, and Understanding 757

PartG|34.4

laboriously proved and, with the advance of scientificunderstanding, shorter and clearer proofs, or quicker al-gorithms, are found.

Overall, it seems sound to adopt the followingmethodological maxim: the more resources we need toproduce (or check) an explanation (resp. an argument,a proof), the more we are entitled to suspect, in theabsence of contrary evidence, that the explanation isunduly complex. From this point of view, computer sim-ulations do not seem flawless, since they make abundantuse of computational and inferential resources. Accord-ingly, it is legitimate to suspect computer simulationsof providing unduly complex explanations, which havesimpler versions yielding the expected accessible un-derstanding.

Nevertheless, this philosophical stance may be in-appropriate in many cases. There is a strong suspicionthat explaining phenomena often requires using anirreducible amount of resources. This idea of computa-tional irreducibility has been vocally advanced, thoughnot clearly defined, by Wolfram [34.155], and philoso-phers have toyed with close intuitions in recent dis-cussions about emergence [34.74, 123, 124, 160–162].Capturing the idea in a clear, robust and fruitful def-inition is a difficult on-going task [34.163]. However,there seems to be an agreement that this intuitive no-tion is not empty, which is what matters for the purposeof the present discussion. Overall, this means that in allsuch cases, asking for computationally simple explana-tions does not make sense, since such explanations donot exist. In this perspective, tailoring our explanatoryideals to our human capacities is wishful thinking, sincein many cases, the inaccessibility of the usual epistemicbenefits of explanations does not stem from our epis-temic shortcomings.

This suggests that we may have to bite the bulletand say that, sometimes, computer simulations do bringfull-fledged explanation and objective understanding,even if, because of our limited cognitive capacities, wecannot enjoy this understanding and the epistemic bene-fits harbored by such explanations. In other words, bothof the above philosophical options are correct, thoughin different cases.

Ideally, one would like to be able to know wheneach of these two options should be adopted. Unfor-tunately, determining whether a computational processcan be shortcut or a computational problem solved byquicker algorithms, seems to be in practice opaque(problem of the lack of transparency of the optimal-ity of the computational process). This means that inmost cases, when facing a computational explanationof a phenomenon, one does not know whether thereare computationally or inferentially shorter versions ofthis explanation (and we are to be epistemically blamed

for being so explanatorily laborious), or whether onecannot do better (and the process is intrinsically com-plex).

Overall, because determining whether explanationsare informationally minimal (regarding the use of rele-vant information) and whether arguments or computa-tions are optimal is opaque, computer simulations aredoomed to remain shrouded in suspicion about their ex-planatoriness, even in cases in which there is no better(that is, shorter or less informationally replete) ex-planation. In brief, the era of suspicion regarding theexplanatoriness of computer simulations will not endsoon.

34.4.5 Bypassing the Opacityof Simulations

Even when computer simulations are epistemicallyopaque, some strategies can be tried to regain predictivepower, control, and potentially understanding regardingthe corresponding inquiries.

Understanding, Controland Higher Level Patterns

As emphasized by Lenhard [34.159], by manipulatingcomputational models and observing which behaviorpatterns are obtained, scientists can try to control theprocesses involved and develop “a feeling for the con-sequences.” Lenhard suggests that this understandingby control, which is oriented toward design rules andpredictions, corresponds to a pragmatic account of un-derstanding, which is also involved in the building ofreliable technological artifacts.

Other authors have emphasized that, even if the de-tails of computer simulations cannot be followed byhuman minds, one may sometimes still obtain valuableinsights by building coarse-grained representations ofthe corresponding target systems and analyzing whethermacro-dynamics emerge when microinformation isthrown away [34.164]. Surprisingly, the existence ofcoarse-grained dynamics seems to be compatible withcomplex, potentially computationally irreducible, dy-namics at the microlevel [34.165, 166], even if this byno means warrants that control or understanding canalways be regained at the macro-level. Thus, the ques-tion arises as to when and how much epistemic virtueslike predictive power, control, and potentially under-standing, which are somewhat lost at the microlevel,can be partly recovered at the macro-level, and how thecorresponding patterns can be detected. The treatmentof such questions requires the analysis of logical andmathematical relations between descriptions of systemsat different scales and, for this reason, it should gainfrom ongoing debates and research in the philosophical

Page 36: Handbook Springer oƒ Model-Based Science

PartG|34.5

758 Part G Modelling and Computational Issues

and scientific literature about the emergence of simplebehavior in complex systems.

Visualization and UnderstandingAnother important issue is how to exploit macro-levelpatterns that are present in computer simulations to re-store partial cognitive grasp of the simulated systemsby humans. Given the type of creatures that we are,and in particular the high visual performance of thebrain, using visual interfaces can be part of the an-swer. Indeed, the format of scientific representationspartly determines what scientists can do with them –whereas, as emphasized by [34.41, p. 95], philosophershave often considered the logical content of a repre-sentation to be the only important element to analyzethem. To go further into these issues, sharp analy-ses of representational systems and their properties arerequired. Tools and concepts developed in the Goodma-nian tradition prove to be extremely useful [34.167]. Forexample, Kulvicki [34.29] highlights how much graphsand images can present information more immediatelythan symbolic representations can. This notion of im-mediacy is cashed in in terms of semantic salience,syntactic salience or extractability. Vorms further showshow taking into account formats of representation inthe analysis of scientific reasoning is crucial, since in-ferences have different cognitive costs depending onthe format of representation [34.168]. Jebeile [34.169]applies similar concepts to computational models andargues that visualization tools can have a specific ex-planatory role since they do not merely present compu-tational data in more accessible ways, but also suggestinterpretations that are not contained in the originaldata, highlight relations between these data, and therebypoint at elements of answers to what-if questions.

Overall, the issue of how much visualization canconvey objective understanding remains debated. For

example, Kuorikoski [34.164] acknowledges that visualrepresentations are cognitive aids but emphasizes thatthey often merely bring about a feeling and illusion ofunderstanding. So, there is the need of epistemologicalanalyses which would make clear in which cases, andhow, visual representations can be reliable guides andself-certifying vectors of knowledge, which partly en-able their users to determine whether and how muchthey should trust them.

34.4.6 Understanding and DisciplinaryNorms

All the above discussion has been based on gen-eral arguments about explanations and understanding.However, as already emphasized, explanatory normssometimes differ from one field to another, economicsbeing, at least in its mainstream branches, a paradig-matic case of a field in which simulation methods areshunned [34.37]. Similarly, the explanatory status ofcomputer simulations and computational models variesacross fields like cognitive sciences, artificial intelli-gence [34.137], artificial life [34.170] or within fieldsthemselves (see, e.g., [34.171] for the case of computa-tional chemistry and [34.79] for that of climate science).

This is not the place to discuss whether these varia-tions regarding explanatory norms are deep, or whetherthey result from differences in theoretical contexts, inthe degrees of complexity of the systems investigated,in the difficulties to collect evidence about them, inthe scientific maturity and empirical success of thesefields, etc. Such questions cannot be answered on thebasis of armchair investigations. Field-specific studiesof the explanatoriness of computer simulations, madeby scholars who are in the same time acutely awareof present discussions about scientific explanation, areneeded.

34.5 Comparing: Computer Simulations, Experimentsand Thought Experiments

Computer simulations, experiments, and, to a lesserextent, thought experiments share various similarities,which calls for an explanation. Indeed, similaritiesbetween experimental activities and computational sci-ence are even found in mathematics, where somemethods are claimed to be experimental (Sect. 34.5.1).Computer simulations, experiments and thought exper-iments can sometimes be seen as ways of carrying outsimilar activities, or activities having similar constraints(Sect. 34.5.2). Should an additional step be made, andcomputer simulations be considered as experiments?

A close scrutiny of the existing arguments in favor ofthis claim shows that it meets insuperable difficulties,both regarding the analysis of computer simulations andexperiments. Further, the claim does not even seem nec-essary to account for the importance of the materialaspects of simulations (Sect. 34.5.3). Finally, even ifcomputer simulations can yield knowledge, which cansometimes be more reliable than that produced by ex-periments, unless a strong case against empiricism isproperly made, computer simulations do not seem toseriously threaten the unique foundational role of exper-

Page 37: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.5 Comparing: Computer Simulations, Experiments 759

PartG|34.5

iments as the source of primary evidence upon whichscience is built (Sect. 34.5.4). In any case, discussionsabout the relationships between experiments and com-puter simulations should remain compatible with theactual existence of hybrid (both computational and ex-perimental) methods (Sect. 34.5.5).

When in the 1990s philosophers of science startedinvestigating computer simulations, they soon realizedthat the object of their inquiry cross-cut traditionalcategories like those of theories, models, experimentsor thought experiments. Similarities with experimentswere particularly striking, since, among other things,computer simulations involved the treatment of mas-sive data and statistical reasoning, required robustnessanalysis, and were claimed to yield new knowledge.As a result, computer simulations were suggestivelydubbed by various authors as computer experiments,numerical experimentation or in-silico thought experi-ments, even though it was not always conceptually clearwhat these potentially metaphorical characterizationsmeant exactly.

All such similarities are worth analyzing and poten-tially call for explanations. They may be the sign of anidentical nature between (some of) these activities, ofcommon essential features, or may just be shallow orfortuitous. Clarifying this issue is also a way to analyzethese activities more acutely by singling out what is spe-cific to each or common to them and to determine towhat extent epistemological insights can be transferredbetween them.

34.5.1 Computational Mathematicsand the Experimental Stance

Experimental Proofs in MathematicsSince aspects related to the representation of materialsystems are absent from mathematics, a comparisonwith this field can be hoped to be fruitful to an-alyze what exactly is experimental in computationalscience.

The mathematical legitimacy of computers for theproduction of proofs has been discussed for severaldecades. Computational proofs like that of the four-color theorem by Appel et al. [34.172, 173] were rapidlylabeled quasi-empirical and discussions raged abouthow they should be interpreted [34.174, p. 244]. Suchcomputational proofs can actually be seen as havingroots in the older tradition of quasi-empirical mathe-matics, practiced for example by mathematicians likeEuler, and philosophically defended by authors likeLakatos [34.175] or Putnam [34.176]. Interestingly,even in these contexts, the labels empirical or exper-imental were used to refer to various aspects of theactivity of proving results.

Like experiments, computational proofs involve ex-ternal processes, which are fallible. Their reliability canthen be seen as being partly of a probable nature andneeds to be assessed a posteriori by running these exter-nal processes several times and checking that the appara-tus involvedworked correctly. By contrast, proofswhichcan be actively and directly produced by humans minds,can provide a priori knowledge, the validity of whichis assessed by (mentally) inspecting the proof itself,qua mathematical entity. Further, computational proofs,like experiments and empirical methods in mathemat-ics, usually provide particular numerical results: as thecomputational physicist Keith Roberts writes it, “eachindividual calculation is [. . . ] analogous to a single ex-periment or observation and provides only numerical orgraphical results” (quoted in [34.70, p. 137]). Therefore,to obtain more general statements (and possibly theo-ries), probabilistic inductive steps are needed. Overall,such debates illustrate the need to clarify the use in thiscontext of labels like experimental or empirical.

The Experimental StanceThe case of computational mathematics also makesclear how scientists can adopt an experimental stancefor inquiries where no physical process is investigated,and the nature of the object which is experimented uponis completely known.

Experimenting involves being able to triggerchanges, or to intervene on material or symbolic dy-namical processes, and to record how they vary ac-cordingly. As noted by Dowling [34.136, p. 265] andJebeile [34.169, II, §7.2], processes for which the dy-namics is known can also work as black boxes, sincethe opacity of the process may stem either from our lackof knowledge about its dynamics, or from the math-ematical unpredictability (or epistemic inaccessibility)of its known dynamics. In this perspective, contrarilyto Guala [34.177], being a black box is not a specificfeature of experiments.

Finally, when experimenting on a material or for-mal object, it is better that interactions with the objectbe made easy and the results be easily accessible to theexperimenters (e.g., by means of visual interfaces) sothat tinkering is made possible [34.136] and intuitions,familiarity, and possibly some form of understand-ing [34.159, 169, III] can be developed.

34.5.2 Common Basal Features

Some similarities of computer simulations and experi-ments (and thought experiments) may be accounted forby highlighting common basal features of these activ-ities, which in turn account for the existence of theircommon epistemological features, such as the shared

Page 38: Handbook Springer oƒ Model-Based Science

PartG|34.5

760 Part G Modelling and Computational Issues

concerns of practitioners of experiments and computersimulations for “error tracking, locality, replicability,and stability” [34.70, p. 142]. In this perspective, oneshould characterize the nature and status of these com-mon basal features.

Role or Functional SubstitutabilityThough computer simulations, thought experiments andexperiments are activities of different types, they cansometimes be claimed to play identical roles. Typically,computer simulations are used to gain knowledge abouthow physical systems behave (hereafter behavioralknowledge) when experiments are unreliable, or mak-ing them is politically or ethically unacceptable [34.41,p. 107]. Importantly, acknowledging that computer sim-ulations can sometimes be used as substitutes for exper-iments by no means implies that they can play all theroles of experiments (Sect. 34.5.4). Further, one shouldbe aware that, at a high-level of abstraction, all activi-ties may be described as doing similar things; therefore,these shared roles should be shown in addition to havenontrivial epistemological implications. For example,one may argue that providing knowledge or producingdata are roles that are endorsed by computer simula-tions, thought experiments, or experiments. However,this may be seen as some partially sterile hand-waving.Indeed this points at a too abstract similarity if these ac-tivities produce items of knowledge of totally differenttypes, and nothing epistemologically valuable can beinferred from this shared characterization (see [34.81]for a presentation of the different types of knowledgeinvolved in science).

El Skaf and Imbert [34.87] make an additional stepwhen they claim that these activities can in certain casesbe functionally substitutable, that is, that we can some-times use one instead of the other for the purpose ofa common inquiry – which remains compatible withthe fact that these activities do not play the roles inquestion in the same way, that they come with dif-ferent epistemic credentials, provide different benefits,and therefore, as role holders, are not epistemologicallysubstitutable. El Skaf and Imbert, in particular, claimthat computer simulations, experiments, and thoughtexperiments are sometimes used for the purpose ofunfolding scenarios (see also Hughes’ notion of demon-stration in Sect. 34.6.1) and argue that investigationsconcerning the possibility of a physical Maxwelliandemon were indeed pursued by experimental, computa-tional and thought experimental means. The existenceof such common roles then provides grounds for an-alyzing similarities in the epistemological structure ofthe corresponding inquiries.

Morrison [34.178] goes even further since she ar-gues that some computer simulations are used as mea-

suring instruments and therefore that they have thesame epistemic status as experimental measurements.She first claims that models can serve as measuringinstruments, and then shows that this role can be ful-filled in connection with both computer simulations andexperiments, which are similarly model shaped. An im-portant part of her strategy is to relax the conditionsfor something to count as an experiment, by discretelygiving primacy, in the definitions of scientific activi-ties, to the roles which are played (here measuring)and by downplaying the importance of physical in-teractions with the investigated target systems in thedefinition of experiments (which are simply seen asa way to perform this measuring role). Giere’s rejoin-der denies the acceptability of this strategy, and followsthe empiricist tradition, when he claims that “a substi-tute for a measurement is not a measurement, whichtraditionally requires causal interaction with the targetsystem” [34.179, p. 60]. Indeed, the potential additionalpay-offs of experiments, as primary sources of radicallynew evidence, come from these causal interactions. Ac-cordingly, their specificity is not due to their roles,qua information sources (since thought experiments,models, or theories are also information sources), butfrom the type of epistemological credentials that comewith the corresponding information, and grounds ourultimate scientific beliefs. A different nonempiricistepistemology might be developed, but the bait mustthen be swallowed explicitly, and it must be explainedwhy such an epistemology, in which activities are ex-clusively individuated on the basis of their functionand the importance of other differences is downplayed,should be preferred. In any case, an account of how toindividuate these functions would be needed, since ata high level of abstraction, various activities can be seenas performing the same function.

Beyond Anthropocentric EmpiricismTo practice science, humans need to collect observa-tions and make inferences. Since human capacities arelimited, various instruments have been developed to ex-tend them and these instruments have been partly com-putational for decades. These parallel developments ofobservational and inferential capacities come with com-mon epistemological features. In both cases, restrictedempiricism, which gives a large and central role tohuman sensorial or inferential capacities in the descrip-tion of how scientific activities are carried out, is nolonger an appropriate paradigm to understand scientificpractices. Indeed, the place of human capacities withinmodern science needs to be reconsidered [34.8, 41,180]. Further, the externalization of observations andinferences comes at the price of some epistemic opacityand passivity for the practitioner, since, as humans, we

Page 39: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.5 Comparing: Computer Simulations, Experiments 761

PartG|34.5

no longer consciously carry out these activities. Insteadwe simply state the results of experimental or computa-tional apparatus. However, this also comes with gains inobjectivity since observational and informational proce-dures are now carried out by external, transparent andcontrolled apparatus, which no longer have hidden psy-chological biases nor commit fallacies.

The development of computational instruments andcomputer simulations also raises similar epistemolog-ical problems. For example, the apparently innocuousnotion of data seems to raise new issues in the con-text of computational science. Computer simulations,like models, have been claimed to be useful to probephysical systems and to be used as measuring instru-ments [34.178]. Whatever the interpretation of suchstatements (Sect. 34.5.3), it is a fact that both computersimulations and computational instruments provide uswith data, which raises transversal questions.

A datum is simply the value of a variable. It can betaken to describe a property of any object. In this sim-ple sense, data coming from experiments and computersimulations can play a similar role by standing for theproperties of some target system within some represen-tational inquiries. Furthermore, in both cases, their in-terpretation usually involves heavy computational treat-ments. In particular, mathematical transforms of varioustypes serve to separate information from noise, removeartifacts, or recover information about a system prop-erty out of intertwined causal signals, like in computedtomography imaging techniques [34.121]. From thispoint of view, as emphasized by Humphreys [34.181],here one departs from a principle frequently used by tra-ditional empiricists, and according to which “the closerwe stay to the raw data, the more likely those data areto be reliable sources of evidence.”

At the same time, there are different types of epis-temological data, and the need for their common studyshould not introduce confusion in their understanding.In science, one seeks to determine how much data reli-ably stand for their target, and which properties exactlythey refer to. Humphreys’s remark above the compu-tational treatment of data, reproduced above, highlightsthe fact that causal information concerning the source iscrucial to treat and interpret data and to determine whatempirical content they bring about this source (this isthe inverse inference problem), given that data do notwear on their sleeves details of how they were pro-duced. From this point of view, experimental and com-putational data have utterly different causal histories –so what gives its sense to the computational treatmentis potentially of a different nature [34.91, 121]. Overall,more pointed comparative analyses of data obtained bycomputer simulations and computational instrumentsare still to be carried out, to understand their semantics

and epistemology and highlight both their nonacciden-tal similarities and specific differences (see [34.182] forthe case of computational instruments).

Computational science must also face the challengeof data management. While the steps of traditionalmathematical proofs and arguments, once produced,can be verified by scientists, things are usually dif-ferent for computer solutions, even if they are merelyexecutions of computational programs [34.91], or ar-guments [34.72]. Details of computer simulations arein general not stored since this would require too largeamounts of memory (even if, in some cases like theMil-lennium Run, scientists may decide to keep track of theevolution of the computer simulation). In other words,like experimental science, computational science in-volves choosing which data to keep track of, developingpowerful devices to store them, finding appropriateways to organize them, providing efficient interfacesto visualize, search, and process them, and, more gen-erally, developing new methods to produce knowledgefrom them. This also raises questions about how thesedata can or should be accessed by the scientific com-munity, and which economic model is appropriate forthem [34.183]. In brief, the epistemology of computersimulations here meets that of big data [34.184, 185],even if it cannot be assumed that on-going debatesand analysis about the latter, because they are mostlyfocused on questions raised by empirically collecteddata, will naturally apply to, or be insightful for, thecorresponding problems raised by computer simula-tions.

Different Activities, Similar Patternsof Reasoning

As noted by Parker [34.186], strategies developed tobuild confidence in experimental results, and describedin particular by Allan Franklin, seem to have closeanalogs for the justification of results generated bycomputer simulations. Indeed, the interpretation of theresults of computer simulations as evidence for hy-potheses about physical systems can sometimes bemade through an error-statistical perspective [34.187]as in the case of experiments [34.188].

Similar patterns of reasoning are also used to ar-gue in favor of the existence of specific mechanisms orentities on the basis of patterns within data, modes ofvisualizations of these patterns, or our ability to manip-ulate the actual or represented systems and find patternregularities in their behavior (see [34.71] for a descrip-tion of the homomorphic tradition, in which visualforms are given much importance, in contrast to thehomologic tradition, which is more based on logical re-lationships). More generally, visualization techniques,aimed to facilitate the reasoning about results present in

Page 40: Handbook Springer oƒ Model-Based Science

PartG|34.5

762 Part G Modelling and Computational Issues

large databases, are crucial in the case of both experi-ments and computer simulations (Sect. 34.4.4).

Importantly, these similarities may have differentexplanations. For example, they may simply stem fromthe need to treat massive amount of data by efficientstandard procedures, or be a consequence of featuresshared by experimental and computational data, inde-pendently of their quantity, like the presence of noise,or may correspond to the application of general typesof evidential or explanatory arguments to data havingdifferent natures.

The Reproducibility of ResultsReproducibility is a typical requirement for experi-ments, though it is one that is sometimes difficult toachieve because of the tacit knowledge involved in thecarrying out of experiments [34.189]. Similar prob-lems may arise with computer simulations. Even if thelatter are nothing more than computations and are inprinciple reproducible, in practice reproducibility maysometimes be difficult, especially in the context of bigscience. For example, computer simulations may be toobig to be reproduced (all the more since scientists havein general little incentive to reproduce results). Numer-ical codes may not be public (because they are notpublished or shared), and many of the computationaldetails may be left tacit. Finally, computer simulationsinvolving stochastic processes may not be exactly re-producible because the random numbers came fromexternal physical signals or because the details of thepseudorandom number generator are not made public.

Experimenters’ and Simulationists’ RegressesGood scientific results are usually expected to be robustagainst various changes [34.190], in particular thoserelated to implementation or material details, and thisis why failure of exact reproducibility should not bea worry.

Still, when one faces an inability to reproduce a re-sult, the problem may arise from a lack of robustness orflaw in the original experiment or computer simulation,or from a failure to reproduce it correctly. Accordingly,as emphasized by Gelfert [34.191], computer simula-tions are affected by a problem similar to that of theexperimenter’s regress [34.192], which is met when todetermine whether an experimental apparatus is work-ing properly scientists have no criterion other than thefact that it produces the expected results. As noted byGodin and Gingras [34.193], regresses like that high-lighted by Collins are instances of well-known types ofarguments already analyzed in the framework of ancientskepticism (more specifically, regresses or circular rela-tions regarding justification). As such, they are specificneither to experiments nor to computer simulations –

even if solutions to these problems, as those describedby Godin and Gingras or Franklin [34.194], may bepartly activity specific. In any case, adopting a gen-eral comparative perspective provides a way to analyzemore acutely what is epistemologically specific or com-mon to scientific activities.

34.5.3 Are Computer SimulationsExperiments?

Some authors go as far as claiming that, at least in somecases, what we call computer simulations are in fact ex-periments. In this perspective, Monte Carlo methods,sometimes labeled Monte Carlo experiments or MonteCarlo simulations, seem to be a philosophical test case(like analog simulations, Sect. 34.2.2). Such methodsare used to compute numbers (e.g., pi), sample targetdistributions or produce dynamical trajectories with ad-equate average properties. They rely crucially on theuse of randomness [34.8, 72]. They may look closer toexperiments because they sometimes use physical sys-tems, like a Geiger counter, to generate random events.

Still, Beisbart and Norton claim that Monte Carlomethods are not experiments, since randomizers canbe replaced by computer codes of pseudorandomiz-ers [34.72, p. 412]. This shows that these computersimulations do not require contact with the random-izer as an external object; therefore no direct empiricaldiscovery about the nature of physical systems can bemade by them and they should not be seen as havingan experimental nature. In brief, in Monte Carlo simu-lations, the physical systems involved are simply usedas computers to generate mathematically random se-quences.

Beyond the analysis of specific cases, some au-thors have defended the bolder claim that all computersimulations are experiments (what Winsberg calls theidentity thesis [34.195, §5]). While this goes againstinherited scientific common sense (computations arenot experiments!), the claim should be carefully exam-ined. Indeed, in principle there is no impossibility here:while computations, logically defined, are not experi-ments, we need physical machines to carry them out.Therefore, in the end, computers, instruments and ex-perimental systems are physical systems that we use forthe purpose of doing science – and it all boils down tohow we conceptualize in a coherent and fruitful waythese external worldly activities. In brief, perhaps, afterall, we would be better off revising our epistemolog-ical notions so that computer simulations are seen asgenuine examples of experiments – a revisionary po-sition with regard to the empiricist tradition since itignores the specificity of experiments as primary evi-dential sources of knowledge.

Page 41: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.5 Comparing: Computer Simulations, Experiments 763

PartG|34.5

In what follows, I review existing arguments infavor of the claim that computer simulations are exper-iments, and how these arguments have been criticized.Overall, as we shall see, in contrast to what is claimedin [34.195, §5], it is very dubious that discussions aboutthe identity thesis are simply a matter of perspective andwhere the emphasis is placed. A minute, conceptuallyrigorous, and sharp treatment of this question can befound in [34.53, 72, 91, 196] and [34.169, Chap. 7].

Problems with Analyses in Termsof Common Physical Structures

Some authors analyze computer simulations as manip-ulations of physical systems (the computers), whichinstantiate or realize models that are also instantiatedor realized by the investigated physical systems.

Norton and Suppe [34.114] are good representativesof this tradition. They first try to describe formal re-lations between what they call a lumped model, thestructure of the target system, and the programmedcomputer, which is supposed to embed the lumpedmodel. They further argue that these relations accountfor the experiment-like character of computer sim-ulations: instead of experimenting on real systems,computer simulations are used as physical stand-insor analogs to probe real-world phenomena, and onethereby learns thing about the represented systems. Thissuggestive position has charmed various authors. It alsohas similarities with accounts of scientific representa-tion made in terms of similarity [34.28], isomorphism,or weaker relationships between the representation andthe target system [34.197, 198], even if the authors thatdefend the above view have not adopted so far this lineof argument.

However, in the case of computer simulations, thisview does not seem to resist close scrutiny, for rea-sons specific to computational activities. While in thecase of analog simulations both the represented sys-tem and the analog computer instantiate a commonmathematical structure (Sect. 34.2.2), such a claim can-not be made for digital computers. The general ideais that steps of computational processes are multiplyrealizable and that, conversely, how physical states ofcomputers are to be interpreted is contextual and partlyarbitrary [34.4]. It is true that for every step of a com-putation to be carried out in practice, one needs to usea physical machine that can be seen as instantiating thecorresponding transition rule. However, physically dif-ferent machines can be used to carry out different partsof a computation (for example when the computationis distributed). Furthermore, even if a single machineis used, different runs of the program will correspondto different physical processes, since the computer mayprocess several tasks in the same time and contextually

decide how its memories are organized, and even withinthe same computation, a single part of the memory maybe used at different steps to code for different physi-cal variables [34.91, pp. 564–566], [34.196, pp. 81–84].Overall, in the general case, the relation between thephysical states of the represented target system and thephysical states of the computer(s) that may be used tosimulate its behavior is a many-many one, and the ideathat the phenomenon is recreated in the machine “isfundamentally flawed for it contradicts basic principlesof computer architecture” [34.196, p. 84]: in the caseof a successful computer simulation, one can simplysay that every step of the computation has been carriedout by some appropriate physical mechanism, but thereis no such thing as a computer instantiating the struc-ture of the model investigated. (Note that the argumentbased on multiple realizability is in the spirit of thoseoriginally developed by Fodor [34.126] in his discus-sion of the reduction of the special sciences).

Problems with Common Analyses in Termsof Intervention or Observation

Computer simulations have also been claimed to qualifyas experiments “in which the system intervened on isa programmed digital computer” [34.199, p. 488], orto involve observations of the computer as a materialsystem [34.114, p. 88]. Winsberg even goes as far as toclaim that [34.195]

“nothing but a debate about nomenclature [. . . ]would prevent us from saying that the epistemictarget of a storm simulation is the computer, andthat the storm is merely the epistemic motivation forstudying the computer.”

Such claims can be answered along the same linesas the previous argument. There is of course no denyingthat when one runs a computer simulation one inter-acts with the interface of the computer, which triggerssome physical change in the computer so that the rightcomputation is carried out. Similarly, once the compu-tation is finished, the physical state of the memory inwhich the result is stored, triggers a causal mechanismthat produces changes in the interface so that the re-sult can be read by the user. However, the definition ofan intervention at the model level does not determinea specific intervention at the physical level of the com-puter. The reason is that, as emphasized above, evenwithin the same computational process, the way thatthe intervened model variable is physically representedin the computer may vary, and how the computer, quaphysical system, evolves precisely may depend on var-ious parameters such as the other tasks that it carriesout at the same time. In brief, the idea that actual com-

Page 42: Handbook Springer oƒ Model-Based Science

PartG|34.5

764 Part G Modelling and Computational Issues

puter simulations, defined at the model level, could beseen as the investigation of the computer, qua physicalmachine, which is used to carry them out, seems to beriddled with insuperable difficulties.

Finally, one should mention that epistemic access tothe physical states of the computer corresponding to thesuccessive steps of a computation is usually not possi-ble in practice [34.196, p. 81].

Problems with General Analyses in Termsof Epistemological and RepresentationalStructure

Some authors have also argued that computer simula-tions and experiments share an epistemological struc-ture, or epistemological aspects, and have used thisclaim to justify the identity thesis.

For example, it has been claimed that in both casesone interacts with a system to gain knowledge abouta target system, and the internal and external validity ofthe processes needs to be checked. This type of analysisstems from a 2002 paper byGuala [34.177] in which hepresents a laboratory experiment in economics aimedat investigating behavioral decision making by givingdecisional tasks to real human subjects in the labo-ratory. In this case, a hypothesis about how agentsbehave in the laboratory is investigated (internal va-lidity hypotheses); then, based on similarities betweenthe experimental situation and the real-life situation,an external hypothesis is made about the behavior ofagents in real life situations (external validity hypothe-sis). The notion of internal validity comes from socialscience and corresponds to the (approximate) truthof inferences about causal relationships regarding thesystem that is experimented on. External validity corre-sponds to the (approximate) truth of the generalizationof causal inferences from an initial system, for whichinternal validity has been demonstrated, to a larger classof systems. Guala further claims that both computersimulations and experiments fit this epistemologicaldescription in terms of internal and external validity ar-guments, but cautiously concludes that their “differencemust lie elsewhere” [34.177]. According to him, com-puter simulations and experiments are different, since inthe latter case there is a material similarity between theobject and the target, whereas, in the former case, thereis a formal similarity between the simulating and thesimulated systems (a claim which seems to be fallingunder the above criticism directed at Norton and Suppeand their followers).

Guala’s conceptual description is endorsed by mostauthors who try to picture computer simulations assome sort of experiment. For example, Winsberg ac-cepts the description, but claims that the differencebetween experiments and computer simulations lies

in the type of background knowledge that researchersuse to justify the external validity hypothesis [34.113,p. 587], a position which is again revisionary withregard to the empiricist tradition if this is the only speci-ficity ascribed to experiments.

A serious worry is that describing the investigationof the computational model in terms of internal valid-ity is problematic and artificial, since, as can be seenabove, computer simulations cannot be considered asinvestigations of the causal behavior of the computer,qua physical system. For the same reason, the use ofthe notion of external validity is inappropriate, sincefor computer simulations inferences about the targetsystem do not involve the generalization of causal re-lations taking place in the computer to other systemsby comparing their material properties but involve therepresentational validity of the computational model.

A final problem is that the characterization of themethodology of experimental studies in terms of inter-nal and external validity, though useful in the socialsciences, is not a general one. Using it as an acceptedgeneral framework to compare experiments and com-puter simulations looks like a hasty extrapolation ofthe case of laboratory experiments in experimental eco-nomics, not to mention the fact that economics may beseen as a bold pick to build a general conceptual frame-work for experimental studies.

It is true that in experiments, the measured prop-erties are often not the ones that we are primarilyinterested in and the former are used as evidence aboutthese latter target properties. Typically, vorticity in tur-bulent flows is difficult to measure directly, and is oftenassessed by measuring velocity, based on imaging tech-niques. In more complex cases, the properties measuredcan be seen as a way to observe different and poten-tially remote target systems, as is vividly analyzed byShapere with his case study of the observation of thecore of the sun by the counting of 37Ar atoms in a tankwithin a mine on Earth [34.180]. Importantly, in all suchcases, the measuring apparatus, the directly measuredproperty, and the indirectly probed target system arerelated by causal processes. The uses of the collectedempirical information then vary with the type of inquirypursued. The evidence may be informational about thephysics of a particular system, like the Sun. Or, it maybe used to confirm or falsify theories (like in the caseof the 1919 experiment by Eddington and the relativ-ity theory). In some cases, though by no means all, itmay be used to draw inferences about the nature or be-havior of a larger class of similar systems – which arenot related to the measured system by a causal rela-tionships. If this latter case of reasoning about externalvalidity is taken as paradigmatic for experiments, andthe causal processes between the target experimented

Page 43: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.5 Comparing: Computer Simulations, Experiments 765

PartG|34.5

systems (the source) and the measuring apparatus (thereceptors), which are present in all experiments, areconsidered as a secondary feature, experimental activi-ties are misrepresented. As Peschard nicely puts it “theidea that the experiments conducted in the laboratoryare aimed at understanding some system that is outsidethe laboratory is a source of confusion” [34.200]. Gen-eral conceptual frameworks that do not introduce suchconfusion are however possible. For example, Peschardproposes [34.200]

“to make a distinction between the target system,that is manipulated in the experiment or representedin the computer simulation, and the epistemic mo-tivation, which in both cases may be different fromthe target system ”

(see also the distinction between the result of theunfolding of a scenario and the final result of the inquiryin [34.87]).

Overall, the common description provided byGuala, and heavily relied upon in [34.113, 199] to sup-port versions of the identity thesis can be defended onlyby squeezing experiments and computer simulationsinto a straightjacket which misrepresents these activi-ties, is not specifically fruitful, and meets insuperabledifficulties.

Materiality MattersClearly, for both experiments and computer simula-tions, materiality is crucial. However, it does matterdifferently, and one does not need to endorse a versionof the identity thesis to acknowledge the importanceof materiality when claiming for example that, to un-derstand computational science, the emphasis shouldbe on computer simulations which can be in practice,and therefore materially, carried out by actual sys-tems [34.41, 91].

For experiments, material details are relevantthroughout the whole inquiry when producing, dis-cussing and interpreting results, their validity and theirscope (especially if one tries to extrapolate from the in-vestigated system to a larger class of materially similarones). By contrast, for computer simulations, materialdetails are important to establish the reliability of thecomputation, but not beyond: only the mathematicaland physical details of the investigation matter whendiscussing and interpreting the results of the computersimulation and the reliability of the inquiry.

34.5.4 Knowledge Production,Superiority Claims, and Empiricism

The question of the epistemic superiority of experi-ments over simulations has also been discussed. Parke

[34.201] takes it for granted that “experiments are com-monly thought to have epistemic privilege over simula-tions” and claims that this is in fact a context-sensitiveissue. As we shall see, if one puts aside the question ofthe specific role of experiments as the source of primaryevidence about nature, it is not clear whether the gen-eral version of the superiority claim has actually beendefended, or whether a straw man is attacked.

Computer Simulations, Experiments and theProduction of Radically New Evidence

Let us try to specify what the general superiority claimcould be and how it has really been defended.

The obvious sense in which experiments may be su-perior is that they can provide scientists with primaryevidence about physical systems, which originate in in-teractions with these systems, and cannot be the productof our present theoretical beliefs. It is unlikely that com-puter simulation can endorse this role. As Simon pithilyputs it, “a simulation is no better than the assumptionsbuilt into it, and a computer can do only what it isprogrammed to do” [34.12, p. 14]. From this perspec-tive, experiments have the potential to surprise us ina unique way, in the sense that they can provide resultscontradictory to our most entrenched theories, whereasa computer simulation cannot be more fertile than thescientific model used to build it (even if computer sim-ulations can surprise us and bring about novel results,see Sect. 34.3.4). This is what Morgan seems to havein mind when she emphasizes that “[N]ew behaviourpatterns, ones that surprise and at first confound theprofession, are only possible if experimental subjectsare given the freedom to behave other than expected,”whereas “however unexpected the model outcomes,they can be traced back to, and re-explained in terms of,the model” [34.202, pp. 324–5]. In brief, experimentsare superior in the sense that, in the empirical sciences,they can serve a function which computer simulationscannot.

Roush [34.203] has highlighted another aspect re-garding which experiments can be superior to simula-tions. She first insists that we should compare the twomethods other things being equal, especially in terms ofwhat is known about the target situation. Then, in anycase in which there are elements in the experimenter’sstudy system that affect the results and are unknown,we may still run the experiment and learn how the targetsystem behaves; by contrast, in the same epistemic situ-ation, the simulationist cannot build a reliable computersimulation that yields the same knowledge. However,when all the physical elements that affect the result areknown, a simulation may be as good as an experiment,and it is a practical issue to determine which one can inpractice be carried out in the most reliable way.

Page 44: Handbook Springer oƒ Model-Based Science

PartG|34.5

766 Part G Modelling and Computational Issues

Thus, for a quantitative comparison to be meaning-ful it should be related to roles which can be sharedby experiments and computer simulations, such as theproduction of behavioral knowledge about physicalsystems, the relevant dynamics of which is known(Sect. 34.5.2).

Grounds for Comparative ClaimsScientists and philosophers have emphasized over thelast decades that computer simulations are often meresimulations [34.177], the results of which should betaken carefully. As seen above, economists shun sim-ulation; similarly, Peck states that evolutionary biolo-gists view simulationswith suspicion and even contempt[34.204, p. 530]. Nevertheless, however well advisedthese judgmentsmay be, they cannot by themselves sup-port a general and comparative claim of superiority infavor of other methods, but at most the claim that, infields where other methods are successful and computersimulations have little epistemicwarrants or face seriousproblems, these other methods will usually or on aver-age be more reliable (exceptions remaining possible).

Some authors have discussed the comparative claimby analyzing the power of the types of inferences madeto justify knowledge claims in each case. In [34.199],Parker adopts Guala’s description of experiments (resp.computer simulations) as having material (resp. for-mal) similarities with their target systems (see thediscussion in Sect. 34.5.3) and studies the claim thatinferences made on the basis of material similaritieswould have an epistemic privilege. (Guala does notseem to endorse a comparative claim. He argues thatmaterial similarities are a specific feature of experi-ments, implying that the prior knowledge needed todevelop simulations is different from that needed todevelop experiments.) Again, the common descriptionin terms of internal and external validity regardingthe inferences from one physical system to anothergives the semblance of a new problem. However, if,as suggested above, the material properties of comput-ers matter only in so far as they enable scientists tomake logically sound computations, and no similaritybetween systems is involved, the grounds and rationalefor this discussion between the properties of the com-puter and those of the target system collapse. A way tosave the argument is to claim that the aforementionedformal similarities are simply those between the com-putational model and the target system, but then thequestion boils down to the much more familiar compar-ison between model-based knowledge (here extractedby computational means) and some type of experiment-based knowledge.

On what grounds could the general privilege ofexperiment-based behavioral knowledge then be de-

fended? Since experiments and computer simulationsare different activities, which are faced with specificdifficulties, it is hard to see why computer simulationsshould always fare worse. Why could simulations basedon reliable models not sometimes provide more reli-able information than hazardous experiments? Indeed,it is commonly agreed that, when experiments cannotbe carried out, are unreliable, or ethically unacceptable,computer simulations may be a preferable way to gaininformation [34.41, p. 107].

Justified Contextual Superiority ClaimsInterestingly, superiority claims can sometimes bemade in specific contexts. Morgan presents cases ineconomics in which a precise and contextual version ofthe superiority claim may be legitimate [34.202].

Like Guala, Morgan discusses laboratory experi-ments in economics, that is, purified, controlled, andconstrained versions of real world systems, which arestudied in artificial laboratory environments (in con-trast with field experiments, which “follow economicbehavior in the wild” [34.202, p. 325]) and are aimedat investigating what is or would be the case in ac-tual (nonsimplified) economic situations. Mathematicalmodels can also be used for such inquiries and, in eachcase, scientists run the risk of describing artificial be-haviors. Morgan then makes the following contextualclaim that “any comparison with the model experimentis still very much to the real experiment’s advantagehere” [34.202, p. 321] (my emphasis) on the groundsthat, in this case, the problem of making ampliativeanalog inferences from laboratory system to real-worldsystems is nothing compared with the problem of therealism of assumptions for models exploring artificialmodels [34.202, pp. 321–322]. She does not justifythis point further, but a plausible interpretation is that,in such cases, mathematical models necessarily ab-stract away essential parts of the dynamics of decisionmaking, which arguably are preserved in experimentsbecause of the material similarity between the labora-tory and real agents. In brief, while material similarityplays a role in her argument she does not make thegeneral claim in the core of her paper that material sim-ilarity will always provide more reliable grounds forexternal validity claims than other methods (even if herformulation is less cautious in her conclusion).

Overall, such sound contextual comparative judg-ments require two premises: first that in some contextcomputer simulations are not reliable (or have relia-bility r) and second that in the same context materialsimilarities provide reasonably reliable inferences (orhave reliability s> r). (Indeed, analogical reasoningbased on material similarities, in which one reasonsbased on systems that are representative of or for larger

Page 45: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.6 The Definition of Computational Models and Simulations 767

PartG|34.6

classes of systems [34.127], can sometimes be pow-erful ways to make sound – though not infallible! –contextual inferences. As emphasized by Harré andMorgan, “shared ontology [. . . ] has epistemologicalimplications” [34.202, p. 323], since “the apparatus isa version of the naturally occurring phenomenon andthe material setup in which it occurs” [34.205, pp. 27–8]. After all, different samples of the same substanceobey the same laws, even if contextual influences maychange how they behave and any extrapolation is notpossible.)

34.5.5 The Epistemological Challengeof Hybrid Methods

Whether computer simulations and experiments are on-tologically, conceptually, and epistemologically distinctactivities or not, it is a fact that jointly experimental andcomputational mixed activities have been developedby scientists. Their study was pioneered by Morgan,who presents various types of hybrid cases in eco-nomics [34.206] and biomechanics [34.127]. For exam-ple, she reports different mixed studies aimed at inves-tigating the strength of bones and carried out by cuttingslices of bone samples, photographing them, creatingdigital 3-D images, and applying the laws of mechanicsto these experiment-based representations. Morgan fur-ther attempts to provide a principled typology of theseactivities. This proves difficult because “modern sci-ence is busy multiplying the number of hybrids on ourepistemological map” and because the qualities of hy-

brids “run along several dimensions” [34.127, p. 233].Overall, sciences illustrate “how difficult it is to cutcleanly, in any practical way, between the philosopher’scategories of theory, experiment and evidence” [34.127,p. 232], and, we may add, computer simulations orthought experiments.

Should these hybrid methods lead philosophers toreconsider the conceptual frontiers between experi-ments and computer simulations? We can first note thattheir existence may be seen as a confirmation that thetraditional picture of science, in which theoretical, rep-resentational or inferential methods on one hand andexperimental activities on the other play completelydifferent but complementary roles, is not satisfactory(Sect. 34.5.2). Then, if one grants that activities likeexperiments, thought experiments and computer sim-ulations can sometimes play identical roles, it is nosurprise that they can also be jointly used to fulfill them.Similarly, a group of four online players of queen ofspades sometimes involve virtual players – but mostpeople will be reluctant to see this as sufficient groundsfor claiming that bots are human creatures.

In any case, these hybrid activities raise episte-mological questions. What, if anything, distinguishesa computer simulation that makes heavy use of em-pirical data from a measurement involving the com-putational refinement of such data [34.53, 121]? Howmuch should the results of these methods be consid-ered as empirical? Overall, what type of knowledge anddata is thereby generated (see [34.53] for incipient an-swers)?

34.6 The Definition of Computational Models and Simulations

The main definitions of computer simulations are criti-cally presented: Humphreys’s 1994 definition in termsof computer-implemented methods, Hartmann’s 1996definition in terms of imitation of one process byanother process, Hughes’s DDI (denotation, demonstra-tion, interpretation) account of theoretical representa-tion, and finally Humphreys’s 2004 definition, with itsemphasis on the notion of a computational template(Sect. 34.6.1). The questions that a satisfactory defi-nition should answer are then discussed, in particularwhich notions should be primitive in the definition,whether computer simulations should be defined as log-ical or physical entities, whether they correspond tosuccess terms, how the definition should accommo-date the possibility of scientific failure and the pursuitof partly open inquiries, or to what extent computersimulations are social, intentional, or natural entities(Sect. 34.6.2).

I come back finally to the issue of the definition ofcomputer simulations. Providing a definition may lookat first sight to be easy, since what computers are iswell-known and clear cases of computer simulationsare well identified. However, a sound definition shouldalso be helpful to analyze less straightforward cases andbe fruitful regarding epistemological issues related tocomputer simulations, not least by forcing philosophersto clarify the various intuitions which are entertainedacross scientific fields about these methods.

It is not difficult to present definitions that accom-modate some types of computer simulations or someparticular (or field specific) uses of computer simula-tions. Nevertheless, failing to distinguish between whatis typical of computer simulations in general and whatis specific to particular cases can lead (and has led) toheedless generalizations (Sect. 34.5.3). Things are allthe more tricky as the very same types of computer

Page 46: Handbook Springer oƒ Model-Based Science

PartG|34.6

768 Part G Modelling and Computational Issues

simulations, qua formal tools (e.g., agent-based, CAmodels, equation-based simulations, etc.,) can be usedin different epistemic contexts for different purposes,and require totally different epistemological analyses.The case of CA-based computer simulations exempli-fies the risk of too quick essentialist characterizations.While it was believed that these models were appropri-ate for phenomenological simulations only [34.9, 135],their use in fluid dynamics has shown that they couldsupply theoretical models based on the same underly-ing physics as traditional methods [34.100].

The following section is organized as follows. Ex-isting definitions and the problems they raise are pre-sented first, and then issues that a good definition ofcomputer simulations should clarify are emphasized.

34.6.1 Existing Definitions of Simulations

Computer-Implemented MethodsAs emphasized by Humphreys, a crucial feature of sim-ulations is that they enable scientists to go beyond whatis possible for humans to do with their native inferentialabilities and pen-and-paper methods. Accordingly, heoffered in 1991 the following working definition [34.7]:

“A computer simulation is any computer-imple-mented method for exploring the properties ofmathematical models where analytic methods areunavailable.”

This definition requires that we possess a clear defi-nition of what counts as an analytic method, which isnot a straightforward issue [34.60]. Further, as notedby Hartmann et al. [34.10, pp. 83–84], it is possi-ble to simulate processes for which available modelsare analytically solvable. Finally, as acknowledged byHumphreys, the definition covers areas of computer-as-sisted science that one may be reluctant to call computersimulations. Indeed, this distinction does sometimesmatter in scientific practice. Typically, economists arenot reluctant to use computers to analyze models butshun computer simulations [34.37]. Since both com-putational methods and computer simulations involvecomputational processes, their difference must be eitherin the different types (or uses) of computations involvedeither at the mathematical and/or the representationallevel.

One Process Imitating Another ProcessHartmann proposes the following characterization,which gives the primacy to the representation of thetemporal evolution of systems [34.10, p. 83]:

“A model is called dynamic, if it [. . . ] includes as-sumptions about the time-evolution of the system.

[. . . ] Simulations are closely related to dynamicmodels. More concretely, a simulation results whenthe equations of the underlying dynamic model aresolved. This model is designed to imitate the time-evolution of a real system. To put it another way,a simulation imitates one process by another pro-cess. In this definition, the term process refers solelyto some object or system whose state changes intime. If the simulation is run on a computer, it iscalled a computer simulation.”

This definition has been criticized along the follow-ing lines. First, as noted by Hughes [34.13, p. 130],the definition rules out computer simulations that donot represent the time evolution of systems, whereas ar-guably one can simulate how the properties of modelsor systems vary in their phase space with other param-eters, such as temperature. Accordingly, a justificationfor the privilege granted to the representation of tempo-ral trajectories should be found, or the definition shouldbe refined, for example, by saying that computer simu-lations represent successive aspects or states of a well-defined trajectory of a system along a physical variablethrough its state space. Second, the idea that a specifictrajectory is meant to be representedmay also have to beabandoned. For example, in Monte Carlo simulations,we learn something about average values of quantitiesalong sets of target trajectories by generating a poten-tial representative of these trajectories, but the computersimulations are not aimed at representing any trajectoryin particular. One may also want a computer simula-tion to be simply informative about structural aspects ofa system.Overall, the temporal dynamics of the simulat-ing computer is a crucial aspect of computer simulationssince it “enables us to draw conclusions about the be-havior of the model” [34.13, p. 130] by unfolding theseconclusions in the temporal dimension of our world,but the temporal dynamics of the target system may nothave to be represented for something to count as a com-puter simulation.

Third, the definition is probably too centered onmodels and their solutions [34.207], since it equatescomputer simulations with the solving of a dynamicmodel that represents the target system. This is tan-tamount to ignoring the fact that describing computersimulations as mathematical solutions of dynamic mod-els is not completely satisfactory. What is being solvedis a computational model (as in Humphreys’s defini-tion [34.41], see below), which can be significantlydifferent from, and somewhat independent of, the ini-tial dynamic model of the system, which usually de-rives from existing theories. Effectively, different layersof models, often justified empirically, can be neededin-between [34.13, 97, 208]. For this reason, the repre-

Page 47: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.6 The Definition of Computational Models and Simulations 769

PartG|34.6

sentational relation between the initial dynamic modeland the target system, and between the computationalsystem and the target system, are epistemologically dis-tinct.

Finally, the definition may be reproached for en-tertaining a recurrent confusion about the role of ma-teriality in computer simulations (Sect. 34.5.3), bydescribing the representational relation as being be-tween two physical processes, and not between thecomputational model and succession of mathemati-cal states which unfold it (in whatever way they arephysically implemented and computed) and the targetsystem.

Computer Simulations as DemonstrationsHughes does not propose a specific definition of com-puter simulations since he believes that computer sim-ulations naturally fit in the DDI account of scien-tific representation that he otherwise defends [34.13,p. 132]. According to the DDI, which involves de-notation, demonstration, and interpretation as compo-nents [34.13, p. 125]:

“Elements of the subject of the model (a physi-cal system evincing a particular kind of behavior,like ferromagnetism) are denoted by elements ofthe model; the internal dynamic of the model thenallows conclusions (answers to specific questions)to be demonstrated within the model; these conclu-sions can then be interpreted in terms of the subjectof the model.”

The demonstration can be carried out by a physicalmodel (in the case of analog simulations) or by a log-ical or mathematical deduction, such as a traditionalmathematical proof, or a computer simulation. Further,according to Hughes, in contrast to Hartmann’s ac-count, “the DDI account allows for more than one layerof representation” [34.209, p. 79]. Overall, a virtue ofthis account is that it emphasizes the common episte-mological structures of different activities by pointingat a similar demonstrative step, which excavates theepistemic content and resources of the model (seealso [34.210] for refinements, [34.87] for an analysiswhich extends the idea of demonstration, or unfolding,to thought experiments and some types of experiments,and [34.72] for the related idea that computer simula-tions are arguments). While as a definition of computersimulation, Hughes’s sketchy proposal has somewhatbeen neglected (see however [34.208]) it is a legiti-mate contender and it remains to be seen how mucha more developed version of is would provide a fruitfulframework for philosophical discussions about com-puter simulations.

Computer Simulations as the ConcreteProduction of Solutions to ComputationalModels

In order to answer problems with the previous defini-tions, Humphreys proposed in 2004 another definitionof computer simulations, which is built along the fol-lowing lines [34.41]. He defines the notion of a theoret-ical template, which is implicitly defined as a generalrelation between quantities characterizing a physicalsystem, like Newton’s second law, Schrödinger’s equa-tion, or Maxwell’s equations. A theoretical templatecan be made less general by specifying some of itsvariables. When the result is computationally tractable,we end up with a computational template. (Thus, whatqualifies as a computational template seems to dependon our computational capacities at a given time.) Whena computational template is given (among other things)an interpretation, construction assumptions, and an ini-tial justification, it becomes a computational model.Finally, Humphreys offers the following characteriza-tion [34.41, pp. 110–111]:

“System S provides a core simulation of an objector process B just in case S is a concrete computa-tional device that produces, via a temporal process,solutions to a computational model [. . . ] that cor-rectly represents B, either dynamically or statically.If in addition the computational model used by Scorrectly represents the structure of the real systemR, then S provides a core simulation of system Rwith respect to B.”

Another important distinction lies between the com-puter simulation of the behavior of a system and thatof its dynamics [34.41, p. 111] since, even when thecomputational model initially represents the structureand dynamics of the system, the way its solutions arecomputed may not follow the corresponding causal pro-cesses. Indeed, in a computer simulation, the purposeis not that the computational procedure exactly mim-ics the causal processes, but that it efficiently yieldsthe target information from which an appropriate dy-namic representation of the target causal processes canfinally be built for the user. For reasons of computa-tional efficiency, the representation may be temporallyand spatially dismembered at the computational level(e.g., by computing the successive states in a differentorder), as may happen with the use of parallel process-ing, or of any procedure aimed at partially short cuttingthe actual physical dynamics.

The space here is insufficient to analyze all the as-pects of the above definition and to do justice to theirjustification – all the more so since further compli-cations may be required to accommodate even more

Page 48: Handbook Springer oƒ Model-Based Science

PartG|34.6

770 Part G Modelling and Computational Issues

complex cases [34.208]. Suffice it to say that this elab-orate definition, which is aimed at providing a syntheticanswer to the problems raised by previous definitions,is one of the most regularly referred to in the literature.

34.6.2 Pending Issues

Simulating or ComputingGiving a definition of computer simulations implieschoosing which notions should be regarded as primi-tive and how to order them logically. Some authors firstdefine the notion of simulation and present computersimulations as a specific type of simulations. For exam-ple, Bunge first defines the notion of analogy, then thatof simulation, and finally that of representation, as sub-relation of simulation. For him, an object x simulatesanother object y when (among other things) (1) thereis a suitable analogy between x and y and (2) the anal-ogy is valuable to x, or to another party z that controls x(see [34.11, p. 20] for more details).

A potential benefit of this strategy is that it be-comes possible to unify in the same general frameworkvarious different types of analogous relations betweensystems such as organism versus society, organism ver-sus automaton, scale ship versus its model, computersimulations of both molecular and biological evolu-tion, etc. Similarly, Winsberg [34.195, §1.3] suggeststhat the hydraulic dynamic scale model of the SanFrancisco Bay model should be viewed as a case ofsimulation (see [34.211] for a recent presentation andphilosophical discussion of this example in the contextof modeling). While scale models can obey the samedimensionless equations as their target systems and beused to provide analog simulations of them, Winsberg’sclaim is not uncontroversial and may require an ex-tension of the notion of simulation. Indeed the modeland the Bay itself do not exactly obey the same mathe-matical equations. For example, distortions between thevertical and horizontal scales in the model increase thehydraulic efficiency, which implies adding copper stripsand the need for empirical calibration. Therefore, thisis not exactly a case of a bona fide analog simulation(Sect. 34.2.2) but of a complex dynamical representa-tion between closely analogous systems. In any case,if one adopts such positions, it is then a small step todescribe other cases of analogical reasoning betweenmaterial systems (and possibly cases of experimentaleconomics, in which the dynamics of the analogous tar-get system is not precisely known and external validityis to be assessed by comparing the material systems in-volved) as cases of simulations (Sect. 34.5.3).

At the same time, unification is welcome only ifit is really fruitful (and is, of course, not misleading).As seen above, the problem with such analyses is that

they tend to describe computer simulations as involv-ing a representational relationship between twomaterialsystems and to misconstrue how computers work (seeagain Sect. 34.5.3). They thereby tend to misrepre-sent the epistemological role of the physical propertiesof computers and the fact that computational scienceinvolves two distinct steps; one in which computer sci-entists warrant that the computer is reliable and anotherin which scientists use computations and do not need toknow anything about computers qua physical systems.A way out of this deadlock may be to use a flexiblenotion of simulation, which can be applied to relationsbetween physical or logical–mathematical simulatingprocesses and the target simulated physical processes.Then, the question remains as to what exactly is gained(and lost) from an epistemological point of view byputting in the same category modes of reasoning ofsuch different types – if one puts aside the empha-sis on the obvious similarities with analog simulations,which are a very specific type of computer simulation(Sect. 34.2.2). Overall, it is currently far from clearwhether this unificatory move should be philosophi-cally praised.

Abstract Entities or Physical ProcessesArguably, computations are logical entities that canbe carried out by physical computers. Then, the ques-tion arises should computer simulations also be seenas abstract logical entities, or should they be seen asmaterial processes instantiating abstract computations?Hartmann’s definitions present computer simulationsas processes, whereas Humphreys’s definition is morecareful in the sense that the computing systems simplyproduce the solution or provide the computer simula-tion. Clearly, to analyze computational science, it isparamount to take into account material and practicalconstraints since a computer simulation is not reallya part of our science and we have no access to itscontent unless a material system carries it out a forus. At the same time, just like the identity of a text isnot at the material level, the identity of a computingsimulation (and the corresponding equivalence relation-ship between runs of the same computer simulation) isdefined at the logical (if not the mathematical) leveland the physical computer simply presents a token ofthe computer simulation. From this point of view, thematerial existence of computer simulations and the inprinciple/in practice distinction emphasized byHumph-reys [34.41] have epistemological, not ontological, sig-nificance, that is, they pertain to what we may learnby interacting with actual tokens of computer simula-tions [34.91, p. 573] but not to the nature of computersimulations. Similarly the identity of a proof seems tobe at the logical level, even if a proof has no existence

Page 49: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.6 The Definition of Computational Models and Simulations 771

PartG|34.6

nor use for us unless some mathematician providessome token of it.

Success, Failure, and the Definitionof Computer Simulations

A computer simulation is something that reproduces thebehavior or dynamics of a target system. The problemwith characterizations of this type is that they makecomputer simulation a success term and if a computersimulation mis-reproduces the target behavior, it is nolonger a computer simulation. This problem is a gen-eral one for representations, but is specifically acute forscientific representations (Frigg and Nguyen Chap. 3,this volume). Indeed, while anything in art can beused to represent anything else, scientific representa-tions are meant to be informative about the naturalsystems they represent. This is part of their essentialspecificities and, arguably, a definition according towhich any process could be described as a scientificcomputer simulation of any other process is not satis-factory. At the same time, one does not want somethingto be a computer simulation, or a scientific represen-tation, based on whether it is scientifically successfuland exactly mirrors its target (remember that, for somescientific inquiries, representational faithfulness is nota goal and may even impede the success of the investi-gation [34.212] and [34.23, Chaps. 1 and 3].

An option is to say that something is a scientific rep-resentation if it correctly depicts what its user wantsit to represent. However, this may raise a problem forcomputer simulations that were carried out and hadsubsequent nonintended uses, like the millennium sim-ulation. It may also raise a problem for fictions, whichstrictly speaking seem to represent nothing [34.25,p. 770].

Finally, failed representations, which do not repre-sent what their producers believe them to depict, arealso a problem. Representational inquiries can fail inmany ways, and failures are present on a daily ba-sis in scientific activity, from theories and experimentsto models and simulations. For this reason, descrip-tions of scientific activities should be compatible withfailure, especially if they are to account for scientificprogress and the development of more successful in-quiries. Indeed, it would be weird to claim that manyof the computer simulations that scientists performand publish about are actually not computer simula-tions. Further, whether a genuine computer simulationis carried out should be in general transparent to thepractitioner, and this cannot be the case if computersimulation is defined as a success term and scientificfailure is frequent (see also [34.213, pp. 57–58]).

Overall, a question is to determine where the fron-tier should lie between unsuccessful or failed computer

simulations, and potential cases in which somethingthat was believed to be a computer simulation by scien-tists actually is not. This in turn requires knowing howcomputer simulations can fail specifically [34.92] andwhich failures are specific to them. In brief, one needsto be able to decide on a justified basis which failuresdisqualify something from being a computer simulationand which ones simply alter its scientific, epistemic, orsemantic value. This analysis may also have to be co-herent with analyses about how other types of scientificactivities such as experiments and thought experimentscan fail [34.214], especially when these activities playsimilar or identical roles.

An option to consider is that something is a com-puter simulation based on criteria that do not involveempirical success, and that it qualifies as an empiri-cal success depending on additional semantic propertiesand on whether it correctly represents the relevant as-pects of its (real or fictional) target system(s). Thisoption is potentially encompassing enough (the scien-tifically short-sighted student can be said to performa computer simulation), but discriminating betweengood and bad computer simulations is still possible. Itis compatible with the fact that research inquiries areoften open and scientists need not know in advancewhat in their results will have representational valuein the end. Finally, it is also compatible with a differ-ent treatment of representational and implementationfailures. Indeed, the possibility of being unsuccess-ful at the representational level is consubstantial toempirical inquiries and is in this sense normal. Bycontrast, an implementation failure is simply some-thing that should be fixed. It corresponds to a case inwhich we did not manage to carry out the intendedcomputation, whereas computing is not supposed to bea scientific obstacle, and we learn nothing by fixing thefailure.

Natural, Intentional, or Social Entities?A similar but distinct issue is to determine which typeof objects computer simulations are, qua token physicalprocesses carried out by computing devices – a questionwhich is close to that of the nature of physical comput-ers and is also related to that of the ontology of model(Gelfert Chap. 1, this volume).

Arguably, they are not simply natural objects whichare defined by some set of physical properties and existindependently of the existence of the agents using them.Indeed, because computations can be multirealized andsome runs of computations built by patching differentbits of computation on physically different machines,it is unlikely that all computations can be described interms of natural kind predicates (massively disjunctivedescriptions not being allowed here) [34.126].

Page 50: Handbook Springer oƒ Model-Based Science

PartG|34.6

772 Part G Modelling and Computational Issues

Further, for both computations and computer simu-lations, pragmatic conditions of use seem to matter. Toquote Guala commenting on the anthropomorphism ofBunge’s definition (see above), [34.177, p. 61]

“it makes no sense to say that a natural geyser sim-ulates a volcano, as no one controls the simulatingprocess and the process itself is not useful to anyonein particular.”

Indeed, even if any physical system can be seenas computing some (potentially trivial) functions (seebelow), any physical object cannot be used as a (gen-eral) computer, and we may have to endorse a posi-tion along the lines of Searle’s notion of social ob-jects [34.215], or of any analysis doing the same work:a physical object X counts as Y in virtue of certain cog-nitive acts or states out of which they acquire certainsorts of functions (here computing), given that theseobjects need to demonstrate appropriate physical prop-erties so that they may serve these functions for us.A specificity of computer simulations is that, unlike en-trenched social objects, such as cars or wedding rings,a small group of users may actually be enough fora physical system to be seen as carrying out a computersimulation. Thus, the evolution of a physical system(like a fluid) may count for some users as an ana-log computer, which performs a computer simulation,and for other users as an experiment, even if experi-ments and computer simulations are in general objectsof different types, and this case is unlikely to be met inpractice (Sect. 34.5.3).

In any case, what is needed for something to beused as a computer or a computer simulation is notcompletely clear. The physical process must clearly berecognized as instantiating a computer model. Controlis useful but not necessarily mandatory (e.g., we mayuse the geyser to simulate a similar physical system,even if the geyser would not count as a controlled ver-satile analog computer). The possibility to extract thecomputed information is clearly useful – an issue thatmatters for discussions about analog and quantum com-puter simulations, and of course cryptography.

An alternative position is not to mention users in thedefinition and to claim that, pace the peculiar case ofman-made computations (which may make use heav-ily of the possibility offered by multiple realizability,see Sect. 34.5.3), physical processes are the one-piecephysical instantiations of running computer models(resp. computer simulations) and, as such, are computa-tions (even if, sometimes, trivial ones). See [34.216] fora sober assessment of this pancomputationialist posi-tion. In this perspective, onemay say that it is a practicalproblem to create artificial human-friendly computerswhich can in addition be controlled and the informa-

tion of which can be extracted. While such positionsmay be palatable for those, like Konrad Zuse, EdwardFredkin and their followers [34.64, 217, 218], who wantto see nature as a computer, it is not clear that such pan-computationalist theses, whatever their intrinsic meritsfor discussing foundational issues like the computa-tional power of nature or which types of computers arephysically possible, serve the purpose of understandingscience as it is actually practiced.

An important distinct question is whether inten-tional or pragmatic analyses should also be endorsedregarding computational models and computer sim-ulations, qua representational mathematical entities,that is, how much the intentions of users and con-ditions detailing how their use by scientists is pos-sible, should be part and parcel of their definitions.Arguably, a scientific model is not simply a pieceof syntax or an entity which inherently and by it-self represents, completely or partially, a target systemin virtue of the mathematical similarities it intrinsi-cally possesses with this system. In order to understandhow scientific representations and computer simula-tions work and actually play their scientific role, theirdescription may have to include captions, legends, ar-gumentative contexts, intentions of users, etc., sincethese elements are part of what makes them scientifi-cally meaningful units. Indeed, how one and the samemathematical model represents significantly varies de-pending on the inquiry, subject matter and knowledgeof the modelers. This is particularly clear in the caseof computational templates, which are used acrossfields of research for different representational andepistemological purposes [34.41, §3.7], and which arescientific units at the level of which different typesof theoretical and conceptual exchanges take placewithin and across disciplines [34.45]. Overall, this is-sue is not specific to computer simulations but canbe raised for other scientific representations [34.23,168, 219–221]. Thus, this point shall not be developedfurther.

Computer Simulationsand Computational Inquiries

How should computer simulations be delineated? Com-puter simulations do not wear on their sleeves how theywere built, contribute to scientific inquiries, should beinterpreted and how their results should be analyzed.Accordingly, authors like Frigg and Reiss distinguishbetween computer simulations in the narrow sense (cor-responding to the use of the computer), and in thebroad sense (corresponding to the “entire process ofconstructing, using and justifying a model that in-volves analytically intractable mathematics” [34.30, p.596]). See also the distinction between the unfolding

Page 51: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science 34.7 Conclusion: Human-Centered, but not Human-Tailored 773

PartG|34.7

of a scenario and the computational inquiry involv-ing this unfolding at its core [34.87], or the descrip-tion of how the demonstration activity is encapsulatedin other activities in the DDI account of representa-tion [34.13].

Whatever the choice which is made, there is tensionhere. As underlined above, an analysis of the identity ofscientific representations cannot rest on the logical andmathematical properties of scientific models and theirsimilarities with their physical targets, and indicationsabout how these representations are to be interpretedcannot be discarded as irrelevant to the analysis of theirnature and uses. At the same time, computer simu-lation, qua computational process, and the argumentsthat are developed by humans about it, are activitiesof different natures and play different roles. Thereforean encompassing definition should not lead to blur thespecificities of the different components of computa-tional inquiries (just like a good account of thoughtexperiments should not blur that they crucially involvemental activities at their core and are part of inquiriesalso involving scientific arguments).

34.6.3 When Epistemology Cross-CutsOntology

Whatever the exact definition of computer simulations,it is clear that they are of a computational nature, in-volve representations of their target systems and thattheir dynamics is aimed at investigating the content ofthese representations.

Importantly, whereas the investigation of scien-tific representations is traditionally associated with theproduction of theoretical knowledge, the nature of com-

puter simulations does not seem to determine the typeof knowledge they produce.

Clearly, computer simulations can yield theoreticalknowledge when they are used to investigate theoreticalmodels. At the same time, even if computer simula-tions are not experiments (Sect. 34.5.3), they produceknowledge, which may qualify as empirical in differentand important senses. As we have seen, computer sim-ulations provide information about natural systems, thevalidity of which may be justified by empirical creden-tials rooted in interactions with physical systems for as-pects as various as the origin of their inputs, the flesh oftheir representations of systems (see in Sect. 34.5.5 theexamples by Morgan about the studies of the strengthof bones), the calibration or choice of their parameters,or their global validation by comparison with experi-ments (Sect. 34.3.2). However, information about thedynamics represented cannot completely be of empir-ical origin, since it involves the description of generalrelations between physical states, and general relationscannot be observed.

From this point of view, computer simulations maybe seen as a mathematical mode of demonstrating thecontent of scientific representations that is in a senseneutral regarding the type of content that is processed:empirically (resp. theoretically) justified representa-tions in, empirically (resp. theoretically) justified in-formation (or knowledge) out. This suggests that whenanalyzing and classifying types of scientific data andknowledge, the ways that they are produced and pro-cessed (experimentally or computationally) and wheretheir reliability comes from (e.g., theoretical credentialsor experimental warrants) are, at least in part, indepen-dent questions.

34.7 Conclusion: Human-Centered,but no Longer Human-Tailored Science

Computer simulations and computational science keepdeveloping and partly change scientific practices(Sect. 34.7.1). Human capacities no longer play therole they had in traditional science, hence the need toanalyze the articulation of computational and mentalactivities within computational science (Sect. 34.7.2).This requires in particular studying computational sci-ence for its own sake, which however should not beseen as implying that computer simulations always cor-respond to scientific activities of radically new types(Sect. 34.7.3). In any case, whatever the exact relationsbetween computer simulations and traditional activitieslike theorizing, experimenting or modeling, it is a factthat recent investigations about computer simulations

have shed light on epistemological issues which werede facto not treated in the framework of previous philo-sophical studies of science (Sect. 34.7.4).

Before the development of computers, humans wereinvolved at every step of scientific inquiries. Varioustypes of devices, tools, or instruments were invented toassist human senses and inferential abilities, and theywere tailored to fit human capacities and organs. Inbrief, science was for the most anthropocentric science,that is to paraphrase Humphreys [34.41, §1.1] “scienceby the people for the people,” and analysts of science,from Locke, Descartes, Kant to Kuhn, or Quine offereda human-centered epistemology [34.123, 124, p. 616].Similarly, theories and models needed to be couched

Page 52: Handbook Springer oƒ Model-Based Science

PartG|34.7

774 Part G Modelling and Computational Issues

in formalisms which made their symbolic manipulationpossible for humans (hence the success of differentialcalculus), problems were selected in such a way thatthey could be solved by humans, results were retrievedin ways such that humans could survey or browse them,etc.

34.7.1 The Partial Mutationof Scientific Practices

The use of computers within representational inquirieshas modified, and keeps modifying, scientific prac-tices. Theorizing is easier and therefore less academ-ically risky, even in the absence of well-entrenchedbacking-up theories; solutions to new problems be-come tractable and how scientific problems are selectedevolves; the models which are investigated no longerneed to be easily manipulated by human minds (e.g.,CA are well adapted for computations, but ill-suited tocarry out mental inferences [34.43]; the exploration ofmodels is primarily done by computers, making men-tal explorations and traditional activities like thoughtexperiments are somewhat more dispensable [34.117]and [34.41, pp. 115–116]; the treatment of computa-tional results, as well as their verification, is made bycomputational procedures; the storage of data, but alsotheir exploration by expected or additional inquirers,are also computer based. Finally, the human, mate-rial, and social structure of science is also modifiedby computers, with a different organization of scien-tific labor, the emergence in the empirical sciences ofcomputer-oriented scientists, like numerical physicistsand computational biologists or chemists, or the devel-opment of big computational pieces of equipment andcenters, the access to which is scientifically controlledby the scientific community (like for big experimentalpieces of equipment).

34.7.2 The New Place of Humansin Science

Overall, the place and role of humans in science hasbeen modified by computational science. Arguably, hu-man minds are still at the center of (computational)science, like spiders in their webs or pilots in theirspacecrafts, since science is still led, controlled, andused by people. Thus, we are in a hybrid scenarioin which we face what Humphreys calls the anthro-pocentric predicament of how, we, as humans, can“understand and evaluate computationally based scien-tific methods that transcend our own abilities” [34.42,p. 134]. In other words, interfaces and interplays be-tween humans and computers are the core loci from

which computational science is controlled and its re-sults skimmed by its human beneficiaries. More con-cretely, scientific problems still need to be selected;computational models, even if designed for computers,need to be scientifically chosen (e.g., CA-based mod-els of fluids were first demonstrated to yield the rightNavier–Stokes-like behavior by means of traditional an-alytic methods [34.43]; results of computer simulations,even if produced and processed by computers, need tobe analyzed relative to the goals of our inquiries; and ul-timately scientific human-sized understanding needs tobe developed for new fundamental or applied scientificorientations to be taken.

34.7.3 Analyzing Computational Practicesfor Their Own Sake

Over the last three decades, philosophers of sciencehave emphasized that in most cases computer simu-lations cannot simply be viewed as extensions of ourtheoretical activities. However, as discussed above, theassimilation of computer simulations with experimentalstudies is still not satisfactory. A temptation has been todescribe the situation as one in which computer stud-ies lay in-between theories and experiments. While thisdescription captures the inadequacy of traditional char-acterizations based on a sharp and exclusive dichotomybetween scientific activities, it is at best a metaphor.Further, this one-dimensional picture does little jus-tice to, let alone help one understand, the intricate andmultidimensional web of complex and context-sensitiverelations between these activities.

An alternative is to analyze computational models,computer simulations, and computational science fortheir own sake. Indeed, computer simulations clearlyprovide a variety of new types of scientific practices,the analysis of which is a problem in its own right. Im-portantly, this by no means implies that these practicesrequire a radically new or autonomous epistemologyor methodology. Similarly mathematical and scientificproblems can be genuinely independent, even when inthe end they can be reduced by complex proceduresto a set of known or solved problems. Indeed, theepistemology of computer simulations often overlapspiecewise with that of existing activities like theorizing,experimenting, or thought experimenting. Disentan-gling these threads, clarifying similarities, highlightingspecific features of computational methods, and analyz-ing how the results of computer simulations are justifiedin actual cases is an independent task for naturalisticphilosophers, even if one believes that, in principle,computer simulations boil down to specific mixes of al-ready existing, more basic activities.

Page 53: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science References 775

PartG|34

34.7.4 The Epistemological Treatmentof New Issues

In practice, the analysis of computer simulations hasraised philosophical issues, which were not treated byphilosophers before computational studies were takenas an independent object of inquiry, either because theywere ignored or unnoticed in the framework of previ-ous descriptions of science, or because they are gen-uinely novel [34.96, 124, 207]. This a posteriori justifiesmaking the epistemological analysis of computationalmodels and computer simulations a specific field ofthe philosophy of science. How much computer sim-ulations will keep modifying scientific practices andhowmuch their philosophical analysis will finally bringabout further changes in the treatment of importantissues like realism, empiricism, confirmation, explana-tion, or emergence, to quote just a few, remains an openquestion.

Acknowledgments. I have tried to present a criti-cal survey of the literature with the aim of clarifyingdiscussions. I thank the editors for providing me the op-

portunity to write this article and for being so generouswith space. I also thank T. Boyer-Kassem, J.M. Durán,E. Arnold, and specifically P. Humphreys for feedbackor help concerning this review article. I am also grate-ful to A. Barberousse, J. P. Delahaye, J. Dubucs, R. ElSkaf, R. Frigg, S. Hartmann, J. Jebeile, M. Morrison,M. Vorms, H. Zwirn for stimulating exchanges overthe last couple of years about the issue of models andsimulations and related questions. All remaining short-comings are mine.

Various valuable review articles, such as (J.M.Durán: A brief overview of the philosophical studyof computer simulations, Am. Philos. Assoc. Newslett.Philos. Comput. 13(1), 38–46 (2013), W.S. Parker:Computer simulation, In: The Routledge Companionto Philosophy of Science, 2nd edn., ed. by S. Psil-los, M. Curd (Routledge, London 2013)), have beenrecently written about the issue of computer simula-tions. (P. Humphreys: Computational science in Ox-ford bibliographies online, (2012) doi:10.1093/OBO/9780195396577-0100) presents and discusses impor-tant references and may be used as a short but insightfulresearch guide.

References

34.1 M. Mahoney: The histories of computing(s), Inter-discip. Sci. Rev. 30(2), 119–135 (2005)

34.2 A.M. Turing: Computing machinery and intelli-gence, Mind 59, 433–460 (1950)

34.3 A. Newell, A.S. Herbert: Computer science as em-pirical inquiry: Symbols and search, Commun.ACM 19(3), 113–126 (1976)

34.4 Z.W. Pylyshyn: Computation and Cognition: To-ward a Foundation for Cognitive Science (MITPress, Cambridge 1984)

34.5 H. Putnam: Brains and behavior. In: Analyti-cal Philosophy: Second Series, ed. by R.J. Butler(Blackwell, Oxford 1963)

34.6 J.A. Fodor: The Language of Thought (Crowell,New York 1975)

34.7 P. Humphreys: Computer simulations, Proceed-ings of the Biennial Meeting of the Philosophyof Science Association, Vol. 2, ed. by A. Fine,M. Forbes, L. Wessels (Univ. Chicago Press, Chicago1990) pp. 497–506

34.8 P. Humphreys: Numerical experimentation. In:Philosophy of Physics, Theory Structure andMeasurement Theory, Patrick Suppes: ScientificPhilosopher, Vol. 2, ed. by P. Humphreys (Kluwer,Dordrecht 1994)

34.9 F. Rohrlich: Computer simulations in the physi-cal sciences, Proceedings of the Biennial Meetingof the Philosophy of Science Association, ed. byA. Fine, M. Forbes, L. Wessels (Univ. Chicago Press,Chicago 1991) pp. 507–518

34.10 S. Hartmann: The world as a process: Simu-lations in the natural and social sciences. In:Modelling and Simulation in the Social Sciencesfrom the Philosophy of Science Point of View, The-ory and Decision Library, ed. by R. Hegselmann,U. Mueller, K.G. Troitzsch (Kluwer, Dordrecht 1996)pp. 77–100

34.11 M. Bunge: Analogy, simulation, representation,Rev. Int. Philos. 87, 16–33 (1969)

34.12 H.A. Simon: The Sciences of the Artificial (MITPress, Boston 1969)

34.13 R.I.G. Hughes: The Ising model, computer simu-lation, and universal physics. In: Models as Medi-ators: Perspectives on Natural and Social Science,ed. by M.S. Morgan, M. Morrison (Cambridge Univ.Press, Cambridge 1999) pp. 97–145

34.14 S. Sismondo: Models, simulations, and their ob-jects, Sci. Context 12(2), 247–260 (1999)

34.15 E. Winsberg: Sanctioning models: The episte-mology of simulation, Sci. Context 12(2), 275–292(1999)

34.16 E. Winsberg: Simulations, models, and theories:Complex physical systems and their representa-tions, Philos. Sci. 68, S442–S454 (2001)

34.17 E. Winsberg: Simulated experiments: Methodol-ogy for a virtual world, Philos. Sci. 70(1), 105–125(2003)

34.18 M. Black: Models and Metaphors: Studies in Lan-guage and Philosophy (Cornell Univ. Press, NewYork 1968)

Page 54: Handbook Springer oƒ Model-Based Science

PartG|34

776 Part G Modelling and Computational Issues

34.19 M. Hesse:Models and Analogies in Science (SheedWard, London 1963)

34.20 M. Redhead: Models in physics, Br. J. Philos. Sci.31, 145–163 (1980)

34.21 N. Cartwright: How the Laws of Physics Lie(Clarendon, Oxford 1983)

34.22 M. Morgan, M. Morrison: Models as Mediators(Cambridge Univ. Press, Cambridge 1999)

34.23 B. Van Fraassen: Scientific Representation: Para-doxes of Perspective (Clarendon Press, Oxford2008)

34.24 R. Frigg: Scientific representation and the seman-tic view of theories, Theoria 55, 49–65 (2006)

34.25 M. Suárez: An inferential conception of scientificrepresentation, Philos. Sci. 71(5), 767–779 (2004)

34.26 R. Laymon: Computer simulations, idealizationsand approximations, Proceedings of the BiennialMeeting of the Philosophy of Science Association(Univ. Chicago Press, Chicago 1990) pp. 519–534

34.27 R.N. Giere: Understanding Scientific Reasoning(Holt Rinehart Winston, New York 1984)

34.28 R.N. Giere: Explaining Science: A Cognitive Ap-proach (Univ. Chicago Press, Chicago 1988)

34.29 J. Kulvicki: Knowing with images: Medium andmessage, Philos. Sci. 77(2), 295–313 (2010)

34.30 R. Frigg, J. Reiss: The philosophy of simulation:Hot new issues or same old stew?, Synthese169(3), 593–613 (2008)

34.31 M. Mahoney: The history of computing in thehistory of technology, Ann. Hist. Comput. 10(2),113–125 (1988)

34.32 D.A. Grier: Human computers: The first pioneersof the information age, Endeavour 25(1), 28–32(2001)

34.33 L. Daston: Enlightenment calculations, Crit. Inq.21(1), 182–202 (1994)

34.34 I. Grattan-Guinness: Work for the hairdressers:The production of de Prony’s logarithmic andtrigonometric tables, Ann. Hist. Comput. 12(3),177–185 (1990)

34.35 T. Schelling: Models of segregation, Am. Econ. Rev.59(2), 488–493 (1969)

34.36 A. Johnson, J. Lenhard: Towards a new cultureof prediction. Computational modeling in the eraof desktop computing. In: Science Transformed?:Debating Claims of an Epochal Break, ed. byA. Nordmann, H. Radder, G. Schiemann (Univ.Pittsburgh Press, Pittsburgh 2011)

34.37 A. Lehtinen, J. Kuorikoski: Computing the per-fect model: Why do economists shun simulation?,Philos. Sci. 74(3), 304–329 (2007)

34.38 R. Hegselmann, U. Mueller, K.G. Troitzsch: Mod-elling and Simulation in the Social Sciences fromthe Philosophy of Science Point of View (Springer,Dordrecht, Pays-Bas 1996)

34.39 G.N. Gilbert, K.G. Troitzsch: Simulation for the So-cial Scientist (Open Univ. Press, Berkshire 2005)

34.40 J. Reiss: A plea for (good) simulations: Nudg-ing economics toward an experimental science,Simul. Gaming 42(2), 243–264 (2011)

34.41 P. Humphreys: Extending Ourselves. Computa-tional Science, Empiricism, and Scientific Method

(Oxford Univ. Press, Oxford 2004)34.42 P. Humphreys: Computational science and its ef-

fects. In: Science in the Context of Application,Boston Studies in the Philosophy of Science, Vol.274, ed. by M. Carrier, A. Nordmann (Springer, NewYork 2011), pp. 131–142, Chap. 9

34.43 A. Barberousse, C. Imbert: Le tournant compu-tationnel et l’innovation théorique. In: Précis dePhilosophie de La Physique, ed. by S. Le Bihan(Vuibert, Paris 2013), in French

34.44 I. Lakatos: Falsification and the methodology ofscientific research programmes. In: Criticism andthe Growth of Knowledge, ed. by I. Lakatos,A. Musgrave (Cambridge Univ. Press, Cambridge1970) pp. 91–195

34.45 T. Knuuttila, A. Loettgers: Magnets, spins, andneurons: The dissemination of model templatesacross disciplines, The Monist 97(3), 280–300(2014)

34.46 T. Knuuttila, A. Loettgers: The productive tension:Mechanisms vs. templates in modeling the phe-nomena. In: Representations, Models, and Sim-ulations, ed. by P. Humphreys, C. Imbert (Rout-ledge, New York 2012) pp. 3–24

34.47 A. Carlson, T. Carey, P. Holsberg (Eds.): Handbookof Analog Computation, 2nd edn. (Electronic As-sociates, Princeton 1967)

34.48 M.C. Gilliland: Handbook of Analog Computa-tion: Including Application of Digital Control Logic(Systron-Donner Corp, Concord 1967)

34.49 V.M. Kendon, K. Nemoto, W.J. Munro: Quantumanalogue computing, Philos. Trans. R. Soc. A 368,3609–3620 (2010), 1924

34.50 C. Shannon: The mathematical theory of commu-nication, Bell Syst. Tech. J. 27, 379–423 (1948)

34.51 M.B. Pour-el: Abstract computability and its re-lation to the general purpose analog computer(Some connections between logic, differentialequations and analog computers), Trans. Am.Math. Soc. 199, 1–28 (1974)

34.52 M. Pour-El, I. Richards: Computability in Analysisand in Physics. Perspective in Mathematical Logic(Springer, Berlin, Heidelberg 1988)

34.53 E. Arnold: Experiments and simulations: Do theyfuse? In: Computer Simulations and the Chang-ing Face of Scientific Experimentation, ed. byJ.M. Durán, E. Arnold (Cambridge Scholars Pub-lishing, Newcastle upon Tyne 2013)

34.54 R. Trenholme: Analog simulation, Philos. Sci.61(1), 115–131 (1994)

34.55 P.K. Kundu, I.M. Cohen, H.H. Hu: Fluid Mechanics,3rd edn. (Elsevier, Amsterdam 2004)

34.56 S.G. Sterrett: Models of machines and modelsof phenomena, Int. Stud. Philos. Sci. 20, 69–80(2006)

34.57 S.G. Sterrett: Similarity and dimensional analy-sis. In: Philosophy of Technology and EngineeringSciences, ed. by A. Meijers (Elsevier, Amsterdam2009)

34.58 G.I. Barenblatt: Scaling, Self-Similarity, and In-termediate Asymptotics, Cambridge Texts in Ap-plied Mathematics, Vol. 14 (Cambridge Univ. Press,

Page 55: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science References 777

PartG|34

Cambridge 1996)34.59 R.W. Shonkwiler, L. Lefton: An Introduction to

Parallel and Vector Scientific Computing (Cam-bridge Univ. Press, Cambridge 2006)

34.60 M.J. Borwein, R.E. Crandall: Closed forms: Whatthey are and why we care, Not. Am. Math. Soc.60(1), 50 (2013)

34.61 B. Fillion, S. Bangu: Numerical methods, com-plexity, and epistemic hierarchies, Philos. Sci.82(5), 941–955 (2015)

34.62 N. Fillion, R.M. Corless: On the epistemologicalanalysis of modeling and computational error inthe mathematical sciences, Synthese 191(7), 1451–1467 (2014)

34.63 R. Feynman: Simulating physics with computers,Int. J. Theor. Phys. 21(6/7), 467–488 (1982)

34.64 T. Toffoli: Cellular automata as an alternative to(rather than an approximation of) differentialequations in modeling physics, Physica D 10, 117–127 (1984)

34.65 N. Margolus: Crystalline computation. In: Feyn-man and Computation: Exploring the Limits ofComputers, ed. by A. Hey (Westview, Boulder 2002)

34.66 R. Hegselmann: Understanding social dynamics:The cellular automata approach. In: Social ScienceMicrosimulation, ed. by K.G. Troitzsch, U. Mueller,G.N. Gilbert, J. Doran (Springer, London 1996)pp. 282–306

34.67 C.G. Langton: Studying artificial life with cellularautomata, Physica D 22, 120–149 (1986)

34.68 B. Hasslacher: Discrete Fluids, Los Alamos Sci.Special issue 15, 175–217 (1987)

34.69 N. Metropolis, S. Ulam: The Monte Carlo method,J. Am. Stat. Assoc. 44(247), 335–341 (1949)

34.70 P. Galison: Computer simulations and the trad-ing zone. In: The Disunity of Science: Boundaries,Contexts, and Power, ed. by P. Galison, D. Stump(Stanford Univ. Press, Stanford 1996) pp. 118–157

34.71 P. Galison: Image and Logic: A Material Culture ofMicrophysics (Univ. Chicago Press, Chicago 1997)

34.72 C. Beisbart, J. Norton: Why Monte Carlo simu-lations are inferences and not experiments. In:International Studies in Philosophy of Science,Vol. 26, ed. by J.W. McAllister (Routledge, Abing-ton 2012) pp. 403–422

34.73 S. Succi: The Lattice Boltzmann Equation for FluidDynamics and Beyond (Clarendon, Oxford 2001)

34.74 A.M. Bedau: Weak emergence, Philos. Perspect.11(11), 375–399 (1997)

34.75 T. Grüne-Yanoff: The explanatory potential of ar-tificial societies, Synthese 169(3), 539–555 (2009)

34.76 B. Epstein: Agent-based modeling and the fal-lacies of individualism. In: Models, Simulations,and Representations, ed. by P. Humphreys, C. Im-bert (Routledge, London 2011) p. 115444

34.77 S.B. Pope: Turbulent Flows (Cambridge Univ.Press, Cambridge 2000)

34.78 P.N. Edwards: A Vast Machine: Computer Models,Climate Data, and the Politics of Global Warming(MIT Press, Cambridge 2010)

34.79 M. Heymann: Understanding and misunder-standing computer simulation: The case of at-

mospheric and climate science – An introduction,Stud. Hist. Philos. Sci. Part B 41(3), 193–200 (2010),Special Issue: Modelling and Simulation in the At-mospheric and Climate Sciences

34.80 E. Winsberg: Handshaking your way to the top:Inconsistency and falsification in intertheoreticreduction, Philos. Sci. 73, 582–594 (2006)

34.81 P. Humphreys: Scientific knowledge. In: Hand-book of Epistemology, ed. by I. Niiniluoto, M. Sin-tonen, J. Woleński (Springer, Dordrecht 2004)

34.82 W.S. Parker: Understanding pluralism in climatemodeling, Found. Sci. 11(4), 349–368 (2006)

34.83 W.S. Parker: Ensemblemodeling, uncertainty androbust predictions, Wiley Interdiscip. Rev.: Clim,Change 4(3), 213–223 (2013)

34.84 M. Sundberg: Cultures of simulations vs. culturesof calculations? The development of simulationpractices in meteorology and astrophysics, Stud.Hist. Philos. Sci. Part B 41, 273–281 (2010), SpecialIssue: Modelling and simulation in the atmo-spheric and climate sciences

34.85 M. Sundberg: The dynamics of coordinated com-parisons: How simulationists in astrophysics,oceanography and meteorology create standardsfor results, Soc. Stud. Sci. 41(1), 107–125 (2011)

34.86 E. Tal: From data to phenomena and back again:Computer-simulated signatures, Synthese 182(1),117–129 (2011)

34.87 R. El Skaf, C. Imbert: Unfolding in the empiricalsciences: Experiments, thought experiments andcomputer simulations, Synthese 190(16), 3451–3474 (2013)

34.88 L. Soler, S. Zwart, M. Lynch, V. Israel-Jost: ScienceAfter the Practice Turn in the Philosophy, History,and Social Studies of Science (Routledge, London2014)

34.89 H. Chang: The philosophical grammar of scientificpractice, Int. Stud. Philos. Sci. 25(3), 205–221 (2011)

34.90 H. Chang: Epistemic activities and systems ofpractice: Units of analysis in philosophy of sci-ence after the practice turn. In: Science After thePractice Turn in the Philosophy, History and So-cial Studies of Science, ed. by L. Soler, S. Zwart,M. Lynch, V. Israel-Jost (Routledge, London 2014)pp. 67–79

34.91 A. Barberousse, S. Franceschelli, C. Imbert:Computer simulations as experiments, Synthese169(3), 557–574 (2009)

34.92 P. Grim, R. Rosenberger, A. Rosenfeld, B. Ander-son, R.E. Eason: How simulations fail, Synthese190(12), 2367–2390 (2013)

34.93 J.H. Fetzer: Program verification: The very idea,Commun. ACM 31(9), 1048–1063 (1988)

34.94 A. Asperti, H. Geuvers, R. Natarajan: Social pro-cesses, program verification and all that, Math.Struct. Comput. Sci. 19(5), 877–896 (2009)

34.95 W.L. Oberkampf, C.J. Roy: Verification and Vali-dation in Scientific Computing (Cambridge Univ.Press, Cambridge 2010)

34.96 W.S. Parker: Computer simulation. In: The Rout-ledge Companion to Philosophy of Science, ed. byS. Psillos, M. Curd (Routledge, London 2013)

Page 56: Handbook Springer oƒ Model-Based Science

PartG|34

778 Part G Modelling and Computational Issues

34.97 J. Lenhard: Computer simulation: The cooperationbetween experimenting and modeling, Philos.Sci. 74(2), 176–194 (2007)

34.98 N. Oreskes, K. Shrader-Frechette, K. Belitz: Verifi-cation, validation, and confirmation of numericalmodels in the earth sciences, Science 263(5147),641–646 (1994)

34.99 J. Lenhard, E. Winsberg: Holism, entrenchment,and the future of climate model pluralism, Stud.Hist. Philos. Sci. 41(3), 253–262 (2010)

34.100 A. Barberousse, C. Imbert: New mathematics forold physics: The case of lattice fluids, Stud. Hist.Philos. Sci. Part B 44(3), 231–241 (2013)

34.101 J.M. Boumans: Understanding in economics:Gray-box models. In: Scientific Understanding:Philosophical Perspectives, ed. by H.W. de Regt,S. Leonelli, K. Eigner (Univ. Pittsburgh Press, Pitts-burgh 2009)

34.102 C. Imbert: L’opacité intrinsèque de la nature:Théories connues, phénomènes difficiles à expli-quer et limites de la science, Ph.D. Thesis (Ateliernational de Reproduction des Thèses, Lille 2008),http://www.theses.fr/2008PA010703.

34.103 J. Hardwig: The role of trust in knowledge, J. Phi-los. 88(12), 693–708 (1991)

34.104 H. Reichenbach: On probability and induction,Philos. Sci. 5(1), 21–45 (1938), reprinted in S. Sarkar(Ed.) Logic, Probability and Induction (Garland,New York 1996)

34.105 A. Barberousse, C. Imbert: Recurring modelsand sensitivity to computational constraints, TheMonist 97(3), 259–279 (2014)

34.106 T. Kuhn: The Structure of Scientific Revolutions,3rd edn. (The Univ. Chicago Press, Chicago 1996)

34.107 P. Kitcher: Explanatory unification and the causalstructure of the world. In: Scientific Explanation,ed. by P. Kitcher, W. Salmon (Univ. MinnesotaPress, Minneapolis 1989)

34.108 R. De Langhe: A unified model of the division ofcognitive labor, Philos. Sci. 81(3), 444–459 (2014)

34.109 A. Lyon: Why are normal distributions normal?,Br. J. Philos. Sci. (2013), doi:10.1093/bjps/axs046

34.110 R. Batterman: Why equilibrium statistical me-chanics works: Universality and the renormaliza-tion group, Philos. Sci. 65, 183–208 (1998)

34.111 R. Batterman: Multiple realizability and univer-sality, Br. J. Philos. Sci. 51, 115–145 (2000)

34.112 R. Batterman: Asymptotics and the role of mini-mal models, Br. J. Philos. Sci. 53, 21–38 (2002)

34.113 E. Winsberg: A tale of two methods, Synthese169(3), 575–592 (2009)

34.114 S.D. Norton, F. Suppe: Why atmospheric mod-eling is good science. In: Changing the Atmo-sphere: Expert Knowledge and EnvironmentalGovernance, ed. by P. Edwards, C. Miller (MITPress, Cambridge 2001)

34.115 C. Beisbart: How can computer simulations pro-duce new knowledge?, Eur. J. Philos. Sci. 2, 395–434 (2012)

34.116 E.A. Di Paolo, J. Noble, S. Bullock: Simulationmodels as opaque thought experiments, Proc. 7thInt. Conf. Artif. Life, ed. by K.A. Bedau, J.S. Mc-

Caskill, N. Packard, S. Rasmussen (MIT Press, Cam-bridge 2000) pp. 497–506

34.117 S. Chandrasekharan, N.J. Nersessian, V. Subrama-nian: Computational modeling: Is this the end ofthought experimenting in science? In: ThoughtExperiments in Philosophy, Science and the Arts,ed. by J. Brown, M. Frappier, L. Meynell (Rout-ledge, London 2012) pp. 239–260

34.118 J.D. Norton: Are thought experiments just whatyou thought?, Can. J. Philos. 26, 333–366 (1996)

34.119 J.D. Norton: On thought experiments: Is theremore to the argument?, Philos. Sci. 71, 1139–1151(2004)

34.120 R. Descartes: Discours de la méthode. In: Oeuvresde Descartes, Vol. 6, ed. by C. Adam, P. Tannery (J.Vrin, Paris 1996), first published in 1637

34.121 P. Humphreys: What are data about? In: ComputerSimulations and the Changing Face of Experi-mentation, ed. by E. Arnold, J. Durán (CambridgeScholars Publishing, Cambridge 2013)

34.122 M. Stöckler: On modeling and simulations as in-struments for the study of complex systems. In:Science at Century’s End: Philosophical Questionson the Progress and Limits of Science, ed. byM. Carrier, G. Massey, L. Ruetsche (Univ. PittsburghPress, Pittsburgh 2000) pp. 355–373

34.123 P. Humphreys: Computational and conceptualemergence, Philos. Sci. 75(5), 584–594 (2008)

34.124 P. Humphreys: The philosophical novelty of com-puter simulation methods, Synthese 169(3), 615–626 (2008)

34.125 A. Barberousse, M. Vorms: Computer simulationsand empirical data. In: Computer Simulationsand the Changing Face of Scientific Experimen-tation, ed. by J.M. Durán, E. Arnold (CambridgeScholars Publishing, Newcastle upon Tyne 2013)

34.126 J.A. Fodor: Special sciences (or: The disunity ofscience as a working hypothesis), Synthese 28(2),97–115 (1974)

34.127 M.S. Morgan: Experiments without material in-tervention: Model experiments, virtual experi-ments and virtually experiments. In: The Philos-ophy of Scientific Experimentation, ed. by R. Hans(Univ. Pittsburgh Press, Pittsburgh 2003) pp. 216–235

34.128 C. Hempel: Aspects of Scientific Explanation andOther Essays in the Philosophy of Science (FreePress, New York 1965)

34.129 W. Salmon: Scientific Explanation and the CausalStructure of the World (Princeton Univ. Press,Princeton 1984)

34.130 W. Salmon: Causality without counterfactuals,Philos. Sci. 61, 297–312 (1994)

34.131 P. Railton: Probability, explanation, information,Synthese 48, 233–256 (1981)

34.132 P. Kitcher: The Advancement of Science: ScienceWithout Legend, Objectivity Without Illusions (Ox-ford Univ. Press, New York 1993)

34.133 T. Grüne-Yanoff, P. Weirich: The philosophy andepistemology of simulation: A review, Simul.Gaming 41(1), 20–50 (2010)

Page 57: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science References 779

PartG|34

34.134 A. Ilachinski: Cellular Automata: A Discrete Uni-verse (World Scientific, Singapore 2001)

34.135 E.F. Keller: Models, simulation and computer ex-periments. In: The Philosophy of Scientific Exper-imentation, ed. by H. Radder (Univ. PittsburghPress, Pittsburgh 2003) pp. 198–215

34.136 D. Dowling: Experimenting on theories, Sci. Con-text 12(2), 261–273 (1999)

34.137 G. Piccinini: Computational explanation andmechanistic explanation of mind. In: Cartogra-phies of the Mind, ed. by M. Marraffa, M. De Caro,F. Ferretti (Springer, Dordrecht 2007) pp. 23–36

34.138 E. Arnold: What’s wrong with social simulations?,The Monist 97(3), 359–377 (2014)

34.139 S. Ruphy: Limits to modeling: Balancing ambi-tion and outcome in astrophysics and cosmology,Simul. Gaming 42(2), 177–194 (2011)

34.140 B. Epstein, P. Forber: The perils of tweaking: Howto use macrodata to set parameters in com-plex simulation models, Synthese 190(2), 203–218(2012)

34.141 W. Bechtel, R.C. Richardson: Discovering Com-plexity: Decomposition and Localization asStrategies in Scientific Research (MIT Press,Cambridge 1993)

34.142 H. Zwirn: Les Systèmes complexes (Odile Jacob,Paris 2006), in French

34.143 Y. Bar-Yam: Dynamics of Complex Systems (West-view, Boulder 1997)

34.144 R. Badii, A. Politi: Complexity: Hierarchical Struc-tures and Scaling in Physics (Cambridge Univ.Press, Cambridge 1999)

34.145 D. Little: Varieties of Social Explanation: An In-troduction to the Philosophy of Social Science(Westview, Boulder 1990)

34.146 H. Kincaid: Philosophical Foundations of the So-cial Sciences: Analyzing Controversies in SocialResearch (Cambridge Univ. Press, Cambridge 1996)

34.147 C. Hitchcock: Discussion: Salmon on explanatoryrelevance, Philos. Sci. 62, 304–320 (1995)

34.148 C. Imbert: Relevance, not invariance, explana-toriness, not manipulability: Discussion of Wood-ward’s views on explanatory relevance, Philos.Sci. 80(5), 625–636 (2013)

34.149 W.C. Salmon: Four Decades of Scientific Explana-tion (Univ. Pittsburgh Press, Pittsburgh 2006)

34.150 G. Schurz: Relevant deduction, Erkenntnis 35(1–3), 391–437 (1991)

34.151 H.E. Kyburg: Comment, Philos. Sci. 32, 147–151(1965)

34.152 M. Scriven: Explanations, predictions, and laws.In: Scientific Explanation, Space, and Time, Vol.3, ed. by H. Feigl, G. Maxwells (Univ. MinnesotaPress, Minneapolis 1962) pp. 170–230

34.153 J. Woodward: Scientific explanation. In: TheStanford Encyclopedia of Philosophy, ed. byE.N. Zalta (Stanford Univ., Stanford 2014), http://plato.stanford.edu/archives/win2014/entries/scientific-explanation/

34.154 J. Woodward: Making Things Happen (OxfordUniv. Press, Oxford 2003)

34.155 S. Wolfram: A New Kind of Science (Wolfram Me-dia, Champaign 2002)

34.156 H.W. de Regt, D. Dieks: A contextual approach toscientific understanding, Synthese 144(1), 137–170(2005)

34.157 R.P. Feynman, R.B. Leighton, M.L. Sands: TheFeynman Lectures on Physics, Vol. 3 (Addison-Wesley, Reading 1963)

34.158 C. Hempel: Reasons and covering laws inhistorical explanation. In: The Philosophy ofC.G. Hempel: Studies in Science, Explanation, andRationality, ed. by J.H. Fetzler (Oxford Univ. Press,Oxford 2000), first published in 1963

34.159 J. Lenhard: Surprised by a nanowire: Simulation,control, and understanding, Philos. Sci. 73(5),605–616 (2006)

34.160 M. Bedau: Downward causation and the auton-omy of weak emergence, Principia 6, 5–50 (2003)

34.161 P. Huneman: Determinism, predictability andopen-ended evolution: Lessons from computa-tional emergence, Synthese 185(2), 195–214 (2012)

34.162 C. Imbert: Why diachronically emergent proper-ties must also be salient. In:World Views, Science,and Us: Philosophy and Complexity, ed. by C. Ger-shenson, D. Aerts, B. Edmonds (World Scientific,Singapore 2007) pp. 99–116

34.163 H. Zwirn, J.P. Delahaye: Unpredictability andcomputational irreducibility. In: Irreducibilityand Computational Equivalence, Emergence,Complexity and Computation, Vol. 2, ed. byH. Zenil (Springer, Berlin, Heidelberg 2013)pp. 273–295

34.164 J. Kuorikoski: Simulation and the sense of un-derstanding. In: Models, Simulations, and Repre-sentations, ed. by P. Humphreys, C. Imbert (Rout-ledge, London 2012)

34.165 C.R. Shalizi, C. Moore: What Is a Macrostate?Subjective Observations and Objective Dynamics(2003) arxiv:cond-mat/0303625

34.166 N. Israeli, N. Goldenfeld: Computational irre-ducibility and the predictability of complex phys-ical systems, Phys. Rev. Lett. 92(7), 074105 (2004)

34.167 N. Goodman: Language of Arts (Hackett, Indi-anapolis 1976)

34.168 M. Vorms: Formats of representation in scientifictheorizing. In: Models, Simulations, and Repre-sentations, (Routledge, London 2012) pp. 250–273

34.169 J. Jebeile: Explication et Compréhension Dans LesSciences Empiriques. Les modèles Scientifiques etle Tournant Computationnel, Ph.D. Thesis (Uni-versité Paris, Paris 2013)

34.170 S. Bullock: Levins and the lure of artificial worlds,The Monist 97(3), 301–320 (2014)

34.171 J. Lenhard: Autonomy and automation: Compu-tational modeling, reduction, and explanation inquantum chemistry, The Monist 97(3), 339–358(2014)

34.172 K. Appel, W. Haken: Every planar map is four col-orable. I. Discharging, Ill. J. Math. 21(3), 429–490(1977)

34.173 K. Appel, W. Haken, J. Koch: Every planar map isfour colorable. II. Reducibility, Ill. J. Math. 21(3),

Page 58: Handbook Springer oƒ Model-Based Science

PartG|34

780 Part G Modelling and Computational Issues

491–567 (1977)34.174 T. Tymoczko: New Directions in the Philosophy

of Mathematics: An Anthology (Princeton Univ.Press, Princeton 1998)

34.175 I. Lakatos: Proofs and Refutations (CambridgeUniv. Press, Cambridge 1976)

34.176 H. Putnam: What is mathematical truth? In:Mathematics, Matter and Method, Vol. 1, (Cam-bridge Univ. Press, Cambridge 1975) pp. 60–78

34.177 F. Guala: Models, simulations, and experiments.In: Model-Based Reasoning, ed. by L. Magnani,N.J. Nersessian (Springer, New York 2002) pp. 59–74

34.178 M. Morrison: Models, measurement and com-puter simulation: The changing face of experi-mentation, Philos. Stud. 143(1), 33–57 (2009)

34.179 R.N. Giere: Is computer simulation changing theface of experimentation?, Philos. Stud. 143(1), 59–62 (2009)

34.180 D. Shapere: The concept of observation in scienceand philosophy, Philos. Sci. 49(4), 485–525 (1982)

34.181 P. Humphreys: X-ray data and empirical content.Logic, methodology and philosophy of science,Proc. 14th Int. Congr. (Nancy), ed. by P. Schroeder-Heister, W. Hodges, G. Heinzmann, P.E. Bour (Col-lege Publications, London 2014) pp. 219–234

34.182 V. Israel-Jost: The impact of modern imag-ing techniques on the concept of observation:A philosophical analysis, Ph.D. Thesis (Universitéde Paris, Panthéon-Sorbonne 2011)

34.183 D. Resnik: Some recent challenges to opennessand freedom in scientific publication. In: Ethicsfor Life Scientists, Vol. 5, (Springer, Dordrecht2005) pp. 85–99

34.184 M. Frické: Big data and its epistemology, J. Assoc.Inf. Sci. Technol. 66(4), 651–661 (2014)

34.185 S. Leonelli: What difference does quantity make?On the epistemology of big data in biology, BigData Soc. (2014), doi:10.1177/2053951714534395

34.186 W.S. Parker: Franklin, Holmes, and the episte-mology of computer simulation, Int. Stud. Philos.Sci. 22(2), 165–183 (2008)

34.187 W.S. Parker: Computer simulation through an er-ror-statistical lens, Synthese 163, 371–384 (2008)

34.188 D.G. Mayo: Error and the Growth of ExperimentalKnowledge (Univ. Chicago Press, Chicago 1996)

34.189 H.M. Collins: Tacit and Explicit Knowledge (Univ.Chicago Press, Chicago 2010)

34.190 L. Soler, E. Trizio, T. Nickles, W.C. Wimsatt: Char-acterizing the Robustness of Science: After thePractice Turn in Philosophy of Science (Springer,Dordrecht 2012)

34.191 A. Gelfert: Scientific models, simulation, and theexperimenter’s regress. In: Representation, Mod-els and Simulations, ed. by P. Humphreys, C. Im-bert (Routledge, London 2011) pp. 145–167

34.192 H.M. Collins: Changing Order: Replication and In-duction in Scientific Practice (Sage, London 1985)

34.193 B. Godin, Y. Gingras: The experimenters’ regress:From skepticism to argumentation, Stud. Hist.Philos. Sci. Part A 33(1), 133–148 (2002)

34.194 A. Franklin: How to avoid the experimentersregress, Stud. Hist. Philos. Sci. 25, 97–121 (1994)

34.195 E. Winsberg: Computer simulations in science.In: The Stanford Encyclopedia of Philosophy,ed. by E.N. Zalta (Stanford Univ., Stanford2014), http://plato.stanford.edu/archives/fall2014/entries/simulations-science/

34.196 J.M. Durán: The use of the materiality argumentin the literature for computer simulations. In:Computer Simulations and the Changing Faceof Scientific Experimentation, ed. by J.M. Durán,E. Arnold (Cambridge Scholars, Newcastle uponTyne 2013)

34.197 B. Mundy: On the general theory of meaningfulrepresentation, Synthese 67, 391–437 (1986)

34.198 O. Bueno: Empirical adequacy: A partial structuresapproach, Stud. Hist. Philos. Sci. 28, 585–610(1997)

34.199 W.S. Parker: Does matter really matter? Computersimulations, experiments, and materiality, Syn-these 169(3), 483–496 (2009)

34.200 I. Peschard: Computer simulation as substitutefor experimentation?. In: Simulations and Net-works, ed. by S. Vaienti (Hermann, Paris) forth-coming http://philsci-archive.pitt.edu/9010/1/Is_simulation_an_epistemic__substitute.pdf

34.201 E.C. Parke: Experiments, simulations, and epis-temic privilege, Philos. Sci. 81(4), 516–536 (2014)

34.202 M.S. Morgan: Experiments versus models: Newphenomena, inference and surprise, J. Econ.Methodol. 12(2), 317–329 (2005)

34.203 S. Roush: The epistemic superiority of experimentto simulation, Proc. PSA 2014 Conf., Chicago, to bepublished

34.204 S.L. Peck: Simulation as experiment: A philosoph-ical reassessment for biological modeling, Trendsin Ecol. Evol. 19(10), 530–534 (2004)

34.205 R. Harré: The materiality of instruments ina metaphysics for experiments. In: The Philoso-phy of Scientific Experimentation, ed. by H. Rad-der (Pittsburg Univ. Press, Pittsburg 2003) pp. 19–38

34.206 M.S. Morgan: Model experiments and models inexperiments. In: Model-Based Reasoning: Sci-ence, Technology, Values, ed. by M. Lorenzo,N.J. Nersessian (Springer, New York 2001)

34.207 J.M. Durán: A brief overview of the philosophicalstudy of computer simulations, Am. Philos. Assoc.Newslett. Philos. Comput. 13(1), 38–46 (2013)

34.208 T. Boyer-Kassem: Layers of models in computersimulations, Int. Stud. Philos. Sci. 28(4), 417–436(2014)

34.209 R.I.G. Hughes: The Theoretical Practices of Physics:Philosophical Essays (Oxford Univ. Press, Oxford2010)

34.210 O. Bueno: Computer simulations: An inferentialconception, The Monist 97(3), 378–398 (2014)

34.211 M. Weisberg: Simulation and Similarity: UsingModels to Understand the World (Oxford Univ.Press, Oxford 2013)

34.212 R. Batterman: The Devil in the Details, AsymptoticReasoning in Explanation, Reduction, and Emer-

Page 59: Handbook Springer oƒ Model-Based Science

Computer Simulations and Computational Models in Science References 781

PartG|34

gence (Oxford Univ. Press, Oxford 2002)34.213 E. Winsberg: Science in the Age of Computer Sim-

ulation (Univ. Chicago Press, Chicago 2010)34.214 A.I. Janis: Can thought experiments fail? In:

Thought Experiments in Science and Philosophy,ed. by T. Horowitz, G. Massey (Rowman Littlefield,Lanham 1991) pp. 113–118

34.215 J.R. Searle: The Construction of Social Reality (FreePress, London 1996)

34.216 G. Piccinini: Computation in physical systems. In:The Stanford Encyclopedia of Philosophy, ed. byE.N. Zalta (Fall 2012 Edition) http://plato.stanford.edu/archives/fall2012/entries/computation-

physicalsystems/34.217 K. Zuse: The computing universe, Int. J. Theor.

Phys. 21, 589–600 (1982)34.218 E. Fredkin: Digital mechanics: An informational

process based on reversible universal cellular au-tomata, Physica D 45, 1–3 (1990)

34.219 R.N. Giere: How models are used to represent re-ality, Philos. Sci. 71, 742–752 (2004)

34.220 U. Mäki: Models and the locus of their truth, Syn-these 180(1), 47–63 (2011)

34.221 R. Giere: An agent-based conception of mod-els and scientific representation, Synthese 172(2),269–281 (2010)