weisbuch, gerard - solomon, sorin - tackling complexity in science - general integration of the...

Upload: andre-folloni

Post on 07-Aug-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    1/60

    TACKLING

    IN SCIENCE General Integration of the Application of Complexity in Science

    PROJE

    CTREPORT

    A NEST PATHFINDER INITIATIVE

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    2/60

    Interested in European research?Research* eu is our monthly magazine keeping you in touch with main developments (results, programmes,events, etc.).It is available in English, French, German and Spanish. A free sample copy or free subscription can be obtainedfrom:

    European CommissionDirectorate-General for ResearchCommunication UnitB-1049 BrusselsFax (32-2) 29-58220E-mail: [email protected]: http://ec.europa.eu/research/research-eu

    Cover photo courtesy of Walter Tross, STARFLAG Project (CNR-INFM)

    EUROPEAN COMMISSION

    Directorate-General for ResearchDirectorate S-Implementation of the 'Ideas' Programme

    E-mail: [email protected] http://cordis.europa.eu/nest

    Contact: Alejandro Martin Hobdey E-mail: [email protected]

    European CommissionOffice MADO 5/10B-1210 BrusselsTel. (32-2) 29-94 589

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    3/60

    Directorate-General for ResearchDirectorate S – Implementation of the 'Ideas' Programme2007

    EUROPEAN COMMISSION

    Tackling Complexity in ScienceG eneral Integration of the A pplication of Complexity in S cience

    edited byGérard Weisbuch

    andSorin Solomon

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    4/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    5/60

    There is no commonly accepted definition of " complexity ". Nevertheless, it is clear thatwidely differing systems, composed of many interacting units, generate under certainconditions a characteristic common phenomenology, the most prominent signature ofwhich is the "emergence" of new patterns transcending the characteristics of theindividual units.

    The promise of the science of complexity is to provide, if not a completely unifiedapproach, at least common tools to tackling complex problems arising in a wide range ofscientific domains.

    NEST, New and Emerging Science and Technology , is initiative under the EuropeanCommission's Framework Programme 6 to promote highly interdisciplinary, high–riskresearch. NEST launched a number of focussed "PATHFINDER" initiatives, including"Tackling Complexity in Science ".

    The "Tackling Complexity" initiative aims at encouraging the development and transferfrom one area of science to another of solutions and approaches to concrete complexreal-world problems. Such solutions need, at the same time, to hold the promise of beingmore generalisable.

    The initiative undertook two calls for proposals (2004 and 2006) which resulted in thefunding of 23 STREP focussed research projects and one Coordination Action, GIACS,amounting to approximately 35 M Euro worth of new research funding.

    The goal of GIACS (General Integration of the Application of Complexity in Science ) is toserve as a forum to bring the research of these research projects together and help intheir relationship with the ongoing research in the field of Complexity in Europe and therest of world.

    The current report is one of the deliverables of GIACS and serves to speculate about thefuture directions and opportunities of complexity research in the long term. It is theresult of the interaction of large number of outstanding scientists involved in GIACS andother NEST-Complexity projects. It is intended to be speculative and thought provoking,while at the same time serving as an inspiration for future research directions in the field.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    6/60

    TABLE OF CONTENTS

    EXECUTIVE REPORT.............................................................................................. 5

    1 INTRODUCTION ................................................................................................71.1 From components to global properties ............................................................71.2 A historical perspective.................................................................................91.3 Concepts and methods ............................................................................... 12

    1.3.1 Concepts............................................................................................ 121.3.2 Methods ............................................................................................. 13

    1.4 Caveat: to be transversal or not .................................................................. 151.5 Domains of application ............................................................................... 15

    2 SOCIAL SCIENCES........................................................................................... 182.1 Social systems .......................................................................................... 182.2 Economics ................................................................................................ 21

    2.2.1 Success stories ................................................................................... 22

    2.2.2 Socio-economic networks ..................................................................... 243 BIOLOGY........................................................................................................ 26

    3.1 Theoretical biology..................................................................................... 263.2 Systems Biology........................................................................................ 273.3 Cognitive Science ...................................................................................... 313.4 Engineering emergence in self-assembling systems ........................................ 32

    3.4.1 Control and programming of self-assembly.............................................. 323.5 Epidemic modeling and complex realities ...................................................... 343.6 Bio-inspired strategies for new computing architectures and algorithms paradigms.................................................................................................................... 37

    4 INFORMATION AND COMMUNICATION TECHNOLOGIES......................................... 404.1 Optimization and Complexity....................................................................... 404.2 Networks, especially computer .................................................................... 41

    5 COMPLEX SYSTEMS, BUSINESS AND SOCIETY..................................................... 445.1 Econophysics and Finance as a success story................................................. 445.2 Present challenges..................................................................................... 445.3 Supply Networks Management..................................................................... 455.4 Credit networks and Systemic Risk............................................................... 46

    5.5 Collective Evaluation and Trust-based Networks............................................. 475.5.1 Overview of emerging technologies ........................................................ 475.5.2 Potential Impact.................................................................................. 475.5.3 Trust-based Networks .......................................................................... 48

    5.6 Penetration of CS science in the business world ............................................. 49

    6 LIST OF CONTRIBUTORS AND EXPERTS.............................................................. 506.1 Editor ...................................................................................................... 506.2 Contributors to specific parts of the document ............................................... 506.3 Participants to think tank meetings .............................................................. 50

    6.3.1 Jerusalem. 26 May 2006....................................................................... 506.3.2 Paris, 6 and 8 June 2006 ...................................................................... 506.3.3 Torino, Villa Gualino, 25-26 July 2006 .................................................... 51

    6.4 Acknowledgment ....................................................................................... 51

    7 REFERENCES .................................................................................................. 52

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    7/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    8/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    9/60

    1 INTRODUCTION

    1.1 From components to global properties

    The challenge

    Let us exemplify the challenge of complex system research with the mind/brain issue.The brain is composed of 10 11 interacting neurons which can be considered as electriccables (axons) with chemical connexions (synapses). How can we deduce the mindproperties (cognition) from the brain structure? More generally, how can we explain thefunctional organization of biological systems at the level of organs for instance, from theproperties of their components (from neurons to mind or from lymphocytes to theimmune response)?

    Similar questions arise in the social sciences; can we understand social organizations(institutions) and collective behaviour from the properties of the human agents whichcompose them? Such a research program is called methodological individualism in thesocial sciences.

    Can we define complex systems?

    The answer is yes, but some caution should be used in keeping in mind the differencebetween real world complex systems and their models.

    Let us start with models: complex systems are composed of a very large number ofdifferent elements with non-linear interactions; furthermore the interaction structure, anetwork, comprises many entangled loops.

    The purpose of complex system research is to propose and solve models such that wecan deduce the functional organization from the interaction of the components. But tohandle such large systems, the description of their elementary components is generallyoversimplified with respect to real systems: hence the issue of the comparison betweenthe predictions of the model with the real system. We will see that complex systemtheory often addressed the issue by restricting predictions to generic properties. (Arecent shift in interests occurred with the availability of large data sets in biology orsocial sciences).

    Before addressing the issue of generic properties, which can be rather technical, let us

    discuss one of the caricatures of complex systems: complex systems is a theory of thewhole, in other words it is so general that it has no practical use. Let us use at this stagesome comparisons.

    Thermodynamics is not only a theory about work and heat but it applies to allphysical systems: gases, liquids, solids with electric, magnetic or elastic properties. Infact statistical mechanics, the microscopic theory behind thermodynamics, is probablythe closest theoretical analogue to complex systems theory: its purpose is to deduce themacroscopic properties of matter from those of its individual components, atoms,molecules, polymers. But not all properties of matter rest on thermodynamics: opticalproperties can often be deduced by adding up properties of individual atoms. The factthat some answers can be obtained from complex systems theory, does not mean thatcomplex systems theory exhausts all fields of research. There are many otherapproaches of real systems with little direct connection with complex systemsmethodology: experiments, empirical data collection, monographs, etc. Even if weconsider complex systems theory as particularly well adapted to the formalization ofproblems in biological and social sciences, it does not imply that it should be the onlyapproach to logical or mathematical formulations.

    The projection perspective is probably an appropriate metaphor (see figure 1).

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    10/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    11/60

    1.2 A historical perspective

    Complex systems perspective and methods are roughly half a century old. The purposeof these reminders is important in the present context: we have now acquired technicaltools to deal with real world complex systems. "Complex systems" is not a buzzword, and

    the interest in complex systems does not boil down to a bandwagon effect. This impliesthat practitioners in the different fields of application could avoid re-inventing the wheelby either learning the theory, concepts or methods, or collaborating with complexsystems specialists.

    One could argue that the first computation in the spirit of complex systems approachis Clausius’ virial theorem (1857) relating pressure to individual molecules velocity. Ofcourse the names of Gibbs and Boltzmann who later set the basis of statistical mechanicsin the 1860’s are more familiar. Let us rather start with the first challenges in theapplications to biological problems, in the 1940’s.

    The two seminal papers in complex systems are sixty years old. They address suchissues as:

    • What are the logical units necessary to build a self reproducing automation? (VonNeumann)

    • What can you do with an assembly of formal neurons? (McCulloch and Pitts)

    In other words, rather than discussing "wet biology" they approach the issue from anabstract perspective. Both results are expressed in terms of universality, independent ofany physical implementation. Assemblies of formal neurons can achieve the computationof any recursive function (i.e. they are equivalent to Universal Turing Machines). VonNeumann proposed a cellular automaton capable of self-reproduction, and of universalcomputation. Both papers where based on the concept of networks of automata and theirdynamical properties (threshold automata for McCulloch and Pitts, Boolean automata forVon Neumann).

    The perceptron is a simple architecture and algorithm allowing a machine (a simplethreshold automaton) to learn from examples and classify patterns. It is the first exampleof a device based on complex systems ideas. One of its variant, the adaline, is still in useto automatically trim modems.

    Close to the year 1970, three independent contributions stressed the importance ofgenericity and universality classes. Thom catastrophe theory, and its variant bifurcationtheory, is applicable to macroscopic systems, while Wilson and Kauffmann ideas are

    applicable to microscopic systems. The three contributions established the importance ofgeneric properties and classes of universality, a very new concept at that time. Someproperties are common to classes of systems independently of the details of the model.Thom and Wilson concepts apply in the vicinity of transitions (singularities for Thom). Ina few words, Kauffman model (dynamics of random Boolean nets) suggests thatorganization phenomena observed in biology (in his case cell differentiation) are commonto large majority of random systems built on some simple prescription. In other words,biological organization might not be the result of a strong selection principle, a radicaldeparture from conventional biological wisdom in the 60’s.

    During the early eighties, physicists started invading biology, cognitive sciences, etc.(Before that time, their efforts mostly concerned the discovery and application ofcomplex systems to phase transitions and to the physics of disordered systems). Hopfieldmodel of associative memories was influential in bringing physicists to cognitive sciences.Kirkpatrick approach to optimization problems through the simulated annealing algorithminspired from statistical mechanics concerned applied sciences (e.g. chip placement andwiring in computer design). Parisi replica solution to the spin glass problem is an exampleof the bold simplification of the original problem associated with sophisticated formalismto get formal solutions of very difficult problems. The early eighties are also the timewhen computers became inexpensive and widely available for simulations Easier

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    12/60

    numerical simulations helped to bridge the gap between observations and experimentson the one hand and simplified theoretical models on the other hand.

    The development of cognitive sciences since the mid 80’s is probably a generalscheme followed by several other disciplines. Two major factors were critical in the shiftfrom psychology to cognitive sciences: a fast development of experimental techniques(intra-cellular recordings, psycho-physics, and later, imagery) and the development oftheory (neural nets).

    An analogue development, also based on the availability of empirical andexperimental data, is the case for the study of financial markets (in the 90’s) and of theinternet studies (2000’s), system biology (biochips in the 2000’s), etc.

    Applied science domains also flourished since the 80’s, implying large communities ofscientists: signal processing (artificial vision, speech recognition) and combinatorialoptimization.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    13/60

    Figure 2: A short historical perspective in complex systems

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    14/60

    1.3 Concepts and methods

    Is there a complexity science? The question is often asked, but the answer dependslargely of what is meant by complexity. If we restrict the answer to complex systems, theanswer is definitely yes, we have built a theory of complex systems. We will review verybriefly in this section a set of concepts and methods which give us a better insight of realcomplex systems.

    1.3.1 Concepts

    Among the most important concepts are the notions of

    • Scaling: Scaling laws can be considered as a first level in modeling;they relate two quantities by rules such as:

    Q α L α (1)

    where L is a parameter (e.g. a length or a number of agents , Q a dynamic quantity(e.g. a period, some degree of complexity) and α some real number. The concept ispervasive in all sciences but its importance in complex systems is due to two reasons:

    - since understanding complex systems is so difficult, knowing how thingsdepend on each other is already an important result;

    - scaling laws are often generic; genericity is itself an important concept whichstates that some results are robust and do not depend upon details.

    • Universality, genericity. Because the full description of real complex systems isvery complicated (in other words of large algorithmic complexity, necessitating along description), a modeler cannot a priori be assured that the necessarysimplification of his model would not change the behaviour predicted by themodel. Rather than trying to reproduce all the details of a real system one looksfor classes (of universality) of systems which share the same set of properties:the generic properties.

    • Attractors: complex systems evolve in time towards a restricted set of

    configurations, called attractors of the dynamics. The nature of the attractors(fixed points, limit cycles or chaos), their number, etc. characterise the dynamics.The notion of attractors is the answer to the "Emergence paradox": the systemdisplays some form of organization apparently unpredictable from the rules thatspecify it.

    • Robustness: the attractors remain qualitatively unchanged over a finite range ofparameters, defining a dynamical regime. But sharp transitions in the space ofparameters separate dynamical regimes. These properties, dynamical regimesand sharp phase transitions, are the generic properties of the dynamics. Becausethe conclusions of complex systems studies are about the attractors, they aregeneric: they are common to large classes of equivalent systems, which maydiffer in their details.

    • Networks: The concept of an interaction network is central to complex systems.The study of the static structure of the interaction network was recently renewedby the ideas of "small world" (Watts and Strogatz) and of scale free networks(Barabasi and Albert, see further section 4.2). Many empirical studies on realworld nets, whether biological (from food webs to gene regulatory networks) and

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    15/60

    social (from small communities to WWW) were done in recent years; newcharacterizations of networks have been proposed and in some cases a bridge wasre-established between network sociologists and complex system scientists, e.g.in the search for small motifs in relation with functional properties for instance.Dynamics and functional organization studies are far more difficult to achieve;they also require larger data sets. The predictions of Vespignani et al. on virusinfection and immunization strategies in scale-free networks are a good exampleof the importance of dynamical studies (see further section 3.5).

    • Many other general concepts were introduced such as percolation, fractals, selforganised criticality, order parameters and functions, etc.

    These general concepts guide research on any particular complex system: they are partof a theory of complex systems as the concepts of energy, temperature, entropy, statefunctions, shape thermodynamics… The analogy was first stated by Stuart Kauffman("Waiting for Carnot").

    1.3.2 Methods

    In quantum mechanics e.g. computations are based on a series of techniques (e.g. grouptheory, eigenvalues and eigen functions, perturbation methods). The same is true forcomplex systems. Let’s enumerate some formal techniques:

    • Discretization, of time and/or space, e.g. cellular automata.

    • Computation of the Z function via the replica ansatz.

    • Renormalization group.

    • Transfer matrix.

    • Random matrices.

    • Damage spreading methods.

    • Langevin equations.

    When formal methods are not applicable, numerical simulations can be used. Butcomplex system methods are not restricted to computer simulations as sometimesstated. The idea is rather to couple formal methods on the simplest models and computersimulations to check the genericity of their conclusions on more elaborate versions.

    Computer simulations are often based on methods inspired by the complex systemapproach to physical and biological phenomena:

    • Probabilistic methods such as simulated annealing rather than exhaustivesearches in NP complete optimization problems.

    • Learning methods such as the perceptron algorithm and its generalizationsinspired from neural nets.

    • Genetic algorithms and their variants.

    • Ants algorithms ...

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    16/60

    Figure 3: Perspectives in complex systems

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    17/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    18/60

    • The importance of networks is pervasive in all domains. One contribution, byGuido Caldarelli (see further section 4.2), summarizes the most recent progress innetworks characterization and dynamics.

    • In several domains the individual moves are recorded for large ensembles (in thethousands) in order to track the elementary processes responsible for interaction:

    – in ethology of animals groups such as schools of fish, swarms of insects orbirds,

    – walkers in crowds,

    – in economics, traders on financial markets,

    – email exchange inside a given community,

    – even in embryology, individual cells of the zebra fish are recorded during

    ontogenetic development.These huge recordings have necessitated new progress in tracking algorithms, not tomention the hardware needs.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    19/60

    Figure 4: A plan of this brochure

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    20/60

    2 SOCIAL SCIENCES

    2.1 Social systems

    Social systems are recognised as complex by most modern researchers in the socialsciences, but sociology and political sciences are for historical reasons among the lesspenetrable fields of application for complex system scientists (economics will be laterdescribed).

    What can a complex system perspective bring to social sciences? Since Max Weber(1864 - 1920), a dominant paradigm in the research program of social sciences has beenmethodological individualism: how to explain social phenomena from individualrationality. This view is commonly shared in sociology or economics, but its connection tocomplex system research is accepted, and used, by a minority of social scientists, e.g.

    Kirman in economics, Axelrod in political sciences (and the BACH group in Ann Harbor,Michigan), Epstein, Axtell and P. Young (the Brookings Institution) in political philosophyand economics. In Europe, Gilbert and Conte e.g. are active in "simulating societies".

    The equivalent problem to functional organization in biology is the problem ofinstitutions in social sciences. In a bounded rationality perspective, institutions allowsocieties to meet the challenges that they face. Social scientists, and not onlystructuralists, recognize that some social organization allows individuals to act in society.There are many kinds of institutions, from what we obviously recognise as institutionssuch as schools, hospital, state, law, religions ... to less obvious instances such ascultural traits, routines, etiquette. Even language can be considered as an institution.

    Many questions have been asked about institutions but a central issue is the origin ofinstitutions: how do individuals in a society "choose", adopt and maintain institutions?How do institutions evolve? What makes people abide or not by social norms?

    The complex system approach as developed by Axelrod, Epstein, Axtell, etc.interprets institutions as the attractors of the dynamics of interacting individuals. Axelrodfor instance has launched the research program on the emergence of cooperation in aprisoner dilemma framework. Along this line of research, Epstein and Axtell discussed theemergence of a class system among social agents with a priori neutral labels. Luc Steelsdemonstrated the emergence of a common vocabulary among a set of communicatingrobots ("talking heads").

    Interpreting institutions as attractors of the social dynamics among agents explain

    their emergence, their stability and robustness, and the fact that the history ofinstitutions follows punctuated equilibria. If we think of social regimes in societies, onegets an explanation of long stasis of organizations such as tribes, chiefdoms or modernstates; transitions among such metastable regimes occur fast and bring very differentorganizational forms ("revolutions").

    Another line of research, also following complex system methodology, is the searchfor scaling laws in social systems. Quantitative approaches to social systems started withscaling laws (Pareto, Levy, Zipf) reflecting the abundance of structures according to theirsize. Zipf’s law for instance established that the distribution of the size of cities roughlyfollows an 1/size law (1920’s). The same kind of scaling was observed in the distributionof individual wealth, income, frequency of words in large corpuses, as in natural systemssuch as earthquakes or climatic events. They imply that there does not exist e.g. a city of"characteristic size" as would imply a normal distribution. The same scale free behaviouris observed in financial time series of returns or in firm size distributions.

    Since the pioneering works of Simon and Mandelbrot we have a simple explanation ofthe origin of these scaling laws: a simple random multiplicative mechanism such as"riches get richer" suffices to yield power law distributions.

    A complex system perspective offers many opportunities in the most fundamentalaspects of most social sciences. A few examples:

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    21/60

    - Language. One is looking for threads to answer questions about the origin anddiversity of human language. Not only in terms of vocabulary, but also grammar. Why dowe have a limited number of grammatical markers systems among the variety of humanlanguages and why those? The challenges of linguistics and its applications to translation,information retrieving, human/computer communication and communication protocolsare of tremendous interest to the European Communities.

    - Economic geography. Differences in economic activity, and especially in Europe, arestriking; it is a major challenge to European construction (of course the same can be saidabout cultures). In what respect is economic differentiation due to physical geography orto history? How much differentiation is "natural" or inevitable, because driven byeconomic growth itself and its aleas (see further section 2.2.2). Understanding cities,growth, shape, interactions, evolution, etc. in connection with their inhabitants'challenges, abilities and individual choices is a typical challenge of economic geography.Many problems call for a complex system approach such as modeling land use, for whichcellular automata approaches have been developed.

    - Economics (see further section 2.2.2). More generally economics calls for complexsystems methods and ideas as proposed by Nobel laureates in economy (K. Arrow) and

    physics (P.W. Anderson) and D. Pines ("The Economy as an Evolving complex system"1987-8). With such "founding fathers" and followers this research program soondeveloped in the US (B. Arthur, SFI) and in Europe (the WEHIA conferences involvinge.g. M. Gallegati and A. Kirman). Another factor which has facilitated these developmentsis the prevalence of mathematical culture among a large fraction of economists. Butsome tensions and competition with physicists (the econophysicists) and theircontribution remain.

    - A second area which has been very successful is cognitive science, where complexsystems methods helped to establish bridges between the (wet, biological) brain andcognition (see further section 3.3). The fast development of cognitive sciences in Europesince the mid 80’s is due to the conjunction of two factors: the use of complex systemsmethods (especially neural nets) and new recording and imagery techniques (multi-recording, MRI, etc.)

    Because of barriers to inter-discipline communication the progress has been much slowerin areas such as sociology, political sciences, political philosophy and even management.There exists some interest for complex systems among some scientists in this disciplines,but most often for some "softer" versions.

    - In the case of social sciences in a wide sense (including e.g. political philosophy andpolitical science), the chosen methodology is most often numerical simulation with littleinterest for complex systems concepts. This often translates into bringing all possibleknowledge as input to the model, running the simulations (often under multi-agentsideology and platforms), and observing temporal or spatio-temporal patterns. This line ofresearch has been active for more than ten years in Europe ("Simulating societies" withleaders such as Nigel Gilbert, Rosaria Conte, and Andrzej Nowak). Applications to presentsocial challenges such as environment have been a strong incentive for thesesimulations. They are often developed in institutions closer to applications such asagriculture or halieutics rather than in inner academic circles. By contrast, somephysicists (Dietrich Stauffer, Serge Galam, Sorin Solomon, Gérard Weisbuch...) appliedcomplex systems methodology.

    - In the case of history, political philosophy and even in management, the softer(rather verbal than modeling) version of complexity is most often used: Edgar Morin in

    France is an example. These communities often relate to cybernetics.Normative approaches and design

    Many applications of CSR concern priority areas of European research defined by theCommission such as health, governance, environment, etc. In all these concretechallenges the picture involves on the one hand hard scientific knowledge either inmedicine, climate, economy, and on the other hand choices by many inhomogeneous

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    22/60

    agents with incomplete information. The agency, EC or national government, has toextract information from a huge set of data, define a policy acceptable for its constituentsand implement it. Any help in the different steps of the political action is awfully needed:from data to knowledge, defining a policy taking into account the conflictingrequirements of production, environment and social needs, and imagine those measuresacceptable by the constituency and helpful in bringing the state of the world chosen bythe political level.

    All tools and methods of complex systems research are needed to get an integratedassessment of the issues and possibilities. A major challenge for complex systemresearch is to be associated to search for solutions to these societal issues. If we forinstance take the case of global warming, the communities of climatologists andeconomics developed sophisticated models of climate on the one hand and of economicson the other hand. But the integration of these models to achieve some usefulassessment has been terribly hard and unsuccessful.

    A challenge that is faced both by the scientific community and by funding agenciessuch as the EC is how to have projects in the priority areas of European research whichbring together complex system scientists and specialists of the fields of application,

    whether agriculture, economics, medicine, etc.Design

    The scientists’ vision of design starts taking into account the fact that designed objects are tobe used by humans (of course sellers already understood that, but for them users are onlypossible customers). The idea of human usage of technical products makes even more sense inour present well connected societies. In the connected society, the success of anycommunication device largely relies on human factors which might overcome technicalstrengths or weaknesses. More than ever, innovation means finding new use for sets oftechnical components that can be either already available or which development can beprogrammed. Research programs of the IST division are well aware of the importance ofthese human factors and of the fact that a large part of innovation is driven bycommunication or based upon the development of ICT. The parallel assessment documentfor the IST division should be more explicit than ours about challenges and opportunities inIST.

    Resource allocation problems are a set of familiar problems in Operational Research (seefurther section 4.1). But new challenges in our present society such as local decision and thenecessity of fast adaptation call for new algorithms. Supply chains methods for instance haveto be re-adapted because of the large variability of consumers demand: supplying localdealers with black Ford T in the 20’s is a different challenge than supplying the specificmodel (in terms of colour, motorization and other options) requested by a 21st centurycustomer. Scheduling cannot be completely made in advance and has to be adaptive (seefurther section 5.3). The "Federal Express" problem is a paradigm of many schedulingproblems encountered today, from routing messages along the Internet to handling naturalcatastrophes: what are the best algorithms to allocate vehicles collecting letters or parcelsaccording to received calls? Simple local optimization algorithms have been invented whichparameters are adjusted according to past history of calls. The research community, andespecially computer scientists, is more aware of applications in distributed computer science,but many other sectors in business administration call for local and adaptive approaches toresource allocation and scheduling.

    In a society where services represent the largest part of the economic activity, a better

    adaptive allocation is a real challenge which is seldom met (think for instance of themedical services crisis in Europe). One gets the impression that the new developments in thefield do not percolate to business through consulting firms or business schools which shouldbe the normal channels (with fortunately some exceptions, see further section 5.6).

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    23/60

    2.2 EconomicsBy Mauro Gallegati, Peter Richmond and Massimo Salzano

    Everyone would agree that the economy is a complex system. Traditional economics isalready a very formalised approach to economy.

    Adam Smith, in 1776, proposed that the economy is a complex self-organizingsystem governed by an ’invisible hand’. This idea of self-regulating order emerging frominteractions between elemental components without it "being part of their intentions"became a guiding principle of human behaviour. The approach was transformed in 1870into a ’theory of General Equilibrium’ (GE) by Léon Walras. He proposed a system with aconfiguration of prices and action plans such that all agents can carry out their chosenplans and markets clear. As this theory was generalized modern, neo-classical,economists seeking to build micro-foundations for macroeconomics soon reverted to therefinement proposed in the 1950s by Arrow and Debreu. They showed that individualagent inter-temporal (on an infinite horizon) optimization also yields a GE, as soon aseach representative agent (RA) that makes up this economy is equipped with perfectprice foresights for each future state of nature and a complete set of Arrow-securitiesmarkets, all open at time zero and closed simultaneously. Whenever these conditionshold true, GE is an allocation that maximizes a properly defined social welfare functionor, in other terms, the equilibrium is Pareto-efficient (First Welfare Theorem).

    The "gaps": the failure of the micro-foundation program based on representativeagents is one. The flaws in solutions to this problem adopted by mainstream macro-economists are pervasive and well known:

    • GE is neither unique nor locally stable.

    • The GE solution is not computable. Indeed it is not even possible to approximatethe system state without going through a full simulation of the actual exchangesoccurring in the system. The same problem applies to the class of so-calledComputable GE models.

    • The GE model is one in which price formation precedes the process of exchange,instead of being the result of it. Real markets work the other way round andoperate in real time. Thus the GE model cannot be considered to offer a realisticexplanation of economic phenomena.

    • It is widely recognized that integrating money into the theory of value

    represented by the GE model is at best problematic. No individual economic agentcan decide to monetize alone; monetary trade should be the outcome of marketinteractions among agents. By the same token, since credit makes sense only ifagents can sign contracts in which one side promises future delivery of goods orservices to the other, equilibrium markets for debt are meaningless. Both theinformation conditions and information processing requirements are not properlydefined and bankruptcy, for example, can be safely ignored.

    • The only role assigned to time in a GE model is that of dating commodities.Products, technologies and preferences are exogenously given and fixed from theoutset. At an empirical level there is an increasing realization that the majoreconomic theories of either Keynes or the monetarist school have not had greatsuccess when trying to account for economic behaviour.

    The research methodology we endorse in seeking to move forward consists in discardingthe Walrasian GE. Instead of trying to deductively prove the existence of an equilibriumprice vector p*, the complexity approach aims at constructing it by means of rules andalgorithms. Clearly, the act of constructing such a coordinated state requires adescription of goal-directed economic agents and their interactions

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    24/60

    Agent-based computational economics (ACE), that is, the use of computer simulations tostudy evolving complex systems composed of many autonomous interacting agentsrepresents an implementation of such a research agenda. With modern computertechnology, ACE allows an explicit modeling of identifiable, goal-directed, adaptingagents, situated in an explicit space and interacting locally in it. In complex adaptivesystems, local interactions between the agents can lead to the spontaneous formation ofmacroscopic structure not evident by simple consideration of individual behaviours.

    Complexity then provides the essential step that takes economics from its position asan axiomatic discipline to being a falsifiable science at the micro, meso and macro levels.It also deals systematically with economic interactions and the aggregation of economicvariables.

    2.2.1 Success stories

    Over 80 years ago in ’Sozial Physik’, Lammel showed how social and economicphenomena might be understood by applying simple physical concepts. In the last 20 yearsthe activity of a few has become a global activity with the participation of many groupsfrom across the world. A second motivation for this activity by the physics community hasbeen the availability of an abundance of financial data that now covers activity down tothe level of transactions occurring every few seconds. This has allowed empirically inclinedphysicists with some economists to begin to gain new insights into these markets where tradestake place at the level of seconds (Stanley, Farmer). The importance of this work cannot beoverstated. If we can really understand what moves financial markets then much financialpain and misery that affects economies and ultimately the lives of citizens everywhere couldbe reduced.

    However not all markets are of this kind. For example, Weisbuch, Kirman and Herreiner (1998)have studied the markets for perishable foods as exemplified by the Marseille fish market.They showed how a model that involved customer loyalty or preference together with alearning element led to an understanding of price development. This model illustrates theemergence of order in the sense of stable trading relationships together with additionalinsights into the way fluctuations could break down that behaviour. Such models couldhave application in the wider context of the consumer society and help traders andindustrial businesses with purchasing and pricing strategies.

    Other economic problems where we are seeing progress are concerned with aspects ofindustrial organization including distribution and growth rates of firms (e.g. Stanley),income and wealth distributions (e.g. Chatterjee et al. 2005). Ideas first developed to study

    population dynamics, by Lotka-Volterra (Biham, Malcai, Richmond and Solomon, 2002) nowvie with agent models first developed and applied to molecules (Richmond). Such models arechallenging the concept of a simple rational agent and introducing heterogeneous agents thatact at the micro level and which when aggregated exhibit behaviour that leads to non-Gaussian behaviour with ’fat tails’ and volatility clustering. With modern computers it isnow possible to simulate increasingly complex models that incorporate the behaviouralcharacteristics of the interacting agents. This offers new opportunities for the physicist,economist and psychologist to cooperate when working on these challenging problems.

    Incomes are now recorded with great precision and frequency in countries such as Japan.Unfortunately in many Western countries, data are not always available in a form that helpsneither detailed analysis nor model development. This is unfortunate since it is self-evident that if the structure and dynamics of both incomes of individuals and theprofitability of businesses can be properly understood, governments everywhere would be in astronger position to apply fair and effective taxation policies.

    Emergence: a key concept inherent to all these problems and one that taxes the theorist isthe question of ’emergence’.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    25/60

    Much more elusive is the notion of intermediate levels (as functional mesoscopic entities).Emergent phenomena are composed of persistent patterns. They emerge from the low-levelinteractions and their emergence is recognized by the fact that they can be operated on by aseparate set of rules. This is the sense in which emergent phenomena define "levels".Analogies with physics might be fruitful. Consider, say interacting electrons in the solidstate. In some materials the interactions lead to the formation of closely coupled pairs thatgive rise to superconductivity at the macroscopic level. Superconductivity is thus anemergent state with properties determined by the electron pairs. Emergence may then beassociated with persistence of some kind of ’quasi-agent’ that is an aggregate of theunderlying agents. This concept of functional mesoscopic entities has been proven fruitful onseveral occasions under different designations such as motifs of well connected subgraphs insocial networks or functional motifs in Genetic Regulatory networks.

    Economic policy issues

    RA implies the existence of one equilibrium state given the initial conditions: to choosethe "best" policy tool for obtaining a desired end would be quite trivial. By contrast, the

    characteristic of "emergence" peculiar to complex system implies the possibility ofreaching one of many equilibria. This begs the question as to whether history and path ofthe state vector is important.

    If economic policy operates on complex environments, ranging from telecommunicationnetworks and financial markets to ecological to geophysical systems, the mainstream toolsare, at best, misleading if not inappropriate. Methods developed for linear systems give riseto unintended consequences since implementation of policies can themselves not only changethe dynamics associated with heterogeneous agents, but also change the overall landscapewithin which these agents interact and evolve.

    Though the science and technology for the solution of complex adaptive systems haveadvanced in recent years, their application to economic policy problems seems to havelagged applications to the business and scientific worlds. (One noticeable exception was asimulation of the influence of tic size on the Nasdaq which was not published; thefinancial sphere is more discreet than the academics ...)

    Among suggestions that have been made for controlling complex economic systems:

    • It will be necessary to develop a new decision theory based on "robustness" vs. theoptimization principle (Lempert, R. J., 2002; Bankes, S. and Lempert, R. J., 1996).J-P Aubin’s viability theory (Birkauser Pub., 1991) goes along these lines.

    • Ideas of ’control’ will have to give way to ’management’ of complex economicsystems (Rihani, 2004).

    • New tools such as those based on ’visual analysis tracing’ will be necessary(Shneiderman, 2004) (the approaches of "Exploratory Modelling", Bankes, 1993 and"Adaptive Strategies").

    Equally, the strong dependence of complex systems on the initial conditions suggests we shallmove away from "high energy control" to "small energy management" of the economy.However, all these ideas and methodologies are still in their infancy and much more work isneeded to bring them to maturity.

    Specific areas that could be ripe for further analysis and where such new ideas might befurther developed at this time include:

    • The nature of income distributions and optimum approaches to redistribution withinsocieties.

    • Development of tools for identification of the best routes that ensure control ofincome dynamics during EC expansion and integration of new member states.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    26/60

    • The characterization of financial instability within networks of firms, banks andother organization (see section 5.4).

    • Methods for the control and evolution of trade and income distributions acrossnations.

    2.2.2 Socio-economic networksBy Gérard Weisbuch

    Since socio-economic networks constitute an institutional structure allowing productionand trade, their investigation lies at the interface between social sciences and economics.

    The concept of networks is pervasive in Complex systems: after all, any interactionstructure implies a network. Social networks as a vector for the propagation of informationand decision among individuals are an old topic in social sciences since the 40’s. It has beenactive since that time and the Sunbelt Conferences now have participation in the thousands.

    The interest in socio-economic networks, where vertices are firms, is more recent andmight have arisen from the Asian credit crisis of the late nineties. It contrasts with standardeconomics which:

    • consider firms as independent entities interacting with customers only through themarket (General Equilibrium Theory)

    • or discuss strategic interactions among a small set of firms (Game Theory).

    In both cases, the approach, based on full rationality of economic actors is a search forequilibrium. In real life, firm dynamics are far from equilibrium, and their interactionstructure is much more complex than the hypotheses of General Equilibrium or Game

    theories.Firms interact via many possible channels, which constitute as many edges of firmnetworks:

    • The most straightforward B2B interaction is through production: the output of afirm is used as an input by other firms.

    • Firms also have capital cross-participations or credit relations.

    • Firms interact via their personnel: they can share directors on their boards;exchange of professionals is also important in certain industries such asinnovation or the media industries.

    Industries are shaped by these interactions, especially new industries; understanding theconsequences of these interactions is of fundamental importance for producers, investors,banks, national or international agencies.

    Many recent studies concerned the static interaction structure describing firmnetworks. Power laws scaling describing firm size and wealth, as well as their connectionstructure were empirically observed from public data obtained for instance from stockexchange.

    The scale free structure of firms connectivity, -namely the fact that some firms are farmore connected than others- whether it concerns trading relationships, partial ownership,sharing directors, etc. already gives a lot of information on strategic interactions and powerrelations.

    As in most network studies, the following issues are addressed in dynamical studies ofeconomic networks:

    • How "smooth" is the dynamics? Should one expect equilibrium or some kind oflaminar flow of production across the network? Or a richer structure with cycles,

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    27/60

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    28/60

    3 BIOLOGY

    From the early days of complex systems research, in the 40’s and 50’s, biologicalsystems were considered as an inspiration for the design of complex artificial systems.But of course, understanding the self-organization properties of biological systems was

    itself a major challenge.We will not describe here in detail the bio-inspired applications such as the manyapplications of neural nets research to signal processing: the reader might check MarkBuchanan’s document, or the standard literature on neural nets. We similarly skip DNAcomputation, immunocomputing, etc. Bio-inspired algorithms will be used in section 3.6.

    We will rather concentrate on complex system research motivated by theunderstanding of biological systems such as cognition, genome expression, the immunesystem and ecology. This basic research, often referred to as systems biology, posesfundamental scientific questions; furthermore, its application to health, medicine, drugdesign and environmental issues is of the highest importance. For instance a betterunderstanding of the interactions that control gene expression would considerablyfacilitate the design of new drugs, both in terms of inventing the drug and synthesizingit, but it would also help to check for possible negative side effects.

    The year 2000 has been a turning point in the application of complex system researchto biology.

    3.1 Theoretical biologyBy Gérard Weisbuch

    Before 2000 the major effort was the understanding of functional properties of biological

    organs: for instance how can neuron assemblies give rise to cognitive properties in thebrain? Hopfield 1982 PNAS paper on associative memories of assemblies of formalneurons is a typical example. Hopfield demonstrated formally that sets of formal neuronseducated by a simple delocalised algorithm, Hebb’s rule, could later recognize thepatterns they had been taught; this recognition does not even request a presentation ofthe original pattern: even a presentation of a part of the pattern would suffice.

    Most of the research of the 40’s to year 2000 was along these lines:

    • The cellular automata of Von Neumann were proposed as models of self-reproduction.

    • S. Kauffman (1969) modeled phenotypic expression based on interactions amonggene expression;

    • R. Thomas, M. Kaufmann, A. S. Perelson et al. from the 70’s modeled the immuneresponse as a result of idiotypic interactions among antibodies and cell receptors.

    • A large fraction of the research interpreting the origin of life (Eigen, 1972; P. W.Anderson, 1983; S. Kauffman, 1989) as due to the emergence of a set ofautocatalytic polymers belongs to this line of research.

    • The gradualism/catastrophism debate. Punctuated equilibria (the genotype/phenotype/population dynamics).

    • Population ecology (Red Queen paradox, co-evolution, game theory and replicatordynamics, massive extinctions seen from the Self-organised Criticalityperspective).

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    29/60

    At that time, the complex systems approach was an alternative to the reductionistapproach which considered biological functions as the result of a finely tuned selectionand proned a major effort at specifying the details of all involved components and oftheir interactions. The complex system scientists established that functional propertiescould be the result of interactions among rather simple entities. Functional organizationwas shown to be a generic property of the attractors of the dynamics of the interactingsystem rather than the result of highly tuned natural selection. These earlier effortsgreatly improved our insight in biological organization.

    3.2 Systems BiologyBy François Képès

    Standard approaches and their limits

    Molecular biology, a field initiated and developed by the interactions between biologists,biochemists, mathematicians and physicists, has provided a wealth of detailed knowledgeabout molecular mechanisms at work in organisms. Most recently, the contributions ofcomputer scientists have enabled breakthroughs in the acquisition and interpretation ofgenomic data. As yet, however, none of these critical achievements has allowed us toobtain an integrated understanding of the functional modules and regulatory networksinvolved in cell and organismal physiology.

    The promises raised by the identification of genes involved in multifactorial diseaseshave not turned into efficient therapies because most molecular biologists have focusedon individual genes and proteins rather than on systems, and have not addressedquestions such as:

    • What is the functional architecture of these interaction networks and theirmodules, what are their dynamic properties? How do different levels of regulationintermingle? What are the organizing principles at play?

    • How do robust behaviours emerge amidst stochastic molecular events?

    • How is the expression coordinated of genes involved in the same physiologicalprocess?

    • How do cells tolerate, control or use noise in regulatory networks?

    • How has evolution shaped these networks?• How can results obtained in model systems be used to improve human therapy?

    • How can regulatory networks be deciphered in a way that would allow thediagnosis and prognosis of multifactorial diseases and the design of newtherapeutic strategies?

    In order to tackle these questions, biologists, physicists, chemists, computer scientistsand mathematicians are needed. Together, they pave the way to a new approach in lifesciences dubbed "systems biology". This renewal of integrative approaches in biologytakes its roots in the legacy of quantitative and theoretical biology, and in the conceptualdevelopments on self-organizing systems in the 1980’s. In the past 10 years, it has beenfueled by the availability of huge datasets produced by the "-omics" methodologies. Asthese methodologies are the daughters of biochemistry and molecular biology, thedatasets have a marked molecular grounding, hence the molecular slant of systemsbiology in comparison to its ancestors.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    30/60

    As an emerging discipline, systems biology cannot and should not be defined in a strictway. Rather, it should be defined in an epistemic way, as a set of successful approachesthat, together, constitute the first landmarks on a virgin territory.

    Gaps

    Several types of gaps must be considered in the case of systems biology.The first one is cultural. Several scientific communities must closely interact for

    success, from bench-work to theoretical biology, the latter often being held in suspicionby bench biologists. For most molecular biologists, modern biology roadblocks are on theexperimental side, while the vision from Systems Biology is how to build biological insightand transferable knowledge from a huge amount of experimental data. Still, to cut downon costly bench-work, it would probably be a good idea to organize and improve thesynergy between conventional and computational experimentations.

    The second gap is ontogenic. One of the major challenges ahead of us will be to makethe best use of the massive amount of molecular data generated by the tools of post-genomics. It is clear that the abundance of such data has increased tremendously in the

    recent past. Yet a closer look reveals that, with the exception of genome sequencing inmost cases, such data are not truly exhaustive, and their quality is generally poor,although slow improvements can be foreseen. Thus, for a long time to come, the lack ofexhaustivity and the poor quality of the data will be a serious ontogenic obstacle toquantitative predictions. Other approaches may sometimes permit one to make usefulpredictions from qualitative or semi-quantitative results. Importantly, it often suffices tobe able to give the evolutive trend of the final parameter to determine the outcome. Forexample, if spontaneous cell death outweighs division in the tumor cell, cancer willregress, which is the single most important fact. Besides, it would be wasteful to convertthe few available quantitative data into qualitative results for the sake of formathomogeneity. It will therefore be crucial to succeed in combining, in the same simulation,the use of quantitative and qualitative results.The third gap is epistemologic. The temptation to build a cell and an organism fromtheir genomes and related molecular data implicitly relies on the conception that all thecellular information lies in the genome. This viewpoint is not warranted by available facts,yet it is widespread, thus creating an epistemologic obstacle. An obvious counterexampleis that two human cells endowed with an identical genome may exhibit extremelydifferent phenotypes (compare, for instance, a stomach cell and a neuron). It is thusclear that part of the cellular information is held in its genetic material, while anotherpart is distributed elsewhere in the cell. The genetic information is easier to identify, as itis contained mostly in one DNA molecule and is relatively accessible to us in the form ofthe genome sequence. This may explain why the genetic part is so strongly emphasized.

    Success stories

    Expectedly, there are few success stories in such a recent field of investigation. Yet,three examples will illustrate a variety of potential approaches. However, let us firstsketch some landmark developments in this field.

    The first surprise came when it was demonstrated that biological systems rangingfrom food webs in ecology to biochemical interactions in molecular biology could benefitgreatly from being analyzed as networks. In particular, within the cell, the variety ofinteractions between genes, proteins and metabolites are well captured by network

    representations and associated measures including degree distribution, node clustering,hierarchy and assortativity (Barabási and Albert, 1999; Ravasz et al., 2002; Watts andStrogatz, 1998). The global topology of these networks showed similarities amongthemselves as well as with respect to sociological and technological networks (Jeong etal., 2000; Guelzim et al., 2002; Shen-Orr et al., 2002).

    In a second step, the issue of local topology, subgraphs and motifs was addressed. Anumber of biological networks were found to contain network motifs, representingelementary interaction patterns between small groups of nodes (subgraphs) that occur

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    31/60

    substantially more often than would be expected in a random network of similar size andconnectivity. Theoretical and experimental evidence indicates that at least some of theserecurring elementary interaction patterns carry significant information about the givennetwork’s function and overall organization (Guelzim et al., 2002; Shen-Orr et al., 2002;Milo et al., 2002; 2004). Feed-back loops, feed-forward loops and other motifs were thusuncovered in real-world networks, and were attributed a qualitative dynamics.

    In a third step, such dynamics were shown to be effective by monitoring them in asynthetic network with the same topology that had been implemented in live cells (earlyattempts: e.g. Becksei and Serrano, 2000; Elowitz and Leibler, 2000). In reverse,network design could be automated by evolution in the computer towards a fixed goalsuch as behaving like a toggle switch (François and Hakim, 2004).

    The following two examples illustrate successes using the discrete and differentialapproaches, respectively.

    The first example testifies to the usefulness of a discrete logical approach relying onfew available data and drawing from the corpus of formal methods from computerscience. The bacterial population that opportunistically invades the lungs of patientsafflicted with cystic fibrosis often switches several years later to producing mucus,

    thereby becoming more resistant to antibiotics and making breathing more difficult forthe patient. Two hypotheses with different therapeutical prospects, genetic andepigenetic, have been proposed to explain this mucoidal switch. The generalized logicalmethod proposed by René Thomas, coupled to temporal validation (Bernot et al., 2004),allowed to validate plausible models and to formulate the most discriminatoryexperimental plan for choosing among the genetic and epigenetic hypotheses to explainthe mucoidal switch (Guespin-Michel et al., 2004). The experimental plan has beencarried out successfully, with useful feed-back for the model.

    Modeling of chemotaxis in bacteria exemplifies differential or stochastic approachesapplied to understanding a phenomenon that exceeds human intuition. Chemotaxis refersto the cell ability to swim or move towards a chemical attractant or away from arepellent. The mechanisms of chemo-taxis in eubacteria (Shimizu et al., 2003) andarchaebacteria (Nutsch et al., 2005) turned out to be principally different, although theyare both based on a random walk that is biased by comparing the actual chemicalconcentration to that memorized at a previous time. In eubacteria for instance, themodel not only matches experimental observations, but shows an unexpected emergenceof spatial patterns of methylation, the chemical modification involved in the memoryeffect (Shimizu et al., 2003).

    A last example shows how the coupling of bench and computer experimentation mayprovide answers to fundamental questions. Here, the issue is evolutionary optimization offitness to environment by adjusting a certain protein level. Bacteria were challengedduring several generations with various external concentrations of a sugar, and thecellular production of the sugar-metabolizing protein was correspondingly monitored. Apredictive cost-benefit model was considered, in which the growth burden of producingthe specific protein was balanced by the growth advantage of utilizing the sugar whenand as available. In a few hundred generations, cells evolved to reach the predictedoptimal protein production level. Thus, protein production appears to be a solution to acost-benefit optimization problem, and to be rapidly tuned by evolution to functionoptimally in new environments (Dekel and Alon, 2005).

    Future challenges

    Globally, one of the major challenges associated with systems biology is to try andunderstand the behaviours of biological networks of interactions, in particular theirdynamic aspects and spatial developments, often with a goal to later control or evendesign them. More immediate objectives would include for instance dealing with theirnatural heterogeneity and size, including spatial aspects, usefully re-composing thesubnetwork motifs into a grand network. These challenges typically require cross-disciplinary import of concepts, and crosstalk between experimentations through bench-work and through modeling and simulation.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    32/60

    The success of this emerging discipline will rely upon new techniques and concepts. Fromthe biological perspective, these techniques and concepts will come from areas includingthe following: - multidimensional and quantitative imaging of molecular events; -modeling, simulating and mimicking signal transduction pathways and regulatorynetworks both for the conception of experiments and the interpretation of data derivedfrom these dynamic images and from high-throughput techniques; - bio-inspiringevolutionary algorithms, which reflect back into improved biological models (see alsosection 3.4); - using micro- and nano-technologies to explore, manipulate and mimic thecellular nanoenvironment and to build new bio-sensors.

    Big pharmaceutical companies in Northern America, unlike those in Europe, areembarking on systems biology, with various goals that generally include several of thefollowing:

    • Personalized healthcare

    • Cost-effective treatment (economic benefit)

    • Personalized intervention with a model of the patient (individual benefit).

    • Model-based intervention strategies

    • Combined drug targets (addressing cases where more than one target should behit at once to reverse the pathology)

    • Drug-drug interaction (assessing the combined effects of multiple drug therapies)

    • Chronotherapy (process-based time-dependent drug administration)

    • Disease reclassification (based on molecular evidences).• Streamlining therapeutic approval

    • Shortening clinical trials

    • Reducing animal testing

    From a more mathematical point of view, several new techniques and concepts are likelyto stem from the so-called "inverse" problem. The availability of huge datasets since the1990’s such as transcriptomes obtained from biochips, multi-neuron recording, and newimaging techniques for instance, opens a new era in complex systems methods. Can wemeet the challenge of this data driven biology and extract sensible information fromthese very large data sets? To be more specific, can we for instance extract the specificsets of interactions among genes which control gene expression in the cell? Inmathematics, such a problem is called an inverse problem: rather than finding atrajectory knowing the dynamics governing the system, we here have to solve theinverse problem, getting the dynamics from the observation of successive configurationsof expressed genes.

    Inverse problems are known to be much harder to solve than direct problems.Moreover, in systems biology one has to solve a problem with a very large number of

    unknown parameters. Even when supposing that the set of interactions among anensemble of N elements is sparse rather than complete, we obtain a number ofparameters to be determined that scales as N. Furthermore, in general, the structure ofthe network is a priori unknown, and so is the nature of the interaction function.

    Fortunately, inverse problems relate to a number of problems already treated incomplex systems research. Learning by formal neuron networks is an inverse problem: itsimply means finding the set of interactions which associates output patterns to inputpatterns. Variants include pattern sequence generators. An exciting prospect is to use

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    33/60

    and generalise learning techniques to e.g. decode gene regulatory networks usingtranscriptomic data that provide time series of gene expression levels. Among themethods inspired from complex systems research that can be used to solve this problem,one of the most successful ones in the past years relies on the Bayesian framework (e.g.Friedman et al., 2000). Alternative methods are likely to be evaluated in the comingyears, such as genetic algorithms or the survey propagation algorithm developed forconstraint satisfaction problems.

    Finally, it should be stressed that the development of systems biology will be guidedby historical and epistemological analyses of the interactions between biology and otherdisciplines. It will also be important to recognize the ethical questions raised by thedevelopment of systems biology, and to answer these questions by challenging humancreativity rather than by dictating laws.

    3.3 Cognitive ScienceBy Jean-Pierre Nadal

    Cognitive science is about understanding both natural and artificial intelligence, andrequires interdisciplinary approaches. At the intersection of biology and human sciences,it borrows techniques and concepts from many scientific domains - psychology,neuroscience, linguistics, philosophy, anthropology, ethology, social science, computerscience, mathematics and theoretical physics. The RIKEN Brain Science Institute(http://www.brain.riken.go.jp/) in Japan is one of the best examples of interdisciplinaryinstitutes which have been created in recent years all over the world, with departmentson several aspects of cognitive science: Understanding the Brain, Protecting the Brain,Creating the Brain (artificial intelligence, robotics), and Nurturing the Brain (brain andcognitive development). Interdisciplinary laboratories and institutes can also be found inEurope: one may mention the Cognitive Neuroscience Sector at SISSA, Trieste; theGATSBY Computational Neuroscience Unit at UCL, London; the EPFL’s Brain and MindInstitute in Lausanne; The Cognitive Science Institute (ISC) in Lyon and the Departmentof Cognitive Studies at École Normale Supérieure in Paris.

    Several important topics in cognitive science are concerned by the field of complexsystems. Here are a few examples.

    • Modeling and simulating brain functions requires the mathematical analysis oflarge interconnected neural networks.

    • The study of collective behaviour, in social animals and in human societies

    requires similar tools.• The analysis of data from brain imaging leads to difficult problems in artificial

    vision, data analysis, and modeling.

    An important complex systems topic concerning cognitive science and many other fieldsis the one of learning. This name evokes different but related aspects.

    • On one side, the theoretical and experimental study of learning and adaptation, atall possible scales: at the cellular level, the network level, the agent level(analysis of human behaviour), at the collective level.

    • On the other side, solving inverse problems requires the development ofalgorithms which can be understood as having a system learning a rule (afunction, a task) from examples.

    Learning theory, statistical (Bayesian) inference, optimal control, learning in formal andnatural neural networks, supervised or unsupervised learning, reinforcement learning,are all different aspects of this same general topic On the computer science side the

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    34/60

    name Machine Learning denotes this general approach of learning a task from examplesmaking use of any relevant algorithm. Machine Learning is proving an increasinglyimportant tool in many fields, such as Machine Vision, Speech, Haptics, Brain ComputerInterfaces, Bioinformatics, and Natural Language Processing. The EU has supportedspecific actions related to Machine Learning. For instance, the PASCAL network ofexcellence of the FP6 program (see http://www.pascal-network.org) was created in orderto build a Europe-wide distributed institute, which goal is to pioneer principled methodsof pattern analysis, statistical modeling and computational learning. Combinations ofconcepts and tools taken from learning theory and statistical physics have allowed toaddress the modeling of dynamical networks, such as neural networks, genetic networks,metabolic networks, social networks, communication networks... In such domain, onehas to both, solve inverse problems (finding the network parameters from the empiricalobservation of the network activity), analyse and/or simulate the network dynamics, inparticular when the network components (cells, proteins, human agents.) are allowed toadapt to the environment and to the behaviour of the other components.

    3.4 Engineering emergence in self-assembling systemsBy Norman Packard

    Self assembly is the spontaneous organization of components into larger-scalestructures. At the molecular level, this process is "automatic", in the sense that thecomponents are driven by the laws of thermodynamics to organize themselves into aminimum free-energy state, which happens to be a complex structure. The moleculardomain is particularly powerful and interesting for exploring self-assembly because forcesbetween charged molecules and surrounding substances (e.g. water or other solvents)create the appropriate thermodynamic conditions for complex self-assembly.

    Self-assembly is a bottom-up process, defined by local properties of the self-assembling components interacting with each other and their environment. These localproperties in turn determine the form of the self-assembled structures, but the self-assembly process may be complex, in the sense that the detailed form of the self-assembled objects is not predictable or derivable from knowledge of the componentproperties. This bottom-up process is to be contrasted with top-down design processesthat specify a target form in detail, and then use a highly controllable process (such asphoto-lithography) to create it.

    Self-assembly is now recognized as the primary path for the construction ofnanodevices. Other approaches based on transporting traditional top-down engineeringto the nano-scale, such as a programmable universal constructor envisioned by Drexler,

    appear difficult for various reasons, including the effects of thermal noise at the nano-scale.

    3.4.1 Control and programming of self-assembly

    Given the bottom-up nature of self-assembly, the final product may be difficult, if notimpossible, to control if the dynamics of self-assembly process are complex. This istypically the case in self-assembly of amphiphilic structures such as micelles or vesicles,and any time the self-assembly process includes nontrivial chemical reactions.

    There are, however, some self-assembly processes that are not complex, and that arein fact controllable to the point that they are explicitly programmable, either bysynthesizing polymers (e.g. DNA, cf. Rothmans (2006), Nature 440, p. 297) so that theywill fold automatically into particular structures, or else by imposing strong boundaryconditions on the self-assembly process that determine the final form (cf. D. Dendukuri,D.C. Pregibon, J. Collins, T.A. Aatton, P.S. Doyle (2006) "Continuous-flow lithography forhigh-throughput microparticle synthesis", Nature Materials, April). These processes have

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    35/60

    been very successful in producing nano-scale modules, but have yet to becomeintegrated into systems with full target functionality.

    The problem of programming self-assembly becomes a true complex systemsproblem when the components are not strongly controllable, because the self-assemblyprocess has multiple dynamic pathways with strong nonlinear interactions betweencomponents. In such cases, the final form of the self-assembled objects is not knowable.Thus, traditional approaches of programming by planning placement of modules explicitlycannot work. Instead, the space of self-assembled objects must be explored bysophisticated search and optimization algorithms to achieve a programming goal.

    Examples of complex self-assembly processes include:

    • Amphiphilic self-assembly of fatty acids and other lipids into micelles, vesicles,tubes, perforated sheets, etc. depending on constituent amphiphiles and theirthermodynamic phase (e.g. work of Luisi, Walde, Sczostak, and Hanczyc).

    • Vesicle formation from novel inorganic substances (e.g. Tianbo Liu, EkkehardDiemann, Huilin Li, Andreas W. M. Dress and Achim Mueller (2003) "Self-

    assembly in aqueous solution of wheel-shaped Mo154 oxide clusters intovesicles", Nature 426, pp. 59-62).

    • Nanostructures from organic synthesis (e.g. work of G. von Kiedrowski, L.Cronin).

    • Surface functionality, either surface energetic or surface patterning, in the contextof nano-electronics (inorganic) and functionalized organic molecular layers.

    • DNA scaffolding: extensive library of structures formed by DNA, including bar-codes, wires, complex matrices for attachment of reactants.

    • Nano-gold attached to various structures (e.g. DNA, Rubidium), usable forelectronic functionality or local heating of a nano-structure.

    • Molecular motors and muscles: components for moving molecules as well asaggregates have been developed at the component level.

    • Fused-polymer shapes.

    Communities spanned: synthetic chemistry, MEMS (micro electromechanical systems),molecular computing, physics, numerical computation (molecular dynamics, dissipativeparticle dynamics).

    Potential research and application areas: molecular circuitry, computation, molecularrobotics, biosensors, energy, artificial living cells.

    Future challenges: specific research area targets. These target areas could span researchand application areas listed above.

    • Extracting electrical energy from nano-objects.

    • Extracting mechanical energy from nano-objects (muscles and motors).

    • Functionalizing nano-objects by coupling to chemical reactions.

    • Achieving multiple functionality with multiple co-existing assembly processes andchemical reactions.

    • Programmed interactions with living organisms.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    36/60

    3.5 Epidemic modeling and complex realitiesBy Alessandro Vespignani

    A wide range of modeling approaches are crucially depending upon the accurate andrealistic description of social/infrastructural and behavioural data in spatially extendedsystems. This includes the movement of individuals at various different levels, from the

    global scale of transportation flows to the local scale of the activities and contacts ofindividuals, interrelations among infrastructure networks (cyber/physical levels) and thecomplex feedback mechanisms regulating traffic and behaviour. These factors aredeterminant in a wide range of problems related to understanding and forecasting thesystem behaviour in epidemiology, population biology, transportation and traffic analysis,infrastructure planning and social behaviour.

    In this context, mathematical epidemiology approaches has evolved from simplecompartmental models into structured models in which the heterogeneities and details ofthe population and system under study are becoming increasingly important features. Inthe case of spatially extended systems, modeling approaches have evolved into schemeswhich explicitly include spatial structures and consist of multiple sub-populations coupledby movement among them. This patch or meta-population modeling framework has thenevolved into a multi-scale framework in which different granularities of the system(country, inter-city, intra-city) are considered through different approximations andcoupled through interaction networks describing the flows of agents (people, animals,merchandise, parcels of information, technology). In addition, the introduction of agentbased models (ABM) has enabled to stretch even more the usual modeling perspective,by simulating these systems at the level of the single individual.

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    37/60

    Figure 5: Different scales structure used in epidemic modelling. Circled symbolsrepresent individuals and the different shades of gray correspond to a specific stageof the disease. From left to right: homogeneous mixing, in which individuals aredivided into compartments and assumed to homogeneously interact with each otherat random; social structure, where people are classified according to demographicinformation (age, gender, etc.); contact network models, in which the detailed socialinteractions between individuals provide the possible virus propagation paths; multi-scale models which consider sub-population coupled by movements of individuals,while homogeneous mixing is assumed on the lower scale; agent based models whichrecreate the movements and interactions of any single individual on a very detailedscale (a schematic representation of a city is shown).

  • 8/20/2019 Weisbuch, Gerard - Solomon, Sorin - Tackling Complexity in Science - General Integration of the Application of Co…

    38/60

    The above modeling approaches are based on actual and detailed data on the activity ofindividuals, their interactions and movement, as well as the spatial structure of theenvironment, transportation infrastructures, traffic networks, etc. While for a long timethis kind of data was limited and scant, recent years have witnessed a tremendousprogress in data gathering thanks to the development of new informatics tools and theincrease in computational power. A huge amount of data, collected and meticulouslycatalogued, has finally become available for scientific analysis and study.

    Networks which trace the activities and interactions of individuals, social patterns,transportation fluxes and population movements on a local and global scale have beenanalyzed and found to exhibit complex features encoded in large scale heterogeneity,self-organization and other properties typical of complex systems. Along with theadvances in our understanding and characterization of systems complexities, theincreased CPU power has led to the possibility of modeling multi-scale dynamical modelsby using thousands of coupled stochastic equations and allows agent based approacheswhich include millions of individuals. This prompts to the definition of a moderncomputational epidemiology area that integrates mathematical and computational

    sciences and informatics tools and aims at providing a new approach to the forecast andrisk assessment for emerging disease.Key elements in the development of such an approach are the integration of

    information technology tools and techniques with the perspective coming from complexsystems analysis. By one side this computational approach needs to integrate GIStechniques, on-line data analysis, petabyte data handling and simulations, with thecreation of the necessary infrastructures. On the other hand such an approach has todeal with all the complexity features emerging in the analysis of large scale systemsmade by a massive amount of elements/individuals. Non-linear behaviour, cooperativeand emergent phenomena, complex networks, etc. cannot be neglected and will be at thebasis of the interpretation and understanding of results as well as in the formulation ofthe opportune computational techniques.Meeting this challenge however requires to adapt and complement the basicapproaches developed in mathematical and statistical epidemiology to large scalesystems (10 4 t