ssrn-id1799307

19
Electronic copy available at: http://ssrn.com/abstract=1799307 Paper Proposal for: Florida Political Science Association Conference March 26, 2011 Mr. Ari S. Litwin Mr. Jimmy A. Davis Dept. of Political Science, UCF Dwayne O. Andreas School of Law, BU Orlando, Florida 32816-1356 Orlando, Florida 32807 Tel (407) 823-6023 Tel (407) 873-8422 Email [email protected] Email [email protected] 1 “Design of a Geopolitical Information System” This paper will detail the development of a geopolitical information system based on GIS technology. GIS is a growing field that combines the advantages of data analysis, geospatial information, and computational simulation. There is a wealth of geopolitical data that can be processed by such a system including, but is not limited to, population; trade; natural resources; military technology and manpower; and various cultural statistics. A properly developed GIS could apply comparative analysis methods to problems like the comparison of foreign policy decisions; the testing of international relations theories; the evolution of more accurate geopolitical indices similar to the Composite Index of National Capability (CINC) or Comprehensive National Power (CNP); and the search for logical quantitative relationships. Designing a GIS is a complex process and must be preceded by careful planning. The developer must decide which software tools to use; how to collect, process, and store data; how to develop the project while keeping integration and standardization in mind; and how to constrain any analysis within the limits of available technology. The GIS will require access to data that is detailed and coded accurately. Methodologically, this paper presents an in-depth look at the project objectives, options available for design, and actual design decisions required to begin programming the system. The purpose is to provide a fundamental design for a geopolitical information system upon which implementation can thereafter begin. A thorough literature review will build on past efforts in the GIS field and the areas the system is studying, culminating in a final product that will provide users with an international relations laboratory. Introduction Even before homo sapiens started our epic ascension throughout the planet, we have shown a heightened awareness of our surroundings. 2 This sense of our surrounding came with a never-before-experienced urge to try and become aware of what the future 1 Ari S. Litwin is currently a graduate student in Political Science – International Studies at the University of Central Florida (UCF) in Orlando, Florida. He holds a Bachelors of Science in Physics and an MBA from Stetson University and a Bachelors of Science in Political Science from UCF. Jimmy A. Davis is currently a JD candidate at Barry University School of Law in Orlando, Florida. He holds a Bachelors of Science in Engineering Physics from Embry-Riddle Aeronautical University and a Masters in Physics from the University of Central Florida.

Upload: jai

Post on 05-Nov-2015

220 views

Category:

Documents


0 download

DESCRIPTION

gomma punda

TRANSCRIPT

  • Electronic copy available at: http://ssrn.com/abstract=1799307

    Paper Proposal for:Florida Political Science Association Conference

    March 26, 2011

    Mr. Ari S. Litwin Mr. Jimmy A. DavisDept. of Political Science, UCF Dwayne O. Andreas School of Law, BUOrlando, Florida 32816-1356 Orlando, Florida 32807Tel (407) 823-6023 Tel (407) 873-8422Email [email protected] Email [email protected] 1

    Design of a Geopolitical Information System

    This paper will detail the development of a geopolitical information system basedon GIS technology. GIS is a growing field that combines the advantages of data analysis,geospatial information, and computational simulation. There is a wealth of geopoliticaldata that can be processed by such a system including, but is not limited to, population;trade; natural resources; military technology and manpower; and various culturalstatistics. A properly developed GIS could apply comparative analysis methods toproblems like the comparison of foreign policy decisions; the testing of internationalrelations theories; the evolution of more accurate geopolitical indices similar to theComposite Index of National Capability (CINC) or Comprehensive National Power(CNP); and the search for logical quantitative relationships.

    Designing a GIS is a complex process and must be preceded by careful planning.The developer must decide which software tools to use; how to collect, process, and storedata; how to develop the project while keeping integration and standardization in mind;and how to constrain any analysis within the limits of available technology. The GIS willrequire access to data that is detailed and coded accurately.

    Methodologically, this paper presents an in-depth look at the project objectives,options available for design, and actual design decisions required to begin programmingthe system. The purpose is to provide a fundamental design for a geopolitical informationsystem upon which implementation can thereafter begin. A thorough literature reviewwill build on past efforts in the GIS field and the areas the system is studying,culminating in a final product that will provide users with an international relationslaboratory.

    Introduction

    Even before homo sapiens started our epic ascension throughout the planet, wehave shown a heightened awareness of our surroundings.2 This sense of our surroundingcame with a never-before-experienced urge to try and become aware of what the future1 Ari S. Litwin is currently a graduate student in Political Science International Studies at the University

    of Central Florida (UCF) in Orlando, Florida. He holds a Bachelors of Science in Physics and an MBAfrom Stetson University and a Bachelors of Science in Political Science from UCF. Jimmy A. Davis iscurrently a JD candidate at Barry University School of Law in Orlando, Florida. He holds a Bachelors ofScience in Engineering Physics from Embry-Riddle Aeronautical University and a Masters in Physics fromthe University of Central Florida.

  • Electronic copy available at: http://ssrn.com/abstract=1799307

    held for us, which continues to manifest itself to this day.3 Alchemy, soothsaying,shamanism, witch doctors, psychics and prophets are not in short supply, as one authorcommented.4

    Irrational or not, we can thank the individuals who studied these fields for helpingus evolve the more rational predictive methodologies employed today.5 In effect, sciencemust evolve from navety to highly accurate predictive modeling; witch hunters andalchemists relying on the Malleus Maleficarum and the Turba Philosophorum haveundoubtedly stumbled upon intuitive concepts of psychology and chemistry.6 Navethoughts thereafter gave way to soft sciences, which must be tolerated for a short timeto allow for maturation to a hard science.

    The problem lies not with the practitioners of the soft science, for they are aboveall responsible for the ground work required for the creation of a mathematically rigorouspredictive science. Without them the hard sciences enjoyed today would not exist. Thetransition however, is sometimes met with harsh resistance from contemporarypractitioners. Luckily for the authors, the modern community is not apt to place us underhouse arrest (or worse) for forwarding our ideas.7

    Collectively, todays scholars live in an age where the computational power of abattalion of mathematicians is at every persons command in the form of a computer.Consider the difficulties of scientists of old, like Newton or more recently even Einsteinwho had the pleasure of toiling endlessly upon a blackboard surrounded by whatappeared to be a plague of chalk dust. Newton, on paper had to formulate classicalphysics and calculus; Einstein his theory of relativity, inter alia.8 The mental acuityrequired for such feats boggles the mind.

    Yet, after the formulation of these sciences we have stagnated in the creation ofothers. Witness the social sciences, psychology, political science, etc. None to date haveforwarded much beyond statistical models, whose ultimate usefulness is questionable inlight of empirically testable formulae.9 Sciences must eventually be able to step awayfrom statistical models and provide predictions with a high degree of accuracy. Inphysics for example, if a scientist launches a projectile from a cannon with certain knownvariables, the prediction of its point of impact is within a matter of inches, not apercentage figure. This is not to say that statistical methodology does not have a place, it

    2 See generally Frederick Lawrence Coolidge & Thomas Wynn, The Rise of Homo Sapiens: The Evolution

    of Modern Thinking (2009)3 E. M. Berens, The Myths and Lengends of Ancient Greece and Rome: A Handbook of Mythology 198

    (2010); see also generally Robert Michael Place, The Tarot: History, Symbolism, and Divination (2005)4 Donald A. Norman, 842 Things That Make Us Smart: Defending Human Attributes in the Age 185 (1993)

    5 Author David Ritchie, Scientific Method: An Inquiry into the Character and Validity of Natural Laws 1-

    22 (2001)6 The Malleus Malificarum (also known as The Witchs Hammer) was a manual the inquisition and others

    used in the persecution of witches, most of whom were likely innocent of the charges levied against thembut others who were likely afflicted with some form of psychological distress. The Turba Philosophorum isa major alchemical text dealing with pre-chemistry arts. Neither books are recognized authorities todayand indeed are abhorrent by modern standards, but undoubtedly serve as a reference to how thought hasevolved over the centuries.7 See generally any discussion of Galileo.

    8 Philip Steele, History of Science 48 (2007)

    9 See the lament of Rein Taagepera, Making Social Sciences More Scientific: The Need for Predictive

    Models (2008)

  • really does. However, the overuse of these techniques serves only to cripple anyunderlying study of natural laws.

    This overuse appears to be the reason for the above mentioned stagnation.Computers, being the miraculous task-horse they are, share the blame for this state ofaffairs. It is now too easy to place numbers into a spreadsheet and push them through astatistical meat grinder based on one model or another and assuming somethingsignificant has emerged from the exercise. While using statistical methods to determineif a relationship between certain variables exists; the ultimate goal is to reduce thatrelationship into a useful law.

    But the computer need not be a bane to the development of new sciences. Tworelatively new fields in computer science; Geographic Information Systems (GIS) andExpert Systems (a less advanced form of Artificial Intelligence) can assist us in doing inyears what took our most brilliant historical minds a lifetime. A properly configured GISallows a never before available means of visualizing data. This visualization allowsresearchers to intuitively determine the existence of certain relationships before investingtime on projects which may potentially dead end. Couple this use of GIS with derivedformulaic laws and suddenly statistical methods give way to predictive mathematicalmodels.

    This paper, then, serves two important purposes. One of which is to show howGIS can be used in International Relations (IR) on a small scale, and yet to make the casethat GIS can inaugurate large-scale changes in the field by providing nothing short of anIR laboratory. The other purpose is to show how logical quantitative modeling can beused in IR to tease out the dynamics of the field, and how GIS can help in this effort.

    What is a Geopolitical Information System (GPIS)

    A Geographic Information System (GIS) is, in the widest definition, a system forthe management, analysis, and presentation of spatial data, that is, data based on location.GIS technology and methodology are utilized in many academic disciplines as well aspractical applications. To make a bold statement, GIS has the potential to revolutionizethe field of IR. Why is this not simply one small step instead of a giant leap for IR?10 Thisis because GIS changes the rules of the game, and it does so across the board.

    GIS provides a link between qualitative and quantitative research by providingcentralized management of all data linked to location. For example, the name of a state,its location on a map, its Gross Domestic Product (GDP) per capita, a speech by itsleader, and the leader's photograph with the head of state of a foreign power can all belinked within the GIS. Furthermore, the system can move beyond the current fixationwith statistical modeling and onto the testing of logical quantitative relationships. Thischange, heralded by Rein Taagepera in Making Social Sciences More Scientific, would bea revolution in itself. Taagepera notes the sloppy use of statistical quantitative modelingin the social sciences; how a more complex interplay between data and modeling;enhanced predictive capability; and logical models that describe more than the directionof an effect are needed.11 GIS is well suited to this task.

    10 To liberally plagiarize from Neil Armstrong.

    11 See supra note 9 at 236-240.

  • Spatial analysis also allows for a more integrated combination of internationalrelations and comparative politics. Data for phenomena both inside and outside of statescan be combined within the GIS, and relationships among and between this data can betested with the same ease. Models can be devised that test foreign policy assumptions anddecisions. Indices, for instance dealing with the search for national power calculations,can be developed based on the combination of larger quantities of data then those used inthe past. These indices can be compared and tested against each other and real-worldevents.

    Finally GIS could be used to test IR theories. Ultimately a GIS internationalrelations project would develop nothing short of a simulation of international politics.This model, running within a computer, would if both spatial and temporal data arecombined, allow for the assumptions of the core IR theories to be tested for accuracy andimprovement. Furthermore, new theories would certainly develop as a result of thiseffort. Effectively, this would be a geo-political laboratory that would allow for a deeperfield of experimental IR, and possibly lead to useful predictive capability. Thus, we call aGIS as applied to IR a Geopolitical Information System or GPIS.

    However, there are many obstacles to overcome before such a system can be puttogether. Designing a GPIS is a complex process and must be preceded by carefulplanning. The developer must decide which software tools to use; how to collect, process,and store data; how to develop the project while keeping integration and standardizationin mind; and how to constrain any analysis within the limits of available technology. TheGPIS will require access to data that is detailed and coded accurately.

    By far the key obstacle is collecting detailed, accurate data. It is in this effort thatthe system must be planned methodically and that patience is most definitely a virtue.One must always keep the well-known computer science maxim in mind: garbage in,garbage out. If the data within the GPIS is inaccurate, so too will be any subsequentanalysis. Supreme effort must be made to ensure that good data is collected, and the levelof difficulty of such a task is great.

    Other concerns, such as the selection of a computerized software package andhow to best store and standardize data must also be surmounted. The following paperdiscusses how a GPIS system could be developed. The potential uses of such a system,the obstacles to creating it, and two studies showing uses of a smaller-scale developedsystem will be discussed. While the ultimate objective is the IR laboratory model, theseauthors recommends an approach through which development and utility walk side-by-side and the system can be useful while being continuously expanded.

    Designing the system Obstacles and Choices

    The design of a GPIS system, especially one of great magnitude is a complexprocess. Various design components must be broken down into sub-stages, which shouldconsist of: choosing or designing the proper software tools; collecting or creating theproper data to be analyzed; designing a data storage and management system; and finallyperforming analysis on that data.

    The GPIS developer is constrained in making these choices. Primarily, time andmonetary resources force the developer to put limits on the expected utility of anyproject. Another key constraint is the limit of available technology. Finally, a host of

  • issues arise in dealing with data collection and storage. Most importantly, data must becoded accurately and it must be available.

    Before tackling these components, it is important to mention two others that areinter-related and go beyond the scope of this paper. One of which is the designphilosophy of the wider GPIS system itself. The majority of research based around GISuses the project model with a clear beginning, middle, and end. The results are thenformatted as a traditional research paper, book, or presentation. An alternative model isthat of the IR laboratory or simulation, which represents a more dynamic and versatilesystem. This model represents a simulation of the real world, through which manyquestions can be asked, studied, and answered. A future expansion of such a modelwould be its inclusion of near real-time data through connection to the Internet. Such asystem would require a substantial investment in resources. While the example studies inthis paper represent the more traditional project model, the laboratory/simulation model iswhere the evolution of a GPIS should strive to reach.

    The second, yet related component is that of the people involved in the GPISsystem. While hearing the term GIS brings to mind technology and investment inhardware and software, the people who design and create such systems, as well as theconsumers of GIS analysis, truly represent the largest investment in such a system. Theexample study represents one that is designed by a single researcher, or small team ofresearchers, and they are the only users of the GPIS they create. In a larger system, suchas the laboratory model, many more people would need to be brought into the system andtheir tasks would be differentiated. Some would be designers, some users, some expertsonly in data collection, data creation, or database building for instance. Such a GPISwould entail a host of the general management obstacles that are present in any large-scale undertaking. These issues, however, are beyond the scope of this current paper.Therefore, the obstacles in this study and the choices of solutions to those obstaclesprimarily deal with technical issues, and are overwhelmingly focused on datamanagement.

    Thus, for the initial development of the GPIS, two key issues must be overcome.One is the selection of a software package for use in managing and analyzing data, theother is the managing of the data itself. Selecting software (and the hardware on which itis to be run) could easily constitute a study of its own. However, there are a few corevariables that standout as being important to the analyst: standardization, functionality,and cost.12

    GIS software packages require an investment in time and energy (in addition tomoney) if one is to learn to utilize the range of their analytical capabilities. With such arequired investment, standardization and functionality are top priorities. The softwareshould be capable of reading a wide range of data formats and work with a wide range ofother standard software such as spreadsheet and database packages. The software shouldalso be expandable, preferably allowing users to write their own needed functions. Twocommercial packages fit these criteria, one is ArcGIS, produced by ESRI, and the other isAutoCAD MAP by AutoDesk. Both packages are full-service and both are backed bylarge corporations (with extensive user assistance available). Both are also costly topurchase.

    12 Harmon, John E., and Steven J. Anderson. 2003. The Design and Implementation of Geographic

    Information Systems. New Jersey: John Wiley & Sons, Inc. 38-41, 186-189.

  • The final criterion in evaluating software is cost. While a detailed cost analysis isunnecessary, it is ultimately the desire of a project manager to balance the high costs ofcommercial software and the implementation difficulties of open source software.Google Earth is one well-known piece of geographic software though it lacks deepanalysis functionality, while GRASS is a well known open source full-service package.This software might be free of monetary charge, but still may require a substantialinvestment of time to learn to use properly.

    For the studies in this paper, ArcGIS was selected as the base software for theGPIS. This software is an industry standard, it was accessible to the authors, and one ofthe authors has experience using the software.

    Finally, working with GIS data requires many decisions to be made over manycriteria. Data can be produced by the user as well as obtained from a variety of sources(which may or may not distribute it free of charge). The quality of data as well as thecredibility of its source is also highly variable. The analyst must first understand whatdata is needed. For the project model, this can all be planned in the design phase. For thelaboratory model data acquisition and usage are both ongoing concerns. Once data hasbeen obtained or created, it must be placed into the GIS software which requires properformatting for use. The data must then be analyzed.13

    While these steps may seem to follow a natural order, in reality all three tend tointertwine. The GIS software can help in organizing data and ongoing analyses, but theuser needs to remember to pre-plan and keep the goals of his or her study in mind.

    Basic Quantitative Logical Relations

    Understandably, there is little time or reason to spend hundreds of thousands ofman-hours to develop the ground work to evolve the field of IR into a mathematicallyintense predictive science. Fortunately pioneers in other hard sciences inadvertentlypaved the way to rapid development of any field so long as there are known variables.

    Classically, physics is broken up into two types of systems; the large (basketballs,stars, galaxies) and the small (neutrons, protons, electrons, photons).14 There appears tobe a similar breakup regarding the study of people: there are populations (the large) andindividuals (the small). The social sciences study both types but there is no reason tobelieve that both groups of people and individuals do not operate by a set of rules or laws.If people follow the same pattern as physics, large group behavior is probably easier topredict than individual behavior. In this way, populations are more like classical physicswhile individuals are more like quantum mechanics. So to begin coming up with a basemodel we will begin working with populations of people and how they interact with eachother.

    Base Units

    Almost everybody is familiar with base units in the sciences; whether a scientistor not. When a car travels at a particular velocity, that velocity is commonly measured in

    13 Decker, Drew. 2001. GIS Data Sources. United States of America: John Wiley & Sons, Inc. 96-101.

    14 See generally Walter Thirring & E. M. Harrell, Classical Mathematical Physics: Dynamical Systems and

    Field Theories (2003) and Nouredine Zettili, Quantum Mechanics: Concepts and Applications (2009)

  • kph. Kilometers per hour can be reduced to meters per second. Here, two commonphysical units are found: meters (or length) and seconds (or time). In physics there aresix fundamental units; length, mass, time, temperature, dielectric constant, and magneticpermeability.15 Placing these basic units together in appropriate combinations describephysical properties. In the common example given above, velocity is some form oflength over time.

    Scientists use a form of dimensional analysis to determine whether or not aformula (or law) is valid. The classical (read complex) method to do this can be brokendown into vector mathematics, easily utilized to derive a potential law. As an example,let us determine the validity of Newtons famous law, f = ma.16 We shall keep thisexample short:

    (1) Let a vector of base units be in the following order:

    where if the base unit is noted by its power, velocity being length over time would looklike the following:

    (2) So mass (m) = and acceleration (a) =

    (3) The basic structure of force (f) is therefore a sum of both vectors, since in thevector form the order is reduced.17

    f = + =

    Force, coincidentally has base units of mass length over time to the second power.The addition worked and Newtons laws remain thankfully intact. But there are fewindividuals willing to sit with more than forty such variables and determine howevermany combinations of them raised to various powers are unitarily valid. Thankfullycomputers exist, and are perfectly suited to such tasks.18

    If there are similar laws in how populations deal with each other, it is imperativethat the fields studying them discover these base units as it would apply to populations.Humans thankfully live in the physical universe, and so some of these base units willcome over easily. As an example, the authors posit that distance undoubtedly affects howpopulations interact with each other. This assumes that you will treat populations a15

    See any basic college physics text for a more in depth explanation. Chris Vuille & Raymond A. Serway,College Physics (2009) comes to mind as an industry standard for undergraduates.16

    The authors will not attempt to replicate Newtons actual treatise for this example, but entreats you totake in Sir Isaac Newton, Philosophi Naturalis Principia Mathematica (1714) to glimpse the herculeantask undertaken in this masterful work. Any method we use had better prove Newtons law is (ignoring allexperimental evidence) at least physically valid, or our technique shall be short lived.17

    By order, the authors mean that division drops to subtraction, multiplication to addition, powers tomultiplication. a2 for example would be calculated as 2x = .18

    Jimmy Allen Davis, The Permutanomicon: The Design and Implementation of an Expert Algorithm toRapidly Compute Simply Physical Relations (2006)

  • populations: one to the east, one to the north, one to the west, and one to the south (seefigure 1 below). Each population is different in size, and the only thing separating themis distance.

    Figure 1: The population in the middle is surrounded by four other populations. Thepopulation south is quite a distance away, denoted by a dashed line. The population size

    is symbolized by actual size.

    In their situation, it would be quite natural (in the absence of any otherknowledge) to try and assess which population poses the most threat. In this diagram,one might logically determine that although the largest population (to the south)outnumbers them, they are too far away to be concerned with immediately. Thepopulation to the west is almost the same size, but further than the population to the eastwhich is definitely larger and as close or closer as the population north.

    Gut instinct would label the eastern population as the most threatening. But why?The reason is the same as when two children play catch. When a ball is thrown, thethrower has instinctively calculated the amount of force, angle of throw, initial velocity,how much angular momentum to impart on the ball, and so on. The catcher likewisecalculates incoming velocity, angle of descent, where to position body, arm, hands, andremembering to keep the fingers open to receive. All of this in split seconds, but if onewas to model the math on a computer, pages of data and calculations would be needed.

    In this example, the answer is intuitive, because the central population knows thatbasic threat (no obscure variables) is somehow a factor of population size and distance.Some ratio of these two variables helps determine threat. But what is the ratio? How canone calculate threat without relying on internal hunches (which, like the ball can be a hitor miss thing)? The ability to determine how much threat another population poseswithout gut instincts would be the basis of sound policy decisions. Who do we attack?Who do we prepare to defend against? Should we ally ourselves with the larger group ortry and collect the smaller groups to defend against the larger?

    Like the children playing catch, the human mind may not grasp the abstractmathematics that went into a basic threat analysis but can do it instinctively. The futuretask of the political scientist is to reduce this analysis to paper and eliminate any

  • deficiency in the thought process of individuals responsible for such decisions. Thequestion is how?

    Considering Population Differences

    First, one must realize that all threat calculations are vector quantities simplybecause there is a magnitude of population and a direction in which the threateningpopulation exists.22 We can keep this quantity positive as calculated from the origin, andnormalized to place the numbers into context, especially if calculated from the point ofview of multiple populations. In other words, threat calculated between two populationswill always yield the same number (not considering directional variance) yet not vieweach other as equally threatening.

    This leads us to conclude that the populations are directly proportional.23 Thismakes sense on an intuitive level. Both populations must consider each others threatbased on their respective advantage or disadvantage. So naturally the larger populationwill view the smaller as not so threatening, while the smaller will be more wary of thelarger populations intention. This leads us to a basic threat formula which alone makeslittle sense.

    (8) T = P1P2

    There is no consideration of distance and so the unitary value of threat here ispopulation to the second power. Besides being uninteresting and boring, the reason thisis wrong is because the threat calculated is the same no matter where the secondpopulation is located.24

    Considering Distance

    The biggest question regarding distance is whether it should be linear or non-linear. The second consideration is how it varies with respect to population. Variationcan be determined by varying the distance between two populations and seeing whatmakes sense. We shall tackle the latter problem first as it is easiest to answer.

    If two equal populations are next door to each other, the base threat should behigher than if they are some distance apart. If distance and populations were to bedirectly proportional this would not be true.25 It would make no sense to consider one oftwo equal populations a greater threat because it was ten times further away. So logicallythe opposite must be true and the form of the equation comes out as follows.

    22 See supra note 11.

    23 This is a fancy way of saying that if you multiply both populations together, you obtain a constant. So if

    population A = 100 and population B = 10, the product of A and B is equal to the product of B and Awhich equals the population constant of the base threat formula.24

    Consider P1 = 100. They have two neighboring populations P2 and P3, both with 10,000. P2 is 100kilometers away. P3 is 1000 kilometers away. Using formula eight in this scenario yields equal base threatfrom both P2 and P3, regardless of distance.25

    Consider P1 = P2 = 100. If they are 1 kilometer apart, a base threat calculation would yield 10000. Ifthey are 10 kilometers apart, their threat would nonsensically increase by some factor of 10x.

  • A GPIS would not only allow for the calculation of threat, but the determination of othernew and interesting IR laws. The development of not only the laws, but simultaneouslythe GPIS, plus a system to try and obtain formulae theorists will undoubtedly miss willgo a long way in predicting how multiple groups of people will interact with each other.

    It should be noted that this type of theory will break down as the population sizegets smaller.47 As one gets closer to the individual, the authors predict that this macro-population theory will deteriorate as individuals will follow more closely whatever lawsof psychology exist. This theory will not explain the unstable dictators or derangedassassins of the world. It will not predict individual action simply because that is beyondits scope. Psychologists, like political scientists will have the unenviable burden ofreducing their statistical analysis into psychological theoretical rules to predict behavior.Until then, psychology, like quantum mechanics will continue to provide some minorinconsistencies in the calculations.

    Conclusion

    Evolving political science from a soft to a hard science with predictive theorieswould be most easily achieved through the creation and maintenance of a highlysophisticated GPIS designed to monitor and track multiple variables within a particularpopulation. It would best be developed by individuals with a background in the hardsciences, who are familiar and comfortable working with complex mathematical modelsand manipulating them.

    The GPIS envisioned would provide the developing group with enormousadvantages beyond the standard statistical analysis methods. Predictions of futureoutcomes would be more certain, guesswork would be eliminated, errors isolated to baddata (bad intelligence), and decisions based off any such calculation would rely less onguessing and more on hard facts. Changing policy decisions to follow such theorieswould allow policy makers the ability to make decisions on hard facts coupled withscience, not a statistical inference.

    Already, one theory provided in this paper can be tested against multiplehistorical events (the vast laboratory of the theoretical political scientist) for accuracy andreplication. The authors predict that base threat will show a vast change just before orafter the commencement of conflict. The need for this type of development is becomingever more necessary in a smaller world whose interdependency is growing more andmore complex by the day. With this type of GPIS in place, optimized decision making iswell within the grasp of the community. By the end of the decade, conflicts and otherinteractions should be predictable to within months of erupting. By the end of thecentury, within days.

    The combination of GPIS and logical quantitative modeling could revolutionizeIR theorizing and practice. The field would have a laboratory needed to build a robustexperimental side, while the theory side would have tools needed to test theirassumptions. This is not to say that developing such tools is easy, but the journey of athousand miles begins with the first step, and it is hoped that this paper serves to take justsuch a step.

    47 If the authors original thought that populations act like macro objects in physics is true, this should not

    be a surprise. In physics, as the objects get smaller and smaller Newtonian physics starts deteriorating.

  • Map 1: Testing Base Threat Levels before the Civil War

  • Map 2: Testing Base Threat Levels between Iran, Israel, and the US