information technology support for open source ... · 2.0 the range of information technology there...

23
Information Technology Support for Open Source Intelligence Analysis & Production Stephen J. Andnole Center for Multidisciplinary Information Systems Engineering College of Information Studies Drexel University Philadelphia, PA 19104 1.0 Introduction The design, development and use of computer-based problem-solving systems will change dramatically as we approach the twenty-first century. Our expectations about what these systems should do are rsing as rapidly as the requisite technology is evolving. By the year 2000, intelligence analysts will use the new technology to deal with all sorts of simple and complex problems. They will also benefit from systems capable of providing much more than database support and low-level inference-making. In fact, the very definitions that determine the nature and purpose of intelligence analysis and production will give way to a much broader understanding of the process. Where today intelligence is something governments collect, analyze and "produce," tomorrow it will just as likely refer to the process by which public and private sector competitors monitor, assess, warn and manage. The future will also see the pluralization of data sources. While there has been an increasing reliance upon open sources, the future will see that reliance grow even more dramatically as more and more organizations collect and analyze data of 31S

Upload: others

Post on 08-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

Information Technology Support for Open SourceIntelligence Analysis & Production

Stephen J. Andnole

Center for Multidisciplinary Information Systems EngineeringCollege of Information StudiesDrexel UniversityPhiladelphia, PA19104

1.0 Introduction

The design, development and use of computer-based problem-solving systems

will change dramatically as we approach the twenty-first century. Our

expectations about what these systems should do are rsing as rapidly as the

requisite technology is evolving. By the year 2000, intelligence analysts will use

the new technology to deal with all sorts of simple and complex problems. They

will also benefit from systems capable of providing much more than database

support and low-level inference-making.

In fact, the very definitions that determine the nature and purpose of intelligence

analysis and production will give way to a much broader understanding of the

process. Where today intelligence is something governments collect, analyze and

"produce," tomorrow it will just as likely refer to the process by which public

and private sector competitors monitor, assess, warn and manage.

The future will also see the pluralization of data sources. While there has been an

increasing reliance upon open sources, the future will see that reliance grow even

more dramatically as more and more organizations collect and analyze data of

31S

Page 2: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

special importance to their military, political, economic and social missions. The

very nature of open source data will itself change as governments, companies,

and global, international, national, and local news organizations produce more

and more data -- in order to extract more and more descriptive, explanatory and

predictive-estimative information. The open <--> closed source continuum will

remain intact as the players along the continuum change. In the 1950s nearly all

of the "closed" players were governmental focusing on military capabilities and

intentions; by the 21st century national, multinational and global corporations

(and other economic "alliances") will find themselves all along the continuum.

"Private" data collections -- used for special analytical purposes -- will emerge as

the public and private news organizations grow in number and capability. The

other published media will continue to serve as reservoirs of information,

reservoirs that will yield intelligence wholes greater than the sum of their parts.

At the heart of the data, information and knowledge bases, and the analysis and

production of finished public and private sector intelligence, will be information

technology.

This paper examines the trend in open source analysis and production and the

role that information technology -- very broadly defined -- can play in the

intelligence analysis and production process. It assumes that information

technology. has come of age and that opportunities now exist for relatively

conservative applications of extremely powerful methods, tools and techniques.

The paper deals with the range of information technology that might support the

analysis and production process as well as some especially promising tools and

techniques. It concludes with a set of recommendations for future investments in

the technology, methodology and analysis.

36o2

Page 3: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

2.0 The Range of Information Technology

There are a variety of tools, methods, techniques, devices and architectures

available to the intelligence systems designer; many more will emerge as we

move toward the 21st century. The challenge -- as always - lies in the extent to

which designers can match the right tool or method with the appropriate

problem. This section looks at a number of technology options now available to

the designer, options that will evolve quite dramatically during the next five to

ten years.

2.1 Emerging Models and Methods

Over the past few years the analytical methodology community has seen the

preeminence of knowledge-based tools and techniques, though the range of

problems to which heuristic solutions apply is much narrower than first assumed.

It is now generally recognized that artificial intelligence (AI) can provide

knowledge-based support to well-bounded problems where deductive inference is

required (Andriole, 1990). We now know that AI performssless impressively m

situations with characteristics (expressed in software as stimuli) that are

unpredictable. Unpredictable stimuli prevent designers from identifying sets of

responses, and therefore limit the applicability of "if - then" solutions. We now

know, for example, that so-called expert systems can solve low-level diagnostic

problems, but cannot predict Russian intentions toward Germany in 1999. While

there were many who felt from the outset that such problems were beyond the

applied potential of AI, there were just as many sanguine about the possibility of

complex inductive problem-solving.

The latest methodology to attract attention is neural network-based models of

34,3

Page 4: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

inference-making and problem-solving. Neural networks are applicable to

problems with characteristics that are quite different from those best suited to AI.

Neural networks are -- according to Hecht-Nielsen (1988) -- "computing systems

made up of a number of simple, highly interconnected processing elements which

process information by their dynamic state response to external inputs." Neural

nets are non-sequential, non-deterministic processing systems with no separate

memory arrays. Neural networks, as stated by Hecht-Nielsen, compnse many

simple processors that take a weighted sum of all inputs. Neural nets do not

execute a series of instructions, but rather respond to sensed inputs. "Knowledge"

is stored in connections of processing elements and in the importance (or weight)

of each input to the processing elements. Neural networks are allegedly

non-deterministic, non-algorithmic, adaptive, self-organizing, naturally parallel,

and naturally fault tolerant. They are expected to be powerful additions to the

methodology arsenal, especially for data-rich, computationally intensive

problems. The "intelligence" in conventional expert systems is pre-programmed

from human expertise, while neural networks receive their "intelligence" via

training Expert systems can respond to finite sets of event stimuli (with finite

sets of responses), while neural networks are expected to adapt to infinite sets of

stimuli (with infinite sets of responses). It is alleged that conventional expert

systems can never learn, while neural networks "learn" via processing.

Proponents of neural network research and development have identified the kinds

of problems to which their technology is best suited: computationally intensive;

non-deterministic; non-linear; abductive; intuitive; real-time;

unstructured/imprecise; and non-numeric (DARPA/MIT, 1988).

It remains to be seen if neural networks constitute the problem-solving panacea

that many believe they represent. The jury is still out on many aspects of the

technology. But like AI, it is likely that neural nets will make a measured

contribution to our inventory of models and methods.

4

Page 5: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

What does the future hold? Where will the methodological leverage lie?

In spite of the over-selling of AI, the field still holds great promise. Natural

language processing systems -- systems that permit free-form English interaction

-- will enhance efficiency. When users are able to type or ask questions of their

information systems in much the same way they converse with human colleagues,

then the way systems will be used will be changed forever.

Expert systems will also routinize many data sorting, information retrieval and

low-level inference-making processes. Rules about relationships among

indicators, the significance of technology activity, and the significance of a

leadership change will be embedded in expert intelligence analysis and production

systems. Problems that now have to be re-solved whenever a slight variation

appears, will be autonomously solved as semi- and fully-automated procedures

are implemented.

Smart data base managers will develop necessary data bases long before problems

are identified. Intelligence systems of the 1990s will be capable of adapting from

their interaction with specific users. They will be able to anticipate problem-

solving "style," and the problem-solving process most preferred by the analyst.

They will be adaptive m real-time, and capable of responding to changes in the

environment, like a shortage of time. They will sense, fuse, distill and present

data and information according to standing and adaptive instructions about the

significance of events and conditions throughout the public and private worlds.

The kinds of problems that will benefit the most from Al will be well-bounded,

deductive inference problems about which a great deal of accessible and articulate

problem-solving expertise exists. The intelligence community should abandon its

goals -- to the extent that they still exist -- of endowing computer programs with

true inductive or abductive capabilities, and the dollars saved should be plowed

3C,35

Page 6: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

back into so-called "low level" Al.

Intelligence systems designers in the 1990s will also benefit from a growing

understanding of how humans assess situations and make inferences. The

cognitive sciences are amassing evidence about perception, biasing, option

generation, and a variety of additional phenomena directly related to intelligence

modeling and problem-solving. The world of technology will be informed by

new findings; resultant systems will be "cognitively compatible" with their users.

Next generation systems will also respond to the situational and psychophysiolo-

gical environment. They will alter their behavior if their user is making a lot of

mistakes, taking too long to respond to queries, and the like. They will slow

down or accelerate the pace, depending on this input and behavior. The field of

cognitive engineering -- which will inform situational and psychophysiological

system design strategies -- will become increasingly credible as we approach the

21st century. The traditional engineering developmental paradigm will give way

to a broader perspective that will define information management, inference- and

decision-making processes more from the vantage point of requirements and

users than computer chips and algorithms. Principles of cognitive engineering

will also inform the design and human computer interfaces (see Section 2.4

below).

2.3 "Competitive" Models & Methods

Hybrid models-and methods drawn from many disciplines and fields will emerge

as preferable to single model-based solutions largely because developers will

finally accept diverse intelligence requirements specifications. Methods and tools

drawn from the social, behavioral, mathematical, managerial, engineering, and

6

Page 7: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

computer sciences will be combined into solutions driven by requirements and

not by methodological preferences or biases. This prediction is based in large

part upon the maturation of the larger design process, which today is far too

vulnerable to methodological fads. Hybrid modeling for design and development

also presumes the rise of multidisciplinary education and training, which is only

now beginning to receive serious attention in academia, industry and the

intelligence community.

The analysis and production process recognizes at least three activities:

monitoring, warning, and management. Monitoring requires the organic and

electronic systems to scan the environment. While this set of tasks may seem

relatively simple, they are m fact quite complicated. Monitoring is based upon

situational and realtime input as well as intelligence data provided to analysts

from a variety of sources. This data can take the form of geographic data, data

about the likelihood of specific kinds of threats, and data acquired from humans

in specific locations or studying specific phenomena. It can also take the form of

data and information about the specific threats and the competence of the analysts

of, for example, science and technology. This information is part empirical and

part judgmental Empirical information about threats may well exist (from photo

reconnaissance or human intelligence), but it is nearly always supplemented with

judgments about many critical attributes: the essence of intelligence is estimation,

not precision.

The monitoring process triggers an analytical chain reaction that is essential to

the development of any inference-making system. Input data is distilled and

interpreted against a set of intelligence data and assumptions that may or may not,

be accurate. The location and lethality of threats, for example, are estimated but

may be well off the mark when a mission or additional events unfold. The

momtoring process is thus -- to some extent at least -- "fixed" by intelligence

7

Page 8: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

estimates (that may or may not prove useful).

Once the location and identification of a threat (or opportunity) has occurred, the

problems regarding lethality and options are often more definable. The same is

true during monitoring, warning, and management. If the situation is relatively

well understood, then it is possible to identify a set of responsive procedures.

This is especially true if there is a finite number of things that can happen. How

many different threats can theoretically exist? How many different forms might

these threats assume? Can they be anticipated? We would argue that for selected

geographic areas, the predictability of threats is relatively high; we would also

argue that in other areas, predictability is low.

The selection of analytical methods is thus anchored on the low <----> high

predictability continuum. The degree of confidence in mtelligence data,

distillation (data fusion) procedures, and identification procedures, will determine

the optimal matching process. Monitoring assumes some structured methodology

for interpreting the diagnosticity of data and indicators. Monitonng and

diagnosis (for the purpose of assessing situations and issuing warnings) is classical

hypothesis testing, or, if one prefers to use artificial intelligence (AI) parlance,

deductive inference via backchaining.

Indicators are observed and relationships among them used to make inferences

about the likelihood of some event or condition. Intelligence analysis and

production assumes this monitoring and diagnosis process (Andriole, 1984). If

we can identify a suite of models and analytical techniques for monitoring,

diagnosis, assessment and warning, then we can develop prototypes that exploit

multiple modeling, redundant analysis, and the like.

For example, there are currently a variety of aids and support systems that

8

Page 9: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

contribute to the intelligence analysis and production process. Some momtor the

international environment searching for crises and others try to anticipate what

the North Koreans, Syrians, or Libyans are likely to do next. Nearly all of these

systems employ but a single dominant analytical model. The global indicationsand warnings (I&W) systems usually rely upon statistical deviations from

"normal" behavior (Andriole, 1983; 1984); the tactical warning systems

frequently rely upon probabilistic inference-making procedures. We have an

opportunity to introduce and demonstrate a new approach to aiding based upon

the use of multiple methods and models. Given today's information and

computing technology and the state of our analytical methodology, it is possible

to design systems that conduct analyses from multiple analytical perspectives. We

should redefine the tasks/methods matching process to include the integration ofmultiple methods and models to increase "hits" and decrease "false alarms."

In at least one study (Andnole, 1979), it was discovered that given a single set of

data and indicators it was possible to generate substantially different crisis

probabilities from two different analytical methods (Bayesian updating and

pattern recognition). At the time, the notion of incarnating several methods in

software was never considered because of the expense and our uncertainty about

how alternative methods could be integrated. Today both concerns are not nearly

as threatening, though few -- if any -- attempts have been made.

Multiple model-based inference-making is a new concept directly pertinent to

reliability and accuracy. In the I&W world, multiple modeling could help reduce

false alarms and increase hits. In the tactical world, it could add breadth and

depth to intelligence estimates, especially as they pertain, to military activity.

There are a variety of monitonng, diagnosis, assessment and warning methods

available. The challenge lies in the extent to which we can identify and integrate

9

Page 10: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

the "right" (multiple) methods into the domains we select to aid. The basic

concept assumes that monitoring, diagnosis, and situation assessment tasks can be

addressed by multiple qualitative and quantitative models and methods:

Monitonng, Diagnosis, Multiple (Quantitatlve & Advanced Information

Situation Assessment -- > Qualitative) Models & -- > & User Computer -- > Systems

& Warning Analytical Methods Interface Technology

Our research suggests that the selection of methods for intelligence analysis and

production must be based upon the following assumptions and attributes:

1. The nature of the task (monitoring, warning, management);

2. The degree of certainty and predictability of the environment andthreat(s);

3. The use of multiple hybrid analytical (qualitative and quantitative) models(consistent with processing capabilities);

4. The use of advanced information technology to display the results of theprocessing and analyst-system interaction; and

5. The use of a certainty/uncertainty control structure to determine theselection of methodological processes, a control structure that will adaptto situational constraints defined in terms of threat/decisiontime/awareness.

The proposed control structure hnked to methodological processes is key to the

selection and advocacy of methods, tools and techniques. Initial hypotheses about

which methods might be useful can be refined according to the control structure.

But the control structure itself operates on two successive levels: the probability

that a given condition (especially threat) exists, and the probability that the

situation (defined m terms of threat/decision time/awareness) is accurate. In

production rule form, the control structure approach might work as follows: if

the probability of threat location/identification/assessment is high and the

36^~~~~8~10

Page 11: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

probability of situation "A" (crisis) is high, then methods suite " " and task

allocation state "low" should be implemented (via display technique[s] "d").

The probability of the threat will predict to the probability of the situation. If the

threat is defined -- with high confidence -- then the probability of short decision

time and high threat will be correspondingly high.

The probability of the threat, in turn, is dependent upon the intelligence and

surveillance data and estimates, and the methods used to distill incoming data

efficiently. Here the multiple modeling approach should pay dividends. Because

of the proposed methods architecture it is essential that threat data be as accurate

as possible. At the same time, if great uncertainty exists, the system will

implement a suite of methods best able to deal with the situation.

The methods architecture has implications for processing speed and, therefore,

the ability to recalculate, re-plan, and perform realtime sensitivity analyses. If

high certainty exists about the threat and the situation, then optimal methods can

be implemented. High threat/situational certainty permits the use of

preprogrammed methodological processes, especially in reasonably well-bounded

domains.

Note again that the selection of the methods, allocations and displays is a function

of the probability that the threat can be identified and profiled and that the

situation can be described and defined. This, however, is dependent upon initial

processing of surveillance and intelligence data, as well as standoff and local

realtime sensor and observed data. The essential idea calls for a great deal of

processing regarding the likelihood of a "situation" existing, or "threat analysis

pre-processing " This pre-processing is by nature probabilistic, calling for

estimates about the state of the threat and situation: the probability of a threat

53611

Page 12: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

existing and the resultant probability that a particular situation exists are inputs to

the warning and management process.

The methods architecture calls for a greater processing investment early in the

analytical process and then less required processing -- in many circumstances --

after initial probabilities are determined. For example, if the system were to be

able to generate a .90 probability of a specific threat (identification, location,

lethality, and the like) and a .90 probability of a crisis situation, then the .81 path

probability would, in turn, trigger the implementation of a set of methods to

complete the warning and management processes.

There are a set of control rules that would make the system work. They are

relational and conditional. There are sets of relationships among threats,

geographic terrain, capabilities, and missions. Data and knowledge bases

exist that define these relationships, relationships that can be programmed as

simple stimuli - response procedures, or as heuristic search structures.

Remember that the selection of a methods process is a function of (a) the

probability of threat and (b) the probability of the situation. The process has

implications for task allocation as well. When time is short and the situation is

"known," the "best" methods are those that are fast and accurate.

The architecture calls for multiple monitoring methods working in parallel.

These would include a suite of tools and techniques that crosscut the

eplstemological spectrum. Procedural preprocessing would involve the use of

neural networks, fuzzy logic systems, genetic learning algorithms, conventional

(heuristic) expert systems, multivariate models, statistical pattern recognition, and

a variety of other probabilistic methods, tools and techniques, such as 1st order

Bayesian mechanisms and Markov chains. The key, as suggested above, is the

allocation of computational resources at the frond end of the tactical process.

370

12

Page 13: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

Qualitative and quantitative models, tools and techniques can be used tofilter

incommg data; m effect, a processing "race" should be set up to distill the data to

its most diagnostic properties as quickly as possible. The significance of the

front-end architecture lies in its explicit multi-modeling orientation and the

analysis of incoming data from multiple epistemological perspectives. In an

completely integrated system the methods would "cooperate" (output-to-input/

input-to-output)

The assumptions we have made regarding the boundability of the intelligence

domain is closely related to the selection of methods during the front end. The

search routines embedded in the modeling processes would lookfor patterns of

events, conditions and indicators that were well understood, immediately

recognizable, or confidently anticipated. This search process could be done

qualitatively or quantitatively, but against sets of templates designed to be

"located" quickly. In parallel the system would also search more inductivelyfor

less likely patterns. When a match is made, then the probability of the threat and

the situation can be determined -- from quantitative and qualitative data and

knowledge bases. The entire architecture assumes that if a match is made

relatively early on in the monitoring process then the system can, in turn,

proceed more confidently toward the warning and management phases. In other

words, the system can shift its methodological orientation from probabihstic

methods, tools and techniques to those more inclined toward optimization.

Another way of conceiving of the architecture is to understand front-end

processes as data-driven, while subsequent stages as parameter-driven. Front-end

methods can range from data-driven to judgment-driven to heuristic-driven, as

well as parameter-driven methods, where appropriate. The rules for method(s)

invocation are derived from P(T) P(M). When the probability is high, then the

system will invoke deterministic methods; when the probabihty is relatively low,

37113

Page 14: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

then the system will invoke more data-/judgment-/ heuristic methods.

2.3 Promising User-Computer Interface Technology

Twenty years ago no one paid much attention to user interface technology. Since

the revolution in microcomputmg -- and the emerging one in workstation-based

computing -- software designers have had to devote more attention to the process

by which data, information and knowledge are exchanged between the system and

its operator. There are now millions of users who have absolutely no sense of

how a computer actually works, but rely upon its capabilities for their very

professional survival A community of software vendors is sensitive to both the

size of this market and its relatively new need for unambiguous, self-paced,

flexible computing. It is safe to trace the evolution of well-designed

human-computer interfaces to some early work in places like the University of

Illinois, the Massachusetts Institute of Technology, (in what was then the

Architecture Machine Group, now the Media Lab), Xerox's Palo Alto Research

Center (Xerox/Parc), and, of course, Apple Computer, Inc. The "desk-top"

metaphor, icon-based navigational aids, direct manipulation interfaces, and user

guided/controlled interactive graphics -- among other innovations -- can all be

traced to these and other organizations.

Where did all these ideas come from? The field of cognitive science and now

"cognitive engineering" is now -- justifiably -- taking credit for the progress in

UCI technology, since its proponents were the (only) ones asking why the

user-computer interaction process could not be modeled after some validated

cognitive information processing processes. UCI models were built and tested,

and concepts like "spatial,database management" (from MIT's Architecture

Machine Group [Bolt, 1984]), hierarchical data storage, and hypertext were

372-14

Page 15: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

developed. It is no accident that much UCI progress can be traced to findings in

behavioral psychology and cognitive science; it is indeed amazing that the

cross-fertilization took so long.

UCI progress has had a profound impact upon the design, development and use of

all kinds of interactive systems.. Because many of the newer tools and techniques

are now affordable (because computing costs have dramatically declined

generally), it is now possible to satisfy complex UCI requirements. Early

data-oriented systems displayed rows and rows (and columns and columns) of

numbers to users; modem systems now project graphic relationships among data

in high resolution color. Designers are now capable of satisfying many more

substantive and interface requirements because of what we have learned about

cognitive information processing and the affordability of moder computing

technology.

The most recent progress in UCI technology is multimedia, or the ability to store,

display, manipulate and integrate sound, graphics, video and good old fashioned

alphanumeric data (Ragland, 1989, Ambron and Hooper, 1988; Alken, 1989). It

is now possible to display to users photographic, textual, numerical, and video

data on the same screen It is possible to permit users to select (and de-select)

different displays of the same data. It is possible to animate and simulate in

real-time -- and cost-effectively. Many of these capabilities were just too

expensive a decade ago and much too computationally intensive for the hardware

architectures of the 1970s and early 1980s. Progress has been made in the design

and execution of applications software and in the use of storage devices (such as

videodisks and compact disks [CDs]). Apple Computer's Hypercard software

actually provides drivers for CD players through a common UCI (the now

famous "stack"). Designers can exploit this progress to fabric systems that are

consistent with the way their users think about problems. There is no question

37315

Page 16: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

that multimedia technology will affect the way future systems are designed and

used. The gap between the way humans "see" and structure problems will be

narrowed considerably via the application of multimedia technology.

Direct manipulation interfaces (DMIs) such as trackballs, mice and touchscreens

have also matured in recent years and show every likelihood of playing important

roles in next generation UCI design and development. While there is some

growing evidence that use of the mouse can actually degrade human performance

m certain situations, there are countless others where the payoff is empirically

clear (Ramsey and Atwood, 1979; Ledgard, Singer and Whiteside, 1981; Bice and

Lewis, 1989). Touch screens are growing m popularity when keyboard entry is

inappropriate and for rapid template-based problem-solving (Smith and Mosier,

1984).

The use of graphical displays of all kinds will dominate future UCI applications.

Growing evidence in visual cognition research (Pinker, 1985) suggests how

powerful the visual mind is. It is interesting that many problem-solvers --

professionals who might otherwise use a system -- are trained graphically not

alphanumencally. Military planners receive map-based training; corporate

strategists use graphical trend data to extrapolate and devise graphic scenarios;

and a variety of educators have taken to using case studies laden with pictures,

icons, and graphics of all kinds. Complicated concepts are often easily

communicated graphically, and it is possible to convert complex problems from

alphanumeric to graphic form. There is no question that future systems will

exploit hypermedia, multimedia, and interactive graphics of all kinds.

Speech input and output should also emerge over the next five to ten years as a

viable UCI technology. While predictions about the arrval of "voice activated

text processors" have been optimistic to date, progress toward even continuous

16

Page 17: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

speech input and output should be steady. Once the technology is perfected there

are a number of special purpose applications that will benefit greatly from

keyboard- and mouse-less interaction.

The use of advanced UCI technology will foster a wider distribution of

technology. Early interactive systems were used most productively by those

familiar with the method or model driving the system as well as interactive

computing itself. In other words, in order to exploit information technology one

had to have considerable computing expertise. Advanced UCI technology reduces

the level of necessary computing expertise. Evidence suggests that training costs

on the Apple Macintosh, for example, are lower because of the common user

interface. Pull-down and pop-up menus, windows, icons, and direct manipulation

via a mouse or trackball are all standard interface equipment regardless of the

application program (and vendor). If you know how to use one Macintosh

program chances are you can use them all to some extent.

UCI technology will also permit the use of more methods and models, especially

those driven by complex -- yet often inexplicable -- analytical procedures. For

example, the concept of optimization as manifest in a simplex program is difficult

to communicate to the typical user. Advanced UCI technology can be used to

illustrate the optimization calculus graphically and permit users to understand the

relationships among variables in an optimization equation. Similarly,

probabilistic forecasting methods and models anchored mi Bayes' Theorem of

conditional probabilities while computationally quite simple are conceptually

convoluted to some intelligence analysts. Log odds and other graphic charts can

be used to illustrate how new evidence impacts prior probabilities. In fact, a

creative cognitive engineer might use any number of impact metaphors (like

thermometers and graphical weights) to present the impact of new evidence on

the likelihood of events.

17

Page 18: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

Finally, advanced UCI technology will also permit the range of intelligence

analysis and production support to expand. Anytime communications bandwidth

between system and user is increased, the range of applied opportunities grows.

UCI technology permits designers to attempt more complex system designs due to

the natural transparency of complexity that good UCI design fosters. Some argue

that the interface may actually become "the system" for analysts. The innards of

the system -- like the innards of the internal combustion engine -- will become

irrelevant to the operator. The UCI will orchestrate process, organize system

contents and capabilities, and otherwise shield analysts from unfriendly

interaction with complex data, knowledge, and algorithmic structures.

3.0 Summary Recommendations

The net effect is staggering. Intelligence analysis and production support in the

1990s -- and beyond -- will be enormously broad and powerful. It will be

distributed and networked. It will be "intelligent" and inexpensive. The effects

of this reality are difficult to precisely predict, though a number of the ideas

expressed above are m fact inevitable. Have we given enough thought to the

direction in which information technology is taking us? Have we assessed the

desirability of the direction? Have we determined the impact which next

generation systems will have upon the future? Will they define the future? Or

will the future suggest a role for the kinds support described here?

The technology tour presented here suggests that intelligence systems designers

will leverage emerging technology to fundamentally change the nature of

intelligence analysis and production. While the recent past enjoyed data-oriented

analysis, next generation systemswill provide analytical support of all kinds. Of

particular value will be the speed with which routine problems will be solved and

18

Page 19: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

the advisory support that interactive, quasi- and fully-automated systems will

provide users m areas as complex as situation assessment and option generation.

Interface technology will play a pivotal role in the introduction of new and

hybrid models and methods -- and the hardware community will make sure that

more than enough computational power is available.

The whole concept of "intelligence analysis" will evolve to accommodate changes

in the larger corporate, governmental, and military information systems

structure. Networking and advanced communications technology will permit

linkages to databases and knowledge bases -- and the routines to exercise them.

Not only will distinctions among mainframe, mini- and microcomputing fade, but

distinctions among management information, executive information, and decision

support systems will also cloud. Ironically, the concept of centralization may

re-appear, not with reference to central computing facilities but with regard to

enormous systems conceived functionally as hierarchies of capabilities. Users

may well find themselves within huge computing spaces capable of supporting all

kinds of analyses. Advanced communications technology will make all this

possible; users will be able to travel within what will feel like the world's largest

mainframe, which conceptually is precisely what a global network of data,

knowledge, and algorithms is.

The same users will be able to disengage the network and go off-line to solve

specific problems. This freedom will expand the realm of analytical computing

in much the same way microcomputing expanded the general user community.

Finally, all this technology will permit designers to fulfill user requirements in

some new and creative ways. Up until quite recently, technology was incapable

of satisfying a variety of user requirements simply because' it was too immature

or too expensive. We have crossed the capability/cost threshold; now designers

37-19

Page 20: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

can dig into a growing toolbag for just the right methods, models, and interfaces.

By the year 2000 this toolbag will have grown considerably. Talented systems

designers should be able to match the right tools with the right requirements to

produce systems that are analyst-oriented and cost-effective.

The future of intelligence analysis and production systems design, development

and use is bright. While some major changes in technology and application

concepts are in the wind, next generation systems will provide enormous

analytical support to their users. We can expect the range of decision support to

grow in concert with advances m information technology.

But can we easily get there from here? What specifically should we do to

encourage progress? Here are some thoughts:

* Reaffirm the need to allocate more resources than we havehistorically to analysis versus collection

* Concentrate on practical, sensible, cost-effective concepts,methods, models and tools, not upon unproven, expensive ones

* Retraining and recruiting of analytically literate designers andanalysts

* Serious consideration to the creation of an Open SourceIntelligence Agency

* Exploitation in information technology via off-the-shelfapplications

* Launch an intelligence analysis and production requirementsanalysis to baseline what we need and what is technologicallyfeasible

* Develop metrics to monitor and measure the impact of newtechnology investments

* Model, invest in and influence the open source world

37g20

Page 21: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

4.0 References & Bibliography

Aiken, PH (1989) A Hypermedia Workstation for SoftwareEngineering. Fairfax, VA: George Mason University, School ofInformation Technology and Engineering.

Ambron, R and Hooper, C (1987) Interactive Multimedia: Visions ofMultimedia for Developers, Educators and InformationProviders. Redmond, WA: Microsoft Publishers, Inc.

Andriole, SJ (1990) Information System Design Principles for the 90s.Fairfax, VA: AFCEA International Press.

(1989) Decision Support Systems: A Handbook for Designand Development. Princeton, NJ: Petrocelli Books, Inc.

Bice, K and Lewis, C (1989) Wings for the Mind: ConferenceProceedings: Computer Human Interaction. Reading, MA:Addison-Wesley Publishing Co.

Bolt, RA (1984) The Human Interface: Where People and ComputersMeet. Belmont, CA: Lifetime Learning Publications.

Defense Advanced Research Projects Agency (DARPA)/MassachusettsInstitute of Technology (MIT) (1988) DARPA Neural Network Study.Fairfax, VA: AFCEA International Press.

Hopple, GW (1986) "Decision Aiding Dangers: The Law of the Hammer andOther Maxims." IEEE Transactions on Systems, Man andCybernetics. SMC-16. 6. November-December.

Ledgard, H, Singer, A, and Whiteside, J (1981) Directions in HumanFactors for Interactive Systems. New York, NY: Springer-Verlag.

North, RL (1988) "Neurocomputmg: Its Impact on the Future of DefenseSystems." Defense Computing. January-February.

Pinker, S, ed. (1985) Visual Cognition. Cambridge, MA: MIT/BradfordBooks

Ragland, C (1989) "Hypermedia: The Multiple Message." MacTechQuarterly. Spring.

37f'21

Page 22: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

Ramsey, HR and Atwood, ME (1979) Human Factors in ComputerSystems: A Review of the Literature. Englewood, CO. ScienceApplications, Inc.

Sage, AP and Rouse, WB (1986) "Aiding the Decision-Maker Through theKnowledge-Based Sciences." IEEE Transactions on Systems, Man andCybernetics. SMC-16. 4. July/August.

Schneiderman, B. (1987) Designing the User Interface. New York, NY:Addison-Wesley.

Smith, SL and Mosier, D (1984) Design Guidelines for the User Intel faceto Computer-Based Information Systems. Bedford, MA: The MitreCorporation.

50o

22

Page 23: Information Technology Support for Open Source ... · 2.0 The Range of Information Technology There are a variety of tools, methods, techniques, devices and architectures ... Over

FIRST INTERNATIONAL SYMPOSIUM: NATIONAL SECURITY & NATIONAL

COMPETITIVENESS: OPEN SOURCE SOLUTIONS Proceedings, Volume II - Link

PagePrevious From A to Z: What We've Done with Open Sources

Next Multi-Media Analysis Methodologies

Return to Electronic Index Page