direct manipulation interfaces successful lrect nte aces

8
28 Many developers have yet to grasp what makes direct manipulation interfaces successful lrect MANIPULATION nte U ser interface technology has made considerable progress dur- ing the past few years, particu- larly in the area of direct manip- ulation. User interfaces based on direct manipulation are characterized by their sup- port for interaction organized around the use of a pointing device, such as a mouse or touchscreen. Other features commonly as- sociated with this style of interface include graphics, windows, and advanced menu techniques. Direct manipulation hal' been successfully used in a wide range of applications, includ- ing spreadsheets, desktop publishing, CAD / CAM and public information systems, soft- ware development environments , and ex- pert systems. LIMITING FACTORS Despite its success, acceptance of direct ma- nipulation has been less than universal for a number of reasons. First and perhaps fore- most, direct manipulation interfaces involve considerably more overhead than teletype- based interfaces. Therefore, the machines associated with them have, until recently, been expensive. Second, risk is incurred in software devel- opment; design and implementation of ap- plications using direct manipulation often involve unconventional languages, unortho- dox design techniques, and unfledged devel- opment tools. No consensus has been reached as to the best way to design soft- aces ware for direct manipulation interfaces. This lack of consensus is reflected in the tools emerging on the markeL' ·2 Third, many software developers have yet to fully grasp what makes these interfaces successful. This issue is the focus of this arti- cle-in particular, the elements of direct manipulation and how they may be applied to knowledge-intensive interactive environ- ments such as expert systems. Prospective developers of direct manipu- lation interfaces are "all dressed up with no- where to go." The technical capabilities needed to develop effective user interfaces exist, but often no one knows what to do with them. The process of creating an effec- tive user interface remains very much an art, yet artistic inclinations among software developers remain an undervalued com- modity. Support for developers in the form of interface guidelines tends to be geared to yesterday's technology, viewing the world as if it were a series of full-screen displays (if not 80-column punch cards). This is quite different from direct manipulation environ- ments, which compress considerable power into a single display through the use of win- dows, pop-ups, icons, graphics, and other gadgetry. The art of creating a direct manipulation interface involves more than piling windows knee-deep on the display, anticipating that the world will rejoice at the prospect of get- ting lost among them. Designs based on na- ive implementation of advanced techniques AI EXPERT. OCTOBER 1988

Upload: others

Post on 28-Dec-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: direct manipulation interfaces successful lrect nte aces

28

Many developers have yet to grasp what makes direct manipulation interfaces successful

lrect MANIPULATION

nte U

ser interface technology has made considerable progress dur­ing the past few years, particu­larly in the area of direct manip­

ulation. User interfaces based on direct manipulation are characterized by their sup­port for interaction organized around the use of a pointing device, such as a mouse or touchscreen. Other features commonly as­sociated with this style of interface include graphics, windows, and advanced menu techniques.

Direct manipulation hal' been successfully used in a wide range of applications, includ­ing spreadsheets, desktop publishing, CAD / CAM and public information systems, soft­ware development environments, and ex­pert systems.

LIMITING FACTORS Despite its success, acceptance of direct ma­nipulation has been less than universal for a number of reasons. First and perhaps fore­most, direct manipulation interfaces involve considerably more overhead than teletype­based interfaces. Therefore, the machines associated with them have, until recently, been expensive.

Second, risk is incurred in software devel­opment; design and implementation of ap­plications using direct manipulation often involve unconventional languages, unortho­dox design techniques, and unfledged devel­opment tools. No consensus has been reached as to the best way to design soft-

aces ware for direct manipulation interfaces. This lack of consensus is reflected in the tools emerging on the markeL' ·2

Third, many software developers have yet to fully grasp what makes these interfaces successful. This issue is the focus of this arti­cle-in particular, the elements of direct manipulation and how they may be applied to knowledge-intensive interactive environ­ments such as expert systems.

Prospective developers of direct manipu­lation interfaces are "all dressed up with no­where to go." The technical capabilities needed to develop effective user interfaces exist, but often no one knows what to do with them. The process of creating an effec­tive user interface remains very much an art, yet artistic inclinations among software developers remain an undervalued com­modity. Support for developers in the form of interface guidelines tends to be geared to yesterday's technology, viewing the world as if it were a series of full-screen displays (if not 80-column punch cards). This is quite different from direct manipulation environ­ments, which compress considerable power into a single display through the use of win­dows, pop-ups, icons, graphics, and other gadgetry.

The art of creating a direct manipulation interface involves more than piling windows knee-deep on the display, anticipating that the world will rejoice at the prospect of get­ting lost among them. Designs based on na­ive implementation of advanced techniques

AI EXPERT. OCTOBER 1988

Page 2: direct manipulation interfaces successful lrect nte aces

ARTWORK: DON CARROLl/IMAGE BANK

tend to result in inept, garish, and irrelevant interfaces causing user overload, disorienta­tion, and frustration. 3.5 Development should be undertaken within a framework of rel­evant user interface concepts. An effective user interface encourages its users to focus their energies on the substance of their work rather than on extraneous require­ments imposed by the computer.

For expert systems, this can mean the dif­ference between success and failure. This is not simply a matter of user acceptance. Even if people can be induced to use .a sys­tem and the results are generally accepted as epistemologically sound, success may re­main elusive. Ensuring that the problems posed by the user and solved by the system are one and the same requires close cooper-ation between user and system. Further-more, if the user cannot be presumed capa-ble of properly posing a question, the system must ascertain not only the facts con-cerning the case at hand but the questions worth asking. Otherwise, effectiveness is compromised, and this may go undetected I . by the unsuspecting user.

If expert systems come into common us­age (and become as common as the shells we use to develop them), it will be in part be­cause obstacles associated with the user in­terface have been surmounted. Direct ma­nipulation represents the best solution available using today's technology.

The success of direct manipulation inter­faces can be attributed not to any single technique but to a convergence of interre­lated elements, of which direct manipula­tion is but one. Other elements include sym­bolic representation, choice constructs, and concurrent contexts. Each of these elements is essential to the interface as a whole: sym­bolic representation is important in model­ing user interfaces on their problem do­mains, choice constructs guide and inform user decisions, and concurrent contexts let the user integrate the benefits of several in­teractive systems at the same time. For its own part, direct manipulation is the mecha­nism that permits users · to navigate among and interact with these elements. Because of their significance to the effectiveness of di­rect manipulation, these elements will be discussed in some detail.

SYMBOLIC REPRESENTATION Symbolic representation is a key element in modeling user interfaces on their domains. To the extent a user interface successfully 29

AI EXPERT. OCTOBER 1988

Page 3: direct manipulation interfaces successful lrect nte aces

30

models the system's domain, the process of interacting with the computer begins to as­sume the character of directly manipulating domain objects. This lets the user think about the problem being solved rather than the way it is being solved.

Symbolic representation is the relation­ship between symbols and domain objects. This relationship is one of association, re­semblance, or convention, and is accom­plished using pictorial, metaphoric, arbi­trary, and composite depictions of the objects and relations comprising the prob­lem domain. In direct manipulation inter­faces, symbols are interactive. As output en­tities, they depict the problem domain in its current state. As input entities, they are the handles, knobs, and other gismos whereby the user manipulates the problem domain. Any changes effected in the domain are re­flected symbolically.

The symbols populating an interface are not isolated entities. The interface itself consists of a collection of symbols, their spa­tial and temporal relations, and their dyna­mics. Spatial relations define the behavior of symbols sharing the display at the same time; temporal relations define the interac­tive scenarios possible in the system. These relations may be effective, effected, con­comitant, or inconsequent; that is, they may affect other objects, be affected by other ob­jects, belong to groups of mutually affected objects, or contrive to have no relationship whatsoever. The dynamics of a symbol de­fine the set of behaviors possible in the sym­bol's life in the system.

Viewed from a rudimentary perspective, the process of designing a user interface consists in defining the collection of sym­bols, their relations, and their dynamics. Se­lecting and integrating symbols into a sys­tem requires a knack for graphically visualizing complex information, an intu­itive grasp of user behavior, and sufficient domain knowledge for delineating the sig­nificant problems. The dimension of do­main dependency suggests that, for expert systems, interface design is analogous to knowledge acquisition.4.6 Following this analogy, the human expert's role in inter­face design is comparable to his or her role in knowledge acquisition.

In expert systems, effective use of symbol­ic representation can help contain complex­ity and make the system intuitive and credi­ble. Because expert systems are knowledge intensive, this is not merely a matter of eas­ing the mechanics of interaction or minimiz­ing the memorization of commands, but also of exploiting the user's expectations as to how ideas are organized and expressed within the system's target domain. A num­ber of factors contribute to accomplishing this.

Symbols and their relations. Symbols and their relations must be defined in terms of how knowledge is represented across the user interface. This includes · the terminol­ogy, rhetoric, notations, depictions, and styles of interaction associated with the problem domain. Some aspects of the inter­face, such as diagramming techniques, are closely associated with the problem domain. Others, such as the presentation of general reasoning, tend to be shared across multiple domains or common to the user's culture.

Effective use of symbols and their rela­tions requires understanding both the do­main and the user. Data flow diagrams, for example, are appropriate to a system sup­porting software design, but are unlikely to inspire enthusiasm among many users of a guide to Chinese cooking, regardless of the quality of the recipes. Such an approach, however, might be useful if the users were software engineers with no prior experience in the kitchen.

Reasoning. Reasoning, as represented iIi relations among symbols, must be expressed in human rather than machine terms. Hu­man reasoning is highly compressed, leaving many premises and intermediary inferences implicit. It is unnecessary and tedious to present all relevant premises and conclu­sions with mathematical rigor. Consider this example:

Any horse can outrun any dog; some greyhounds can outrun any rabbit; therefore, any horse can outrun any rabbit.

If we disregard the possibility that some horses are quite slow while some dogs are pretty fast, we can readily observe that two additional premises are needed to render the argument valid:

If X can outrun Y and Y can outrun Z, then X can outrun Z

All greyhounds are dogs

Even so, the argument is still not complete. To demonstrate from these premises that any horse can outrun any rabbit requires 27 steps of formal logic. 7 Such reasoning is not human reasoning; it is the reasoning of computers (and logicians). In the user inter­face, the degree of compression advisable depends on the domain and the user's famil­iarity with the domain. 3 Premises presumed by an. expert may seem less than obvious to a nOViCe.

Explanations. Explanations should be ex­planatory rather than tracebacks of activat­ed rules. The purpose of explanation is to support user orientation and system credi­bility. Explanations should provide the user with a view of overall strategy, the factors

AI EXPERT. OCTOBER 1988

Page 4: direct manipulation interfaces successful lrect nte aces

r

governing the strategy, and the user's op­tions with respect to controlling the strate­gy. While rule-based representation is con­ducive to system manageability from the developer's perspective,S its modularity and simplicity tend to subvert the user's ability to understand the overall structure of the problem domain. 9

Explanations to expert systems are so im­portant that they cannot afford to rely sole­lyon textual discourse for their symbolic representation. Pictorial and' metaphorical representations can usually be more easily and quickly grasped, especially when imple­mented as interactive entities capable of sus­taining user exploration and manipulation. Explanations should, from the user's per­spective, be presented as an integral part of the system, not a subsystem developed as an afterthought.

Questioning. For interactive expert sys­tems to do their work, they must solicit suf­ficient information from the user to form a basis for their recommendations. Conse­quently, it is not unusual for expert systems to subject users to lengthy interrogations. However, users quickly tire of an expert sys­tem that only wants to play Twenty Ques­tions. Other means of soliciting user input should be considered.

Alternatives to the interrogative ap­proach include model building and mixed­initiative interaction. Both of these are well­suited to direct manipulation. With the model-building approach, the system pro­vides the user with a set of symbolic tools to depict the target problem graphically.

This depiction may be carried out coop­eratively, with the user and system prompt­ing one another as necessary until the prob­lem is refined to their mutual satisfaction. Mixed-initiative interaction3 is a variation on the interrogative approach, except the user is able to interrupt the interrogation at any time to offer as-yet-unsolicited informa­tion. To the extent that the interrogative approach is necessary, it should be progres­sive rather than arbitrary; that is, sequences of questions should be intuitively organized rather than sequenced according to the fir­ing of a randomly arranged rule base.

While the analogy between interface de­sign and knowledge acquisition illuminates the path toward improved expert system user interfaces, it also suggests that the knowledge acquisition bottleneck may be tighter than we realize. If expert resources are required, not only to construct the knowledge base but to design the interface, development time and expense will increase accordingly.

Before sounding the alarm, though, we may wish to consider the possibility that we are taking the analogy a bit too far. The de­mands for domain expertise in interface de-

sign, while very real, may not be as stringent as in knowledge acquisition. It is at least conceivable that in domains characterized by a considerable body of publically accessi­ble knowledge, developers experienced in user interface design could make consider­able headway in prototyping the interface with only a modicum of expert assistance.

CHOICE CONSTRUCTS Choice constructs constrain, guide, and in­form user decisions. Menus are the most fa­miliar type of choice construct; other types include icons, buttons, and text prompts. Menus support user decisions by presenting useful sets of alternatives. Icons and buttons support tightly constrained decisions, while text prompts, being essentially open-ended, usually must be augmented with error han­dling. Serialized text prompts are command languages. Menus cover the territory be­tween buttons and text prompts.

The combination of direct manipulation with bit-mapped display technology has fos­tered the maturity of menus. Using bit­mapped displays, a host of menu types has arisen: pop-ups, multiple-item menus, and menu bars. Direct manipulation provides the mechanism for deftly summoning and dispatching these menus. For example, in Figure 1, the use of windows and menu bars lets the user select from a variety of contexts.

Pop-up or momentary menus are handy for transient, single-item decisions. Multi­ple-item menus express slightly more com­plex notions, such as selecting values for groups of parameters. Scrollable list boxes are provided for managing' dynamically sized lists of selectable items, such as a list of files. Menus can be combined with other choice constructs to form dialogue boxes.

Menu bars are exemplary choice con­structs. They consist of a row-of mouse-sen­sitive symbols, .each of which, when selected, presents a momentary menu (in this con­text, momentary menus are usually called pull-down menus). To enter a command, the user opens a pull-down, selects an item, and the pull-down disappears. Menu bars support continuous access to a repertoire of commands within a shallow hierarchy, fully accessible within a single screen. Conse­quently, they can be browsed rapidly with­out loss of orientation. The importance of this kind of choice construct is not that it momentarily overlaps a region of the dis­play, but that the region overlapped is small and the overlap is momentary.

Choice constructs are essentially struc­tures of related interactive symbols, grouped and displayed to assist the user in conveying decisions to the system. The sym­bols comprising a choice construct may be pictorial, arbitrary, or composite. The pal-

AI EXPERT. OCTOBER 1988

Users quickly tire of an expert system that only wants to play Twenty Questions

31

I ·

Page 5: direct manipulation interfaces successful lrect nte aces

32

MENU BAR

-----iiiliiiliiiiiiiliiiiiiiiiiiiil ~ ALERT INDICATORS -

~

EXPLANATION SUBSYSTEM, SHOWING GRAPHIC AND TEXTUAL ELEMENTS

FIGURE 1. Operator interface

display for an expert welding control

system.

Bead profile Is offset to right

To maintain constant level fill r "'l.rsll"ale,>v. torch has been moved eloser

to left sidewall

Pass: ..

Actual Planned

ette of drawing tools used in the Apple's MacDraw is a metaphoric menu.

For expert systems, the importance of choice constructs is their approach to the lo­cus of interactive control. In recognizing this, it is helpful to note the fundamental differences between systems based on ad­vanced choice constructs and conventional menu-driven systems.

Conventional menu-driven systems guide users through a series of decisions, travers­ing from one decision point to the next. Be­cause the locus of control resides with the system, the user has no alternative but to re­spond to the current menu. Such systems can be very easy to use on a step-by-step ba­sis, but for complex tasks, the associated menu trees tend to become prolix, disor­ienting, and confining.

Advanced choice constructs, however, are user-driven. They are passive until the user chooses to act upon them. By letting the user take the initiative, the system gives the user flexibility in deciding what to do and when to do it. Because the realm of possible actions is structured and bounded by menus, the user is largely relieved of re­sponsibility for remembering what com­mands exist and in what contexts they make sense. Choice constructs simplify the me-

~it~~~ Current (Amps) ~~[225 .

,.~]

~-;~i ~~~~t@ Travel Speed (ipm)a15.~,:

"~j 60

55 50

45 40 35 30 25 20 15 10 05 00

WINDOWS

GRAPHIC REPRESENTATION OF CURRENT SYSTEM STATUS

chanics of interaction and thus reduce the mental load required to use the system.

CONCURRENT CONTEXTS Concurrent contexts let users interact with several systems, processes, domains, or vir­tual worlds at the same time. This is com­monly realized using windows, where each window behaves as a virtual terminal, and the user can select a window for interaction using direct manipulation techniques.

Windows provide a fix for what is other­wise a shortcoming in user interface envi­ronments. This shortcoming is the narrow­ness of domain of computer applications, sometimes referred to as the tool metaphor because of the tendency for programs to handle individual, specialized tasks. 1o Typi­cal tools in a user support environment might be a word processor, drawing pro­gram, and spreadsheet. Without windows, these programs or tools operate for the most part as discrete, sequential entities. Any attempt to force them to cooperate in a joint interactive venture is cumbersome at best.

This difficulty can be vexing when at­tempting to integrate emerging technol­ogies into an ongoing environment. For ex­ample, using today's technology, it is

AI EXPERT. OCTOBER 1988

Page 6: direct manipulation interfaces successful lrect nte aces

frequently feasible to develop expert sys­tems to assist operators of monitoring and control applications in diagnosing problems. However, if the expert system executes in­dependently and takes exclusive control over the user interface, forcing the operator to abandon the monitoring and control ap­plication, considerable inconvenience will be involved in consulting the expert system. This considerably diminishes the probability that the system will be consulted at all.

The windows solution is to let the user run and interact with more than one pro­gram at a time, with each program running independently in its own window. Several additional services are usually associated with window systems. These include com­monality across domains, dynamic display control, freedom of navigation, and data transfer.

Commonality across domains means simi­lar functions are implemented similarly from one application to the next; for in­stance, the procedure for deletion will be the same regardless of whether the user is deleting a paragraph from a document or erasing a rectangle from a drawing. Appli­cations can exploit and reinforce the user's experience with other applications in ren­dering the interface easy to use.

With dynamic display control, the user can orchestrate windows as desired. As the needs of the user shift among the domains occupying the display, the arrangement of windows can readily be altered. The ability to set aside a window (and in doing so, col­lapse it into its iconographic representation) without destroying it lets the user put it on hold and return to it later as needed by re­opening it. In this fashion, the user can set aside collections of iconographic domains, thus tailoring a metaphorical menu of systems.

Freedom of navigation lets the user roam at large among domains. For example, a programmer might have a text editor con­taining source code in one. window, a display of detailed design documentation in an­other, and a command processor in a third. The programmer might scroll through the design documentation in search of a bit of pseudocode and, upon finding it, shift atten­tion to the text editor to key in an imple­mentation of the pseudocode using a high­level programming language, all the while keeping the pseudocode visible on the dis­play. This being accomplished, the pro­grammer might instruct the editor to save the source code to disk, and then focus on the command processor, entering com­mands appropriate to compiling and linking the source code. If the compile results in er­ror messages, the programmer can correct the source code without losing sight of the rpessages.

Data transfer is the ability to move data from one window to another. For example, a user composing a document can insert drawings from a draw program or tables from a spreadsheet directly into the text. In the previous example, the programmer might prefer to insert a copy of the design­level pseudocode directly into the text-edi­tor, where it could be revised to conform to the syntax of the high-level language.

While windowing is clearly a key element in direct manipulation interfaces, it is per­haps a concept whose full potential has yet to be realized. Several deficiencies are asso­ciated with windows: • If poorly managed, windows can contrib­ute to user disorientation, which is only rar­ely productive in the workplace. Windows containing vital information may appear and vanish under program control, giving the bemused user no clue as to whence they came nor whither they have gone. • To the extent that windows represent iso­lated islands of activity, they can hamper the user's efforts. It is up to the user to create and maintain whatever organization and re­lationships might exist among the windows. Thus the desktop metaphor found in some systems can be realized with a disturbing de­gree of accuracy. Too many windows may be required on display, forcing the user to thrash through mounds of clutter whenever some bit of information is desired. • The current state of the user's focus of at­tention may not be obvious. For example, users may think they are entering text into an editor window, only to discover that the system has been directing the input to a command processor.

Problems due to poor management of windows can be corrected in a straightfor­ward manner. Rather than presume that the cheap thrill of a windowing environment will pass for user friendliness, developers must use windows as a means to better de­signs. This includes observing, within the context of direct manipulation, many con­ventional tenets of good user interface de­sign, such as providing the user with a stable work area, continuous access to status infor­mation, and requiring confirmation for im­portant actions. II

To some extent, the problems with win­dows result from the tool metaphor. That windows operate primarily as independent entities or "alienated windows," as R. Reichman 12 calls them, is contrary to the way users use them. Windows are not arbi­trarily selected for display and interaction. In current systems, the structure and rela­tion of a given set of windows is entirely subjective to the user. There is little attempt on the part of the system to reinforce this structure. Particularly in window systems based on existing operating environments,

AI EXPERT. OCTOBER 1988

Windows may appear and vanish under program control, giving the bemused user no clue as to whence they came nor whither they have gone

33 I ,

Page 7: direct manipulation interfaces successful lrect nte aces

applications originally developed for tele­type interaction are forced to fit into the window scheme of things. As more applica­tions are developed to take advantage of windows (as well as of other window applica­tions), we may expect to see some improve­ment in integration among windowed processes.

Another possible basis for improvement is hypertext. Using concepts arising out of hy- . pertext, clusters of applications can be dyna­mically linked together at access points specified by the user. In any case, for the present, the isolation of individual processes within windows is not nearly so complete as that of processes running in a conventional teletype environment.

EXPERT SYSTEM USER INTERFACE E.L. Hutchins et al. 13 note that for some tasks, high-level languages may be superior to direct manipulation. If so, interface de­signers need to be able to tell which type of interface they should use for a given application.

Because selecting the symbols appropriate to a system is a domain-dependent activity, the problem may not be susceptible to an a priori solution. Instead, decisions as to in­terface type must be made as part of an iter­ative design and prototype process. Tasks for which a suitable pictorial or metaphori-

cal representation can be found may lend themselves readily to direct manipulation designs. Tasks intrinsically bound up in sys­tems of arbitrary symbols may be more suit­ed to high-level languages.

To suppose that the issue is this simple, however, is to sell direct manipulation short. It is the nature of direct manipulation to ab­sorb other user interface techniques, as evi­denced in the assimilation of Small talk, LISP, and some implementations of the UNIX shell into the direct manipulation scheme of things. In each of these environ­ments, the user interacts with the system us­ing a language based on arbitrary symbols. But this is accomplished in a manner inte­grated into and augmented by direct manip­ulation. Apparently the ability to select and manipulate symbols, combined with features such as commonality across domains, dy­namic display control, freedom of naviga­tion, and data transfer, is a paradigm that is sufficiently powerful to encompass the capa­bilities of earlier, more conventional technologies.

Expert systems help people solve com­plex, demanding problems. These problems may entail a higher level of accountability than conventional interactive software sys­tems. Consequently, it is important that us­ers be able to focus their energies on the substance of the problem domain rather

INDUSTRIAL STRENGTII OOps.

You have three options in today's world; lead, followor get out of the way. You've already taken a leadership position in hardware with the latest 286 or 386 system. Now you can use that triple-digit architecture to blast ahead of the pack with the most powerful new Object Oriented Programming (ooPs) software on the market: Smalltalk/V286.

Smalltalk!Y, the original oOPS tool for the PC, gave scientists, engineers, program­mers and educators a brand new way to solve . problems. And soon they were developing exciting new applications in everything from economics to medicine to space.

Now Smalltalk/V286 gives you true work station performance with industrial strength capabilities like: push-button debugging; multi-processing; portability

between DOS, OS/2 and Presentation Manager operating environments; integrated color graphics; a rich class library; and access to 16 MB of protected mode memory, even under DOS.

The new Smalltalk/V286, which is e~n easier to learn and use than Smalltalk!Y, retails for just $199.95. Or you can buy Smalltalk!Y, still the world's best selling OOPS, for only $99.95. And both come with our 60 day money­back guarantee.

Check out the new Smalltalk/V286 at your dealer. If he doesn't have it, order toll free, 1-800-922-8255. Or write to: Digitalk, Inc., 9841 Airport Blvd., Los Angeles, CA 90045. And let us put you ahead of the SmalItalk/V 286 power curve.

CIRCLE 14 ON READER SERVICE CARD

Page 8: direct manipulation interfaces successful lrect nte aces

than on extraneous requirements imposed by the computer.

The elements of direct manipulation combine to form an environment hospitable to the interactive demands of expert sys­tems. Symbolic representation provides the basis for externally representing knowledge using artifacts intrinsic to the problem do­main. Advanced choice constructs support user-driven interaction control strategies while explicitly limiting the range of possi­ble initiatives. Concurrent contexts enable users to consult expert systems without hav­ing to do so in a vacuum; instead, they can be run alongside other systems, expert or otherwise, and consulted as needed. Direct manipulation provides the mechanism for. navigating among symbols, choice con­structs, and systems in a manner that is intu­itive, transparent, and productive. OJ

REFERENCES 1. Potter, A. "Software Development Under Win­

dows," COMPUTER LANGUAGE 5(1): 36-44, Jan . 1988.

2. Stern, H.L "Comparison of Window Systems," BYTE, Nov. 1987, pp. 265-272.

3. Berry, D.C. , and D.E. Broadbent. " Expert Systems and the Man-Machine Interface," Expert Systems 4(1), Feb. 1987.

4. Potter, A. " Interfacing the Expert: Characteristics and Requirements for the User Interface in Expert Systems," in Prorfl'dillgs oj the Third Anllual Conference all Artificial IlI tell igPl/rP Jar S/Ja(e AP/J/ica tiolls, Nov. 2-3,

1987, pp. 327-331. Sponsored by NASA Marshall Space Flight Center and the University of Alabama in Huntsville.

5. Shneiderman, B. Designing the User Interface: Strate­gies Jar Effective Human-Computer Interaction. Reading, Mass.: Addison-Wesley, 1987.

6. Baroff, J., R. Simon, F. Gi lman, and B. Shneider­man. " Direct Manipulation User Interfaces for Expert Systems," in Hendler, J.A. (ed.), Expert Systems: The User Interface. Norwood, N .J .: Ablex, 1988 , pp. 99-1 25 .

7. Copi , I.M. Symbolic Logic, third ed. London , U.K.: MacMillan , 1967.

8. Barr, A., and E.A. Feigenbaum (eds.). The Hand­book oj Artificial Intelligence, vol. II. Los Altos , Calif.: William Kaufman, 1982.

9. Kidd, A.L, and M.B. Cooper. "Man-Machine In­terface Issues in the Construction of an Expert Sys­tem. " International Journal of Man-Machine Studies 22: 91-102,1985 .

10. Laurel, B.K. " Interface as Mimesis," in Norman, D.A. , and S.W. Draper (eds.), User-Centered S),stem De­sign: Nrw Perspectives on Human-Computer Interaction. Hillsdale, N .J.: Lawrence Erlbaum Associates, 1986. pp. 67-85. .

11. Rubinstein, R., and H .M. Hersh. The Human Fac­tor. Burlington , Mass. : Digital Press, 1984.

12. Reichman, R. " Communication Paradigms for a Window System," in Norman, D.A., and S.W. Draper (eds.), User-Centered System Design: New Perspectives on Hu man-Computer Interaction. Hillsdale, N.J.: Lawrence Erlbaum Associates, 1986 , pp. 285-3 13.

13. Hutchins, E.L, J.D. Hollan , and D.A. Norman. " Direct Manipulation Interfaces," in Norman , D.A., and S.W. Draper (eds.), User-Centered S),stem Design: New Perspectil.es on Human-Computer Interaction. Hillsdale, N.J .: Lawrence Erlbaum Associates, 1986, pp. 87-124.

Andrew Potter is a senior software engineer with General Digital Industries Inc., Huntsville, Ala.