volume vi number 4 winter, 1991

20
Volume VI Number 4 Winter, 1991 Library of Congress No. ISSN 0892-4996 Contents of This Issue: 3 Editor's Note Meet Me in St. Louis, Louie, at the 1992 Visitor Studies Conference 4-11 Suggested Guidelines for Designing Interactive Exhibits by Stephen Bitgood 12-13 Evaluation of the Falling Feather Exhibit on Gravity by Stephen Bitgood 14-17 Bibliography: Hands-On, Participatory, and Interactive Exhibits by Stephen Bitgood 18 1992 Visitor Studies Conference Announcement and Call for Papers 19 Report of the AAM Visitor Research & Evaluation Committee by Minda Borun 20 What's Happening With the Visitor Studies Association? Message from the President by Harris Shettel Editor: Steve Bitgood (Jacksonville State University) Associate Editors: Harris Shettel (Rockville, MD); Pete Conroy (Anniston Museum of Natural History); Michael Pierce (Alabama State Museum); Don Patterson (Jacksonville State University) Managing Editor: Arlene Benefield (Center for Social Design) Contributing Editors: Marilyn Hood (Hood Associates); John Koran (University of Florida); Ross Loomis (Colorado State University); Donald Thompson (University of Wisconsin-Milwaukee); and Wilcomb Washburn (Smithsonian Institution). Published Quarterly by the Center for Social Design in cooperation with the Visitor Studies Association, the Anniston Museum of Natural History and Jacksonville State University. Address: Visitor Behavior, Center for Social Design, P. 0. Box 1111, Jacksonville, Alabama 36265. Phone: (205) 782-5640 Subscriptions: $12.00, individual; $20.00, institutional. Write or call for information on back issues. Single issues, $3.50 each. Unsolicited contributions are welcome and should be sent to the editor at the above address. n r.rnnrr,,.,.,,_r-..c'-..:,.ir.

Upload: others

Post on 20-Nov-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Volume VI Number 4 Winter, 1991

Library of Congress No. ISSN 0892-4996

Contents of This Issue:

3 Editor's Note

Meet Me in St. Louis, Louie, at the 1992 Visitor Studies Conference

4-11 Suggested Guidelines for Designing Interactive Exhibits by Stephen Bitgood

12-13 Evaluation of the Falling Feather Exhibit on Gravity by Stephen Bitgood

14-17 Bibliography: Hands-On, Participatory, and Interactive Exhibitsby Stephen Bitgood

18 1992 Visitor Studies Conference Announcement and Call for Papers

19 Report of the AAM Visitor Research & Evaluation Committee by Minda Borun

20 What's Happening With the Visitor Studies Association? Message from the Presidentby Harris Shettel

Editor: Steve Bitgood (Jacksonville State University)

Associate Editors: Harris Shettel (Rockville, MD); Pete Conroy(Anniston Museum of Natural History); Michael Pierce (AlabamaState Museum); Don Patterson (Jacksonville State University)

Managing Editor: Arlene Benefield (Center for Social Design)

Contributing Editors: Marilyn Hood (Hood Associates); John Koran(University of Florida); Ross Loomis (Colorado State University);Donald Thompson (University of Wisconsin-Milwaukee); andWilcomb Washburn (Smithsonian Institution).

Published Quarterly by the Center for Social Design in cooperationwith the Visitor Studies Association, the Anniston Museum of NaturalHistory and Jacksonville State University.

Address: Visitor Behavior, Center for Social Design, P. 0. Box 1111,Jacksonville, Alabama 36265. Phone: (205) 782-5640

Subscriptions: $12.00, individual; $20.00, institutional. Write or callfor information on back issues. Single issues, $3.50 each.

Unsolicited contributions are welcome and should be sent to theeditor at the above address.

n r.rnnrr,,.,.,,_r-..c'-..:,.ir.

CSITOR BEHAVIOR) Winter, 1991 Volume VI Number 4 Page 2

THECENTER

FOR

GG ' .SOCIAL

®ESICNgroupColor for every budget! The Center for Social Design is a nonprofit

organization with several major areas of focus:

Graphic Design,Illustration & Production • Publications, including Visitor Behavior, Collected

For Interpretive Panels and Labeling Systemspapers from the Annual Visitor Studies Conference,and Technical reports on visitor-related topics.

• Information panels Ready-to-order• Research and evaluation projects in visitor studies.. Workshops in visitor studies and evaluation.

Rainforest, Grasslands and Wetlands• Planning Workshops For information contact:• Standards Manuals

CENTER FOR SOCIAL DESIGN

Florence Bramley, Director P.O. Box 1111

PO Box 070216 • Staten Island, NY 10307-0002 Jacksonville, Alabama 36265

Phone (718) 317-9800 • FAX (718) 356-1937 (205) 782-5640

The Center for Social Design is Looking For InstitutionsWho Would Like To Host Visitor Studies Workshops

Conducted By Leading Professionals

Workshop Topics include: Workshop Instructors include:

• Evaluating Marketing/Public Relations • Stephen Bitgood• Exhibit Evaluation • Florence Bramley• Program Evaluation • Minda Borun• Survey Development • Marilyn Hood• Graphics/Label Development • Ross Loomis

• Beverly Serrell• Harris Shettel

Contact: Center for Social DesignP. O. Box 1111Jacksonville, AL 36265Phone: (205) 782-5640

BEHAVIOR))) Winter, 1991

This issue is devoted to the design and evaluation ofinteractive exhibits. My intention was to summarize some ofthe important points found in the literature, to provide a bib-liography that readers might use for their own exploration,and to provide short abstracts of evaluation studies whichtargeted interactive exhibits. As I began this task, severalproblems arose. First, there appeared to be confusion in theliterature concerning the definition of "interactive." Mysolution was a definition that focused on "physical interac-tion" between the exhibit and the visitor (i.e., a visitorresponses to the exhibit results in a physical change in theexhibit). In addition, I make a distinction among three typesof response engagement (page 5): simple hands-on, partici-patory, and physical interactive.

Another problem that emerged was that my short, 2-3page summary of the literature turned into several more pagesthan originally anticipated. It would be easy to expand thecurrent article to a massive volume. It might be morevaluable, however, to have a series of guidelines for differenttypes of interactive exhibits: computer, other electronic,mechanical, and low tech. Perhaps some of you have sugges-tions on where to go next.

A third problem was collecting relevant articles/materi-als on interactive exhibits. Many of the sources are hard tofind because they have not been published or they have beenpublished in difficult-to-obtain places.

Finally, there was the problem of time. There was notenough time to consider thoroughly all the informationrelevant to the design of interactive exhibits. The currentissue is only a beginning to the consideration of the myriadof factors necessary for effective interactive exhibits. Wehope that it is a useful starting point for readers.

Steve Bitgood, Editor

LETTER TO THE EDITOR

Perhaps as an item for the newsletter you might wish tomention the absence of any section on visitor surveys orevaluation in the publication RequiredReading: The Profes-sional's Bookshelf/ The American Association of MuseumsBookstore Catalogue for Museum Professionals (Autumn/Winter, 1991-1992). There is only one entry in the indexunder "evaluation," one titled Visitor Evaluation Duo, andone with the title Visitor Surveys: A User's Manual, all ofwhich are listed on page 35 under the rubric of "Technical In-formation Service."

Perhaps you could call attention to the absence of any"Interest Areas" covering Visitor Surveys and the paucity ofvisitor studies on the "Professional's Bookshelf" despite theexistence of a standing committee on the AAM dealing withthis subject.

Wilcomb E. Washburn, DirectorAmerican Studies ProgramSmithsonian Institution

Volume VI Number 4 Page 3

Meet Me in St. Louis, Louie...at the 1992 Visitor Studies Conference

"Meet me in St. Louis, Louie," may be a great song, butnext year it will also be a great idea! The 1992 Visitor StudiesConference will be held in St. Louis on June 23-27 at theClarion Hotel in downtown St. Louis—just a few short blocksfrom the Arch. Sponsored by the St. Louis Zoo, the St. LouisScience Center, Missouri Botanical Garden, St. Louis ArtMuseum, and the History Museum (Missouri Historical So-ciety), the Conference will include two full days of work-shops held at all five cultural institutions as well as two anda half days of meetings at the Conference Hotel.

The cost for the 1992 Conference willbe $150 for VisitorStudies Association members, $175 for non-members, and$75 for students if you register before May 22nd. Thisincludes social events. Prices go up $25 for late registration.Spouses (or significant others) can attend all of the socialevents for $75.

Speaking of social events, we have a wonderful series ofevening activities. On Tuesday, the 23rd, delegates willattend a reception at the Missouri Botanical Garden featuringa tram tour. The Garden has just finished renovating theClimatron and completed construction of a brand new educa-tion complex and is a simply stunning place to visit in June.Wednesday will feature a tour of one of the world's largestbreweries, Anheuser Busch, along with a reception and a tourof their exhibit gallery. Make sure you plan on attending ourprogressive dinner on Thursday, starting at the HistoricalSociety, going on to the Art Museum, and finishing at theZoo. All three institutions have undergone significantexpan-sion and renovation in recent years and this five-hour eventpromises to be one of the highlights of the convention. OnFriday you'll tour the Science Center and be among the firstvisitors to see the completed expansion (opening in June,1992). You'll also tour the Soviet Space exhibit (appearingin only four institutions in North American) and see a specialOMNIMAX show. ALL THIS IS FREE TO DELEGATES!Remember too that Saturday is a home game for the Cardinalsbaseball team and the stadium is located just across the streetfrom the hotel. Also on Saturday afternoon after the sessionsare over, there is a bus trip planned to the Cahokia MoundsHistoric Site and Interpretive Center, the largest prehistoricIndian city north of Mexico.

We have a conference coordinator, David Blum, with aspecial phone number. David can be reached at (314) 535-8235. We will also feature an exhibitor's area. Exhibit boothscan be rented for two full days by calling David. They'reavailable on a first-come basis.

Roger Miles from the Natural History Museum (Lon-don) will be our keynote speaker and we expect to have mostof the leading professionals on the program. We are stilllooking for quality proposals for individual papers and ses-sions. See the "Final Call for Papers"on page 18.

VISITOR BEHAVIORS)) Winter, 1991 Volume VI Number 4 Page 4

Suggested Guidelines forDesigning Interactive Exhibits

Stephen BitgoodJacksonville State University

Well-designed interactive exhibits can be highly effec-tive; but, they may fail dismally if they are poorly designed.This article offers a review of the literature from the perspec-tive of what has been learned from interactive devices. Manyof the suggestions apply to all exhibits, but several areespecially important when the exhibit involves interactivecomponents.

Visitor input is of critical importance to the developmentof interactive exhibits. Visitor evaluation helps to answer anumber of questions. For example, during the planning stage,evaluation can provide information about the knowledge,misconceptions, attitudes, and interests of the potential audi-ence. During the preparation stage, evaluation can provideinformation about what does and doesn't work. After instal-lation, evaluation can be used as a basis for making finaladjustments to improve the effectiveness of the exhibits. Thereader is strongly encouraged to consult publications such asCurator, ILVS Review, and past issues of VisitorBehavior formore in-depth considerations. In addition, an article by C. G.Screven (1990) and a book by R. Loomis (1987) provideexcellent overviews of visitor evaluation.

Developers of interactive exhibits should also be famil-iar with several publications that discuss design issues ingreater detail than is possible in this article. Kennedy's(1990) User Friendly: Hands-on Exhibits That Work andLevy's (1989) Cogs, Cranks, and Crates: Guidelines forHands-On Traveling Exhibitions are both published by theAssociation for Science-Technology Centers (ASTC); theyprovide information, checklists, and suggestions that shouldbe valuable to an exhibit development team. Norman's(1988) The Psychology of Everyday Things is a wealth ofeasy-to-read information on human factors—information thatcan be easily applied to exhibit design. Miles, Alt, Gosling,Lewis, and Tout's (1982) book, The Design of EducationalExhibits, has an excellent chapter on exhibit media, part ofwhich discusses interactive exhibit devices. In addition,there are several articles that attempt to provide overviewsand/or suggestions with respect to interactives (e.g., Flagg,199 lb; Screven, 1991; Wagner,1991). For considerations indesigning labels for instruction and interpretation, see Bitgood(1991a) and Serrell (1983).

I define an "interactive exhibit" as a device in which thevisitor's response to the exhibit produces a change in theexhibit. This definition is restricted to physical interactionwith a device; it does not include "mental interaction." Inter-

actives might include something as simple as pressing abutton which illuminates a light or something as complex asa sophisticated interactive computer system. The importantpoint is that there is a visitor-controlled change in the exhibit.Another way to say this is that "the message to be deliveredis, to one degree or another, under the physical control of thevisitor" (Shettel, 1991). This definition distinguishes amongother types of active response exhibits — "simple hands-on"and "participatory." "Simple hands-on" involves responsessuch as touching or climbing. Touching animal fur orclimbing on a gorilla sculpture are examples. "Participatory"involves making comparisons between the visitor's responseand some standard. Assembling a turtle skeleton or compar-ing your jumping distance with that of a cougar are examplesof "participatory." (The standard for the turtle skeleton isevery piece in its correct place). Examples of "interactive"exhibits might include lifting a flip panel to reveal text,pressing a button to change scenes from summer to winter, orholding a magnifying glass over an object to reveal some-thing previously unseen. There is a cause-effect relationshipbetween the visitor response and a change in the exhibit.Table 1 summarizes these differences.

The distinctions among these three types of exhibits arenot generally made in the literature. In fact, the terms "hands-on," "participatory," and "interactive" are often used inter-changeably. When distinctions have been made, they havenot been consistent with the current perspective. However, Ibelieve the distinctions made here are important for tworeasons. First, the design guidelines are more complex forinteractive exhibits since visitor-exhibit interface (principlesof human factors) must be considered. Control devices andresponse feedback mechanisms play a critical role in interac-tive exhibits, but not for simple hands-on and participatoryexhibits. A second reason why the distinction is important isthat the intended and actual impact of these exhibit types maybe different. To minimize confusion, it is helpful to distin-guish types of response engagement from the actual orpotential outcome on visitors. The right-hand column ofTable 1 ("Possible and/or Intended Impact") outlines theimpact that these exhibits might have on visitors. As can beseen, these outcomes can be quite different. Simple hands-on and participatory activities may help to focus the learner'sattention on the objects, may facilitate affective learning, andmay communicate sensory-perceptual knowledge; but, theseforms of direct response exhibits are probably not as capableof creating cause-effect reasoning such as: "I remove the airfrom the tube and objects still fall due to gravity instead of

VISITOR BEHAVIOR Winter, 1991 Volume VI Number 4 Page 5

Table 1

TYE. ORESPON:SE ':: .... .... ::;.:....: ::.

:EXAMPLES OF ....::....

POSSIBLE AND/OR

ENGAGEMENT INTENDED IMPACT

SIMPLE HANDS-ON 1. Touching animal fur. 1. Produce sensory and/or perceptual

(Exhibit prompts the visitor learning.

to touch, climb, etc.) 2. Climbing on a statue of an animal.2. Focus visitor's attention on object.

3. Dressing up in firemen's clothing.3. Create an increase in interest, achange in attitudes, etc.(affectivelearning).

PARTICIPATORY 1. Comparing jumping distance (or 1. Teach similarities and differences

(Exhibit prompts a response and some other visitor response) with between objects or events.

the outcome is used to teach a other animals.

point by comparing it with some 2. Focus visitor's attention on object.

other response or standard; goes 2. Feeling several objects and

beyond simple hands-on)comparing them on characteristics 3. Produce an increase in interest, asuch as coolness, roughness, etc. change in attitudes, etc.(affective

learning).3. Assembling a turtle skeleton and

comparing with a correct assemply.

1. A label with a flip panel. 1. Teaching of cause-effect relation-ships (using either discovery

INTERACTIVE 2. Devices with controls (buttons, learning or guided learning.)

(Exhibit prompts a response levers, cranks, etc.) in which awhich changes the state of the response on the control makes a 2. Teach similarities and differences

exhibit; the change is under the change in the exhibit (lighting, between objects, events.control of the visitor.) sound, object's position, etc.).

3. Focus visitor attention on object orLEVEL 1: Simple engagement 3. Interactive computer tutorials, self- event.(e.g., press a button, light turns testing devices, games, etc.on) 4. Affective learning (increase in

4. Magnifiers (magnifying glass, interest, attitude change, etc.).LEVEL 2: Prolonged engagement microscope) that when used(e.g., interactive computer game) correctly reveal what was previ- 5. Self-testing of visitors.

ously unseen.6. Conceptual orientation of visitors.

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 6

float like I thought they would."

This article attempts to provide guidelines that includetwo aspects of interactive exhibits: stages of evaluation; anddesign of the exhibit in terms of the physical device, labels forinstruction and explanation, and the visitor-exhibit interface.

Stages of Visitor Evaluation

The Planning Stage

1. Prepare clear and explicit goals and objectives. What doyou want to communicate? How will you know if you aresuccessful. That is, what will the visitor be able to say, feel,or do if it works? Is an interactive device the best way to com-municate your message? Too often interactive devices arechosen because they are considered in vogue rather thanbecause they are the most effective medium for communicat-ing the message.

Below is a partial list of possible goals one might havefor interactive devices:

• Magnifying an image. The goal might be for thevisitor to use a microscope or magnifying glass inorder see something that is usually unseen.

• Discovering a physical phenomenon. An example isseeing metal filings form a pattern at the poles of amagnet when the visitor places a magnet over a glass-covered tray of metal filings.

• Comparing objects. For example, the visitor mightpress a button that alters the scene from one visualimage to another in order to compare some propertyof objects (e.g., Mt. Saint Helen before and after thevolcano exploded).

• Demonstrating a physical action. Pressing a buttonmight activate a vortex of vapor that mimics the aircurrents of a tornado (Oppenheimer, 1986).

• Demonstrating a concept. Borun (1990) used an inter-active device to correct one of the common miscon-ceptions about gravity — that a ball would float if itwere in a vacuum. Another example is from Driscoll(1990) who evaluated an interactive computer tutorialthat demonstrated phenomena of color and light.

• Focusing visitor attention. An interactive device suchas a computer could be used to focus visitor attentionon exhibit objects. Worts (1991a; 1991b) describedinteractive computers that instructed visitors toexamine paintings more closely and offered possibleinterpretations of the meaning of the artworks.

• Visitor self-testing. Interactive computers are alsoused to allow visitors to self-test their knowledge(e.g., Screven, 1991).

• Describing how things function. For example, a visitormight operate controls that demonstrate how a motorworks.

• Orientation to an exhibit area or to the museum. Amenu-driven computer that explains exhibit themesserves this function (Morrissey, 1991). More ambi-tious is the computer system at the Franklin Instituteof Science that can prepare individualized tours forvisitors (Mintz, in press).

2. Define your audience. Is the exhibit going to be designedfor children from ages 6 to 10 years? Fur- all ages? Pinpoint-ing the intended audience will make design and evaluation ofthe exhibit easier as well as increase its success.

Kennedy (1990) and Oppenheimer (1986) suggestedthat exhibits be designed for a diverse audience in terms ofage, physical size, learning style, level of knowledge, etc.This includes people with physical impairments. Unfortu-nately, this advice is not always followed. For example, arecently opened exhibit in a major science museum allowsthe visitor to reach into a plexiglass enclosure to arrangeblocks in a pattern similar to a task on popular intelligencetests. However, only small children can reach into the rear ofthis plexiglass enclosure because a larger person's forearmsdo not fit. Since this exhibit is appropriate for all ages, itshould allow physically larger visitors to use the device.

3. Conduct a front-end evaluation study to determine theaudience's pre-knowledge, interests, attitudes, and miscon-ceptions. For example,MindaBorun(1990;1991)foundthatvisitors shared several misconceptions about gravity. Onceidentified, the museum was able (for most visitors) to correctthese misconceptions with specially designed interactivedevices. See Screven (1990) and Shettel (1989) for a moredetailed discussion of front-end evaluation.

4. Consider how the interactive device will relate to otherexhibits in the area. One must be very careful in designingexhibit spaces. As Melton (1935) concluded, every exhibitelement competes with every other element for the visitors'attention. It is important that an interactive device receive itsshare of attention without dominating the exhibit area to thepoint that other exhibit displays are ignored. Ambient noise,crowding, and other disruptive stimuli should have minimalnegative impact on the visitor's attention to exhibits.

It is comforting to know that computers, if properly used,can effectively direct attention to objects on display ratherthan compete with those objects (e.g., Worts, 199 la; 1991b).At the Art Gallery of Ontario interactive computers are usedto focus visitors attention on artworks in the Group of Sevengallery. Worts evaluation data convincingly demonstratedthe effectiveness of this, as well as other, interactive devices.

5. Consider multiple stations. Devices such as interactivecomputers are popular, but accessibility to the device may bea problem under crowded conditions. When space and fundsare available, it is wise to provide several interactive stationsin order to allow for greater access (e.g., Kennedy, 1990).

Volume VI Number 4 Page 7VISITOR BEHAVIOR))) Winter, 1991

The Preparation Stage

1. As it is being prepared, trial test the device with a sampleof visitors. Changes can be made to improve its ability toteach; or, if the device doesn't communicate after severalmodifications, then a new exhibit concept may be designed.Trial testing is perhaps the most important guideline forinteractive devices since it is difficult to anticipate howvisitors will use a device unless you test it. Testing of this typeis generally called "formative evaluation." McNamara(1990) reported that the majority of exhibits developed at theVirginia Science Museum are initially effective for 10 per-cent or less of respondents. This percentage dramaticallyrises with trial testing and revision. For a more detaileddiscussion of formative evaluation see Screven (1990).

As part of formative evaluation, one must ensure that thedevice will attract and hold visitor attention. If visitors do notapproach, stop, read, and interact, the exhibit is not likely todeliver its message. Wagner (1991) suggested using eye-catching display titles and color to pull visitors to the exhibitif attraction is likely to be a problem. Time at the exhibit isalso dependent on several factors including the nature of theinteractive controls, distracting sights and sounds, the time ittakes for the device to reveal the outcome of a visitorresponse, etc.

The Post-Installation Stage

After the final device has been installed, there is stillwork to be done. It is important to determine how theinteractive device functions in relation to other exhibits in thearea and to "fine tune" the device. Evaluation during thisstage is necessary because it is difficult to predict the exhibit'seffectiveness even if front-end and formative evaluationshave been successfully used (Screven, 1990).

1. Even after installation, small, inexpensive changes mayoften be made to increase the exhibit's effectiveness. Post-installation changes should be conducted in a systematicmanner using visitor feedback. "Remedial evaluation" is theterm used to describe this type of evaluation (Screven, 1990).Trial testing and revision, similar to formative evaluationduring the preparation stage, can make the difference be-tween a resounding success and a dissappointing failure. Forexample, a change in the control device (e.g., from a com-puter keyboard to a joystick) might improve effective usageof an interactive computer exhibit. While this type ofadjustment is ideally implemented during the preparationstage, some problems may not be obvious until after theexhibit is installed.

2. It is important to conduct follow-up checks to determineif the device continues to operate properly. Maintenance overtime can be and usually is a problem. At some point, ajudg-ment must be made as to the useful life of an exhibit.

Unfortunately, at this time, there is no acceptable rate of"down time" for interactive devices. In fact, to my knowl-edge, the Saint Louis Science Center is the only institutionmonitoring "down time" on their interactive devices (Bon-ner, 1991). This information is extremely important if we areto assess the cost-effectiveness of interactive devices. Even-tually some standard of acceptable "down time" may bedeveloped.

3. In addition to measuring time that a device is "out-of-order," computer driven devices can be used to record dataon the frequency of use, time of use, accuracy of answeringquestions, whether or not the visitor completed the sequenceof operations programmed on the interactive device, andother such information. Armed with this information, moreintelligent decisions can be made for redesigning hardwareand software, and for future budget projections.

Exhibit Design Considerations

The Interactive Device

1. Provide implicit cues for responding. Norman (1988)described this as the principle of "visibility." Devices canoften be designed so that their visual appearance makescorrect usage obvious.

I recently worked with a science museum which had anexhibit called The Human Battery. It was not immediatelyclear to visitors where they should place their hands eventhough there was a diagram showing the correct positioningof hands. Only about 30% of visitors placed their hands onthe correct plates. The addition of a simple hand outline onthe plates where the hands should be placed dramaticallyincreased the percentage of visitors who were able to use thedevice correctly.

It is always best for the desired visitor response to beobvious independent of verbal clues, rather than dependenton instructions or illustrations (Kennedy, 1990 ,Miles et al.,1982; Norman, 1988). For example, Miles et al. (1982) pointout that large colorful press-buttons and levers generallyprovide obvious cues to their function.

2. Effective mapping of controls also helps to make theappropriate response obvious and it helps to minimize incor-rect responses. Mapping refers to the relation between themovement and placement of controls and the effect that theresponse has on the device. For example, to control severallights one might arrange the controls in the same spatialpattern as the lights. Thus, the light on the right of theapparatus is controlled by the right-hand switch, the light onthe left of the apparatus is controlled by the left-hand switch,and so on. The reader is encouraged to read Norman (1988)for positive and negative examples of mapping in commoneveryday devices.

3. Design for durability and ease of maintenance. Interactive

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 8

devices almost always require more resources than non-interactives. This makes adequate budgeting essential. Oneof the most frequent mistakes is to fail to budget for trialtesting during the preparation stage (see"Preparation Stage").In addition, failure to budget for "fine tuning" (see Post-Installation Stage) can also result in the failure of an exhibit.

Interactive devices that are "out-of-order" deliver thewrong message to the public. It is inevitable that interactivedevices will be pounded on, hit, kicked, and abused in everyother possible way. Thus, the materials used should be asdurable as possible. In addition, since maintenance can betime consuming and frequent, the devices should be designedso that they can be easily repaired. For example, accesspanels can be conveniently placed to allow for maintenance(e.g., Wagner, 1991).

Low tech devices in some cases are preferred to high techones. For example, there is a low-tech self-quiz on gorillasat the San Francisco Zoo that allows visitors to slide a plasticmarker to answer True or False for each question on the selfquiz. Visitors can then check on the accuracy of their answersby looking on the rear of the display. The device appears tobe highly effective. A high-tech version of a gorilla self-quizwas observed in another zoo; visitors pressed one electronicbutton for "True" and another for "False." The device thengave feedback whether or not the response was correct. Un-fortunately, the device needed considerable maintenance andvisitor efforts were frustrated when it was inoperative.

4. Plan for safety and comfort. Any device must be safe forvisitors of all ages. Loose objects should not become flyingmissiles. Objects should not break into dangerous pieces.Avoid devices that have sharp edges and hinges that pinchfingers. Don't make the device tempting to climb and thusrisk a fall.

You can plan for the comfort of users by providingseating where visitors are likely to spend more than a minuteor two, by designing physical equipment so that it is comfort-able to use (doesn't require bending, stretching, etc.), and byensuring that bright lights are not shining in the visitors' eyes.

Instructions and Interpretive Messages

screen. Computer layering allows the visitor to access agreater amount of information as well as providing the oppor-tunity for branching menus. Using a computer only as anencyclopedia, however, should be avoided. One of the greatadvantages of computers is its tremendous capability ofmotivating the learner through user-machine interaction.

2. Keep instructional and explanatory labels and diagrams toa minimum. One strategy is to empirically determine theminimum required instructions. Diamond (1991) testedprototypes without labels and added only those instructionsnecessary to produce the correct response. Remember thatthe visitor must process a considerable amount of informa-tion as he/she approaches the device. If there is too muchinformation to process, the visitor is likely to overlook someof the information resulting in failure to follow the instruc-tions or understand the message. However, while instruc-tions should be kept at a minimum, it is important to remem-ber that if an unfamiliar interactive device is being used, thevisitor should be told what the machine does (Miles, et al.,1982).

3. Instructions should be easily available when needed, notburied in text or presented only at the beginning of thesequence. In addition, it is important that instructions are notobscured by the visitor when operating the device (Miles, etal., 1982). Keeping the mental load of the visitor to aminimum is important.

4. Place the instructions where they will be read. Instructionsshould be placed where they will be noticed and proximal tothe controls that must be operated. If they are placed too faraway from the controls referred to in the instructions, theymay be ignored or it will be difficult for the visitor toconceptually connect the instructions with the controls.

5. Make the instructions easy to understand. Use simpleterms and make sure that they are understandable to youraudience. This means trial testing the instructions. Instruc-tions should also be presented in the order that they are to becarried out.

1. Chunking of text. If information is provided in smallchunks rather than all at once, it is easier to attract visitorattention and it is easier for visitors to process the informa-tion. Bitgood, et al. (1986) found a substantial increase inlabel reading when a 150-word label was divided into threelabels of 50 words each.

Screven (1986) recommends using interactive devicesto "layer" information by dividing it into small chunks andmaking only a small portion available at one time. One low-tech approach is to use flip panels. Major information ispresented on the outside of the panel and lifting the panelreveals secondary information. The high-tech approach, onthe other hand, might use computer-layered copy in which thevisitor can call up a variety of information on the computer

6. Minimize the number of instructions (parsimony ofinstruction). Bitgood (1991b) found that less than one-half ofthe visitors followed the eight steps necessary to observe thedemonstration of gravity in the Falling Feather exhibit. Ifthere are too many instructions, visitors may become con-fused, may give up before all steps have been completed, orare more likely to perform the steps incorrectly. Minimizingthe number of instructions does not mean that several simplesteps should be presented as a single, more complex instruc-tion.

7. Provide instructions for sensory impaired users. Cap-tioned instructions for the hearing impaired and an audiotrack for the sight impaired are extremely desirable and mayalso help those with poor reading skills (Kennedy, 1990).

Volume VI Number 4 Page 9VISITOR BEHAVIOR))) Winter, 1991

Driscoll (1990), when evaluating an interactive computerexhibit at the New York Hall of Science, found that anoptional audio narration, in addition to the written text on themonitor screen, was widely used. The redundancy aided poorreaders and gave an alternative to those who preferred thehearing mode.

Visitor-Exhibit Interface

visitors to master. In addition, when several functions are in-volved in a task, controls for each function should lookdifferent (Kennedy, 1990).

6. Placement of controls. Kennedy (1990) argues thatcontrols should be placed within 10 inches of the front of anexhibit. Trial testing should ensure proper placement.

1. Anticipate how visitors might make errors and try tominimize these errors with physical or psychological con-straints (Norman, 1988). For example, on a computer key-board it is easy to press an adjacent incorrect key (e.g., "one"instead of "two"). The fact that these two keys are next to oneanother increases the chance one of these numbers will bepushed incorrectly in place of the other. If "one" turns thesystem on and "two" turns it off, an error can be very costlyto the user. By using "one" and "zero" instead of "one" and"two," the possibility of pushing the wrong key is minimized,since the"one" and"zero" keys are faraway from one anotheron the standard keyboard. Another way to reduce user errorsis to make controls for different functions look different(Kennedy, 1990). For example, a green pushbutton may beused for starting a device and a red pushbutton for stopping(However, note the possible problem with color blindness).

2. Controls must provide feedback to user. The user shouldbe told if his/her response is registering in the device by somevisual or auditory change such as a change in the computerscreen, feedback text on the screen, or a sound. Interactivedevices work even better if redundant feedback is given tousers (e.g., Diamond, 1991). Judy Diamond (1991) foundthat in the exhibit, Radioactive Rock, visitors needed redun-dancy in order to see the effect of radiation. Redundancyincluded hearing clicks, seeing a red light, and reading a dialto indicate the strength of radiation.

3. Timing of events. How long does it take for the device tobe activated once a response is made? Text and graphicsshould appear as quickly as possible. It is also desirable forvisitors to be able to control the speed at which the displayresponds.

4. Sensitivity of controls. How sensitive are the controls?Are they oversensitive? Menninger (1991) reported that acommon complaint in an evaluation of an interactive vide-odisc at the Getty Museum was an oversensitive touch screen.

5. Selection of controls. Controls may be either mechanical(e.g., wheels, handles, levers, cranks) or electrical (e.g.,pushbuttons, trackballs, joysticks). The user's energy isdirectly transmitted to the exhibit when mechanical controlsare used, while electrical controls let the device do the work.

Touch screens are easy to master and overcome many ofthe problems associated with keyboards. Other devices havealso proved useful. For example, Driscoll (1990) reported• that a trackball device used as a computer control was easy for

7. Computer software navigation. It should be easy tonavigate through the exhibit program. Ideally, the programshould be at the beginning when the visitor approaches. Al-ternatively, it should be obvious how to get to the beginning.Several evaluation studies reported a problem when thedevice is not reset before a new visitor attempts to use it (e.g.,Menninger, 1991;Flagg, 1991a;Mintz, 1990). Flagg(1991b)asserts:

"The most successful interfaces between users andelectronic exhibits make it immediately obvious how tonavigate through the program. Interfaces that rely on in-troductory screens may not be as effective, because visi-tors typically begin a program where someone else leftoff." (p. 10).

8. Perceptual and physical limitations of users. Designersmust be aware of the perceptual and physical limitations ofthe human body (Miles, et al, 1982). Controls and instruc-tions should not be placed too high or too low since it requiresextra work and may interfere with the visitor's performance.See Kennedy (1990) for more detailed anthropometric guide-lines relevant to designing interactive exhibits so that theyaccommodate a wide range of physical sizes of users.

9. Plan for multi-person use. Visitors often use interactivedevices as a group. For example, Driscoll (1990) found thatvisitors tended to share the Color & Light exhibit computeras a group even though it was originally designed for one userat a time. If possible, exhibits should be designed to accom-modate this inherent sociability factor. Duensing (1987)reports: "We have noticed at the Exploratorium that not onlyis it fun for people to do things together at an exhibit, it is alsofun to watch others" (p. 141). Providing more than one seatat a station and enough space for others to observe the usershould help accommodate group usage of the interactive.

10. Design for the physically disabled. Moveable seats aredesirable so that wheelchair bound visitors can use the exhibitunobstructed. Kennedy (1990) suggests specific dimensionsfor designing the exhibit table/counter for wheelchair access.

11. Required time of use. It is sometimes difficult to keepvisitors at one exhibit for a prolonged period of time. Otherexhibits may draw visitors away after a minute or two. Onthe other hand, a successful device might have the oppositeeffect, i.e., resulting in one visitor dominating time on theexhibit. In this case, limiting time on the device may be

necessary. For example, the Denver Museum of NaturalHistory has a driving test device in which visitors use a codedplastic card. The device is therefore able to restrict visitors toa single use of the device, enabling others to have their turn.

12. Select meaningful response requirements. Interactivedevices can be effective in guiding meaningful outcomes,such as understanding a natural phenomenon or a concept.However, interactives are too often used in a meaninglessway. For example, Borun (1977) found that "...pushbuttonsare frequently only start buttons and don't allow real interac-tion with the display. They do not help visitors to perceivesignificant cause and effect relationships... We concludefrom the above that pushbuttons seem to hinder rather thanhelp the communication of scientific facts and principles (p.67)."

13. The use of controls should be clear. A button is obviouslyfor pushing, around handle for turning, levers are for pulling,etc. (Kennedy, 1990). If necessary control labels should tellwhat to do (e.g., "press," "push," "pull"). If there is more thanone control, is their sequence obvious? (Wagner, 1991).

Final Thoughts

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 10

The suggested guidelines in this article are no substitutefor creative thinking. Designing an effective interactivedevice requires acombination of: well-conceived objectives,creative thinking, knowledge of the principles of visitor-exhibit interaction (human factors), competent engineering,visitor evaluation, and common sense. Any one of theseelements is useless without the others. Keep in mind thatcommunicating the message is the most important outcomeof effective design. The most creative and clever device willnot overcome the lack of appropriate learning objectives.Interactive devices must be used intelligently if they are tohave their maximum effect in museums, zoos, aquariums,and other exhibition settings.

There is still much that we need to know about designingeffective exhibits whether they be of the interactive, partici-patory, simple hands-on, or hands-off type. We believe thatthe gap between what we need to know and what we currentlyknow can be closed more quickly if visitor researchers,educators, and exhibit designers work together.

References

Bitgood, S. (1991a). The ABCs of Label Design. In S.Bitgood, A. Benefield, and D. Patterson (Eds.), VisitorStudies: Theory, Research, and Practice, Volume 3.Jacksonville, AL: Center for Social Design. Pp. 115-129.

Bitgood, S. (1991b). The Falling Feather Exhibit. VisitorBehavior, 6(4), 4-5.

Bitgood, S., Nichols, G., Pierce, M., Conroy, M., &Patterson, D. (1986). Effects of Label Characteristicson Visitor Behavior. Technical Report No. 86-55.Jacksonville, AL: Center for Social Design.

Bonner, J. (1991). Personal Communication.Borun, M. (1977). Measuring the Immeasurable: A Pilot

Study of Museum Effectiveness. Washington, DC:Association of Science-Technology Centers.

Borun, M. (1990). Naive Notions and the Design ofScience Museum Exhibits. ILVS Review, 1(2), 122-124.

Borun, M. (1991). Cognitive Science Research andScience Museum Exhibits. In S. Bitgood, A. Benefield,& D. Patterson (Eds.), Visitor Studies: Theory, Re-search, and Practice, Volume 3. Jacksonville, AL:Center for Social Design. Pp. 231-236.

Diamond, J. (1991). Prototyping Interactive Exhibits onRocks and Minerals. Curator, 34(1), 5-17.

Driscoll, J. (1990). Exhibit-Link: A Computer BasedInterpretation System at the New York Hall of Science.ILVS Review, 1(2), 118-120.

Duensing, S. (1987). Science Centres and Exploratories:A Look at Active Participation. In D. Evered & M.O'Connor (Eds.), Communicating Science to the Public.New York: John Wiley & Sons. Pp. 131-142.

Flagg, B. (1991a). Implementation and FormativeEvaluation of Beyond Earth, A Space Adventure. Re-search Report No. 91-001. Bellport, NY: MultimediaResearch.

Flagg, B. (1991b). Visitors in Front of the Small Screen.ASTC Newsletter, Nov-Dec, 9-10.

Hilke, D. D., Hennings, E., C., Springuel, M. (1988). TheImpact of Interactive Computer Software on Visitors'Experiences: A Case Study. ILVS Review, 1(1), 34-49.

Kennedy, J. (1990). User Friendly: Hands-on ExhibitsThat Work. Washington, DC: Association of Science-Technology Centers.

Levy, S. (1989). Cogs, Cranks, and Crates: Guidelinesfor Hands-On Traveling Exhibitions. Washington, DC:Association of Science-Technology Centers.

Loomis, R. J. (1987). Visitor Evaluation. Nashville, TN:American Association of State and Local History.

McNamara, P. A. (1990). Trying It Out. ILVS Review,1(2), 132-134.

Melton, A. W. (1935). Problems of Installation inMuseums of Art. New Series No. 14. Washington, DC:American Association of Museums.

Menninger, M. (1991). An Evaluation Study of the GettyMuseum's Interactive Videodisc. AAM ProgramSourcebook. Denver, CO: American Association ofMuseums. Pp. 111-120.

Miles, R., Alt, M., Gosling, D., Lewis, B., & Tout, A.(1982). The Design of Educational Exhibits. London:Allen & Unwin Publishers.

Mintz, A. (1990). The Interactive Computerized Informa-tion System: Annual Report. Philadelphia, PA: TheFranklin Institute Science Museum.

Volume VI Number 4 Page 11BEHAVIOR))) Winter, 1991

Mintz, A. (in press). The Franklin Institute ComputerNetwork. Spectra.

Morrissey, K. (1991). Visitor Behavior and InteractiveVideo. Curator, 34(2), 109-118.

Norman, D. A. (1988). The Psychology of EverydayThings. New York: Basic Books.

Oppenheimer, F. (1986). Working Prototypes: ExhibitDesign at the Exploratorium. Washington, DC:Association of Science-Technology Centers.

Screven, C. G. (1986). Exhibitions and InformationCenters: Some Principles and Approaches. Curator,29(2), 109-137.

Screven, C. G. (1990). Uses of Evaluation Before, During,and After Exhibit Design. ILVS Review, 1(2), 36-66.

Screven, C. G. (1991). Computers in Exhibit Settings. InS. Bitgood, A. Benefield, & D. Patterson (Eds.), VisitorStudies: Theory, Research, and Practice, Volume 3.Jacksonville, AL: Center for Social Design. Pp. 130-138.

Serrell, B. (1983). Making Exhibit Labels: A Step-by-StepGuide. Nashville, TN: American Association for Stateand Local History.

Shettel, H. (1989). Front-End Evaluation: Another UsefulTool. AAZPA Annual Proceedings. Pittsburgh, PA:American Association of Zoological Parks and Aquari-ums.

Shettel, H. (1991). Personal Communication.Wagner, (1991). Some Thoughts on the Process of

Creating Interactive Devices. ASTC Newsletter, July/August, 10-11.

Worts, D. (1991a). Enhancing Exhibitions: Experimentingwith Visitor-Centered Experiences in the Art Gallery ofOntario. In S. Bitgood, A. Benefield, & D. Patterson(Eds.), Visitor Studies: Theory, Research, and Practice,Volume 3. Jacksonville, AL: Center for Social Design.Pp. 203-213.

Worts, D. (1991b). Technology in Exhibits: A Means oran End? AAM Program Sourcebook. Denver, CO:American Association of Museums. Pp. 339-350.

NoteThanks to Harris Shettel and Don Patterson for

reading an earlier draft of this article and makinginsightful suggestions.

Where to get more informationabout interactive exhibits

DO YOU WANT THE1993 VISITOR STUDIES CONFERENCE

IN YOUR CITY??

IF SO, LET US KNOW NOW!

Send your ideas to:

Pat Shettel14102 Arctic Ave

Rockville, MD 20853Phone: (301) 871-5516FAX: (301) 871-6453

ILVS Review: A Journal of Visitor Behavior.This journal contains many articles relevant to interac-

tive exhibits and is published by:ILVS Publications611 N. Broadway, Suite 600Milwaukee, WI 53202(414) 223-4266

Association of Science-Technolgoy CentersASTC publishes a number of monographs dealing with

interactive exhibits. For a complete list of ASTC publica-tions write:

ASTC1025 Vermont Ave, Suite 500Washington, DC 20005

Spectra.This publication by the Museum Computer Network

contains articles relevant to the use of interactive comput-ers.

Museum Computer Network5001 Baum BlvdPittsburgh, PA 15213-1851.

VISITOR BEHAVIOR) Winter, 1991 Volume VI Number 4 Page 12

Evaluation of the Falling FeatherExhibit on Gravity

Stephen Bitgood, Ph.D.Jacksonville State University

The exhibit, located in a science museum at the time ofthis evaluation, attempted to demonstrate that, once air resis-tance is removed, both light and heavy objects fall at the samerate due to gravitational pull. The apparatus consisted of alarge, clear plexiglass tube containing a feather and a smallpiece of metal shaped like a chicken. The tube was attachedto a vacuum pump so that air could be pumped out. Anelectromagnet, when activated, was supposed to hold themetal chicken against the feather at one end of the tube whichwas then rotated so that the objects fell straight down. Threecolored buttons were placed to the right of the tube anddirections on how to use the exhibit directly below thebuttons.

The following 125-word interpretive/explanatory labelwas attached to the exhibit on the top left side:

"Galileo discovered that gravity pulls equally onheavy and light objects. The unexpected factor is airresistance.

When the feather and the chicken fall in the tubefilled with air, the feather falls slower because it has alarger size relative to its weight and encounters moreair resistance.

Remove the air from the tube and a vacuum iscreated. The feather falls as fast as the chicken. Thisshows that gravity works just as "hard" in holdinggiant planets or tiny comets in orbit around the sun.

To test Galileo, Astronaut Dave Scott, on Apollo15, dropped a falcon feather and a hammer in theairless environment on the moon. They fell slowerthan they would on Earth — but they arrived at thesame time."

Directions were provided which must be followed inorder to observe the feather and the metal weight fallingunder normal conditions with air and under vacuum condi-tions after the air was pumped out. To complete the demon-stration, a total of eight manipulations had to be made by thevisitor.

(1)Turn the tube so that the feather and metal object are onthe bottom of the tube.

(2) Press and hold the red button to magnetize the metalchicken and hold down the feather.

(3)Rotate the tube so that the feather and metal chicken areat the top while still being held by the electromagneticfield.

(4) Press the green button to release the magnetic force

and compare how fast the metal chicken and featherfall.

(5) Press the blue button to activate a pump to removethe air from the tube.

(6) Repeat step #2 (press and hold the red button).(7) Rotate the tube so that the objects are at the top.(8) Press the green button that releases the magnetic

field and observe the objects falling in a vacuum.

Method

The exhibit was evaluated by direct observation of visi-tors as they approached and used the exhibit and by inter-viewing a sample of those observed. A total of 54 visitorswere observed including male (32) and female (4) adults aswell as male children (18). During the four hours of record-ing, no female children were observed interacting with theexhibit. For each visitor observed, the following events wererecorded:

(1) The chain of responses when the visitor attempted toobserve the gravity phenomenon;

(2) The total time interacting with the exhibit;(3) The gender and age of the visitor;(4) Whether or not the demonstration was completed suc-

cessfully (i.e., did the visitor go through the correctsequence of steps and did the device operate as in-tended?);

(5) The number of manipulations performed by the visitor(a total of 8 were necessary).

(6) Whether or not the visitor read the explanatory label.

Results

Only 14.8% (8 of 56) completed all eight steps in theirproper sequence. Of these 8 individuals, 3 were not able tocompare the falling rate of the objects because the feather wasnot held down by the magnetic force. Consequently, whenthe tube was rotated, the feather fell before the metal chicken.

Total times and percentage of visitors in each time rangeat the exhibit were distributed as follows:

0-30 sec (18.5%)31-60 sec (22.2%)61-90 sec (3.7%)91-120 sec (14.8%)>120 sec (44.4%)

VISITOR BEHAVIOlR))) Winter, 1991 Volume VI Number 4 Page 13

Although a substantial percentage (44.4) were at theexhibit for more than 120 seconds, only 6 of these 24individuals were able to successfully observe the gravitydemonstration. Only one visitor who stayed .under 120seconds was able to successfully manipulate the exhibit.

The number of total manipulations per visitor also var-ied. Table 1 shows that 59.2 percent of visitors made fewerthan eight (the minimum required) manipulations. Fourteenof 22 individuals who did make eight or more manipulationswere still unable to observe the gravity phenomenon becauseof not following the directions correctly or because theelectromagnet failed to pin down the feather.

Table 1

Number of PercentManipulations of Visitors

0-1 02-3 25.94 -5 11.16-7 22.28-9 14.8 [Needed to

10-11 11.1 make at12-13 3.7 least 8 to be>13 11.1 successful]

Reading of the interpretive label was extremely rare.Only 2 of 56 visitors read the label, and only one of thesereaders appeared to read long enough to complete the entirelabel.

Inteviews with visitors found almost no comprehensionof the exhibit's message. Visitors who spent a long time atthis exhibit appeared frustrated that the message was notclear.

Discussion

A third problem was the placement and characteristics ofthe interpretive/explanatory label. It was placed on the farcomer of the exhibit in a very nonobtrusive location and wasconsequently overlooked by 54 of 56 visitors. In addition,the fact that it contained 134 words may have served as adeterrent to potential readers (Bitgood, 199 1 a). Our researchsuggests that labels of more than 75 words are read lessfrequently than shorter labels.

A fourth problem was the proximity of the exhibit to arocket engine demonstration. Every few minutes a loudrocket engine would fire and distract visitors attempting tointeract with the exhibit.

A potential problem with the exhibit evaluated here wasthe visitors' confusion over the concepts of gravity and airre-sistance. In a study by Minda Borun (1990; 1991) at theFranklin Insitute it was found that many visitors believed thatobjects would float if you take away air resistance. It is notsurprising that after a few interviews, we found that visitorswere confused about the relationship between gravity and airresistance.

It was not necessary to observe a large number of visitorsin this study, since the major problems were apparent afterobserving only a few people interacting with the exhibit. Thisevaluation punctuates the importance of using trial testingduring the development of interactive science exhibits. Hadthe exhibit been trial tested with visitors while it was beingdeveloped, and changes made from this feedback, it is likelythat a more effective exhibit could have been produced.

Following this evaluation, the Museum made the deci-sion to remove the exhibit rather than attempt to make it work.However, had this study been conducted as part of remedialevaluation, it would be possible to improve the exhibit(Bitgood, 1991; Screven, 1990).

ReferencesThe evaluation revealed several problems with the Fall-

ing Feather exhibit. First, it showed that few visitors com-pleted the necessary chain of manipulations in order to suc-cessfully observe the gravity phenomenon. Perhaps it isunrealistic to think that visitors will spend a minimum of twominutes to complete a complicated sequence of eight steps(See Bitgood, 1991b).

A second problem was that the apparatus did not alwayswork as it was intended. The electromagnet did not alwayshold down the feather as required in order to compare thefalling rates of the feather and the piece of metal. Trial testingthis device during its development might have revealed thisproblem and appropriate changes could have been madebefore final installation (Bitgood, 1991b).

Bitgood, S. (1991a). The ABCs of Label Design. In S.Bitgood, A. Benefield, & D. Patterson (Eds.), VisitorStudies: Theory, Research, and Practice, Volume 3.Jacksonville, AL: Center for Social Design. Pp.

Bitgood, S. (1991b). Suggested Guidelines for InteractiveExhibits. Visitor Behavior, 6(4), 4-11.

Borun, M. (1990). Naive Notions and the Design ofScience Museum Exhibits. ILVS Review, 1(2), 122-124.

Borun, M. (1991). Cognitive Science Research andScience Museum Exhibits. In S. Bitgood, A. Bene-field, & D. Patterson (Eds.), Visitor Studies: Theory,Research, and Practice, Volume 3. Jacksonville, AL:Center for Social Design. Pp. 231-236.

Screven, C. G. (1990). Uses of Evaluation Before, During,and After Exhibit Design. ILVS Review, 1(2), 36-66.

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 14

Bibliography: Hands-On,Participatory, and Interactive Exhibits

Stephen BitgoodJacksonville State University

The following bibliography is by no means exhaustive.However, I hope that it is of use to those who wish to researchthe topic of interactive exhibits.

Abron, S., & Hooper, K. (1988). Interactive Multimedia:Visions of Multimedia for Developers, Educators andInformation Providers. Redmond, WA: MicrosoftPress.

Berrin, K. (1978). Activating the Art MuseumExperience. Museum News, 56 (4): 42-45.

Birney, B.A. (1988). Brookfield Zoo's "Flying Walk"Exhibit: Formative Evaluation Aids in the Develop-ment of an Interactive Exhibit in an Informal LearningSetting. Environment and Behavior, 20(4): 416-434.

Bitgood, S. (1991). Evaluation of the Falling FeatherExhibit on Gravity. Visitor Behavior, 6(4), 4-5.

Bitgood, S. (1991). Suggested Guidelines for DesigningInteractive Exhibits. Visitor Behavior, 6(4), 10-14.

Bitzer, D.L. (1968). The Computer: A Flexible Guide toan Art Museum. In Bowles, E. (Ed.). Computers andTheir Potential Applications in Museums. Proceed-ings of a Conference at the Metropolitan Museum ofArt. New York: Aino Press. Pp. 349-357.

Bork, A. (1980). Interactive Learning. In R.P. Taylor(Ed.), The Computer in the School: Tutor, Tool,Tutee. New York: Teachers College Press. Pp. 53-66.

Boron, M. (1977). Measuring the Immeasurable: A PilotStudy of Museum Effectiveness. Washington, D.C.:Association of Science-Technology Centers.

Borun, M. (1979). Select-A-Label: A Model ComputerBased Interpretive System for Science Museums.Philadelphia: Franklin Institute and Science Museum.

Borun, M. (1983). Enhancing Museum Education ThroughComputers. Journal of Museum Education:Roundtable Reports, 8(5): 5-7

Borun, M. (1990). Naive Notions and the Design ofScience Museum Exhibits. ILVS Review, 1(2), 122-124.

Borun, M. (1991). Cognitive Science Research andScience Museum Exhibits. In S. Bitgood, A. Bene-field, & D. Patterson (Eds.), Visitor Studies: Theory,Research, and Practice, Volume 3. Jacksonville, AL:Center for Social Design. Pp. 231-236.

Borun, M.; Flexer, B.; Casey, A.; Baum, L. (1983).Planets and Pulleys: Studies of Class Visits to aScience Museum. Washington, D.C.: Association ofScience Technology Center.

Bowles, E. (Ed.). (1968). Computers and Their PotentialApplications in Museums. New York: Aino Press.

Braman, R. (1987). Exploratorium Cookbook I. (Rev. ed.)San Francisco: The Exploratorium.

Bryan, D. (1985). Involvement. Exhibit Builder, 2(7), 18-25.

Bryan, D., & Englehardt, J. (1988). Interactive ExhibitControls. Exhibit Builder, 6(1), 50-53.

Callison, DJ. (1983). A Simulation of Random AccessVideo Technology: Reaction to Premastered Multi-Media Interactive Programmed Instruction in aChildren's Museum Free Inquiry Learning Environ-ment: Indiana University, Ed.D. Dissertation.

Cary, S., Eason, L.P., & Friedman, A.J. (1979). Summa-tive Evaluation of a Participatory Science Exhibit.Science Education; 63(1): 25-36.

Cash, J. (1985). Spinning Toward the Future: TheMuseum on Laser Videodisc. Museum News, 63(6),19-35.

Chambers, M. (1990). Improving the Esthetic Experiencefor Art Novices: A New Paradigm for InterpretiveLabels. In M. McDermott-Lewis (Ed.), The DenverArt Museum Interpretive Project. Denver: Denver ArtMuseum. Pp. 101-109.

Chambers, M., & Muir, H. (1990). Suitable for Framing:Making Value Judgments About Art. In M. McDer-mott (Ed.), The Denver Art Museum InterpretiveProject. Denver: Denver Art Museum. Pp. 111-120.

Danilov, V. (1984). Early Childhood Exhibits at ScienceCenters. Curator, 27(3),173-188.

Decrosse, A., Landry, J., & Natali, J-P. (1987). Explora:The Permanent Exhibiton of the Centre for Scienceand Industry at LaVillette, Paris. Museum, 155,176-91.

Diamond, J. (1991). Prototyping Interactive Exhibits onRocks and Minerals. Curator, 34(1), 5-17.

Diamond, J., Bond, A., & Hirumi, A. (1989). DesertExplorations - A Videodisc Exhibit Designed forFlexibility. Curator, 32(3), 161-173.

Driscoll, J. (1990). Exhibit-Link: A Computer BasedInterpretation System at the New York Hall ofScience. ILVS Review, 1(2), 118-120.

Dubose, R. (1973). Sensory Perception and the MuseumExperience. Museum News, 52(2), 50-51.

Duensing, S. (1987). Science Centres and Exploratories:A Look at Active Participation. In D. Evered & M.O'Connor (Eds). Communicating Science to thePublic. New York: John Wiley & Sons. Pp. 131-142.

VISITOR BEHAVIOR Winter, 1991 Volume VI Number 4 Page 15

Eason, L.P. & Linn, M.C. (1976). Evaluation of theEffectiveness of Participatory Exhibits. Curator,19(1): 45-62.

Ellis, G. (1991). Laserdisc: Better Video for Exhibits.Exhibit Builder, Nov-Dec, 28-34.

Erskine, D.J. (1964). Audio-Visual Materials in Interpreta-tion in Yellowstone National Park. Ann Harbor.University of Michigan, M.A. Thesis.

Fazzini, D. (1972). The Museum as a Learning Environ-ment: A Self-Motivating, Recycling, Learning Systemfor the Museum Visitor. Milwaukee, WI: Universityof Wisconsin-Milwaukee, Ph.D. Dissertation.

Feher, E., & Rice, K. (1985). Development of ScientificConcepts Through the Use of Interactive Exhibits in aMuseum. Curator, 28(1): 35-46.

Flagg, B. (1991). Visitors in Front of the Small Screen.ASTC Newsletter, Nov/Dec, 9-10.

Friedman, A.; Eason, L.; Snelder, G.I. (1979). StarGames: A Participatory Astronomy Exhibit. Plane-tarium, 8(3): 3-7.

Gillies, P. (1981). Participatory Science Exhibits inAction: The Evaluation of the Visit of the Ontario"Science Circus" to the Science Museum, London.South Kensington: Science Museum.

Gillies, P., & Wilson, A. (1982). Participatory Exhibits:Is Fun Educational? Museums Journal, 82(3), 131-134.

Graber, J. (1987). Using Computer Stations to SurveyVisitors. Current Trends in Audience Research. SanFrancisco: American Association of Museums VisitorResearch and Evaluation Committee. Pp. 11-14.

Hendron, R. (1986). Socrates: An Inexpensive ComputerLearning Guide. AAZPA 1986 Annual Proceedings.Wheeling, WV: American Association of ZoologicalParks & Aquariums. Pp. 760-762.

Herbert, M. (1981). The Water Pushes It and the WheelTurns It. Curator, 24(1), 5-18.

Hilke, D.D. (1986). Do I Want a Computer in my l xhi bitHall? Asessing the Impact of Interactive ComputerSoftware on Visitors' Museum Experiences. Prelimi-nary Findings: "The Laser at 25" Evaluation Study.Washington, D.C.: National Museum of AmericanHistory.

Hilke, D. D. (1988). Computer Interactives: Beginning toAssess How They Affect Exhibition Behavior,Spectra, 15(4), 1-2.

Hilke, D.D.; Hennings, E.C.; Springuel, M. (1988). TheImpact of Interactive Computer Software on Visitors'Experiences: A Case of Study. ILVS Review, AJournal of Visitor Behavior, 1(1): 34-49.

Hipschman, R. (1990). Exploratorium Cookbook II. (4thed.). San Francisco: The Exploratorium.

Jenkins, D. (1985). A Survey of Interactive Technologies.AAZPA 1985 Annual Proceedings. Wheeling, WV:American Association of Zoological Parks & Aquari-ums. Pp. 72-79.

Kahn, R. (1978). Computers and Science Museums: APublic Access Model. Parts I and II. People'sComputers, July-Aug., p. 38; Sept-Oct., p. 21.

Kennedy, J. (1990). User Friendly: Hands-On ExhibitsThat Work. Washington, DC: Association of Science-Technology Centers, 1990.

Kerr, S. (1986). Effective Interaction in a Natural ScienceExhibit. Curator, 29(4), 265-277.

Klevans, M.L. (1991). An Evaluation of an InteractiveMicrocomputer Exhibit in a Museum Setting. In S.Bitgood, A. Benefield, & D. Patterson (Eds), VisitorStudies: Theory, Research, and {raetice, Vol. 3.Jacksonville, AL: Center for Social Design. Pp. 237-255.

Koran, J.J. Jr; Koran, M.L.; Longino, S.J. (1986). TheRelationship of Age, Sex, Attention, and HoldingPower With Two Types of Science Exhibits. Curator,29(3): 227-244.

Korn, R. (1985). The Computer as a Potential EvaluationTool. Journal of Museum Education: RoundtableReports, 10(2): 7-9.

Kom, R., & Vandiver, R. (1988). Interactive Labels: ADesign Solution. ILVS Review, 1(1), 108-109.

Krulick, J., & Ritchie, M. (1990). Expanding the NoviceExperience. In M. McDermott-Lewis (Ed.), TheDenver Art Museum Interpretive Project. Denver:Denver Art Museum. Pp. 49-54.

Levy, S. (1989). Cogs, Cranks and Crates: Guidelinesfor Hands-On Traveling Exhibits. Washington, DC:Association of Science-Technology Centers.

Loomis, R., Goddard, J., D'Agostino, J., Birjulin, A., &Smith, J. (1990). Developing an Evaluation Programfor the Denver Museum of Natural History Hall ofLife. In Current Trends in Audience Research.Chicago, IL: American Association of MuseumsVisitor Research & Evaluation Committee. Pp. 23-28.

McManus, P.M. (1987). It's the Company You Keep...The Social Determination of Learning-RelatedBehaviour in a Science Museum. InternationalJournal of Museum Management and Curatorship, 6:263-270.

McManus, P.M. (1988). Good Companions: More on theSocial Determination of Learning-Related Behavior ina Science Museum. International Journal of MuseumManagement and Curatorship, 7(1): 37-44.

McNamara, P.A. (1986). Computers Everywhere: ButWhat Happened to the Research? Journal of MuseumEducation: Roundtable Reports, 11(1): 21-24.

McNamara, P. A. (1990). Trying It Out. ILVS Review,1(2), 132-134.

Melton, A. W. (1936). Distribution of Attention inGalleries in a Museum of Science and Industry.Museum News, 14(3), 6-8.

Menninger, M. (1990). An Evaluation Study of the GettyMuseum's Interactive Videodisc. Program Source-book. Denver, CO: American Association of Muse-ums. Po. 111-120.

VISITOR BEHAVIOR))) Winter, 1991

Miles, R., Alt, M., Gosling, D., Lewis, B., & Tout, A.(1982). The Design of Educational Exhibits. London:Allen & Unwin Publishers.

Mintz, A. (in press). The Franklin Institute ComputerNetwork. Spectra.

Mintz, A., & Borun, M. (1990). Linking Visitors toExhibits and to Science with an Interactive Informa-tion System. ILVS Review, 1(2), 117-118.

Morrissey, K. (1991). Visitor Behavior and InteractiveVideo. Curator, 34(2), 109-118.

Motiska, R. (1991). Technology Brings Health Educationto Life. Exhibit Builder, Nov-Dee, 10-18.

Normal, D. (1988). The Psychology of Everyday Things.New York: Basic Books.

Oppenheimer, F. (1986). Working Prototypes: ExhibitDesign at the Exploratorium. San Francisco: TheExploratorium.

Phillips, D. (1988). Recipe for an Interactive Art Gallery.International Journal of Museum Management andCuratorship, 7(3), 243-252.

Peers, B. (1991). Improving the Motivational Power ofMuseum Dioramas. Ottawa: Canadian Museum ofNature.

Pierotti, R. (1973). See ... Touch ... Respond. MuseumNews, 52(4): 43-48.

Pizzey, S. (Ed.). (1987). Interactive Science and Technol-ogy Centers. London: Science Projects Publishing.

Pizzey, S. (1985). Museums and Galleries, Update 8:Interactive Displays. Architects' Journal, 182(33),43-44.

Rhees, D. (1981). Exhibits About Computers. Washing-ton, DC: Association of Science-Technology Centers.

Rudy, L. (1991). Designing for the Future at the FranklinInstitute Science Museum. Exhibit Builder, Nov-Dec,6-8.

Schaedlich, P. (1990). Myths, Misconceptions andModels: Whale Watching Through Exhibits. AAZPA1990 Annual Conference Proceedings. Wheeling,WV: American Association of Zoological Parks &Aquariums. Pp. 531-538.

Schatz, D.L.& Friedman, A.J.(1975). DiscoveringAstronomy: An Interactive Museum Exhibit. Bulletinof the 146th Meeting of American AstronomicalSociety, 7(3): 447-448.

Screven, C.G. (1970). The Programming and Evaluationof an Exhibit Learning System. In K. J. Goldman,(Ed.). Opportunities for Extending Museum Contribu-tions to Pre-College Science Education. Washington,D.C.: Smithsonian Institution.

Screven, C.G. (1973). Public Access Learning: Experi-mental Studies in a Public Museum. In R. Ulrich, T.Stachnik, & J. Mabry (Eds). The Control of HumanBehavior,Vol. 3. Glenview, IL: Scott-Foresman. Pp.226-233.

Screven, C.G. (1974). The Measurement and Facilitationof Learning in the Museum Environment: An Experi-

Volume VI Number 4 Page 16

mental Analysis. Washington, D.C.: SmithsonianInstitution Press.

Screven, C.G. (1974). Learning and Exhibits: Instruc-tional Design. Museum News, 52(5): 67-75.

Screven, C. G. (1986). The Design of Exhibitions andInformation Centers: Some Principles and Ap-proaches. Curator, 29(2), 109-137.

Screven, C. G. (1988). A "Self-Test" Computer Systemfor Motivating Voluntary Learning from Exhibitions.ILVS Review, 1(2), 120-121.

Screven, C. G. (1991). Computers in Exhibit Settings. InS. Bitgood, A. Benefield, & D. Patterson (Eds.),Visitor Studies: Theory, Research and Practice, Vol.3. Jacksonville, AL: Center for Social Design. Pp.130-138.

Searles, H. (1987). Interpreting and Evaluating withMicro-Computers. Visitor Behavior, 2(3): 5-6.

Semper, R., Diamond, J., & St. John, M. (1982). Use ofInteractive Exhibits in College Physics Teaching.American Journal of Physics, 50(4): 425-430.

Serrell, B. (1991). Profile of an Exhibit. EvaluationSummary of "Darkened Waters: Profile of an OilSpill". Unpublished Report.

Sewell, B., & Raphling, B. (1991). Tool or Toy? AnInteractive Computer Program Evaluation. "Ottersand Oil. " Shedd Aquarium. Unpublished Report.

Sharp, E. (1983). Touch Screen Computers: An Experi-mental Orientation Device at The National Museum ofAmerican History. Washington, D.C.: SmithsonianInstitution, National Nuseum of American History,Office of Public and Academic Programs..

Silberglitt, B.S. (1972). The Use of Audio and ProstheticDevices to Improve and Evaluate Exhibit Effective-ness. Miswaukee, WI: University of Wisconsin-Milwaukee, Ph.D. Dissertation; 1972.

Smith, F., (1982). Planet Ocean: Applying DisneylandTechniques at a Science Museum. Curator, 25(2),121-30.

Sneider, C.I., Eason, L., Friedman, A.J. (1979). Summa-tive Evaluation of a Participatory Science Exhibit.Science Education, 1979; 63(1): 25-36.

Taylor, D. (1984). Computerizing Science Exhibits. Call-A.P.P.L.E. , 7(1),27-31.

Thier, H.D., & Linn, M.C. (1976). The Value of Interac-tive Learning Experiences in a Museum. Curator,15(3): 248-254.

Van Rennes, E.C. (1981). Exhibits Enhanced by Stand-Alone Computers. Bloomfield Hills, Michigan:Cranbrook Institute of Science. [Note: ERIC Docu-ment Reproduction Service No. IR 0096071

Van Rennes, E., & Mark, C. (1981). Bridging the VisitorExhibit Gap with Computers, Museum News, 60(1),21-30.

Vanausdall, J. (1986). The Computer as Interpreter.Museum News, 64(3): 73-82.

Nominations forOfficers and Board of the

VISITOR STUDIES ASSOCIATIONare currently being accepted!

Deadline: January 31, 1991

Send to:Minda Borun

Franklin Institute of Science20th St & the ParkwayPhiladelphia, PA 19151Phone: (215) 448-1103FAX: (215) 448-1364

FINAL CALL FOR PROPOSALS

1992 AAM Poster SessionSponsored by the

Visitor Research & Evaluation Committee

Deadline for proposals: January 31, 1992.

Send proposals to:Stephen BitgoodCenter for Social DesignP.O. Box 1111Jacksonville, Alabama 36265

Phone: (205) 782-5640

FAX: (205) 782-5640

CALL FOR PAPERSSecond Annual Museum Education Roundtable

Research Colloquium

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 17

Wagner, C. (1991). Some Thoughts on the Process ofCreating Interactive Devices. ASTC Newsletter, July/August, 1991. Pp. 10-11.

Wagner, W. (1981). Audio-Visual Controller Synchro-nizes Museum Display. Electronics, 25(August), 142.

Webster, S. (1985). Interactive Exhibits at the MontereyBay Aquarium. AAZPA 1985 Annual Proceedings.Wheeling, WV: American Association of ZoologicalParks & Aquariums. Pp. 63-68.

White, H.E. (1967). The Design, Development andTesting of A Response Box, A New Component forScience Museum Exhibits. Berkeley; University ofCalifornia. HEW Project no 3148, Contract no OE6-10-056.

White, J. (1986). More than Just Hands-On! ThoughtfullyDeveloping Participatory Exhibits. AAZPA AnnualConference Proceedings. Wheeling, WV: AmericanAssociation of Zoological Parks & Aquariums. Pp.240-245.

White, J. (1987). Getting Involved in ParticipatoryExhibits. Interpreter, 18(1): 9-11.

Whitman, J. (1978). More than Buttons, Buzzers andBells. Museum News, 57(1): 43-50.

Wilhelmi, L. (1983). Microcomputers Systems as ExhibitControls. Curator, 26(2),107-120.

Worts, D. (1990). The Computer as Catalyst: Experiencesat the Art Gallery of Ontario. ILVS Review, 1(2), 91-108.

Worts, D. (199la). Enhancing Exhibitions: Experimentingwith Visitor-Centered Experiences at the Art Galleryof Ontario. In S. Bitgood, A. Benefield, & D.Patterson (Eds), Visitor Studies: Theory, Research,and Practice, Volume 3. Jacksonville, AL: Center forSocial Design. Pp. 203-213.

Worts, D. (199 lb). Technology in Exhibits: A Means oran End? AAM Program Sourcebook. Denver, CO:American Association of Museums. Pp. 339-350.

Zambrano, F. (1987). Computer Design of Spaces andExhibits in a Museum. Spectra, 14(4), 11.

April 6, 1992

Dillon Ripley Center, Room 3037

VISITOR STUDIES ASSOCIATIONSmithsonian Institution

Send abstracts (no more than 3 pages) to:Logo Contest

See "Message From the President" on page 20

Annie V. F. Storrc%o MERP.O. Box 23664Washington, DC 20026-3664(301) 589-6058 or (202) 786-2873

VISITOR BEHAVIOR )̂ Winter, 1991 Volume VI Number Page 18

VISITOR STUDIES CONFERENCE

June 23-27, 1992

St. Louis, MissouriCLARION HOTEL

($59 per night)

Workshops: June 23-24 (Tuesday & Wednesday)(Note: there is a separate registration fee)

Conference Sessions: June 25-27 (Thursday-Saturday)

Registration Fees: Before May 22: $150, VSA members; $175, non-members

After May 22: $175, members; $200, non-members

[Registration and housing forms will be sent to you in early February]

Mark your calendar and plan your budget!

Contact: David Blum. Phone: (314) 533-4849 or (314) 862-7500

FINAL CALL FOR PAPERS

1992 VISITOR STUDIES CONFERENCE

Deadline for proposals is January 31, 1992.

Send all proposals to:

Stephen BitgoodCenter for Social DesignP.O. Box 1111Jacksonville, Alabama 36265

Phone: (205) 782-5640

FAX: (205) 782-5640

R BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 19

Shettel, from page 20

All of the thirteen committees of the Association havebeen staffed. I thank all of you who agreed to serve on thesecommittees with a special thanks to those of you who havemultiple assignments. We will certainly try to avoid this inthe future when we have a larger membership pool to drawupon. I am preparing a draft charter for each of the commit-tees and will ask them to be ready with a short report to themembership at the conference in June.

In the last issue of VisitorBehavior there was a "Call ForNominations" notice for the officers of the Association -President, Vice President, Secretary, Treasurer, and boardmembers—with a December 15 deadline. As of now, Decem-ber 23, we have received zero nominations, which may meanthat there are none, or it may mean that the notice was notnoticed. In case it is the latter, Jam extending the deadline toJanuary 3 1st. You may nominate yourself. (The full nomi-nating procedure will be explained in a subsequent issue ofVisitor Behavior.)

Speaking of nominations, the Site Selection Committeehas asked me to request nominations for conference sites for1993. If your museum would like to be considered as a hostinstitution (ideally, along with others in your area), we needto know as soon as possible. By the time we meet in St. Louisin June we should have a site selected for 1993! See thenotice on page 11 for information about where to send yourideas, suggestions, and proposals.

Somewhat less important, but a lot more fun, is thecontest we are initiating for the design of a logo for theAssociation. Members are invited to send their ideas to me,which will then be reviewed by our Logo Selection Commit-tee, consisting of the Officers (4) and Board Members (12).All entries (with names removed) will be ranked by theCommittee and the logo with the highest average ranking willbe announced at the June meeting. The winner will be givena free registration to the conference (or the equivalent valueif he or she cannot attend). Officers and Board Members areeligible to submit entries. The deadline for all entries isFebruary 15. Put on your creative hat and send us your ideas— as many as you can come up with!

In closing, let me remind you that all VSA membersshould have received by now their renewal membershipnotice for 1992 (along with your membership directory). Ifyou have not received it or you have misplaced it, a copy iscontained in this issue. I urge you to join NOW so that theexpense of sending out second notices can be avoided.(Joining now will also insure the continuation of your sub-scription to Visitor Behavior through 1992.) Growth in bothmembership and conference attendance are the two inter-locking keys to our future health and well-being. Don't waitfor the conference to join. We need the funds "up front" tosupport the conference. Are you convinced? Great! Weawait your check with eager anticipation.

AAM Visitor Research &Evaluation Committee Report

Good news for members of the VR&E (Visitor Research& Evaluation) Committee of the AAM! The program com-mittee for the '92 AAM in Baltimore approved 8 of our 13proposed sessions. Topics include: The African AmericanMuseum Visitor, Evaluation and Fiscal Responsibility, NewApproaches to Visitor Learning Assessment, and TestingInteractive Media Exhibits.

In addition, we will be presenting our poster session anddistributing the associated publication "Current Trends."Researchers, scholars, and visitor behavior specialists — allthose wishing to participate in this year's visitor studies postersession should contact Steve Bitgood at the Center for SocialDesign (address on page 1). This is a valuable opportunity toshare the results of your work with the museum community.

Also at this year's AAM meeting, VR&E will be co-sponsoring an issues luncheon in collaboration with ED-COM (the Education Committee). _ The luncheon, "Re-sources and Opportunities for Collaboration," grew out of anobservation that educators were becoming increasingly con-cerned with learning research, and might benefit by lookingat work that has been done in museums in addition toclassroom-based studies. Those attending the luncheon willbe asked to look at selected research studies on museum-based learning, to consider to what extent this work is usefulin answering the questions of museum educators and to thinkabout which questions, issues, needs are not addressed by thiswork. In preparation for the lunchtime discussion, it isrecommended that participants read Visitor Behavior, Vol-ume 4, No. 2, Summer, 1989, a special issue on school fieldtrips, and "What Research Says About Learning in ScienceMuseums," available through ASTC (Association of Sci-ence-Technology Centers, 1025 Vermont Ave, Suite 500,Washington, DC). We're looking forward to a lively andinformative discussion.

Preceding the AAM meetings will be a two-day work-shop on evaluation sponsored by the VR&E Committee. Itwill cover front-end evaluation, critical appraisal, and forma-tive and remedial evaluation. This is a first for an AAMmeeting and will be an exciting experiment in offeringprofessional development opportunities to AAM conferenceattendees. The workshop will take place on April 23-24, somark your calendars and spread the word to those who mightbe interested in taking part. For more information, call me(215-448-1103), Harris Shettel (301-871-5516), or SteveBitgood (205-782-5640).

Readers who wish to join the AAM Visitor Research andEvaluation Committee should drop a note and $10.00 dues to:Bea Taylor, Treasurer, 610 Somerset Rd, Apt #1, Baltimore,MD 21210. Ph. (410) 235-4027.

Minda Borun, Chair

VISITOR BEHAVIORPsychology InstituteP. O. Box 3090Jacksonville, AL 36265

NON PROF. ORG.U.S. POSTAGE

PAIDJACKSONVILLE, AL

BULK-RATEPERMIT NO.2

VISITOR BEHAVIOR))) Winter, 1991 Volume VI Number 4 Page 20

MESSAGE FROM THE PRESIDENTHarris Shettel

Visitor Studies Association

The Visitor Studies Association is moving ahead on anumber of important fronts, a few of which I will report onhere. First, the planning for the 1992 conference in St. Louison June 23-27 is moving along very nicely, thanks to the ex-cellent team that has been brought together — Jeffrey Bonner(St. Louis Science Center) and Ellen Stokes (St. Louis Zoo),Co-chairs, Mary Beth Mobley (St. Louis Science Center),Secretary, and Tom Turner (St. Louis Science Center),Treas-urer.

In addition, David Blum, a professional meeting plan-ner, was brought on board to add his considerable expertiseto the planning process. Hewillremain apart of the team untilthe conference is over and the final accounting has been com-pleted. As one of David's early and critical tasks, he hasnegotiated a very favorable room rate at the Clarion Hotel of$59.00 per night (plus tax of about 12%). Up to four peoplecan share a room at that same rate. This is an excellent pricefor a very comfortable, centrally located, downtown hotel,and should allow those with limited budgets (e.g., students)to attend the conference, especially if they are willing to sharea room with others. Hotel parking, by the way, is included inthe room price. All Conference panel sessions will be held atthe Clarion Hotel.

Additional information about the cost of the conferencecan be found in this issue (registration fees, etc.) Those of youwho need to budget in advance (or who work for organiza-tions that need to do so for the'92 budget year) can now makea very good estimate of the total cost of the conference.

Official registration and hotel forms will be sent out after thefirst of the year.

The very sizable task of putting together a program forthe conference has also begun, with Steve Bitgood taking thelead as Chair of the Conference Program Committee. (TheFinal "Call For Papers" can be found on page 18. Note theJanuary 31 deadline!) Poster session proposals are also beingsolicited.

I should note that at this point we have a very limitednumber of proposals submitted for the June conference. Junemay seem like a long way off, but it takes time to review andapprove proposals, and to organize and publish the program.Please take a few moments to think about what you havedone, or ideas and opinions about the field, that you wouldlike to share with your colleagues.

Personally, I would like to see more public discussionabout some of the issues that we often get excited about insmall informal discussions over a beer, but tend to shy awayfrom when asked to present them to a wider audience.Among the subjects that I have heard brought up in suchdiscussions are: General methodological issues (e.g., are wereally in the throes of a paradigm shift?); the variety of rolesevaluation can play within an institution (e.g., part of, orindependent from, the exhibit development process); the roleof learning theory in the development and evaluation of ex-hibits and programs; the importance of, and measurement of,the affective as contrasted with the cognitive domain invisitor studies; the proper role, or roles, of hands-on, partici-pative, and interactive exhibits. Perhaps a debate or publicforum format would be an appropriate way to present suchtopics. But whatever the topic or format, it is essential thatyou send Steve your ideas ASAP!

See Shettel, page 19