evaluation of academic librarians' instructional performance: report of a national survey

17
Evaluation of Academic Librarians’ Instructional Pe~ormance: Report of a National Survey by Patrick Ragains Some types oflibrary insfruction easily allow student learning to be assessed (i.e., the self-paced workbook and credit-bearing library instruction), but there is nogenerally accepted means of measuring learning gainedJ+om the typical one-shot library instruction session, Instead, evaluation oflibrary instruction tends toficus upon attendees’ ~c~~~s cf the libra~n’s pe$rmance. This study reports the results ofa survey ~~lua~n pr~‘c~ reported by library instruction coordinators atfirty-f5ur colleges and universities throughout the United States. Among the respondents’ organizations, “reaction data” is most frequently used to assess the +ctiveness oflibrary instruction, although it pro- vides little evidence concerning what students have learned. Nearly three- hurts 4 the libray inst~ctio~ c~dinators who a~w~ed the survey indicated that ~luati~~edback is ~licited~om students at the& institu- tions. More than one fburth of the respondents indicated that their libraries use student responses as a basisfbr perfinmance appraisal, while slightly less than sixteen percent reported that evaluative responses solicitedfiomfaculty are used to assess petfvrmance. Peer observation of libra y instruction as an evalu- ative tool was also~~u~tly reported. Survey results are analyzed and dis- cussed, us are bem$fs and limitaiions @various types ~~l~t~, The author argues that subjective data alone are inadequate to measure student learning, guide programmatic improvements in libray instruction, or be used as a basis@ librarians’ perfbrmance appraisals. It is suggested that, due to its Patrick &gains is Business and CGmmmenf l@rrnafiun Librarian, University ~~~a, .&no, NV 89557. E-mail: cragai~~f~.unr,edu7. Research Strategies,vol. 15, no. 3, pp. 159-175 01997 by JAI Press Inc. All rights reserved.

Upload: patrick-ragains

Post on 17-Sep-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of academic librarians' instructional performance: Report of a national survey

Evaluation of Academic Librarians’ Instructional

Pe~ormance: Report of a National Survey

by Patrick Ragains

Some types oflibrary insfruction easily allow student learning to be assessed (i.e., the self-paced workbook and credit-bearing library instruction), but there is nogenerally accepted means of measuring learning gainedJ+om the typical one-shot library instruction session, Instead, evaluation oflibrary instruction tends toficus upon attendees’ ~c~~~s cf the libra~n’s pe$rmance. This study reports the results ofa survey ~~lua~n pr~‘c~ reported by library instruction coordinators atfirty-f5ur colleges and universities throughout the United States. Among the respondents’ organizations, “reaction data” is most

frequently used to assess the +ctiveness oflibrary instruction, although it pro- vides little evidence concerning what students have learned. Nearly three- hurts 4 the libra y inst~ctio~ c~dinators who a~w~ed the survey indicated that ~luati~~edback is ~licited~om students at the& institu- tions. More than one fburth of the respondents indicated that their libraries use student responses as a basisfbr perfinmance appraisal, while slightly less than sixteen percent reported that evaluative responses solicitedfiomfaculty are used to assess petfvrmance. Peer observation of libra y instruction as an evalu- ative tool was also~~u~tly reported. Survey results are analyzed and dis- cussed, us are bem$fs and limitaiions @various types ~~l~t~, The author argues that subjective data alone are inadequate to measure student learning, guide programmatic improvements in libra y instruction, or be used as a basis@ librarians’ perfbrmance appraisals. It is suggested that, due to its Patrick &gains is Business and CGmmmenf l@rrnafiun Librarian, University ~~~a, .&no, NV 89557. E-mail: cragai~~f~.unr,edu7.

Research Strategies, vol. 15, no. 3, pp. 159-175 01997 by JAI Press Inc. All rights reserved.

Page 2: Evaluation of academic librarians' instructional performance: Report of a national survey

140 RESEARCH STRATEGIES B(3)

limitations, course-related library instruction can render little usefkl evaluative iqfiwmation. Librarians need to ident& and implement teaching strategies that increase the avant ~rneani~~l inst~ction and ally student leaking to be assessed.

E valuation is acknowledged as a weak link in the provision of library instructi~n.~ This is especially true of “one-shot“ instruction, wherein

librarians are commonIy allocated part or all of a single class period to tell students what they should know to find information, whether to meet an immediate need such as completing a course assignment or conduct more wide-ranging literature and information searches in a given subject area. Librarians’ search for suitable instructional methods, settings, and oppor- tunities are well documented in the professional literature and on the Intemet- based BI-L discussion group, and reflect the desire to serve our users-primarily students--in the best manner possible. At the same time, there are interests driving the evaluation of BI that have less to do with student learning than with sensitivity to library instruction as a highly visible forum for academic librarians, a forum that may be treated as a proving ground for individual librarians, their organizations, or even the profession as a whole, Due to these interests, library faction may be evaluated for multiple purposes-as feedback for individual librarians, for purposes of evaluating program effectiveness, and as evidence to be used in individual performance appraisals.2

There is a large body of literature on the practice of library instruction, and a reasonable, if smaller, amount on evaluating its effectiveness. Much of the literature on evaluation tends either to be based in practice (i.e., “this is how we did it”) or prescriptive (i.e., “this is what you can/should do’%s These snapshots of actual or idealized evaluation programs aim to convey methods for measuring student learning, student and faculty satisfaction with library instruction, and the overall effectiveness of an instructional program. Satisfy- ing these criteria may be considered the primary goals of library instruction. A broad range of considerations for developing a program of evaluation are discussed in separate works by King and Ory, Adams, and Lancaster.4 Still, there appears to be a need for wider understanding of existing evaluation practices for library instruction in higher education. It may also he worthwhile to determine if the visibility of library instruction has in fact attracted special attention to it and, if so, to discuss its possible effect on academic librarians. To these ends, the present study of evaluation practices for library instruction was carried out.

THE SURVIW CONDUCTED IN TWO PHASES The basis of this study was a survey of library instruction practices,

conducted in two stages by the author. The survey, which is reproduced in the

Page 3: Evaluation of academic librarians' instructional performance: Report of a national survey

Ragains 161

Figure 1 Number of survey responses by type of institution, compared to total number of institutions of higher education and branches in U.S. and outlying areas, by type

and control of institution: 1994-9512

Institution Type:

Public university Private university Public college (4-yr., some with

Master’s degrees)

Private college (4yr., some with Master‘s degrees)

Community college Total

Total

94 62

511

1548

1,036 3251

Percent of No. of Total

Respondents Responding

20 21.27 1 1.61

10 1.95

7 0.45

6 0.58 44 1.35

appendix, was comprised of questions related to the evaluation of library instruction, student learning of library and information skills, the uses of evaluative data, the status of librarians at the respondent’s institution, volume of instructional activities and the availability of relevant training and devel- opmental opportunities for library instruction providers. The survey was first mailed in the fall of 1994 to library instruction coordinators at universities identified as peer institutions by administrators at Montana State University, Bozeman (MSU Bozeman), where I was employed at the time. The incumbent library instruction coordinator at MSU Bozeman also completed the survey. Most of the first round of surveys were completed and returned, but the results appeared to be of limited usefulness, mainly due to the small size of the group initially chosen for the survey. In the summer of 1995 I posted the survey on BI-L, the Internet discussion group for instructional librarians, and re-posted it in the early fall. The BI-L version of the survey included an additional question concerning total student enrollment at the respondent’s campus, but was otherwise identical to the first questionnaire.

I received a total of forty-four useable responses from libraries in the United States, both through surface and electronic mail.5 Most of the respon- dents were self-selected from among the subscribers to BI-L and, as such, the survey results cannot be considered a representative sample of library instruc- tion evaluation practices. One can safely presume, however, that the respon- dents were substantially interested in the topic and perhaps also that major trends would be represented by their answers. Even allowing for lack of a carefully selected sample of respondents, the results of such a survey seems to warrant the attention of library instruction coordinators, instructional librarians and their administrators. The survey has identified several trends

Page 4: Evaluation of academic librarians' instructional performance: Report of a national survey

162 RESEARCH STRATEGIES XX31

in the practice of library instruction and raised issues that may lead to a more informed review of instruction programs in academic libraries.

Characteristics of the Respondents and Their Institutions As shown in Figure 1, a plurality of completed surveys (twenty) came

from public universities. One-half that number were returned from public colleges, slightly fewer from both private and community colleges, and one from a private university. This skewed response rate is partially attributable to differences in the way each phase of the survey was conducted, since the initial mailing went to ten public universities in the intermountain West, accounting for eight of the twenty responses in that category. Apart from this bulge among the respondents from public universities, there is no explanation for the number of respondents in any category.

Respondents were asked to indicate the status of librarians at their insti- tutions (Figure 2). This was a primary interest, since I wished to know if any differences in the evaluation of instructional activities could be attributed to the status of librarians as tenured or non-tenured faculty, or as non-tenured professional staff. As shown below, most of the survey respondents work at

Figure 2 Status of Librarians At Respondents’ Institutions

status No. of Respondents Non-tenured professional staff 9

Tenurehack faculty 28

Non-tenured faculty 9

Percentages 20%

60%

20%

Non-tenured faculty

Non-tenured professional staff

Tenure-track faculty 60%

Page 5: Evaluation of academic librarians' instructional performance: Report of a national survey

Ragains 163

colleges or universities where librarians are tenure-track faculty. Some differ- ences among these groups will be discussed below.

ANALYSIS OF THE SURVEY RESPONSES The following discussion is based on the total survey results and various

cross tabulations, which examine the frequency of particular techniques of library instruction evaluation when used alone, in conjunction with other methods of performance appraisal, or in particular work environments. Se- lected data appears in Figures 3 and 4, covering several different methods of evaluating library instruction and the uses made of evaluative data, respec-

Figure 3 Library Instruction Evaluation Methods Reported, Analyzed by Stahts of

Librarians at Responding Institutions

Library Instruction Evaluation Methods Reported by Survey Respondents

Page 6: Evaluation of academic librarians' instructional performance: Report of a national survey

164 RESEARCH STRATEGIES 150

Figure 4 Uses of Evalutaive Data, Analyzed by Status of Librarians at

Responding Institutions

stetus of Iibrtims

Data used for: Non-tenured professtonal Tenure-track faculty Non-tenure track Totats

(n-s) (~26) (IIJI)

Feedback to librarian 5 II 2 18

Libra&u’s performance ngpnisal 2 s 0 7

~~of~a~t~~a 4 8 2 14

Uses of Evaluative Data Reported by Survey Respondents 18

tively. As previously stated, most respondents were self-selected, so the survey results are not presented as a representative nationwide sample.

Student Involvement in Evaluation Because students are the primary audience for library instruction, their

impressions of its effectiveness warrant our interest, The first questions on the survey concerned the solicitation of responses from students who had re- ceived library instruction. Gathering responses from students concerning their satisfaction with library instruction emerged as the most frequently-used type of evahxation. In contrast, students are actually tested on their knowledge and application of library and information sear& concepts only about half as often as they are asked their general impressions of instructional presentations and materials provided by librarians, Approximately three-fourths of the respond- ing librarians use student evaluations of this kind in their institutions. These methods of evaluating library apron bad been in use for up to thirteen

Page 7: Evaluation of academic librarians' instructional performance: Report of a national survey

Ragains 165

years, with two years as the median. Student evaluations serve as satisfaction surveys in most of the reported cases, with students often being asked to complete a response form at the end of the library instruction session, before the class is dismissed. Multiple responses were allowed for this question and some librarians reported that student evaluations are solicited at different times at their institutions, depending on the type of instruction given or the discretion of the individual librarian. In slightly less than half the cases, students may complete and submit an evaluation form at a class meeting soon after the session. An equal number of responses indicated that student evalu- ations are solicited at or near the end of a course, after students have completed an assignment requiring library use. Timing evaluations to occur after stu- dents have had an opportunity to use sources or search techniques covered in a session may allow students more fairly to assess the benefits of library instruction, although students sometimes report that, after several weeks have passed, they cannot remember what was covered in a session. Collecting evaluative data soon after an instructional session may increase the number of responses, but tends to limit the information collected to commentsabout the style and organization of the librarian’s presentation.

A number of copies of library instruction evaluation forms used with students and faculty were submitted by survey respondents and additional samples of such forms were obtained from the LOEX Clearinghouse. These forms typically solicit reactions concerning the appropriateness of a session’s content and the librarian’s manner of presentation (e.g., how skillfully the session was conducted, the librarian’s responsiveness to questions, etc.). The forms are similar in many ways to those given to students near the end of a course, on which students are asked to rate their course instructors.

The “trail” designated for student evaluations of library instruction (i.e., who may see the completed forms) was the subject of another question, since the path taken by evaluations may reveal a level of interest within an organi- zation and the uses made of this data. In all cases the librarian who taught the instructional session is allowed access to student responses. Next in frequency, twenty-one respondents (66% of the group using student evaluations) indi- cated that a non-supervisory library instruction coordinator may see evalu- ation forms completed for other librarians’ classes. The faculty member teaching the course for which the session was given may see student responses in about 53% of the reported cases, while a librarian’s direct supervisor may do so approximately 47% the time. Last in line are administrative librarians other than one’s direct supervisor (such as assistant dean or director), who are allowed access to this information in only five known instances, representing sixteen percent of the libraries that solicit evaluative responses from students.

As indicated by all but one of these respondents, data from the completed forms is used as feedback for the librarian who delivered a particular session. Student responses are used just as frequently to help judge the effectiveness of the library’s instructional services. In 37.5% of this group, evaluative data

Page 8: Evaluation of academic librarians' instructional performance: Report of a national survey

166 RESEARCH STRATEGIES 15(3)

from students is used as a component in individual performance appraisals, including annual, promotion, and tenure reviews. The data is used almost as often to develop profiles of individual librarians’ skill in delivering library instruction. Only one-third of this group asks students to complete evaluation forms at or near the end of the course for which library instruction was given, with over twice as many gathering student responses either immediately or very soon after the librarian’s presentation. Such responses are known to be limited in quality but, because the information is comparatively easy to obtain, appear to be used more frequently than any other means of assessing the impact of library instruction on students.

Teaching Faculty’s Involvement in Evaluation of Library Instruction In course-related BI, we generally rely on faculty to tell us their students’

needs concerning library use and information seeking. It is a logical extension of this reliance to ask faculty if our efforts have been satisfactory or whether the results they wanted were achieved. A faculty reaction survey is used in forty-one percent of the respondents’ institutions to gather such feedback. Almost three-fourths of this number also survey students concerning their satisfaction with library instruction sessions they have attended. Where used, some form of evaluation of library instruction by teaching faculty has been in place for a median of three years, and the data collected is used by supervisors in performance reviews about forty percent of the time. The paper trail for surveys completed by faculty resembles that of student-completed evalu- ations, with the librarian who delivered the instruction always having access to the information, and the library instruction coordinator and supervisor being privy to it in over half the applicable cases. Faculty evaluations of library instruction also appear to be more frequently used among responding insti- tutions in which librarians are tenure-track faculty (although one cannot assume this is generally true, due to the small number of survey responses). Here, questions about the quality and nature of campus-wide faculty relation- ships arise, to which I will devote attention near the end of this paper.

Peer Observation and Other Means of Evaluating Library Instruction Peer observation has received attention in library-related literature, and

it emerged in this survey as a frequently-used form of evaluation.6 Although how often it is used is unknown, peer observation is used to some degree in just over half of the respondent’s libraries. Peer observation usually serves as a type of formative evaluation, which is an evaluation undertaken to improve a process or program before its conclusion. Such observation and feedback is often incidental to team teaching or other collaborative efforts in delivering library instruction, and less-experienced librarians can gain confidence by working with a colleague to prepare a lesson, by teaching in the presence of a mentor or peer, and by watching peers in action. Observation by the library instruction coordinator (usually a peer) was reported by forty-one percent of

Page 9: Evaluation of academic librarians' instructional performance: Report of a national survey

Ragaim 167

the survey respondents. Similarly, feedback about one’s teaching effectiveness is often received informally from faculty who develop close working relation- ships with one or more librarians. Closely related to such developmental activities are training by one’s supervisor or an instruction coordinator and support to attend professional meetings and conferences on teaching, all of which were available at most of the respondent’s libraries.

Less frequently used types of evaluation are observation by the supervisor (reported by thirty-six percent of the respondents), and the supervisor’s assessment of handouts prepared for sessions (thirty-two percent). Depending on the circumstances, these types of evaluation may either be formative or summative (i.e., characterizing the evaluation of a finished program).

Effectiveness in Library Instruction as a Factor in Performance Appraisal Survey respondents were asked to indicate the importance of library

instruction and other job responsibilities as factors in annual, promotion and tenure-related reviews at their institutions. Other job factors covered by this question were reference assistance to users, online or Internet searching, research and publication, collection development, other library-related activi- ties such as planning and analyzing library services, and service to one’s institution, community, or profession. Librarians answered this question in a variety of ways, sometimes giving percentages for each category, and in other instances assigning a range of percentages, indicating they and their col- leagues are evaluated differently based on individual responsibilities. One respondent commented that, considering the effort required to plan and deliver library instruction, it was not given enough weight in performance appraisals in her library. A total of twenty-three librarians indicated a specific percentage assigned to library instruction in the appraisal process, ranging from zero to 50%, with a median of 20% (which was also the percentage most commonly reported). A 20% ranking for library instruction in comparison to other performance factors seems quite reasonable, although in some instances library instruction appeared to receive a relatively low valuation in the per- formance appraisal process due to greater importance being assigned to factors such as electronic information searching or scholarly research and publication.

Reframing the Inquiry: Are We Evaluating an Ineffective Instructional Medium?

It is appropriate for academic librarians to be responsive to their users, accept accountability for their performance and, when necessary, seek to improve it. It follows from this that librarians should be accountable for delivering high-quality group instruction. However, given the limitations of the one-hour stand, we should ask ourselves if any useful information is gained by trying to evaluate it.7 Course-related library instruction sessions occur mainly as a result of faculty requests, frequently in a tightly circum-

Page 10: Evaluation of academic librarians' instructional performance: Report of a national survey

168 RESEARCH STRATEGIES ZSf3)

scribed environment which allows the librarian relatively little control over the instructional experience (except when it is ceded to the librarian by the course instructor). The expectations of faculty and students are often poorly formulated, as often indicated by users’ oversimplified ideas about informa- tion sources (e.g., mistaking Hytelnet for a rmion catalog, or overemphasis of CARL Uncover in comparison to subject-based indexes) and the faulty expec- tation that library and information search skills can be taught adequately in one class session (or less). This survey indicates that librarians are very seldom able to assess what students learn from group library instruction. There is little basis in such a scenario for useful or fair appraisals of librarians’ performance.

Soliciting evaluations from course instructors whose students receive library instruction is also problematic. When evaluated in this way, librarians are exposed to a level of examination not normally experienced by faculty outside the library. Also, it is arguable that librarians’ instructional perform- ance is sometimes more closely scrutinized than are other aspects of profes- sional library work such as reference assistance, cataloging, or collection development. Group instructional sessions are a highly visible forum for academic librarians and some libraries have instituted session-by-session evaluations primarily in order to emphasize librarians’ teaching role and show they are accountable as faculty members. Such exercises may unintentionally place librarians and library services at a disadvantage in their campus envi- ronments. To request an evaluation for every classroom interaction does nothing to promote confidence in librarians, and may in fact suggest a pre- sumption of incompetence in their ability to address basic instructional needs.* Where such a Procrustean fit has been implemented, librarians are placed under greater scrutiny in comparison to their colleagues who teach credit- bearing courses, and consequently may be viewed as less deserving of promo- tion, tenure, and ultimately of faculty status.

While quality instruction can and often does occur during one-hour stands, I believe this type of library instruction is often a poor medium for teaching and learning. As such, there is little reason for the special efforts often made to evaluate it. Instead, better instructional settings are needed, which are planned and delivered by librarians who regularly assist users with their information needs. Credit-bearing library research and information literacy courses have addressed this need, although only a relatively small number of students typically take these classes.’ Librarian-led term paper clinics, usually limited to a single class session, have been initiated in many college and university libraries, yet these often have been too poorly attended to justify their contimration in libraries where staffing levels are either low or diminish- ing. A lesser-used but potentially effective model involves a hybrid between one-shot, course-related library instruction and the semester-long course in library or information literacy, to be aimed at students majoring in a given discipline, rather than those in individual coursesJo Online and web-based instructional guides and tutorials provide more ways to reach students. I1

Page 11: Evaluation of academic librarians' instructional performance: Report of a national survey

Regains 169

These types of instruction allow either for longer periods of contact between students and librarians (thereby giving more opportunity to assess students’ learning, if so desired) or, in the case of an instructional web page, simply for more unmediated contact. Such efforts can be used as program components to supplement course-related library instruction and traditional reference desk service.

CONCLUSION AND RECOMMENDATIONS The purpose of this survey and accompanying analysis was to examine

existing practices for evaluating library instruction. Course-related library instruction often occurs in settings which am less than ideal and erode its potential value. ln such an artificially prescribed setting, there is little to be gained by using the evaluation methods identified as most common in this national survey. Little useful feedback is available from satisfaction surveys completed by teaching faculty and their students. Indeed, when the data collected is used in performance appraisals, such practices may do harm. More useful in this setting are developmental activities such as team teaching, observation by and guidance from supervisors and peers, and simply becom- ing better acquainted with teaching faculty. These kinds of developmental activities are characterized by or naturally encourage formative evaluation and selfassessment. When library apron is to be evaluated in a formal manner, the purposes for doing so must be clearly defined and understood by librarians who provide the service, their supervisors, and admiistrators. Distinctions must be made between efforts to measure learning outcomes for students (normally determined by test results), the overall instructional pro- gram, or the performance of individual librarians. It is ill-advised to use satisfaction surveys as evidence for personnel evaluations, especially in the absence of other sources of information. Fair summa tive evaluation of librari- ans’ instructional performance requires sustained opportunities to teach, which is not a common characteristic of the one-hour stand. Developing improved assessment of student learning from library instruction, although beyond the scope of the present study, is certainly a worthy undertaking. Any such ~o~endatio~ or models could profitably draw upon the experience of librarians teaching for-credit courses, in addition to the more general body of literature on the evaluation of learning.

I believe our best opportunity to teach library and information skills

resides in more aggressive, proactive planning and delivery of instruction, in contrast to library instruction that is offered only at the request of faculty. Libra-~tiat~ instruction, includii credit-bearing information literacy courses, subject-based group instruction not tied to individual courses, and World Wide Web-based guides and tutorials, tends to free us from the artificial constraints of one-shot library instruction, which is often provided by default and in the absence of planning to deliver better and more consistent instruc- tional programs. Once such programs are ~pl~~t~ and given a chance to

Page 12: Evaluation of academic librarians' instructional performance: Report of a national survey

170 RESEARCH STRATEGIES 15(3)

mature, evaluation of library instruction would begin to outgrow its “nuisance value,” and no longer tend to cast librarians in a position subordinate to their faculty colleagues. Additionally, data gathered for formative and summative evaluations would permit more meaningful assessment of student learning and the teaching effectiveness of librarians.

Appendix Library Instruction Evaluation Survey

1.

2.

3.

4. Who has access to the responses of students on the completed forms? Check all which apply.

-a.

-b.

- c-2.

The librarian who gave the session.

The librarian’s direct supervisor.

Administrative librarians other than the librarian’s direct supervi- sor.

-d.

- e.

The library instruction coordinator (check only if this person does not supervise the librarians who deliver instructional sessions).

The faculty member teaching the course for which the session was given.

Does your library currently use an evaluation instrument to measure students’ satisfaction with library instruction? (A copy of your library’s evaluation instrument would be welcome.) YES NO-

How long has your library’s basic instrument for measuring student satisfaction with library instruction been in use?

- yrs.

When are students asked to complete the evaluation forms?

- a. At the end of a library instruction session, before the class is dismissed.

_ b. At a subsequent class meeting, soon after the library instruction session.

c -* At or near the end of the course, after the students have completed any assignments requiring library use.

f -* Others (describe under “Comments” below).

continued. . .

Page 13: Evaluation of academic librarians' instructional performance: Report of a national survey

Appendix Library Instruction Evaluation Survey (cont’d)

comments:

5. How is the data on the completed student response forms used? Check all which apply.

_a. As feedback for the librarian who delivered the instructional ses- sion.

b. By supervisors as a component for individual performance apprais- als (i.e., annual, promotion and tenure reviews).

_ c. To assess the effectiveness of the library’s instructional services.

6. Are student responses compiled and/or analyzed to develop profiles of individual librarians’ effectiveness in delivering library instruction?

YES NO-

7. As part of your library’s instructional services, are students tested on their knowledge and application of library and information search concepts?

YES- NO- SOMETIMES

Comments:

Do you have an evaluation instrument to measure faculty satisfaction with library instruction? (Again, a copy would be welcome.)

YES NO-

How long has the basic instrument for measuring faculty satisfaction with library instruction been in use?

-yrs.

10. Who has access to the responses of faculty on the completed forms? Check all which apply.

_ a. The librarian who gave the session.

continued.. .

Page 14: Evaluation of academic librarians' instructional performance: Report of a national survey

172 RESEARCH STRATEGIES 1X3)

Appendix Library Instruction Evaluation Survey (cont’d)

-b. - c.

-d.

- e.

The librarian‘s direct supervisor. Administrative librarians other than the librarian’s direct supervi- sor, The library instruction coordinator (check only if this person does not supervise the librarians who deliver instructional sessions). Others (describe under “Comments” below).

Comments:

Il. How is the data on the completed faculty response forms used? Check all which apply. _ a. As feedback for the librarian who delivered the instructional ses-

sion. _ b. By supervisors as a component for individual performance ap-

praisals (is., annual, promotion & tenure reviews).

_ c. To assess the effectiveness of the library’s instructional services.

12. Are faculty responses compiled and/or analyzed to develop profiles of individual librarians’ effectiveness in delivering library instruction?

YES- NO-

13. Below, indicate other means used at your institution to evaluate librarians’ instructional performance. Check all which apply.

_ a. Supervisor observation of sessions. _ b. Supervisor‘s assessment of handouts prepared for sessions.

c = d.

Peer observation of sessions. Observation by the library instruction coordinator.

e -. Other (describe under “Comments” below).

14. Please indicate the status of librarians at your institution: _ a. Non-tenured professional staff _ b. Tenure-track faculty _ c. Non-tenured faculty

continued. , .

Page 15: Evaluation of academic librarians' instructional performance: Report of a national survey

Ragains 173

Appendix Library Instruction Evaluation Survey (cont’d)

15. Using percentages, please indicate the importance of the factors below in assessing the performance of reference librarians at your institution. (A range of percentages may be indicated if reference & instruction librarians are evaluated differently based on their particular responsibilities.)

Assistance to users at reference desk or similar service points.

Online or Internet searching

Library instruction

Collection development

Other library-related activities (e.g., planning/analyzing library services)

Service to the campus, community & profession

Research and publication

16. At your institution, how many librarians contribute to the library’s in- structional services program by making presentations to classes and other groups?

17. a. Excluding general tours, approximately how many sessions are deliv- ered each academic year to students enrolled at your institution?

no. of sessions

b. Are all of these sessions evaluated?

YES- NO-

Comments:

18. a. Excluding tours, approximately how many sessions are delivered each academic year to clientele who are not your primary users (e.g. high school classes, non-campus organizations, business associations)?

no. of sessions

b. Are these sessions evaluated?

YES NO- SOMETIMES

contimwd . . .

Page 16: Evaluation of academic librarians' instructional performance: Report of a national survey

174 RESEARCH STRATEGIES 2X3)

Appendix Library Instruction Evaluation Survey (cont’d)

comments:

19. What training or developments oppo~ti~ are availabie to help li- brarians improve their teaching skills? Check all which have been avail- able during the last two years.

- a.

-b.

- c.

Training by the library instruction coordinator or supr%wisor

Training by peers

Support to attend professional meetings and or conferences on teaching

-d. No training or development activities are regularly offered

- e. Other (describe under “Commenfs” below)

comments:

20. What was your institution’s enrolhnent in the most recent academic year?

Below, please provide your name, position title, address, telephone number and Internet address.

Name

Position

Institution

Address

Telephone no.

Internet address

Additional information and comments regarding your library’s instructional program are welcome.

Page 17: Evaluation of academic librarians' instructional performance: Report of a national survey

&gains 175

NOTES AND REFERENCES ‘Caroline Rowe, “Modem Library Instruction: Levels, Media, Trends, and Problems,” Research strutegies 22 &vinter 1994): 14. 2Mignon S. Adams, **Ev~uation~ in Source~k~r ~~blio~~hic l~~ruc~~n, eds. Kath- erine Branch et al., (Chicago: ACRL Bibliographic Instruction Section, 1993), p. 45. %ana Shonrock, ed., Evalpating Library Instruction: Sample Questions, Forms, and Sfrate- gies fir Practical Use (Chicago: American Library Association, 1996); two comprehen- sive reviews of the literature on evaluating library instruction are: Richard Hume Werking, “Evaluating Bibliographic Education: A Review and Critique,” Library Trends 29 (Summer 1980): 153-72; Christopher Bober, Sonia Poulin, and Luigina Vileno, “Evaluating Library Instruction in Academic Libraries: A Critical Review of the Literature, 1980-1993,” lsilp Rt$rnce Librarian, no. 51/52 (1995): 53-71. 4David N. King and John C. Ory, “Effects of Library Instruction on Student Research: A Case Study,” College & Research Libraries 42(1981): 31-41; Adams, “Evaluation,“ pp. 45-57; F.W. Lancaster, ~YOU Wanf to Evaluate Your Library... (Champaign, n: University of Illinois, Graduate School of Library and Information Science, 1993): 220-256. ?wo responses were received f’rom Canadian libraries, although these were not used in analyzing the total resuits. ?ee-Allison Levene and Polly I?. Frank, “Peer Coaching: Professional Growth and Development for Instruction Librarians,” Reference Services Review 21 no. 3 (1993): 35-42; Paul Burnam, “Fine-tuning Classroom Technique: a Peer Coaching Experience (at Ohio Wesleyan University),” Research Strategies 11 (Winter 1993): 42-46. 7Jean Sheridan, “The Reflective Librarian: Some Observations on Bibliographic ht- struction in the Academic Library,” J~u~i ofAcademic ~i~ar~~h~p 16 (1990): 22-26; Tom Eadie, “Immodest Proposals,” Li~ay~ou~u~ 115 0ctober 1990): 43. *In consideration of this point, it is worthwhile to note that an ERIC search for literature on evaluating guest lecturers (not limited to librarians) yielded no relevant citations. ‘Examples of credit-based library instruction known to the author include two courses on government publications taught by librarians employed in the Documents and Maps Section of the Pennsylvania State University Libraries; general information literacy courses taught at Montana State University, Bozeman and the University of New Mexico Libraries; a course at SUNY Plattsburgh, reported in Carla List, “Branch- ing Out: A Required Library Research Course Targets Disciplines and Programs,” The R.@rence Librarian No. 51/52 (1995): 385-98; and a discipline-based course titled Agricultural Information Literacy at Montana State University, Bozeman. ‘?I& model provided the basis for a revamped library instruction program at Mon- tana State University, Bozeman. Scheduled to be implemented in the 19%/97academic year, it has been postponed, although the author has used the model’s basic framework for providing orientations to government information at the University of Nevada, Reno Libraries. “World Wide Web-based library instruction tutorials available at the time of this writing include sites available at: New Mexico State University <http://li- brary.nmsu.edu/projects/tutorial/index.html>, Cornell University <http://uris- lib.library.cornell.eduftutorial.html~, Ohio State University <ht~://www.lib.ohio-state.edu/Gateway/>, and Johns Hopkins University ch~://~.we~~.jhu.~u/Education/~to~~s/practi~m.h~>. ‘?Source for total numbers of institutions: “Institutions of Higher Education and Branches, by Type, Control of Institution, and State: 1994-95,” Digest 4 Education Statistics, 1995 (Washington: Government Printing Office, 1995): 248. The author also acknowledges Graduate Assistant Marisa Dhaneswongse for her assistance in creating Figures l-4.