swiss army assessment diana hartle, amy watts university of georgia assorted tools for evaluating...

Post on 25-Dec-2015

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Swiss Army Assessment

Diana Hartle, Amy Watts

University of Georgia

Assorted Tools for Evaluating Instruction

Measure twice, cut once

• Who or what is being assessed?

• Why is the assessment happening?

• What do you hope to do with your findings?

• Also remember (as if you could forget) your budget, time, and technology constraints.

Paper or Plastic?

How are you going to do your assessment?

• Methodology – quantitative or qualitative?

• What’s your yardstick?

• Collection – Paper? Computerized? Other technology?

For every job, there is a perfect tool

Don’t reinvent the wheel• Do a literature search in library literature.

• Web search for other libraries’ initiatives

• Trawl the listservs – if you’ve already searched the archives and nothing’s there, ask!

• Consult colleagues at your own library and other libraries.

Case Study 1: Peer Observation

Who is being assessed? Teaching librarians

Why is the assessment happening?Continuing professional development

What will be done with findings?Individual librarians can make improvements/adjustments to teaching methods

Case Study 1: Peer Observation (Continued)

Methodology: qualitative

Measurement: None. This is a descriptive process.

Collection method: Written

Case Study 1: Peer Observation (Continued)

Developing the tool; what have others done?

• Evolution of Peer Evaluation of Library Instruction at Oregon State University Libraries, Appendix A: IS Instruction & Training Checklist for Observations

• Data Gathering Tools • Suggestions for Working with Your Peer Coach • Classroom Observation Worksheet • Peer Teaching: How to Get Started

Tailoring to our own needs• Peer Observation Guidelines we developed

Case Study 2: WebCT Quiz

Who or what is being assessed?Information literacy and library skills of students in introductory level classes; librarians who teach those sessions

Why is the assessment happening?Directive from library administration

What do you hope to do with your findings?Demonstrate effectiveness and importance of library instruction; improve teaching in these sessions

Case study 2: WebCT Quiz (Continued)

Methodology: mix of qualitative and quantitative

Measurement: Percentage of questions answered correctly

Collection: Multiple choice/short answer online quiz delivered via course management software

Case study 2: WebCT Quiz (Continued)

Developing tool

• Committee formed

• Literature review conducted

• Examined samples from other initiatives• Consulted ARCL Standards for

Information Literacy http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.cfm

• Vetted through colleagues

What else have we put in our toolbox?

Instruction Assessment

• SurveyMonkey quiz for UNIV classes (Nadine Cohen) tinyurl.com/7kek7/  tinyurl.com/qguh6/

• Half-sheet written evaluation for introductory science classes (Monica Pereira) www.libs.uga.edu/ref/evalform.pdf

• Clear/muddy cards (Laura Shedenhelm)

Our colleagues’ toolboxes

Other assessment initiatives at UGA Libraries

• LibQual – Survey developed by ARL to assess and improve service in the Libraries.

• Focus groups for website usability• SLC use of building (focus groups & surveys)• Science Library User Satisfaction Survey• Collection Development faculty needs survey• Library Faculty Job Satisfaction Survey

http://dataserv.libs.uga.edu/assessment/activities.html

Go get your fingernails dirty

Assessment doesn’t have to be painful. Assessment for assessment’s sake will never be fun. But if you go into it with a clear expectation of how you will use your results, everything else will fall into place.

Good luck, and sharp tools are safe tools.

top related