“but how do we know it’s working?” designing effective evaluation of web-based information...

39
“But How Do We Know It’s Working?” Designing Effective Evaluation of Web- Based Information Literacy Instruction

Upload: jasper-phillips

Post on 25-Dec-2015

218 views

Category:

Documents


2 download

TRANSCRIPT

“But How Do We Know It’s Working?”

“But How Do We Know It’s Working?”

Designing Effective Evaluation of Web-Based Information Literacy Instruction

Overview of Presentation Topics

Overview of Presentation Topics

Part ILiterature Review(1) Introduction(2) Evaluation Methods(3) Motivations for Assessment(4) Observed Trends in Library

Instruction Evaluation

Overview of Presentation Topics

Overview of Presentation Topics

Part IILaurier’s Web-Based Information LiteracyInstruction: Overview of Our Assessment Plan

(1) Overview of Instruction Provided(2) Description of Assessment Plan: Goals,

Scope, Design, Implementation, and Results

(3) Use of Assessment Results to Develop Good Practices and Enhance

Future Web- based Instruction and Assessment.

Literature ReviewIntroduction

Literature ReviewIntroduction

• Literature replete with articles on assessment of library instruction.

• Much written on assessment of one-shot session; less on course-integrated information literacy.

• Many articles on assessment of in-class instruction but a marked lack of information on assessing library instruction in a Web-based environment.

• Motivations for assessment are many and strong in today’s world of higher education.

• A growing transition from the one-shot session to course-integrated, for-credit instruction has necessitated but also made possible a shifting emphasis in assessment methodology away from subjective towards objective evaluation.

Literature ReviewEvaluation Methods

Literature ReviewEvaluation Methods

Definitions

• Formative/Subjective EvaluationMeasures the quality of instruction; typically focuses on user satisfaction. Focus on reactive data in form of written or verbal feedback. Doesn’t show what the student actually learned. Predominant form of assessment in academic libraries.

• Summative/Objective EvaluationAims to gauge what has been learned or the extent of improvement in competencies or skills.

Literature ReviewEvaluation MethodsLiterature Review

Evaluation MethodsAssessment Instruments

• Formative/Subjective EvaluationSome Examples: Surveys (User Satisfaction Surveys, Attitude Surveys); Peer-evaluation Forms; Interviews; Focus Groups.

• Summative/Objective EvaluationSome Examples: In-class assignments or tests; pre-test/post-test method.

Literature ReviewMotivations for

Assessment

Literature ReviewMotivations for

Assessment• Education Reform Movement of 1980s provided

impetus for change.• Higher Education Accreditation

Bodies/Standards.• ACRL Standards & Task Force on Academic

Library Outcomes Assessment.• Shrinking library budgets.• Push from campus administrators to

demonstrate accountability.• Gain faculty support for instructional efforts.• Adds credibility to librarian work.• Helps librarians gain a clearer focus on the

reasons for and goals of instructional activities.

Literature ReviewLibrary Instruction Evaluation:

Observed Characteristics and Trends

Literature ReviewLibrary Instruction Evaluation:

Observed Characteristics and Trends• Not uncommon for no evaluation to be in place.• Typically involves an informal approach (formative)

rather than the application of rigorous scientific methodology. Realities of work situation can mean that it is not always possible to manipulate subjects to an ideal experimental design

• Often seen as elusive. • Libraries have typically demonstrated greater

strengths in gathering quantitative rather than qualitative data.

• Perceived/actual lack of definitive measures. • Lack of resources/time: Very often undertaken on the

job by regular library staff.• Lack of skills.

Literature Review Library Instruction Evaluation:

“Doing the Best You Can With What You Have”

Literature Review Library Instruction Evaluation:

“Doing the Best You Can With What You Have”

Barclay, 1993

• Shouldn’t wait for the perfect time or the perfect instrument

• Can be a modest study as long as well designed and results in hard data

Literature ReviewChanging Instructional Environment:

Implications for Assessment

Literature ReviewChanging Instructional Environment:

Implications for Assessment

Move toward for-credit, course-integrated information literacy has resulted in the following emerging trends: • Makes summative assessment more possible and

meaningful i.e. fosters a learning outcomes approach.• Makes it easier to develop assessment which is ongoing

involving a series of linked activities.• Introduces new challenges as information literacy stresses

higher order thinking and critical thinking skills - introduces need for new measures.

• Effective assessment relies on extensive cooperation between faculty, librarians and other parties - not just about the library!

• Information literacy increasingly involves multi-year, multi-tiered instruction - requires evaluation that is more complex and ideally implemented at program level.

• Arguably current instructional trends makes evaluation more important than ever before !

Laurier Library’s Assessment PlanLaurier Library’s Assessment Plan

(1) Overview of Instruction Evaluated(2) Description of Assessment Plan:

Goals, Scope, Design,Implementation, and Results

(3) Use of Assessment Results to Develop Good Practices and

Enhance Future Web-based Instruction and Assessment.

Overview of Instruction Evaluated

Overview of Instruction Evaluated• Web-based instruction delivered via WebCT

courseware (includes content modules, self-tests, and quizzes). Only in-class component was introduction to the tutorial (included presurvey & pre-test) and evaluation survey.

• Comprised five modules: Defining Your Research Topic; Information Requirements; Looking for Information; Our Catalogue: TRELLIS; and the Internet.

• Generic as opposed to subject specific• Targeted primarily at first year students.

Overview of Instruction Evaluated (Con’td)

Overview of Instruction Evaluated (Con’td)

• Focused on teaching of basic information competencies as defined in the objectives outlined at the beginning of each module.

• Integrated within three different course areas on request by faculty: Communication Studies (1st year class - approx. 320 students & 2nd year class - 62 students); Geography (1st year class - approx. 520 students).

• Required component of course and counted for marks (5% Communication Studies), (16% Geography).

Goals of Laurier Library’s Assessment Plan

Goals of Laurier Library’s Assessment Plan

Combine elements of objective/summative assessment and subjective/formative assessment with the following key goals:1. Establish what students had learned. 2. Obtain insights in to students’

perceptions of what and how much they had learned and how useful/effective the tutorial was in helping improve their research skills.

3. Determine effectiveness of librarians in responding to

problems/queries.

Scope of Laurier Library’sAssessment Plan

Scope of Laurier Library’sAssessment Plan

• Though tutorial content was delivered entirely online, not all assessment tools were Web-based.

• Limited Resources implied a fairly modest approach. Aimed to do the best we could with what was available to us.

• Online delivery of instruction did mean certain assessment techniques had to be ruled out.

• Not introduced at the program level - just focused on one instructional tool targeting one specific audience.

Summative AssessmentDesign, Implementation, Results

Summative AssessmentDesign, Implementation, Results

• Quizzes

• Pre-Test/Post-Test

Quizzes: OverviewQuizzes: Overview

• Multiple-choice questions.• Used WebCT quizzes for Communication Studies classes

(one quiz per module = 5 quizzes). Worth 5% of grade. Liked WebCT automatic grading feature, capability for random question generation, report/statistics generation features, ability for student to do quizzes on own time from on and off-campus locations.

• At request of professor Geography students had in-class multiple choice test (comprised 40 questions including post-test questions). Worth 16% of grade.

• Quiz questions aimed to test whether students had grasped the knowledge/skills defined in the learning objectives for each module.

• Focus on testing knowledge of facts & concepts rather than evaluating students’ ability to apply skills.

Quiz Results: OverviewQuiz Results: Overview

• Average Grade: CS100 - 73%; GG100 - 70% CS202 - 76%

• Rate of Non-Participation: CS100-14%; GG100 - 6%; CS202- 21%

• Comparison of Individual Quiz Results (average for three course areas): Quizzes 2 & 3 = >80%; Quiz 1 =79%;

Quiz 5 = 69%; Quiz 4 = 56%.

Quiz Results: Further Breakdown

Quiz Results: Further Breakdown

High Competency Areas

(over 80%)

1. Defining a Research Topic (Communication Studies).

2. Information Requirements: Defining characteristics of most key information resources and basic search tools identified in tutorial.

3. Search Strategies: Phrase searching; author/editor searching.

4. Understanding LC call numbers.

5. Internet: Broad definition of Internet (including limits); interpreting URL; identifying who produces Internet information.

Medium Competency Areas

(50%-80%)

1. Defining a Research Topic (Geography).

2. Information Requirements: Location of some library materials especially periodicals and government information.

3. Search Strategies: Boolean operators & truncation.

4. Catalogue: searching a book title or newspaper article citation; correctly identifying citations - chapters in books or journal articles.

5. Internet: specialized search engines; identifying the most appropriate Internet search tool for a given search from a supplied list; difference between resources a library offers and the free Internet.

Low Competency Areas

(below 50%)

1. Information Requirements: Difference between peer-reviewed and popular resources.

2. Catalogue: Searching citations (journal articles, chapters in books) in catalogue ; difference between keyword and subject searching.

3. Internet: Selecting from list of resources which most likely to be freely available on the Internet; definition of the Invisible Web.

Pre-Test/Post-TestPre-Test/Post-Test

• Researched design/implementation of pre-tests/post-tests at other higher education institutions (journal literature, Web).

• Goal was to test following core competencies through carefully selected multiple choice questions:• Organization of materials in an academic library (call numbers)• Understanding of what a periodical is.• Search Tools: the definition/function of a library catalogue,

bibliography, and an index.• Search Strategies: the purpose of truncation and Boolean.• Ability to interpret citations for journal articles and chapters in

books in order to search for them accurately in a library catalogue.

• Search Engines (capabilities and limitations).

• Some small changes to this test after first term to rectify some small problem areas

Pre-Test/Post-TestPre-Test/Post-Test

Implementation• Pre-Test administered in class (captive

audience) to ensure high rate of participation. Students asked to place name on test. Manually corrected (first term); Use of answer sheets graded by computer (second term).Post-Test - one of WebCT quizzes. Counted for marks during second term but not during first. 29% of CS students did not do post test during first term. During second term only 6% of Geography students didn’t do these questions, but 36% of second year CS students did not complete post test despite the fact it was worth marks.

Pre-Test/Post-TestSignificant Increase in Overall

Results

Pre-Test/Post-TestSignificant Increase in Overall

ResultsCommunicationStudies 100

Pre-Test (265): 43%Post-Test (231): 63%211 (66%) students did both tests% increase: 20%10 students no change in grade, 7 saw grade fallAverage increase in grade per student = 2.8 or 23%

Geography 100 Pre-Test (407): 58%Post-Test (490): 73%% increase: 15%Not possible to generate per-student statistics for post-test

CommunicationStudies 202

Pre-Test (25): 53%Post-Test (41): 77%21 (33%) students did both tests% increase: 24%2 saw grade fallAverage increase in grade per student = 2 or 25%

Pre-Test/Post-Test ResultsFurther Breakdown of

Results

Pre-Test/Post-Test ResultsFurther Breakdown of

ResultsHigh -> High> 70% in both

Call numbers (GG100 & CS202) - Question 1 How to identify books in library - Question 2 Definition of Periodicals - Question 3

Medium (55%-69%) -> High (>70%)

Call Numbers (CS100) - Question 1Definition of Bibliography - Question 7Definition of Index (GG100) - Question 9 Internet Search Engines - Question 10

Weak (<45%) ->High (>70%)

Truncation - Question 8Definition of Index (CS100) - Question 9

Weak (<45%) ->Medium (55%-69%)

Boolean Operators (AND, OR) - Question 6Definition of an Index (CS202) - Question 9

Weak (<45%) -> Weak (<45%)

Searching Journal Article Citation in Catalogue - Question 4Searching for Chapter in a Book using Catalogue - Question 5

Formative EvaluationFormative Evaluation

1.Pre Survey (Library Usage/Experience)

2.Evaluation Survey (User Satisfaction)

3.Usability Testing/In-Depth Survey

PresurveyPresurvey

• Administered alongside Pre-Test during class time• Designed to gather information on students’

previous library/information resource usage & experience

• Students asked five questions: (1) whether they had used an online library catalogue before; (2) how often they use libraries; (3) whether they had previously received library instruction; (4) if Internet access is available from home or residence room; (5) how often they use the Internet

• CS100 (265 responses); GG100 (407 responses); CS202 (25 responses)

Presurvey: ResultsPresurvey: ResultsHave you used an online library catalogue before?

CS100: Yes - 64%; No- 36%

GG100: Yes - 84%, 16% - No;

CS202: Yes - 76%; 24% - No

How often do you use libraries?

CS100: 11% (frequently), 56%(occasionally), 30% (rarely), 3% (never)

GG100: 9% (frequently); 55% (occasionally), 34% rarely; 2% (never) CS202: 12% (frequently); 40% (occasionally); 44% (rarely); never(4%)

Have you ever received instruction in the use of library and information resources?

CS100: Yes - 69%; No - 31%

GG100: Yes - 59%; No - 41%

CS202: Yes - 60%; No - 40%

Do you have access to the Internet at home or in your residence room?

CS100: Yes - 88%; No- 12%

GG100: Yes - 93%; No - 7%

CS202: Yes- 88%; No - 12%

How often do you use the Internet?

CS100: 69%(frequently); 28% (occasionally); 2.7% (rarely); 0.3% (never)

GG100: 80% (frequently); 16%(occasionally); 3.75% (rarely); 0.25% (never)

CS202: 76% (frequently); 20% (occasionally); 4% (rarely)

Evaluation SurveyEvaluation Survey

• Combination of short answer/closed questions (7 in total) and open-ended questions (3 in total).

• Designed to determine student attitudes/perceptions in terms of usefulness/effectiveness of tutorial as a learning experience.

• Distributed by librarians, faculty, or TAs after students had completed tutorial and deadline for submission of quizzes had elapsed.

• Results processed manually with some student assistance.

• Response Rates: CS100 - 223 (69%); Geography - 399 (77%); CS202 - 36 (58%).

Evaluation Survey: ResultsEvaluation Survey: Results

How much of the Web tutorial did you already know?

CS100: Most (10%); Some (83%); None (7%)

GG100: Most (24%); Some (69%); None (7%)

CS202: Most (22%); Some (61%); None (17%)

If you knew some of this already, please specify what? (only asked in second term)

GG100: circa 25%: Internet especially how to search the Internet, and searching TRELLIS especially for authors and books, 13%: basic search skills especially Boolean; 9%: journal articles.

CS202: 28%: TRELLIS searching; 17%: Internet; 11%: basic search strategies; 8%: location of materials

Feeling about length of tutorial

No. of hours to complete the

tutorial

CS100: Too long (89%); About Right (11%)

GG100: Too long (54%); About Right (46%)

CS202: Too long (76%); About Right (24%)

CS100: Average 5.3 hours

GG100: Average 4.2 hours

CS202: Average 3.8 hours

Evaluation Survey: ResultsEvaluation Survey: ResultsDid you experience technical difficulties when doing the tutorial?

If yes, what form did this take:

CS100: 202 responses (91%)

GG100: 97 responses (24%)

CS202: 12 responses (33%)

CS100: Yes - 50%; No - 50%

GG100: Yes - 21%; No - 79%

CS202: Yes - 33%; No - 67%

CS100: Logging On (22%); Quiz Grading (43%); Slow system (21%); Other (14%)

GG100: Logging On (13%); Slow System (60%); Other (27%)

CS202: Logging On (0%); Quiz Grading (75%); Slow System (8%); Other (17%)

Do you think that this Web tutorial will help you conduct research more effectively?

CS100: Yes - 35%, No - 35%; Don’t Know - 30%

GG100: Yes - 70%; No - 13%; Don’t Know - 17%

CS202: Yes - 33%; No - 36%; Don’t Know - 31%

In the event that a choice of instruction delivery mechanisms had been available, which of the following would you have preferred?

CS100: Prefer online: Yes - 57% ; No - 41%; Don’t Know - 2%

GG100: all online - 46%; online & in-class - 41%; in-class - 13%

CS202: all online - 72%; online & in-class - 17%; in-class -11%

Did the librarians respond to your questions and problems promptly?

CS100: Yes - 90%; No - 8%; Don’t Know - 2%

GG100: Yes - 11%; No - 3%; Doesn’t Apply - 86%

CS202: Yes - 28%, No - 0%; Doesn’t Apply - 72%

Evaluation Survey: ResultsEvaluation Survey: Results

What was the most useful thing which you learned from the tutorial?

Most Frequent Answers:• Internet especially how to search, do research, and evaluate

information.• How to search TRELLIS. • Search strategies especially Boolean logic.• Journal indexes/How to find journal articles.• Location of items/How call numbers work.• How to do research/about the library/scope of information

available.• How the TUG Libraries work.• How to use the Library’s Web site.

Evaluation SurveyResults

Evaluation SurveyResults

What would you change about the tutorial content to improve it ?/ Other Comments?

Most Frequent Answers:• Make it shorter/more concise/less repetitive/less details.• Nothing (esp. GG100 & CS202).• Too common-sense/basic in places.• Make it more interactive, exciting. Less reading, point-form.• Introduce more hands-on or applied activities/assignments.• Should be optional.• Not helpful if senior student (esp. CS202).• Should relate more to subject matter of course.• Introduce more in-class instruction (esp. CS100).• Some module specific comments.

Usability Testing/In-Depth Survey

Usability Testing/In-Depth Survey

• Conducted research on usability testing both in general (Nielsen, Krug) & in library context. Found much on testing of library Web sites, nothing on testing of Web-based instructional tutorials. Needed to adapt what we learned to develop our own testing.

• Used Nielsen’s principle of “discount usability”.• Final instrument combined elements of usability testing with

elements of in-depth user survey. • Target group: first or second year students (total of 4

students).• Offered monetary incentive.• Conducted in series of several sessions for each user for a

period of approx. four to six hours.• Implemented in January of second term after tutorial had run

once with a view to gathering user input on suspected problem areas, and to identifying other positive or negative attributes of the tutorial from a user’s perspective.

Usability Testing/In-Depth Survey

Usability Testing/In-Depth Survey

Included:

1. Background questions (to build profile of user).

2. One set of questions per module asking about: usability, length, and content (e.g. relevance, usefulness, too basic/too complex or confusing), writing style, self-tests, quizzes, whether presented effectively for learning, suggestions for improvements.

3. Final question set on tutorial as a whole: logical flow, repetition between modules, best modules, worst modules, suggestions for improvement of usability and content.

Usability Testing/In-Depth SurveyResults

Usability Testing/In-Depth SurveyResults

Usability • Generally very easy/intuitive to navigate & orientate oneself within the tutorial.• Problem Areas: (1) Links confusing sometimes; too many links in places. (2) More cues needed to show student how far they have progressed through the module. (3) Scrolling problematic in a few places especially module 4. (4) WebCT quiz instructions could be clearer.

Length Seen as a problem in certain areas of the tutorial especially modules 3 & 4, and parts of Module 2. Suggested simplifying definitions and cutting down on number of examples.

Contentincluding relevance, usefulness, clarity

In general comments were pretty favourable about tutorial content. Problem areas: (1) too basic in parts esp. Module 1. (2) Too much detail in parts of Module 2, as well as Modules 3 & 4. (3) Not explained well enough: Boolean, Citation Trial, some TRELLIS searches.

Usability Testing/ In-Depth SurveyResults

Usability Testing/ In-Depth SurveyResults

Self-Tests/Practice Exercises •Very effective in preparing for quizzes and should be introduced to greater extent.

•Good where the practice exercise involves high level of interactivity e.g. TRELLIS.

Quizzes •Generally perceived as reasonable with a few exceptions:• TRELLIS quiz too short and too hard.

•With longer modules too much to retain for quiz.

• Would have liked feedback for individual Post Test questions.

Presentation for Effective Learning

Other Comments

• Generally felt to be effectively presented for learning purposes.

Suggested Enhancements: (1) Break up in to shorter chunks/pages, with bolded terms and sentences, more point-form. (2) Introduce more interactivity

Did not comment/observe on repetition between modules

Best Modules: Modules 4 & 5

Using Assessment Results toEnhance Instruction & Evaluation

Overview

Using Assessment Results toEnhance Instruction & Evaluation

Overview

• Minor changes put in place between Fall & Winter terms of year one, while significant changes implemented over the Spring & Summer of 2002 in preparation for Fall term

• Work carried out in part by librarians on Information Literacy Working Group but also by a short-term contract librarian hired especially for this purpose, and a Computer Science student hired during Summer 2002.

Using Assessment Results toEnhance Instruction & Evaluation

Enhancements to Date

Using Assessment Results toEnhance Instruction & Evaluation

Enhancements to Date• Change in Technology/Medium of Delivery for Tutorial Content:

WebCT -> “Regular” WWW <http://www.wlu.ca/library/infolit/tutorial>

• Improved accessibility: AA Bobby approved & usable with screen readers

• Significantly reduced length especially Modules 2, 3, 4• Modified presentation of content pages - less text, shorter, more

bolded words/sentences, bulleted lists, tables, more colour.• Reduced reliance on internal links to ensure all key content

contained within content pages.• Removed need for scrolling by using smaller images/screen

captures.• Tutorial became more interactive - introduction of virtual tour of

the library (Module 2); TRELLIS interactivity maintained & slightly enhanced & similar approach adopted for module on finding journal articles; interactive multiple choice self-tests and practice exercises were developed.

Using Assessment Results toEnhance Instruction & Evaluation

Enhancements to Date

Using Assessment Results toEnhance Instruction & Evaluation

Enhancements to Date• Significant revisions to modules where content identified as

of questionable relevance/usefulness or excessively detailed - modules 2, 3, 4, & 5. Modules 2 & 5 most changed.

• Introduced new module on finding journal articles.• Pre-Test/Post-Test: Minor edits (between terms) - revamped

one problem question, changed to one correct answer only, made post-test count for marks.

• Quizzes: addressed problematic questions (between terms), while significantly changing whole approach during Summer 2002. Developed many questions which require students to apply skills taught through tutorial. Improved quiz instructions.

• Presurvey: clarified frequency scale, reworded one question (during Summer 2002).

• Student Evaluations: Minor edits (between terms) - added one question, rewording/slight refocus of some questions.

Using Assessment Results toEnhance Instruction & Evaluation

Looking to the Future

Using Assessment Results toEnhance Instruction & Evaluation

Looking to the Future• Review Pre-Test/Post-Test to ensure continued relevance.

May shift focus to test ability to apply core skills.• Investigate building on presurvey & evaluation survey to

include questions designed to determine students’ perceptions of their own skills in defined areas both before and after they take the tutorial.

• Consider conducting exit survey with participating faculty.• Continue to run tutorial and engage in assessment to ensure

continued process of renewal and improvement.• Devote more time/resources to promoting awareness of

tutorial and accompanying assessment plan among upper administrators and faculty to win increased support/resources for information literacy instruction in this and other forms.