![Page 1: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/1.jpg)
Crossing the Rubricon
Assessing the Instructor
Ned Fielden, Mira FosterSan Francisco State UniversitySan Francisco, California USA
![Page 2: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/2.jpg)
Case StudyAssessment of Librarian
Instructors
• Literature Review• Theoretical Issues• Rubric Design and
Implementation• Preliminary Review
![Page 3: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/3.jpg)
Instructor AssessmentSeveral Methods
• Supervisor Review• Peer Evaluation• Surveys• Performance Assessment
(Learning outcomes of students assessed)
![Page 4: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/4.jpg)
Institutional Need for Instructor Assessment
• Retention of probationary candidates, Tenure and Promotion
• CSU as public institution, criteria based, strict rules about personnel review
• Summative vs. formative assessment
![Page 5: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/5.jpg)
Process
• Literature Review• Identify Suitable Mechanism for
Review• Create Draft• Consult with Library Education
Committee• Formally Adopted by Library Faculty
![Page 6: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/6.jpg)
Rubrics
• Powerful, easy to use, standardized
• Considerable literature on rubric use for students/programs/outcomes
• Little on library instructor usage
![Page 7: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/7.jpg)
Value of Rubrics
• Standardised• Easy to use (minimal training)• Insures all criteria of review met• Possibilities of quantitative data
analysis, introduction of new values• Can be employed both for
summative and formative assessment
![Page 8: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/8.jpg)
Rubric Basics
• Glorified “checklist”, annotated to establish criteria, distinct items
A. Preparation1. Communicated with course instructor before
the session to determine learning objectives and activities
2. Learned about course assignment(s) specifically related to library research
3. Customized instruction session plan to curriculum, specific course assignments and/or faculty/student requests
![Page 9: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/9.jpg)
Rubric Complexity• May be designed to reflect highly
nuanced categories
*Oakleaf, M.L., 2006. Assessing information literacy skills, Dissertation, University of North Carolina.
EvaluationCriteria
Beginning Developing Exemplary Student Learning Outcomes
ArticulatesCriteria
0 – Student does not address authority issues
1 – Student addresses authority issues but does not use criteria terminology
2 – Student addresses authority issues and uses criteria terminology such as: author, authority, authorship or sponsorship
LOBO 3.1.1The student will articulate established evaluation criteria (ACRL 3.2 a)
![Page 10: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/10.jpg)
Types of Rubrics• Analytic – Specific Criteria – Isolated Facets –Capacity For Highly Granular ScoringAnalytic rubrics “divide … a product or
performance into essential traits or dimensions so that they can be judged separately…” *
• Holistic –Big Picture –Fuzzier Focus“overall, single judgment of quality” *
*Arter and Tighe, Scoring rubrics, 2001.
![Page 11: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/11.jpg)
Rubric Design
• What criteria to include• Opportunity to introduce
specific values in program• Involvement of all
constituents (evaluators/evaluatees)
![Page 12: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/12.jpg)
Rubric Implementation
• Formative – Raw data given to candidate– Pre- and post-consultation– Candidate to use data however
desired
• Summative – Framework for formal letter for RTP
file
![Page 13: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/13.jpg)
Summary
• Powerful, easy to use tool, levels playing field, highly customizable
• Issues of mixing formative and summative functions
![Page 14: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/14.jpg)
Further Study
• Explore different varieties of instructor assessment tools
• Test different rubrics• Establish balance point between
depth of data and ease of use• Evaluate outcomes
![Page 15: C rossing the R ubricon Assessing the Instructor](https://reader035.vdocuments.us/reader035/viewer/2022070409/5681442f550346895db0c9ca/html5/thumbnails/15.jpg)
Crossing the RubriconAssessing the Instructor
• Bibliography– http://online.sfsu.edu/~fielden/rbib.html
• Sample Rubric– http://online.sfsu.edu/~fielden/
rubrics.html
Bridge Photo with permission from robep http://www.flickr.com/photos/robep/