westernu assessment kick-off meeting: the why’s, who’s, what’s, how’s, and when’s of...

26
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.

Upload: robyn-poole

Post on 29-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and

when’s of assessment

Institutional Research & EffectivenessNeil M. Patel, Ph.D.Juan Ramirez, Ph.D.

Meeting Roadmap

• The goals are to understand–Why assessment needs to take place–Who should be involved in assessment–What needs to be assessed– How to assess the learning outcomes–When assessment reports are due

Why does assessment need to take place?

• WASC recommendations• “Nine colleges in search of a University”• Landscape of education• Why do we assess? – To measure learning– To identify challenges related to instruction, curriculum, or

assignments.– To improve learning

• Methods must be in place to properly assess• Information should be shared widely and used to inform decision-

making

Who should be involved in assessment?

• The program– Deans– Faculty– Curriculum committees– Assessment committees– Assessment Specialists– Preceptors

• Assessment & Program Review Committee– Contains a representative from each college

• Institutional Research & Effectiveness– Director– Senior Assessment Analyst– Assessment Analyst

What needs to be assessed?

INSTITUTIONAL LEARNING OUTCOMESPhase 12012-13

Evidence based practice

Interpersonal communication skills

Phase 22013-14

Critical thinking

Collaboration skills

Phase 32014-15

Breadth and depth of knowledge in the discipline/Clinical competence

Ethical and moral decision making skills

Phase 42015-16

Life-long learning Humanistic practice

What needs to be assessed? (cont.):We cannot assess everything!

• Direct assessment of Signature Assignments– Signature assignments have

the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts”

– Course-embedded assessment

– Aligned with LO’s– Authentic in terms of

process/content, “real world application”

• Indirect assessment, i.e., Student perceptions– First year survey– Graduating survey– Alumni surveys– Student evaluation of

course

ILO Assessment TemplateWestern University of Health Sciences

Assessment Template

• Timeline– For programs– For Assessment Committee

• Section I: Progress Report• Section II: Learning Outcome Alignment• Section III: Methodology, Goals & Participation• Section IV: Results• Section V: Discussion & Implications

Section I: Progress Report

• Instructions: Please list any programmatic actions that have taken place as a result of last year’s assessment addressing the same Institutional Learning Outcome.

• Goal: To document what occurred as a result of the assessment

Section II: Learning Outcome Alignment

• Instructions: Please list all program learning outcomes (PLO) that align with the institutional learning outcome.

Section III: Methodology, Goals & Participation

• Name of assignment• Type of assessment (Direct; Indirect)• Full description of assignment

– Narrative• PLO’s (from the aforementioned list) the assignment assesses• Quantifiable assessment goal(s) for assignment• Type of scoring mechanism used• Attachment of scoring tool highlighting what is being assessed• Participation: List of titles and assessment roles for those who

participated in the assessment process

Section III components

• PLO’s (from the aforementioned list) the assignment assesses– It is possible that not all

PLO’s will be assessed by the assignment

– Goal: To determine, after time, which PLO’s are/are not being assessed

• Quantifiable assessment goal(s) for assignment– To determine how many

students are achieving at a specific level/score

– To determine if differences in scores exist between two or more groups

– To determine if scores from one assignment predict scores of another assignment

Section III components

• Type of scoring mechanism used– Scoring guide, rubric, Scantron, professional judgment

• Attachment of scoring tool highlighting what is being assessed– Example: Rubric

• Participation– Faculty, Faculty committee, Program assessment

committee, Deans, Institutional Research & Effectiveness– Goal: To keep track and demonstrate program

participation

Section IV: Results• Name of assignment• Analytical approach

– Should align with assessment goal!

– To determine how many students are achieving at a specific level/score: Frequency distribution

– To determine if differences in scores exist between two or more groups: chi-square, t-test or ANOVA

– To determine if scores from one assignment predict scores of another assignment: Regression

• Sample size – Number of students

assessed

• Statistical results– Frequency table– Central tendency– Standard deviation– Test statistic– Degrees of freedom– p value

Section V: Discussion & Implications

• Name of assignment• Restate assignment goal• Was the goal reached (Yes/No)?• How do the results relate back to the ILO?– Narrative

• How are the results being used?– Narrative

Example

Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Evidence-Based Practice will be assessed using 4th year preceptor evaluations.

Question: What do we need to assess this assignment?

Example: 4th year preceptor evaluations to assess Evidence-Based Practice

• Things to consider:– Which PLO does this assignment address?– How is the assignment graded?– Who has the data?– What is/are the assessment goals?• Standards of success

– How do we analyze the data?

Example: 4th year preceptor evaluations to assess Evidence-Based Practice

• Assignment: The preceptor evaluation of students occurs during various time points within the 4th year rotations. For the purpose of assessment, the program has decided to use the students’ last preceptor evaluation. The preceptor is asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. The data is sent directly to the 4th year Director. To assess Evidence-Based Practice, a single item within the checklist is used: The student displays evidence-based practice.

Example: 4th year preceptor evaluations to assess Evidence-Based Practice

• Assessment Goal: 90% of students will demonstrate evidence-based practice skills.

• Why did we come up with 90%?– For grading, students need to achieve a score of 70% or

higher, and each evaluation of “Yes” = 1 point, thus 14 points out of 20 is required to pass.

– It is possible for all students to score 0 on the EBP item.– For assessment purposes, we are striving for 90% of

students to display EBP skills in their last rotation within the curriculum.• Remember signature assignment approach

Example: Data of 4th year preceptor evaluations to assess Evidence-Based Practice

EPB Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female

Student EBP Score Gender

11 0 1

12 0 1

13 1 1

14 0 2

15 1 2

16 0 1

17 0 2

18 0 1

19 1 2

20 1 2

Student EBP Score Gender

1 1 2

2 1 2

3 1 1

4 0 1

5 1 2

6 1 1

7 0 1

8 1 1

9 1 1

10 0 1

ExampleName of assignment 4th year preceptor evaluation

Type of assessment (Direct; Indirect) Direct

Provide a full description of the assignment. Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain skills or display certain knowledge elements; there are 20 total items in the evaluation form

Which PLO(s) from the list in Section II (above) will this assignment assess? (Please list)

PLO 2

Please state the assessment goal(s) for the assignment. What is the quantifiable standard(s) of success for this assignment?

90% of students will demonstrate evidence-based practice skills

When does the assignment take place in the curriculum? (Year in program, Semester)

This is the very last preceptor evaluation during the 4th year Spring semester

Type of scoring mechanism used Yes/No scoring guide for the item: The student displays evidence-based practice

Participants Faculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4th year preceptors evaluated students; 4th year program director collected data; Assessment Committee analyzed data

Example: ResultsName of Assignment 4th year preceptor evaluation

Analytical Approach Frequency distribution

Sample Size N=20

Statistical Result Frequency Percent

No 9 45.0%

Yes 11 55.0%

Total 20 100.0%

Example: Discussion & ImplicationsPlease restate the assessment goal(s).

Was the goal reached? (Yes/No)

How do the results relate back to the ILO?

How are the findings being used?

Assignment 1:4th year preceptor evaluation

90% of students will demonstrate evidence-based practice skills

No; Only 55% of students demonstrated evidence-based practice skills.

Only a slight majority of students demonstrate evidence-based practice skills during the final phase of their education within the curriculum.

The program is determining 1. If preceptors know what to look for when evaluating students, 2. If there are predictors to student success for this assignment, 3. If previous 4th year evaluations lead to a different conclusion, 4. Rigor?

GROUP WORK TIME!!!

TimelineTimeline for Programs

Distribute Template April 3, 2013

Section I: Progress ReportSection II: Institutional Learning Outcome & Program Learning Outcome AlignmentSection III: Methodology, Assessment Goals, & Participation

May 3, 2013

Section IV: Results June 7, 2013

Assessment Report Due July 31, 2013

Questions?Concerns?