graduate program assessment: a pilot study using a common activity and combined rubric

19
Graduate Program Assessment: A Pilot Study Using a Common Activity and Combined Rubric Rana Khan, Ph.D., Director, Biotechnology Program Datta Kaur Khalsa, Ph.D., Director of Assessment, Education Department Kathryn Klose, Ph.D., Associate Chair & Director, Finance Management and Accounting Yan Cooksey, Ph.D. Director, Learning Outcomes Assessment, Dean’s Office Sloan Conference Oct 11, 2012

Upload: penelope-herman

Post on 30-Dec-2015

27 views

Category:

Documents


0 download

DESCRIPTION

Graduate Program Assessment: A Pilot Study Using a Common Activity and Combined Rubric. Rana Khan, Ph.D., Director, Biotechnology Program Datta Kaur Khalsa, Ph.D., Director of Assessment, Education Department Kathryn Klose, Ph.D., Associate Chair & Director, Finance Management and Accounting - PowerPoint PPT Presentation

TRANSCRIPT

Graduate Program Assessment: A Pilot Study Using a Common Activity and Combined Rubric

Rana Khan, Ph.D., Director, Biotechnology ProgramDatta Kaur Khalsa, Ph.D., Director of Assessment, Education DepartmentKathryn Klose, Ph.D., Associate Chair & Director, Finance Management and AccountingYan Cooksey, Ph.D. Director, Learning Outcomes Assessment, Dean’s Office

Sloan Conference Oct 11, 2012

Course Outcomes

Program Objectives

Undergraduate and

Graduate School Goals

Institutional Outcomes

UMUC’s LEVELS OF ASSESSMENT

Sloan Conference Oct 11, 2012

 STUDENT LEARNING EXPECTATIONS (SLEs)

Written Communication

(COMM)

Produce writing that meets expectations for format,

organization, content, purpose, and audience.

 

Information Literacy (INFO) Demonstrate the ability to use libraries and other

information resources to effectively locate, select, and

evaluate needed information.

 

Critical Thinking (THIN) Demonstrate the use of analytical skills and reflective

processing of information.

 

Technology Fluency (TECH) Demonstrate an understanding of information

technology broad enough to apply technology

productively to academic studies, work, and everyday

life.

 

Content/Discipline-Specific

Knowledge (KNOW)

Demonstrate knowledge and competencies specific to

program or major area of study.

 

UMUC GRADUATE SCHOOL SLEs

Sloan Conference Oct 11, 2012

CURRENT APPROACH: 3-3-3 MODEL

3 rounds, over 3 years, at 3 stages

5 SLEs: COMM, THIN, INFO, TECH, KNOW

Sloan Conference Oct 11, 2012

3-3-3 Model

ASSESSING THE ASSESSMENT

Strengths: Weaknesses: • Tested rubrics • Added faculty

workload

• Reasonable collection points

• Lack of consistency in assignments

• Larger samples - more data for analysis

• Variability in applying scoring rubrics

Sloan Conference Oct 11, 2012

Common activity

• Topic for all disciplines – “Challenges facing leaders”

Combined activity

• 4 SLEs (all except KNOW)

• SLE criteria from existing rubrics – eliminate overlap

• 4-pt scale (Exemplary, Competent, Marginal &

Unsatisfactory)

Training raters and norming

COMBINED ACTIVITY/RUBRIC (C2) MODEL

Sloan Conference Oct 11, 2012

Current

3-3-3 Model

Combined Activity/Rubric (C2)

Model

Multiple Rubrics: one for each of 4 SLEs

Single rubric for all 4 SLEs

Multiple assignments across graduate school

Single assignment across graduate school

One to multiple courses/4 SLEs Single course/4 SLEs

Multiple raters for the same assignment/course

Same raters/assignment/course

Untrained raters Trained raters

3-3-3 VS COMBINED ACTIVITY/RUBRIC (C2) MODEL

Sloan Conference Oct 11, 2012

DESIGN OF A PILOT STUDY

• Purpose:– To simplify the current assessment process– To increase the process reliability and validity

• Methods: – Courses were identified– Faculty chosen to be raters– Norming sessions were conducted– Paper were collected and assessed– Intra-Class Correlation Coefficient (ICC) was

calculated

Sloan Conference Oct 11, 2012

Implementation Process of Pilot Study

Week 1

Norming Session 1: Rater orientation of scoring process, activity, rubric and timeline

Week 2Scoring Session 1: Anchor paper grading

Norming Session 2: Asynchronous comparative discussion

Week 3

Norming Session 3: Live conference discussing anchor results and rubric questions

Week 4Scoring Session 2: 10-day grading period of all student papers by raters

Week 5Norming Session 4: Live conference on results with feedback for improvement

Week 6 Pilots student data processed and analyzed

SPRING 2012 PILOT NORMING

Sloan Conference Oct 11, 2012

• Intra-class Correlation Coefficient (ICC)• estimation of inter-rater reliability• one-way random effects ANOVA model

PHASE I PILOT RESULTS

>0.75, excellent; 0.40 to 0.75, fair to good/moderate; <0.40 poor

Source: Fleiss (1986) on ICC values clinical & social science research

Sloan Conference Oct 11, 2012

ItemIntra-class Correlation

THIN

Conceptualization 0.396Analysis 0.493Synthesis 0.509Conclusion 0.390Implications 0.201

INFOEvaluation 0.430Incorporation 0.381Ethical Use 0.335

COMM

Context/Purpose 0.316Content/Ideas/Support 0.475Organization 0.444Grammar/Spelling/Punctuation 0.456

TECHTech Mgmt. 0.175Info Retrieval 0.512

PHASE I PILOT RESULTS

Sloan Conference Oct 11, 2012

PHASE I PILOT RESULTS

  Overall Descriptive Statistics (N=91)  Mean Median Mode Std.

Deviation

Minimum Maximum

THIN Conceptualization 3.247 3.350 3.5 .4823 1.8 4.0

  Analysis 3.091 3.150 3.3 .5194 1.5 4.0

  Synthesis 3.053 3.150 3.5 .5839 1.5 4.0

  Conclusion 2.998 3.100 3.5 .6024 1.1 4.0

  Implication 2.935 3.000 3.0 .6026 1.0 4.0

INFO Evaluation 3.258 3.250 3.3 .4961 2.0 4.0

  Incorporation 3.158 3.250 3.3 .5191 1.5 4.0

  Ethical Use 3.579 3.750 4.0 .5425 1.0 4.0

COMM Context/Purpose 3.225 3.300 3.5 .4900 1.8 4.0

  Content/Ideas/Support 3.105 3.150 3.3 .5079 2.0 4.0

  Organization 3.116 3.250 3.3 .5575 1.5 4.0

  Grammar/Spelling/

Punctuation3.115 3.250 3.5 .5593 1.8 4.0

TECH Tech Mgmt. 3.621 3.750 3.8 .4213 1.8 4.0

  Info Retrieval 3.639 3.750 3.8 .4110 1.0 4.0

  Total 45.093 45.800 45.2a 5.5541 29.5 55.4

  a. Multiple modes exist. The smallest value is shown

b. Scale: Exemplary: 3.1-4.0; Competent: 2.1-3.0; Marginal 1.1-2.0; Unsatisfactory: 0-1.0

Sloan Conference Oct 11, 2012

PILOT INTENTIONS

Consistency in interpretation of rubric

Consistency in use of rubric

Address variability of data collection

Limit extra load on faculty

Sloan Conference Oct 11, 2012

LESSONS LEARNED

• Review alignment

• Consolidate rubric further

• Tech management criteria

• Norming practice

Sloan Conference Oct 11, 2012

“Refined Rubric and Random Paper Grading Study.”

• Same raters

• Same papers but distributed randomly

• More norming practice with the refined rubric

• Increase evidence of combined rubric validity

FUTURE DIRECTION - PHASE II

Sloan Conference Oct 11, 2012

Sloan Conference Oct 11, 2012

REFERENCE

• Fleiss, J. L. (1986). Design and analysis of clinical experiments. New York, NY: John Wiley & Sons.

Sloan Conference Oct 11, 2012

CONTACT

• Rana Khan: [email protected]

• Datta Kaur-Khalsa:[email protected]

• Kathryn Klose: [email protected]• Yan Cooksey: [email protected]

Sloan Conference Oct 11, 2012

ACKNOWLEDGEMENTS

• John Aje• Diane Bartoo• Nancy Glenn• Kathy Marconi• Dan McCollum• Garth McKenzie• Pat Spencer• Rudy Watson

• Bruce Katz• Dawn Rodriguez• Carol De’Arment• Pat Miller • Lisa Parsons• Katie Crockett• Anthony Cristillo

Sloan Conference Oct 11, 2012