ubiquitous core skills assessment

40
Ubiquitous Core Skills Assessment David Eubanks, JCSU Kaye Crook, Coker College Slides at highered.blogspot.com

Upload: henrik

Post on 08-Jan-2016

27 views

Category:

Documents


1 download

DESCRIPTION

Ubiquitous Core Skills Assessment. David Eubanks, JCSU Kaye Crook, Coker College Slides at highered.blogspot.com. Hi…I’m the Assessment Director. The Theory. What Could Go Wrong?. (Example 1) Assessment Method: - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Ubiquitous Core Skills Assessment

Ubiquitous Core Skills Assessment

David Eubanks, JCSUKaye Crook, Coker College

Slides at highered.blogspot.com

Page 2: Ubiquitous Core Skills Assessment

Hi…I’m the Assessment Director

Page 3: Ubiquitous Core Skills Assessment

The Theory

Page 4: Ubiquitous Core Skills Assessment

What Could Go Wrong?

Page 5: Ubiquitous Core Skills Assessment
Page 6: Ubiquitous Core Skills Assessment

(Example 1) Assessment Method:

The History capstone experience culminates in a senior thesis that requires the student demonstrate historical literacy, historiographical awareness, good communication skills, and the ability to conduct research.

The thesis is composed in close association with a History faculty member who continuously assesses the student's abilities in the core curriculum areas. The thesis advisor's evaluation is then shared with other faculty members in the department.

Page 7: Ubiquitous Core Skills Assessment

(Example 1) Assessment Results and Evidence:

Each student in the program who took HIS 391 and HIS 491 (the two-course capstone sequence) over the last academic year was assessed using the History Capstone Evaluation Form.

Data from the past year indicates that majors have improved in terms of their writing, oral expression, critical thinking, and research skills.

Page 8: Ubiquitous Core Skills Assessment

(Example 1) Actions and Improvements:

Continue to monitor History majors' critical thinking and writing skills using the History Capstone Evaluation Form [2814].

Page 9: Ubiquitous Core Skills Assessment

(Example 2)

Outcomes Statement: Assessment results will show that by end of their senior year, chemistry majors will be knowledgeable in each of the four major areas of chemistry, and will have successfully participated in independent or directed research projects and/or internships. Prior to graduation, they will have presented an acceptable senior seminar in chemistry.

Assessment Method: Simple measures of proficiency are the chemistry faculty's grading of homework assignments, exams, and laboratory reports. A more general method of evaluation is to use nationally-standardized tests as final exams for each of the program's course offerings. These are prepared and promulgated by the American Chemical Society (ACS). [No mention of senior seminar]

Assessment Results and Evidence(none)

Actions and Improvements(none)

Page 10: Ubiquitous Core Skills Assessment

Fixing What Went Wrong

Resurrected an IE Committee for assessment Provided faculty training Provided guidelines for reports Adhered to established deadline Reviewed the revised reports and provided

feedback

Page 11: Ubiquitous Core Skills Assessment

Think Like a Reviewer!

1. Would you have a clear understanding of this program’s outcomes, and assessment methods?

2. Is there documentation of assessment results?

3. Is it clear that results/data have been analyzed by program faculty?

4. Have actions been taken to improve the quality of program based on these results?

5. Are there statements that seem to come out of nowhere?

Page 12: Ubiquitous Core Skills Assessment

Assessment Report Audit FormIE Assessment Reports Audit

SACS Standard CS 3.3.1: The institution identifies expected outcomes, assess the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in educational programs, to include student learning outcomes.

Program Area

1. Objectives1 & Outcomes2 Identified; Focus on Student Learning

2. Outcomes Align with Objective(s)

3. Assessment Methods3 Identified

4. Assessment Methods Align with Outcomes

5. Results of Assessments Cited (Data)4

6. Evidence of Comparative, On-going Data4

7. Analysis and Use of Results (data)4

8. Evidence of Action Taken5

9. SQUID Links to Rubrics & Cited Documents

10. Reports updated through 08-09

(program & year of report)

(program & year of report)

(program & year of report)

(program & year of report)

1 Objectives state what is to be achieved specific to the goal it falls under. It will be theoretical and broad. 2 Outcomes specify what is expected to be observed once the assessment has been completed. Student Learning Outcomes identify knowledge, skills, dispositions that exemplify graduates of the program. 3 Assessment Methods describe what will be done, observed, measured to determine if the outcomes & objectives have been met. There should be some variety, use of both qualitative and quantitative methods possible with small sample sizes. 4 Assessment Results and Evidence should include findings from the listed assessment methods with documentation and evidence. Look for data (numbers, percentages, ratings) and evidence that results have been analyzed, shared, and discussed with conclusions reached. 5 There should be clear evidence that actions have been taken to modify and improve the program based on analysis of results cited. Past tense should be used as much as possible; for example, “faculty met and decided to…” It is OK to state occasionally state “no improvements needed at this time”. 6 Documents needed for the evaluator should be linked in SQUID by [xxxx] notation. Check all Squid links to find evidence such as data organized in tables, charts, graphs, rubrics and scoring guides used in the assessments, minutes of meetings, etc.

Provide comments for each Program Area below and on the back of this page.

Page 13: Ubiquitous Core Skills Assessment

Improved Chemistry OutcomesAssessment results will show that chemistry graduates

have

1. a clear understanding of and demonstrate proficiency in each of the five major areas of chemistry: inorganic chemistry, organic chemistry, analytical chemistry, physical, and bio-chemistry.

2. the ability and technical expertise to successfully engage in independent or directed research projects and/or internships relevant to the chemical field.

Page 14: Ubiquitous Core Skills Assessment

Improved Chemistry Assessments1. Assessment methods for knowledge proficiency

a. Knowledge proficiency of chemistry students will be assessed by using standardized tests that are prepared and distributed by the American Chemical Society (ACS). These tests are used nationwide to provide an external and uniform source of assessment. These tests are an excellent means of comparing knowledge proficiency of Coker students to the national average.

 b. The performance of students in admission tests such the Graduate Record Examination (GRE), Pharmacy College Admission Test (PCAT), and Medical College Admission Test (MCAT) will also be used to assess knowledge proficiency.

 c. Additionally, assessment criteria as specified in an internal Chemistry Content Knowledge Scoring Guide [4315] rubric will be used. These criteria will be applied to students in chemistry courses and the scores will be reported in assessment results.

 2. The technical know-how and practical skills of students will be assessed based on their performance in laboratory sessions and using rubric [4315]. Students are required to participate in a research project under the direction of a faculty member. Their performance (measured in terms of poster presentation, publication, etc) will be used to assess practical and research skills. Technical know-how will be assessed in a student’s senior capstone seminar using rubric [4316].

Page 15: Ubiquitous Core Skills Assessment

General Education Assessment

Page 16: Ubiquitous Core Skills Assessment

What goes wrong: Data Salad

Page 17: Ubiquitous Core Skills Assessment

Assessment Design

Page 18: Ubiquitous Core Skills Assessment

Choosing an Achievement ScalePick a scale that corresponds to the

educational progress we expect to see.

Example:

0 = Remedial work

1 = Fresh / Soph level work

2 = Jr / Sr level work

3 = What we expect of our graduates

Page 19: Ubiquitous Core Skills Assessment

For Complex Outcomes:

Use authentic data that already exists Sacrifices reliability for validity

Create subjective ratings in context Sacrifices reliability for validity Don’t have to bribe students to do something extra

Assess what you observe, not just what you teach! Recovers some reliability

Page 20: Ubiquitous Core Skills Assessment

Faculty Assessment of Core Skills Each syllabus has simple rubric for at least

one skill (not necessarily taught)

Ratings entered on web form each term by course instructors

Page 21: Ubiquitous Core Skills Assessment
Page 22: Ubiquitous Core Skills Assessment

Analysis and Reporting

Page 23: Ubiquitous Core Skills Assessment

Comparison of Writing vs. Speaking

Page 24: Ubiquitous Core Skills Assessment

Compare DemographicsLongitudinal Writing Effectiveness in Day

0.00

0.50

1.00

1.50

2.00

2.50

2003-4 2004-5 2005-6 2006-7

Academic Year

Avg

. S

core AF F

AF M

W F

W M

Page 25: Ubiquitous Core Skills Assessment

Matthew Effect

Page 26: Ubiquitous Core Skills Assessment

Reading and WritingWriting by Class and Library Use

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

Fall 05 Fall 04 Fall 03 Fall 02

Start Term

Avg

. W

riti

ng

Sco

re

> Library

< Library

Page 27: Ubiquitous Core Skills Assessment

What can go wrong: Averages Averages compress all the juice out of data

Without juice, we only have directional info How high should it be? How do we get there?

An average is like a shotgun. Use carefully.

Page 28: Ubiquitous Core Skills Assessment

A Solution: Min / max

Major Code Class Student ID Last First ObservationsArt (4) Senior xxxxxxx Tatiana Tolstoy 7

Min Max AvgAnalytical Thinking Remedial Jr/Sr 1.00Creative Thinking Remedial Fr/So 0.80Writing Effectiveness Fr/So Jr/Sr 1.17Speaking Effectiveness Remedial Jr/Sr 1.00

Page 29: Ubiquitous Core Skills Assessment

Assessment Results

Three of our five senior students attained higher scores in their Senior year compared to the scores in their First year.

Unfortunately, two of our students showed improvement during their Sophomore and Junior years, but the improvement was not sustained into their Senior year.

Page 30: Ubiquitous Core Skills Assessment

Action tied to results

Change our Capstone course to include the writing of a thesis in the style of a review article. We are hopeful that this assignment will improve at least the effective writing and analytical thinking components of the core skills.

A rubric has been developed to help assess these core skills in that writing assignment

Page 31: Ubiquitous Core Skills Assessment

Solution: Use ProportionsDay Analytical Min Rating

0%

20%

40%

60%

80%

100%

2003-4 2004-5 2005-6 2006-7

Jr-Sr

Fr-Soph

Remedial

Page 32: Ubiquitous Core Skills Assessment

Some Effects

Institutional dashboard Provided evidence for QEP

Core skills rubric in every syllabus Creates a language and expectation

Example of program changes: We now videotape student presentations Writing assignments look for other core skills Intentional focus on analytic vs creative thinking

Page 33: Ubiquitous Core Skills Assessment

Last Requests?

highered.blogspot.com

Page 34: Ubiquitous Core Skills Assessment

“Through the eyes of an IE evaluator”from Marila D. Palmer, SACS Summer Institute, July 2009

What does an Institutional Effectiveness Evaluator look for when reading an IE program report?

Assessments align with learning outcomes and goals; for example, indicate on a survey which questions align with which outcomes

Assortment of well-matched assessments: written and oral exams, standardized tests, presentations, capstone projects, portfolios, surveys, interviews, employer satisfaction studies, job/grad placement, case studies

Goals/outcomes probably don’t change much from year to year. Strategies for accomplishing these goals may and probably should change.

Outcomes should focus on student learning rather than curricular or program improvement

Documentation: numbers, percentages, comparative, longitudinal data On-going documentation (year to year to year) Results that have been analyzed, shared, discussed, and acted upon Evidence of improvement based on analysis of results Easy-to-find and/or highlighted sections that point to proof/evidence of

improvement; tell the evaluator exactly where in the document/evidence gives proof

A little narrative goes a long way; get rid of extraneous information; think of it as a summary

Report needs to be organized; keep is short and simple (KISS)

Page 35: Ubiquitous Core Skills Assessment

How we reviewed the reportsIE Assessment Reports Audit

SACS Standard CS 3.3.1: The institution identifies expected outcomes, assess the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in educational programs, to include student learning outcomes.

Program Area

1. Objectives1 & Outcomes2 Identified; Focus on Student Learning

2. Outcomes Align with Objective(s)

3. Assessment Methods3 Identified

4. Assessment Methods Align with Outcomes

5. Results of Assessments Cited (Data)4

6. Evidence of Comparative, On-going Data4

7. Analysis and Use of Results (data)4

8. Evidence of Action Taken5

9. SQUID Links to Rubrics & Cited Documents

10. Reports updated through 08-09

(program & year of report)

(program & year of report)

(program & year of report)

(program & year of report)

1 Objectives state what is to be achieved specific to the goal it falls under. It will be theoretical and broad. 2 Outcomes specify what is expected to be observed once the assessment has been completed. Student Learning Outcomes identify knowledge, skills, dispositions that exemplify graduates of the program. 3 Assessment Methods describe what will be done, observed, measured to determine if the outcomes & objectives have been met. There should be some variety, use of both qualitative and quantitative methods possible with small sample sizes. 4 Assessment Results and Evidence should include findings from the listed assessment methods with documentation and evidence. Look for data (numbers, percentages, ratings) and evidence that results have been analyzed, shared, and discussed with conclusions reached. 5 There should be clear evidence that actions have been taken to modify and improve the program based on analysis of results cited. Past tense should be used as much as possible; for example, “faculty met and decided to…” It is OK to state occasionally state “no improvements needed at this time”. 6 Documents needed for the evaluator should be linked in SQUID by [xxxx] notation. Check all Squid links to find evidence such as data organized in tables, charts, graphs, rubrics and scoring guides used in the assessments, minutes of meetings, etc.

Provide comments for each Program Area below and on the back of this page.

Page 36: Ubiquitous Core Skills Assessment

Assessment Reporting Review HandoutI. My Plans “How-To’s”: Goal: Student Learning Outcomes: The program identifies expected outcomes, assesses the extent to

which it achieves these outcomes, and provides evidence of improvement in student learning outcomes. (SACS 3.3.1, formerly 3.5.1)

 Objective: A statement of what you are trying to achieve Focus on student learning What is to be achieved? Somewhat broad and theoretical Outcomes Statement: What would you like to see as a result of the assessments you use? Is observable Is measurable (either quantitatively or qualitatively) Assessment Method: Describe what will be done, observed, and measured to determine if outcomes

and objectives have been met Need more than one method Can give both quantitative and qualitative results Alignment with outcomes should be clear Any forms used for assessment should have squid link with indication which section(s) align to

which outcomes May change based on results from previous year 

Page 37: Ubiquitous Core Skills Assessment

Assessment Reporting Review Handout (continued)

I. My Plans “How-To’s” (continued):

Assessment Results and Evidence: Summarize results of the assessments, describe the evidence you looked at, and provide an analysis of the results

Include findings from listed assessment methods; include results from majors in all on and off site programs

Squid links to data/results organized in tables and charts Documentation/Evidence includes numerical data, information from discussions & interviews,

minutes of meetings, etc When citing numerical data or ratings, include in the narrative what they mean (1 = excellent, etc) Report AND analyze your findings Include results from actions you reported you would do this year Compare results from last year to this year or from one cohort to another Should not be the same from year to year Actions and Improvements: What actions were taken to address issues discovered in your findings? Report what you did, not what you will do (or word it so that any future actions were discussed and

agreed upon) If your evidence shows positive results, how can you build on it? If not, what do you try next? Actions taken to modify and improve the quality of your program should tie to the analysis of the

evidence reported in the previous section Should NOT be the same from year to year; do not report “everything is fine and we will continue to

monitor…”

Page 38: Ubiquitous Core Skills Assessment

Assessment Reporting Review Sheet (continued)

II. Assessment Report “Shoulds”:

1. Should provide actionable information2. Should have purpose of improving quality of educational program you are offering3. Should have focus that it is done for internal use rather than for external use4. Results should be valuable to you for feedback and improvement rather than solely for reporting

purposes Ask “does this assessment method give me any feedback that we can actually use to improve

our program?”5. Should focus on student learning outcomes, not program outcomes or curricular improvement6. Should not have all actions/improvements in future tense; show you actually took action rather than

you intend to do so 7. Should address and document “we will do” actions from previous year to what was actually done this

year and what were the results8. Should be written so that documentation/evidence is easy-to-find; highlight/identify exactly what part of

a document/table/survey/rubric/portfolio, etc provides proof of results and improvement9. Should address results from majors in all programs (day, evening, off-campus, distance learning, etc)10. Read your reports & think like a reviewer.

• Would you have a clear understanding of this program’s goals, outcomes, and assessment methods?

• Are there obvious examples and documentation of results of the assessment methods listed?

• Is it clear that results/data have been discussed and analyzed among faculty in the program?

• Have actions been taken (note past tense) to improve the quality of program based on these results?

• Are there statements that seem to come out of nowhere (no linkage to objective, assessments, or data)?

Page 39: Ubiquitous Core Skills Assessment

Guidelines for reports

1. Have learning goals/outcomes been established?2. Have methods of assessment been identified? 3. Are the assessments aligned with the goals and

outcomes? Is it easy to see how?4. Are results reported using data and other

evidence? Is this evidence easy to find and identify? Have results been analyzed? Is there evidence of on-going assessment from year to year?

5. Is there evidence that results have been used to take actions and make modifications and improvements?

Page 40: Ubiquitous Core Skills Assessment

Questions?Kaye Crook

Associate Professor of Mathematics

Coker College

Hartsville, SC

[email protected]

Thank you!