ay 2012/13 annual assessment report - columbia college/media/files/academic assessment/assess… ·...

96
AY 2012/13 Annual Assessment Report Academic Affairs Assessment Office January 20, 2015 1

Upload: others

Post on 19-Oct-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

AY 2012/13 Annual Assessment Report

Academic AffairsAssessment Office

January 20, 2015

1

Page 2: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Contents

I. Institutional Report 8A. Institutional Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8B. Institutional Review of Departmental Assessment . . . . . . . . . . . . . . . . . . . . 9

1. What was the institution’s major assessment accomplishment this year? . . . 92. How did this accomplishment impact the institutional assessment goals? . . . 93. How did this accomplishment impact the program assessment goals? . . . . . 94. How was this accomplishment related to last year’s assessment? . . . . . . . . 95. How will this accomplishment relate to next year’s assessment? . . . . . . . . 96. Reflecting upon program accomplishments, how would the institution im-

prove the assessment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . 107. What is working well and should not change? . . . . . . . . . . . . . . . . . . 108. What was the institution’s major assessment hurdle this year? . . . . . . . . 109. How did this hurdle impact the department assessment goals? . . . . . . . . . 1010. How did this hurdle impact the program assessment goals? . . . . . . . . . . 1011. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 1012. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 1013. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1014. What would you change to provide the most impact on assessment in the

institution? Explain and be as specific as you can . . . . . . . . . . . . . . . 11

II. Departmental Reports 12

Art Department 13A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

1. What was the department’s major assessment accomplishment this year? . . 132. How did this accomplishment impact the department assessment goals? . . . 133. How did this accomplishment impact the course assessment goals? . . . . . . 134. How was this accomplishment related to last year’s assessment? . . . . . . . . 135. How will this accomplishment relate to next year’s assessment? . . . . . . . . 136. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137. What is working well and should not change? . . . . . . . . . . . . . . . . . . 138. What was the department’s major assessment hurdle this year? . . . . . . . . 139. How did this hurdle impact the department assessment goals? . . . . . . . . . 1310. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 1411. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 1412. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 1413. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1414. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 14

2

Page 3: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Business Administration 16A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

1. What was the department’s major assessment accomplishment this year? . . 162. How did this accomplishment impact the department assessment goals? . . . 163. How did this accomplishment impact the course assessment goals? . . . . . . 164. How was this accomplishment related to last year’s assessment? . . . . . . . . 165. How will this accomplishment relate to next year’s assessment? . . . . . . . . 166. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167. What is working well and should not change? . . . . . . . . . . . . . . . . . . 168. What was the department’s major assessment hurdle this year? . . . . . . . . 169. How did this hurdle impact the department assessment goals? . . . . . . . . . 1710. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 1711. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 1712. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 1713. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1714. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 17B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

Computer and Mathematical Sciences 21A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

1. What was the department’s major assessment accomplishment this year? . . 212. How did this accomplishment impact the department assessment goals? . . . 213. How did this accomplishment impact the course assessment goals? . . . . . . 214. How was this accomplishment related to last year’s assessment? . . . . . . . . 225. How will this accomplishment relate to next year’s assessment? . . . . . . . . 256. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257. What is working well and should not change? . . . . . . . . . . . . . . . . . . 258. What was the department’s major assessment hurdle this year? . . . . . . . . 259. How did this hurdle impact the department assessment goals? . . . . . . . . . 2510. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 2511. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 2512. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 2513. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2514. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 26B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3

Page 4: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Criminal Justice and Human Services 30A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

1. What was the department’s major assessment accomplishment this year? . . 302. How did this accomplishment impact the department assessment goals? . . . 303. How did this accomplishment impact the course assessment goals? . . . . . . 304. How was this accomplishment related to last year’s assessment? . . . . . . . . 305. How will this accomplishment relate to next year’s assessment? . . . . . . . . 306. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307. What is working well and should not change? . . . . . . . . . . . . . . . . . . 308. What was the department’s major assessment hurdle this year? . . . . . . . . 309. How did this hurdle impact the department assessment goals? . . . . . . . . . 3110. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 3111. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 3112. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 3113. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3114. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 31B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Education 34A. Department Evaluation Reportt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

1. What was the department’s major assessment accomplishment this year? . . 342. How did this accomplishment impact the department assessment goals? . . . 343. How did this accomplishment impact the course assessment goals? . . . . . . 344. How was this accomplishment related to last years assessment? . . . . . . . . 345. How will this accomplishment relate to next years assessment? . . . . . . . . 356. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357. What is working well and should not change? . . . . . . . . . . . . . . . . . . 358. What was the department’s major assessment hurdle this year? . . . . . . . . 359. How did this hurdle impact the department assessment goals? . . . . . . . . . 3510. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 3511. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 3512. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 3513. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3514. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 36B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

History and Political Sciences 38A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

1. What was the department’s major assessment accomplishment this year? . . 382. How did this accomplishment impact the department assessment goals? . . . 38

4

Page 5: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

3. How did this accomplishment impact the course assessment goals? . . . . . . 384. How was this accomplishment related to last year’s assessment? . . . . . . . . 395. How will this accomplishment relate to next year’s assessment? . . . . . . . . 396. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397. What is working well and should not change? . . . . . . . . . . . . . . . . . . 398. What was the departments major assessment hurdle this year? . . . . . . . . 409. How did this hurdle impact the department assessment goals? . . . . . . . . . 4010. How did this hurdle impact the course assessment goals? . . . . . . . . . . . 4011. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 4012. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 4113. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4114. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 41B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

Humanities 45A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

1. What was the department’s major assessment accomplishment this year? . . 452. How did this accomplishment impact the department assessment goals? . . . 453. How did this accomplishment impact the course assessment goals? . . . . . . 454. How was this accomplishment related to last year’s assessment? . . . . . . . . 455. How will this accomplishment relate to next year’s assessment? . . . . . . . . 456. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 457. What is working well and should not change? . . . . . . . . . . . . . . . . . . 468. What was the department’s major assessment hurdle this year? . . . . . . . . 469. How did this hurdle impact the department assessment goals? . . . . . . . . . 4610. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 4611. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 4612. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 4613. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4614. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 46B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

Nursing 50A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Psychology and Sociology 57A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

1. What was the department’s major assessment accomplishment this year? . . 572. How did this accomplishment impact the department assessment goals? . . . 57

5

Page 6: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

3. How did this accomplishment impact the course assessment goals? . . . . . . 574. How was this accomplishment related to last year’s assessment? . . . . . . . . 575. How will this accomplishment relate to next year’s assessment? . . . . . . . . 586. Reflecting upon your accomplishments, how would you improve the assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 587. What is working well and should not change? . . . . . . . . . . . . . . . . . . 588. What was the department’s major assessment hurdle this year? . . . . . . . . 589. How did this hurdle impact the department assessment goals? . . . . . . . . . 5810. How did this hurdle impact the course assessment goals? . . . . . . . . . . . . 5811. How was this hurdle related to last year’s assessment? . . . . . . . . . . . . . 5812. How will this hurdle be addressed in next year’s assessment? . . . . . . . . . 5813. Reflecting upon the hurdles you faced, how would you improve your assess-

ment process? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5914. What would you change to provide the most impact on assessment in your

department? Explain and be as specific as you can. . . . . . . . . . . . . . . 59B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

1. Psychology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 612. Sociology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

Physical and Biological Sciences 66A. Department Evaluation Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

1. Biology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 662. Chemistry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 673. Environmental Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

B. Department MFT Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 691. Biology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 692. Chemistry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

III. ETSPP Results 75

2013 ETSPP Report Summary 75A. Senior ETSPP Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75B. ETSPP Day . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75C. ETSPP Summary and Recommendations . . . . . . . . . . . . . . . . . . . . . . . . 76

ETSPP Summary of Scaled Scores 77

ETSPP Summary of Proficiency Classifications 79

ETSPP Scaled Score Distributions 81

IV. Summer 2014 Assessment Workshop Report 83

Overall conclusions 83

6

Page 7: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Topics covered by presentations 84

Help given to faculty 84

Examples of Types of Rubrics Developed in Workshop 84

Assessment Workshop 2014 Survey Summary 85

V. AY 2012/13 Annual Assessment Summary 96

7

Page 8: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Part I.

Institutional Report

A. Institutional Summary

Columbia College (CC) is in a period of transition with respect to assessment. In AY 2012/13the faculty association approved a schedule of assessment tool development, implementation, andevaluation to ensure full assessment cycles were completed which met the Higher Learning Com-mission (HLC) new criteria for assessment. CC has a long record of course level objectives andoutcomes associated with each course in the master syllabi that are used for each course taught atevery venue, however the HLC has required that each program have explicit measurable learningoutcomes (MLOs) that demonstrate how each course develops those MLOs through introductionof new information and/or skills to the mastery level defined and observed by each department byeach graduate to ensure levels that will provide success for graduates in the fields of which theywere trained. During this academic year academic departments and programs discovered wherethey were in the process of mapping out program outcomes and identifying expectations of grad-uates upon completion of degrees. It was discovered during this process that some programs werefarther behind than others; primarily due to the lack of an outside accreditation body to providesuch expectations of programs. The discovery is described in each department summary reportprovided in this report. To help departments in their assessment reporting, fourteen questionswere provided to help document assessment for the academic year. To help with consistency, theinstitutional report followed the same format answering the fourteen questions in a modified formfor consistency in reporting.

Of note were the Education and Nursing programs. Each have outside accreditation bodieswhich have in place rigid requirements for licensure and certification. Because these departmentshave a history of assessing at both the class and program level they have been a great resourceto other departments with respect to answering questions about types of rubrics and assignmentsthat are most appropriate for different types of MLOs being assessed by programs. During theOverall conclusions faculty across all departments spent a week together working on their respectiveprogram assessment outcomes, rubrics, and schedules. It was in this week that collaborationexpanded across the campus and the assessment culture at CC blossomed. A summary of theworkshop and survey can be found in that section.

The goal for Columbia College in the next academic year will be to complete outcomes andrubrics for each program to be implemented on a three or five year rotation to allow full evaluationof results and to ensure programs identify strengths and weaknesses demonstrating a completeassessment cycle and to start the next cycle on the next department designated program outcomesto ensure the assessment cycle continues in order to improve student learning across all courses andvenues.

8

Page 9: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Institutional Review of Departmental Assessment

1. What was the institution’s major assessment accomplishment this year?

The Faculty Association approved the assessment plan proposed by the Assessment Committee.The full plan is explained on the web site http://web.ccis.edu/Offices/AcademicAssessment.aspx.

The proposal includes institutional, program, and class assessment with a timeline that facultyassociation voted to adjust pushing it back by 9 months.

The web site was updated to reflect the current committee membership, a repository for priorassessment reports at the program, department and institutional level. The adopted assessmentprocess is clearly presented and available to the public to reflect the movement of the institutionto make available to prospective students the information they may need to make an informeddecision about where they want to go for their higher education.

2. How did this accomplishment impact the institutional assessment goals?

The goal for the institution was to put in place a clear assessment plan that included course,program, and institutional assessment for the public to view. The accomplishment of this year metthis goal.

3. How did this accomplishment impact the program assessment goals?

Making and passing the assessment plan made clear the purpose of program assessment. How-ever the first task of making assessment tools for courses made for confusion about how programswould be assessed. The plan made apparent the need for faculty to have opportunities to work onassessment in a more structured environment. The faculty association voted to extend the dead-line for having program curriculum maps, program assessment schedules, and one class assessmentrubric created to December 2013.

4. How was this accomplishment related to last year’s assessment?

The goal was to implement a complete assessment cycle plan for each program. Due to lackof interest by faculty and lack of understanding there was little change from the previous year indepartment assessment reports other than the form with which the information was collected. Manydepartments still are looking at standardized tests to evaluate programs. The new information isrelated to the process of implementing the new assessment plan. Very little data has been collectedat the program level that helps the faculty and instructors in the program know in which classesstudents are getting the skills and knowledge that is being demonstrated on the standardized exams.With so many students transferring into CC across all the venues there is no clear evidence forwhere the knowledge/competence is developed.

5. How will this accomplishment relate to next year’s assessment?

The goal is for rubrics to be created and implemented by the spring 2014 semester. Three pro-grams (history, chemistry, and business) have created a multiple choice exam to measure outcomesacross introductory and upper level courses. The exams contain 10 to 25 questions with differentlevels of difficulty.

9

Page 10: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

All departments will need more time to develop assessment tools to let them examine howstudents develop skills and competencies across their curriculum.

6. Reflecting upon program accomplishments, how would the institution improve theassessment process?

Provide more support for faculty to develop tools and create realistic plans that can be com-pleted as described. An assessment office needs to be created with resources faculty and all othersat the institution can use to help identify and measure where institutionally identified studentslearning outcomes can be measured, analyze, and evaluated to modify and/or create courses toimprove student learning across all venues and modalities at CC.

7. What is working well and should not change?

The assessment committee has been the driving force for keeping faculty on course to meet theassessment plan requirements.

8. What was the institution’s major assessment hurdle this year?

Moving the deadlines to December 2014.

9. How did this hurdle impact the department assessment goals?

It made the ability for data to be collected in spring 2014 to be adequately analyzed by spring2015. The goal was that each program would have information that would help them evaluatestudent learning across the programs curriculum. It will be much longer before that type of analysisis possible.

10. How did this hurdle impact the program assessment goals?

It did not impact the program assessment goals because the goal times were changed. Goalsstill are the same.

11. How was this hurdle related to last year’s assessment?

N/A. We now have a different form for annual reports.

12. How will this hurdle be addressed in next year’s assessment?

There will be more evidence demonstrating where programs have weaknesses in the assessmentprocess and where the institution has opportunities to provide more support to programs to assistthem so that they can reach program assessment goals.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

There should be designated time each semester when faculty who are responsible for assessmentin their programs can help each other meet their program goals. An assessment office can provide

10

Page 11: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

training and/or education, and/or share tools and processes that have worked or have not so thatthe culture of assessment is cultivated and spread across programs and departments. Eventuallyall faculty and instructors will be involved in assessment at CC.

14. What would you change to provide the most impact on assessment in the insti-tution? Explain and be as specific as you can

Create an assessment office that meets regularly with faculty until each program is meeting theapproved assessment plan for two complete cycles. After two complete cycles faculty would thenbe able to determine when they needed to meet with the assessment office. Faculty have not seenmuch data yet that impacts their programs. Once they do they will need a lot of resources to helpthem interpret trends and implications of the results.

11

Page 12: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Part II.

Departmental Reports

12

Page 13: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Art Department

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

Under the leadership of Bo Bedilion the Department of Visual Arts & Music has made a verygood start on course level assessment. This re-evaluation of our evaluation process has made usbegin to look in depth at course content, master syllabi and program content-structure.

2. How did this accomplishment impact the department assessment goals?

We have confidence that we can implement the process smoothly.

3. How did this accomplishment impact the course assessment goals?

N/A

4. How was this accomplishment related to last year’s assessment?

The previous assessment was to evaluate the portfolio presentations of our graduating seniorsas part of our ARTS 496 class. The new assessment will give us more in depth information andguide program development/evaluation.

5. How will this accomplishment relate to next year’s assessment?

This was a beginning and next year we plan for more courses to be evaluated. It will also impactour review of course master syllabi and course outcomes.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

The process seems to be working. Our goal is complete implementation.

7. What is working well and should not change?

All is working well. See above.

8. What was the department’s major assessment hurdle this year?

As I mentioned above, Professor Bedilion was instrumental in the development of our assess-ment. If there is any large hurdle it will be to reexamine all course outcomes and make sure thatindividual syllabi accurately reflect the goals.

9. How did this hurdle impact the department assessment goals?

N/A

13

Page 14: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

10. How did this hurdle impact the course assessment goals?

N/A

11. How was this hurdle related to last year’s assessment?

As mentioned previously we had only ARTS 496 assessment last year. The new assessment ismuch more thorough.

12. How will this hurdle be addressed in next year’s assessment?

No hurdles to be addressed; just complete the process.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

N/A

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

No changes needed.

14

Page 15: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

Not Applicable

15

Page 16: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Business Administration

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

Completion of the program/course matrices

2. How did this accomplishment impact the department assessment goals?

Too soon to say. When we begin to examine the results of the matrices, we will know whichdepartment goals, if any, need adjustment.

3. How did this accomplishment impact the course assessment goals?

Again, once we see the results of the matrices, we will be in a better position to review courseassessment goals.

4. How was this accomplishment related to last year’s assessment?

Previously assessment focused on the results of major field tests (MFT) completed by students inMGMT 479, the business capstone course. The MFT is administered when students are completingdegree requirements. The matrices should allow us to monitor progress of students in meetingdesired outcomesas established in the matrices, as students progress through the four-year businesscurriculum.

5. How will this accomplishment relate to next year’s assessment?

When we have results of the newly established matrices, we will be in a good position todetermine what future assessment should entail.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

For years we relied heavily upon MFT results as an assessment instrument. With new assessmentinstruments in place we are unable to detect now if accomplishments have actually occurred. Timewill tell.

7. What is working well and should not change?

The MFT is well suited for our needs. All courses from our required core courses are well coveredon the MFT. The results provide an indication of how well we are imparting an appropriate levelof knowledge to students in these fundamental areas of business.

8. What was the department’s major assessment hurdle this year?

Creating the program/course matrices required careful thought. Very tight time constraintsmade the assignment especially challenging.

16

Page 17: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

9. How did this hurdle impact the department assessment goals?

Department assessment goals have not changed.

10. How did this hurdle impact the course assessment goals?

Course assessment goals have not changed.

11. How was this hurdle related to last year’s assessment?

Creation of the program/course matrices was especially challenging, not only because of timeconstraints, but also because the concept was new and daunting. By contrast last year’s assessmentwas built upon MFT results, a long established assessment tool with which we are thoroughlyfamiliar. MFT has served well the assessment needs of the department.

12. How will this hurdle be addressed in next year’s assessment?

With the course/program matrices in place we will be better positioned next year to recognizetheir importance to assessment. So much effort went into creating the matrices that their valuewas given little thought. Next year should be different!

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

Focus more on assessment throughout the students four-year study of business rather than relyentirely upon MFT results. I believe the new assessment instruments being introduced shouldremedy this situation.

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

Assessment is becoming an extremely complex and time consuming program at Columbia Col-lege. Dr. Hendrickson, the colleges assessment coordinator, now devotes half her time to assess-ment. We also have in place a support staff member who assists. What would be of value atthe department level would be release time granted to a faculty member to coordinate assessmentactivities.

17

Page 18: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

18

Page 19: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I004D0053443A3A86153F2B5128A60F5324396D4626E8 1/1

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: BusinessForm Code: 4GMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

Scaled Score Range Number in Range Percent Below

200 0 100

195-199 0 100

190-194 1 100

185-189 7 99

180-184 12 98

175-179 18 96

170-174 30 92

165-169 58 85

160-164 68 77

155-159 135 61

150-154 125 47

145-149 112 33

140-144 117 20

135-139 83 10

130-134 52 4

125-129 27 1

120-124 6 0

MeanStandardDeviation

Total Test Scaled Score 151 13

Students responding to less than 50% of the questions: 2Students in frequency distribution: 851Students tested: 853

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 20: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: MBAForm Code: 4FMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

Scaled Score Range Number in Range Percent Below

300 0 100

295-299 0 100

290-294 0 100

285-289 0 100

280-284 2 98

275-279 3 95

270-274 3 92

265-269 6 85

260-264 8 77

255-259 11 66

250-254 15 50

245-249 11 39

240-244 9 29

235-239 6 23

230-234 10 13

225-229 9 3

220-224 3 0

MeanStandardDeviation

Total Test Scaled Score 249 15

Students responding to less than 50% of the questions: 0Students in frequency distribution: 96Students tested: 96

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 21: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Computer and Mathematical Sciences

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

One of our major accomplishments this year was creating program-level learning outcomes foreach of our four degree programs: mathematics, computer science, computer information systemsand management information systems. A second accomplishment relates to the first in that oncethese program-level learning outcomes were delineated we were able to complete the program-levelmatrices which show to what degree (not at all, introduce, reinforce or emphasize) the courses inthe programs correlate to these outcomes (i.e. a curriculum map).

A third accomplishment was the program-level assessment plans. In these plans, the tool(s) thatare currently used or those which we hope to use to assess students knowledge of the program-levellearning outcomes are described. These plans include timelines for giving these tools, collecting thedata, analyzing the data and reporting ideas for improving the teaching and learning process.

A fourth accomplishment was the creation of some course-level assessment instruments. ForCISS 170, a general education requirement, an online grading system (Skills Assessment Manager:SAM) was implemented in all sections across all in-seat venues to assess students knowledge ofMicrosoft Office products. For MATH 201 (and MATH 215): Calculus 1 (and Calculus 1a), 10multiple choice questions were created to assess students knowledge of differential calculus. Inaddition, with these two mathematics courses, a course-level assessment plan was also created.

2. How did this accomplishment impact the department assessment goals?

Initially, the Academic Assessment Committee had devised a plan which was approved byfaculty governance that would have departments create both program-level learning outcomes,curriculum maps, and program-level assessment plans. In addition, departments were asked tocreate one course-level assessment plan and tool for a course within one of their programs. Ourdepartment choose to concentration on MATH 201 (and MATH 215) as this first calculus courseis a turning point for both our math majors and our computer science majors. We created thecourse-level assessment plan and an assessment tool (the aforementioned 10 multiple questions),but this instrument has not been given. We did not give this instrument, because the AcademicAssessment Committees and Academic Assessment Offices requested us to change our focus to theprogram-level after members of this committee and office attended a Higher Learning Commission(HLC) Assessment Academy. At this Academy, it was determined that our focus on course-levelshould wait until after program-level assessment was addressed. As the focus is now at the program-level, our assessment goals currently include evaluating and creating program-level assessment tools.Formally, we currently only have summative assessment tools for our programs and we need to createformative ones. We plan to create these instruments during the summer assessment workshop andbegin implementing and collecting data during the summer session.

3. How did this accomplishment impact the course assessment goals?

As mentioned, the course-level assessment was put on hold as we concentrated on assessmentat the program-level. However, we will pilot the MATH 201 tool during the summer session on the

21

Page 22: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

main campus. We will also work with Cengage to collect the data from the SAM program, analyzethe data, and determine if we can improve on the teaching and learning in the CISS 170 course.

4. How was this accomplishment related to last year’s assessment?

During the AY 2012-2013 and the AY 2013-2014 the primary assessment tools that were usedfor the math, computer science, computer information systems programs were the MFT and theDPAF. We only use the DPAF for the management information systems program. These are thesame methods we have been using to assess our programs for many years. The MFT results forcomputer science and computer information systems since 2005 are provided in tables 1 and 2.These results show that for both the day and AHE venues, students on average have for the mostpart scored within one standard deviation of the national mean. For a five-year period our daystudent average was actually above the national mean, but for the last few years we have beenbelow. The average scores for the day venue have been higher than that of AHE and AHE hasnot had an average above the national mean. Table 3 shows the aggregated data for each of theassessment indicators for the combined Day and AHE venues. Again, the student average is withinon standard deviation from the national mean for each indicator, although below. As the venueswere combined for that information, it cannot be determined if there is a difference between studentperformances for each of these indicators across venues. It should be noted that students completingthe computer information systems degree are taking the same MFT exam as those who are earninga degree in computer science. As these degrees are significantly different from each other, anotherculminating experience exam needs to be found or created to more appropriately align with theoutcomes of that program. In addition, the management information systems program does nothave a culminating experience exam.

The MFT results for mathematics in the day program (this program is not offered in AHE)since 2005 are provided in table 4. These results show that students have consistently scored withinone standard deviation of the national mean and in fact in 2013 the students scored slightly abovethe national mean. The aggregated data for each of the assessment indicators for math is shown intable 5. The students scored above the mean for all but nonroutine problems. This is somethingwe should look into to see if we can add more of these types of problems in our courses.

22

Page 23: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

TABLE 1: Computer Science MFT for the Day Program

Day Program

Year Mean Standard Deviation From

National Mean Sample Size

2013 142 -0.48 5

2012 140.8 -0.52 11

2011 162.9 0.86 7

2010 159.8 0.64 5

2009 166.0 1.04 3

2008 NA 0.60 4

2007 NA 1.00 2

2006 141.2 -0.48 5

2005 159.0 0.64 3

TABLE 2: Computer Science MFT for the AHE Program

AHE Program

Year Mean Standard Deviation

From National Mean

Sample

Size

2013 138 -0.73 18

2012 134.6 -0.90 51

2011 131.8 -1.07 11

2010 133.5 -1.00 11

2009 131.60 -1.09 16

2008 n/a -1.10 33

2007 n/a -1.0 79

2006 130.97 -1.16 61

2005 130.40 -1.11 19

TABLE 3: Computer Science 2012-2013

Assessment Indicator Mean

Percent

Correct

National

Mean

Standard

Deviations from

the Mean

Programming and Software Engineering 40 48 -0.65

Discrete Structures and Algorithms 29 38.9 -0.97

Systems: Architecture/Operating

Systems/Networking/Database

31 38.7 -0.85

Page 24: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

TABLE 4: Mathematics MFT for the Day Program

Day Program

Year Mean Standard Deviation

From National Mean

Sample

Size

2013 156.6 0.01 5

2012 150.5 -0.31 4

2011 146.6 -0.53 5

2010 169.4 0.74 5

2009 141.5 -0.80 4

2008 158.1 0.16 9

2007 147.3 -0.50 7

2006 158.1 -0.61 3

TABLE 5: Mathematics 2012-2013

Assessment Indicator Mean

Percent

Correct

National

Mean

Standard

Deviations from

the Mean

Calculus 33 31.4 0.19

Algebra 41 34 0.92

Routine 39 32.3 0.80

Nonroutine 22 26.7 -0.72

Applied 40 34.7 0.66

Page 25: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

5. How will this accomplishment relate to next year’s assessment?

Next year we hope that the program outcomes and aligned formative and summative assessmenttools we create during the summer workshop can help us collect more data that when analyzed canhelp us get an idea of what our students know at the beginning, middle and end of their programs.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

Creating, implementing, and analyzing data will be a long and time consuming process. Wewill need the resources and the support to ensure this process can be completed and is successfulin helping us improve the teaching and learning process.

7. What is working well and should not change?

The MFT for both computer science and mathematics does give the department a nationalcomparison. It allows us to see that our students are learning what students in the same programsacross the nation are learning. We hope to continue giving these exams, but on a three year cycle.

8. What was the department’s major assessment hurdle this year?

The major hurdle this year was learning what it means to assess at the program level. The factwe did not have program outcomes made it more difficult to even begin the process. Our learningcurve was great, but I believe the majority of the department understands what we need to do.

9. How did this hurdle impact the department assessment goals?

Our goals changed from assessing individual courses to assessing our programs.

10. How did this hurdle impact the course assessment goals?

We know now that some course-level assessment will also be part of program-level assessment.We will be doing some course-level assessment in CISS 170 and MATH 201/MATH 215 in thecoming year.

11. How was this hurdle related to last year’s assessment?

Last year, we only used the MFT to assess students.

12. How will this hurdle be addressed in next year’s assessment?

We will use the summer workshop to develop new assessment plans.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

We need everyone in the department to know and understand the new assessment plan. Themore we know the better we will be able to create tools, collect, and analyze data to improve theteaching and learning process for our programs.

25

Page 26: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

I believe we need more time, resources and financial support to learn about the assessmentprocess, make plans for our programs and carry them out.

26

Page 27: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

27

Page 28: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: Computer ScienceForm Code: 4HMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

Scaled Score Range Number in Range Percent Below

200 0 100

195-199 0 100

190-194 0 100

185-189 0 100

180-184 0 100

175-179 0 100

170-174 0 100

165-169 0 100

160-164 2 91

155-159 0 91

150-154 0 91

145-149 3 78

140-144 6 52

135-139 4 35

130-134 4 17

125-129 3 4

120-124 1 0

MeanStandardDeviation

Total Test Scaled Score 139 10

Students responding to less than 50% of the questions: 0Students in frequency distribution: 23Students tested: 23

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 29: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: MathematicsForm Code: 4AMFInstitution: Columbia College (MO)Cohort: Home Campus Mathematics In Seat 12/13AYClosed on: July 09, 2013

TOTAL TEST

Scaled Score Range Number in Range Percent Below

200 0 100

195-199 0 100

190-194 0 100

185-189 0 100

180-184 0 100

175-179 0 100

170-174 1 80

165-169 0 80

160-164 1 60

155-159 2 20

150-154 0 20

145-149 0 20

140-144 0 20

135-139 1 0

130-134 0 0

125-129 0 0

120-124 0 0

MeanStandardDeviation

Total Test Scaled Score 157 13

Students responding to less than 50% of the questions: 1Students in frequency distribution: 5Students tested: 6

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 30: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Criminal Justice and Human Services

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

Our Department is composed of multiple programs (Criminal Justice, Human Services, ForensicScience). Our major accomplishment was to complete program and course matrices as directed.

2. How did this accomplishment impact the department assessment goals?

Discussion of the elements of the matrices helped us to clarify and focus our assessment goals.Once we examine the results from the matrices, we can determine if any goals need adjustment.

3. How did this accomplishment impact the course assessment goals?

Discussion of the elements of the matrices helped us to clarify and focus our course assess-ment goals. Once we examine the results from the matrices, we can determine if any goals needadjustment.

4. How was this accomplishment related to last year’s assessment?

It is different this year from last year and will be a more comprehensive process in future years.In past years, our Assessment reports emphasized test results in culminating experience courses,and activities and responsiveness of our feedback loop. The program and course matrices will helpus establish a more comprehensive level of assessment.

5. How will this accomplishment relate to next year’s assessment?

This is the first year of a new process. We will build on each year’s results as we move forward.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

It is difficult to suggest improvements to process or substance until we see a few years of results.Once we see results, we can suggest improvements.

7. What is working well and should not change?

The MFT in Criminal Justice works very well for our needs and should not change. This hasbeen our primary assessment instrument in our culminating experience course for over ten years.

8. What was the department’s major assessment hurdle this year?

Our major hurdle was in creating program and course matrices for Forensic Science. Ourpast assessment activities relied upon an unsuitable instrument. We succeeded in locating a newinstrument but it was time consuming.

30

Page 31: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

9. How did this hurdle impact the department assessment goals?

It helped us to clarify and focus the goals specific to Forensic Science and helped with thematrices.

10. How did this hurdle impact the course assessment goals?

Once we solved the dilemma, it helped us to clarify and focus the course assessment goals.

11. How was this hurdle related to last year’s assessment?

The hurdle was a persistent problem mentioned in previous assessment reports.

12. How will this hurdle be addressed in next year’s assessment?

We will have results from our new instrument which will be more relevant and suitable.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

Assessment should be a more comprehensive process. The new process and instruments andmatrices should assist us in conducting assessment on a more comprehensive basis.

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

There should be release time for a faculty member to coordinate assessment activities. Also,there should be staff support for data collection and analysis as needed.

31

Page 32: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

32

Page 33: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: Criminal JusticeForm Code: 4GMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

Scaled Score Range Number in Range Percent Below

200 0 100

195-199 0 100

190-194 0 100

185-189 4 98

180-184 8 95

175-179 15 88

170-174 25 77

165-169 28 64

160-164 29 51

155-159 17 43

150-154 29 30

145-149 30 17

140-144 19 8

135-139 10 4

130-134 5 1

125-129 2 0

120-124 1 0

MeanStandardDeviation

Total Test Scaled Score 158 14

Students responding to less than 50% of the questions: 0Students in frequency distribution: 222Students tested: 222

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 34: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Education

A. Department Evaluation Reportt

1. What was the department’s major assessment accomplishment this year?

1. We completely overhauled our MLOs for every teacher certification course in order to alignwith the new MoSPE teaching standards.

2. We developed a major mid-preparation assessment package.

2. How did this accomplishment impact the department assessment goals?

1. We were able to see that we have some gaps. The largest gap is in the area of instructionaltechnology. Therefore, we have targeted increasing the use of instructional technology incourses as a key goal over the next year, and developed necessary action steps, such as:

(a) All faculty will observe teachers in our PDS, Ridgeway Elementary, as they use technol-ogy during a lesson.

(b) All faculty will attend three monthly trainings in Spring 14 with CCs instructionaltechnology specialists. (This training may continue depending on need.)

2. We now have a measure to examine the quality of our program at the early preparation level.We are not solely focused on the end of program outcomes.

3. How did this accomplishment impact the course assessment goals?

1. Students in the teaching methods courses must now demonstrate competency in planning,delivering, and evaluating instruction that integrates technology.

2. The early preparation course outcomes are aligned with the mid-preparation assessment. Wehave to ensure that we are providing the instruction that will lead to success on the mid-preparation assessment.

4. How was this accomplishment related to last years assessment?

1. Last year’s assessment did not have a focus on program improvement. Prior to the newassessment initiative, we primarily looked at end of program outcomes PRAXIS pass ratesand job placement rates. But because we have high rates of passing the PRAXIS and gettinga job, this did not lead to a focus on improvement. We did have some idea from DESEs state-wide survey of first-year teachers that graduates were reporting a need for more technologyknowledge during their preparation. However, only a small percentage of graduates completethe survey, so the anecdotal evidence was not persuasive on its own.

2. We previously had a simple mid-preparation assessment that was driven by DESE and notby our curriculum. The new assessment will allow us to focus on program improvement inaddition to meeting the DESE data reporting mandates.

34

Page 35: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

5. How will this accomplishment relate to next years assessment?

1. We will re-examine our courses in light of the changes we are making. We hope that thetechnology gap we noted on the assessment matrices will be corrected.

2. We will use the mid-preparation assessment data to analyze gaps in training at the courselevel and the field level.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

Having a technology solution that will help us aggregate and disaggregate data is imperative.

7. What is working well and should not change?

It is a whole-department effort. I dont think it would work nearly as well if some facultymembers were disengaged from the process.

8. What was the department’s major assessment hurdle this year?

Mostly, not having enough time to devote to the planning and implementation processes. Also,wrapping our minds around a very complex assessment structure has been challenging. And nothaving access to an electronic data store for analysis limited our baseline measurements.

9. How did this hurdle impact the department assessment goals?

It has taken much longer than we would have liked to get where we are at right now.

10. How did this hurdle impact the course assessment goals?

Given the time limitations, we developed the new course assessment goals but have struggledsomewhat with the implementation piece.

11. How was this hurdle related to last year’s assessment?

This is difficult to answer. We have devoted significantly more time to assessment this yearthan last, but more time is still needed.

12. How will this hurdle be addressed in next year’s assessment?

We are hoping that the college has a data solution for the next year.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

We may not be the key to improving our department assessment process. College infrastructureand investment in resources (including time for faculty) are necessary to overcome our challengesof time and data analysis limitations.As far as the hurdle of assessment complexity, we now have the support through the assessmentoffice to address this issue.

35

Page 36: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

We need data storage and analytics. Period.

36

Page 37: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

Not Applicable.

37

Page 38: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

History and Political Sciences

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

1. Development of Program Level Assessment Plans and Program Learning Outcomes Matricesfor each of the department’s programs.

2. Development of a course level assessment for HIST 112. During autumn 2013, the History andPolitical Science department discussed and began preparing a Pre- and Post-test for HIST112: World History since 1500. The test format chosen is a 70-question objective answer test,and is given in the first week and again in the last week of class. It was indeed administeredin Week 1 of both Day HIST 112 sections. In late Feb. 2014, Dr Kessel (Dept. Chair) and DrKarr (instructor of HIST 112 and developer of the Pre/Post-Test) met with members of theAssessment Office and discussed ways to make the test more useful through question-codingand use of D2L for delivery.

3. In order to better assess our program level outcomes in the History program, we beganadministering the History Assessment Test (HAT) as a pre-test to all students enrolled inHIST 294 in the Day campus in the Fall 2013. This will, over time, create a body of datathat will allow us to gauge improvement over the course of the last two years of study in theHistory major.

4. In American Studies, the department worked to develop a standardized test for a programlevel assessment of the American Studies degree. The department was awaiting final word onthe selection of testing software before uploading the exam.

2. How did this accomplishment impact the department assessment goals?

Development of departmental assessment plans and learning outcomes matrices helped focusour thinking on which courses and at what level (introductory, reinforcing and emphasize) ourdepartment’s learning goals are being addressed and envision a roadmap for strengthening ourassessment in the future.The department is on-track to have a useful assessment regime for one of its General Educationcourses. This will mean that the History program practices assessment of a gateway course as wellas its capstone 494 course. In addition, we have created a process to make the HAT data moremeaningful and to better assess student learning in their history electives by implementing theHAT pre-test.Implementing the HAT as a pre-test in HIST 294 will help us to better assess student learning overtime, especially in the last two years when students are taking upper level elective courses ratherthan general education survey courses.

3. How did this accomplishment impact the course assessment goals?

HIST 112: While no formal course-level assessment had existed prior to AY 2013/14, the de-partment feels that over time, the Pre/Post-test may allow faculty to detect areas where existingpractices work well, and where improvements are needed. The spring semester is the first semester

38

Page 39: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

for course-level assessment, however, and faculty have not yet determined a base-level expectationregarding class-performance expectations. Moreover, the deptartment has yet to agree on how bestto use an additional assessment tool within HIST 112 to assess the 2 formal essays produced duringthe course.

4. How was this accomplishment related to last year’s assessment?

HIST 112: There was no course-level assessment during AY 2012/13.HIST 294: The HAT pre-test is a new form of assessment for the program.HIST 494: while the existing HIST 494 assessment test (HAT) was useful, it was also limited bylack-of-randomization within chronological and geographical content areas. Dr. Compton workedwith Ashley Gosseen at Online administration to address that limitation.

5. How will this accomplishment relate to next year’s assessment?

The development of an assessment plan for each program will allow the department to roll outnew assessment measures, as well as apply new goals for existing measures as capstone courses aretaught. The department will need to develop a rubric to evaluate PADM/POSC 495 senior thesesand presentations, for example.We will continue to use each of the changes in History that were implemented in the 2013-14academic year. With this foundation of assessment in place, we will be able to focus on assessmentpractices beyond quantitative data, such as the evaluation of essays in HIST 112 and senior thesesin HIST 494. We will also work to implement the HIST 294 HAT pre-test in this course in allvenues.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

HIST 112:

• Build assessment around existing assignments and exams.

• Do not produce tacked-on extra assessment tools.

• Quandary: that means standardized tests and assignments in order to assure data that isviable across venues. Standardization goes against the grain of professional college instruction,particularly in the liberal arts.

HIST 494:

• The senior thesis an important assessment tool and is required of this course in all venues.Our challenge moving forward will be to ensure that theses produced in all venues meet thesame criteria for success.

7. What is working well and should not change?

The department feels that the MFT is a useful, albeit imperfect, assessment tool for politicalscience students. The test covers the major fields of political science and provides some basis tocompare our students to those of other institutions. Our ability to draw conclusions from the data

39

Page 40: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

is limited by the small number of students that take the test. In 2013, only three students fromthe Day campus took the exam.Assessment of student learning in all the department’s programs through completion of the seniorthesis is an important tool for the program.

8. What was the departments major assessment hurdle this year?

Developing reasonable achievement goals for assessment instruments was a hurdle for all dis-ciplines, especially those for instruments that have yet to be developed and implemented. Forexample, the goal for POSC students was set as students score at or above the national mean onthe MFT. This year, two out of three students scored above the national mean but one low scoringstudent pulled the average down. Perhaps we need to reformulate our goals in a way that recognizesour frequent small N size.HIST 494: The HAT administrative process remains somewhat unreliable: Dr. Karr must askAHE/Online administration to inform him of upcoming HIST 494 offerings across all venues. Thisresulted in an absence of any HAT results whatsoever during fall 2013. The instructors for thosecourse must each configure them as web-assisted; the HAT must be individually loaded into eachcourse; Dr. Karr must request a test date from each instructor, then go in and set up the HATaround that date, then supply each instructor with a test password. At the end, he must also gointo each individual HIST 494 D2L site, retrieve test results, and format them for distribution. Itis an inefficient process.

9. How did this hurdle impact the department assessment goals?

The MFT assessment goal for POSC was not met in 2013.HIST 494: The HAT is an internal assessment tool, and unlike MFT exams, there is no nationalnorm by which to gauge student success. While we have worked to reconfigure the exam and makeit more comparable to the MFT tests in its content and randomization, it is still an inadequateinstrument.Our goal is to see scores on the HAT be at least 10 percent higher on the post-test in HIST 494than on the pre-test given in HIST 294. Since the pre-test was offered for the first time in fall 2013,It will take some time to build up data to assess improvement between the pre and post test.

10. How did this hurdle impact the course assessment goals?

Developing reasonable achievement goals for the HIST 112 course assessment instrument was achallenge.

11. How was this hurdle related to last year’s assessment?

HIST 494: The problems associated with faithfully administering the HAT in all HIST 494classes in all venues restricted our ability to collect full data for this assessment tool in the 2012-13school year.In 2012, students from the Day campus, two nationwide sites and two sections of online were testedin HIST 494 Historical Research and Methods. The results from last year’s HAT in the Day campusindicate that our students are scoring approximately 50% correct on this test. Average total scoresranged from 35.53% - 50.28% correct. The small N sizes make it difficult to draw conclusions, but

40

Page 41: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

results were fairly consistent across venues, although MO2A was noticeably weaker than the rest.Students across venues had the strongest performance in United States history.

12. How will this hurdle be addressed in next year’s assessment?

POSC will revise its goals for MFT performance to 60% of students will score at or above thenational mean.HIST 494: Dr. Karr continues to work with the AHE staff to develop a better information flow tofacilitate this process.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

HIST 112: In addition to the new process of assessment for HIST 112 we also plan an assessmenttool related to writing and analysis through the use of an essay. Determining how to assess thiscomponent in all HIST 112 courses across venues will be a challenge, but will also improve ouroverall assessment. Developing the logistics for applying the objective assessment across venueswill also be a challenge. The Assessment Office can help the department determine an appropriatesample of sections to be assessed. A mechanism for informing the department of the location andtiming of offered sections and for assessment implementation must be established. Lessons fromthis process can be applied to future departmental course level assessments.HIST 494: We will continue to work on making the HAT more effective in lieu of a nationally-normed test like the MFT. We are able to generate question-level data for every time this testis administered, which could be analyzed to improve both the test itself as well as the program.In order to accomplish such a task we will need significant help or input from an expert at usingspreadsheets and filtering through the information.In addition, we continue to grapple with the need to review the senior theses written in HIST494 sections across all venues. An adequate review of these would require a substantial timecommitment from the day faculty, which may prove to be a stumbling block to this process.

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

Please see responses above.

41

Page 42: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

42

Page 43: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: Political ScienceForm Code: 4HMFInstitution: Columbia College (MO)Cohort: Political Science Day EV DE & All AHE campuses 12/Closed on: July 22, 2013

TOTAL TEST

ScaledScoreRange

Numberin Range

PercentBelow

200 0 100

195-199 0 100

190-194 0 100

185-189 0 100

180-184 0 100

175-179 0 100

170-174 0 100

165-169 0 100

160-164 1 67

155-159 1 33

150-154 0 33

145-149 0 33

140-144 0 33

135-139 0 33

130-134 1 0

125-129 0 0

120-124 0 0

Subscore 1 Subscore 2 Subscore 3

U.S. Government

and Politics

ComparativeGovernment and

Politics

InternationalRelations

ScaledScoreRange

Numberin Range

PercentBelow

Number inRange

PercentBelow

Numberin Range

PercentBelow

100 0 100 0 100 0 100

95-99 0 100 0 100 0 100

90-94 0 100 0 100 0 100

85-89 0 100 0 100 0 100

80-84 0 100 0 100 0 100

75-79 0 100 0 100 0 100

70-74 0 100 0 100 0 100

65-69 0 100 0 100 0 100

60-64 2 33 0 100 0 100

55-59 0 33 0 100 1 67

50-54 0 33 1 67 1 33

45-49 0 33 1 33 0 33

40-44 0 33 0 33 1 0

35-39 0 33 0 33 0 0

30-34 0 33 1 0 0 0

25-29 1 0 0 0 0 0

20-24 0 0 0 0 0 0

MeanStandardDeviation

Total Test Scaled Score 151 15

Subscore 1 50 20

Subscore 2 44 10

Subscore 3 51 7

Students responding to less than 50% of the questions: 0

Page 44: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 2/2

Students in frequency distribution: 3Students tested: 3

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 45: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Humanities

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

All culminating experience courses were reviewed to ensure measurement tools are properlyassessing program goals. Also, assessment strategies and program goals for English, Communica-tions, and EAP were all reevaluated and revised by faculty during the June assessment workshop.Communications faculty created additional course assessment rubrics to ensure program outcomesare met.

2. How did this accomplishment impact the department assessment goals?

The assessment workshop helped the department make significant progress; all departmentalprograms now have revised and appropriate assessment goals and measures. Humanities facultyare now making steady progress and all appear to understand the need for and purpose of creatingclear program outcome goals and for creating an assessment program to measure success.

3. How did this accomplishment impact the course assessment goals?

Each Humanities program now has clear assessment goals established by full time faculty. Thatwas not the case earlier in the academic year.

4. How was this accomplishment related to last year’s assessment?

The department is now using tools to assess progress earlier in the students’ academic careers.In previous years students were only assessed during their culminating experience course, primarilyvia the Major Field Test or departmentally created test.

5. How will this accomplishment relate to next year’s assessment?

The department may now move forwared by implementing the new assessment tools and col-lecting relevant information. This feedback may then be used to determine program strengths andweaknesses, which college venues are meeting outcome goals more effectively, and will ultimatelyresult in program modifications where appropriate.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

The Assessment Office must work to regularly remind full time faculty of the importance andrelevance of meaningful assessment. Faculty understand course-level assessment because we me-thodically assess learning in each course. However, faculty need to understand how this course-levelassessment may be used to improve program level assessment.

45

Page 46: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

7. What is working well and should not change?

The pretest/posttest used by Communications, the MFT used by English, and the original the-ses required in all culminating experience courses. These are effective measures but are insufficientwhen used alone.

8. What was the department’s major assessment hurdle this year?

Lack of clear understanding of the importance of meaningful assessment. Also, few full timefaculty really understood that assessment throughout an academic program was important.

9. How did this hurdle impact the department assessment goals?

Relatively little was accomplished prior to the workshop.

10. How did this hurdle impact the course assessment goals?

Relatively little was accomplished prior to the workshop.

11. How was this hurdle related to last year’s assessment?

The department’s assessment measures were not significantly changed.

12. How will this hurdle be addressed in next year’s assessment?

The department now has clear outcome goals and more meaningful tools for measuring thosegoals.

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

Make assessment a regular item on department meeting agendas.

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

Make assessment a regular item on department meeting agendas.

46

Page 47: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

47

Page 48: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: Literature in EnglishForm Code: 4HMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

ScaledScoreRange

Numberin

Range

PercentBelow

200 0 100

195-199

0 100

190-194

0 100

185-189

0 100

180-184

0 100

175-179

1 97

170-174

1 94

165-169

2 87

160-164

0 87

155-159

5 71

150-154

6 52

145-149

2 45

140-144

3 35

135-139

8 10

130-134

0 10

125-129

3 0

Subscore 1 Subscore 2 Subscore 3 Subscore 4

Literature 1900

and EarlierLiterature 1901

and LaterLiteraryAnalysis

Literary Historyand

Identification

ScaledScoreRange

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

100 0 100 0 100 0 100 0 100

95-99 0 100 0 100 0 100 0 100

90-94 0 100 0 100 0 100 0 100

85-89 0 100 0 100 1 97 0 100

80-84 0 100 0 100 1 94 0 100

75-79 2 94 2 94 1 90 0 100

70-74 0 94 1 90 0 90 1 97

65-69 1 90 0 90 0 90 0 97

60-64 0 90 4 77 4 77 1 94

55-59 5 74 7 55 4 65 3 84

50-54 4 61 1 52 3 55 5 68

45-49 4 48 3 42 6 35 4 55

40-44 4 35 4 29 4 23 6 35

35-39 6 16 7 6 3 13 1 32

30-34 3 6 0 6 3 3 6 13

25-29 2 0 1 3 1 0 3 3

20-24 0 0 1 0 0 0 1 0

Page 49: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 2/2

120-124

0 0

MeanStandardDeviation

Total Test Scaled Score 147 13

Subscore 1 47 13

Subscore 2 49 15

Subscore 3 50 14

Subscore 4 42 12

Students responding to less than 50% of the questions: 0Students in frequency distribution: 31Students tested: 31

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 50: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Nursing

A. Department Evaluation Report

Academic Year: 2012-2013Form Completed by: Linda S. Claycomb, M.S.N., R.N. Nursing Program Director/DepartmentChairSources of Evaluative Information:

1. Student performance on licensure exam (NCLEX-RN R©)

2. Program completion rates

3. Percent Generic and Bridge students

4. HESI R©Exit and Content testing data

5. Student evaluations of program

6. Part-time clinical faculty MSN progress

Analysis of Information:Licensing exam results: The National Council of State Boards of Nursing (NCSBN) in conjunc-

tion with the Missouri State Board of Nursing (MSBN) provide data on the national NCLEX-RNpass rates for associate degree students (first time, US educated). The MSBN annual pass rates ofColumbia College ADN students on their first attempt at passing the NCLEX-RN are shown onthe following table:

50

Page 51: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Nursing Program Assessment 2010-2011 Academic Year

Page 1

Annual first time NCLEX-RN pass rates for Columbia College graduates compared to the

national mean.

Time period* NCSBN Mean Columbia Campus Lake Ozark Campus

2005-06 88.0% 89.1% NA

2006-07 84.8% 80.0% 62.8%

2008 86.7% 81.6% 74.5%

2009 85.7% 77.05% 95.2%

2010 87.0% 90.0% 68.0%

2011 84.84% 88.0% 73.53%

2012 98.7% 97.8%

2013** 1st qtr. 84.45%

2nd qtr. 90.35%

3rd qtr. 83.00%

YTD: 84.29%

1st qtr. 95.0%

2nd qtr. 100.0%

3rd qtr. 100.0%

YTD: 50/51 98.0%

1st qtr. 100.0%

2nd qtr. 100.0%

3rd qtr. 90.0%

YTD: 33/34 97.05%

*Note – Prior to 2008, the time periods were not directly comparable between NCSBN and MSBN because NCSBN data

reflects a calendar year and MSBN data was previously collected from July 1- June 30 each year. A calendar year rate is

provided beginning in 2008.

**Note – 2012 NSCBN quarterly rates are received on Education Program Summary Reports. Not all students from each

graduating class may have taken the exam.

Both the Columbia campus and Lake of the Ozark campus programs exceeded the MSBN minimum

requirement of 80% for NCLEX success. Both campuses exceed 95% annual passing rates in 2012 and

2013.

Admission standards.

The nursing program eliminated the minimum composite score of 150 points as a nursing application

requirement with the implementation of the TEAS V which was predicted to decrease admission scores

by several points. The nurse faculty have continued to track admission composite scores. The table

below reflects average composite scores by campus. Since the average composite scores are well above

150, it is anticipated that nurse faculty will set a minimum composite application score this year. This

will provide students with information to gauge their progress towards success in obtaining entrance into

the nursing program.

December 2012 class

(average composite

score by campus)

May 2013 class December 2013 class May 2014 class

Columbia 164.43 168.96 170.32 168.96

Lake 168.87 166.56 No class 168.35

Completion rates.

The nurse faculty has set a goal of 80% completion for students who enter the program. This

benchmark reflects our experience that some students are unable to complete the program because of

academic preparation, financial issues, personal obligations or health issues. Completion data for the

period 2005-2013 appears below. The increased program rigor associated with HESI® testing is

contributing to increased attrition rates; but also increased NCLEX-RN pass rates. The nurse faculty are

continuing to trend attrition and looking at curriculum adaptations that may be more adventitious to the

retention of students.

Page 52: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Nursing Program Assessment 2010-2011 Academic Year

Page 2

Annual program completion rates 2005- 2012.

Year Columbia Campus Lake Ozark Campus All Students

2005 89.3% NA 89.3%

2006 85.0% 84.2% 84.7%

2007 90.0% 73.7% 83.2%

2008 88.3% 95.0% 92.1%

2009 81.3% 73.3% 77.3%

2010 65.4% 65.6% 66.0%

2011 69.6% 74.0% 71.1%

2012 73.0% 69.0% 71.0%

2013 66.12% 65.11% 65.7%

*December and May graduating classes for each academic year

Generic students: The program began at Columbia College as a LPN to ADN Bridge program and

began enrolling generic students in 2005. Generic students have now become the majority of the

students. Since 2008-2009, LPN’s compose between 9% -12% of the accepted students.

Percentage of generic and bridge (LPN) students accepted, academic years 2005-2012.

Year Generic Percent

Number

Bridge

Percent

Number

2005-2006 75%

64/85

25%

21/85

2006-2007 78%

76/97

22%

21/24

2007-2008 87%

105/121

13%

16/121

2008-2009 91%

75/82

9%

17/82

2009-2010 88%

86/98

12%

12/98

2010-2011 88.6%

86/97

11.3%

11/97

2011-2012 92.0%

104/113

6.0%

7/113

2012-2013 90.7%

98/108

9.3%

10/108

*December and May graduating classes for each academic year.

Page 53: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Nursing Program Assessment 2010-2011 Academic Year

Page 3

HESI® testing: The HESI® Total Testing Content and Exit testing package was added to the program

provide information regarding the performance of Columbia College students to students nationally. The

testing process provides reports to students and faculty on individual and class performance. The

minimal acceptable score for a graduating nursing student is 850. The following table reflects progress

in meeting the acceptable level as set by HESI®. Success on NCLEX-RN is predicted to be 95% when a

student scores 850 on the HESI® Exit test.

Comparison Exit HESI® to NCLEX-RN® Pass Rates.

Placement. The nursing program tracks placement of our graduates by class. Placement is considered

employment in a professional nurse position or enrollment towards a baccalaureate degree. The

following data is dependent on the student reporting employment following graduation.

Year Columbia

Percent

Lake Ozark

Percent

May 2009 84.38%

87.0%

December 2009 89.0% 83.0%

May 2010 88.0% 63.0%

December 2010 100.0% 50.0%

May 2011 100.0% 69.0%

December 2011 84.0% 93.0%

May 2012 83.33% 53.3%%

December 2012 100% 79%

May 2013 95.65% 71%

Classes Avg. Content HESI Avg. Exit HESI Pass Rate

Dec-08 Columbia 757 745 88%

Lake No class

May-09 Columbia 773 725 66%

Dec-09 Columbia 809 795 84%

Lake 787 798 75%

May-10 Columbia 862 911 100

Lake 789 790 59

Dec-10 Columbia 819 847 84.5

Lake 764 843 67

May-11 Columbia 876 855 100

Lake 824 859 84.7

Dec-11 Columbia 922 950 100%

Lake 869 882 93%

May-12 Columbia 904 893 100%

Lake 891 891 90%

Dec-12 Columbia 953 953 94.7%

Lake 936.64 968 100%

May-13 Columbia 972.01 975.77 100%

Lake 994.54 991.25 92.86%

Page 54: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Nursing Program Assessment 2010-2011 Academic Year

Page 4

Program satisfaction measures. Program graduates take a graduate written survey near the end of

their final course in the sequence. Both quantitative and qualitative information is solicited. Overall

program satisfaction has increased from ‘good’ to ‘excellent’ for the second consecutive year. There

continues to be some student dissatisfaction with CCNN (interactive audiovisual conferencing) and

national HESI testing which is impacting ‘overall satisfaction’ and’ recommendation of the program to

others’.

Graduate Exit Survey Comparison of Annual May Rate on Standardized Questions.

Student responses answering Excellent or Good are reported:

Survey question May10 May11 May12 May 13

Overall satisfaction with program

as a whole.

24%

92

62%

60%

Satisfaction with clinical sites.

33%

84%

95%

68%

Satisfaction with instructional

methods.

12.5%

11%

62%

56%

Satisfaction with instructors

(theory & clinical).

13%

88%

81%

55%

Would recommend program to

potential students.

8%

84%

53%

67%

Clinical instructor evaluations. Full-time and Part-time clinical instructors are evaluated by students in

each clinical nursing course each session. Achieved ratings are consistently greater than 95 percent.

Occasionally there may be an instructor whose ratings are lower. Some instructors teach two sections of

the same course in the same term and may receive very different scores from the sections. Faculty

evaluates all clinical instructor evaluations and reviews those falling below 90%. The faculty have

determined that there were no scores of concern under 90% and they would continue to monitor and

trend performance for subsequent sessions. No full-time faculty received lower than the 95th percentile.

Part-time clinical faculty MSN progress. The table below indicates progress towards achieving >50%

of part-time clinical faculty with an MSN degree. Currently the part-time clinical percentage with MSN

degrees for both campuses combined is 54.5%.

MSN Status for Part-time Instructors Columbia and Lake Ozark Campuses.

Columbia College Nursing Program

MSN Status for Clinical Instructors Columbia and Lake Ozark Campuses Total 10

MSN 5 (50 %)

BSN 5: Two (2) BSN currently enrolled in an MSN program and scheduled to complete in 2014.

Page 55: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Use of this Information by the Nursing Program:Information including: student evaluations, current developments in health care, current re-

search, local nursing practice trends, comments from the Missouri State Board of Nursing andaccreditation standards continue to be used by faculty to update curriculum and processes in thenursing program. Some of the process changes implemented have included: admission requirements,standardized testing, item analysis of test questions, more high-level critical analysis questions onexams, a modified grading strategy and NCLEX review courses. A required individualized re-mediation programs for students falling below the acceptable range on HESI content and Exittesting, increased use of practice tests and virtual clinical case studies have also positively im-pacted Columbia College licensure pass rates. Program data will continue to be analyzed with theincreased rigor the 2013 NCLEX-RN test plan and the number of nursing programs with fallingNCLEX-RN pass rates. Fortunately, Columbia College is not one of those schools.

RN to BSN ProgramThe nurse faculty are developing an online RN to BSN degree program that will be submitted to

CAP this fall. It is anticipated that the degree will be approved through the governance structureand be implemented in the 2014-15 academic year.

Recommendations and goals for program improvement:

1. Continue to monitor and trend the nursing curriculum, teaching modalities and programadmission criteria that will increase retention in the nursing program.

2. Continue to evaluate the recruitment and retention of clinical instructors and full-time facultyfor the nursing program.

3. Increase the proportion of clinical instructors with masters degrees.

4. Successfully implement an online RN to BSN program

55

Page 56: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

Not Applicable

56

Page 57: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Psychology and Sociology

A. Department Evaluation Report

1. What was the department’s major assessment accomplishment this year?

For the AY 2013-14 the primary assessment tools used were, MFT, DPAF, and department fac-ulty reviews. After the Assessment Committee developed a new assessment plan, the departmenthas started the process of investigating and implementing assessment tools that would assess classlevel goals. The first class proposed to assess at the course level was PSYC 325 Quantitative Re-search Methods. In this course students would be given a written form of assessment that examinestheir understanding of statistics, quantitative research methods, and the American PsychologicalAssociation (APA) writing guidelines.

After a February HLC assessment work shop it was discovered that the assessment plan adoptedby the Assessment Committee was flawed in that it proposed starting assessment at the course levelinstead of the program level, which was highly suggested due to the spot visit HLC will be con-ducting in the Spring of 2015 as a result of the HLC Review report received by Columbia Collegein AY 2012.

In response the psychology department will be developing assessment tools that will assesscritical thinking through oral communication, writing skills, and quantitative reasoning in thesummer of 2014 in the Columbia College Summer Assessment Workshop (June 16-21).

2. How did this accomplishment impact the department assessment goals?

Understanding the differences and similarities of assessing the program and courses was a majoraccomplishment by the department this year. It has provided a way to implement an assessmentplan at a pace that is attainable and sustainable. The Psychology Department will start assessingcourses that are offered in AY 2014-15 and assess on a 3 year rotation those courses offered on aregular basis and a 4 year rotation those courses offered on a 2 year basis. This rotation will allowfor the department to evaluate the information collected and make changes to improve course andprogram offerings.

Also, three of the faculty (G. Hendrickson, A. Mauxion,, A. Tabatabai) in the department areactive in the assessment committee which helps the department understand the schedule and planfor appropriate resources to be allocated in the future to assess day, evening and on-line programs tobe aligned with rigor and offerings to ensure student success within and outside of the Psychology,Sociology, and Anthropology majors.

3. How did this accomplishment impact the course assessment goals?

Set a schedule to plan for future assessment cycles.

4. How was this accomplishment related to last year’s assessment?

The Psychology and Sociology department’s leadership in the assessment process at ColumbiaCollege is a demonstration of how seriously we take the goals for improved assessment.

57

Page 58: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

5. How will this accomplishment relate to next year’s assessment?

This year’s accomplishment will allow next year’s assessment become a useful tool that will helpdirect and build the future of the department’s majors and possible minors. The P&S departmentleadership in assessment will ensure a useful and meaningful assessment process evolves.

6. Reflecting upon your accomplishments, how would you improve the assessmentprocess?

We do not know enough yet to answer this question. We will have to assess this to answersubstantively.

7. What is working well and should not change?

The department is very adept at communicating successes, weaknesses, and new approaches.Fostering the healthy communication the department recognizes and values should be fostered.

8. What was the department’s major assessment hurdle this year?

Lack of time allocated by all faculty in the process. Administrative support for facultytime commitments to assessment is lacking. Most faculty are focused on student assessment andthe process of turning the focus on departmental goals requires time and attention that needs toscheduled and compensated outside of instructional preparation and delivery timeframes. Facultyskepticism about the goals of assessment.

9. How did this hurdle impact the department assessment goals?

Getting faculty buy-in is challenging because there are no clear signs that this process will ac-complish something useful. There is also some skepticism that the assessment model will be used todesign a plug and play CBE model that will move the liberal arts tradition toward a market baseddelivery system and pull Columbia College away from its traditional liberal arts college status anddrive more toward a skills-based instructional model more useful for applied programs.

Few faculty were involved in the process this year. Next year there may be suggestions whichmay include addition of questions, or tools to evaluate programs.

10. How did this hurdle impact the course assessment goals?

N/A

11. How was this hurdle related to last year’s assessment?

N/A

12. How will this hurdle be addressed in next year’s assessment?

N/A

58

Page 59: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

13. Reflecting upon the hurdles you faced, how would you improve your assessmentprocess?

More information and advanced planning needs to happen to involve faculty in the process ofcreating program assessments.

14. What would you change to provide the most impact on assessment in your de-partment? Explain and be as specific as you can.

Several things would help convince faculty:

1. Models that demonstrate how assessment has improved the overall quality of instruction atother institutions.

2. Good detailed examples of assessment processes with demonstrated program progress out-comes

3. Compensation for work outside of instructional work of teaching and student assessment.

4. A view of the long term goals of the institution and how this process will make CC a betterplace to work and a stronger liberal arts institution.

59

Page 60: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

60

Page 61: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

1. Psychology

61

Page 62: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: PsychologyForm Code: 4GMFInstitution: Columbia College (MO)Cohort: CombinedClosed on: Combined

TOTAL TEST

ScaledScoreRange

Numberin

Range

PercentBelow

200 0 100

195-199

1 99

190-194

0 99

185-189

2 98

180-184

3 96

175-179

2 95

170-174

15 87

165-169

9 81

160-164

13 74

155-159

22 61

150-154

19 50

145-149

20 38

140-144

22 25

135-139

31 7

130-134

6 4

125-129

5 1

Subscore 1 Subscore 2 Subscore 3 Subscore 4

Learning,Cognition,Memory

Perception,Sensation,Physiology

Clinical,Abnormal,Personality

Developmentaland Social

ScaledScoreRange

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

100 0 100 0 100 0 100 0 100

95-99 1 99 1 99 0 100 1 99

90-94 3 98 0 99 1 99 0 99

85-89 0 98 2 98 2 98 3 98

80-84 1 97 0 98 6 95 2 96

75-79 4 95 3 96 1 94 9 91

70-74 11 88 20 85 11 88 3 89

65-69 7 84 13 77 25 73 15 81

60-64 11 78 13 70 15 64 17 71

55-59 26 63 25 55 17 54 25 56

50-54 14 54 10 49 29 37 26 41

45-49 31 36 27 33 16 28 9 36

40-44 36 15 19 22 26 13 24 22

35-39 14 7 25 8 9 8 25 7

30-34 9 2 8 3 9 2 9 2

25-29 3 0 4 1 1 2 2 1

20-24 0 0 1 0 3 0 1 0

Page 63: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 2/2

120-124

1 0

MeanStandardDeviation

Total Test Scaled Score 151 14

Subscore 1 51 14

Subscore 2 52 14

Subscore 3 54 14

Subscore 4 52 14

Students responding to less than 50% of the questions: 0Students in frequency distribution: 171Students tested: 171

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 64: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2. Sociology

64

Page 65: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: SociologyForm Code: 4IMFInstitution: Columbia College (MO)Cohort: DE Sociology Students 12/13 AYClosed on: July 29, 2013

TOTAL TEST

Scaled ScoreRange

Number inRange

PercentBelow

200 0 100

195-199 0 100

190-194 0 100

185-189 0 100

180-184 0 100

175-179 0 100

170-174 2 88

165-169 0 88

160-164 3 71

155-159 2 59

150-154 2 47

145-149 4 24

140-144 2 12

135-139 1 6

130-134 0 6

125-129 1 0

120-124 0 0

Subscore 1 Subscore 2

Core Sociology Critical Thinking

Scaled ScoreRange

Number inRange

PercentBelow

Number inRange

PercentBelow

100 0 100 0 100

95-99 0 100 0 100

90-94 0 100 0 100

85-89 0 100 0 100

80-84 0 100 0 100

75-79 0 100 0 100

70-74 0 100 1 94

65-69 2 88 1 88

60-64 1 82 2 76

55-59 1 76 2 65

50-54 4 53 4 41

45-49 4 29 2 29

40-44 2 18 4 6

35-39 0 18 0 6

30-34 2 6 0 6

25-29 0 6 1 0

20-24 1 0 0 0

MeanStandardDeviation

Total Test Scaled Score 152 12

Subscore 1 48 11

Subscore 2 51 11

Students responding to less than 50% of the questions: 0Students in frequency distribution: 17Students tested: 17

ETS protects the confidentiality of all test data.

Page 66: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Physical and Biological Sciences

A. Department Evaluation Report

1. Biology

These two degrees are considered together because most data is collected without distinguishingbetween the two programs. The degrees are primarily differentiated by the number of chemistrycourses taken, with B.S. students having a significantly deeper exposure to chemistry.

Academic Year: 2012-2013Form Completed by: Dr. Frank Somer; Science Department ChairSources of Evaluative Information: 1. DPAF completed by instructor of BIOL 490 Senior Seminar2. Major Field Test resultsAnalysis of Information: In the DPAF, the instructor observed some of the MLOs mainly mechan-ical skills involved in producing and documenting a written report. There was no great distinctionbetween the most successful and least successful students in the class, as all were deemed fairlyaverage and were lacking in important areas, especially in critical analysis of scientific data andideas. The instructor saw the course itself as an appropriate culminating experience, but foundthat the students didn’t apply themselves accordingly, and hence did not get all that they shouldhave out of the course.

The last couple of years of MFT scores have been historically strong, and there was a signifi-cant fall-back this year for Day [from an average of 159 (64th percentile nationally) to 150 (39thpercentile nationally)] and a slight fall-back for Evening [from 154 (50th percentile) to 152 (45thpercentile). While it is noteworthy that Evening students outperformed Day students this year,it seems that this mostly reflects a very lack-luster performance on the part of the Day students.Motivation is always a question mark on an exam where there is no penalty for a poor performance,and it seems unlikely that this group of students overall (and especially in Day) gave their bestefforts in preparing for, and taking, the MFT. Contrary to expectations, based on the more con-centrated curriculum of B.S. students relative to B.A. students, the latter actually outscored theformer this year, 153 to 150.

The area subscores were overall in the low 40s, in terms of national percentile, with Eveningagain slightly outperforming Day. The Assessment Indicators were missing from the MFT data Ireceived, so I cannot comment on these more specific topical areas.

In summary, this year represented a step back from the student performances of the past coupleof years. This isnt too surprising, given that the last couple of years of MFT scores were historicallystrong, but of course we will strive to make future performances more reminiscent of the previoustwo years than of this year’s. One change that has been made, that might help in this regard, is thereformatting of BIOL 490 Senior Seminar (new format taught beginning in Fall 2013) that shouldeliminate most of the redundancy that led to student dissatisfaction with the previous version ofthe course, and could have hurt their motivation to excel on the MFT.

66

Page 67: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2. Chemistry

Academic Year: 2012-2013Form Completed by: Dr. Frank Somer; Science Department ChairSources of Evaluative Information: 1. DPAF completed by instructor of CHEM 490 Senior Semi-nar 2. Major Field Test resultsAnalysis of Information: Seven students completed CHEM 490 Senior Seminar and took the chem-istry MFT test in 2012-2013. Six of these students were Forensic Science (Chemistry Track) majors,and only one was a Chemistry major.

The DPAF form indicates that all MLOs were observed in most of the students, though therewas some variation in the range of abilities. Integration of skills and use of literature, while betterthan in some previous semesters, were cited as areas for potential improvement. The top student inthe class was observed to use the content of the major to solve problems, to effectively communicatescientific ideas to others, interpret scientific literature at an appropriate level, and extend scientificconcepts into ideas for future research. The weakest student had not retained as much of the majorcontent and was seen to struggle in the identical areas where the top student exceled, though it wassuggested that some of this weakness might be topic-dependent and not reflective of overall ability.The instructor did deem the course a useful culminating experience for the students, helping tomodel lifelong learning in science, as well as addressing such practical matters as resume writingand job applications.

The average MFT score for the class was at the 28th percentile nationally, down from 35th per-centile last year, and identical to the result of two years ago (AY 2010-11). However, as previouslynoted, six of the seven students tested were Forensic Science majors, so these results have little tosay about the B.A. in Chemistry. The one Chemistry major who was tested made the lowest scorein the class, in the 7th percentile nationally. This clearly illustrates a major problem with usingan assessment tool that has nothing riding on it from the students perspective: this one Chemistrymajor was clearly the best student in the class and (based on my own experience and that ofthe other Chemistry faculty) knows much more chemistry than any of the other students. As theinstructor explained in the DPAF, however, she had another final exam later that day (which wasgoing to count toward a course grade), and hence simply didn’t try her best on the MFT. Giventhat the only student in the major being assessed didn’t give an honest effort on the exam, nomeaningful statements about the major can be made on the basis of the results.

It should be noted that, starting next year, we will have an additional Chemistry degree: aB.S. degree with substantially expanded requirements, as compared to the current B.A. degree.This should improve student assessment scores and, we hope, provide a more consistently robustcohort of students in the Senior Seminar course, so that the results can be more reliably analyzedto improve the Chemistry program.

67

Page 68: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

3. Environmental Science

Academic Year: 2012-2013Form Completed by: Dr. Frank Somer; Science Department ChairSources of Evaluative Information:

1. DPAFs completed by instructor of ENVS 490: Senior Seminar

2. Major Field Test results

Analysis of Information: Four students completed ENVS 490 Senior Seminar in 2012-2013: two inFall and two in Spring.

The DPAF forms indicated that both students in Fall 2012 were proficient in all the skillsand competencies expected in the major. The two students in Spring 2012 were observed to beproficient in most of the expected areas, but were found to be lacking in their ability to interpretscientific data. In each of the two sections of the course, the one student was described as excellent,and the other as capable but lacking in perspective and/or motivation. The two instructors seemedrelatively positive about the course as a culminating experience. One noted that the addition of theMFT will make the course more useful for assessing the major, and that the recent revisions to thecourse (which include cross-listing with the Biology Senior Seminar, to be implemented beginningin Fall 2013) will also be beneficial in this regard.

It should be noted that there is no MFT in Environmental Science. The content of the last twosubscore topics of the Biology MFT (Organismal Biology, and Evolution and Ecology), however,align well with the knowledge expected of ENVS majors and hence are appropriate for assessingthis program. The results of the exam are consistent with this view, as the ENVS students scoredslightly below the Biology students on the two sections (Cell Biology, and Molecular Biology andGenetics) where the latter have significantly deeper coursework, but higher on the two sectionsthat align with their major content. It is interesting to note that the performance of the ENVSstudents on the two sections that align with their major was sufficient to make their overall scoreson the Biology MFT slightly higher than those of the Biology students, whose course requirementsprovide much more comprehensive coverage of the exam material. The performance of the ENVSstudents on the Evolution and Ecology section was especially strong, ranking in the 63rd percentileof individuals and the 83rd percentile of institutional averages nationally. While the initial MFTresults for ENVS majors are encouraging, in terms of both the appropriateness of the chosen subsetof the Biology test as an assessment tool and the performance of the students, we will obviouslyneed another year or two of data to draw conclusions and make recommendations for improvementon this basis.

One note about the DPAF form: It would be useful to have a blank for the number of studentstaking the culminating-experience course in the particular term under review. Currently, the onlyway I know the course enrollment (which is obviously very important for assessment purposes) isby looking it up in Datatel.

68

Page 69: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

B. Department MFT Results

1. Biology

69

Page 70: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: BiologyForm Code: 4GMFInstitution: Columbia College (MO)Cohort: OC/EVE/CJ BIO Trac/ENVS Biology In Seat 12/13AYClosed on: July 10, 2013

TOTAL TEST

ScaledScoreRange

Numberin

Range

PercentBelow

200 0 100

195-199

0 100

190-194

0 100

185-189

0 100

180-184

0 100

175-179

0 100

170-174

2 94

165-169

0 94

160-164

2 89

155-159

9 63

150-154

5 49

145-149

9 23

140-144

4 11

135-139

3 3

130-134

1 0

125-129

0 0

Subscore 1 Subscore 2 Subscore 3 Subscore 4

Cell BiologyMolecular

Biology andGenetics

OrganismalBiology

PopulationBiology,

Evolution, andEcology

ScaledScoreRange

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

100 0 100 0 100 0 100 0 100

95-99 0 100 0 100 0 100 0 100

90-94 0 100 0 100 0 100 0 100

85-89 0 100 0 100 0 100 0 100

80-84 0 100 0 100 0 100 0 100

75-79 0 100 0 100 0 100 0 100

70-74 1 97 1 97 1 97 1 97

65-69 2 91 2 91 1 94 1 94

60-64 1 89 6 74 3 86 5 80

55-59 6 71 6 57 5 71 10 51

50-54 7 51 5 43 8 49 3 43

45-49 10 23 6 26 6 31 6 26

40-44 5 9 5 11 3 23 3 17

35-39 2 3 2 6 5 9 3 9

30-34 1 0 2 0 1 6 3 0

25-29 0 0 0 0 2 0 0 0

20-24 0 0 0 0 0 0 0 0

Page 71: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 2/2

120-124

0 0

MeanStandardDeviation

Total Test Scaled Score 150 9

Subscore 1 51 9

Subscore 2 51 10

Subscore 3 49 11

Subscore 4 51 11

Students responding to less than 50% of the questions: 0Students in frequency distribution: 35Students tested: 35

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 72: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2. Chemistry

72

Page 73: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

DEPARTMENTAL SUMMARY OF TOTAL TEST AND SUBSCORESTest: ChemistryForm Code: 4HMFInstitution: Columbia College (MO)Cohort: Chemistry-OC 12/13AYClosed on: July 09, 2013

TOTAL TEST

ScaledScoreRange

Numberin

Range

PercentBelow

200 0 100

195-199

0 100

190-194

0 100

185-189

0 100

180-184

0 100

175-179

0 100

170-174

0 100

165-169

0 100

160-164

0 100

155-159

0 100

150-154

1 86

145-149

0 86

140-144

1 71

135-139

3 29

130-134

1 14

125-129

1 0

Subscore 1 Subscore 2 Subscore 3 Subscore 4

Physical

ChemistryOrganic

ChemistryInorganicChemistry

AnalyticalChemistry

ScaledScoreRange

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

Numberin

Range

PercentBelow

100 0 100 0 100 0 100 0 100

95-99 0 100 0 100 0 100 0 100

90-94 0 100 0 100 0 100 0 100

85-89 0 100 0 100 0 100 0 100

80-84 0 100 0 100 0 100 0 100

75-79 0 100 0 100 0 100 0 100

70-74 0 100 0 100 0 100 0 100

65-69 0 100 0 100 0 100 0 100

60-64 0 100 0 100 0 100 0 100

55-59 0 100 0 100 2 71 1 86

50-54 2 71 0 100 0 71 0 86

45-49 0 71 1 86 0 71 0 86

40-44 2 43 1 71 2 43 1 71

35-39 1 29 3 29 0 43 1 57

30-34 1 14 2 0 3 0 3 14

25-29 1 0 0 0 0 0 1 0

20-24 0 0 0 0 0 0 0 0

Page 74: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 2/2

120-124

0 0

MeanStandardDeviation

Total Test Scaled Score 138 8

Subscore 1 40 9

Subscore 2 39 6

Subscore 3 42 10

Subscore 4 37 10

Students responding to less than 50% of the questions: 0Students in frequency distribution: 7Students tested: 7

ETS protects the confidentiality of all test data.Copyright © 2012 Educational Testing Service. All rights reserved.

Page 75: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Part III.

ETSPP Results

2013 ETSPP Report Summary

Columbia College uses the ETSPP to assess learning outcomes (critical thinking, reading, writ-ing, math, humanities, social sciences, and natural sciences) in the General Education curriculum(including general education courses taken in transfer).

The ETSPP was administered to a total of 211 seniors (134 Day, 24 Nationwide and 53 Online).Day numbers are comparable to 2012 with respect to the Total Means. Nationwide campus TotalMean is higher than 2012 and at the average U.S. mean for 2013. The Online campus has a lowermean than both 2012 and the 2013 U.S. mean.

The 2011 ETSPP Report Summary for Columbia College noted administering both the MFTand the ETSPP in capstone courses took up too much instructional time; therefore, the ETSPPwas administered only in capstone courses of majors that do not use the MFT, and Nationwideenrollments in these classes were very small. The decision to continue this practice may explainthe insufficient data for analysis of Nationwide and Online seniors.

A. Senior ETSPP Results

Day (134) Nationwide (24) Online (53)

U.S. 2012 2013 2012 2013 2012 2013

Total Mean 447.8 450.15 445.34 429.00 445.00 446.34 437.31

Critical Thinking 112.8 114.31 112.98 – – 112.34 –

Reading 119.0 121.26 118.84 – – 120.94 –

Writing 114.9 115.46 114.77 – – 115.26 –

Mathematics 114.2 113.03 112.43 – – 110.70 –

Humanities 115.7 118.17 117.57 – – 118.74 –

Social Sciences 114.4 116.45 114.11 – – 114.79 –

Natural Sciences 116.1 117.91 116.25 – – 116.92 –

B. ETSPP Day

Day seniors scored above the national means in Critical Thinking, Humanities, and NaturalSciences. Scores were almost five points lower than the 2012 scores.

75

Page 76: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

C. ETSPP Summary and Recommendations

The ETSPP results are difficult to interpret. Movements above and below the mean with re-spect to a certain year or in identifying a trend could be artifacts of low N proportions representingColumbia College enrollments (Day approx. 950, Nationwide approx.. 10,000, and Online approx..16,000; from: http://www.ccis.edu/day/about/factsheet.asp).

Administering the ETSPP at Nationwide campuses continues to be problematic. The Assess-ment Committee proposed a new assessment plan to be implemented over the 2013-14 academicyear.

While math scores on the ETSPP appear to be low compared to all other skill and contentareas, they are within one standard deviation of the mean. The Assessment Committee will needto analyze these results further with an eye to making recommendations about curricular change.This work is timely; the Colleges general education curriculum is due for review.

The complete 2013 ETSPP data report is available in the Assessment Office.

76

Page 77: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

ETSPP Summary of Scaled Scores

77

Page 78: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

ETS® Proficiency Profile

Summary of Scaled Scores

To show the ability of the group taking the test

Columbia College (MO) Cohort Name: Combined

Close Date: Combined

Student Level: All

Abbreviated

Test Description: Combined

Number of students tested: 177

Number of students included in these statistics: 172

Number of students excluded (see roster): 5

Possible

Range

Mean

Score

95% Confidence Limits*

for Mean

Standard

Deviation

25th

Percentile

50th

Percentile

75th

Percentile

Total Score 400 to 500 442.92 441 to 445 16.63 431 442 453

Skills Subscores:

Critical

Thinking100 to 130 112.24 111 to 113 6.06 107 112 117

Reading 100 to 130 118.53 117 to 120 7.11 113 119 124

Writing 100 to 130 114.49 113 to 116 4.33 113 114 118

Mathematics 100 to 130 111.45 110 to 113 5.80 108 110 114

Context-Based Subscores:

Humanities 100 to 130 116.95 116 to 118 6.44 112 118 123

Social

Sciences100 to 130 113.49 112 to 115 6.43 108 112 118

Natural

Sciences100 to 130 116.05 115 to 117 6.10 112 117 121

*The confidence limits are based on the assumption that the questions contributing to each scaled score are a sample from

a much larger set of possible questions that could have been used to measure those same skills. If the group of students

taking the test is a sample from some larger population of students eligible to be tested, the confidence limits include both

sampling of students and sampling of questions as factors that could cause the mean score to vary. The confidence limits

indicate the precision of the mean score of the students actually tested, as an estimate of the "true population mean" - the

mean score that would result if all the students in the population could somehow be tested with all possible questions.

These confidence limits were computed by a procedure that has a 95 percent probability of producing upper and lower limits

that will surround the true population mean. The population size used in the calculation of the confidence limits for the mean

scores in this report is 172.

Reports based on a sample of fewer than 50 test takers are representative of the performance of that sample only. Reports

based on fewer than 50 test takers should not be considered representative of the larger group of like students, and

inferences or generalizations about the larger population or subgroup should not be made based on such small samples.

ETS protects the confidentiality of all test data.

Copyright © 2012 Educational Testing Service. All rights reserved.

Page 79: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

ETSPP Summary of Proficiency Classifications

79

Page 80: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/1

ETS® Proficiency Profile

Summary of Proficiency Classifications

To show how many students are proficient at each level

Columbia College (MO) Cohort Name: Combined

Close Date: Combined

Student Level: All

Abbreviated Form

Test Description: Combined

Number of students tested: 177

Number of students included in these statistics: 172

Number of students excluded (see roster): 5

Skill Dimension Proficiency Classification

Proficient MarginalNot

Proficient

Reading, Level 1 66% 13% 22%

Reading, Level 2 31% 22% 48%

Critical Thinking 5% 8% 87%

Writing, Level 1 58% 31% 11%

Writing, Level 2 15% 31% 54%

Writing, Level 3 5% 19% 77%

Mathematics,

Level 127% 32% 41%

Mathematics,

Level 217% 15% 67%

Mathematics,

Level 34% 11% 85%

The skills measured by the ETS® Proficiency Profile test are grouped into proficiency levels - three proficiency levels for

writing, three for mathematics, and three for the combined set of skills involved in reading and critical thinking. The table and

graph show the number and percentage of students who are proficient, marginal, and not proficient at each proficiency level

in reading and critical thinking, writing, and mathematics. A student classified as marginal is one whose test results do not

provide enough evidence to classify the student either as proficient or as not proficient. See the User's Guide for more

information about these classifications, including a list of the specific skills associated with each proficiency level in each

skill area.

Reports based on a sample of fewer than 50 test takers are representative of the performance of that sample only. Reports

based on fewer than 50 test takers should not be considered representative of the larger group of like students, and

inferences or generalizations about the larger population or subgroup should not be made based on such small samples.

ETS protects the confidentiality of all test data.

Copyright © 2012 Educational Testing Service. All rights reserved.

Page 81: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

ETSPP Scaled Score Distributions

81

Page 82: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

4/10/2014 Program Workshop

https://www.programworkshop.com/6.0.0.0/home.aspx?skin=MMI&sc=I005911624A414073094D3B5B2EAD15401845B3C0902D 1/2

ETS® Proficiency Profile

Scaled Score Distributions

Skills Subscores

Columbia College (MO) Cohort Name: Combined

Close Date: Combined

Student Level: All

Abbreviated

Test Description: Combined

Number of students tested: 177

Number of students included in these statistics: 172

Number of students excluded (see roster): 5

Reports based on a sample of fewer than 50 test takers are representative of the performance of that sample only. Reports

based on fewer than 50 test takers should not be considered representative of the larger group of like students, and

inferences or generalizations about the larger population or subgroup should not be made based on such small samples.

ETS protects the confidentiality of all test data.

Page 83: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Part IV.

Summer 2014 Assessment Workshop Report

Overall conclusions

The summer 2014 assessment workshop was a positive step for building an assessment culture atColumbia College. Through hard work, concentrated workdays, and faculty openness to thinkingabout their programs developing student outcomes, assessment tools were created that can beimplemented in the fall of 2014.

An unexpected welcomed consequence of spending five days together with faculty from otherdepartments and other programs provided opportunities to discover and communicate commongoals with respect to outcomes (i.e. writing/communication skills, capstone projects), commonhurdles in assessing difficult outcomes (i.e. critical thinking, ethics), and rubrics that apply acrossstudent development. Through the discussions of different types of rubrics that were either in useor being developed other departments saw tools that could be applicable to them and departmentswere able to share tool forms. After five days of focused communication, common frustration indeveloping outcomes that were measurable and rubrics that could assess student development acrossprogram curriculum, communication channels were opened and renewed. This communication andcollegiality were noted by the workshop presenter Jennifer Fager as remarkable as it was directlyrelated to the increased improvement of the assessment tools over the five days.

Concerns were expressed about the implementation of assessment tools across modalities. Theywere not sure how assessment information would be collected or how it would be returned to theassessment office for organization. Dr. Cunningham was present at the workshop when this issuewas discussed but could not address it at the time. He later submitted an e-mail message with thefollowing in formation which was read to the faculty the next day for clarification:

”I started to chime in at this mornings session, but the conversation went another direc-tion. Please take every opportunity to allay the facultys concern about disseminatingand collecting assessment information from AHE (EV, NW & DE). Disseminating andcollecting information from all corners of our division on a regular basis is what wedo. Full-time faculty (FTF) do not need to worry about who is teaching which coursewhere and how they are going to contact them. As long as the FTF deliver qualityinstruments, with clear instructions for implementation, and I know they will, all willbe fine. As Jennifer and Dr. Dalrymple alluded, this is not that hard. FTF have lots todo, but worrying about how their work will get out to AHE, should not consume theirtime or emotional energy.”

Faculty welcomed this information and proceeded with tool development to assess programsacross all courses and modalities. Another concern was the lack of General Education Assessmentdevelopment at this workshop. A small group of faculty (Dr. Mauxion, Dr. Shlemper, and Dr.Felts) worked together to form a plan to address the lack of assessment tools for General Educationcore, the General Studies program, and the Honors Program. They plan to follow up with otherfaculty and Dr. Smith as needed this summer to prepare for the fall faculty conference in August.Their progress and work will be documented in the assessment office report for AY2013-14.

83

Page 84: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Topics covered by presentations

• Evidence of Student Learning Where to look within programs (coursework, projects, exami-nations, co-curriculars) and how to think about assessment across different modalities

• Rubrics What are they, how to develop them, how to implement them. Using accreditationbodies, professional organizations, and other institutions to find formats that are useablewithin disciplines and looking at other disciplines for unexpected insights to problems en-countered while developing assessment rubrics.

• Reasonable Assessment Timelines Not every student, every class, every time. Look at spread-ing out over outcome levels, modalities, and locations. Course assessment should come in thefuture as program assessment analysis completes a few cycles and results indicate areas toassess courses.

All can be accessed at this site in SharePoint:

https://sharepoint.ccis.edu/Groups/AcademicAssessment/Shared%20Documents/Forms/AllItems.aspx

Help given to faculty

Every person attending the workshop was active in developing their program assessment plansand creating assessment tools. During the first two days there was great variability in whereprograms were in the planning of their assessment and thinking about how they would implementtheir assessment tools. The variability caused frustration for those faculty who had worked ontheir program assessment plans prior to the workshop on a continuous basis and those who workeddiligently during the December break to meet the old assessment timeline. In response to thisfrustration, more time was allocated for one-on-one assessment coordinator or workshop presenterto program time. By Wednesday faculty were asking for more time to work within programs andless time sharing progress or insights. The schedules were adapted to meet these requests.

Also, the workshop presenter broke her foot and requested to leave earlier on Friday to getmedical attention at home. This request was allowed and in return the workshop presenter offeredto review all of the program plans and provide tailored feedback to each program. The plans willbe sent on 6/27/14. As a result, the final Workshop Summary will be submitted to Dr. Smithupon receipt of the feedback.

Examples of Types of Rubrics Developed in Workshop

• Examples of rubrics to be administered fall 2014 (available in appendix A)

– History

– Psychology

– Education

– Art

84

Page 85: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

– Math

– Nursing

• Examples of Program Learning Outcomes Plan

– History

– Buisness

Assessment Workshop 2014 Survey Summary

A survey faculty used to provide feedback to the assessment office about the workshop wasadministered through Survey Monkey. There were ten questions asked to collect information aboutcontent, improvement, and suggestions for future assessment workshops, should they be offered.

The Results of the faculty survey for all questions follow: Over all of the 22 responses (collectedby 6/25) 18 (81%) indicated they would attend an assessment workshop next year (Question 9).

85

Page 86: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears, 52% of workshop attendees indicated they learned more about assessment at

the workshop, while 33% already had a working knowledge of assessment in general. 14% of

workshop participants indicated the workshop was of little to no value to them.

Page 87: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears, 90% of workshop participants indicated the workshop was helpful to the

college, 85% helpful to their program, 75% helpful to their departments, and 70%, helpful to

them as faculty. Notably, 45% believe the assessment workshop would benefit their students.

This question was not a required question; twenty of twenty-one responded.

Page 88: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears 71.4% of workshop participants indicated that the speaker was helpful to their

completion of workshop tasks; while 28.58% indicated they would have accomplished the tasks

without the speaker’s involvement.

Page 89: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears, 86% of participants see themselves as active or vital to the Assessment

Culture of Columbia College; while 14% indicated they are complying with new assessment

tasks simply to keep HLC happy.

Page 90: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

Many participants focused majorly on their program assessment plans, while 33%

focused on outcomes and/or rubrics.

Page 91: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

The feedback above indicates a majority of attendees wanted more examples and clearer

expectations from this workshop. Many suggested a set in stone schedule presented on the first

day of the workshop and concrete examples from other institutions. 19% suggested we compact

the workshop into a shorter number of days, while 10% suggested another speaker. 10%

indicated more faculty should have attended and 14% wanted a room with a window and/or

coffee offered all day for attendees.

Page 92: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears, 80% of attendees feel confident or completely confident about the tools, plans,

and timelines they created at the workshop. 5% still do not feel confident about what they

created and 14% are unsure.

Page 93: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

It appears, 43% of attendees suggested Data Analysis and Reporting should be

emphasized at next year’s workshop, while 14% would like an emphasis in Rubric Writing. 15%

feel there should be focused attention to Graduate and General Education programs and 10% are

not sure there should be a conference at all.

Page 94: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

Faculty overwhelmingly indicated they would attend a workshop next year.

Page 95: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

2014 Assessment Workshop Survey Results

Feedback varied from accommodations to length of workdays.

Page 96: AY 2012/13 Annual Assessment Report - Columbia College/media/Files/Academic Assessment/Assess… · Institutional Report A. Institutional Summary Columbia College (CC) is in a period

Part V.

AY 2012/13 Annual Assessment Summary

History

96