executive summary of the assessment of student learning

37
EXECUTIVE SUMMARY OF THE ASSESSMENT OF STUDENT LEARNING 2018-2019

Upload: others

Post on 26-May-2022

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Executive Summary of the Assessment of Student Learning

EXECUTIVE SUMMARY OF THE

ASSESSMENT OF STUDENT

LEARNING 2018-2019

Page 2: Executive Summary of the Assessment of Student Learning

Table of Contents

I. Introduction ............................................................................................................................ 3

II. Strategic Initiatives for 2018-19 ............................................................................................. 3

III. Assessment Updates ............................................................................................................... 4

IV. Summary Results of Assessment Performance 2018-19 ....................................................... 5

1. Assessment Rubric ............................................................................................................... 5

2. Assessment Reflection ......................................................................................................... 9

3. Direct Student Learning Assessment Results .................................................................... 12

4. Direct Assessment Results: Undergraduate Learning Outcomes ...................................... 15

5. Direct Assessment Results: Graduate Learning Outcomes ............................................... 18

6. What do we do with Assessment Results? ........................................................................ 18

7. Assessment of Student Learning Goals for 2019-20 ......................................................... 18

Appendix A. Undergraduate Learning Outcomes Mapping to the Curriculum ......................... 19

Appendix B. Assessment Rubric .................................................................................................. 22

Appendix C. Direct Assessment Results for each QuEST Requirement...................................... 27

Page 3: Executive Summary of the Assessment of Student Learning

3

Executive Summary of the Assessment of Student Learning 2018-2019

Kate Oswald Wilkins, Director of Assessment

I. Introduction The 2018-19 academic year marked a number of changes, improvements, and challenges in the assessment of student learning. This report summarizes the primary assessment efforts accomplished during the academic year, assessment performance in key areas, and goals for the 2019-20 academic year.

II. Strategic Initiatives for 2018-19

1. Revision to course learning objectives. Faculty reviewed and revised all course objectives for all undergraduate and graduate programs. After dean approval, all course objectives will be maintained within AEFIS and automatically populate within the AEFIS tools tab of the Canvas course. Currently, approximately 49% of objectives are entered into AEFIS.

2. Curriculum mapping. All department chairs and program directors received training on curriculum mapping, and deans determined a timeline for each department/program to complete curriculum maps for its majors/programs.

a. All QuEST courses are mapped to Undergraduate Learning Outcomes in AEFIS.

b. All courses (both graduate and undergraduate) containing assessment measures are mapped to their respective program learning outcomes and to either the Undergraduate Learning Outcomes or the Graduate Learning Outcomes in AEFIS.

3. Assignment linking for scheduled 2018-19 QuEST and academic assessment completed. For assignments graded within Canvas, the linking process ensures that AEFIS aggregates student scores on assessment measures and provides the aggregated scores by specific program or by institutional-level learning outcomes (ULOs/GLOs). Currently, approximately 66 percent of undergraduate programs (including QuEST) and approximately 58 percent of graduate programs (total of approximately 65 percent of all programs) are linked.

4. Aggregated institutional-level student learning data.

a. As a result of campus-wide assignment linking, for the first time we are able to aggregate institution level assessment data. The results appear later in this report.

5. Program-specific student learning data for 2018-19 in AEFIS.

a. School deans, department chairs, program directors, and administrative assistants can

view assessment data by program for 2018-2019 (via the main menu/Report

Dashboard/Assessment reports). For specific directions on how to run reports see the

deans/chairs assessment manual available on the assessment website under resources.

6. AEFIS as the central repository for assessment activities.

a. Deans use the institutional assessment rubric to score academic departments’ assessment activities. This process occurred within the AEFIS tool.

Page 4: Executive Summary of the Assessment of Student Learning

4

b. Department chairs and program directors completed the annual assessment reflection

survey within AEFIS.

c. Additionally, several academic departments used the practicum tool and the survey tool to collect student learning data.

III. Assessment Updates

1. AEFIS Training.

a. Department chairs, program directors and administrative assistants. The assessment

office continues to work with department chairs/program directors and their

administrative assistants in 2019-20. The focus is on maintaining their curriculum

mapping within AEFIS and to continue to link assessment assignments for 2019-20 into

AEFIS.

b. Faculty. Starting in the 2019-20 academic year, assessment personnel trained

department faculty and QuEST faculty to do their own assignment linking for

assessment reporting purposes. Faculty involvement is essential to the accuracy of

assignment linkages and faculty engagement in meaningful assessment. The continued

Dean support for assessment expectations helps to ensure compliance for assessment

expectations.

2. 2018-19 data analysis by student characteristics. We are able to filter student learning data and

analyze results by student characteristics (e.g. SAT/ACT score, athletic status, sex, and ethnicity).

The assessment office will finalize this analysis, distribute results to appropriate campus offices,

and accept requests for custom reports.

3. Real-time results for Fall 2019. Results continue to populate as instructors enters student scores in linked assignments. Deans, chairs, directors and administrative assistants can view results in process and begin the analysis of results once student scores are finalized for the Fall semester. Directions on how to run reports is included in the deans/chairs assessment manual available on the assessment website under resources.

4. Assessment plans for majors/programs. Based on feedback from deans and chairs/program directors, for 2019-20, there will be a new process for maintaining and scoring assessment plans. Additional information is provided to the deans for feedback. Benefits of the new process include:

a. Keeping all assessment activities within AEFIS, rather than in multiple files on M drive. b. Version control for assessment plans. The current assessment plan for every program

will be within AEFIS for deans, chairs, directors, administration assistants and assessment personnel. This ensures we are all working with the most current version.

c. Easier assessment rubric scoring process than prior years. d. More effective reporting for program progress on assessment action items.

Page 5: Executive Summary of the Assessment of Student Learning

5

IV. Summary Results of Assessment Performance 2018-19

Summary results of assessment performance include the annual assessment survey, the annual

assessment rubric scores, and direct assessment results.

1. Assessment Rubric.

a. Description. Deans and chairs/program directors co-score graduate and undergraduate

program-specific assessment plans (before April 1) annually using our common assessment

rubric. This rubric evaluates assessment plans and assessment processes on a four-point

scale.

b. The assessment rubric includes the following categories:

1.) Process: Is the plan being implemented faithfully and revised as needed?

2.) Engagement: Are all relevant parties involved in the creation/revision, analysis,

interpretation, and improvement processes associated with the plan?

3.) Student Learning Objectives: Are the student learning objectives clear, measurable,

aligned with ULOs/GLOs, and representative of the range of learning for that

major/program?

4.) Measures: Are the instruments used to assess learning relevant for the objective? Do

measures yield information/data you can use to drive improvement?

5.) Timeline: Is the timeline for data collection manageable with sufficient data points to

effectively inform decision making and program review?

6.) Targets: Are the targets based on professional standards and/or experience with

student work? Are targets challenging and achievable?

7.) Action Plans/Use of student learning data from prior year: Is the department using

assessment data to revise curriculum and pedagogy to support student learning?

8.) Dissemination: Is the department communicating learning objectives, results and

improvements related to student learning to a wide audience?

c. Purpose. The assessment office and school deans use the annual assessment rubric scores

to document individual major/program performance on assessment plans and processes

over time. The institutional expectation is for every program to score at least a three on

each element of the rubric to reflect proficient assessment performance. This report uses

comparative data from the 2015-19 assessment rubric scores.

Page 6: Executive Summary of the Assessment of Student Learning

6

d. Summary comments on the assessment rubric data

1.) This year, averages for each criterion line of the rubric range from 2.9 (engagement) to

3.35 (objectives).

2.) The majority of rubric scores are three.

3.) Process, engagement, action plans, and dissemination appear as areas for improvement

based on the number of scores recorded as one and two.

4.) Since plans need to score a three on the plan elements (program learning objectives,

measures, targets, timeline) in order to make significant curricular changes, it makes

sense that these areas tend to score higher.

5.) During the year, we have noticed calibration issues related to program learning

outcomes. Majors/programs with concentrations should earn a three only if their

assessment plan articulates specialized learning for each concentration. Otherwise,

deans and chairs should score this area at a 2.

e. Recommendations

1.) Focus on using data to improve student learning (action plans). As an institution, we

have developed assessment plans, revised and improved program outcomes and course

learning objectives and mapped our curriculum to provide a clear understanding of

where we deliver instruction to accomplish those program learning outcomes. We have

the infrastructure to collect and aggregate student learning data in powerful ways. Our

next step is to focus on using the data to improve student learning (action plans.) The

assessment office plans to develop assessment resources and offer a Teaching and

Learning sessions on how to use data to develop action plans.

2.) Dissemination of assessment results. With the availability of direct assessment reports,

departments can articulate the areas where our students excel. Institutionally, we need

to learn how to share these stories in ways that communicate value and excellence to

our stakeholders. Spring 2020 or fall 2020 may be an appropriate time for Provost’s

Cabinet to revisit how we can share student learning data on our websites in meaningful

ways.

Page 7: Executive Summary of the Assessment of Student Learning

7

Summary Assessment Plan & Process Performance: Average Rubric Scores 2015-2019 This graph shows the average program score for each category on our institutional assessment rubric and compares that average score by academic year. Note:

The deans recommended separating dissemination out of the engagement area, effective 2018-19. Therefore Dissemination has only one year of data reported.

Page 8: Executive Summary of the Assessment of Student Learning

8

These graphs display the percentage and number of each rubric score for 2018-19 scored rubrics.

Page 9: Executive Summary of the Assessment of Student Learning

9

2. Assessment Reflection Survey.

a. Description. This survey captures academic departments’ reflections on the assessment of

learning progress at the end of each academic year.

b. Purpose. The survey provides a structure to for chairs/program directors to lead and record

meaningful, reflective discussions with their departments about their assessment plans and

goals.

1.) The report summarizes key aggregate responses from the assessment survey.

2.) The assessment office reviews individual department responses with their respective

school dean over the summer.

3.) Responses facilitate the prioritization of assessment improvements needed in the

upcoming academic year.

4.) The assessment office develops resources for departments based on individual

responses and emails those resources to the chairs and directors each fall, along with

the offer for additional assistance.

c. Plans to Improve Learning due to Analysis of Assessment Results

1.) The assessment survey asked chairs and program directors to identify actions the

department plans to take in order to improve student performance in instances where

assessment results fell below targets. As displayed in the graph, actions that garnered

the highest number of responses included “change/add an assignment,” and “improve

instruction on the topic in a required course.” They also recorded specific action plans

on the survey. These action plans are located in the summary notes for each school in

Provost/M/Assessment Plans by Program. “Other” responses written into the survey

included plans/desires for increased collaboration with other departments/offices on

campus, modification to courses and learning objectives to better suit PLOs and student

needs, and selection of new assessment measures. Several comments discussed

addressing potential weaknesses at earlier points in a given program, revising and

refining rubrics, and lack of knowledge due to the newness of the major/program.

d. Assistance Requested of the Assessment Office

1.) “Other” responses written into the survey included numerous comments regarding

AEFIS, thanks and appreciation to assessment office personnel, lack of prior knowledge

of assessment in newly-established majors, major-specific information requests, and

indication that no current needs require assistance.

2.) Help selecting/revising measures may appear as a high response this year because AEFIS

usage is uncovering the wide variety of ways departments were collecting assessment

data. While measures have always appeared on assessment plans, the assessment office

previously had little knowledge about where student work was stored and how it was

scored. Most departments are adapting well to the use of Canvas for score entry;

however, some departments are finding this more challenging.

Page 10: Executive Summary of the Assessment of Student Learning

10

3.) Developing a manageable timeline was a surprising response, since few departments

scored below 3 in this area. One possibility is that some chairs are not building the

timeline into their academic calendar, and attempt to “do it all” during May

development week. We anticipate that as the institution becomes more adept with

AEFIS, the perception of an unmanageable timeline will resolve.

Page 11: Executive Summary of the Assessment of Student Learning

11

Plans to Improve Learning due to Analysis of Assessment Results

Page 12: Executive Summary of the Assessment of Student Learning

12

Assistance Requested of the Assessment Office

3. Direct student learning assessment results.

a. Description.

3.) Majors/Programs. Each academic major or graduate program collects data on at least

1/3 of the assessment measures on its assessment plan each year.

4.) QuEST. Courses that fulfill the QuEST requirements collect student learning data related

to one of the QuEST course objectives per year, rotating through the course objectives

each year.

b. Purpose. Direct evidence of student learning helps us document the degree to which

Messiah students are achieving institutional learning outcomes (and is a requirement for

continued Middle States accreditation). Our evidence helps tell the story of Messiah’s

distinctiveness to external stakeholders, and internally it helps us identify targeted areas

needing improvement.

c. Results. This year, due to our use of AEFIS and the integration between AEFIS and Canvas,

we were able to report aggregate, institution level assessment results on student learning.

Page 13: Executive Summary of the Assessment of Student Learning

13

1. 90% percent of QuEST courses reported scores within AEFIS.

2. 66% of undergraduate majors reported scores within AEFIS.

3. 58% of graduate programs reported scores within AEFIS

4. Institutionally 65% of all programs reported scores within AEFIS.

It is important to note that integration and access issues between Canvas/AEFIS prevented

many chairs and program directors from making assignment linkages. Therefore, the

assessment office completed linkages during the summer, dependent upon the degree to

which instructors scored assignment within Canvas, and the degree to which assignment

names and targets were clearly identified within the departments’ assessment plan.

Direct Assessment Results - UNDERGRADUATE

Page 14: Executive Summary of the Assessment of Student Learning

14

Number of Direct Assessment Measures - Undergraduate

Page 15: Executive Summary of the Assessment of Student Learning

15

4. Direct Assessment Results: Undergraduate Learning Outcomes

a. Description. The data in the graphs represent aggregate student performance results

from all assignment linkages made within academic majors/graduate programs as well

as general education.

i. The first graph displays the percentage of each score range.

ii. The second displays the total number of assessments aggregated to each

undergraduate learning outcome. Results are reported per term, i.e. fall and

spring (including J-Term, the 12-week spring term, and May term).

iii. Each area sets their proficient range (yellow) in accordance with the target

listed in the assessment plan. For instance, if the goal is for a particular

percentage of students to achieve a B or higher on the assessment, B (83 or

whatever constitutes B) is set at the low end of the proficient range. Because

proficiency ranges are a new feature available to us through AEFIS, educators

are continuing to discuss where to set the basic, below basic, and advanced

ranges. Generally, the advanced category represents A range scores and

basic/below basic represents scoring poorly on the assessment (60-69) or failing

the assessment (below 60).

b. General Education.

i. Instructors/chairs reported approximately 90 percent of scheduled general

education data.

ii. The assessment plan for general education sets “proficient” at 70 or above for

all areas, presumably due to the differing proficiency expected for students

completing courses outside of their major.

iii. A breakdown of QuEST assessment results is available in Appendix C of this

report. Comparison performance across student groups by QuEST objective

appears in Appendix D.

c. Academic majors.

i. Approximately 66 percent of academic majors reported some data via AEFIS

(although the majority of programs still collected and discussed assessment

data).

ii. Program level assessment reporting for academic programs is available in AEFIS

under the report dashboard.

d. Aggregate student performance data on the ULOs includes learning from every portion

of the required curriculum (i.e. general education and majors).

i. Appendix A shows the mapping from program learning objectives in the major,

QuEST, and Student Success and Engagement.

ii. Please note that Student Success and Engagement PLOs contributing to ULOs

are listed to show where these ULOs are enhanced through SSE, but direct

assessment data does not include SSE reporting at this time.

Page 16: Executive Summary of the Assessment of Student Learning

16

e. Reflection on Direct Assessment Results for the Undergraduate Learning Outcomes

i. A high percentage of assessments are collected for ULO 2, Breadth and Depth of

Knowledge, due to the high portion of QuEST credits devoted to knowledge of

the liberal arts (24 credits), as well as the high number of major program

learning objectives aligned with knowledge of the discipline.

ii. Based on current program learning objectives, additional linkages could be

made to social responsibility. We encourage, but do not require academic

majors to map to this ULO.

iii. We should see the number of collected assessments increase over time due to

increased educator proficiency in assignment linking and departments’ need to

copy linkages from the previous year data collection to report on action plans

from the previous academic year.

iv. Results from QuEST make the degree of learning proficiency appear higher than

we might interpret due to the low target of 70 percent.

Direct Assessment Results – School of Graduate Studies

Page 17: Executive Summary of the Assessment of Student Learning

17

Page 18: Executive Summary of the Assessment of Student Learning

18

5. Direct Assessment Results: Graduate Learning Outcomes

Student performance data on the graduate learning outcomes (GLOs) aggregates from

assignments linked to program learning objectives within graduate program assessment plans.

Approximately 58% of programs are represented in the 2018-19 assessment data.

6. What do we do with Assessment Results?

a. Analyze, report, create action plans. i. During May development week each year, academic departments analyze and

reports assessment results in accordance with their assessment plans. ii. General education units will have the opportunity to view section level and

aggregate assessment results, discuss instruction and assessment strategies, and identify action plans to improve student performance. Reports go to the Assistant Dean of General Education for follow up and execution the following academic year.

iii. Academic departments analyze assessment results, identify action plans to execute during the upcoming academic year, and report progress on the previous year’s action plans in AEFIS. Deans approve end of year reporting and monitor progress on action plans in the upcoming academic year.

b. Dissemination of assessment results. Stakeholders expect to see assessment results. i. Institution-level. We will share aggregated institution-level results on the

Messiah website. ii. General Education. QuEST assessment results are posted on the QuEST website

annually. iii. Program-level. Academic departments should share results as appropriate via

their website and with faculty, students, alumni, and local employers.

7. Assessment of Student Learning Goals for 2019-20

a. Complete campus AEFIS training b. Complete an AEFIS user guide c. Update assessment manual and resources for consistency with AEFIS d. Work with Marketing & Communications on ways to share aggregate assessment results

with internal and external stakeholders e. Work with AEFIS to transition assessment plans to an electronic format accessible to

deans, chairs, and the assessment office f. Resource chair/program director needs regarding curriculum mapping, use of AEFIS, and

methods of learning assessment

Page 19: Executive Summary of the Assessment of Student Learning

19

Appendix A. Undergraduate Learning Outcomes Mapping to the Curriculum

1. Foundations for learning. a. Description. Students will develop skills common to the liberal arts and sciences:

research, analysis, reflection, and communication. b. Program learning objectives mapped to this ULO include:

QuEST. Abilities of the liberal arts: to think, read, write, and speak effectively o First Year Seminar o Created and Called for Community o Oral Communication

Student Success and Engagement: Dig Deep. o Common Chapel & Sixers o Co-curricular Educational Programming o Student Leadership Programming o Semester-long programs

2. Breadth and Depth of Knowledge:

a. Description. Students will develop knowledge common to the liberal arts and sciences in the fields of arts, humanities, natural sciences, and social sciences. Students will also develop specialized knowledge and disciplinary expertise.

b. Program learning objectives mapped to this ULO include:

QuEST. Knowledge of the liberal arts: to promote students’ grasp of the larger picture

o Mathematical & Natural Sciences, o Languages & Culture, o Social Sciences & History, o Non-western studies, o Humanities o Arts

Majors. Program-level learning objectives aligned with CWEO 4.1 (disciplinary knowledge)

3. Faith Knowledge and Application

a. Description. Students will develop informed and mature convictions about Christian faith and practice.

b. Program learning objectives mapped to this ULO include:

QuEST. Deepen faith: Christian faith encourages the development of an informed Christian conviction

o Knowledge of the Bible o Christian Beliefs

Majors. Program-level learning objectives aligned with CWEO 4.5 (Christian faith and the discipline/vocation)

Student Success and Engagement: Be Rooted: formation of maturing sense of self, identity, self-esteem, confidence, ethics, integrity, maturing sense of relationship to God resulting in spiritual practices, character building, reconciliation, service, intentional growth.

Page 20: Executive Summary of the Assessment of Student Learning

20

4. Specialized Skills and Scholarship a. Definition. Students will become proficient in the scholarship of their discipline and

demonstrate specialized skills required for employment. b. Program learning objectives mapped to this ULO include:

Major. Program-level learning objectives aligned with CWEO 4.2 (scholarship) and 4.3 (applied disciplinary skills)

5. Self-Awareness a. Definition. Students will gain self-awareness of identity, character, and vocational

calling. b. Program learning objectives mapped to this ULO include:

QuEST. To inspire action: Social Responsibility spurs students to know self o Created and Called to Community o Wellness

Major. Program-level learning objectives aligned with CWEO 4.4 (vocational awareness).

Student Success and Engagement. Be Strong: gain realistic self-appraisal, self-understanding, set personal goals, become interdependent and collaborative, work with others different from self

o Student Activities Board o Career Coaching o Martin & Flowers Program o Recreational Sports o Wellness Initiatives o Intercollegiate Athletics o Into the City o Life Hacks

6. Social Responsibility:

a. Definition. Students will demonstrate a commitment to service, reconciliation, and justice, and respond effectively and ethically to the complexities of an increasingly diverse and interdependent world.

b. Program learning objectives mapped to this ULO include:

QuEST. o To inspire action: Social Responsibility spurs students to know good and do

good. Ethics World Views Pluralism

o Modern language objectives (a and b) o Cross Cultural course objectives (b-d)

Majors. Encouraged but not required.

Student Success and Engagement: o Be Cultivated: Understand, value and appreciate human differences,

develop cultural competency, understand and pursue reconciliation Inclusivity Training Off-campus programs

Page 21: Executive Summary of the Assessment of Student Learning

21

Intentional connections Heritage Months

o Branch Out: Civic responsibility, commitment to service, effective in leadership, commitment to living in community

Outreach Teams Leadership Retreats Service Day MLK Day ELI

Page 22: Executive Summary of the Assessment of Student Learning

22

Appendix B. Assessment Rubric

Criteria 1 2 3 4

Process

Is the plan being

implemented faithfully and

revised as needed?

Assessment plan is not

implemented.

Most aspects of plan are being

implemented or all aspects are

implemented to some degree.

Assessment plan is fully

implemented.

Plan is faithfully executed and

modified/evaluated as needed.

Explanations:

Engagement

Are all relevant parties are

meaningfully involved in

the creation/revision,

implementation, analysis,

interpretation and learning

improvement process?

Limited involvement beyond

chair/director

All department faculty are aware

of process and results

All department faculty participate

in conversations regarding the

use of assessment data to

improve student learning

All relevant stakeholders

(students, employers, alumni) are

meaningfully involved in the

creation/revision,

implementation, analysis,

interpretation, and/or

improvement processes

associated with this assessment

plan.

Explanations:

Page 23: Executive Summary of the Assessment of Student Learning

23

Criteria 1 2 3 4

Student Learning

Objectives

Are the student learning

objectives clear,

measurable, aligned with

ULOs/GLOs, and

representative of the range

of learning for that

major/program?

Objectives are problematic

(vague, abstract, not aligned with

ULOs/GLOs) or missing.

Objectives are clear, mostly

measurable, partially aligned with

ULOs/GLOs.

Objectives are clear,

measureable, aligned with

ULOs/GLOs, and represent an

overview of the knowledge,

skills, beliefs, and values that are

important for a graduate of this

major/program, accounting for

variations in learning outcomes

due to tracks/concentrations

Objectives are clear, measurable,

aligned with ULOs/GLOs, and

representative of the range of

learning that is important for this

program.

The learning objectives provide a

comprehensive view of the

knowledge, skills, beliefs, and

values that are important for a

graduate of this major/program

and accounting for variations in

learning outcomes due to

tracks/concentrations

Explanations:

Page 24: Executive Summary of the Assessment of Student Learning

24

Criteria 1 2 3 4

Measures

Are the instruments used to

assess learning relevant for

the objective? Do measures

yield information/data you

can use to drive

improvement?

Not all objectives have a measure

identified.

OR

Measures do not directly connect

to the objectives.

All objectives have at least one

direct measure.

Measures connect to learning

objectives superficially or

tangentially and/or include

learning other than stated

objectives.

Relies almost exclusively on the

same form of assessment (survey,

exam, project).

Relies almost exclusively on data

from a single source (course,

program, activity).

All objectives have at least one

direct measure.

Some objectives have multiple

measures.

Measures clearly connect to

learning objectives.

And two of the following four

criteria:

Objectives measured more

than one point in time

(formative).

Indirect measures are used

strategically.

Plan incorporates different

forms of assessment (survey,

exam, project).

Plan incorporates data from a

variety of sources (course,

program, activity).

Measures meet all of the

following criteria:

All objectives have at least one

direct measure.

Some objectives have multiple

measures.

Measures clearly connect to

learning objectives.

Objectives measured more than

one point in time (formative).

Indirect measures are used

strategically.

Plan incorporates different forms

of assessment (survey, exam,

project).

Plan incorporates data from a

variety of sources (course,

program, activity).

Explanations:

Page 25: Executive Summary of the Assessment of Student Learning

25

Criteria 1 2 3 4

Timeline

Is the timeline for data

collection manageable with

sufficient data points to

effectively inform decision

making and program

review?

Not identified clearly for all

measures.

Clearly states semester/year for

each objective/measure.

Data analysis delayed from data

collection.

Time between collection points

may not facilitate informed

decision making.

Clearly stated and manageable

schedule.

At least two data points for each

objective per review cycle.

Timeline for data collection is

manageable and allows for

continuous improvement with

timely and meaningful decision

making even before program

review.

Explanations

Targets

Are the targets based on

professional standards

and/or experience with

student work? Are targets

challenging and achievable?

Some targets are missing. Targets are arbitrarily chosen or

reflect minimal expectations.

Targets are challenging and

achievable based on prior data,

and reflect the level of

performance a novice

professional knows/can do.

Targets are challenging and

achievable.

Targets are based on professional

standards and/or prior data and

experience with student work and

reflect the level of performance a

novice professional knows/can

do.

Targets are set at a level to

inspire program improvement.

Explanations:

Page 26: Executive Summary of the Assessment of Student Learning

26

Criteria 1 2 3 4

Use of student learning

data from prior academic

year

Is the department effectively

examining and using

assessment data to revise

curriculum and pedagogy to

support student learning?

Assessment data not

collected/analyzed/used for

decisions and/or results not

documented in AEFIS.

•Data collected, documented and

discussed by department.

•Department reviewed

confidence in measures and data

as sufficient indicators of student

performance.

•If data indicated changes were

needed, action plans were

developed in consultation with

dean (e.g. improving outcomes,

measures, targets, curriculum or

pedagogy).

•Data collected, documented and

discussed by department.

•Department and dean confirmed

confidence in measures and data

as sufficient indicators of student

performance.

•Action plans (e.g. improving

outcomes, measures, targets,

curriculum or pedagogy)

developed in consultation with

dean.

•If prior year data warranted

action plans, the department

implemented the changes.

•Department collected and

discussed follow-up data after the

implementation of action plans in

order to determine whether

changes resulted in improvement

or whether additional action is

necessary, and/or

•Data confirms effective

curriculum and pedagogy for

learning outcomes.

*Score of 4 should be assigned

only if objectives, measures,

targets and timeline all score a 4.

Explanations:

Dissemination

Is the department

communicating learning

objectives, results and

improvements related to

student learning to a wide

audience?

No record of assessment results

and changes made as a result of

assessment findings.

The department/program retains

records of assessment results and

positive changes made as a result

of assessment findings, and

results are entered in assessment

software system.

The department/program retains

records of assessment results and

changes made as a result of

assessment findings, results are

entered in assessment software

system, and assessment results

and improvements are publicly

posted.

The department/program retains

records of assessment results and

changes made as a result of

assessment findings, and results

are entered in assessment

software system. Assessment

results and improvements are

publicly posted and shared

proactively with faculty,

prospective students, employers

and alumni in ways that facilitate

their discussion.

Explanations:

Page 27: Executive Summary of the Assessment of Student Learning

27

Appendix C. Direct Assessment Results for each QuEST Requirement

First Year Seminar

Objective D: Apply basic methods and skill of information literacy: accessing, evaluating, and using information effectively and ethically

Oral Communication

Objective C: Convey information and reasoned argument in spoken and visual presentation

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

% Basic/BelowBasic

% Proficient % Advanced

First Year Seminar: Objective D

40%

33%

27%

0%

20%

40%

60%

80%

100%

First Year Seminar: Objective D

% Advanced % Proficient % Basic/Below Basic

83%

11%6%

0%

20%

40%

60%

80%

100%

1

Communication: Objective C

%Advanced %Proficient %Basic/Below Basic0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

%Basic/Below Basic %Proficient %Advanced

Communication: Objective C

Page 28: Executive Summary of the Assessment of Student Learning

28

Created and Called for Community

Objective D: Write critically, using effective prose for particular audiences

Mathematical Sciences

Objective A: Solve quantitative problems using mathematical techniques, statistical methods, or information technology

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

% Basic/Below Basic %Proficient %Advanced

Mathematical Sciences: Objective A

48%

26%

26%

0%

20%

40%

60%

80%

100%

Mathematical Sciences: Objective A

%Advanced %Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

% Basic/Below Basic % Proficient % Advanced

Created and Called for Community: Objective D

53%

36%

12%

0%

20%

40%

60%

80%

100%

120%

Created and Called for Community: Objective D

% Advanced % Proficient % Basic/Below Basic

Page 29: Executive Summary of the Assessment of Student Learning

29

Laboratory Sciences

Objective C: The ability to conduct and analyze simple investigations in the natural sciences

Science, Technology, and the World

Objective B: Characterize ethical, social, historical, philosophical, aesthetic, or political aspects of science or technology

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/Below Basic % Proficient % Advanced

Laboratory Sciences: Objective C

61%

23%

16%

0%

20%

40%

60%

80%

100%

Laboratory Sciences: Objective C

% Advanced % Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/Below Basic %Proficient %Advanced

Science, Technology, and the World: Objective B

63%

26%

10%

0%

20%

40%

60%

80%

100%

Science, Technology, and the World: Objective B

%Advanced %Proficient % Basic/Below Basic

Page 30: Executive Summary of the Assessment of Student Learning

30

Social Sciences

Objective C: Analyze important variables contributing to one or more social problems/issues

European History

Objective B: Comprehend selected ideas, peoples, institutions, and events central to the formation of Western traditions

0%

10%

20%

30%

40%

50%

60%

70%

80%

% Basic/Below Basic %Proficient %Advanced

Social Sciences: Objective C

71%

18%

11%

0%

20%

40%

60%

80%

100%

Social Sciences: Objective C

%Advanced %Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/Below Basic % Proficient % Advanced

European History: Objective B

61%

32%

7%

0%

20%

40%

60%

80%

100%

European History: Objective B

% Advanced % Proficient % Basic/Below Basic

Page 31: Executive Summary of the Assessment of Student Learning

31

United States History

Objective B: Comprehend the selected ideas, peoples, institutions, and events central to American history

Literature

Objective C: Analyze significant works of literature

0%

5%

10%

15%

20%

25%

30%

35%

40%

% Basic/Below Basic %Proficient %Advanced

United States History: Objective B

36%

36%

28%

0%

20%

40%

60%

80%

100%

United States History: Objective B

%Advanced %Proficient % Basic/Below Basic

55%

28%

17%

0%

20%

40%

60%

80%

100%

Literature: Objective C

%Advanced %Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

% Basic/BelowBasic

%Proficient %Advanced

Literature: Objective C

Page 32: Executive Summary of the Assessment of Student Learning

32

Religion

Objective C: Identify relationships between religion and culture at the local, national, and transnational levels

Philosophy

Objective C: Engage the work of significant thinkers

0%

10%

20%

30%

40%

50%

% Basic/BelowBasic

% Proficient % Advanced

Religion: Objective C

47%

36%

18%

0%

20%

40%

60%

80%

100%

Religion: Objective C

% Advanced % Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

% Basic/Below Basic % Proficient % Advanced

Philosophy: Objective C

45%

41%

14%

0%

20%

40%

60%

80%

100%

Philosophy: Objective C

% Advanced % Proficient % Basic/Below Basic

Page 33: Executive Summary of the Assessment of Student Learning

33

Arts

Objective C: Make or perform art, usually at an introductory level

Languages and Cultures

Objective C: Articulate knowledge of culture in that language

0%

10%

20%

30%

40%

50%

60%

% Basic/BelowBasic

% Proficient % Advanced

Arts: Objective C

52%

38%

9%

0%

20%

40%

60%

80%

100%

Arts: Objective C

% Advanced % Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

%Basic/BelowBasic

%Proficient %Advanced

Languages and Cultures: Objective C

52%

24%

24%

0%

20%

40%

60%

80%

100%

Language and Cultures: Objective C

%Advanced %Proficient %Basic/Below Basic

Page 34: Executive Summary of the Assessment of Student Learning

34

Cross Cultural Studies

Objective C: Discuss facets in which the host culture is similar to their own

Knowledge of the Bible

Objective C: Recognize the Bible’s variety of literary genres and discuss principles necessary for their interpretation

0%

10%

20%

30%

40%

50%

60%

% Basic/Below Basic % Proficient % Advanced

Knowledge of the Bible: Objective C

51%

39%

10%

0%

20%

40%

60%

80%

100%

Knowledge of the Bible: Objective C

% Advanced % Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/Below Basic % Proficient % Advanced

Cross Cultural Studies: Objective C

65%

30%

5%

0%

20%

40%

60%

80%

100%

Cross Cultural Studies: Objective C

% Advanced % Proficient % Basic/Below Basic

Page 35: Executive Summary of the Assessment of Student Learning

35

Christian Beliefs

Objective B: Articulate central beliefs of historic Christian faith about God, Jesus Christ, the Holy Spirit, salvation, and the church

Wellness

Objective A: Describe the relationship between habitual exercise and disease risk

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/BelowBasic

% Proficient % Advanced

Christian Beliefs: Objective B

60%

27%

13%

0%

20%

40%

60%

80%

100%

Christian Beliefs: Objective B

% Advanced % Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

%Basic/Below Basic %Proficient %Advanced

Wellness: Objective A

52%

35%

13%

0%

20%

40%

60%

80%

100%

Wellness: Objective A

%Advanced %Proficient %Basic/Below Basic

Page 36: Executive Summary of the Assessment of Student Learning

36

Ethics in the Modern World

Objective C: Apply Christian ethical approaches to selected ethical problems or issues

Pluralism in Contemporary Society

Objective C: Explain some effects of inequality, prejudice, and discrimination

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/BelowBasic

%Proficient %Advanced

Ethics in the Modern World: Objective C

60%

33%

7%

0%

20%

40%

60%

80%

100%

Ethics in the Modern World: Objective C

%Advanced %Proficient % Basic/Below Basic

0%

10%

20%

30%

40%

50%

60%

70%

% Basic/Below Basic % Proficient % Advanced

Pluralism in Contemporary Society: Objective C

58%

33%

10%

0%

20%

40%

60%

80%

100%

Pluralism in Contemporary Society: Objective C

% Advanced % Proficient % Basic/Below Basic

Page 37: Executive Summary of the Assessment of Student Learning

37

Non-Western Studies

Objective A: Articulate a basic understanding of a culture or people whose heritage and/or present life has been significantly shaped by customs,

practices, and systems of thought outside the Western tradition

64%

23%

13%

0%

20%

40%

60%

80%

100%

Nonwestern Studies: Objective A

%Advanced %Proficient %Basic/Below Basic0%

10%

20%

30%

40%

50%

60%

70%

%Basic/Below Basic %Proficient %Advanced

Nonwestern Studies: Objective A