understanding the components of the assessment...

Post on 31-May-2020

3 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

THE “HOW” OF ASSESSMENT:

UNDERSTANDING THE COMPONENTS OF THE

ASSESSMENT CYCLE SHARON A. AIKEN-WISNIEWSKI, PHD

UNIVERSITY OF UTAH

2013 NACADA International Conference

Maastricht, Netherlands

Workshop Facilitator

Sharon

•*AVP at U of U

•*Administrator

•*Practitioner

•*Faculty

•*NACADA

•*Assessment

•**Gardening,

reading,

blogging, hiking

University of Utah

Public Research University with law,

medical and dental schools in Salt Lake

City, UT

Around 33,000 students – 25,000 are

undergraduates

Over 70 undergraduate academic

departments

About 2300 students live on campus

our

Success Elevated: Best Advising On Earth

2013 NACADA Annual Conference

Salt Lake City, Utah - Oct. 6 - 9

Academic Advising at the U of U

General Education

DARS & GPS Admin

University Policy & Procedure

New students in transition

Major exploration

Academic Performance

Pre-professional school planning

Major/minor curriculum

Course selection

Major declaration

Program management

Graduation Advising

Updating DARS & GPS

University College Advising Departments & Colleges Advising

Learning Objectives

Defining Assessment & Types

Understand the elements of learning outcomes assessment

Understand the cycle of assessment

Understand the vision statement

Understand the mission statement

Understand the connection of Goal, Programmatic Learning Objectives, Student Learning Outcomes, Process & Delivery Outcomes

Understand mapping with assessment

Learning Objectives

Understand measurement tools

Review techniques for analysis

Develop strategies to communicate results with stakeholders

Communicate strategies to report results of assessment to stakeholders

Learn tools to organize and prioritize activities based on assessment data

Introduce a theoretical model for change

Appreciate that assessment is continuous (just like learning)

The Assessment Cycle

Identify

Outcomes

Gather

Evidence

Interpret

Evidence

Implement

Change

What is assessment?

A process that collects data from multiple sources

for the purpose of understanding and change.

What might we understand through assessment

Satisfaction

Need

Student Learning

Assessment - Definition

For Academic Advising…

Assessment is the process through which we gather

evidence about the claims we are making with

regard to student learning and the process/delivery

of academic advising in order to inform and

support enhancement & improvement.

Campbell, S. (2008)

Basic Principles of Assessment

evidence-driven—relying on multiple measures

formative rather than simply summative

measurement of outcomes

Student learning-outcome based (in education)

complex process of comparison

always a process of reasoning from evidence

always, to some degree, imprecise

episodic

just about measurement

about performance

evaluation—although it can

inform it

solely an administrative

process

easy or quick

Assessment is . . . Assessment is not . . .

NACADA Assessment Institute & Assessment CD

Flowchart of Assessment

Values

Vision

Mission

Goals

Programmatic Objectives

Learning Outcomes

Process & Delivery Outcomes

Mapping

Measurement Tools

Analysis of Data

Change

Values, Vision, Mission, Goal, and Objectives

The Assessment Cycle

Values

A value reflects what your advising organization considers important.

An example

Academic advising at Juergen University reflects the university’s commitment to student learning, persistence, and success by

Demonstrating that each student is valued as a learner and member of the campus community.

Establishing inclusive spaces that are safe and comfortable for students who seek advising.

Valuing the educational choices students make while attending the campus.

Vision

aspiration – the desired future of academic advising on a campus

Inspirational & long-term

Ambitious yet realistic

Generate enthusiasm

Example

To develop a coordinated and responsive academic advising program that has global recognition for excellence.

Mission

A statement that communicates the purpose of academic advising and provides direction for achieving vision and affirming values.

Consistent with institutional mission

Clear & concise

Repeatable

Example

University College assists people, through academic advising, to achieve educational goals. We cultivate relationships that empower students to navigate the campus.

What do you think?

At Heerlen College, academic advising is an intentional, educational partnership between advisors and students. Grounded in teaching and learning and approached from a developmental perspective, this multidimensional process considers and respects students’ diverse backgrounds, interests, abilities, and facilitates the identification and achievement of educational, career, and life goals.

How does this identify purpose?

Is it clear & concise?

Who is being served?

How are they being served?

What would you change?

Example Questions

Goal

Goal – broad statements that are long-range, clear, and offer guidance for action but are too broad to measure

For Example: The goals for academic advising are to design a program that is

Focused on engaging students in identifying curricular and extra-curricular programs that address their educational goals.

Based on theories of teaching, learning, and identity development.

Reflective of developmental and demographic profiles of student population.

Proactive in creating a safe environment that is focused on social justice issues.

Learning Objectives

Programmatic Learning Objectives: clarify goals

through language that is precise, detailed, and

action oriented. Offers direction to actualize the

goal and leads to learning outcome statements.

Student can identify career and major that incorporate

his/her interests, values, and skills.

Student can develop and implement an academic plan

to complete degree.

Student can identify and use campus and community

resources that compliment major and career.

I

Learning Outcomes

The Assessment Cycle

Student Learning Outcomes

Student Learning Outcomes – articulates what students are expected to know, do, or value based on advising that relates to learning objectives

These are measurable.

Focus on 3 domains of cognition, psychomotor, and affective

Power Verbs

Example:

Cognition (Know)

Student develops an interests and abilities.

Student understands his/her academic goal.

Psychomotor (Do)

Student develops a course schedule.

Student calculates his/her grade point average (GPA)

Student communicate his/her interests.

Affective (Value and Appreciate)

Student joins the alumni association as a way to support the university mission.

Student appreciates the breadth of learning through general education.

Let’s write some learning outcomes.

Power Verbs

•Define

•Record

•Analyze

•Compare

•Solve

•Select

•Summarize

•Describe

Mapping

Identifying where the learning outcome occurs in the advising process

Steps

Select a student learning outcome

Identify activities in advising that

Introduce the concept in the learning outcome,

Allow practice and promote development,

And demonstrate competency.

How and when will you evaluate if learning has occurred and outcome has been met?

The Assessment Cycle

Identify

Outcomes

Gather

Evidence

Interpret

Evidence

Implement

Change

Questions, Stretch, and Recharge

Gather Evidence

The Assessment Cycle

Gathering Evidence –

Measurement of Student Learning

Written exams

Portfolios

Reflective essays (personal statements)

Direct observation

Performance on case study

Pretest/posttest

Surveys &

questionnaires

Rubrics

Focus Groups

Interviews

Retention &

graduation rates

Direct Indirect

Gathering Evidence –

Measurement of Student Learning

Measuring amount of learning

Structured

Descriptive or causal

Sample is large, random, representative

Numbers

Questionnaire, survey, rubrics, experiments

Data analysis summarizes

Generalize

Meaning and understanding

Emerging

Exploratory

Small and purposeful

Words

Observation, focus groups, case studies

Data analysis interprets

Not for Generalizing

Quantitative Qualitative

28

National Tools

Examples in U.S. & Canada

NSSE – National Survey of Student Engagement

Noel-Levitz’ Student Satisfaction Survey

Why?

Instrument and process is in place

Assistance with administration and data processing

Comparison to similar/peer institutions

Third party involved in data for assessment & change

Issue: Cost

29

Institutional Data

Utilize campus resource: Institutional Analysis or Institutional Research Office

This office may already be collecting information you can utilize (numbers of students in your college, age, number of credits taken per semester, etc.)

Often, it’s FREE!

Third party involved in collecting assessment data can be advantageous.

Survey, Rubric, and Focus Group

Tools

Survey

Survey: a method for collecting quantitative

information about the level of learning in a

population

Closed-ended questions

Controlled response – Likert scale

Statistics/Percentage

Benchmark

Survey Advantages

• It is an efficient way of collecting information from a

large number of respondents.

• Surveys are flexible.

• Because they are standardized – reduce error.

• They are relatively easy to administer.

• There is an economy in data collection due to the

focus provided by standardized questions.

Survey Disadvantages

• Subjects’ motivation to respond.

• Low validity when researching affective

variables.

• Randomly sample - errors due to non-

response may exist.

• Survey question choices "strength of choice".

Methods used to increase response rates

• brevity

• financial

incentives

• non-monetary

incentives

• personalization

of request

• claimed affiliation

• emotional appeals

• make a difference

• guarantee

anonymity

• legal compulsion

(certain

government-run

surveys)

Survey Development

What do we

want to know?

What

response scale

is best?

Who will be

surveyed?

(the sample)

How will you

collect the

data? (paper

or electronic)

Identifying and using campus resources.

Writing the item

Address learning outcome

Is it a leading question?

Responses

Range of response

How response choices are developed

Acceptable response rate – what is it?

Sample

Participants

Random Sample

Access to survey based on deliver mode

Validity & Reliability

Extent to which an

experiment, test, or any

measuring procedure

yields the same results

on repeated trials.

The survey tool

consistently identifies

that learning occurs.

Indicator measures

what it is intended to

measured.

Survey only measures

student learning

through advising

process.

Purposeful

Reliability Validity

Examples

If often you were anxious (not able to focus, not able to recall information) before your exams, which resource would you use?

a) The Campus Library

b) The Counseling and Advising Center

c) The Service Learning Center

d) The Student Union Activity Center

e) Not sure which resource would assist with anxiety

You are applying for a scholarship. One requirement is a resume. Which campus resource would assist you in preparing a resume?

a) The Service Learning Center

b) The Women’s Resource Center

c) Career Services

d) Academic Advisor in major department

e) All of the above

Question One Question Two

Rubric - Definition

A rubric is a scoring scale utilized to measure a

student’s or advisor’s performance against a

predetermined set of criteria

A Rubric:

divides a desired objective or outcome into its

component parts

identifies what are acceptable and

unacceptable degrees of performance for a

specific outcome

These component parts are criterion points and

degrees of performance

Basic Rubric

Criterion

Points

(vertical)

Degrees of

Performance

(horizontal)

Criterion Points

At Least two points on vertical axis of rubric

Criterion points identify knowledge, understanding, and behavior

Objective: Student can develop and implement an academic plan

SLO’s:

Student communicates courses needed for degree.

Student explains why certain courses are needed for a specific major.

Student builds an academic plan for a specific graduation point.

Student demonstrates registering for courses

Degrees of Performance

At least two points on horizontal axis

Points can include

Adequate/not adequate

Excellent/competent/developing

Numbers (from 1 to 5)

Descriptors in Performance

Criterion

Points

Description

for Criterion

Points

Degrees of

Performance

Descriptors for Degrees of

Performance

Descriptors would also be added for the degrees of performance for each criterion as well

For example, the descriptor for Excellent for the criterion Knowledge may be “Can delineate all requirements for the major”

For Competent regarding the criterion Knowledge, it may be “Can delineate most of the requirements for the major”

For Needs Work it may be “Can delineate some of the requirements for the major”

and for Not Aware for Knowledge it may be “Cannot delineate any requirements for the major”

Descriptors for Degrees of

Performance

Criterion

Points

Description for

Criterion

Points

Degrees of

Performance

Descriptors for

Degrees of

Performance

Continue to Build the Rubric

You then continue to add descriptors for each

degree of performance relative to each of the

remaining criterion points for Understanding and

Behavior items

Rubric – Academic Planning

Degrees of

Performance

Criterion

Descriptors for

both

Construction of a rubric requires:

reflection on the overall objectives, goals, and

outcomes of your program, and

identification of what you hope will be

accomplished via academic advising

Rubrics are a good way to allow a quick evaluation

of a performance or demonstration of knowledge

Process or Satisfaction Data

**Rubrics

One way to evaluate data from a rubric

The % of students that fit into each degree of performance for each criterion.

Another way to evaluate data from a rubric

develop a score range to see what the numerical average is for each criterion point. •Excellent = 4

•Competent = 3

•Needs More Info = 2

•Not Aware = 1

This could be tracked longitudinally to establish a benchmark

Rubric – Withdrawal (732) Completed by advisor after appointment

What are the % of students in each degree of performance for the first criterion – explaining withdrawal?

What is the mean (average) degree of performance on students communicate withdraw deadline?

4% 14% 34% 49%

Average is 3.09 on 4.0

Focus Groups

Why are you selecting a focus group?

Types of evidence you want to gather

How will you find and select participants?

Question Protocol

Recording Data

Analysis

52

Focus Group Interview

Definition: is an interview with a small group of people on a specific topic. (Patton)

Groups are 6 to 10 people

One to two hours in length

Questions – open-ended/semi-structured

The results – exploring to describe a phenomenon or concept

Emerging patterns

Often answering “why or how”

Planning is key

53

Focus Interview Groups

TRADITIONAL FOCUS

GROUPS

In person

Can read body

language and ask

follow up questions

Can really connect with

the participants

WEB-BASED FOCUS

GROUPS

Online

Can survey larger

number of people

No need to transcribe,

answers are already

written

Process or Satisfaction Data

Survey or Focus Group

•“We have good advisors in X dept. They are very good at sending out reminders about class registration, tuition, etc. . . . We can drop in and get help . . . That easier than having an appt.

•“They aren’t helpful in determining what I could handle, maybe taking 3000 level classes this semester is a little much or change it around. And no one suggested I should get tutoring.”

•“She helps me a lot. I had transfer credit from Germany, and that was difficult because no one seemed to really know what they were doing, but she constantly followed up on things for me and got me the credits I needed. I truly appreciate her help.”

•“I went to the advisor to ask questions about studying abroad. She suggested that I not do that because I would delay my graduation.”

What does this tell you?

Is it complete? Do you need more?

Focus Group

Describes

Continue to probe

Rich data

Useful in developing

response items for

surveys

Resources

Location

Students to participate

Analysis

Interest

Pro’s Con’s

QUICK REVIEW Gathering Evidence – Measurement of Student Learning

Written exams

Portfolios

Reflective essays (personal statements)

Direct observation

Performance on case study

Pretest/posttest

Surveys &

questionnaires

Rubrics

Focus Groups

Interviews

Retention &

graduation rates

Direct Indirect

Analysis of Data/Interpret Evidence

The Assessment Cycle

Key Questions

What are these data informing (SLO’s, advising process, satisfaction, need, etc.)

What are these data suggesting?

Policies and practice are achieving desired outcomes.

Policies and practices need improvement/enhancement to achieve desired outcomes.

Policies and practices need to be retooled to achieve desired outcomes.

What policies and practices are feasible to continue, tweak, retool within your campus culture?

Resources?

Authority?

Institutional structure?

Reviewing Data - Who

Who will be involved in review?

Committee

Advisors

Campus Partners

Students

Reviewing Data – How Presented

How will data be presented to the review team?

Quantitative (numbers) by measurement tool

Survey, Rubric, Institutional Data

Qualitative (words) by measurement tool

Focus Group, Open-ended question on Survey

Handouts, PowerPoint slides

Reviewing Data – Reference Points

Longitudinal Data – Trend lines

% of students who . . .

Benchmark against peers

Within institution or system

Nationally based on institutional type or advising

model

Benchmark against peers in national tools

Compare data to anonymous peers

Reporting Results - Who

• Administration: President, Provost, various committees

– via annual report, strategic plan, white paper, Web sites, etc.

• Faculty: all faculty, curricular committees, faculty advisors

– via performance reviews, annual reports, strategic plans, Web

sites, etc.

• Students: all students, student advisees, student senate, student

groups

– via newsletters, annual reports, Web sites, etc.

• Budgeting entities

– via annual reports, budget requests, Web sites, etc.

• Accreditors

– via self-studies, accreditation reports, Web sites, etc

• Government Officials – Ministry of Education

Analysis Tools

Multiple Measures

**Survey

**Rubric

**Portfolios

**Interviews

**Observation

**National Tools (NSSE, SSI)

**Retention/ Graduation data

SPSS or SAS to “crunch the numbers”

Nudist or Atlas TI to look for patterns in

qualitative data

Tools for data collection provides

analysis (for example, Student Voice)

Analysis of DATA

Brainstorm

What are these data informing?

What are these data suggesting about practice?

What are these data suggesting about policy?

Be Creative and True to these Data

What is missing for future assessment processes?

Tool to Organize – 4 Quadrant Grid

•*Value to X

axis

•*Value to Y

axis

•*Where do

your activities

fall?

66

Implementation Time: 6 months or more

Implementation Time: Less than 6 months

Resources: Many Resources: Few

• The image of advising - PR

campaign to inform students what

advisors do and share positive stories.

•Campus-wide Advising Conference (Annual

Event)

•Develop ways to be more purposeful in explaining

various parts of the degree for educational

connections and purposes to be apparent. (Less

checklist orientation)

•Make sure that websites are up to date

•Collaboration between UAAC and ASUU to develop

organized ways to outreach to students to increase

understanding about advising (myths, DARS)

•Share results with campus community

•Section in Undergraduate Bulletin to clarify role of

advisor, when to see advisor, referrals

•Share comments from students with colleges

•Develop a college level sort to allow colleges to

identify adjustments to advising in their areas

Example of 4 Quadrant Grid

An effective way of communicating

Reporting Results – How?

PowerPoint Presentation

Flyers

Posters

In future requests for assessment

FACEBOOK, TWITTER, Website

Implement Change

The Assessment Cycle

•John Kotter’s 8 Stages of Change

Tools for Change - Theories

Eight-Stage Process

1. Establish a Sense of Urgency

2. Create a Guiding Coalition

3. Develop a Vision & Strategy

4. Communicate the Change Vision

5. Empower Others for Broad-Based Action

6. Generate Short-term Wins

7. Consolidate Gains & Producing More Change

8. Anchor New Approaches in the Culture

At the end of the day, assessment of academic

advising is all about…

• developing consensus around collective

expectations about student learning that should occur

in advising

• gathering evidence in order to understand student

learning resulting from academic advising

• using this evidence to support improvements in

academic advising that will contribute to

improvements in student learning

The Assessment Cycle - Continues

What has

changed that

you want to

monitor?

What data

did you not

have in the

last cycle?

Can you

gather it this

time?

Timeline

The Assessment Process

Sharing ideas

Accepting

ideas

Cross-

institutional

teams

Understanding

outcomes

Developing

tools

Be Open To New Lenses

As You Understand the

Impact of Advising

Assessment Cycle

Learn, Grow, and Change to Impact Student

Learning & Success

Involve everyone – Team Assessment

Your staff and students

Don’t forget to share and discuss with your colleagues that are here today.

Be open – never fear exchanging lenses

Be aware of your resources

Read

Utilize the work of others (with permission)

Don’t let perfection stop your process!

Key Resources in Advising Assessment

Institution

Local Tools

National Tools

Paper Tools

Electronic

Tools

NACADA Core Values

NACADA Concept of Academic Advising

Council on Academic Standards – Academic Advising

NACADA Academic Advising Assessment CD – 2nd edition

NACADA Assessment Institute

Assessment Gifts

The opportunity to work collectively to impact

student success through an international forum.

Collect the name of a colleague not at your

institution who will be

your assessment buddy.

Reference list on assessment

Good Luck!

Questions

Contact Information

Sharon A. Aiken-Wisniewski, Ph.Dd

E-mail: saiken@uc.utah.edu

LEARN FROM YOUR GLOBAL ASSESSMENT

PARTNERS

our

Success Elevated: Best Advising On Earth

2013 NACADA Annual Conference

Salt Lake City, Utah - Oct. 6 - 9

top related