leadership for meaningful assessment presented at the neasc annual conference boston december 9,...

Post on 18-Jan-2016

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Leadership for Meaningful Assessment

Presented at the

NEASC Annual ConferenceBoston

December 9, 2015

By

Trudy W. BantaProfessor of Higher Education

andSenior Advisor to the Chancellor forAcademic Planning and Evaluation

Indiana University-Purdue University Indianapolis301 University Boulevard, Suite 4049

Indianapolis, Indiana 46202tbanta@ iupui.edu

http://www.planning.iupui.edu

Outcomes AssessmentProcess of providing credible evidence of

• resources

• implementation processes

• outcomes

To IMPROVE effectiveness of

• instruction

• programs

• services in higher education

© TWBANTA-IUPUI

Organizational Levels for Assessment

© TWBANTA-IUPUI

National

Regional

State

Campus

College

Discipline

Classroom

Student

Interviews with Leaders

#1. Articulate vision to improve learning in strategic plans

Use data to document improvement and to recognize people in

• Annual reports

• Budget hearings

• Program reviews

© TWBANTA-IUPUI

Interviews with Leaders

Encourage collaboration to use data in decision-making

• Administrators (academic and non-academic

• Faculty

• Student Affairs professionals

© TWBANTA-IUPUI

Interviews with Leaders

Offer support and incentives

• Professional development

• Central data resources

• Rewards for faculty and staff

© TWBANTA-IUPUI

Interviews with Leaders

Communicate

• Listen and learn

• Disseminate methods, findings

• Publicize responsive improvements

© TWBANTA-IUPUI

Stan Ikenberry2015

1. Use evidence of learning to improve graduation rates

2. Inventory assessment activity and strengthen what works

3. Move from compliance to use of evidence to improve

© TWBANTA-IUPUI

Wabash Center of Inquiry2011

1. Audit information sources

2. Set aside resources for response to findings

3. Communicate findings, discuss broadly

4. Identify 1 or 2 outcomes for improvement

5. Engage students in interpreting findings and suggesting responses

© TWBANTA-IUPUI

What is your most pressing question about leadership for

outcomes assessment?

© TWBANTA-IUPUI

Principles of Effective Practice~ Phases of Assessment ~

* Planning

* Implementing

* Improving & Sustaining

© TWBANTA-IUPUI

Planning Effective Assessment

• Engage stakeholders

• Establish purpose

• Develop a written plan

• Time assessment appropriately

© TWBANTA-IUPUI

Encouraging Faculty Interest

What works in . . .

● changing the curriculum?

● teaching core skills?

● capturing student interest - -

and retention?

● using technology in instruction?

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Some Evaluative Questions

If we undertake a new approach:

• Is instruction more effective?

• Are students learning more?

• Are students more satisfied?

•Are faculty more satisfied?

•Do outcomes justify costs?

Connect Assessment to Processes Faculty Value

• Strategic planning

• Program review

• Curriculum revision

• Scholarship of Teaching & Learning

• Faculty development

• Promotion & tenure

© TWBANTA-IUPUI

Engage Students

● Stress importance in

▪ recruiting

▪orientation

● Ask students to collect data

▪ questionnaires

▪ focus groups

● Ask students to interpret data

● Disseminate USE of findings widely

© TWBANTA-IUPUI

Engage Student Affairs Staff

• To enhance student learning

• To improve programs/services

• To capture expertise

© TWBANTA-IUPUI

Engage Employers

• In developing learning outcomes

• In collecting data from students, graduates

• In supervising internships

• In evaluating papers, projects, presentations

© TWBANTA-IUPUI

Implementing Effective Assessment

• Provide leadership

• Select/design data collection

• Provide resources

• Educate faculty & staff

• Assess resources and processes too

• Share findings

© TWBANTA-IUPUI

Leadership for Assessment

*President, Provost, Trustees

*Campus Coordinator

*Unit Experts

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Select or DesignAssessment Methods

1. Match with goals

2. Use multiple methods

3. Combine direct and indirect measures

4. Combine qualitative and quantitative measures

5. Consider pre - post design to assess gains

6. Use built-in points of contact with students

An Impossible Dream:

Using standardized test scores

to compare institutions

▪ 1990

▪ 2007

© TWBANTA-IUPUI

2007

Voluntary System of Accountability

~ Assessment of Learning ~

defined as

critical thinking, written communication, analytic reasoning

© TWBANTA-IUPUI

VSA Recommendations(over my objections)

Collegiate Assessment of Academic Proficiency (CAAP)

Measuring Academic Proficiency & Progress (MAPP) (now Proficiency Profile)

Collegiate Learning Assessment (CLA)

(College BASE)

© TWBANTA-IUPUI

In TN We Learned

1) No test measured 30% of gen ed skills

2) Tests of generic skills measure primarily prior learning

3) Reliability of value added = .1

4) Test scores give few clues to guide improvement actions

5) Hard to motivate students to take these tests

© TWBANTA-IUPUI

An Inconvenient Truth

.9 = correlation between CLA and SAT/ACT

means at the institutional level

thus

81% of the variance in institutions’ scores is due to students’ prior learning

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Sources of Variance in K-12 Student Achievement

• 20% in-school factors

(10% teachers)

• 60% out-of-school factors

• 20% unknown factors

D. Goldhaber (2002)

© TWBANTA-IUPUI

American Educational Research Association

Do not usevalue-added models

based onstudents’ standardized test scores

to judgeteachers, schools, or teacher

preparation programs.

November 2015

Employing currently available

standardized tests of generic skills

to compare the quality of institutions

is not a valid use of those tests.

© TWBANTA-IUPUI

© TWBANTA-IUPUI

Start with Measures You Have

• Assignments in courses

• Course exams

• Work performance

• Records of progress through the curriculum

To assess ethical behavior

Defining Issues Test

(James Rest 1979, 86, 93)

Measure of Moral Reasoning

(Choose course when facing moral dilemma)

See Mental Measurements Yearbook

© TWBANTA-IUPUI

The Mental Measurements Yearbook

-- Buros Institute -- University of Nebraska –

Lincoln

http://buros.org/mental-measurements-yearbook

© TWBANTA-IUPUI

Student Electronic Portfolio

• Students take responsibility for demonstrating core skills

• Unique individual skills and achievements can be emphasized

• Multi-media opportunities extend possibilities

• Metacognitive thinking is enhanced through reflection on contents

- Sharon J. Hamilton

IUPUI

VALUE Rubricshttp://www.aacu.org/value/rubrics/index.cfm

• Critical thinking

• Written communication

• Oral communication

• Information literacy

• Teamwork

• Intercultural knowledge

• Ethical reasoning

Assess

*Resources – faculty, students, facilities

*Processes – curriculum, teaching

*Outcomes – learning,

program effectiveness

© TWBANTA-IUPUI

Sharing Findings

*Dashboard

*Highlights

*Full report

▪ Written, posted to website

▪ Delivered in person

© TWBANTA-IUPUI

Improving & Sustaining Assessment

• Obtain credible evidence

• Ensure findings are used

• Reexamine the assessment process

© TWBANTA-IUPUI

~ Credible Evidence ~

1. Are we measuring something important?

2. Are findings consistent over time?

3. Are artifact raters able to agree?

4. Do findings provide direction for action?

© TWBANTA-IUPUI

Build Follow-Up Into Your Plans

1. Identify sources of support for responsive improvements

Are there committed administrators?

2. Speculate about findings and potential responses

3. Plan for collaborators to meet

© TWBANTA-IUPUI

Plan for Follow-Up4. Develop communications plan to

encourage broad dialogue about findings

5. Engage students in analysis and responding to findings

6. Use conversations to identify 1 or 2 outcomes for improvement

-Wabash National Study

Blaich & Wise

© TWBANTA-IUPUI

Leadership for Meaningful Assessment:

Dartmouth’s Story

Alicia Betsinger, PhDNew England Association of Schools and Colleges (NEASC)

Annual Conference December 9-11, 2015

Arts & Sciences

• Teagle Foundation Grant (2010-2012): Faculty Development at the Next Level: Departments and Academic Programs

– Embedding faculty development and learning-centered practices into departmental structures. At both Dartmouth and Brown, teaching and learning centers will work with a small group of departments to develop and implement department-wide student learning outcomes. Each institution will draw on the expertise and influence of their teaching and learning centers. The institutions' project leaders and faculty participants will work together to discuss assessment data, share information, and report on projects.

• Neuroscience• Astronomy• Linguistics

New Leadership

• President (July 2013)

• Executive Vice President (September 2013)

• Associate Provost of Institutional Research (April 2014)

• Provost (July 2014)

• Dartmouth Center for the Advancement of Learning (DCAL) Director (January 2015)

• Postdoctoral Fellow, Assessment and Evaluation (August 2015)

NEASC Interim Report

•College-Wide Inquiry: Gateway Initiative• School Level Inquiry

• General Education: First-year writing & Advising 360: Coordinated advising system serving students through first six terms

• Graduate and Professional Schools

• Departmental Level Inquiry: • Teagle Grant Departments• Classics

• Student-and-Course Level Inquiry• Biology/Chemistry• Mathematics

• Learning Assessment within Faculty of Arts & Sciences

Arts & Sciences: Next Steps

• Phase One:• Faculty Assessment Committee:

• Representation from all undergraduate divisions: Arts & Humanities, Interdisciplinary Programs, Sciences, and Social Sciences.

• Two-year membership

• Define what we are trying to learn about learning at Dartmouth

• Identify existing faculty efforts, identifying their essential aspects and features

• Identify existing resources for inquiry into student learning

Arts & Sciences: Next Steps (cont.)

• Phase Two: Articulate five-year strategy for broadly engaging Arts & Sciences in sustained inquiry into student learning in three key areas:

• The major: courses, curriculum, departments/programs• General Education: distributives, writing requirement, and

language requirement at the course, curriculum, and institution levels; and

• Graduate programs: courses, curriculum, programs

Final ThoughtsLeadership comes in many forms

Impacts of leadership transitions vary

Willingness to engage in the tough

discussions are impeded or facilitated

by those in leadership positions

Activity

Scenario 1: You are a new President/Provost and your accreditation report is due in a year. What specific actions do you take to support the work?

Scenario 2: You are the new Accreditation Liaison Officer (ALO) and your accreditation report is due in a year. What specific actions do you take to gain the support needed to submit a successful report?

Leadership for Meaningful

Assessment:The BHCC

Perspective

Dr. Maria PuenteDean Lori A. CatallozziBunker Hill Community

College

New England Association of Schools and Colleges (NEASC)

Annual Meeting December 9, 2015

Assessment can only be meaningful to us if it is:

Mission-centric Outcomes-based Broad and inclusive Reflective, active and continuous

Making Assessment Institutionally Meaningful: The BHCC Experience

Our College has had meaningful assessment experiences because the leadership:• values a mission-driven approach• emphasizes and supports a culture of

evidence• creates opportunities and structures for

sustaining broad-based, inclusive participation

• engages the campus community in cycles of collaborative reflection, planning & action

Leading an Institutionally Meaningful Assessment Process: The BHCC Experience

Our Mission Statement:Bunker Hill Community College serves as an educational and economic asset for the Commonwealth of Massachusetts by offering associate degrees and certificate programs that prepare students for further education and fulfilling careers. Our students reflect our diverse local and global community, and the College integrates the strengths of many cultures, age groups, lifestyles and learning styles into the life of the institution. The College provides inclusive and affordable access to higher education, supports the success of all students, and forges vibrant partnerships and pathways with educational institutions, community organizations, and local businesses and industries.

The BHCC Experience: Assessment as Mission-Centric

The phrase “…supports the success of all students…” speaks to one of our institutional values, Inclusiveness and Equity.

But how do we know if these values are really being practiced? How can we assess equity in a meaningful way?• Success of Pell Students vs. Non-Pell Students• Success of Students of Color vs. Success of White

Students• Why these measures? Narrowing of achievement

gaps is an indicator of progress towards greater inclusiveness & equity

The BHCC Experience: An Example of Assessment as Mission-Centric

Outcomes can be both quantitative and qualitativeStudent Learning Outcomes Assessment Program

(SLOAP) – an example of outcomes-based assessment• Faculty-driven initiative intended to promote a

College-wide culture of assessment; Focus on projects that help close the feedback loop

SLOAP’s Writing Across the Curriculum initiative: Long-running, college-wide initiative that allows faculty to assess students’ writing and critical thinking abilities across disciplines• Fall 2010 & Fall 2011 projects to improve critical

thinking of students

The BHCC Experience: Assessment as Outcomes-Based

The goal: Inquiry-based, evidence-informed decision-making across all programs and initiatives at all levels of the institution

Learning Communities Developmental Education Reforms Civic and Community Engagement Learn & Earn Life Map Annual Unit Planning & Program Review

The BHCC Experience: Assessment as Broad & Inclusive

How assessment of major initiatives & programs informs our strategic planning• Expansion of SLOAP projects and framework • Scale-up of developmental education acceleration• Engagement of faculty in professional development

that fosters proven pedagogical practices • Expansion of high impact practices for students in

the 30-60 credit range• Integration of culturally inclusive curricula and

practice

The BHCC Experience: Assessment as Reflective, Active &

Continuous

• Learning Community impact on six key indicators of student success

• College-wide increases in fall-to-fall retention, overall course completion, developmental education and gateway course completion

• Narrowing and closure of racial/ethnic disparities in achievement

• Successful course completion and retention of Pell grant students

The BHCC Experience: Gains & Achievements

• Implement outcomes-based General Education Program

• Integrate LifeMap across the curriculum• Integrate high impact practices at 3o credit mark• Increase percentage of students who graduate

within four years• Increase percentage of students who earn a degree

before transfer• Continue to focus on closing achievement gaps

The BHCC Experience: Goals for 2015-2020

Each table will focus on one of four data handouts and should be prepared to briefly report out.

Questions for table discussion: 1. What does the data tell you? Are there overall

trends? Disparities? 2. What conclusions can you draw from the

data?3. What additional research questions does the

data raise?

Activity: Data Jigsaw

2004 2005 2006 2007 2008 2009 2010 20110%

10%

20%

30%

40%

50%

60%

Persistence: Year-to-Year

My College

Same Entry Year

ATD Network

Student Cohort

Figure 1. Year-to-Year Persistence of Students in the ATD Network for the 2004-2011 Cohorts (BHCC denoted as “My College”).

Source: 2014 Achieving the Dream Benchmarking for Success Report

Figure 2. Fall-to-fall Retention at BHCC by Race and Pell Status, 2007-2013. Source: BHCC Office of Institutional Effectiveness

Figure 3. Successful Course Completion of BHCC Students by Race/Ethnicity, Fall 2007 – Fall 2013  Source: BHCC Office of Institutional Effectiveness

88%

69%82%

59%16.7

28.7

13.9

21.8

15.3

26.7

12.9

19.4

87%

67%82%

57%

Learning Community Seminars (n = 2652)Comparison (n = 887)

Learning Community Clusters (n = 1407)Comparison (n = 662)

Figure 4. Learning Community Retention & Credits Earned Using Matched-Pair Analysis 2008-2012Source: DVP-PRAXIS LTD and OMG Center for Collaborative Learning, 2013

Term-to-Term Year-to-Year

Within First Year At End of Second Year

Within First Year At End of Second Year

Term-to-Term Year-to-Year

RetentionCredits Earned

How has leadership for assessment influenced the outcomes assessment process at your institution?

Given what you have heard today, do you have any new questions about outcomes assessment?

top related