1 gloria m. rogers, ph.d. institutional research, planning, and assessment rose-hulman institute of...
TRANSCRIPT
1
Gloria M. Rogers, Ph.D.Institutional Research, Planning, and Assessment
Rose-Hulman Institute of TechnologyTerre Haute, Indiana USA
8th Improving Student Learning Symposium Improving Student Learning Strategically
“Strategies for Harnessing Information Technology to
Facilitate Institutional Assessment”
2
OverviewOverview Use of models to guide institutional
strategies for improving student learning Assessing student learning Best practices for student assessment Brief history of RHIT process Assessment model/taxonomy A case study - demonstration Benefits to teaching/learning Assessment method truisms Barriers to faculty involvement Advice from the field
3
OU
TCO
MES
Use of Use of Principles of Best Practice for Principles of Best Practice for Assessment of Student LearningAssessment of Student Learning in guiding in guiding
development of assessment “system”development of assessment “system”
Value of using models to guide practice
Recognition of local constraints
INPUTS
4
Rose-Hulman Institute
of Technology
Terre Haute, Indiana, USA 1600+ undergraduate students B.S. degrees in engineering, science,
and mathematics Median SAT scores 1350 (700M,650V) 80%+ engineering students
5
BRIEF HistoryBRIEF History Presidential Commission of faculty, staff and
students appointed in Spring of 1996 to develop a plan for the assessment of student outcomes
Provide for continuous quality improvement Meet outcomes-based accreditation
standardsRegional (NCA)Program (ABET)
Assessment for Continuous ImprovementAssessment for Continuous Improvement
Educational Goals &Educational Goals &ObjectivesObjectives
ConstituentsConstituents
Program OutcomesProgram Outcomes
Assessment: Assessment: Collection, Analysis Collection, Analysis
of Evidenceof Evidence
Evaluation:Evaluation:Interpretation of Interpretation of
EvidenceEvidence
Feedback for Feedback for Continuous Continuous
ImprovementImprovement
Gloria Rogers -Rose-Hulman Institute of Technology
AccreditationAccreditation
Measurable Measurable Performance Performance
CriteriaCriteria
Educational Educational Practices/StrategiesPractices/Strategies
Accreditation RequirementsAccreditation RequirementsInstitutional MissionInstitutional Mission
7
Competency-Based Instruction
Assessment-Based CurriculumIndividual Perf. Tests
PlacementAdvanced Placement TestsVocational Preference Tests
Other Diagnostic Tests
“Gatekeeping”
Admissions TestsRising Junior Exams
Comprehensive ExamsCertification Exams
Campus and Program Evaluation
Program ReviewsRetention Studies
Alumni Studies“Value-added” Studies
Program Enhancement
Individual assessmentresults may be aggregated to
serve program evaluation needs
Levelof
Assessment(Who?)
Individual
Group
KNOWLEDGE
SKILLS
ATTITUDES
&
VALUES
BEHAVIOR
Object of
Assessment
(What?
)Learning/Teaching
(Formative)Accountability(Summative)
Purpose of Assessment (Why?)(Terenzini, JHE Nov/Dec 1989)
Taxonomy of Approaches to Assessment
XX
X
8
Rose-Hulman’s MissionRose-Hulman’s Mission
To provide students with the world’s best undergraduate education in engineering, science, and mathematics in an environment of individual attention and support.
9
Input Recruit highly qualified students, faculty, and staff
Provide an excellent learning environment
Quality
Encourage the realization and recognition of the full potential of all campus community members
Climate
Instill in our graduates skills appropriate to their professions and life-long learningOutcomes
Resources
Provide resource management & development that supports the academic mission
10
Ethics and professional responsibility Understanding of contemporary issues Role of professionals in the global society and ability to
understand diverse cultural and humanistic traditions Teamwork Communication skills Skills and knowledge necessary for mathematical,
scientific, and engineering practice Interpret graphical, numerical, and textual data Design and conduct experiments Design a product or process to satisfy a client's needs
subject to constraints
Instill in our graduates skills appropriate to Instill in our graduates skills appropriate to their professions and life-long learningtheir professions and life-long learningOutcomes
11
Why portfolios?Why portfolios?
Authentic assessment Capture a wide variety of student work Involve students in their own
assessment Professional development for faculty
12
Why “electronic” portfolios?Why “electronic” portfolios? Student-owned laptop computer
program since 1995 Classrooms, residence halls, common
areas, library, fraternity houses all wired
Access Efficient Cost effective Asynchronous assessment
13
RosE-Portfolio Structure
StudentAdvisor
ADMIN
Employer Rater
Faculty•Submit•Review•Search•Dynamic Resume•Access Control
•View Advisee’s portfolio•Search Advisee’s portfolio
•User Management•Group Management•System Configuration•Criteria Tree•Activity Managment
•View•Search
•Inter-rater Reliability•Rating sessions•Feedback•Rating management
•Curriculum Map•PTR Portfolio
•Submit•Review •Search
14
Show
Me!
15
Assessment of student materialAssessment of student material
Faculty work in teams Each team assesses one learning objective Score holistically Emerging rubrics
Does the reflective statement indicate an understanding of the criterion?
Does the reflective statement demonstrate or argue for the relevance of the submitted material to the criterion?
Does the submitted material meet the requirements of the criterion at a level appropriate to a graduating seniorlevel appropriate to a graduating senior at R-HIT?
16
Show
Me!
17
Example of ResultsExample of Results
1 Understand criterion?
2 Submission relevant to criterion?
3 Meet standards for R-HIT graduate?
18
Example of ResultsExample of ResultsDoes submission meet the standards for a Does submission meet the standards for a
graduate of R-HIT?graduate of R-HIT?1 Appropriate for
audience2 Organization3 Content
factually correct
4 Test audience response
5 Grammatically correct
Communication Skills - All Criteria
0%
20%
40%
60%
80%
100%
1 2 3 4 5
Criteria #
19
Linking results to PracticeLinking results to Practice Development of Curriculum Map Linking curriculum
content/pedagogy to knowledge, practice and demonstration of learning outcomes
Show
Me!
20
Curriculum Map ResultsCurriculum Map ResultsFall 1999-2000 (181 courses/labs)Fall 1999-2000 (181 courses/labs)
Communication SkillsCommunication Skills
69%
86%
70%
7%
0%
20%
40%
60%
80%
100%
Explicit Competence Feedback Not Covered
21
Curriculum Map ResultsCurriculum Map ResultsFall 1999-2000 (181 courses/labs)Fall 1999-2000 (181 courses/labs)
EthicsEthics
22
Closing the loopClosing the loop
JAN
FEB
MAR
APR
MAY
JUN
JUL
AUG
SEP
OCT
NOV
DEC
Eval Committee receives and
evaluates all data; makes report and
refers recom-mendations to
appropriate areas.
Institute acts on the recom-
mendations of the Eval. Comm.
Reports of actions taken by the
Institute and the targeted areas are
returned to the Eval Comm. for
iterative evaluation.
Institute assessment cmte. prepares
reports for submission to Dept.
Heads of the collected data (e.g. surveys, e-portfolio
ratings).
23
Primary focusPrimary focus
It is not about electronic portfolios. It is about:
supporting teaching and learning faculty and student developmentthe transformation of the
teaching/learning environment
24
Benefits to teachingBenefits to teaching Faculty are asked to reflect on learning
outcomes in relation to practiceConsider the value of stated outcomes
– Right ones?– Right performance criteria?– Individual faculty role in creating the context for
learning Develop a common language and
understanding of program/institutional outcomes
Explicit accountability Promotes interdisciplinary
discussions/collaborations
25
Benefits to learningBenefits to learning Students review their own progress as it
relates to expected learning. Portfolios provide a way for students to make
learning visible and becomes the basis for conversations and other interactions among students and faculty.
Learning is viewed as an integrated activity not isolated courses.
Students learn to value the contributions of out-of-class experiences.
Student reflections are metacognitive as they appraise their own ways of knowing.
Promotes a sense of personal ownership over one’s accomplishments.
26
Assessment method truismsAssessment method truisms There will always be more than one way
to measure any outcome No single method is good for
measuring a wide variety of different student abilities
Consistently inverse relationship between the quality of measurement methods and their expediency
Importance of pilot testing to see if method is good for your program (students & faculty)
27
Barriers to implementation Barriers to implementation FacultyFaculty
current workloadlack of incentive to participate in the
process (rewards)“what’s in it for me” (cost/benefits)
Institutional/program leadershipInstitutional/program leadershipLack of vision for the program/institutional assessment process (no existing, efficient models)Cost/benefit unknownDifficulty of restructuring the reward system to facilitate faculty participation
28
Process deficienciesProcess deficiencies Lack of understanding of the dynamics of
organizational change Absence of “tools” to facilitate collaborative work
Portfolio deficienciesPortfolio deficiencies Ill-defined purpose Lack of efficient ways to manage the portfolio
process Systematic review of portfolio contents is ill-
defined or non-existent Student and faculty roles not clear Portfolio process not integrated into the
teaching/learning environmentResource deficienciesResource deficiencies
Expertise in portfolio development Development of “authentic” portfolio
29
Advice from the fieldAdvice from the field
You cannot do it all - prioritizeAll assessment questions are not equalOne size does not fit allIt’s okay to ask directionsTake advantage of local resourcesDon’t wait until you have a “perfect”
planDecouple from faculty evaluation
E=MCE=MC22
30
DEMODEMO SiteSite
http://www.rose-hulman.edu/ira/reps/http://www.rose-hulman.edu/ira/reps/