hsueh-fen chen, ph.d., assistant professor, school of public health

1
PEER LEARNING THROUGH ASSESSMENT: A CASE STUDY IN ACTIVE LEARNING TO DEVELOP HIGHER ORDER THINKING Hsueh-fen Chen, Ph.D., Assistant Professor, School of Public Health Carol A. Kominski, Ph.D., Assessment Specialist, Center for Learning & Development ABSTRACT INTRODUCTION METHODOLOGY References Higher order thinking (HOT) according to Bloom’s (1956) taxonomy requires application, analysis, evaluation, and creation skills. However, assessing these skills efficiently, especially in large classes, frequently necessitates the use of multiple choice questions (MCQs). Although it is relatively easy to construct MCQs that assess lower order thinking skills such as recall, recognition, and understanding, assessing HOT skills is relatively time-consuming and difficult. This study turns the tables to use the PeerWise software system (Denny et al., 2008) to actively engage students in constructing MCQs, explaining the reasoning behind each question, answering other students’ questions, and evaluating the quality of each question using a HOT rubric. UNTHSC’s Quality Enhancement Plan (QEP) focuses on developing students’ higher order thinking (HOT) skills across health sciences curricula. HOT is defined according to Bloom’s taxonomy. Bloom’s taxonomy In higher education active learning strategies are increasingly being used to develop students’ HOT skills. Active learning strategies require the learner to engage in activities such as structuring the learning environment, working cooperatively with other learners, solving authentic problems, reflecting upon the learning experience, building upon what one already knows, and connecting existing ideas in new ways. “PeerWise is a system in which students create multiple choice questions and answer those created by their peers.” (Denny et al., 2008). Creating questions, providing explanations of correct answers, answering other students’ questions, as well as commenting on and evaluating those questions require active engagement. The questions created can also be helpful for students wishing to improve mastery in preparation for exams. HMAP 5328 is a Human Resources course for graduate students. It is one of ten courses that are participants in UNTHSC’s QEP. This course is a required course for MHA students and an elective course for MPH students. Details of participants: Course period: Spring semester 2013 Enrollment: 20 students Requirement: 71% of enrolled students in MHA curriculum. Work experience: 53% have 3 or fewer years of full-time experience. Curricular workload: 71% are enrolled in more than 3 classes. Retain one created and five answered questions per week as requirement. Provide models of questions at all levels of thinking at beginning of course to demonstrate HOT skills questions. Use instructor ratings more highly in grading than student ratings and provide instructor feedback to students on a regular basis. Consider team competitions for question creation. Model more substantive as well as supportive comments. Provide more rewards for students for creating HOT questions. Denny, P., Hamer, J., Luxton-Reilly, A., & Purchase, H. (2008b). PeerWise: Students sharing their multiple choice questions. ICER ’08: Proceeding of the fourth international workshop on computing education research (pp. 51–58). New York, NY: ACM. Reilly, A.L. and Denny, Paul. (2010). Constructive evaluation: a pedagogy of student-contributed assessment. Computer Science Education, 20, 2, 145-167. INTRODUCTION The purpose was to explore the feasibility of using student construction of multiple choice questions (MCQs) to foster active learning and improve higher order thinking skills in a graduate course in public health. Student performance statistics, an electronic survey, and focus groups assess success in engaging students and developing higher order thinking skills. PURPOSE Hands-on lab: Students participated in a hands-on lab where PeerWise software was introduced and students participated in exercises to create MCQs, answer questions created by other students, and rate questions. HOT rating rubric: Students evaluated each question according to a 5 point wholistic rubric based upon question clarity, completeness of explanation, and extent to which question addressed HOT skills. Statistics: Statistics collected within the PeerWise software system assessed level of student engagement and quality of student contributions. Number of questions created. Number of questions answered. Number of comments made. Average question rating. Student opinions from survey. Survey: A survey administered halfway through the course gauged students’ satisfaction with PeerWise, the extent to which students were actively engaged, and the extent to which they thought creating, answering, rating, and commenting contributed to development of HOT skills and course mastery. RESULTS INCENTIVES/REQUIREMENTS FOR PARTICIPATION Topic Areas Cultural Competence, Diversity and Disparity, Employee Benefits, Job Design & Analysis, Motivation, Performance Management, Retention, Wellness Program Requirements Question Creation: Each student has to create one question per topic area. Question Answering: Each student has to answer at least five questions per topic area. Scoring: Students get 4 points for created questions and 4 points for answered questions if they meet requirement for each. Students get additional points (up to 4) for higher rated questions that they create. Optional Student Activities Each student is expected but not explicitly to rate questions they answer on a five point scale based on HOT rubric. Students can freely comment and respond to other students comments, choose to follow other students, and rate questions as easy, medium, or hard. Incentives Highest rated questions are selected for midterm and final exams. Percent of Grade Earned through PeerWise Activities = 10%. RUBRIC FOR RATING QUESTIONS Required Participation 8 questions written 40 questions answered No minimum number of comments PeerWise participation scores ranged from 1499 to 3917 at end of assignment. Question Ratings Based on HOT Rubric 4.0 = clearly written question & explanation requiring application, analysis, or evaluation 3.0 = clearly written question and explanation requiring understanding 2.0 = clearly written question and explanation requiring recall only 1.0 = clearly written question but poor or missing explanation Student recommendations for future classes 83% recommended that PeerWise definitely or probably be used for future HMAP 5328 classes. 69% would like to see other instructors use PeerWise in their courses. Student opinion of PeerWise course requirements 88% said writing one question per week was an appropriate demand. 83% said answering five questions per week was an appropriate demand. 43% would like to see higher grades awarded to students who construct questions requiring higher order thinking skills. Student opinion of feedback from other students 64% said that PeerWise was slightly or much more encouraging than other SPH classes in encouraging them to learn from their peers. Only 35% said that student ratings of their questions were somewhat fair while 65% said that ratings were somewhat or very unfair. STUDENT COMMENTS Reasons for Recommendations that PeerWise be used in future HMAP courses and in other SPH courses Yes, PeerWise required me to think deeply into the articles. The scores themselves motivated me to continuously try to improve on my previous score. Encourages us to read before class. Sometimes we miss important points, when we answer the questions, we learn them. It improved my reading skills and developed an interest. Helps in midterm exam review. Allows class members to be participative with one another. I feel like PeerWise is another learning tool which helps students learn from one another and not just from the professor. Some students derive questions from topics that I personally might not have focused on. LESSONS LEARNED: NEXT STEPS Reasons for Improving the Way Students Rate Questions and Provide feedback Some people rated unfairly and were too strict. I prefer for the professor to determine if we are writing questions of higher order thinking as opposed to the other students. There was a discrepancy on the way students were formulating the questions and how students were evaluating them. EXAMPLE OF STUDENT HOT QUESTION As a public health specialist, you have developed the ability to work effectively within the cultural context of a community that shares a diverse cultural background. This concept is widely known as cultural competence and can be understood from Betancourt’s article. As implied in the article, among stakeholders in managed care, government, and academe, cultural competence is emerging as an important strategy to address health care disparities. However, why can’t cultural competence alone address health disparity? A.Cultural competence is not well-implemented in the different health settings. B.Discrimination against the racial groups is a structural problem in the U.S that cannot be solved with cultural competence techniques. C.There are other issues, such as educational level and low socioeconomic status (SES) which contribute to health disparities and cannot be addressed by cultural competence. D.It is not true, the National Institutes of Health (NIH) and the Agency for Healthcare Research and Quality (AHRQ) has solved the problem of health care disparities in the U.S.

Upload: albina

Post on 15-Jan-2016

48 views

Category:

Documents


0 download

DESCRIPTION

PEER LEARNING THROUGH ASSESSMENT: A CASE STUDY IN ACTIVE LEARNING TO DEVELOP HIGHER ORDER THINKING. Hsueh-fen Chen, Ph.D., Assistant Professor, School of Public Health Carol A. Kominski, Ph.D., Assessment Specialist, Center for Learning & Development. ABSTRACT. METHODOLOGY. RESULTS. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Hsueh-fen Chen, Ph.D., Assistant Professor, School of Public Health

PEER LEARNING THROUGH ASSESSMENT: A CASE STUDY IN ACTIVE LEARNING TO DEVELOP HIGHER ORDER THINKING

Hsueh-fen Chen, Ph.D., Assistant Professor, School of Public HealthCarol A. Kominski, Ph.D., Assessment Specialist, Center for Learning & Development

ABSTRACT

INTRODUCTION

METHODOLOGY

References

Higher order thinking (HOT) according to Bloom’s (1956) taxonomy requires application, analysis, evaluation, and creation skills. However, assessing these skills efficiently, especially in large classes, frequently necessitates the use of multiple choice questions (MCQs). Although it is relatively easy to construct MCQs that assess lower order thinking skills such as recall, recognition, and understanding, assessing HOT skills is relatively time-consuming and difficult. This study turns the tables to use the PeerWise software system (Denny et al., 2008) to actively engage students in constructing MCQs, explaining the reasoning behind each question, answering other students’ questions, and evaluating the quality of each question using a HOT rubric.

UNTHSC’s Quality Enhancement Plan (QEP) focuses on developing students’ higher order thinking (HOT) skills across health sciences curricula. HOT is defined according to Bloom’s taxonomy.

Bloom’s taxonomy

•In higher education active learning strategies are increasingly being used to develop students’ HOT skills. Active learning strategies require the learner to engage in activities such as structuring the learning environment, working cooperatively with other learners, solving authentic problems, reflecting upon the learning experience, building upon what one already knows, and connecting existing ideas in new ways.

•“PeerWise is a system in which students create multiple choice questions and answer those created by their peers.” (Denny et al., 2008). Creating questions, providing explanations of correct answers, answering other students’ questions, as well as commenting on and evaluating those questions require active engagement. The questions created can also be helpful for students wishing to improve mastery in preparation for exams.

•HMAP 5328 is a Human Resources course for graduate students. It is one of ten courses that are participants in UNTHSC’s QEP. This course is a required course for MHA students and an elective course for MPH students.

•Details of participants:

• Course period: Spring semester 2013• Enrollment: 20 students• Requirement: 71% of enrolled students in MHA curriculum.• Work experience: 53% have 3 or fewer years of full-time experience.• Curricular workload: 71% are enrolled in more than 3 classes.

• Retain one created and five answered questions per week as requirement.

• Provide models of questions at all levels of thinking at beginning of course to demonstrate HOT skills questions.

• Use instructor ratings more highly in grading than student ratings and provide instructor feedback to students on a regular basis.

• Consider team competitions for question creation.

• Model more substantive as well as supportive comments.

• Provide more rewards for students for creating HOT questions.

Denny, P., Hamer, J., Luxton-Reilly, A., & Purchase, H. (2008b). PeerWise: Students sharing their multiple choice questions. ICER ’08: Proceeding of the fourth international workshop on computing education research (pp. 51–58). New York, NY: ACM.

Reilly, A.L. and Denny, Paul.  (2010).  Constructive evaluation: a pedagogy of student-contributed assessment.  Computer Science Education, 20, 2, 145-167.

INTRODUCTION

The purpose was to explore the feasibility of using student construction of multiple choice questions (MCQs) to foster active learning and improve higher order thinking skills in a graduate course in public health. Student performance statistics, an electronic survey, and focus groups assess success in engaging students and developing higher order thinking skills.

PURPOSE

Hands-on lab: Students participated in a hands-on lab where PeerWise software was introduced and students participated in exercises to create MCQs, answer questions created by other students, and rate questions.

HOT rating rubric: Students evaluated each question according to a 5 point wholistic rubric based upon question clarity, completeness of explanation, and extent to which question addressed HOT skills.

Statistics: Statistics collected within the PeerWise software system assessed level of student engagement and quality of student contributions.

• Number of questions created. • Number of questions answered.• Number of comments made.• Average question rating. • Student opinions from survey.

Survey: A survey administered halfway through the course gauged students’ satisfaction with PeerWise, the extent to which students were actively engaged, and the extent to which they thought creating, answering, rating, and commenting contributed to development of HOT skills and course mastery.

RESULTS

INCENTIVES/REQUIREMENTS FOR PARTICIPATION

Topic AreasCultural Competence, Diversity and Disparity, Employee Benefits, Job Design & Analysis, Motivation, Performance Management, Retention, Wellness Program

RequirementsQuestion Creation: Each student has to create one question per topic area.Question Answering: Each student has to answer at least five questions per topic area. Scoring: Students get 4 points for created questions and 4 points for answered questions if they meet requirement for each. Students get additional points (up to 4) for higher rated questions that they create.

Optional Student ActivitiesEach student is expected but not explicitly to rate questions they answer on a five point scale based on HOT rubric. Students can freely comment and respond to other students comments, choose to follow other students, and rate questions as easy, medium, or hard.

IncentivesHighest rated questions are selected for midterm and final exams.Percent of Grade Earned through PeerWise Activities = 10%.

RUBRIC FOR RATING QUESTIONS

Required Participation

•8 questions written•40 questions answered•No minimum number of comments•PeerWise participation scores ranged from 1499 to 3917 at end of assignment.

Question Ratings Based on HOT Rubric

•4.0 = clearly written question & explanation requiring application, analysis, or evaluation•3.0 = clearly written question and explanation requiring understanding•2.0 = clearly written question and explanation requiring recall only•1.0 = clearly written question but poor or missing explanation

Student recommendations for future classes

•83% recommended that PeerWise definitely or probably be used for future HMAP 5328 classes.•69% would like to see other instructors use PeerWise in their courses.

Student opinion of PeerWise course requirements

•88% said writing one question per week was an appropriate demand.•83% said answering five questions per week was an appropriate demand.•43% would like to see higher grades awarded to students who construct questions requiring higher order thinking skills.

Student opinion of feedback from other students

•64% said that PeerWise was slightly or much more encouraging than other SPH classes in encouraging them to learn from their peers.•Only 35% said that student ratings of their questions were somewhat fair while 65% said that ratings were somewhat or very unfair.

STUDENT COMMENTSReasons for Recommendations that PeerWise be used in future HMAP courses and in other SPH courses

•Yes, PeerWise required me to think deeply into the articles. The scores themselves motivated me to continuously try to improve on my previous score. •Encourages us to read before class. Sometimes we miss important points, when we answer the questions, we learn them.•It improved my reading skills and developed an interest.•Helps in midterm exam review. Allows class members to be participative with one another.•I feel like PeerWise is another learning tool which helps students learn from one another and not just from the professor.•Some students derive questions from topics that I personally might not have focused on.

LESSONS LEARNED: NEXT STEPS

Reasons for Improving the Way Students Rate Questions and Provide feedback

•Some people rated unfairly and were too strict.•I prefer for the professor to determine if we are writing questions of higher order thinking as opposed to the other students. •There was a discrepancy on the way students were formulating the questions and how students were evaluating them.

EXAMPLE OF STUDENT HOT QUESTION

As a public health specialist, you have developed the ability to work effectively within the cultural context of a community that shares a diverse cultural background. This concept is widely known as cultural competence and can be understood from Betancourt’s article. As implied in the article, among stakeholders in managed care, government, and academe, cultural competence is emerging as an important strategy to address health care disparities. However, why can’t cultural competence alone address health disparity? A.Cultural competence is not well-implemented in the different health settings. B.Discrimination against the racial groups is a structural problem in the U.S that cannot be solved with cultural competence techniques. C.There are other issues, such as  educational level and low socioeconomic status (SES) which contribute to health disparities and cannot be addressed by cultural competence. D.It is not true, the National Institutes of Health (NIH) and the Agency for Healthcare Research and Quality (AHRQ) has solved the problem of health care disparities in the U.S.