uset participant: kolby sorenson faculty mentor: dr. debra ... · uset participant: kolby sorenson...

1
USET Participant: Kolby Sorenson Faculty Mentor: Dr. Debra J. Mascaro Mechanical Engineering Mechanical Engineering In their first semester of the Mechanical Engineering (ME) program, our students learn the basics of design methodology, mechanical hardware, physics, and modeling concepts during course lectures. These concepts are then applied a team-based design project that culminates with an end-of-semester design competition. Based on our past observations, we felt that our students were deprived of timely, individualized feedback on their application of these design concepts. In engineering design, there is no single “right” answer. Instead, students must learn to utilize design objectives, constraints, target specifications, and metrics in order to reduce an infinite “design space” (all possible combinations of design solutions) to a smaller pool of possible solutions. From this limited pool, students must practice using tools (decision matrices, pairwise comparison charts, etc.) to help them systematically select their “best” design option. [1] Amelink, C.T. and Creamer, E.G., “Gender Differences in Elements of the Undergraduate Experience that Influence Satisfaction with the Engineering Major and the Intent to Pursue Engineering as a Career,” Journal of Engineering Education, 2010, Vol. 99, pp. 81-92. [2] Vogt, C.M., “Faculty as a Critical Juncture in Student Retention and Performance in Engineering Programs,” Journal of Engineering Education, 2008, Vol. 97, pp. 27-36. [3] Campbell, T.A. and Campbell, D.E., “Faculty/Student Mentor Program: Effects on Academic Performance and Retention,” Research in Higher Education, 1997, Vol. 38, pp. 727- 742. [4] Ulbig, S. G., & Notman, F., “Is Class Appreciation Just a Click Away?: Using Student Response System Technology to Enhance Shy Students' Introductory American Government Experience,” Journal Of Political Science Education, 2012, Vol. 8(4), pp. 352-371. BACKGROUND INFORMATION PILOT PROGRAM S TRATEGY S TUDENT SURVEY RESULTS S TUDENT PERFORMANCE/RETENTION C ONTD . C ONCLUSIONS SOURCES During the fall 2012 semester, we implemented a pilot program based on a two- part strategy of (1) increasing the time spent on project-related active learning activities during lecture, and (2) implementing an “Adopt-a-Lab” strategy in which additional ME faculty voluntarily joined individual lab sections to provide design feedback at critical points during the semester. The actual components that make-up this two part strategy are illustrated in the figure below. Figure 1. Graphical summary of the overall four-step program strategy. Showing the ideal timeline and process flow relationship. PROJECT OBJECTIVES Improve student comprehension of the engineering design process Improve quality of design project-related assignments (DPAs) submitted by students Improve student retention in the ME program Table 1. Results of a student survey given after the implementation of the pilot program and strategy defined previously (n = 125). Figure 2. This chart summarizes the results of a student survey question involving the number of in- class DPA-related activities. The majority of students felt that the course would be better with “more” or “the same number of” in- class DPA-related activities (those developed for the “Adopt-a-Lab” pilot program). Table 2. Most common student feedback for open-response survey questions Table 3. Average DPA grades from 2011 to 2012 – examination of improvement Table 4. Average student final exam grades from 2011 to 2012 (for questions related to material covered in the “Adopt-a-Lab” pilot program) – examination of relative improvement Table 5. Student retention in the ME program – examination of relative change from 2011 student surveys to 2012 surveys Overall, this pilot program has shown a small but promisingly positive impact on student performance on assignments and exams. Using the detailed feedback results from Tables 1-4, and that of Figure 2, we plan to refine our strategy, scope, and implementation plan and re-launch this program in the future. Through these minor changes, we feel that future versions of this program and teaching strategy will achieve each of our primary objectives (outlined above) to a very significant degree, and improve the overall educational experience of students. DISCUSSION Comparing quantitative student performance data from fall 2011 to that of fall 2012 (Tables 3 and 4), it is clear that the average student’s DPA and final exam grades improved with the implementation of this pilot program. This indicates that, in 2012, students (1) submitted higher quality and/or more complete work, and (2) showed improved comprehension of key learning objectives (those covered explicitly within the active learning/Adopt-a-Lab pilot program). From these results, we conclude that our pilot program achieved two of our primary objectives. Based on a survey of students’ plans for the following semester administered in fall 2011 and fall 2012 (Table 5), there seems to have been no significant change in student retention. Therefore, we conclude that the pilot version of this program did not achieve our goal of improving student retention in the ME program. S TUDENT PERFORMANCE/RETENTION RESULTS

Upload: others

Post on 12-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

USET Participant: Kolby Sorenson Faculty Mentor: Dr. Debra J. Mascaro Mechanical Engineering Mechanical Engineering

In their first semester of the Mechanical Engineering (ME) program, our students learn the basics of design methodology, mechanical hardware, physics, and modeling concepts during course lectures. These concepts are then applied a team-based design project that culminates with an end-of-semester design competition. Based on our past observations, we felt that our students were deprived of timely, individualized feedback on their application of these design concepts. In engineering design, there is no single “right” answer. Instead, students must learn to utilize design objectives, constraints, target specifications, and metrics in order to reduce an infinite “design space” (all possible combinations of design solutions) to a smaller pool of possible solutions. From this limited pool, students must practice using tools (decision matrices, pairwise comparison charts, etc.) to help them systematically select their “best” design option.

[1] Amelink, C.T. and Creamer, E.G., “Gender Differences in Elements of the Undergraduate Experience that Influence Satisfaction with the Engineering Major and the Intent to Pursue Engineering as a Career,” Journal of Engineering Education, 2010, Vol. 99, pp. 81-92. [2] Vogt, C.M., “Faculty as a Critical Juncture in Student Retention and Performance in Engineering Programs,” Journal of Engineering Education, 2008, Vol. 97, pp. 27-36. [3] Campbell, T.A. and Campbell, D.E., “Faculty/Student Mentor Program: Effects on Academic Performance and Retention,” Research in Higher Education, 1997, Vol. 38, pp. 727-742. [4] Ulbig, S. G., & Notman, F., “Is Class Appreciation Just a Click Away?: Using Student Response System Technology to Enhance Shy Students' Introductory American Government Experience,” Journal Of Political Science Education, 2012, Vol. 8(4), pp. 352-371.

BACKGROUND INFORMATION

PILOT PROGRAM STRATEGY

STUDENT SURVEY RESULTS STUDENT PERFORMANCE/RETENTION CONTD.

CONCLUSIONS

SOURCES

During the fall 2012 semester, we implemented a pilot program based on a two-part strategy of (1) increasing the time spent on project-related active learning activities during lecture, and (2) implementing an “Adopt-a-Lab” strategy in which additional ME faculty voluntarily joined individual lab sections to provide design feedback at critical points during the semester. The actual components that make-up this two part strategy are illustrated in the figure below.

Figure 1. Graphical summary of the overall four-step program strategy. Showing the ideal timeline and process flow relationship.

PROJECT OBJECTIVES

① Improve student comprehension of the engineering design process ② Improve quality of design project-related assignments (DPAs) submitted by

students ③ Improve student retention in the ME program

Table 1. Results of a student survey given after the implementation of the pilot program and strategy defined previously (n = 125).

Figure 2. This chart summarizes the results of a student survey question involving the number of in-class DPA-related activities. The majority of students felt that the course would be better with “more” or “the same number of” in-class DPA-related activities (those developed for the “Adopt-a-Lab” pilot program).

Table 2. Most common student feedback for open-response survey questions

Table 3. Average DPA grades from 2011 to 2012 – examination of improvement

Table 4. Average student final exam grades from 2011 to 2012 (for questions related to material covered in the “Adopt-a-Lab” pilot program) – examination of relative improvement

Table 5. Student retention in the ME program – examination of relative change from 2011 student surveys to 2012 surveys

Overall, this pilot program has shown a small but promisingly positive impact on student performance on assignments and exams. Using the detailed feedback results from Tables 1-4, and that of Figure 2, we plan to refine our strategy, scope, and implementation plan and re-launch this program in the future. Through these minor changes, we feel that future versions of this program and teaching strategy will achieve each of our primary objectives (outlined above) to a very significant degree, and improve the overall educational experience of students.

DISCUSSION Comparing quantitative student performance data from fall 2011 to that of fall 2012 (Tables 3 and 4), it is clear that the average student’s DPA and final exam grades improved with the implementation of this pilot program. This indicates that, in 2012, students (1) submitted higher quality and/or more complete work, and (2) showed improved comprehension of key learning objectives (those covered explicitly within the active learning/Adopt-a-Lab pilot program). From these results, we conclude that our pilot program achieved two of our primary objectives. Based on a survey of students’ plans for the following semester administered in fall 2011 and fall 2012 (Table 5), there seems to have been no significant change in student retention. Therefore, we conclude that the pilot version of this program did not achieve our goal of improving student retention in the ME program.

STUDENT PERFORMANCE/RETENTION RESULTS