model-eliciting activities: assessing engineering student problem

15
Model-Eliciting Activities: Assessing Engineering Student Problem Solving and Skill Integration Processes* TUBA PINAR YILDIRIM, LARRY SHUMAN and MARY BESTERFIELD-SACRE University of Pittsburgh, Pittsburgh, PA 15260-0001. USA, E-mail: [email protected] A Model-Eliciting Activity (MEA) presents student teams with a thought-revealing, model- eliciting, open-ended, realistic, client-driven problem for resolution. Here we extend the original MEA construct developed by mathematics education researchers to upper-level engineering coursework and introduce an ethical component. Our extensions also require students to integrate previously learned concepts as they gain new understanding. We propose that MEAs offer engineering educators at least two potential benefits: improved conceptual understanding and a means for assessing the problem solving process. However, these benefits will only accrue if the MEAs are properly implemented. Consequently, relative to the first we propose certain strategies, learned through experience, for successful MEA implementation, recognizing that this is not a simple task. In addition, we suggest using MEAs as assessment tools and illustrate how they can help analyze students’ problem solving processes. Our findings are based on experiments conducted in different learning environments over a two year period at the University of Pittsburgh’s Swanson School of Engineering. This paper should serve as a resource for those engineering educators and researchers interested in implementing MEAs. Keywords: Model-Eliciting Activity (MEA); ethical reasoning; teaching engineering concepts; assessment of problem solving 1. INTRODUCTION A MODEL-ELICITING ACTIVITY (MEA) presents student teams with a thought-revealing, model-eliciting [1], open-ended, real-world, client- driven problem. MEAs are purported to improve conceptual learning and problem solving skills. Originally developed by mathematics educators, they were first introduced to engineering students, primarily at the freshman level, at Purdue Univer- sity, seven years ago [2–4]. Since then, MEAs are slowly finding their way into engineering class- rooms at various levels; and have the potential to become a widely used engineering education tool. Besides their potential for improving learning, MEAs offer engineering educators a mechanism for assessing problem solving and engineering concepts. Although MEAs have been well published in the pre-college mathematics educa- tion literature, their promise and benefits are still relatively novel in engineering education. The purpose of this article is to better inform the engineering education community of MEAs and how our use of them in upper-level engineering courses is leading us to improve students’ concep- tual understanding, problem solving and ability to recognize and address ethical dilemmas. Further, through our modified design experiment approach and assessment tools for studying MEAs in the classroom, we provide early lessons for those seeking to add MEAs to their instructor’s tool kit. The paper is organized as follows: we provide a theoretical background supporting MEAs and MEA research. Next we describe our experiments to date and assessment tools, including MEA implementation strategies, the introduction of an ethical component and insights from using MEAs as an assessment tool. In our last section, we provide a summary of the findings and advice to faculty. Specifically, we overview the literature that ad- dresses three facets of MEAs’ instructional bene- fits: improved conceptual understanding, problem solving, and teamwork skills, to which we now add a fourth–improved ability to recognize and resolve ethical dilemmas. However, here we stress an important lesson learned from our experience: these benefits will only be realized if the MEAs are correctly implemented. Factors that influence the success of MEA implementation include: the nature of the embedded concept and course, the MEA’s purpose in the exercise, the time allocated for resolving the MEA, the extent of guided discovery provided by the instructor during the MEA solution session, the feedback given after completion, and the instructor’s training relative to MEA use. We explore using MEAs for assessing important engineering outcomes. Based on our experiments, we illustrate how they can be used to analyze problem solving patterns showing how * Accepted 15 October 2009. 831 Int. J. Engng Ed. Vol. 26, No. 4, pp. 831–845, 2010 0949-149X/91 $3.00+0.00 Printed in Great Britain. # 2010 TEMPUS Publications.

Upload: ngokhue

Post on 02-Jan-2017

219 views

Category:

Documents


0 download

TRANSCRIPT

Model-Eliciting Activities: AssessingEngineering Student Problem Solving andSkill Integration Processes*

TUBA PINAR YILDIRIM, LARRY SHUMAN and MARY BESTERFIELD-SACREUniversity of Pittsburgh, Pittsburgh, PA 15260-0001. USA, E-mail: [email protected]

A Model-Eliciting Activity (MEA) presents student teams with a thought-revealing, model-eliciting, open-ended, realistic, client-driven problem for resolution. Here we extend the originalMEA construct developed by mathematics education researchers to upper-level engineeringcoursework and introduce an ethical component. Our extensions also require students to integratepreviously learned concepts as they gain new understanding. We propose that MEAs offerengineering educators at least two potential benefits: improved conceptual understanding and ameans for assessing the problem solving process. However, these benefits will only accrue if theMEAs are properly implemented. Consequently, relative to the first we propose certain strategies,learned through experience, for successful MEA implementation, recognizing that this is not asimple task. In addition, we suggest using MEAs as assessment tools and illustrate how they canhelp analyze students’ problem solving processes. Our findings are based on experiments conductedin different learning environments over a two year period at the University of Pittsburgh’s SwansonSchool of Engineering. This paper should serve as a resource for those engineering educators andresearchers interested in implementing MEAs.

Keywords: Model-Eliciting Activity (MEA); ethical reasoning; teaching engineering concepts;assessment of problem solving

1. INTRODUCTION

A MODEL-ELICITING ACTIVITY (MEA)presents student teams with a thought-revealing,model-eliciting [1], open-ended, real-world, client-driven problem. MEAs are purported to improveconceptual learning and problem solving skills.Originally developed by mathematics educators,they were first introduced to engineering students,primarily at the freshman level, at Purdue Univer-sity, seven years ago [2–4]. Since then, MEAs areslowly finding their way into engineering class-rooms at various levels; and have the potential tobecome a widely used engineering education tool.Besides their potential for improving learning,MEAs offer engineering educators a mechanismfor assessing problem solving and engineeringconcepts. Although MEAs have been wellpublished in the pre-college mathematics educa-tion literature, their promise and benefits are stillrelatively novel in engineering education. Thepurpose of this article is to better inform theengineering education community of MEAs andhow our use of them in upper-level engineeringcourses is leading us to improve students’ concep-tual understanding, problem solving and abilityto recognize and address ethical dilemmas.Further, through our modified design experimentapproach and assessment tools for studying

MEAs in the classroom, we provide early lessonsfor those seeking to add MEAs to their instructor’stool kit.The paper is organized as follows: we provide a

theoretical background supporting MEAs andMEA research. Next we describe our experimentsto date and assessment tools, including MEAimplementation strategies, the introduction of anethical component and insights from using MEAsas an assessment tool. In our last section, weprovide a summary of the findings and advice tofaculty.Specifically, we overview the literature that ad-

dresses three facets of MEAs’ instructional bene-fits: improved conceptual understanding, problemsolving, and teamwork skills, to which we now adda fourth–improved ability to recognize and resolveethical dilemmas. However, here we stress animportant lesson learned from our experience:these benefits will only be realized if the MEAsare correctly implemented. Factors that influencethe success of MEA implementation include: thenature of the embedded concept and course, theMEA’s purpose in the exercise, the time allocatedfor resolving the MEA, the extent of guideddiscovery provided by the instructor during theMEA solution session, the feedback given aftercompletion, and the instructor’s training relative toMEA use. We explore using MEAs for assessingimportant engineering outcomes. Based on ourexperiments, we illustrate how they can be usedto analyze problem solving patterns showing how* Accepted 15 October 2009.

831

Int. J. Engng Ed. Vol. 26, No. 4, pp. 831–845, 2010 0949-149X/91 $3.00+0.00Printed in Great Britain. # 2010 TEMPUS Publications.

personal digital assistants (PDAs) and reflection inparticular can provide important information tothe instructor.

2. BACKGROUND

MEAs were originally developed by mathe-matics education researchers [5, 6] to both promoteproblem solving by encouraging students to buildmathematical models and provide a mechanism tounderstand students’ thought processes. They usedMEAs to observe the progress of student problem-solving competencies and the growth of mathema-tical cognition. Concomitantly, MEAs became atool for both instructors and researchers to notonly observe but also design situations that engagelearners in productive mathematical thinking [5, 8,9].MEAs are constructed using six specific prin-

ciples as shown in Table 1 [4, 7]. It is proposed thata well-designed MEA following these principlescan contribute to student’s understanding of en-gineering concepts, problem solving, commun-ications and teamwork skills. Diefes-Dux et al.[4] emphasize that MEAs are not about the solu-tion itself, but the process of problem solving.Specifically, it is the process that students followwhile solving an MEA combined with elements ofmodel generation and reporting that differentiatesMEAs from other typical engineering problems, aswell as differentiating it from problem based learn-ing.This emphasis on building, expressing, testing

and revising conceptual models is also what differ-entiates MEAs from ‘textbook’ problem-solving

activities. Other distinguishing characteristicsinclude the length of time required for resolution,access to different information resources, numberof individuals involved in the problem-solvingprocess, and type of documentation required inresolving an MEA. A typical MEA is a teamexercise; this (multidisciplinary) teamwork practicealso reinforces the students’ learning. We posit thatMEAs can play three distinct roles in helpingstudents learn an engineering concept:

. Integrate learning from previous courses withnew information (integrator);

. Reinforce the concepts that are currently beingcovered (reinforcer); and

. Discover a concept that has yet to be formallyintroduced (discoverer).

Since MEAs often require students to applymathematical or other structural interpretationsto situations that cut across multiple disciplines,they may have to make new connections, combina-tions, manipulations, predictions or look at theproblem in other ways in order to resolve the posedscenario. Hence, MEAs can require students toemploy learning from previous courses, integratingit with new information.Since MEAs have been introduced into the

engineering classroom, a rich body of researchhas begun to emerge. We generalize this literatureinto three streams: (1) studies that provide exam-ples for MEA implementation, (2) studies thatsuggest using MEAs to achieve specific learninggoals, (3) studies that investigate the MEA solu-tion process. Our study contributes to all three ofthese streams.Diefes-Dux et al. provide an example of the

Table 1. MEA Construction Principles

Principle Description

Model Construction: Student team must create a mathematical model (system) that addresses the needs of a given client. Amathematical model: a system used to describe another system, make sense of a system, explain a system, or to make predictionsabout a system

Reality: The activity is set in a realistic, authentic engineering context and requires the development of a mathematical model forsolution. A well-designed MEA requires students to make sense of the problem context by extending their existing knowledge andexperience. The MEA should create the need for problem resolution, ideally making the student team behave like engineersworking for the particular organization.

Self Assessment: As the model develops, students must perform self-evaluation of their work. The criterion for ‘goodness ofresponse’ is partially embedded in the activity by providing a specific client with a clearly stated need. The criterion should alsoencourage students to test and revise their models by pushing beyond initial ways of thinking to create a more robust model thatbetter meets the client’s needs.

Model Documentation: The model must be documented; typically students write a memo to the client describing their model. TheMEA is not only model-eliciting, but thought-revealing; i.e., the team’s mathematical approach to the problem is revealed in theclient deliverable. This process enables students to examine their progress, assess the evolution of the mathematical model, andreflect about the model. It provides a window into students’ thinking, which can inform instruction.

Generalizability: The created model must be sharable, transferable, easily modifiable, and/or reusable in similar situations. It mustbe generally useful to the client and not just apply to the particular situation; i.e., it must be capable of being used by otherstudents in similar situations, and robust enough to be used repeatedly as a tool for some purpose.

Effective prototype: The solution to an MEA provides a useful prototype, or metaphor, for interpreting other situations. Theactivity needs to encourage the students to create simple models for complex situations. The underlying concepts must be importantideas. Students should be able to think back on a given MEA when they encounter other, structurally similar situations.

T. P. Yildirim et al.832

studies within the first stream; they describe the useof MEAs with first-year engineering students,demonstrating that not only did they effectivelyintroduce engineering concepts in context, but theyalso increased female students’ interest in engin-eering [3]. Ahn and Leavitt [10] summarize theirexperiences with MEA implementations andrecommend: (1) keeping the student team size tothree or four, (2) arranging students with similarweaknesses in groups together, (3) making surethat there are enough instructors to guide studentsduring the activity, (4) using activities that arerelevant to students, (5) creating a prior engage-ment and (6) recording data from the relatedactivities for consequent MEAs. Although theirsuggestions are based on experiments with eighthgraders, our experience with engineering studentssupports these findings as explained in section 4.Studies in the second stream suggest that the

benefits of MEAs involve decreasing the educa-tional gap between majority and underrepresentedstudents, advancing students’ creativity, and moti-vating them to use advanced engineering know-ledge and techniques. As noted, Diefex-Dux et al.[3] documented that MEAs can advance the inter-est and persistence of women in engineering byproviding a learning environment tailored to amore diverse population than traditional engineer-ing course experiences. Chamberlin and Moon [11]used MEAs to develop creativity and identifycreatively gifted students in mathematics, assertingthat MEAs ‘provide students with opportunitiesto develop creative and applied mathematicalthinking; and analyze students’ mathematicalthinking . . . aiding in the identification of thosestudents who are especially talented in domain-specific, mathematical creativity.’ Moore et al. [12]discussed the opportunities for MEAs to motivatestudents to use upper level knowledge. Hallagan[13] used MEAs as exercises in analyzing the waysteachers think about and interpret their practice.Similarly, Berry [14] used MEAs to interpret theinstructor’s role in supporting and enhancingstudent teams’ functioning. More recently, Hjal-marson et al. [15] used MEAs in analyzing theteacher actions that supported the design abilitiesof students. Self, Miller, Kean, and Moore, et al.[16] developed and used MEAs to help reveal themisconceptions of student in dynamics and ther-mal sciences courses. Most recently, Teran inMexico has begun using MEAs to improve learn-ing in upper division industrial engineering courses[17], spreading the use of engineering MEA appli-cations into the international arena.In the third and least developed stream, Moore

et al. [18] investigated the impact of teamworkeffectiveness on the solution quality. Using self-ratings, she found a positive correlation betweenhow effectively the team functioned and the good-ness of its solution. Dark and Manigault [19] notedthat two processes during an MEA solution makea difference: collaborative learning and modeldevelopment. Working in groups develops critical

thinking abilities and helps to reflect group think-ing in the model. However, objective assessmentsof MEA effectiveness remain as a gap in theliterature. How students and teams of studentsnavigate the problem solving process and howMEAs impact student learning still requiresfurther exploration.

3. RESEARCH METHODOLOGY ANDASSESMENT TOOLS USED INCONJUNCTION WITH MEAs

Our MEA research, to date, has focused onidentifying the factors that influence implementa-tion success and studying how the construct can beused to assess student learning. Our researchmethodology has involved four steps—MEAconstruction, implementation, assessment, andrevision.Pilot MEA construction began in early 2007

with initial implementations in summer 2007.Appendix A provides a summary of these MEAsand the environments in which the implementationoccurred, and the concepts and skills targeted. Inaddition, we modified previously developed MEAsfor use in upper level Industrial Engineeringcourses.Table 2 provides a list of MEA implementations

to date, giving the course where MEAs wereintroduced, student profile, concepts targeted,instructor training, the MEA’s role in studentlearning, and how it was assigned. In general, aninstructor chooses to implement a particular MEAthat complements a specific engineering conceptfor a course. Depending on the instructor’s per-sonal teaching style, schedule, and course require-ments, an implemented MEA may be used as anintegrator, reinforcer or discoverer. Potentialassessment tools (discussed below) are then deter-mined to evaluate the potential student learning.Based on the evaluation of learning and feedbackto the students, the MEA and assessment tools aremodified for future use. As part of the engineeringconcept feedback, an engineering ethical compo-nent is also addressed with the students.MEAs have been developed and are being devel-

oped in such different core engineering subjectsincluding thermodynamics, materials, statistics,quality control, engineering economics, supplychain management, linear programming and en-gineering ethics. In our project alone, we havedeveloped nearly 20 MEAs. Yet, the engineeringproblem solving process and aspects of modelingdiffers from that for mathematics education. Matheducators define problem solving as ‘searching fora means (i.e., procedures, steps) to solve the prob-lem, where the goal is to find a correct way to getfrom the given information to the goal(s) set forth’[20]. This is only one dimension of the engineeringMEA, but not the primary objective. In contrast,engineering MEAs are created to elevate students’problem solving, modeling and decision making

Model-Eliciting Activities 833

capabilities for the long term, even where the teamis not developing an appropriate solution to theposed problem.MEAs provide instructors with a medium for

assessment; in response, tools have been developedto identify and analyze the steps students utilizewhile solving an engineering problem. Our experi-ence suggests that these assessment tools used incollaboration with an MEA can provide valuableinsights into the extent of the students’ conceptuallearning, and can enable the instructor to assess theactual teamwork process. We have used five differ-

ent methods that were initiated by engineeringeducation researchers for collecting studentresponses to MEAs: reflection tools, studentreports, PDAs, Wikis, and test questions asdescribed below. (A sixth, pre- and post- concepttests have been used successfully by Self andMiller, et. al. and are now being added to ourtool kit. [16] ).

3.1 Reflection toolsReflection tools were originally suggested by

Lesh, Hamilton and their colleagues. Following

Table 2. MEA Implementations at University of Pittsburgh

Course Student Profile MEAs ImplementedNature of MEAImplementation

Assessment DataCollected

Open Ended ProblemSolving

Industrial Engineers /Juniors

Supplier Development,Condo Pricing,Quality Improvement,Compressor Reliability,Volleyball,CD Compilation,Disaster DecisionModeling, Trees,Gown Manufacturing

� Instructor trained� MEAs used asintegrator anddiscoverer

� In class assignments

Reflections

Engineering Ethics Mixed EngineeringDisciplines/ Juniors and Seniors

Trees � Instructor and TAtrained

� MEA used asintegrator

� Take homeassignments

PDA DataReports

Statistics II Mixed EngineeringDisciplines/ Sophomores

SUV Rollover,Hazardous MaterialTransport

� Instructor trained� MEA used asreinforcer: studentsapply fundamentalslearned in the class

� Take homeassignments

PDA dataReports

EMPOWEREnergy Sustainability

Mixed EngineeringDisciplines/ Juniors and Seniors

Ethanol,Windmills

� Instructor trained� MEA used asreinforcer studentshad mixedunderstanding of theenergy concepts

� Take homeassignments

Wikis

Decision Models Mixed EngineeringDisciplines/ Graduate students andSeniors

Ethanol,FEMA Disaster ReliefDam Construction

� Instructor trained� MEA used asreinforcer

� Take homeassignments

No data collected

Engineering Statistics I Mixed EngineeringDisciplines/ Juniors and Seniors

Tire reliability,Defibrillator Lead,CNC Machine

� Instructors not trained� MEA used asreinforcer

� Take homeassignments

PDA DataReportsTest questionsReflections

Engineering Statistics I Mixed EngineeringDisciplines/ Juniors and Seniors

Tire reliability, CNCMachine

� Instructors not trained� MEA used asreinforcer

� Take homeassignments

Behavioral ObservationReportsTest questionsReflections

Engineering Statistics II Industrial Engineering/Sophomores

Process Quality Control � Instructor trained� MEA used asreinforce anddiscoverer

� Assigned as a in-classfollowed by takehome exercise

ReflectionsReportsTest questions

T. P. Yildirim et al.834

an MEA activity, reflection tools help studentsrecall and then record significant aspects aboutwhat they have done (e.g., strategies used, ways theteam functioned, etc.) so that the instructor mightuse this information to discuss with students theeffectiveness of various roles, strategies, and levelsand types of engagement [21]. Reflection toolsenable students to better develop their conceptualframeworks for thinking about learning and prob-lem-solving by requiring them to reflect on aspectsof the exercise or process just completed.We have used surveys as reflections tools. We

recently moved from paper to digital surveys toprovide ease of data classification and collection.An example reflection survey is found in APPEN-DIX B. When implemented to assess the under-lying problem solving process, reflection toolsprovide powerful information about three majoridentifiers of students’ problem solving process:

. Whether or not students worked effectively as ateam or relied primarily on a single individual,

. The extent that the team used an iterative prob-lem solving approach, and

. The particular stages in the problem solvingprocess, which the students focus on.

An example of the summary response from areflection tool is given in Fig. 1.

3.2 Student reportsStudent reports, i.e., the actual assigned MEA

report (typically submitted in memorandumformat to the ‘client’), are another way to assessthe success of MEA implementation. Their level ofunderstanding of the targeted concepts, whetherthe team used them, and whether they used themcorrectly can be determined from the report.Indications of how concepts are incorrectly usedinclude:

. Inappropriate background to understand thetargeted concept

. Insufficient guidance to students about expecta-tions

. Insufficient time to fully solve the problem (orstudents failing to allocate sufficient time andeffort to properly solve the MEA)

. Poorly written report; (students failing to clearlycommunicate the use of concepts.

A successful report should clearly describe thegeneral model developed for resolving the type ofproblem presented by the client as well as theteam’s specific solution to the given problem.Assumptions should be clearly stated. If an ethicaldilemma is embedded, the report should also ad-dress it and provide an appropriate resolution, inaddition to pointing out other issues that mightaffect the recommended solution.

3.3 PDA recordingsWe have used PDAs to collect data on problem

solving patterns. Here, each student on the team isgiven a PDA. At specific time intervals whensolving the MEA students are asked to recordtheir current task. To record the process, we utilizesoftware enabling students to identify (1) thespecific problem solving task being addressed, (2)the degree of progress at that point (not makingprogress, satisfactory, very good progress), and (3)whether the task is done as a group or individually.The PDAs were programmed to ‘alert’ every 10minutes, at which point each student recorded thetask, his/her progress and whether it wasconducted by an individual or in a team setting.The number of data points collected using thePDA depended on the total time each studentdevoted to the project. Figure 2 provides an ex-ample of one student’s problem solving pattern.In Fig. 2, the vertical axis shows the general

problem solving tasks, where non-productive tasksare shown below the timeline to indicate noprogress at these instances. Triangles denote thatthe team is working on the task together; thesquares indicate individual task work. The largerthe triangle or square, the more engaged thestudent(s) is (are) and progressing for that parti-cular task (and the project, in general).PDAs have allowed us to capture not only the

problem solving steps for each team member, butalso the combined process followed by the team.

Fig. 1. Examples of Team Problem Solving Analyzed usingReflection Tools.

Fig. 2. An Example for PDA Output for one Student in aGroup.

Model-Eliciting Activities 835

Figure 3 presents a suggested summary of differentteam problem solving patterns gleaned from ourPDA data. As shown, tasks might be divided upamong individuals, or solved by the full team. Theteam may solve each task sequentially or iterateamong tasks. The team may devote a dispropor-tionate amount of time to a particular task (e.g.,problem formulation or implementation), ordivide its time equitably among tasks. PDAsallow the instructor to determine how the team isproceeding and therefore provide more informedfeedback upon completion of the exercise.

3.4 WikisWikis are especially useful for the teams that

meet virtually (i.e., students are in different loca-tions). When using Wikis, we asked students toupload their work in progress to a commonwebsite and converse on this site via the Wiki. Inthis manner we were able to observe (or recreate)the student work and final report as the groupprogressed, while viewing their text/chat conversa-tions related to the MEA. This enabled a determi-nation of how the students divided up tasks andworked on various aspects of the MEA. It alsoenabled us to see the various solution approachesattempted, their discussions concerning theseapproaches, and how the team traversed the prob-lem solving process.

3.5 Test questionsFollowing the submission of the MEA and the

instructor’s feedback to the students, follow-upexam questions were then asked to ascertain theextent that the concepts were learned. Using well-crafted questions, the instructor determines theextent that the students had mastered the concept,and if misconceptions remain. Note that this willnot enable the instructor to determine the extentthat the MEA process was helpful, but only if thegoal was achieved.

To assess the students’ overall performance onthe MEA, we developed an evaluation rubric basedon four of the six MEA constructs: (1) General-izability, (2) Self Assessment/Testing, (3) ModelDocumentation, and (4) Effective Prototype.Supporting elements have been delineated togetherwith expectations for each solution-related prin-ciple. For example, for Effective Prototype, wehave expanded the principle to include refinementand elegance of the solution. Each dimension isgraded on a five-point scale, indicating the degreeto which the solution achieves or executes theprinciple. The scores across the four dimensionscan be averaged to obtain an overall score. Speci-fically:

. Generalizability: Assesses the degree to whichthe model is a working solution for the particu-lar problem and future similar cases. Is themodel robust, and can it be easily ‘handedover’ to others to apply in similar situations?

. Self-Assessment/Testing: Assesses the extent towhich the solution has been tested and reflectsthought and procedural revision. Have nuancesor special conditions in the data or problem beenuncovered and accounted for in the procedure?

. Model Documentation: Evaluates the level ofdetail and explicitness in the written procedure.Clarity of expression, correct grammar, and easeof reading are also assessed. Have the assump-tions that were made been clearly stated? Has allinformation specifically requested by the clientbeen included?

. Effective Prototype: Measures the refinementand elegance of the solution procedure. Is theprocedure based on thorough application ofengineering concepts and principles? Have

Fig. 3. MEA Solution Approaches.

Fig. 4. Assessment Rubric Screenshot.

T. P. Yildirim et al.836

appropriate engineering ideas been used? Is thesolution accurate and of high quality?

A score of a ‘1’ on any given dimension indicatesthat the principle was not achieved or executed inthe solution. A score of ‘2’ indicates some, butinsufficient, achievement or execution. A ‘3’ indi-cates sufficient, or minimum, level of achievementand satisfaction of the base requirements. A scoreof a ‘4’ indicates that the solution embodies theprinciple for the most part and that the solutionhas gone beyond the requirements; the team hasachieved more than expected and has generallydone a good job. In order to achieve a ‘5’ on agiven dimension, the principle must be executed inan outstanding manner as delineated in the rubric.The ethics component are scored using the Pitts-burgh-Mines Ethics Assessment Rubric (P-MEAR) previously developed and validated [22].Figure 4 shows a partial view of our rubric scoringsheet.

4. USE OF MEAS AS TEACHING TOOLS

In this section, we describe the factors that werefound to play an important role in implementingMEAs successfully in engineering classrooms. Wediscuss (1) the factors that are crucial in imple-menting MEAs and (2) how adding an ethicalreasoning domain to an MEA can increase itseffectiveness as a teaching tool.

4.1 Contingencies to Successfully ImplementingMEAs as Teaching ToolsWe have learned from our experiments that

MEA implementation must be a carefully plannedprocess that follows several important steps. Welist these factors and their impact on MEA imple-mentation particular to the upper-level engineeringeducation context next. See Fig. 5 for a summaryrepresentation of these contingencies that impactMEA implementation success. We have deter-mined these factors by carefully reviewing theartifacts from our experiments as describedabove. Of particular value has been data obtainedfrom the reflection tools, MEA reports, andcombining the individual PDA results for eachteam and comparing that to the first two datasets. Clearly, an important determinant of thesuccess of an MEA implementation was whetherthe students used the targeted concepts appropri-ately when solving the MEA. We also reviewedpost experiment student and instructor feedback.We analyzed the differences between those experi-ments for which favorable results (improvedconceptual learning) were achieved and those inwhich the results were less than favorable. Where asignificant portion of the student teams missed thetargeted concept (which happened in severalexperiments), or used the appropriate conceptbut in the wrong manner, the implementationwas considered to be unsuccessful, although such

instances do present the instructor with an impor-tant teaching opportunity if appropriate feedbackis provided to the students in an expeditiousmanner.Certainly there will be additional factors that

will be added to this list, particularly as wecontinue to extend the MEA construct to larger,more complex problems. However, the resultsreported here are based on data obtained from arelatively large number of students and instructors.Further, we have used pairs of trained coders toreview the data, ensuring that they obtain a highdegree of consistency before finalizing their scores.

4.1.1 Embedded MEA concepts and coursesThe embedded concepts are a key factor in

designing an MEA. Our MEAs have targetedmeasures of central tendency, the central limittheorem, hypothesis testing, design of experiments,quality control, supply chain, multi-criteria deci-sion making, and economic analysis. Clearly, thesuccess of the MEA may vary based on thedifficulty in recognizing and understanding theembedded concept. More advanced engineeringconcepts require students to have the proper back-ground receive appropriate guidance through theimplementation process.The course in which the MEA is used is another

complicating factor. We have introduced andtested MEAs in introductory and intermediateengineering statistics, supply chain, qualitycontrol, decision analysis, operations research,human factors and engineering sustainabilitycourses. For certain courses where students typi-cally solve textbook problems, (e.g., statistics andquality control), we found very positive studentreactions to the MEAs. In particular, after theQuality Control MEA implementation, a follow-up reflection survey found that 90% of the studentsstated that they enjoyed working on the MEAbetter than the textbook examples.

4.1.2 MEA’s role in conceptual understandingWe have noted that MEAs can play three

different roles in learning engineering concepts:integrator, reinforce or discoverer. That is, theMEA can be developed and: (1) introduced in a

Fig. 5. Contingencies on MEA Implementation.

Model-Eliciting Activities 837

manner that requires students to integrateconcepts from earlier courses; (2) implemented ina manner so that a concept recently introduced willbe reinforced; or (3) implemented so that thestudents will discover a new concept (with someinstructor guidance), as well as combinations ofthese three roles. We have learned that the successof a given MEA can vary based on its particularrole. For example, the CNC MEA (see AppendixA) has been implemented twice in a statisticscourse from both a reinforcer role (i.e., assignmentgiven after the concept—hypothesis testing—wasintroduced in lecture) and a discoverer role (i.e.,where the concept hadn’t been introduced). Underthis latter scenario, the effectiveness of the MEAwas significantly reduced; in fact, none of thestudent teams correctly applied the targetedconcept in their solutions! (This latter experimentalso highlighted the need for the instructor toprovide guided discovery so that such situationsdo not occur.)

4.1.3 Team size and numberThe size of the team addressing the MEA can

also have an impact on its success. We havefollowed Ahn’s and Leavitt’s suggestions of threeto four person teams, and have observed nosubstantial difficulties [10]. In addition, thenumber of teams in a classroom is another factorfor a successful MEA implementation. As thenumber of teams increase, the effort or guidancethat the instructor can provide per team isdecreased proportionally; negatively impactingMEA success.

4.1.4 Instructor’s experience (past involvement)with MEAsThe instructor’s past experience and involve-

ment (with MEAs) was not initially predicted tobe as important a factor in the MEA implementa-tion success. However, when investigating thefactors that might have impacted student learning,student feedback indicated both higher involve-ment and satisfaction with the activity when theinstructor provided sufficient guidance. Whereinstructors provided insufficient feedback, furtherexamination revealed that these instructors werenot sufficiently familiar with the MEA constructand the ways it could be used in the classroom; insome cases, too much was left to a teachingassistant, who also was not sufficiently preparedto administer an MEA scenario. Thus, instructorexperience is included as one of the significantcontributors of MEA implementation success.We have been able to divide the faculty who has

introduced MEAs as part of our experiments intothree categories: (1) familiar with the MEAconstruct, prior research and implementation, (2)familiar only with MEA implementations, and (3)no familiarity with MEAs.To reiterate, MEA implementations were most

successful when the instructor was both familiarwith prior MEA research and had been trained on

how to implement MEAs. It became apparent thatproper training of how to implement an MEA wasone of the most significant factors in its success.When the instructor was fully aware of the learn-ing benefits that can result from an MEA, he/shetypically put more effort into its planning andexecution.

4.1.5 Instructor guidanceAs noted above, even when faculty had experi-

ence with MEA implementations, but did not fullyappreciate its benefits, we have noticed that theguidance given to students during the executionand feedback phases was at best insufficient. If theinstructor had no training on MEA implementa-tion, the MEA was much less likely to produce itstargeted results. Hence, we have concluded thatunless the instructor had been formally trained onpurpose, philosophy, and implementation of anMEA, the results may be substantially less thandesired.Stated another way, an important factor in

MEA success is the level of guidance providedduring the solution process. Our MEA implemen-tations have been either in class or recitationexercises or solved at home, and occasionally acombination of both. We have learned that thesuccess of the MEA depends on the guidance givenstudents while working on the MEA. In-classexercises enable the instructor to quickly identifywhen and where students make mistakes; theinstructor can then help students reflect and redir-ect by clearing up misunderstandings. However,the instructor must be careful not to give too muchguidance. As noted, we have seen a ‘reinforcer’MEA implemented in which the instructor did notinteract with the students during the solutionprocess. As a result, all of the teams missed thekey concept receiving no direction that might haveprovided the necessary hints to abandon their non-productive approaches. In short, it is importantthat the instructor monitors each team’s progress,and provides some direction or hints when theteam is floundering or missing the point. This isa potential risk when giving the MEA as an out-of-class activity. As we develop more complex MEAs(exercises that may require two or three weeks toresolve), the instructor will need to periodicallyutilize mechanisms for monitoring the solutionprocess, such as requiring intermediate reports onthe developing models and reflection tools (thatwill be discussed later).

4.1.6 Time allotted for solutionOur experiments also varied in time students

were given to complete the exercise. Time is oftena function of exercise complexity and whether ornot the MEA was assigned to be solved within theclass or at home. We observed that the solutions(models and documentation) for MEAs solvedoutside the classroom had more depth, sincestudents had access to a wider range of informa-tion and no time restrictions.

T. P. Yildirim et al.838

Both doing the activity in-class and out-of-classhave benefits and potential risks. In the former,students are able to obtain guidance from theinstructor on how to carry out certain steps inthe MEA solution (i.e., guided discovery), but maynot have sufficient time to appropriately completethe assignment. In the latter case, students can beprovided with as much time as they need to carryout the tasks to complete the MEA, but may notreceive appropriate guidance, and hence, continuein the wrong direction. Clearly, it is important tobalance the benefits of each task. To achieve this,we suggest that MEAs start in class, with studentsworking on the exercise for sufficient time so thatthe instructor is comfortable that each team under-stands the problem and is beginning to address theconcepts embedded in the MEA. After that, theteams could be allowed to continue to work on theMEA outside of the classroom, giving them achance to better deal with broader subjects coveredin the MEA.

4.1.7 FeedbackHattie and Timperley [23] define feedback as

‘the information provided by an agent (e.g.,teacher, peer, book, parent, self, experience)regarding aspects of one’s performance or under-standing’. Feedback builds students’ self-efficacyas learners and writers [24], and is essential forassessment processes since it (1) identifiesstrengths, (2) suggests strategies for improvement,(3) reflects a view of the teacher student relation-ship [25]. Moore and Smith [26] suggest that feed-back is most useful when instructor enters into adialogue with students about the ideas they haveexpressed.Providing feedback to students better ensures

that they are likely to reach a desired level ofexpected understanding when the activity iscompleted. We have provided feedback to studentsin oral and written forms. Feedback that pointsout common mistakes and expected solutionmethods provides students with a perspective to

solve similar problems. The effectiveness (or inef-fectiveness) of feedback was clearly seen in thestudents’ reflection surveys. Where feedback waslimited or not provided, concepts were not rein-forced, and in too many instances students missedfully grasping and using the key concept.One way to provide feedback is to use two part

MEAs. The first part (typically done individually)serves to get the student thinking about theconcept embedded in the MEA and the situationto be presented in the second part. The secondpart, which is the actual MEA, then is addressedby the full team. However, the first part needs toensure that the students confront the concept. Forexample, in the ‘Tire Reliability’ MEA, studentswere asked questions that required them toconceptualize reliability, but did not ask themspecifically to define it. As a result, in the secondpart of the exercise, the student teams, having touse the concept of reliability for the first time, werenot clear as to its meaning.

4.2 Extending MEA concept to include the ethicalreasoning domainThe use of context-based case studies provides

ideal subject material for the development ofmodeling exercises. This is particularly true in thecase of ethics-based models, which often requirethe synthesis of such intangible concepts as envir-onmental justice, international policy, andresource conservation during solution. By addingan ethical reasoning domain, we created ethicalMEAs or E-MEAs [9]. Our objective was toencourage students to consider how the engineer-ing decisions that they make potentially influencethe public, environment, other stakeholders, theirfirm and/or themselves. In addition, we were ableto better understand the various strategies studentteams use to resolve complex ethical dilemmas.Table 3 provides a description of the ethicalissues embedded to date in our MEAs.An E-MEA should be created for a specific

purpose, typically as a learning exercise to intro-

Table 3. Illustrative Ethical Issues that are embedded in MEAs Developed

SUV Rollover: Students were asked to address the sensitivity of the results they have just found. Specifically, what should they (thetesting company) do with the results (that the carrier has asked them to keep confidential). That is, they must consider issuesrelated to non-disclosure versus the safety of the public; at what point does public welfare trump non-disclosure?

CNC Machine Purchase: In second part of the MEA, students were asked to re-do their analysis in order to now show that thereplacement is, in fact, better (assuming the team originally concluded that it was not), or provide more specific details about howit proved the replacement is better. That is, students were given the dilemma of whether to fudge data to please the boss or not.

Ethanol: Students were asked to address ethical issues of ethanol use and production.

Hazardous Materials: This E-MEA involves a decision with ethical implications concerning possible investment in countermeasuresfor reducing or preventing hazardous materials spills. The team must address unknown material costs and the values attached toaccidents in which injuries and fatalities could occur.

Pilotless Airplane: This MEA involves a NASA sponsored student competition in which the entrants must design and test a‘pilotless plane.’ The team must propose a method for dealing with potential rule violations related to using last year’s designs; i.e.,does the design have to be original? This MEA was used in an introductory engineering statistics course.

Trees: Part 1 of this MEA concerns possible removal of old growth trees along a road through a public forest to reduce trafficaccidents. Part 2 ups the stacks by making the trees redwoods.

Model-Eliciting Activities 839

duce or reinforce one or more concepts. Steps todevelop an E-MEA include determining:

. The conceptual issue(s) requiring engineeringethical reasoning that will be presented,

. Other fundamental concepts required for resolu-tion,

. How the E-MEA will be used (e.g., within alecture, a recitation exercise, or in a workshop.)

Once these steps are decided, a storyline must bedeveloped that describes a realistic situation inwhich the concept(s) will be embedded. We havedeveloped a number of our storylines from inci-dents in the news, as well as from personal experi-ence, and the experience of colleagues in industry.After the storyline has been developed, the ethicaldilemma can then be introduced. The dilemmamay come from personal experience or adaptedfrom a text or case. It should not be a ‘black orwhite’ issue, but rather lie in a ‘gray area.’ Itsresolution might require a creative ‘win-win’ reso-lution. It should be written in a way that requiresthe student to carefully read the case in order torecognize it. We have learned to frame the casefirst, and then have the students address thedilemma once they have obtained their results.Although we believe that adding an ethical

dilemma into MEAs is a novel contribution tothe engineering education research, we list it as afactor to elevate effectiveness of teaching in class-room and refer the interested readers to Shuman,Besterfield-Sacre and Yildirim [9] where weprovide more detail.

5. MEAs AS MEDIA FOR ASSESSINGASPECTS OF LEARNING

We next illustrate how using the assessmenttools given in section 3, the instructor can learnabout the student problem solving process. Figure6 gives the PDA results for the three members of a

team addressing the Defibrillator Lead MEA. ThisMEA, which was based on a series of incidentswith implantable defibrillator lead malfunctions,required student teams to estimate how manysamples they would need to collect in order todetect if a batch of leads did not meet specifica-tions. Since the calculated sample size was rela-tively small, the team would then have todetermine if the central limit theorem would holdso that they could assume that the mean of thesampling distribution was approximately normallydistributed. This MEA was designed to reinforcethe primary concept (central limit theorem) whilealso requiring the student to discover how thesample size could be estimated in order to achievea desired confidence limit. Note that the triangleindicates that the team was working together whilethe square indicates that the team member wasworking individually. Again, the size of the symbolindicates the level of engagement—from not veryengaged (satisfied), to satisfied with their effort tovery good progress.Figure 6 suggests that students two and three

were working closely together for over three hours(time steps are every 10 minutes), although formuch of this time very little progress occurred,including the first hour which was devoted toproblem formulation. Student one appears tohave only participated for this first hour periodand then dropped out, leaving his two partners tocomplete the assignment. We have taken thestudents’ individual reflection statements andcombined them to obtain a fuller picture of thesolution process:

The problem solving strategy our team used at thebeginning was to derive our information from thetext. We started by stating what we knew, but we werecaught up in the fact that the distribution of thesamples was uniform. However, we reached a pointwhere the text did not help us, and on the contraryconfused us. At 30 minutes we sorted out our confu-sion. At 50 minutes we were even more confident andwere able to proceed more quickly. We were trying to

Fig. 6. Combined PDA and Reflection Data—Defibrillator Lead Example

T. P. Yildirim et al.840

figure out how to create confidence intervals on uni-form distributions when we decided to researchwhether the means of samples [were] taken from auniform distribution. When we found that this wastrue, we were able to use the information and statisticsto construct intervals and proceed. For the wholeproblem, we continually questioned what we knewto make sure our reasoning was sound.

Combining Fig. 7 with the combined reflectionstatement provides the instructor with a clearpicture— a team that struggled for the entire in-class portion of the MEA. Two of the teammembers then worked together outside of classand eventually came up with a solution. Theywere able to develop the confidence interval(needed to estimate sample size), but missed thekey concept concerning the central limit theorem.This strongly suggests that the instructor shouldhave reviewed the MEA in the context of thecentral limit theorem during the next class period.A second example is given by Fig. 7. Here the

students were solving the CNC Machine MEA.This was designed to reinforce the concept ofhypothesis testing (two samples). It also containedan ethical dilemma—the plant manager was‘highly suggesting’ that the teams select one parti-cular machine, and, if necessary, manipulate thedata to prove that point. The students’ combinedreflection statement provides insight into how theyapproached and resolved this MEA:

We began by trying to standardize the data and usethe z table. We started the project by attempting toperform a simple z-standard test to see if the data wasacceptable. At 15 minutes we decided to use a con-fidence interval instead since the z-standard test didn’treally answer the question. At 20 minutes we realizedthat was incorrect, we almost started over with theprocess. At 45 minutes we thought of setting upconfidence intervals and we took this thought furtherand came closer to a final answer before realizing wewere incorrect. At 50 minutes we again saw that theconfidence interval wasn’t the best way to answer thequestion and decided hypothesis testing was the best

solution strategy to follow. After more thinking andbrainstorming we figured out that we should usehypothesis testing and we finished out the problemthat way. These were our three main strategies.

It is interesting to note that while the teammembers felt they were working together, studenttwo spent much of the time interpreting the results(analyzing the data) and student three was notfully engaged until the end of the project, andeven spent part of the class time in a non-produc-tive mode. The combined reflection does show thatthe team undertook a process of reviewing andrevising their progress, as is required by a well-designed MEA.

6. SUMMARY AND CONCLUSIONS ANDFOR ENGINEERING EDUCATORS

We have addressed the educational benefits ofMEAs, relative to learning important engineeringconcepts and in evaluating students’ problemsolving and engineering ethical reasoning abilities.Further, we have provided several tools andmethods by which MEAs and problem solvingcan be assessed. We further discuss the factorsfor success in the classroom for MEA implementa-tion.Our experience points to several factors that

impact the success of MEA implementation. Animportant factor influencing MEA success is theguidance from the instructor throughout MEAimplementation. Corrective guidance can makesure that students are properly focused addressingthe correct problem and targeted concept, espe-cially where the solution time is constrained. Suchguidance is positively correlated with instructor’straining on MEA implementation. If the instructorappreciates the potential benefits that the studentscan receive from an MEA, he/she should morereadily make the extra effort to properly guide

Fig. 7. Combined PDA and Reflection Data—CNC Machine Example

Model-Eliciting Activities 841

students and provide necessary feedback; other-wise the positive effects of the MEA are limited.Feedback after completion of the MEA plays an

important role in students’ understanding of keyconcepts. Such feedback can reinforce studentunderstanding as well as correct misconceptions.Dividing MEAs into several parts and providingfeedback at points during the solution process alsoensures that misconceptions are identified andcorrected early allowing for student teams toredirect to achieve the desired result.Our research and experience strongly suggests

that MEAs can help educators assess theirstudents’ problem solving process through theuse of PDAs, Wikis, reflection tools, as well asthe actual student reports and well-designed exam-ination questions. Using such tools allows educa-tors to gain insight into the team’s group processes,problem solving processes, degree of involvement,and their process of iterating through the exercise.Further, such information can provide engineeringeducators information about the quality of studentlearning.This study contributes to engineering and engin-

eering education fields in several ways. As engin-eering schools focus on providing realistic learningenvironments for their students, improved learningand assessment tools are needed. We propose that

MEAs can improve student learning. Further, wehave suggested how educators might use thesetools to better understand team processes.One important issue needs to be resolved: the

improvement of student learning. Specifically,although we have documented some means bywhich we can assess problem solving and concep-tual understanding, research is needed to under-stand how MEAs might be used to improvestudents’ problem solving abilities and conceptualunderstanding of critical engineering concepts. Inaddition, the effectiveness of MEAs should becompared to other learning tools and methodolo-gies. (For example, we view MEAs as a comple-ment to problem based learning and believe thatthe potential benefits are comparable.) As ourwork with MEAs continues we intend to addressthese outstanding questions.

Acknowledgements—This research is supported in part by theNational Science Foundation through DUE 071780: ‘Colla-borative Research: Improving Engineering Students’ LearningStrategies throughModels and Modeling.’ We acknowledge ourpartners in this four year research project: Eric Hamilton(Pepperdine University, formally at the USAFA); JohnChrist, U.S. Air Force Academy; Ronald L. Miller and BarbaraOlds, Colorado School of Mines; Tamara Moore, University ofMinnesota; Brian Self, California Polytechnic State Univer-sity—San Luis Obispo; and Heidi Diefes-Dux, Purdue Uni-versity.

REFERENCES

1. R. Lesh, M. Hoover, B. Hole, A. Kelly and T. Post, Principles for developing thought-revealingactivities for students and teachers. The Handbook of Research Design in Mathematics and ScienceEducation, Kelly, A. and Lesh, R. (eds), Lawrence Erlbaum Associates, Mahwah New Jersey, 2000,pp. 591–646.

2. H. Diefes-Dux, T. Moore, J. Zawojewski, P. K. Imbrie and D. Follman, A Framework for PosingOpen-Ended Engineering Problems: Model-Eliciting Activities. Frontiers in Education Conference,Savannah, Georgia, October 2004, pp. 20–23.

3. H. Diefes-Dux, D. Follman, P. K. Imbrie, J. Zawojewski, B. Capobianco and M. Hjalmarson,Model eliciting activities: an in-class approach to improving interest and persistence of women inengineering. Proceedings of the American Society for Engineering Education , Annual Conference& Exposition, Salt Lake City, Utah, June, 2004.

4. M. Hjalmarson, Engineering as bridge for undergraduate and k-12 curriculum. 9th AnnualConference on Research in Undergraduate, Mathematics Education. Piscataway, NJ, 2006.

5. R. Lesh, The development of representational abilities in middle school mathematics: thedevelopment of student’s representations during model eliciting activities. In: I.E. Sigel (ed.),Representations and student learning, Lawrence Erlbaum, Mahwah, NJ, 1998.

6. R. Lesh, Research design in mathematics education: focusing on design experiments in L. English(ed.), International Handbook of Research Design in Mathematics Education, Lawrence Erlbaum,Mahwah, NJ, 2002.

7. R. Lesh and H. Doerr, Foundations of a models and modelling perspective on mathematicsteaching, learning and problem solving, in R. Lesh and H. Doerr (eds), Beyond constructivism: Amodels and modeling perspectives on mathematics problem solving, learning and teaching, LawrenceEarlbaum, Mahwah, NJ, 2003.

8. R. Lesh and A. Kelly, Teachers’ evolving conceptions of one-to-one tutoring: a three-tieredteaching experiment. J. for Research in Mathematics Education, 28(4), 1997, pp. 398–430.

9. L. Shuman, M. Besterfield-Sacre, T. P. Yildirim, E-MEAs: Introducing an ethical component tomodel eliciting activities. ASEE Conference Proceedings, ASEE Annual Conference, June, 2009.

10. C. Ahn and D. Leavitt, Implementation strategies for Model Eliciting Activities: A teachers guide,accessed 4/27/2009, http://site.educ.indiana.edu/Portals/161/Public/Ahn%20&%20Leavitt.pdf,2009.

11. S. A. Chamberlin and S. M. Moon, Model-Eliciting Activities as a tool to develop and identifycreatively gifted mathematicians. J. of Secondary Gifted Education, 17(1), 2005, pp. 37–47.

12. T. Moore, H. Diefes-Dux, Developing model-eliciting activities for undergraduate students based onadvanced engineering content. 34th ASEE/IEEE Frontiers in Education Conference, Savannah,Georgia, October, 2004.

13. J. E. Hallagan, The case of Bruce: a teacher’s model of his students’ algebraic thinking aboutequivalent expressions. Mathematics Education Research Journal, 18(1), 2006, pp. 103–123.

T. P. Yildirim et al.842

14. S. E. Berry, An investigation of a team of teachers’ shared interpretations of the teacher’s role insupporting and enhancing group functioning. University of Indiana Electronic Dissertation, 2007.

15. M. A Hjalmarson and H. Diefes-Dux, Teacher as designer: a framework for teacher analysis ofmathematical model-eliciting activities. Interdisciplinary J. of Problem-based Learning, 2(1), 2008,Article 5.

16. B. P. Self, R. L. Miller, A. Kean, T. J. Moore, T. Ogletree and F. Schreiber, Important studentmisconceptions in mechanics and thermal science: identification using model-eliciting activities, 38thAnnual Frontiers in Education Conference, Saratoga Springs, NY, October, 2008.

17. A. Teran, Use of Model Eliciting Activities: Lessons Learned, Industrial Engineering ResearchConference, Miami, FL, May 30–June 3, 2009.

18. T. Moore, H. Diefes-Dux and P. K. Imbrie, How team effectiveness impacts the quality ofsolutions to open-ended problems. 1st International Conference on Research in EngineeringEducation, Honolulu, HI, June, 2007.

19. M. Dark and C. Manigault, Information assurance model-eliciting activities for diverse learners.CERIAS Tech Report 2007–92 by Center for Education and Research Information Assurance andSecurity Purdue University, West Lafayette, IN, 2007.

20. J. G. Greeno, The structure of memory and the process of solving problems. In: R. Solso (ed.),Contemporary issues in cognitive psychology. The Loyola Symposium. Washington, DC: Winston,(1973).

21. E. Hamilton, R. Lesh, F. Lester and C. Yoon, The use of reflection tools to build personal modelsof problem-solving. In: R. Lesh, E. Hamilton, J. Kaput, J. (eds.), Models & Modeling asFoundations for the Future in Mathematics Education, Lawrence Earlbaum Associates, NJ, Inc.(in press).

22. Shuman, LJ, B. Olds, M. Besterfield-Sacre, H. Wolfe, M. Sindelar, R. Miller and R. Pinkus, UsingRubrics to Assess Students’ Ability Resolve Ethical Dilemmas, Proceedings of the 2005 IndustrialEngineering Research Conference, Atlanta, GA, May 2005.

23. J. Hattie and H. Temperley, The power of feedback. Review of Educational Research, 77(1), 2007,pp. 81–112.

24. T. J. Crooks, The impact of classroom evaluation practices on students. Review of EducationalResearch, 58, 1988, pp. 438–481.

25. J. Orrell, Feedback on learning achievement: rhetoric and reality. Teaching in Higher Education,11(4), 2006, pp. 441–456.

26. B. S. Moore and R. Smith, R. Curriculum leadership and assessment and reporting, Adelaide,South Australian College of Advanced Education, 1989.Figure 1. Examples of Team ProblemSolving Analyzed using Reflection Tools

APPENDIX A. University of Pittsburgh MEAs

MEA Skills Targeted Description

Probability, Statistics and Data Analysis

SUV Rollover � An experimental design with acost constraint

� Statistical analysis using ANOVA

A major insurance carrier has noticed a relatively large number of claimsinvolving SUVs that have rolled over after tire tread has separated. Thecarrier contacts an engineering testing firm to design a series ofpotentially destructive tests on a combination of vehicles and tires toidentify a potential problem with either a vehicle or tire model in variousenvironmental conditions. Students are given costs for conducting theexperiment, a budget, and are then asked to provide a design for theexperiment—i.e., identify each combination of vehicle and tire to test. Asimulator provides each team with a unique set of test results based ontheir design in order to conduct a thorough statistical analysis. In theirfinal report the team must address what should they (i.e., the testingcompany) do with the results (that the carrier has asked to keepconfidential). They must consider at what point does public welfaretrump non-disclosure?

CNC MachinePurchase

� Hypothesis testing to determinethe whether a CNC machineinvestment is justifiable

� Concepts of mean and variance� Economic analysis

In Part 1, a manufacturing plant has an opportunity to replace an olderCNC machine with a newer model. The plant manager views this as asignificant opportunity, especially since the purchase would not comefrom his budget. He requests that the team prove that the new machinewill outperform the current one as measured by unit production time,cost, and quality, in order to build the best case for purchase. In Part 2,the team is asked to re-do its analysis in order to show that thereplacement is, in fact, better (if the team had concluded that it was not),or provide more specific details about proving that the replacement isbetter.

HazardousMaterials

� Categorical data analysis The team is asked to create a procedure for deciding whether a small,rural Pennsylvania county with a major highway running through it andfaced with a series of hazard material spills, should invest $2 million incountermeasures that should lead to a reduction in such accidents. Theteam must address unknown material costs and the values attached toaccidents in which injuries and fatalities could occur. The students aregiven a large accident data base, although some data are missing.

Model-Eliciting Activities 843

MEA Skills Targeted Description

Defibrillator LeadMEA

� Central Limit Theorem This MEA is based on a series of incidents with implantable defibrillatorlead malfunctions, required student teams to estimate how many samplesthey would need to collect in order to detect if a batch of leads did notmeet specifications. Since the calculated sample size was relatively small,the team would then have to determine if the central limit theoremwould hold so that they could assume that the mean of the samplingdistribution was approximately normally distributed.

Decision Analysis

Dam Construction � Multi-criteria decision making A dam will be constructed in the South Eastern Anatolia. The TurkishGovernment, for economic reasons, must reduce the dam’s budget.Alternatives include making the dam less safe by decreasing its height,eliminating some material, or lengthening the project. Ethical issuesinclude the dam’s impact on its neighbors; harm to local population;having a historical region go under water.

Supply Chain

EthanolProduction

Determining whether a ‘green,’socially-conscious agriculturalcompany should become an ethanolproducer or remain solely in grainproduction for food and livestock.

In Part 1, the team is asked to create a procedure for determiningwhether a ‘green,’ socially-conscious Midwest agricultural companyshould become an ethanol producer or remain solely in grain productionfor food and livestock. The team also must evaluate various sites for anethanol production facility, which could be fueled by any one of severalfeed stocks. In Part 2, the company has decided to move forward andlocate its ethanol production facility in Ames, Iowa. The producer willneed one or more distribution points for its ethanol. Should it pursue acentralized or de-centralized distribution scheme, given a set of potentialdistribution center locations? The team must also address issuesinvolving ethanol use and production.

Engineering Ethics

Trees Recognizing and resolving ethicaldilemmaReducing auto accidents vs.preserving old growth trees

Part 1 concerns possible removal of old growth trees along a roadthrough a public forest. There have been a series of accidents althoughmany may be due to excessive speed. The county’s department oftransportation has decided to remove the trees, but a citizenenvironmental group is protesting. The team is asked to resolve thedispute. Part 2 involves a similar scenario reset in a California State Parkthat contains redwood trees. It is based on a case initially developed byHarris, Pritchard and Rabins.

Global Decision Making

Outsourcing GownManufacturing

Deciding whether or not tooutsourceDeciding where to outsource

A company is planning to outsource its gown manufacturing to one ofthree countries and will also sell to that country. Students mustdetermine adjustments to the gowns according to the anthropometrymeasures of females in the selected country. Issues of the expected use ofchild labor and preventing the selected manufacturers from selling itsdesigns to competitors.

Quality Control

Quality ProcessControl

Understanding how to dynamicallyfollow a quality control processObtaining a manufacturing processthat is capable

A car parts manufacturer is about to sign a contract with a Japanese carmaker t well known for quality. First, the parts manufacturer needs tomake sure that it can produce parts within the desired quality level. Themanufacturer asks a consulting firm to provide a report that documentsthat the parts t are of the desired quality. In part 1 students areintroduced to the problem and requirements. They are given data tomeasure whether the process manufactures parts are of the requiredquality using quality control charts. In the second part, students areprovided with longitudinal data showing changes in the process overtime. In the third part, students are asked to provide an overall reportabout whether they think the process is indeed in control, and how tomeasure it in the future. They must link their recommendations toquality control procedures.

Sustainability

Windmills Deciding where to build a windmillconsidering a long term planningForecasting

A company is considering a windmill farm investment in one of severalregions. The team must pick the most economical location, consideringlong term demand for electricity and price estimates. In addition, theteam must consider locating the farm in the ocean versos on land.

T. P. Yildirim et al.844

APPENDIX B. Example Reflection Sheet

Name____________________________

Reflections Questions

Please answer all questions as best you can and turn this in to the Digital Dropbox on your Courseweb. Thisis to be completed individually by each person.

Due: Friday, October 24 by 5:00 PM.1. Draw a simple graph with time on the x-axis and percent progress on the y-axis. Mark two points on the

graph (early and later) where significant changes in your thinking occurred. See below for an example.

2. With respect to your graph, describe the problem solving strategy your team used at the start of theassignment and then how it changed t each of the two points. Were there any other strategy changes?

3. For each of the two critical transition points that you marked in your graph, did you change: (a) frombeing bored to being highly engaged? (b) from feeling lost to feeling in control of the situation? (c) fromfeeling frustrated or overwhelmed to feeling like things are going smoothly? (d) from being distracted bythings unrelated to the problem to feeling well focused?

4. If any of the preceding changes occurred, did your ways of thinking about the problem also change? Didyou recognize the importance of some new relationships, or patterns, or mathematical models?

5. What concepts from science, math or engineering did you use in order to obtain the solution? Do you feelthat this exercised enabled you to better understand any of these concepts? (Explain).

5. What alternatives did you discuss in resolving the issue: ‘Please do not share these results with anyoneelse either inside or outside of SAFETY+.’?

Tuba Pinar Yildirim is a dual doctoral candidate Industrial Engineering and MarketingDepartments at the Swanson School of Engineering and Katz Graduate School of Businessat the University of Pittsburgh. She received her MS degree in Industrial Engineering at theUniversity of Pittsburgh, and her BS degrees in Industrial and Mechanical Engineeringfrom Middle East Technical University in Turkey. Ms. Yildirim is interested in assessmentof learning improvement and problem solving patterns in Model Eliciting Activities in theengineering education field. Her other research focuses on game theoretic and analyticalmodels.

Larry J. Shuman is Senior Associate Dean for Academics and Professor of IndustrialEngineering at the University of Pittsburgh. His research focuses on improving theengineering educational experience with an emphasis on assessment of design and problemsolving, the study of the ethical behavior of engineers and engineering managers; he is alsostudying responses to natural and man-made disasters. A former senior editor of theJournal of Engineering Education, Dr. Shuman is the founding editor of Advances inEngineering Education. He received his Ph.D. from The Johns Hopkins University inOperations Research and the BSEE from the University of Cincinnati.

Mary Besterfield-Sacre is an Associate Professor and Fulton C. Noss Faculty Fellow inIndustrial Engineering at the University of Pittsburgh. Her principal research interests arein engineering education evaluation methodologies, planning and modeling the K-12educational system and empirical and cost modeling applications for quality improvementin manufacturing service organizations. She received her B.S. in Engineering Managementfrom the University of Missouri—Rolla, her M.S. in Industrial Engineering from PurdueUniversity, and a Ph.D. in Industrial Engineering at the University of Pittsburgh. She is anassociate editor for the Journal of Engineering Education.

Model-Eliciting Activities 845