a comprehensive abet-focused assessment plan · pdf filea comprehensive abet-focused...
TRANSCRIPT
A Comprehensive ABET-focused Assessment Plan
Designed to Involve All Program Faculty
Olga Pierrakos and Heather Watson
Department of Engineering and Center for Innovation in Engineering Education
James Madison University
Harrisonburg, V A, USA
Abstract - In this paper, we present a comprehensive and
innovative assessment plan and continuous improvement process
used by one of the newest engineering programs in the United
States. The program was developed from the ground up to have a
strong culture of assessment in preparation for ABET. In
developing the assessment plan and continuous improvement
process, one design requirement was that the assessment plan
involve all faculty in the program in order to establish a strong
assessment culture. The assessment plan includes both direct and
indirect assessment measures, as well as quantitative and
qualitative evaluations of student outcome attainments. The
assessment plan targets not only program-level continuous
improvement, but also course-level continuous improvement.
Course-level continuous improvement involves Course
Evaluations and Course Assessment and Continuous Improvement
(CACI) Reports, which are prepared by the faculty and serve to
document direct assessments of course outcomes and student
outcomes. Program-level continuous improvement involves
evaluation of the collection of CACI Reports that feed into the
Student Outcome Summary Reports (SOSR), which are annually
prepared by the Assessment Committee members. Methods
developed as part of our assessment plan are generalizable and
included in the paper.
Keywords- ABET; Assessment; Continuous Improvement; Student
Outcomes, Program Educational Objectives, Engineering
I. INTRODUCTION
"Accreditation may be defmed as a process, based on
professional judgment, for evaluating whether or not an educational institution or program meets specified standards of
educational quality" [1]. Continuous improvement is an
important concept in education because it defines the framework for assessment and evaluation, which is required
by accrediting agencies. There is one fundamental question driving the continuous improvement process. Can the program
demonstrate the degree to which students have attained the
anticipated student outcomes or program outcomes? The
assessment evidence of student learning is used to identify
student strengths and weaknesses related to each of the student
outcomes for the purpose of making decisions about how to improve the program teaching/learning processes. Assessment processes that focus on the continuous improvement of the
program produce results that can be systematically used by
faculty and administration in meaningful ways. Although there
are several challenges to developing an assessment plan and a
continuous improvement process, one major challenge is for
978-1-4673-5261-1/13/$31.00 ©2013 IEEE
the assessment plan and continuous improvement process to
be owned by the entire faculty body and not just a select few
faculty who lead the assessment efforts [2]. Certain
engineering programs and departments have implemented
approaches to streamline the assessment process so that faculty are involved without a burdensome effect on their time [3-5]. One of the underlying factors in the development of a
sustainable assessment plan is active engagement of faculty in
the steps of the process so that this sense of ownership is
fostered. Faculty engagement often occurs at the course-level
using course assessment reports [5-7]. Some programs have
also developed online tools and automated processes to carry out this reporting [8-10]. Another important aspect of a sustainable assessment plan is the establishment of a
reasonable data collection and analysis schedule. The
assessment plan and continuous improvement process
developed in the new Department of Engineering at James
Madison University (JMU) reflects best practices and is created to promote a culture of assessment.
II. JAMES MADISON UNIVERSITY ASSESSMENT CULTURE
James Madison University is an institution with a strong
assessment culture, and the Engineering program was founded with clear goals and objectives and a commitment to ongoing assessment. The University' s Center for Assessment and
Research Studies (CARS), a nationally renowned program that
offers the only Ph.D. in assessment and measurement,
supports the design, administration, and analysis of assessment
instruments administered at each stage. CARS staff members
work with each program to design and develop assessment processes and instruments in every academic major. JMU uses many diverse strategies and assessment prompts including
locally developed, regional, and national comprehensive
exams; on-line infonnation-Iiteracy/library skills assessments; portfolio assessment; perfonnance assessments; essay/tenn
paper review; oral comprehensive exams; external on-site supervisor ratings; exit interviews, surveys and focus groups.
Annually scheduled Assessment Days (one preceding the fall
semester and one during the spring semester) at JMU establish
the importance of assessment as central to academic and student development. On the annual Spring Assessment Day
each February, the University (through CARS) collects general educational and developmental information from
sophomores and juniors who have completed 45 to 70 credit
hours. This day also provides an opportunity for programs to
assess their students. Some academic programs use the
February Assessment Day to administer assessment tests or surveys; others embed assessment activities within department
courses. The Engineering Program uses both approaches. On
an annual basis, the Engineering Assessment Committee
prepares an Assessment Progress Template (APT) report that
is submitted to CARS for rigorous evaluation and feedback.
Further, the University conducts reviews of all academic programs on a five-year cycle, and evaluating progress in
assessment is a key part of these comprehensive reviews.
III. CONSTITUENTS & THE CONTINUOUS IMPROVEMENT
PROCESS IN THE JMU ENGINEERING PROGRAM
From day one of the Engineering program, which welcome the
inaugural class of students in August 2008, assessment was integral to the program development and a key facet to curriculum and programmatic development and continuous
improvement efforts. We started by identifying all the
constituencies of the JMU Engineering program:
• Industry and Employers of Program Graduates - Our graduates must be able to make significant and sustained
contributions to the success of their employers.
• Advisory Council - The JMU STEM Executive Advisory
Council (EAC) serves as an external constituency of the
Engineering program. The members of the EAC are
representatives from industry and academia, and are leaders in their respective fields.
• Alumni o(the Engineering Program - Our graduates must be prepared with the knowledge and skills for successful
engineering careers or advanced studies.
• Current Students of the Engineering Program - Our program must provide an environment which fosters the success and accomplishment of our current students.
• Program Facultv - Our faculty play a critical role in
identifying the needs of students as well as employers and building those capabilities into the program. The
department faculty bridge and integrate all constituencies to assure the program accomplishes results.
Per the program assessment and evaluation plan, Table 1
summarizes how input is gathered from each of these
constituencies. In its assessment activities, Figure 1 represents
the Engineering program's continuous improvement loop, which is a six-step iterative process involving both internal
constituents (i.e. faculty and current students) and external constituents (i.e. alumni, employers, industry, and ABET).
The first three steps of the process involve defining and
refining the Program Educational Objectives (PEOs), Student Outcomes (SOs), and Performance Indicators (PIs). Performance Indicators are measurable learning outcomes and concrete actions students should be able to perform as a result
of participation in the program. These three steps are the
foundation of the assessment process and ever since the establishment of the Engineering Program we have iterated and refined the PEOs, SOs, and PIs. The ABET and
Assessment Committees are the two faculty groups that
oversee these initial steps of the continuous improvement
process, but all constituents have a role in these three steps.
The fourth step in the process is defining and refining the assessment metrics for the PEOs, SOs, and PIs and as a
program we have focused on both direct and indirect
assessment methods to evaluate attainment of PEOs, SOs, and
PIs. The Assessment Committee is the faculty group that oversees this fourth step of the continuous improvement loop,
but all constituents also have a role in this step. The fifth step in the process is collecting, analyzing, and assessing data to evaluate attainment of PEOs, SOs, and PIs. Once again, the
Assessment Committee is the faculty group that oversees this
step of the continuous improvement loop, but all constituents
have a role. The last step in the continuous improvement loop
is deciding on actions, implementing the actions, and reassessing. All the engineering faculty are integral to this step
given that at all levels (project, course, and program), the
faculty are the ones driving the continuous improvement
efforts, deciding on the actions.
TABLE!. MAPPING OF CONSTITUENTS WITH ASSESSMENT METHODS
Constituents Assessment Methods for
Assessment Methods for SOs PEOs
Industry and o Employer Survey Employers of
Program (every two years) 0 Employer Survey (every two years,
Graduates o Employer Interviews indirect) (as needed)
0 Periodic review of
Advisory PEOs
Council 0 Review of placement 0 Review of overarching assessment results data and overarching (annually) assessment results (annually)
0 Placement Data Alumni (annually)
0 Alumni Survey (every o Alumni Survey (every two years, indirect)
two years) o Course Evaluations (every semester,
indirect) 0 FE Exam Results (annually)
Current 0 Engineering Science Concept Inventories Students
0 Senior Exit Survey (annually)
(annually, indirect) o Capstone Project Assessment Survey
(annually, indirect) o Senior Exit Surveys and Interviews/Focus
Groups (annually) o Student Work Assessment via Course
Assessment and Continuous Improvement 0 Periodic review of (CACI) Reports (every semester)
Program PEOs o Capstone Design Project Rubrics on
Faculty 0 Review of placement Design Process, Writing, and Presentation data and overarching (annually)
assessment results o Capstone Project Assessment Survey (annually) (annually, indirect)
0 Review of overarching assessment results (annually)
In assuring a quality assessment process and metrics, the Assessment Committee maintains an ongoing program that
obtains multiple measures of student attainability for PEOs and SOs. The assessment methods are a mix of direct
measures, which are defined as quantified observations and
ratings of student performance/attainment, and indirect
measures, which are both qualitative and quantitative
evaluations of student achievements/attainments, such as
survey data.
/' 6· Decide-on
ACliuM.lmpiemut. 3DdR�auC!S$
\ S - Collecl, Aulyu,
p.d 1.lcrpnt A'i�Smfnl Data
4 - nefioelReDne .4.sstssmut Met ric ror
PEOs, SOs, and Pis
2 • Oefint>lRefme Stude-DI
O.tcomn (SOs)
J
J - Defiot'/Refine Perlonnance
lodkaton(P15}
Figure 1: JMU Engineering Continuous Improvement Loop.
The assessment of SOs is perfonned as much as possible with
direct measures, including evaluations of specific samples of
student work, targeted exam questions, and evaluation of
capstone projects. These direct measures are supplemented by
indirect measures, such as student surveys. For the assessment
of PEOs, indirect measures are more prominent, as the graduates and their employers are the best sources of
information about post-graduation success. Several quantitative measures, such as number of promotions/salary
increases, number of professional development activities, and
membership in professional organizations also are used to measure achievement of PEOs. The following list delimits the
direct and indirect measures for assessing PEOs and SOs.
Direct Measures of Program Educational Objectives:
• Alwnni Survey (Specific questions)
• Senior Exit Survey (Placement Questions)
• Employer Survey
Indirect Measures of Program Educational Objectives:
• Alumni survey (Objective-focused questions)
• Admissions / Enrollment data
Direct Measures of Student Outcomes:
• Faculty Assessment of Student Work Samples via
Course Assessment & Continuous Improvement Reports
• Faculty Evaluation of Capstone Projects via Capstone Design Process Rubric, Capstone Report Rubric, and Capstone Presentation Rubric
• Fundamentals of Engineering Exam Results
• Engineering Concept Inventories
Indirect Measures of Student Outcomes:
• Senior Exit Survey (Outcome-specific questions)
• Course Evaluation Surveys (Outcome-specific questions)
• Faculty Survey on Capstone Project Assessment
• Student Survey on Capstone Project Assessment
• Alumni Survey (Outcome-specific questions)
Other Assessment for Continuous Improvement:
• Freshman Entrance Survey
• Senior Exit Survey (General questions)
• Senior Exit InterviewslFocus Groups
The JMU Assessment Plan demonstrates our overarching goal of establishing connections between course-level assessment
(where the key internal constituents are the program faculty
and the students) and program-level assessment of the SOs
(where the key internal constituents are the program faculty,
the Assessment Committee, and students) and the PEOs (where the key internal constituents are the program faculty,
the ABET Committee, the Assessment Committee, and students). External constituents (alwnni, employers, industry,
and the advisory council), described previously, are also integral to the assessment efforts of the program. In this
assessment plan, course-level assessment from the students is attained via the Course Evaluations and from the faculty via
Course Assessment and Continuous Improvement (CACI)
Reports, which serve to document direct assessments of course
outcomes and also Performance Indicators using student work,
examinations, projects, etc. On the other hand, program-level assessment is evaluated also from the collection of CACI Reports that feed into the Student Outcome Summary Reports
(SOSR), which are annually prepared by the Assessment
Committee members. Each of these constituents has a role in
this plan and these responsibilities are described below. The
Assessment Committee is the principal faculty group responsible for coordination of assessment activities. The
Assessment Committee reviews assessment instruments
appropriate to specific PEOs and SOs, reviews the assessment process to identify areas for improvement, and provides
recommendations to the Academic Unit Head and program faculty. The Assessment Committee also organizes assessment-focused faculty meetings every semester.
The assessment process is designed to inform the faculty on
the strengths and weaknesses of the program in a way to bring about continuous improvements in curriculwn, teaching
pedagogy, advising, student services, and all other facets of the program. Annually, the Assessment Committee prepares
an Assessment Progress Template (APT) report that is submitted to CARS for rigorous evaluation and feedback. The
APT, which summarizes all assessment work that has been conducted during the academic year, is also sent to all
engineering faculty for solicited feedback. Any recommendations or issues identified in the report are
carefully considered and a course of action is developed. The
assessment reports often serve as triggers for requests for
additional specific assessment data and analyses to be used for
specific studies or improvement projects by faculty/sub-groups within Engineering.
IV. ASSESSMENT AND EVALUA nON PLAN FOR ABET
STUDENT OUTCOMES A TO K
The JMU Engineering program has developed an assessment and evaluation plan employing the best practices we have identified from several peer institutions, input from assessment advisors, CARS, and ABET resources [11]. In general, the assessment and evaluation plan ensures that the engineering
faculty create, maintain, and monitor performance related to
the SOs with advice and input from constituent
representatives. An essential element of assessment and
evaluation processes involves broad faculty involvement coupled with identification of specific responsibilities. This
section presents the duties of various committees and
individuals.
Assessment Committee
The Assessment Committee is the foundation for our assessment and evaluation processes and is the faculty team
ultimately responsible for organizing and executing key
assessment activities:
• Update assessment and evaluation plan as needed and
distribute to faculty.
• Plan and conduct assessment-focused faculty meetings.
• Administer engineering science concept inventories in
coordination with instructors.
• Administer the Senior Exit Survey, Alumni Survey,
Employer Survey, and Capstone Project Assessment Surveys (to both capstone advisors and senior students).
• Plan and conduct Senior Exit Interviews/Focus Groups
with graduating students.
Course Coordinators
Each course is assigned a course coordinator by the Assessment Committee and this faculty member plays a
critical role in assessment. For all program faculty to be
involved in these assessment processes, each program faculty members serves as course coordinator for at least one course
per year. The course coordinator duties are described below.
• Review syllabi from all sections and instructors to ensure that these syllabi are compatible with the master syllabus.
• Oversee the coordination of assessment activities across
all sections of a course.
• Prepare the course-specific questions on the Course
Evaluation Survey at the end of each semester and provide these to the instructors in all sections of the
course.
• Assemble and evaluate the collected assessment data.
• Prepare Course Assessment & Continuous Improvement
(CACI) Reports each semester a course is taught.
• Assemble/update the course binder.
• Prepare reports at assessment-focused faculty meetings.
Outcome Coordinators
Each member of the Assessment Committee serves as an outcome coordinator, which requires the integration and analysis of the individual course assessment materials related
to a specific Student Outcome. The responsibilities of the
outcome coordinators include:
• Reviewing relevant CACI reports and prepare the Student Outcome Summary Report (SOSR) at the end of each
academic year and submit to the Assessment Committee.
• Evaluating the overall degree of outcome attainment and
whether the information provided supports the outcome
accomplishment using the rubric in the Student Outcome
Summary Report (SOSR), shown in Figure 2.
• Preparing an oral presentation for an assessment-focused
faculty meeting and lead discussion related to the Student
Outcome.
Assessment Committee Chair The Assessment Committee Chair also has the additional
responsibilities of preparing that annual assessment report, also known as the Assessment Progress Template (APT)
which was briefly described previously. The APT is a
summary of all program assessment activities and program assessment results for the academic year. The APT is submitted to the staff of the JMU Center for Assessment and
Research Studies (CARS) for feedback annually. To prepare
the APT, the Assessment Committee Chair reviews all CACI
and SOSR reports to select exemplar assessments of student
work (as the direct metrics) and indirect assessment results to provide an overarching story of program assessment and attainment of Student Outcomes. The APT is disseminated to
program faculty for feedback and discussion, which often
leads to the identification and actions for program continuous
improvements.
Student Outcome Summary Report- JMU School of Engineering - Version 1.1 (Aprll1l1012)
�:20tl-2012 P'!PlrM b\-: PitITllk05 Q.!!!:: "'lll.�· 1�i.2012
Slud en t Outcom e: A - Engineering IIl'1Idualc:s will hll\'ctbc abillty 10 Ipp�'knowlcdll:c ofmathcmatks. i1CklIcc. and engineering , _ , Performance ludlC,lIon: At -.. . ..
A2_ .•.• Al_ .•
�Tablt I: L is t of count:s where Dnlcomc \$ addrt:sStd wit h II brtddtKriplion .
I •••. '''.H'''C •• � 1 .....
.. Bn.fD"'ripn..D .. fIlO"'(�bC:�';ifF.",:/.T��Spe<if"Oukome
E!, GR _ __ 1'hi.d ... focu ••• OD
Tab�l: Eum lIIrdll'r'C1 RSSfSs m rn lS Lt. SludtDl wol'kcululili on from COUIVWOrk lor performance Indkuor AI.
B .... f D ... riptiooD of S .. dn' Work T.��
(I)
lAdn-idul(l) T'''10' for :\1.nDn<!
p •• form>OD<O An,mmon'
Table]: EIem iardll'ffl RSSfSSm en lS L e. s lud ml wol1r.tulu,lIl on from courstwol1r. lor trfonnancr Indlcalor A2.
B .... f D ... riptioo .. of Studnt Work T,��
(I)
lAdnidual(l)
OVERALL RA TL."'1G OF OUTCO:\IE ACHIEVEMENT (Bold One):
5 - � above expectations: All measures exceed target levels 4-��p\"Ctations:Most mt"3sur\"Sat or aboH targetlevels 3 - Outcome achie\-ed: Most measures at or near target levels 2 - Outcome lIot achie\-ed: Most measures below targelle\-els I - � below expectations: All measures below larget levels
1) .. 2) . . J) ••
I) •• 2) •• J) ••
Ty�,.dDHCripti<>.of A. on'
T,,,., for :\1.nDn<! p •• fo.mn<o A",mmon'
Figure 2: Student Outcome Summary Report (SOSR) template.
Our Student Outcome assessment and evaluation plan
stipulates the use of both direct and indirect methods to
evaluate attainment of each SO and these metrics are
described below. The following list describes the direct
assessment methods for SOs used in the JMU engineering program:
Faculty Assessment of Student Work - All Student Outcomes
and Performance Indicators are directly measured by faculty
assessing student work in specified courses. In some cases,
there are specific scoring or rating rubrics used, in other cases there are specific assignments evaluated based on grades received, and yet in other cases there is an overall faculty
evaluation on course work based on a variety of sources (e.g.
case studies, observations, assignments, course evaluations,
etc.). Targeted exam questions are also utilized and these are
selected by the instructors. Faculty capture results of these direct measures using the Course Assessment and Continuous
Improvement (CACI) template in Figure 3. In these CACI
reports, which are submitted by the course coordinators to the
Assessment Committee at the end of each semester, the
following items are captured: mapping of course outcomes to Student Outcomes, mapping of student work to Performance Indicators, description and characteristics (e.g. team-based or individual) of student work, description of assessment method,
metric for assessment, target attainment, measured attainment,
evaluation and uses of results, etc.
Course Assessment and Continuous Improvement (CAe!) Report - JMU School ofEnginel!ring' Venion l.2{ApiJl�_l()12)
Con", :\-am1>oc .. d S.m.: E:"GR ??? SHrio.�uml>or.
S.m .. to . .. dYnr:F.UZOIl I."",,torf,): Pioornko.
L) So",.thinp."hmkDbcutin""",·-.,.,hi:q .... ".,'" 11ti:lwll..,tQ}Jincl..u, .... (;pdannrCt>Unoo_ .. ' r"'f"'O>''''IDSTip!«onJ(:j? Imprt;"';"Ift'DiW2ti(m ",.thcd;'j"'JWO""'gdfJiwrJojcomonl' '",pro""'llp«lo",g i<:"'-",,,IJod:·lnco�/Ndbock>,«.iwdfr-«W'U#W>l_?
Co.n<Ou,tom ..
U""".."..pln.ioaofthi.«runc.lhe"udn"...uJb..ble,o
,
C)Wh>! .... mm •• d.rio •• do)·o. b.". for. fut .... no •• Itho ,"un< .. d ili?
D)01ht ••••• ",1 .0 ... m •• 1I, nm.rk .... d opi.io •• )"00 b •• ,. �.rd"l t�is t" .... :
""Iff."".. "o/lI""on o-o>U� o-�
� L';:-!,":
Pf'Ol:ram· ... ·.LS .. d •• ' o..ko", .. (.-kj
UOHofR ... I"orSan"'od
Imp ... nmu ..
Figure 3: Course Assessment and Continuous Improvement (CACI) Report template.
Seven Engineering Science Concept Inventories-National
engineering science concept inventories (CIs) eval uate students' understanding of fundamental engineering
knowledge, so inclusion of such direct measures enables us to
assess program goals A and E, given that these goals pertain to
graduates having the ability to apply mathematics, science,
and engineering knowledge to solve problems. The seven engineering concept inventory with the corresponding course
in parentheses in which they are administered are: (1) Statics Concept Inventory, (2) Dynamics Concept Inventory, (3) Materials Concept Inventory, (4) Fluid Mechanics via the
Thermal Transport Concept Inventory, (5) Thermodynamics
via the Thermal Transport Concept Inventory, (6) Heat Transfer via the Thermal Transport Concept Inventory, and (7)
Circuits Concept Inventory. Instructors in each of the courses
where the inventories are administered select a target set of
concept inventory questions to use for assessment purposes.
This is done to assure that concepts covered in class are the
ones that are indeed measured. For motivation purposes, students receive some type of course credit (e.g. final exam
points, homework credit, or extra credit points) as decided by
the course instructor.
Fundamentals of Engineering (FE) Exam - Scores from the
national FE exam taken by senior engineering students prior to beginning the practice of engineering work enable us to assess Student Outcomes A, E, F given that these goals pertain to
graduates having the ability to apply mathematics, science,
and engineering knowledge to solve problems. A pass rate of 70% is our target performance on the FE Exam.
Faculty Evaluations of Capstone Project Reports and Presentations - Capstone projects by definition represent the
culmination of students' educational experiences in the
program. As such, they are an outstanding opportunity to observe student achievement of many SOs. A previous publication details the JMU two-year capstone model [12].
Using the Capstone Design Process Rubric, Capstone Report
Writing Rubric, and Capstone Presentation Rubric developed by the engineering design instructors with feedback from the
Assessment Committee, faculty evaluate the design process
and quality of the senior capstone projects. The Capstone Design Process Rubric is used to evaluate the final capstone reports on twelve design process dimensions under four
overarching categories: (1) Planning and Information
Gathering, (2) Concept Generation, Evaluation, and Selection,
(3) Design Embodiment, and (4) Testing and Refinement [13]. Further, the Writing Rubric evaluates the quality of the
capstone reports on six writing-related dimensions and the
Presentation Rubric on three delivery-related dimensions as
well as the design process. The design instructors oversee the
evaluation of all the senior capstone reports and all program
faculty (with some external engineer practitioners as external evaluators) are involved in the evaluation of the senior
capstone presentations.
The following list describes the indirect assessment methods
for SOs used in the JMU engineering program:
Senior Exit Survey - The Senior Exit Survey administered as
an online survey is given to the seniors in April. This survey is
focused on indirectly assessing attainment of the SOs,
Performance Indicators, and PEOs, as well as collecting
placement information of our graduates. The placement questions in the survey pertain to career plans after graduation
including employer information, position titles, starting salary
information, graduate school plans, etc. The survey also includes a couple open-ended questions to collect more in
depth feedback on program and curriculum.
Faculty and Student Survey on Capstone Project Assessment - Assessing student learning as a result of participating on a
two-year capstone project [12] is a critical piece to indirectly
showing attainment of SOs. It is important to capture the
perspective of the capstone faculty advisors as well as the
student. These two sets of surveys (capstone advisor version and student version) are administered at the end of the
capstone experience as online surveys. Capstone faculty
advisors assess their capstone students' degree of learning
outcome attainment at the end of the two-year capstone design
experience. Similarly, students self-assess the degree of
learning outcome attainment at the end of the two-year capstone design experience. Capstone faculty advisors'
responses are compared to student responses and the results
enable us to assess the extent to which the capstone design experience has enabled students to meet Student Outcomes.
This survey, National Engineering Students' Learning Outcomes Survey (NESLOS) [14-15], includes 55 ABET
derived learning outcomes that map to the eleven SOs and
some of the forty Performance Indicators. In this survey, we
also assess the degree to which the capstone project enabled
students to apply knowledge and skills from all required
science, math, and engineering courses in the curriculum. Such information enables us to better understand how effectively students are applying and integrating knowledge
from coursework and also helping us set some standards during project solicitation and project selection.
Alumni Survey - The Alumni Survey seeks for demographic background of alumni, appropriateness and attainment ratings
of the PEOs, appropriateness and attainment ratings of the
SOs, the ability of the SOs to support the PEOs, importance of
PEOs and SOs to the workplace, employment details,
employer contact information, and several open-ended questions. The Alumni Survey is administered to alumni as an
online survey. Contact information of our graduates is collected during the Senior Exit Survey and this contact
information is used in administering the Alumni Survey.
Course Evaluation Surveys - At the end of every course, instructors administer an online survey to students. The
Course Evaluation Surveys include standard questions devised
by the faculty as well as custom-designed questions by the
course instructor(s) for the purpose of continuous
improvement. The Course Evaluation Surveys include students' ratings of textbook usefulness, amount of work required in the course in comparison to other similar level courses, degree of challenge in the course, appropriateness of
examinations, ratings of the value as well as the difficulty of
specified course projects or assignments, ratings of students' achievement of the course outcomes, and open-ended questions. The survey employs a 5-pt Likert scale for most
questions. Our goal is to achieve 70% of survey responses at
the 4 or 5 level. Faculty are encouraged to include course
evaluation findings in CACI reports.
Senior Exit Interviews/Focus Groups - To accompany the more quantitative and structured Senior Exit Survey, all of our seniors are also offered the opportunity to participate in a
Senior Exit Interview or Focus Group to provide more general feedback and satisfaction with the program at all levels -
curricular, extra-curricular, professional, etc. The questions
are open-ended and semi-structured. Seniors are offered
opportunities to participate in focus groups which are conducted by the Academic Unit Head and a member of the
Assessment Committee who is not an instructional faculty
member but rather the Student Coordinator in the program.
Data from the interviews/focus groups indirectly map to SOs
and PEOs.
Having described the metrics for assessing SOs, it is also
important to describe the time line and frequency of these
metrics. A three-year cycle is used for assessing SOs, corresponding to 3 to 4 each year. The annual groupings of
SOs for each year (i.e. Student Outcomes A, C, E, and H during year 1, Student Outcomes B, D, J, K during year 2, and
Student Outcomes F, G, and I during year 3) are based on the
fact that we recognize critical connections between SOs.
V. ASSESSMENT RESULTS AND ATTAINMENT OF
STUDENT OUTCOMES
In this section, we present a sample of assessment results corresponding to Student Outcome C. More specifically,
Figures 4 and 5 correspond to the summary of direct and
indirect assessment results for Student Outcome C. The tables
include details about the assessment methods, the target or
expected student attainment results, the measured student attainment results, as well as an evaluation of the results and
actions to be taken. Direct assessments of Student Outcomes
(i.e. Figure 4) are primarily focused on showing attainment of
the Performance Indicators by targeting exemplar student work across the curriculum. This mapping is based on the CACI reports, the SOSR and the Assessment Committee
working closely with the program faculty. The target
attainment for all direct assessments was set at 70%. For
program-level target attainment, all direct assessments were set at 70%, but course coordinators could set their own target
attainments for courses. For indirect assessments of Student Outcomes (i.e. Figure 5), assessment results focused on data from the Senior Exit
Survey, the Faculty Capstone Project Assessment Survey, the
Student Capstone Project Assessment Survey, and the Alumni
Survey. The target attainment for all indirect assessments was set at 80%. Figures 4 and 5 are from the JMU Engineering ABET Self-Study.
Perform ... e lodkalon Melbod(s)of
Assossm .. 1
p.rlormlllullldkllon Adequately Met.
E\-aluauonof adtquac:yof.conctptual�actordnlgninwblcb.tooeoll wtrereqUll"edto delemunethtnzeand efficlencyofcoruU,otlymixe<i
Reactor DeSIgn flo"· �aclon II "·ell u tn.lcb �acton for effect" ... y remo\·llIg
Cl.�::: A .. 'gnment
�;:��::�;�:;:E�::�:==�m:��e�c 411 ��;:���
r= f--HO-h. -" -"" -,oo---¥.:,,,,,,1:!�';':'�",OfT.i"""""�''''"''''='C:.Of!'!'' ''''''�OO� '","' ''' ''�:-:"::: .m:::;'.,:::;"::;:","'· ... "":::".+EN-'O -Rt--+----1=!.:::.:.,�::�';'m
::�:£Otal , Dellgn Proj«C :::��::��::�r.
':;�f�:::; :::'h':.'enl based on pncenlage of :'�ude�S:P:'�thtthat
IIO<:letal,and E\"aluahonof!"tport de"gned lo a ..... .. ooeoll'.bd,tylo apply dellgnmSIrUCl'on blllaUo technical .�"*tenu conce-pu !"tlated 10 tht JMU bUll)"".m. The analy ... wu m oIMr course. III Ihtl cbaracten,lIcl Analy", of Thill based on quahtal,,·e "�lImeol ofeconom,c, en"lfOomenul, 110<:,00, uble. Studenll are leanung . BusS)·".m and technicalcharaclenlllcl andunpiCU.Studenllnanw.nl"·u blled 4lJ fundamental and ad\".nce<i
011. report rubric (4.pl lelle) and the pncentage of ltudenll leoringl Irnowle<ige and llcilll
1-:"".L:;::"':�I);:- t;\"-+':�;::':.;:;;��::; '":::�
ee<I::: o:rl-;''':: -'':-'':�'''-''O-f_-"-'''''-' -" -f �- ' -= -. -"'-" -"''''- ' -ified- ''''---+
'-NO-R+- --+-----1���::�d£��
ent
quanUtaU\.. �I���: =!:��U:;1:!:�:::��:':·'="7I1sl::.e�;:.blsed on tlle 231 :''':":II'::':��O"':
nee<band S)"*lemLe\"tl ;:.�:;::���:r:��:I!:�urin���::m�.lil.Sludent ENGR
='::::g:;oo
;:,:,�"gn Analy",Pro
ject ::':I1�!�;:I�se: ll
o��e:;�������:(4-ptlea1e)andthe 411 :,e:;�-;'::itlllJ.lo,
1c':-'l .�"':"',�"':�-
g-+�-,,:
-r.,-.-. --f.::";:: '·""'::':;;;��:Ce\:":;:f:-,,;:;:" ::::,,,:;: g!,:::; ,;:-:�":::'�7.:I;: o;;;;
t�=c;�;;;: ':n:-:,,:,;:;f::::::c::;:: ,.=1.::=:';=.;:::::.-''''-+'-NO- R+---+ -----1?2r��;:
dn,gn [nfanI J�ilor rate morulor.Srudenl.n.llImenlw .. based on the pnc�otage of 431 �o;'.IO f-""-,''''--i'' ;
'''''''' \"llen''' ��'''�: ''':
'f'"e:ro''' ·.!J::''::''' ,,:''-�''':;_'-'U='"�fi�''':t�'''::'""' o f"-.,--', ,,--',,,-,,,,, -.. -. -f--+-+---+"""""grt'l� <;il�o=�
l)"1tern, C hallenge: report nle\WlIIO the des,gn challenge focused on tolChen cabmet and Irnowledge III the component, Kitchen Clbinet accenib,lilY for. wbHLchair bound client. Studenl anammenl ""II eng"'eenng Ie'eoce,
Aceenibility bued on the pnceRlage Of lludeRlI acbie\·ing l leore of7O% or bigher. eng"'eenng m.anag.ment, E\"alustion of. leam-bued project demonstration, oral examination, ,ustainability, and 1)I'Iemi
�:: Dellgn ::;m:�'::�:�itt�::':iecbu.:.!:,�
O:)":�
d��;..t:��ing. m
,;;::� �2;se�:!�� to j StudenlllU,omenl ""*1 bued on the pncenUoge of llooenll acrue'·mg I methodl.
soco�of7O%orh, her
Figure 4: Summary for direct assessments of Student Outcome C.
C. Deslg ••
E'"I\ual'on ,," based On an onl",e .un·ey,,� eacb Itudenl self·ntedtbeulenllOwh,ch lheJhelgreed(S-pt
SeniorElUt lJe ale)thattheJMVeogineerillgeWTiculum.nhancedtlleir Sun"t)' :����::::.,:�..::==..�i�:�:a:g
tbeperceolne of .. won'IlreeWllor lwnul\".llreem
J)"stem, E\-aluation "" bued on an online lun"y""IIMe heulty coml'o".', ap.rone rate<ltlle extenllowlucb berlbJ.. capstonedeo'gn ltudenU or proce�. ProJecl demOllltrate<lan.bll,ty10 acbJe", tile lelU"Tlillg OUlCO!De1 10 moe! Sun·�y. al'gnwg w'lb Studenl Outcome C. Fac .. lly lubnuned a dHlr'" Faculty resporue(S-pl lJelle)for e.ch .. morcapstOlleleamlheJhe "eed, lSenment ad\ued. SlI.Ideol.lIlinment ""II based on tile pncenUoge of .. 1I�in heulty nting an adequate.bilil)"or abo,·e for tbetr ll00entl
Targel Mu ... l"td SluduIO .. lcomeAdoqUI.lyM.'.
�-;!��::::::;:���'::'I
A�tloD-N ot appllcable SluduIO .. lcomeAdequl.lyM.'.
=;t!:7�=:':�s��=�!�";::;"g OUlcomet .. �laledloIhe Clpslonedeo'gnprojectl Studeotilluoklhallbey .... adequalelyapplyrngtlle .ngmeenng dellgn prDUll lO I product, prDCftI, or '�"*lelD, yet faculty ad,uon behe-"t that ttudenll can be domg """'.
f--+-----------+--+----1��::�\!�;:..���:i==�;!'.:;x..:; for ...... lmenl purpo .... but allO ",ib mo� dearly tltabhllungdellgn procell el<j>etlallomforcapllOne proJectl.Tlul ll belpmSdellgnUlSIrucIOn,capllOne r.c .. ltyad,·,wn, and c'pltonettudenu teld.ar expectahon,. The des,gnUlSlruclonplan lo use thlldes,gn procelllCOf"ing rubric to pro\·i<!elludenllfonnali\ .. fee<lback .. ""tll"'oquil"tthem to telf ... se .. tlletr progt"tIldunngtlle fo ..... meller c"l'ltone dellgn
::�:�7�.·b�:,f---+'''' .. ''',�= "OO=-,.,.. ''''''=OO=- �=OO'CC'Mc= .�=·�=-. • ''' ... = .. ''''b "".'_=�+--+---f"""""'---------1 ,-,
Sun·e)"
rellectiandself·ntel tllee�tel\ltowhieh,helbeagreel(S·pt lJe.le) that the Thft/engineerillgeWTiculum.nhanced tbeir .billties to ..,bJe\·e Student OUlcome C. Although tile Alurnru Sun·eyha. been de-·elopeci,,1 hal not bem admWlllered yel. Studenl atwnmenl ",11 be based on the
enU�e of AllIIIIm a m� or IIrOn1iv al!reem
�..;.p,lot tbeAlumn'Sut\ .. yin tllelpnng0f201l
Figure 5: Summary for indirect assessments of Student Outcome C.
VI. CONCLUSION
In this paper, we presented a thorough assessment plan being utilized in the Department of Engineering at James Madison University, The plan incorporates qualitative and quantitative assessment measures, involves all program faculty in the collection and analysis of data, and outlines a manageable structure for maintenance of the process, The assessment plan was successfully used for the department's fIrst-time ABET accreditation review in 2012, The ABET program evaluation team described the assessment plan "as the most thorough assessment plan they had seen in their 20 plus year as ABET evaluators and a plan they would take back to share with their own engineering departments. The plan itself is also subject to a continuous improvement process, and future work will include implementing methods to organize the collection of assessment data in an online format The approach presented in this paper is adaptable to other programs and the hope is that this information can aid other engineering programs,
ACKNOWLEDGMENT
The authors would like to acknowledge the support of the National Science Foundation Awards #DUE-0837465 (NSF CCLT "Design and Implementation of an Innovative Problem-based Learning Model and Assessment Tools in Undergraduate Engineering Education ") and #EEC-0846468 (NSF CAREER - "Characterizing, Understanding, and Integrating Complex Problem Solving in Engineering Education "), The views expressed in this paper are those of the authors and do not necessarily represent those of the National Science Foundation, We would also like to thank the students and faculty who participated in this effort,
REFERENCES
[I] Prados J,W" Peterson G,D" and Lattuca L, Jan, 2005, "Quality Assurance of Engineering Education through Accreditation: The Impact of Engineering Criteria 2000 and Its Global Influence," Journal of Engineering Education, pp, 165-184,
[2] Felder RM, and R Brent, Jan, 2003, "Designing and Teaching Courses to Satisfy the ABET Engineering Criteria," Joumal of Engineering Education, pp, 7-25,
[3] Helmick, M,T, and Gannod, G,c', "Streamlining and Integration of Miami Three-Tier Outcomes Assessment for Sustainability," Proceedings oj the 39th ASEEliEEE Frontiers in Education Conference, San Antonio, TX, October 18-21,2009,
[4]
[5]
[6]
[7]
Weisbrook, c'M, and Schonberg, W" "A Streamlined Approach to Developing and Assessing Program Educational Objectives nad Program Outcomes," Proceedings oj the ASEE Annual Conference and Exposition, Vancouver BC, Canada, June 26-29,2011,
Schreiner, S,' Cezeaux, l, and Testa, D" "Faculty-Friendly Assessment Systems for Biomedical Engineering Programs," Proceedings oj the ASEE Annual ConJerence and Exposition, Honolulu, Hawaii, June 24-27,2007,
Slamovich, E. B. and Bowman, KJ" "All, Most or Some: Implementation of Tiered Objectives for ABET Assessment in an Engineering Program," Proceedings oJ the 39'h ASEEIJEEE Frontiers in Education Conference, San Antonio, TX, October 18-21,2009,
Kurdahi, F" Shoemaker, l, and LaRue, l, "Design and Implementation of a Program Outcome Assessment Process for an ABET-Accredited Computer Engineering Program," Proceedings oj the ASEE Annual ConJerence and Exposition, Honolulu, Hawaii, June 24-27, 2007,
[8] Christensen, K. , Perez, R, Panta, p" and Bedarahally, p" "Unifying Program-Level ABET Assessment Data Collection, Anlalysis, and Presentation," Proceedings oj the 41" ASEEIJEEE Frontiers in Education Conference, Rapid City, SO, October 12-15,201 L
[9] Ozturk, H. and Raubenheimer, 0" "PAT: A Program Assessment Tool for Engineering Programs," Proceedings oj the ASEE Annual Conference and Exposition, Vancouver BC, Canada, June 26-29, 2011,
[10] Whiteman, W" "Sustainable Assessment and Beyond," Proceedings oj the ASEE Annual ConJerence and Exposition, Austin, TX, June 14-17, 2009,
[II] ABET Board of Directors, CRITERIA FOR ACCREDiTiNG ENGINEERING TECHNOLOGY PROGRAMS, Publication, Baltimore: ABET, 2011. Accrediting Board for Engineering and Technology, 15 Nov, 2011, Web, 04 Jan, 20\3,
[12] Pierrakos 0" Barrella E. M. , Nagel R. L, Nagel Jx. , Henriques jJ" Imholte D,D" June 2013, "An Innovative Two-Year Engineering Design Capstone Experience at James Madison University," 120fh ASEE Annual ConJerence & Exposition, Atlanta, GA
[13] Nagel RL. , Pierrakos 0., Nagel Jx. , June 2013, "A Versatile Guide and Rubric to Scaffold and Assess Engineering Design Projects," 12(jh ASEE Annual ConJerence & Exposition, Atlanta, GA
[14] Pierrakos 0" 1. Lo, M, Borrego, June 2007, "Assessing Learning Outcomes of Senior Mechanical Engineers in a Capstone Design Experience," ASEE Annual ConJerence & Exposition, Honolulu, Hawaii,
[15] Pierrakos 0" M, Borrego, 1. Lo, June 2008, "Assessing Students' Learning Outcomes during Summer Undergraduate Research Experiences," ASEE Annual ConJerence & Exposition, Pittsburg, PA