program quality improvement plan - virginia tech

21
Appendix C Program Quality Improvement Plan

Upload: others

Post on 25-Jan-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Appendix C Program Quality Improvement Plan

APPENDIX C

PROGRAM QUALITY IMPROVEMENT PLAN

STRATEGIC PLAN FOR THE EDUCATIONAL UNIT

Strategic plan was conceived in Fall of 2012. Industry leaders and faculty met to discuss emerging areas of leadership and skills needed in the construction industry. The plan that emerged was to identify tracks in the curriculum that spoke to the skill needs for the future of construction rather than skills needed for the past construction industry. Courses would be developed or adopted from Building Construction and other units to support the tracks. All Building Construction students would be required to take a common Core Curriculum; and, the tracks would be made up of additional courses beyond the core. The Core Curriculum would fulfill all requirements of accreditation without requiring content from the additional courses in the tracks.

The Strategic planning process concluded that communications skills were now much more critical for success in the industry.

Finally, the Strategic Planning Process concluded that new advanced technologies needed to be implemented that represented advanced construction techniques and tools for modern construction.

The Plan concluded that there would be four tracks for undergraduates to select. The four tracks identified:

1. Virtual Design Construction Track 2. Sustainable Building Performance Track 3. Restricted Elective Track 4. Structural Design Track

The plan was to be implemented over the next five years and would be reviewed following our next accreditation period to be considered in Spring of 2017. Strategic plan will be reviewed during academic year 2017-2018.

This plan was put into place and is now fully implemented and all graduates of the 2015-16 class have participated in this. As the plan emerged, a Degree in Real Estate was created and approved. A double major between Building Construction and Real Estate was formulated as one of the potential restricted elective tracks. Currently, ~36% of graduates take the Double Degree with Real Estate or other curricula from the restricted elective track, ~23% take the Virtual Design Construction track, ~31% take the Sustainable Building Performance track, and ~10% take the Structural Design Track.

In support of the communication skills required, the Department of Building Construction now requires Communication I and II rather than English I and II. Also, the Department added a new required course titled Building Effective Construction Teams (BC 2104).

In support of the need for advanced construction techniques, the Department created a new lab call the Build Lab. This lab focuses on digital construction, automated information collection, and Information Technology.

2

ASSESSMENT PLAN FOR THE DEGREE PROGRAM The mission of the degree program in Building Construction at Virginia Tech is to:

“Partner with industry in the co-evolution of the curriculum to meet future demands and needs of construction while remaining as current as feasible in technology, processes, and delivery methods”.

This mission statement was iteratively developed during faculty meetings and vetted through the department’s Industry Futures Committee. This committee is comprised of industry leaders (e.g. CEOs, owners, presidents) who represent local, regional, and national construction companies. The underlying principle reflected in the mission statement is agility because effective construction programs must reflect and respond to the dynamic nature of the construction industry in order to matriculate graduate students with relevant skill-sets. As the industry changes, we expect the program to change accordingly such that graduates are prepared to make substantive contributions to the industry of today and tomorrow, not the industry of yesterday. Through our strong partnerships with industry (e.g. during bi-annual meetings of the Industry Affiliates Board), the mission statement has changed over time to reflect emergent needs.

To evaluate the effectiveness of the program in achieving the mission, faculty and members of the Industry Affiliates Board developed the following set of program goals and measurable program objectives to ensure continuous improvement. While the goals serve to guide programmatic decision making, the objectives serve as quantifiable indicators of program performance toward achieving our mission. For each assessment cycle, the Futures Committee meets and provides feedback to the department on areas of potential change. The feedback is discussed at faculty retreats and appropriate actions are determined. The degree program goals and objectives are as follows in Table C1:

Table C1. Mapping Degree Program Goals to Measurable Objectives

Goal 1: Provide opportunities for students to gain employment in the construction industry after graduation.

Objective 1.1: Provide students with opportunities to participate in Career Fairs that specifically focus on employment in the construction industry

Objective 1.2: Expose students to a variety of construction and engineering companies during the Career Fairs.

Goal 2: Provide opportunities for students to establish an area of specialization in a construction discipline.

Objective 2.1: Maintain enrollment in at least two specialization areas

Goal 3: Place students in industry positions where they can achieve field and office leadership roles.

Objective 3.1: Support job placement for students interested in entering the construction industry after graduation

3

During each annual retreat, faculty review the twenty ACCE Program Learning Outcomes and develop operational definitions of the outcomes that can be directly measured through evaluation of student work products including technical documentation, narratives, examinations and presentations. Faculty identify work products from selected courses that can provide evidence of student achievement of the program learning outcomes. The result of this process is presented in Table C2 below.

Table C2. Definitions for Program Learning Outcomes

# Learning Outcomes Definitions

1 Create written communications appropriate to the construction discipline.

Schedule Narrative; Prepare response to a request for proposal (RFP); Prepare a scope package.

2 Create oral presentations appropriate to the construction discipline.

Team presentation structure; General presentation effectiveness.

3 Create a construction project safety plan. Risk assessment and integration plan; Training and reward program.

4 Create construction project cost estimates. Components and organization of an estimate; Quantity take-off.

5 Create construction project schedules. Components of a plan; Logical network and presentation.

6 Analyze professional decisions based on ethical principles.

Scenario-based decisions and discussion.

7 Analyze construction documents for planning management of construction processes.

Review plans and specifications to review what is required.

8 Analyze methods, materials, and equipment used to construct projects.

Means and methods; Site layout and site logistics and site flow.

9 Apply construction management skills as a member of a multi-disciplinary team.

Define roles and responsibilities as a member of your team with differing tracks and/or degrees.

10 Apply electronic-based technology to manage the construction process.

Develop electronic scheduling, model navigation, presentation and estimating.

11 Apply basic surveying techniques for construction layout and control.

Manually set horizontal and vertical control points from an established benchmark.

12 Understand different methods of project delivery and the roles and responsibilities of all constituencies involved in the design and construction process.

Describe stakeholders, organizational structure and contractual arrangements.

4

13 Understand construction risk management. Describe variables of risk, their mitigation and consequences; understand risk areas in contracting and methods to reduce risk.

14 Understand construction accounting and cost control.

Understand cash flow analysis; tracking actual versus budgeted; cost codes setup.

15 Understand construction quality assurance and cost control.

Understand importance of a quality assurance plan and its ability to affect future cost.

16 Understand construction project control processes.

Understand project controls, including: plans, RFIs, submittals, change orders, earned value and budget control.

17 Understand the legal implications of contract, common, and regulatory law to manage a construction project.

Explain requirements of a contract, understand basic parameters of common law and impact of regulatory agencies and law.

18 Understand the basic principles of sustainable construction.

Understand the management of consumable resources and reduction of waste in the process of building and on a construction site.

19 Understand the basic principles of structural behavior.

Understand load paths and soil bearing capacities.

20 Understand the basic principles of mechanical, electrical, and piping systems.

Apply the understanding and knowledge of MEP systems into project design.

A suite of assessment tools are used to measure the degree program objectives and learning outcomes. For the degree program objectives, the tools are based on programmatic data related to job placement, enrollment and career fair attendance. For the learning outcomes, the tools capture assessments of student learning directly through faculty and industry evaluation of student performance and also indirectly through student self-assessments. These tools are described below: Assessment Tools for Degree Program Objectives:

1. Graduating Senior Exit Interviews - The Department Head conducts and documents graduating senior interviews. Each year, exit interviews of graduating students are done in a free form discussion context with small groups of five to six students. This process results in a compilation of student concerns about the faculty, curriculum, facility and departmental student services. Student responses also highlight strengths of the program. During the interview, students are also asked to report on their post-graduation employment plans (e.g. whether they have accepted full-time employment in the construction industry). These data are compiled and stored in a spreadsheet for tracking trends over time in student concerns, programmatic strengths and employment. A copy of the Graduating Senior Exit Interview survey protocol can be seen in Appendix D.

2. Career Fair Participation Statistics - Every Fall and Spring semester, the Myers-Lawson School of Construction hosts a construction industry career fair. The career fair draws companies from across sectors and throughout the construction supply chain. Data is collected, stored and analyzed based

5

on participation of students and companies at the career fair. Program support staff collect the following data: 1) names and home academic units for all students who attend, 2) name, type and sector of companies that attend.

3. Enrollment Data – Every semester, enrollment data for construction courses are generated and distributed to academic programs by the university registrar. These data are compiled, stored and analyzed for enrollment trends over time. More specifically, enrollment in the Structures and Virtual Design cognate specialization areas are parsed and compared to enrollment patterns from previous years to determine whether students are enrolling in specialization areas.

Assessment Tools for Student Learning Outcomes:

1. Graduating Senior Exit Surveys - At the end of their required Capstone Experience course (BC 4444), graduating students are asked to indicate how strongly they agree that the program has prepared them to achieve the student learning outcomes. Each question in the survey directly corresponds to each of the 20 program learning outcomes (e.g. Question 1 asks students how strongly they agree that the program has prepared them to create written communications appropriate to the construction discipline). A copy of the Graduating Senior Exit Survey can be seen in Appendix D.

2. Faculty Assessments of Student Learning - At the end of each semester, faculty evaluate student work products that correspond to each of the program learning objectives. Evaluations are guided by a rubric in an effort to standardize evaluations between faculty and across semesters and to make students explicitly aware of instructor expectations. Work products include formal presentations, construction documentation, computer models and examinations. Faculty assessments occur in 4 classes:

a. BC 4444 – Construction Practice II (Capstone): Students work in teams to prepare a Response to RFP that is provided by an industry sponsor. Most of the “create”, “analyze” and “apply” SLOs (i.e. 1-11) are assessed with student performance during this class because students are required to apply their knowledge gained from previous classes to analyze requirements for a construction project and create plans, specifications, design and other documentation. Lecture is limited to refreshing previously learned concepts and instructors spend the bulk of the instruction time providing incremental feedback on students’ RFP Responses. The syllabus for the course can be seen in Appendix B.

b. BC 4434 – Construction Practice I: Students work individually on exploring how more advanced construction management concepts (e.g. contracts, law, cost controls) apply to the industry sponsored Capstone project in BC 4444. Most of the “understand” SLOs (i.e. 12-20) are assessed with student performance on a series of exams in this course because students are not required to demonstrate higher-order mastery of the material with these SLOs. True/False, multiple choice and fill-in-the-blank questions are sufficient to determine whether students understand concepts, but not apply or analyze concepts required for creation of appropriate construction design and planning materials. The syllabus for this course can be seen in Appendix B.

c. BC 2014 – Building Effective Construction Teams: In this course, students work collaboratively to analyze ethical decision-making based on industry sponsored case scenarios. Students are organized into a panel and prepare a presentation on the ethical challenges contained within the scenarios. The industry sponsor(s) then attend the students’ final presentations, describe the real-world outcome of the case and evaluate the students’ ethical solutions. Because of the central role that ethics plays in building effective construction teams, the summative panel and presentation are used to assess SLO 6: Analyze professional decisions based on ethical principles. The syllabus for this course can be seen in Appendix B.

6

3. Industry Assessments of Student Learning - At least twice each year, a jury of Industry Advisory Board members directly assess the performance of students on: 1) the final design/build project proposal presentation of the Capstone course (BC 4444), and 2) a student ethics panel during the Building Effective Construction Teams Course (BC 2104) in which they are required to respond to authentic ethical scenarios posed by industry representatives. For these assessments, oral and written feedback is provided to students in an open forum attended by faculty, staff and other students. Written evaluations by industry representatives are guided by a rubric in an effort to standardize assessments between representatives and across semesters. A copy of the rubric can be seen in Appendix D.

To facilitate continuous improvement, the faculty annually reviews the data collected through these assessment tools and develops strategies to reinforce strengths and address weaknesses. Gaps in student learning are identified through this process and curricular interventions are developed to ensure that students are achieving the program learning outcomes and that the degree program objectives are consistently met. The curriculum is reviewed to ensure that any changes in course structure, sequencing, or faculty specialization do not negatively impact student achievement of the learning outcomes. For instance, the curriculum review process ensures that as new classes are added and outdated classes are removed from the curriculum, students are still able to establish a 12-credit area of specialization within a construction sub-discipline. Table C3 below summarizes the assessment tools, frequency of use, and data collection procedures. Table C3. Summary of data collection procedure for degree program objectives and learning outcomes

Tool Frequency Data Collection Procedure

DEGREE PROGRAM OBJECTIVES

Graduating Senior Exit Interviews

Every semester Department head meets with student focus groups and records programmatic feedback, post-graduation intent and any accepted or pending construction job offers.

Career Fair Participation Statistics

Every semester Administrative staff collect data from companies through an online registration form. Students are required to swipe their identification cards at the registration table in order to enter the career fair, which indicates their name and home academic unit. The resulting data is collected and stored in a spreadsheet for analysis.

Enrollment Data Every semester Enrollment data for construction courses is automatically generated by the Registrar’s office. This data is collected, stored and sorted based on construction cognate for tracking of enrollment patterns by specialization.

STUDENT LEARNING OUTCOMES

7

Graduating Senior Exit Surveys

Every semester Students anonymously complete the survey on the last day of class in their senior Capstone course. Responses are collected and stored in a spreadsheet for analysis.

Faculty Assessment of Student Learning

Every semester Faculty for selected courses assess student work products based on the operational definitions of the ACCE Program Learning Outcomes (see Table 9.1.3.3) including tests, presentations and assignments. More applied work products (e.g. a semester-long RFP response) are used to assess higher levels of mastery (e.g. “create”, “apply”) while tests are used to assess lower levels of mastery (e.g. “understand”). Resulting assessment data is entered into an online form and then exported in tabular format for analysis.

Industry Assessment of Student Learning

Every semester Industry board members assess senior Capstone presentations and review students’ final project documentation. Board members provide verbal feedback directly to students and complete an assessment rubric that is entered into an online form. Data collected through this form is exported in tabular format for analysis.

Program objectives are evaluated based on direct measures designed to capture the core components of our mission that relate to preparing students for employment as the construction industry evolves. The measures focus on the Program’s support of a diverse, bi-annual Construction Career Fair and specialization areas within the curriculum that reflect new skills required for employment in the construction industry. Table C4 maps Program Goals and Objectives to the measures used to determine whether the program has met the performance criteria. Table C4. Mapping of assessment tools and performance criteria to program goals and objectives

Measures Performance Criteria

Program Goal 1 -Provide opportunities for students to gain employment in the construction industry after graduation.

Objective 1.1 – Provide students with opportunities to participate in Career Fairs that specifically focus on employment in the construction industry

Student participation in the Career Fair

75% of students will attend at least one career fair each year

8

Objective 1.2 – Expose students to a variety of construction and engineering companies during the Career Fairs.

Company participation in the Career Fair

Career fairs will be attended by a variety of companies

Program Goal 2 - Provide opportunities for students to establish an area of specialization in a construction discipline.

Objective 2.1 - Maintain enrollment in at least two specialization areas.

Enrollment in specialization areas

Specialization areas will maintain enrollment of at least 25% of the total number of students enrolled in the program.

Program Goal 3 - Place students in industry positions where they can achieve field and office leadership roles.

Objective 3.1 - Support job placement for students interested in entering the construction industry after graduation

Post-graduation job placement

90% of students will gain employment in the construction industry 3 months after graduation

Performance criteria for Learning Outcomes were based on a combination of direct and indirect measures of student performance. The performance criteria was arbitrarily set at 80% because the faculty and participating industry board members believe that this level represents “above average” student performance. However below, in the following section, we describe how we use trends in performance between semesters to trigger action plan development, even in cases where we are meeting our 80% target. In this way, we are able to identify problem areas for performance (e.g. indicated by a severe downward trend) before performance drops below our target. Table C5. Mapping of assessment tools and performance criteria to student learning outcomes

# Learning Outcomes Measures Course Performance Criteria

1

Create written communications appropriate to the construction discipline.

[DIRECT] Faculty grade for the executive summary contained within RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of RFP Response

BC 4444

[INDIRECT] Student Response to Exit Survey Question 1

BC 4444

2

Create oral presentations

[DIRECT] Faculty grade for RFP response presentation

BC 4444

9

appropriate to the construction discipline.

[DIRECT] Industry board member assessment of RFP Response presentation

BC 4444 Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 2 BC 4444

3

Create a construction project safety plan.

[DIRECT] Faculty grade for safety plan contained in RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of safety plan overview during RFP Response presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 3

BC 4444

4

Create construction project cost estimates.

[DIRECT] Faculty grade for project cost estimate contained in final RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of estimate overview during RFP Response presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 4

BC 4444

5

Create construction project schedules.

[DIRECT] Faculty grade for project schedule contained in RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of schedule overview during RFP Response presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 5

BC 4444

6 Analyze professional decisions based on ethical principles.

[DIRECT] Faculty grade for ethics panel presentation

BC 2104

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of ethics panel presentation

BC 2104

[INDIRECT] Student response to Exit Survey Question 6

BC 4444

7

Analyze construction documents for planning management of construction processes.

[DIRECT] Faculty grade for construction management plan contained in final Capstone RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 7

BC 4444

10

8

Analyze methods, materials, and equipment used to construct projects.

[DIRECT] Faculty grade major systems narrative and specifications contained in RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of methods, materials and equipment overviewed during RFP Response presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 8

BC 4444

9

Apply construction management skills as a member of a multi-disciplinary team.

[DIRECT] Average assessments for BC (College of Architecture) and CEM (College of Engineering) student peer-reviews of teamwork on RFP Response preparation.

BC 4444

Average score of direct and indirect measures will be ≥ 80%. [DIRECT] Industry board

member assessment of team coordination during RFP Response presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 9

BC 4444

10

Apply electronic-based technology to manage the construction process.

[DIRECT] Faculty grade for BIM contained in final Capstone RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[DIRECT] Industry board member assessment of BIM overview during final Capstone presentation

BC 4444

[INDIRECT] Student response to Exit Survey Question 10

BC 4444

11 Apply basic surveying techniques for construction layout and control.

[DIRECT] Student grades on Surveying Quiz

BC 1224 Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 11

BC 4444

12

Understand different methods of project delivery and the roles and responsibilities of all constituencies involved in the design and construction process.

[DIRECT] Student grades on Delivery Methods exam

BC 4434

A Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 12

BC 4444

11

13 Understand construction risk management.

[DIRECT] Average student grades on Risk Management exam

BC 4434 Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 1 BC 4444

14

Understand construction accounting and cost control.

[DIRECT] Average student grades on Accounting and Cost Control Exam

BC 4434 Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 14 BC 4444

15

Understand construction quality assurance and quality control.

[DIRECT] Average student grades on QA/QC exam

BC 4434 Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 15

BC 4444

16 Understand construction project control processes.

[DIRECT] Average student grades on Control Processes exam

BC 4434 Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 16 BC 4444

17

Understand the legal implications of contract, common, and regulatory law to manage a construction project.

[DIRECT] Average student grades on Construction Law exam

BC 4434

Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 17

BC 4444

18 Understand the basic principles of sustainable construction.

[DIRECT] Faculty grade for sustainability plan contained in RFP Response.

BC 4444 Average score of direct and indirect measures will be ≥ 80%. [INDIRECT] Student response

to Exit Survey Question 18 BC 4444

19 Understand the basic principles of structural behavior.

[DIRECT] Faculty grade for Structural Systems Analysis assignment

BC 4444 Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 19

BC 4444

20

Understand the basic principles of mechanical, electrical, and piping systems.

[DIRECT] Faculty grade major systems narrative and specifications contained in RFP Response

BC 4444

Average score of direct and indirect measures will be ≥ 80%.

[INDIRECT] Student response to Exit Survey Question 20

BC 4444

The evaluation methodology is based on feedback from faculty, students and industry representatives. The department conducts two retreats annually, one in the fall, prior to the start of the semester and another in the spring at the close of the academic year. During these retreats, assessment results are presented to the faculty with a perspective of continuous improvement. An academic review of the year is conducted and strategies are put into place to improve the quality of the curriculum, the relevance of content, and student,

12

industry, and faculty feedback. Identifiable weaknesses, strengths, and concerns are highlighted and an action plan is developed consistent with the program’s mission, objectives and learning outcomes. After review of the data, action plans are developed to address any identified deficiencies based on: 1) not meeting the 80% target, and 2) any downward trends observed between semesters. The following guidelines specify differences between downward trends and the resulting actions:

≤ 1% negative change - No action plan is required.

2-5% negative change - No action plan is required, but discussion at faculty retreat to identify root causes of the negative trend. For example, faculty may identify that a particular course module is outdated and may discuss strategies for ensuring that the course content remains relevant.

>5% negative change – An action plan is required, even in cases where the performance criteria target is met. Our intent with developing action plans when we meet targets but observe substantial negative trends is to offset the somewhat arbitrary nature of our targets and support continuous improvement.

Action plans are developed during the faculty retreats and are implemented over the following semester where possible. In some cases (e.g. new course development) the implementation phase may be longer.

ASSESSMENT IMPLEMENTATION PLAN FOR THE DEGREE PROGRAM Our data were collected in Fall 2015 and Spring 2016. The feedback was used in faculty retreats (for example, Spring 2016 retreat minutes are in Appendix D) and implementation was done as determined during retreat. The Department will make timely changes as review of assessment occurs. At least every 3 years the Department will do a comprehensive assessment analysis of student learning outcomes. Comprehensive assessment of degree Program Objectives and Learning Outcomes occurs through two vehicles: 1) Fall and Spring semester faculty retreats, and 2) Fall and Spring Semester Building Construction committee meetings of the Myers-Lawson School of Construction’s Industry Affiliates Board. In this way, the faculty and industry identify gaps in program and student performance and are provided with a mechanism through which to offer suggestions for improvement. Data in support of Program Objectives and Learning Outcomes are collected each semester and assessed the following semester. For instance, data currently being collected for Fall 2016 will be assessed in Spring 2017. Results will then be presented to the Building Construction Committee of the Industry Affiliates Board and at the faculty retreat at the end of the Spring 2017 semester. Actions plans that emerge from those meetings will then be prepared for implementation in Summer 2017 with implementation planned for Fall 2017. This process of Program Objective assessment for Fall 2015 through Spring 2016 is provided in Table 9.1.4.1 and for Student Learning Outcomes in Table 9.1.4.2 below. Each table maps the Program Objective or Student Learning Outcome to: 1) performance measures, 2) performance criteria, 3) resulting action plans, and 4) a description of the effectiveness of the action plan. Both tables have been formatted as screenshots of the Excel spreadsheets we use to facilitate the assessment and evaluation process. We have also included a link to the original spreadsheet file that contain hyperlinks to our data collection instruments, rubrics, exit surveys, industry surveys, examples of student work products and assignments.

13

We strongly encourage the Program Evaluators to download the spreadsheet and follow the links to documents contained therein to more easily evaluate and provide feedback on our methodology. Based on our strategic plan, during academic year 2017-2018 a full review will be conducted and updates

will be made as determined necessary.

Table 9.1.4.1. Assessment of Program Objectives for Fall 2015 – Spring 2016

Assessment

Tool Target SP14-FA15 SP15-FA16 Trend Target met?

Action Plan [Implementation in 15-16]

Outcomes from Action

Plan

Program Goal 1 - Provide

opportunities for students to

gain employment in the

construction industry after

graduation.

Objective 1.1 - Provide students with

opportunities to participate in Career

Fairs that specifically focus on

employment in the construction industry

Career Fair

Participation

Statistics

75% of students will attend at

least one career fair each year

76% of students attended at

least one career fair

86% of students attended at

least one career fair

Career fair

attendence

increased by

10% from 2014-

2016.

yes Since we met our target for 15-16,

no action plan is required.

n/a

Objective 1.2 - Expose students to a

variety of construction and engineering

companies during the Career Fairs.

Career Fair

Participation

Statistics

Career fairs will be attended

by a variety of companies.

181 companies attended the

FA14 and SP15 career fairs with

64% general contractors, 20%

specialty contractors, 9%

engineering firms, and 34%

design-build firms.

191 companies attended the

FA15 and SP16 career fairs with

61% general contractors, 13%

specialty contractors, 8%

engineering firms, and 32%

design-build firms.

Overall

attendence is

slightly up and

diversity of

companies has

remained

stable.

yes Our career fair remains a robust

opportunity for students to be

exposed to a large number of

diverse companies, so no action

plan is required.

n/a

Program Goal 2 - Provide

opportunities for students to

establish an area of

specialization in a construction

discipline.

Objective 2.1 - Maintain enrollment in at

least two specializations areas.

Enrollment

Data

Specialization areas (tracks)

will maintain enrollment of at

least 20% of the total number

of students enrolled in the

program.

DNC 23% of students enrolled in

VDC track, 10% in structures

track, 31% in sustainability

track, 36% in real estate track

n/a no Prior to 15-16, we didn't

systematically track enrollment in

specialization areas. After we

started tracking enrollment in 15-

16, we discovered that

enrollment in the structures track

was below our target. Discussion

of whether we should continue to

offer this specialization is

scheduled for the FA16 faculty

retreat

Discussion planned for

FA16 faculty retreat.

Program Goal 3 - Place students

in industry positions where they

can achieve field and office

leadership roles.

Objective 3.1 - Support job placement for

students interested in entering the

construction industry after graduation

Graduating

Senior Exit

Survey

90% of students will gain

employment in the

construction industry 3

months after graduation

100% of students who

graduated in SP14 and FA15

were employed in the

construction idustry 3 months

after graduation.

100% students who graduated

in SP15 and FA16 were

employued in the construction

industry 3 months after

graduation

stable yes We met our targets for SP14-FA15

so no action plan was necessary.

n/a

2014-2016 ASSESSMENT CYCLE 2015-2016 EVALUATION CYCLE

Table 9.1.4.2. (Frame 1 of 3). Assessment of Student Learning Outcomes for Fall 2015-Spring 2016 (link to original .xlsx spreadsheet with hyperlinks to assignments, rubrics and examples for each SLO)

Table 9.1.4.2. (Frame 2 of 3). Assessment of Student Learning Outcomes for Fall 2015-Spring (link to original .xlsx spreadsheet with hyperlinks to assignments, rubrics and examples for each SLO)

17

Table 9.1.4.2. (Frame 3 of 3). Assessment of Student Learning Outcomes for Fall 2015-Spring 2016 (link to original .xlsx spreadsheet with hyperlinks to assignments, rubrics and examples for each SLO) Notes: [1] Action plans were developed based on meeting 80% target and downward trend between semesters.

Minimal downward trend </= 1%, no action required.

Moderate downward trend 2-5%: no action plan required, but discussion at faculty retreat to identify root causes

Severe downward trend >5%: action plan required. [2] n/a = data was not applicable to observation by the industry board evaluators

[3] DNC = data was not collected this semester

19

10. REVIEW LAST VISITING TEAM REPORT: WEAKNESSES AND CONCERNS

10.1 Previous Accreditation Actions

During the 2011 accreditation review, the following weaknesses were identified by the visiting team. Weakness 1: Curriculum, Business and Management category. The curriculum is deficient in that is does not include the core subject matter Principles of Management in the Business and Management Curriculum category. Third Year Report Response: This was deemed Alleviated by the department in that 3 curriculum course requirements were adjusted to include a required course addressing the principles of management: MGT 3304 (Management Theory and Leadership Practice). The change became a requirement for all incoming freshmen starting 2011 – 2012 academic year. Weakness 2: Department Head Position. The Department Head is on a nine-moth, not a twelve-month appointment. His services during the summer months are funded through summer tuition. The department would be better served by a twelve-month appointment. Third Year Report Response: This is deemed In-Progress by the department in that the Department Head continues to be a nine-month appointment plus one month for summer administration. The department recognizes the concern of the ACCE and this issue will be reviewed and addressed after the current Department Head appointment period has ended. Weakness 3: Academic Quality and Outcomes Assessment. While significant progress has been made since the last re-accreditation visit, the Academic Quality and Outcomes Assessment plan and process are still incomplete. Third Year Report Response: This was deemed alleviated by the department. The Quality Assessment Plan is being used to continually review and solicit input and feedback to the curriculum and strategic plan. Measurable objectives are compiled and discussed annually by faculty and staff. New curriculum within the department accommodates continually expanding requirements of the construction profession. Concern 1: Budget. Budget support is not adequate to enable the program to achieve its stated purposes. For FY 09-10, $89,047 (9.8%) of program expenditures were funded by endowment income and one-time donations. Third Year Report Response: The is deemed In-Progress from the department. The department has raised $1,500,00.00 endowment for support of the Department Head and the program. Funds from endowments continue to remain a critical element to support program needs due to a decade of state budget cuts. This is not specific to the

20

Department of Building Construction at Virginia Tech as this is the case with many comparable programs across the country. Concern 2: Student Advising. While the advising process has been revised and re-implemented, a meeting with senior students indicated continued difficulty with this area. However, the Visiting Team did not address this issue with any newer students, precluding discovery of the effectiveness of the new advising approach. Pending confirmation of the effectiveness of the new documents and process, this area is cited again as a Concern. Third Year Response: The department acknowledges this concern was alleviated. BC Freshmen Orientation Advising evaluations ranked the highest in the college for the past two years. Recent graduate exit interviews solidify student satisfaction with department advisement. The Undergraduate Advising Manual and the online Student Guidebook are updated yearly to provide faculty and students with the most up-to-date information regarding the program, policies, and requirements.