annual planning and evaluation report instructional programs … · report; if there is a question...

440
Annual Planning and Evaluation Report Instructional Programs: 2017-2018 Office of Institutional Effectiveness and Student Success

Upload: others

Post on 02-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

Annual Planning and Evaluation Report Instructional Programs: 2017-2018

Office of Institutional Effectiveness and Student Success

Page 2: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

ANNUAL PLANNING AND EVALUATION REPORT Instructional Programs: 2017-2018

Office of Institutional Effectiveness and Student Success Northern Virginia Community College

May 2019

Page 3: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

NORTHERN VIRGINIA COMMUNITY COLLEGE

OFFICE OF INSTITUTIONAL EFFECTIVENESS AND STUDENT SUCCESS

The purpose of the Office of Institutional Effectiveness and Student Success is to conduct analytical studies and provide information in support of institutional planning, policy formulation, and decision making. In addition, the office provides leadership and support in research related activities to members of the NOVA community engaged in planning and evaluating the institution’s success in accomplishing its mission.

When citing data from this report, the Northern Virginia Community College (NOVA) Office of Institutional

Effectiveness and Student Success must be cited as the source.

4001 Wakefield Chapel Road Annandale, VA 22003-3796

(703) 323-3129www.nvcc.edu/oir

Page 4: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

Annual Planning and Evaluation Report: 2017-2018 Instructional Programs

Introduction Northern Virginia Community College (NOVA) conducts planning and evaluation of all instructional programs annually. This report presents yearly assessment results for degree-awarding programs and select certificates at NOVA. The Annual Planning and Evaluation Report has been published since the 2002-2003 academic year. Reports from previous years for instructional programs, campuses, and administrative units can be found on the website for the Office of Institutional Effectiveness and Student Success (OIESS): http://www.nvcc.edu/assessment/. At the beginning of the planning and evaluation cycle, each instructional program determines the expected student learning outcomes (SLO) and program goals to be assessed for the year and proposes the methods to assess the achievement of these outcomes. (Evaluation methods are not attached to this report; if there is a question about an evaluation method, please contact the instructional program or OIESS). At the end of the planning and evaluation cycle, each instructional program documents the results from their assessment activities. They also detail how those results have been or will be used in making continuous improvements in each cycle. The assessment process for instructional programs is faculty-driven. Faculty members are directly involved in the development of student learning outcomes, the implementation of assessment activities, and the analysis and use of assessment results. As shown in Tables 1 and 2, the planning and evaluation process for instructional programs engages a large number of teaching faculty and academic deans. Reports have been prepared and submitted by a designated faculty member (SLO Lead Faculty) from each degree-awarding program and from select certificates at the College. Table 1 details the SLO Lead Faculty and Coordinating Deans for 2017-2018 when the assessments were conducted. Table 2 provides information on SLO Lead Faculty and Pathway Deans for 2018-2019, when the reports were submitted in September 2018. The assessment activities and resulting reports are facilitated by the Pathway Deans who are responsible for a cluster of programs. Such widespread faculty participation is not only in compliance with SACSCOC Principles of Accreditation, but is also integral to establishing a culture of assessment and promoting data-driven decision-making. This report presents results for the 2017-2018 academic year. Each instructional program is presented separately. The programs are listed in alphabetical order.

Page 5: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

i

Annual Planning and Evaluation Report Instructional Programs: 2017-2018

Table 1. Coordinating Academic Deans and SLO Lead Faculty: 2017-2018 Program/Certificate Coordinating Dean SLO Lead Faculty

Accounting, A.A.S. Barbara Canfield, LO Rujuta Panchal, LO Administration of Justice, A.A.S. MaryAnn Schmitt, MA Jo Ann Short MA Air Conditioning & Refrigeration, A.A.S. Alison Thimblin, WO Martin Kang, WO Architecture Technology, A.A.S. Mary Vander Maten, AN Armen Simonian, AN ASL- English Interpretation, A.A.S. Jennifer Daniels, AN Paula Debes, AN Automotive Technology, A.A.S. Ivy Beringer, AL; Diane Mucci, MA Laura Garcia-Moreyra, AL Biotechnology, A.A.S. Diane Mucci, MA Xin Zhou, MA Business Administration, A.S. Barbara Canfield, LO Mike Brazie, LO Business Management, A.A.S. Barbara Canfield, LO Mike Brazie, LO Computer Science, A.S. Stewart Edwards, AN Larry Shannon, AN Construction Management Technology, A.A.S. Maggie Emblom-Callahan, AL Siamak Ghorbanian, AL Contract Management, A.A.S. Christopher Arra, WO Charles Taylor, WO Cybersecurity, A.A.S. Ivy Beringer, AL Margret Leary, AL Dental Assisting A.A.S. Marsha Atkins (Interim), ME Lisbeth Shewmaker, ME Dental Hygiene, A.A.S. Marsha Atkins (Interim), ME Marina McGraw, ME Diagnostic Medical Sonography, A.A.S. Marsha Atkins (Interim), ME Leigh Giles-Brown, ME Drivers Education Career Studies Certificate MaryAnn Schmitt, MA Nicole Mancini, MA Early Childhood Development, A.A.S. Ivy Beringer, AL Susan Johnson, LO Emergency Medical Services, A.A.S. Marsha Atkins (Interim), ME Gary Sargent, ME Engineering Technology, A.A.S. Emblom-Callahan, AL: Mary Vander Maten, AN Rudy Napisa, AN Engineering, A.S. Mary Vander Maten, AN Rudy Napisa, AN Fine Arts, A.A., A.A.A. David Epstein, WO Fred Markham, WO Fine Arts, A.A.A. Photography Specialization David Epstein, WO Gail Rebhan, AL Fire Science Technology, A.A.S. Robert Wade, AN Gary Sargent, ME General Studies, A.S. Jennifer Daniels, AN Graphic Design, A.A.S. Jimmie McClellan, AL Elizabeth Hill, AL Health Information Management, A.A.S. Robert Wade, ME Jacqueline Gibbons, ME Horticulture Technology, A.A.S. Barbara Canfield, LO Anders Vidstrand, LO Hospitality Management, A.A.S. Stewart Edwards, AN Jill Guindon-Nasir, AN Information Systems Technology, A.A.S. Ivy Beringer, AL Moses Niwe, AL Information Technology, A.S. Ivy Beringer, AL Moses Niwe, AL

Page 6: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

ii

Program/Certificate Coordinating Dean SLO Lead Faculty Interior Design, A.A.S. Katherine Hitchcock, LO Kristine Winner, LO Liberal Arts, A.A. Hemchand Gossai, AN Rima Gulshan, AN Liberal Arts, English Specialization Jennifer Daniels, AN Chris Kervina, MA Marketing, A.A.S. Stewart Edwards, AN Judy McNamee, AN Massage Therapy Career Studies Certificate Camisha Parker, WO Jennifer Sovine, WO Medical Laboratory Technology, A.A.S. Marsha Atkins (Interim), ME Maria Torres-Pillot, ME Music Recording Technology Certificate Katherine Hitchcock, LO Sanjay Mishra, LO Music, A.A., A.A.A., Specialization Jimmie McClellan, AL Lisa Eckstein, AL Nursing, A.A.S. Marsha Atkins, ME Laura Dickson, ME; Brenda Clark, ME Occupational Therapy, A.A.S. Marsha Atkins (Interim), ME Megan Cook, ME Paralegal Studies, A.A.S. Ivy Beringer, AL Joyce McMillan, AL Personal Training Career Studies Certificate Maggie Emblom-Callahan, AL Rick Steele, AL; Dahlia Henry-Tett, MA Phlebotomy Career Studies Certificate Marsha Atkins (Interim), ME Maria Torres-Pillot, ME Photography and Media, A.A.S. David Epstein, WO Aya Takashima, AL Physical Therapist Assistant, A.A.S. Marsha Atkins (Interim), ME Jody Gundrum, ME Professional Writing for Business, Government and Industry Jennifer Daniels, AN Jennifer Nardacci, AN Public History & Historic Preservation Career Studies Certificate Katherine Hitchcock, LO Marc Dluger, LO

Radiography, A.A.S. Marsha Atkins (Interim), ME Jarice Risper, ME Respiratory Therapy, A.A.S. Marsha Atkins (Interim), ME Donna Oliver-Freeman, ME Science, A.S. Abe Eftekhari, AN Mary Vander Maten, AN Science, Mathematics Specialization Alison Thimblin Martin Bredick, AL Social Sciences, A.S. MaryAnn Schmitt, MA Social Sciences, Political Science Specialization Katherine Hitchcock, LO Jack Lechelt, AL Social Sciences, A.S. Geospatial Specialization Katherine Hitchcock, LO Michael Harman, LO Social Sciences, A.S. Teacher Educ. Specialization MaryAnn Schmitt, MA Ashley Wilkins, MA Substance Abuse Rehab. Counselor Certificate Ivy Beringer, AL Chandell Miller, AL Veterinary Technology, A.A.S. Barbara Canfield, LO Tregel Cockburn, LO Welding: Basic Techniques Career Studies Certificate Diane Mucci, MA Matthew Wayman, MA

Page 7: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

iii

Annual Planning and Evaluation Report

Instructional Programs: 2017-2018 Table 2. Pathway Deans and SLO Lead Faculty: 2018-2019

Pathway Dean Program/Certificate SLO Lead Faculty Business and Hospitality Management, Ivy Beringer, AL

Accounting, A.A.S. Pamela Parker, AL Business Administration, A.S. Mohammad (Kabir) Jamal, AL Business Management, A.A.S. Mohammad (Kabir) Jamal, AL Contract Management, A.A.S. Charles Taylor, WO Hospitality Management, A.A.S. Jill Guindon-Nasir, AN Marketing, A.A.S. Judy McNamee, AN

Visual, Performing and Media Arts, David Epstein, WO

Graphic Design, A.A.S. Dwayne Treadway, LO Interior Design, A.A.S. Kristine Winner, LO Music, A.A., A.A.A. Specialization Lisa Eckstein, AL Music Recording Technology Certificate Sanjay Mishra, LO Photography and Media, A.A.S. Aya Takashima, AL Cinema A.F.A. Bryan Brown, WO Liberal Arts, Theatre Studies Certificate David Tyson, WO Visual Art, A.F.A. Fred Markham, AL

Liberal Arts and Communications, Jimmie McClellan, AL

Liberal Arts, A.A. Liberal Arts, Art History Specialization Sarah Liberatore, AL Liberal Arts, Communication Studies Tamara Warren-Chinyani, WO

Languages, Jennifer Daniels, AN

Liberal Arts, English Specialization Chris Kervina, MA American Sign Language to Eng. Interpretation Paula Reece, AN Professional Writing Certificate Jennifer Nardacci, AN

Social Sciences, Katherine Hitchcock, LO

Social Sciences, A.S. Social Sciences, A.S. Geospatial Specialization Michael Harman, LO Social Sciences, A.S. Political Science Specialization

Jack Lechelt, AL

Public History & Historic Preservation Career Studies Certificate Marc Dluger, LO

Education and Public Service, Evette Hyder-Davis, MA

Social Sciences, A.S. Teacher Educ. Specialization Ashley Wilkins, MA Early Childhood Development, A.A.S. Susan Johnson, LO Drivers Education Career Studies Certificate Nicole Mancini, MA Administration of Justice, A.A.S. Timothy Dickinson, AL Paralegal Studies, A.A.S. Joyce McMillan, AL Substance Abuse Rehab. Counselor Certificate Chandell Miller, AL

Technologies & Engineering Transfer, Abe Eftekhari, AN

Engineering, A.S. Rudy Napisa, AN Air Conditioning & Refrigeration, A.A.S. Martin Kang, WO Automotive Technology, A.A.S. Laura Garcia-Moreyra, AL Architecture Technology, A.A.S. Nazanin Saidi, AN Construction Management Technology, A.A.S. Siamak Ghorbanian, AL

Page 8: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

iv

Welding: Basic Techniques Career Studies Certificate Matthew Wayman, MA

Mathematics and Computer Science, Alison Thimblin, WO

Computer Science, A.S. Larry Shannon, AN Science, Mathematics Specialization Martin Bredeck, AL

Physical Sciences, Barbara Canfield, LO Science, A.S. Mitra Jahangeri, LO

Life Sciences, Diane Mucci, MA

Biotechnology, A.A.S. Xin Zhou, MA Horticulture Technology, A.A.S. Anders Vidstrand, LO

Nursing and Surgical Technologies, Marsha Atkins, ME Nursing, A.A.S. Laura Dickson, ME

Allied Health, Shelly Powers, ME

Dental Assisting A.A.S. Lisbeth Shewmaker, ME Dental Hygiene, A.A.S. Marina McGraw, ME Diagnostic Medical Sonography, A.A.S. Leigh Giles-Brown, ME Emergency Medical Services, A.A.S. Gary Sargent, ME Fire Science Technology, A.A.S. Gary Sargent, ME Health Information Management, A.A.S. Jacqueline Gibbons, ME Massage Therapy Career Studies Certificate Jennifer Sovine, WO Medical Laboratory Technology, A.A.S. Maria Torres-Pillot, ME Phlebotomy Career Studies Certificate Maria Torres-Pillot, ME Occupational Therapy Assistant, A.A.S. Kathi Skibek, ME Physical Therapist Assistant, A.A.S. Jody Gundrum, ME Personal Training Career Studies Certificate Rick Steele, AL; Dahlia Henry-Tett, MA Radiography, A.A.S. Jarice Risper, ME Respiratory Therapy, A.A.S. Donna Oliver-Freeman, ME Veterinary Technology, A.A.S. Kiana Adkisson-Selby, LO

Information and Engineering Technologies, Paula Ford (Interim), WO

Cybersecurity, A.A.S. Margret Leary, AL Information Technology, A.S. Moses Niwe, AL Information Systems Technology, A.A.S. Moses Niwe, AL Engineering Technology, A.A.S. John Sound, MA

General Studies, General Education, Global Studies, Barbara Hopkins, AN

General Studies, A.S.

Page 9: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

v

Annual Planning and Evaluation Report Instructional Programs: 2017-2018

Table of Contents

Introduction .................................................................................................................................................................................................................... Table 1. Coordinating Academic Deans and SLO Lead Faculty: 2017-2018 ................................................................................................................. i Table 2. Pathway Deans and SLO Lead Faculty: 2018-2019 ...................................................................................................................................... iii Accounting, A.A.S. ....................................................................................................................................................................................................... 1 Administration of Justice, A.A.S. .................................................................................................................................................................................. 7 Air Conditioning and Refrigeration, A.A.S. ................................................................................................................................................................. 13 American Sign Language to English Interpretation, A.A.S. ........................................................................................................................................ 18 Architecture Technology, A.A.S. ................................................................................................................................................................................ 25 Automotive Technology, A.A.S. and Emissions Specialization................................................................................................................................... 32 Biotechnology, A.A.S. ................................................................................................................................................................................................ 43 Business Administration, A.S. .................................................................................................................................................................................... 55 Business Management, A.A.S. .................................................................................................................................................................................. 63 Computer Science, A.S. ............................................................................................................................................................................................ 71 Construction Management Technology, A.A.S. ......................................................................................................................................................... 75 Contract Management, A.A.S. ................................................................................................................................................................................... 78 Cybersecurity, A.A.S. ................................................................................................................................................................................................ 89 Dental Assisting Program, Certificate ........................................................................................................................................................................ 97 Dental Hygiene, A.A.S. ............................................................................................................................................................................................ 102 Diagnostic Medical Sonography, A.A.S. .................................................................................................................................................................. 107 Drivers Education Career Studies Certificate ........................................................................................................................................................... 118 Early Childhood Development, A.A.S. ..................................................................................................................................................................... 123 Emergency Medical Services, A.A.S. ....................................................................................................................................................................... 131 Engineering, A.S. ..................................................................................................................................................................................................... 137 Engineering Technology, A.A.S. .............................................................................................................................................................................. 146 Fine Arts, A.A./A.A.A. .............................................................................................................................................................................................. 152 Fine Arts: A.A.A., Photography Specialization ......................................................................................................................................................... 159 Fire Science Technology, A.A.S. ............................................................................................................................................................................. 168 General Studies, A.S. .............................................................................................................................................................................................. 170 Graphic Design, A.A.S. ............................................................................................................................................................................................ 178 Health Information Management, A.A.S. .................................................................................................................................................................. 194 Horticulture Technology, A.A.S. ............................................................................................................................................................................... 199 Hospitality Management, A.A.S. .............................................................................................................................................................................. 205 Information Systems Technology, A.A.S. ................................................................................................................................................................. 213 Information Technology, A.S.................................................................................................................................................................................... 219 Interior Design, A.A.S. ............................................................................................................................................................................................. 225 Liberal Arts, A.A....................................................................................................................................................................................................... 232 Liberal Arts: English Specialization, A.A. ................................................................................................................................................................. 240

Page 10: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

vi

Marketing, A.A.S. ..................................................................................................................................................................................................... 248 Medical Laboratory Technology, A.A.S. ................................................................................................................................................................... 255 Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization ......................................................................................................................... 262 Music Recording Technology Certificate .................................................................................................................................................................. 270 Nursing, A.A.S. ........................................................................................................................................................................................................ 276 Occupational Therapy Assistant, A.A.S. .................................................................................................................................................................. 282 Paralegal Studies, A.A.S. ........................................................................................................................................................................................ 290 Personal Training Career Studies Certificate ........................................................................................................................................................... 296 Phlebotomy Career Studies Certificate .................................................................................................................................................................... 301 Photography and Media, A.A.S. ............................................................................................................................................................................... 308 Physical Therapist Assistant, A.A.S. ........................................................................................................................................................................ 319 Professional Writing Certificate ................................................................................................................................................................................ 329 Public History & Historic Preservation Career Studies Certificate ............................................................................................................................ 335 Radiography, A.A.S. ................................................................................................................................................................................................ 343 Respiratory Therapy, A.A.S. .................................................................................................................................................................................... 352 Science, A.S. ........................................................................................................................................................................................................... 358 Science: Mathematics Specialization, A.S. .............................................................................................................................................................. 369 Social Sciences, A.S. .............................................................................................................................................................................................. 372 Social Science: Geospatial Specialization, A.S. ....................................................................................................................................................... 380 Social Sciences: Political Science Specialization A.S. ............................................................................................................................................. 387 Social Sciences: Teacher Education Specialization, A.S. ........................................................................................................................................ 390 Substance Abuse Rehabilitation Counselor, Certificate ........................................................................................................................................... 396 Veterinary Technology, A.A.S. ................................................................................................................................................................................. 398 Welding: Basic Techniques Career Studies Certificate ............................................................................................................................................ 426

Page 11: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

1

Annual Planning and Evaluation Report: 2017-2018 Accounting, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed for persons who seek employment in the accounting field or for those presently in accounting who wish to increase their knowledge and update their skills. The occupational objectives include accounting trainee, accounting technician, junior accountant, and accountant.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Apply generally accepted accounting principles (GAAP)

Principle of Accounting 1 ACC 211 Direct Measure: Multiple choice questions from academic test banks and embedded in assignments. Provided Rubric Criteria or Question Topics:

1 (initial investment) 2 (cash transaction) 3 (deferred expense) 4 (credit purchase) 5 (accounts receivable) 6 (dividends) 7 (cash payments) 8 (deferred revenue) 9 (adjusting entries - supplies) 10 (adjusting entries - prepaid)

Sample Size (Specify N/A where not offered).

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 9 5 104 AN 16 8 200 MA 7 0 0 LO 13 3 35 WO 12 6 72 ME N/A N/A N/A ELI 13 3 38 DE* N/A N/A N/A Total 70 25 449

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 70% for individual questions and average SLO assessment score is 70%. Overall SLO assessment score is 72.5% this year. Results: by In-Class, ELI, Dual Enrollment: Previous year’s data by campus is not available. (Specify N/A where not offered.)

Results by Campus/ Modality

Current Assessment Results

Fall 2017 Percent > target

AL 70 AN 73 MA 0 LO 72 ME N/A WO 81 ELI 65 DE N/A Total 74.4%

Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2016 % of

Students > target

% of Students > target

1. 74 75 2. 74 66 3. 69 67 4. 61 61 5. 80 74 6. 72 69 7. 78 73 8. 79 73 9. 81 65 10. 76 68 Total 74.4% 69.1%

*Dual-enrollment

Previous action(s) to improve SLO: The Cluster will review the results and revise the assessment questions’ (2, 3, 4, 6, 9, and 10) format for clarity, and communicate with faculty the importance of the assessment process. The process will be finalized at the Spring 2018 Cluster meeting and placed into action shortly thereafter. Target Met: [ x ] Yes Based on recent results, areas needing improvement: There was an error in uploading answers in question 3 and 4 on Blackboard which caused incorrect marking for both questions. The error was recognized and corrected in October 2017. Current actions to improve SLO based on the results: It is important for all the campuses to participate in SLO dissemination in selected sections of classes and results must be submitted to the SLO lead. The Accounting Cluster discussed the issues in Spring 2019 meeting and is in the process of developing a centralized platform to submit SLO results. The Accounting cluster is planning to implement this action in Fall 2019.

Page 12: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

2

Accounting, A.A.S. Mean Overall Score on SLO Assessment

Semester Fall 2017 Fall 2016 Mean Score 72.5% 72%

Current results improved: [ X ] Partially Strengths by Criterion/ Question/Topic: Results were improved in Questions 2, 5, 7, 8, 9 and 10. Weaknesses by Criterion/ Question/Topic: Results did not improve for Questions 3 and 4.

Next assessment of this SLO: Fall 2019

Perform all steps in the accounting process, including production of basic financial statements.

Principle of Accounting 1 ACC 211 Direct Measure: Multiple choice questions from academic test banks and embedded in class assignments and examinations. Provided Rubric Criteria or Question Topics:

Question (subcomponent)

1 (net assets) 2 (total liabilities) 3 (net income) 4 (total revenue) 5 (total expenses) 6 (stockholders equity) 7 (accounting equation) 8 (statement order) 9 (retained earnings) 10 (dividend treatment)

Sample Size (Specify N/A where not offered).

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 9 5 105 AN 16 8 205 MA 7 0 0 LO 13 3 34 ME N/A N/A N/A WO 12 6 71 ELI 13 3 34 DE* N/A N/A N/A Total 70 25 449

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 70% for individual questions and 70% for overall SLO assessment. Overall SLO assessment score is 74.3% this year. Results by In-Class, ELI, Dual Enrollment: Previous year’s data by campus is not available. (Specify N/A where not offered.)

Results by Campus/ Modality

Current Assessment Results

Fall 2017 Percent > target

AL 63 AN 79.5 MA 0 ME N/A LO 80.5 WO 76 ELI 68.5 DE N/A Total 73.5%

Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2016

% of Students > target

% of Students >

target 1. 59 51 2. 66.5 57 3. 69 67 4. 86 82 5. 90 84 6. 71 73 7. 67 66 8. 74 78 9. 82 79 10. 80 79

Previous action(s) to improve SLO: The Cluster reviewed the results and in late fall 2018 revised the assessment questions 1, 2, 3, and 7 format for clarity and results have improved as compared to the previous year’s data. Target Met: [ x ] Yes Based on recent results, areas needing improvement: Based on this year’s results, the Cluster is working to improve the clarity and format of the assessment. Current actions to improve SLO based on the results: It is important for all the campuses to participate in the SLO dissemination in selected sections of classes and results must be submitted to the SLO lead. The Accounting cluster discussed the issues in the Spring 2019 meeting and is in the process of developing a centralized platform to submit SLO results. The Accounting cluster is planning to implement this action in Fall 2019. Next assessment of this SLO: Fall 2019

Page 13: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

3

Accounting, A.A.S. Total 74.5% 71.6%

*Dual-enrollment

Mean Overall Score on SLO Assessment Semester Fall 2017 Fall 2016 Mean Score 74.3% 67%

Current results improved: [ x ] Partially Strengths by Criterion/ Question/Topic: Results were improved in all questions except questions 6 and 8. Weaknesses by Criterion/ Question/Topic: Questions 1, 2, 3 and 7 require further improvement.

Identify generally accepted auditing standards (GAAS) and the additional requirements imposed by the Sarbanes-Oxley Act

Auditing 1 ACC 241 Direct Measure: Multiple choice and matching questions from academic test banks and embedded in class assignments and examinations.

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 1 0 0 AN 1 0 0 MA N/A N/A N/A LO 1 0 0 WO N/A N/A N/A ME N/A N/A N/A ELI 2 1 18 DE* N/A N/A N/A Total 5 1 18

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 70% for individual questions and 70% for overall SLO assessment. Overall SLO assessment score is 77% this year. Results by In-Class, ELI, Dual Enrollment: Previous year’s data is not available as this SLO was not assessed in last few years. (Specify N/A where not offered.)

Results by Campus/ Modality

Spring 2018 Percent > target

AL 0 AN 0 MA N/A LO 0 WO N/A ELI 77 DE N/A Total 77%

Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Spring 2018

% of Students > target 1. 67 2. 94 3. 78 4. 67 5. 72 6. 72 7. 72 8. 72

Previous action(s) to improve SLO: Target Met: [ x ] Yes Based on recent results, areas needing improvement: ACC 241 class is not offered regularly in all campuses and so collecting data for this SLO is challenging. Current actions to improve SLO based on the results: It is extremely important for all the campuses to participate in this SLO assessment when offered and collect data for analysis. At the Spring 2019 meeting, the cluster discussed improving SLO assessments for higher level accounting classes. The cluster will have details in Fall 2019 meeting. Next assessment of this SLO: Based on class offered in future.

Page 14: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

4

Accounting, A.A.S. 9. 89 10. 83 Total 76.6%

Mean Overall Score on SLO Assessment

Semester Spring 2018 Mean Score 77%

Current results improved: [ x] Yes Based on previous SLO assessment, results were improved. Strengths by Criterion/ Question/Topic: Questions 2, 3, 9 and 10 results are greater than the target. Weaknesses by Criterion/ Question/Topic: Questions 1, 4, 5, 6, 7, and 8 require improvement in clarity and format.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Describe and make distinctions between various accounting methods under U.S. GAAP and international financial reporting standards (IFRS) [ X ] CT

Intermediate Accounting II ACC 222

Semester/year data collected: Spring 2018 Target: 70% for individual questions and 70% for overall SLO assessment. Overall SLO assessment score is 81% this year. Results by In-Class, ELI, Dual Enrollment: Previous year’s data by campus is not available. (Specify N/A where not offered.)

Results by Campus/ Modality

Spring 2018 Percent > target

AL 58 AN 0 MA N/A ME N/A LO 98 WO N/A ELI 0 DE* N/A Total 78%

*Dual-enrollment Results by CLO Criteria:

Spring 2018 Spring 2015

% of Students > target

% of Students > target

1. 83 88 2. 69 56 3. 86 83 4. 76 71

Previous action(s) to improve CLO if applicable: N/A Target Met: [ x ] Yes Based on recent results, areas needing improvement: Question 2 will be rephrased in the next assessment to improve clarity. The subcomponent (distinctions between IFRS and GAAP) is at the appropriate level of competency. Current actions to improve CLO based on the results: It is extremely important for all the campuses to participate in this CLO assessment when offered and collect data for analysis. At the Spring 2019 meeting, the cluster discussed improving CLO assessments for higher level accounting classes. The cluster will have details in the Fall 2019 meeting.

Page 15: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

5

Accounting, A.A.S. Total 78.5% 74.5%

Mean Overall Score on SLO Assessment Semester Spring 2018 Spring 2015

Mean Score 81% 74% Spring 2015 Current results improved: [x] Yes The Spring 2018 score is significantly improved compared to Spring 2015. Strengths by Criterion/ Question/Topic: Topics covered in questions 1 and 3 are meeting the expectation of competency level. Weaknesses by Criterion/ Question/Topic: Topics covered in questions 2 and 4, need improvement.

Next assessment of this CLO: Spring 2020

Program Goals Evaluation Methods Assessment Results Use of Results Improve course success rates for Principles of Accounting I and II (ACC 211 and 212)

NOVA OIR Fall 2017 Grade Distribution Reports by Course is not available.

Target: Improve course success rates for Principles of Accounting I and II Results for Past 5 years: ACC 211

Fall Students in % Percentage Increased

2017 64 No change 2016 64 No change 2015 64 8 2014 56 -5 2013 61 -3

Target Met: [ x ] No Results for Past 5 years: ACC 212

Fall Students in % Percentage Increased

2017 82 1 2016 81 2 2015 79 No Change 2014 79 4 2013 75 -1

Target Met: [ x ] Yes

Previous action(s) to improve program goal: Most recent results: Results improved: Results are the same as the last few years in ACC 211. There is a 1% increase in results in ACC 212, but the increase is not significant as not all the campuses participated in SLO assessments. Assessed: Annually

Maintain course success rates for upper-level accounting courses

NOVA OIR Fall 2017 Grade Distribution Reports by Course is not available.

ACC 219 Gov’t and Not-for-Profit Accounting ACC 220 Accounting for Small Business

Results for Past 5 Years: Maintain course success rates for upper-level accounting courses

Academic Year

Number of Graduates

Percentage Increased

2017 103 -22

Assessed: Annually

Page 16: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

6

Accounting, A.A.S. ACC 221 Intermediate Accounting I ACC 222 Intermediate Accounting II ACC 230 Advanced Accounting ACC 231 Cost Accounting I ACC 232 Cost Accounting II ACC 240 Fraud Examination ACC 241 Auditing I ACC 261 Federal Taxation I ACC 262 Federal Taxation

2016 131 18 2015 111 No change 2014 111 No change 2013 111 No change

Target Met: [ x ] No

Page 17: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

7

Annual Planning and Evaluation Report: 2017-2018 Administration of Justice, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver excellent in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Administration of Justice program offers academic opportunities to students desiring to enter various phases of the Criminal Justice System and security related fields. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

CLO Critical Thinking (CT) Students will demonstrate the ability to evaluate evidence carefully and apply reasoning to decide what to believe and how to act. The CT proficiencies identified for the paper were: Students will demonstrate the ability to: 2.1 discriminate among degrees of credibility, accuracy, and reliability of inferences drawn from given data; 2.2 recognize parallels, assumptions or presuppositions in any given source of information 2.3 evaluate the strengths and relevance of arguments on a particular question or issue; 2.4 weigh evidence and decide if generalizations or

Organized Crime (OC) ADJ 216

In Fall 2017, students were required to write a research paper incorporating subsections 2.1, 2.2, 2.3 and 2.4 critical thinking skills from CLO CT. The subsections were correlated to a rubric that addressed 8 attributes of Organized Crime. See Attachment 1 for paper directions and rubric. Students analyzed each attribute using the CLO CT subsections and professors graded according to a rubric provided as Attachment 1 to this report. The attributes of OC which correlated to CT 2.1, 2.2, 2.3 and 2.4 were as follows: 1.OC has no political goals seeking only money 2.OC is hierarchical 3.OC has limited or exclusive membership 4.OC constitutes a unique subculture 5.OC perpetuates itself 6.OC exhibits a willingness to use illegal violence 7.OC is monopolistic 8.OC is governed by explicit rules and regulations Sample size: (Specify N/A where not offered).

Campus/Modality

# Sections Offered

# Sections Assessed

# Students Assessed

AL N/A N/A N/A AN 1 1 13 MA 1 1 14 ME N/A N/A N/A LO N/A N/A N/A WO 1 0 0 DE* N/A N/A N/A ELI N/A N/A N/A Total 3 2 27

*Dual-enrollment

Semester/year data collected: Fall 2017 Number of sections 2 Campuses: AN, MA Enrolled number of Students: 29 Current results improved if applicable: Not Applicable – see note below* [ ] Yes [ ] No [ ] Partially Target: 80% success rate for each subsection correlated with each attribute Results: Overall 86% success rate, see individual results correct per attribute. Professors individually graded each attribute (see rubric in attachment). Overall success rates reflect averages across campuses, rather than by individual campus. Results by SLO Criteria:

Attribute Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Introduction 86 86 86 76 86 86 86 86 Deduction 86 86 86 76 86 86 86 86 Analysis 86 86 86 76 86 86 86 86 Inference 86 86 86 76 86 86 86 86 Evaluation 86 86 86 76 86 86 86 86 Total 86 86 86 76 86 86 86 86

This is the first time the CLO was piloted so there is no clear comparison from previous reports that used multiple choice questions. This report will serve as a comparison when it is reassessed by 2021. Overall, the critical thinking subsections correlated to each attribute were successfully measured with the following overall grade results measuring 86% with grades of A and B. Students who did not complete the assignment received an F and those who minimally attended classes and/or answered the essays (see rubric) received a D. Therefore, the results are not deemed substantively impacted by anyone below a C grade.

Overall, the performance met expectations of the CLO. This was a pilot assessment for CLO CT. Research papers were used, however essays might have been easier to assess. The ADJ Faculty Discipline believes an essay specific to the CT subsections, with limited “attributes” used in this pilot study might have been an easier, overall assessment of CTs. Research papers are more comprehensive than essays and involve additional areas to assess the overall grades. An essay can be more refined to specific learning objectives with that particular CLO. The ADJ Faculty discipline would revise the CT subsections of this CLO for papers written in the future, as they would correlate to different learning objectives for other courses. This course is not offered through an online venue, although several proposals were sent to ELI. It has not yet been selected for incorporation to offer online. Once NOVA Online approves this course, a future assessment can be reviewed. This course is not offered in DE. This CLO will be re-assessed again by 2021.

Page 18: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

8

Administration of Justice, A.A.S. conclusions based on the given data are warranted

A rubric (Attachment A) identified the completeness of answers to each attribute to assess a grade of A, B, C, D or F. Survey Score range assessments: A - Excellent 90-100% B - Very Good 80-89% C - Average 70-79% D - Below Average Below 70% F – Failure

A= 60% B=26% C=6% D=4% F=4%

Explain basic investigative concepts relating to crime.

Criminal Investigations ADJ 236

In Fall 2017, students were given a multiple choice exam testing key basic investigative concepts relating to crime. The ADJ Cluster members from participating campuses prepared input and the questions are reflected in Attachment 2. The rubric was as indicated below. Survey Score range assessments: Excellent 90-100% Very Good 80-89% Average 70-79% Below Average Below 70% See Attachment B Sample size: (Specify N/A where not offered).

Campus/Modality

# Sections Offered

# Sections Assessed

# Students Assessed

AL 1 1 5 AN 2 1 23 MA 1 1 13 ME N/A N/A N/A LO N/A N/A N/A WO 1 0 0 ELI N/A N/A N/A DE* N/A N/A N/A Total 5 3 41 *Dual -enrollment

Semester/year data collected: Fall 2017 Number of Sections 3 Campuses: AL, AN, WO, ELI Total number students: 41 Current results improved if applicable: [ ] Yes [ ] No [ ] Partially Not applicable, see notes below* Target: 70% success rate for each question Results: Overall 84% success rate, see individual results correct per question*: *Overall success rates reflect averages across campuses, rather than by individual campus. Due to changes in questions, individualized questions cannot be compared with a previous assessment but will be in the next assessment. Considering the fact that it is statistically highly improbable that every area will be mastered by every student, results are considered strong in each area as required by this course. Q1. 90% (evidence) Q2. 90% (evidence) Q3. 75 % (evidence, forensics) Q4. 75% (law) Q5. 75% (trial procedure) Q6. 90% (evidence, forensics) Q7. 80% (evidence, forensics) Q8. 90% (law, standard of proof) Q9. 87% (evidence, forensics) Q10. 90% (surveillance) Q11. 75% (informants) Q12. 85% (eyewitness id) Q13. 85% (witness id) Q14. 85% (undercover operations) Q15. 85% (sources)

Overall, the performance met expectations of the SLO with an overall average of 84%. The four weakest questions averaging 75% each were questions 3, 4, 5 and 11. Faculty will emphasize (beginning in fall 2018) these areas in pedagogy such as PowerPoint presentations, lecture and hands-on exercises. A review of the one ELI section and on- campus results were comparable. The results were merged for this report prior to receipt of the new checklist but will be addressed separately in future APERs. This course is not offered in DE. Overall, faculty believe the assessment questions were fair reflections of the material covered for this SLO. ADJ faculty assessed this SLO in 2016-17 with an overall 72% success rate. As stated in the previous assessment, changes were made in pedagogy by faculty on each campus and the number of questions from the previous assessment were increased from 5 to 15. ADJ Dean did not participate in Faculty Clusters nor provided any input to the changes. ADJ Faculty believe the additional questions more comprehensively reflect the student learning objectives. The target rate for this assessment was increased and faculty believe it was due to the changes made in instruction delivery and emphasis in PowerPoints. Faculty also recognize that it is sometimes difficult to compare student audiences from one

Page 19: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

9

Administration of Justice, A.A.S. semester to another. This assessment did not include a demographic comparison of the students. It is unknown whether other such factors contribute to differences in the target success rates. This SLO will be re-assessed by Fall 2021.

Name and describe the steps in the criminal adjudication process that include the police, courts and corrections.

Survey of Criminal Justice System ADJ 100 In the Spring 2018, students were given multiple-choice questions at the end of the semester, which tested key investigative concepts relating to the steps in the criminal adjudication process of police, courts and corrections. The concepts in this SLO are to be “introduced” (i.e., not “mastered”) to the students as learning objectives. Cluster members from participating campuses provided input in creating the questions. See Attachment 3 to this report. The rubric was as indicated below: Survey Score range assessments: Excellent 90-100% Very Good 80-89% Average 70-79% Below Average Below 70% See Attachment C Sample size: (Specify N/A where not offered).

Campus/Moda

lity

# Sectio

ns Offered

# Section

s Assesse

d

# Students Assessed

AL 1 1 22 AN 3 3 82 MA 2 2 30 ME N/A N/A N/A LO N/A N/A N/A WO 2 2 25 ELI 2 0 0 DE* 1 0 0 Total 11 8 159

*Dual -enrollment

Semester/year data collected: Spring 2018 Number of sections 8 Campuses: AL, AN, MA, WO Number of Students:159 Current results improved if applicable: [ ] Yes [ ] No [ ] Partially Not Applicable, See notes below* Target: 80% success rate for each question Results: Overall 81% success rate. See percentage results correct per question*. *Overall success rates reflect averages across campuses, rather than by individual campus. Due to changes in questions, individualized questions cannot be compared with a previous assessment but will be in the next assessment. Considering the fact that it is statistically highly improbable that every area will be mastered by every student, results are considered strong in each area as required by this course Q1. 80%(CJS Goals) Q2. 85% (felonies) Q3. 80% (Justice Fairness) Q4. 75%(Federal Gov’t powers) Q5. 80%(Role of Homeland Security Agencies) Q6. 75%(Adjudication) Q7. 85% (Arrest) Q8. 85% (Plea bargaining) Q9. 80% (due process) Q10. 80% (Probation) Q11. 75%(Initial appearance) Q12. 80%(CJS components) Q13. 83%(Grand Jury) Q14. 80% (Pleas) Q15. 70% (Federalism)

Performance met overall expectations of SLO. This target SLO was considered successfully assessed. Questions falling below the target 80% (i.e. questions 4, 6, 11 and 15) were deemed appropriately worded. These areas will be emphasized more in pedagogy delivery (such as in additional PowerPoints and classroom exercises) to increase their individual target assessments of 80%. This SLO will be re-assessed by the fall of 2020. The weaker questions from this assessment will be included again for comparison. These questions were changed from the 2016-17 assessment by using the published test banks across campuses. The questions are believed to represent a more comprehensive approach to assessing this SLO. Due to the dates when the questions were received, a pretest was not conducted for an additional comparison. ELI and DE sections were not included. Those sections had already formatted their lesson plans and/or used different textbooks and test banks. Future assessments will be more timely and consistent with test banks to ensure all appropriate sections are included.

Describe how crime is measured by

Survey of Criminal Justice System ADJ 100

Semester/year data collected: Spring 2018 Number of sections: 8

Performance met expectations of this SLO. This target SLO was considered

Page 20: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

10

Administration of Justice, A.A.S. means of the Uniform Crime Report (UCR), National Crime Victimization Survey (NCVS), Self-Report data and the National Incident Based Reporting System (NIBRS).

In the Spring 2018, students were given multiple-choice questions at the end of the semester, which tested how crime is measured by means of the UCR, NCVS, self-reporting surveys and NIBRs. The concepts in this SLO are to be “introduced” to the students as learning objectives. Cluster members from participating campuses provided input in creating the questions. See second part of Attachment 3. The rubric was as indicated below: Excellent 90-100% Very Good 80-89% Average 70-79% Below Average Below 70% Sample size: (Specify N/A where not offered).

Campus/Modality

# Sections Offered

# Sections Assessed

# Students Assessed

AL 1 1 22 AN 3 3 82 MA 2 2 30 ME N/A N/A N/A LO N/A N/A N/A WO 2 2 46 ELI 2 0 0 DE* 1 0 0 Total 11 8 180

*Dual -enrollment

Campuses: AL, AN, MA, WO Number of Students: 180 Current results improved if applicable: [ ] Yes [ ] No [ ] Partially Not Applicable. See comments below* Target: 80% success rate for each question Results: *Overall 81% success rate. See percentage results correct per question. Overall success rates reflect averages across campuses, rather than by individual campus. Due to changes in questions, individualized questions cannot be compared to a previous assessment but will be in the next assessment as appropriate. Considering the fact that it is statistically highly improbable that every area will be mastered by every student, results are considered strong in each area as required by this course. Q1. 90%(UCR general) Q2. 90% (UCR general) Q3. 75% (Self Reporting data) Q4. 75% (Cohort study) Q5. 75%(Victimization survey) Q6. 90%(UCR-Index crimes) Q7. 80%(UCR agency) Q8. 90%(UCR principles) Q9. 87%(UCR general) Q10. 90% (UCR sources) Q11. 75%(NIBRs) Q12. 85% (UCR Part II) Q13. 85%(UCR-application) Q14. 85%(NIBRs – application) Q15. 85% (Self-report surveys) Target 70% success rate for each question Results: Overall 84% success rate. See percentage results correct per question.

successfully assessed. ELI and DE were not included in the assessment because DE classes were not identified to some campuses in a timely fashion. ELI sections were inadvertently not included in this assessment in time to include in their lesson plan formats. This SLO was reassessed from the 2016-17 APER. The recommended changes from that assessment in pedagogy were emphasized on previously identified weak questions. Pretests were not conducted in time for a fair comparison of change within the curriculum, but overall, the SLO concepts were successfully assessed as learned. The discipline map requires these concepts be “introduced” not mastered. The combined percentage success of all questions met the overall expectations of the SLO, as well as each question (each assessed over 70%) Faculty may consider raising the target expectations to 80% for each question in a future assessment, expected to be assessed by Spring 2020.

Program Goals Evaluation Methods Assessment Results Use of Results Goal: To increase number of ADJ Certificates.

OIR Reports: 2013-2018 College Graduates by Curriculum and Award Type

ADJ CERTIFICATES Totals: 2017-18 16 2016-17 17 2015-16 10 2014-15 17 2013-14 28

Goal not met. While successful enrollments continue among approximately 18 different ADJ courses with various sections distributed among all campuses and ELI each semester, ADJ Faculty believes the decline is comparable to the overall decline within the VCCS. The decline

Page 21: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

11

Administration of Justice, A.A.S.

Target: to increase total number by 5% Target decreased by 5% - not met

may be due to economic trends out of the control of marketing efforts by the program and/or college. The ADJ program continues as it has the past decade, to market this program through outreach efforts during Fairfax County High School college sessions designed to give college awareness to students. The ADJ Certificate is earned if the ADJ AAS degree is earned. The ADJ AAS graduates for 2018 dropped from 81 to 55. The graduate decline is greater than the decline in the percentage drop (5%) for this certificate. Faculty believe it is insignificant in comparison of the enrollment drop. The ADJ AS has now been approved and implemented as of Spring 2018. It is possible that students enrolling in the AS degree pathway are focusing on the pathway courses that will transfer to the Big Five Institutions in the VCCS (GMU, JMU, ODU, VCU and Radford.) Some of the courses in the AAS degree, (i.e. ADJ 212, ADJ 216) are not part of the guided pathways and may have an impact on student enrollments for courses needed for this certificate. The ADJ faculty discipline committee and pathway council may consider in 2018-19 aligning courses for the Certificate to include both the ADJ AAS and ADJ AS degrees. Any changes will be properly routed through NOVA offices having purview over the ADJ academic and counseling services. This goal will be re-assessed by Fall 2021.

Goal: To increase the number of National Security ADJ Career Study Certificates (CSC).

OIR Reports:2013-2018 College Graduates by Curriculum and Award Type

ADJ National Security CSC

2017-18 5 2016-17 4 2015-16 2

This target was met. Generally, student enrollments are still successful in approximately 18 different ADJ courses with various sections offered on all campuses and ELI. The National

Page 22: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

12

Administration of Justice, A.A.S. 2014-15 4 2013-14 6 Target: to increase total number by 5% Target met

Security CSC is appropriate in the Washington DC metropolitan area for obvious reasons. Despite some declines in the overall ADJ graduates as reflected in the above goal, even a 5% increase would suggest students are still pursuing this CSC. The ADJ faculty discipline committee, steering committee and pathway council may consider during the 2019-2020 timeframe, proposals for at least some of these national security CSC courses to be offered through NOVA ONLINE. If so, such proposals will be properly routed through NOVA offices having purview of ADJ discipline. As determined by the previous assessment, Faculty marketed and will continue to market this CSC, via liaison and fliers properly routed through NOVA protocol. Many of the courses in this CSC are identical and fully accepted equally, to those taught at GMU. With the new Pathway Council, Steering Committee and Discipline Committee, faculty can share and market this via word of mouth or other methods chose by those forums. This goal will be reassessed by Fall 2021.

Page 23: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

13

Annual Planning and Evaluation Report: 2017-2018 Air Conditioning and Refrigeration, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed to prepare students for jobs in the air conditioning and refrigeration field. The second year provides students with skills that lead to leadership positions in the HVACR industry. Occupational objectives include industry licensing, advanced critical thinking skills and state tradesman licenses in HVACR.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Technical knowledge and capabilities: Design, install, maintain, and repair a basic residential air conditioning and heating system.

Psychometrics and Heat Load Calculations AIR 207 Direct Measure: The assessment is administered during the Final exam; SLO Questionnaire that students have completed during the last day of class. Assessment scale is 0 - 100%. Summary of questions attached. Questions: 1) Head load calculations 2) Oversized equipment 3) Calculating R value 4) Calculating U value 5) Calculating HTM 6) Heat gain 7) Choosing outside design conditions 8) Calculating heat gain HTM 9) Calculating heat loss 10) Sensible heat gain and latent heat gain Sensible heat gain causes the air Sample Size (Write N/A where not offered).

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students assessed

WO only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

* Dual-enrollment house to increase.

Semester/year data collected: Spring 2018 1 of 1 sections (WO only) total 13 students. No DE or online. Target: students will score 80% or higher overall on each criterion as well as the overall score.

AIR 207

Question 2015-2016

2016-2017

2017-2018

Q1 86 68 92 Q2 86 95 92 Q3 85 97 31 Q4 88 63 62 Q5 87 39 15 Q6 88 84 23 Q7 88 42 54 Q8 89 93 8 Q9 86 83 15 Q10 86 91 77 Overall 87% 75.5% 46.9%

Results: Overall SLO score average 46.9%; Target SLO score average 80%. Compared to previous results: The scores are very different than previous semesters. Target score was exceeded for questions 1 and 2. Target score was not met, but was close, for question 10. Target score was not met for questions 3, 4, 5, 6, 7 and 8. This may be due to the way the data was collected. The SLO Quiz was given to the student as an extra credit assignment. Meaning the students didn’t have too much of an incentive to finish or to actually solve the questions. Further, we have a first-year, first semester instructor. Lack of experience in teaching is a minus during these SLO results.

Target Met: No. Results of this assessment were discussed with AIR faculty, especially with the instructor that is teaching this class currently. Action taken: Spring 2018. Due to a change in leadership in the AIR program and instructor of AIR 207, there was a learning curve for SLO assessment. As a result, the SLOs were not assessed the same way they had been in the past. Based on analysis of these results, the program plans to take the following actions: 1. Clearly convey these SLO standards to the Instructor. 2. Consider a preliminary assessment to determine student learning or progression. 3. Continue to assess and compare results to determine what actions need to be taken in the future. Planned actions for Fall 2018 and Spring 2019: 1. Assess this SLO in the same manner during the 2018-19 academic year to compare results. 2. Partner up with the Professional Advising Committee

Page 24: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

14

Air Conditioning and Refrigeration, A.A.S. to look for constructive feedback for methodology/ questions of the assessment on the SLOs. 3. Continue to assess and compare results to determine what actions need to be taken in the future. Next assessment: Follow-up assessment will be administered Fall 2018 and Spring 2019 semester.

Problem prevention and solutions: Analyze an HVACR system’s current operation, evaluate its ability to adequately provide for the equipment owner’s comfort requirements, and if necessary, formulate a strategy to correct any deficiencies.

Advanced Troubleshooting AIR 238 Direct Measure: Students were assessed on all of the 10 SLO points through this course, every semester, and every class. The methodology/criteria is not a single assessment but rather a comprehensive group of constant assessments with written quizzes, Hands-on Lab tasks & exercises, and Mid-term and Final Exams. As well as a final SLO Questionnaire that students have completed during the last day of classes. Assessment scale is 0 - 100%. Summary of questions attached. Questions: 1) Calculating friction rate. 2) Calculating the size of a metal duct. 3) Calculating friction rate for a static pressure drop. 4) Calculating the size of a metal duct. 5) Calculating velocity. 6) Calculating static pressure in different systems. 7) Understanding standard static pressures. 8) Understanding recommended face velocity. 9) Choosing a supply register. 10) Understanding noise criteria curves. Sample Size (Write N/A where not offered).

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students assessed

WO only 2 2 23

Semester/year data collected: Fall 2017 - AIR 238 1 of 1 sections (WO Campus only) total 13 students. Spring 2018 - AIR 238 1 of 1 sections (WO Campus only) total 10 students. Target: students will score 80% or higher overall on each criterion as well as the overall score. Results by SLO Criteria:

AIR 238 Question 2015-2016 2016-2017 2017-2018

Q1 86 78 100 Q2 86 50 96

Q3 89 56 96 Q4 89 39 91 Q5 87 72 83 Q6 87 39 87 Q7 86 56 100

Q8 86 83 74 Q9 85 83 96 Q10 86 56 78 Overall 86% 61.2% 90.1%

Results: Overall SLO score average 90.1%. Target SLO score average 80%. Compared to previous results: The target score was met. Compared to previous results: The scores are very different than previous semesters. Target score was exceeded for the majority of the questions except for questions 8 and 10, which were very close. Strengths: Instructor has made the effort to change the instruction delivery and has surpassed last year’s standards. Weakness: Question 10 should be addressed by the Instructor.

Target: Met. The results of this assessment were discussed with AIR faculty. Action taken: Spring 2018. There were several changes after last year: The instructor had more hands-on activities and lab tasks to supplement the lecture. Based on analysis of these results, the program plans to take the following actions: 1. Maintain a good percentage of sections to assess over the total number of sections offered in each class. 2. Consider reassessing early questions (especially 10) at the time of the final exam to determine if learning outcomes meet target scores. 3. Continue to assess and compare results to determine what actions need to be taken in the future. Planned actions for Fall 2018 and Spring 2019: 1. Assess this SLO in the same manner during the 2018-19 academic year to compare results. 2. Partner up with the Professional Advising Committee

Page 25: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

15

Air Conditioning and Refrigeration, A.A.S. ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 23

* Dual-enrollment house to increase.

to look for constructive feedback for methodology/ questions of the assessment on the SLOs. 3. Continue to assess and compare results to determine what actions need to be taken in the future. Next assessment: Follow-up assessment will be administered Fall 2018 and Spring 2019 semester.

Presentation, manners, reliability and safety: Demonstrate the ability to choose attire appropriate to the HVACR customers being served, utilize appropriate manners that will please a diverse population of possible customers, and demonstrate they may be relied upon to perform required duties properly, safely and on time.

Principle of Refrigeration AIR 121 Direct Measure: Students were assessed at final exam. Assessment scale is 0 - 100% - Pass or Fail. There is no particular written test in this particular category. It was a Pass or Fail. Also, this was tested in conjunction with several technical tasks with the Final Exam. Summary of the Task: Are the students able to perform the 10 steps on hooking up the gages with good presentation, with the correct manners, work safely according to the EPA regulations, and are they (the students) reliable on the performance of this task. house to increase.

Semester/year data collected: Fall 2017: AIR 121, 4 of 4 sections (WO Campus only). Total 68 students. Spring 2018: AIR 121, 2 of 2 sections (WO Campus only). Total 27 students. There was a dual enrollment sections during the Fall Term. Target: Students will score 90% or higher overall on each criterion as well as the overall score.

AIR 121 Question 2016-2017

Presentation 97% Manners 100%

Safety 100%

Reliability 93% Overall 97.5%

Results: Overall SLO score average 97.5% Strengths: by the Criterion is that everybody understood importance of Safety with 100%. Every student who passed this course passed it with 100%. While attending class, every student becomes a technician 100% ready for job market. Weakness: by the Criterion is that every student does not make it through the class, ergo the program itself.

SLOs data gathering was changed since the administrators have communicated with the instructors to implement this SLO Criterion rather than a written SLO. Target Met: Yes. The results of this assessment were discussed with AIR faculty. Areas needing improvement would be the focus on the Dual Enrollment students: There were 3 students that did not meet the standards. They were not allowed back to the program. Contacted PWCS and there was a screen method implemented for student success. Based on analysis of these results, the program plans to take the following actions: 1. Maintain a good percentage of sections assessed over the total number of sections offered in each class. 2. Contacted PWCS Administration and developed a screening process implemented for student selection for program success. 3. Continue to assess and compare results to determine

Page 26: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

16

Air Conditioning and Refrigeration, A.A.S. what actions need to be taken in the future. Next assessment: Follow-up assessment will be administered Fall 2018 and Spring 2019 semester.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will demonstrate the ability to show quantitative reasoning skills by applying basic math skills. [ X ] QR

Principle of Refrigeration I AIR121 Direct Measure: Students were assessed at final exam. Assessment scale is 0 - 100% - Pass or Fail. Questions: 67. Calculating Superheat. 68) Calculating Sub-cooling. Sample Size (Write N/A where not offered).

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students assessed

WO only 6 2 22 ELI N/A N/A N/A DE* N/A N/A N/A Total 6 2 22

* Dual-enrollment house to increase.

Semester/year data collected: Fall 2017 - 1 of 4 sections (WO Campus only) total 10 students that took the Final Exam. Spring 2018 -1 of 2 sections (WO Campus only) total 12 students that took the Final Exam. Target: students will score 80% or higher overall on each criterion as well as the overall score. Results by SLO:

AIR 238 Question 2017-2018

Q 67 86.36% Q 68 81.81% Overall 84.09%

Results: Overall SLO score average 84.09%. Target SLO score average 80%. Comparison to previous results: The target score was met. Compared to previous results: The scores are very different than previous semesters. Target score was exceeded for both questions. Superheat and Sub-cooling are extremely important subjects for the student to manage in the Field of HVAC. Strengths: The instructor has made the effort to convey the importance of the topic and students have understood and use quantitative reasoning to solve the questions. An improvement for next year will be to ask the same questions in a fill in the blank instead of the multiple choice format.

Previous action(s) to improve CLO if applicable: N/A Target Met: Yes Based on recent results, areas needing improvement: Current actions to improve CLO based on the results: The Program Head is collecting Final Exams to get better data sample for next year’s report. Next assessment of this CLO: To continue with Quantitative Reasoning, there is a better class. Will need to get better data for next years’ report.

Program Goals Evaluation Methods Assessment Results Use of Results

Increase graduation totals.

Track annual awarding of degrees and certificates. NVCC Office of Institutional Research reports show the number of graduates. (Using Table 2.6 from the Fact Book.)

AIR program’s Degrees and Certificates have more graduates than the previous year.

AIR Program 2017/18 2016/17 2015/16 2014/15 2013/14

A.A.S. 23 18 21 33 28

Certificate 17 19 19 18 24

It is partially down since enrollment has been decreasing. We are working closely to bring these numbers up to increase enrollment. An unemployment center and the Veteran Affairs have been contacted to help

Page 27: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

17

Air Conditioning and Refrigeration, A.A.S. C.S.C. 28 17 21 21 23

enroll possible individuals that are trying to change careers. Assessed: Annually.

Increase number of students in the AIR Program.

Track overall number of students by both enrollment in courses and by head count. Data from Enrollment Report by Division in SIS.

AIR degree student population by course enrollment and by head count.

AIR Program Fall Semester: 2018 2017 2016 2015 2014

Current Total Registered (Unduplicated)

176 196 228 209 235

Total Enrollments (Duplicated) 382 342 403 373 420

Due to economic growth issues enrollment is down college-wide especially in the HVAC-R program. Another major factor is that the loss of 50% of full-time faculty: 3 Full-Time HVAC-R instructors and the constant change in the Leadership as well as the change in the Administration. This can only hurt the quality of instruction which did have a direct impact on enrollment and students continuing to finish a CSC, Certificate and/or AAS Degree. To improve enrollment: During the Summer of 2018, the program started working with local high schools to get more Dual Enrollment students as well as working hand-in-hand with the Veteran Affairs and the Local Unemployment office to get more student registrations. In addition, we are in the process of hiring 2 Full-Time Faculty by the Fall of 2019. Assessed: Annually

Page 28: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

18

Annual Planning and Evaluation Report: 2017-2018

American Sign Language to English Interpretation, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: Designed for students who have limited, if any, previous experience with interpreting for Deaf people, this degree program provides the comprehensive training in theory and practical interpreting skills necessary for employment as an educational or community interpreter. Successful completion of this program prepares the student to pursue either a Virginia Quality Assurance Screening Level, or national certification either through the Registry of Interpreters for the Deaf or the Educational Interpreter’s Performance Assessment. These credentials qualify the student to interpret in either educational or community settings.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will interpret a 20-minute live and videotaped segment of American Sign Language into English with 80% accuracy.

Simultaneous Interpreting, ASL to English INT 233 Direct Measure: A 20-minute live and videotaped segment of American Sign Language into English with 80% accuracy. This SLO is assessed using the Final Exam Grade. The Final Exam is comprised of a videotaped selection that students have never seen before and they videotape themselves providing an interpretation. The students are required to voice a 20 minute story with a familiar signer and they are evaluated on their English grammar, appropriate word choice, dynamic equivalence, error recovery, processing times, fingerspelling and number comprehension, voice quality, deletions/additions/substitutions, and mannerisms. Grading rubric is attached. Sample Size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AN only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students will score 80% or higher overall on each criterion as well as the overall score for the course. Overall Results:

Spring 2017 Spring 2016

Final grade

A- 2 students B- 4 students C- 2 students D- 1 student

F- 0

A- 3 students B- 5 students C- 3 students D- 1 student

F- 0 Current results improved: [ ] Yes [X ] No [ ] Partially Results by SLO Criteria:

Parameter Number of Students in 2017-2018

Number of Students in 2016-2017

English grammar

90-100%-0 80-89%- 0 70-79%- 1 60-69%-3

Below 59%- 5

90-100%-4 80-89%- 1 70-79%- 3 60-69%-0

Below 59%- 4

Appropriate Word Choice

90-100%- 0 80-89%- 3 70-79%- 1 60-69%- 3

Below 59%- 2

90-100%- 2 80-89%- 4 70-79%- 1 60-69%- 4

Below 59%- 1

Dynamic Equivalence

90-100%- 0 80-89%- 1 70-79%- 2 60-69%- 2

Below 59%- 4

90-100%- 0 80-89%- 7 70-79%- 2 60-69%- 1

Below 59%- 2

Error Recovery

90-100%- 0 80-89%- 1 70-79%- 2 60-69%- 2

Below 59%- 4

90-100%- 0 80-89%- 7 70-79%- 1 60-69%- 4

Below 59%- 0

Previous action(s) to improve SLO: In Spring 2018 the instructor focused on grammar and word choice/ equivalent meaning in an effort to balance out the parameters and assist in overall achievement. Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: All areas need improvement since none of the parameters met their target. However, the parameters that will have the most impact on the students’ product continues to be English Grammar, Appropriate Word Choice, and Dynamic Equivalence. Current actions to improve SLO based on the results: The instructor has provided several practice/analysis opportunities for students throughout the semester in years past. However, these results prove that even more opportunities are necessary to guide students towards mastery of this skill. Therefore, instead of 8 assignments throughout the semester, the instructor will add two more guided assignments to help students achieve this goal. In addition, further self-analysis activities will be included to assist students in understanding the importance of these skills. Next assessment of this SLO: This SLO will be assessed in the Spring of 2019.

Page 29: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

19

American Sign Language to English Interpretation, A.A.S.

Processing time

90-100%- 0 80-89%- 3 70-79%- 1 60-69%- 2

Below 59%- 3

90-100%- 1 80-89%- 4 70-79%- 2 60-69%- 2

Below 59%- 3

Fingerspelling/ Numbers

90-100%- 1 80-89%- 1 70-79%- 2

60-69%- Below 59%- 5

90-100%- 1 80-89%- 5 70-79%- 1 60-69%- 3

Below 59%- 2

Voice quality

90-100%- 0 80-89%- 4 70-79%- 0 60-69%- 3

Below 59%-2

90-100%- 4 80-89%- 3 70-79%- 2 60-69%- 2

Below 59%-1

Deletions/ additions/ substitutions

90-100%- 0 80-89%- 3 70-79%- 1 60-69%- 2

Below 59%- 3

90-100%- 3 80-89%- 3 70-79%- 2 60-69%- 2

Below 59%- 2

Mannerisms/ fillers

90-100%- 0 80-89%- 4 70-79%- 1 60-69%- 2

Below 59%- 2

90-100%- 2 80-89%- 3 70-79%- 4 60-69%- 1

Below 59%- 2 Current results improved: [ ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: The one parameter that showed improvement was Mannerisms and fillers. That was a particular issue with this group of students and time was spent to correct this issue. However, only 66% of the students were able to achieve the overall target for the course. Weaknesses by Criterion/ Question/Topic: None of the parameters were met this evaluation period. More work is needed in all areas.

Students will transliterate a 20 minute live and videotaped segment of Contact Sign into English

Transliteration INT 141 Direct Measure: Students are required to voice a 20-minute story with a familiar signer and they are evaluated on their English grammar, appropriate word choice, dynamic equivalence, processing times, fingerspelling and number comprehension. The assessment rubric is attached. Sample Size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AN only 1 1 10

Semester/year data collected: Summer 2017 Target: 80% of students will score 80% or higher overall and on each criterion. Final Grades:

Summer 2017 Summer 2015

Final Grade

90-100%-2 80-89%- 7 70-79%- 0 60-69%-0

Below 59%- 1

A 90, 92 B 89, 88

C 77, 76, 78, 78, 74 D 60

Current results improved: [ X ] Yes [ ] No [ ] Partially

Previous action(s) to improve SLO: In the summer of 2015, the target was not achieved for any of the parameters, and as a result, the following actions were taken: 1. Since English Grammar is also covered in a class that the students take earlier in the program, INT 105, the teacher for that course altered the curriculum of that course to reinforce the section focusing on grammar. 2. The second area of concern was Dynamic Equivalence which is the focus of INT 107, another course that is taken prior to INT 141. The instructor for that course included a more intense unit of dynamic

Page 30: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

20

American Sign Language to English Interpretation, A.A.S. ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Results by SLO Criteria:

Parameter Number of Students 2017-2018

Number of Students 2015-2016

English grammar

90-100%- 2 80-89%- 3 70-79%- 2 60-69%- 2

Below 59%- 1

90-100%- 0 80-89%- 1 70-79%- 3 60-69%- 1

Below 59%- 5

Appropriate Word Choice

90-100%- 3 80-89%- 6 70-79%- 1 60-69%- 0

Below 59%- 1

90-100%- 1 80-89%- 3 70-79%- 2 60-69%- 3

Below 59%- 1

Dynamic Equivalence

90-100%- 2 80-89%- 7 70-79%-0

60-69%- 0 Below 59%- 2

90-100%- 2 80-89%- 2 70-79%-0

60-69%- 1 Below 59%- 4

Processing Time

90-100%- 0 80-89%- 4 70-79%- 0 60-69%- 5

Below 59%- 2

90-100%- 0 80-89%- 2 70-79%- 3 60-69%- 2

Below 59%- 3

Fingerspelling/ Numbers

90-100%- 5 80-89%- 2 70-79%- 0 60-69%- 3

Below 59%- 1

90-100%- 1 80-89%- 3 70-79%- 1 60-69%- 3

Below 59%- 2 Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: All parameters demonstrated an improvement from the last time that this SLO was assessed. The target was achieved for Dynamic Equivalence and Appropriate Word Choice. The area of English Grammar demonstrated a significant improvement although the target was not yet achieved. In 2015-16, only 11% achieved this goal, while in 2017-18, 45% of the students achieved this goal. Weaknesses by Criterion/ Question/Topic: Both Processing time and Fingerspelling and Numbers demonstrated increased success rates but the target was not achieved. These would be the next areas of focus, along with a continued focus on English Grammar.

equivalence to ensure that students are ready for INT 141. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: English Grammar, Processing Time and Fingerspelling and Numbers all demonstrated increased success rates but the target was not achieved. Current actions to improve SLO based on the results: While the efforts in INT 105 have provided positive results, more work is needed. The instructor for INT 141 will break down this broad category to determine any patterns in the areas of weakness within this parameter in order to focus efforts. Next assessment of this SLO: This SLO will be assessed in the summer of 2018.

Students will transliterate a 20 minute live and videotaped

Transliteration INT 141 Direct Measure: Students are required to transliterate a 20-minute signed story with a familiar speaker and they are evaluated on their PSE grammar, appropriate sign choice,

Semester/year data collected: Summer 2017 Target: 80% of students will score 80% or higher overall and on each parameter. Final Grades:

Summer 2017 Summer 2015

Previous action(s) to improve SLO: After the 2015-16 assessment, more activities and feedback focusing on PSE grammar, appropriate sign choice, dynamic equivalence, and processing time were introduced earlier in the course.

Page 31: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

21

American Sign Language to English Interpretation, A.A.S. segment of English into Contact Sign.

dynamic equivalence, mouthing, processing time, fingerspelling and number use, fluency and NMS, deletions/additions/substitutions, and mannerisms. Sample Size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AN only 1 1 10 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Final Grades

90-100%-3 80-89%- 5 70-79%- 2 60-69%-0

Below 59%- 0

A- 96, 93, 92, 93, 91, 90 B- 87, 86

C- 77 D

F- 18 Current results improved: [ ] Yes [ X ] No [ ] Partially The target has been achieved at the same rate as in the summer of 2015. Results by SLO Criteria:

Parameter Number of Students for 2017-2018

Number of Students for 2015-2016

PSE grammar

90-100%- 1 80-89%- 3 70-79%- 2 60-69%- 1

Below 59%- 3

90-100%- 1 80-89%- 6 70-79%- 1 60-69%- 0

Below 59%- 2

Appropriate Sign Choice

90-100%- 3 80-89%- 6 70-79%- 1 60-69%- 0

Below 59%- 0

90-100%- 1 80-89%- 4 70-79%- 2 60-69%- 0

Below 59%- 3

Dynamic Equivalence

90-100%- 3 80-89%- 5 70-79%- 0 60-69%- 0

Below 59%- 2

90-100%- 1 80-89%- 4 70-79%- 2 60-69%- 0

Below 59%- 3

Processing Time

90-100%- 5 80-89%- 3 70-79%- 0 60-69%- 1

Below 59%- 1

90-100%- 3 80-89%- 4 70-79%- 2 60-69%- 0

Below 59%- 1

Mouthing

90-100%- 3 80-89%- 4 70-79%- 0 60-69%- 0

Below 59%- 3

90-100%- 6 80-89%- 2 70-79%- 0 60-69%- 1

Below 59%- 1

Fingerspelling/ Numbers

90-100%- 3 80-89%- 5 70-79%- 0 60-69%- 1

Below 59%- 1

90-100%- 6 80-89%- 2 70-79%- 0 60-69%- 1

Below 59%- 1 Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: The target was achieved for Appropriate Word Choice, Dynamic Equivalence, Processing Time, and Fingerspelling and Numbers. The first three of these parameters also showed improvement from 2015-16 and fingerspelling

Target Met: [ X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Based on these results, PSE Grammar and Mouthing are the parameters that need more support. Current actions to improve SLO based on the results: The instructor will conduct more detailed assessments to determine exactly which parts of PSE grammar that students struggle with. This will allow for focused practice in these areas. Since mouthing is a skill that is introduced in this course, more practice opportunities will be introduced throughout the course. Next assessment of this SLO: This SLO will be assessed in the summer of 2018.

Page 32: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

22

American Sign Language to English Interpretation, A.A.S. and numbers confirmed achievement, but no improvement. Weaknesses by Criterion/ Question/Topic: Mouthing demonstrated a slight decline in achievement and the target was not achieved during this assessment year. PSE Grammar experienced a decline in success although the target was not achieved in 2015-16 either.

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Students will demonstrate the ability to analyze their own interpreting product, identify strengths and at least 2 areas of improvement that will have the greatest impact on their product, and determine a plan for practice outside of the classroom. [ X ] CT

Interpreting in Safe Settings INT 237 Direct Measure: In the Final Exam, students are required to write a reflection paper that analyzes their final video as well as their progress throughout the semester. In addition, students are asked to create a realistic plan for the future after graduation. The questions and grading Rubric are provided. Sample Size: (Specify N/A where not offered)

Campus/ Modality

Total #

Sections Offered

# Sections assessed

# Students assessed

AN only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 90% of students will score 80% or higher overall and on each criterion. Results:

Spring 2018

Final Grades

90-100%-7 80-89%- 1 70-79%- 1 60-69%-0

Below 59%- 0 Results by CLO Criteria:

Criteria/ Question Topics

Spring 2018

Average Score % of Students > target

1 96 100 2 96 100 3 96 100 4 91 89 5 91 89 Total 94 95.6

This is the first year that this CLO has been assessed. Strengths by Criterion/ Question/Topic: Students overall were able to critically evaluate their work. This is a skill that is emphasized throughout the program. Weaknesses by Criterion/ Question/Topic: The area that was the weakest related to the students’ ability to apply what they know about themselves to predict what challenges they may face in the working world and their internship.

Previous action(s) to improve CLO if applicable: This is the first time that this CLO has been assessed. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: The program only has two years to teach students a skill that in truth is a life-long learning process. One of the important skills that they must learn is how to look at their work critically, and how to predict what challenges they may face in order to prepare for them. While students were able to critically look at their work, they were not as competent at applying this knowledge to new situations. Current actions to improve CLO based on the results: In INT 237, the instructor will introduce more role playing opportunities to give students an opportunity to analyze future situations and apply what they know about their own work to determine the best path towards success. Next assessment of this CLO: This CLO will be assessed again in 2018-19.

Program Goals Evaluation Methods Assessment Results Use of Results

The program will produce

Enrollment and graduation totals will be tracked through OIR data in the fall of 2018.

Target: Number of program-placed students in each degree/certificate will increase by 2 percent.

Previous action(s) to improve program goal:

Page 33: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

23

American Sign Language to English Interpretation, A.A.S. at least 30 FTES (or 15 for the fall semester).

Results for Past 5 years: Fall Number of

Students Percentage Increased

2017 62 8.8 2016 57 -3.3 2015 59 -3.2 2014 61 -1.6 2013 62

Target Met: [ X ] Yes [ ] No [ ] Partially

Academic Year Number of FTES Percentage Increased

2017-18 31.0 8.8 2016-17 28.5 -7.5 2015-16 33.33 -3.6 2014- 15 34.6 26.8 2013- 14 27.27 43.3

Target Met: [ X ] Yes [ ] No [ ] Partially

1. The students who are program placed in the INT program are being given to the Assistant Dean and full-time faculty member as faculty advisors to help guide them to completion of the program starting in the fall of 2015. This has been extremely helpful in tracking students and leading them to graduation.

2. After the success of the Dual Enrollment program in Loudoun County, we have expanded the program to include a second school. The Assistant Dean will be visiting this school to talk with students and encourage them to consider a career in ASL to English Interpretation as an ongoing project.

Most recent results: After years of declining enrollment, we are finally experiencing increases in enrollment within the ASL and INT department. The Program Head has been promoting the program by visiting all of the classes to do group academic advising and to ensure that all students know the degree options available to them. In addition, the Program Head has increased efforts to personally encourage students to declare a major in ASL or INT and to take their degree to completion. Results improved: [ X ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: Regarding the Dual Enrollment classes, the Program Head is working with FCPS in an effort to bring these classes back there as well. We have already updated the placement expectations website for students who take HS ASL classes at FCPS and will continue to investigate Dual Enrollment options in the spring of 2019. Assessed: Annually

The program will graduate at least 12 students each year.

This information is obtained by the OIR Fact Book.

Target: The program will graduate at least 12 students each year. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2017-18 4 -20 2016-17 5 -37.5 2015-16 8 -11 2014- 15 9 0 2013- 14 9 0

Target Met: [ ] Yes [ X] No [ ] Partially

Previous action to improve program goal: In an effort to provide more options for students, the department worked hard to finalize an articulation agreement between NOVA and Gallaudet University in the fall of 2016. However, during this period of finalization, a large group of students decided to transfer to Gallaudet University and did not complete the internship requirements for the NOVA degree. These students who entered in 2016, should have graduated in 2018 before transferring to Gallaudet. In order to prevent this problem in the future, we did include in the agreement a preference for students to

Page 34: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

24

American Sign Language to English Interpretation, A.A.S. Comparison to previous assessment(s): Graduation rates are down by 20% to our lowest level despite efforts to convince students to apply for graduation. Students have finished the program, but have not applied for graduation.

graduate from NOVA in order to ensure a seamless transition. Most recent results: Half of these students transferred to Gallaudet University. These students have not applied for graduation because Gallaudet University has not enforced the need for students to transfer in with their AAS completed, per our original agreement. Results improved: [ ] Yes [ X ] No [ ] Partially Current actions to improve program goal: The Program Head is continuing to strongly encourage students to apply for graduation. One suggested strategy is to have students apply during the internship class as a group. This issue is being discussed as part of the Program Review. Assessed: Annually

Page 35: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

25

Annual Planning and Evaluation Report: 2017-2018

Architecture Technology, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Architecture curriculum is designed to prepare students for employment. The graduates will find employment in the field of architecture, construction, and urban design utilizing their construction knowledge, graphic communication and problem solving skills. Students must see their architecture advisor to satisfy individual goals.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to describe how site characteristics affect the design and construction of buildings.

Architectural Design and Graphics I & II ARC 231 and ARC 232 Direct Measure: Student Learning Outcomes were measured by evaluation of projects produced in our capstone course. Projects were evaluated in 4 areas for each SLO on a scale from 1-4. 1=not demonstrated, 2=marginally demonstrated, 3=well demonstrated, 4=very well demonstrated. See attached Capstone Course Evaluation forms. a. Documentation of site characteristics. b. Manipulation of site topography to

accommodate new structure. c. Attention to solar orientation. d. Organization of the site. Sample Size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

Students assessed

AL only 1 1 7 AN only 1 1 6 ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 13

*Dual-enrollment

Semester/year data collected: Fall 2017 and Spring 2018 Total of 24 projects were evaluated in December 2017 and May 2018 by two teams, one for each campus, including seven Faculty and Professional Architects and Engineers. The project evaluation team rated the projects which presented 2.47 for this SLO on a scale of 1-4. Target: The Architecture Cluster has agreed that a target of 2.5 is acceptable for each of the SLOs with an ultimate goal of 3.0 Results by In-Class, ELI, Dual Enrollment: (Specify N/A where not offered).

Results by Campus/ Modality

Fall 2017 Fall 2015 Average

Score Percent >

target Average

Score Percent >

target AL 2.18 87.2 2.52 100.8 AN 2.78 111.2 2.93 117.2 Total 2.47 98.8 2.72 108.80

Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2015

Average Score

% of Students > target

Average Score

% of Students > target

a 2.59 61.5 2.89 63 b 2.17 46 2.98 61 c 2.15 46 2.78 58 d 2.95 61.5 3.07 64 Total 2.47 53.75% 2.93 61.5%

Current results improved: [ ] Yes [ ] No [ X ] Partially Question d, Organization of the site, being the highest at 2.95 Question c, Attention to solar orientation, being the lowest at 2.15

This SLO was not evaluated in 2016-2017. The score of 2.47 is lower than the last evaluation (2015) with a score of 2.72. AN has shown a higher score than AL, 50% vs. 0% of students scoring greater than or equal to target score of 2.50. Instructors of the Site Planning course have been informed of the results for further improvements. With breaking down our SLOs to evaluate specific criteria and gain more detailed evaluation we (the faculty) can now concentrate on areas that need the most improvement. We have taken into consideration the advice of the Architecture Curriculum Advisory Committee. By measuring the SLOs through evaluation of the capstone courses, the evaluation includes all other relevant courses, thereby making the evaluation comprehensive and efficient. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Question b. Manipulation of site topography to accommodate new structure. Question c. Attention to solar orientation.

Page 36: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

26

Architecture Technology, A.A.S. Current actions to improve SLO based on the results: Instructors of the site planning course have been informed of the result of the evaluation for further improvements. Starting Spring of 2019 we need to make sure that student taking ARC 231 or ARC 232 have already taken the Site Planning course. Next assessment of this SLO: December 2019. SLOs 4 and 5 will be assessed in December 2018. SLOs 1 and 3 will be assessed in May 2019.

Student will be able to methodically design a building.

Architectural Design and Graphics I & II ARC 231 and ARC 232 Direct Measure: SLO measured by evaluation of projects produced in our capstone course. Projects were evaluated in 4 areas for each SLO on a scale from 1-4. 1=not demonstrated, 2= marginally demonstrated, 3=well demonstrated, 4=very well demonstrated. See attached Capstone Course Evaluation forms. a. Demonstrate logical organization of spaces. b. Clearly communicates horizontal and vertical

circulations. c. Demonstrate an appropriate scale for spaces. d. Preliminary selection of materials for

appearance.

Sample Size: (Specify N/A where not offered) Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students Assessed

AL only 1 1 7 AN only 1 1 6 ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 13

*Dual-enrollment

Semester/year data collected: Fall 2017 and Spring 2018 Total of 24 projects were evaluated in December 2017 and May 2018 by two teams, one for each campus, Including seven Faculty and Professional Architects and Engineers. The Project evaluation team rated the projects which presented 2.75 for SLO 8 on a scale of 1-4. Target: The Architecture Cluster has agreed that a target of 2.5 is acceptable for each of the SLOs with an ultimate goal of 3.0 Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Fall 2017 Fall 2015 Average

Score Percent >

target Average

Score Percent >

target AL 2.17 86.80 2.89 115.60 AN 3.34 133.60 3.53 141.20 Total 2.75 110.20 3.05 128.4

Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2015

Average Score

% of Students > target

Average Score

% of Students > target

a 2.80 54 3.34 64.5 b 2.91 61.5 3.17 70 c 2.85 54 3.42 65 d 2.46 69 2.91 81 Total 2.75 59.6% 3.21 70%

Current results improved: [ ] Yes [ ] No [ X ] Partially

SLO was not evaluated in 2016-2017. Score of 2.75 is lower than last evaluation (2015) score of 3.05. AN has shown a higher score than AL when breaking down our SLOs to evaluate specific criteria and gain more detailed evaluation. We (Architecture Faculty) can now concentrate on areas that need the most improvement. We have taken into consideration the advice of the Architecture Curriculum Advisory Committee. By measuring the SLOs through evaluation of the capstone courses, the evaluation includes all other relevant courses, thereby making the evaluation comprehensive and efficient Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Question d, Preliminary selection of materials for appearance. Current actions to improve SLO based on the results: We will improve this area by more specific information on Exterior materials in our next project of Spring 2019.

Page 37: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

27

Architecture Technology, A.A.S. Question b, Clearly Communicates horizontal and vertical circulations, being highest (2.91) Question d, Preliminary selection of materials for appearance, being the lowest (2.46).

Next assessment of this SLO: December 2019. SLOs 4 and 5 will be assessed in December 2018. SLOs 1 and 3 will be assessed in May 2019.

Students will be able to communicate graphically using computer applications.

Architectural Design and Graphics I & II ARC 231 and ARC 232 Direct Measure: Students Learning Outcomes were measured by evaluation of the projects produced in our capstone course. Projects were evaluated in 4 areas for each SLO on a scale from 1-4. 1=not demonstrated, 2= marginally demonstrated, 3=well demonstrated, 4=very well demonstrated. See attached Capstone Course Evaluation forms. a. Project demonstrates the students’

competence in using architectural software commonly used in the industry.

b. Project demonstrates the student’s ability to organize graphic communication using computer applications.

c. Project demonstrates the students’ ability to represent building components using architectural software commonly used in the industry.

Sample Size: (Specify N/A where not offered) Campus/ Modality

Total # Sections Offered

# Sections assessed

# students assessed

AL only 1 1 5 AN only 1 1 6 ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 11

*Dual Enrollment

Semester/year data collected: Fall 2017 and Spring 2018 Total of 24 projects were evaluated in December 2017 and May 2018 by two teams, one for each campus, including seven Faculty and Professional Architects and Engineers. The Project evaluation team rated the projects which presented 3.31 for this SLO on a scale of 1-4. Target: The Architecture Cluster has agreed that a target of 2.5 is acceptable for each of the SLOs with an ultimate goal of 3.0 Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Spring 2018 Spring 2016 Average

Score Percent >

target Average

Score Percent >

target AL 2.78 111.2 2.93 117.2 AN 3.85 154.0 3.82 152.8 Total 3.31 132.6 3.37 135.0

No Dual-enrollment

Criteria/ Question Topics

Spring 2018 Spring 2016

Average Score

% of Students > target

Average Score

% of Students > target

a 3.22 100 3.39 90 b 3.44 77 3.37 90 c 3.28 92 3.36 85 Total 3.31 89.6% 3.37 88%

Current results improved: [ X ] Yes [ ] No [ ] Partially The result is above the ultimate goal of 3.0.

SLO 7 has not been evaluated in 2016-2017. The score of 3.31 is slightly lower than the last evaluation (2016) score of 3.37. AN has shown a higher score than AL when breaking down our SLOs to evaluate specific criteria and gain more detailed evaluation. We (Architecture Faculty) can now concentrate on areas that need the most improvement. We have taken into consideration the advice of the Architecture Curriculum Advisory Committee. By measuring the SLOs through evaluation of the capstone courses, the evaluation includes all other relevant courses, thereby making the evaluation comprehensive and efficient Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The result is above the ultimate goal of 3.0. Though the target was exceeded, we will continue to make our CAD courses more challenging and marketable as per recommendations of the Architecture Curriculum Advisory Committee members. Next assessment of this SLO: May 2020. SLOs 4 and 5 will be assessed in December 2018. SLOs 1 and 3 will be assessed in May 2019.

Page 38: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

28

Architecture Technology, A.A.S. Core Learning

Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will be able to describe how buildings are constructed. [ X ] CT [ X ] QR

Architectural Design and Graphics I & II ARC 231 and ARC 232 Direct Measure: Measured by evaluation of projects produced in our capstone course. Projects were evaluated in 4 areas for each SLO on a scale from 1-4. 1=not demonstrated, 2= marginally demonstrated, 3=well demonstrated, 4=very well demonstrated. See attached Capstone Course Evaluation forms. a. Project demonstrates the students’ ability to

research building materials and methods. b. Project demonstrates the students’ ability to

assemble building components. c. Project demonstrates the students’ ability to

design construction details. d. Project demonstrates the students’ ability to

graphically communicate construction systems. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# students assessed

AL only 1 1 5 AN only 1 1 6 ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 11

*Dual-enrollment

Semester/year data collected: Fall 2017 and Spring 2018 Total of 24 projects were evaluated in December 2017 and May 2018 by two teams, one for each campus, including seven Faculty and Professional Architects and Engineers. The Project evaluation team rated the projects which presented 3.10 for SLO 7 on a scale of 1-4. Target: The Architecture Cluster has agreed that a target of 2.5 is acceptable for each of the SLOs with an ultimate goal of 3.0 Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered)

Results by

Campus/ Modality

Spring 2018 Spring 2016

Average Score

Percent > target

Average Score

Percent > target

AL 2.50 100 2.57 102.8 AN 3.71 148.4 3.46 138.4 Total 3.1 3.0

Offered only at AL and AN Results by CLO Criteria:

Results by CLO Criteria/ Question Topics

Spring 2018 Spring 2016

Average Score

% of Students > target

Average Score

% of Students > target

a 3.00 50 3.05 50 b 3.14 67 2.97 63 c 3.08 75 2.84 69 d 3.19 83 3.19 83 Total 3.10 68.75 3.01 66.25

Current results improved: [ X ] Yes [ ] No [ ] Partially

SLO 7 has not been evaluated in 2016-17. Score of 3.10 is slightly higher than last evaluation (2016) score of 3.02. AN has shown a higher score than AL when breaking down our SLOs to evaluate specific criteria and gain more detailed evaluation. We (Architecture Faculty) can now concentrate on the areas that need the most improvement. We have taken into consideration the advice of the Architecture Curriculum Advisory Committee. By measuring the SLOs through evaluation of the capstone courses, the evaluation includes all other relevant courses, thereby making the evaluation comprehensive and efficient. Target Met: [ X ] Yes [ ] No [ ] Partially Based on the recent results, areas needing improvement: The result is above ultimate goal of 3.0. Though the target has been exceeded, we will continue to make the courses more challenging and also marketable as per recommendations of the Architecture Curriculum Advisory Committee members. Next assessment of this CLO: May 2020.

Program Goals Evaluation Methods Assessment Results Use of Results To prepare students for employment in the field of Architecture Technology and construction.

A survey was administrated to Architectural Design and Graphics II (ARC 232) students in May 2018 at the Annandale campus to determine their perceptions of how well prepared they are in our program’s four SLOs. There were six participating students. No exit survey was conducted in May 2018.

Target: Number of program-placed students in each degree/certificate to be increased by 10 percent by next academic year. The student survey revealed that students rate their skills and knowledge in our four learning objectives on average 3.10 on a scale from 1-4. A score of “3” represents “good”. A score of “4” represents “excellent”. Site Planning was the lowest score 2.47. Computer Graphics was the highest score 3.31.

No exit survey was conducted in May 2018. We have asked students enrolled in the capstone courses, who were preparing to graduate, to inform us by mail/letter their academic/ employment status. So far we have not received enough responses.

Page 39: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

29

Architecture Technology, A.A.S. From responses that we have received we could see that some of our graduates have started their own small design-built firm involved in residential renovations and additions. Some found work at local construction firms and construction related retails (like becoming a kitchen designer in a cabinet suppliers’ firm).We will continue same kind of survey this year. That way, the program can have a better understanding of our students’ future activities, which will allow the program to adjust to the needs of our students in preparing them for their future career goals. To recruit students to the program, with the help and participation of our Architecture Curriculum Advisory committee members, we have secured exciting internship opportunities, making the program more attractive and practical. Turner Construction Co. has already accepted three of our students in their internship program for Summer 2018 and will continue accepting applications for next year. This gives the students invaluable hands-on experience and the possibility of future scholarship. In Spring 2018, the Advisory committee members received copies of all architecture Course Content Summaries to comment and recommend necessary revisions as per market needs. We are expecting the results by Spring 2019. Results improved: [ ] Yes [ ] No [ X ] Partially

Page 40: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

30

Architecture Technology, A.A.S. Current actions to improve program goal: To increase the number of students’ internship programs in Architecture and Construction. To revise or improve existing courses as per the Architecture Curriculum Advisory Committee members’ recommendation. Assessed: Annually

To increase the number of program placed students in the program.

Distribution of Program Placed Students by Curriculum and Award Type Data from 2017-18 Planning and Evaluation Report.

Target: Program graduation totals will increase by 10 percent. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2013-2014 14 - 2014-2015 25 70.00 2015-2016 10 -58.82 2016-2017 18 78.57 2017-2018 12 -33.33

Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): In the 2017-18 academic year we had a decrease from 2016-17 of 33.33%. Target: Increase number of program-placed students in each degree/certificate by 10 percent for next academic year by advising our students and explaining the possibilities.

Fall Number of Students

Percentage Increased

2013 240 - 2014 189 -21.25 2015 169 -10.58 2016 133 -21.31 2017 119 -10.52

Target Met: [ ] Yes [ X ] No [ ] Partially In the 2017-18 academic year, we had a decrease from 2016-17 of 33.33%.

Previous action to improve program goal: We had 119 Program placed students last year. We have been seeing a decline in the number of program placed students over the last 5 school years. We strongly believe that this is due to the economic downturn. The building trade industry was hit hard and recovery has been very slow and uncertain, which is affecting a decline in our program placed students. We also had many students who were taking our ARC courses without program placement or with no intention of graduation. We strongly believe that due to recent economic recovery and improvement of the building industry, we also will see good improvement in our program enrollment. Results improved: [ ] Yes [ X ] No [ ] Partially Current actions to improve program goal: We need to make sure that all the students entering to Architecture Technology program are program placed. Now, students entering our program (ARC 123) have to fill out a questionnaire about their goals and intensions, which makes a good base for faculty to give them proper advice and encourage

Page 41: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

31

Architecture Technology, A.A.S. them to program place in Architecture. Assessed: Annually

Additional Program Goals: 1- To prepare and

encourage students to continue their undergraduate education in four year universities.

2- To prepare students who possess an undergraduate degree in an unrelated discipline to enter a Master of Architecture program.

3- To encourage our students to continue and graduate from the Architecture Program.

Our students mainly transfer to Catholic University of America (CUA). We also have an agreement with Virginia Tech Architecture Graduate program to accept students who possess unrelated undergraduate degrees and are completing our Architecture program into their Master of Architecture program. We have started our negotiations with other area universities like UDC, Howard University and UVA for more transfer agreements. This began in Fall 2018.

Target: To negotiate with all local Architecture schools in VA, MD and DC to increase the transfer possibilities for our students who want to continue their studies in Architecture. Results for Past 5 Years: AAS Architectural Technology students transferring to CUA 2014: 7 2015: 5 2016: 6 2017: 4 2018: 1 AAS Architecture Technology students transferring to VA Tech Graduate Program 2014: 1 2015: 2 2016: 2 2017: 1 2018: 0 Target Met: [ ] Yes [ ] No [ X ] Partially

We have two agreements for our students to transfer to higher institutions. One with CUA and the other with ARC Graduate program of Virginia Tech. For students who already have a bachelor degree in an unrelated area, and having graduated from our program they can enter their graduate program. Most recent results: Last year we had 4 of our graduates continuing their studies at CUA, and this year we only have one graduate transferred to CUA. This year we did not have any students transfer to VT Graduate program. We have started negotiations with other area universities for more transfer agreements. The Architecture program offers an AAS degree, an occupational program designed to prepare our students for employment, but we also have students planning to transfer to four year colleges, so we are offering a two track program both for employment and transfer by using our Technical Elective courses and offering courses as per the needs of each group. Results improved: [ ] Yes [ ] No [ X ] Partially Current actions to improve program goal: Negotiate with all local Architecture schools in VA, MD and DC to increase the transfer possibilities for our students who want to continue their studies in Architecture. Next assessment: Fall 2019

Page 42: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

32

Annual Planning and Evaluation Report: 2017-2018 Automotive Technology, A.A.S. and Emissions Specialization

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed to prepare students for full-time employment in the automotive field. Student Learning Outcome

s Evaluation Methods Assessment Results Use of Results

Inspect and measure a rotor with a dial indicator and micrometer to determine the serviceability of the rotor.

Brakes AUT 265 Direct Measure: The same number 61 NATEF approved task as the previous assessment was used to show students are able to perform this entry level task of diagnosing rotor conditions. Assessment attached. This task was assessed in Brakes where mastery of these skills is expected. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

AL only 2 2 26 MA only 2 2 27 DE* 3 2 37 ELI N/A N/A N/A Total 7 6 90

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% or higher overall score Current results improved: [ ] Yes [X] No [ ] Partially Results by In-Class, ELI, Dual Enrollment: Data was collected at the end of Spring 2018 semester from all sections offered at NOVA. 26 out of 34 Alexandria students and 27 out of 27 Manassas students were assessed in all AUT 265 courses - 2 at Alexandria and 2 at Manassas. The average pass rate was 67% for Alexandria, below the target and 99% for Manassas, above the target. Three dual enrolled high schools also offered this course, although due to communication challenges, only 2 schools participated in this cycle, which is 1 more than the previous cycle. The pass rate for dual enrolled students was 78%, below the target 80%. Previous Data

Question Spring 2016

Spring 2017

NOVA Students - 57

AL MA

1 - Visual Inspection 100% 100% 100% 2 – Service Information 100% 87% 100% 3 – Thickness Variation Specifications 100% 98% 100%

4 – Micrometer Measurements 98% 81% 88%

5 – Dial Indicator Measurements 100% 84% 85%

6 – Recommendations 100% 81% 100%

Question Spring 2016

Spring 2017

Dual Enrolled Students - 8 1 - Visual Inspection N/A 100% 2 – Service Information N/A 75%

Previous action(s) to improve SLO: Faculty noticed during the Spring 2016 assessment that the only area students had difficulty was acquiring an accurate micrometer reading ride height leading to an overall score of 99%. In the following semesters, faculty gave students feedback on their accuracy and measuring technique and also increased the difficulty level by measuring to the thousandths of an inch instead of hundredths of an inch. With the added difficulty, the current cycle’s performance on this skill decreased at both campuses to an overall score of 91%, similar to the dual enrolled students at 92%. Although the target was met during our last assessment in Spring 2017, there is room for improvement for the next assessment. The following recommendations were agreed upon by faculty during the Fall 2017 cluster. These recommendations will be implemented in every Brakes (AUT 265) course by faculty immediately to improve student outcomes in every class and in preparation for the next assessment at the end of Spring 2017: For question 2, students will be instructed during practice to use the same unit of measurement for comparing specifications to actual measurements. For question 4, students had a difficult time measuring accurately given the style and accuracy of the micrometer. Also, many students forgot to index the rotor and ensure the positions they measured were an appropriate distance from each other. Question 5 will be reworded for clarity.

Page 43: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

33

Automotive Technology, A.A.S. and Emissions Specialization 3 – Thickness Variation Specifications N/A 100%

4 – Micrometer Measurements N/A 87% 5 – Dial Indicator Measurements N/A 87%

6 – Recommendations N/A 87%

Question Spring 2018 Campus &

# of students AL 26

MA 27

DE 37

1 - Visual Inspection 88% 100% 89% 2 – Service Information 63% 100% 78% 3 – Thickness Variation Specifications 92% 100% 84%

4 – Micrometer Measurements 42% 93% 67%

5 – Dial Indicator Measurements 58% 100% 78%

6 – Recommendations 58% 100% 70% Strengths by Criterion/ Question/Topic: Manassas campus seems to have a proven solution to teaching and assessing students in inspecting, measuring, and assessing rotors which will be shared with Alexandria campus and the dual enrolled schools. Therefore, we will focus on Alexandria campus and our dual enrolled partners. Looking up specifications and performing a visual inspection are the two highest scoring areas. Weaknesses by Criterion/ Question/Topic: The lowest scoring areas relate to measuring the components and comparing them to service information on questions 2, 4, 5, and 6. It was noted that making measurements on a rotor with multiple grooves makes it challenging for students to measure in the exact same spot as the instructor chose to be used a means for comparison against the students’ answers. Also, some of the micrometers were not calibrated properly, so if a student used a separate micrometer than the instructor, the answer would be incorrect.

Question 6 showed that students need to understand how much material is removed during rotor machining to know what the minimum amount of thickness a rotor needs to start with. Target Met: [ ] Yes [ ] No [X] Partially Based on recent results, areas needing improvement: For the students, reading micrometers properly and understanding the different nomenclature that manufacturers use for various specifications. For faculty, ensuring equipment is the same for all students and the instructor who is making the initial comparison measurement. Current actions to improve SLO based on the results: The following recommendations were agreed upon by faculty during the Fall 2018 discipline meeting. These recommendations will be implemented in all Brakes (AUT 265) courses by faculty immediately to improve student outcomes in every Brakes class and in preparation for the next assessment at the end of Spring 2019: Ensure all sections are practicing and proctoring the assessment the same way and are using the same equipment to ensure the measurement the instructor took as a means for comparing to a student’s answer will be accurate. Add a metric specification for runout and parallelism to the worksheet. Update worksheet so the tasks are listed in a more sequential order and verbiage matches what is found on Alldata service information. Review different manufacturer’s nomenclature for different specifications to aid in understanding titles for specifications. Next assessment of this SLO: Spring 2019

Apply electrical theory using wiring

Electricity 2 AUT 242 Direct Measure: All instructors collaborated on adding two questions at the end that had the same answer as number 5,

Semester/year data collected: Fall 2017 Target: 80% or higher overall score Current results improved:

Previous action(s) to improve SLO: Faculty noticed during the Fall 2015 assessment that students had difficulty with questions 5 and 6 which required the highest level of critical thinking skills, the basis of which is the ability to read wiring

Page 44: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

34

Automotive Technology, A.A.S. and Emissions Specialization diagrams and schematics to diagnose and repair automotive electrical circuits.

but the questions were worded differently to assess which wording best communicates question number 5. Questions 1-6 are the same questions used in the previous SLO worksheet. An accompanying wiring diagram that highlighted the main components and characteristics of circuits (relays, switches, modules, and loads) was included. The core questions targeting fundamental knowledge about circuit voltages and problem-solving abilities was used. Assessment attached. This task was assessed in Electricity 2 where mastery of these skills is expected. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Section

s Offered

# Sections Assesse

d

# Students Assessed

AL only 1 1 15 DE* N/A N/A N/A ELI N/A N/A N/A

*Dual-enrollment

[X] Yes [ ] No [ ] Partially Data was collected at the end of Fall 2017 to assess the question wording for number 5 in order to decide how to communicate the question properly for all campuses to participate in the Spring 2018 assessment. In the Fall, 15 out of 16 students were assessed in one AUT 242 - Electricity 2 course at Alexandria which was the only course offered between all campuses and Dual Enrolled High Schools. The average pass rate was 91%, above the target of 80% or higher. This assessment showed that the last question wording option for this assessment was the most effective in communicating to the students, although a large number of students still did not answer the question properly due to the students missing details in the question. Previous Data

Question Fall 2015 Fall 2016

AL AL MA 1 - Voltage Point W 95% 89% 100% 2 – Voltage Point X 95% 100% 100% 3 – Voltage Point Y 81% 100% 100% 4 – Voltage Point Z 86% 100% 100% 5 – Open Circuit 50% 33% 77% 6 – Corrosion 72% 44% 100%

Question Spring 2017

AL MA 1 - Voltage Point W 80% 100% 2 – Voltage Point X 80% 100% 3 – Voltage Point Y 72% 100% 4 – Voltage Point Z 80% 96% 5 – Open Circuit 56% 96% 6 – Corrosion 52% 100%

Question Fall 2017

AL 1 - Voltage Point W 100% 2 – Voltage Point X 100% 3 – Voltage Point Y 81% 4 – Voltage Point Z 100% 5 – Open Circuit 56% 6 – Corrosion 75% 7 - #5 Reworded 56% 8 - #5 Reworded 69%

diagrams to diagnose a problem and resulted in an overall score of 80%. In the following semesters, faculty placed more emphasis on circuit operation without faults and relating symptoms on a work order to electrical flow on a diagram in lecture. Faculty have also increased the amount of basic electrical taught in non-electricity courses. In the 2016-17 cycle, performance on this skill improved overall to 87% in fall and 84% in spring, although Alexandria scored lower than the pass rate at 78% in the fall and 70% in the spring. Target was met at Manassas but not Alexandria during our last assessments in Fall 2016 and Spring 2017. There is room for improvement and collaboration for the next assessment. The following recommendations were agreed upon by faculty during the Fall 2017 cluster. These recommendations will be implemented in all automotive courses by faculty immediately to improve student outcomes in every class and in preparation for the next assessment at the end of Fall 2017: Color code diagram during normal operation. Color code diagrams during abnormal operation on this and multiple other horn circuits. Should also be done in AUT 241 (Electricity 1). Electricity 1 should practice color coding normal operation, but also be able to “diagnose on paper” with color coding during a fault. Question number 5 appears to be confusing to many students. Our next goal is to ask the same question in multiple ways to see if the question is worded properly in Fall 2021. Target Met: [ ] Yes [ ] No [ X] Partially Based on recent results, areas needing improvement: Reading and comprehending written questions so students will be able to use the information presented in the questions and schematics to diagnose a fault based on symptoms.

Page 45: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

35

Automotive Technology, A.A.S. and Emissions Specialization Strengths by Criterion/ Question/Topic: Students are able to understand what voltages should be present normally in a normally functioning circuit. Weaknesses by Criterion/ Question/Topic: Looking at the data below, question 5 has the lowest score for both campuses, followed by question 6. Given the amount of time spent reviewing the material in class, it is clear students still become confused when visualizing how electricity normally flows through the circuit, then analyzing the symptom the vehicle is experiencing and finally using those symptoms to conceptualize the possibilities of how the flow of electricity could have been changed.

Current actions to improve SLO based on the results: The following recommendations were agreed upon by faculty during the Fall 2018 discipline meeting. These recommendations will be implemented in all AUT 242 (Electricity 2) courses by faculty immediately to improve student outcomes in every Brakes class and in preparation for the next assessment at the end of Fall 2021: Expand our 8-week course offering into a 16-week hybrid course with weekly written assignments on voltage drop, relay and control circuitry. Reword question 5 on the assessment to list the technicians’ findings in bullet-point format so students don’t miss pertinent information. Ensure all sections are practicing and proctoring the assessment the same way. Next assessment of this SLO: Fall 2021

Retrieve diagnostic trouble codes and monitor status using a scan tool. Using the scan tool data and wiring diagrams, determine the next logical step in the drivability diagnostic process.

Fuels 2 AUT 121 Direct Measure: All instructors teaching this course collaborated on creating a hands-on and written assessment which tests the student’s ability to retrieve codes, look at scan data, understand system operation and use a wiring diagram to determine a diagnostic process. This task is assessed in Fuels 2 where mastery of these skills is expected. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Section

s Offered

# Sections Assesse

d

# Students Assessed

MA only 2 2 30 DE* N/A N/A N/A ELI N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% Data was collected at the end of Fall 2017 semester in all sections offered. 30 students were assessed in two AUT 122 – Fuels 2 courses at Manassas. The average pass rate was 98%, which is above the target of 80%. This is consistent with last year’s pass rate at the Alexandria campus.

Question Spring 2017 (AL)

Fall 2017 (MA)

1 – Retrieve DTC’s 100% 100% 2 – DTC Descriptions 100% 100% 3 – Monitor Readiness 100% 93% 4 – Component/Wire Check 94% 100%

Current results improved: [ ] Yes [X] No [ ] Partially Strengths by Criterion/ Question/Topic: Looking at the data, it shows students are proficient at both campuses with diagnostic trouble code retrieval, code description, and accessing monitor readiness. Question 4 shows students are able to combine code,

Previous action(s) to improve SLO: Faculty noticed during the Spring 2017 assessment that students did well on the paper exam. However, many students chose a test location on paper that would be difficult to access in an actual service bay diagnostic situation. The recommendation was made to set up a similar diagnostic scenario in the lab to allow students the opportunity to look for the most practical test points. Scores for question 4 improved to 100% over Alexandria’s 2017 Spring score of 94%. However, 7% of students missed question 3 at Manassas in Fall 2017 which is below Alexandria’s 2017 Spring score of 100%. This brings us to a total score of 98% for Manassas in Fall 2017, the same as Alexandria’s 98% in Spring 2017. Although the target was met during our last assessment in Fall 2017, there is room for improvement for the next assessment. Target Met: [X] Yes [ ] No [ X] Partially Current actions to improve SLO based on the results: The following recommendations were agreed upon by faculty during the Fall 2018 discipline meeting.

Page 46: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

36

Automotive Technology, A.A.S. and Emissions Specialization symptom, system operation, and wiring diagram data to interpret the next diagnostic step. Weaknesses by Criterion/ Question/Topic: Both campuses did equally well but had the lowest score in different areas. Manassas had minor trouble with question 3 which requires the student to locate a different menu item than is typically used to retrieve codes or data.

These recommendations will be implemented in all Fuels 2 (AUT 122) courses by faculty immediately to improve student outcomes in future courses and for the next assessment at the end of Fall 2018 to collect data for when both campuses are holding courses at the same time: Practice using Mode 6 data monitor readiness as a part of the normal diagnostic process during labs. Next assessment of this SLO: Fall 2018

Core Learning Outcome

s

Evaluation Methods Assessment Results Use of Results

CLO: Apply electrical theory using wiring diagrams and schematics to diagnose and repair automotive electrical circuits. [ X ] CT

Electricity 2 AUT 242 Direct Measure: The test questions were updated to include the best understood version of question 5 with an accompanying wiring diagram that highlighted the main components and characteristics of circuits to include: relays, switches, modules, and loads. The questions that targeted fundamental knowledge about circuit voltages and problem-solving abilities were used. Assessment attached. This task was assessed in Electricity 2 where mastery of these skills is expected. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

AL only 2 2 27 MA only 2 2 22 DE* 1 N/A N/A ELI N/A N/A N/A Total 5 4 49

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% or higher overall score Current results improved: [ ] Yes [ ] No [X] Partially In Spring 2018, 27 out of 29 students at Alexandria and 22 out of 22 students at Manassas were assessed in all AUT 242 courses - 2 at Alexandria and 2 at Manassas. The average pass rate was 76% for Alexandria and 97% for Manassas. In the Spring semester, Alexandria did not meet the 80% pass rate, and is above the last cycle’s score of 70%. Manassas did meet the 80% pass rate and is slightly below last cycle’s score of 98%. One dual enrolled high school course performed the SLO, but unfortunately gave it to all students in their course, not just the students who were dual enrolled. Therefore, the data was inaccurate and could not be used. The instructor was made aware of the process and will submit clear data for the next SLO assessment. Previous Data

Question Fall 2015 Fall 2016

AL MA 1 - Voltage Point W 95% 89% 100% 2 – Voltage Point X 95% 100% 100% 3 – Voltage Point Y 81% 100% 100% 4 – Voltage Point Z 86% 100% 100% 5 – Open Circuit 50% 33% 77%

Previous action(s) to improve SLO: Faculty noticed during the Fall 2015 assessment that students had difficulty with questions 5 & 6 which required the highest level of critical thinking skills, the basis of which is the ability to read wiring diagrams to diagnose a problem and resulted in an overall score of 80%. In the following semesters faculty placed more emphasis on circuit operation without faults and relating symptoms on a work order to electrical flow on a diagram in lecture. Faculty have also increased the amount of basic electrical taught in non-electricity courses. In the 2016-17 cycle, performance on this skill improved overall to 87% in fall and 84% in spring, although Alexandria scored lower than the pass rate at 78% in the fall and 70% in the spring. The target was met at Manassas but not Alexandria during our last assessments in Fall 2016 and Spring 2017. There is room for improvement and collaboration for the next assessment. The following recommendations were agreed upon by faculty during the Fall 2017 cluster. These recommendations will be implemented in all automotive courses by faculty immediately to improve student outcomes in every class and in preparation for the next assessment at the end of Spring 2017: • Color code diagram during normal operation • Color Code diagrams during abnormal operation

on this and multiple other horn circuits. Should also be done in Electricity 1.

Page 47: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

37

Automotive Technology, A.A.S. and Emissions Specialization 6 – Corrosion 72% 44% 100%

Question Spring 2017

AL MA 1 - Voltage Point W 80% 100% 2 – Voltage Point X 80% 100% 3 – Voltage Point Y 72% 100% 4 – Voltage Point Z 80% 96% 5 – Open Circuit 56% 96% 6 – Corrosion 52% 100%

Question Fall 2017 Spring 2018

AL 15

AL 27

MA 22

1 - Voltage Point W 100% 81% 100% 2 – Voltage Point X 100% 81% 100% 3 – Voltage Point Y 81% 74% 100% 4 – Voltage Point Z 100% 78% 100% 5 – Open Circuit 56% 55% 90% 6 – Corrosion 75% 89% 95% 7 - #5 Reworded 56% 8 - #5 Reworded 69%

Strengths by Criterion/ Question/Topic: Students are able to understand what voltages should be present in a normally functioning circuit. Weaknesses by Criterion/ Question/Topic: Looking at the data, question 5 still has the lowest score for both campuses, followed by question 6. Given the amount of time spent reviewing the material in class, it is clear students still become confused when visualizing how electricity normally flows through the circuit, then analyzing the symptom the vehicle is experiencing, and finally using those symptoms to conceptualize the possibilities of how the flow of electricity could have been changed.

• Electricity 1 should practice color coding normal operation, but also be able to “diagnose on paper” with color coding during a fault.

• Question number 5 appears to be confusing to many students, so our next goal is to ask the same question in multiple ways to see if the question is worded properly in Fall 2017.

Target Met: [ ] Yes [ ] No [ X] Partially Based on recent results, areas needing improvement: Using schematics to diagnose a fault based on symptoms. Current actions to improve SLO based on the results: The following recommendations were agreed upon by faculty during the Fall 2018 discipline meeting. These recommendations will be implemented in all Electricity 2 (AUT 242) courses by faculty immediately to improve student outcomes in future courses and for the next assessment at the end of Fall 2018 to collect data for when both campuses are holding courses at the same time. • Expand our 8 week course offering into a 16

week hybrid course with weekly written assignments on voltage drop, relay and control circuitry.

• Reword question number 5 on the assessment to list the technicians’ findings in bullet-point format so students don’t miss pertinent information.

• Ensure all sections are practicing and proctoring the assessment the same way.

Next assessment of this SLO: Fall 2021

Program Goals Evaluation Methods Assessment Results Use of Results

Create an advising path for all students entering and continuing

Per the outcomes of last year’s program goal 1 (create an advising path for all students entering and continuing in the program to advise them about our program, career options, and guidance on how to achieve their goals), we found that our new advising process and tools are much improved. This is indicated by the number of graduates we

Although our advising path for current students is working well, automotive faculty recognized the need for NOVA advertisement to stretch beyond transfer agreements and into career ready programs to increase the number of incoming students as evidenced by our shrinking enrollment. It would also be beneficial for Automotive to increase the awareness

Previous actions to improve Program Goal: As of Spring 2017, faculty have created a single website for our one program that can be found at: http://www.nvcc.edu/automotive/index.html One feature of the website provides any interested party with a fillable contact form directly on the website that goes to a group e-mail account. The

Page 48: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

38

Automotive Technology, A.A.S. and Emissions Specialization in the program to advise them about our program, career options, and guidance on how to achieve their goals. Meets NATEF requirement Pre-Admission Counseling 5.2 - Advise automotive students about careers and our program before enrolling.

have in spite of the falling enrollment numbers (see data under goals 2 and 3). However, many members of the public including NOVA personnel, incoming, and continuing students were still unaware of all our offerings.

of our program to the college as a whole so counselors can send prospective and incoming students directly to an automotive faculty advisor for all the required information before enrolling in classes. In addition, many continuing students are unsure about how to proceed when looking for an employer who will help them grow in their career while they’re in school or close to graduating. An automotive specific SDV 101 course would help set the tone for what is required in an automotive program, prepare automotive students for employment, and possibly transfer to 4 year institutions which are specific only to automotive.

request can be handled in 48 business hours by the department the requestor has indicated. The website also includes key components that were found to be important to incoming students by the GM ASEP marketing committee survey. These components are: faculty credentials, pictures, and an automotive program guide that provides the first step in the advising process to students not yet enrolled in the program. This includes information about each of our programs, career and transfer options, financing options, courses to complete each degree, as well as maps for each campus. We have also created a tri-fold pamphlet in Spring 2017 to hand out when visiting high school automotive programs or attending career fairs. Once students are in the program, we distribute the “10 Steps to Success” automotive student guide that includes more detailed information such as tool discounts, resume template, and “Are You on Track” degree progress check sheet. We have successfully been using the guide since Fall 2016. Guide is attached. All shared documents including the recruiting pamphlet, program guide, advising sheets, lab contracts, plus the SLOs can be found on our shared “Automotive SLO“, Blackboard site to better collaborate between campuses. The Blackboard site has been successfully used since Spring 2017. Screen capture of site is attached. We have had dozens of form requests from the website and successfully replied to them. Most prospective students then signed up for classes after speaking with a representative. This, coupled with visiting almost every high school automotive class in the Northern Virginia area, has helped to improve our first-year course enrollment. Students are better aware of their options and as a result are better prepared to complete their certificate or degree. Most recent results: In Fall 2017, the Automotive Program Chair contacted the NOVA President about the possibility of expanding the marketing reach of NOVA Automotive through NOVA’s regular advertisement routes. This included radio, print, and

Page 49: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

39

Automotive Technology, A.A.S. and Emissions Specialization online as well as expanding to YouTube advertisements. A meeting was held to discuss these options and the outcome was the President, Campus, Provost and Program Dean worked to put NOVA Automotive materials in the que for NOVA Graphics to complete. All full-time faculty have participated in creating the content of the materials and we expect to have the pamphlets and banner designs completed by December 2018. Third draft of the banner and program guide are attached. As of Fall 2018, we received permission to begin developing and teaching an automotive specific SDV 101 course. We have started to build a course and will complete the necessary training as it becomes available. The course should be ready for deployment for the Fall 2019 semester. Some materials and automotive specific areas to cover in the course are attached. Given that building an SDV 101 course and professional marketing materials takes over a year to complete, we deployed some strategies this year to help our current incoming students become aware of the offerings at NOVA. First, we contacted all students who registered as an automotive technology student and advised them over the phone or in person if they were willing to meet with us. Second, we delayed the General Motors ASEP orientation to 2 weeks before the start of classes to help capture any students who might be interested in this once-per-year opportunity or had questions about the global automotive program. These methods proved to be very successful as all our first-year classes are filled. Results improved: [X] Yes [ ] No [ ] Partially Current actions to improve Program Goal: The 2018-19 program goal will be to host orientation sessions at both campuses over the 2019 summer semester, and reach out to all counseling departments in Spring 2019 at all campuses to give automotive faculty advisor contact information to everyone in order to capture as many students as possible before they register.

Page 50: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

40

Automotive Technology, A.A.S. and Emissions Specialization The next assessment of this Program Goal: Fall 2019

To increase the number of program placed Automotive Technology A.A.S. degree students

Enrollment totals are tracked through OIR data.

The results for the past 5 years are as follows:

2013 2014 2015 2016 2017 AAS 434 437 397 359 359

Previous actions to improve Program Goal: This is a new program goal and has not been previously assessed.

Most recent results: Enrollment has stayed the same compared to last year and the target has not been met. Enrollment on both the Alexandria and Manassas campuses have declined as a whole in most discipline areas. The enrollment in automotive reflects this downward trend. Current actions to improve Program Goal: There is opportunity to attract students to come to NOVA instead of a for-profit institution. We will increase our marketing presence at the high schools and surrounding locations such as libraries, parts stores, shops, dealerships, etc. We will also host two orientation sessions during Summer 2019 to capture and place more students in the best program suited for the student. Next assessment of this Program Goal: the end of 2019 Spring semester.

To increase the number of Automotive Technology Graduates

Graduation totals are tracked through OIR data.

The results for the past 5 years are as follows:

AAS

AAS Emissions

Specialization CSC

Certification

2013-2014 31 17 2014 - 2015 29 11 2015-2016 43 30 2016 - 2017 43 31 2017 - 2018 45 8 26

Previous actions to improve Program Goal: This is a new program goal and has not been previously assessed.

Most recent results: The goal of increasing graduates has been met as our total graduation rates for our AAS degrees has increased by 10 graduates or 23% over last year. Even though our enrollment has declined, our graduation rates have improved because of our increased effort to reach out to every student at every stage of their college career to ensure they are advised properly to meet their personal goals.

Current actions to improve Program Goal: The following recommendations were agreed upon by faculty during the Fall 2018 discipline meeting and will be implemented immediately. There is still room for improvement as we try to ensure every student who has earned an Associate degree also applies and graduates with the Maintenance and Light Repair Career Studies Certificate which is stackable with both Associate degrees.

Page 51: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

41

Automotive Technology, A.A.S. and Emissions Specialization

Our advising plan is to ensure all students are also signed up for the MLR CSC as well as the Degree when first entering the program if they are not at their limit for degrees claimed.

Next assessment of this Program Goal: Spring 2019.

Create a system for locating graduates for employment opportunities and NATEF program review. NATEF requirement Annual Follow Up 5.4

Per the outcomes of last year’s program goal 2 (create a system for locating graduates for employment opportunities and NATEF program review), we found that electronic surveys are difficult to employ and are generally unsuccessful due to lack of internet and/or computer access in the classroom, and the inundation of faculty evaluation surveys and exams at the end of the semester.

For the 2017-18 school year, we have given a paper survey to all students at the beginning of each semester in all advanced level classes such as Fuels 2 and Electronics. Our target was to collect data from 30% of our students who are near graduation.

Previous actions to improve Program Goal: A graduate survey was created online at: https://goo.gl/forms/vmn2sgwI0BdCoz3o1 The survey was designed to collect data from students easily and securely through the secure Gmail account on myNOVA and put on a google form that would be accessible to program heads. Any student who was in a second year class, or was close to graduating, was asked to complete the survey. The survey did not prove to be an effective tool due to lack of internet and/or computer access in the classroom, and the inundation of faculty evaluation surveys and exams at the end of the semester. During the Fall 2017 cluster meeting faculty made the following recommendations to improve faculty’s collection of graduate contact information at the end of Spring 2018: Ask students to check SIS at the end of the semester and update their contact information to include a cell phone number with the use of written surveys on paper. Pass out a 3x5 card and ask only for basic information. Use a general purpose exit survey that is automatically populated through Blackboard to all automotive students. Give the survey in paper at the beginning or middle of the semester. Most recent results: This practice yielded a 50% return on students who were near graduation and willing to release their contact information to us so we would be able to keep in touch with them for NATEF accreditation and job opportunities. Graduate survey attached. This is now a part of our regular process and we expect that the rate of return will continue to increase each year.

Page 52: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

42

Automotive Technology, A.A.S. and Emissions Specialization Results improved: [X] Yes [ ] No [ ] Partially For next year’s program goal, we will be looking to freshen up our website to include employer profiles for employers who are regularly looking for our graduates including non-automotive employers such as Micron technologies and hospitals in need of medical equipment repair technicians. Next assessment of this program goal: Fall 2019.

Page 53: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

43

Annual Planning and Evaluation Report: 2017-2018 Biotechnology, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is designed to prepare graduates for employment in entry-level positions at biotechnology and pharmaceutical companies, as laboratory, research, or manufacturing technicians. Coursework will develop an understanding of basic scientific principles in biology and chemistry, and will emphasize laboratory techniques and procedures such as solution and media preparation, DNA purification and analysis, electrophoresis, chromatography, maintenance of cells in culture, and quality control techniques.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Demonstrate proficiency in standard lab procedures and in the use of basic lab equipment.

Biotechnology Methods BIO 250 One section was offered in the traditional format. Students were given an open-note, hands-on practical final exam which was used as an assessment on this SLO. Students were evaluated in the following categories: Aseptic Technique/Streak Plate Method Use of a thermocycler Use of a microcentrifuge Each activity was worth 10 points. Scores were normalized to a 0-4 scale. Students were expected to receive an 80% (3) or higher to demonstrate competency in each area. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

MA only 1 1 8 DE* N/A N/A N/A ELI N/A N/A N/A Total 1 1 8

*Dual-enrollment

Semester/year data collected: Fall 2017 Eight students were included in this assessment. Target: Students were expected to achieve an 80% (score of 3 or above) for their demonstration of the course topics. Aseptic Technique/Streak Plate Method • 100% (8/8) of students scored 3 or better • Increased percent of students scoring 3 or

better (46% to 100%) as compared to the 2014-15 report

• Increased average score from 2.7 to 3.4 as compared to 2014-15 report

Use of a Thermocycler • 88% (7/8) of students scored 3 or better • Maintained percent of students scoring 3 or

better (88%) as compared to the 2014-15 report

• Average score dropped slightly from 3.5 to 3.4 as compared to the 2014-15 report

Use of a Microcentrifuge • 62% (5/8) of students scored 3 or better • Decreased percent of students scoring 3 or

better (75% to 62%) as compared to the 2014-15 report

• Increased average score from 3.0 to 3.2 as compared to 2014-15 report

Current results improved: [ ] Yes [ ] No [ X ] Partially

See Column 4 for comparison.

BIO 250 was offered only on the Manassas Campus in Fall 2017. Students were expected to achieve an 80% (score of 3 or above) for their demonstration of the course topics. Students mastered the streak plate method using aseptic technique and using a thermocycler. The percentage of students receiving a 3 or better (80%) was 100% and 88%, respectively (n=8). Students struggled more with using a microcentrifuge, with only 62% of the students earning a 3 or better (80%). Assessment of this SLO last occurred in 2014. There was a 54% increase in the number of students mastering the streak plate method using aseptic technique in Fall 2017, as compared to Fall 2014. The Fall 2014 report notes that this skill requires proficient use of the Bunsen burner and proper labeling of petri dishes. Since Fall 2014, these skills have been emphasized to ensure students are proficient at setting up the equipment/workspace before demonstrating the actual technique. Although the average score for the microcentrifuge activity increased (3.0 to 3.2) as compared to Fall 2014, the percent of students scoring 3 or better decreased compared to that same year (75% to 62%). Students struggled to note the differences in volumes held in the sample tubes and therefore did not properly balance the equipment. More emphasis will be made in the future so

Page 54: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

44

Biotechnology, A.A.S. that students take time to make the necessary observations. Next assessment of this SLO: Fall 2018.

Demonstrate professional communication and interpersonal skills and behaviors necessary for working in a collaborative laboratory environment.

Introduction to Careers In Biotechnology BIO 180 Eight students in one section were assessed during the 8-week course on their reliability as potential employees and their personal presentations of themselves. Students were also assessed on their performance in a mock interview setting. Reliability was equated to attendance and punctuality, as an employee would be evaluated. In addition, students were expected to dress professionally (business casual) for each class meeting. Mock interview panels were set up for students to demonstrate their abilities to submit a professional application for employment (including a professional resume and cover letter) and display their professional communication and interpersonal skills. Attendance (including punctuality and professional appearance) was worth 100 points. The mock interview activity (including the final resume and cover letter) was worth 100 points. Scores were normalized to a 0-4 scale. Students were expected to receive an 80% (3 or better) or higher to demonstrate competency. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

MA only 1 1 8 DE* N/A N/A N/A ELI N/A N/A N/A Total 1 1 8

*Dual-enrollment

Semester/year data collected: Spring 2018 Eight students were enrolled in BIO 180. Only one section was offered during the semester (Manassas Campus only). Target: Students were expected to achieve an 80% (score of 3 or above) for their demonstration of professional communication and behavior. Attendance • 100% (8/8) of students scored 3 or better. • The average score (n=8) was 3.8. Mock Interview, Final Resume and Cover Letter • 83% (5/6) of students scored 3 or better. (Two students did not complete the assignment and are not included in this data). • The average score (n=6) was 3.4. Spring 2017 Seventeen students were enrolled in BIO 180. Only one section was offered during the semester (Manassas Campus only). Attendance 100% (17/17) of students scored 3 or better. The average score (n=17) was 3.9. Mock Interview, Final Resume and Cover Letter 81% (13/16) of students scored 3 or better. (One student did not complete the assignment and is not included in this data). The average score (n=16) was 3.4. Current results improved: [ X ] Yes [ ] No [ ] Partially Results remained the same.

Students were expected to achieve an 80% (score of 3 or above) for their demonstration of professional communication and behavior. The percentage of students receiving a 3 or better (80%) was 100% (n=8) for attendance. The percentage of students receiving 3 or better (80%) for the mock interview assignment was 83% (n=6). This course was last taught and assessed in Spring 2017. As compared to that assessment, there was no significant change in the percentage of students demonstrating competency in this SLO. Next assessment of this SLO: Spring 2020.

Demonstrate proficiency in standard lab calculations.

Basic Laboratory Calculations for Biotechnology BIO 147

Semester/year data collected: Fall 2016 Nineteen students were enrolled in two sections of BIO 147 at the Manassas Campus only.

BIO 147 was taught as a hybrid course for the third time on the Manassas Campus in Fall 2016. Students were expected to achieve an 80% (score of 3

Page 55: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

45

Biotechnology, A.A.S. Nineteen students in two hybrid section were given a closed-book final exam which included math/word problems commonly encountered in the laboratory. The points for each topic varied: Significant Figures = 3 points each Scientific Notation = 2 points each Dilutions = 3 points Stock Solutions = 8 points and 9 points. Scores were normalized to a 0-4 scale. Students were expected to receive an 80% (3 or better) or higher to demonstrate competency in each area. Sample size: (Specify N/A where not offered)

Campus/ Modality

# Sections Offered

# Sections Assessed

# Students Assessed

MA only 2 2 19 DE* N/A N/A N/A ELI N/A N/A N/A Total 2 2 19

*Dual-enrollment

Target: Students were expected to achieve an 80% (score of 3 or above) for their demonstration of the course topics. Significant Figures (3 questions) • 96% (18/19) of students scored 3 or better • Increased percent of students scoring 3 or

better (93% to 96%) as compared to the 2015-16 report

• Maintained average score at 3.8 as compared to 2015-16 report

Scientific Notation (3 questions) • 98% (19/19) of students received a score of

3 or better • Increased percent of students scoring 3 or

better (79% to 98%) as compared to the 2015-16 report

• Increased average score from 3.6 to 3.9 as compared to 2015-16 report

Dilutions (1 question) • 81% (14/19) of students scored 3 or better • Increased percent of students scoring 3 or

better (50% to 81%) as compared to the 2015-16 report

• Increased average score from 2.9 to 3.2 as compared to 2015-16 report

Multi-Component Stock Solutions (2 questions) • 94% (17/19) of students scored 3 or better • Increased percent of students scoring 3 or

better (43% to 94%) as compared to the 2015-16 report

• Increased average score from 3.1 to 3.8 as compared to 2015-16 report

Current results improved: [ X ] Yes [ ] No [ ] Partially

or above) for their demonstration of the course topics. Students mastered significant figures, scientific notation, and multi-component stock solution. The percentage of students receiving a 3 or better (80%) was over 96%, 98%, and 94%, respectively (n=19). Students struggled more with dilutions, with 81% of the students earning a 3 or better (80%). The topic where the students displayed the lowest score (Multi-Component Stock Solutions) showed an increase in the average score as compared to the 2015-16 report. All increases are due in part to better alignment of BIO 147 with BIO 250 implemented by the instructor in Fall 2016. In the previous assessment of this SLO (2015-16), it was noted that some students did not adhere to the desired block scheduling and were confused by the differing calculation methods employed by the two different instructors of BIO 147. The program faculty’s actions to alleviate these issues (described in the 2015-16 APER) were implemented for this assessment period and included 1) requiring students to adhere to block scheduling to take advantage of the BIO 147/BIO 250 alignment and 2) have a single instructor teach all BIO 147 sections for consistency in calculation methodology. These same actions will be used to further improve outcomes in these areas. Next assessment of this SLO: Fall 2019.

Explain fundamental scientific concepts in biotechnology.

Biotechnology Concepts BIO 253 Students completed a cumulative final exam. Students were evaluated in the following categories: • Basic concepts of biotechnology • Structure and functions of DNA and proteins • Genetic engineering to produce products

Semester/year data collected: Fall 2017 8 students enrolled in ELI course, which was the only section offered. Fall 2016 11 students enrolled in the Hybrid Course (assessed on Manassas Campus)

This SLO is designed to gauge the understanding of students with respect to the fundamental concepts of biotechnology. Four categories encompassing the course objectives and addressing the SLO were designed and assessed during the written final exam.

Page 56: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

46

Biotechnology, A.A.S. • Techniques used in bio-manufacturing

Students were scored from 1-4 in each category listed above based on answers to multiple short answer questions in each category given on the final exam. 1=poor; 2=fair; 3=good; 4=excellent. Students were expected to receive an 80% (3 or better) or higher to demonstrate competency in each area. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

MA only N/A N/A N/A DE* N/A N/A N/A ELI 1 1 8 Total 1 1 8

*Dual-enrollment

10 students enrolled in ELI Course (taught by the Instructor of the Hybrid Section) Number of sections: 2 Definition and applications of Biotechnology Fall 2017 – ELI • 75% (6/8) students scored a 3 or better • Class average: 3.5 Fall – Hybrid • 91% (10/11) Students scored a 3 or better • Class average: 3.7 Fall 2016 – ELI • 90% (9/10) scored a 3 or better • Class average: 3.8 Fall 2014 • 95% (21/22) students scored a 3 or better • Class average: 3.8 Structure and functions of DNA and proteins Fall 2017 – ELI • 75% (6/8) students scored a 3 or better • Class average: 3.1 Fall 2016 – Hybrid • 82% (9/11) Students scored a 3 or better • Class average: 3.6 Fall 2016 – ELI • 80% (8/10) scored a 3 or better • Class average: 3.6 Fall 2014 • 91% (20/22) students scored a 3 or better • Class average: 3.5 Genetic engineering to produce products Fall 2017 – ELI • 87.5% (7/8) students scored a 3 or better • Class average: 3.6 Fall 2016 – Hybrid • 91% (10/11) Students scored a 3 or better • Class average: 3.6 Fall 2016 – ELI • 90% (9/10) scored a 3 or better • Class average: 3.7 Fall 2014 • 91% (20/22) students scored a 3 or better • Class average: 3.6 Techniques used in bio-manufacturing Fall 2017 – ELI

In two categories, “genetic engineering to produce products” and “techniques used in biomanufactoring,” the class met the target goal of 80% or greater competency (87.5%). The other two categories, “basic concepts of biotechnology” and “structure and functions of DNA and proteins”, 75% of the class (6/8) showed competency, which is below the 80% target. The exam used for the assessment was the same as the one used by the ELI section in Fall 2016. A test pool was created and used in the administration of the ELI Final Exam. To improve student mastery of the second SLO, “structure and function of DNA and proteins”, a lecture and a home assignment were added by the instructor to help the students review the central dogma and related knowledge. To improve the success rate of this course, the instructor met with each student after each exam to review the questions in the exam to address any weakness as soon as possible. However, two students still struggled in this challenging course which resulted in 75% success rate. Next assessment of this SLO: Fall 2018.

Page 57: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

47

Biotechnology, A.A.S. • 87.5% (7/8) students scored a 3 or better • Class average: 3.6 Fall 2016 – Hybrid • 72% (8/11) Students scored a 3 or better • Class average: 3.5 Fall 2016 – ELI • 80% (8/10) scored a 3 or better • Class average: 3.6 Fall 2014 • 95% (21/22) students scored a 3 or better • Class average: 3.7 The target for this SLO was for 80 percent or more of the students to be competent in each category listed above. We define competence as a student score of 3 or better. This target was met. Additionally, a cumulative score for every student was calculated with a target of 13 points (~80%) or greater. 75% of students (6/8) met this goal, as compared to 81% in Fall 2016. Current results improved: [ ] Yes [ X ] No [ ] Partially

Describe the ethical and regulatory aspects of the biotechnology industry.

Principles in Regulatory and Quality Environments for Biotechnology BIO 165 There were eight students enrolled in one online section. They were required to present their biotech product to the class by generating a presentation and sharing the link to the presentation on the discussion board. It was graded according to the following rubric:

1. a brief introduction of the company 2. a brief introduction of the biotech product 3. a flowchart that shows the process of how your

company produces the biotech product 4. an action plan (a plan for quality control) 5. Attributions provided properly 6. the link to the presentation works well 7. script attached

Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

MA only N/A N/A N/A DE* N/A N/A N/A ELI 1 1 8

Semester/year data collected: Fall 2017 8 students took the course in an online format, and 7 of them submitted the presentation. Out of 55 total points, three students got a 55, three got 53, and one got 50. The class average was 53.

SLO Criteria Results

1 87.5% (7/8) students achieved 2 87.5% (7/8) students achieved 3 87.5% (7/8) students achieved 4 87.5% (7/8) students achieved 5 87.5% (7/8) students achieved 6 50% (4/8) students achieved 7 75% (6/8) students achieved

7 out of 8 students submitted their presentation on the biotechnological product they had been working on in the course. Due to the detailed rubric, all of the students who submitted the assignment achieved 90% or higher.

Page 58: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

48

Biotechnology, A.A.S. Total 1 1 8

*Dual-enrollment Core Learning Outcome Evaluation Methods Assessment Results Use of Results CLO: Apply the scientific method including: planning an experiment, collecting data, analyzing and interpreting data. [ X ] CT

Nucleic Acid Methods BIO 252 Seventeen students were enrolled in BIO 252 in one section at the Manassas campus only. Students were tasked with completing two projects (a molecular cloning project and a DNA sequencing project), documenting the experiments in their scientific lab notebook, and reporting the results in a scientific paper format. The two lab notebook assignments were worth 30 points each. Scores were normalized to a 0-4 point scale. Students were expected to receive an 80% or higher (3 or better) to demonstrate competency. The two scientific papers were worth 50 points each. Scores were normalized to a 0-4 scale. Students were expected to receive an 80% (3 or better) or higher to demonstrate competency. Sample size: (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students Assessed

MA only 1 1 17 DE* N/A N/A N/A ELI N/A N/A N/A Total 1 1 17

*Dual-enrollment

Semester/year data collected: Spring 2017 Target: Students were expected to achieve an 80% (score of 3 or above) to demonstrate competency. Seventeen students were enrolled in BIO 252 in one section at the Manassas Campus. Only fifteen students completed both scientific paper assignments. One student submitted no paper at all and one student submitted one paper late. These students were not included in the scientific paper data. Only sixteen students completed both notebook assignments. One student did not submit the DNA sequencing notebook assignment and is therefore not included in the lab notebook data. Lab Notebook Assignments • 69% (11/16) of students scored 3 or better. • The average score (n=14) was 3.4. Scientific Papers • 47% (7/15) of students scored 3 or better. • The average score (n=11) was 3.1. Spring 2016 Fourteen students were enrolled in BIO 252. Only thirteen students completed the scientific paper assignment. Two students submitted very similar papers and were docked points as a result. These students were not included in the scientific paper data. Lab Notebook Assignment • 64% (9/14) of students scored 3 or better. • The average score (n=14) was 3.3. Scientific Paper • 18% (2/11) of students scored 3 or better. • The average score (n=11) was 2.9. Current results improved if applicable: [ X ] Yes [ ] No [ ] Partially

In the last assessment of this SLO (2015-16), program faculty determined that students needed increased guidance in completing lab notebook assignments and in preparing scientific manuscripts. This was accomplished by improving alignment between the lecture portion of the course and the lab course. Students are now expected to participate in journal club activities in which they dissect and present scientific literature. Students were expected to achieve an 80% (score of 3 or above) for their application of the scientific method. The percentage of students receiving a 3 or better (80%) was only 47% (n=15) for the scientific paper. However, this is an increase of 29 percentage points as compared to the 2015-16 assessment (18%) and an increased average score (3.1) as compared to 2015-2016 (2.9). Student grades not only depend on their application of the scientific method, but also their writing skills. To compensate for this additional variable, a second paper assignment was added and students could use feedback from the first assignment to modify their writing skills. The percentage of students receiving 3 or better (80%) for the lab notebook assignment was 69% (n=16). The average score was 3.4. This is an increase in both metrics from 2015-16: 64% and 3.3, respectively. Due to the improved alignment of the lecture with the laboratory, student performance has increased since the last assessment period. In the future, the assessment method will exclude students’ writing skills when assessing their application of the scientific method.

Page 59: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

49

Biotechnology, A.A.S. Next assessment of this SLO: Spring 2019.

Program Goals Evaluation Methods Assessment Results Use of Results Promote the Biotechnology Program to increase awareness, recruitment and proper program placement.

Organization and participation at outreach events (Data: surveys and headcounts at events)

Outreach events: 1. STEM Day celebration, April 2018

Attended by more than 200 students, similar to data reported in 2016-17.

2. Biotechnology Club members participated in

campus student group activities to raise awareness of the Biotechnology Program and reach out to NOVA students who were interested in the Biotechnology Program or scientific career.

The Biotechnology program has a target of increasing enrollment, or at least maintaining enrollment above 24 students for each new cohort which begins each Fall semester. This goal was not met in Fall 2017, falling short by 16 students. This is partially due to the change of leadership in Summer 2017, and the decrease of biotechnology faculty member from 4 to 2 during the 2017-18 academic year. Despite the changes to the biotechnology faculty, faculty member and students still actively participated in outreach events, such as STEM Day, to reach out to the local society. To prevent loss of applicants due to incorrect contact information on the outdated biotechnology homepage, the webpage will be updated soon to give clear application guidance to potential applicants. Assessed: Annually

Retain Biotechnology Program students.

Number of students now completing biotech lab exercises (Data: students enrolled in BIO101 labs at participating campuses)

Students completing the biotechnology lab exercise in BIO 101 from years 2012-18:

MA WO AL Total # of Students

2017-18 958 0 700 1658 2016-17 1008 948 912 2868 2015-16 1034 792 782 2608 2014-15 1043 869 1025 2937 2013-14 1084 192 1440 2716 2012-13 1018 168 1118 2304 2011-12 480 71 72 623

The Biotechnology Program attracts students for the AAS track primarily through recruitment in BIO 101 courses. Manassas and Alexandria Campuses continue to implement a Biotech-themed lab activity in the BIO 101 lab sections. The numbers of students who finished the Biotech-themed lab activity at Manassas and Alexandria Campuses were comparable to those of 2015-16. However, Woodbridge Campus stopped this lab activity because of budget issues. In the next academic year, efforts will be made to introduce this lab exercise to other campuses, such as Annandale Campus and Loudoun Campus. Assessed: Annually

Page 60: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

50

Biotechnology, A.A.S. Graduate Biotech Program Students.

Students who graduate each semester by degree type (Data: OIR)

Number of graduates for years 2011-18:

AAS CSC Total # of Graduates

2017-18 2 0 2 2016-17 5 6 11 2015-16 10 5 15 2014-15 9 5 14 2013-14 12 6 18 2012-13 5 1 6 2011-12 8 0 8

The graduation rate of 2017-18 was 2, lower than 11 in 2016-17. The decrease can be attributed to a smaller class recruited in Fall 2017. There still remains a disconnect between the number of students completing the program requirements and students who apply for graduation. As stated above, faculty are working on a more comprehensive database to better track students and implement more targeted advising to urge student to apply for graduation. It is very common that students obtain employment or transfer and forget to apply for graduation. Assessed: Annually

Program placement goal. Our goal is to grow enrollments of both AAS and CSC programs. Number of students program placed in the Biotechnology Program (Data: OIR report) Number of program-placed students designated as active. (Data: comparison of program-placed students with those enrolled in biotech curricula.) Number of students who persist in the program from Fall to consecutive Spring semester. (Data: queries of enrollment and program placement of students in all biotech courses) NOTE: Active students are defined as students who are program-placed and have taken at least one biotech course. They remain active until they go three semesters without enrolling in a biotech class.

Target: retention rates are 80% for each category assessed. This target was met only for both AAS and CSC student population Number of students enrolled in the program from years 2012-17:

AAS Enrollment

Certificate Enrollment

Fall 2017 45 8 Fall 2016 87 20 Fall 2015 75 7 Fall 2014 96 13 Spring 2014 66 11 Fall 2013 92 12 Spring 2013 100 26 Fall 2012 97 26

Active in the Program: Fall 2017: 12/53 (28%) Fall 2016: 20/80 (25%) Fall 2015: 22/83 (27%) Fall 2014: 45/109 (42%) Spring 2014: 47/77 (61%) Fall 2013: 45/104 (43%) Spring 2013: 28/100 (28%) Fall 2012: 32/116 (27%) Retention rate of all students enrolled in Fall biotech courses: Fall 2017-Spring 2018: 7/8 (88%) Fall 2016-Spring 2017: 18/20 (90%) Fall 2015-Spring 2016: 12/17 (71%)

Based on OIR data, the Fall AAS enrollment decreased from 87 students in Fall 2016 to 45 in Fall 2017. There were an additional 8 students who were placed in the CSC track (total of 53 students in the Biotechnology program). Overall, there were 8 new program placed students enrolled in Biotechnology courses and 4 previous program placed students taking other Biotechnology requirements in this academic year. The remaining 41 students fall into three categories: 1) in need of additional courses to complete the program, 2) are improperly placed in the Biotechnology (i.e. not intending to complete the program), or 3) completed the program and have not applied for graduation. To better track the status and progress of biotechnology students, the Biotechnology Program head is currently creating a comprehensive database to better track incoming and current Biotechnology students, with the dual aim of identifying students who have completed the program and have yet to apply for graduation and students that are not intending to complete the Biotechnology program. This database

Page 61: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

51

Biotechnology, A.A.S. Fall 2014-Spring 2015: 28/39 (72%) Fall 2013-Spring 2014: 21/40 (52.5%) Fall 2012-Spring 2013: 30/36 (83%) Retention rates of Active Students enrolled in Fall biotech courses:

AAS CSC Overall Fall 2017-Spring 2018

6/7 (86%)

1/1 (100%)

7/8 (88%)

Fall 2016 – Spring 2017

8/10 (80%)

10/10 (100%)

18/20 (90%)

Fall 2015-Spring 2016

7/11 (64%)

5/6 (83%)

12/17 (71%)

Fall 2014-Spring 2015

18/21 (86%)

10/13 (77%)

28/34 (82%)

Fall 2013-Spring 2014

20/27 (74%)

4/4 (100%)

24/31 (77%)

Fall 2012-Spring 2013

17/19 (89.5%)

11/13 (85%)

28/32 (87.5%)

currently has all biotechnology students program placed since Fall 2017. Starting from the 2016-17 academic year, Biotechnology courses were only offered on the Manassas Campus. This has helped decrease the number of students taking Biotechnology courses who are not program placed at the time of enrolling in the Fall Biotechnology courses. This is evident in 88% retention rate in 2017-18 and 90% retention rate in 2016-17. This action was implemented by the Dean and Associate Dean. Enrollments in core biotechnology courses BIO 250 and BIO 253 decreased from 2016-17 numbers (from 20 to 8). This decrease is possibly due to the change in the program head in summer 2017, which negatively affected recruitment in the Summer of 2017. The overall fall to spring retention rate for students enrolled in Fall Biotech courses was 88% (7/8) in 2017-18, which is comparable to 90% (18/20) in 2016-17 and better than 2015-16 rate of 71%. Assessed: Annually

Obtain high student success in courses.

Percent of students who obtain a grade of B or higher (Data: grade rosters)

Target: Each class has a target of 80% success. 75% (6/8) courses met this target in 2016-17. Spring 2018: BIO180: 6/8 students successful (75%) BIO252: 5/8 students successful (63%) BIO254: 9/9 students successful (100%) BIO255: 9/9 students successful (100%) Fall 2017: BIO147: 6/10 students successful (60%) BIO165: 4/8 students successful (50%) BIO250: 4/8 students successful (50%) BIO253: 5/8 students successful (63%) Spring 2017: BIO252: 13/17 students successful (76%) BIO254: 15/15 students successful (100%)

There has been extensive change to the Biotechnology Program faculty in the past year. Two of four faculty members left NOVA. As a result, four Biotech core courses are now taught by new faculty members. In 2017-18, only 25% of courses met the target of 80% or more of students earning a “B” or better. Some of the courses, such as BIO 250 and BIO 252, continued to be challenging for the students despite the measures adopted to improve student success in the past year. In the future, prompt communication and collaboration among Biotech faculty members will be practiced to identify students who struggle in these courses.

Page 62: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

52

Biotechnology, A.A.S. BIO255: 16/18 students successful (89%) Fall 2016: BIO147: 19/21 students successful (90%) BIO165: 22/23 students successful (96%) BIO180: 16/17 students successful (94%) BIO250: 14/20 students successful (70%) BIO253: 18/20 students successful (90%) Spring 2016: BIO252: 12/14 students successful (86%) BIO254: 12/14 students successful (86%) BIO255: 12/13 students successful (92%) Fall 2015: BIO147: 12/14 students successful (86%) BIO165: 12/13 students successful (92%) BIO180: 16/16 students successful (100%) BIO250: 11/16 students successful (69%) BIO253: 13/15 students successful (87%) Spring 2015: BIO252: 17/23 students successful (74%) BIO253: 2/4 students successful (50%) BIO254: 27/28 students successful (96%) BIO255: 21/25 students successful (84%) Fall 2014: BIO147: 22/27 students successful (81%) BIO165: 21/24 students successful (88%) BIO180: 30/30 students successful (100%) BIO250: 19/26 students successful (73%) BIO253: 27/29 students successful (93%) Spring 2014: BIO147: 8/15 students successful (53%) BIO165: 4/4 students successful (100%) BIO250: 9/12 students successful (75%) BIO252: 9/15 students successful (60%) BIO254: 12/16 students successful (75%) BIO255: 9/17 students successful (53%) BIO253: 11/16 students successful (69%) Fall 2013: BIO147: 17/26 students successful (65%) BIO165: 21/23 students successful (91%) BIO180: 16/25 students successful (64%) BIO250: 17/23 students successful (74%) BIO253: 25/36 students successful (69%)

Students seemed to have more difficulties in the fall courses of their first semester entering the program. More advising and mentoring will be given to the incoming biotech students in the fall semester in the future. Assessed: Annually

Page 63: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

53

Biotechnology, A.A.S. Spring 2013: BIO253: 13/13 students successful (100%) BIO250: 7/7 students successful (100%) BIO252: 9/12 students successful (75%) BIO254: 22/23 students successful (96%) BIO165: 8/11 students successful (73%) BIO180: 11/11 students successful (100%) BIO147: 4/5 students successful (80%) Fall 2012: BIO 250: 13/15 students successful (87%) BIO253: 14/16 students successful (87%) BIO251: 8/11 students successful (73%) BIO165: 13/15 students successful (86%) BIO180: 13/14 students successful (93%)

Strengthen relationships with local biotechnology industry.

Number of company partners (as designated by participation in advisory committee or as internship supervisor). Involvement of company reps as guest speakers in classes or at biotechnology events. New course development or course enhancement in response to industry need.

Target: To maintain or increase the number of internship host sites and maintain or increase the number of industry and academic partner presence in courses. Both targets were met. Current partners include: Bode Technology, Ceres Nanosciences, HHMI, INOVA, GMU, ATCC, Innovate Tech Ventures, NOVA Alexandria, NOVA Manassas, Mediatech and Quest. Placement of students in internships:

# of Students Placed

# of Internship Locations

2017-2018 7 6 2016-2017 11 7 2015-2016 20 7 2014-2015 14 6 2013-2014 17 6 2012-2013 12 6 2011-2012 9 --

Participation of industry in program: Representatives doing mock interviews with students: Bode Technology (Sayed Mosavi), ATCC (Solmaz Eskandarinezhad & Loricel Champ). Representatives evaluating student “Entrepreneur Project”: MediaTech (Brian Posey), IQVIA (Mitch DeKoven), United States Consumer Product Safety Commission (Jay Kadiwala).

Representatives from industry, government, and academia are continuously involved in outreach events, classroom workshops, and serve as guest speakers in classes. We had 6 industry and academic guest speakers in 2017-18, comparable to 9 in 2016-17, 10 in 2015-16, 13 in 2014-15, 16 in 2013-14 and 12 in 2012-13. Biotechnology faculty continually engage potential industry and academic partners to increase the number of guest speakers, internship opportunities and career placements. A new internship pathway at Virongy for students in Summer 2018 was added to the growing list of opportunities available to Biotechnology students. These new opportunities are needed as student enrollment is expected to increase and student interests become more diversified. In 2017-18 we placed 7 students in internships at 6 different host sites. To increase the number of sites available to the Biotechnology students for internships, we continually petition our advisory board for increased opportunities at their sites and connections to their industry partners for continued diversification of internship locations. This request will be repeated at the Fall 2018 advisory board meeting.

Page 64: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

54

Biotechnology, A.A.S. Guest speakers in classes:

• 8 guest speakers in 2016-17 • 9 guest speakers in 2016-17 • 10 guest speakers in 2015-16 • 13 guest speakers in 2014-15 • 16 guest speakers in 2013-14 • 12 guest speakers in 2012-13 • 6 guest speakers in 2011-12

The program head routinely attended meetings hosted by Prince William County Department of Economic Development, met with local biotech industry leaders, participated in local high school career fair, and worked with GMU faculty members to form and fortify collaborations. Assessed: Annually

Page 65: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

55

Annual Planning and Evaluation Report: 2017-2018 Business Administration, A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Associate of Science degree curriculum in Business Administration is designed for persons who plan to transfer to a four-year college or university to complete a baccalaureate degree program in Business Administration with a major in Accounting, Business Management, Decision Science and Management, Information Systems, Finance, Marketing, etc.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to describe the various theories related to the development of leadership skills, motivation techniques, teamwork and effective communication.

Introduction to Business BUS 100 Direct Measure: Criteria: a = know b = understand c = apply Method: Short Answer Questions

a. List the 3 major leadership styles b. Describe the differences between the 3 major

leadership styles c. Give specific example of each leadership style

No rubric provided Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 7 4 54 AN 21 7 85 MA 6 2 0 LO 13 5 56 ME N/A N/A N/A WO 0 0 N/A ELI 13 9 0 DE* N/A N/A N/A Total 60 27 195

*Dual-enrollment

Data collected: Fall 2017 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher overall

• Criteria b: 70% of students will score 70% or higher overall

• Criteria c: 65% of students will score 70% or higher overall

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Assessment Results

Fall 2017 Average

Score Percent > 70%

AL 75.72 59.26 AN 85.63 64.71 MA DNR DNR ME N/A N/A LO 92.14 83.93 WO N/A N/A ELI DNR DNR DE N/A N/A Total 90.08 69.74

DNR: Did not report the data Results by SLO Criteria:

Criteria/ Question Topics

Results Fall 2017

Results Spring 2016

Average Score

% Students > target

Average Score

% Students > target

a 92.69 77.95 ** 79 b 89.58 74.36 ** 71 c 87.98 78.46 ** 56 Total 90.08 69.74 ** 68.6

Previous action(s) to improve SLO: The cluster agreed (beginning Fall 2016) to use interactive classroom exercises such as “Imagine you are stranded on an island with your classmates, what leadership skills will you need to get people to work together to ensure the group’s survival?” to emphasize understanding and application. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: A comparison of the raw scores to the % of students > target would indicate that student knowledge is neither homogenous across all criteria nor consistent across campuses. Current actions to improve SLO based on the results: To address consistency across the campuses, the program must first improve reporting across all campuses and ELI to determine whether this inconsistency is broad across the discipline or narrow with regard to individual campuses. In the mean-time the program will discuss the possible sources of these inconsistencies. The program will also discuss the cause of the non-homogenous

Page 66: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

56

Business Administration, A.S. ** Did not record average scores by criteria Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion: The overall raw scores are exceptional across the criteria and the overall % of students > than the target has risen to the standard with significant improvement in the criteria “apply” which the cluster sees as a measure of critical thinking skills. Weaknesses by Criterion: A comparison of the total raw scores to the % of students > target would indicate that student knowledge is neither homogenous across all criteria nor consistent across campuses. In addition, the overall target was barely met.

results across criteria and the potential link to inconsistent presentation of the material. Finally, the fact that the overall target was barely met might be a statistical anomaly based on the choice of 3 equally weighted criteria and a 70% target. This means that 70% of the students must answer all 3 questions correctly or the target will not be met. This SLO was administered in a course in which the topic was introduced and we are expecting mastery. The program will discuss either weighting the criteria questions when calculating the overall standard, or adjusting the standards to reflect this fact when the curriculum map is reviewed by the Business Program curriculum committee in Spring 2019. Next assessment of this SLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Students will apply the planning, organizing, leading and control processes of management in identifying the various theories related to the development of leadership skills.

Introduction to Business BUS 100 Direct Measure: Criteria a = know b = understand c = apply Method: Short Answer Questions

a. List the four functions of management b. Explain the importance of each function c. Give an example of each function in practice

Rubric for a: Planning, Organizing, Leading (Directing) and Controlling. No rubric provided for b or c Sample Size (Specify N/A where not offered)

Data collected: Spring 2018 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher • Criteria b: 70% of students will score 70% or higher • Criteria c: 65% of students will score 70% or higher

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Campus/ Modality

Results Fall 2017

Average Score

Percent > 70%

AL 63.86 38.89 AN 78.7 62.35 MA DNR DNR ME N/A N/A

Previous action(s) to improve SLO: The cluster agreed (beginning with Fall 2016) to use interactive classroom exercises such as “Imagine you own your own business, give examples of each of these management functions in the day to day operation of your business” to emphasize application. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: A comparison of the raw scores to the percent of students > target

Page 67: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

57

Business Administration, A.S. Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students Assessed

AL 7 4 54 AN 21 7 85 MA 6 2 0 LO 13 5 56 ME N/A N/A N/A WO 0 0 N/A ELI 13 9 0 DE* N/A N/A N/A Total 60 27 195

*Dual-enrollment

LO 93.37 85.71 WO N/A N/A ELI DNR DNR DE N/A N/A Total 88.28 64.1

DNR: Did not report the data Results by SLO Criteria:

Criteria/ Question Topics

Results Fall 2017

Results Spring 2016

Average Score

% Students > target

Average Score

% Students > target

a. 90.28 81.54 ** 87 b. 88.34 78.97 ** 71 c. 86.18 67.18 ** 46 Total 88.28 64.1 ** 57

**Did not report average scores by criteria Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion: The overall raw scores are exceptional across the criteria and the overall percentage of students > than the target has risen with significant improvement in the criteria “apply” which the cluster sees as a measure of critical thinking skills. Weaknesses by Criterion: A comparison of the raw scores to the percentage of students > target would indicate that student knowledge is neither homogenous across all criteria nor consistent across campuses.

would indicate that student knowledge is neither homogenous across all criteria nor consistent across campuses. Current actions to improve SLO based on the results: To address consistency across the campuses the program must first improve reporting across all campuses and ELI to determine whether this inconsistency is broad across the discipline or narrow with regard to individual campuses. In the mean-time the program will discuss the possible sources of these inconsistencies. The program will also discuss the cause of the non-homogenous results across criteria and the potential link to inconsistent presentation of the material. Finally, the fact that the overall target was barely met might be a statistical anomaly based on the choice of 3 equally weighted criteria and a 70% target. This means that 70% of the students must answer all 3 questions correctly or the target will not be met. This SLO was administered in a course in which the topic was introduced and we are expecting mastery. The program will discuss either weighting the criteria questions when calculating the overall standard or adjusting the standards to reflect this fact when the curriculum map is reviewed by the Business Program curriculum committee in Spring 2019. Next assessment of this SLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Page 68: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

58

Business Administration, A.S. Students will be able to utilize computer skills through the Internet, word processing, and other productivity software to construct reports and presentations of current business practices.

Interpersonal Dynamics in the Business Organization BUS 270 Direct Measure: Criteria: a = know b = understand c = apply Method: Short Answer/Multiple Choice A. technology used in business communication B. digital messages C. e-mails No rubric for a & b Rubric for c Sample Size (Specify N/A where not offered):

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students Assessed

AL 1 1 20 AN 1 1 22 MA 1 1 50 LO 3 3 77 ME N/A N/A N/A WO 1 1 7 ELI N/A N/A N/A DE* N/A N/A N/A Total 8 8 176

*Dual-enrollment

Data collected: Spring 2018 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher • Criteria b: 70% of students will score 70% or higher • Criteria c: 65% of students will score 70% or higher

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Campus/ Modality

Results Spring 2018

Average Score

% Students

> 70% AL 94.17 95 AN 92.42 86.36 MA 91.2 92 ME N/A N/A LO 84.54 84.42 WO 95.24 100 ELI N/A N/A DE* N/A N/A Total 86.68 88.64

Results by SLO Criteria:

Criteria Question Topics

Results Spring 2018

Results Spring 2017

Average Score

% Students > target

Average Score

% Students > target

a. 90.12 89.20 ** 66 b. 88.33 88.64 ** 63 c. 79.69 82.95 ** 55 Total 86.68 88.64 ** 62

**Did not report average scores by criteria Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion: The reported results for both raw scores and students scoring > target were outstanding across all campuses and criteria. Weaknesses by Criterion: The weakest results were for the criteria “apply” which the cluster considers to be a measure of critical thinking skills.

Previous action(s) to improve SLO: The cluster believed that this course was new for the college and the curriculum was still developing. These results were relayed to the instructors for all sections going forward to provide some structure to that development. The improvement year over year is an indication that this action may have had beneficial results. The cluster will use this same method to assess this SLO in the future and monitor the results to verify that these actions were, in fact, beneficial and that they should be incorporated into a standard procedure for all SLO assessments. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: These results will be provided to instructors for the course so that they can discuss appropriate communication through e-mail with their students. Current actions to improve SLO based on the results: The cluster will continue to provide feedback to teaching faculty regarding SLO results. Given that the weakest results were reported for the third criteria assessment, it is important that faculty take every opportunity to distinguish between appropriate and inappropriate content for official communications in all program courses. Next assessment of this SLO: The curriculum committee for the program has an agenda item to

Page 69: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

59

Business Administration, A.S. revisit, rewrite and reschedule the curriculum map during their next meeting.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will be able to calculate the basic impact of marginal cost for the production of goods in a capitalist system. [ x ] QR

Principles of Micro Economics ECON 202 Direct Measure: Calculate the average total, fixed and marginal costs for a "competitive" firm given a certain production cost schedule. 1. determining the efficient level of output 2. calculating output based on market price 3. calculating total profit No rubric provided Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 5 DNR 0 AN 15 2 39 MA 9 DNR 0 LO 10 1 21 ME N/A N/A N/A WO 7 1 16 ELI N/A N/A N/A DE* N/A N/A N/A Total 46 4 76

*Dual-enrollment DNR: Did not report the data Data not parsed by A.S. or A.A.S. Program

Data collected: Spring 2018 Target: 65% of students will score 65% or higher overall and on each criterion. Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Campus/ Modality

Results Spring 2018

Average Score

Percent > 65

AL DNR DNR AN 74.07 66.67 MA DNR DNR ME N/A N/A LO 42.86 38.10 WO 62.5 68.75 ELI DNR DNR DE N/A N/A Total 68.04 59.21

DNR: Did not report the data Results by CLO Criteria:

Question/ Topics

Results Spring 2018

Average Score % Students > target

1. 82.66 65.79 2. 54.96 53.95 3. 53.44 51.32 Total 68.04 65.79

Current results improved: [ ] Yes [ ] No [ ] Partially [x] NA (First time assessed) Strengths by Question: Students were strongest on question 1 which required that they fill out the table and understand the concept of efficiency. Weaknesses by Question: The students were weak at understanding marginal cost as related to marginal revenue and projecting profits based on total cost and total revenue.

Previous action(s) to improve CLO if applicable: First time assessment Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The students were weak at understanding marginal cost as related to marginal revenue and projecting profits based on total cost and total revenue. Two of the campuses and ELI did not report results. Current actions to improve CLO based on the results: These results (and the failure to report) were communicated to the SLO lead, the discipline, the pathway dean, and the pathway provost and remedies were discussed at the Business and Accounting Pathway Council meeting in Fall 2018. The pathway dean and provost will follow up by speaking with their peers from other disciplines and ELI to develop a college-wide system to ensure ELI results are included in future assessments. The SLO lead will raise the issue again with the pathway dean at the January 2019 college-wide discipline meeting. Next assessment of this CLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Page 70: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

60

Business Administration, A.S. Program Goals Evaluation Methods Assessment Results Use of Results

To increase the number of students program placed in the Business Administration, A.S. program.

Distribution of Program Placed Students by curriculum and award type (Data: OIR Fact Book 2013-14 through 2017-18)

Target: Program graduates as a % of students placed in the program outpace the increase in all A.S. graduates as a % of all A.S. students. Results for Past 5 years:

Fall Number of Students Percentage Change Program College Program College 2017 5020 28811 (10) (3) 2016 5539 29808 (3) (0.2) 2015 5726 30452 3 0.5 2014 5538 30297 1 1 2013 5508 29853 9 4

Target Met: [ ] Yes [x ] No [ ] Partially

Previous action(s) to improve program goal: The cluster is working with GMU regarding their proposed Bachelor of Science in Business - an online focused program targeting older adults. This program will have the same core requirements as their BA in Business. Adult learners would then, theoretically, enroll as A.S. Business Administration students to take classes that would be prerequisites to the GMU program. This program is not yet in place and the target date is dependent on GMU’s ability to successfully navigate The State Council of Higher Education’s requirements. Results improved: [ ] Yes [ x ] No [ ] Partially Current action(s) to improve program goal: At this time, the program is captive to the demographics in the high schools and the diversion of students into Computer Science and IT related programs. Another issue raised by the discipline members is that both the A.S. Business Administration and A.A.S. Business Management Programs are sensitive to the employment cycle in that the demand for these degrees is highest when unemployment levels are high. To diversify enrollment risk, the program will continue the partnership with GMU’s Advance Program, targeted at non-traditional students, so that the college will be ready to take advantage of the opportunity to increase enrollment by attracting older students.

Page 71: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

61

Business Administration, A.S. Assessed: Annually

To encourage students placed in the Business Administration, A.S. program to complete the degree.

Number of Graduates by Degree and Specialization (OIR Data for Annual Planning and Evaluation Reports, Report 44-18) Number of Graduates by Degree and Specialization 2017-18

Target: Program graduates as a percentage of students placed in the program outpace the increase in all A.S. graduates as a percentage of all A.S. students. Results for Past 5 Years:

Academic Year

Number of A.S. Graduates Percentage Change

Program College Program College 2017-2018 845 3979 (6) 0.3 2016-2017 902 3967 (11) (8) 2015-2016 1015 4316 0.1 3 2014-2015 1006 4171 0.2 3 2013-2014 1004 4046 2 1

Target Met: [ ] Yes [ x] No [ ] Partially Results improved: [ ] Yes [ ] No [ x ] Partially

Previous action to improve program goal: The cluster is working with GMU regarding their proposed Bachelor of Science in Business, an online focused program targeting older adults. This program will have the same core requirements as their BA in Business. Adult learners would then, theoretically, enroll as A.S. Business Administration students to take classes that would be prerequisites to the GMU program. This program is not yet in place and the target date is dependent on GMU’s ability to successfully navigate The State Council of Higher Education’s requirements. At the same time, cluster members participated in the Structured Pathways committees to ensure that the curriculum provided students with an efficient path to the other major transfer institutions: Old Dominion University, Virginia Commonwealth University, Virginia Tech, and James Madison University. Results improved: [ ] Yes [ ] No [ x ] Partially Current actions to improve program goal: At this time, the program is captive to the demographics in the high schools and the diversion of students into Computer Science and IT related programs. Another issue raised by the discipline members is that both the A.S. Business Administration and A.A.S. Business Management Programs are sensitive to the employment cycle in that the demand for

Page 72: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

62

Business Administration, A.S. these degrees is highest when unemployment levels are high. To diversify enrollment risk, the program will continue the partnership with GMU’s Advance Program, targeted at non-traditional students, so that the college will be ready to take advantage of the opportunity to increase enrollment by attracting older students. Assessed: Annually

Page 73: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

63

Annual Planning and Evaluation Report: 2017-2018 Business Management, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Associate of Applied Science degree curriculum in Business Management is designed for persons who seek employment in business management or for those presently in management who are seeking promotion. The occupational objectives include administrative assistant, management trainee, department head, branch manager, office manager, manager of small business, and supervisor.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to describe the various theories related to the development of leadership skills, motivation techniques, teamwork and effective communication.

Principles of Management BUS 200 Direct Measure: Criteria: a = know b = understand c = apply Method: Short Answer Questions

a. leadership styles b. differences between leadership styles c. examples of leadership styles

No rubric provided Sample Size (Specify N/A where not offered)

Campus/ Modality

Total #

Sections Offered

# Sections assessed

# Students Assessed

AL 1 1 23 AN 2 1 9 MA 1 1 0 LO 1 1 15 ME N/A N/A N/A ELI 4 DNR 0 DE* N/A N/A N/A Total 9 4 47

*Dual-enrollment DNR = Did not report

Data collected: Fall 2017 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher overall

• Criteria b: 70% of students will score 70% or higher overall

• Criteria c: 65% of students will score 70% or higher overall

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Assessment Results

Fall/2017 Average Score Percent > target

AL 89.37 86.96 AN 82.72 77.78 MA DNR DNR ME N/A N/A LO 56.11 46.67 WO N/A N/A ELI DNR DNR DE N/A N/A Total 80.03 72.34

Results by SLO Criteria

Criteria Average

Score %

Students > target

Average Score

% Students > target

1. 90.87 87.23 ** 63 2. 77.38 72.34 ** 59 3. 71.83 68.09 ** 59 Total 80.03 72.34 ** DNR

** Did not report average score by criteria Current results improved: [ x ] Yes [ ] No [ ] Partially

Previous action(s) to improve SLO: Since the sample size was too small in the Spring 2017 assessment for meaningful comparison, this SLO was assessed again in Fall 2017 and the cluster examined those results in Fall 2018 to make a comparison and to discover recommendations. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Although the overall results met the standards, these results were not homogenous across the reporting campuses. In addition, obtaining assessments from ELI continues to be problematic. Current actions to improve SLO based on the results: The weakest campus reported that the assessment was framed as extra credit whereas the other campuses integrated the questions into assessments as graded material. By offering the assessment as extra credit, students were given a penalty-free option to not provide an answer which would then be counted as incorrect. The cluster agreed that the next time this

Page 74: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

64

Business Management, A.A.S. Strengths by Criterion: The average scores were excellent and the percentage of students > the target met the standards for all 3 criteria. Weaknesses by Criterion: Although Manassas collected data in the form of raw assessments, the assessments were ungraded and a request to resubmit the data was not honored. In addition, the results were not homogenous across the campuses.

SLO is assessed it should not be framed as extra credit. The ELI issue is systemic to the college. The cluster feels that SLO assessments should be administered centrally. These results (and the failure to report) were communicated to the SLO lead, the discipline, the pathway dean, and the pathway provost and remedies were discussed at the Business and Accounting Pathway Council meeting in Fall 2018. The pathway dean and provost will follow up by speaking with their peers from other disciplines and ELI to develop a college-wide system to ensure ELI results are included in future assessments. The SLO lead will raise the issue again with the pathway dean at the January 2019 college-wide discipline meeting. Next assessment of this SLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Students will apply the planning, organizing, leading and control processes of management in identifying the various theories related to the development of leadership skills.

Principles of Management BUS 200 Direct Measure: Criteria: a = know b = understand c = apply Method: Short Answer Questions a. functions of management b. the importance of each function c. examples of each function Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

Data collected: Fall 2017 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher overall

• Criteria b: 70% of students will score 70% or higher overall

• Criteria c: 65% of students will score 70% or higher overall

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Campus/ Modality

Assessment Results

Fall/2017

Previous action(s) to improve SLO: Since the sample size was too small in the Spring 2017 assessment for meaningful comparison, this SLO was assessed again in Fall 2017 and the cluster examined those results in Fall 2018 to make a comparison and to discover recommendations. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Although the overall results met the

Page 75: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

65

Business Management, A.A.S. AL 1 1 23 AN 2 1 9 MA 1 1 0 ME N/A N/A N/A LO 1 1 15 WO N/A N/A N/A ELI 4 DNR 0 DE* N/A N/A N/A Total 9 4 47

*Dual-enrollment DNR = Did not report

Average Score

% Students > target

AL 90.91 95.65 AN 97.72 100 MA DNR DNR ME N/A N/A LO 72.78 53.33 WO N/A N/A ELI DNR DNR DE* N/A N/A Total 88.90 82.98

Results by SLO Criteria

Criteria/ Question Topics

Assessment Results

Fall 2017

Assessment Results

Spring 2017

Average Score

% Students > target

Average Score

% Students > target

1. 94.30 95.74 ** 86 2. 90.29 91.49 ** 76 3. 82.12 82.98 ** 63 Total 88.90 82.98 ** DNR

** Did not report average score by criteria Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: The average scores were excellent and the percentage of students > the target met the standards for all 3 criteria. Weaknesses by Criterion/ Question/Topic: Although Manassas collected data in the form of raw assessments, the assessments were ungraded and a request to resubmit the data was not honored. In addition, the results were not homogenous across the campuses.

standards these results were not homogenous across the reporting campuses. In addition, obtaining assessments from ELI continues to be problematic. Current actions to improve SLO based on the results: The weakest campus reported that the assessment was framed as extra credit whereas the other campuses integrated the questions into assessments as graded material. By offering the assessment as extra credit, students were given a penalty-free option to not provide an answer which would then be counted as incorrect. The cluster agreed that the next time this SLO is assessed it should not be framed as extra credit. The ELI issue is systemic to the college. The cluster feels that SLO assessments should be administered centrally. Next assessment of this SLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Students will be able to utilize computer skills through the Internet, word processing, and other productivity software to construct reports and presentations of

Principles of Management BUS 200 Direct Measure: Criteria: a = know b = understand c = apply Method: Short Answer Questions 1. technologies used in business communication 2. digital messages 3. e-mails

Data collected: Spring 2018 Target: 70% of students will score 70% or higher overall

• Criteria a: 75% of students will score 70% or higher overall

• Criteria b: 70% of students will score 70% or higher overall

• Criteria c: 65% of students will score 70% or higher overall

Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Previous action(s) to improve SLO: First-time assessment. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The results were not consistent across the campuses and the overall standard was not met. In addition,

Page 76: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

66

Business Management, A.A.S. current business practices.

Provided Rubric Criteria or Question Topics Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 2 1 4 AN 1 1 14 MA 2 1 3 ME N/A N/A N/A LO 1 1 19 WO 1 DNR DNR ELI 3 DNR DNR DE* N/A N/A N/A Total 10 4 40

*Dual-enrollment DNR = Did not report

Campus/ Modality

Assessment Results

Spring 2018

Average Score Percent > target

AL 91.67 75 AN 66.67 50 MA 86.67 100 ME N/A N/A LO 77.82 73.68 WO N/A N/A ELI DNR DNR DE* N/A N/A Total 75.98 67.50

Results by SLO Criteria:

Criteria/ Question Topics

Assessment Results Spring 2018

Average Score

% Students > target

1. 84.91% 77.50% 2. 69.48% 70% 3. 68.96% 70% Total 75.98% 67.50%

Current results improved: [ ] Yes [ ] No [ ] Partially [ x ] N/A (First-time assessment) Strengths by Criterion: The strongest results were reported for the criteria “know”. Weaknesses by Criterion: The weakest results were reported for the criteria “apply”. The overall standard was not met.

obtaining assessments from ELI continues to be problematic. Current actions to improve SLO based on the results: To address consistency across the campuses the program must first improve reporting across all campuses and ELI to determine whether this inconsistency is broad across the discipline or narrow with regard to individual campuses. In the meantime, the program will discuss the possible sources of these inconsistencies with special focus on inconsistent presentation of the material. Finally, the fact that the overall target was not met might be a statistical anomaly based on the choice of 3 equally weighted criteria and a 70% target. This means that 70% of the students must answer all 3 questions correctly or the target would not be met. The discipline will address the anomaly at the January 2019 college-wide meeting and reflect any changes in future assessments such as adjusting the targets, weighting the questions or adding a 4th question. The ELI issue is systemic to the college. The cluster feels that SLO assessments should be administered centrally. These results (and the failure to report) were communicated to the SLO lead, the discipline, the pathway dean, and the pathway provost and remedies were discussed at the Business and Accounting Pathway Council meeting in Fall, 2018. The pathway dean and provost will follow up by speaking with their peers from other

Page 77: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

67

Business Management, A.A.S. disciplines and ELI to develop a college-wide system to ensure ELI results are included in future assessments. The SLO lead will raise the issue again with the pathway dean at the January 2019 college-wide discipline meeting. Next assessment of this SLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will be able to calculate the basic impact of marginal cost for the production of goods in a capitalist system. [ x ] QR

Principles of Micro Economics ECON 202 Direct Measure: Calculate the average total, fixed and marginal costs for a "competitive" firm given a certain production cost schedule. 1. determining the efficient level of output 2. calculating output based on market price 3. calculating total profit No rubric provided Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 5 DNR 0 AN 15 3 74 MA 9 DNR 0 ME N/A N/A N/A LO 10 1 21 WO 7 1 16 DE* N/A N/A N/A Total 46 5 111

*Dual-enrollment DNR = Did not report

Data collected: Spring 2018 Target: 65% of students will score 65% or higher overall and on each criterion Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Assessment Results Spring 2018

Average Score Percent > 65%

AL DNR DNR AN 62.91% 59.46 MA DNR DNR ME N/A N/A LO 42.86% 38.10 WO 62.5% 68.75 ELI DNR DNR DE N/A N/A Total 61.33% 56.76%

Results by CLO Criteria

Criteria/ Question Topics

Current Assessment Results

Spring 2018 Average

Score % Students > target

1. 72.52 60.36 2. 53.39 53.15 3. 52.54 51.35 Total 61.33 56.76

Previous action(s) to improve CLO if applicable: First-time assessment Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The students were weak at understanding marginal cost as related to marginal revenue and projecting profits based on total cost and total revenue. Two of the campuses and ELI did not report results. The students assessed were not parsed based on their placement in either the A.S or A.A.S. programs. ECON 202 is an alternative course for the A.A.S. students, so we cannot be certain that the overall results are a sample representative of the population. Current actions to improve CLO based on the results: These results (and the failure to report) will be communicated to the SLO lead for the discipline

Page 78: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

68

Business Management, A.A.S. Current results improved: [ ] Yes [ ] No [ ] Partially [ x ] N/A (First-time assessment) Strengths by Criterion/ Question/Topic: Students were strongest on question 1 which required that they fill out the table and understand the concept of efficiency. Weaknesses by Criterion/ Question/Topic: The students were weak at understanding marginal cost as related to marginal revenue and projecting profits based on total cost and total revenue.

and the appropriate deans for follow up. To ensure that students are exposed to the topics of marginal cost/marginal revenue and total cost/total revenue, the discipline will discuss means to include these topics in other courses within the curriculum. This discussion will take place at the college-wide discipline meeting in January 2019. The next time this SLO is assessed, ECON 202 will be the chosen course for the A.S. program and ECON 120 will be used to assess the A.A.S. program. In addition, the Economics department will be given better instructions regarding the need to parse students by program placement. Next assessment of this CLO: The curriculum committee for the program has an agenda item to revisit, rewrite and reschedule the curriculum map during their next meeting.

Program Goals Evaluation Methods Assessment Results Use of Results To increase the number of students program placed in the Business Management, A.A.S. program.

Distribution of Program Placed Students by curriculum and award type (Data: OIR Fact Book 2013-2018)

Target: The program is seeking an increase of 2% year to year. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017-2018 753 (6) 2016-2017 801 (23) 2015-2016 1038 (14) 2014-2015 1209 (18) 2013-2014 1482 (15)

Target Met: [ ] Yes [ x ] No [ ] Partially Comparison to previous assessment(s): Although the program has fewer students enrolled, the decline has decelerated.

Previous action(s) to improve program goal: The discipline members noted that the A.A.S. Business Management Program is susceptible to changes in the labor market in that demand is greatest when unemployment is high. The discipline also believes that there has been a shift toward IT related fields. The cluster is working with the Curriculum Steering Committee, a group of cluster members and representatives from the private sector to find ways to add industry certification and internship opportunities within the program. The committee is also looking at better ways to communicate the

Page 79: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

69

Business Management, A.A.S. value of an A.A.S. Business Management degree to both employers and employees in the area. This committee meets again in November 2018 and has been tasked with providing recommendations. Additionally, in concert with our partners at GMU, a pathway has been created (through the Advance Program) for Business Management A.A.S. students to obtain a 4-year Bachelor of Sciences in Business degree from GMU. Results improved: [ x ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: The pathways curriculum committee is meeting in November 2018 and methods for promoting the Advance Program with the intention of attracting more adult learners and other non-traditional enrollments is on the agenda. Assessed: Annually.

To encourage students to complete the degree.

Target: The program is seeking an increase of 2% year to year. Results for Past 5 Years

Academic Year Number of Graduates

Percentage Increased

2017-2018 35 (22) 2016-2017 45 28 2015-2016 36 (29) 2014-2015 50 (14) 2013-2014 59 7

Target Met: [ ] Yes [ x ] No [ ] Partially

Previous action to improve program goal: The discipline members noted that the A.A.S. Business Management Program is susceptible to changes in the labor market in that demand is greatest when unemployment is high. The discipline also believes that there has been a shift toward IT related fields. The cluster is working with the Curriculum Steering Committee, a group of cluster members and representatives from the private sector to find ways to add industry certification and internship opportunities within the program.

Page 80: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

70

Business Management, A.A.S. The committee is also looking at better ways to communicate the value of an A.A.S. Business Management degree to both employers and employees in the area. This committee met again in November 2018 and has been tasked with providing recommendations at the January 2019 college-wide discipline meeting. Additionally, in concert with our partners at GMU, a pathway has been created (through the Advance Program) for Business Management A.A.S. students to obtain a 4-year Bachelor of Sciences in Business degree from GMU. Most recent results: Results improved: [ ] Yes [x ] No [ ] Partially Current actions to improve program goal: The pathways curriculum committee met in November 2018, and methods for promoting the Advance Program with the intention of attracting more adult learners and other non-traditional enrollments was discussed. The pathway dean and provost agreed that the college needed to develop a systematic marketing program directed to these constituencies. Next assessment: Annually.

Page 81: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

71

Annual Planning and Evaluation Report: 2017-2018

Computer Science, A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed primarily for students who wish to transfer to a four-year college or university to complete the baccalaureate degree in Computer Science. The curriculum emphasizes the study of the science of computing and the use of computing in a scientific setting.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Demonstrate techniques for problem analysis and algorithm design.

Computer Science CSC 200 Direct Measure: No direct data was available. Other Method: Used final grades from course as a relative indicator of SLO achievement. The final grade was based on programming projects and exams from all Introduction to Computer Science, CSC 200, courses. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

AL 3 3 AN 23 23 MA 8 8 ME N/A N/A LO 8 8 WO 4 4 ELI 9 9 DE* 4 4 Total 59 59

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 90% of students will score 70% or higher overall Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered): Data from previous assessment aggregated all campus results into combined assessments and did not include by campus breakouts.

Results by Campus/ Modality

Current Assessment Results

Semester/year

Previous Assessment Results

Semester/year Average

Score Percent >

target Average

Score Percent >

target AL N/A 79 N/A N/A AN N/A 71 N/A N/A MA N/A 66 N/A N/A ME N/A NA N/A N/A LO N/A 72 N/A N/A WO N/A 75 N/A N/A ELI N/A 53 N/A N/A DE N/A 100 N/A N/A Total N/A 70 2.44 81.2

Results by SLO Criteria: No data for individual criteria Current results improved: [ ] Yes [ X ] No [ ] Partially N/A Strengths by Criterion/ Question/Topic: N/A Weaknesses by Criterion/ Question/Topic: N/A

Previous action(s) to improve SLO: This SLO was last assessed in Fall 2015. Designed common assessment projects, used across all campuses, at the same time during the semester. Designed assessment rubric for the projects used as assessment across all campuses.

Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: Due to the lack of collected data, no specific areas of concern regarding students were identified. Old SLO lead left NOVA during the middle of the Fall 2018 semester. Current actions to improve SLO based on the results: New SLO lead selected beginning Spring 2019 to coordinate SLO evaluations and methodology. In 2015, the Discipline faculty attempted to align all campuses to teach the same programming language for CSC 200. This was only partially successful and is an ongoing concern.

Page 82: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

72

Computer Science, A.S. The CS faculty, during the Fall 2019 discipline meetings, will continue the discussion of all faculty teaching the same programming language. The CS faculty will also discuss using a common syllabus. Next assessment of this SLO: Fall 2019

Write computer programs using object-oriented programming features

Computer Science I CSC 201 Direct Measure: No direct data was available. Other Method: Used final grades from course as a relative indicator of SLO achievement. The final grade was based on programming projects and exams from all Computer Science I, CSC 201 courses. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# of Sections assessed

AL 3 3 AN 15 15 MA 4 4 ME N/A N/A LO 5 5 WO 3 3 ELI 6 6 DE* 1 1 Total 37 37

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 90% of students will score 70% or higher overall. Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered): Data from previous assessment aggregated all campus results into combined assessments and did not include by campus breakouts.

Results by Campus/ Modality

Current Assessment Results

Semester/year

Previous Assessment Results

Semester/year Average

Score Percent >

target Average

Score Percent >

target AL N/A 84 N/A N/A AN N/A 85 N/A N/A MA N/A 81 N/A N/A ME N/A NA N/A N/A LO N/A 65 N/A N/A WO N/A 60 N/A N/A ELI N/A 52 N/A N/A DE N/A 100 N/A N/A Total N/A 73 2.64 87.81

Results by SLO Criteria: No data for individual criteria Current results improved: [ ] Yes [ x ] No [ ] Partially N/A Strengths by Criterion/ Question/Topic: N/A Weaknesses by Criterion/ Question/Topic: N/A

Previous action(s) to improve SLO: Designed common assessment projects, used across all campuses, at the same time during the semester. Designed assessment rubric for the projects used as assessment. In 2017-18 faculty began reassessing course prerequisites and discussions on standardizing minimum expectations for students entering CSC 201. Beginning in Fall 2016, there was increased attention towards the implementation of class, and object-oriented programming features. Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: Due to the lack of collected data, no specific areas of concern regarding students were identified. Old SLO lead left NOVA during the middle of the Fall 2018 semester. Current actions to improve SLO based on the results: New SLO lead selected beginning Spring 2019 to coordinate SLO evaluations

Page 83: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

73

Computer Science, A.S. and methodology. Discipline faculty will discuss common assessment methodology and rubric during Fall 2019 discipline meeting. Next assessment of this SLO: Fall 2019

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Demonstrate critical thinking by applying appropriate data structures and Abstract Data Types (ADTs). [ X ] CT

Computer Science II CSC 202 Direct Measure: No direct data was available. Other Method: Used final grades from course as a relative indicator of SLO achievement. The final grade was based on programming projects and exams from all Computer Science II, CSC 202 courses. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 2 2 N/A AN 8 8 21 MA 5 5 N/A ME N/A N/A N/A LO 4 4 N/A WO 1 1 16 ELI 0 0 N/A DE* 2 2 N/A Total 22 22 37

*Dual-enrollment

Target: 90% of students will score 70% or higher overall Results by In-Class, ELI, Dual Enrollment (Specify N/A where not offered):

Results by Campus/ Modality

Current Assessment Results Semester/year

Average Score Percent > target AL N/A 88 AN N/A 79 MA N/A 71 ME N/A NA LO N/A 67 WO N/A 64 ELI N/A NA DE* N/A 92 Total N/A 75

Results by SLO Criteria: No data for individual criteria Current results improved: [ ] Yes [ x ] No [ ] Partially N/A Strengths by Criterion/ Question/Topic: N/A Weaknesses by Criterion/ Question/Topic: N/A

Previous action(s) to improve CLO: Not previously assessed. Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: Due to the lack of collected data, no specific areas of concern regarding students were identified. Current actions to improve CLO based on the results: New SLO lead selected beginning Spring 2019 to coordinate SLO evaluations and methodology.

Next assessment of this CLO: Fall 2019

Program Goals Evaluation Methods Assessment Results Use of Results

Program goal on program-placed students is to increase number of program-placed students.

Data obtained from previous editions of this document and current NVCC enrollment.

Semester/year data collected: Fall 2018 Target: Number of program-placed students in the Computer Science, AS program will increase every academic year. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2012 1738 - 2014 1955 12.5 2016 1936 -1.0

Most recent results: Program shows an apparent leveling off of program-placed students. This appears to be within reasonable limits of the ebb and flow of student registration. Results improved: [ ] Yes [ X ] No [ ] Partially

Page 84: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

74

Computer Science, A.S. 2017 2011 3.9 2018 2004 -0.3

Target Met: [ ] Yes [ ] No [ x ] Partially

Current action(s) to improve program goal: Our discipline group constantly reviews each course curriculum to meet transfer criteria to four year colleges. Faculty members participate in STEM programs and encourage students to major in Computer Science. Assessed: Annually

Program goal on graduation is to increase program graduation totals.

Data obtained from previous editions of this document and 2017-18 NVCC graduation records.

Semester/year data collected: 2017-18 Target: Program graduation totals will increase every academic year. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2012-13 83 - 2013-14 117 41 2014-15 135 15.4 2015-16 169 25.2 2016-17 192 13.6 2017-18 207 7.8

Target Met: [ X ] Yes [ ] No [ ] Partially

Previous action to improve program goal: Faculty members participated in STEM programs and encouraged students to major in Computer Science. Results improved: [ x ] Yes [ ] No [ ] Partially Current actions to improve program goal: Hired Computer Science tutor in the Spring of 2019. In the process of hiring Computer Science tutor for Fall 2019. Assessed: Annually

Page 85: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

75

Annual Planning and Evaluation Report: 2017-2018 Construction Management Technology, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to qualify personnel in both engineering technology and management for employment in all areas of a construction firm. Occupational objectives include engineering aide, construction project manager, construction supervisor, estimator, and facilities planning and supervision.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Communicate effectively, Written and Oral, consistent with requirements of the construction management industry career

Direct Measure:

• BLD 101- Student evaluation is made in this course through Project Presentations. Previously evaluated through other assignment.

• Presentation is on the research made on selected topics of different chapters.

• Presentation is evaluated on two attributes; project Content and Presentation Techniques which targets audio/visual as well as graphic communication skill.

• Students submit a project hardcopy which affords written communication abilities.

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# % AL only 4 4 58/62 94 Online N/A N/A N/A N/A DE N/A N/A N/A N/A Total 4 4 58/62 94

Program is offered at Alexandria Campus only: • 4 sections offered/year, method introduced Fall 2017 • Data collected, Fall 2017 and Spring 2018

Target: Student average score is minimum of 75% Results by In-Class Enrollment:

Results by

Campus/ Modality

Spring 2018 Fall 2017

Average Score

Percent > [target]

Average Score

Percent > [target]

AL only 83 75 79 75 Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2018 Average

Score % of Students >

[target] 1. Content coverage 86 90 2. Presentation Technique 81 80 Total 83 75

Current results improved: [ X ] Yes [ ] No [ ] Partially

Action to improve SLO: Modified Assignment/ Assessment following semester Spring 2019. Target Met: [ X ] Yes [ ] No [ ] Partially Results are above 75% average success rate. Students allowed to present slides, video or Power Point. Also a sample Presentation was demonstrated. Results were insignificantly different from one course section to another. Results will be used to: - Better demonstrate sample presentation - Encourage attention to presentation tools and content *New data for “Result by SLO.” Past Data not comparable due to change in assessment method. Next assessment of this SLO: Fall 2019

BLD 231 using building quantity measurement to accurately estimate construction costs.

Construction Estimating I BLD 231 Direct Measure: Student evaluation is made through a given sample building with Masonry components. Previously evaluated through the same assignment. • Students understanding of Materials and Methods of

construction used, is measured through the components selected to quantify.

Program is offered at Alexandria Campus: • 2 sections offered /year, method introduced Fall 2016-

17 • Data collected: Fall 2017 and Spring 2018

Target: Student average score is minimum of 75% Results by SLO Criteria:

Action to improve SLO: In the past instructors have worked with students to assist in remedial Math. Program revision proposed to encourage MTH requirement completion prior to BLD 231. Proposed

Page 86: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

76

Construction Management Technology, A.A.S. This course evaluates the major areas of materials quantity survey. The specific subject evaluated by this SLO is Masonry Materials (Chapter 11 of the course text book). See assignment attached.

• Students’ ability to arrive at correct quantities of various masonry parts such as modules, mortar and reinforcement using geometric properties understanding such as areas and volumes of different shapes and masses.

• Students’ ability to then use algebraic tools to calculate total quantities of various masonry system components and arrive at the proper quantities.

• Students are able to score maximum of 10 points for accurate computation.

Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL only 2 2 50/62 ELI N/A N/A N/A DE* N/A N/A N/A Total 2 2 50

*Dual-enrollment

Results by SLO

Criteria/ Question Topics

Spring 2018 Fall 2017

Average Score

% of Students > target

Average Score

% of Students > target

1. Geometric computation 78 72 82.1 91

2. Algebraic Computation 86 83 83.3 93

Total 82 77.5 82.7 92 Current results improved: [X] Yes [ ] No [ ] Partially Students showed weakness in Geometric computations, however stronger algebraic results.

Spring 2018, implemented Fall 2018. Target Met: [X] Yes [ ] No [ ] Partially Results are above 75% average success rate. Based on recent results, students are encouraged to improve their geometric skills. To improve SLO based on the results, Program revision is proposed to require MTH course to be satisfied in freshman year prior to BLD 231. Scheduled for Fall 2019 Next assessment: Spring 2019

Successfully identify and demonstrate skills necessary to manage human resources related to the construction industry.

Construction Management BLD 241 Direct Measure: BLD 241 evaluates this SLO by lecture and weekly homework, both tests and papers. The specifics of this SLO evaluated are Subject of Supervisory Planning (Chapter 7 of the course text). See assignment attached: Review Test Submission_ Chapter 7 Homework Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

# %

AL only 2 2 42/46 91 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 2 2 42 N/A

*Dual-enrollment

BLD 241 is only offered at the Alexandria campus, one section annually. The fall semester of 2017 and 2018 were evaluated with 46 students tested. Target: student average score to be a minimum of 75% Results by In-Class Enrollment:

Results by

Campus/ Modality

Spring 2018 Fall 2017

Average Score

Percent > target

Average Score

Percent > target

AL only 87 90 91.3 95 Current results improved: [X] Yes [ ] No [ ] Partially

Previous action to improve SLO: Beginning Management courses BLD 101 and 102 completion are required prior to this course. Target Met: [X] Yes [ ] No [ ] Partially Results are above 75% average success rate. Based on recent results, students are encouraged to continue completing the lower level Management courses. To improve SLO based on the results: Students will be assigned team research so they are exposed to more chapters (topics) to study, research and report. Future assessments will break down the results to better determine areas for improvement. Next assessment: Spring 2019

Page 87: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

77

Construction Management Technology, A.A.S. Core Learning

Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students are assessed on quantitative skills. This CLO will be an extension of the SLO 2 which measures mathematically the areas, sizes and quantities of a typical building system (i.e. Masonry System) [ X ] QR

Direct Measure: A sample building layout containing Masonry walls are issued to the students. • Students’ abilities to survey and calculate and

quantify of materials used in the masonry system is measured.

• Students’ use of unit prices and price extension to arrive at the total estimated cost of that masonry system is assessed.

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students Assessed

AL only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 9

*Dual-enrollment

Semester/year data collected: Fall 2017 • 1 section offered /year, method introduced Fall 2017 • Data collected, Fall 2017

Target: Student average score is minimum of 75% Average score:

Campus/ Modality

Assessment Results Fall 2017

Average Score

Percent > target

AL only 78 84 Results by CLO Criteria:

Results by CLO Criteria/ Question Topics

Assessment Results: Fall 2017

Average Score

% Students > target

1. Quantify material 75 81 2. Calculate Units & Price 81 84 Total

Current results improved: [X] Yes [ ] No [ ] Partially Strengths: Students’ strength generally is in System Identification. Weaknesses: quantifying the material applied.

Previously this CLO was not assessed. Target Met: [X] Yes [ ] No [ ] Partially Based on recent results, Students in general need improvement in Geometry more than Arithmetic as indicated by the SLO 2 assessment. Currently Program revision is proposed to require MTH course completion prior to BLD 231 so they are better prepared quantitatively. Revision will be reviewed and implemented Fall 2019. Next assessment: Fall 2019

Page 88: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

78

Annual Planning and Evaluation Report: 2017-2018 Contract Management, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is designed for individuals who plan to seek employment in contract management positions and for those presently in contract management positions who seek career advancement. The program is designed to create opportunities for positions in contract management for both government agencies and private industry. Instruction includes both the theoretical concepts and the practical applications needed for future success in the contract management field. This will provide a greater understanding of acquisition, life cycle management, and contracting processes. Occupational objectives include project manager, procurement analyst, contract administrator, contract specialist, contract negotiator, contract price analyst, and contract termination specialist.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to conduct a market research in accordance with the Federal Acquisition Regulation (FAR).

Contract Execution CON 124 Direct Measure: The assignment to the students was to conduct and prepare a complete and thorough market research report. The evaluation method utilized by the Contract Management was the Direct Evaluation Method to assess the SLO. The program rubric utilized 7 criteria: 1. Define the overall objectives for conducting market

research 2. Define the market research steps 3. Apply market research 4. Demonstrate a solid understanding of the industry 5. Analyze and identify commercial practices 6. Analyze and identify level of market competition. 7. Identify efficiency standards in marketplace Performance levels are as follows:

• 4 - Exemplary • 3 - Good/Solid • 2 - Acceptable • 1 - Unacceptable

Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

WO only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A TOTAL 1 1 9

*Dual-enrollment

Data collected: Spring 2018 Target: 95% of the students should score 3 or higher. Results by SLO Criteria:

Criteria 4 3 2 1 % of Students

1 8 1 0 0 100 2 9 0 0 0 100 3 8 1 0 0 100 4 6 2 0 0 100 5 5 4 1 0 100 6 6 3 0 0 100 7 6 3 0 0 100 Overall: 96.9% of students achieved the 95% target of 3 or higher in each category

Describe the results: The rubric above clearly demonstrates that the students in the Contract Management Program are grasping the concepts of market research at the CON 100 level classes. The students demonstrated the ability to (1) define market research, (2) the overall steps needed to implement market research, (3) applying market research, (6) analyze and identify levels of market competition, and (7) identify efficiency standards in marketplace. The students were weakest and did not meet the target in the following criteria: (4) demonstrate a solid understanding of industry and (5) analysis of commercial practices. It should be noted that criteria 4 and 5 are advanced techniques in federal contracting and require students to have an understanding of private/commercial industry standards. In 2018, the students showed improvements in 6 and 7. Comparison of previous assessment: SLOs for 2014 and 2015 showed that students achieved an

Previous actions to improve SLO: To improve the learning outcomes from the Fall 2014 and Spring 2018, the Contract Management Program placed a greater emphasis on market research in its CON 100 and CON 104 classes. The Contract Management program has decided to measure student learning outcomes in CON 124, to assess the students’ overall understanding of the concepts. Instructors have implemented the following: 1. Targeted essays and

presentations that focus on understanding (7) marketplace standards (Fall 2017 and Spring 2018).

2. Additional emphasis will be placed on the identified area regarding CON 100 and CON 104 courses in order to ensure the student has a better understanding of the commercial practices and marketplace (Fall 2017 and Spring 2018).

3. More student led discussion and involvement in the identified area of marketplace efficiencies (Fall 2017 and Spring 2018).

4. Targeted case scenarios that focus on the identified area of

Page 89: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

79

Contract Management, A.A.S.

overall 86.9% (the established benchmark). Students in 2015 and 2016 have achieved an overall 95.9%. Students in 2016 and 2017 achieved an overall 97%. Students in 2017 and 2018 the students achieved on overall success rate of 96.9%. For SLO criteria 1, 2, 3, 6 and 7, the students demonstrated a clear understanding of the market research process. For the basic concepts of Market Research (criteria 1, 2, 3, 6 and 7), the students performed at above 95% level. Significant improvement in criteria (3) Apply Market Research to arrive at the most suitable acquisition method (100%) and criteria (7) Identify efficiency standards in the marketplace were 93% to 100%. The only areas that had a small drop in performance were criteria 4 (93% to 88%) and 5 (100% to 88%).

the analysis of the marketplace (Spring 2018).

Most recent results: Following the 2015 and 2016 SLO, the Contract Management Program established a target of 95% or score of 3 or better. Based on the evaluation criteria 1, 2, 3, 6, and 7, the results were met. Overall, the program achieved 95% or a score of 3 or higher. The students were weakest and did not meet the target in the following criteria: (4) Demonstrate a solid understanding of industry and (5) analysis of commercial practices - two SLO’ that focus on private industry practices and commercial practices. Achievement of targets: The Contract Management Program established a target of 95% or score of 3 or better. Based on the evaluation criteria all CON program goals were met. Overall the program achieved 96.9% or a score of 3 or higher. Based on the results, students remain weaker than the target in the following criteria: (4) Demonstrate a solid understanding of industry, and (5) Analysis of commercial practices. Current action to improve SLO: To improve the learning outcomes, the Contract Management Program instructors will take the following steps: • Targeted essays that research

the identified areas of commercial and industry practices (Fall 2018).

Page 90: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

80

Contract Management, A.A.S. • Additional emphasis placed on

the identified areas regarding the CON 100 course in order to ensure the student has a better understanding of commercial practices (Fall 2018).

• More student and professor led discussions and involvement in the identified area of marketplace efficiencies (Spring 2019).

• Targeted case scenarios that focus on the identified area of the analysis of marketplace (Spring 2019).

• Oral presentations on the common practices in private and commercial industries (Spring 2019).

Next Assessment: Fall 2017

Students will be able to apply strategic acquisition planning so that informed business decisions can be made on behalf of the government.

Fundamentals of Cost and Price Analysis CON 170 Direct Measure: The assessment tool to measure the student’s progress was exam questions/essays, presentations that focus the students on defining purchase request package (PR) requirements, methods of engaging industry, evaluation of solicitation, cost and price analysis techniques, determining a fair and reasonable proposal, and finally negotiation strategies. The evaluation method utilized by Contract Management was the Direct Evaluation Method to assess the SLO. The program rubric utilized 7 criteria: 1. Define the federal requirements of a purchase request

package. 2. Identify the applicable methods for exchanging

information with the industry/vendors 3. Identify the specifics of the requirements determine

the components and procedures for preparing an oral or written solicitation.

4. Identify and apply the procedures for processing a solicitation response, to include safeguarding quotes and proposals, and processing timely and late offers.

5. Identify the analytical techniques that will be used to evaluate contractors' proposals to ensure that both the Government and contractor get a fair and reasonable price.

Data collected: Spring 2018 Target: 90% of the students should score 3 or higher. Results by SLO Criteria:

Criteria 4 3 2 1 % of Students

1 15 3 2 0 90% 2 10 5 5 0 75% 3 12 3 4 1 75% 4 15 4 0 1 95% 5 7 4 8 1 55% 6 12 5 2 1 85% 7 2 10 7 1 60% Total Overall Score 76.4%

Describe the results: The rubric above demonstrates that the students in the Contract Management Program have an acceptable understanding of concepts regarding strategic execution of government contracts. The students demonstrated the ability to define and identify the requirements of a PR package and procedures for processing responses to a solicitation (criteria 1 and 4). Students fell short of the overall goals in the following areas: (2) Identify the applicable methods for exchanging information with the industry/vendors, (3)

Previous actions to improve SLO: This is the first year the CMP formally collected the data and formally reported said data. This measurement will serve as the benchmark. Achievement of targets: The Contract Management Program established a target of 90% or a score of 3 or better. The students demonstrated the ability to define and identify the requirements of a PR package and procedures for processing responses to a solicitation (criteria 1 and 4). Students fell short of the overall goals in the following areas: (2) Identify the applicable methods for exchanging information with the industry/vendors, (3) Identify the specifics of the requirements determine the components and procedures for preparing an oral or written solicitation, (5) Identify the analytical techniques that will be used to evaluate contractors'

Page 91: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

81

Contract Management, A.A.S. 6. Capable of completing a price analysis of a

contractor’s proposal in order to establish price objectives for negotiations.

7. Develop a negotiation strategy. Performance levels are as follows:

• 4 - Exemplary • 3 - Good/Solid • 2 - Acceptable • 1 - Unacceptable

Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

WO only 1 1 20 ELI N/A N/A N/A DE* N/A N/A N/A TOTAL 1 1 20

*Dual-enrollment

Identify the specifics of the requirements determine the components and procedures for preparing an oral or written solicitation, (5) Identify the analytical techniques that will be used to evaluate contractors' proposals to ensure that both the Government and contractor get a fair and reasonable price, (6) Capable of completing a price analysis of a contractor’s proposal in order to establish price objectives for negotiations, and (7) Develop a negotiation strategy. It is important to note that this is the first year of assessing this SLO. Therefore, this year will serve as the benchmark for this SLO. Comparison of previous assessment: This is the first year of assessing this SLO and will serve as the benchmark.

proposals to ensure that both the Government and contractor get a fair and reasonable price, (6) Capable of completing a price analysis of a contractor’s proposal in order to establish price objectives for negotiations, and (7) Develop a negotiation strategy. This is the first year of assessing this SLO. Therefore, this year will serve as the benchmark for this SLO. Current action to improve SLO: To improve the learning outcomes, the Contract Management Program instructors will take the following steps:

1. Provide students will additional research material that targets engagement with industry, source selection process, cost and pricing techniques and developing negotiation strategies (Fall 2018).

2. Emphasis will be placed on the identified areas regarding CON 100 (Spring 2019), CON 121 (Spring 2019) and CON 124 (Fall 2019) to ensure that students understand the concept of engagement with industry, cost and price analysis and fair and reasonable proposals.

3. Add additional study guides that focused on contract negotiations and source selection in CON 100, CON 104, and CON 105 (Fall 2018).

4. More student and professor led discussions and involvement around negotiations (Spring 2019).

Page 92: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

82

Contract Management, A.A.S. Next assessment: Fall 2018

Students will be able to independently apply contract administration programs in support of Federal contracts.

Contract Administration CON 127 The assignment to the students was to prepare a major acquisition plan that focuses on independent government estimates, strategies for conducting cost and price analysis, and establishing a negotiating position. This assignment also required the students to understand labor mix, material mix and indirect cost. In addition, the focus was on principles of contract administration, cost effectively managing government contracts, creating effective work flows, maintaining accurate contract documentation, applying the performance matrix, creating change control tools, and mitigating risk to the government. The evaluation method utilized by Contract Management was the Direct Evaluation Method to assess the SLO. The program rubric utilized 7 criteria:

1. Define the principle of contract administration. 2. Creating a comprehensive management system

that focuses on delivering cost-effective approaches to managing government contracts.

3. Create a contract management process that defines the work flow of the contract, defines roles for completing each step in the work flow and how work flow exceptions will be handled.

4. Create contract management standards that include maintaining accurate documentation for the start and end dates of a contract.

5. Apply performance metrics to measure and assess the effectiveness of the contract management function.

6. Create a Contract change control system by which the contract can be modified.

7. Analyze, assess and mitigate risk associated with government contracting.

Performance levels are as follows:

• 4 - Exemplary • 3 - Good/Solid • 2 - Acceptable • 1 - Unacceptable

Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

WO only 1 1 16

Data collected: Spring 2018 Target: 90% of the students should score 3 or higher. Results by SLO Criteria:

Criteria 4 3 2 1 % of Students

1. 15 0 0 1 94% 2. 6 9 0 1 94% 3. 5 8 2 1 87% 4. 8 7 0 1 94% 5. 5 8 1 1 87% 6. 8 6 1 1 87% 7. 6 6 3 1 75% Overall: 87.4% of students achieved 90% target of 3 or higher in each category

Describe the results: The rubric above clearly demonstrates that the students in the Contract Management Program are grasping understanding of contract administration principles: (1) The students demonstrated the ability develop a comprehensive contract management and system that can track and monitor government contracts. (2) The challenges for this year reflect the CMP’s desire to a deeper dive into real life, situational contracting approach to help better prepare our students for the challenges they will face as contracting officers. The following student outcomes dropped this year: (3) Create a contract management process that defines the work flow of the contract, defines roles for completing each step in the work flow and how work flow exceptions will be handled, (4) Create contract management standards that include maintaining accurate documentation for the start and end dates of a contract, (5) Apply performance metrics to measure and assess the effectiveness of the contract management function, (6) Create a Contract change control system by which the contract can be modified, and (7) Analyze, assess and mitigate risk associated with government contracting. The students did achieve an 87% on criteria 3, 4, 5, and 6. The CMP feels that the students are grasping the basic principles. Criteria 7 is achieving 75%.

Previous actions to improve SLO: To improve the learning outcomes, the Contract Management Program instructor has implemented the following steps: • Provided students with

additional research material that targets the identified areas of contract management work flows, maintaining records, and developing and implementing performance matrix in the CON 127 class (i.e. risk mitigation, communication plans, change management structure) - Spring 2018.

• Emphasis was placed on the identified areas 3, 4 5, 6, and 7 regarding CON 127 Contract Administration (Spring 2018).

Most recent results: The Contract Management Program established and met the targeted goal of 90% or a score of 3 or better in 2016 and 2017. Based on the evaluation criteria 1, 2, 3, 4, 5, 6 and 7 were met. Overall the program achieved 91.9% or a score of 3 or higher. Based on the results, students were weakest but demonstrated considerable improvement in the following criteria: (4) Create contract management standards that include maintaining accurate documentation for the start and end dates of a contract, (5) Apply performance metrics to measure and assess the effectiveness of the contract management function, and (7) Analyze, assess and mitigate risk associated with government contracting.

Page 93: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

83

Contract Management, A.A.S. ELI N/A N/A N/A DE* N/A N/A N/A TOTAL 1 1 16

*Dual-enrollment

Comparison of previous assessment: The students achieved an overall 85.7% on the SLOs for 2014 and 2015, and in 2015 and 2016 have achieved an overall 91.9%. The result of 93.4% in 2016 and 2017 demonstrate that the students are improving in their overall knowledge of contract administration. SLO criteria 1, 2, 3, 4, 5, 6 and 7 show improvement. The students performed at a 93 to 100% level in basic concepts of Contract Administration (criteria 1 thru 3). Nearly all students performed at the acceptable levels in all areas of measurement. The advanced criteria 4, 5, 6 and 7 showed an increase in the targeted performance levels from 87% to 92%.

Achievement of targets: The Contract Management Program established a target of 87.4% or a score of 3 or better. This score reflects a shift in assignment strategies. The CMP adopted a much more robust assignment strategy that reflects real contract management scenarios. This will allow the students to become more familiar with developing and implementing contract administration strategies. Based on the evaluation, criteria 1, 2 and 4 were met. Overall the program achieved 87.4% or a score of 3 or higher. Based on the results, students were weakest but demonstrated considerable improvement in the following criteria (7) Analyze, assess and mitigate risk associated with government contracting. Just shy of the overall goal, students achieved a strong 87% in criteria 3 and 5. Current action to improve SLO: To further improve the learning outcomes, the Contract Management Program instructors will take the following steps: • Provide students will additional

research material that targets the identified areas of contract management work flows, change management systems and developing and implementing performance matrix in the CON 127 class. (Fall 2019).

• Emphasis will be placed on the identified areas 3, 4 5, 6, and 7 introduction courses (Fall 2019).

Page 94: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

84

Contract Management, A.A.S. Next assessment Spring 2018

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

Students will be able to recognize and apply fundamental contracting techniques by utilizing the basic Federal contracting processes: cost estimation procedures, requirement determinations, and characteristics of best value analysis. [ x ] QR

Cost and Price Analysis and Negotiation Techniques CON 217 Direct Measure: The assignment to the students was to prepare a major acquisition plan that focuses on independent government estimates, strategies for conducting cost and price analysis, and determining best value. This assignment also required the students to understand labor mix, material mix and indirect cost to assist with the development of a cost estimate. In addition, it focused on principles of contract administration, cost effectively managing government contracts, creating effective work flows, maintaining accurate contract documentation, applying performance matrix, creating change control tools, and mitigating risk to the government. Information needed to establish an effective evaluation criterion, as well as conduct a best value analysis. The evaluation method utilized by the Contract Management was the Direct Evaluation Method to assess the SLO. The program rubric utilized 6 criteria: 1. Identify the seven fundamentals federal contracting

processes. 2. Analyze customer requirement determinations. 3. Define federal cost estimation procedures. 4. Analyze direct material & direct labor requirements in

order to develop cost estimate. 5. Analyze requirements in order to develop an effective

federal contracting evaluation criteria 6. Identify the quantitative and qualitative methods for

determining best value. Performance levels are as follows:

• 4 - Exemplary • 3 - Good/Solid • 2 - Acceptable • 1 - Unacceptable

Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

WO only 1 1 12 ELI N/A N/A N/A

Data collected: Spring 2018 Target: 80% of the students should score 3 or higher. Results by SLO Criteria:

Criteria 4 3 2 1 % of Students

1 8 3 0 1 92% 2 10 1 0 1 92% 3 6 3 2 1 75% 4 6 3 2 1 75% 5 8 3 0 1 92% 6 4 5 2 1 75% Overall: 83% of students achieved 90% target of 3 or higher in each category

Describe the results: The rubric above clearly demonstrates that the students in the Contract Management Program are grasping understanding of basic Federal contracting processes: (2) Analyzing customer requirement d determinations, and (5) and developing evaluation criteria. However, the students were weakest and did not meet the target in the following advanced areas of applying federal contracting process SLO criteria: (3) Defining cost estimation procedures, (4) Analyze direct material and labor requirements to develop a cost estimate, and (6) Identify the quantitative and qualitative methods for determining best value. Criteria 6 showed some improvement, going from a benchmark score of 61% to 75%. Comparison of previous assessment: The students achieved an overall 83% (the established benchmark) for the 2016 and 2017 SLOs. Students in 2017 and 2018 achieved an overall 83%. Students performed above the 92% level in the basic concepts of fundamental contract processes, analysis of requirements and developing and analyzing evaluation criteria (criteria 1, 2, and 5) with significant improvement in criteria 5 (70% to 92%) and criteria 6 (61% to 75%). The only area that dropped in performance was criteria 3 (93% to 75%).

Previous actions to improve SLO: To improve the learning outcomes from Fall 2017 and Spring 2018, the Contract Management Program placed a greater emphasis on fundamentals of cost and price analysis in CON 170. Instructors implemented the following to further develop the student’s skills: • Provided students with

additional research material that targets the identified areas of labor, material, indirect costs, and requirements determination (Spring 2018).

• Introduced cost accounting concepts in 100-level courses (i.e. Con 100 and CON 170 courses) cost and pricing assignments.

• Emphasis was placed on the identified areas regarding the Fundamentals of Cost and Price Analysis (CON 170) course (Spring 2018).

• Additional assignments were given to focus on the identified areas: requirements determination, evaluation criteria, best value analysis and cost and price analysis (Spring 2018).

Most recent results: Following the 2017 and 2018 SLO, the Contract Management Program established a target of 80% or a score of 3 or better. Based on the evaluation, criteria 1, 2, and 3 were met. Overall the program achieved 83% or a score of 3 or higher. However, students did

Page 95: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

85

Contract Management, A.A.S. DE* N/A N/A N/A TOTAL 1 1 12

*Dual-enrollment

not achieve the target in the more advanced areas of cost estimating, best value analysis and requirements determination. Based on the results, students were weakest and did not meet the target in the following criteria: (4) Analyze direct material and labor requirements to develop a cost estimate, (5) Analyze requirements to develop an effective federal contracting evaluation criteria, and (6) Identify the quantitative and qualitative methods for determining best value. Criteria 6 is the weakest at 61%. It should also be noted that the CMP has changed the format on the assignments this year to better reflect the issues that a Contracting Professional will face while performing his/her duties. Achievement of targets: The Contract Management Program established a target of 83% or a score of 3 or better. Based on the evaluation criteria 1, 2, and 5 were met. Overall the program achieved 83% or a score of 3 or higher. However, students did not achieve the target in the more advanced areas of cost estimating, best value analysis and requirements determination. Based on the results, students were weakest and did not meet the target in the following criteria: (3) Define cost estimation procedures, (4) Analyze direct material and labor requirements to develop a cost estimate, and (6) Identify the quantitative and qualitative methods for determining best value. Criteria 6 remains weak, even though students improved from 61% to 75%.

Page 96: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

86

Contract Management, A.A.S. Current action to improve SLO: To improve the learning outcomes, the Contract Management Program instructors will take the following steps: • Provide students will additional

research material in CON 100 that targets the identified areas of labor, material, and indirect costs (Spring 2018).

• Introduce quantitative methods in 100 level courses cost and pricing assignments.

• Emphasis will be placed on the identified areas of cost estimation and quantitative methods regarding the Fundamentals of Cost and Price Analysis course, CON 170 (Fall 2018).

• Additional assignments that focus on the identified areas requirements determination, evaluation criteria, best value analysis and cost and price analysis (Spring 2018).

Next assessment: Spring 2018

Program Goals Evaluation Methods Assessment Results Use of Results Increase the number of graduates in the Contract Management Program by 15%.

Number of NOVA Grads by Degree and Specialization (Data: OIR Report). The data was collected in Fall 2018.

Five (5) YEARS OF DATA

ACADEMIC YEAR

GRAD TOTAL

AAS GRAD TOTAL CERTIFICATE

GRAD TOTALS

2017-18 4 2 6 2016-17 3 0 3 2015-16 3 4 7 2014-15 5 6 11 2013-14 2 13 15

The targeted goal was not met. The number of grads increased by 1 for the AAS and 2 for the certificate. While we did see an increase in both programs, the CON program wants to achieve a higher number of graduates. This goal will be assessed for the 2018-19 academic year. Our goal was approximately a 20% increase.

Previous action was taken to improve the graduation totals. The Contract Management Program emphasized and strongly encouraged students to begin the process of applying for graduation as they begin their last semester classes. Additional emphasis will be added as the students complete their 200 level classes. The CON program is continuing to work with DAU to establish a partnership agreement. In addition, the CON program is working with ELI to get 50% of the program online. The first

Page 97: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

87

Contract Management, A.A.S. CON class delivered through ELI was held in Summer 2018. Actions: The Contract Management Program instructors will continue to emphasize and encourage students to immediately apply for program placement and begin the process of applying for graduation as they begin the final semester of classes. Additional emphasis will be added as the students complete their 200 level classes. The CON program is working with DAU to establish a partnership and renew the Memorandum of Understanding. In addition, the CON program is working with ELI to get 50% of the program online. We have 3 classes online. We have completed CON 100, CON 121 and CON 124. Both initiatives will help boost enrollment and graduation (Spring 2019). Assessed: Annually

Increase the number of program placed students

Distribution of Program Placed Students by Curriculum and Certificate Type (Data: OIR Reports 2010 through 2017)

FALL AAS Certificate

2017 34 9 2016 34 9 2014 51 13 2013 52 23 2012 64 23 2011 69 29

The number of placed students held steady for this year.

Actions: The major initiative for the Contract Management program in the 2018 Academic year was to get multiple classes online. The program now has 3 online classes and the remaining 3 classes will be online by Spring 2019. In addition, the program has designed and will be taking to the curriculum committee a new initiative that will offer two certificate programs for Level I contracting professionals and Level II contracting professionals. Our target date for beginning these certificates is Fall 2019. This initiative will be presented to the Contract Management Program’s

Page 98: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

88

Contract Management, A.A.S. committee and the cluster for approval. The Contract Management Program will continue to encourage all students to be officially placed in either the AAS or certificate program. The program will emphasize that students should be officially placed in the AAS or certificate contract program during all entry level classes. Prior to the first class, Associate Professors are encouraging students to be sure to place themselves officially in the program (Spring 2019) Assessed: Annually

Page 99: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

89

Annual Planning and Evaluation Report: 2017-2018

Cybersecurity, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. A.A.S. in Cybersecurity Program Purpose Statement: This curriculum is designed for those who seek employment in the field of Cybersecurity (information assurance), for those who are presently in IT or a security field and who desire to increase their knowledge and update their skills, and for those who must augment their abilities in other fields with knowledge and skills in information security. The curriculum is mapped to the NSA/DHS Knowledge Units necessary for NOVA’s designation as a Center of Academic Excellence.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Describe the basic concepts of information assurance fundamentals (3, 4)

Internet/Intranet Firewalls and E-Commerce ITN 263 Direct Measure: Cybersecurity faculty coordinated on questions to ask ITN 263 students. These questions not only needed to be consistent with NOVA SLO objectives, but also with NSA CAE2Y designation outcome expectations to describe the differences between symmetric and asymmetric algorithms. Two multiple choice questions were provided. Faculty at all 5 campuses provided these questions at the time the final exam was given. Topics included:

# TOPICS FOR QUESTIONS 3 Security basics 4 Security basics

Questions and answers are attached in file SLO Questions and Answers.docx

Data collected: Fall 2017 Target: Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Data collection: 6 sections of ITN 263 offered, including 2 ELI sections via Woodbridge. 3 sections provided results: Alexandria (1 section), Loudoun (1 section), Annandale (0 sections), Manassas (0 sections), and Woodbridge campus (1 section). ELI courses did not report. It is unknown if they received copies. Manassas did not report as its single section ended before the questions were distributed. Dual enrollment sections were not captured in this report. Of the 3 reporting sections, one section was not submitted in a format conducive to analysis. The remaining two sections, with a total of 40 students were assessed (23 in one class and 17 in another).

# TOPICS FOR QUESTIONS

% CORRECT

3 Security basics 85 (34/40 students) 4 Security basics 72.5 (29/40 students)

Students demonstrated an accuracy rate of 0.6625 with all of the answers. Firewall functionality questions were answered with more inaccuracies than security basic questions, as the latter topics are covered in multiple courses. This is the first time this topic has been assessed in this course, in this program, so no previous data is available to trend performance.

Previous actions to improve SLO: The results of this assessment revealed significant issues with students being able to configure a firewall. As this was not flagged as a previous issue, no previous remediation was implemented to approve the SLO. This is a topic that is difficult for experienced network administrators to understand. Areas to be improved: Firewall configuration instruction Actions for improvement: 1. An assessment tool is in

development and will be implemented in Fall 2019 that will allow this SLO to be assessed on a semester-by-semester basis in greater detail.

2. Other resources are being sought and may need to be developed in-house.

When will the improvements take place: Spring 2018 - The SLO will be reassessed after the development of an automated tool that will enable all of the program outcomes to be assessed on a semester-by-semester basis. This is estimated to take effect in Fall 2019.

Page 100: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

90

Cybersecurity, A.A.S. Describe the basic concepts of information assurance fundamentals.

Cyberlaw ITN 267 This course is an elective in the Cybersecurity AAS degree program, so not all campuses offer the course each semester. There were 4 sections offered at 4 campuses (including 2 at ELI), with a total of 64 students enrolled across the college. Students were assessed based on five (5) multiple choice questions. Faculty at all campuses teaching this course were provided these questions at the time the final exam was given. While the Learning Outcome was very broad, three of the topics were developed to cover specific outcomes for NOVA’s CAE2Y designation requirement to identify different security and privacy regulations (questions 1, 2, 3, and 4). Topics included:

# TOPICS FOR MULTIPLE CHOICE QUESTIONS

1 Security and Privacy Regulations 2 Security and Privacy Regulations 3 Security and Privacy Regulations 4 Security and Privacy Regulations 5 Intellectual Property

Questions and answers are attached.

Data collected: Fall 2017 Target: Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Data collection: This is the first time this SLO was assessed in this program, so this should be considered its baseline. Results were provided by faculty. ELI courses did not report; it is unknown if the information was disseminated to them. Only 1 campus reported results (LO). This is largely due to the fact that these tend to be short, 8-week courses, and the questions may not have been disseminated in a timely enough manner to sample those students. This course is not eligible to be taught as a dual enrollment course. Of these single reporting sections, a total of 9 students in that class were assessed. Single answer results were not captured; however, a total of the SLO was captured.

# TOPICS FOR MULTIPLE CHOICE QUESTIONS

PERCENTAGE OF STUDENTS RESPONDING CORRECTLY

1 Security and Privacy Regulations

Not captured

2 Security and Privacy Regulations

Not captured

3 Security and Privacy Regulations

Not captured

4 Security and Privacy Regulations

Not captured

5 Intellectual Property Not captured Total 96.66

Previous actions to improve SLO: First time offered. Students in this class exceeded the target, which is not surprising. This is a conceptual class with few, if any, labs and these concepts are also embedded in several other courses. Areas to be improved: None identified, with the exception of collection processes. Actions for improvement: An automated tool is being piloted in the Spring 2018 that will directly survey the students. When will the improvements take place: Fall 2019

Explain basic operations involved in system administration

Administration of Network Resources ITN 260 Students were assessed based on ten (10) multiple choice questions and two (2) essay questions requiring them to critically analyze a scenario and propose a solution. While the Learning Outcome was very broad, topics were developed that covered a broad range of IA topics that met the requirement for both NOVA’s SLOs as well as mapped directly to NSA’s CAE2Y designation requirement to be knowledgeable on Windows Server administration. Topics included:

# TOPICS FOR QUESTIONS

1 User rights

Data collected: Spring 2018 Target: Students should answer questions with a 70% accuracy rate. As this was the first time that these topics were assessed in this course, no data is available to trend student performance at this time. Data collection: There were a total of 158 students across 7 sections (2 ELI, one with only 1 student in it). Results were provided by faculty. No data was available from Dual Enrollment courses as this is not offered for Dual Enrollment. No data was provided by ELI; it is unknown if they received the survey. Essay questions were not administered consistently across the sections and so only the first 10 multiple choice questions have been included in this study.

Previous actions to improve SLO: First assessment Based on recent results, areas to be improved: ITN 200 is a core course supporting multiple programs. It is foundational to the other cybersecurity programs. Because this is the first assessment of this SLO, there have been no previous actions for improvement. The complex manner in which the assessment was set up did not lend itself to accurately determining outcomes across the college. It would be

Page 101: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

91

Cybersecurity, A.A.S.

2 Files and folders 3 RAID 4 Active Directory 5 File System 6 RAID 7 File System 8 User rights 9 Account Management 10 Virtual Machines 11 User accounts 12 File permissions

Questions and answers are attached.

Because this is the first assessment of this SLO, previous results are unavailable for comparison. Results were not analyzed by campus, but will be in future assessments once an automated tool is in place to facilitate the analysis. Results by SLO Criteria:

# TOPICS FOR QUESTIONS # / % CORRECT 1 User rights 40 (.81) 2 Files and folders 13 (.265) 3 RAID 44 (.897) 4 Active Directory 46 (.938) 5 File System 42 (.857) 6 RAID 15 (.306) 7 File System 39 (.795) 8 User rights 38 (.775) 9 Account Management 2 (.04) 10 Virtual Machines 38 (.775) Total 317 (.646)

The results indicate that students are weakest with respect to files/folder and account management. These are critical aspects to securing any network system, whether this is a Cybersecurity AAS student, or an IT AAS student. At this time, students were underperforming at .65 of the .70 target established.

difficult to ensure that faculty are teaching the exact same concepts to the students, as questions were not based on the course content summary, which is only required to be taught in 80% of its entirety. As a result, the most useful metric of this analysis is in how the students did, as an average, with all of the questions. Areas to be improved: More focus on risk calculations. Actions for improvement: A post assessment is being developed that will accommodate SLOs and requirements. Scenarios are being developed that will focus on this. Faculty have established ITN 200 as a course that a common syllabus will be developed. When will the improvements take place: A common syllabus will be developed before Summer 2019 and an outcomes assessment will be piloted in May 2019.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Describe current threats and explain how to continuously monitor the threats that may be present in the cyber realm (1, 2, 5, 6) CT [X]

Internet/Intranet Firewalls and E-Commerce ITN 263 Cybersecurity faculty coordinated on questions to ask ITN 263 students. These questions not only needed to be consistent with NOVA SLO objectives, but also with NSA CAE2Y designation outcome expectations to describe the differences between symmetric and asymmetric algorithms. Four multiple choice questions were provided. Faculty at all 5 campuses provided these questions at the time the final exam was given. Topics included:

# TOPICS FOR QUESTIONS 1 Firewall rule sets 2 Firewall/network device functionality 5 Firewall functionality 6 Threats

Data collected: Fall 2017 Target: Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Data collection: There were 6 sections of ITN 263 offered, including 2 ELI sections via Woodbridge. Of these, 3 sections provided results. Results were received from Alexandria (1 section), Loudoun (1 section), Annandale (0 sections), Manassas (0 sections), and Woodbridge campuses (1 section). ELI courses did not report, it is unknown if they received copies. Manassas did not report as its single section ended before the questions were distributed. Dual enrollment sections were not captured in this report. Of these 3 reporting sections, one section was not submitted in a format conducive to analysis. The

The results of this assessment revealed significant issues with students being able to configure a firewall. As this was not flagged as a previous issue, no previous remediation was implemented to improve the SLO. This is a topic that is difficult for experienced network administrators to understand. Areas to be improved: Firewall configuration instruction. Actions for improvement:

1. An assessment tool is in development and will be implemented in Fall 2019

Page 102: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

92

Cybersecurity, A.A.S. Questions and answers are attached in file SLO Questions and Answers.docx

remaining two sections, with a total of 40 students were assessed (23 in one class and 17 in another). Results by CLO Criteria:

# TOPICS FOR QUESTIONS

% CORRECT

1 Firewall rule sets 27.5 (11/40 students) 2 Firewall/network

device functionality 62.5 (25/40 students)

5 Firewall functionality 67.5 (27/40 students) 6 Threats 82.5 (33/40 students)

Students demonstrated an accuracy rate of .6625 with all of the answers. Firewall functionality questions were answered with more inaccuracies than security basic questions, as the latter topics are covered in multiple courses. This is the first time this topic has been assessed in this course and in this program, so no previous data is available to trend performance.

which will allow this SLO to be assessed on a semester-by-semester basis in greater detail.

2. Other resources are being sought and may need to be developed in-house.

When will the improvements take place: Spring 2018 - The SLO will be reassessed after the development of an automated tool that will enable all of the program outcomes to be assessed on a semester-by-semester basis. This is estimated to take effect in Fall 2019.

Program Goals Evaluation Methods Assessment Results Use of Results To increase the number of program placed students in the AAS Cybersecurity degree program in order to meet workforce needs.

The distribution of program-placed students was received from OIR as it was reported 8/24/2016 for dates prior to Fall 2018. For Fall 2018, it was pulled directly from SIS. A request needs to be made for OIR to pull data on this, as the SIS data counts all students currently enrolled.

Target: According to the Bureau of Labor Statistics, the growth rate for information security jobs is projected at 37% from 2012-2022. Realistically, an initial projected target of 15% continued growth for the next 3 years while data can be trended and compared to national models, is anticipated. The numbers of program placed students in the AAS Cybersecurity degree from its inception in the Fall 2014 through Fall 2018 are shown below:

Year # of Program Placed Students Percent Change

Fall 2014 42 N/A Fall 2015 518 1133.33% Fall 2016 847 63.51% Spring 2017 1155 36.36% Fall 2017 1432 24% Fall 2018 2875 101%

Results: According to this data, target was exceeded.

The Cybersecurity AAS degree is a new degree, launched in Fall 2014, in a highly visible career field. This has garnered great interest, increasing enrollments significantly in the degree program. Some of this rapid growth can be attributed to the transfer “pathways” into senior institutions (at present, there are 8 degree paths that take the AAS Cybersecurity degree in full satisfaction of the first two years). It is expected that this rapid growth will level out after three years beginning in the 2018-19 academic year. One issue with program enrollments that must be carefully monitored are students that are moving from the ASIT into the AAS Cybersecurity degree due to restricting academic requirements at GMU on the BSIT degree. Research needs to be done to determine the number of students previously program placed in the ASIT that are now in the AAS

Page 103: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

93

Cybersecurity, A.A.S. Cybersecurity degree as this may negatively impact completion rates, as discussed below. Area to be improved: • Accurate reporting is an

absolute necessity to estimate capacity (room and resource) requirements. OIR should provide this data automatically to program chairs to facilitate the completion of this report with accurate, consistent data.

• Increase targeting high school programs to attract younger students into the program.

Actions for improvement: • Cybersecurity department

presence at NOVA CTE events with high school students, campus open houses; participation in DHS NCCIC tours.

• A student run cyber conference was hosted at Loudoun last summer.

• Increase in the number of dual enrollment classes offered at the high schools.

When will the improvements take place: • Planned improvements were

implemented, with more forthcoming this next academic year.

• This goal was assessed and needs to be reassessed at the end of the Spring 2018 when a more comprehensive enrollment analysis can be performed.

Assessed: Annually

To encourage students to complete their A.A.S. in

Data received by the Registrar’s office 8/2016 and SIS graduation queries performed 10/4/2016.

Target: TBD – 20% of Enrolled Students

As is the case with a growing program, graduations trend upwards. While the program is a

Page 104: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

94

Cybersecurity, A.A.S. Cybersecurity degree prior to transferring to a four-year institution.

Number of students graduating with an A.A.S. in Cybersecurity are listed below.

2014-15 # of Graduates

Percent Change

Fall 2014 1 N/A

Spring 2015 3 200%

Summer 2015 0 -100%

2014 – 2015 TOTALS 4 N/A

2015-16 # of Graduates

Percent Change

Fall 2015 7

Spring 2016 12

Summer 2016 15

2015 – 2016 TOTALS 34

2016-17 # of Graduates

Percent Change

Fall 2016 30 Spring 2017 64 Summer 2017 17 2015 – 2017 TOTALS 107

2017-2018 # of Graduates

Percent Change

Fall 2017 49 39% over Fall 2016

Change from 2014-15 to 2016-17

4 to 107 students

Spring 2018 86 35% increase over Spring

2017 The target has not yet been met. This target needs to be revisited as graduation rates vary by full-time versus part-time students, such that the number of enrollments into the program will not be an indicator of how many of those will graduate within two or seven years (often the case, according to OIR data, with part-time students). The program is too new to meaningfully report the percent of increase in graduates across the last two years. Certainly, the cluster’s program goal of a 10% increase was realized Summer 2016 over Summer 2015 totals. The data is presented to enable the reader to further analyze it.

terminal program (an A.A.S.) degree, the pathways it provides to senior institution cybersecurity programs is not only a great attractor to the program, but also serves to retain students in the program. Major transfer programs, such as to GMU, GWU, and Capitol, either prevent students from transferring into the senior institution’s cybersecurity program without the degree, or offers significant financial incentives to transfer. It is important to ensure that students program placed in the AAS Cybersecurity degree are appropriately advised as to the demands and expectations on the students in the field. While IT is, by and large, an occupational field, the emphasis on critical thinking skills and academic rigor in the program is increasing. Area to be improved: Advising support to faculty and counselors to ensure that students understand career demands in the field and are appropriately placed. Actions for improvement: • Discussion with students at

advisement sessions and orientation.

• Advising sessions with Faculty Advisors.

• Update: Faculty have been assigned to all new and pathway students.

When will the improvements take place: Improvements to advising will take place Fall 2018. Assessed: Annually.

Page 105: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

95

Cybersecurity, A.A.S. The Fall 2014 semester is the first year the A.A.S. in Cybersecurity degree is listed in the printed NOVA catalog. Prior to students being able to graduate with the A.A.S. in Cybersecurity degree, students were able to graduate with the A.A.S. in Information Technology degree and use the courses in the Cybersecurity certificate as their IT elective courses. Students selecting that option could graduate with the A.A.S. in Information Technology degree and the Cybersecurity certificate. This resulted in one student being able to meet all of the program requirements in the first year that the degree was offered.

Provide up-to-date course content and new certificate programs

Review the course content summaries for the Cybersecurity courses that were updated in December 2014 and degrees/certificates listed in the NOVA catalog for currency and relevance in the current job market.

Semester /Year: 2017-18 Target: Two courses per semester will be evaluated and updated.

• Courses have been identified for revision and updating by the Dean and Chair for the Spring 2019 semester, prior to conversion to Canvas.

• Faculty are also working with the IST Peer Group across the VCCS interested in updating the cybersecurity courses.

• A Virtual Range was developed by Virginia Tech and is being explored by NOVA for use with ITN 200 and other labs.

• A Critical Infrastructure course was developed and offered in the Fall 2017. Also offered was a Malware Reverse Engineering course. Enrollments were low and it will likely not be offered again. These new courses are developed to meet very topical needs and may be proposed as a new course to the VCCS after a couple of years.

• Note that the curriculum is driven, in part, by national security standards. NOVA is an NSA designated CAE2Y (Center of Academic Excellence for Two-Year colleges) and is required to comply with established academic standards.

The target was not met this past academic year, with only one course assessed in Spring 2018, as the Division underwent significant reorganization and its decentralized nature hindered communications across the college. This has been resolved with the new reporting structure put into place Fall 2018.

Areas to be improved: • Courses are being revised at

the VCCS. Course content updates at this point may be unnecessary as they may be redone this semester.

• Dissemination of new course updates to adjunct and full-time faculty, so that all are teaching on the same page.

Actions for improvement: Coordinate course updates with Advisory Board to ensure they are current and relevant. When will the improvements take place: May 2019 Next assessment: End of Spring 2019

Page 106: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

96

Cybersecurity, A.A.S. Develop competencies-based program assessments.

Review SLO content so that it can be migrated to a full competencies-based assessment program. Students should, eventually, sit for a program assessment. This will be supported by the Virginia Cyber Range after Spring 2017.

Pilot - Fall 2018 Semester Target: 80% of graduating students will pass the competencies exam at a pass rate of 70%. Update: A pilot in two ITN 260 classes was attempted, however the vendor had significant issues that prevented all but one or two students being able to successfully authenticate to the application. A new pilot is planned for Spring 2019.

Area to be improved: Greater participation in the SLO process by faculty. Actions for improvement: • Collect the SLO results for a

more accurate picture of the SLOs.

• Move away from paper-based exam-like questions to projects and those that demonstrate hard-skills.

When will the improvements take place: The improvement will take place during the Fall 2019 semester.

Page 107: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

97

Annual Planning and Evaluation Report: 2017-2018 Dental Assisting Program, Certificate

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Dental Assisting Program prepares students to perform chairside assisting skills, dental laboratory and dental practice management procedures, and exposing radiographs. The program prepares students to perform advanced functions as delegated by the Virginia Board of Dentistry.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Demonstrate knowledge and understanding for managing protocols for infection control practices and biohazardous waste.

Chairside Assisting I Proficiency Evaluation DNA 113 Direct Measure:

1. Proper placement of PPE 2. Proper use of dis-infectants 3. Proper Handwashing Technique

The student learning outcome was rated as follows: 5 = Excellent 4 = Above Average 3 = Average 2 = Below Average 1 = Poor 0 = Student fails to perform task

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students Assessed

ME only 1 1 14 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 14

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of the students will meet the 75% competency level on the Proficiency Evaluation.

Results by Campus/ Modality

Fall 2017 Fall 2016 Average

Score Percent > [target]

Average Score

Percent > [target]

ME 89 100 > 75 91 100 > 75

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Fall 2016

Average Score

% of Students > [target]

Average Score

% of Students > [target]

1 91 100 > 75 97 100 > 75 2 96 100 > 75 94 100 > 75 3 95 100 > 75 95 100 > 75 Total 94 95.3

Current results improved: [ X] Yes [ ] No [ X] Partially Strengths by Criterion/ Question/Topic: Knowledge is appropriate for application in the clinical setting. Weaknesses by Criterion/ Question/Topic: Students fail to understand the importance of proper sequencing of the placement of PPE, proper use of disinfectants, and proper handwashing technique in the prevention and in the transmission of diseases.

Previous action(s) to improve SLO: Peer review before evaluation, discussion and demonstration of proficiency to be evaluated in the laboratory session. Review of material in the lecture component of the class. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: For proper sequencing of placement of personal protective equipment (PPE). Current actions to improve SLO based on the results: For future assessments for this student learning outcome, the instructors in the Fall of 2018 will demonstrate the proper placement of PPE and the use of disinfectants before the graded proficiency, distribute handouts, and review the lecture material on this SLO. Next assessment of this SLO: This Student Learning Outcome will be assessed by the Program Director in the Spring of 2019.

Students will be able to identify legal and ethical aspects of clinical practice.

Dental Office Practice Management Test II DNA 130 Direct Measure: Question Topics

1. Knowledge of patient confidentiality

Semester/year data collected: Spring 2018 Target: 100% of the students will meet the 75% competency level on Test II. Results:

Previous action(s) to improve SLO: Reevaluate and review test questions on content evaluated in the Spring semester of 2017. Target Met:

Page 108: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

98

Dental Assisting Program, Certificate 2. Knowledge of protocol for protecting

patient information 3. Knowledge of phone protocol

The SLO was rated as follows: A = 92 – 100% B = 84 – 91% C = 75 – 83% D = 66 – 74% F = 65% and below

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual-enrollment

Results by Campus/ Modality

Spring 2018 Spring 2017

Average Score

% of students >

[target] Average

Score % of

students > [target]

ME 91.3 100 > 75 91.7 100 > 75 Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Spring 2017 Spring 2016

Average Score

% of Students > [target]

Average Score

% of Students > [target]

1 92 100 > 75 95 100 > 75 2 72 92 > 75 77 100 > 75 3 90 100 > 75 93 100 > 75 Total 84.6 88.3

Current results improved: [ ] Yes [ X ] No [ ] Partially Strengths by Criterion/ Question/Topic: Knowledge is appropriate for application in a clinical setting. Weaknesses by Criterion/ Question/Topic: Protecting patient information questions were the most often missed questions. These test questions and the lecture content will be reviewed.

[ X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Information on protecting patient information. Current actions to improve SLO based on the results: Review protecting patient information questions and course content in the Spring of 2018. Next assessment of this SLO: This SLO will be assessed by the Program Director in the Spring of 2019.

Students will be able to perform dental assisting expanded duties.

Chairside Assisting I Proficiency Evaluation – Application of a Topical Anesthetic Agent DNA 113 Direct Measure:

1. Identifies site of application 2. Application amount of topical anesthetic 3. Application time of topical anesthetic

The student learning outcome was rated as follows: 5 = Excellent 4 = Above Average 3 = Average 2 = Below Average 1 = Poor N/A = Not Applicable

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# of Students assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of the students will meet the 75% competency level on the Proficiency Evaluation. Results:

Results by Campus/ Modality

Fall 2017 Fall 2016

Average Score

% of students >

[target] Average

Score % of

students > [target]

ME 93 100 > 75 93 100 > 75 Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Fall 2016

Average Score

% of Students >

[target] Average

Score % of

Students > [target]

1 93 100 > 75 92 100 > 75 2 92 100 > 75 96 100 > 75 3 96 100 > 75 96 100 > 75 Total 93.6 94.6

Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: Knowledge is appropriate for application in the clinical setting.

Previous action(s) to improve SLO: Review material with students in the Fall of 2017 to accurately identify the oral location site for placement of the topical anesthetic agent, placement of the correct amount, and the proper time frame for application. Target Met: [ X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Location of application of local anesthetic agent. Current actions to improve SLO based on the results: Use the manikins and skulls to demonstrate correct site of application of local anesthetic agent in the Fall of 2018.

Page 109: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

99

Dental Assisting Program, Certificate Weaknesses by Criterion/ Question/Topic: A drop in the

assessment of amount of topical anesthetic agent will be addressed in the lecture material and in the laboratory sessions in the Fall of 2019.

Next assessment of this SLO: The student learning outcome will be assessed by the Program Director in the Spring of 2019.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Understand and demonstrate knowledge of radiation safety measures in order to produce diagnostic radiographic surveys. CT [ X ]

Direct Measure: Final Exam Question Topics:

1. Patient protection 2. Radiographer protection

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# of Students Assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of students will score 75% or higher on question topics. Results:

Results by Campus/ Modality

Fall 2017 Fall 2016

Average Score

% of students >

[target] Average

Score % of

students > [target]

ME 85 100 > 75 89 100 > 75 Total 85 89

Results by CLO Criteria:

Results by CLO Criteria/

Question Topics

Fall 2017 Fall 2016

Average Score

% of Students >

[target] Average

Score % of

Students > [target]

1 89 100 > 75 88 100 > 75 2 90 100 > 75 86 100 > 75 Total 89.5 87

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Knowledge is critical for application in the clinical setting. Weaknesses by Criterion/ Question/Topic: Student performance in this area increased from the previous year but there is room for improvement in the content of patient and radiographer protection, and radiation safety issues. Lecture material and laboratory safety measures will be evaluated in the Fall of 2018.

Previous action(s) to improve CLO if applicable: Review test questions and provide rationale. Target Met: [ X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Even though the average score was well above the target of 75%, radiation safety is very important for patient and radiographer protection. The material will be reviewed in specific lectures to increase student performance on these exam questions in the lecture portion of the course in the Fall of 2018. Current actions to improve CLO based on the results: Continue to review and demonstrate radiation safety measures in the lecture and laboratory session. Next assessment of this CLO: The SLO will be assessed by the Program Director in the Spring of 2019.

Program Goals Evaluation Methods Assessment Results Use of Results Increase number of program-placed students

Program Student Enrollment A maximum enrollment of 16 students is approved by the accrediting body, the Commission of Dental Accreditation, for the Dental Assisting program.

Target: 100% of maximum capacity of 16 students Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 14 15% 2016 13 -8% 2015 14 27%

Previous action(s) to improve program goal: 1. Participation in job fairs in

the Spring of 2018. 2. Discussion of program at

Curriculum Advisory Committee meetings in the

Page 110: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

100

Dental Assisting Program, Certificate 2014 11 120% 2013 5 N/A

Target Met: [ ] Yes [X ] No [ ] Partially

Fall of 2018 and Spring of 2019.

3. Discussion by program director to colleagues and peers at Dental Continuing Education Seminars and Meetings.

4. Student Services identification of potential dental assisting students and career counseling during the academic year.

5. Participate in high school career days in the Spring of 2019.

6. Outreach programs and community service programs in the academic year of 2018-2019.

7. Clinical sites visits to market the program in the Spring of 2018.

Most recent results: Although more students continue to be accepted for the target mark of 16, students do not enter the program for the following reasons: financial, change in program interest, and family responsibilities. Results improved: [ X] Yes [ ] No [ ] Partially Current action(s) to improve program goal: Online identification of students who may be interested in the dental assisting program and track them during the academic year of 2018-2019. Assessed: Annually

Increase program graduation rates

Graduation Rates Target: 100% graduation total Results for Past 5 Years:

Academic Year Graduates Percentage Increased

Previous action to improve program goal: Actions to improve graduation rates include targeting at-risk students within in the first four

Page 111: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

101

Dental Assisting Program, Certificate # %

2017-18 13/14 = 93% 1% 2016-17 12/13 = 92% 6% 2015-16 12/14 = 86% 4% 2014-15 9/11 = 82% 2% 2013-14 4/5 = 80% N/A

Target Met: [ X] Yes [ ] No [ ] Partially

weeks of the Fall semester of the program. Most recent results: 2018 Graduation rate = 93% Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: Identifying students who are at-risk, referring them to the academic success counselor, remediation, and follow-up to ensure continued program success within the first four weeks of the semester. Assessed: Annually

Page 112: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

102

Annual Planning and Evaluation Report: 2017-2018 Dental Hygiene, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The program is designed to prepare students to serve in a dynamic and growing health profession as members of the dental health team. After successful completion of the program, the student will be eligible to take the National Board Dental Hygiene Examination and professional licensure examinations. Upon successful completion of the licensing process, the title “Registered Dental Hygienist” (R.D.H.) is awarded.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Evaluate the outcomes of treatment for determining a patient’s subsequent treatment needs.

Dental Hygiene IV DNH 244 Direct Measure: Evaluation of DH Care Skill evaluation Question Topics:

1. Identifies evaluative criteria and expected outcomes of care.

2. Accurately assesses periodontal status to determine whether goals are being met.

3. Interprets and summarizes the findings accurately.

4. Determines modifications to the ongoing treatment sequence or to the maintenance care plan.

5. Recommends an appropriate recall interval based on the findings.

6. Writes an evaluative statement in the treatment records if the goals of treatment have been met.

7. Documents the completion of this service in the treatment record.

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 31 ELI N/A N/A N/A DE* N/A N/A N/A TOTAL 1 1 31

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 85% of students will pass with an 80% or higher on first attempt. Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Fall 2017 Fall 2015

Average Score

Percent > target

Average Score

Percent > target

ME 97.9 100 84.4 76.7 Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Fall 2015

Average Score

% of Students > target

Average Score

% of Students > target

1 99% 99% (1%NI) 3 97.4% 96.6% 4 99% 99% (1%NI) 6 96.1% 90.3% 73.3% 73.3% (26.7%NI)

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: As noted above for Criteria #6 there was a significant improvement since last assessment period. Weaknesses by Criterion/ Question/Topic: The above results for the noted criteria assessed, no significant weaknesses were noted.

Previous action(s) to improve SLO: Prior improvements suggested reviewing the evaluating criteria and the role that plays in determining the outcome of care to improve the students’ analysis to determine outcomes of dental hygiene care. In addition, reviewing the connection between periodontal status and treatment to determine if the goal has been met would strengthen the students’ analytical abilities in this area. These improvement suggestions by faculty have shown to be beneficial as the students’ average score and the number reaching the target increased significantly since the last assessment. This will be a continuous activity each semester. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: No significant improvement areas were identified based on the results for current year. Current actions to improve SLO based on the results: Currently, possible actions to improve this SLO based on the results could be to review the rubric. Next assessment of this SLO: Fall 2019

Page 113: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

103

Dental Hygiene, A.A.S. Communicate the provision of oral health care services with diverse population groups

Dental Hygiene IV DNH 244 Direct Measure: Oral Health Education Part I Skill Evaluation Question Topics:

1. Selects the appropriate patient for plaque control 2. Conducts a caries risk assessment 3. Asks the patient if they have any areas of

concern 4. Points out areas of concern in the patient’s

mouth 5. Explains the disease process clearly and at a

level the patient can understand 6. Utilizes disclosing solution to illustrate and

evaluate the patient’s oral hygiene technique 7. Observes the patient’s present TB techniques

and modifies if necessary 8. Observes the patient’s present flossing

technique and modifies if necessary 9. Asks the patient to set Short-Term goals to be

practiced until next session. 10. Uses appropriate audio-visual and educational

material to clarify the presentation Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 31 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 31 *Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% of students will achieve 85% or better on first attempt. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017 Spring 2016* Average

Score Percent >

target Average

Score Percent >

target ME 88.4 83.8 92.2 80

*The measurement tool used in Spring 2016 is different than the current measurement tool, therefore, there is not a direct comparative correlation. Also, this is the first time this measurement tool has been assessed, so there is no comparative data available. Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017

Average Score % of Students > target

3 89.6 87.1 5 86.5 77.4 9 90.9 87.1

Current results improved: Undetermined as this measurement tool has not be assessed previously. Strengths by Criterion/ Question/Topic: Criteria 3 and 9 above indicate that the target is being reached. Weaknesses by Criterion/ Question/Topic: Criteria 5 clearly indicates target was not met and needs improvement.

Previous action(s) to improve SLO: Previous suggestions involved needing improvement in population description. This was addressed with the class and will continue to be stressed with future classes, i.e., the importance of including all aspects of the chosen population and how these aspects affect the outcome of the program. Faculty will have examples of previous program descriptions that met all requirements for the students to review. The above action will be carried out by the professor of the course when teaching the concepts of target populations in Fall 2018. Target Met: [ ] Yes [ ] No [X ] Partially Based on recent results, areas needing improvement: Overall scores for the skill evaluation met the target. However, when the individual criteria were broken down, criteria 5 needs improvement. Current actions to improve SLO based on the results: To improve results with the criteria 5, this area will be reviewed and remediated by didactic professors as well as clinical instructors in Spring 2018 to enhance the student’s ability to communicate more clearly to the patient. The suggestion of using layman terms to assist the patient with understanding and the use of interpreter’s will be encouraged. Next assessment of this SLO: Fall 2019

Recognize the importance for discerning and managing ethical issues consistent with professional code of ethics

Office Practice and Ethics DNH 230 Direct Measure: Case Study Discussion Board Sample Size (Specify N/A where not offered)

Semester/year data collected: Fall 2017 Target: 80% of students will score 75% or above. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017* Average Score Percent > target

ME 90 100

Previous action(s) to improve SLO: N/A Target Met: [ X] Yes [ ] No [ ] Partially

Page 114: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

104

Dental Hygiene, A.A.S. Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 31 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 31

*Dual-enrollment

*No comparative data available; the first time this measurement tool has been used. Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Fall 2017

Average Score % of Students > target

Case Study discussion board 90 100

Current results improved: [X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: The results indicate that students are successful in analyzing case studies and coming to an ethical solution. Weaknesses by Criterion/ Question/Topic: The weakness was that this assignment didn’t utilize a rubric and break the discussion down into components.

Based on recent results, areas needing improvement: No significant weaknesses were noted. Current actions to improve SLO based on the results: The discussion board wasn’t broken into individual components so a rubric will be developed in the Spring 2019. Next assessment of this SLO: Fall 2020

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

Students will calculate, interpret, and use numerical and quantitative information in a variety of settings. [X ] QR

Dental Public Health II DNH 227 Direct Measure: Program Development Project Evaluation: Includes a comparative analysis using appropriate statistics, graphs and charts, and accurate labeling and explanation of graphs. Includes determination of success of reaching program goals. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 31 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 31

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students to achieve 75% or higher Results by In-Class, ELI, Dual Enrollment

Results by

Campus/ Modality

Spring 2018 Spring 2016

Average Score

Percent > target

Average Score

Percent > target

ME 93.7 100 92.2 100 Results by CLO Criteria:

Results by CLO Criteria/ Question

Topics

Spring 2018

Average Score % of Students > target

Program Development Project 94.5 92.8

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: The results show that the students understand how to interpret the statistical analysis and how that can be used to demonstrate the success of a community health program.

Previous action(s) to improve CLO if applicable: N/A Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: No significant weaknesses were noted so no suggestions for improvement are being made at this time. Current actions to improve CLO based on the results: This Rubric is being revised for Spring 2020. Next assessment of this CLO: Fall 2020

Page 115: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

105

Dental Hygiene, A.A.S. Weaknesses by Criterion/ Question/Topic: Rubric is being revised to further break down into components to better identify strengths and weaknesses.

Program Goals Evaluation Methods Assessment Results Use of Results

Program goal on program-placed students (e.g., To increase number of program-placed students)

Short description of method(s) and/or source of data: OIR Planning and Evaluation Report: Distribution of Program Placed Students by Curriculum and Award Type 2014-2017

Fall 2018 Target: To have the program operating at full capacity which would reflect enrollment of 32 students in even number years and 39 students every other years as our distance site accepts 7 students on odd number years. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2018 72 0 2017 72 1.4 2016 71 -6.5 2015 76 -12.6 2014 87

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): Between 2015 and 2016, there was a decrease due to our distance program at Germanna Community College deciding to accept students every other year.

Previous action(s) to improve program goal: No changes were made for improvement as this is a definitive number due to capacity. Most recent results: Stable Results improved: Remained the same Current action(s) to improve program goal: None (see comment above) Assessed: Annually

Program goal on graduation (e.g., To increase program graduation totals)

Short description of method(s) and/or source of data: Department Records for Student Enrollment and Completion 2013-2017 OIR 2010-2016 Planning and Evaluation Reports Number of Graduates by Program and Specialization

2018-19 Target: 80% of the students complete the program and graduate Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

Percentage of completion

2018 31 -16 30/31= 96.7% 2017 37 2.7 37/39 = 94.8% 2016 36 -2.7 36/39 = 92.3% 2015 37 5.7 37/39 = 94.8% 2014 35 35/39 = 89.7%

Target Met: Not Comparable Comparison to previous assessment(s): 2018 graduating class did not have a cohort from GCC (6 students). They are accepting students every other year now.

Previous action to improve program goal: Remediation procedures have been implemented by the department to improve student success. The data from “Number of Graduates by Program and Specialization” indicate the Dental Hygiene program is meeting this goal. The average program completion rate is 93.6% from 2014-2018. When comparing the data from the past assessment periods, similar trends in the percentage of program completion rate were noted. The results indicate the Dental Hygiene program is meeting its target of 80% of the students completing the program and graduating. Although this goal is being met, however, an improvement could occur perhaps with review and/or revising the applicant qualification/evaluation portion of the acceptance process. Will be discussed

Page 116: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

106

Dental Hygiene, A.A.S. between Student Services and the Dental Hygiene department for the 2019 application period. This program goal is assessed on an annual basis. The data will be reviewed and reported again for the Academic Year 2018-19. Most recent results: There was a change in the number of graduates due to the change in the number of enrolled students as GCC didn’t have a cohort graduating in 2018. Results improved: Change in Enrollment Pattern Current actions to improve program goal: With the application cycle, the Science portion of the TEAS test will be scored as part of the candidate criteria starting with the January 2019 application cycle. This section was required to be taken but scored on the candidate’s application. This could improve the quality of candidate as this is a science-heavy curriculum. Assessed: Annually

Page 117: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

107

Annual Planning and Evaluation Report: 2017-2018

Diagnostic Medical Sonography, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellent, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed to prepare students to produce diagnostic images of the human body using special equipment to direct high frequency sound waves into different anatomic structures in a patient’s body. The sonographer is a central member of the healthcare team and assists the radiologist in gathering diagnostic data for interpretation. NOVA’s program emphasizes didactic and “hands-on” practice of sonographic techniques in a well-equipped scanning laboratory at the Medical Education Campus in Springfield, Virginia. Clinical experience is acquired at numerous area hospitals and private medical affiliates. Students in the Diagnostic Medical Sonography degree program learn to perform ultrasound of the Abdomen and Small Parts as well as Obstetric and Gynecologic sonography. Upon successful completion of the degree requirements, the student will be eligible to apply to take the American Registry for Diagnostic Medical Sonography (ARDMS) examination(s) leading to credentials as a Registered Diagnostic Medical Sonographer (RDMS®).

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Apply principles of ultrasound physics in the operation of medical sonographic equipment to recognize and perform image optimization techniques.

Acoustical Physics and Instrumentation I DMS 208

Direct Measure: Week 15 Quiz Evaluation Criteria: Multiple choice quiz questions on the topics of ultrasound system receiver functions (optimization controls). Each question was worth 5 points. Quizzes were not identical because questions came from larger question pools built into the Blackboard learning management system. Question Topics: 1. Compensation (receiver sweep gain) 2. Returning echoes 3. Controls 4. Compression 5. Receiver functions 6. Receiver overall gain 7. Low amplitude echoes 8. Read zoom function 9. Contrast resolution 10. Cine-loop function Sample Size (Specify N/A where not offered):

Campus/ Modality

Total #

Sections Offered

# of

Sections Assessed

# Students assessed

ME only 1 1 14 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 14

*Dual enrollment

Semester/year: Fall 2017

Target Overall Score: 75/100 points • Percent of Students Meeting Target Score: 64.3% • Percent of Students Below Target Score: 35.7% • Average Student Overall Score: 79.3% • Range of Scores: 65 - 100

Results by In-Class, ELI:

Results by Campus/ Modality

Fall 2017 Fall 2016 Average

Score Percent > [target]

Average Score

Percent > [target]

ME 79.3 64.3% 82.3 100% Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017 Fall 2016

Average Score out

of 5.0

% of Students Overall Score > 75/100

Average Score

% of Students Overall Score > 75/100

1 1.67 N/A 1.43 N/A 2 5.0 N/A 5.0 N/A 3 5.0 N/A 4.38 N/A 4 2.5 N/A 4.0 N/A 5 4.45 N/A 5.0 N/A 6 3.89 N/A 5.0 N/A 7 2.28 N/A 2.5 N/A 8 5.0 N/A 5.0 N/A 9 4.42 N/A 5.0 N/A 10 5.0 N/A 5.0 N/A Total 3.92 64.3% 4.23 100%

Current results improved: [ ] Yes [ ] No [ X ] Partially

Purpose of Evaluation: To assess student awareness of ultrasound machine controls and their corresponding image optimization effects. Previous Action(s) to Improve SLO: In the 2016-17 assessment of this SLO, a copy of the student lab assignments for DMS 218 were not being saved into the permanent record. Rather the lab assignments were handed back to the students making tracking of individual evaluation measures impossible. The recommendation at that time was to ensure final copies of student assessments are saved for later evaluation. This has been implemented via the requirement that students submit copies of their assignments online. This holds true for all quizzes as well since some are conducted fully online and other paper versions are scanned in electronically for record keeping. Target Met: [ ] Yes [ ] No [ X ] Partially Based on Recent Results, Areas Needing Improvement: The 2017 cohort performed below their 2016 cohort peers in questions related to

Page 118: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

108

Diagnostic Medical Sonography, A.A.S. Strengths by Criteria/ Question Topic: In the 2017 cohort, students on average performed better on criteria 1 and 3 than students in the 2016 cohort and performed as well as the 2016 cohort for criteria numbers 2, 8, and 10. Also, there were more students (5) who scored in the 90-100 range on the overall assessment than in the 2016 cohort (3). Weaknesses by Criteria/ Question Topic: In the 2017 cohort, students on average performed worse than the 2016 cohort on criteria 4, 5, 6, 7, and 9. Also, there were more students (5) who failed this assessment in 2017 than in the 2016 cohort (0). Note that it was not possible to compare the percent of students with an overall score in each criteria, due to the questions being derived from exam pools; thus, not all students in the cohort answered each criteria question. For example, in 2017 criteria 10 appeared on 19 student quizzes whereas criteria 4 only appeared on 4 student quizzes.

the following image optimization techniques:

• Compression • Compensation • Amplification • Reject/Filter

Many of these questions required student understanding of key term definitions, alternative terms and the application of these machine controls. It will be useful to identify if one or more of these required elements is responsible for the lower average scores in 2017. Another area needing improvement involves the overall score for the assessment in 2017. Overall there were more students who performed with a higher grade on the assessment in 2017 than in 2016. However, there were five students who failed the assessment in 2017 versus no failures in 2016. For both years, the majority of each class met the target score which lends confidence that students are understanding and accomplishing this SLO successfully. One factor that may have been a reason for the poor grade results of the five students in 2017 is that the syllabus allows for the lowest quiz grade to be dropped. This knowledge may lead students to feel they have leeway to not fully prepare for one assessment since they know it will not count against them. Action Taken by DMS Faculty: This particular assessment will be analyzed by the course instructor before Fall 2019 for question quality in terms of discrimination and difficulty since a large majority of students are performing so well.

Page 119: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

109

Diagnostic Medical Sonography, A.A.S. Also, the course instructor for DMS 208 will implement an assignment to help students match optimization controls with their alternative names and effects. The DMS Program Director will ask students for feedback during the Spring 2019 semester regarding if knowing in advance that the lowest quiz grade will be dropped from the overall course grade provides a disincentive to effectively prepare for quizzes. Next Assessment of this SLO: This SLO will be assessed again in 2019-20

Provide high quality patient care in an ethical, legal, safe, and effective manner.

Coordinated Internship DMS 190

Direct Measure: End of Term Evaluation – Ethics and Attitude Evaluation Rankings Scores:

• Excellent = 100 • Above Average = 90 • Average = 80 • Below Average = 70

Respect for Patient Privacy by: 1. Respecting patient modesty 2. Discussing patient history and findings with

appropriate individual(s) 3. Reserving medical questions for the appropriate

time/place Proper Patient Communication by: 4. Introducing her/himself to the patient 5. Confirming the patient’s identity verbally or by

nametag 6. Confirming the type of exam in lay terms with the

patient or caregiver 7. Confirming the proper exam preparations with the

patient or caregiver 8. Explaining the type of exam in lay terms with the

patient or caregiver 9. Keeping patient informed of exam progress

Semester/Year: Fall 2017 Target Overall Score: 80/100 in each of the evaluation criteria

• Percent of Students Meeting Target Score = 100% • Percent of Students Below Target Score = 0% • Average Student Overall Score = 80.5 • Range of Scores = 80-100

Results by In-Class, ELI

Results by

Campus/ Modality

Fall 2017 Fall 2016

Average Score

Percent > 80/100

Average Score

Percent > 80/100

ME 80.5 78% 97.3 100% Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017 Fall 2016

Average Score

% of Students

> 80% Average

Score % of

Students > 80%

1. 81.8 78% 99.2 100 2. 81.2 78% 99.2 100 3. 80.6 78% 97.7 100 4. 81.2 78% 96.9 100 5. 81.8 78% 96.2 100 6. 80.6 78% 96.2 100 7. 81.2 78% 96.9 100 8. 79.4 78% 96.2 100 9. 80.0 78% 96.2 100 10. 80.6 78% 99.2 100

Purpose of Evaluation: For clinical instructors to evaluate students’ professional readiness to provide high quality patient care. Previous Action(s) to Improve SLO: This SLO was revised and accepted for implementation during the 2016-17 session. There are not prior actions available for comparison as this is the first time it is being assessed. Target Met: [ ] Yes [ ] No [ X] Partially Based on Recent Results, Areas Needing Improvement: The primary weakness with these results is the fact that evaluations were not completed for 3 students in the 2017 cohort. That missing data affects the averaged calculations dramatically and is inconsistent with the 2016 cohort that had 100% compliance with submission. In addition, some students need improvement in the criteria related to the category of Self-Confidence

Page 120: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

110

Diagnostic Medical Sonography, A.A.S. Adherence to Program Dress Code and Personal Cleanliness 10. Wearing the appropriate NVCC uniform 11. Having hair clean, nails short and well kept 12. Wearing an NVCC student nametag and patch

Seeking Assistance When Necessary 13. When moving difficult patients 14. In an emergency situation 15. Any situation where the student is not competent

Self-Confidence by 16. The ability to adapt to new situations 17. Instilling confidence in patients 18. Demonstrating initiative 19. Basing decisions on clear thought Total: 100 points

Sample Size (Specify N/A where not offered):

Campus/ Modality

Total # Sections Offered

# of Sections Assessed

# Students assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual enrollment

11. 82.4 78% 100 100 12. 82.4 78% 99.2 100 13. 81.2 78% 98.5 100 14. 81.2 78% 98.5 100 15. 79.4 78% 96.2 100 16. 78.8 78% 94.6 100 17. 77.7 78% 95.4 100 18. 78.8 78% 96.2 100 19. 78.8 78% 96.9 100 Total 80.5 78% 97.3 100

Current results improved: [ ] Yes [ X ] No [ ] Partially Strengths by Criteria/ Question Topic: Across all criteria, all students in the 2016 cohort and the majority of students in the 2017 cohort met or exceeded the target goal. Weaknesses by Criteria/ Question Topic: The primary weakness with these results is the fact that evaluations were not completed for 3 students in the 2017 cohort. That missing data affects the averaged calculations dramatically and is inconsistent with the 2016 cohort that had 100% compliance with submission. In addition, the following criteria had at least one student who received the minimum target score of 80 in the 2017 cohort: 10, 15, and 18. The following criteria had at least two students who received the minimum target score of 80 in the 2017 cohort: 8, 16, and 19. The following criteria had at least three students who received the minimum target score of 80 in the 2017 cohort: 17.

as there were several scores just at the minimum target criteria. Action Taken by DMS Faculty: First, the DMS Clinical Coordinator will be asked to develop a plan of action to collect missing end of semester evaluations for every student. This plan of action will be implemented in Fall 2019. Second, the marginal results in the Self-Confidence category are to be expected somewhat because this evaluation occurs during the students’ very first clinical rotation. They are still learning a lot and do not have mastery yet at this stage. Reiterating professionalism and reinforcing techniques to help students with their self-confidence will take place during the Fall 2019 DMS 206 course. Next Assessment of this SLO: 2018-19

Practice professional work habits and appropriate interpersonal relationships in a clinical setting when working with clinical staff, other health care providers, and/or physicians.

Clinical Education I DMS 231

Direct Measure: Midterm Evaluation – Clinical Instructor Evaluation of Student Assessment Description: Evaluation Rankings Scores:

• Excellent = 100 • Above Average = 90 • Average = 80 • Below Average = 70

Relevant Evaluation Criteria: 1. Adhere to official rules and protocols 2. Demonstrate an eagerness to learn

Semester/year: Summer 2018 Target Overall Score: 80/100 in each relevant evaluation criteria

• Percent of Students Meeting Target Score: 99.2% • Percent of Students Below Target Score: 0% • Average Student Overall Score: 97.7 • Range of Scores = 70-100

Results by In-Class, ELI:

Results by Campus/ Modality

Summer 2018 Summer 2017 Average

Score Percent >

80/100 Average

Score Percent >

80/100 ME 97.7 92.3% 74.3 50%

Results by SLO Criteria:

Summer/2018 Summer/2017

Purpose of Evaluation: For clinical instructors to evaluate students’ professionalism in workplace employee interactions. Previous Action(s) to Improve SLO: This SLO was revised and accepted for implementation during the 2016-17 session. There are no prior actions available for comparison as this is the first time it is being assessed. Target Met: [ X ] Yes [ ] No [ ] Partially

Page 121: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

111

Diagnostic Medical Sonography, A.A.S. 3. Demonstrate appropriate amount of

professionalism while on affiliate grounds 4. Adapt to new clinical situations (new equipment,

protocols, staff, fluxes in patient demographics, etc.)

5. Follow instructions from assigned Clinical Instructor within a reasonable time frame and without reluctance

6. Consistently accepts constructive criticism from Clinical Instructors and supervising (interpreting) physician

7. Demonstrate respect for Clinical Instructor(s) 8. Interact appropriately with affiliate staff 9. Demonstrate an appropriate rate of skill acquisition

(e.g. technical, interpersonal, etc.) 10. Is reliable with the performance of tasks as well as

with communication Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual enrollment

Results by SLO

Criteria/ Question Topics

Average Score

% of Students > 80/100

Average Score

% of Students > 80/100

1. 97.7 100% 75.0 50% 2. 99.5 100% 73.3 50% 3. 99.0 100% 75.0 50% 4. 97.3 100% 74.2 50% 5. 97.9 100% 75.0 50% 6. 98.0 100% 74.2 50% 7. 99.2 100% 74.2 50% 8. 98.7 100% 74.2 50% 9. 94.7 92.3% 73.3 50% 10. 98.2 100% 74.2 50% Total 97.7 99.2% 74.3 50%

Current results improved: [X ] Yes [ ] No [ ] Partially Strengths by Criteria/ Question Topic: An overwhelming majority of students met or exceeded the target score in the 2018 class. Weaknesses by Criteria/ Question Topic: For the 2018 course, the following criteria had one student who scored below the minimum target score and one student who just met the minimum target score: 9. The following criteria had two students who just met the minimum target score: 5. The following criteria had only one student who met the minimum target score: 1, 4, 6, and 10. For 2017, the evaluation results were saved in a limited format that cut off some student results. These results are no longer available through the clinical tracking system. Data for 7 evaluations (6 students) are unable to be considered for comparison purposes here.

Based on Recent Results, Areas Needing Improvement: The 2018 course results were a drastic improvement over the 2017 course mostly because a complete set of data was saved. For the 2018 course, there was one student who scored below the minimum target in the following criteria: Demonstrate an appropriate rate of skill acquisition. This is troublesome at this stage in the curriculum as all students are expected to improve their performance dramatically during this summer session. As this is a midterm evaluation, outcomes should be analyzed against the final evaluation to determine if this particular student was able to become more successful. In addition, there were several other criteria where some students only met the minimum target score. Criteria 1, 4, 5, 6, 9, and 10 need to be evaluated for possible ways to avoid these lower scores. Action Taken by DMS Faculty: The DMS Clinical Coordinator will continue to save all student evaluation data in a full spreadsheet format at the end of each semester. The new DMS faculty with clinical responsibility will be trained on the proper way to download student results. This will be implemented in Spring of 2019. The DMS faculty team will meet before the end of the Spring 2019 semester to review SLO results and develop targeted instruction to address criteria where students are meeting the bare minimum standard. Next Assessment of this SLO: 2018-19

Page 122: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

112

Diagnostic Medical Sonography, A.A.S. Core

Learning Outcome

Evaluation Methods Assessment Results Use of Results

Integrate patient history, current medical condition, and sonographic findings to provide accurate diagnostic information. [ X ] CT

Abdominal Sonography DMS 211

Direct Measure: Week 9 Discussion – Topic: Discuss an interesting Abdominal Case Study you observed or took part in during your clinical rotation. Be sure to include all required elements. Assessment Description: Assignment Relevant Required Elements

1. Information about patient history 2. Patient presentation signs and symptoms 3. Sonographic description of the findings 4. Post explanation of actual diagnosis, treatment

and prognosis 5. Research possible differentials for peer replies

Relevant Rubric Criteria

1. Written Organization a. Exceptional – Writing is organized with a

logical sequence that is easy to follow. All aspects of the topic are addressed in a clear and thorough manner that reflects a high degree of comprehension. (20)

b. Acceptable – Writing is presented with a basic level of organization that is able to be followed. Topic is addressed in a manner that reflects a basic level of comprehension. (15)

c. Not Acceptable – Writing lacks proper organization and is difficult to follow. Topic is not addressed or does not demonstrate comprehension. (0)

2. Peer Replies a. Exceptional – All peer replies add insight and

further the discussion. (20) b. Acceptable – Some but not all of the peer

replies add insight and/or further the discussion. (15)

c. Peer replies do not add insight or help to further the discussion. (0)

Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

Semester/year: Spring 2018

Target Overall Score: 15/20 in each relevant criteria • Percent of Students Meeting Target Score = 84.6 • Percent of Students Below Target Score = 15.4 • Average Student Overall Score = 18.5/20 in Written

Organization and 16.5/20 in Peer Replies • Range of Scores = 0-40

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Spring 2018 Spring 2017

Average Score

% of Students > 15/20

Average Score

% of Students > 15/20

1. Written Organization 18.5 92.3 19.6 100

2. Peer Replies 16.5 84.6 20 100 Total 17.5 88.45 19.8 100

Current results improved: [ ] Yes [ X] No [ ] Partially Strengths by Criteria/Question Topic: The 2018 class overall did well in the Writing Organization category. This means students had original topic posts that were well organized and that met information criteria for relevant pathology, patient history and presentation as well as sonographic descriptions of the findings. Weaknesses by Criteria/Question Topic: The 2018 class did poorly with their peer replies. Two students did not submit any and one only submitted one peer reply rather than two. Thus, these students missed out on the opportunity to apply their knowledge of clinical presentations for pathology to determine possible differential diagnoses.

Purpose of Evaluation: To assess student ability to complete an ultrasound examination technical report incorporating all relevant patient history and ultrasound findings to provide accurate diagnostic information. Previous Action(s) to Improve SLO: This SLO was evaluated in 2016-17. During the Spring 2018 semester, the DMS 212 instructor included a more complete explanation of both high quality and poor quality technical reports in advance of the assignment in order to improve student attention to detail and use of sonographic descriptions. Target Met: [ ] Yes [ ] No [ X] Partially Based on Recent Results, Areas Needing Improvement: Because 3 of the students in the 2018 class either did not complete or partially completed the peer reply portion of the assignment, their critical thinking was limited for this evaluation. Thought their initial posts were well constructed and proved competence in integrating patient data for presentation purposes, the analysis that was required in analyzing peer scenarios that students did not have the full background information for was a crucial part of the critical thinking element of this assignment. Solutions need to be sought so students are unable to simply not complete the assignment fully. Action Taken by DMS Faculty:

Page 123: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

113

Diagnostic Medical Sonography, A.A.S. ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual enrollment

For the next administration of this course in Spring 2020, the DMS 211 faculty will be encouraged to develop examples of discussion replies and peer replies that provide sufficient analysis and detail to satisfy the rubric requirements. These examples will be posted in the online Canvas course and will be addressed during class meeting time. Next Assessment of this SLO: This SLO will be evaluated again in 2018-19

Program Goals Evaluation Methods Assessment Results Use of Results

Upon completion, DMS graduates will meet industry standards for employment

1. DMS Graduation Totals (Dec 2018) – as reported in NOVA student records system Job Placement Rate – as reported to the CAAHEP accrediting body in the JRC-DMS Annual Report 2. Credential Success Rate – as reported to the CAAHEP accrediting body in the JRC-DMS Annual Report

Number of Graduates Dec 2018: 12 Target Job Placement: 80% (CAAHEP requires 75%) Job Placement Results:

• 2013-2014 Cohort – 90% • 2014-2015 Cohort – 88% • 2015-2016 Cohort – 92% • 2016-2017 Cohort – 92% • 2017-2018 Cohort – 92%

Target Credential Success Rate (RDMS): 100% RDMS Credential Success Rates:

• 2014-2015 Cohort A – 100% • 2014-2015 Cohort B – 100% • 2015-2016 Cohort – 100% • 2016-2017 Cohort – 100% • 2017-2018 Cohort – 100%

Credential Exam Success Rate by Cohort:

DMS Cohort ARDMS SPI ARDMS Abdomen

ARDMS Ob/Gyn

2014-15 A 100% 100% 100%

2014-15 B 100% 100% 100%

2015-16 100% 100% 100%

2016-17 100% 100% 100%

2017-18 100% 100% 100%

Purpose of the Evaluation: To ensure DMS program graduates are ready for the sonography workforce and to maintain accreditation standards. Previous Action(s) to Improve Program Goal: 1. In Fall 2016 the DMS program established a direct line of contact with our clinical affiliates and other health care sites to collect job announcements. These postings are actively sought and shared with senior DMS students and graduates via email announcements. 2. Resume and Interview Skills Workshops were implemented in the DMS 299 Supervised Study course in Fall 2016 and continued as part of the DMS 222 Registry Review course in Fall 2017 to better prepare students for job searching and employment success. 3. A new instructor was assigned to teach the acoustical physics courses as of Fall 2016 to ensure continuity and continued success on the SPI exam. Needed Improvements:

Page 124: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

114

Diagnostic Medical Sonography, A.A.S.

All students met the target success rate for both the job placement results and the credentialing exams. The ideal would be to have 100% of the DMS students employed for each cohort. To facilitate this, the DMS Clinical Coordinator will routinely ask about open job opportunities when conducting site visits and will disseminate this information to DMS students in their last semester of the program and encourage them to apply for jobs. Action Taken by Faculty: 1. In Fall 2016 the DMS program established a direct line of contact with our clinical affiliates and other health care sites to collect job announcements. These are still being sought and shared with senior DMS students and graduates via email announcements. 2. Resume and Interview Skills Workshops were continued in Fall 2017 and Fall 2018 as part of the DMS 222 Registry Review course. The purpose of the workshops is to better prepare students for job searching and employment success. 3. The DMS Program Director has continued to identify national and local networking opportunities for students to expose them to sonography professional organizations. Students have attended the Virginia Society of Ultrasound Meeting in Spring of 2016, AIUM annual conference in Spring 2018, LET IFSER Educators Summit in Summer 2018, and volunteered for a registry review event in October 2017 and 2018. 4. In October, the DMS Program Director conducts a pre-graduation Fall Student Success Meeting with each individual senior student in the DMS program. The Program Director will gather information from

Page 125: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

115

Diagnostic Medical Sonography, A.A.S. the students regarding their intent and plan for obtaining employment post-graduation. Future Assessment: Annually

Increase the program retention rates over the length of the entire program (18 months)

OIR Data – Number of Graduates by Program and Specialization: Fact Book 2010-2011 through 2015-2016 VCCS SIS: Student Information System Student cohort rosters 2017-2018 Student cohort rosters 2016-2017 Student cohort rosters 2015-2016 Student cohort rosters 2014-2015** Student cohort rosters 2013-2015* Student cohort rosters 2012-2014 Student cohort rosters 2011-2013 Student cohort rosters 2010-2012 Student cohort rosters 2009-2011 Student cohort rosters 2008-2010 * Last cohort under the five semester curriculum

** First cohort under the four semester curriculum

Target Retention: 80% (as required by CAAHEP) Graduates -

• 2017-2018 Cohort – 12 • 2016-2017 Cohort – 12 • 2015-2016 Cohort – 13 • 2014-2015 - Cohort B - (Dec Grads) - 13 • 2013-2015 - Cohort A - (May Grads) - 8 • 2012-2014 Cohort - 10 • 2011-2013 Cohort - 10 • 2010-2012 Cohort - 11 • 2009-2011 Cohort - 7 • 2008-2010 Cohort – 7

Retention Rate

Year Retained Students Percentage

2017-2018 13 of 14 93% 2016-2017 12 of 14 86% 2015-2016 13 of 15 87% 2014-2015 Cohort B 13 of 17 76% 2013-2015 Cohort A 8 of 9 89% 2012-2014 10 of 11 91% 2011-2013 10 of 11 91% 2010-2012 11 of 13 85% 2009-2011 7 of 13 53% 2008-2010 7 of 9 78%

Attrition Rate: 2017-2018 Cohort – In progress, so far 1 of 14 (7%) One student from the 2016 cohort returned during the 2017 cohort

• Involuntary – N/A • Voluntary – one student withdrew from the program for

health reasons.

2016-2017 Cohort – 2 of 14 (14%) • Involuntary – N/A • Voluntary – one student withdrew from the program for

pregnancy reasons but will return for 2017. One student withdrew from the program because she did not realize how much work it would be.

2015-2016 Cohort –2 of 15 (13%)

Purpose of the Evaluation: To prioritize student retention and to define factors related to student attrition. Previous Action(s) to Improve Program Goal: 1. One student who withdrew from the 2016-2017 cohort for pregnancy reasons returned to the program as part of the 2017-2018 cohort and is progressing successfully thus far. 2. Twice in Fall, twice in Spring, and once in Summer the DMS faculty presents professional and program expectations to enhance prospective student awareness by offering in-person mandatory prospective student advising sessions. 3. Within a week of a student indicating that they wish to withdraw, the DMS Program Director meets with the student to discuss reasons, potential solutions to challenges, and options for continued success. Students who withdraw in good standing are given an opportunity for guaranteed readmission to the DMS program for the following Fall. The requirement is that these students must reapply for non-competitive admission during the May application period as an indication that they wish to return. Two email reminders are sent at one month and at two weeks prior to the start of the application period to remind students of their option to reapply during this time. Needed Improvements: The DMS program has met the targeted CAAHEP accreditation standards (requires at least 80%

Page 126: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

116

Diagnostic Medical Sonography, A.A.S. Two students from the 2014 cohort returned during the 2015 cohort. • Involuntary – one student received an “F” in clinical DMS

196. • Voluntary – one student withdrew from the program for

personal reasons.

2014-2015 Cohort – 4 of 17 (23%) • Involuntary – one student was administratively removed

from the program for behavioral problems; three students received a “D” in physics 208.

• Voluntary - N/A

2013-2015 Cohort - 2 of 9; 1 returning (11%) • Involuntary - one student received a “D” in physics 208;

unsatisfactory academic performance. • Voluntary - one student withdrew due to medical disability

issues from a previous, long-standing injury and was physically unable to continue in the program.

2012- 2014 Cohort - 1 of 11 (9%) • Involuntary - N/A • Voluntary – one student withdrew due to pregnancy and

will return during the summer of 2014

2011-2013 Cohort - 1 of 11 (9%) • Involuntary - one student removed from the program for

academic dishonesty. • Voluntary - N/A

2010-2012 Cohort - 2 of 13 (15%) Two students from the 2009 cohort returned during the 2010 cohort. • Involuntary - one student received an “F” in DMS 231. • Voluntary – one student reported it was too much work and

not what she wanted to do for the next 2 years. This student withdrew two weeks past the census date.

2009-2011 Cohort - 6 of 13 (46%) • Involuntary – two students received a “D” physics 208;

unsatisfactory academic performance. One student received a “D” in physics 209 and an “F” in DMS 211. Two students unable to work effectively with patients; unable to complete clinical coursework.

• Voluntary - one student left due to personal financial problems

2008-2010 Cohort - 2 of 9 (22%)

retention) for student retention during the 2017-18 cohort. More can be done in terms of disseminating information to help prepare prospective students to set realistic expectations of what the program will be like in terms of rigor, time commitments, and cost. The 2017-2018 cohort is the fourth class to graduate from the accelerated 18-month program. Although one student received an incomplete and is expected to officially graduate in Spring 2019, the rest graduated in December 2018. Thus, the program met its goal of 80% retention which is the accreditation standard. Action Taken by Faculty: 1. The DMS faculty continues to offer their presentation of professional and program expectations to enhance prospective student awareness. These sessions are offered twice each Fall and Spring and once during the Summer semester. 2. The DMS program faculty and administrative assistant continue to work diligently to update the DMS website to reflect the most current and accurate program information for prospective students. This effort was undertaken to help students understand the program progression and required time commitment. 3. To further help prevent student attrition and provide interested students with a realistic and comprehensive set of expectations, in Spring 2018 the DMS program developed a list of approximate program costs and a frequently asked questions link on the DMS website. This information was incorporated into the in-person pre-admission advising sessions.

Page 127: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

117

Diagnostic Medical Sonography, A.A.S. • Involuntary - one student removed from the program for

academic dishonesty. • Voluntary - one student left the program due to medical

health issues

4. To further aid student understanding of program entry and progression requirements, the DMS administrative staff developed a tri-fold brochure in Spring 2018 that contains accurate and concise information on admissions and program criteria. The brochure is a link on the DMS website and is distributed and discussed during the prospective student workshops. 5. By Fall 2019, the DMS Program Director will identify resources to better help students differentiate between the different types of ultrasound practice. Doing so will help prospective students choose between the available degree options (General Sonography, Vascular, or Echocardiography). Future Assessment: Annually

Page 128: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

118

Annual Planning and Evaluation Report: 2017-2018

Drivers Education Career Studies Certificate NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Driver Education Career Studies Certificate program is designed for students who wish to become licensed teachers of driver education or maintain qualifications in the state of Virginia.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to create a competent classroom lesson plan for Driver Education students under the age of 19.

Driver Task Analysis EDU 114 Direct Measure: Mandatory lesson plan assignment where students must create a lesson plan from a topic from the Drivers Education Curriculum and teach it to the class.

Criteria Assignment Rubric 1 He/she brought all the materials,

submitted the lesson plan and components on blackboard, and provided an extra hard copy for the Instructor.

2 He/she adhered to the time limit. 30 mins. MAX. Time: _____

3 The objective was stated and clear at the beginning and students were clear on what they would be learning.

4 The lesson plan followed the DOE curriculum and slides/content were from the DOE.

5 He/she made sure to check for understanding during the lesson.

6 ALL students were engaged in active learning (hands on and involved in the lesson).

7 Directions for the activities were specific and clear to students.

8 The activities were appropriate for the HS age level.

9 He/she understands content and could communicate clearly with the class.

10 The visuals were professional and used proper English.

11 He/she provided learning activities for students of all abilities (differentiated learning).

12 He/she included a SEPARATE formative assessment.

13 The closure reinforced the objectives of the lesson.

Semester/year data collected: • Summer 2017 (8 students at Manassas, 8 students at

Alexandria) • Fall 2017 (16 students) • Spring 2018 (10 students)

Target: All students must achieve a minimum of 80 out of 100 points. Our goal is to have 85% of the students pass this objective with 80%. Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Summer 2017/ Fall 2017/ Spring 2018

(% of students who passed with 80% or

above)

2016-2017 (% of students who passed with 80% or

above)

Average Score

Percent > target

Average Score

Percent > target

AL 87.5 85 N/A N/A MA 86/95/97 85 86 80 Total 85 86 80

Percent of Students who Passed each Section

Criteria Summer

2017 Manassas

Summer 2017

Alexandria Fall 2017

Spring 2018

1 90% 87.5% 100% 100% 2 80% 100% 83% 81% 3 90% 87.5% 95% 100% 4 80% 87.5% 94% 100% 5 90% 87.5% 100% 100% 6 80% 87.5% 100% 90% 7 80% 87.5% 95% 100% 8 100% 87.5% 95% 100% 9 80% 87.5% 100% 100% 10 100% 100% 88% 100% 11 80% 87.5% 95% 100% 12 100% 100% 94% 94% 13 70% 87.5% 94% 94%

Current results improved:

Previous action(s) to improve SLO: In Fall 2016 we started to implement a new way of having the students prepare for this lesson by breaking it up into smaller lessons throughout the course leading up to this large lesson. Last year we saw a huge increase in the percentage of students who passed this SLO using this action and thus we have kept with the idea of having mini lessons leading up. Target Met: [X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Based on these recent results it looks as if we really need to stress the very short amount of time that they have to give the lesson. Maybe displaying a large timer at the back of the class so they are aware of how much time has passed. It might also be a good idea for our instructors to model what we mean by an engaging lesson that is specific and clear and is tailored towards students with various abilities. Having the instructor demonstrate what an unclear and vague lesson looks like and see if the students in the class are able to understand them or if they get confused. When they do this, use this as a teaching

Page 129: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

119

Drivers Education Career Studies Certificate Sample (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 1 1 8 MA 3 3 34 ELI N/A N/A N/A DE* N/A N/A N/A Total 4 4 42

*Dual-enrollment

[ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Based on the chart above it appears that bringing the materials, checking for understanding, appropriate for HS level, understands the content, visuals were professional, and having a separate summative assessment were all strengths of the students who took this course. Weaknesses by Criterion/ Question/Topic: The weaknesses were adhering to the 30- minute limit, the objective was not clearly posted, the lesson plan did not follow the DOE objectives, all students were engaged in active learning, the directions were not specific and clear, the student did not provide learning activities for students with all abilities and the closure did not reinforce the objectives.

point as they prepare for their own lesson. Current actions to improve SLO based on the results: Display clock/timer during the lesson so the students know how much time has passed and model good/bad lessons so the students understand that they need to be clear and be prepared for all learning abilities. These new changes will be implemented in Spring 2019. Next assessment of this SLO: Fall 2018

Students will demonstrate proficiency in their own driving skills as demonstrated on the range.

Instructional Principles of Drivers Education EDU 214 Direct Measure: At the range students will be assessed on their driving skills by certified Driver Education Instructors. They must demonstrate the skills below and students must be able to perform all skills with 100% accuracy.

1 Push Pull Slide Steering 2 BGE Mirror Setting 3 Reference Points 4 Parallel Parking 5 Perpendicular Parking 6 Angle Parking 7 Two Point Turns 8 3 Point Turns 9 Backing 10 Backing and Turning 11 Off Road Recovery

Sample (Specify N/A where not offered)

Campus/ Modality

Total # Section Offered

# Sections Assessed

# Students Assessed

AL 1 1 10 MA 3 3 32 ELI N/A N/A N/A DE* N/A N/A N/A Total 4 4 42

*Dual-enrollment

Semester/year data collected: • Summer 2017 (10 students at Manassas and 7 students at

Alexandria) • Fall 2017 (16 students) • Spring 2018 (9 students)

Target: All students must achieve a 100% passing rate on this assignment. Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Summer 2017/Fall 2017/ Spring 2018

(% of students who passed with 100% or above)

2016-2017 (% of students who

passed with 100% or above)

Average Score

Percent > target

Average Score

Percent > target

AL 100 100 N/A N/A MA 100/100/100 100/100/100 96 100 Total 100 100 96 100

Results by SLO Criteria - Driving Skills Breakdown

Criteria Summer 2017

(Section 1)

Summer 2017

(Section 2)

Fall 2017 Spring 2018

1 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

2 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

Previous action(s) to improve SLO: Previously we implemented a strategy of allowing the students more time on the range to practice their skills before testing. This appeared to help students who really needed that extra practice time and for those who still struggled, the instructors started allowing for a retake but recorded how many times it took the students to retake the specific skill. If a retake was needed the instructor would work one on one with the student to address what they did wrong and how to correct it. These changes were implemented in Fall 2017 and are still in place today. Target Met: [X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Based on these new results, even though there was a 100% pass rate there were a few instances where it appears some students are still struggling. In the area of parallel parking and two and three point turns, it might be a good idea for

Page 130: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

120

Drivers Education Career Studies Certificate 3 7 passed 1st

attempt 10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

4 7 passed 1st attempt

9 students passed 1st attempt 1 passed on 3rd attempt

15 passed 1st attempt 1 passed 2nd attempt

8 Passed 1st attempt 1 Passed 2nd attempt

5 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

6 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

7 7 passed 1st attempt

9 students passed 1st attempt 1 passed on 3rd attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

8 7 passed 1st attempt

9 students passed 1st attempt 1 passed on 3rd attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

9 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

9 Passed 1st attempt

10 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

8 Passed 1st attempt 1 Passed 2nd attempt

11 7 passed 1st attempt

10 passed 1st attempt

15 passed 1st attempt 1 passed 2nd attempt

NC on range

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: It appears that our students are really excelling in their driving skills. The areas of strength are push, pull slide, BGE mirror setting, reference points, perpendicular parking, angle parking, backing, backing and turning, and off road recovery.

the instructor to demonstrate these before the range day so that the students can practice on their own at home. Current actions to improve SLO based on the results: The instructor will demonstrate proper parallel parking, two and three point turns before the range day. This will allow the student time to go home on their own to practice the skills taught so that they do not need the retakes on the range day. These changes will be implemented in Spring 2019. Next assessment of this SLO: Fall 2018

Page 131: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

121

Drivers Education Career Studies Certificate Weaknesses by Criterion/ Question/Topic: Some areas of weakness that need to be addressed and students need to practice more before getting to the range are parallel parking, two point turns and three point turns.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will be able to design a proper behind the wheel driving route. [ X ] CT

Instructional Principles of Drivers Education EDU 214 Direct Measure: All students will design a proper behind the wheel driving route.

1. Diagram 2. School/area you are working at 3. Step by Step Directions 4. Level of Risk Addressed 5. Appropriate 6. Skills being taught

Sample (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 1 1 10 MA 3 3 32 ELI N/A N/A N/A DE* N/A N/A N/A Total 4 4 42

*Dual-enrollment

Semester/year data collected: • Summer 2017 • Fall 2017 • Spring 2018

Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Summer 2017/Fall 2017/Spring 2018

(% of students who passed with 100% or

above)

2016-2017 (% of students who

passed with 100% or above)

Average Score

Percent > target

Average Score

Percent > target

AL 100 100 N/A N/A MA 100/100/100 100/100/100 N/A N/A

Results by CLO Criteria:

Results by CLO

Criteria/ Question Topics

Current Assessment Results

[Semester/year]

Average Score % of Students > target

1 100% 100% 2 100% 100% 3 100% 100% 4 100% 10% 5 100% 100% 6 100% 100% Total 100% 100%

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: This assignment has always been one that the students have completed and is a mandatory part of becoming a driver’s education instructor. The instructor models this lesson in class and shows the students the step by step process about how to create this driving route so all the areas of this are a strength. Weaknesses by Criterion/ Question/Topic: The only weakness may be the limitations of the students in reference to the area surrounding where they plan to work. It may be difficult to create a

Previous action(s) to improve CLO if applicable: We have never assessed this before. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: It does not appear that there is any need to improve upon this assignment. Current actions to improve CLO based on the results: None Next assessment of this CLO: Fall 2018

Page 132: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

122

Drivers Education Career Studies Certificate proper diagram in this case if the person does not know where they plan to work or they are unsure of the area and roads.

Program Goals Evaluation Methods Assessment Results Use of Results

Improve program graduation totals

NOVA OIR Number of Graduates by Program and Specialization reports.

Target: At least 50% of those who enroll in the Driver Education Certificate courses will graduate. Number of graduates of the Certificate program:

Year # of Graduates

2017-18 10

2016-17 0 2015-16 1

2014-15 1 2013-14 2 2012-13 3 2011-12 1 2010-11 3

Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): Compared to the previous years we have seen significant improvement. We are now incorporating the application for graduation into our EDU 114 final exam process and that is why we are seeing these numbers rise.

Previous action(s) to improve program goal: Starting in Spring 2016 we required students to submit official transcripts if they had already completed the ENG 111 requirement that is the 3rd course in this certificate. By requiring the official transcript, we could mark that class off as being completed and students could apply for graduation. However, we are still working on getting those ready and put into the system so the rates have not increased as much as we would like. On top of that many students are still having trouble actually applying to graduate because we found out last summer that they needed to be placed into the Driver’s Education Certificate program so they could apply. Most recent results: The recent results still say that we are not at 50% yet, however we have improved in the past year. We will continue to require the official transcript and have the students apply for graduation during the EDU 114 final exam. Results improved: [ X] Yes [ ] No [ ] Partially Current action(s) to improve program goal: Require official transcripts to prove completion of that third course, ENG 111, and have students apply for graduation during EDU 114. These actions were implemented in Spring 2017. Assessed: Annually

Page 133: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

123

Annual Planning and Evaluation Report: 2017-2018

Early Childhood Development, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed for persons who seek employment involving the care and education of young children, or for those persons presently employed in these situations who wish to update and enhance their competencies. Occupational opportunities include, program leaders, supervisors, and/or directors in child development programs.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Using observation techniques to assess development and effective practices with children, families and programs Accreditation Alignments: Aligns with NAEYC Standard 3: Observing, documenting, and assessing to support young children and families. NAEYC key elements: 3a: Understanding the goals, benefits, and uses of assessment 3b: Knowing about and using observation, documentation, and other appropriate assessment tools and approaches 3c: Understanding and practicing responsible assessment to promote positive outcomes for each child.

Observation and Participation CHD 165 Direct Measure: Physical Development Checklist with common grading rubric. Students apply a checklist to observations of fine and gross motor development. They describe developmental levels from checklist indicators and suggest strategies for support of that development. Provided Rubric Criteria or Question Topics: Directions for assignment and grading rubric are attached. Other Method (if used): Qualitative data and assessment collected as part of the rubric summary data completed by instructors. Used in analysis/comments. Sample Size (Specify N/A where not offered):

Campus/ Modality

Total #

Sections Offered

# Sections assessed

# Students assessed

AL 2 2 21 MA 1 1 8 LO 1 1 10 ELI 1 1 24 DE* N/A N/A N/A Total 5 5 63

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% of students will score 80% or higher overall on each criterion as well as the overall score Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017 Average

Score Percent >

target AL 63% (17) MA 87% 7 LO 85% 5 ELI 88% 8 DE N/A N/A Total 80 3

Note: Data not separated by campus in previous assessment Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Current Assessment Results

Fall 2017

Average Score

% of Students >

target 1.KE-3a 80% 54% 2.KE-3b 89% 71% 3.KE-3c 81% 80%

Note: Data not separated by criteria in previous assessments. Current results improved: [ ] Yes [ ] No [ X ] Partially NOTE: Last assessment showed us at 97% and meeting the target. We meet the target overall for this reporting cycle but with a lower percentage. The previous percentages were not focused on the individual criteria and considered elements not

Previous action(s) to improve SLO: This SLO was last assessed in Spring of 2017. Some changes were made to this SLO Assessment in anticipation of the NAEYC Program Accreditation visit in Fall of 2017. Previously we collected data with the categories “exceeds, meets or didn’t meet.” No range of scores were collected and the percentage indicators that these categories represented were not consistent between instructors. This report collects data on point range and total points that more accurately reflects the scores of students. During the NAEYC Accreditation visit, data collection was a major topic and focus. Our SLOs and NAEYC Standards are in alignment but we were not collecting data on each NAEYC Key Element (criteria) separately. We were collecting data that represented more than one criterion and on examination sometimes more than that one SLO. We needed to rethink our assessment and SLOs. We started this during Spring 2018 while awaiting the Accreditation results. We determined as a faculty that our assessments are good but the pieces of the assignments needed to be more clearly identified by SLO and NAEYC Key Assessments. The data for this SLO was collected in Fall 2017. It was revised to collect more detailed data by range but not by individual criteria.

Page 134: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

124

Early Childhood Development, A.A.S. part of the SLO. This makes results with prior years incompatible. We have struggled with getting consistency with data reporting and collection. Some previous data was based on overall assignment results rather than looking at individual criteria. This period, we also had some data reported in an older format that may have skewed some results - particularly at the Alexandria campus. We have few sections to compare evaluations and problems in one or two can result in major shifts in results. Overall, we are consistent in the comments and summary results we are getting in this SLO. The criteria section of the rubric still needs to be refined to clarify the SLO and Key Element (Criteria) for the three major parts. Strengths by Criterion/ Question/Topic: Students are able to complete the checklist by observing and recording physical development (Part 1 of assignment and KE 3b). Weaknesses by Criterion/ Question/Topic: Although they are able to complete the checklist, they struggle with being able to then describe what the checklist tells them about development and where the child is developmentally relevant to expectations (Part 2 of assignment and KE 3a). This comment is shown in both the score for this section and the comments from instructors. Students do provide instructional strategies based on analysis of checklist (Part 3 of assignment and KE 3c) and overall meet the target but instructors indicate that the depth of the selection of strategies is lacking. Often the strategy is focused on practicing the indicator activity from the checklist and not the developmental skill.

Target Met: [ X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: As indicated above, the rubric and data collection process are undergoing continuous review which will help identify issues and lead to refinement of data collection and analysis. The ECD instructors indicate that while the assignment is good, students need help in narrative descriptions and matching strategies to developmental indicators. More focus needs to be given to strategies. We need a clarification on what is preventing a concise and clear narrative description of the data gathered from the checklist. The students may need more review of development than provided in the course or stronger writing skills; both have been suggested in the instructor comments in the summary rubric. Current actions to improve SLO based on the results: At the September 2018 Discipline Meeting, the faculty decided that a more focused approach needed to be taken with instructors. Rather than full adjunct meetings to discuss SLOs or email reminders to use the current rubric to collect data, we would target with one on one and small group discussions with instructors specifically assigned to teach courses with SLO specific assignments. These will start in Fall 2018 and continue into the Spring 2019 with future follow-up to be determined. The goal is more consistency in data collection and emphasis in the assignment. Next assessment of this SLO: Spring 2019

Applying developmental theories and early childhood program

Language Arts for Young Children CHD 118

Semester/year data collected: Fall 2017

Previous action(s) to improve SLO: Last assessment: Fall 2016. Previously we used an Environmental

Page 135: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

125

Early Childhood Development, A.A.S. model components, select educational strategies appropriate for the learning environment Accreditation Alignments: Aligns with NAEYC Standard 4: Using Developmentally Effective Approaches to Connect with Children and Families NAEYC key elements: 4a: Understanding positive relationships and supportive interactions as the foundation of their work with young children. 4b: Knowing and understanding effective strategies and tools for early education, including appropriate use of technology. 4c: Using a broad repertoire of developmentally appropriate teaching and learning approaches 4d: Reflecting on their own practice to promote positive outcomes for each child.

Direct Measure: CHD 118 course assignment: Lesson Plan with Props. Students create a lesson plan with appropriate props that incorporates content knowledge and effective educational strategies. Focus for this SLO are the strategies used to develop and deliver the activity plan. Provided Rubric Criteria or Question Topics: Directions for assignment and grading rubric are attached Other Method (if used): Qualitative data and assessment collected as part of the rubric summary data completed by instructors. Used in analysis/comments. Sample Size (Specify N/A where not offered):

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 3 2 32 MA 1 1 19 LO 1 1 7 ELI 2 2 27 DE* N/A N/A N/A Total 7 6 85

*Dual-enrollment

Target: 80% of students will score 80% or higher overall on each criterion as well as the overall score Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017 Average

Score Percent > target

AL 80 0 MA 94 14 LO 81 1 ELI 75 (5) DE N/A N/A Total 83 3

Note: Previous report not broken down by campus score. Results by SLO Criteria:

Results by SLO

Criteria/ Question Topics

Fall 2017

Average Score

% of Students > target

1.KE-4a 76 56 2.KE-4b 80 78 3.KE-4c 76 57 4.KE-4d 86 80

Current results improved: [ ] Yes [ ] No [ X ] Partially Overall the target of 80% was reached but individual criteria showed weaknesses. Strengths by Criterion/ Question/Topic: Strengths were in the reflection on practice (4d) and in demonstrating effective strategies (4b). There areas are emphasized throughout the program and within the course. Weaknesses by Criterion/ Question/Topic: Students displayed less strength in using a broad repertoire of approaches (4c) and in understanding positive relationships and interactions. In comments it was suggested by instructors that this course comes early in the program (first year) and they students may not have the depth of experiences to respond well to these areas.

checklist from CHD 166 Infant and Toddler Programs to collect data for this assignment. During the NAEYC Accreditation visit it was determined that this assignment better measured the criteria for the NAEYC Standard 4 and Standard 5 which align with NOVA SLO 4 and 5. The previous assignment better measures SLO 1. We collect data each semester on all SLO/NAEYC Standards for our Accreditation Reports as a result we have a number of assessments collecting data on more than one SLO. Some changes were made to this SLO Assignment in anticipation of the NAEYC Program Accreditation visit in Fall 2017. Previously we collected data with the categories “exceeds, meets or didn’t meet.” No range of scores were collected and the percentage indicators that these categories represented were not consistent between instructors. This report collects data on point range and total points that more accurately reflects the scores of students. During the NAEYC Accreditation visit data collection was a major topic and focus. We had made changes in Fall 2017 to better align rubrics with NAEYC Criteria. This was the first assignment we reviewed. We continued to perfect this assignment rubric after the accreditation visit for Spring 2018. The previously used assignment from CHD 166 may continue to be used to collect SLO data but is in the process of being re-evaluated. The data for this SLO was collected in the Fall 2017. It was revised to collect more detailed data by range. It also provided more

Page 136: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

126

Early Childhood Development, A.A.S. detail about the criteria and broke out individual elements. Target Met: [ ] Yes [ ] No [ X ] Partially The overall target was met but individual criteria did not. Based on recent results, areas needing improvement: The faculty need to ensure that everyone is using the correct and most current rubric. In a transition, some faculty have relied on rubrics from previous course sections and not checked for updates. The assignments have not changed but we have better matched the SLO and criteria to specific sections and if an old rubric is used that data does not always reflect the appropriate criteria. We need to look at the level of expectations in this particular assignment for interactions. Current actions to improve SLO based on the results: Group and individual meetings with instructors teaching this course will be held in Spring 2019 to ensure that rubrics are being used consistently and the criteria understood. Next assessment of this SLO: Fall 2019

Create curriculum that integrates content and developmental knowledge Accreditation Alignments: Aligns with NAEYC Standard 5: Using content knowledge to build meaningful curriculum NAEYC key elements:

Language Arts for Young Children CHD 118 Direct Measure: Assignment: Lesson Plan with Props. Students create a lesson plan with appropriate props that incorporates content knowledge and effective educational strategies. Focus for this SLO is the content knowledge needed for the curriculum development. Provided Rubric Criteria or Question Topics: Directions for assignment and grading rubric are attached

Semester/year data collected: Spring 2018 Target: 80% of students will score 80% or higher overall on each criterion as well as the overall score. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Spring 2018 Average

Score Percent >

target AL N/A N/A MA 96 16 LO 87 7 ELI 76 (4)

Previous action(s) to improve SLO: Previous assessment of this SLO: Fall 2016. Some changes were made to this SLO in anticipation of the NAEYC Program Accreditation visit in Fall of 2017. Previously we collected data with the categories “exceeds, meets or didn’t meet.” No range of scores were collected and the percentage indicates that these categories represented were not consistent between instructors.

Page 137: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

127

Early Childhood Development, A.A.S. 5a: Understanding content knowledge and resources in academic disciplines 5b: Knowing and using the central concepts, inquiry tools, and structure of content areas or academic disciplines 5c: Using own knowledge, appropriate early learning standards and other resources to design, implement, and evaluate developmentally meaningful and challenging curricula for each child.

Other Method (if used): Qualitative data and assessment collected as part of the rubric summary data completed by instructors. Used in analysis/comments. Sample (Specify N/A where not offered):

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 1 0 0 MA 2 2 12 LO 1 1 14 ELI 1 1 17 DE* N/A N/A N/A Total 5 4 43

*Dual-enrollment

DE N/A N/A Total 89 9

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Current Assessment Results

Spring 2018 Average

Score % of Students

> target 1.KE-5a 83 67% 2.KE-5b 91 81% 3.KE-5c 93 83%

Current results improved: [ X ] Yes [ ] No [ X ] Partially Previous results represented only an overall score and not a report by criteria. We have exceeded the target but have gone down over the previous report. This report represents a deeper analysis of the criteria. Strengths by Criterion/ Question/Topic: Because we do not have prior year assessment result, a clear picture of strengths is not available. Criteria 5a and 5c meet the goal of 80%. Weaknesses by Criterion/ Question/Topic: Criteria 5a falls below the goal of 80% and is of some concern because this criterion speaks specifically to content which is the overall theme of this Standard.

This report collects data on point range and total points that more accurately reflects the scores of students. During the NAEYC Accreditation visit data collection was a major topic and focus. Our SLOs and NAEYC Standards are in alignment but we were not collecting data on each NAEYC Key Element (criteria) separately. We were collecting data that represented more than one criterion and on examination, sometimes more than one SLO. We needed to rethink our assessment and SLOs. We started this during Spring 2018 while awaiting the Accreditation results. We determined as a faculty that our assessments are good but the pieces of the assignments needed to be more clearly identified by SLO and NAEYC Key Assessments. The data for this SLO was collected in Spring 2017. It was revised to collect more by point range. We also separated out the criteria to distinguish SLO 5 and SLO 4 which previously had overlapped. This process continued and the rubric was revised slightly for the spring to further realign criteria. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: This SLO has been measured extensively using one particular assignment. It covers curriculum content and we have been using only one curriculum area. It has consistently exceeded the target. The faculty will discuss the possibility of using other curriculum/lesson plan/content assignments to measure SLO 5 in the Spring 2019 discipline meeting. Current actions to improve SLO based on the results: Faculty

Page 138: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

128

Early Childhood Development, A.A.S. discussion in Spring 2019 about using a different content area assignment to assess SLO 5. Next assessment of this SLO: Fall 2020

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

Describe current threats and explain how to continuously monitor the threats that may be present in the cyber realm (1, 2, 5, 6) [ X] CT

Advanced Observation and Participation in Early Childhood/Primary Settings CHD 265 Direct Measure: Assignment Project Reflection - Students provide self-analysis and reflection on the Program Capstone Project. Provided Rubric Criteria or Question Topics: Directions for assignment and grading rubric are attached. Other Method (if used): Qualitative data and assessment collected as part of the rubric summary data completed by instructors. Used in analysis/comments. Sample (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 2 2 30 MA 1 1 13 LO 1 1 9 ELI N/A N/A N/A DE N/A N/A N/A Total 4 4 52

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students will score 80% or higher overall on each criterion as well as the overall score. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Spring 2018 Average

Score Percent >

target AL 78 (2) MA 67 (13) LO 86 6

Results by CLO Criteria:

Results by CLO Criteria/

Question Topics

Spring 2018

Average Score

% of Students >

target 1. 80% 66% 2. 74% 61% 3. 60% 50% 4. 79% 58%

Current results improved: NA [ ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Students were able to identify what was successful and usually equated successful with personal growth. Weaknesses by Criterion/ Question/Topic: By score and by instructor comments, students struggled to analyze their own challenges with the project and comment on how they would improve the project, which also indicated lack of analysis on what was needed to improve. Self-reflection and critical introspection was not evident.

Previous action(s) to improve CLO if applicable: N/A Target Met: [ ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Self-reflection on how to improve seemed to be a hard concept for some. Unclear if it was the identification of challenges or the inability to be self-critical to obtain positive results. Some concern that basic expressive writing skills may be cause of some inability to capture critical thinking. Current actions to improve CLO based on the results: Faculty has decided to retain this assignment across the campuses and review data to see if improvements can be made. Next assessment of this CLO: We will collect data from this course and assignment as part of the Accreditation requirements but plan on assessing other CLO requirements in the near future.

Page 139: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

129

Early Childhood Development, A.A.S. Program Goals Evaluation Methods Assessment Results Use of Results

Increase degree graduation totals

Graduation totals by discipline are supplied by the college OIR for use by disciplines and programs in preparing the APER. The data are provided by OIR at: http://www.nvcc.edu/college-planning/data.html Data for 2017-18 and prior years provided by OIR were used for reporting and comparison here.

Target: 5% increase in the number of graduates over the previous year. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017-18 57 (17%) 2016-17 69 30% 2015-16 53 7% 2014-15 57 7% 2013-14 53

Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): Last year the ECD Program had a 30% increase in the number of graduates over the previous year and the largest number of graduates in the last five years. Although we are down 17% this year, our overall numbers are within a normal range of fluctuation for the program. We are a stacked degree and encourage students to complete the Career Studies Certificate and Certificate on the way to the AAS degree. We have an increase as seen below in the number of Certificate and Career Study graduates over the previous year indicating potential increase in graduates over the next two years.

Year ECD Certificate

Career Studies Certificate

(Infant and Toddler Certificates)

2016-17 39 79 (+8 certificates) 2017-18 45 84 (+11 certificates) % change 15% 9%

Previous action(s) to improve program goal: Faculty actively advised students about efficient pathways through the program. We have a stacked degree and regularly circulate information about pathways through the program. There has been a general decline in enrollment at the college which has impacted programs such as ECD. The advising takes place throughout the year. Faculty at each campus visit classes each semester as part of advising week to encourage enrollment for the next semester and inform students about efficient and economical pathways through the program. Most recent results: This reporting period showed a significate drop in AAS graduates but an increase in both ECD Certificates and Career Study Certificates. As we have a stacked degree those numbers can be seen as students progressing through the program to an AAS degree. Results improved: [ ] Yes [ X ] No [ ] Partially While the results did not improve graduation, rates do fluctuate and last year represented a five year high at 30%, well above the annual goal of 5%. Current action(s) to improve program goal: The program has changed beginning in Fall 2018. It has been aligned at the State level to encourage transfer to 4 year institutions. This alignment was a joint project of the VCCS and the 4 year colleges in Virginia. The transfer will allow ECD students to seek Teacher Licensure at the four-year college using their ECD degree as the base.

Page 140: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

130

Early Childhood Development, A.A.S. This is anticipated to increase the number of ECD graduates who wish to use this as a transfer degree. Assessed: Annually

Increase the number of program placed students

Program placement totals by degree and certificate are supplied by the college OIR for use by disciplines and programs in preparing the APER. The data are provided by OIR at: http://www.nvcc.edu/college-planning/data.html Data for 2017-18 and prior years provided by OIR were used for reporting and comparison here.

Target: 5% increase in the number of programed placed students over the last year. Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

2017-18 450 (2%) 2016-17 459 (11%) 2015-16 514 (13%) 2014-15 594 (4%) 2013-14 621 --

Target Met: [ ] Yes [ x] No [ ] Partially Comparison to previous assessment(s): While placement continues in a downward direction there was less of a decrease in program placement than in the past three years. The decrease in placement is consistent with the overall drop in enrollment and graduation rates. It is also a function of the advising by ECD faculty which encourages placement in the ECD Career Studies Certificate.

Previous action to improve program goal: The full-time faculty have actively reached out to students on rosters who are not program placed or are placed in a program other than ECD, but enrolled in ECD classes to advise them on placement. Adjuncts have been supportive in this effort by referring students not placed to program heads. This is done when rosters are pulled at the beginning of classes each Fall 2017 and Spring 2018 semesters. Most recent results: While there is not an increase in placements, there is a narrowing of the percentage decrease. Some of this is attributable to lower enrollments and some to the program’s encouragement to first be placed in the Career Studies Certificate. Results improved: [ ] Yes [ ] No [X ] Partially Current actions to improve program goal: Because of the change in the program starting in Fall 2018, the faculty have made a deliberate effort at outreach to students in all classes to inform them what placement choices they have and how selection of a catalog year and a program will impact the courses they take. Next assessment: Annually

Page 141: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

131

Annual Planning and Evaluation Report: 2017-2018 Emergency Medical Services, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to develop the competencies needed to prepare the student to take and successfully pass the Virginia certification exams for Emergency Medical Technician-Basic (EMT-B), Emergency Medical Technician-Intermediate (EMT-I), and/or Paramedic. EMT-Basic certification is foundational to all other EMS certifications. This means that all EMS providers must successfully complete EMT-Basic certification in order to continue on to any other level of certification. While the EMT-Intermediate and Paramedic curricula introduce “advanced” competencies to the students, they are—in essence—a more in-depth continuation of the competencies introduced and mastered in the EMT-Basic curriculum. Competencies at each level of certification are demonstrated via State and/or National examinations that include both cognitive (“written”) and psychomotor (“practical”) components

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

When presented with four unique written clinical scenarios (each including a 6-second static ECG cardiac rhythm strip) the student will: correctly interpret the ECG cardiac rhythm presented, accurately determine the patient’s acute condition classification as stable or unstable, deploy a clinically appropriate treatment algorithm, appropriately utilize electrical therapy (i.e. defibrillation/ cardioversion), call for the administration

Paramedic Review EMS 216-001H Direct Measure: Direct evidence gained via student testing by program faculty. The assessment was conducted as a test during the aforementioned class. The tool utilized for scoring/documentation was partially derived from the National Registry of Emergency Medical Technician’s “Static Cardiology Psychomotor Examination” form. This form was altered to allow for the itemization of the 5 specific key regions that we wished to assess. Question Topics: Students were each given a total of four unique scenarios. Faculty first read the scenario to the student and then the sheet (each containing one written clinical scenario and one cardiac EKG tracing “rhythm strip”) was handed to the student for their evaluation. The students were then required to verbally relay both appropriate EKG rhythm interpretation “diagnosis” (area 1 below) and proper clinical management “treatment” (areas 2-5 below). The student was given a total time limit (for all 4 scenarios) of 6 minutes. As faculty evaluators were not allowed to alter nor “add to” the scenarios in any way the condition of the patient (in each scenario) could not change (remained static). The student could opt to “skip” and later return to a particular scenario but the 6-minute time limit remained in effect. Students were awarded points for each of the scenarios as follows:

Data collected: Spring 2018 Target: > 80% of all students assessed will score > 80% in each of the noted topic areas. 13 students were evaluated and each presented with 4 unique written scenarios. Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Current Assessment Results

Spring 2018

Previous Assessment Results

Spring 2014 Average

Score %

> 80% Average

Score %

> target ME 94.6% 84.6% * 87.4%

*Data unavailable Results by SLO Criteria

Criteria/ Question Topics

Spring 2018 Spring 2014

Average Score

% of Students

> 80% Average

Score % of

Students > 80%

1 92.3% 69.2% * 86% 2 94.2 % 76.9% * 98.3% 3 98.1% 92.3% * 69.1% 4 88.5% 84.6% * 94.4% 5 100% 100% * 89.3% Total 94.6% 84.6% * 87.4%

*Data unavailable Current results improved: [ ] Yes [x ] No [ ] Partially Strengths by Criterion/ Question/Topic: Target of > 80% was met in topic areas:

Previous action(s) to improve SLO: A new textbook (thought to be more suited for EMS clinicians) was selected for teaching EKG interpretation within our EMS 153 (Basic EKG Recognition) class and was deployed during the fall 2014 semester. Greater time reviewing current AHA ACLS algorithms was also instituted, within EMS 153 as well as in EMS 207 (Advanced patient assessment) during this same term. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: Contrasting our results from the last assessment we showed a noteworthy increase within topic areas: • 3 (appropriate treatment algorithm

selection): 92.3% vs 69.1% • 5 (appropriate medication and

dosage): 100% vs 89.3%. These increases are likely secondary to the increase in time spent within EMS 153 and EMS 207 reviewing current AHA ACLS treatment pathways, which was instituted in response to the initial assessment results of 2014. When compared to the last assessment conducted, we showed a decline in topic areas: • 1 (cardiac rhythm interpretation)

69.2% vs 86%

Page 142: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

132

Emergency Medical Services, A.A.S. of clinically appropriate medication(s) with correct dosage(s).

• 1 point. (maximum) for each correct EKG rhythm identification (see area 1 below) = 4 total points possible

• 0.5 points (maximum) for each of the remaining categories (see areas 2-5 below) = 8 total points possible

Areas faculty assessed, including desired actions. Did the student: 1) Cardiac Rhythm Interpretation

Correctly interpret the ECG cardiac rhythm presented?

2) Patient Classification Accurately determine the patient’s acute condition classification as stable or unstable?

3) Treatment algorithm Deploy a clinically appropriate treatment algorithm (i.e. current ACLS standards)?

4) Electrical Therapy *Appropriately utilize electrical therapy (i.e. defibrillation/cardioversion)?

5) Medications *Call for the administration of clinically appropriate medication(s) with correct dosage(s)?

*Note: As the student could elect to utilize this management when clinically contraindicated, credit was also awarded for those scenarios where it was withheld when not warranted.

• Copy of form (assessment method) attached, with SLO alterations highlighted.

• Assessment method utilized during maiden assessment (2014) not available.

Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 13

*Dual-enrollment

• 3 (appropriate treatment algorithm selection): 92.3%

• 4 (appropriate electrical therapy): 84.6% • 5 (appropriate medication and dosage): 100%

Weaknesses by Criterion/ Question/Topic: Target of > 80% was not met in topic areas:

• 1 (cardiac rhythm interpretation): 69.2% • 2 (patient classification of stable or unstable):

76.9% Additionally, topic area 4 (appropriate electrical therapy) met target yet showed decline compared to previous assessment: 84.6% vs 94.4%

• 2 (patient classification of stable or unstable) 76.9% vs 98.3%

• 4 (appropriate electrical therapy): 84.6% vs 94.4%.

It should be noted that data regarding average scores was not available for the previously conducted assessment. Current actions to improve SLO based on the results: As our faculty believe that the areas in which we were not successful in meeting our target (topics 1 and 2) as well as the topic in which we met the target yet showed a decline compared to our last assessment (topic 4) are of vital importance to master (for both credentialing procurement and clinical practice). In an effort to improve these areas starting with the fall 2018 term our faculty will institute the following actions: We will escalate the time spent during our clinical laboratory scenario sessions (within EMS 213-ALS Skills and EMS 207-Advanced Patient Assessment) reviewing cardiac rhythm strips and the proper use of electrical therapy measures (i.e. when it should be utilized as well as withheld). Furthermore all faculty will also ensure that during aforementioned sessions that those students acting in the “lead” medic role will be specifically tasked to verbally quantify each mock cardiac patient’s stability classification (stable vs unstable) in a rapid triage style fashion. Also starting with our Fall 2018 term, faculty will provide real-time feedback regarding these items, during the debrief period that follows each scenario training session. Next assessment of this SLO: Fall 2018

Page 143: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

133

Emergency Medical Services, A.A.S. Core Learning

Outcome Evaluation Methods Assessment Results Use of Results

CLO: The EMS Advanced Life Support Student will demonstrate competent affective behavior related to emergency medical care, as measured by the Northern Virginia Community College EMS Program Affective Behavior Assessment tool. [ x ] QR

Students from the following advanced life support level EMS sections were assessed: • 151: Introduction to Advanced Life Support • 201: Professional Development • 207: Advanced Patient Assessment • 205: Advanced Pathophysiology Direct Measure: Assessments were completed by faculty based upon direct student observation as well as any applicable peer reported incidents occurring during the relevant term. Faculty were assigned students based on the student’s primary ALS class level. The affective behavior assessment tool utilized was developed, in part, from information gained from the Joint Review Committee on Educational Programs for the EMT-Paramedic and incorporates eleven relevant affective domain topic areas that directly reflect content from the roles and responsibilities portion of our national paramedic level curriculum. Accompanying each topic area were expectations to guide faculty in appropriate scoring. Faculty were advised to assign scores based on behavioral patterns and not on remote atypical occurrences. The assessments were conducted by all full-time faculty members. Scoring: Each of the eleven topic areas were scored via a Likert scale of 0-2:

• 2 = Competent • 1 = Needs Improvement • 0 = Not yet Competent

Assessment Tool Topics 1. INTEGRITY 2. EMPATHY 3. SELF- MOTIVATION 4. APPEARANCE and PERSONAL HYGIENE 5. SELF-CONFIDENCE 6. COMMUNICATIONS 7. TIME- MANAGEMENT 8. TEAMWORK AND DIPLOMACY 9. RESPECT 10. PATIENT ADVOCACY

Data collected: Spring 2018 Target: > 80% of all students assessed will achieve > 80% (> 1.6 points) for each of the eleven topic areas. Results by In-Class, ELI, Dual Enrollment:

Results by

Campus/ Modality

Spring 2018 Previous Assessment Results

Average Score

Percent > 80%

Average Score

Percent >target

ME 89.1% 81.6% N/A N/A Results by CLO Criteria:

Results by Individual CLO

Criteria/ Question Topics

Spring 2018

Average Score

% of all students > 80%

(1.6 points)

1 95.9 % (1.9 points)

91.8% (44/49)

2 88.8% (1.8 points)

77.6% (38/49)

3 83.7% (1.7 points)

69.4% (34/49)

4 98.0% (1.9 points)

95.9% (47/49)

5 73.5% (1.5 points)

51.0% (25/49)

6 83.7% (1.7 points)

69.4% (34/49)

7 91.8% (1.8 points)

83.7% (41/49)

8 95.9% (1.9 points)

91.8% (45/49)

9 98.9% (1.9 points)

98.0% (48/49)

10 91.8% (1.8 points)

83.7% (42/49)

11 83.7% (1.7 points)

85.7% (42/49)

Totals 89.1% (19.6/22)

81.6% (440/539)

Current results improved: N/A - This is the first term that this CLO has been assessed and will serve as the benchmark for future assessments.

Previous action(s) to improve CLO if applicable: This is the first time assessing this CLO. Target Met: [ ] Yes [ ] No [x ] Partially Based on recent results, areas needing improvement: This was our program’s first assessment of this CLO. It was noted that several areas (empathy, self-motivation, self-confidence, and communications) did not meet our ascribed target and will need to be addressed via the below prescribed action plan. It is believed that one reason for our not achieving our target specifically in the self-confidence topic area is likely related to the low scores received by students in our initial ALS classes who are just starting the advanced portion of the program and thus would expectedly have less self-confidence than their seasoned 200-paramedic level peers. Current action(s) to improve CLO, based on results: Starting with the Fall 2018 term all faculty will ensure that whenever a student’s affect (regarding any of the sub-target regions) begins to trend in a declining fashion, we will attempt to reverse the trend via a formal one on one meeting with the student to actively discuss the areas of potential concern. Also beginning with the Fall 2018 term the faculty will work with each student in order to determine if internal or external causative factors are at play. Utilizing this knowledge, the student and faculty member will collaborate to develop individualized strategies to assist the student. These could include program or college resources or the assigning of a fellow student as a peer mentor. The above prescribed collaborative faculty/student strategizing sessions will

Page 144: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

134

Emergency Medical Services, A.A.S. Note: Expectations from the assessment tool topics is saved as attachment. Sample Size (Write N/A where not offered):

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

ME only 4 4 49 ELI N/A N/A N/A DE* N/A N/A N/A Total 4 4 49

* Dual Enrollment

11. CAREFUL DELIVERY OF SERVICE Strengths by Criterion/ Question/Topic: Our assessment results showed several regions that achieved or surpassed our target range. These areas included: 1-Integrity 4-Appearance & Personal Hygiene 7-Time Management 8-Teamwork and Diplomacy 9 –Respect 10-Patient Advocacy 11-Careful Delivery of Service Weaknesses by Criterion/ Question/Topic: Our target score was not met in the following topic areas: 2- Empathy 3- Self Motivation 5- Self-Confidence 6- Communications

be documented (when legally/ethically permissible and adhering to all college policies) within our behavioral assessment tool. These processes are to be enacted by all program faculty starting Fall 2018. Next assessment of this CLO: Fall 2018

Program Goals Evaluation Methods Assessment Results Use of Results

To show escalation in the number of program placed students within the EMS program

Data Source: NOVA Data for Evaluation and Planning Reports/Distribution of Program Placed Students by Curriculum and Award Type (http://www.nvcc.edu/college-planning/data.html)

Target: The overall number of program-placed students (degree and certificate) will increase by at least 10 percent. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 46 130% 2016 20 - 23% 2015 26 30% 2014 20 0% 2013 20 - 26%

Target Met: [x ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): Our overall program placement percentage showed an exceptional increase over all of the previously assessed four years.

Previous action(s) to improve program goal: In 2013 (when this item was last addressed) the program director (in collaboration with student services) devised a new EMS CSC worksheet that was deployed in Spring 2015. As the level of placements declined the following year it is unlikely that this action was beneficial. Target Met: [ x ] Yes [ ] No [ ] Partially Most recent results: The program had a 130% increase, regarding program placement, over the previous 4 assessed years. We believe that a key to our success was the outstanding efforts put forth by our program’s administrative assistant which included: immediately responding to inquiries from current/prospective students and carefully explaining the nuances and layers of complexity that are specific to our unique program. These areas must be understood to allow for appropriate program placement and ultimately for student success. When certain vital information is not appropriately relayed to prospective and current students alike, we run the risk of

Page 145: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

135

Emergency Medical Services, A.A.S. attrition secondary to their justifiable disgruntlement. Current Action Plan: Faculty and the program director (working with our administrative assistant and the student services department) will ensure that each student who is taking classes in our program, regardless of level, will be actively program placed. We have considered allowing the student to complete the program placement form online in-order to streamline the process. The program director will collaborate with the dean of students to ascertain the pros/cons and overall viability of such an undertaking. The program director, in collaboration with the administrative assist, will re-vamp our current online information session to reflect the multitude of changes that will be occurring regarding the EMS program’s curriculum. The state mandated curriculum changes will begin in Spring 2019 and will alter the types of CSC that will be offered by our program. These occurrences will make effective communications with current and prospective students imperative. Above actions to be implemented during Spring 2019 term. Assessed: Annually

To show escalation in the number of annual graduates from the EMS program

Data Source: NOVA Data for Evaluation and Planning Reports/Distribution of Program Placed Students by Curriculum and Award Type (http://www.nvcc.edu/college-planning/data.html)

Target: The number of annual graduates from the EMS program will show an increase of at least 10 percent. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2017-2018 19 - 5% 2016-2017 20 - 9% 2015-2016 22 - 24% 2014-2015 29 12% 2013-2014 26 - 16%

Target Met: [ ] Yes [ x] No [ ] Partially

Previous action to improve program goal: Note: Limited data regarding 2013 graduation number enhancement strategies was found. Instead of enhancing program graduation rates, the goal appears to have been related to: “Increase the number of participants in the EMS program”. The action plan apparently involved both faculty and the program director and included: We will continue to promote the CSCs and the EMS AAS degree, informing students of the benefits these can provide to them as they progress through their EMS career. In 2015, we will add one additional section of Basic EMT to both the Spring and Summer

Page 146: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

136

Emergency Medical Services, A.A.S. semesters. We also will add a second section of each of our paramedic classes, and will offer an accelerated Intermediate program in the Fall 2015. We continue to brainstorm ways to improve our offerings. One such idea is to offer “non-traditional” times for some classes, such as weekend classes and/or dynamic accelerated classes (intensives). Most recent results: Unfortunately, our total of 19 graduates is noted to be the lowest within the last 5 year assessment period. Results improved: [ ] Yes [x ] No [ ] Partially Current actions to improve program goal: I believe that in the increasingly competitive field of EMS employers are beginning to see the value of staff that not only possess a health-care credential but also a discipline specific degree. Starting with the Spring 2019 term, our program director along with program staff/faculty will undertake a more directed marketing approach showing the multitude of positives and potentials for advancement that one may gain from holding a degree along with their National/State licensure. Additionally, beginning with our Spring 2019 term, the value of a degree vs only clinical licensure will also be stressed, by faculty, during academic advising sessions with each of our students. Next assessment: Fall 2019

Page 147: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

137

Annual Planning and Evaluation Report: 2017-2018 Engineering, A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare the student to transfer into a baccalaureate degree program in engineering fields such as mechanical engineering, civil engineering, chemical engineering, aeronautical engineering, and naval architecture/marine engineering. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Student will demonstrate knowledge of mechanics of deformable bodies.

Mechanics of Materials EGR 246 Direct Measure: A comprehensive problem involving the following: a. Determine the maximum shear and maximum

bending moment. b. Draw the Shear and Bending Moment Diagram c. Identify the properties of the I- Beam d. Calculate normal stresses e. Calculate Qb (First Moment of the Area) of the

beam flange f. Calculate shear stress at the flange web

junction g. Calculate the maximum value of the principal

stress (maximum) at the flange web junction Sample Size (Specify N/A when not offered)

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 1 1 30 AN 1 1 19 MA 1 ** - ME N/A N/A N/A LO N/A N/A N/A WO N/A N/A N/A ELI N/A N/A N/A DE* N/A N/A N/A Total 3 2 49

*Dual Enrollment **Data not available

Semester/year data collected: Fall 2017 Target: Minimum acceptable success rate: 60% Success rate - % of students who scored 60% or above on their completed test questions. Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017

Spring 2017

a. b. 67% 54%

c. 80% 57%

d. 53% 39% e. 53% 43% f. 59% 38%

g. 49% 30% Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Questions “a” and “b” met the minimum acceptable success rate. Weaknesses by Criterion/ Question/Topic: Questions “c, d, e and f” did not meet the minimum acceptable success rate.

Previous action(s) to improve SLO: In Fall Semester 2017 cluster meeting, it was suggested that the SLO be assessed again to establish improvements in the SLO assessment instruments if needed, and identify necessary steps to improve the outcome. Students encountered challenges in solving engineering problems requiring multiple steps. Exposing students to these multiple steps of computation will allow them to gain confidence in solving complex engineering problems. Instructors should demonstrate several similar problems when covering Chapter 8 of the course. EGR 246 is the last engineering course in the program and students need to acquire the ability to solve complex problems that will be encountered in their junior and senior engineering courses. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: EGR 246 was assessed last Fall 2017 to determine if there is a need to revise the assessment instrument. This assessment period, the results in all the parts of the problem showed improvements with an average increase of 17%. The assessment questions used in this SLO was shared and discussed with Mason’s Mechanical Engineering Department and was accepted. Even with the increase in each part of the problem solving question, only the first two questions: a) Determine the maximum shear and maximum bending moment. Draw the Shear and Bending Moment Diagram, and b) Identify the properties of the I- Beam met the minimum acceptable success rate of

Page 148: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

138

Engineering, A.S. 60%.These parts of the problem were also covered in EGR 240 Solid Mechanics – Statics, while the other parts of the question are on the mechanics of materials concepts and applications. Current actions to improve SLO based on the results: Continue to use this comprehensive question in EGR 246. Reinforced the lectures on the calculations of the shear and bending moment, and the stress calculations required in selecting the proper size of the horizontal structural support. This recommendation will be implemented in Spring 2019. Instructors should continue to demonstrate solving similar problems when covering Chapter 8 of the course. Students need to acquire the ability to solve complex problems that will be encountered in their junior and senior engineering classes. The Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Next assessment of this SLO: Fall 2018

Student will be able to analyze the position of rigid bodies and their applied forces at rest and in motion.

Engineering Mechanics – Dynamics EGR 245 Direct Measure: Problem Solving Tests in EGR 245 SLO Question 1 Part A: Equation of Motion Part B: Application of Linear Momentum SLO Question 2 Part A: Summation of Forces and Moments Part B: Application of Newton’s Second Law Sample Size (Specify N/A when not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed # %

AL 1 1 15 20 AN 2 2 40 56 MA N/A N/A N/A N/A ME N/A N/A N/A N/A LO 1 1 17 24 WO N/A N/A N/A N/A

Semester/year data collected: Spring 2018 Target: Minimum acceptable success rate: 60% Success rate - % of students who scored 60% or above on their completed test questions. Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics Spring 2018 Fall 2017

SLO Question 1 Part A 54% 60%

SLO Question 1 Part B 42% 45%

SLO Question 2 Part A 64% 76%

SLO Question 2 Part B 47% 56%

Current results improved: [ ] Yes [ X ] No [ ] Partially Strengths by Criterion/ Question/Topic:

Previous action(s) to improve SLO: Prior recommendations were: a) Benchmark the use of this SLO Question, b) Review of the summation of forces and moments; concepts were learned and applied in EGR 240 Solid Mechanics – Statics where the rigid bodies are at rest, and c) Emphasize the need for students to solve engineering problems with multiple parts and the ability to review their solutions step-by-step to ensure solving of the entire problem was done correctly. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: It was noted that some students failed to identify the directions causing incorrect signs in the equations. However, the overall concepts were assimilated.

Page 149: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

139

Engineering, A.S. ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 4 4 72 100

*Dual-enrollment

Only, SLO Question 2, Part A (Summation of Forces and Moments) met the minimum acceptable success rate with 64%. Weaknesses by Criterion/ Question/Topic: All the SLO questions, with exception SLO Question 2, Part A failed to meet the minimum acceptable success rate.

The second question showed the need to emphasize Concept of Friction in the prerequisite course Solid Mechanics – Statics. Current actions to improve SLO based on the results: Continue to use the SLO question to gather additional data. This additional data will be used to design new and additional SLO questions. Prior recommendations to review the concepts of the summation of forces and moments. This initiative should address the issue of the incorrect direction of the vector(s) in the equation. Continue to emphasize the need for students to solve engineering problems with multiple parts and the ability to review their solutions step-by-step to ensure solving of the entire problem correctly. Also, there is a need to emphasize Concept of Friction, and its application on rigid bodies in motion in the EGR 240 Solid Mechanics – Statics which the prerequisite course. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Next assessment of this SLO: Fall 2018

Students will demonstrate ability to work effectively as a team.

EGR 295 Direct Measure: Assessment of this SLO is primarily done with a design project team evaluation but also reviews the results of student success rates completion of team projects in EGR 295 in the Spring 2018 semester. Design Project Team Evaluation Form - an assessment tool of the course’s textbook: Engineering Design, 4th Edition, Dym and Little (ISBN: 978-1-118-32458-5), Publisher: Wiley

Semester/year data collected: Spring 2018 Target: Success Rate is defined by the number of students, who received 75% or greater in their Sum of Rating Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Spring 2018 Fall 2016

a. 91% 88% b. 82% 86% c. 79% 82%

Previous action(s) to improve SLO: The previous action was to provide additional discussions on leadership and identifying leadership qualities and opportunities. This will allow students to realize their leadership qualities and potentials. This recommendation will be implemented in Spring 2019. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: “Ability to Provide Leadership” is consistently lower in both sets of data. All other

Page 150: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

140

Engineering, A.S. Students are required to rate their team members and themselves in the following categories:

a) Quality of the technical work b) Ability to Communicate c) Ability to Provide Leadership d) Commitment to Team Project e) Demonstrated Effectiveness

Sample Size (Specify N/A when not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AN only 1 1 23 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

d. 86% 90% e. 91% 88%

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: All categories of the SLO assessment instrument were met. Weaknesses by Criterion/ Question/Topic: No weakness on the SLO assessment instrument.

categories have significant high ratings above the success rate. Current actions to improve SLO based on the results: The instructor in the EGR 295 should continue to cover the Team Dynamics lecture early in the semester before the midterm team project and reinforce the discussions of the team dynamics and leadership prior to the final team project. Encourage the different teams to communicate their progresses and challenges that they encounter during the completion of their projects. Often, every team member demonstrates leadership qualities; however, they don’t realize these qualities. Additional discussions on recognizing leadership qualities and ensuring that the teams work efficiently and effectively. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Next assessment of this SLO: Spring 2019

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Student will apply and demonstrate engineering problem solving methodology. [ X ] QR

Solid Mechanics – Statics EGR 240 Direct Measure: Problem Solving Tests in EGR 240. See attached method. SLO Question 1 Part A: Defining vectors of forces in 3D Part B: Solving the problem using simultaneous equations of 3 unknowns and 3 equations. SLO Question 2 Solving 3 questions of vector cross product Sample Size (Specify N/A when not offered):

Campus/ Modality

Total #

Sections Offered

# Sections assessed

# Students assessed

AL 1 1 35 AN 2 ** - MA 2 2 27

Semester/year data collected: Spring 2018 Target: Minimum acceptable success rate: 60% Success rate - % of students who scored 60% or above on their completed test questions. Results by CLO Criteria:

Results by CLO Criteria/ Question

Topics Spring 2018 Fall 2017

SLO Question 1 Part A 54% 60%

SLO Question 1 Part B 42% 45%

SLO Question 2 One Problem Two Problems Three Problems

93% 73% 79%

39% 42% 51%

Current results improved:

Previous action(s) to improve CLO if applicable: SLO Question 1: Engineering Mechanics instructors include additional exercises in viewing mechanics illustrations in order to properly identify vectors. Provide students additional practice problems in solving three equations simultaneously. SLO Question 2: Engineering Mechanics instructors will continue to provide additional mechanics problems in solving force couple system problems using Vector cross product. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Students need to improve their ability: a) To define the vectors of forces in 3D;

Page 151: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

141

Engineering, A.S. ME N/A N/A N/A LO 1 1 11 WO N/A N/A N/A ELI N/A N/A N/A DE* N/A N/A N/A Total 6 4 73

*Dual Enrollment

[ ] Yes [ ] No [X ] Partially Strengths by Criterion/ Question/Topic: SLO Question 2 met the minimum acceptable success rate. Weaknesses by Criterion/ Question/Topic: SLO Question 1 failed to meet the minimum acceptable success rate.

b) Solve simultaneous equation problems with 3 equations and 3 unknowns. Current actions to improve CLO based on the results: This is the first time that this SLO was used to determine core learning; however, the program has been tracking the student’s ability in solving engineering problems and their quantitative reasoning skills as an SLO before. Engineering Mechanics instructors include additional lectures which include exercises in viewing mechanics illustrations, extract the required information to develop the vectors. Providing these additional lectures in defining vectors from mechanics’ problem illustrations, will identify whether the students’ difficulty in completing the problem occurs in defining the vector equations as opposed to the solving the equations simultaneously. Further, to the extent possible, identify the students’ math skills and their challenges in solving simultaneous equations. Additionally, provide students several practice exercises in solving three equations simultaneously. Continue prior recommendations: Engineering Mechanics instructors will provide additional mechanics problems in solving force couple system problems using Vector cross product. These recommendations though focused on engineering mechanics also address the student’s quantitative reasoning skills in solving complex problems. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Next assessment: Fall 2018

Program Goals Evaluation Methods Assessment Results Use of Results To teach students principles of statics/dynamics, and prepare them for future study in

The courses use problem solving tests to evaluate student’s performance in both statics and dynamics.

Fall 2017 Target: Minimum acceptable success rate: 60%

Previous action(s) to improve program goal: One of the prior year’s recommendations was the continuation of constantly evaluating the testing methods and questions to ensure students are successful when they transfer to

Page 152: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

142

Engineering, A.S. Aerospace, Mechanical or Civil Engineering, as well as other engineering fields.

The use of problem solving tests measure the students’ reading and comprehension skills in addition to their math and physics abilities coupled with engineering concepts learned in statics and dynamics. OIR Data:

• Grade summary for EGR 240 (Fall 2017) • Grade summary for EGR 245 (Fall 2017)

Success rate is the number of students over the total number students who took the course and completed it with C or better. EGR 240 Solid Mechanics - Statics

Fall Success Rate 2017 77%

2016 64%

2015 63%

2014 69%

2013 63% EGR 245 Engineering Mechanics - Dynamics

Fall Success Rate 2017 76%

2016 50%

2015 68%

2014 81%

2013 58% Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): EGR 240 met the acceptable success rate and increased by 13% from the prior year. Also, EGR 245 met the acceptable success rate and increased by 26% from the prior year.

the four year institutions. Also, in the latest EGR 120 Introduction to Engineering, a section of Mechanics was reinforced to highlight the importance of the courses and will be able to make students aware of the commitment in both time and effort needed for successful completion of the course and the engineering degree program. Most recent results: Both EGR 240 and EGR 245 showed significant improvements. EGR 240 increased by 7% and EGR 245 increased by 16%. Results improved: [ X ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: Continue the prior year’s recommendation, since both EGR 240 and EGR 245 increased their success rate. This will determine if the improvement initiatives will continue to meet the acceptable success rate; if not determine additional steps to continue the improvement or maintain their current levels. Additionally, engineering mechanics instructors should meet to exchange teaching techniques and experiences acquired during the academic year. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Assessed: Annually

To teach students the fundamental principles of stress and strain relationships in structures, and techniques for handling transformation of stresses and deflection of beams.

The courses use problem solving tests to evaluate student’s performance in mechanics of materials. The use of problem solving tests measure the students’ reading and comprehension skills in addition to their math and physics abilities coupled with engineering concepts learned in mechanics of materials. OIR Data: Grade summary for EGR 246

Fall 2017 Target: Minimum acceptable success rate: 60% Success rate is the number of students over the total number students who took the course and completed it with C or better. EGR 246 Mechanics of Materials

Fall Success Rate 2017 77%

2016 56%

Previous action to improve program goal: Review of Engineering Statics to ensure that students are able to compute the required values (i.e. reactions, center of gravity, moment of inertia, internal forces, etc.) needed for solving the mechanics of materials problems. Also, the presentation of the course requirements, in detail, during the initial two sessions of the course. Students will be made aware of the commitment in both time and effort needed for successful completion will be an on-

Page 153: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

143

Engineering, A.S. 2015 68%

2014 63%

2013 50% Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): EGR 246 met the acceptable success rate and increased by 21% from the prior year.

going initiative. This recommendation will be emphasized and implemented in Spring 2019. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: Continue both recommendations even when the past two reporting cycles showed different results. The recent data showed an increase of 21% from the prior year. Additionally, engineering mechanics instructors should meet to exchange teaching techniques and experiences acquired during the academic year. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Assessed: Annually

Increase number of program placed students

OIR Report: Distribution of Program Placed Students and Grade Summary Report

Fall 2017 Target: Minimum acceptable rate of increase of number of program placed students: 5% A.S. Engineering

Year No. of Program Placed Students

2017 1912

2016 2021

2015 2024

2014 2004

2013 1918 Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): The number of program placed students has a decrease of 5%.

Previous action(s) to improve program goal: It was recommended in the prior year’s assessment that the Assistant Dean and Associate Dean responsible for the engineering program will continue to establish an accurate count of engineering students. This will allow a better understanding of the trend which will be useful in establishing course schedules and availabilities. Most recent results: Fall 2017 showed that even with duplicated head count, there were only 955 students who enrolled in EGR courses, as compared to 1912 program placed students. The number of program placed students includes students in the lower math courses prior to the engineering program math requirements of calculus. This showed a decline of 6% from the prior year and is similar to the decline of the program placed with a decline of 5%. Results improved: [ ] Yes [ X ] No [ ] Partially

Page 154: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

144

Engineering, A.S. Current action(s) to improve program goal: Encourage students to get program placed early in the program and identify the pathway(s) they plan to follow. This will ensure that they are able to select the courses that are required in their pathway and avoiding taking unnecessary courses that will not transfer to their selected transfer institution. Presently, NOVA is participating in an NSF grant VT-NETS (Virginia Tech – Network for Engineering Transfer Students). This program actively recruits students in high schools to enroll in NOVA engineering program and plans to transfer to Virginia Tech upon completion of their two-year degree. A scholarship is awarded to student participants. Promote the NOVA and Mason ADVANCE program, which streamlines students’ journey toward a four-year degree. The Engineering degree program is part of this program. Engineering faculty should participate in college activities that promote the engineering program. The engineering faculty needs to determine whether the 5% increase is a realistic target. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Assessed: Annually

To encourage students to complete their A.S. degree in Engineering.

OIR Data – Awards by Curriculum Code and Specialization

Fall 2017 Target: Minimum acceptable rate of increase of number of graduates: 5% A.S. Engineering

Year No. of Graduates 2017 121

2016 101

2015 108

2014 86

2013 89

Previous action to improve program goal: The recommendations from last year’s report are: a) Encouraged students to participate in the information sessions conducted by GMU and Virginia Tech, and b) During faculty one-on-one advising and engineering information sessions, efforts should be made to inform the students of the benefits of completing their associates prior to transferring. Most recent results: The graduate of the parent program has an increase of 20% and the Electrical Engineering Specialization decreased by 19%. There is also 1% increase in the

Page 155: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

145

Engineering, A.S. A.S. Engineering, Specialization in Electrical Engineering

Year No. of Graduates 2017 52

2016 64

2015 70

2014 71

2013 64 Percent of Graduates to Program Placed Students

Year No. of Graduates 2017 9%

2016 8.2%

2015 8.8%

2014 7.8%

2013 8% Target Met: [ ] Yes [ ] No [ X ] Partially Comparison to previous assessment(s): The graduate of the parent program has an increase of 20%. The Electrical Engineering Specialization also saw a decrease of 19% from prior year. The number of graduates compared to the number of program placed students has a 1% increase.

number of graduates compared to the number of program placed students. Results improved: [ ] Yes [ ] No [ X ] Partially Current actions to improve program goal: Changes in the degree program to improve graduate rate was initiated last year and was not written in the prior assessment. The A.S. Engineering degree was restructured to allow students to select the Engineering courses that transfer to their chosen senior institution and eliminate course requirements that may not transfer to their specific engineering degree program and transfer institution. With this restructuring, the Specialization in Electrical Engineering was eliminated and student will only acquire an A.S. in Engineering. Although the changes and the selection of courses are similar to the prior programs, the Engineering Faculty will ensure that the students are well informed about the pathways available to complete the selection of courses that will facilitate the completion of their degree and transfer to their selected institution. Promote the NOVA and Mason ADVANCE program, which streamline students’ journey toward a four-year degree. The Engineering degree program is part of this program. Further, continue to invite the senior institution to provide information sessions so the students can be informed of the various transfer opportunities. This recommendation will be implemented in Spring 2019. Engineering Discipline Dean and the Engineering Discipline Group will be responsible for the implementation of the recommendations. Assessed: Annually

Page 156: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

146

Annual Planning and Evaluation Report: 2017-2018

Engineering Technology, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed to prepare students for employment in Civil Engineering, Mechanical Engineering, or Drafting Technology fields. The degree also prepares individuals to continue their education in advanced degrees for programs in these fields. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to apply the principles of mechanics in the analyses and solutions of engineering problems.

Statics and Strength of Materials for Engineering Technology EGR 130 Direct Measure: Midterm Examination and Final Examination. (See attachment) Sample Size (Write N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# AL only 1 1 17 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 17

*Dual -enrollment

Semester/year data collected Fall 2017 Target: Minimum acceptable success rate: 70%

Fall 2017

Fall 2016

No. of Students: 17 13 Success rate: Midterm Exam 82% 77%

Success rate: Final Exam 71% 92%

Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: Success rate in both Midterm Examination (Statics) is 82%, slightly higher than the Final Examination (Strength of Materials) which is 71%. Weaknesses by Criterion/ Question/Topic: The evaluation method was not able to determine students’ strengths and weaknesses from the results.

Previous action(s) to improve SLO: Statics: The EGR 130 Statics and Strength of Materials for Engineering Technology instructor will provide additional lecture and sample problems to help the student to create the required free body diagram that will lead to the correct computations of the reactions and forces on the problem. The EGR 130 instructor will provide additional lecture and sample problems to correctly identify the orientation of forces. This will hopefully lead to the correct solution for Analysis of Structure used in the Method of Joints and Sections. Additionally, the review of math concepts in algebra, analytic geometry and trigonometry will be continued. These will also assist in the visualization issue encountered and identification of the forces orientation defined in prior assessment recommendations. Strength of Materials: The EGR 130 instructor will also include additional discussions in solving problems with beams comprising of two different materials properties. Highlighting the rationale and advantage of using steel as reinforcement in increasing load capacity will be an essential part of the discussion. This also allows the students further exposure in the engineering practices commonly used in building and constructions. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: In the Statics portion of the course, there was a slight increase in the success rate. The recommendations from the Fall 2017 described above have contributed to this increase. However, in

Page 157: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

147

Engineering Technology, A.A.S. the Strength of Materials portion of the course, a decreased of 11% occurred. Current actions to improve SLO based on the results: In reviewing the results from the both Midterm Examination (Statics) and Final Examination (Strength of Materials) results, it appears that the need to manage the additional time is warranted. The EGR 130 instructor should ensure that adequate time is spent in both Statics and Strength of Materials when additional lecture times are warranted in any the mechanics topics. This will be implemented in Fall 2019. The EGR 130 instructor should ensure that the other topics are not compromised when additional lectures are warranted. Additional problem solving examples may be covered in Blackboard so materials can be covered efficiently without compromising the lecture time in the other parts of the course. Next assessment of this SLO: The degree program which includes this course was restructured and its specializations discontinued. EGR 130 is not part of the new degree program.

Students will be able to apply their knowledge and skills of Computer Aided Drafting to start and complete engineering drafting projects.

Technical Drawing CAD 140 Direct Measure: Final Examination Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# AN only 1 1 11 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 11

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: Minimum acceptable success rate: 75% Results by In-Class, ELI, Dual Enrollment:

Spring 2018

Spring 2017

Number of students 11 14 Number of students with CAD Assignments above 90%

73% 79%

Number of students with Final Examination above 90%

82% 86%

Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: Number of students with Final Examination above 90% was down by 4% and met the target. Weaknesses by Criterion/ Question/Topic:

Previous action(s) to improve SLO: The CAD 140 Technical Drawing instructor will reinforce the need for proper placement of the views and will spend additional lectures in multi-views and auxiliary views drawing preparations. The lecture will include standards and conventions required in preparing engineering drawings. The CAD 140 instructor will highlight the importance of proper engineering documentation and its use in fabrications and constructions, and the cost associated with failure to communicate the design correctly. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: In the prior assessment, it was noted that the conventions in technical drawing need to be reinforced. Even when the system facilitates the creation of the layout, students should be made aware of the accepted practices in engineering when showing the different views.

Page 158: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

148

Engineering Technology, A.A.S. Number of students with CAD Assignments above 90% was down by 6% to 73%, and the target was not achieved. Further, the rubric needs to be broken down with different criteria in order to identify the areas that need improvements.

Current actions to improve SLO based on the results: CAD 140 students are able to generate 3D models of complex shapes and are able to apply their knowledge and skills of Computer Aided Drafting to start and complete engineering drafting projects. However, the graphics communication skills still need to address the drawing conventions and standards, in order to properly communicate the designs using graphics. Recommend that the instructor ensures that the students are able to focus not only on the 3D modeling aspects, but also on the proper views and labels of the drawing. Final examination questions should specify the grade point distribution which include the proper placements of the required views in addition to drawing the correct graphics. This will be implemented in Spring 2020 or next course offering. Next assessment of this SLO: The degree program which includes this course was restructured and its specializations discontinued. CAD 140 is not part of the new degree program.

Students will demonstrate interpersonal skills to function as part of a team.

Automated Manufacturing MEC 118 Direct Measure: Survey Questions

• Question 1: Percent of the respondents stated that they have the following interactions (formal/informal).

• Question 2: Percent of the respondents stated that these interactions are beneficial to their learning.

• Question 3: Percent of the respondents stated that these interactions improved their interpersonal skills.

• Question 4: Percent of the respondents also stated they will be able to use these skills outside of the classroom environment.

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# AN only 1 1 12

Semester/year data collected: Spring 2018 Target: Minimum acceptable success rate: 85% Results by In-Class, ELI, Dual Enrollment:

Spring 2018 Spring 2016 Number of respondents* 12 10

Question 1:

Often (50%)

Seldom (50%)

None (0%)

Often (60%)

Seldom (40%)

None (0%) Question 2 92% 100% Question 3 92% 70% Question 4 92% 90%

Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: Question 2 and 3: 92% of the respondents have identified their interactions as beneficial to their learning and improved their interpersonal skills. Weaknesses by Criterion/ Question/Topic:

Previous action(s) to improve SLO: In the Fall Semester 2016 report, it was recommended that the MEC 118 instructor foster both structured and unstructured team activities to ensure that students are able to learn and experience interpersonal skills with more frequency and practice. Target Met: [ X ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Since the course was conducted in an eight-week format, students may have been pre-occupied with the completion of their projects and contributed lower frequency of their interactions, since the course duration is short. The next offering will be a sixteen-week long semester. Current actions to improve SLO based on the results: The result is 7% above the acceptable rate; the instructor should continue the previous action of fostering both structured and unstructured team activities to ensure that students are able to learn and experience interpersonal skills with more frequency.

Page 159: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

149

Engineering Technology, A.A.S. ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 12

*Dual-enrollment

Question 4: Only 50% of the respondents have formal/informal interactions.

Additionally, scheduling the course in a longer 16-week format may contribute to a positive result, allowing students to interact frequently and improve their interpersonal skills. This will be implemented in Spring 2020 or the next course offering. Next assessment of this SLO: The degree program which includes this course was restructured and its specializations discontinued. MEC 118 is not part of the new degree program.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Students will be able to synthesize their knowledge of the fundamentals and practices of engineering technology. [ X ] QR

Automated Manufacturing Technology MEC 118 Direct Measure: MEC 118 Final Examination. The final examination consisted of two questions: CNC Lathe and CNC Mill (rubric is attached) Sample Size (Specify N/A when not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

Students assessed

# % AN only 1 1 12 100 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 12 100

*Dual Enrollment

Semester/year data collected: Spring 2018 Target: Minimum acceptable success rate: 75%

Spring 2018

Spring 2016

No. of Students: 12 10 Success rate: CNC Lathe 93% 60%

Success rate: CNC Mill 73% 60%

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Question 1- CNC Lathe question showed improvements and met the target. Weaknesses by Criterion/ Question/Topic: Question 2 - CNC Mill question showed improvements; however, it did not meet the target. .

Previous action(s) to improve CLO if applicable: This is the first time that the assessment focused on Quantitative Reasoning. The activities of the course allow the students to combine their knowledge and experience acquired from the other Engineering Technology courses to create a part using Computer Numerical Control (CNC) machines (lathe and mill). The Fall 2016 recommendation stated in the report, the MEC 118 instructor emphasized the applications and benefits of using canned cycle codes to complete machining tasks in both lathe and mill is applicable to this outcome. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The CNC mill still needs additional lectures. The complexity of working with the three axes and couple with the machines speed and feed when producing a part will need additional examples. It is recommended to continue the prior recommendation to emphasize the applications and benefits of using canned cycle codes to complete machining tasks in both lathe and mill. This will facilitate the creation of parts with a simpler set of machining instructions. The MEC 118 instructor will be responsible to implement the recommendation. Current actions to improve CLO based on the results: To improve the ability to synthesize their knowledge of the fundamentals and practices of engineering technology, instructors in their prior courses in Computer Aided Drafting and Design (CAD) course discussed the use of geometric

Page 160: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

150

Engineering Technology, A.A.S. coordinates beyond the traditional design and drawing applications. This initiative was implemented in Spring 2019. By providing this information, students will be able to use their CAD skills to solve various engineering applications using the existing numerical data base created in CAD and apply them in other engineering applications, i.e. machining, manufacturing and assembly, thus allowing them to synthesize their knowledge of the fundamentals and practices of engineering technology. This recommendation will be implemented in Spring 2020 or the next course offering. Next assessment of this CLO: The degree program which includes this course was restructured and its specializations discontinued. MEC 118 is not part of the new degree program.

Program Goals Evaluation Methods Assessment Results Use of Results Graduates will be able to perform satisfactorily as Engineering Technologists and gain advanced engineering skills and knowledge in the many and varied entry-level or higher positions.

OIR Data (Fall 2017) and Grade summary for CAD 202 & EGR 130 (CAD 202 provides essential skills to work in an engineering field and EGR 130 provides the analytical skills)

Target: • CAD 202 Minimum acceptable success rate:

75% • EGR 130 Minimum acceptable success rate:

70% Success rate is the number of students over the total number students who took the course and completed it with C or better. Results for Past 5 years - CAD 202:

Fall Success Rate 2017 88% 2016 90% 2015 78% 2014 74% 2013 93%

Results for Past 5 years - EGR 130

Fall Success Rate 2017 50% 2016 77% 2015 65% 2014 89% 2013 71%

Target Met: [ ] Yes [ ] No [ X ] Partially

Previous action(s) to improve program goal: The AAS Engineering Technology (and Specialization in Civil, Drafting and Mechanical) were restructured, eliminating the three specializations and revising the courses offered in the parent program. This will be the last assessment for this discontinued Associates in Applied Science of Engineering Technology with Specialization in Civil, Drafting and Mechanical. In the next reporting cycle, Fall 2019, the revised Associates in Applied Science of Engineering Technology will report its assessment results and recommendations. Results improved: [ ] Yes [ ] No [ X ] Partially Current action(s) to improve program goal: The degree program which includes this course was restructured and its specializations discontinued; and the courses are not part of the new program.

Page 161: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

151

Engineering Technology, A.A.S. Comparison to previous assessment(s): The CAD 202 target was met with a 2% drop while the EGR 130’s success rate was down by 27% from the prior year and did not meet the target success rate.

To encourage students to complete their A.A.S. degree in Engineering Technology

Office of Institutional Effectiveness and Student Success – Number of NOVA Graduates by Degree and Specialization: 2017 -2018. Research Report No. 41-17 (August 2018)

Target: • Minimum of 7 graduates annually for the AAS

Engineering Technology (and Specialization in Civil, Drafting and Mechanical)

• Minimum of 15 graduates annually for Career Studies Certificate in Computer Aided Drafting and Design

Results for Past 5 Years:

Academic Year

Number of Graduates: Engineering Technology

AAS

Number of Graduates:

Computer Aided Drafting and Design CSC

2017 2 19 2016 15 12 2015 9 15 2014 10 21 2013 14 19

Engineering Technology Target Met: [ ] Yes [ X ] No [ ] Partially Computer Aided Drafting and Design Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The number of graduates in Engineering Technology (and Specialization in Civil, Drafting and Mechanical) has decreased by 87% from 15 graduates to 2 while the number of graduates in the Computer Aided Drafting and Design, Career Studies Certificate increased by 58%.

Previous action to improve program goal: There were several core courses that were cancelled in Spring 2018. These cancellations contributed to the decline of the number of graduates. The AAS Engineering Technology (and Specialization in Civil, Drafting and Mechanical) were restructured, eliminating the three specializations and revising the courses offered in the parent program. This will be the last assessment for this discontinued Associates in Applied Science of Engineering Technology with Specializations in Civil, Drafting and Mechanical. In the next reporting cycle, the revised Associates in Applied Science of Engineering Technology will report its assessment results and recommendations. However, the Computer Aided Drafting and Design, Career Studies Certificate will continue to exist in its current form. Next assessment: Fall 2018

Page 162: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

152

Annual Planning and Evaluation Report: 2017-2018

Fine Arts, A.A./A.A.A. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: (A.A.): The Associate of Arts degree with a major in Fine Arts is designed for students who plan to transfer to a four-year program in a professional school or to a college or university baccalaureate degree program in Fine Arts. Program Purpose Statement: (A.A.A.): The Associate of Applied Arts degree with a major in Fine Arts is designed for students who seek direct employment in the applied arts field.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Critique personal studio art projects of those or other students.

Drawing I ART 121 Direct Measure: Students were asked to write a brief one to two-page critique of an assignment from their portfolio discussing the elements and principles of design. The students commented specifically on use of variety (a principle of design that refers to the combination of design elements to achieve complex relationships within the artwork), thoughtfulness of media, appropriateness of technique, and level of craftsmanship. The project was completed outside of class. Instructors evaluated the critique based on four factors listed below as they related to the assessment criteria of variety, media, technique, and craft: 1. Description: Written, detailed observations on

the subject matter and/or elements seen in the work.

2. Analysis: An understanding of composition and design elements through clear descriptions and correct usage of art terminology.

3. Interpretation: Use of multiple criteria such as mark-making, use of medium, expression, and creativity to explain meaning or intent of artwork.

4. Judgment: Use of multiple criteria to measure the success of the artwork such as craftsmanship, expression, creativity, and design. Judgment is based on a thorough review of visual evidence and excellent interpretation.

Point totals for each area: • Variety: 0 – 25 points • Media: 0 – 25 points • Technique: 0 – 25 points • Craft: 0 – 25 points

Semester/year data collected: Fall 2017 Target: Students will score a combined 70% or higher on overall score and at least 15 on each criterion. Results by In-Class assessment:

Results by Campus/ Modality

Fall 2017 Fall 2015

Average Score

% of students

>70 Average

Score

AL 85.0 27.3 84.8 AN** 0 0 86.3 MA** 0 0 76.76 LO 84.3 13 86.25 WO 68.5 53 74.9 Total 81.8 22.5 81.8

** No results submitted Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2015

Average Score

% of Students

>15 Average

Score

Variety 19.7 12.5 20.32 Media 20.4 8.8 21.4 Technique 20.8 6.3 20.31 Craft 20.7 11.3 19.7 Total 20.4 9.725 20.43

Current results improved: [ ] Yes [ ] No [ x ] [ Partially ] Strengths by Criterion/ Question/Topic: Media, Technique, and Craft improved from previous 2015 assessment and students showed greatest achievement in technique.

Fall 2015 targets were met; however given the lower averages in media, technique, and craft during that assessment, the cluster felt a need to increase student understanding of critical and analytical techniques in ART 121. In addition, appropriate vocabulary and concepts relevant to assignments were stressed throughout the course to assist students as they became more self-critical and engaged in critical dialogues with each other. Target Met: [ ] Yes [ ] No [ x ] Partially In the current assessment, targets were once again met at all campuses except Woodbridge; however, the total average score across the college remained the same at 81.8%, 11.8 percentage points above the 70% target. Average scores were relatively consistent across the criteria, varying only 1 point from the highest to lowest average score (19.7 and 20.7) Scores in technique and craft were highest; however, 11.3% of students did not meet the target score in craft. This suggests that craft and variety are two areas of instruction that should be more thoroughly addressed. This will be done by the implementation of assignments that enhance exploration in those areas. In turn, instructors will offer targeted feedback specific to craft and use of variety. The cluster is currently developing a rubric that will aid this targeted feedback. The discipline group hopes to implement the new rubric for future assessments. Next assessment: Fall 2019

Page 163: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

153

Fine Arts, A.A./A.A.A. • Total: 0 – 100 points

Rubric suggestions for point totals within each area of assessment:

• Excellent: 20 – 25 • Above Average: 15 – 19 • Average: 10 – 14 • Below average: >10

Total scores in the four areas of assessment suggest the following four levels of success:

• Excellent: 80 – 100 • Above Average: 60 – 79 • Average: 40 – 59 • Below Average: >40

The instructors collected the written critique, and point totals were distributed in each of the four areas of assessment criteria as indicated above. Instructors scored a rubric for each student and recorded individual student scores on the class tally sheet. Assessment rubrics, example instructions, and class tally sheets are attached. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 5 1 11 AN 6 0 0 MA 4 0 0 LO 4 4 54 WO 3 1 15 ELI N/A N/A N/A DE* N/A N/A N/A Total 22 6 80

*Dual-enrollment

Weaknesses by Criterion/ Question/Topic: Use of variety fell slightly over previous 2015 assessment. Variety, a principle of design that refers to the combination of design elements to achieve complex relationships within the artwork, suggests that students can improve in their ability to combine the design elements.

Critically evaluate works of art and architecture within their historical context.

History and Appreciation of Art II ART 102 Direct Measure: Students were given a formal writing assignment (topics could vary as long as they were selected from ART 102 course content area) and tasked with producing a well-organized piece of formal art criticism. Each student was given a detailed rubric outlining the goals of art criticism: to identify and describe the most important formal attributes of a selected

Semester/year data collected: Fall 2017 Target: Average total score of 3.0 or above for at least 70% of students assessed. These scores were also broken down into category sub-scores with a target score of 3. Results by SLO Criteria:

Criteria/ Question Topics

Fall 2017 Fall 2015 Averag

e Score

% of Students

> 3

Average

Score

% of Students

> 3

Fall 2017 target scores were met and exceeded overall and in areas identified for improvement such as analysis and evaluation. In our review of both Fall 2015 and Fall 2017 assessments, the art history discipline group concluded that the data indicates the continued importance of providing students opportunities to practice formal analysis both in the classroom and through Blackboard as practice and prompt

Page 164: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

154

Fine Arts, A.A./A.A.A. artwork, and using formal terminology, to analyze how the artwork communicates through formal means, evaluating the artwork within its own art historical period. The rubric outlined the specific criteria used to assess each of 5 subcategories: organization, identification, description, analysis and evaluation. The range for each sub-category score was as follows:

• Excellent: 5 • Good: 4 • Acceptable: 3 • Weak: 2

Instructors scored the rubric for each student submission and then recorded individual student scores to obtain class scores on the class tally sheet. Assessment rubric, instructions and tally sheet example attached. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# of Students assessed

On campus 14 13 274

ELI N/A N/A N/A DE* N/A N/A N/A Total 14 13 274

*Dual-enrollment

Organization 3.729 * 3.45 *

Identification 3.83 * 3.82 *

Description 3.73 * 3.52 * Analysis 3.69 * 3.49 * Evaluation 3.72 * 3.43 * Total 3.74 78.4 3.54 77.5

*No data provided at the topic level. Current results improved: [ x ] Yes [ ] No [ ] Partially All targets were met or exceeded. The average total score was 3.74 and 78.4% of students assessed received a total score of 3 or better. Comparing these results to Fall 2015 scores shows an increase in average total score from a 3.54 in Fall 2015 to a 3.74 in Fall 2017 as well as a rise in the percentage of students receiving a 3 or better from 77.5% in Fall 2015 to 78.4% in Fall 2017. Strengths by Criterion/ Question/Topic: All targets were met or exceeded. The average total score was 3.74 and 78.4% of students assessed received a total score of 3 or better. Comparing these results to Fall 2015 scores shows an increase in average total score from a 3.54 in Fall 2015 to a 3.74 in Fall 2017 as well as a rise in the percentage of students receiving a 3 or better from 77.5% in Fall 2015 to 78.4% in Fall 2017. Weaknesses by Criterion/ Question/Topic: Comparing these results to Fall 2015, scores show no perceived weaknesses.

instructor feedback are key to improving student learning. Steering committee faculty for the art history discipline group will share this data with their campuses in August 2018 and encourage adjunct faculty to maintain current classroom and Blackboard opportunities to practice formal analysis and to provide prompt instructor feedback on formal analysis throughout the semester. The art history discipline group is revising and adding more SLOs now that we will be assessing our program separately from Fine Arts. A schedule will be forthcoming.

Identify and apply the principles and elements of design.

Fundamentals of Design I ART 131 Direct Measure: Students were asked to plan and complete a project that utilizes and synthesizes the elements and principles of design, and applies color as an element of design. The project was completed both in class and out of class. Students were assessed on two criteria: 1) Utilization of principles and elements of art and

design. 2) Application of color as an Element of Design.

Semester/year data collected: Spring 2018 Target: Students will score 4 or greater in the two categories assessed and a combined target of 8. Across the college, the target is an average of 4.1 in each category and a combined target average of 8.1. Results by In-Class assessment:

Results by

Campus/ Modality

Spring 2018

Spring 2015

Average Score % > 8 Average

Score AL 8.6 24 *** AN** 0 0 ***

Spring 2016 assessment achievement targets were met; however, the cluster was dissatisfied with the number of assessments returned from ART 132 Fundamentals of Design I. In Spring 2016, course content for ART 131 and 132 were revised. Course content for 131 was revised to include color theory and other principles of design. Course content of ART 132 was revised to focus primarily on 3D design. The cluster agreed to shift assessment of SLO 7 to examine the use of color as new course content

Page 165: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

155

Fine Arts, A.A./A.A.A. Within each criteria, students were scored from 1-5 points. Each score represented the following levels of achievement in use of Color, Principles, and Elements of Design. A score of:

• 5 = Master use • 4 = Accomplished use • 3 = Competent use • 2 = Developing use • 1 = Beginning use

Scores in both categories were not combined for totals of 6 – 10. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

AL 2 2 25 AN 3 0 0 MA 2 1 18 LO 2 2 23 WO 2 0 0 ELI N/A N/A N/A DE* N/A N/A N/A Total 11 5 66

*No Dual-enrollment

MA 9.2 5.6 ***

LO 8.5 21.7 ***

WO** 0 0 ***

Total 8.7 18.2 9.3 ** No results submitted ***No data for individual students on file Results by SLO Criteria

Criteria/ Question Topics

Spring 2018 Spring 2015

Average Score

% of Students

> 4.1 Average

Score

Use of Elements and Principles of Design

4.35 13.6 4.5

Use of Color 4.37 15.2 4.9

Total 8.7 9.3 Current results improved: [ ] Yes [ x ] No [ ] Partially Strengths by Criterion/ Question/Topic: Difference in score between criteria is negligible, although scores did fall somewhat from previous assessment of Spring 2015. Weaknesses by Criterion/ Question/Topic: Target scores were met in both criteria and there is not demonstrable weaknesses based upon the assessment.

of ART 131. ART 131 is also a more highly enrolled course and shifting assessment to this class also offered the advantage of providing a larger sample of data. In addition, rubrics were also revised to provide more focused assessment and include the use of principles of design (balance, color, texture, and space). Target Met: [ x ] Yes [ ] No [ ] Partially It is difficult to compare the Spring 2018 and Spring 2015 sets of data for three reasons: 1. Assessment criteria remained the same, but

the 2018 rubric was changed to reflect a greater emphasis on color.

2. ART 131 Fundamentals of Design I is an introductory studio art class, and a less advanced group of students were assessed during the Spring 2018 assessment cycle.

3. Records of individual student scores were not retained and no data indicating percentages of students that met the target is available.

Upon comparison of the data that is available, average scores met targets but did decline from the previous Spring 2015 assessment by 0.6 points. This is, once again, likely due to the assessment of less advanced students in ART 131 Spring 2018 vs. those students that were more advanced in ART 132 in Spring 2015. Assessment did not expose weakness in assessment criteria. This suggests the rubric can be refined to better determine student weaknesses. The discipline group is currently developing a rubric that will expand upon the criteria and aid instructors in their evaluation of projects and subsequent feedback to students. The rubric should be complete and implemented in Fall 2018. Next assessment: Spring 2020

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Critical Thinking [ x] CT

Art Appreciation ART 100 Direct Measure: Students were given a formal writing assignment (topics could vary as long as

Semester/year data collected: Spring 2018 Target: Average total score of 3.0 or above for at least 70% of students assessed. These scores were

There was no previous assessment of a Critical Thinking Core Learning Outcome and no previous actions for improvements to report. This is the first

Page 166: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

156

Fine Arts, A.A./A.A.A.

they were selected from ART 100 course content area) and tasked with producing a well-organized piece of formal art criticism. Each student was given a detailed rubric outlining the goals of art criticism: to identify and describe the most important formal attributes of a selected artwork, and using formal terminology, to analyze how the artwork communicates through formal means evaluating the artwork within its own art historical period. The rubric outlined the specific criteria used to assess each of 5 subcategories: organization, identification, description, analysis and evaluation. The range for each sub-category score was as follows:

• Excellent – 5 • Good – 4 • Acceptable – 3 • Weak – 2

Instructors scored the rubric for each student submission and then recorded individual student scores to obtain class scores on the class tally sheet. The assessment rubric, instructions and tally sheet example are attached. Sample Size (Specify N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections assessed

# Students assessed

On Campus

27 22 549

ELI N/A N/A N/A DE* N/A N/A N/A Total 27 22 549

*Dual-enrollment

also broken down into category sub-scores with a target score of 3. This was the first time ART 100 has been assessed and the first assessment of a VCCS core competency. As a result, no comparison between previous and current assessment is available. Results by CLO Criteria:

Criteria/ Question Topics

Average Score

1.Organizatoin 3.52 2.Identification 3.69 3.Description 3.59 4.Analysis 3.25 5. Evaluation 3.25 Total 3.46

All targets were met or exceeded. The average total score was 3.46 and 76.1% of students assessed received a total score of 3 or better. Current results: are inconclusive. Strengths by Criterion/ Question/Topic: Student scores were strongest in Identification with an average of 3.69, 0.69 points above the target average of 3. Weaknesses by Criterion/ Question/Topic: Student scores were weakest in Analysis and Evaluation with an average score of 3.25 in both categories. This score still reflects an acceptable range as it is 0.25 points above the target average of 3.

time this VCCS core competency has been assessed. Target scores and averages were met during this initial assessment, but areas of improvement were identified by the art history discipline group. Analysis and evaluation are critical thinking skills that are essential to all students – not just art history students. ART 100 is a popular General Education course, and the art history discipline group needs to make certain that critical thinking (as well as other core competencies) is an essential component of EVERY ART 100 course taught at NOVA. As such, the art history discipline group will review and revise the ART 100 Course Content Summary (CCS) to ensure core competencies are addressed. The art history discipline group will then work with our Academic Dean for Liberal Arts to ensure that a course taught at one campus meets the same standards as one taught at another. There are inconsistent standards across the college and this impacts student learning. The art history discipline review of the CCS will take place in Fall 2018 with submission of revisions by Spring 2018. All campuses, sections and instructors will need to comply with the new CCS course outcomes/VCCS core competencies and revised standards by Summer 2018. Next assessment: Spring 2018

Program Goals Evaluation Methods Assessment Results Use of Results Increase number of graduates in Fine Arts programs.

Data was gathered from the NOVA OIR 2017-2018 Fact Book.

Target: 40 Graduates from Associate of Art (A.A.) and 18 graduates from the Associate of Applied Art (A.A.A.) Results for Past 5 years – Associate of Art (A.A.) in Fine Art:

Fall Number of Students % Increased

2013 40

OIR data from 2013-2016 indicated a trend in declining graduation from AA and the AAA in Fine Arts degree programs. Student surveys administered in Spring 2016 to art students indicated students felt the AA and AAA did not adequately prepare them for successful transfer. It was recorded in the 2015-2016 APER that the cluster felt implementation of an Associate of Fine

Page 167: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

157

Fine Arts, A.A./A.A.A. 2014 38 -5 2015 35 -7.89 2016 31 -11.43 2017 44 41.9% Target Met: [ x ] Yes [ ] No [ ] Partially Results for Past 5 years - Associate of Applied Art (AAA) in Fine Art:

Fall Number of Graduates

% Increased

2013 21 2014 19 -9.52 2015 14 -26.32 2016 11 -21.43 2017 19 72.7% Target Met: [ x ] Yes [ ] No [ ] Partially Graduation in both the AA and AAA improved significantly by 13 students (41.9%) and eight students (72.7%) respectively.

Arts (AFA) degree was essential in increasing degree completion rates. It was also recommended that the AA in Fine Arts and AAA in Fine Arts close upon implementation of the AFA. Fall 2016, NOVA’s Curriculum Committee and Administrative Council approved the AFA and subsequent closure of the AAA and AA in Fine Arts. In Fall 2018, the AFA was implemented and the AAA and AA in Fine Arts have been officially closed and there will effectively be no new AA and AAA in Fine Art program placed students. Current students who wish to stay on the AAA and AA path, instead of migrate to the AFA, will have three years to complete the degree, and the fine arts discipline group anticipate a sharp decline in AA and AAA in Fine Art graduates during that time. This APER report marks the last program goal assessment before the AA and AAA in Fine Arts degree plan was closed in Fall 2018. Recent results from 2017-2018 data demonstrates a very large increase in graduation rates. This is due to measures taken during 2016-2017 assessment to improve student advising and promote graduation. Results improved: [ X ] Yes [ ] No [ ] Partially Faculty will continue to advise and assist remaining AA in AAA in Fine Art students to either graduate within three years or pursue the Associate of Fine Art (AFA) in Visual Art. There is currently no data available to suggest a new goal for AFA graduates. When data is available, new targets will be created for the AFA in Visual Arts and those for the AA and AAA in Fine Arts adjusted.

Increase program placement in the Fine Arts Program

Data was gathered from the NOVA OIR 2017-2018 fact book.

Target: 650 program placed students in AA and 175 program placed students in AAA Results for Past 5 Years - Associate of Fine Art (A.A.) in Fine Art Placed Students:

Academic Year Number of Students

% Increased

2013 626

In 2016, AA program placement declined sharply by 82 students and fell below the target of 650 students. AAA program placement increased by eight students but fell below the target of 175 students by 34 students. The sharp decline in AA program placed student in 2016 suggested students found alternative majors

Page 168: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

158

Fine Arts, A.A./A.A.A. 2014 656 4.79 2015 690 5.18 2016 608 -11.88 2017 631 3.78%

Target Met: [ ] Yes [ x] No [ ] Partially Results for Past 5 Years - Associate of Applied Art (AAA) in Fine Art Placed Students:

Academic Year Number of Students

% Increased

2013 182 2014 145 -20.33 2015 131 -9.66 2016 139 6.11 2017 114 -17.99

Target Met: [ ] Yes [ x ] No [ ] Partially Program placement declined by approximately 18% from 139 students to 114.

such as the AAA in Fine Art. A decline in ART FTES from 961.1 in 2015 to 938.0 in 2016 suggested the overall loss in ART enrollments are in line with an overall loss in NOVA enrollment over the same period. Recent Fall 2017 data showed an FTES increase in AA and AAA in Fine Art program placed students to 1,044.7, up 11.3% from 938 FTES in Fall 2016. Enrollment is strengthening in the fine arts and the current assessment suggests that active academic advising during the 2017-2018 academic year has assisted AA Fine Arts program placement. A declining trend persisted in program-placed AAA Fine Art students. In Fall 2018, the college opened the Associate of Fine Art in Visual Arts. With this new degree came the closure of the AA and AAA in Fine Arts. As a result, it is expected there will be “0” program placed students in these now, closed degrees. The studio art discipline groups expect program placed students within the new, AFA in Visual Arts degree. Currently no data available to suggest a new goal for AFA program placed students. When data is available, new targets will be created for the AFA in Visual Arts and those for the AA and AAA in Fine Arts adjusted. Results improved: [ ] Yes [ x ] No [ ] Partially

Page 169: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

159

Annual Planning and Evaluation Report: 2017-2018 Fine Arts: A.A.A., Photography Specialization

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Fine Arts, A.A.A. – Photography Specialization Program Purpose Statement: The Photography specialization is designed for students who seek employment in the applied arts field. Course work will stress both technical and aesthetic elements, enabling students to solve a wide range of visual problems with imagination and originality

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Control the image capture process

Photography II PHT 102 Direct measure: Multiple choice exam questions on the following topics:

• Shutter speeds • Shallow Depth of Field (DOF) • Maximum DOF

Sample Size (Write N/A where not offered)

Campus/

Modality

# of Total

Sections

Offered

# of Sections

Assessed

# Students assessed

AL 1 1 AAA 1 AAS 1

AAA+AAS

1

Non-major

6

Total 9 WO 1 1 AAA 6

AAS 5 AAA+

AAS 1

Non-major

3

Total 15 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 2 2 Total 24

*Dual-enrollment The assessment was previously done in PHT101 and the result was not sorted by AAA and AAS.

Semester/year data collected: Fall 2017 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees

SLO questio

ns

Spring 2016 Fall 2017 Change from previous

assessments Shutter speeds

AL (31) 96% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+4%

ELI (5) 55% NA NA WO (20) 76% 86% +10%

All = 83% All = 93% +10% Shallow DOF

AL (19) 59% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+41%

ELI (6) 66% NA NA WO (11) 42% 73% +31%

All = 53% All = 86.5% +33.5% Max. DOF

AL (31) 96% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+4%

ELI (9) 100% NA NA WO (16) 61% 80% +19%

All = 83% All = 90% +7% Overall 73% 89% +16%

The results for all three SLO questions are improved and are the average of all three questions are above target.

Previous action(s) to improve SLO: The PHT discipline group decided to simplify answer choices and eliminate the answer selection of “None of the above”. The PHT discipline group also decided to simplify the wording of the question to the following. “To obtain shallow Depth of Field in an image”. This modified question was implemented in the assessment in Fall 2016. This revised wording helped students with reading/language challenges. This assessment was previously done in PHT101 – Photography I. The PHT discipline group decided to assess the SLO in PHT102 where the level of outcome are “Practiced” and “Mastered”. Target was met. Based on recent results, areas needing improvement: Shallow DOF question still has the lowest achievement level even though simplifying the wording helped increase the achievement level. Students commented that wording "wide aperture" in one of the answer choices is confusing. Current action(s) to improve SLO, based on results: PHT discipline group will discuss answer choices for the Shallow DOF question and consider different wording such as "wide open" or "large" instead of “wide aperture” in the Spring 2019 Cross Campus meeting. The change will be implemented in Spring 2019. For the next assessment in Fall 2020, the SLO lead faculty member will send multiple reminders to collect assessment data

Page 170: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

160

Fine Arts: A.A.A., Photography Specialization separated by Photography + Media AAS degree and others. Next assessment of this SLO: Fall 2020

Manage image assets and workflow

Photography II PHT102 Direct measure: Multiple choice exam questions on the following topics:

• Backing up files • Light room • Metadata • Missing image files

Fall 2017

Campus/

Modality

# of Total

Sections

Offered

# of Sections

Assessed

# Students assessed

AL 1 1 AAA 1 AAS 1 AAA+AAS

1

Non-major

6

Total 9 WO 1 1 AAA 6

AAS 5 AAA+AAS

1

Non-major

3

Total 15 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 2 2 Total 24

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees: The WO result is not sorted by AAA and AAS. The previous result was not sorted by AAA and AAS.

SLO questions

Spring 2016 Fall 2017 Change from

previous assessment

Backing up files

AL (31) 75% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+25%

WB (10) 76% 93% +17% All = 76% 96.5% +20.5%

Light room AL(40) 97% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+3%

WO(13) 100% 100% +/-0% All = 98% 100% +2%

Metadata AL(37) 90% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+10%

WO(13) 100% 93% -7% All = 92% 96.5% +4.5%

Missing image files

AL(20) 90% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+10%

WO(13)100% 93% -7% All = 94% 96.5% +2.5

Overall 90% 97% +7% Overall average increased by 7% compared to the previous assessment. Success rate for the question “Backing up files” increased by 20.5%. The other three questions’ success rates are almost identical to the previous assessment (within 4.5% difference overall).

Previous action(s) to improve SLO: The PHT discipline group revised the question of “backing up files” to fit the current trend incorporating “3-2-1 rules” in Spring 2017. The concept of “3-2-1 rules” was discussed in class in additions to providing online materials on Blackboard. The achievement level for “backing up files” increased by 20.5%. The target was met. Based on recent results, areas needing improvement: While the overall achievement level for all four questions improved and is above target, the question on “Metadata” and “Missing image files” decreased by 7% in one campus. Current action(s) to improve SLO, based on results: For the next assessment in Fall 2019, the SLO lead faculty member will send multiple reminders to collect assessment data separated by Photography + Media AAS degree and others. Faculty decided to review the Library Module in Lightroom and how Lightroom handle files within a catalog system in class and online starting in Fall 2018. Next assessment of this SLO: Fall 2019

Solve technical and aesthetic problems independently and creatively

Photography II PHT102 Direct measure: Multiple choice exam questions on the following topics:

• Orange color cast (WB) • Minimize highlight clipping • Vibrance

Semester/year data collected: Spring 2018 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees:

Previous action(s) to improve SLO: The faculty continued providing the review materials online and multiple practice tests to prepare students for exam, which appeared to be effective after the last assessment in fall 2016. Target was not met.

Page 171: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

161

Fine Arts: A.A.A., Photography Specialization Sample Size (Write N/A where not offered):

Campus/

Modality

# of Total

Sections

Offered

# of Sections

Assessed

# Students assessed

AL 2 2 AAA 6 AAS 7

AAA+AAS

NA

Non-major

12

Total 25 WO 1 1 AAA 2

AAS 3 AAA+A

AS NA

Non-major

7

Total 12 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 3 3 Total 37

*Dual-enrollment All PHT102 students were assessed. This involved 3 faculty members in 3 sections. Total sample was 37 students (AAS=10 students). The previous result included AAS only.

SLO questions Fall 2016 Spring 2018

Change from previous

assessment Orange color cast (WB)

AL: AAS(5)=100%

AL/AAS 57% AL/AAA 33%

-43%

WO/NA WO/AAS 66% WO/AAA 100%

NA

Total AAS(5)=100%

Total/AAS 61.5%

Total/AAA 66.5%

-38.5%

Minimize highlight clipping

AL: AAS(4)=80%

AL/AAS 100% AL/AAA 66%

+20%

WO/NA WO/AAS 66% WO/AAA 50%

NA

Total AAS(4)=80%

Total/AAS 83% Total/AAA 58%

+3%

Vibrance AL: AAS(4)=80%

AL/AAS 71% AL/AAA 66%

-9%

WO/NA WO/AAS 100% WO/AAA 50%

NA

Total AAS(4)=80%

Total/AAS 85.5%

Total/AAA 58%

+5.5%

Overall 86.6% AAS 76.6% -10% The success rate of the question (Minimize highlight clipping) is improved by 20%. Overall average is under the target by 38.5%. The success rate of the two questions (Orange Color Cast, Vibrance) is under the target.

Based on recent results, areas needing improvement: Achievement level for “Orange color cast” was significantly lower than the previous assessment. Also, the achievement level for “Vibrance” at AL campus was down by 9%. Current action(s) to improve SLO, based on results: For “Orange color cast” question, many students chose “b. adjust Temp and Tint slider in Lightroom” which suggests that the students did not consider shooting with a correct white balance (a. reshoot with correct white balance). Starting Fall 2018 Faculty will remind students of the importance of correct capture rather than depending on post-processing in LR. For the “Vibrance” question, starting Fall 2018 faculty will use a portrait to demonstrate how skin tones are adjusted when using Vibrance slider. Next assessment of this SLO: Spring 2020

CLO: Critical Thinking [ x] CT

Advanced Photography I and II: PHT201+202 Direct measure Writing Artist Statement (Rubric attached at the end of the report) 1. Identifies and explains the relevance 2. Recognizes context (i.e., cultural/social,

scientific, technological, political, ethical, personal experience)

3. Communicates personal points of view (perspective)

4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic)

5. Uses College-level writing Sample Size (Write N/A where not offered):

Campus/

Modality

# of Total

Sections

Offered

# of Sections Assesse

d

# Students assessed

Semester/year data collected: Spring 2018 Target: 70 percent of students will score 3 points or better on each criterion and 15 points or better on the overall score. Results by degrees

Results by CLO Criteria Spring 2018

Average Score % of Students > 3 / 15 points

1. Identifies and explains the relevance

AAS = 3.6 points AAA = 4 points

AAS=94% AAA=100%

2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience)

AAS = 2.9 points AAA = 3.3 points

AAS=66% AAA=66%

3. Communicates personal points of view (perspective)

AAS = 3.3 points AAA = 3.6 points

AAS=94% AAA=100%

4. Analyses and Justifies decisions (i.e., visual

AAS = 3.4 points AAA = 4 points

AAS=77% AAA=100%

This was the first assessment of this CLO. The areas that need improvement are Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience), 4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic), and 5. Uses College-level writing are below target. Current action(s) to improve SLO, based on results: For 2 starting Fall 2018 faculty will emphasis these aspects of writing an artist’s statement in class and provide students with web sites and examples of well written artist’s’ statements in Blackboard. For 5 starting Fall 2018 faculty will encourage students to get help from the Writing Center.

Page 172: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

162

Fine Arts: A.A.A., Photography Specialization AL only

1 1 AAA 3 AAS 18 AAA+AAS

3

Non-major

5

Total 23 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 Total 23

*Dual Enrollment All PHT201+202 students were assessed. This involved 1 faculty member in 1 section. Total sample was 23 students (AAS=18 students).

styles, technical, and aesthetic)

5. Uses College-level writing

AAS = 2.8 points AAA = 3.6 points

AAS=61% AAA=100%

Total AAS = 16.2 points AAA = 18.6 points

AAS=72% AAA=100%

Results: The following criteria are above target: 1. Identifies and explains the relevance and 3. Communicates personal points of view (perspective) are above target. The following criteria are below target: 2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience), 4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic), and 5. Uses College-level writing.

Next assessment of this CLO: Spring 2021

Program Goals Evaluation Methods Assessment Results Use of Results

To provide quality education and preparation for diverse careers options in the field of photography

Studio Lighting PHT 221, WO campus Photography AAA Student Survey 2018 – developed within Photography program. 1 out of 3 AAA students participated in the survey (33%). • 0 identified their home campus as AL (0%) • 1 identified their home campus as WO

(100%) The rating scale range from 1 to 5 with 5 as the highest rating, and students were asked “How do you rate the overall:” 1. Faculty concern for students 2. Instructors’ expertise in area of study Quality of the PHT facilities

Photography Student Survey was conducted in PHT 221 in Spring 2018. Total sample was 1 student. Target: 80 percent of students will rate 4 points or above on each criterion.

Spring 2017 Spring 2018 % Change

Ave rating

% students

> 4

Ave rating

% students

> 4 1. 5.00 100% 5.00 100% +/-0% 2. 4.75 100% 5.00 100% +/-0% 3. 4.50 75% 4.00 100% +25%

Overall, students are satisfied with the program.

Previous actions to improve Program Goal: The quality of Photo Facilities scored the lowest rating of the three questions. Moving into a new facility in Alexandria campus in Fall 2017 solved some facility related issues. Target: [ X ] Yes [ ] No [ ] Partially Most recent results: Although the target was met, “Quality of Photo Facilities” has the lowest rating. At the Alexandria campus one student said the ceiling lights interfere with viewing the computer monitors in AFA room 305 (Mini lab) and AFA room 304/308. Current actions to improve Program Goal: Alexandria faculty are aware of the ceiling lighting problems. The Alexandria Photo Lab Manager and faculty are working to solve the issue. Currently, there is no estimated time when the problem will be solved. However, the inquiry on this issue was sent to the Facility in Fall 2018. In Woodbridge faculty work towards consistently updating the facilities to provide state of the art education. Next assessment: Spring 2019

Studio Lighting PHT221

Photography Student Survey was conducted in PHT221 in Spring 2018. Total sample was 1 student.

Previous actions to improve Program Goal: The previous assessment indicated that students struggled with “The history and

Page 173: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

163

Fine Arts: A.A.A., Photography Specialization Photography AAA Student Survey 2018 – developed within Photography program. 1 out of 3 AAA students participated in the survey (33%).

1. 0 identified their home campus as AL (0%)

2. 1 identified their home campus as WO (100%)

https://www.surveymonkey.com/r/DZYGS7J (Rating scale range from 1 to 5 with 5 as the highest rating.) After completing the Photography program, how would you rate your own mastery of: 1. Camera operation and exposure control 2. Photography software - Photoshop 3. Photography software - Lightroom 4. Video software - Final-Cut 5. The history and concepts of Photograph

Target: 80 percent of students will rate 4 points or above on each criterion.

Previous Assessment

Results [Spring 2017]

Current Assessment

Results [Spring 2018]

Change from previo

us assessment

Average

rating

% of Students > 4

Average rating

% of Students > 4

1. 4.75 100% 4 100% +/-0% 2. 4.25 100% 4 100% +/-0% 3. 4.5 100% 4 100% +/-0% 4. 3.33 75% 3 0% -75% 5. 4 50% 4 100% +50%

concepts of Photography” and “Final-Cut”. Faculty encouraged students taking PHT110 – History of Photography and PHT130–Video I early in their degree. The survey indicated the advice to take PHT110 early was effective. The target goal was partially met. Most recent results: The rating for “Final-Cut” still scores low. This may be because of the small sample size of 1 student. Other areas’ ratings are high and meeting the target. Current actions to improve Program Goal: During Fall 2018 advising week, faculty will encourage students to enroll in PHT130 and PHT110 towards the beginning of their degree program. Next assessment of this goal: Spring 2019

Studio Lighting PHT221 Photography AAA Student Survey 2018 – developed within Photography program. 1 out of 3 AAA students participated in the survey (33%).

3. 0 identified their home campus as AL (0%)

4. 1 identified their home campus as WO (100%)

Rating scale range from 1 to 5 with 5 as the highest rating, and students were asked,”Do you feel technically prepared to work in the field?”

Photography Student Survey was conducted in PHT221 in Spring 2018. Total sample was 1 student.

Target: 70 percent of students will rate 4 points or above.

Previous Assessment

Results [Spring 2017]

Current Assessment

Results [Spring 2018]

Change from

previous

assessment

Average rating

% of Students >

4

Average

rating

% of Students >

4 Yes, prepared

4 75% 3 0% -75%

Previous actions to improve Program Goal: Beginning Fall 2017 faculty advised students to take Studio Lighting (PHT221) for preparation in commercial and portrait photography. PHT221 continues to be a popular class. The target goal was not met. Most recent results: The students’ level of preparedness decreased by -75%. There were no student comments in this section. Faculty decided that comments for this question would be helpful to determine why students do not feel fully prepared. Student preparedness fell from a 4 rating to a 3 rating. Current actions to improve Program Goal: Faculty decided to add a comment section for this question in the spring 2019 survey and to monitor the results. Next assessment of this goal: Spring 2019

Professional development One professor was invited to be one of 20 curators for the Alchemical Vessels exhibition and benefit for the nonprofit Smith Center for Healing & the Arts, which serves both individuals with

Faculty participation in art exhibitions and other professional activities in the art community enhances the visibility and

Page 174: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

164

Fine Arts: A.A.A., Photography Specialization cancer and the Washington, DC community by using the arts as tools for healing. She was also invited to exhibit work in the exhibition. One professor was invited to participate in the exhibition “(IM)migration”, Baltimore Community College-Essex” Baltimore, MD 2017 One professor’s artist’s books were acquired by the Beinecke Rare Book & Manuscript Library at Yale University, New Haven, CT One faculty member attended the National Society for Photographic Education Conference in Philadelphia.

prestige of the photography program. Faculty decided that the achievement level should be full support to attend professional conferences. Full funding is not available. The faculty member received partial funding and subsidize attendance at the conference. The achievement level to attend conferences was not met. Faculty will continue to request full funding for professional conferences.

Enhance the curriculum

Curriculum and instruction evaluation/review by PHT cluster members

Visual Art AFA degree was approved in Spring 2018. The Fine Arts AAA Photography Specialization was discontinued in Fall 2018. The PHT discipline program review is in progress. When completed the review should provide insights into improving the curriculum. In Fall 2017 the Woodbridge campus started to offer the AAS degree. The PHT discipline is no longer allowed to offer PHT195 Special Topic courses.

In Summer 2018 Faculty updated the course content summary for PHT235. Faculty decided that the following course content summaries need revisions: PHT106, 201-202, 206, 211,221-222, 227, 231, 246, 247, 265, 270-271, and 274. Faculty will start revising them in Fall 2018. PHT195 courses such as Business of Photography, Flash/Speedlight Techniques, Street Photography, and Low/Night Photography were offered as special topic elective courses. The VCCS master course list does not have similar courses listed above for us to offer. Faculty are disappointed that we can no longer offer these courses. Faculty decided that the achievement level should be one new initiative annually. The achievement level was met. Next assessment of this goal: Spring 2019

To graduate more students

Graduation data : Number of NOVA Graduates by Degree and Specialization 2017-18 from OIR

2017-2018 (College- 7004 AAA=5) 2016-2017 (College =7443, AAA=5) 2015-2016 (College=7752, AAA=16) 2014-2015 (College=7528, AAA=21) 2013-2014 (College=7373, AAA=25) 2012-2013 (College=7601, AAA=33) Target: an increase (+10%)

The number of students graduating with a Photography Specialization degree was unchanged.

Target: [ ] Yes [ X ] No [ ] Partially

With the enactment of the AFA-Fine Arts degree the AAA Photo Specialization will no longer be offered.

Success rates: Fall 2017 Success rates by discipline excludes ELI classes for OIR

Fall 2016: 82.8% (+9.9% from previous year) Fall 2016: 72.9% (-0.6% from previous year) Fall 2015: 73.5% (+2% from previous year)

Previous actions to improve Program Goal: Full-time faculty worked with adjuncts to ensure quality instruction in all classes in

Page 175: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

165

Fine Arts: A.A.A., Photography Specialization Fall 2014: 72% (+0.4% from previous year) Fall 2013: 71.7% (+5% from previous year) Fall 2012: 66.7%

Target: 75% The success rate increased by almost 10%.

Fall 2017 and Spring 2018. This may be responsible for the substantial increase in the success rate. Target: [ ] Yes [ X ] No [ ] Partially

Most recent results: The success rate increased by 9.9%. Detailed data by individual courses for courses with less than 75% is indicated below along with actions to improve the success rate. Next assessment of this goal: Spring 2019

Fall 2017 Student Grade Distribution by Course College-Wide from OIR Fall 2017 Student Grade Distribution by Course Alexandria Campus from OIR Fall 2017 Student Grade Distribution by Course Woodbridge Campus from OIR

14 courses were offered in Fall 2017. Some of the courses had multiple sections at both the AL and WO campus. 4 courses had college-wide success rates at or under 75%. Target: 75%

Fall 2017 Grade distribution by course – College-Wide

Course/ %

Success

Total #

Students

# Credi

t Students

D F W Audit I

PHT101* 72%

128

116 7 (6%)

13 (11%)

12 (10%)

12 (9%)

0

PHT103 67%

7 6 0

1 (16%)

1 (16%)

1 (14%)

0

PHT195 60%

21 10 0

0

4 (40%)

11 (52%)

0

PHT270 72%

40 36 3 (8%)

5 (13%)

2 (5%)

4 (10%)

0

*This may include ELI data

Sections of courses whose success rate are on or under 75% as college wide, organized by Campus Course offered in Fall 2016

Course offered in Fall 2017

Changes from previous year

PHT101 / 69% (WO)* PHT101 / 63% (WO) -6% NA PHT103 / 67% (AL) NA NA PHT195 / 60% (AL) NA PHT270 / 74% (AL) PHT270 / 79% (AL) +5% PHT270 / 42% (WO) PHT270 / 65% (WO) +23%

*This may include ELI data

Most recent results and previous actions to improve Program Goal: The success rate for PHT270 (AL) increased by 5%. The success rate for PHT270 (WO) increased by 23%. Continuous reminder to students to submit assignments and encouragement to meet with their professors during office hours helped increased the success rate in Fall 2017. However, 21% of students still received D or F in Fall 2017. The success rate for PHT 101 (WO) declined by 6%. Three of the five sections were offered through ELI which historically has lowered success rates. PHT 103 (AL) had total 6 credit students. 4 students earned A’s or B’s. The low success rate may be due to the small sample size. 4 students withdrew from PHT195 (AL) which resulted in a low success rate for the course. One student had a death in her family, two students had a changes in employment, and the other student withdrew without communicating the faculty about the reason. 3 of 4 students had circumstances beyond students/faculty’s control which resulted in withdrawal from the course. Current actions to improve Program Goal: Faculty members continue to send emails to students in academic difficulty encouraging them to submit assignments

Page 176: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

166

Fine Arts: A.A.A., Photography Specialization and meet with their professor during office hours. At the cross-campus meeting in Spring 2019, the PHT discipline group will discuss course materials for PHT270 and possible assessment methods to identify the reasons for the low success rate At the cross-campus meeting in Spring 2019, the PHT discipline group will discuss course materials for PHT101, ELI course materials and teaching methods to identify reasons for the low success rate.

Studio Lighting PHT221 Photography AAA Student Survey 2018 – developed within Photography program. 1 out of 3 AAA students participated in the survey (33%).

5. 0 identified their home campus as AL (0%)

6. 1 identified their home campus as WO (100%)

Rating scale range from 1 to 5 with 5 as the highest rating. Do you have a Photo advisor? If yes, how often do you contact your advisor? The PHT cluster decided that it is important for students to have a faculty advisor and meet the advisor to discuss the degree progress to increase the number of graduates. This survey informs if and how much guidance students are receiving from a faculty advisor.

Photography Student Survey was conducted in PHT221 in Spring 2018. Total sample was 1 student.

Target: 100 percent of students will have a PHT faculty advisor and meet with an advisor at least “Once a semester”. Spring 2018: Do you have a Photo advisor?

[Spring 2017] [Spring 2018] % Change # of

Stud. % of Stud.

# of Stud.

% of Stud.

Yes 4 100% 1 100% +/-0% No 0 0% 0 0% +/-0% I don’t know

0 0% 0 0% +/-0%

Spring 2018: If yes, how often do you contact your advisor?

[Spring 2017] [Spring 2018] % Change # of

Stud. % of Stud.

# of Stud.

% of Stud.

Once a semester or more

4 100% 1 100% +/-0%

Previous actions to improve Program Goal: The previous survey indicates that it was effective to meet with 100% of students (PHT101 and 102) to inform them of the importance of having a faculty advisor, and to also provided clear guidance on the program and degrees. Since the last assessment Faculty continued to meet with all students to give information about having a faculty advisor and the program/degree. The target goal was met. Most recent results: 100 percent of students have a PHT faculty advisor and meet with the advisor at least “Once a semester”. Current actions to improve Program Goal: Faculty continue to meet with all students to give information about having a faculty advisor. Also, Faculty will inform students about the new Visual Art AFA program. Currently, there is no clear implementation date for this goal because it is not up to the faculty but it involves the college-wide (possibly VCCS-wide) change. Next Assessment: Spring 2019.

To obtain the instructional resources and develop the

The PHT cluster evaluates the traditional and digital photographic facilities

Target: the adequate funding of technology Tech Plan FF&E and ETF funding are used annually to upgrade technology. New equipment at AL:

The AFA building in AL officially opened in Fall 2017. In the darkroom, electrical problems caused five enlargers to not work properly in Fall 2017 and Spring 2018. Rewiring enlargers in Summer 2018 seemed to solve the problem. In addition to the listed

Page 177: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

167

Fine Arts: A.A.A., Photography Specialization curriculum needed to provide excellent instruction

• Smith Victor Cool LED 100K 2 light kit – ETF • Qty11 - 15” Apple MacBook Pro laptops with sleeves–

FF&E • Qty2 - Manfrotto 025B large boom with stand • Hasselblad Imacon Flextight X1 virtual drum scanner • Edwards Engineered Products 16x20” high output UV

vacuum frame exposure unit • Epson P9000 printer • GoPro Hero 5 action camera with charger, extra battery,

case, grip, 64gb micro SD card – ETF • Nikon KeyMission 360 action camera with charger, extra

battery, case, grip, 64gb micro SD card – ETF • Qty6 - JVC GY-HM200U 4K streaming camcorder –

FF&E • Qty6 - JVC XLR shotgun microphones for camcorders –

FF&E • Qty3 - Epson Perfection V850 Pro flatbed scanners • Canon EOS 5D Mark IV DSLR with 24-70mm f/4 L IS

USM lens – ETF • Sony 70-200mm f/2.8 G Master zoom lens for Sony E

mount - ETF At the Woodbridge campus all iMacs were replaced in Summer 2017.

new equipment, the studio blackout curtain was installed in Lighting Studio/AFA328 in Fall 2018. Also, whiteboard daylight wash lighting with a dimmer switch in AFA304 and AFA308 were installed in Spring 2018.

At the Woodbridge campus all iMacs were replaced in Summer 2017. However due to budget constraints only 20 of the 40 iMacs were 27-inch monitors with extra RAM. This is unfortunate. The target goal for the adequate funding of technology was met. Current actions to improve Program Goal: Faculty and the lab manager will work closely to check the inventory of the photography and media equipment and requests necessary updates, upgrades, and replacements when FF&E, ETF, and Tech Plan funds are available. Next assessment: Spring 2019

Page 178: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

168

Annual Planning and Evaluation Report: 2017-2018 Fire Science Technology, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The overall goal of the program is to prepare individuals for entry or advancement in the fire service, or a related field, by providing them with knowledge of the fire protection profession and giving them the general education necessary to function and advance in one of these professions. Program Goals Evaluation Methods Assessment Results Use of Results Program goal on program-placed students

Data Source: NOVA Data for Evaluation and Planning Reports/Distribution of Program Placed Students by Curriculum and Award Type (http://www.nvcc.edu/college-planning/data.html)

Semester/year data collected: Fall 2017

Target: The overall number of program-placed students will increase by at least 10 percent Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 22 - 48.8 % 2016 43 - 18.8 % 2015 53 - 19.6 % 2014 66 - 25.0 % 2013 88 - 16.9%

Target Met: [ ] Yes [ x ] No [ ] Partially Comparison to previous assessment(s): A decrease of program placed students from 43 to 22 (-48.8%) was noted.

Previous action(s) to improve program goal: Data not available (see below). Most recent results: A decrease of program placed students from 43 to 22 (-48.8%) was noted Results improved: [ ] Yes [x ] No [ ] Partially Current action(s) to improve program goal: During this assessment cycle, the FST program underwent several significant alterations. When the former Program Head/SLO Lead faculty member departed abruptly during the Spring 2018 term, an interim was named. It was then discovered that scant data had been preserved regarding the program’s Standardized Learning Objectives. Without this vital data, the compiling of accurate and reliable SLO/CLO reports is an impossibility. Without the benefit of having relevant SLO data the identification of any programmatic strength/shortcoming patterns needing to be addressed is otiose. It was further discovered that the entire FST program (all of which is online based) is in need of revitalization to not only meet online course delivery standards but also to ensure that our students receive the finest FST educational experience possible. With these adverse findings in mind, the new leadership has/will deploy the below noted actions to achieve each related goal (projected target dates are also noted): As to allow the continuation of the FST program, we sought assistance from fellow VCCS FST offering colleges and were able to secure shared services for distance learning (SSDL) agreement to ensure course delivery as we undertake the necessary refurbishment process. Our target date for the completion of the revamp is Spring 2019 or early Summer 2019. We will also begin the process of acquiring a permanent full-time FST program head/SLO lead faculty person. The target date for this acquisition is the end of the Fall 2018 term or early Spring 2019. The program will develop a formalized process in which all SLO assessment related documents (surveys, formulas, skills check-sheets

Page 179: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

169

Fire Science Technology, A.A.S. rubrics etc.) shall be maintained and stored for future reference and use. This is to be initiated following program revamp. In an attempt to increase both program placement and graduation rates, the interim program head has begun a limited targeted marketing strategy involving the dissemination of information about our program to applicable regional fire service employers. It should be noted that several large career fire service agencies in the region are beginning to consider mandating a two-year FST discipline specific degree in order to be eligible for promotion. This should equate to an overall increase in students for this program, as long as proper dialog and active marketing, with regional fire suppression career agencies, continues. Next Assessment: Fall 2019

Program goal on graduation

Data Source: NOVA Data for Evaluation and Planning Reports/Distribution of Program Placed Students by Curriculum and Award Type (http://www.nvcc.edu/college-

planning/data.html)

Target: The number of annual graduates from the FST program will show an increase of at least 10 percent Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

2017-18 9 28.6% 2016-17 7 133.3% 2015-16 3 50.0% 2014-15 2 -84.6% 2013-14 13 85.7%

Target Met: [x ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): When compared to 2016-17 the number of graduates increased, as has been the noted trend for the last 4 cycles. The fact that the program’s graduation rate has never rebound (to the level noted in 2013-14) is relevant and in need of addressing.

See information above.

Page 180: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

170

Annual Planning and Evaluation Report: 2017-2018 General Studies, A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is a flexible associate degree. For students who plan to transfer, the degree can parallel the first two years of a four-year bachelor of science program if they choose courses that match the transfer institution's requirements. For those students who do not plan to transfer, the degree allows them to structure a program to suit their needs using accumulated credits from a variety of formal and experiential sources.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Civic Engage-ment

PLS Discipline Student Learning Outcomes: “Students will be able to describe the political institutions and processes of the government of the United States.”

American National Politics PLS 135 Direct Measure: This assessment was performed in PLS 135 classes, which deals directly with this SLO. Provided Rubric Criteria or Question Topics: We asked students 20 Multiple Choice (MC) questions requiring them to identify correct responses in four areas of important to American politics and government: The Constitution; Legislative Branch; Executive Branch; and Judicial Branch. Sample:

Campus/

Modality

# of Total

Sections Offered

# Sections assessed

# Student

s assesse

d AL 2 2 36 AN 2 1 32 MA 0 0 0 ME 0 0 0 LO 1 1 17 WO 0 0 0 NOVA Online

3 0 0

DE* 0 0 0 Total 8 4 85

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: Average score 80% or higher overall on each criterion as well as the overall score. Results:

Results by Campus/ Modality

Spring 2018 Average Score

AL 82 AN 78 LO 77 Avg score 79

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2018 Average Score

Constitution 78 Legislature 81 Executive 86 Judiciary 73 Avg Score 79.5

Current results improved: N/A: First time assessed. Strengths by Criterion/ Question/Topic: Students performed best in the area of the Executive Branch, which should not be surprising as it receives the most media attention. Next is legislature. Both topics are above the target. Weaknesses by Criterion/ Question/Topic: Knowledge of the Constitution and Judiciary was lacking, which is not surprising, particularly with the Judiciary. Still, knowledge of answers to questions in both fields was not too far off of target.

Previous action(s) to improve SLO: N/A: First time assessed Target Met: [ ] Yes [ ] No [X] Partially All campuses slightly above or below target, which is great. However, knowledge of Constitution and Judiciary seem to be lag categories with regards to scores.

Current actions to improve based on recent results, areas needing improvement: - Need to spend more time in class discussing

Constitution and Judiciary. - Inform students on importance of both topics. - After Spring 2019, will adjust questions. - We have gotten all campuses and NOVA Online to

participate for Fall 2018 semester.

Will share results with all faculty and they will see recommendations to address issues. Discuss issue at Discipline Group meeting, during convocation of Spring 2019, and we will discuss this at every Fall convocation.

Next assessment of this SLO: Fall 2018

Critical Thinking

Student Development Orientation SDV 100

Semester/year data collected: Spring 2018

Previous action(s) to improve CLO: The SDV Curriculum Committee has a yearly mandatory SDV In-

Page 181: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

171

General Studies, A.S. SDV 100: Identify three to five aspects of critical thinking such as: identifying faulty logic, problem-solving, and asking questions/ probing etc.

Direct Measure: Students were quizzed on 5 critical thinking questions embedded in a College Resource Quiz in SDV 100. Question Topics • Q9: Thinking creatively • Q10: Solving problems • Q15: Critical thinking in high school

versus college • Q17: Narrowing the problem • Q18: Critical thinking

Sample:

Campus/ Modality

# of Total

Sections

Offered

# Sections assessed

# Students assessed

AL 21 13 230 AN 36 32 678 MA (+1 SDV 101)

15 9 161

ME SDV 101

11 5 49

LO 18 13 250 WO 22 5 109 NOVA Online

24 17 246

DE* 10 1 21 Total 157 95 1744

*Dual-enrollment Major improvement on data overall is due to the support and insistence of Associate Deans of Student Development on each campus.

Target: 80% of students will answer correctly on the 5 critical thinking questions included on the College Resource and Critical Thinking Quiz. Results: Campus

/ Modality

Q9 Q10 Q15 Q17 Q18 Total

AL 97% 93% 31% 11% 83% 63% AN 95% 88% 24% 11% 80% 60% MA 98% 94% 28% 3% 86% 62% ME 98% 92% 16% 78% 80% 73% LO 99% 93% 23% 13% 84% 62% WO 100% 96% 100% 100% 100% 99% NOVA online 96% 68% 13% 76% 90% 69%

DE 100% 95% 24% 86% 100% 81% Total Average 98% 90% 32% 47% 88% 71%

Current results improved: N/A - First time we assessed this topic. Strengths by Criterion/ Question/Topic: Questions 9, 10, and 18 had the best scores due to the fact that they could be assessed by using good test taking skills and singling out other answers that are not the best (multiple choice). The questions are broad enough that even without reviewing the textbook they can be answered. Weaknesses by Criterion/ Question/Topic: Questions 15 and 17 had the lowest scores. Question 15 is a question that requires the student to pick several right answers and there is more room for error. Question 17 had the highest wrong answers because it is not worded directly from the text but it’s inferred from the reading material and requires a bit more critical thinking to figure out the best answer.

Service where we have instructors present on best practices on student engagement and learning (May 2016, May 2017, June 2018). The Committee has also considered using a different textbook but our primary goal has been to keep the textbook affordable by using OER (Open Education Resources). We have considered that since the textbook is only available online that it discourages students from reading it. The committee reviewed textbooks in 2017-2018 and we voted against the different options because they could not remain free. At this time we have not found a better free textbook that covers the topic we review in this class. Most of the assignments required self- assessment and reflection and students feel more comfortable with those assignments than assessments and quizzes that required them to review the textbook available online. NOVA Online, formerly ELI differed on when/where they assessed the critical thinking questions. It was not in the first quiz/assessment and not attached to a college resource quiz but it was its own separate quiz. This allows discussion that putting a critical thinking reading assignment/assessment as its category later on in the class may improve the results. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The Critical Thinking CLO is currently located along with College Resources and Communication Skills. Comparing with ELI on where they put their assessment, students may do best if Critical Thinking has its own category after Academic and Test-Taking skills. Current actions to improve CLO based on the results: Unfortunately Fall 2018 assessment is well underway and too late to make any improvements or changes. Critical Thinking is not going to be assessed for Spring 2019. Comparing Spring 2018 to Fall 2018 would allow for more results to see if there is improvement or if the data stays the same. Next assessment of CT: Spring 2020

Quantitative Reasoning

General Chemistry I & II CHM 111 and 112

Semester/year data collected: Spring 2018

Previous action(s) to improve SLO: This was the second round of assessing the QR objectives. In the

Page 182: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

172

General Studies, A.S. Students will use numerical values to perform various calculations and draw reasonable conclusion. Students will use graphical methods to organize and interpret data.

Direct Measure: Lab Report (pilot) Rubric Criteria - QR Rubric for Lab assignment: Five criteria presented on the Quantitative Reasoning (QR) Rubric:

I. Interprets Quantitatively: Explains the numerical information presented in mathematical forms (equations, formulas, graphs, diagrams and tables).

II. Presents quantitatively: Converts the given information into mathematical forms such as tables, graphs, diagrams, and equations.

III. Analyzes thoughtfully: Draws relevant conclusions from provided information and data, and predicts future trends.

IV. Communicates qualitatively and persuasively: uses quantitative evidence to support the argument or purpose of the work (what evidence is used, how it is formatted and contextualized).

V. Problem solving: Sets up a numerical problem and calculates the solution correctly

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 10 1 23 AN 18 1 25 MA 8 3 52 ME 0 0 0 LO 23 8 128 WO 8 0 0 NOVA Online

1 1 18

DE* 8 8 78 Total 76 22 324

*Dual-enrollment Assessment Results’ Calculation: Average Score: Total Points in all courses ÷ Total Number of Students Maximum points available = 20 points #(15.2/20)x100=76% and (16.7/20)x100=84%

Target: The average score of students participating will be 70%. For itemized criteria, 70% of students will correctly answer each item. Results by In-Class, ELI, and Dual Enrollment:

Results by Campus/ Modality

Spring 2018

Average Score % Earned AL 16.8 84.1 AN 14.7 73.4 MA 17.9 89.6 ME N/A N/A LO 14.2 71.1 WO 0 0 ELI 16.8 84.0 DE* 15.6 78.0 Total AVG 14.8 74

Results by CLO Criteria: Results by

CLO Criteria/

Question Topics

Spring 2018

Average Score % Earned on Questions

I. 2.9 72.5 II. 3.0 75.0 III. 3.0 75.0 IV. 3.0 75.0 V. 2.9 72.5 Totaled 3.0 74

Current results improved: [ X ] Yes [ ] No [x ] Partially Four out of the five campuses offering in-person Chemistry courses contributed data for this report, in addition to ELI and DE courses. Although the larger sample of students evaluated resulted in lower score in each criterion, the results for this assessment are considered more meaningful compared to Fall 2017. In spite of the overall decrease in the average, the targeted values for the evaluation were met by each campus and on each criterion. There was very little to no variations in the average score among criterion, which indicates students’ overall preparation. Furthermore, students met the targeted goal for each item.

January 2018 cluster meeting, the discipline group discussed the previous assessment and ways to improve the faculty participation and the Core Learning Outcomes. There were some questions regarding interpreting the rubric that seemed to be the reason for insufficient faculty participation. After the meeting, on January 05, an informative follow up email was sent to the cluster to allow enough time to plan for the semester. The following changes were assumed: • To improve the consistency of the assessments and

hence the results, two laboratory experiments were selected and shared with the faculty to use for the evaluation.

• To increase the students’ Core Learning Outcomes, a handout with guidelines regarding analysis of data, thinking quantitatively, and writing analytically was developed and shared with the discipline to distribute among all students on all campuses. This was to ensure that all students have access to the same

• information prior to their analytical writing and interpretation of data.

• To maintain standardization of the collected data, a table for collecting information was developed and shared with the Assistant Deans.

Target Met: [ X ] Yes [ ] No [ ] Partially All campuses met and some exceeded the targeted value. WO did not participate in the assessment, and only one course from each of AL and AN participated. Compared to Fall 2017, the number courses participating increased from 10% to 29% participation in Spring 2018. The number of students participating in this assessment increased by over 200% compared to Fall 2017. Moreover, ELI and DE courses have participated close to 100%. Future results may be improved by the addition of a lab activity at the beginning of the semester to familiarize students with some of the mathematical manipulation and graphical analysis that they would encounter throughout the course.

Page 183: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

173

General Studies, A.S. Strengths by Criterion/ Question/Topic: Three of the criteria, “Presents quantitatively,” “Analyzes thoughtfully,” and “Communicates qualitatively and persuasively” were scored equally high. “Interprets Quantitatively” and “problem solving” were among the weaknesses of the students evaluated. Both of these criteria are math related and more students find these types of assessments challenging. This may improve by addition of some kind of math related activity to the curriculum during the first few weeks of school.

Scientific Literacy: Students will understand the scientific method and identify methods of inquiry that lead to scientific knowledge.

General Biology I BIO 101 Direct Measure: A quiz on the Scientific Method was available on Blackboard to all BIO 101 students in the college (students from all campuses including ELI and DE) towards the end of the semester, Fall 2017. The quiz consisted of 10 multiple choice questions that assessed steps in the Scientific Method. The topics were as follows: • Item #1: observation is first step • Item #2: order of steps • Item #3: definition of hypothesis • Item #4: validity of hypotheses • Item #5: importance of control • Item #6: definition of data • Item #7: example of hypothesis • Item #8: definition of variable • Item #9: definition of theory • Item #10: defining data collecting This assessment is the same as given to students in the previous year. All assessment data is gathered through Blackboard. Sample: The assessment tool was deployed on Blackboard to all students taking BIO 101 on campus (AL, AN, LO, MA, WO) or at ELI or as dual enrolled students, and 572 students took part in this assessment. The exact total number of students in BIO 101 during Fall 2017 is not available, but it is around 1600. This approximate number allows us to determine that about a third of all students

Data Collection: Fall 2017 Achievement Target: • For the whole quiz, 70% of students achieving

70% on the quiz. • For each item, 70% of students correctly

answering that item. Like the previous year, students identified themselves by major. This allowed us to compare results from students’ program placed in General Studies (219), Social Science (195) and Science (279). Note that these numbers add to 693; some of the students listed double majors. Results for all students:

1) Average student score: 84.2% 2) Percentage of students earning at least

70% is = 88.5% (506/572)

Results by SLO Criteria for all students:

Item # # of students

scoring correctly (out of 572 students)

Percentage of students

answering correctly

1 366 64.0% 2 538 94.0% 3 529 92.4% 4 509 88.9% 5 464 81.1% 6 537 93.8% 7 489 85.4% 8 500 87.4% 9 376 65.7% 10 500 89.6%

Results for students placed in A.S. Science:

1) Average student score: 83.3%

Instructors and students of BIO 101 are becoming more used to assessment by Blackboard. During the Fall 2018 Cluster meeting, faculty members requested results of the previous year’s data. These data were sent to the Biology discipline chair for dissemination. The low achievement results on Items 1 and 9 are important to the biology faculty because they show that students do not understand that curiosity is the first step of solving a scientific problem through the scientific method. Also, the term “theory” in science continues to confuse students. Students’ wrong answers indicate that they do not realize “theory” in science is not a hypothesis, but a well-substantiated explanation of the natural world. It is valuable for instructors to have this feedback. The discipline chair recently elected in the Biology discipline in Fall 2018 has already seen this data. She wants to work with faculty on the concepts of the two low-scoring questions for the 2019-2020 academic year. We need to find out if students are not understanding the concepts or there is a problem with the question itself. This is the second year that A.S. Science students were identified in the assessment. Although most A.S. Science majors take BIO 101, many students in General Studies and Social Sciences and other majors also take BIO 101. Faculty assessing Social Science and General Studies asked if we could identify their students, since those programs also wish to use this Scientific Method assessment for students in their majors. For the 2018-2019 assessment year, we plan to add A.S. Liberal Arts. It is interesting that the results again show very similar results for students, regardless of major. BIO 101 is a class taken by science students early in their academic

Page 184: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

174

General Studies, A.S. responded to the Blackboard notice and took the quiz. Dual enrollment students were included, and 101 DE students (17.6% of the total) took the assessment. In the case of ELI, 128 ELI students (22.3% of the total) took the assessment. The number of students from each campus and from ELI was not tallied. However, the student ID numbers are in the raw data, and specific information can be gleaned from the data.

2) Percentage of students earning at least 70% is = 87.8% (245/279)

Item # # of students scoring correctly

(out of 279 students)

Percentage of students

answering correctly

1 164 58.7% 2 261 93.5% 3 253 90.6% 4 253 90.6% 5 218 78.1% 6 262 93.9% 7 236 84.5% 8 245 87.8% 9 177 63.4% 10 246 88.1%

Results for students placed in A.S. Social Science:

1) Average student score: 82.4% 2) Percentage of students earning at least

70% is = 86.2% (245/279)

Item # # of students scoring correctly

(out of 195 students)

Percentage of students

answering correctly

1 112 57.4% 2 183 93.8% 3 170 87.1% 4 176 90.2% 5 145 74.3% 6 185 94.8% 7 164 84.1% 8 168 86.1% 9 125 64.1% 10 172 88.2%

Results for students placed in A.S. General Studies:

1) Average student score: 82.7% 2) Percentage of students earning at least

70% is = 85.8% (188/219) Item # # of students

scoring correctly (out of 219 students)

Percentage of students

answering correctly

1 123 56.1%

career, and results show that science students at this early stage did not outperform students in other majors. In this assessment, we were able to demonstrate for the first time that students from all campuses, ELI and Dual Enrollment took part. In the current Blackboard setup, each question is posed as an independent little exam, and that takes more time for students. The two more questions about ELI and DE that two more questions did not discourage students. Nearly 18% of student responders were DE, and 22% were ELI students. The next assessment for this SLO is scheduled for Spring 2019.

Page 185: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

175

General Studies, A.S. 2 200 91.3% 3 191 87.2% 4 191 87.2% 5 177 80.8% 6 206 94.0% 7 189 86.3% 8 187 85.3% 9 145 66.2% 10 193 88.1%

Results indicate that total scores are well above 70%, and most (8 out of 10) individual items meet achievement goals. Scores were very similar to those of last year. The lowest scores were in items 1 and 9. Item 1 asked about the first step in the Scientific Method. The other low score was Item 9 which asked the definition of the word “theory.” Current results improved: [ ] Yes [ ] No [ X ] Partially Scores from students program placed in Science, Social Science and General Studies are very similar. In the 2015-2016 academic year, students scored below 70% in questions 1, 2 and 9 (42%, 47.5%, and 57.4%). In 2016-2017, students scored below 70% in questions 1 and 9 (65.4% and 66.6%). This cycle, students also scored below 70% in questions 1 and 9 (64% and 65.7%). This shows a marked improvement in identifying the steps of the Scientific Method (question 2) over the years assessed, and an improvement in general knowledge of Scientific Method.

Program Goals Evaluation Methods Assessment Results Use of Results

Program goal on program-placed students

Data: Distribution of Program Placed Students by Curriculum and Award Type (Fact Book 2012-2013 through 2017-2018)

Target: Number of program-placed students in each degree/certificate will increase by 1.5%.

Fall Number of Students

Percentage Difference

Fall 2017 10519 -1.34% Fall 2016 10662 +1.46% Fall 2015 10508 -4.7% Fall 2014 11030 -3.45% Fall 2013 11425

Previous action(s) to improve program goal: Previously, the SLO team reviewed the decrease of program placed students since Fall 2013 and established a more realistic achievement target at 1.5% for 2015-2016 assessment year which continued through the 2017-2018 assessment year. Most recent results: 1.34% decrease. Results improved: [ ] Yes [ ] No [ X] Partially

Page 186: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

176

General Studies, A.S. Target Met: [ ] Yes [ X] No [ ] Partially Achievement target not met: 1.34 % decrease from previous year. Comparison to previous assessment(s): There was a 1.46% increase in General Studies Program Placed students in Fall 2016 as compared with Fall 2015.

Current action(s) to improve program goal: The number of program-placed students decreased by 1.34%, thereby not meeting the achievement target. The achievement target for the 2016-17 assessment year was reviewed for possible revision in light of the previous decrease in program placed students and it was decided to retain it at 1.5% as a realistic target. The SLO team reviewed the achievement target for the 2017-18 assessment year for possible revision, keeping in mind the increase in program placed students in Fall 2016. Given this increase in program placed students, the SLO team decided to maintain the achievement target of 1.5% increase 2017-18 assessment year. The SLO team will review the achievement target for the 2018-19 assessment year. The Program will also dedicate itself to promoting this discipline with more publicity to the students and offering them support services. The guided pathways program is also likely to help increase the program placed students as they are mapped to student centered goals where students get a clear picture of programs of study being offered. Assessed: Annually

Program goal on graduation

Data: Number of Graduates (Fact Book 2012-2013 through 2017- 2018; Number of Graduates by Program and Specialization: 2017-18)

Target: Program graduation totals will increase by 1.5% from the previous year. Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Difference

2017 - 18 1430 -4.3% 2016 - 17 1495 -4.2% 2015- 16 1562 +6.4% 2014- 15 1468 -0.33 % 2013 - 14 1473

Target Met: [ ] Yes [ X] No [ ] Partially Target not met as the number of graduates decreased by 4.3%.

Comparison to previous assessment(s): The number of graduates decreased during 2016 – 2017. However, the number of graduates increased during 2015-2016. This increase in the number of graduates took place after three consecutive years

Previous action to improve program goal: The SLO team reviewed the resources available for students to assess how more students could be retained and to increase program graduation totals. Advising was a major resource for students in guiding them towards completion of their degrees. The SLO team promoted a responsive curriculum by putting student learning as its core value. The SLO team communicated with all Advising Divisions and all appropriate offices so there could be further initiatives to increase student enrollment, success and retention. Most recent results: Decrease in the number of graduates by 4.3%. Results improved: [ ] Yes [ X] No Partially Current actions to improve program goal: In light of the decrease in the number of graduates for the previous year, the SLO team reviewed the achievement target in the assessment year 2017 - 2018 and left it unchanged at 1.5% increase in graduates from the previous year as the SLO team agreed that it was a realistic goal. The

Page 187: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

177

General Studies, A.S. of decrease in the number of graduates starting in 2013 – 14.

SLO team took into consideration the substantial increase in the number of graduates during 2015-2016. These results will be shared with the Office of Student Success and Initiatives, Counseling/Advising Divisions and all appropriate offices so there can be further initiatives to increase student enrollment, success and retention. Students are now receiving structured advising by their faculty advisors which is an essential component for student success. This is likely to have a positive effect on student retention along with the guided pathways program. The SLO team is hopeful of a stable student enrollment as the advising program gets further structure and enhancements. This program goal will be assessed again next year in 2018-2019 and the achievement target will be reviewed again to decide if it is realistic to sustain or not and establish a more realistic target. Assessed: Annually

Page 188: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

178

Annual Planning and Evaluation Report: 2017-2018 Graphic Design, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: Program Purpose Statement: The curriculum is designed for persons who seek full-time employment in the graphic design field. The occupational objectives include graphic and/or interactive designer in the graphic design marketplace.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

To analyze artwork from various perspectives and apply to projects

History of Design ART 250 Direct Measure: Students in ART 250 were asked to pick a graphic design of choice and identify the most important formal aspects of their selected artwork and apply that knowledge to a class project. Students should then be able, verbally or in writing, analyze and describe the formal attributes. While there was no specific word count required; the descriptions were to be at least one paragraph in length. The students were given a rubric with the assessment guidelines that also outlined the specific criteria used to evaluate each of the four subcategories: identification, description, and proper use of terminology and application of knowledge. The range for each sub-category score was as follows:

• Excellent – 5 • Good – 4 • Acceptable – 3 • Weak - 2

Instructors scored the rubric for each student submission and then recorded individual student scores to obtain class scores on the class tally sheet. Assessment rubric, instructions and tally sheet example attached. Sample: Students in 3 sections out of 3, from the AL, LO and MA campuses

Semester/year data collected: Fall 2017 Target: Average total score of 3.0 or above for at least 85% of students assessed. A total of 65 written critiques were assessed and the average total score was 3.98. These scores were also broken down into category sub-scores with a target score of 3. The combined category sub-score averages for all students were:

Summary of Outcomes Fall 2017

Overall (20pts) 3.98 Avg. (85%) Part 1: Identification (2-5 pts) 4 Part 2: Description (2-5 pts) 4.1 Part 3: Terminology (2-5 pts) 3.7 Part 4: Application (2-5 pts) 3.9

Target Met: [ X ] Yes [ ] No [ ] Partially All targets were met or exceeded. The average total score was 3.98 and 85% of students assessed received a total score of 3 or better. NOTE: The Graphic Design SLO working group along with the Art History SLO Faculty had to slightly change the assessment instructions and measurement tool to fit the ART 250 History of Design class, which prior to this assessment had not been evaluated. We do not have enough information at this time to provide a comparative analysis to previous semesters because different methods of data collection were used. Before Fall 2015, designated rubric section data was not collected. Until more courses have been evaluated using this new method, these results are not comparable.

Previous action(s) to improve SLO: This SLO was last assessed in Spring 2017. Students were to evaluate and create a portfolio of design work (traditional or digital) to present to professionals in the Graphic Design profession.

The instructors for this assessment planned to further strengthen students’ abilities to self-reflect, self-analyze, and create critical insights into their work, so they are not overly dependent on what others think. Target Met: [ X ] Yes [ ] No [ ] Partially Current actions to improve based on recent results, areas needing improvement: Fall 2017 - Although target scores were met and exceeded, the art history SLO working group will make recommendations for improvement to the Cluster for implementation by Fall 2018. While students excelled at identification and description of various media, their use of formal art terminology could be further developed. Opportunities to practice formal description will be increased in all sections of ART 250. ACTION TO BE TAKEN: Opportunities to practice formal description will be increased in all sections of ART 250. Next assessment of this SLO: Spring 2021

Page 189: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

179

Graphic Design, A.A.S. were assessed. Art 250 is not taught through ELI or through dual enrollment and are not part of this assessment. Total Sample: 65

To design visual concepts based on set criteria

Interactive Design 1 ART 263 Direct Measure: Students in ART 263 were evaluated on a web advertising banner campaign project based around advertising for a musical or theatrical production. The instructor evaluated each project based on research and analysis, conceptualization and ideation, technical proficiency, and final project design. An assessment rubric is attached. The range for each sub-category score was as follows:

• Excellent – 5 • Good – 4 • Average – 3 • Poor- 2

Sample: Number of Sections – One section in total was evaluated which took place on the AL campus. No dual enrollment or ELI courses are offered and are not part of this assessment. Total Sample: 9

Semester/year data collected: Fall 2017 Target: To have more students above the Average level, which would be in the 75% range = C. For this assessment 77% of students were at or above the average. Results by SLO Criteria:

Summary of Outcomes Fall 2017 Overall (25pts) 22.1 Avg. (88%)

Part 1: Research / Analysis (1-5pts)

4.55 (91%)

Part 2: Conceptualization / Ideation (1-5pts)

4.4 (88%)

Part 3: Technical Proficiency (1-5pts)

4.1 (82%)

Part 4: Final (1-10pts) The final outcome captures the concluding stage of the process.

9.1 (91%)

Target Met: [ X ] Yes [ ] No [ ] Partially All targets were met or exceeded. The average total score was 22.1 and 88% of students assessed received a total score of 23.3 or better. We exceeded our target goal. The majority of the students were in the Good and Excellent ranges with an overall average of 93% in the rubric domains resulting in 77% of the students being successful. The other 33% students had an overall average of 72% with the weakest domain being technical proficiency. No students were in the poor range in any area of the rubric domains. NOTE: The Graphic Design SLO working group changed the assessment instructions and measurement tool to be more precise and to move away from scores that seemed too much like “grades.” These revisions were in the Fall 2015 semester and delivered to faculty before the end of the semester. We do not have enough information at this time to provide a comparative analysis to previous semesters because different methods of data collection were used. Before Fall 2015, designated rubric section data was not collected. Until more courses have been evaluated using this new method, these results are not comparable.

Previous action(s) to improve SLO: This SLO was last assessed in the Spring of 2015 in ART 140 Intro to Graphic Skills, and the achievement target was not met. In the Spring 2015 semester, the achievement target for this SLO was not met. With 58% of the students above average, the faculty believed the students needed to be encouraged to work outside of class time to improve skill-sets taught in the preparatory exercises and class time demonstrations. Students needed additional practice to achieve the technical skills and design knowledge to achieve a satisfactory final design. The faculty considered providing more assistance in class. Target Met: [ X ] Yes [ ] No [ ] Partially Current actions to improve based on recent results, areas needing improvement: Fall 2017 - From the results of this study, student results were very good and met expectations. With the weakest results being found within the Conceptualization / Ideation and Technical Proficiency domains it is felt these are areas need more attention and additional actions to be taken to strengthen the students’ academic results. The Graphic Design SLO working group changed the assessment instructions and measurement tool to be more precise and to move away from scores that seemed too much like “grades.” These revisions were in the Fall 2015 semester and delivered to ART 263 faculty at the beginning of the fall semester. The project illustrated a consistent application of research and analysis, conceptualization and ideation, technical proficiency, and final design. Class time, in the form of a process critiques, was utilized to have students share their work and the technical issues they encountered. This led to discussions and shared problem-solving of the necessary directions, techniques, and tools used in the class. Further class time focused on the student’s work in order to receive additional feedback on such from their classmates and faculty in order to achieve greater final results. ACTIONS TO BE TAKEN: A higher degree of

Page 190: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

180

Graphic Design, A.A.S. conceptualization / Ideation will be taken in the form of

more direction and focus on what is desired from each student with more measurable results. This would include additional sketches and basic project ideas from the students to facilitate better a broader range of concepts created by the student for the assignment. Extra class time utilized towards the students sharing their findings with the class as a whole or in selected smaller groups for further discussion of their subject and to receive additional feedback on such from their classmates, group members, and faculty. A higher degree of Technical Proficiency will be addressed through additional in-class demonstrations on the techniques and tools used to execute their assignment(s) as well as required additional exercises that strengthen these skills executed outside of in-class sessions. These actions are to be considered and/or adopted for the next academic year (2018-2019.) Next assessment of this SLO: Fall 2020

To consider and apply technical and conceptual expertise in the creation of visual concepts.

Graphic Design II ART 218 Direct Measure: Students were asked to consider and apply technical and conceptual expertise in the creation of visual concepts to a project. The project was to design and produce product packaging or a product brochure. Students were assessed on research and analysis, ideation and conceptualization, technical proficiency, and final (concluding stage). An assessment rubric is attached. The range for each sub-category score was as follows:

• Excellent – 5 • Good – 4 • Average – 3 • Poor- 2

Sample: Number of Sections – One section in total was evaluated on the AL campus. No dual enrollment or ELI courses are offered and are not part of this assessment. Total Sample: 14

Semester/year data collected: Spring 2018 Target: The target was to have more students above the Average level, which would be in the 75% range = C. Results by SLO Criteria:

Summary of Outcomes Spring 2018 Overall (25pts) Avg. 23.68 (94.7%)

Part 1: Research and Analysis (1-5 pts)

4.86 (97.3%)

Part 2: Ideation and Conceptualization (1-5 pts)

4.82 (96.4%)

Part 3: Technical Proficiency (1-5 pts)

4.73 (94.5%)

Part 4: Final (1-10pt) The final outcome captures the concluding stage of the process.

9.27 (92.7%)

Target Met: [ X ] Yes [ ] No [ ] Partially For this assessment all students were at or above the average. We exceeded our goal and think the approach to the project is good. Most students were in the good to excellent ranges in each of the rubric domains with very few bringing the overall average down yet still placing landing them in the above average range. The research, ideation, and conceptualization, and technical proficiency domains were all above 90% and gave rise completing good and excellent projects. Overall the final rubric domain was

Previous action(s) to improve SLO: This SLO was last assessed in Spring 2016 with overall results that were very good and met expectations. Although acceptable, the weakest results were found within the research/analysis and technical proficiency domains. It was suggested faculty address these minor concerns through providing more guidance on the research and analysis part of the project and more assistance in class. Target Met: [ X ] Yes [ ] No [ ] Partially Current actions to improve based on recent results, areas needing improvement: Spring 2018 - From the results of this study, student results were excellent and met expectations. The overall results were exceptional illustrated by the students’ concept development and execution. The students were able to realize their ideas working through the development and prototype process and execute them on deadline. ACTIONS TO BE TAKEN: For some students, it was difficult to manage their time with the prototype process, and some had trouble making decisions. Maintaining a high level of feedback and process critiques should be implemented and maintained in future sections of this class.

Page 191: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

181

Graphic Design, A.A.S. strong resulting in a unique overall project, which we are

happy about. This was an exceptional high-achieving group of students. NOTE: The Graphic Design SLO working group changed the assessment instructions and measurement tool to be more precise and to move away from scores that seemed too much like “grades.” These revisions were in the Fall 2015 semester and delivered to faculty before the end of the semester. We do not have enough information at this time to provide a comparative analysis to previous semesters because different methods of data collection were used. Before Fall 2015, designated rubric section data was not collected. Until more courses have been evaluated using this new method, these results are not comparable.

Some students have economic barriers to buying the supplies needed for package design. Costs to complete this project should be emphasized at the beginning of the semester, and students should be required to research options to meet their budgets. (A real-world issue.) These actions should be adopted for the next academic year (2018-2019.) Next assessment of this SLO: Spring 2021

CLO: To represent mathematical information numerically, symbolically, and visually, using graphs and charts. [ X ] QR

Graphic Design II ART 218 Direct Measure: In Spring 2018, students were asked to design and develop an informational graphics project for ART 218. Students were given a project in which the learning outcome was to represent mathematical information numerically, symbolically, and visually, using graphs and charts for a product brochure or presentation. An assessment rubric is attached. The range for each sub-category score was as follows:

• Excellent – 5 • Good – 4 • Average – 3 • Poor- 2

Sample: Number of Sections – One section in total was evaluated on the AL campus. No dual enrollment or ELI courses are offered and are not part of this assessment. Total sample: 14

Semester/year data collected: Spring 2018 Target: To have more students above the Average level, which would be in the 75% range = C. Results by SLO Criteria:

Summary of Outcomes Spring 2018 Overall (25pts) 23.41 Avg. (93.6%)

Part 1: Investigation and research (1-5 pts)

4.86 (97.3%)

Part 2: Interpretation and Concept Formulation (1-5 pts)

4.82 (96.4%)

Part 3: Mathematical Visualization Proficiency (1-5 pts)

4.64 (92.7%)

Part 4: Final infographic execution which captures the concluding stage of the process (1-10 pts)

9.09 (90.9%)

Target Met: [ X ] Yes [ ] No [ ] Partially For this assessment all of students were at or above the average. The class assessed had a high degree of success for the project. We exceed our goal yet there is room for growth and understanding in the area of data visualization and informational graphics for both print and web / interactive applications. This was an exceptional high-achieving group of students.

Previous action(s) to improve SLO: This quantitative reasoning SLO is new and was implemented in the Spring of 2018 semester. Based on SCHEV and SACSCOC recommendations, assessment of General Education competencies was to rely primarily on direct measures, actual student work or student performance (course embedded assessments), similar to current SLO assessment methods. The Graphic Design Program chose the SLO that incorporated representing quantitative reasoning visually as it closely relates to our field. Target Met: [ X ] Yes [ ] No [ ] Partially Current actions to improve based on recent results, areas needing improvement: Spring 2018 - From the results of this study, student results were excellent and met expectations. Design skills combined with quantitative reasoning were evident, and students gained knowledge in producing informational graphics. ACTIONS TO BE TAKEN: Further strengthen students’ abilities to represent mathematical information numerically, symbolically, and visually, using graphs and charts in relevant projects. Using informational graphics in annual reports or editorial designs should be considered. Teach graphing tools within Adobe Creative Suite Illustrator software. These actions are to be considered and adopted over the next few academic years to build expertise in this area.

Page 192: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

182

Graphic Design, A.A.S. Next assessment of this SLO: Spring 2020

Program Goals Evaluation Methods Assessment Results Use of Results

Use outside resources to speak on the graphic design field.

Lectures by visiting professionals to a select number of Graphic Design courses. Data was collected at the end of the (Fall 2017-Spring 2018) academic year.

Summary: Faculty had a decrease in the number of outside resources to speak on the graphic design field from 2016-2017 to 2017-2018. We had 16 versus 20 in the previous year. We did meet our target to have an average of 16 outside resources each academic year, but reached 26 fewer students.

The majority of faculty worked diligently to bring in more design professionals to speak with the students to provide them with more “real world” education. Students are more excited about their education and learning in the classroom after hearing from and engaging with professionals.

Fall 2017 Speakers/

Trips Student

Attendance Course

1 49 ART 140 2 8 ART 287 3 8 ART 287 4 13 ART 135 5 17 ART 217 6 8 ART 265 7 22 Various courses 8 18 ART 209 9 25 ART 140 TOTAL 168

Spring 2018

Speakers/ Trips

Student Attendance Course

1 10 ART 265 2 9 ART 265 3 21 ART 140 4 15 ART 281 + others 5 13 ART 209 6 19 ART 203 & 204 Total 87

Fall 2016

Speakers/Trips

Student Attendance Course

1 12 ART 140 2 13 ART 142 3 13 ART 265 4 10 Various courses 5 11 Various courses

Before classes began in Fall 2017, faculty met at the cluster and on each campus to plan for new events for the 2017-2018 academic year. They brainstormed to come up with new ways to promote each event to increase student interest and participation. Because funding events can be challenging, faculty sought funding in advance to ensure events could be handled at the level anticipated. The faculty will also look at other types of funding such as grants or sponsorship.

FALL 2017 1. Art 140 (L) Guest Speaker, Jarrett Michael, UI/UX

Designer at Creative2. and, Jessica Michael, Design Associate for Students for Liberty and freelance. They came in to talk about who they are, what they are currently doing after graduation from the program and what their experiences have been like since graduating NOVA. Followed by a question and answer period letting students ask questions.

2. ART 287 (L) Guest Speakers, Lilian Cortinas & Liz Montes de Oca of BlueWing Creative. Bluewing is a versatile creative studio. They came in during portfolio presentation to help students refine their work, giving pointers on layout, font choice, and readability.

3. Art 287 (L): Guest Speaker, Missy Grimm - Graphic Designer, CACI and George Holton of Holton Design. They came in during portfolio presentation to help students refine their work, giving pointers on layout, font choice, and readability.

4. ART 135 (L) Guest Speaker, Julia Sarver, Creative Director at the Merritt Group came in to speak on creating a strong logo and identity. She critiqued students’ logos during the design refinement stage, giving pointers on layout, font choice, and readability. It was a lively conversation.

5. ART 217 (AL): Guest Speaker, Environmental Designer Rob Steele talked with students about wayfinding for their course branding project.

6. ART 265 (A): Field Trip. Stephenson Printing Plant Tour. George Stephenson (owner) and Greg Troup (sales rep), gave students a guided tour of the printing plant. The visit highlighted the circulation of

Page 193: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

183

Graphic Design, A.A.S. 6 21 ART 140 7 13 ART 209 8 13 ART 209 TOTAL 106

Spring 2017

Speakers/ Trips

Student Attendance Course

1 8 Various courses 2 18 Various courses 3 14 ART 265 4 10 ART 217 5 30 ART 209 & 264 6 15 ART 209 7 8 ART 218 8 12 ART 265 9 12 ART 265 10 15 ART 251 11 12 ART 142 12 21 ART 250 Total 175

Fall 2015

Speakers/Trips

Student Attendance Course

1 14 ART 209 2 9 ART 209 3 9 ART 209 4 12 Various classes 5 11 Various classes 6 15 ART 290

Spring 2016

Speakers/Trips

Student Attendance Course

1 15 ART 290 2 40 Various 3 40 Various 4 7 ART 290 5 10 Various 6 14 ART 265 7 11 ART 217

Spring 2015

Speakers Student Attendance Course

1 25 ART 150 2 12 ART 209 3 14 ART 209 4 21 ART 265 5 11 ART 265

a print job through the plant, as well as the different printing presses, and finishing and mailing processes offered by the company. Stephenson Printing is one of the few local printing plants, within the DC area, which provides web, sheet-fed, and digital printing, as well as binding services.

7. Various classes: (A) Guest speaker Bailey O’Connell. Bailey is a Sr. Recruiter with Creative Circle one of the largest creative staffing agencies in the business — run by creatives, for creatives. They connect design professionals with companies looking for full-time or freelance talent.

8. ART 209: (A) Guest Speaker, Paul Chapman Talked about comic book and graphic novel sequential writing and the use of text and image. He demonstrated how to find books in the VCCS library collection. Mr. Chapman has a degree in Fine Arts and library science.

9. Art 140 (A): Guest Speaker, comic book artist, Sako Davis talked about comic book/graphic novel cover design which coincided with students’ project.

SPRING 2018 1. Art 265 (L): Guest Speaker, Ronnie Price -

Creative Director at GAM, Graphics and Marketing and Jeff Geurin of Concept Marketing. Jeff Geurin is a print broker. They came in to discuss to critique students work and discuss booklet design.

2. ART 265 (A): Field Trip. Stephenson Printing Plant Tour. George Stephenson (owner) and Greg Troup (sales rep), gave students a guided tour of the printing plant. The visit highlighted the circulation of a print job through the plant, as well as the different printing presses, and finishing and mailing processes offered by the company. Stephenson Printing is one of the few local printing plants, within the DC area, which provides web, sheet-fed, and digital printing, as well as binding services.

3. ART 140 (A): Field Trip. AIGA DC presents Gail Anderson. Students attended a lecture given by Gail Anderson. Anderson is a New York-based designer, writer, and educator. She is a partner at Anderson Newton Design. Ms. Anderson has co-authored numerous books with Steven Heller like The Typographic Universe, New Modernist Type, New Ornamental Type, New Vintage Type, and many more. She also teaches both masters and undergraduate studies at School of Visual Arts, as

Page 194: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

184

Graphic Design, A.A.S. 6 23 ART 290 & 263 7 18 ART 264 8 23 ART 264 + various 9 22 ART 264 + various 10 8 ART 287

Fall 2014

Speakers Student Attendance Course

1 10 ART 140 2 25 ART 150 3 12 ART 263 4 7 ART 265 5 20 ART 209 6 60 Exhibit/lecture/

various classes 7 11 ART 290 8 11 ART 290 9 11 ART 290 10 22 Master Class 11 5 ART 287

Spring 2014

Speakers Student Attendance Course

1 10 ART 209 2 10 ART 209 3 15 ART 287 4 20 Various 5 8 ART 295 6 8 ART 295 7 8 ART 295 8 8 ART 295 9 8 ART 295 10 17 ART 140

Fall 2013

Speakers Student Attendance Course

1 21 ART 209 2 21 ART 209 3 13 ART 209 4 13 ART 209 5 13 ART 209 6 19 ART 140 7 7 ART 287 8 7 ART 287 9 25 ART 263 10 23 ART 290 & 263 11 18 Master Class

well as high school design programs. In the lecture, she talked about her journey as a designer and how to collaborate with others.

4. ART 281 (A): Guest Speaker. Donald Ely. Donald discussed his personal workflow, process, and techniques of illustration. He shared his work and held a lengthy Q&A session and participated in a critique of book cover illustrations created by the students.

5. ART 209: (A) Guest Speaker, Paul Chapman Talked about comic book and graphic novel sequential writing and the use of text and image. He demonstrated how to find books in the VCCS library collection. Mr. Chapman has a degree in Fine Arts and library science.

6. ART 203 & 204: (A) Guest speak, Callison Slater talked to animation students about his work, process, and techniques. He is an award winning animator who studied at The Art Institute of Washington, VCU School of the Arts, and NOVA Community College. He won ‘Best Animation’ at the New York Short Film and Screenplay Competition, ‘Best Animation’ at the Sixteen is to Nine International Film Festival and the ‘Audience Choice Award’ at the Rockport Film Festival. He is the lead Animator for Scamazon, a political satire website. Most recently, Callison Slater has been invited to RTX Austin as one of fifty guest animators from across the USA.

ACTION TO BE TAKEN: In 2018-2019 faculty will continue our efforts to find outside resources to speak on the graphic design field. Full-time faculty will identify topics and look for additional resources to support and enhance the curriculum. AlI faculty members should work diligently to bring in guest speakers, and/or take students on field trips every semester. Students will be encouraged to participate in local AIGA events that occur over the course of the school year, as well as to join the AIGA. Students on the Alexandria campus will be told about the school chapter AIGA NOVA. This program goal is ongoing.

Page 195: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

185

Graphic Design, A.A.S. 12 18 Master Class

Spring 2013

Speakers Student Attendance Course

1 16 ART 209 2 16 ART 209 3 13 ART 218 4 13 ART 287 5 13 ART 264 6 35 ART 116, 140, 265 7 13 ART 287 8 15 ART 140 (2 sections)

Fall 2012

Speakers Student Attendance Course

1 14 ART 209 & 141 2 11 ART 209 3 20 ART 140

Spring 2012

Speakers Student Attendance Course

1 22 ART 140 2 15 ART 116 3 12 ART 218 4 11 ART 218 5 12 ART 195 6 17 ART 265

Fall 2011

Speakers Student Attendance Course

1 22 ART 140 2 17 ART 116 3 18 ART 263

From a year-to-year comparison the following students were serviced by lectures provided to students:

Academic Year

Total Speakers /

Trips Total

Students +/ Change Previous

year 2017-2018 15 255 -26 2016-2017 20 281 +84 2015-2016 13 197 -120 2014-2015 21 317 +7 2013-2014 22 310 +131 2012-2013 11 179 +33 2011-2012 9 146 n/a AVERAGES 15.7 238+

Page 196: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

186

Graphic Design, A.A.S. To encourage number of and attendance at program-sponsored events.

Participation in programs sponsored by the Graphic Design Advisory Committee, developed to teach students about the field from the viewpoint of practicing professionals.

Data was collected at the end of the 2017-18 academic year.

Summary: The target to maintain the number of events from the previous academic year was achieved even without a Fall topic show. The number of students attending increased by 23+ over the previous year. FALL 2017

Event Student Attend-

ance Event Content

1 35+ Portfolio review Alexandria students + additional students from multiple classes attending to see what is involved with creating a portfolio and speak with reviewers and students

2 8 Portfolio review Loudoun students + additional students attending to see what is involved with creating a portfolio and speak with reviewers and students

TOTAL 43+ SPRING 2018

Event Student Attend-

ance Event Content

1 80+ Annual Student Show: Alexandria 2 25+ Portfolio review Alexandria students +

additional students from multiple classes attending to see what is involved with creating a portfolio and speak with reviewers and students

3 100+ Arts Festival on Loudoun campus. The event displayed graphic design students work and a variety of other activities such as making chalk graffiti art, photo props booth and button making.

4 50+ Annual Student Show: Loudoun 5 6 Portfolio review Loudoun students +

additional students attending to see what is involved with creating a portfolio and speak with reviewers and students

6 40+ Documentary film screening “Graphic Means” in the Waddell theater

TOTAL 311+ FALL 2016

Event Student Attend-

ance Event Content

1 75+ Copyright Panel Discussion 2 5 Portfolio review students + additional

students attending to see what is

Faculty met with the advisory board in early Fall 2017* to plan for new events for the 2017-18 academic year and to promote each event to increase student interest and participation. Events were discussed then implemented with the committees help. *As expected in the Fall 2017 semester, the Alexandria Campus program moved into a new building and the teaching Art Gallery was not completed until the spring semester. A topic show and talk could not be held. FALL 2017: Members of the advisory committee attended the portfolio reviews and talked briefly about their work. This allowed students to have exposure to the professionals that help govern the program. 1. Portfolio Review (A): Advisory Board members

affiliated with the Alexandria campus and other design professionals attended the review including: Eddie Sutton, principal, JustPixels.com; Jennifer Knotts, Graphic Designer at Knotts Design; Zenon Slawinski, Principal of Zen Arts Design Studio; Marie Elizabeth Pierce, an UX/Web Designer at Booz Allen Hamilton; Bailey O’Connell, Senior Recruiter Creative Circle DC; Karin Huggens, Principal for PaletteHead; and Julia Sarver, Creative Director for the Merritt Group.

2. Portfolio Review (L): Advisory Board members affiliated with the Loudoun campus attended the review.

SPRING 2018: Events included portfolio reviews, a film screening, an arts festival, and annual student shows. Members of the advisory committee attended the portfolio reviews and some events where they talked briefly about their work. This allowed students to have exposure to the professionals that help govern the program. 1. Annual Student Show Alexandria (Juried) 2. Portfolio Review Alexandria: Advisory Board

members affiliated with the Alexandria campus and other design professionals attended the review. Eddie Sutton, principal, JustPixels.com; Rahsaan Williams, Web and Graphics Designer for the Office of the Chief Information Officer National Aeronautical Space Administration (NASA); Bailey O’Connell, Senior Recruiter Creative Circle DC ; Anthony Dihle, owner/creative director, Victory Dance Creative; Don Starr, Associate Director,

Page 197: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

187

Graphic Design, A.A.S. involved with creating a portfolio and speak with reviewers and students

TOTAL 80+ SPRING 2017

Event Student Attend-

ance Event Content

1 60+ Topic Show: Tad Carpenter 2 75+ Annual Student Show: Alexandria 3 25+ Portfolio review Alexandria students +

additional students attending to see what is involved with creating a portfolio and speak with reviewers and students

4 35 Topic Event: Guess the Guest 5 50+ Annual Student Show: Loudoun 6 5 Portfolio review students + additional

students attending to see what is involved with creating a portfolio and speak with reviewers and students

TOTAL 250+ Fall 2015

Event Student Attend-

ance Event Content

1 14 Portfolio review students + additional students attending to see what is involved with creating a portfolio and speak with reviewers and students

Spring 2016

Event Student Attend-

ance Event Content

1 45 Topic: Bauhaus: The Face of the 20th Century

2 100+ Student Show 3 21 Portfolio review students + additional

students attending to see what is involved with creating a portfolio and speak with reviewers and students

Fall 2014

Event Student Attend-

ance Event Content

1 33 Topic Show “Go Font Yourself 2” workshop

2 12 Portfolio review students + additional students attending to see what is involved with creating a portfolio and speak with reviewers and students

School of Art, and Director, Graphic Design at George Mason University.

3. Loudoun campus wide Arts Festival. The event displayed students work and a variety of other activities such as making chalk graffiti art, photo props booth and button making. (Two advisory committee members were involved.)

4. Juried Annual Student Show in the Loudoun Wadell Gallery

5. Portfolio Review Loudoun: Advisory Board members affiliated with the Loudoun campus attended the review.

6. Screening of the documentary film “Graphic Means” in our Waddell theater. It's a film about graphic design production of the 1950s through the 1990s—from linecaster to photocom position, from paste-up to PDF.

The program would like to maintain the event offerings for the academic year ahead. ACTION TO BE TAKEN: The Advisory Board Committee and the faculty will continue to discuss the success of the previous events and plan for new events in the 2018-19 academic year. This program goal is ongoing.

Page 198: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

188

Graphic Design, A.A.S. Spring 2015

Event Student Attend-

ance Event Content

1 120 Topic Show “Go Font Yourself 2” 2 120 Student Show 3 9 Portfolio review students + additional

students attending to see what is involved with creating a portfolio and speak with reviewers and students

4 65 Topic: Design Thinking FALL 2013

Event Student Attend-

ance Event Content

1 100 Topic Show opening & meet the illustrators

2 100 Topic Show panel discussion 3 19 Portfolio review students + additional

students attending to see what is involved with creating a portfolio and speak with reviewers and students

SPRING 2014

Event Student Attend-

ance Event Content

1 19 Student Design Challenge 2 100 Student Show 3 76 AIGA 4 28 Portfolio review students + additional

students attending to see what is involved with creating a portfolio and speak with reviewers and students

5 35 Student Show

FALL 2012

Event Student Attend-

ance Event Content

1 65 Topic Show SPRING 2013

Event Student Attend-

ance Event Content

1 65 Student Show 2 76 Design +

Page 199: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

189

Graphic Design, A.A.S. 3 13 + 19

additional students

Portfolio Review + other attending students

4 15 Type Workshop

5 35 Spring Festival and Gallery Show

Fall 2011

Event Student Attend-

ance Event Content

1 16 Topics Show 2 69 Topic show 2 speakers Night 1 3 78 Topic show 2 speakers Night 2

Spring 2012

Event Student Attend-

ance Event Content

1 66 Topics Show 2 18 Portfolio review + approximately an

additional 26 students attending 3 78 Student and Topics Show

2010-11

Event Student Attendance Fall 2010 Spring 2011

1 96 16 2 39 15 3 17 60 4 25 N/A

To encourage students to continue through all courses and complete the degree.

Enrollment totals of higher-level classes.

Data provided by each campus Assistant Dean, obtained from enrollment reports in MyNova SIS system and collected at the end of the 2017-18 academic year.

Target: To grow each higher-level course by at least two students each semester.

Course 2010-11

2011-12

2012-13

2013-14

2014-15

2015-16

2016- 17

2017- 18

ART 209 51

54 (+3)

37 (-17)

39 (+2)

48 (+9)

ART 217 56

41 (-15)

56 (+15)

53 (-3)

44 (-9)

29 (-15)

39 (+10)

38 (-1)

ART 218 28

36 (+8)

20 (-16)

31 (+11)

28 (-3)

24 (-4)

37 (+13)

51 (+15)

ART 251 14 45 (+31)

36 (-9)

51 (+15)

ART 263 14

26 (+12)

22 (-4)

25 (+3)

26 (+1)

14 (-12)

13 (-1)

18 (+5)

ART 264 24

20 (+4)

17 (-3)

22 (+5)

17 (-5)

11 (-6)

10 (-1)

4 (-6)

ART 265 50

33 (-23)

45 (+12)

30 (-15)

49 (+19)

36 (-13)

39 (+2)

27 (-12)

ART 268* 7*

Comparing the data over the eight academic years, the enrollments have fluctuated with a decrease in enrollments in some of the listed courses. The trend could be due to the college administration looking more closely at lower class enrollments, which has caused an increase in course cancelations and availability of classes during the 2014 through 2018 academic years. The Alexandria and Loudoun Graphic Design reviewed enrollment data to determine how the reduction in the programs course offerings impacted students’ ability to complete their degrees. The faculty worked together to implement new rotation methods for offering 200 level courses in either Fall or Spring with the goal of having increased class enrollments in either the Fall or Spring sessions rather than lower enrollments each semester which has resulted in slightly reduced course offerings each semester. New Associate Deans should monitor class schedules and courses and inform students and faculty on the

Page 200: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

190

Graphic Design, A.A.S. ART 270 13 21

(+8) 27

(+6) 25

(-2) ART 281 11

(n/a) 22

(+11) 20

(-2) ART 287 21

(n/a) 32

(+11) 21

(-9) 30

(+9) Total Students

260 283 (+23)

319 (+36)

* First time tracking Target Met: [ ] Yes [ X ] No [ ] Partially The program goal has not been met and should be changed to a measure that is a more realistic overall rate of growth/decline. Note the addition of a “Total Students” data point in the table above. The Graphic Design program would like to target a 2% growth each semester in place of the goal “To grow each higher-level course by at least two students each semester.” The program goal as stated has not been met for ART 217, ART 264, ART 265, ART 270 and ART 281. Yet, overall the program increased enrollment by 37 students which equates to a 12.7% increase over the previous year (283 in 2016-17 year and 319 in 2017-18 year). Art 209, 218, 251, 263, 281*, and had the most significant increases of students. *ART 281 is a new course offering as an elective for the degree.

availability of the courses. This will allow students to plan their schedules for degree completion. The program is aware some students are leaving the program early to pursue advanced degrees at a four-year institution. The Graphic Design program has a track that is part of the new A.F.A. that will launch in the Fall 2018 semester. The hope is for the degree to attract more students who need Graphic Design courses to transfer and increase the overall enrollments in program courses. ACTIONS TO BE TAKEN: 1. Advise students one-on-one as well as in groups

on the next course or courses to take after completing the current session.

2. Encourage students in existing classes to enroll early in class sequencing that will help in their development in the program.

3. Integrate and update a variety of support training videos in Blackboard classes to help support student learning. These resources are included to enhance student success, with the goal of increasing student persistence, and offer students ample opportunities for success in classes.

4. Track individual students who are close to graduation to ensure they have met all the degree requirements, to keep them on track for graduation, and provide frequent feedback and support as needed.

5. Participate in campus advising week with walk-in and appointment-based sessions to help students gain a clear sense of how they are progressing toward their goals.

7. Encourage students to apply for internships and assist them in finding internship opportunities in order to give them hands on experience in the field.

8. Provide insight about the design industry so students can see how their efforts could aid them to find possible employment.

The program will work to implement the A.F.A. Visual Arts Graphic Design degree with a projected launch in Fall 2018. Faculty will continue to encourage students in all classes to enroll early in courses and guide their degree progression. Make sure all students are program placed and have a

Page 201: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

191

Graphic Design, A.A.S. faculty advisor. Faculty members will continue to talk with their classes and emphasize the importance of working with a faculty advisor for course sequence and degree selection. This program goal is ongoing.

To prepare students for employment in the graphic design field.

Requesting and participating in a “practice” interview and portfolio presentation with a member of the design community, an opportunity made available by the Advisory Committee, the program and other professional organizations. Data was collected at the end of the 2017-18 academic year.

Target: To increase interviewing skills and practice, the program will work to increase student enrollment in the ART 287 Portfolio & Resume capstone course by 2 students each semester. The capstone course is where the faculty schedule interviews with professionals for the students. Currently, 100% of students in the capstone course complete interviews. This is a stated requirement to pass the course.

Semester Total Students

Internship Interviews

Professional Interviews

Spring 2018 13 2 39* Fall 2017 17 4 51* Spring 2017 16 3 48* Fall 2016 5 2 15* Spring 2016 21 6 42*

(1 student did not pass)

Fall 2015 11 1 18* Spring 2015 20 4 39 Fall 2014 14 2 30 Spring 2014 22 7 N/A Fall 2013 7 2 N/A Spring 2013 21 6 N/A Fall 2012 4 3 N/A Spring 2012 36 N/A N/A Fall 2011 9 N/A N/A Spring 2011 28 N/A N/A Fall 2010 7 N/A N/A

*Each student interviewed with at least three professionals Target Met: [ ] Yes [ X ] No [ ] Partially Target was not met this year. The results were mixed this year with increased enrollments in other capstone classes, which may lead to better enrollments in ART 287. The program had a decrease in its target to increase the enrollment in the capstone course by -11 students over the 2016-17 academic year.

Prior to this assessment, faculty identified areas for enhancement and made necessary modifications to the curriculum to ensure student work was aligned with industries best practices and is current. Multiple opportunities for students to refine their portfolios in order to demonstrate their talents and help steer students toward excellence were presented through coursework, workshops, studio tours, interviews and internships. Faculty encouraged students to enroll in the Interactive degree to help round out their knowledge and offer more opportunities when pursuing a career in the design field during both the Fall 2017 and Spring 2018 semesters as well as during advising over the summer. ACTIONS TO BE TAKEN: In the 2018-19 academic year, faculty will continue to keep the curriculum current, identify areas for enhancement and make necessary modifications to the curriculum to ensure student work is aligned with industries’ best practices and skill set. They will continue to offer multiple opportunities for students to refine their portfolios to achieve a higher caliber outcome in order to demonstrate their talents. It also will help steer students toward excellence while stressing meeting deadlines. Individuals in the design field must be able to communicate their ideas in writing, visually, and verbally. Faculty will continue to give students multiple opportunities and encourage them to get as much practice as possible interviewing and presenting their resumes and portfolios in order to get feedback and improve their work and their presentation skills. In 2018-2019 academic year faculty will make students aware of and encourage them to apply for design internships to further prepare for the graphic design field. Faculty will work with students who are planning on getting the new A.F.A. degree to prepare their

Page 202: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

192

Graphic Design, A.A.S. portfolio for transfer, and understand the application process for the visual arts with a graphic design focus. The program goal is ongoing.

To encourage students to complete an AAS degree in Graphic Design.

Graduation totals: Data from Number of NOVA Graduates by Degree and Specialization: 2017-2018 from the OIR website Data from the Planning and Evaluation Reports within the Fact Book dated 2013-2017 was used to determine the totals. Data was collected at the end of the 2017-18 academic year.

Target: To continue to stabilize or increase the graduation totals across all degrees and certificates in the Graphic Design program.

Academic Year

Number of Grads (AAS, C, CSC)

% change from previous year

2017-18 39 8% 2016-17 36 -27% 2015-16 57 46% 2014-15 39 -25% 2013-14 52 8% 2012-13 48 23% 2011-12 39 22% 2010-11 32 18% 2009-10 27 N/A AVERAGE 41.25

Target Met: [ X ] Yes [ ] No [ ] Partially Target was met this academic year. The number of graduates for the 2017-18 academic year was up from the previous year. The 2017-18 graduation totals increased by 3 with 39 students over the previous year. This was still slightly below the average number of graduates of 41.25 with 39 graduates.

Loudoun and Alexandria campuses will continue to coordinate class offering to ensure course availability in an academic year, to ensure that students can complete their degree. Faculty advisors will go over student’s progress reports. The program assistant deans reviewed schedules for conflicts or multiple offerings and have begun to alternate some course schedules. The schedule reviews took place in the Fall 2017 and Spring 2018 semesters. This work needs to be taken up by the new Associate Deans that were created in the college reorganization. We will not be able to assess this for another year or two, but we have been able to cancel some classes and send the students to the other campus to complete the course. ACTION TO BE TAKEN: The faculty will discuss at the program council meeting potential reasons why the percentage of graduates increased over the previous year and how faculty can continue to modestly increase and stabilize graduation totals. In 2018-19, to continue to improve student graduation percentages, faculty will: 1. Actively remind students in the capstone courses

regarding the last day to apply for graduation. This will be done through verbal classroom announcements, Blackboard announcements, advising sessions, and classroom posters.

2. Go through the class rosters of the higher-level courses to see which students are close to graduation and what needs to be done to assist these students to complete their course of study and obtain their degree.

3. Proactively advise students about what is needed to complete a degree

The program goal is ongoing.

To increase the program placement rate in the Graphic Design program

Program placement totals: Data from Distribution of Program Placed Students by Curriculum and Award Type Fall 2013 through Fall 2017* on the OIR website.

Year Total % change from previous year 2017 175* >-0.9%* 2016 176* -23%* 2015 229 -3% 2014 232 -19% 2013 289 -8% 2012 315 18%

For 2017-2018, the target was not met again, yet we saw an improvement in that our enrollment stabilized when the overall enrollment at the college decreased. We are not sure if changing the name to Graphic Design to better communicate the objectives of our program and attract more students have helped with this.

Page 203: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

193

Graphic Design, A.A.S. Data from Fall 2007- Fall 2016 Distribution of Program Placed Students by Curriculum and Award Type chart on the OIR website and collected at the end of the 2016-17 academic year. *Please note: There is no data for 2016 or 2017 regarding the Web Design Specialist, Career Studies Certificate program placement in the report above.

2011 266 10% 2010 242 -7% 2009 259 -1% 2008 262 8% 2007 242 N/A

Target Met: [ ] Yes [ X ] No [ ] Partially Target wasn’t met this academic year. While the overall number of students stayed relatively steady, we did not meet our target to increase program-placed students. After four straight years of declining program-placement, 2017 was an improvement over previous years to not have a decrease in program placed students.

The program worked diligently to get students program placed over the last 5 years and we believe the name change to “Graphic Design” has had an effect on more students finding the program and counselors recommending us.

With the push for “transfer” degrees, our students in the past were often told they can’t study Graphic Design if they intend to get a 4-year degree. With the new AFA degree with a Graphic Design track being implemented, we hope to capture more students. We also hope by creating an Advance Program in Graphic Design with GMU it will help future program-placement, although these numbers may be tied into the AFA degree. We should develop a way to track the data related to the AFA degree.

Tactics used by professors over the past year to increase the program placement in the Graphic design program have included: 1. Reviewed class rosters to encourage students to

be program placed within the first 6 to 8 weeks of the semester.

2. Offered explanation of degrees and certificates to students in Graphic Design classes.

3. Offered outreach opportunities to current NOVA students and area high schools to promote the program and careers in graphic design.

4. Began discussions and a relationship with Monroe Technology Center in Leesburg, VA. Monroe Tech is a Career and Technical High School which has a thriving graphic design program.

5. Talked with advisors about the graphic design program and the upcoming transfer options.

ACTION TO BE TAKEN: In 2018-19: 1. Loudoun will continue to build a relationship and

market to the Monroe Technology Center High School in Leesburg. Monroe Technology Center has a program which focuses on the area of graphic communications and computer & digital animation. The goal is to make their students more aware of our program.

2. Full-time faculty will continue their class visits and scheduled advising sessions to get students to understand the sequence of courses and the importance of program placement towards their goal of graduation. Faculty will continue to work diligently to get students placed, which will hopefully yield an increase in graduations in the future.

3. Faculty will work with GMU to solidify an ADVANCE program in Graphic Design.

4. Continue to monitor if the name change from Communication Design to Graphic Design leads to increased understanding of the degree and certificate offerings, and if enrollments will increase as a result.

5. Faculty will work with NOVA leadership to see if moving ahead with an AFA Graphic Design specialization is needed.

This program goal is ongoing.

Page 204: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

194

Annual Planning and Evaluation Report: 2017-2018 Health Information Management, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare students to work as Health Information Management (HIM) professionals. These individuals play a critical role in maintaining, collecting, and analyzing the data that doctors, nurses, and other healthcare providers rely upon to deliver quality health care. The program emphasizes professionalism and instructional methods in a state-of-the-art computerized laboratory at the Medical Education Campus in Springfield, followed by clinical experience at various affiliated health care organizations. After successful completion of degree requirements, the student will be eligible to take the Registered Health Information Technician (RHIT) examination. This leads to an American Health Information Management Association (AHIMA) certification of a Registered Health Information Technician (RHIT) credential. Student Learning

Outcomes Assessment Methods Assessment Results Use of Results

Analyze policies and procedures to ensure organization compliance with regulations and standards. (V.A.)

Supervision & Management Practices HIM249 (E05H) Direct measure(s): Ensuring students understand the difference between policy and procedure. The direct assessment method derives from Readings in Chapter 5 from the Kelly book assigned in Week 4. The material is then covered in Quiz 1. Sample Size (Write N/A where not offered)

Campus/ Modality

Total # Sections Offered

# Sections Assessed

# Students assessed

ELI 1 1 15 DE* N/A N/A N/A Total 1 1 15

* Dual-enrollment

Semester/year data collected: Fall 2017- 2018 Target: 80% Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Current Assessment Results

Average Score Percent > target ELI 40.77 35

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Current Assessment Results

Average Score

% of Students > target

Quiz 1 Scores 40.77/50pts 86 Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criteria/ Question Topic: 13/15 students earned 36pts or better on Quiz 1 which was worth a total of 50pts. Weaknesses by Criteria/ Question Topic: Results demonstrate that students needed future reinforcement in the critique of different leadership theories, employee/employer relationships, and performance appraisal.

Previous action(s) to improve SLO: Program Director and HIM Faculty added more assignments to meet AHIMA competencies in Spring 2018. Target Met: [ ] Yes [ ] No [X ] Partially Based on recent results, areas needing improvement: Based on recent results would need to implement assignments that reflect students analyzing policies and procedures related to organization compliance and regulations. Current action(s) to improve SLO, based on results: The program will review this SLO for improvements in student test results in the areas mentioned in Fall 2019. Next assessment of this SLO: Spring 2020

Differentiate the roles and responsibilities of various provider and disciplines to support

Performance Improvement in Healthcare Settings HIM 229 (E05H) Direct measure(s): Ensuring students can identify attributes of a good faith peer review system as it relates to the healthcare setting. Data for this

Semester/year data collected: Fall 2017 Target: The achievement target is 75% of students would successfully achieve scores of 20 points or higher on the assignment, which would show that students understood how to interpret and apply

Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The target for this objective was met. Specific areas where students remained below target occurred where

Page 205: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

195

Health Information Management, A.A.S. documentation requirements throughout the continuum of healthcare. (I.B)

assessment is from discussion board #9, worth a total of 25 points. In this assessment, students are to research and explain the concept of "good faith peer review." Peer review refers to the good faith activities utilized by the medical staff to conduct patient care review for the purpose of analyzing the quality of care provided to patients by physicians. The federal HCQIA (Healthcare Quality Improvement Act) was passed by Congress in 1986 to extend immunity to good faith peer review of physicians and dentists. The purpose of this project is to provide students with an opportunity to apply occurrence screening criteria to actual case scenarios to identify events requiring peer review. Grading - 25 points: • 15 points for original post (Points are deducted

if the original post does not contain the elements of a good faith review)

• 5 points for each response to a peer (Points are deducted for not posting to 2 students’ discussions)

Sample Size (Write N/A where not offered): Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

ELI 1 1 12 DE* N/A N/A N/A Total 1 1 12

*Dual-enrollment

quality improvement concepts and principles as they relate to the relationship between quality management and ensuring individual competency. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017 Average

Score Percent >

target ELI 29.29 80%

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Average

Score % of Students

> target Discussion Board # 9 Scores 99.67% 100%

Current results improved if applicable: [X ] Yes [ ] No [ ] Partially Weaknesses by Criteria/ Question Topic: In reviewing the outcomes, the instructor will provide an additional assessment. Students will also be required to write a sample of a Good Faith Peer Review to reinforce their understanding of the process.

instructions for the assignment were not followed. To continue improvement in student learning, the HIM 229 instructors will continue to modify and/or update scenarios each year, as this class is only taught once a year. This will be reviewed in Fall 2019. Next assessment of this SLO: Fall 2019

Apply report generation technologies to facilitate decision-making. (III.C)

Electronic Health Records Management HIM233 (E05H) Direct measure: At least one direct measure of students’ knowledge/skill/ ability is used to assess each SLO. Assignment Summary: This activity is intended for the beginning and intermediate EHR student user. The activity introduces the student to meaningful use, the certification of EHRs, and selecting an EHR for use in physician practice. The current assessment method is different from previous assessments, as the earlier method was assessing other standards and criteria. Objectives:

Semester/year data collected: Spring 2018 Target: 80% of students will score 80% or better on each criterion as well as the overall score. Two out of 10 students did not complete the assignment. Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Spring 2018

Average Score

% of Students >

target 1.Question answered 8/10pts 80%

Current results improved: [ X] Yes [ ] No [ ] Partially Strengths by Criteria/ Question Topic:

Previous action(s) to improve SLO: HIM program faculty member and Program Director have added more assignments that meet different AHIMA competencies and standards (Fall 2017). Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: This was not assessed previously and therefore there is no data available for comparison. No improvements needed. Current action(s) to improve SLO, based on results: Only one student scored below the average and the program will make

Page 206: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

196

Health Information Management, A.A.S. 1. Define key terms commonly used in health

information technology. 2. Define key acronyms commonly used in health

information technology. 3. Explain the process used in the selection and

implementation of EHRs. 4. Explain why certification is important. 5. Locate and apply the necessary information to

assess systems’ capabilities to make an informed decision for EHR selection.

Sample Size (Write/ where not offered):

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

ELI 1 1 10 DE* N/A N/A N/A Total 1 1 10

*Dual-enrollment

8/10 students met the criterion. Weaknesses by Criteria/ Question Topic: 2/10 students did not complete the assignment. In this assessment only one student scored below the average. The program will evaluate the assessment and make changes in the future to determine areas for improvement.

changes in the future to determine areas for improvement. Most recent results: Report indicates the achievement targets for the most recent cycle were met. Results are described in the context of the actions may have had when assignments were added to meet AHIMA competencies and standards. Next assessment of this SLO: Spring 2020

CLO: Apply policies and procedures surrounding issues of access and disclosure of protect health information. (II.C) [ x ] QR

Legal Aspects of Health Record Documentation HIM 226 E05H Direct measure: Ensuring students understand how to apply policies and procedures in granted authority to release protected health information. Students will use scenarios to determine if the situation is identity theft, medical identity theft, or neither. Grading Rubric - Scenario questions: 4 questions x 6.25 points = 25 points Sample Size (Write N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

ELI 1 1 9 DE* N/A N/A N/A TOTAL 1 1 1

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students will score 75% or better on this assessment. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Spring 2018 Average

Score Percent > target

ELI 19.79 79.17% Results by CLO Criteria:

Criteria/ Question Topics

Spring 2018

Average Score % of Students > target

1. 6.25 100 2. 5.56 89 3. 3.47 56 4. 5.22 78 Total 19.79 79.19

Current results improved if applicable: N/A: First assessment Strengths by Criteria/ Question Topic: Students demonstrated an understanding of how to apply policies and procedures for protected health information as it relates to release.

Previous action(s) to improve CLO if applicable: This CLO was not previously assessed. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The HIM program has added more assignments that focus on access and disclosure of protected health information as outlined in the AHIMA competencies and standards, to be implemented by the HIM faculty member and Program Director in Spring 2019. Current action(s) to improve CLO, based on results: As the students met the overall target, specific areas of protected health information focused on re-disclosure still needs to be reviewed. This will be reviewed in Spring 2019 by the HIM faculty member and Program Director. Next assessment: Spring 2019

Page 207: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

197

Health Information Management, A.A.S. Weaknesses by Criteria/ Question Topic: Based on the case scenarios, the results indicated that improvement is needed in identification of medical identity theft.

Program Goals Evaluation Methods Assessment Results Use of Results NOVA’s HIM Program will increase the number of new program placed students from the previous year.

The data are from the OIR report: https://www.nvcc.edu/oir/_files/factbooks2013-2018.pdf

Target: Increase the AAS by 5% There was no increase from Fall 2016 to Fall 2017:

Year Enrollment Fall 2017 45 Fall 2016 55 Fall 2015 56 Fall 2014 56 Fall 2013 55

This goal was previously assessed and was not met, although we remained steady the last three years prior and enrollment was low throughout the college programs.

Target: [ ] Yes [ X ] No [ ] Partially Not met and has been re-evaluated The Program Director and faculty were involved in the recruitment of students for the 2016-17 school year. Faculty met with prospective students at the Medical Education Campus. In Fall 2017, the Program worked with the Dean of Allied Health to develop and implement strategies for promoting the program including: HIM Program Director and faculty working with student services to review and revise marketing materials for the program, updating the HIM Website with current program information and pathways to success, and faculty continued in person student advisement for the HIM program. With the implemented strategies there has not been an increase in program placed students. After reviewing the statistics over the last 5 years, the HIM program has re-evaluated this goal. Since progress towards increased enrollment has not been at 2% or greater it is not reasonable to continue the goal as stated. The program will revise this goal to state: NOVA’s HIM Program will minimize the attrition rate of program placed students to not exceed 5%. This revision is in direct result to the college’s total decrease in enrollment across all campuses. The new goal will be implemented in Fall 2020. To support program placed students, the program will request support for outreach to the college community for students interested in the HIM Program. HIM faculty advisors will review students enrolled and continue to meet with advisees twice a semester to monitor academic student progress. To monitor this outcome, the program faculty will continue to track the number of programs

Page 208: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

198

Health Information Management, A.A.S. placed students who develop difficulties enrolled for early intervention. Assessed: Annually

NOVA’s HIM Program will increase total number of students graduating from the previous year.

Comparison of graduation rates from previous school year to assessment year. The Data are from OIR Reports: Number of Graduates by Degree Specialization Reports, https://www.nvcc.edu/oir/_files/factbooks2013-2018.pdf

Target: The HIM Program will increase the number of graduates over the previous year’s total by 5%. Number of HIM Program Graduates:

Year HIM Total Graduates 2016-17 9

2015-16 16

2014-15 7

2013-14 16

2012-13 17

Target: [ ] Yes [ X ] No [ ] Partially In the HIM program the goal was not met. The HIM Program Director and faculty increased academic advising and planning from once a semester to twice a semester during Spring 2017 to assure they met the qualifications to graduate prior to applying. The program will begin to track students by 3-year cohorts to better monitor the attrition rate. The HIM Program Director and faculty will continue to monitor students who are enrolled in the program. Students who are on track to graduate within 3 years will continue to receive academic support, and students who are in the program for more than 3 years will be evaluated to determine the best course of action, program restart, completion, and graduation during the 2017-18 academic year. There are 18 courses in the HIM program and of these, 12 are offered online and/or on campus. All courses, whether online or campus based, are reviewed in the present assessment. The program will continue to transition applicable courses in the HIM curriculum to distance learning and/or hybrid courses to provide additional options for students at NOVA locations outside of the MEC and throughout Virginia, with projected completion revised to 2020 as funding provides. Since this goal is linked to the program attrition rate it will be modified. The program will revise this goal to state: NOVA’s HIM Program placed students will graduate within three years of placement. The new goal will be implemented in Fall 2020. Assessed: Annually

Page 209: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

199

Annual Planning and Evaluation Report: 2017-2018

Horticulture Technology, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare the students for full-time employment within the field of commercial horticulture as well as those presently employed who seek further knowledge and advancement. Graduates of the program are prepared for managerial/supervisory level positions in areas which include: landscape design and installation, grounds maintenance, floristry, greenhouse and nursery management, garden center operation, and sales and marketing in related industries.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Correctly identify parts of a plant under microscope or as a sample.

Horticultural Botany HRT 127 Direct Measure: Students should successfully complete a comprehensive Lab Practical Final Exam at the end of the semester. Success is a score of 70% or higher out of 100%. There were 30 questions and two bonus questions. The questions all involved correctly identifying parts of a plant that were on display as actual samples, microscope images, or models that were seen previously in lab. Provided Rubric Criteria or Question Topics: A Sample Exam is Provided. There is no written test as models, microscopes, and specimen stations are set up that day. The answer sheet is hand written as we set-up the exam. The exam of 30 questions plus 2 bonus questions was administered covering the material from all labs. The questions were physical plant samples, microscope images or models of plants that were on display and involved in most cases a single word correct answer identifying the part of the plant. Sample Size (Write NA where not offered.)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# LO only 1 1 18 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 18

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: A target success rate was the majority of the class achieving a 70% or better on the exam. Results by SLO Criteria: Grades were as follows (#s of students):

• 90-100+% - 9 • 80-89% - 1 • 70-79% - 2 • 60-69% - 2 • 50 -59% - 3 • Below 50% - 1

Sub scores: 47.5% of the missed questions were on the microscope questions 1-9, 14-15 and 28-30. Of the students that took the exam, 66.7% achieved the target score or better. 33.3% of the students that took the exam did not meet the target. Current results improved: [ x ] Yes [ ] No [ ] Partially Of the students that took the exam, 66.7% achieved the target score of above 70%. In 2015, 65% achieved the target score or better compared to 61.5% in Fall of 2014. Also the number of 90-100% scores increased from 50% in 2017 from 15% in 2015. See below.

Percentage 2017 2015 2014 2013 90-100 9 3 2 1 80-89 1 6 5 3 70-79 2 4 2 2 60-69 2 2 1 1 50-59 3 2 0 3

Previous action(s) to improve SLO: This SLO was assessed in Fall 2011, Fall 2012, Fall 2013, Fall 2014 and Fall 2015 by the previous program head. His results are included here in the assessment results from previous years. He implemented sample and practice laboratory exams. In Fall 2017 under his guidance, another faculty member taught the course and implemented weekly quizzes, but not specifically Lab practical quizzes. They also allowed slides and microscopes to be available for review throughout the semester. This practice was continued. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: Questions identifying parts of a plant in a microscope image or under a microscope. Current actions to improve SLO based on the results: After the results of the Fall 2017 assessment, the course was further redesigned to address the issues with the laboratory based questions. For Fall 2018, refinements of the actual lab worksheets and procedures was undertaken to further direct the lab learning outcomes. Additionally, instead of practice exams, two additional lab practical exams were added to make sure that the students were comfortable with the exam format and how the laboratory information related to the exams. This included giving two lab practical exams at 5-6 week intervals coupled with the midterm exams to better familiarize students with the

Page 210: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

200

Horticulture Technology, A.A.S. Below 50 1 2 4 1 No Show 0 1 2 1

Strengths by Criterion/ Question/Topic: Questions identifying parts of a plant in a sample or model. Weaknesses by Criterion/ Question/Topic: Questions identifying parts of a plant in a microscope image or under a microscope.

laboratory materials, microscope images and plant samples. In addition the laboratory worksheets will be discussed and reviewed in class after they are graded to help students review. The lab slides and microscopes are also still available for review throughout the semester. These practical exams and review techniques will hopefully improve the overall success rate. A new faculty member is teaching the class now and is implementing these changes. Next assessment of this SLO: We will re-assess and evaluate again in the Fall of 2018, based on these improvements.

Neatly draw and correctly label a landscape plan.

Planting Design I HRT 231 Direct Measure: Students were given an assigned view and design scenario for a take home final drawing projects. Each drawing is graded in relation to the base plan provided, the criteria of the design scenario and the standard requirements of a landscape plan view (scale, spacing, hardscape, etc.). Students were also graded on the accuracy in use of the sheet, the line weight, creativity and use of color and ink. Provided Rubric Criteria or Question Topics: A sample project is included. The instructor was a new adjunct faculty and they did not have a written rubric. In the future a rubric will be provided. The final project was worth 100 total points. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 14 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 14

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: A target success rate was the majority of the class achieving a 70% or better on the project. Results by SLO Criteria: Grades were as follows (#s of students):

• 90-100+% - 2 • 80-89% - 3 • 70-79% - 3 • 60-69% - 0 • 0 -59% - 3 • Withdrawn – 3

Sub-scores: The point reductions were given by percentage (of total reductions) in the following categories (some students had numerous reductions):

• Plant Spacing/ Size: 36.3% • Plant and material quantities: 9.1% • Neatness and Miscellaneous:9.1% • Labels and Graphics: 45.5%

Of the students that completed the project (and did not withdraw from the class) 72.7% achieved the target score or better. 27.2% of the students that took the exam did not meet the target. Additionally, 3 students or withdrew from the class and did not attempt this assessment.

Previous action(s) to improve SLO: This is a new SLO which was rewritten in Spring 2016 and had not yet been assessed. This was done by the previous program head. There were no similar or analogous SLOs previous to this new addition. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The sub scores indicate that the main areas needing improvement are plant spacing and correct sizing and placement of plants based on these criteria. Also the area which may need the greatest improvement is graphics skills and especially proper labelling. Current actions to improve SLO based on the results: The adjunct faculty member will be contacted by the program head to brainstorm additional lab and studio work, or class activities which deal with these plant spacing and size issues as well as focus on graphics and labelling protocol and requirements. We will work to add design critique and assessment throughout the semester to address these areas. This will facilitate the learning of these components of landscape plan and prepare students for their final project. To be implemented in Fall 2019. Next assessment of this SLO: We will re-assess and evaluate again in Fall 2018, based on these improvements. The new

Page 211: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

201

Horticulture Technology, A.A.S. Current results improved: N/A - This is a new SLO which was rewritten in Spring 2016 and had not yet been assessed. Strengths by Criterion/ Question/Topic: Plant lists: plant names, plant quantities and material quantities. Also the neatness and miscellaneous criteria. Weaknesses by Criterion/ Question/Topic: Plant size and spacing and appropriate graphics, especially labelling.

program head will assess this SLO. The new adjunct faculty member who is teaching this course in Fall 2018 is working on these recommended improvements.

Calculate areas and volumes of landscape features and amounts of required materials

Site Analysis HRT 230 Direct Measure: Students should successfully complete a comprehensive Final Exam at the end of the semester. Success is a score of 70% or higher out of 100% of the relevant questions relating to area, volume and geometric calculations. On the exam there were 5 questions that met the criteria. Three of these questions were simple multiple choice and two were multiple choice but with complex scenarios that required critical thinking to calculate the area. These 5 relevant questions on the final exam were worth 4 points each for a total of 20 points of the final exam. Provided Rubric Criteria or Question Topics: Sample Exam Provided. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 10 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 10

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: A target success rate was the majority of the class achieving a 70% or better on these questions. Results by SLO Criteria: Score percentages (point correct out of 20 points total) for these questions were as follows (#s of students):

90-100+%

80-89%

70-79%

60-69%

0-59%

2018 1 6 0 1 2 Sub scores: 50% of students missed the most complex scenario question involving calculation of area (Q. 13) while 0% missed the simple scenario question. Of the straightforward multiple choice questions there was strong variation and no pattern observed. Of the students that completed the exam 70% achieved the target score or better. 30% of the students that took the exam did not meet the target. Current results improved: N/A - This is a new SLO which was rewritten in Spring 2016 and had not yet been assessed. Strengths by Criterion/ Question/Topic: Answering simple multiple choice questions related to area, volume or slope. Weaknesses by Criterion/ Question/Topic: Calculating complex scenario questions especially with subtle criteria.

Previous action(s) to improve SLO: This is a new SLO which was rewritten in Spring 2016 and had not yet been assessed. This was done by the previous program head. There were no similar or analogous SLOs previous to this new addition. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The subscores indicate that the best area to focus on would be the complex scenario questions which require interpretation of subtle criteria in order to correctly calculate the area in a landscape. Current actions to improve SLO based on the results: The adjunct faculty member will be contacted by the program head to brainstorm additional lab or worksheet activities which have complex or subtle scenarios. These will be assigned throughout the semester to facilitate the learning of these procedures and to prepare students for the test. To be implemented in Spring 2020. Next assessment of this SLO: We will re-assess and evaluate again in Fall 2018, based on these improvements. The new program head and will assess this SLO. The new adjunct faculty member who taught this course in Spring 2017 and is teaching this course in the future (most likely Spring 2020) is working on these recommended improvements.

Page 212: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

202

Horticulture Technology, A.A.S. Core Learning

Outcome Evaluation Methods Assessment Results Use of Results

CLO: Critical Thinking [ x ] CT

History of Garden Design HRT 120 Direct Measure: Gardens and culture research paper. This project was assigned as a semester long research paper allowing students to critically think about the course material and have an opportunity to research and explore in depth a garden history topic related to a specific culture and period in history. Students were able to select any site and research and discuss it in the context of its gardens, landscape and the social and cultural context in which it was built. This semester-long research project allowed students to critically think about the course material and have an opportunity to research and explore in depth with a garden history topic related to a specific culture and period in history. It was graded to a rubric for a total of 50 points. Provided Rubric Criteria or Question Topics: Research Paper Rubric is included. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 21 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 21

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: A target success rate was the 75% of the class achieving a 90% or better on the research paper. Results by CLO Criteria: Grades were as follows (#s of students):

90-100+%

80-89%

70-79%

60-69%

0-59%

2018 18 3 0 0 0 Subscores: The percentage of students who received reduced points based on the rubric, received point reductions by percentage in the following categories:

• Spelling and grammar: 33.3% • Research and references: 0% • Structure and quality of content: 28.6% • Also: No points reduced: 38.1%

All of the students completed the project; 85.7%% achieved the target score or better. 14.3% of the students that took the exam did not meet the target. Spelling and grammar was the largest source of point reduction in the research paper. These spelling and grammar skills are important, although not necessarily a direct measure of the critical thinking CLO. The research and references requirement were the lowest source of point reduction. Use of references and citation are an important part of critical thinking, and these sub scores indicate an aptitude in this CLO. The structure and quality of content resulted in the largest source of point reductions due to a critical thinking related subcategory. This would be an appropriate area to focus on to improve students critical thinking CLO. Finally, the 38.1% of students who had no reduced points also indicate aptitude in the critical thinking CLO. Current results improved: N/A- new assignment with no comparable direct measure

Previous action(s) to improve CLO if applicable: This is the first time that this specific class and assignment was used to assess this CLO. There is not a comparable previous measure. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The subscores indicate that the best area to improve the critical thinking CLO for this assessment would be to emphasize quality of content and using structure for the representation of ideas in a research paper form. Current actions to improve CLO based on the results: The rubric will be revised with more details elaborating on the quality of content and structural requirements of the paper to better articulate the CLO critical thinking aspects of this assignment. To be implemented Spring 2019. Next assessment of this CLO: This course is offered every semester and this assignment will be given each time the course is offered. The results can be saved from each class and assessed on a 2-year cycle, but which includes every semester’s course results. A 2- year cycle would place the assignment as a CLO assessment for 2020-21. The course is taught by and will be assessed by the program head.

Page 213: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

203

Horticulture Technology, A.A.S. Strengths by Criterion/ Question/Topic: Use of research, citation, and proper use of references. Weaknesses by Criterion/ Question/Topic: Structure and quality of content remain the weakest area relevant to this CLO.

Program Goals Evaluation Methods Assessment Results Use of Results To increase the number of program placed students in the program.

Program placed rates as recorded by OIR in report entitled Distribution of Program Placed Students by Curriculum and Award Type

Semester/year data collected: Fall 2017 Target: Number of program-placed students in each degree/certificate will increase by 2-5% Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 67 0 2016 67 0 2015 67 -11.8% 2014 76 -2.5% 2013 78

Target Met: [ ] Yes [ ] No [ x ] Partially Comparison to previous assessment(s): Horticulture tends to fluctuate with the economy. We continue to encourage students to be program placed as they enter in the Fall and discuss it in our Introduction to Horticulture classes. Although our number of students remaining the same for the last three years does not meet our program goal, it is better than the negative trend of decreasing student numbers from 2013-15. Having a stable number of students over the past 3 years partially meets our goal.

Previous action(s) to improve program goal: Outreach within the NOVA campus community, especially via Student Services and the Counseling office. These actions began previously under different faculty directions but were revised in Spring 2018 with the hiring of a new full time faculty member and program head. This activity will continue with more of a concerted effort under this new leadership. Most recent results: No decrease or increase Results improved: [ ] Yes [ ] No [ x ] Partially Current action(s) to improve program goal: The faculty continue to promote the program/curriculum through local and national professional organizations and look at new areas of interest for courses including Arboriculture/Urban Forestry and the offerings for Viticulture starting in Fall 2015. In the past assessments the plan was for the program to schedule a meeting with the Student Services/Counseling staff to encourage earlier program placement. This was done by previous faculty, but did not seem to increase student placement. Current faculty, including adjunct faculty will be increasing the promotion of the program off campus to professional organizations and also to local high schools including the new Academies of Loudoun. Additional on-campus promotion of the program will be done at student club events, Spring into Fall (annually in Spring), and specific Horticulture Club activities on campus. Assessed: Annually

To encourage students to complete their A.A.S. degree in Horticulture

Graduation rates as recorded by OIR in report titled: Number of Graduates by Program and Specialization: 2017-18

Semester/year data collected: 2017-18

Previous action to improve program goal: Previous training of a new faculty member was done by former faculty. With the hiring of

Page 214: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

204

Horticulture Technology, A.A.S. Technology and/or Landscape Design Specialization

Target: Program graduation totals will increase by 20% Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

2017-18 8 100% 2016-17 4 -50% 2015-16 8 33% 2014-15 6 -68% 2013-14 19

Target Met: [ x ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The Horticulture program graduation rates tend to fluctuate alongside enrollment. Although the trends tend to correlate, our most recent graduation rates have been growing compared to past academic years. Additionally, these graduation rates have most recently increased even though student enrollment has been flat over the past 3 years.

the new program head, additional training from counseling and advising and the new MSTB advising manger will be undertaken. This will allow a better approach to the previous goal to conduct an annual review with each student alongside the other full-time faculty member. Results improved: [ ] Yes [ x ] No [ ] Partially Current actions to improve program goal: Graduation rates fluctuate as the average Horticulture student takes longer than two years to complete their degree due to work, family and other reasons. Many more students finished in 2017-18 as compared to the previous year. Full-time faculty members will implement a much more rigorous advising process. These program advisors will seek additional training from the new MSTB advising manager (FAM) and have already each met with this advising manager. The new navigate system will be used to better advise students. Faculty continue to encourage students to complete their degrees. Additional reorganization will allow for further consultation with the Dean and Associate Dean. Assessed: Annually

Page 215: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

205

Annual Planning and Evaluation Report: 2017-2018 Hospitality Management, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. HRI Program Purpose Statement: The curriculum is designed to enable the student to enter management and management training positions in the hospitality industry and for those presently employed who desire updating in the field.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will compare the various sectors of the hospitality industry and recognize the unique characteristics and attributes of each and their effect on quality of work life

Introduction to Hospitality HRI 101 Introduction to Hospitality II HRI 102 Data collected from the 3 sections of Fall HRI 101 and 2 sections of Spring HRI 102 during the Fall 2017 and Spring 2018 semesters. There was one faculty and 103 students. Only Taught on the Annandale Campus. Direct Measure: Students will answer 3 specific exam questions identifying the sectors of the hospitality industry and the expected weekly work load and salary expectations in the hospitality industry. • Question # 1- Identifying the

sectors of the hospitality industry. A correct response is 3 or more identified sectors.

• Question #2- Salary range for management trainees. Only one correct answer

• Question #3- Weekly schedule for managers. Only one correct answer

Sample:

FALL 2017-HRI 101 Data HRI 101-002N TR # Students=21

FALL 2017-HRI 101 Data HRI 101-036N R # Students=25

FALL 2017-HRI 101 Data

Semester/year data collected: Fall 2017 and Spring 2018 Achievement Target: 80% correct response rate on the questions. Specific Questions’ Achievement Rates:

• Q#1 - 79% Correct Reponses Rate o 26% did not even answer the question

• Q#2 - 50% correct response rate o 18% did not even answer the question

• Q#3 - 50% correct response rate o 17% did nor even answer the question

Results: No one met our target of 80% correct response. The first question requiring a list of sectors had the highest rate of response at 79%. There is a high confidence since students had to list the sectors, versus a multiple choice question. Question #1: In the Fall HRI 101- 3 sections, with a total of 61 students: 15 students were able to answer the question by listing all 5 of the Primary Sectors of the Hospitality Industry. However, 11 Students were beginning to list at least 4 Primary Sectors of the Hospitality Industry. Another 11 Students were able to list at least 4 of the Primary Sectors of the Hospitality Industry. There were 2 students who could only list 2 of the Primary Sectors of the Hospitality Industry and finally, there was 2 students, who could list only 1 of the Primary Sectors of the Hospitality Industry. The interesting data was the 20 Students who did not know even 1 of the Primary Sectors of the Hospitality Industry, or perhaps they chose not to answer, which is also interesting data.

In the Spring HRI 102- 2 sections, with a total of 42 students: 6 students were able to answer the question by listing all 5 of the Primary Sectors of the Hospitality Industry. However, 16 Students were able to list at least 4 Primary Sectors of the Hospitality Industry. Another 12 Students were able to list at least 4 of the Primary Sectors of the Hospitality Industry. There was 1 student who could only list 2 of the Primary Sectors of

Previous action to improve SLO: Since the last assessment, faculty incorporated more lecturer materials, role playing and salary surveys to help students develop a stronger understanding of the industry. We used salary surveys from the National Restaurant association and the American Hotel and Lodging Association. Based on recent results, areas needing improvement: We continue to fall short on explaining salary and work environment expectations in our industry. More time will be devoted to these subjects, in all the introductory classes in hospitality, beginning in Fall 2018. Next assessment of this SLO: Fall 2018 and Spring 2019

Page 216: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

206

Hospitality Management, A.A.S. HRI 101-001N MW # Students=15

Spring 2018-HRI 102 Data HRI 102-040N W # Students=20

Spring 2018-HRI 102 Data HRI 102-001N MW # Students=22

the Hospitality Industry and finally, there were 0 students who could list only 1 of the Primary Sectors of the Hospitality Industry. There was another 7 Students who did not know even 1 of the Primary Sectors of the Hospitality Industry, or perhaps they chose not to answer, which is also interesting data.

In the Fall HRI 101- 3 sections, with a total of 61 students and the Spring HRI 102- 2 sections, with a total of 42 students: 21 students were able to answer the question by listing all 5 of the Primary Sectors of the Hospitality Industry. However, 27 Students were able to list at least 4 Primary Sectors of the Hospitality Industry. Another 23 Students were able to list at least 4 of the Primary Sectors of the Hospitality Industry. There were 3 students who could only list 2 of the Primary Sectors of the Hospitality Industry and finally, there were 3 students, who could list only 1 of the Primary Sectors of the Hospitality Industry. There was another 27 Students who did not know even 1 of the Primary Sectors of the Hospitality Industry, or perhaps they chose not to answer, which is also interesting data. QUESTION #2: In the Fall HRI 101- 3 sections, with a total of 61 students: 28 Students chose the correct response.

HRI 101 61 Students in 3 classes A B C D No Answer 1 28 13 2 17

HRI 102 42 Students in 2 classes A B C D No Answer 1 24 12 3 2

HRI 101 & HRI 102 A B C D No Answer 2 52 25 5 19

QUESTION #3:

HRI 101 61 Students in 3 classes A B C D No Answer 4 15 26 0 16

HRI 102 42 Students in 2 classes A B C D No Answer 1 12 25 2 2

HRI 101 & HRI 102

A B C D No Answer

Page 217: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

207

Hospitality Management, A.A.S. 5 27 51 2 18

The new assessment target for 2018-2019 will be 75%. Faculty feel our expectations were too high initially.

Students will illustrate the proper use and care of commercial food production equipment

Principles of Food Preparation HRI 120 Direct Measure: Students will use commercial equipment in the HRI commercial kitchen to prepare a recipe from scratch. Instructor and kitchen chef will note on a grading rubric if proper use and cleanup of the equipment occurred. Observation was conducted by instructors and the chef instructional assistant during the final practical exam for HRI 120 students. The grading rubric for proper equipment usage and care is below: 0 1 Equip turned on – correct temps used while cooking

Not properly preheated; Poor temp control while cooking

Preheated; Correct temp control during cooking

0 1 Equipment cleaned properly

Left dirty equipment and /or equipment not adequately cleaned or equipment left on.

Equipment was properly cleaned, turned off and no dirty pans were left.

Sample: Data was collected from 2 out of 2 sections. One faculty and 32 students.

Semester/year data collected: Fall 2017 Achievement target: 90% Results:

• HRI 120 Sec 01: 12 students; 11 students successfully used and cleaned commercial equipment

• HRI 120 Sec 02: 20 students; 16 students successfully used and cleaned commercial equipment

Total points earned:

Number of students failing

Number of students passing

Equip turned on – correct temps used while cooking

2 30

Equipment Cleaned properly

3 29

Success rate: The rubric scores were a pass/fail. In HRI 120, 27 out of 32 students observed were successful in the use and cleaning of commercial equipment. This is an 84% success rate. This is a dramatic drop from the previous in 2015, with a success rate of 96%. Our goal was not met for all classes. Students achieved an 84%. The last assessment of this SLO was Fall 2015. The success rate was 96% in the previous assessment and fell short of that goal by over 10% points.

The success rate of 84% fell below our target of 90%. This also was much lower than the previous assessment in 2015. There was a drop of over 10% points. The failure rate appeared to be higher because the students failed to adequately clean the equipment. There was a stronger understanding of how to use the equipment. Students were assessed with a Rubric score that was a simple pass/fail. Faculty were disappointed in the low cleaning scores. Additional time will be spent in the classroom training and working with students each week on how to thoroughly take care of commercial equipment. We will continue to monitor correct use of equipment. The 2 students who failed in this area cooked at too high of a temperature. There was a correlation with students who failed the cleaning and use of equipment; they were the same students who had not enrolled in the Sanitation Course. The curriculum tells the students to take the classes at the same time; however, not all students are full time students. Sanitation will be stressed in all classes working in the kitchen. Next Assessment: Fall 2018. We will use the same achievement goal of 90%.

Students will apply approved food handling/safety standards in the preparation, service and

Sanitation and Safety HRI 158 Direct Measure: Results of the national standardized National Restaurant Association (NRA) ServSafe Food Managers Certification Exam. The exam is a numbered, highly controlled exam booklet sent only to our certified instructor who is

Semester/year data collected: Fall 2017 and Spring 2018 Achievement target: A 75% pass score is required by NRA to obtain the ServSafe certification and our 2017-18 goal is that 75% of our students will meet that target and successfully pass the exam. Results: Results of the NRA ServSafe exam at NOVA are the combined results of 3 sections.

2017-18 student success rates improved from the previous year. For this year the certification pass rate was 75% compared to last year with only 71%. The exam content has not changed but every year new questions are added while others are removed. Faculty does not get a copy of the exam so we don’t teach to the exam. We teach to cover best food handling practices.

Page 218: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

208

Hospitality Management, A.A.S. storage of food.

able to administer the exam. For security purposes, copies of the exam are not available for reference. The link for the ServSafe exam overview: www.ServSafe.com. Results from the NRA exam are in Attachment C. Exam sections: D-1 Implement food Safety SOPs D-2 Employee Hygiene & Health D-3 Receipt, Storage and Transport D-4 Food Prep, Display & Service D-5 Compliance with Regulatory Sample: All HRI 158 courses during both Fall 2017 and Spring 2018: 3 sections, 1 faculty, 43 students. The three sections are taught on the Annandale campus.

Exam Pass Rates Compared to Previous Years:

2017-18 2016-17 2015-16 2014-15 2013-14 75% 71% 70% 52% 78.5%

Summary of average raw scores per skill section:

2013/ 2014

2014/ 2015

2015/ 2016

2016/ 2017 NOVA

2016/ 2017 Dual Enroll

2017/ 2018

D-1 Implement food Safety SOPs

83.5 80.9 80.3 76 79.5 80

D-2 Employee Hygiene & Health

74.5 82.9 86.0 82 87 94.7

D-3 Receipt, Storage and Transport

76.0 79.0 68.9 77 93 68.8

D-4 Food Prep, Display & Service

71.0 78.0 78.2 74 76 77.5

D-5 Compliance with Regulatory

87.1 77.3 80.2 81 82 76.9

Score summary: 32/43 NOVA students successfully passed the exam with average passing scores of 83.4%. This is an improvement from last year with a previous pass rate of 82.3%. Of the 11 students who failed the exam, they all missed the exam by just a few points, four or less points. NOVA students achieved the goal for the first time since 2013. Our target goal was met. NOVA: The area of greatest improvement was D-2- Employee Hygiene and Health. The area needing the most improvement is D-3 Receipt/Storage of food and D-5 Compliance with Regulatory guidelines.

The cohort this year was only 43 students, compared to 66 students last year. Enrollments have declined substantially do to the total decline in community college enrollments in NOVA. We will continue to track this cohort this next year. We improved significantly by 7% points in Employee Hygiene which is a critical area to maintain competence. New class engagement games and activities contributed to the overall improvements. Will reassess in Fall 2018. Exam sections are being expanded and we will look at new areas in 2018, including: D-6 Cleaning and Sanitation and D-7 Facility and Equipment. The achievement target for 2018-19 will continue to be 75% of our students will meet that target and successfully pass the exam. Faculty will motivate students to attend class regularly and provide weekly quizzes for a grade to emphasize and review sanitation principles, especially food prep and display standards to try to improve this skill area. These actions will be implemented beginning in Fall 2018 semester. Next Assessment: Fall 2018

Students will describe and apply the four functions of management: plan, organize, lead and control.

Principles and Applications of Catering HRI 256 Direct Measure: Students will be observed planning, organizing, leading and controlling a catered lunch function. Each student assumed the role of a manager, which included

Semester/year data collected: Spring 2018 Achievement target: Student will score at least 20 points out of 25 points Results:

Individual Manager Scores based on a maximum 25 points

Grade

Faculty wanted to see if our graduates were prepared with the skills for success in the workplace. We assessed an upper level capstone course, HRI 256 Principles and Applications of Catering, which is the last course students take prior to graduation. Faculty worked directly with students to apply all the functions of management in the execution of an actual catered lunch. The assessment was more

Page 219: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

209

Hospitality Management, A.A.S. [ X ] CT

adequate planning, organizing, leading and controlling the preparation, execution and post evaluation of a catered lunch for 36-75 guests on campus. A written project checklist will be tallied as functions are completed. Sample: HRI 256 had 1 faculty, 21 students. Only taught on the Annandale campus.

24 A 24 A 23.5 A 23.5 A 23.5 A 22.5 A 22 B 22 B 21.5 B 21.5 B 21, 21 B 20.5 B 20,20 B 19.5, 19.5, 19.5 C 18.5 C 17.5 C 17 D

Students scored an average of 21.04 points out of 25 on their leadership skill and performance. The range of scores was 17-24. 28% of the class scored below the achievement target. However, for final grades, only one student performed below average. This is the second time the HRI 256 catering class was used as the platform for this SLO assessment. Previously students’ management skills had a greater success rate which would have been 84% compared to our current success rate of 73%. This discrepancy may be due to a new instructor teaching a course, when a different perspective on grading is used. HRI 256 is a capstone course that encompasses all of the courses taught in Hospitality Management, from food costing to human resource management to cooking to marketing, menu planning, etc. Compared to the previous assessment in 2017 of this objective, results of this catering course revealed a lower level of competency in the actual practice of management functions. There was an 73% success rate. Our success rate in DEMONSTRATING management functions fell from the previous assessment, but a different instructor may be the reason.

than theory – students assumed a management role. Faculty felt this was a more accurate way to evaluate the learning outcome. Previously, students simply answered test questions and listed the 4 functions of management. Evaluating performance is far more difficult, but a Rubric was used. See Attachment A. Results will be used to improve this assessment method by designing a well-defined grading rubric and dividing the management tasks into the four function areas. This should direct the students to specific management actions needed for success. Results also indicated that we need to look at the 27% who fell short of the goal. Faculty will brainstorm on how we can better prepare students for management roles over the course of the upcoming spring 2019 semester. Faculty will screen ALL enrolled students to ensure they have the pre-requisites for this course. Students taking the course, without completing all other HRI requirements, most likely are lacking the skilled management skills they need to succeed in this class. Changes to be implemented in Spring 2019. Student screening will be implemented in Spring 2019. We will monitor students who do not have all the pre-requisites. Current data was from a small sample size of 21 students. We need to apply the new matrix using another HRI 256 catering class when the SLO is reassessed in 2019. The achievement target was met and exceeded. Next Assessment: Will reassess in Spring 2019 to compare results using the new rubric.

Program Goals Evaluation Methods Assessment Results Use of Results

To encourage students to complete their AAS degrees and certificate

Track the number of students graduating. Data: Number of NOVA Graduates by Degree and Specialization: 2017-2018, OIR publication No. 44-18, August 2018.

Achievement target for AAS degree Graduates: 30 or more graduates. Number of HRI AAS Graduates:

Year Number of Graduates

Previous actions to improve program goal: Last year’s actions to improve graduation rates included increased faculty advising; visitations by the assistant dean to classes at the start of the Fall 2017 semester, and teaching faculty using

Page 220: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

210

Hospitality Management, A.A.S. programs in Hospitality Management.

2011-12 22 2012-13 24 2013-14 27 2014-15 26 2015-16 26 2016-17 17 2017-18 23

We fell short of the target with 23 graduates; however, compared to the year prior, when we only had 17 graduates, we have improved. College enrollments are down in general and we are seeing the decline in fewer students in the classrooms. New achievement target for next year: Maintain number of AAS degree graduates in 2018-19. Achievement target for Certificate Graduates: Maintain number of certificate graduates in 2017-18. Number of HRI Certificate Graduates:

Year Number of Graduates 2012-13 14 2013-14 17 2014-15 19 2015-16 18 2016-17 12 2017-18 10

We missed this target slightly, with a drop in graduates this year, by only 2 graduates. With only two certificate programs still active, we see fewer graduates with certificates. Even the attractive culinary arts certificate graduate has declined. New Achievement target: Maintain number of certificate graduates 2018-19.

classroom time to encourage students to register for classes and meet with their advisor every subsequent semester. The Assistant Dean was involved as were two full-time faculty members in the advising portion during the 2017-18 academic year. There was a 34.7% increase in AAS graduates and a 34.4% decrease in certificate graduates from the previous year. The results will be used to address enrollment declines in our programs and determine why students are not finishing their degree programs. We will also use the results to focus on more frequent academic advisory with individual students. Both the culinary arts and meeting planning certificate programs are attracting less students also. We will continue to marketing the culinary arts and hospitality programs in 6 area high school academies in Fall 2018. A career day is also being planned with George Mason for all high school students in Northern Virginia in the spring next year. We will increase advising opportunities for our students and encourage degree completion beginning Fall 2018. Faculty will make announcements in their classes and the assistant dean will make classroom visits in the Fall 2018 to encourage students to see their faculty advisors AND provide a curriculum update. We will continue to work closely with students to see that they complete their degrees in a timely fashion AND APPLY for graduation. All faculty announced to their classes that faculty advisors need to be seen each semester for guidance. That is encouraged at the start of each semester as well as prior to “Advising Week” and at the end of each semester. All faculty will participate in additional advising to keep students focused and on track. Assessed: Annually

To encourage students to program place in AAS degrees and certificate programs in

Track the number of student program placed in hospitality programs. OIR report: DISTRIBUTION OF PROGRAM PLACED STUDENTS BY CURRICULUM AND AWARD TYPE, Fall 2013 through Fall 2017

Achievement target for HRI AAS: 190 program placed students HRI AAS degree Program Placed Students:

Fall 2017

Fall 2016

Fall 2015

Fall 2014

Fall 2013

175 191 213 221 229

Previous actions to improve program goal: Last year’s actions to improve recruitment of new students and retain current students included high school visitations by Chef Mike in Fall 2017; presentations at career fairs by the Assistant Dean in Summer and Fall 2017; and increased availability to work with students during Summer 2017 when 9-

Page 221: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

211

Hospitality Management, A.A.S. Hospitality Management.

The target was not met; in fact, we saw a decline of 14% from the previous year. Total college enrollments declined about 4% so this was not surprising. New achievement target: maintain or increase 170 program placed students in HRI Achievement target for HRI Certificates: Maintain or increase the number of certificate program placed students HRI Certificate Program Placed Students:

Total Culinary Arts

Meet Plan

Food Hotel Hotel

Fall 2017 41 32 9 d/c d/c Fall 2016 41 34 7 d/c d/c Fall 2015 73 66 7 0 0 Fall 2014 82 66 15 1 1 Fall 2013 95 75 18 2 2 Fall 2012 98 78 16 4 4

The target was not met; in Culinary Arts, however, it only had a minor decline of 2 students. Meeting Planning certificate program target was met. In fact, the certificate program increased by 2 students, which is a good sign. However, these low numbers still make it hard to grow the program, being that there are often not enough students to hold the classes, which makes it hard to grow enrollment. New achievement target: maintain or increase the number of certificate program placed students. Focus on marketing the Meeting Planning certificate program to potential students who are already working within the industry and are looking for a certificate as part of their continued personal growth and development.

month faculty are off contract. The faculty also used classroom time to encourage students to register for classes and meet with their advisor every semester. The Assistant Dean and two full-time faculty members were involved in the advising portion during the 2017-18 academic year. This SLO was first assessed in 2014-15. We have a historical perspective for the past six years that shows a slow but steady decline in HRI program placements. We hope to stabilize the number of students placed in our programs. This may be ambitious since college enrollments overall have declined over the past year. These results are alarming at a time when the need for training in the hospitality industry and especially restaurants is in great demand. We will use the results to: 1. Continue recruiting from the high schools by

visiting all the Fairfax County hospitality and culinary academies.

2. Encourage networking with industry professionals to send employees to NOVA for skill enhancement and management training

3. Co-host a high school career fair along with George Mason University to market the NOVA hospitality programs. The event is scheduled for Spring 2019.

The decline in culinary schools across the country has reflected this same trend. Culinary careers are hard work and students are not attracted to the weekend, night and holiday work requirements, long work hours, and often low pay scale. In addition, when the economy is good, students have lots of job opportunities and are not returning to college for skill updates. Over the past year, a shortage of chefs, cooks and foodservice workers has only increased in the DC area. Students can easily find a job in foodservice and culinary. Hotels, restaurants and food production sites are desperate for workers. Faculty and advisors have used this data to encourage students to declare their major early in their studies. Announcements will be made at the start of each semester, during advising week, and at the end of each semester. Reminders may help prompt students to commit to a degree. Faculty will

Page 222: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

212

Hospitality Management, A.A.S. also get involved in career fairs and high school visitations. NOVA experienced a decline in total student enrollments during 2017-18 and we saw this in Hospitality also. Faculty is very concerned and are brainstorming on how to find and engage new students for the program. Faculty will participate in the Fall 2018 and Spring 2019 in Career Fairs, High Schools visits, and joint GMU Career Day. The achievement target was not met for AAS or certificate placements and we see this as not only a program issue but a college wide issue. Assessed: Annually

Page 223: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

213

Annual Planning and Evaluation Report: 2017-18

Information Systems Technology, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. A.A.S. in Information Technology Program Purpose Statement: This curriculum is designed for those who seek employment in the field of information technology, for those who are presently in that field and who desire to increase their knowledge and update their skills, and for those who must augment their abilities in other fields with knowledge and skills in information technology. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Be able to design problems using procedural and object oriented design and implement, sequence, selection and loop structures within the design solution.

Java Programming I ITP 120 Direct Measure Students were assessed based on five (5) multiple-choice questions and five (5) design problems included with the course common final exam given across the Annandale, Alexandria, Loudoun, Manassas and Woodbridge campuses and ELI. Data collection: Faculty are given files that included all questions for a common final exam, answers to the questions, spreadsheet for the results and a grading rubric. The files are stored on the IT Cluster Blackboard site. Each faculty member is given access to the Blackboard site and can print copies of the exam from the Blackboard site. The exam questions are graded using the common rubric to ensure consistency across the campuses. The results from the exam are stored in the spreadsheet. The exam questions must be given in a closed book, closed notes, proctored environment. The results for each question are given to the IT Assistant Dean for the campus. The IT Assistant Dean tallies the numbers for the campus and forwards the results to the faculty in charge of the IT Cluster. The faculty in charge of the IT Cluster creates one tally of the results for the NOVA IT departments on all five campuses. Sample: There were 11 sections of ITP 120, including ELI sections. Of these 11 reporting sections, a total of 184 students were assessed.

Semester/year data collected: Fall 2017 Target: 70% of the students answering the questions correctly. Results:

SLO Grade

PERCENTAGE OF STUDENTS RECEIVING

GRADE – Fall 2017 A (90-100) 10.33% B (80-89) 19.02% C (70-79) 17.39% D (60-69) 11.96 % F (Below 60) 21.74% C or Better 46.74%

Students demonstrated a passing rate of 46.74% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple choice answers. Otherwise, the assessment results were equally as poor across all of the reporting campuses, with no distinctive differentiation in scores. Comparison to previous assessment: Due to the new project, we do not have past assessments to compare results. Weaknesses of the ITP 120 course SLO assessment: It is more difficult to design a rubric for the ITP 120 course. The SLO assessment questions’ grading rubric must be very detailed and instructors must follow the rubric to correctly compare the results. It is very time consuming to grade the ITP 120 exams within the amount of time that faculty have to submit the results for the exam and post the final course grades.

Posted the results of the SLO with the faculty on an IT Cluster Blackboard site. Target Met: No Actions to Improve SLO: Assistant deans will continue to discuss assessment areas for improvement based on the scores. Emphasis will be placed on results broken down by campus, to include ELI and dual enrollment sections. Develop strategy to nest SLO into program learning objective to reduce the amount of assessment needed. Results are to be broken down by question/topic in the next assessment. In addition, data will be divided for online and off-site dual enrolled sections. When will the improvements take place: The improvement will take place during the Fall 2020 semester. Next Assessment: Fall 2020

Page 224: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

214

Information Systems Technology, A.A.S. Be able to identify correct syntax and logic in a programming language.

Java Programming I ITP 120 Direct Measure: Students were assessed based on five (5) multiple-choice questions and five (5) design problems included with the course common final exam given across the Annandale, Alexandria, Loudoun, Manassas and Woodbridge campuses and ELI. Data collection: Faculty are given files that include all questions for a common final exam, answers to the questions, spreadsheet for the results, and a grading rubric. The files are stored on the IT Cluster Blackboard site. Each faculty member is given access to the Blackboard site and can print copies of the exam from the Blackboard site. The exam questions are graded using the common rubric to ensure consistency across the campuses. The results from the exam are stored in the spreadsheet. The exam questions must be given in a closed book, closed notes, proctored environment. The results for each question are given to the IT Assistant Dean for the campus. The IT Assistant Dean tallies the numbers for the campus and forwards the results to the faculty in charge of the IT Cluster. The faculty in charge of the IT Cluster creates one tally of the results for the NOVA IT departments on all five campuses. Sample: There were 11 sections of ITP 120, including ELI sections. Of these 11 reporting sections, a total of 184 students were assessed.

Semester/year data collected: Fall 2017 Target: 70% of the students answering the questions correctly. Results:

SLO Grade

PERCENTAGE OF STUDENTS RECEIVING

GRADE – Fall 2017 A (90-100) 10.33% B (80-89) 19.02% C (70-79) 17.39% D (60-69) 11.96 % F (Below 60) 21.74% C or Better 46.74%

Students demonstrated a passing rate of 46.74% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple choice answers. Otherwise, the assessment results were equally as poor across all of the reporting campuses, with no distinctive differentiation in scores. Comparison to previous assessment: Due to the new project, we do not have past assessments to compare results. Weaknesses of the ITP 120 course SLO assessment: It is more difficult to design a rubric for the ITP 120 course. The SLO assessment questions’ grading rubric must be very detailed and instructors must follow the rubric to correctly compare the results. It is very time consuming to grade the ITP 120 exams within the amount of time that faculty have to submit the results for the exam and post the final course grades.

Posted the results of the SLO with the faculty on an IT Cluster Blackboard site. Target Met: No Fall 2017 - Areas to be improved: Work towards a better understanding of the areas where the students scored below the 70% target. Reinforce the topics covered in the ITP 100 class during the ITP programming course. Current Actions to Improve SLO: Assisting the faculty with a better understanding of the topics listed in the course content summary. The Campus Coordinator will discuss the SLO assessment areas for improvement with the faculty teaching the ITP 120 class. Results are to be broken down by question/topic in the next assessment. In addition, the data will be divided for online and off-site dual enrolled sections. Strategies for improvement in covering the topics in the classroom will include: • Improve the rubric for the ITP120

course. • Give faculty ample time to grade the

ITP120 exam. • Add the SLO to dual enrollment

classes. When will the improvements take place: The improvements will take place during the Fall 2020 semester. Next Assessment: Fall 2020

Be able to describe memory types and allocation methods

Introduction to Telecommunications ITN 100 Direct Measure: Students were assessed based on seven (7) multiple choice questions, three (3) matching questions and one (1) order question. Faculty at all 5 campuses provided these questions at the time the final exam was given. As the questions addressed the single topic of OSI reference model and layers, scores were not broken down individually.

Semester/year data collected: Fall 2017 Target – Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Results:

SLO Grade

PERCENTAGE OF STUDENTS RECEIVING

GRADE – Fall 2017 A (90-100) 16.27%

Posted the results of the SLO with the faculty on an IT Cluster Blackboard site. Target Met: No Fall 2017 results of this assessment revealed significant issues with this topic Actions for improvement: It may be more appropriate to structure some of these answers as multiple choice for the Spring 2019 assessment. To further promote

Page 225: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

215

Information Systems Technology, A.A.S. Data collection: Results were provided by faculty for 13 sections at Alexandria, Loudoun, Annandale Manassas and Woodbridge campuses. ELI courses did not report, although they received copies of the assessment and instructions. Sample: Of these 12 reporting sections, a total of 252 students were assessed.

B (80-89) 12.70% C (70-79) 14.69% C or Better 43.66%

Students demonstrated an accuracy rate of 43.66% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple choice answers. Due to the new project, we do not have past assessments to compare results.

consistent course content across the five NOVA campuses offering the IT program, a common textbook and syllabus across campuses will be used in the course starting with the Fall 2018 semester. Results will be broken down by campus, to include ELI and dual enrollment sections. Results are to be broken down by question/topic in the next assessment. In addition, data will be divided for online and off-site dual enrolled sections. When will the improvements take place: The improvement will take place during the Spring 2019 semester. Next Assessment: Spring 2019 Semester

Be able to identify terminology, correct syntax, an appropriate use of graphics, animation, and XHTML for a successful multimedia website.

Multimedia Software ITE 170 Direct Measure: Students were assessed based on a common final exam given across all campuses and ELI containing fifty (50) multiple-choice questions relating specifically identifying terminology, correct syntax, and appropriate use of graphics, animation and XHTML for a successful website. The topics for the multiple choice questions directly relate to the SLO that assesses the content of the entire course. The questions are used to determine if students completing the course are prepared for further study in web page design.

# TOPICS FOR STUDENT LEARNING OUTCOME QUESTIONS

1 Demonstrate the ability to plan and organize a multimedia web site.

2 Understand the design concepts for creating a multimedia website.

3 Use a web authoring tool to create a website. 4 Understand the design concepts related to

creating and using graphics for the web. 5 Use graphics software to create and edit images

for the web. 6 Understand the design concepts related to

creating and using animation, audio and video for the web.

7 Use animation software to create and edit animations for the web.

8 Use software tools to publish and maintain a multimedia web site.

Semester/year data collected: Spring 2018 Target – multiple choice questions: 70% of students answering the question correctly. Results: The table below shows the percentage of students across all campuses and ELI who received a specific final exam grade on the common final exam during the Spring 2017 and Spring 2018 semesters.

Final Exam Grade

PERCENTAGE OF STUDENTS

RECEIVING GRADE –

Spring 2017

PERCENTAGE OF STUDENTS

RECEIVING GRADE –

Spring 2018 A (90-100) 15.0 48.37 B (80-89) 35.45 19.06 C (70-79) 25.65 10.70 D (60-69) 5.51 4.65 F (Below 60) 0 5.58 C or Better 81.61 78.13

Comparison to previous assessment: the questions are the same questions that were used during the Spring 2016 – Spring 2017 academic year. It would be best to look at not just overall average, but averages for individual questions/problems. Using the averages for individual questions/problems, the program can determine the topics most in need of attention to increase student learning. There are more students earning an A in the class during the Spring 2018 semester than the Spring 2017 semester. The difference in the number of students earning an A in the class may be due to the instructors focusing more

Previous Actions to Improve SLO: With the decrease in the amount of time students are given to complete the final exam (1 hour and 40 minutes instead of 1 hour and 50 minutes) it is difficult to add requirements to the exam. The ITE 170 faculty added a semester project to the course instead of adding the project to the final exam. Target Met: [ X ] Yes [ ] No [ ] Partially Most Recent Results: During the Spring 2018 semester, 78.13% of students in the ITE 170 class earned an A, B or C in the class. During the Fall 2015 semester, 81.61% students earned an A, B or C in the class. During the Spring 2018 semester, 48.37% students earned an A in the class. The increase in the number of students earning an A in the course may be due to the change in the focus of the how to teach the ITE 170 course. The emphasis of the class has shifted from testing the definitions to understanding how to use the material that they learn in the class to solve a problem. Current Actions to Improve SLO: During the Fall 2018 semester, the IT faculty will create CASE-based problem solving SLO assessment questions instead of solely using the multiple-choice questions to assess the course. The new SLO

Page 226: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

216

Information Systems Technology, A.A.S. Data collection: The questions for the ITE 170 SLOs are given to all sections of ITE 170 across the Annandale, Alexandria, Loudoun, Manassas and Woodbridge campuses and ELI. The questions are included in a common final exam. The common final exam is stored on the IT Cluster Blackboard site. Each faculty member is given access to the Blackboard site and instructions on how to copy the electronic version of the common final exam from the IT Cluster Blackboard site into the Blackboard site for their ITE 170 course. If the faculty member does not have access to a computer lab to give the final exam, a paper version of the exam is available. The electronic version of the exam is self-graded. Instructors are responsible for grading the paper version of the exam. The IT Cluster Blackboard site provides the instructors with a grading rubric and spreadsheet to report the results. The exam questions must be given in a closed book, closed notes, proctored environment. The results for each question are given to the IT Assistant Dean for the campus. The IT Assistant Dean tallies the numbers for the campus and forwards the results to the faculty in charge of the IT Cluster. The faculty in charge of the IT Cluster creates one tally of the results for the NOVA IT departments on all five campuses. Sample: Ten (10) sections of ITE 170 with a combined two hundred fifteen (215) students responding to the five multiple choice questions administered at the end of the semester with the course final exam.

on problem solving skills by giving more projects in the class instead of the students learning terms and definitions. Multiple choice questions – students met the 70% success rate for the course. The SLO for this course determines whether the student earns an A, B or C in the course. Strengths of the SLO Assessment: It is easier to grade and compare the results from multiple choice SLO questions than from CASE study questions. Faculty need to develop and use a more detailed rubric to assess CASE study questions. The SLO assessment format tests the students’ understating of the course material. Weaknesses of the SLO Assessment: The current data collection does not include a question that asks the students about their program placement. This will allow the faculty to determine if the students in the A.S. in IT degree program are meeting the SLO expectations. The expectation for the A.S. in IT degree program is for students to learn the skills needed to be successful when they transfer to an IT program in a four-year institution/university. Multiple choice questions are not an appropriate method to assess whether the student has a mastery of when to use the appropriate web design tool. CASE-based questions are a more appropriate format for the SLO assessment. CASE-based SLO questions were developed for the ITE 170 class, but were not developed in time to use them for the Spring 2018 semester. Faculty need to modify the SLO to include more hands-on problem solving instead of using multiple choice questions.

assessment method is a closer match to the change in how to material is presented in the class. The data collection for future SLOs will include a question that asks the students about whether they are program placed in the A.A.S in IST degree, A.S. in IT degree or A.A.S. in Cybersecurity degree. The data will be divided between campus based classes, ELI (online) classes and dual enrollment classes. This will allow the faculty to determine if the students in the A.A.S. in IST degree program are meeting the SLO expectations and ensure that the material is covered adequately in all instructional formats.

Results are to be broken down by question/topic in the next assessment. In addition to data to be divided for online and off-site dual enrolled sections.

When will the improvements take place: The improvements will take place during the Fall 2018 semester. Next Assessment: Fall 2018

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

Quantitative reasoning problems

PC Hardware and Operating System Architecture ITE 221 Direct Measure: The ITE 221 Core Learning Outcome Quantitative Reasoning Assessment was a timed and proctored assessment. Students had 30-minutes in which to complete the assessment. Students with a documented Memorandum of Accommodation granting additional time for in-class assessments could take the assessment in the local campus Testing Center.

Semester/year data collected: Spring 2018 Target – Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Results:

CLO Grade

PERCENTAGE OF STUDENTS

RECEIVING GRADE – Spring 2018

Spring 2018 results of this assessment revealed significant issues with this topic Actions for improvement: Because all instructors do not teach the above concepts the exact same way, a somewhat subjective grading rubric was used for evaluating the “correctness” of a student’s solution. It is proposed to require ELI sections to report. These results need to be broken

Page 227: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

217

Information Systems Technology, A.A.S. Students could only use a simple four-function calculator. Students were not allowed to use any other type of calculator including, but not be limited to, advanced graphing calculators, smart phone calculators, internet-based calculators, operating system calculator utilities, etc. Instructors were required to verify the calculator being used by each student before the assessment. There were five categories of quantitative reasoning problems, which made up the ITE 221 CLO Assessment. This was a paper-based assessment and not a computer-based assessment. Instructors were required to select one problem from each category: 1. Binary-to-Decimal Conversion 2. Hexadecimal-to-Decimal Conversion 3. Decimal-to-Binary Conversion 4. Hexadecimal-to-Binary Conversion 5. Two’s Complement Notation Data collection – Results were provided by faculty of 10 sections received from Alexandria, Loudoun, Annandale Manassas and Woodbridge campuses. ELI courses did not report. A separate Excel spreadsheet was used to record the results of the assessment. Instructors were required to provide the date of the report, their campus, the course section, and their name. Students remained anonymous, only identified as Student 1, Student 2, etc. Sample: Of these 10 reporting sections, a total of 253 students were assessed.

A (90-100) 24.11% B (80-89) 15.02% C (70-79) 10.67% C or Better 49.8%

Students demonstrated an accuracy rate of 43.66% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple- choice answers.

Due to the new project, we do not have past assessments to compare results.

down by campus, to include ELI and dual enrollment sections. When will the improvements take place: The improvements will take place during the Fall 2020 semester. Next Assessment: Fall 2020

Program Goals Evaluation Methods Assessment Results Use of Results To increase the number of program placed students in the AAS in IST program.

Review the number of program placed students in the A.A.S. in IST degree program listed in the OIR Factbook – section entitled Distribution of Program Placed by Curriculum and Award Type. https://www.nvcc.edu/college-planning/data.html

Semester/year data collected: Fall 2017 Target: Increase the number of program placed students by 1%. The numbers of program placed students in the A.A.S. in Information Systems Technology degree from 2012 through 2017 are listed below:

Year Number of Program Placed Students in A.A.S. in IT degree

Percent Change

Fall 2012 641 n/a Fall 2013 627 -2.23% Fall 2014 739 15.15%

Established campus coordinators for high enrollment programs Target Met: [ X ] Yes [ ] No [ ] Partially Actions for Improvement: Improving the assigning of advisors to students – it was discussed a new advising procedure needs to be in place. G3 – Get Skills, get a Job, give back – focusing on technical programs –AAS can fall into G3. Put skills up front and the soft skills at the end. (Free model for a community college).

Page 228: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

218

Information Systems Technology, A.A.S. Fall 2015 472 -56.5% Fall 2016 383 -23.2% Fall 2017 390 1.83%

Comment: As can be seen, the number of students’ program placed in the A.A.S. Information Systems Technology degree program has increased by 1.83%. Goal: The IT Cluster will focus on strategies to increase the number of program placed students by 2% for the 2018 – 2019 academic year.

When will the improvements take place: The improvements will take place during the Fall 2018 semester. Assessed: Annually

To encourage students to complete their A.A.S. in Information Systems Technology degree prior to transferring to a four-year institution and seeking employment in the IT industry

Review the number of graduating students in the A.A.S. in IST degree program listed in the OIR Factbook – section entitled Distribution of Graduates by Curriculum and Award Type. https://www.nvcc.edu/college-planning/data.html

Semester/year data collected: 2016-17 Target –Increase the number of program placed students by 1%. The charts with the number of students graduating with an A.A.S. in Information Systems Technology degree from 2012-2017 is listed below.

Fall NUMBER OF STUDENTS

(A.A.S. Degree) PERCENT CHANGE

2011-2012 49 n/a 2012-2013 61 19.672 2013 - 2014 54 -12.963 2014 - 2015 69 21.739 2015 - 2016 54 -27.778 2016 - 2017 56 3.70

Comment: There is an increase in the number of students graduating with the A.A.S. in IST degree. Goal: The IT Cluster will focus on strategies to increase the number of students graduating with the A.A.S. in IST degree by 2% for the 2018 – 2019 academic year.

The Dean and faculty used results to increase the number of students graduating with the A.A.S. in IST degree.

• IT faculty discussed retention strategies for students in their classes and/or in the IST program.

Target Met: [ X ] Yes [ ] No [ ] Partially Action for improvement: • Design programs that upon completion

graduates are prepared to study for industry certification programs.

• Develop promotional material that focus on Transfer pathway for NOVA IST students – AAS IST or AAS Business.

Assessed: Annually

Page 229: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

219

Annual Planning and Evaluation Report: 2017-18

Information Technology, A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. A.S. in Information Technology Program Purpose Statement: The Associate of Science degree curriculum in Information Technology is designed for persons who plan to transfer to a four-year college or university to complete a baccalaureate degree program in Information Technology.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Be able to design problems using procedural and object oriented design and implement sequence, selection and loop structures within the design solution.

Software Design ITP 100 Direct Measure: Students were assessed based on five (5) multiple-choice questions and five (5) design problems included with the course common final exam given across the Annandale, Alexandria, Loudoun, Manassas and Woodbridge campuses and ELI. The topics for the multiple choice questions and the design problems are directly related to the SLO that assesses the content of the entire course:

# TOPICS FOR MULTIPLE CHOICE QUESTIONS

1 Class vs. Object 2 Methods 3 Private 4 Value of Index 5 OO Call Syntax # TOPICS FOR PROBLEM SOLVING 1 Design a problem using the appropriate

control structure. (Selection Structure) 2 Design a problem using parallel arrays. 3 Given an input value, identify the correct

result from executing modules. 4 Design a problem using the appropriate

statements to create modules and call statements.

5 Debugging modular code. Data collection: Faculty are given files that include all questions for a common final exam, answers to the questions, spreadsheet for the results and a grading rubric. The files are stored on the IT Cluster Blackboard site. Each faculty member is given access to the Blackboard site and can print copies of the exam from the Blackboard site. The exam questions are graded using the common

Semester/year data collected: Fall 2017 Target – multiple choice questions: 70% of the students answering the questions correctly. Target – problem solving questions: 70% of the students answering the questions correctly. Results: The table below shows the percentage of students across the campuses and ELI that provided a correct answer for each question/problem:

MULTIPLE CHOICE

QUESTION

PERCENTAGE OF STUDENTS

RESPONDING CORRECTLY –

Fall 2016

PERCENTAGE OF STUDENTS

RESPONDING CORRECTLY –

Fall 2017 1 50.25 41.07 2 86.14 77.78 3 75.00 76.43 4 48.12 51.85 5 75.39 78.11 % Correct 66.98 65.05

PROBLEM SOLVING

QUESTION

AVERAGE SCORE RECEIVED BY

STUDENTS COMPLETING

DESIGN PROBLEM –

Fall 2016

AVERAGE SCORE RECEIVED BY

STUDENTS COMPLETING

DESIGN PROBLEM –

Fall 2017 1 73.26 72.67 2 53.54 92.98 3 67.23 89.80 4 76.10 80.90 5 60.21 81.03 % Correct 66.05 83.48

It would be best to look at not just overall average, but averages for individual questions/problems. Using the averages for individual questions/problems, the program can determine the topics most in need of

Posted the results of the SLO with the faculty on an IT Cluster Blackboard site. Continue to provide the students with supplementary videos and online resources. Continue to provide assistance for students taking IT classes through the campus tutoring centers. Target Met: [ ] Yes [ ] No [ X ] Partially Based on Most Recent Results - Areas to be improved: The areas of focus for improvement on the multiple choice test are the two questions where the students scored below the 70% target percentage. Multiple Choice question 1 (Class vs. Object) and Multiple Choice question 4 (Value of Index). It is also noted that the overall multiple choice test questions did not change much. Current Actions to Improve SLO: Campus Coordinators continue to discuss assessment areas for improvement based on the scores. Emphasis will be placed on Multiple choice questions that did not change much. Provide additional ELI and Dual Enrolled off-site sections disaggregated separately for future assessments. When will the improvements take place: The improvement will take place during the Fall 2018 semester. The SLOs will be assessed at the end of the semester. The numbers from the Fall 2017 semester will be compared to the numbers from the Fall 2018 semester.

Page 230: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

220

Information Technology, A.S. rubric to ensure consistency across the campuses. The results from the exam are stored in the spreadsheet. The exam questions must be given in a closed book, closed notes, proctored environment. The results for each question are given to the IT Assistant Dean for the campus. The IT Assistant Dean tallies the numbers for the campus and forwards the results to the faculty in charge of the IT Cluster. The faculty in charge of the IT Cluster creates one tally of the results for the NOVA IT departments on all five campuses. Sample: Sixteen (16) sections out of thirty-five (35) sections of ITP 100 with a combined two hundred and ninety-seven (297) students out of seven hundred and seventy-one (721) students responding to the five (5) common multiple-choice questions and four (4) design problems included with the course common final exam given across the campuses and (1) ELI section.

attention to increase student learning and the student’s preparation for the programming language class. ITP 100 is a prerequisite course for the required programming class. Comparison to previous assessment: The questions used to assess the course during the Fall 2017 semester are the same questions used to assess the ITP 100 course during the Fall 2016 semester. The exam consisted of five multiple choice questions and five problem solving questions. Multiple Choice Questions: Comparing the results from the Fall 2017 semester to the results from the Fall 2016 semester, there is a slight decrease in the percentage of students answering the questions correctly. During the Fall 2017 semester, the students met the 70% target for three out of the five multiple choice questions, two of which exceeding the percentage for the same questions for the Fall 2016 semester. The students did not meet the 70% target when the percentages for the five multiple choice questions are averaged. Problem Solving Questions: There is an increase in the percentage of students answering the five problem solving questions correctly with the percentage for all five of the problem solving questions exceeding the 70% target. Improving the percentage of students understanding of the material is always a concern among the faculty. There is a significant increase in the number of students understanding the material when the results for the five questions are averaged. This is a positive sign from the previous actions to improve SLO. The average for the problem solving questions was 66.05% during the Fall 2016 semester and 83.48% during the Fall 2017 semester. The average percentage of students understanding the material is above the 70% target.

Next Assessment: Fall 2019

Be able to describe memory types and allocation methods.

Introduction to Telecommunications ITN 100 Direct Measure: Students were assessed based on seven (7) multiple-choice questions, three (3) matching questions and one (1) order question. Faculty at all 5 campuses provided these questions at the time the final exam was given.

Semester/year data collected: Fall 2017 Target: In order to be consistent with CompTIA exam standards students are expected to answer questions with a 70% accuracy rate. Results: As the questions addressed the single topic of OSI reference model and layers, scores were not broken down individually:

Fall 2017 results of this assessment revealed significant issues with this topic Target Met: [ ] Yes [ X ] No [ ] Partially Actions for improvement: It is proposed to structure these answers as multiple choice for the Spring 2019 assessment. Results will be shared with faculty in this report. Add the SLO to dual

Page 231: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

221

Information Technology, A.S. Data collection – Results were provided by faculty of 13 sections received from Alexandria, Loudoun, Annandale, Manassas and Woodbridge campuses. ELI courses did not report, although they received copies of the assessment and instructions. Sample: Of these 12 reporting sections, a total of 252 students were assessed.

SLO Grade

PERCENTAGE OF STUDENTS RECEIVING

GRADE – Fall 2017 A (90-100) 16.27 B (80-89) 12.70 C (70-79) 14.69 C or Better 43.66

Students demonstrated an accuracy rate of 43.66% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple-choice answers. Due to the new project, we do not have past assessments to compare results.

enrollment classes. In addition, it is planned to break down overall results by questions not overall grade. This will help identify specific problems areas are for students. When will the improvements take place: The improvement will take place during the Spring 2019 semester. Next Assessment: Spring 2019

Be able to identify terminology, correct syntax, an appropriate uses of graphics, animation, and XHTML for a successful multimedia website.

Multimedia Software ITE 170 Direct Measure: Students were assessed based on a common final exam given across all campuses and ELI containing fifty (50) multiple-choice questions relating specifically to identifying terminology, correct syntax, and appropriate uses of graphics, animation and XHTML for a successful website. The topics for the multiple choice questions directly relate to the SLO that assesses the content of the entire course. The questions are used to determine if students completing the course are prepared for further study in web page design: # TOPICS FOR STUDENT LEARNING

OUTCOME QUESTIONS 1 Demonstrate the ability to plan and organize

a multimedia web site. 2 Understand the design concepts for creating

a multimedia website. 3 Use a web authoring tool to create a website. 4 Understand the design concepts related to

creating and using graphics for the web. 5 Use graphics software to create and edit

images for the web. 6 Understand the design concepts related to

creating and using animation, audio and video for the web.

7 Use animation software to create and edit animations for the web.

8 Use software tools to publish and maintain a multimedia web site.

Semester/year data collected: Spring 2018 Target – multiple-choice questions: 70% of students answering the question correctly. Results: The table below shows the percentage of students across all campuses and ELI who received a specific final exam grade on the common final exam during the Spring 2017 and Spring 2018 semesters:

Final Exam Grade

PERCENTAGE OF STUDENTS RECEIVING GRADE –

Spring 2017

PERCENTAGE OF STUDENTS RECEIVING GRADE –

Spring 2018 A (90-100) 15.0 48.37 B (80-89) 35.45 19.06 C (70-79) 25.65 10.70 D (60-69) 5.51 4.65 F (Below 60) 0 5.58 C or Better 81.61 78.13

Comparison to previous assessment: The questions are the same questions that were used during the Spring 2016 – Spring 2017 academic year. It would be best to look at not just overall average, but averages for individual questions/problems. Using the averages for individual questions/problems, the program can determine the topics most in need of attention to increase student learning. There were more students earning an A in the class during the Spring 2018 semester than the Spring 2017 semester. The difference in the number of students earning an A in the class may be due to the instructors focusing more on problem solving skills by giving more projects in the class instead of the students learning terms and definitions.

During the Spring 2018 semester, 78.13% of students in the ITE 170 class earned an A, B, or C in the class. During the Spring 2017 semester, 81.61% of students earned an A, B, or C in the class. During the Spring 2018 semester 48.37% students earned an A in the class. The increase in the number of students earning an A in the course may be due to the change in the focus of the how to teach the ITE 170 course. The emphasis of the class has shifted from testing the definitions to understanding how to use the material that they learn in the class to solve a problem. Target Met: [ X ] Yes [ ] No [ ] Partially Current Actions to Improve SLO: Add the SLO to dual enrollment classes. Rubric needs to break results down further for the SLO. Modify the SLO to include hands-on problem solving instead of using multiple choice questions. When will the improvements take place: The improvements will take place during the Fall 2018 semester. Next Assessment: Fall 2018

Page 232: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

222

Information Technology, A.S. Data collection – the questions for the ITE 170 SLOs are given to all sections of ITE 170 across the Annandale, Alexandria, Loudoun, Manassas and Woodbridge campuses and ELI. The questions are included in a common final exam. The common final exam is stored on the IT Cluster Blackboard site. Each faculty member is given access to the Blackboard site and instructions on how to copy the electronic version of the common final exam from the IT Cluster Blackboard site into the Blackboard site for their ITE 170 course. If the faculty member does not have access to a computer lab to give the final exam, a paper version of the exam is available. The electronic version of the exam is self-graded. Instructors are responsible for grading the paper version of the exam. The IT Cluster Blackboard site provides the instructors with a grading rubric and spreadsheet to report the results. The exam questions must be given in a closed book, closed notes, proctored environment. The results for each question are given to the IT Assistant Dean for the campus. The IT Assistant Dean tallies the numbers for the campus and forwards the results to the faculty in charge of the IT Cluster. The faculty in charge of the IT Cluster creates one tally of the results for the NOVA IT departments on all five campuses. Sample: Ten (10) sections of ITE 170 with a combined two hundred fifteen (215) students responding to the five multiple choice questions administered at the end of the semester with the course final exam.

Multiple-choice questions – Students met the 70% success rate for the course. The SLO for this course determines whether the student earns an A, B or C in the course. Strengths of the SLO Assessment: It is easier to grade and compare the results from multiple choice SLO questions than from CASE study questions. Faculty need to develop and use a more detailed rubric to assess CASE study questions. The SLO assessment format tests the students’ understating of the course material. Weaknesses of the SLO Assessment: The current data collection does not include a question that asks the students about their program placement. This will allow the faculty to determine if the students in the A.S. in IT degree program are meeting the SLO expectations. The expectation for the A.S. in IT degree program is for students to learn the skills needed to be successful when they transfer to an IT program in a four-year institution/university. Multiple choice questions are not an appropriate method to assess whether the student has a mastery of when to use the appropriate web design tool. CASE-based questions are a more appropriate format for the SLO assessment. CASE-based SLO questions were developed for the ITE 170 class, but were not developed in time to use them for the Spring 2018 semester.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results CLO: Quantitative reasoning problems [ X ] QL

PC Hardware and Operating System Architecture ITE 221

Direct Measure: The ITE 221 Core Learning Outcome Quantitative Reasoning Assessment was a timed and proctored assessment. Students had 30-minutes in which to complete the assessment. Students with a documented Memorandum of Accommodation granting additional time for

Semester/year data collected: Spring 2018 Target – Students should answer questions with a 70% accuracy rate. This is consistent with CompTIA exam standards. Results:

CLO Grade

PERCENTAGE OF STUDENTS RECEIVING GRADE – Spring 2018

A (90-100) 24.11

Spring 2018 results of this assessment revealed significant issues with this topic Target Met: [ ] Yes [ X ] No [ ] Partially Actions for improvement: Faculty to come up with a better grading rubric for evaluating the “correctness” of a student’s solution. Appoint dual enrollment coordinators and specify role of adding the SLO to dual enrollment

Page 233: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

223

Information Technology, A.S. in-class assessments could take the assessment in the local campus Testing Center.

Students could only use a simple four-function calculator. Students were not allowed to use any other type of calculator including, but not be limited to, advanced graphing calculators, smart phone calculators, internet-based calculators, operating system calculator utilities, etc. Instructors were required to verify the calculator being used by each student before the assessment. There were five categories of quantitative reasoning problems which made up the ITE 221 CLO Assessment. This was a paper-based assessment and not a computer-based assessment. Instructors were required to select one problem from each category: 1. Binary-to-Decimal Conversion 2. Hexadecimal-to-Decimal Conversion 3. Decimal-to-Binary Conversion 4. Hexadecimal-to-Binary Conversion 5. Two’s Complement Notation

Data collection: Results were provided by faculty of 10 sections received from Alexandria, Loudoun, Annandale, Manassas and Woodbridge campuses. ELI courses did not report. A separate Excel spreadsheet was used to record the results of the assessment. Instructors were required to provide the date of the report, their campus, the course section, and their name. Students remained anonymous, only identified as Student 1, Student 2, etc. Sample: Of these 10 reporting sections, a total of 253 students were assessed.

B (80-89) 15.02 C (70-79) 10.67 C or Better 49.8

Students demonstrated an accuracy rate of 43.66% with the questions. Largely, this was due to the short answer nature of the exam, as compared to what are higher rates of success with multiple choice answers. Due to the new project, we do not have past assessments to compare results.

classes. Results are to be broken down by question/topic in the next assessment. In addition, data will be divided for online and off-site dual enrolled sections. When will the improvements take place: The improvements will take place during the Fall 2021 semester. Next Assessment: Fall 2021

Program Goals Evaluation Methods Assessment Results Use of Results To increase the number of program placed students in the AS in IT program.

Review the number of program placed students in each degree and certificate program listed in the OIR Factbook – section entitled Distribution of Graduates by Curriculum and Award Type.

Target: Increase the number of program placed students in the A.S. in IT degree program by 2% Results: The numbers of program placed students in the A.S. in Information Technology degree from 2011 through 2017 are listed below:

The Dean encouraged IT faculty to discuss with students their program placements. The results from the statistics for students program placed in the A.S. in IT degree program will be used to: in Fall

Page 234: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

224

Information Technology, A.S. https://www.nvcc.edu/college-planning/data.html Year

Number of Program Placed

Students in A.S. in IT degree

Percent Change

Fall 2012 2150 n/a Fall 2013 2191 1.906 Fall 2014 2388 8.99 Fall 2015 2374 -0.586 Fall 2016 2273 -4.25 Fall 2017 2307 1.50

Comment: As can be seen, enrollment in the A.S. Information Technology degree program has an increase in the number of program placed students in the Fall 2016 – Spring 2017 academic year.

2019, the Provost, Dean and faculty will discuss effective outreach efforts used by each campus to promote the IT program in their service area. Implementing an IET Canvas site – to share documents with faculty. Target Met: [ ] Yes [ ] No [ X ] Partially When will the improvements take place: The improvements will take place during the Fall 2019 semester. Assessed: Annually

To encourage students to complete their A.S. in Information Technology degree prior to transferring to a four-year institution.

Review the number of graduates in the OIR Factbook – section entitled Distribution of Graduates by Curriculum and Award Type https://www.nvcc.edu/college-planning/data.html

Target: Increase the number of graduating students in the A.S. in IT degree program by 2% Results: The chart with the number of students graduating with an A.S. in Information Technology degree from 2011-2017 is listed below:

YEAR

NUMBER OF STUDENTS

GRADUATING (A.S. Degree)

PERCENT CHANGE

2011-2012 641 n/a 2012-2013 627 -2.18 2013-14 739 17.86 2014 - 2015 472 -36.12 2015 - 2016 383 -18.85 2016 - 2017 326 -14.88 2017 - 2018 401 23.00

Comment: The number of students graduating with the A.S. in IT degree increased during the 2017 – 2018 academic year. During the 2016 – 2017 academic year, there was a 14.88% decrease in the number of students graduating with the A.S. in IT degree.

There is a market demand for students with educational and experience in the IT field. Instructors realize that students may be able to obtain jobs without completing a degree or certificate. The IT Cluster needs to compare the number of students, program placed in the degree/certificate programs to the number of students graduating from the degree/certificate programs to make sure that the percentage is in line with the percentage of students who graduate from the community college. Action for improvement: • Address issues that impact the

program by soliciting input and feedback from IET Faculty.

• Develop and assign sub-groups of faculty to address challenges and opportunities within the curriculum.

• Take and present issues to the Pathways Council and support initiatives through curriculum committee and other college/state committees.

Target Met: [ X ] Yes [ ] No [ ] Partially When will the improvements take place: The improvements will take place during the Fall 2019 semester. Assessed: Annually

Page 235: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

225

Annual Planning and Evaluation Report: 2017-2018

Interior Design, A.A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Interior Design program provides quality education for students to prepare them for entry level employment in the interior design field or to transfer to an accredited university for further education. The curriculum provides a foundation education covering a broad range of topics in interior design, art history, furniture history, and basic design. Computer aided drafting, rendering and business practices round out the curriculum. Students become knowledgeable in both residential and contract design.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will apply technical drawing skills to graphically illustrate design concepts.

Three Dimensional Drawing and Rendering IDS 106 , Loudoun Campus Direct Measure: IDS 106 Final Project, submitted with SLO document. Previous measure: final project with different skills breakdown Provided Rubric Criteria: IDS 106 Grade Rubric Final Project F17, submitted with SLO document Sample Size (Specify NA where not offered)

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 16 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Fall 2017 (previous evaluation Fall 2015) Target: 80% of the class will score 80% or better on the project. Results: For this project, 14 of the 16 students or 88% of students achieved a score of 80% or better on the project. This is a new target based on new course structure. As noted below for the previous assessment, the target for that project was established at 90% of students achieving 90% or better. For that assessment 15 of 18 or 83% of students achieved the target and the average score was 94%, 4 percentage points over the target of 90%. Those that did not achieve the target did so because they did not hand in their projects on time. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Fall 2017 Fall 2015 Average

Score Percent >

target Average

Score Percent >

target LO 87.56 8 94 4

Results by SLO Criteria: (Specify N/A where not offered)

Results by SLO Criteria/ Question

Topics

Current Assessment Results

Fall 2017

Average Score % of Students > target

1.Bubble 88 +14 2.Block 84 +14 3.Plan 85 -5 4.1 Point 92 +8 5.2 Point 89 +1 Total 87.6

In the chart above, specific topics from the rubric were selected to be analyzed. Based on the target, 80% of the

Previous action(s) to improve SLO: This course has been significantly updated since it was evaluated in 2015. There is a new format for the class and a new instructor. The course now provides more rigorous instruction in rendering of floor plans and other drawings used to graphically portray organizational relationships as well as three dimensional drawings. In addition, each assignment now has a specific rubric documenting exercise requirements. This was a recommendation of the advisory committee, faculty from local universities and previous students. These changes were implemented in Fall 2017. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Overall students exceeded the target by 8 percentage points. From the results by SLO criteria, students did very well in three of the areas, but performance can be improved in the development of the rendered plan. In addition, while the target was achieved for the two-point perspective, that area could use some improvement as well. Current actions to improve SLO based on the results: Faculty will evaluate course exercises to determine the best way to improve in the specified areas. The plan portion of the final project could be improved by providing more practice exercises and further explanation of the graphic scale. The two-point perspective portion of the project could be improved by the repetition of very basic exercises such as drawing a box, drawing a chair, drawing a room in both the

Page 236: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

226

Interior Design, A.A.S. students achieving 80% or greater in each area were determined to have completed that portion of the assignment successfully. For the bubble diagrams, for example, the average score is listed at 88 for that topic. The percentage of students that achieved the target of 80% was 94%, or 14 points above the target of 80%. The same follows for each of the other rubric areas. Current results improved: Not applicable Strengths by Criterion/ Question/Topic: Students exhibit proficiency in developing Bubble Diagrams, Blocking Plans and One Point Perspectives. Weaknesses by Criterion/ Question/Topic: Students can improve results on Plan Rendering and Two Point Perspectives.

one and two-point perspectives. Faculty will discuss and implement modifications in Spring 2019. Next assessment of this SLO: Spring 2020

Students will utilize basic building and accessibility codes related to the health, safety and welfare of the public to develop interior floor plans.

Theory and Research in Interior Design IDS 215, Loudoun Campus Direct Measure: IDS 215 Codes Quiz F17 and IDS 215 Final Project FA 17 Previous measure: same quiz, different project. Provided Rubric Criteria or Question Topics: IDS 215 Codes Quiz Questions 18 (ramps), 22 (corridor width), 23 (door width), 27 (obstructed path), and 30 (dead ends); IDS 215 Grade Sheet Final Project F17 (Office Plan and Design Development). Sample Size (Specify NA where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Fall 2017 (previous assessment Spring 2015) Target: Two targets for the two different exercises: • Quiz: all students will score 90% or higher on the code

quiz • Project: all students will score 90% or better on the space

planning portion of the project. This portion of the project requires special analysis as well as knowledge of building codes. It was worth 20 of 100 points. In the previous assessment it was worth 50 of 200 points. The change was made by faculty to streamline the grading process and simplify the rubric.

Results for Code Quiz:

Results by Campus/ Modality

Fall 2017 Spring 2015 Average

Score Percent >

90% Average

Score Percent >

90% LO only 89 33 89 50

While it appears that there is a significant number of students that would not achieve the target compared to the previous assessment, 33% vs 50%, this is not the case. The difference is a total of 1 student. It is the small size of the class that makes the percentage seem large. Code Quiz Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Spring 2015

# stu correct

% of Students

> 90% Average

Score % of

Students > 90%

1. Q 18 9 100 10 100 2. Q 22 9 100 9 90

Previous action(s) to improve SLO: Per the previous assessment, modifications have been made to online quizzes. In addition, exercises testing practical application of the concepts were developed and a code checklist to be used during project preparation has been used to help develop awareness of specific planning and code issues. These items were added in Fall 2017. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: While improving overall from the previous assessment, students still need to work on developing an understanding of the impact of dead end corridors on space planning. Current actions to improve SLO based on the results: Results from the previous assessment and this assessment are virtually the same despite the fact that the course was taught by different instructors using different projects but using the same test. Faculty will continue to stress the importance of building codes in space planning. For the next assessment faculty will separate out the code issues from the space planning on the project. The element evaluated for the next evaluation will be the code checklist for

Page 237: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

227

Interior Design, A.A.S. 3. Q 23 9 100 9 90 4. Q 27 9 100 10 100 5. Q 30 5 56 2 20 Total 41 91 8

Code quiz current results improved: [ ] Yes [ ] No [X] Partially Results for Space Planning portion of Project:

Results by Campus/ Modality

Fall 2017 Spring 2015 # stu 90% or above

Percent > 90%

# stu 90% or above

Percent > 90%

LO 7 78% 10 100% Project current results improved: [ X ] Yes [ ] No [ ] Partially Code Quiz and Project Strengths by Criterion/ Question/Topic: Students are able to understand the majority of concepts related to the effect of building codes on planning space. Code Quiz and Project Weaknesses by Criterion/ Question/Topic: The concept of dead end corridors needs work.

the project rather than the design project itself. In addition, more difficult design exercises related to planning and code issues will be incorporated into the class and into the quiz. This additional difficulty will necessitate a modification in expectation for success and a change of target which will be discussed with faculty at an upcoming discipline meeting. Next assessment of this SLO: Spring 2020

Students will practice business management including estimating, marketing, business structures and ethics as they relate to the field of interior design.

Business Procedures IDS 225, Loudoun Campus Direct Measure: Specific questions from three tests were evaluated Provided Rubric Criteria or Question Topics: • Test 2: Contract Elements, questions 13-

18 • Test 2: Business Formations, questions

42-44 • Test 3: Project Phases, question 2 • Test 3: Advantages of having your own

business, question 14 • Test 3: Disadvantages of having your own

business, question 15 • Test 4: 5 Ps of Marketing, questions 17-21 • Test 4: Reasons for presentations,

questions 30-33

In the previous assessment, only questions related to business formation were evaluated.

Semester/year data collected: Spring 2017 (previous assessment Spring 2015) Target: 75% of students will score 75% or higher overall and on each criterion and 75% or higher overall on the selected questions. Results by In-Class, ELI, Dual Enrollment: This SLO was not assessed by grades for one test or project. It was evaluated by specific questions from three tests. In the past, one test was evaluated, but the amount of material that was required for each test was overwhelming for students, so the instructor broke the topics into smaller segments via shorter tests. See below for criteria assessment. In this chart, the current assessment results are based on a total of 56-points from the three tests. Assuming 56 points = 100%, the average score was 78%. In the previous assessment of the same SLO, the total number of points was 40 and the average was 74%. The target was the same for both assessment years.

Results by Campus/ Modality

Spring 2017 Spring 2015 Average

Score Percent >

target Average

Score Percent >

target LO 78 3 74 -1

Previous action(s) to improve SLO: Since the last assessment, faculty decided that to improve assessment of this SLO, a wider range of questions should be evaluated. In this assessment, faculty decided to evaluate a few key questions from three non-cumulative tests to determine some baseline understanding about various aspects of the business of interior design, including some of the material evaluated in the Spring 2015 assessment (business formations). In that assessment, faculty determined that students did not really understand the business formations. To help with this, a new faculty member with experience across a wide range of business types has been teaching this course, providing real life examples of the different business styles. The new faculty member has been teaching this course since Fall of 2016. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: Contract language and

Page 238: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

228

Interior Design, A.A.S. Sample Size (Specify NA where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 21 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2017 Average

Score % of Students >

75% 1. Contract (12 pt) 5.9 33% 2.Bus.Form (6 pt) 4.1 52% 3.Phase (8 pt) 5.33 71% 4.Adv (6 pt) 6.00 100% 5.Disadv (6 pt) 5.71 95% 6.Market (10 pt) 9.9 100% 7.Pres (8 pt) 6.57 86%

Elements evaluated above include contract elements, business formations (topic of previous assessment) from Test 2; design phases, advantages and disadvantages of having a business from test 3; 5 Ps of marketing; and reasons for presentations from test 4. Current results improved: [ ] Yes [ ] No [ x ] Partially Of the seven topics listed above, four of them successfully reached the target set by faculty, one almost reached that target, and two could use improvement. Strengths by Criterion/ Question/Topic: Students were very engaged with the marketing and presentation topics and recognized the advantages and disadvantages of having a business. Weaknesses by Criterion/ Question/Topic: Students’ understanding of contract details is weak and could use additional exploration about types of businesses.

business format and structure seem to be the points that students continue to have problems deciphering. Current actions to improve SLO based on the results: Faculty will revisit lessons related to contract language and business formats and develop short exercises to reinforce the vocabulary. These exercises may include creation of case studies related to these different business modes. Discussion of modifications will occur at Discipline Meeting in March 2019 with implementation in Fall 2019. Next assessment of this SLO: Fall 2020

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Students will demonstrate the ability to use numerical, geometric, and measurement data and concepts, mathematical skills, and principles of

Lighting and Furnishings IDS 206, Loudoun Campus Direct Measure: Calculations Test There has not been a previous assessment of this CLO. Provided Rubric Criteria or Question Topics: NVCC Pilot Quantitative Reasoning Rubric (Spring 2017) combined with Calculations Test. There were five questions on the test, each matched one of the points on the Quantitative Reasoning Rubric. • Interprets Quantitatively = question 1

Semester/year data collected: Spring 2018 Target: 50% of students will score 70% or higher on the Calculations Test. Results by In-Class, ELI, Dual Enrollment:

Results by Campus/ Modality

Spring 2018 Average

Score Percent > [target]

LO 76 69 Results by CLO Criteria:

Spring 2018

Previous action(s) to improve CLO if applicable: This is the first time this CLO has been assessed, therefore there are no previous results. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Interior Design students find it difficult to do math in any form, despite the fact that they use it daily in the business of design. For this course, Lighting and Furnishings, faculty designed a test in which students calculate the amount of light for a space, the number of fixtures needed, then apply it to a

Page 239: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

229

Interior Design, A.A.S. mathematical reasoning to draw logical conclusions and to make well-reasoned decisions and possess the skills and knowledge necessary to apply the use of logic, numbers and mathematics to deal effectively with common problems and issues. [ x ] QR

• Presents Quantitatively = question 5 • Analyzes Thoughtfully = question 2 • Communicates Qualitatively and

Persuasively = question 3 • Problem Solving = question 4

Sample Size (Specify NA where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO 1 1 16 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Results by CLO Criteria/ Question Topics

Average Score

(out of 4)

% of Students >

target 1. Interprets Quant. 3.3 75 2.Presents Quant. 2.69 56 3. Analyzes Thoughtfully 3.56 75 4.Communicates Qual. 2.69 56 5. Problem Solving 2.81 69

Current results improved: N/A: first assessment Strengths by Criterion/ Question/Topic: Students seem to do best at interpreting and analyzing data. Weaknesses by Criterion/ Question/Topic: Students appear to need to develop their problem solving and qualitative communications skills.

plan. The majority of students were able to do the simple formulas to determine the amount of light and the number of fixtures required, but had more trouble with what to do with that information. Current actions to improve CLO based on the results: Faculty will develop worksheets for students to practice calculations and lighting layouts. In addition, in order to remove the stress associated with taking a math test in a design course (though it’s not new to this course), faculty will offer the test twice to allow students to learn from what errors they may have made the first time. This will be added to the course in Spring 2019. Next assessment of this CLO: Fall 2022

Program Goals Evaluation Methods Assessment Results Use of Results

To increase the number of program placed students in the Interior Design Program.

Short description of method(s) and/or source of data: OIR Document DISTRIBUTION OF PROGRAM PLACED STUDENTS BY CURRICULUM AND AWARD TYPE FALL 2013 THROUGH FALL 2017 https://www.nvcc.edu/college-planning/_docs/Distribution-of-Program-Placed-Students-by-Curriculum-and-Award-Type-Fall-2013-Fall-2017.pdf

Target: increase the number of program placed students by 5% by 2017-2018. Results for Past 5 years:

Fall Number of Students

Percentage Increased

F2013 165 -7% F2014 139 -16% F2015 130 -6% F2016 125 -4% F2017 137 +10%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to NVCC total AAS program placed students:

FALL # NVCC AAS PROGRAM PLACED

% CHANGE PER YEAR

F2013 7995 -7% F2014 7319 -8%

Previous action(s) to improve program goal: Starting in Spring 2017, faculty are monitoring enrollment and counseling students early on to make sure they are program placed from the time they enroll in the program. Most recent results: Active monitoring of program identification by faculty has contributed to the increase in program placement in the interior design program. Many students that were enrolled in general education or liberal arts by the counselors have been moved to interior design for better advising. Results improved: [ x ] Yes [ ] No [ ] Partially Current action(s) to improve program goal:

Page 240: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

230

Interior Design, A.A.S. F2015 6893 -6% F2016 6398 -7% F2017 6528 +2%

Comparison to previous assessment(s): as shown in the charts above, there was a downturn on the part of the college as well as the program in the previous assessment. This year both the college and the program have increased enrollments. The interior design program has exceeded the increase of the college by 8 percentage points.

Faculty will continue to monitor program placement with entry level students to make sure they are placed correctly. In addition, since the entry level course IDS 100, is often filled, the program will offer an additional 12 week IDS 100 starting in Fall 2018 to capture additional students that may want enroll but try to do so after classes have already started. Assessed: Annually

To encourage students to complete their A.A.S. degree in Interior Design

Short description of method(s) and/or source of data: OIR document COLLEGE GRADUATES BY CURRICULUM AND AWARD TYPE 2012-13 THROUGH 2016-17 Number of NOVA Graduates By Degree and Specialization: 2017-2018 https://www.nvcc.edu/college-planning/_docs/Number-of-NOVA-Graduates-By-Degree-and-Specialization-2017-2018.pdf

Target: Interior Design program graduates will increase by 5% for the year 2017-2018 Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2017-2018 15 +25% 2016-2017 12 -29% 2015-2016 17 -19% 2014-2015 21 17% 2013-2014 18 -22% 2012-2013 23 130%

Target Met: [ x ] Yes [ ] No [ ] Partially Number of NVCC AAS graduates and total graduates

Academic Year

Number of AAS Grads % change

Number of

NVCC grads

% change

2017-2018 1050 +2.3 7004 +4.7%

2016-2017 1026 9% 7338 -4% 2015-2016 941 -4% 7656 3% 2014-2015 976 -1% 7424 2%

2013-2014 989 -2% 7243 -5% 2012-2013 1012 -2% 7601 -3%

Comparison to previous assessment(s): previous assessment data can be seen on charts above.

Previous action to improve program goal: Interior Design faculty actively engage in academic advising to keep students on track with their course progression. Reminders are sent to students about the last day to apply for graduation, and upper level course students are informed of this in person as well. These actions started in Fall 2016 and are ongoing. In this program, most students are part time and take more than the normal two years to graduate. The ones that finish often will finish in three to four years, rather than in two, so the number of graduates seem to spike every other year. There is no apparent correlation with the number of program placed students with the number of graduates. At the highest enrollment shown on the chart in 2012-2013, the number of program placed to graduates was not significantly different than it is today. Most recent results: In the current year, the number of graduates increased from 12 in 2016-17 to 15 in 2017-18, an increase of 25%. Results improved: [ x ] Yes [ ] No [ ] Partially Current actions to improve program goal: As noted above, Interior Design faculty actively engage in academic advising to keep students on track with their course progression. Reminders are sent to students about the last day to apply for graduation, and upper level course students are informed of this in person as well. Modifications to the Interior Design program may help to boost graduation rates by decreasing the number of

Page 241: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

231

Interior Design, A.A.S. overall credits. Faculty will make changes to the curriculum via the Curriculum Committee in Fall 2018 for implementation Fall 2019. This will be continued in the future. Assessed: Annually

Page 242: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

232

Annual Planning and Evaluation Report: 2017-2018

Liberal Arts, A.A. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Associate of Arts degree major in Liberal Arts is designed for persons who plan to transfer to a four-year institution to complete a Bachelor of Arts Degree (B.A.).

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Civic Engagement PLS Discipline Student Learning Outcomes: “Students will be able to describe the political institutions and processes of the government of the United States.”

American National Politics PLS 135 Direct Measure: This assessment was performed in PLS 135 classes, American National Politics, which deals directly with this SLO. Provided Rubric Criteria or Question Topics We asked students 20 Multiple Choice (MC) questions requiring them to identify correct responses in four areas of important to American politics and government: The Constitution; Legislative Branch; Executive Branch; and Judicial Branch. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 2 2 36 AN 2 1 32 MA N/A N/A N/A ME 0 0 0 LO 1 1 17 WO 0 0 0 NOVA Online

3 0 0

DE* N/A N/A N/A Total 8 4 85

*Dual-enrollment Major improvement on data overall is due to the support and insistence of Associate Deans of Student Development on each campus.

Semester/year data collected: Spring 2018 Target: Average score 80% or higher overall on each criterion as well as the overall score. Results:

Results by Campus/ Modality

Spring 2018 Average Score

AL 82 AN 78 LO 77 Ave score 79

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2018 Average Score

Constitution 78 Legislature 81 Executive 86 Judiciary 73 Ave score 79.5

Current results improved: N/A: First time assessed Strengths by Criterion/ Question/Topic: Students performed best in the area of the Executive Branch, which should not be surprising as it receives the most media attention. Next is legislature. Both topics are above the target. Weaknesses by Criterion/ Question/Topic: Knowledge of the Constitution and Judiciary was lacking, which is not surprising, particularly with the Judiciary. Still, knowledge of answers to questions in both fields was not too far off of target.

Previous action(s) to improve SLO: N/A: First time assessed. Target Met: [ ] Yes [ ] No [X] Partially All campuses slightly above or below target, which is great. However, knowledge of Constitution and Judiciary seem to be lag categories with regard to scores. Current actions to improve based on recent results: - Need to spend more time in class

discussing Constitution and Judiciary. - Inform students on importance of both

topics. - After Spring 2019, will adjust questions. - We have gotten all campuses and NOVA

Online to participate for Fall 2018 semester.

Will share results with all faculty and they will see recommendations to address issues. Discuss issue at Discipline Group meeting, during convocation of Spring 2019, and we will discuss this at every Fall convocation. Next assessment of this SLO: Fall 2018

Critical Thinking SDV 100: Identify three to five

Student Development Orientation SDV 100

Semester/year data collected: Spring 2018

Previous action(s) to improve CLO: The SDV Curriculum Committee has a yearly mandatory SDV In-Service where we have

Page 243: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

233

Liberal Arts, A.A. aspects of critical thinking such as: identifying faulty logic, problem-solving, and asking questions/probing etc.

Direct Measure: Students were quizzed on 5 critical thinking questions embedded in a College Resource Quiz in SDV 100. Question Topics • Q9: Thinking creatively • Q10: Solving problems • Q15: Critical thinking in high school versus

college • Q17: Narrowing the problem • Q18: Critical thinking

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 21 13 230 AN 36 32 678 MA (+1 SDV 101)

15 9 161

ME SDV 101

11 5 49

LO 18 13 250 WO 22 5 109 ELI 24 17 246 DE* 10 1 21 Total 157 95 1744

*Dual-enrollment

Target: 80% of students will answer correctly on the 5 critical thinking questions included on the College Resource and Critical Thinking Quiz. Results by In-Class, ELI, Dual Enrollment:

Campus/ Modality Q9 Q10 Q15 Q17 Q18 Total

AL 97% 93% 31% 11% 83% 63% AN 95% 88% 24% 11% 80% 60% MA 98% 94% 28% 3% 86% 62% ME 98% 92% 16% 78% 80% 73% LO 99% 93% 23% 13% 84% 62% WO 100

% 96% 100%

100%

100% 99%

NOVA online 96% 68% 13% 76% 90% 69%

DE 100% 95% 24% 86% 100

% 81%

Total Average 98% 90% 32% 47% 88% 71%

Current results improved: N/A - First time we assess this topic. Strengths by Criterion/ Question/Topic: Questions 9, 10, and 18 had the best scores due to the fact that they could be assessed by using good test taking skills and singling out other answers that are not the best (multiple choice). The questions are broad enough that even without reviewing the textbook they can be answered. Weaknesses by Criterion/ Question/Topic: Questions 15 and 17 had the lowest scores. Question 15 is a question that requires the student to pick several right answers and there is more room for error. Question 17 had the highest wrong answers because it is not worded directly from the text but it’s inferred from the reading material and requires a bit more critical thinking to figure out the best answer.

instructors present on best practices on student engagement and learning (May 2016, May 2017, June 2018). The Committee has also considered using a different textbook but our primary goal has been to keep the textbook affordable by using OER (Open Education Resources). We have considered that since the textbook is only available online that it discourages students from reading it. The committee reviewed textbooks in 2017-2018 and we voted against the different options because they could not remain free. At this time we have not found a better free textbook that covers the topic we review in this class. Most of the assignments required self- assessment and reflection and students feel more comfortable with those assignments than assessments and quizzes that required them to review the textbook available online. NOVA Online, formerly ELI, differed on when/where they assessed the critical thinking questions. It was not in the first quiz/assessment and not attached to a college resource quiz but it was its own separate quiz. This allows discussion that putting a critical thinking reading assignment/assessment as its category later on in the class may improve the results. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The Critical Thinking CLO is currently located along with College Resources and Communication Skills. Comparing with ELI on where they put their assessment, students may do best if Critical Thinking has its own category after Academic and Test-Taking skills. Current actions to improve CLO based on the results: Unfortunately, the Fall 2018 assessment is well underway and too late to make any improvements or changes. Critical Thinking is not going to be assessed for Spring 2019. Comparing Spring 2018 to Fall

Page 244: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

234

Liberal Arts, A.A. 2018 would allow for more results to see if there is improvement or if the data stays the same. Next assessment of CT: Spring 2020

Quantitative Reasoning Students will use numerical values to perform various calculations and draw reasonable conclusion. Students will use graphical methods to organize and interpret data.

General Chemistry I & II CHM 111 and 112 Direct Measure: Lab Report (pilot) Rubric Criteria: QR Rubric for Lab assignment: Five criteria presented on the Quantitative Reasoning (QR) Rubric:

I. Interprets Quantitatively: Explains the numerical information presented in mathematical forms (equations, formulas, graphs, diagrams and tables).

II. Presents quantitatively: Converts the given information into mathematical forms such as tables, graphs, diagrams, and equations.

III. Analyzes thoughtfully: Draws relevant conclusions from provided information and data, and predicts future trends.

IV. Communicates qualitatively and persuasively: uses quantitative evidence to support the argument or purpose of the work (what evidence is used, how it is formatted and contextualized).

V. Problem solving: Sets up a numerical problem and calculates the solution correctly

Sample Size (Specify N/A where not offered.)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 10 1 23 AN 18 1 25 MA 8 3 52 ME 0 0 0 LO 23 8 128 WO 8 0 0 NOVA Online

1 1 18

DE* 8 8 78 Total 76 22 324

*Dual-enrollment Assessment Results’ Calculation: Average Score: Total Points in all courses ÷ Total Number of Students

Semester/year data collected: Spring 2018 Target: The average score of students participating will be 70%. For itemized criteria, 70% of students will correctly answer each item. Results by In-Class, ELI, and Dual Enrollment:

Results by Campus/ Modality

Spring 2018

Average Score %Percent Earned

AL 16.8 84.1 AN 14.7 73.4 MA 17.9 89.6 ME N/A N/A LO 14.2 71.1 WO 0 0 ELI 16.8 84.0 DE* 15.6 78.0 Total Avg 14.8 74

Results by CLO Criteria:

Results by CLO Criteria/

Question Topics

Spring 2018 Average

Score % Earned on

Questions 1. 2.9 72.5 2. 3.0 75.0 3. 3.0 75.0 4. 3.0 75.0 5. 2.9 72.5 Total Avg 3.0 74

Current results improved [ X] Yes [ ] No [ ] Partially Four out of the five campuses offering in-person Chemistry courses contributed data for this report, in addition to ELI and DE courses. Although the larger sample of students evaluated resulted in lower score in each criterion, the results for this assessment are considered more meaningful compared to fall 2017. In spite of the overall decrease in the average, the targeted values for the evaluation were met by each campus and on each criterion.

Previous action(s) to improve SLO: This was the second round of assessing the QR objectives. In the January 2018 cluster meeting, the discipline group discussed the previous assessment in Fall 2017 and ways to improve the faculty participation and the Core Learning Outcomes. There were some questions regarding interpreting the rubric that seemed to be the reason for insufficient faculty participation. After the meeting, on January 05, an informative follow up email was sent to the cluster to allow enough time to plan for the semester. The following changes were assumed: • To improve the consistency of the

assessments and hence the results, two laboratory experiments were selected and shared with the faculty to use for the evaluation.

• To increase the students Core Learning Outcomes, a handout with guidelines regarding analysis of data, thinking quantitatively, and writing analytically was developed and shared with the discipline to distribute among all students on all campuses. This was to ensure that all students have access to the same information prior to their analytical writing and interpretation of data.

• To maintain standardization of the collected data, a table for collecting information was developed and shared with the Assistant Deans.

Target Met: [ X ] Yes [ ] No [ ] Partially All campuses met and some exceeded the targeted value. WO did not participate in the assessment, and only one course from each of AL and AN participated. Compared to Fall 2017, the number courses participating increased from 10% to 29% participation in Spring 2018. The number of

Page 245: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

235

Liberal Arts, A.A. Maximum points available = 20 points #(15.2/20)x100=76% and (16.7/20)x100=84%

There was very little to no variations in the average score among criterion, which indicates students’ overall preparation. Furthermore, students met the targeted goal for each item. Strengths by Criterion/ Question/Topic: Three of the criteria, “Presents quantitatively,” “Analyzes thoughtfully,” and “Communicates qualitatively and persuasively” were scored equally high. “Interprets Quantitatively” and “problem solving” were among the weaknesses of the students evaluated. Both of these criteria are math related and more students find these types of assessments challenging. This may improve by addition of some kind of math related activity to the curriculum during the first few weeks of school.

students participating in this assessment increased by over 200% compared to Fall 2017. Moreover, ELI and DE courses have participated close to 100%. Future results may be improved by addition of a lab activity at the beginning of the semester to familiarize students with some of the mathematical manipulation and graphical analysis that they would encounter throughout the course.

Scientific Literacy: Students will understand the scientific method and identify methods of inquiry that lead to scientific knowledge.

General Biology I BIO 101 Direct Measure: A quiz on the Scientific Method was available on Blackboard to all BIO 101 students in the college (students from all campuses including ELI and DE) towards the end of the semester, Fall 2017. The quiz consisted of 10 multiple choice questions that assessed steps in the Scientific Method. The topics were as follows: • Item #1: observation is first step • Item #2: order of steps • Item #3: definition of hypothesis • Item #4: validity of hypotheses • Item #5: importance of control • Item #6: definition of data • Item #7: example of hypothesis • Item #8: definition of variable • Item #9: definition of theory • Item #10: defining data collecting This assessment is the same as given to students in the previous year. All assessment data is gathered through Blackboard. Sample:

Program Sample Size # %

General Studies 173 31

Semester/year data collected: Fall 2017 Achievement Target:

• For the whole quiz, 70% of students achieving 70% on the quiz.

• For each item, 70% of students correctly answering that item.

Like the previous year, students identified themselves by major. This allowed us to compare results from students’ program placed in Liberal Arts (39), General Studies (173), Social Science (92) and Science (115). Note that these numbers add to 693; some of the students listed double majors. Results:

Item #

Lib Arts

General Studies

Social Sciences

Science

Average of All

Students Assesse

d Q1 0.6 0.6 0.7 0.6 0.7 Q2 1.0 0.9 1.0 1.0 1.0 Q3 1.0 0.9 0.9 0.9 0.9 Q4 0.9 0.9 0.9 1.0 0.9 Q5 0.8 0.8 0.8 0.8 0.8 Q6 0.9 0.9 1.0 1.0 0.9 Q7 0.8 0.9 0.8 0.9 0.9 Q8 0.9 0.9 0.9 0.9 0.9 Q9 0.7 0.6 0.8 0.6 0.7 Q10 0.9 0.9 0.9 0.9 0.9 Ave Total

8.6 8.3 8.8 8.5 8.5

Instructors and students of BIO 101 are becoming more used to assessment by Blackboard. During the Fall 2018 Cluster meeting, faculty members requested results of the previous year’s data. These data were sent to the Biology discipline chair for dissemination. The low achievement results on Items 1 and 9 are important to the Biology faculty because they show that students do not understand that curiosity is the first step of solving a scientific problem through the scientific method. Also, the term “theory” in science continues to confuse students. Students’ wrong answers indicate that they do not realize “theory” in science is not a hypothesis, but a well-substantiated explanation of the natural world. It is valuable for instructors to have this feedback. The discipline chair recently elected in the Biology discipline in Fall 2018 has already seen this data. She wants to work with faculty on the concepts of the two low-scoring questions for the 2019-20 academic year. We need to find out if students are not understanding the concepts or there is a problem with the question itself. This is the second year that A.S. Science students were identified in the assessment. Although most A.S. Science majors take BIO 101, many students in General Studies,

Page 246: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

236

Liberal Arts, A.A. Social Sciences 92 16

Liberal Arts 39 7 Science 115 20 Computer Science 11 2

Business Admin 56 10

Information Tech 25 4

Dual Enrollment (Off-site and On-campus Students)

9 2

Other (8 Students or Less)

43 8

Average of All Programs 563 100

The assessment tool was deployed on Blackboard to all students taking BIO 101 on campus (AL, AN, LO, MA, WO) or at ELI or as dual enrolled students, and 563 students took part in this assessment. The exact total number of students in BIO 101 during Fall 2017 is not available, but it is around 1600. This approximate number allows us to determine that about a third of all students responded to the Blackboard notice and took the quiz. Dual enrollment students were included, and 101 DE students (17.6% of the total) took the assessment. In the case of ELI, 128 ELI students (22.3% of the total) took the assessment. The number of students from each campus and from was not tallied. However, the student ID numbers are in the raw data, and specific information can be gleaned from the data.

Results indicate that total scores are well above 70%, and most (8 out of 10) individual items meet achievement goals. Scores were very similar to those of last year. The lowest scores were in items 1 and 9. Item 1 asked about the first step in the Scientific Method. The other low score was Item 9 which asked the definition of the word “theory.” Current results improved: [ ] Yes [ ] No [ X ] Partially Scores from students’ program placed in Liberal Arts, Science, Social Science and General Studies are very similar. In the 2015-2016 academic year, students scored below 70% in questions 1, 2 and 9 (42%, 47.5%, and 57.4%). In 2016-2017, students scored below 70% in questions 1 and 9 (65.4% and 66.6%). This cycle, students also scored below 70% in questions 1 and 9 (64% and 65.7%). This shows a marked improvement in identifying the steps of the Scientific Method (question 2) over the years assessed, and an improvement in general knowledge of Scientific Method.

Liberal Arts, and Social Sciences and other majors also take BIO 101. Faculty assessing Social Science and General Studies asked if we could identify their students, since those programs also wish to use this Scientific Method assessment for students in their majors. For the 2018-2019 assessment year, we plan to add A.S. Liberal Arts. It is interesting that the results again show very similar results for students, regardless of major. BIO 101 is a class taken by science students early in their academic career, and results show that science students at this early stage did not outperform students in other majors. In this assessment, we were able to demonstrate for the first time that students from all campuses, ELI and Dual Enrollment took part. In the current Blackboard setup, each question is posed as an independent little exam, and that takes more time for students. The two more questions about ELI and DE did not discourage students. Nearly 18% of student responders were DE, and 22% were ELI students. Next Assessment: Spring 2019

Program Goals Evaluation Methods Assessment Results Use of Results Program goal on program-placed students

Distribution of Program Placed Students by Curriculum and Award Type: Fact Book 2012-2013 through 2017-2018

Semester/year data collected: Fall 2017 Target: Number of program-placed students in each degree/certificate will increase by 1.5%.

Fall Number of Students

Percentage Difference

Fall 2017 2570 -8.34 Fall 2016 2804 -14.4%

Previous action(s) to improve program goal: Previously, the SLO team reviewed the decrease of program placed students over the past five years and established a more realistic achievement target at 1.5% for 2017-2018 assessment year. Most recent results: -8.34% (decrease)

Page 247: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

237

Liberal Arts, A.A. Fall 2015 3278 -6.4% Fall 2014 3505 -6.1% Fall 2013 3733

Target Met: [ ] Yes [ X ] No [ ] Partially Achievement Target Not Met: 8.34% decrease in program placed students. Comparison to previous assessment: The number of program placed students in Liberal Arts has been decreasing over the past five years. The percentage decrease for Fall 2016 (14%) was the highest in the past 5 years.

Results improved: [ ] Yes [ ] No [ X] Partially Current action(s) to improve program goal: The number of program-placed students decreased by 8.34%, thereby not meeting the achievement target. The achievement target for 2016-17 assessment year was reviewed for possible revision in light of the decrease in program placed students for five consecutive years, and it was decided to revise it at 1.5% as a realistic target. The SLO team will review the achievement target for 2018-19 assessment year. The Program will also dedicate itself to promoting this discipline with more publicity to the students and offering them support services. The guided pathways program is also likely to help increase the program placed students as they are mapped to student centered goals where students get a clear picture of programs of study being offered. Efforts to encourage program placement in Liberal Arts include: encouragement of faculty to participate in Advising Week. Programs to increase interest in Liberal Arts have been continued or developed. Such programs include the CST Showcase at Woodbridge. This is a multi-day event which brings together CST, technology and film. Liberal Arts student success is celebrated, and student film and theatre projects are shared with the campus. At Annandale campus, Liberal Arts Week showcases students’ music performances and art shows. Last Spring, this week included Awards Night for Liberal Arts students, a political science symposium, and daily events to attract students. The Liberal Arts team will continue to work with Counseling and First Year Advisors to support enrollment in Liberal Arts. Assessed: Annually

Page 248: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

238

Liberal Arts, A.A. Program goal on graduation

Number of Graduates: Fact Book 2012-2013 through 2017- 2018; Number of Graduates by Program and Specialization: 2017-18

Target: Program graduation totals will increase by 1.5% from the previous year. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Difference

2017 - 2018 215 -43.5% 2016 - 2017 381 -6.1% 2015- 2016 406 -0.73% 2014- 2015 409 -4.21% 2013 - 2014 427

Target Met: [ ] Yes [ X ] No [ ] Partially Target Not Met as the number of graduates decreased by 43.5%. Comparison to previous assessment(s): The number of Liberal Arts Degree graduates has continued to decline. The decline in program graduates went up in the 2016-17 assessment year to 6.1%. However, there was a steep decrease in the Liberal Arts graduates in the 2017-18 assessment year.

Previous action to improve program goal: The SLO team reviewed the resources available for students to assess how more students could be retained and to increase program graduation totals. Advising was a major resource for students in guiding them towards completion of their degrees. The SLO team promoted a responsive curriculum by putting student learning as its core value. The SLO team communicated to all Academic Divisions and to all appropriate offices related to student success and advising so there could be further initiatives to increase student enrollment, success and retention. Most recent results: Decrease in the number of graduates by 43.5%. Results improved: [ ] Yes [ X ] No [ ] Partially Current actions to improve program goal: In light of decreases in the number of graduates for the previous years, the SLO team reviewed the achievement target in the assessment year 2017-18 and left it unchanged at 1.5% increase in graduates from the previous year as the SLO team agreed that it was a realistic goal keeping in mind the new initiatives and resources for students. These results will be shared with the Office of Student Success and Initiatives, Counseling/Advising Divisions and all appropriate offices so there can be further initiatives to increase student enrollment, success and retention. Students are now receiving structured advising by their faculty advisors which is an essential component for student success. This has had a positive effect on student retention so far. The guided pathways program is also likely to positively impact student retention and student success. The SLO team is hopeful of stable student enrollment and retention practices as the advising program gets further structure and enhancements. This program goal will be assessed again next year in 2018-2019 and the achievement target will be reviewed again

Page 249: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

239

Liberal Arts, A.A. to decide if it is realistic to sustain or not and establish a more realistic target. Assessed: Annually

Page 250: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

240

Annual Planning and Evaluation Report: 2017-2018

Liberal Arts: English Specialization, A.A. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is designed for students who plan to transfer to a college or university for a Bachelor of Arts or a Bachelor of Science in English, Creative Writing or Writing and/or Rhetoric as an entry-level professional writer.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Student analyzes written, oral, and visual texts

Survey of American Literature II ENG 242 Direct Measure: Student essays from ENG 242. Rubric attached. The ENG Discipline Group selected 1 of our existing Student Learning Outcomes (SLO): SLO 2, Student analyzes written, oral, and visual texts and collected data from ENG 242: Survey of American Literature II. We elected to separate 241 and 242 data because the sample size for 242 was considerably smaller than that of 241. 241 data for SLO 2 is used as the ENG Discipline Group’s Core Learning Outcome assessment data (see below). In Spring 2018, a subcommittee of 7 full-time disciplinary faculty designed a 2-criteria rubric to measure student learning for SLO 2 and for SLO 4 (described below). The faculty subcommittee tested the two SLO rubrics used by norming them against samples from Fall 2017 semester ENG 241 students. The samples were provided by 2 of the faculty on the committee. This assessment relied upon random sampling of students. Samples were generated by the Office of Student Success Initiatives (OSSI). For each section, a sample of 5 students and 3 alternates was generated. Additionally, OSSI generated a list of all students in ENG 242 who were enrolled in the Liberal Arts-English Specialization degree program. Faculty who taught sections of ENG 242 were requested to provide a written assignment and responses written by 5 of those randomly selected students and all English Specialization students identified by OSSI. Only 8 Specialization students were enrolled in ENG

Semester/year data collected: Spring 2018 Target: Students’ average scores on the two rubric criteria will be at least 2. Results by In-Class, ELI, Dual Enrollment: SLO 2a: Student analyzes written, oral, and visual texts: Identifies content, structure, and rhetorical features of the text(s) under consideration in the paper.

Results by Campus/ Modality

Spring 2018 Average Score

MA 2.55 LO 2.1 ELI 2.57 Total 2.44

Results by In-Class, ELI, Dual Enrollment: SLO 2b: Student analyzes written, oral, and visual texts: Appropriately employs critical terminology in written work

Results by Campus/ Modality

Spring 2018 Average Score

MA 2.5 LO 2.1 ELI 2.57 Total 2.42

Current results improved: N/A: This SLO has never been assessed by the English Discipline Group. Improvement cannot be determined at this time. Strengths by Criterion/ Question/Topic: Limitation: Given that only 4 of 5 sections of ENG 242 offered in Spring 242 are represented in the dataset, it is difficult to draw definitive conclusions about student learning. However, the students in the sample seemed to be able to analyze written texts and integrate evidence from those texts into their writing.

Previous action(s) to improve SLO: None - This is the program’s first attempt to assess SLOs; therefore, past results are unavailable. Target Met: [X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Assignment design: The ENG Discipline needs to further investigate the alignment of assignments to student learning outcomes (SLOs). The data comparing on campus and online versions of the course demonstrate that student success varied between these two delivery modes, which may be related to the type of assignments given in each setting. Faculty development that focuses on assignment design would help ensure that assignments incorporate student learning outcomes without mandating particular pedagogical approaches that would limit academic freedom. Additionally, the assignments submitted as part of this assessment could be used as a starting point for discussing the variety of ways SLOs can be incorporated. In particular, one instructor’s assignment seemed to produce better average results for both criteria when compared to both online and other in-person essays.

Results

Current Assessment Results

SLO 2a SLO 4b Instructor 9 2.75 3.33 Other in-person 2.18 1.82 ELI 2.57 2.57 Overall Average 2.44 2.42

Students overall seem to be able to identify textual features in their analysis papers and to have some success in applying critical terminology. However, tighter integration of SLOs into assignments could improve these outcomes. Current actions to improve SLO based on the results:

Page 251: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

241

Liberal Arts: English Specialization, A.A. 242, and only 5 of those completed assignments that were represented in the data set. Spring 2018 data were scored by 5 disciplinary faculty in September 2018; 2 of these disciplinary faculty had been part of the rubric generation. Each student essay was scored by 2 faculty readers. During the scoring session, faculty had the opportunity to discuss divergent scores and used these discussions to refine application of the scoring rubrics. Sample Size:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 1 0 0 MA 2 2 11 LO 1 1 6 ELI 1 1 7 DE* N/A N/A N/A Total 5 4 24

*Dual-enrollment

Weaknesses by Criterion/ Question/Topic: Given the limitations of the sample size, it is difficult to accurately assess weakness. Students seem to be adequately performing analysis tasks in their writing.

Revision of Course Content Summaries: The ENG Discipline Group has formed a committee to review the 200-level literature Course Content Summaries (CCS). This committee will recommend changes to the CCS for ENG 241 and other literature courses to ensure that the SLOs for the discipline and specialization are better reflected in the CCS (Spring 2019). Initiation of a standing subcommittee for assessment: The ENG Discipline Group will initiate a standing subcommittee for assessment. This subcommittee should develop assessment plans, including developing rubrics to assess the SLOs for both the Discipline and English Specialization as well as for the Core Leaning Objectives (Spring 2019). Professional Development—Rubric and assessment development: The committee tasked with developing assessments and creating rubrics should receive some training to assist with these tasks (Spring-Fall 2019). Professional Development—Assignment Design: The ENG Discipline Group will investigate resources for assignment design professional development (Spring-Fall 2019). Professional Development—Scoring Opportunities: The initial group of 5 faculty scorers appreciated the opportunity to score and have conversations about these essays. The discussions about divergent scores and the assignments that prompted student work assessed were valuable opportunities to better understand teaching and learning. In the future, the ENG Discipline Group would like to broaden this opportunity to include other full-time disciplinary faculty. Adjunct faculty, too, would find this activity valuable; however, the Discipline Group recognizes that there should ideally be some sort of stipend or additional compensation provided as this is a labor-intensive assessment method (seek funding for 19-20 Academic Year). Next assessment of this SLO: 2018-19

Student integrates evidence and competing primary and/or secondary claims

Survey of American Literature II ENG 242 Direct Measure: Student essays from ENG 242. Rubric attached. The ENG Discipline Group selected 1 of our existing Student Learning Outcomes (SLO): SLO 4, Student integrates evidence and competing primary and/or secondary claims effectively into

Semester/year data collected: Spring 2018 Target: Students average scores on the two rubric criteria will be at least 2. Results by In-Class, ELI, Dual Enrollment: SLO 4a: Student integrates evidence and competing primary and/or secondary claims

Previous action(s) to improve SLO: None - This is the discipline’s first attempt to assess SLOs; therefore, past results are unavailable. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement:

Page 252: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

242

Liberal Arts: English Specialization, A.A. effectively into argument-based writing (or other) assignments.

argument-based writing (or other) assignments, and collected data from ENG 242: Survey of American Literature II. We elected to separate 241 and 242 data because the sample size for 242 was considerably smaller than that of 241. In Spring 2018, a subcommittee of 7 full-time disciplinary faculty designed a 2-criteria rubric to measure student learning for SLO 2 and SLO 4. The CLO assessment did not use SLO 4. The assessment of this SLO will be discussed in the 2017-18 Annual Planning and Evaluation Report. The faculty subcommittee tested the two SLO rubrics used by norming them against samples from Fall 2017 semester 241 students. The samples were provided by 2 of the faculty on the committee. This assessment relied upon random sampling of students. Samples were generated by the Office of Student Success Initiatives (OSSI). For each section, a sample of 5 students and 3 alternates was generated. Additionally, OSSI generated a list of all students in ENG 241 who were enrolled in the Liberal Arts-English Specialization degree program. Faculty who taught sections of 241 were requested to provide a written assignment, and responses written by 5 of those randomly selected students and all English Specialization students identified by OSSI. Only 8 Specialization students were enrolled in ENG 242, and only 5 of those completed assignments that were represented in the data set. Spring 2018 data were scored by 5 disciplinary faculty in September 2018; 2 of these disciplinary faculty had been part of the rubric generation. Each student essay was scored by 2 faculty readers. During the scoring session, faculty had the opportunity to discuss divergent scores and used these discussions to refine application of the scoring rubrics. Sample Size:

effectively into argument-based writing (or other) assignments: Identifies and integrates evidence from sources.

Results by Campus/ Modality

Spring 2018 Average Score

MA 2.77 LO 2.3 ELI 2.71 Total 2.44

Results by In-Class, ELI, Dual Enrollment: SLO 4b: Student integrates evidence and competing primary and/or secondary claims effectively into argument-based writing (or other) assignments: Considers competing claims from sources.

Results by Campus/ Modality

Spring 2018 Average Score

MA 1.64 LO 1.6 ELI 2.43 Total 1.83

Current results improved: N/A - This SLO has never been assessed by the English Discipline Group. Improvement cannot be determined at this time. Strengths by Criterion/ Question/Topic: Limitation: Given that only 4 of 5 sections of ENG 242 offered in Spring 2018 are represented in the dataset, it is difficult to draw definitive conclusions about student learning. However, the students in the sample seemed to be able to analyze written texts and integrate evidence from those texts into their writing. Weaknesses by Criterion/ Question/Topic: Students do not seem to consider claims from competing sources in their written analyses. However, we cannot draw any definitive conclusions from the limited sample of student work.

Assignment design: The ENG Discipline needs to further investigate the alignment of assignments to student learning outcomes (SLOs). The data comparing on campus and online versions of the course demonstrate that student success varied between these two delivery modes, which may be related to the type of assignments given in each setting. Faculty development that focuses on assignment design would help ensure that assignments incorporate student learning outcomes without mandating particular pedagogical approaches that would limit academic freedom. Additionally, the assignments submitted as part of this assessment could be used as a starting point for discussing the variety of ways SLOs can be incorporated. In particular, one instructor’s assignment seemed to produce better average results for both criteria when compared to both online and other in-person essays.

Results

Current Assessment Results

SLO 4a SLO 4b Instructor 9 2.92 2.25 Other in-person 2.45 1.23 ELI 2.71 2.43 Overall Average 2.65 1.83

Acknowledging competing, or at least alternative, claims are consistent area of concern. This may suggest that assignments do not explicitly require that students incorporate other points of view into their papers. This suggests that both assignment design and SLO-Course Content Summary alignment may need more attention. Current actions to improve SLO based on the results: Revision of Course Content Summaries: The ENG Discipline Group has formed a committee to review the 200-level literature Course Content Summaries (CCS). This committee will recommend changes to the CCS for ENG 242 and other literature courses to ensure that the SLOs for the discipline and specialization are better reflected in the CCS (Spring 2019). Initiation of a standing subcommittee for assessment: The ENG Discipline Group will initiate a standing subcommittee for assessment. This subcommittee should develop assessment plans, including developing rubrics to assess the SLOs for both the Discipline and English Specialization as well as for the Core Leaning Objectives (Spring 2019).

Page 253: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

243

Liberal Arts: English Specialization, A.A. Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 1 0 0 MA 2 2 11 LO 1 1 6 ELI 1 1 7 DE* N/A N/A N/A Total 5 4 24

*Dual-enrollment

Professional Development—Rubric and assessment development: The committee tasked with developing assessments and creating rubrics should receive some training to assist with these tasks (Spring-Fall 2019). Professional Development—Assignment Design: The ENG Discipline Group will investigate resources for assignment design professional development (Spring-Fall 2019). Professional Development—Scoring Opportunities: The initial group of 5 faculty scorers appreciated the opportunity to score and have conversations about these essays. The discussions about divergent scores and the assignments that prompted student work assessed were valuable opportunities to better understand teaching and learning. In the future, the ENG Discipline Group would like to broaden this opportunity to include other full-time disciplinary faculty. Adjunct faculty, too, would find this activity valuable; however, the Discipline Group recognizes that there should ideally be some sort of stipend or additional compensation provided as this is a labor-intensive assessment method (seek funding for 2019-20 Academic Year). Next assessment of this SLO: 2018-19

Student integrates evidence and competing primary and/or secondary claims effectively into argument-based writing (or other) assignments.

Survey of American Literature I ENG 241 Direct Measure: Student essays from ENG 241. Rubric attached. See description of methods in CLO below. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AN 3 3 14 MA 1 1 6 WO 1 1 6 ELI 4 4 21 DE* N/A N/A N/A Total 9 9 47

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: Students average scores on the two rubric criteria will be at least 2. Results by In-Class, ELI, Dual Enrollment: SLO 2a: Student analyzes written, oral, and visual texts: Identifies content, structure, and rhetorical features of the text(s) under consideration in the paper.

Results by Campus/ Modality

Spring 2018 Average Score

AN 2.50 MA 2.08 WO 1.50 ELI 2.38 Total 2.27

Results by In-Class, ELI, Dual Enrollment: SLO 4b: Student integrates evidence and competing primary and/or secondary claims effectively into argument-based writing (or other)

Previous action(s) to improve SLO: None - This is the discipline’s first attempt to assess SLOs; therefore, past results are unavailable. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Assignment design: The ENG Discipline needs to further investigate the alignment of assignments to student learning outcomes (SLOs). The data comparing on campus and online versions of the course demonstrate that student success varied between these two delivery modes, which may be related to the type of assignments given in each setting. Faculty development that focuses on assignment design would help ensure that assignments incorporate student learning outcomes without mandating particular pedagogical approaches that would limit academic freedom. Additionally, the assignments submitted as part of this assessment could be used as a starting point for discussing the variety of ways SLOs can be incorporated. In particular, one instructor’s assignment seemed to produce better

Page 254: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

244

Liberal Arts: English Specialization, A.A. assignments: Considers competing claims from sources.

Results by Campus/ Modality

Spring 2018

AN 2.14 MA 0.67 WO 0.83 ELI 1.83 Total 1.65

Current results improved: N/A - This SLO has never been assessed by the English Discipline Group. Improvement cannot be determined at this time. Strengths by Criterion/ Question/Topic: Students seem to be able to analyze texts with competency, as demonstrated by the overall average score of 2.27 on SLO 2a: Identifies content, structure, and rhetorical features of the text(s) under consideration in the paper. Weaknesses by Criterion/ Question/Topic: Not all students used literary terminology or other critical terminology in their analyses (SLO 2b: Appropriately employs critical terminology in written work). Limitations of the analysis: Though ENG 241 is the most frequently offered 200-level literature course in the discipline, it was not offered on all campuses, nor were many sections offered on each campus. Average scores, especially on the Woodbridge campus, skewed lower due to performance of one student in the small 6-student sample. We have discovered that capturing data from our specialization students is extremely challenging. The Discipline Group committee that developed this assessment did not feel that it would be appropriate to capture data from specialization students in ENG 111, 112, or 125 as these are the foundational courses and would not truly allow us to differentiate between our specialization students’ learning and that of other students.

average results for both criteria when compared to both online and other in-person essays. Acknowledging competing, or at least alternative, claims are consistent area of concern. This may suggest that assignments do not explicitly require students to incorporate other points of view into their papers. This suggests that both assignment design and SLO-Course Content Summary alignment may need more attention. Current actions to improve SLO based on the results: Revision of Course Content Summaries: The ENG Discipline Group has formed a committee to review the 200-level literature Course Content Summaries (CCS). This committee will recommend changes to the CCS for ENG 241 and other literature courses to ensure that the SLOs for the discipline and specialization are better reflected in the CCS (Spring 2019). Initiation of a standing subcommittee for assessment: The ENG Discipline Group will initiate a standing subcommittee for assessment. This subcommittee should develop assessment plans, including developing rubrics to assess the SLOs for both the Discipline and English Specialization as well as for the Core Leaning Outcomes (Spring 2019). Professional Development—Rubric and assessment development: The committee tasked with developing assessments and creating rubrics should receive some training to assist with these tasks (Spring-Fall 2019). Professional Development—Assignment Design: The ENG Discipline Group will investigate resources for assignment design professional development (Spring-Fall 2019). Professional Development—Scoring Opportunities: The initial group of 5 faculty scorers appreciated the opportunity to score and have conversations about these essays. The discussions about divergent scores and the assignments that prompted student work assessed were valuable opportunities to better understand teaching and learning. In the future, the ENG Discipline Group would like to broaden this opportunity to include other full-time disciplinary faculty. Adjunct faculty, too, would find this activity valuable; however, the Discipline Group recognizes that there should ideally be some sort of stipend or additional compensation

Page 255: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

245

Liberal Arts: English Specialization, A.A. provided as this is a labor-intensive assessment method (seek funding for 19-20 Academic Year). Next assessment of this SLO: 2018-19

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: [ X ] CT

Survey of American Literature I ENG 241 Direct Measure: Student essays from ENG 241. Rubric attached. To assess critical thinking, the ENG Discipline Group selected 1 of our existing Student Learning Outcomes (SLO): Student analyzes written, oral, and visual texts. In Spring 2018, a subcommittee of 7 full-time disciplinary faculty designed a 2-criteria rubric to measure student learning for SLO 2 and for SLO 4, Student integrates evidence and competing primary and/or secondary claims effectively into argument-based writing (or other) assignments. The CLO assessment did not use SLO 4. The assessment of this SLO will be discussed in the 2017-18 Annual Planning and Evaluation Report. The faculty subcommittee tested the two SLO rubrics used by norming them against samples from Fall 2017 semester ENG 241 students. The samples were provided by 2 of the faculty on the committee. This assessment relied upon random sampling of students. Samples were generated by the Office of Student Success Initiatives (OSSI). For each section, a sample of 5 students and 3 alternates was generated. Additionally, OSSI generated a list of all students in ENG 241 who were enrolled in the Liberal Arts-English Specialization degree program. Faculty who taught sections of ENG 241 were requested to provide a written assignment and responses written by 5 of those randomly selected students and all English Specialization students identified by OSSI. Only 4 Specialization students were enrolled in ENG 241, and all are represented in the data set.

Semester/year data collected: Spring 2018 Target: Students average scores on the two rubric criteria will be at least 2. Results by In-Class, ELI, Dual Enrollment: SLO 2a: Student analyzes written, oral, and visual texts: Identifies content, structure, and rhetorical features of the text(s) under consideration in the paper.

Results by Campus/ Modality

Spring 2018 Average Score

AN 2.43 MA 2.25 WO 1.42 ELI 2.33 Total 2.23

Results by In-Class, ELI, Dual Enrollment: SLO 2b: Student analyzes written, oral, and visual texts: Appropriately employs critical terminology in written work

Results by Campus/ Modality

Spring 2018 Average Score

AN 2.29 MA 1.67 WO 0.92 ELI 1.45 Total 1.66

Current results improved: N/A - This CLO has never been assessed by the English Discipline Group. Improvement cannot be determined at this time. Strengths by Criterion/ Question/Topic: Students seem to be able to analyze texts with competency, as demonstrated by the overall average score of 2.23 on SLO 2a: Identifies content, structure, and rhetorical features of the text(s) under consideration in the paper.

Previous action(s) to improve CLO: None - This is the discipline’s first attempt to assess CLOs; therefore, past results are unavailable. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Assignment design: The ENG Discipline needs to further investigate the alignment of assignments to student learning outcomes (SLOs). The data comparing on campus and online versions of the course demonstrate that student success varied between these two delivery modes, which may be related to the type of assignments given in each setting. Faculty development that focuses on assignment design would help ensure that assignments incorporate student learning outcomes without mandating particular pedagogical approaches that would limit academic freedom. Additionally, the assignments submitted as part of this assessment could be used as a starting point for discussing the variety of ways SLOs can be incorporated. Incorporation of critical or literary terminology in analysis papers and acknowledging competing, or at least alternative, claims are consistent area of concern. This may suggest that assignments do not explicitly require that students apply the terminology taught in the classes and/or that the assignments assessed do not require students to incorporate other points of view into their papers. Current actions to improve CLO based on the results: Revision of Course Content Summaries: The ENG Discipline Group has formed a committee to review the 200-level literature Course Content Summaries (CCS). This committee will recommend changes to the CCS for ENG 241 and other literature courses to ensure that the SLOs for the discipline and specialization are better reflected in the CCS (Spring 2019). Initiation of a standing subcommittee for assessment: The ENG Discipline Group will initiate a standing subcommittee for assessment. This subcommittee should develop assessment plans, including developing rubrics to assess the SLOs for both the Discipline and English

Page 256: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

246

Liberal Arts: English Specialization, A.A. Spring 2018 data were scored by 5 disciplinary faculty in September 2018; 2 of these disciplinary faculty had been part of the rubric generation. Each student essay was scored by 2 faculty readers. During the scoring session, faculty had the opportunity to discuss divergent scores and used these discussions to refine application of the scoring rubrics. Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AN 3 3 14 MA 1 1 6 WO 1 1 6 ELI 4 4 21 DE* N/A N/A N/A Total 9 9 47

*Dual-enrollment

Weaknesses by Criterion/ Question/Topic: Not all students used literary terminology or other critical terminology in their analyses (SLO 2b: Appropriately employs critical terminology in written work). Limitations of the analysis: Though ENG 241 is the most frequently offered 200-level literature course in the discipline, it was not offered on all campuses, nor were many sections offered on each campus. Average scores, especially on the Woodbridge campus, skewed lower due to performance of one student in the small 6-student sample. The next time this SLO is assessed, we will determine whether variation between campuses is a trend that needs to be addressed in either the Discipline Group or the Language Pathways Council. We have discovered that capturing data from our specialization students is extremely challenging. The Discipline Group committee that developed this assessment did not feel that it would be appropriate to capture data from specialization students in ENG 111, 112, or 125 as these are the foundational courses and would not truly allow us to differentiate between our specialization students’ learning and that of other students.

Specialization as well as for the Core Leaning Outcomes (Spring 2019). Professional Development—Rubric and assessment development: The committee tasked with developing assessments and creating rubrics should receive some training to assist with these tasks (Spring-Fall 2019). Professional Development—Assignment Design: The ENG Discipline Group will investigate resources for assignment design professional development (Spring-Fall 2019). Professional Development—Scoring Opportunities: The initial group of 5 faculty scorers appreciated the opportunity to score and have conversations about these essays. The discussions about divergent scores and the assignments that prompted student work assessed were valuable opportunities to better understand teaching and learning. In the future, the ENG Discipline Group would like to broaden this opportunity to include other full-time disciplinary faculty. Adjunct faculty, too, would find this activity valuable; however, the Discipline Group recognizes that there should ideally be some sort of stipend or additional compensation provided as this is a labor-intensive assessment method (seek funding for 2019-20 Academic Year). Next assessment of this CLO: Spring 2020

Program Goals Evaluation Methods Assessment Results Use of Results

Program goal on program-placed students

Prior to 2017-2018, the Liberal Arts-English Specialization was not assessed as a program, but was assessed as part of the Liberal Arts program. In the Office of Institutional Effectiveness and Student Success’ annual reports of data for planning and evaluation, none of the specializations, including the English specialization, are disaggregated from the Liberal Arts degree program. Liberal Arts program placement data for Fall 2013, 2014, and 2015 was derived from the Annual Planning and Evaluation Report for each of those academic years, which is available at https://www.nvcc.edu/assessment/

Semester/year data collected: Fall 2017 Target: No target has been established. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 Liberal Arts: OIESS Report not available English specialization: 61

L.A: unknown English

specialization: -15.28%

2016 Liberal Arts: OIESS Report not available English specialization: 72

L.A: unknown English

specialization: -2.78%

Previous action(s) to improve program goal: N/A – This is the first time reporting on the English Specialization separately from Liberal Arts. Most recent results: Like the larger Liberal Arts degree program, the English Specialization has seen some fluctuation in program placement since 2014. Results improved: N/A Current action(s) to improve program goal: The English Discipline Group plans to consult with the Liberal Arts Pathway Council to determine what actions need to be taken to improve program placement rates. We expect this coordination to take place during the 2019-20 Academic Year, once the Pathway Councils have become an established part of the newly-reorganized NOVA college structure. At that time, the English Discipline Group will

Page 257: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

247

Liberal Arts: English Specialization, A.A. An SIS query was run to determine program placement for English specialization (Academic Plan 5113).

2015 Liberal arts: 3278 English specialization: 74

L.A: -6.48% English

specialization: -15.5%

2014 Liberal arts: 3505 English specialization: 87

L.A: +23.58% English

specialization: N/A

2013 Liberal arts: 3278 English specialization: N/A

L.A: English

specialization: N/A

Target Met: N/A - No target has been established. Comparison to previous assessment(s): N/A. The English Specialization has not been assessed previously as an entity separate from the Liberal Arts degree program.

reconstitute an English Specialization Subcommittee to coordinate with the Liberal Arts Pathway Council on an on-going basis. Additionally, the English Discipline Group will identify a program placement target for 2018-19. Assessed: Annually

Program goal on graduation

Prior to 2017-2018, the Liberal Arts-English Specialization was not assessed as a program, but was assessed as part of the Liberal Arts program. The Program drew upon the graduation rates published by Office of Institutional Effectiveness and Student Success in its annual reports of data for planning and evaluation. The Liberal Arts-English Specialization was first offered in the 2014-15 academic year; therefore, there is no data for 2013-2014.

Semester/year data collected: 2017-18 Target: No target has been established Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

2017-2018 25 92% 2016-2017 13 117% 2015-2016 6 100% 2014-2015 3 N/A-1st year 2013-2014 N/A

Target Met: N/A: No target has been established Comparison to previous assessment(s): N/A. The English Specialization has not been assessed previously as an entity separate from the Liberal Arts degree program.

Previous action to improve program goal: The English Discipline Group, formerly the English Discipline Cluster, has encouraged full-time faculty to work closely with Specialization students to ensure that they receive advising that helps them finish their degree programs. Most recent results: The number of English Specialization graduates continues to grow since it was first offered in the 2014-15 academic year. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: The English Discipline Group plans to consult with the Liberal Arts Pathway Council to determine what actions need to be taken to improve graduation rates. We expect this coordination to take place during the 2019-20 academic year, once the Pathway Councils have become an established part of the newly-reorganized NOVA college structure. At that time, the English Discipline Group will reconstitute an English Specialization Subcommittee to coordinate with the Liberal Arts Pathway Council on an on-going basis. Additionally, the English Discipline Group will identify a graduation target for 2018-19. Assessed: Annually

Page 258: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

248

Annual Planning and Evaluation Report: 2017-2018 Marketing, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce.

Program Purpose Statement: The curriculum is designed for persons who seek full-time employment or advancement in areas involving marketing and marketing management. The career objectives include marketing assistant manager, store owner and department manager, sales supervisor, customer service representative, front-line supervisor, promotion and public relations assistant, advertising account associate, marketing communications assistant, international marketing intern, social media marketing specialist, brand ambassador, event marketing associate and e-commerce sales support for business, government and not-for-profit organizations. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to communicate effectively to the public, press and employees in both oral and written formats with grammatical accuracy.

Public Relations MKT 221 Direct Measure: News release and media kit with a PR plan for a new product launch presented during a mock press conference to show mastery of the written and spoken aspects of public relations used to measure this SLO. SLO components: Evaluation based on clarity and format style, grammatical accuracy, clear communications during presentation and overall mastery of PR communications skills. Faculty member evaluated each component. Students were rated as underperforming (0-39 points) or meeting and exceeding expectations (40-50 points). Assignment attached. Sample size: 21 students • Sections surveyed: 1 at

Annandale • Total sections: 1 classroom

Semester/year data collected: Fall 2017 Target: 75% of students will meet skill requirements indicating mastery of SLO. Results: All 21 students successfully achieved the target:

Semester 0-39 pts. 40-50 pts.

Fall 2017 n= 21 students 0 21 (100%)

Fall 2016 n=3 students 0 3 (100%)

Fall 2015 n=9 students 1 8 (88%)

Fall 2014 n=10 students 2 8 (80%)

Fall 2013 n=13 students 3 10 (76%)

SLO components evaluation - Fall 2017:

Component Success

Under-

performing Meeting Expecta-

tions

Clarity & format style 0 21 (100%)

Grammatical Accuracy 0 21 (100%)

Clear oral Communications during presentation

0 21 (100%)

Current results improved: [ ] Yes [ X ] No [ ] Partially - Data same as last year All areas met expectation in 2016 and 2017.

Target Met: [ X ] Yes [ ] No [ ] Partially The target was met or exceeded expectations for this SLO and for individual SLO components (see column 3). One hundred percent of the students successfully achieved the overall target of 75%. This is the continuation of a positive trend over the past five years. This trend is due to the use of project outlines, plus support from the Campus Writing Center and faculty assistance. The success also carried over to the individual component evaluations. Previous action: Fall 2017 required the student project outline prior to the final project presentation to improve SLO results. Students having trouble with this assignment will be referred to the Campus Writing Center for assistance. Also, more individual assistance was provided by the professor for SLO components success in 2017. The Assistant Dean will remind faculty during 2017-18 semester meetings to focus on communication activities in their MKT classes. Current action: Continue previous actions which contributed to student SLO success in 2017 such as project outlines and use of the Campus Writing Center for writing and editing purposes during 2018. Next assessment: Fall 2018

Students will be able to describe the elements of the

Introduction to Marketing MKT 201

Semester/year data collected: Fall 2017 Target: 80% of students will meet skill requirements

Target Met: [ ] Yes [ X ] No [ ] Partially The target was not met for this SLO for both in-class and ELI sessions, as well as for some individual SLO components

Page 259: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

249

Marketing, A.A.S. marketing mix (product, price, place, and promotion) and their integration to achieve customer satisfaction and organization goals.

Direct Measure: Survey used to test the student’s ability to list, explain and then apply their knowledge indicating mastery of the information. Assignment attached. SLO components: Evaluation based on student understanding of each individual component. Faculty member evaluated each component. Students were rated as underperforming (0-79%) or meeting and exceeding expectations (80-100%). Sample size: 111 students total Total sections: 11 Sample size: • In-class - 88, • ELI - 23 • Sections surveyed: 9 - AL, AN,

ELI, LO

indicating mastery of SLO. Results - Classroom: 69 out of 88 students (78%) successfully achieved the target.

Classroom 0-79% 80-100% Fall 2017 n= 88 students 19 (22%) 69 (78%)

Results - ELI: 18 out of 23 students (78%) enrolled in ELI successfully achieved the target.

ELI 0-79% 80-100% Fall 2017 n= 23 students 5 (22%) 18 (78%)

Classroom results - SLO components, Fall 2017:

Component Success - Classroom

Under-performing

Meeting Expectations

Product 15 (17%) 73 (83%) Price 22 (25%) 66 (75%) Placement 19 (22%) 69 (78%) Promotion 16 (18%) 72 (82%)

ELI results - SLO components, Fall 2017:

Component Success -

ELI Under-

performing Meeting

Expectations

Product 3 (13%) 20 (87%) Price 6 (26%) 17 (73%) Placement 7 (30%) 16 (70%) Promotion 3 (13%) 20 (87%)

Results from Fall 2016:

• Overall: 81% meeting expectations • Product: 83% meeting expectations • Price: 79% meeting expectations • Placement: 79% meeting expectations • Promotion: 81% meeting expectations

Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criteria/ Question Topic: Areas where results improved or are above target:

• Product mastery improved 2% • Promotion mastery improved 3.5%

(see column 3). SLO results for classroom and ELI are reported separately as now required. Only 78% of the students successfully achieved the 80% target in both in-class and ELI formats and this needs to be improved in 2018. When ELI and classroom results were combined in 2017 and 2016, success was achieved at 81% and 80% respectively. The SLO results have been right at the edge of the 80% target goal. The in-class and ELI individual components reflect underperforming results for product “pricing” and “placement.” Additional work in these two areas is needed to achieve the 80% target in 2018. Annandale Campus offers the most MKT 201 sections. Woodbridge and Reston Campuses did not respond to three emails sent starting September 2017. The College website listed no MKT 201 classes for Manassas during Fall 2017. College faculty teaching MKT 201 were provided with survey instructions and a reply form for results, a student survey form which could be duplicated or emailed, along with a faculty answer key. Faculty were asked to complete and forward results by the end of the Fall 2017 semester. Previous action: Use current rubric in 2017 and continue previous actions which contributed to a positive trend in student success. Also, focus additional time on students’ reviews, study guides, online activities, videos and assignments in specific areas of “pricing” and “placement” to improve student overall SLO and individual components success in 2017. Current action: Use current rubric in 2018 and focus more time than previously assigned on student reviews, study guides, online activities, videos and assignments in specific areas to increase 2018 SLO overall success to 80% in both classroom and ELI formats. More individual attention is also needed for the “pricing” and “placement” components and can be supported by reinforcing terminology. This should improve both overall and component SLO success in 2018. Request participation from Woodbridge and Reston Campuses in 2018.

Page 260: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

250

Marketing, A.A.S. Weaknesses by Criteria/ Question Topic: Areas where results did not improve or remain below target:

• Price mastery decreased by 5% • Placement mastery decreased by 5%

Next assessment: Fall 2018

Students will be able to demonstrate how to achieve organizational objectives by effectively interacting with others as team members and team leaders.

Principles of eCommerce MKT 282 Direct Measure: Student groups create a company website using Google. Evaluation based on ability to work as a team producing a website that is engaging, easy to navigate, allows for interaction with customers, appropriate for target audience and without spelling errors. Assignment attached. Faculty member evaluated each component. Teams were rated as underperforming with 0-79 points or meeting and exceeding expectations with 80-100 points in 2018. Sample size: 15 students • Sections surveyed: 1 at

Annandale - Hybrid • Total sections: 1

Semester/year data collected: Spring 2018 Target: - 80% of students will meet skill requirements indicating mastery of SLO. (Target goal increased to 80% in 2017.) Results: 13 out of 15 students (87%) successfully achieved the target. Target goal 80 points:

Semester 0-79 pts. 80-100 pts. Spring 2018 n=15 students 2 13 (89%)

Spring 2017 n=13 students 0 13 (100%)

Target goal 75 points:

Semester 0-74 pts. 75-100 pts. Spring 2016 n=12 students 0 12(100%)

Spring 2015 n=10 students 0 10 (100%)

Spring 2014 n=20 students 0 20 (100%)

Student component collaboration: Success based on student survey results. Rating for each individual component is done on a possible one through five points. One through three points indicates expectations not met. Four through five points indicates success. Percentage of success indicated below. Target goal 80%:

Student Rating Criteria of Success

2017 n=11

students

2018 n=15

students

Attendance 100% 87% Goals 90% 87% Accountability 100% 93% Cohesion 100% 87% Communication 100% 87% Decision Making 90% 87%

Target Met: [ X ] Yes [ ] No [ ] Partially The target of 80% was met or exceeded expectations for this SLO and for individual SLO components (see column 3). The ability to work as student teams was successfully achieved in 2018 by the majority of students. Two students had some difficulty working together; however, they did manage to produce a successful website. Students enjoy this assignment since this is a small class and most students know each other from other marketing courses. The individual component evaluation for 2018 also indicates these teams function well even though there was some decrease in results. There were no 100% successes as reported in 2017; however, all 2018 components achieved 87% success or above. In 2017, “conflict resolution” was the least successful component at 81% success. The 2018 evaluation indicates an increase to 87% success which is an improvement and positive trend. Previous action: Use 80% for the overall target and the comparison of component results in 2018. Also, discuss with students the importance of class attendance and basic conflict resolution methods as the assignment is presented. Current action: Continue using the student survey to determine overall team success and evaluation of individual components in 2018. Focus more time on improving “leadership” and “conflict resolution” skills when the project is presented by the professor. Also, continue to emphasize the importance of class attendance for teams to function properly. These actions should improve overall success and individual component success in 2018. Current results improved: [ ] Yes [ x ] No [ ] Partially Strengths by Criteria/ Question Topic: Areas where results improved or are above target – None

Page 261: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

251

Marketing, A.A.S. Adjusting 90% 87% Assessment 90% 87% Timely Work 100% 93% Leadership 90% 87% Conflict 81% 87%

Weaknesses by Criteria/ Question Topic: Areas where results did not improve or remain below target - All areas weaker in 2018 Next Assessment: Spring 2019

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Quantitative Reasoning Students will be able to apply basic business math to inventory planning and control, pricing strategies, budget calculations, stock turns, and inventory loss.

Merchandise Buying & Control MKT 227 Direct Measure: Comprehensive merchandising math exam used to evaluate CLO – selected as the General Education core competency evaluation of student math skills. Includes stock turnover, planned purchases, open-to-buy, vendor discounts, inventory shrinkage, pricing, mark ups and mark downs, etc. Math exam part of the final comprehensive class exam. Exam attached. Faculty member evaluated each math question. Students were rated as underperforming (0-44 points) or meeting and exceeding expectations (45-60 points) Sample size: 19 students • Sections surveyed: 1 at

Annandale • Total sections: 1

Semester/year data collected: Spring 2018 Target: 75% of students will meet skill requirements indicating mastery of SLO. Results: 12 out of 14 students (85%) successfully achieved the target:

Semester 0-44 pts. 45-60 pts.

Spring 2018 n=19 students 2 17 (89%)

Spring 2017 n=11 students 2 9 (81%)

Spring 2016 n=14 students 2 12 (85%)

Spring 2015 n=12 students 2 10 (83%)

Spring 2014 n=20 students 2 19 (90%)

CLO components evaluation Spring 2016 - 2018.

Questions 2016 2017 2018 % correct

1 stock turnover 73% 71% 78% 2. BOM stock 83% 84% 85% 3. s/s ratio 82% 82% 84% 4. p. purchases 84% 84% 96% 5. open-to-buy 92% 100% 100% 6. shrinkage 73% 79% 84% 7. markup 93% 91% 94% 8. original mu 84% 83% 86% 9. P. reductions 90% 91% 92% 10. mark downs 86% 84% 89%

Current results improved: [ X] Yes [ ] No [ ] Partially - All areas are improved over prior year.

Target Met: [ X ] Yes [ ] No [ ] Partially The target was met or exceeded expectations for this CLO and for individual SLO components (see column 3). This SLO was selected for the General Education Core Competency Assessment of student math skills. Eighty-nine percent of the students successfully achieved the overall CLO target of 75%. This is a positive trend. CLO components also showed overall improvements and especially in “stock turnover,” achieving the 75% target for the first time. A student merchandising math workbook developed by the faculty member is used to cover this information and to provide numerous practice problems. These problems cover basic high school math. Previous actions: Faculty will focus on additional student problems calculating “stock turnover” and “merchandise shrinkage” using group work to encourage students to help each other in class. Faculty will refer students struggling with math to the Math and Science Tutoring Center for assistance in Spring 2018. Current Action: Remove SLO #4 as a stand-alone program goal. MKT 227 has been removed from the Marketing Program curriculum. The College MTH requirement MTH 154: Quantitative Reasoning will satisfy program MTH requirements in 2018. Next Assessment: This SLO will no longer be evaluated. The program SLO list will drop from five goals to four. This change appears in the 2018-19 Marketing Program Curriculum Map.

Students will be able to apply

Sales and Marketing Management MKT 215

Semester/year data collected: Spring 2018

Target Met: [ X ] Yes [ ] No [ ] Partially The target was met and/or exceeded expectations for this

Page 262: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

252

Marketing, A.A.S. marketing principles such as marketing strategies, sales promotion, sales management good customer service, public relations and ethical procedures in consumer and business transactions...

Direct Measure: Sales presentation used to evaluate this SLO. Assignment attached. SLO components: Students must follow the appropriate steps in the selling process such as getting the customer’s attention, presenting product benefits, handling objections, and closing the sale. Faculty member evaluated each component. Students were rated as underperforming (0-17 points) or meeting and exceeding expectations (18-25 points). Sample size: 10 students • Sections surveyed: 1 at

Annandale • Total sections: 1

Target: 75% of students will meet skill requirements indicating mastery of SLO. Results: 23 out of 23 students (100%) successfully achieved the target:

Semester 0-17 pts. 18-25 pts.

Spring 2018 n = 10 students 0 10 (100%)

Spring 2017 n = 23 students 0 23 (100%)

Spring 2016 n= 10 students 0 10 (100%)

Spring 2015 n=14 students 0 14 (100%)

Spring 2014 n=19 students 0 19 (100%)

SLO components - Spring 2017 and Spring 2018:

Component success

Meeting Goals 2017

n=23

Meeting Goals 2018

n=10 Approach 23 (100%) 10 (100%) Securing desire 23 (100%) 10 (100%) Handling objections 23 (100%) 10 (100%) Closing the sale 23 (100%) 10 (100%)

Current results improved: [ ] Yes [ x ] No [ ] Partially - Results same as prior year

SLO and for individual SLO components (see column 3). One-hundred percent of the students successfully achieved the overall target which is a positive trend. Student skills for this SLO were supported by additional student training on sales preparation methods and practice demonstrations. SLO component evaluations also show continued success. New training activities included three additional case study presentations by students prior to the final presentation and Ted Talks videos incorporated into the course materials with student follow up discussions. Also, Shark Tank (TV show) sales pitches were critiqued by students. This activity was particularly valuable to encourage critical thinking. Past action: Continue to maintain the positive results by means of student practice demonstrations and using videos to illustrate the proper methods of developing a sales presentation. Previous student activities that have proven successful will be continued in 2018. Current action: Continue in 2019 with both past and present student learning activities that produced the positive overall and individual component SLO success in 2018. Next Assessment: Spring 2019

Program Goals Evaluation Methods Assessment Results Use of Results To keep the marketing curriculum up-to-date reflecting current industry trends.

The Marketing Faculty Cluster and Marketing Program Advisory Committee, using its Marketing Trends Statement (2015) reviewed program courses, degrees and certificates offered.

Assessment results produced recommended AAS degree curriculum changes in 2017.

Target Met: [ X ] Yes [ ] No [ ] Partially The goal was met or exceeded expectations for keeping the curriculum up-to-date. The Marketing Program recommendations were approved in 2017 by the College Curriculum Committee and College Board. Three courses were removed from the program (MKT 200, MKT 216 and MKT 227) with recommendations from the Advisory Committee. These courses are focused on the retail skills which are no longer demanded by employers. Past Actions: Continue using the Marketing Faculty Cluster and Program Advisory Committee with its Industry Trends Statement for program evaluation purposes in 2017-18. Current Action: Update Marketing Trends Statement (2015) produced by the Marketing Advisory Committee. Use updated Trends Statement to review program skills training and topics

Page 263: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

253

Marketing, A.A.S. covered to be inclusive in current courses to some greater or lesser degree in 2018-19. Next Assessment: 2018-19

To support and improve successful marketing course completion to encourage student advancement in their program of study.

OIR Success Rates by Disciplines, ELI courses excluded: Data for 2017-18 Planning and Evaluation Reports

Target: 75% completion rate for Marketing courses desired. Course Completion Rates:

Term Total Students Success Rate

Fall 2017 159 83%(191) Fall 2016 167 80% (134) Fall 2015 257 77% (197) Fall 2014 244 76% (185) Fall 2013 329 78% (256) Fall 2012 338 79% (266)

Target Met: [ X ] Yes [ ] No [ ] Partially The goal was met or exceeded expectations for course completion. Results show course completion rate of 83.2% for Fall 2017 is an improvement over the previous year and is above the target of 75%. Previous actions taken by the faculty support this effort. Past action: Discuss MKT course completion rates with faculty. Develop methods of encouraging students to successfully finish their courses in 2016-17. Methods: (1) Closely monitor students needing help; (2) Inform students of tutoring and writing services on campus; (3) Remind students of the academic harm possible if they don’t attend class. Current action: Continue previous actions in 2018-19 to support the target goal. In addition, remind students at the end of the fall and spring semesters to see their marketing faculty advisor to make sure they are on track to graduate. Next Assessment: 2018

To maintain and increase student graduation totals.

OIR Number of Graduates by Program and Specializations: 2017-18 .

Target: 10 AAS graduates in the Marketing Program. AAS MKT Graduates:

• 2017-2018: 9 • 2016-2017: 12 • 2015-2016: 12 • 2014-2015: 8 • 2013-2014: 10

Target: 7 CSC graduates each in the Marketing Program CSC MKT Graduates:

Program 2015-16 2016-17 2017-18 Soc. Media. 6 11 6 Marketing 7 9 7 Promotion & PR 7 6 6 *Retail Mgt. 3 9 8

*Retail Management name change to Marketing Management approved for 2018-19.

Target Met: [ ] Yes [ X ] No [ ] Partially The goal was not met for both AAS and CSC graduates in 2017-18. The number of AAS graduates decreased. The number CSC graduates also decreased in all four certificates. This needs to be improved in 2018-19. Past Action: To maintain and increase the number of graduates by means of (1) promoting the AAS degree, certificates and courses; (2) funding student scholarships; and (3) focusing on active student advising efforts in 2017-18. Current action: To maintain and increase the number of graduates by means of (1) promoting the AAS degree, certificates and courses; (2) funding student scholarships; and (3) focusing on faculty student advising efforts in 2018-19. The Marketing Program College Pathways Reorganization of the Marketing Program should have a positive influence on the number of program graduates by 2018-19. Changes to the Marketing Program curriculum should also attract more students, along with the student option of completing this AAS degree and all certificates online for the first time

Page 264: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

254

Marketing, A.A.S. starting fall 2018. Assessed: Annually

To maintain and increase enrollment and the number of program placed students

OIR Student Enrollments by FTES: NOVA Fact Book 2011-2015, and 2012-2016

OIR Distribution of Program Placed Students: Data for 2015-2016 and 2016-2017

Target: MKT FTES 75% MKT FTES:

• Fall 2016 55.7% • Fall 2015 – 79.7% • Fall 2014 – 74.7 • Fall 2013 – 87.2 • Fall 2012 – 95.0 • Fall 2011 – 109.9

Target: Program Placed MKT students 110 Program Placed:

• Fall 2017- 97 • Fall 2016 – 108 • Fall 2015 – 162 • Fall 2014 – 186 • Fall 2013 – 206

Fall 2012 – 213

Target Met: [ ] Yes [ X ] No [ ] Partially The goal of 75% was not met for FTES. The goal of 110 program placed students was not achieved. FTEs and program-placed numbers have been dropping which is a reflection of College enrollments overall. Results show a decline in the number of student FTES and program placed students. New action is needed to increase FTES and the number of program placed students in 2018-19. Past action: Initiate approval for a Dual Enrollment option for the Marketing Program with area high schools to increase FTEs in 2018-19. Hopefully, many of these students will continue their education at NOVA, increasing the number of program placed students and graduates by 2019. Continue to work with MKT faculty and Program Advisory Committee to (1) promote degrees/courses by means of student handouts and advising; (2) fund student scholarships; and (3) focus on student advising efforts in 2017-18. Current action: Start the Dual Enrollment option with area high schools during 2018-19. This should maintain and hopefully improve the FTES. Support the college initiative of a more hands-on approach with incoming students and trying new methods of identifying current students having problems in class in 2018-19, along with encouraging students to use the Writing Center and Math Tutoring Center for assistance. Continue to work with MKT faculty and Program Advisory Committee to (1) promote degrees/courses by means of student handouts and advising; (2) fund student scholarships; and (3) focus on student advising efforts in 2018-2019. Assessed: Annually

Page 265: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

255

Annual Planning and Evaluation Report: 2017-2018 Medical Laboratory Technology, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare students to perform essential laboratory testing on blood and body fluids that is critical to the detection, diagnosis, and treatment of disease. In a medical laboratory, the MLT is part of a team of highly skilled pathologists, technologists, and phlebotomists working together to determine the presence, extent or absence of disease, and helping to evaluate effectiveness of treatment. This program emphasizes “hands-on” practice of laboratory methods in a state-of-the-art laboratory at the Medical Education Campus in Springfield, followed by clinical experience at various affiliating health care organizations. Upon completion of the program graduates will be eligible to take the American Society for Clinical Pathology (ASCP) Board of Certification examination, and other national certification examinations offered at the technician level.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Collect, process, and analyze biological specimens and other substances in Blood Bank

Blood Banking MDL 216 Direct Measure: Questions about procedures embedded in MDL 216 midterm exam. SLO 1 2017 Attachments: 1.1 BB midterm exam Fall 2017 Topics: • Question #21: Blood Bank procedure • Question #24: Forward grouping test • Question #22:Gel system reagents • Question #23: ABO typing • Question #15: Indirect Antiglobulin

Test • Question #25: Cord blood and mom

samples Sample: One course; one section. All second year MLT students were included in the data collection; N=12.

Semester/year data collected: Fall 2017 Target: At least 80% of Second year MLT students will answer correctly questions related to basic Blood Banking procedures in their midterm exam. Target Final Score in Blood Banking Midterm Exam: 75% of second year students will score equal to or greater than 80% on the BB midterm exam. Results: multiple choice question, Fall 2017:

Percentage of students answering correctly

Q# 21

Q# 24

Q# 22

Q# 23

Q# 25

Q# 15

Level 1 83% 92% Level 2 100% 100% Level 3 25% 100%

Target was met for questions #21, 24, 22, 23 and 15 on MDL 216 midterm exam. Target was not met for question #25. Final Score in BB Midterm Exam, Fall 2017: 75% of students scored >80%. Target was met on BB midterm final score. Comparison with 2016-2017 Results for this SLO: Fall 2016 results were assessed for this SLO in the MDL 101 Introduction to Medical Laboratory Technology course at an entry level. One course: one section all first year students N=13. 100% (n=13) students received a cumulative score equal to or greater than 70% on cumulative quizzes to meet the set up target.

During Fall 2017, results of the MDL 216 midterm exam were used to determine if second year MLT students understand the testing protocols and principles associated to Blood Bank (BB) procedures. This is the first time that the performance of students in MDL 216 procedures was analyzed in terms of degree of difficulty presented in the questions utilized for assessment of this SLO. The principles for practicing laboratory medicine in BB are needed to be well prepared to transfer this knowledge into practice at the clinical training course and to successfully complete the MLT program. Previous actions implemented by program faculty during the 2016-2017 cycle prior to current assessment with the intention to improve SLO performance includes the purchase of tutorial software that reinforces the principles of testing and demonstrates different testing techniques. The increase in open lab hours during Fall 2017 under the supervision of the Laboratory Instructor provided more opportunity to develop entry level skills in BB laboratory procedures and in the interpretation of such results. The Fall 2017 students demonstrated the required knowledge for performing ABO forward and reverse grouping, Rh typing, Direct and Indirect Anti-globulin testing at the entry level. Upon the analysis of correct responses to exam items related to basic BB procedures we found that the target was met on questions classified as level I and II. Some students still show difficulties in level III

Page 266: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

256

Medical Laboratory Technology, A.A.S. The evaluation of this SLO with now the same students but in their 2nd year shows the expected improvement for a cohort of students in an advanced course. They demonstrated meeting the higher target of 75% of students achieving a score greater than or equal to 80% Evaluations of the same SLO across the curriculum in advanced courses should demonstrate improvement as students are exposed to repetition and practice of skills that lead to becoming proficient in laboratory techniques. Repetition and practice of techniques lead to naturalization of skills that help to achieve a higher level of performance.

questions (Bloom’s taxonomy). These questions involve higher order analytical skills and may require more practice to develop a systematic approach to analyze such situations. Based on recent results, areas needing improvement include: • Analysis of situations or case studies that require

interpretation of tests results to suggest patient’s condition/disorder or further testing needs.

Current actions to improve SLO, based on results: • Upon revision of the SLOs, appropriate actions

to improve SLO may include more guided sessions of Case discussions during Fall 2018. An appropriate strategy will be to model the process used to analyze the information provided in the case, showing the steps used to reach the correct solution. Basic concepts will be reinforced and additional practice problems will help to sharpen the analytical process. These activities will be developed by the course instructor and will be implemented in the MDL 216 course scheduled for Fall 2018.

Next Assessment: Fall 2018

Perform, discuss, and demonstrate principles and methodologies of diagnostic assays, problem solving, and troubleshooting techniques.

Hematology MDL 125 Direct Measure: Questions about procedures embedded in MDL 125 midterm exam: SLO 2 2017 Attachments 2.1 Midterm Practical 2.2 Laboratory Techniques competency

assessment Part 1 Skills 1-10

Set of questions within the MDL 125 midterm practical require students to perform the calculation of Red Cell Indices, and to demonstrate problem solving skills leading to recognizing procedural errors while performing the hemoglobin and hematocrit tests in their midterm practical. Sample:-one course section. All first year MLT students were included in the data collection; N=20.

Semester/year data collected: Fall 2017 Target: 100% of first year MLT students will score 75% or more in questions related to correctly performing hematology procedures in their midterm practical exam. Fall 2017 Results: Average score = 93% Target was met for the group of students assessed during the Fall 2017 period. Comparison with 2016-2017 Results for this SLO: This SLO was evaluated in the MDL 243 Introduction to Clinical Molecular Diagnostics course in Spring 2016: one course; one section, all second year MLT students were included in the data (N=17). 100% of students received a score greater or equal to 70% on the PCR/ELP Polymerase Chain Reaction/ Electrophoresis Lab exercise. Although this is a second year group the lab exercise is considered of moderate complexity, and it is the first time they

The first year students in the MLT program learn in their MDL 125 Hematology course the principles and methods for performing complete blood count (CBC). They are expected to perform by manual and automated methods the hemoglobin and hematocrit tests, and utilize this information to calculate the red cell indices that serve as a guide to classify the anemia disorders. In this course, first year students learn a procedure, practice it, and discuss expected results. By the time they take their midterm exam they should be able to recognize errors that produce discrepant results in basic hematology tests. The evaluation of this SLO shows development of progression from novice to a practicing level in diagnostic assays. The Midterm Practical Test also assessed problem solving and trouble shooting skills. If something went wrong with any of the manual tests they performed, students would have to know how to correct it. If the values were incorrect, their calculations on red cell

Page 267: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

257

Medical Laboratory Technology, A.A.S.

perform this technique obtaining a score that meets the target.

indices would be off and they needed to recognize this in order to troubleshoot it. Students have been exposed previously to recognize sources of error in hemoglobin and hematocrit procedures. Instructor observations during practical exam provided a record of problem solving activities performed by students during the midterm practical. The rubric that is utilized for competency assessment in MDL 125 laboratory techniques is included. Current actions to improve SLO, based on results: The course instructor will continue to provide more discussion of areas that introduce error in manual testing procedures to reinforce problem solving options. Midterm practical exams help to monitor if students have developed adequate psychomotor skills to manually perform tests, and cognitive skills to apply learned concepts for solving discrepancies recognized by calculating indices. A tool that will be incorporated into the test will be the addition of an open question in which the student will describe the problem solving options they utilized after they performed the manual procedures and calculated the red cell indices. The proposed target was met by first year students during Fall 2017 in the MDL 125 midterm test. Next Assessment: Fall 2018

Comply with applications of safety and government regulations.

Laboratory Instrumentation MDL 260 Direct Measure: 3.1 Electrical Safety Quiz. Topics: 1. Result from shock 2. Cause of ignition of Flammable

liquids 3. Avoidance of permanent use of

extension cords 4. Action required for frayed power cord 5. Areas where electricity can travel 6. Part of human body offering principal

resistance 7. Workers allowed to work on exposed

cables 8. Inspection of cords 9. Reset circuit breakers 10. Role of circuit breakers and fuses

Semester/year data collected: Spring 2018 Target: 100% passing with a score of 70% or more. Results: Topic & Average correct score per item: 1. Result from shock: 100% 2. Cause of ignition of Flammable liquids: 100% 3. Avoidance of permanent use of extension cords:

100% 4. Action required for frayed power cord: 81% 5. Areas where electricity can travel: 97% 6. Part of human body offering principal resistance:

94% 7. Workers allowed to work on exposed cables: 97% 8. Inspection of cords: 81% 9. Reset circuit breakers: 87% 10. Role of circuit breakers and fuses: 69% Spring 2017 - N=12 33% received a passing grade greater than 70%

This is the second time that we considered for this SLO the performance on the Electrical Safety Quiz. The findings of the past year demonstrated that only 33% of the group of first year MLT students taking this course during Spring 2017 obtained at least a 70% passing score. This topic was covered during safety education in all areas but since electrical safety is considered an important factor to maintain a safe laboratory environment free of electrical fires, this SLO has been assessed for the second time. A plan to develop a specific module about electrical safety was proposed after the 2016-17 cycle. This module was developed and was included in the MDL 260 – Laboratory Instrumentation course. During Spring 2018, this module was delivered and the Electrical Safety quiz utilized. For the Spring 2017

Page 268: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

258

Medical Laboratory Technology, A.A.S. Sample: One course, one section. All first year MLT students were included in this data collection; N=17.

Spring 2018 - N=17 100% of students received a score greater than 70% Spring 2018 - Score of Students in Electrical Safety Quiz:

• 75% - 3 students • 80% - 1 student • 85 % - 2 students • 90 % - 3 students • 95% - 1 student • 100% - 7 students

Target was met for students taking MDL 260 during Spring 2018 and a comparison with the results obtained for 2017 showed marked improvement in the 2018 cohort. The laboratory Instrumentation course in which the electrical safety module was incorporated will be maintained, but an action plan will be developed to improve knowledge about the role of circuit breakers and fuses in electrical safety.

evaluation, it was used again to assess this SLO with the first year students taking the MDL 260 course. This emphasis on the importance of electrical safety and steps required to maintain a safe laboratory environment helped to improve performance on this SLO. All students obtained a passing grade greater than 70%. A significant improvement in the knowledge of electrical safety was demonstrated if we consider that 76% of the students obtained a score of 85% or greater in this quiz. The Action Plan includes more questions to be included in lecture discussions about the role of fuses and circuit breakers to avoid electrical accidents and assessment of this area in quizzes and exams. We will consider for Spring 2019 electrical safety and chemical safety to make sure students can demonstrate appropriate knowledge in all regulatory areas.

VCCS Core Learning Outcome

General Ed Evaluation Methods Assessment Results Use of Results

Critical Thinking

Clinical Hematology II MDL 225 Direct Measure: Questions for cell identification and disease correlations embedded on MDL 225 final lab practical. CLO 1 2018 Attachments

1.1 Hematology II Final Lab Practical exam, Spring 2018

Topics: • Question #1: Identify abnormal red

cell morphology present in blood smear image.

• Question #4: Recognize presence of hyper-segmented neutrophils in blood smear.

• Question #6: Recognize immature white blood cells (blasts) on blood smear and their prominent nucleoli.

• Question #11: Identify abnormal white blood cells (Hairy cells) by comparing blood smears stained with regular stains and special stains.

Semester/year data collected: Spring 2018 Target: At least 80% of MLT students will correctly identify normal blood cells and will be able to correlate changes in blood cells associated to diseases. Results: Percentage of students answering correctly in Spring 2018: • Question #1: 19/19 (100%) • Question #4: 18/19 (95%) • Question #6: 16/19 (85%) • Question #11: 19/19 (100%) • Question #13: 19/19 (100%) Target was met as shown by results obtained from selected questions embedded in final Lab Practical exam that required the use of critical thinking skills to identify correctly cells and correlate findings with disease states. Survey for Clinical Preceptors after Clinical rotation in Hematology, MDL 276 Clinical Hematology Techniques helped to validate the effectiveness of utilization of the multi head teaching microscope as a

Correct classification of white blood cells is a microscopy skill that is required to correctly perform the blood smear differential count. MLT students are expected to demonstrate proficiency in correct identification and classification of normal and abnormal white blood cells. Classification of blood cells has been determined as an area of difficulty for our first year students. This area of weakness was noted by clinical preceptors receiving students for hematology sections during 2016 -17 evaluations. Evaluations from clinical training included remarks about the difficulty that some students had to differentiate lymphocytes from monocytes and to distinguish immature blood cells using microscopy. Different efforts were introduced since Fall 2017, like more hands-on laboratory sessions dedicated to evaluating blood smear were incorporated into the curriculum with the objective to improve blood cells recognition.

Page 269: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

259

Medical Laboratory Technology, A.A.S.

• Question #13: Distinguish nuclear fragments from white blood cells on blood smear images.

Sample for Core Learning Outcome: Critical Thinking Assessment - one course, one section. All first year MLT students in MDL 225; N=19.

valuable resource to accomplish program goals in the area of Hematology. A question on clinical training evaluation form asks the affiliate if they feel the students’ ability to identify and classify blood cells has improved, remained the same, or has declined when compared to the previous year’s students. Out of 16 clinical affiliates, 15 felt students’ ability for cell recognition improved.

The acquisition of a multi head teaching microscope in Spring 2018 provided the opportunity to enhance the learning process of this course by synchronizing the description and observation of cell characteristics that are critical to classification of blood cells. Cell recognition improved as students participated in sessions that utilized the multi head teaching microscope. Receiving immediate feedback about white blood cell characteristics while using the multi heads microscope helped to compare cells in the same field of view, recognize differences, and apply classification criteria to their observations. The assessment performed on cell identification improved dramatically and was able to be transferred from real images seen through the microscope on slides to printed images or computer images. After the guided practice of cell identification using the multi head microscope, students expressed that they are more confident when assessing cell morphology of normal and abnormal cells as well as correlating findings with hematological conditions like leukemia. The target was met for this CLO assessing critical thinking skills and the positive results have been associated to the systematic approach applied to distinguishing characteristics of blood cells and the utilization of the multi head microscope. The use of this equipment will be incorporated during Fall 2018 in other courses that have a microscopy component like MDL 140, Microscopic Analysis of Body Fluids During Fall 2018, assessment of improvement in microscopic analysis in MDL 140 due to incorporation of multi headed microscope will be performed.

Program Goals Evaluation Methods Assessment Results Use of Results The Medical Laboratory Technology Program will increase the retention rate from previous year.

Retention Rate: Data collected from Student Information System (SIS) class Rosters

Semester/year data collected: Fall 2016 – Fall 2017 Target: 90% Retention Results: First year cohort: 12 traditional students were accepted into the MLT program for the Fall 2016. The retention rate the following year for this cohort was 92%. The attrition rate was 8% (N=1).

The goal is to maintain or exceed a retention rate per year of 90%, and for the cohort of 12 students that entered the program in Fall 2016, there was a 92% retention due to the loss of 1 student during the first semester. The faculty and program director concentrate efforts to identify at-risk students early in the program. During the year 2016-17, once an at-risk student was

Page 270: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

260

Medical Laboratory Technology, A.A.S.

Previous retention rates from SIS rosters: • Fall 2015-2016: 85% • Fall 2014-2015: 83% •

Target Met: [ X ] Yes [ ] No [ ] Partially

identified, the faculty member invited the student to a meeting to discuss the situation and give ideas on how to receive support for developing needed skills. A referral for other support services is made to include the Academic Success Counselor, and Tutoring services are also incorporated in the plan developed to help the student. Open lab hours can be set up to provide more practice to develop the required skills, and online tutorials are also available for review on all areas of medical laboratory scope of practice. Assessed: Annually

Student performance on the national MLT certification exam will be equal or better to the National passing rate.

BOC-ASCP Certification Rate: Percentage of students that finished the program obtaining a passing score as first timers. Attachment: PPR BOC Program Performance Reports for students finishing the program in May 2017 and taking MLT BOC during 2017-18

Target: obtain a 90% passing rate or better in MLT BOC test Results: The MLT BOC-ASCP Exam Pass Rate – First time test takers during the 2017 cycle (N=9) Only 9 of the 10 AAS-MLT graduates finishing the program in 2018 took the MLT BOC Test obtaining an 88.89% passing rate (8/9) for first time examinees. The MLT Program performance passing rate of 88.89% was higher than the National passing rate 86.26% for this cycle. The student that failed as first timer repeated the test and obtained a passing grade within 3 months of graduation. The 9 examinees sitting for ASCP MLT BOC obtained a 9 out of 9 passing grade within 6 months of graduation for a general 100% passing grade. Target Met: [ X ] Yes [ ] No [ ] Partially MLT (ASCP) exam pass rate for previous years:

• 2016 cycle: 93% • 2015 cycle: 95% • 2014 cycle: 100%

The program performance on the Board of Certification test for MLT was excellent. The target was met and the performance of our students was better than the national mean score in Blood Bank area of the 7 areas of the body of knowledge. As a result of the previous cycle results, the MDL 281 course concentrated in the Spring 2017 on providing an extensive review of all the areas by incorporating exam simulator software that has proven to be very helpful in the preparation of students for the certification exam. The software has the capacity to be used as a review course and provides statistical information by individual about type of questions that were missed per area when used as a certification test simulator. We will continue to use this software in the MDL 281 course for review and exam simulations.

The Medical Laboratory Technology program will increase total number of students graduating from the previous year.

Graduation totals: OIR Reports - Number of Graduates by Program and Specialization: 2017-2018

Semester/year data collected: 2017-18 Target: increase total number of students graduating from the previous year Results: A total of 11 students graduated in the 2017-18 academic year. Graduates from previous years:

The goal of the 2017-18 cycle to increase the total amount of graduates was not met. The program director and faculty are continually seeking opportunities to promote the program in the community and among the personnel working in clinical affiliates. A plan has been developed to set up information tables at different clinical labs to provide prospective students with the information about the MDL program. Other opportunities for growth is to

Page 271: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

261

Medical Laboratory Technology, A.A.S. • 2016-17 – 18 • 2015-16 – 15 • 2014-15 - 16

provide in the phlebotomy program and medical laboratory assistant programs the career ladder information of foundation courses that permit to those interested in the MLT program advance in their study plan. PBT and MLA students may want to gain more skills and advanced credentials that help them to move up in the career ladder. The target is to increase the enrollment to the full capacity of 20 admissions to MLT program per year. Assessed: Annually

Students employed in the field following graduation from the program at a rate equivalent to the previous year.

Career placement rate: Exit interviews; voluntary graduate e-mail feedback; social media

Target: 100% of graduates employed within six months of graduation Employment of 2017-18 graduates within 6 months after graduation - Information about employment is available for 9 graduates: A total of 8 graduates are working as MLT in healthcare institutions of Virginia like INOVA hospitals and 1 is a new mom that decided to stay at home. Data for other graduates is not available.

The data for job placement is constantly being updated as we receive information from graduates or employers. The results are reviewed during Advisory Committee meetings because members of this committee are the employers of many of our graduates. The target was partially met because we only have information about 8 graduates that reported were seeking employment. For those 8 graduates, information that accounted for their placement in healthcare institutions in Virginia was obtained. In reality, of the 8 BOC certified graduates that were seeking employment, all have been placed for a 100% placement, and the graduate that decided to stay at home indicated that she will not be seeking employment for the near future. Faculty members are always trying to reach out to alumni in order to update their information. The next evaluation will be with the Spring 2019 graduates.

Page 272: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

262

Annual Planning and Evaluation Report: 2017-2018

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. A.A. Program Purpose Statement: The Associate of Arts degree curriculum in Music offers an emphasis in fine arts. The Associate of Arts degree curriculum may be used by students who wish to transfer to a four-year college or university to complete a Bachelor of Arts degree in Music. A.A.A. Program Purpose Statement: The Associate of Applied Arts degree curricula in Music and Jazz/Popular Music is designed for students who seek employment in performing music.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to render a performance that is musically expressive and technically accurate.

Applied Wind and Percussion Lessons MUS 255, 275, and 285 Direct Measure: Performing at a music jury, before a panel of music faculty members, is a capstone experience for applied music students who prepare in private lessons throughout the year. This is a standardized evaluation of applied music lessons. Applied music lessons are one-on-one weekly lessons. The assessment for this SLO is unique to the music discipline because it's a performance-based assessment. Juries are assessments that have been used for decades to measure students' musical achievement at the world's finest colleges. Grades were recorded using the NOVA Music Program Performance Examination, a twelve-point rubric. Each jury performance is evaluated by at least three adjudicators. The jury form will be broken down by

Semester/year data collected: Fall 2017 Target: The Achievement target was an average of 4 for each individual category. 200 Level Wind and Percussion Students' Assessment Results: Data collection is shown in the charts below. The charts show the average score of the components for Technique and Interpretation (worth 5 points each). The percentage scores for each category is also shown. SLO #3 Data Breakdown – Applied Winds (both woodwinds & brass) – 200-level music majors

Category AL Averages

LO Averages

AL & LO Combined Averages

Technique 3.57 71.4%

4.67 93.4%

4.12 82.4%

Interpretation 3.71 74.3%

4.67 93.4%

4.19 83.8%

SLO #3 for Winds - 2 categories, 5 points each • AL campus: 5 students; 2-3 judges; 14 scores total • LO campus: 1 student; 3 judges; 3 scores total • AL & LO combined: 6 students; 5-6 judges; 17 scores total SLO #3 Data Breakdown – Applied Percussion (specifically drumset) – 200-level music majors

Category AL Averages

LO Averages

AL & LO Combined Averages

Technique 2.5 50%

4.67 93.4%

3.56 71.2%

Our achievement target was met for this year for our Wind students in MUS 255 and 275, but not for our Percussion students in MUS 285. Current results are lower than the piano students from Fall 2016 and also lower than the voice students from Fall 2015. You can compare the scores from Fall 2017 on the left to the scores from Fall 2016 and Fall 2015. Overall: Overall, the wind and percussion students assessed in this report scored lower than the piano students from Fall 2016 and lower than the voice students from MUS 236 in 2015. However, it is hard to compare scores when the actual number of students assessed is so low—6 from AL and 2 from LO. Differences across campuses in scoring standards is most certainly an obvious factor here. Average scores from the AL campus are lower than the LO campus on this report. Average scores from the AL campus are also lower than scores from the AN and LO campuses on the 2016-17 and 2015-16 reports. AL campus faculty have consistently been grading this assessment with more rigorous standards than other campuses. At our meeting on March 17, 2017, we discussed ways to become more objective in grading this assessment. It was decided that a more detailed grading rubric was needed, and a more detailed grading rubric was sent to the cluster to be used alongside this assessment. It is attached. Our cluster agreed that our goal is to be more rigorous and consistent across campuses with our standards for grading this assessment. However, we still need to be more in line when we grade. We are assessing this SLO again in Fall 2019 and will see if this goal is met then. Previous Actions: Before 2015-16, this assessment focused only on overall scores. Since 2015-16, we have

Page 273: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

263

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization component to assess student success in each category as well as overall score. Source of Evidence: Performance. Maximum Scores: • Wind/String/Vocal total

possible score =40 • Piano/Harp,

Composition and Percussion total possible score = 30 (with sight-reading; 25 without sight-reading)

• Drumset total possible score = 25

See the attached Music SLO method of assessment. Sample: The information is from the MUS 255, 275, and 285 course (woodwind, brass, and percussion applied lessons at the 200 level) at the Alexandria and Loudoun Campuses in Fall 2017. No information was received from the Annandale campus for this assessment.

Interpretation 2.5 50%

4.67 93.4%

3.56 71.2%

SLO #3 for Drumset - 2 categories, 5 points each • AL campus: 1 student; 2 judges; 2 scores total • LO campus: 1 student; 3 judges; 3 scores total • AL & LO combined: 2 students; 5 judges; 5 scores total Previous Results: SLO #1 Data Breakdown – Fall 2016 Applied Piano – 200-level

Category Fall 2016 From AL & AN

Technique 4.43 88.6%

Interpretation 4.5 90%

SLO #1 Data Breakdown – Fall 2015 Applied Voice – 200-level – MUS 236 Applied Voice Lessons

Category Fall 2015

Technique 4.22 84.4%

Interpretation/ Musicianship

4.36 87.2%

Strengths and Weaknesses by Criteria/ Question Topic: The drumset player’s performance on the AL campus was weak and unprepared, so that affected the results significantly since there were only 2 drumset players total. The scores on the piano juries in Fall 2016 were consistent and above 80% in every category. However, the lowest scores both on the piano juries from Fall 2016 and the voice juries from the 2015-2016 APER were in “Technique.” The Fall 2017 wind students on the AL campus had the second lowest score in the “Technique” category. Clearly, we need to do better in this category.

broken down scores by category as well as looked at overall scores. This report is just looking at specific categories for this SLO. Current action: This assessment focuses on 200 level wind and percussion students using the jury form broken down by component. We were able to compare component scores from Fall 2017 to both Fall 2016 and Fall 2015. We will have a judges’ training session at our next meeting in Fall 2019 to see if we can agree on the level of rigor that we need to have when we judge. Both campuses need to focus on improving scores for technique starting in Fall 2019. Another action for improvement: The assessments for SLO #1 and SLO #3 are the same. This report takes data from different categories on the assessment to the two different SLOs. However, the SLOs and/or the assessments need revision. The program will work with the Office of Academic Assessment in Summer 2019 to fix this. Next assessment: The next assessment will focus on string and voice students in MUS 245 and 236, and we will continue to assess students using the jury form broken down by component as well as overall score. We will have a training session and continue to use the more detailed grading rubric in hopes of gaining more consistent scores across campuses. This will be done in Fall 2019.

Students will be able to perform pieces, exercises, scales, and progressions accurately.

Applied Wind and Percussion Lessons MUS 255, 275, and 285 Direct Measure: Performing at a music jury, before a panel of music faculty members, is a capstone experience for applied music students who

Semester/year data collected: Fall 2017 Target: The Achievement target was an average of 4 for each individual category. 200 Level Wind and Percussion Students' Assessment Results: Data collection is shown in the charts below. The charts show the average score of the components for Tone Quality, Intonation, Rhythm, and Scales for the Wind players, and Tone

Our overall achievement target was met in some categories but not in others. You can compare the scores from Fall 2017 on the left to the scores from Fall 2016 and Fall 2015. Overall: Overall, the wind and percussion students assessed in this report scored lower than the piano students from Fall 2016 and lower than the voice students from MUS 236 in 2015 in most comparable categories. However, it is hard to compare scores when the actual

Page 274: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

264

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization prepare in private lessons throughout the semester. This is a standardized evaluation of applied music lessons. Applied music lessons are one-on-one weekly lessons. The assessment for this SLO is unique to the music discipline because it's a performance-based assessment. Juries are assessments that have been used for decades to measure students' musical achievement at the world's finest colleges. Each jury performance is evaluated by at least two adjudicators. The jury form will be broken down by component to assess student success in each category as well as overall score. Source of Evidence: Performance. Maximum Scores: • Wind/String/Vocal total

possible score = 40 • Piano/Harp,

Composition and Percussion total possible score = 30 (with sight-reading; 25 without sight-reading)

• Drumset total possible score = 25

See the attached Music SLO method of assessment. Sample: The information is from the MUS 255, 275, and 285 course (woodwind, brass, and percussion applied lessons at the 200

and Accuracy for the Percussion players. Each category is worth 5 points. The percentage scores for each category is also shown. SLO #3 Data Breakdown – Applied Winds (both woodwinds & brass) – 200-level music majors

Category AL Averages LO Averages AL & LO

Combined Averages

Tone Quality 4.14 82.9%

4.67 93.4%

4.41 88.2%

Intonation 3.86 77.1%

5 100%

4.43 88.6%

Rhythm 4.21 84.3%

4.67 93.4%

4.44 88.8%

Scales 3.07 61.4%

4.33 86.6%

3.7 74%

SLO #3 for Winds - 4 categories, 5 points each • AL campus: 5 students; 2-3 judges; 14 scores total • LO campus: 1 student; 3 judges; 3 scores total • AL & LO combined: 6 students; 5-6 judges; 17 scores total SLO #3 Data Breakdown – Applied Percussion (specifically drumset) – 200-level music majors

Category AL Averages LO Averages AL & LO

Combined Averages

Tone 2.5 50%

5 100%

3.75 75%

Accuracy 4 80%

4.67 93.4%

4.34 86.8%

SLO #3 for Drumset - 2 categories, 5 points each • AL campus: 1 student; 2 judges; 2 scores total • LO campus: 1 student; 3 judges; 3 scores total • AL & LO combined: 2 students; 5 judges; 5 scores total Previous Results: SLO #1 Data Breakdown – Fall 2016 Applied Piano – 200-level

Category Fall 2016 From AL & AN

Tone Quality 4.65 92.9%

Accuracy 4.36 87.1%

number of students assessed is so low—6 from AL and 2 from LO. It is also hard to compare because the categories are a little different on the different assessment sheets for the different mediums (wind instrumental, percussion, piano, and voice). Differences across campuses in scoring standards is most certainly an obvious factor here. Average scores from the AL campus are lower than the LO campus on this report. Average scores from the AL campus are also lower than scores from the AN and LO campuses on the 2016-17 and 2015-16 reports. AL campus faculty have consistently been grading this assessment with more rigorous standards than other campuses. At our meeting on March 17, 2017, we discussed ways to become more objective in grading this assessment. It was decided that a more detailed grading rubric was needed which was sent to the cluster to be used alongside this assessment. It is attached. Our Discipline Group agreed that our goal is to be more rigorous and consistent across campuses with our standards for grading this assessment. However, we still need to be more in line when we grade. We are assessing this SLO again in Fall 2019 and will see if this goal is met then. Previous Actions: Before 2015-16, this assessment focused only on overall scores. Since 2015-16, we have broken down scores by category as well as looked at overall scores. This report only looks at data from specific categories. Current action: This assessment focuses on 200 level wind and percussion students using the jury forms broken down by component. We were able to compare component scores from Fall 2017 to both Fall 2016 and Fall 2015. We will have a judges’ training session at our next meeting in Fall 2019 to see if we can agree on the level of rigor that we need to have when we judge. Both campuses need to focus on improving scores for scales and technique. Full-time faculty need to inform all applied lesson teachers of these results and communicate the urgency to put more emphasis on scales and technique in lessons with students. This emphasis started in Spring 2019 and will be emphasized again in Fall 2019. Another action for improvement: The assessments for SLO #1 and SLO #3 are the same. The SLOs or the assessments need revision. The program will work with the Office of Academic Assessment in Summer 2019 to fix this.

Page 275: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

265

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization level) at the Alexandria and Loudoun Campuses in Fall 2017. No information was received from the Annandale campus for this assessment.

*Optional Caption (sight-reading, scales)

n/a

*Due to AL assessing the Optional Caption and AN leaving it out. Also, the assessment sheets are a little different for the different mediums (wind instruments, percussion, piano, and voice) so that is why some categories are different. SLO #1 Data Breakdown – Fall 2015 Applied Voice – 200-level – MUS 236 Applied Voice Lessons

Category Fall 2015

Tone Quality 4.43 88.6%

Intonation 4.35 87%

Rhythm 4.69 93.8%

Sight-reading/Scales N/A *NOTE: The same assessment is used for SLO#1 and SLO#3 because the SLOs are very similar. This is how we can compare results, even though the SLO is different. Strengths by Criteria/ Question Topic: The highest scores from both campuses (when you look at just the 2 campus averages) for the wind players are in the “tone quality” and “rhythm” categories. This is where we met our achievement goal for the individual categories on both campuses. The drumset player’s performance on the AL campus was weak and unprepared, so that affected the results significantly since there were only 2 drumset players total. Weaknesses by Criteria/ Question Topic: The lowest scores from both campuses (when you look at just the 2 campus averages) for the wind players are in the “scales” category. While LO met the achievement goal, both campuses scored lowest in this category. The scores on the piano juries in Fall 2016 were consistent and above 80% in every category.

Next assessment: The next assessment will focus on string and voice students in MUS 245 and 236, and we will continue to assess students using the jury form broken down by component as well as overall score. We will have a training session and continue to use the more detailed grading rubric in hopes of gaining more consistent scores across campuses. This will be done in Fall 2019.

Analyze the musical structure of a composition.

Music Theory II MUS 112 Direct Measure: All students who were enrolled in MUS 112 (second semester music theory class) across campuses were given a rigorous assessment as a part of their final exam. Students

Semester/year data collected: Spring 2018 Target Score: Using a 100-point rubric, the Achievement Target was an average score of 75% or higher, overall and in each individual category. MUS 112 Assessment Results - The chart below shows the SLO overall and component scores:

SLO Component Average Points Achieved / Points Possible &

Current results are very slightly improved overall from 62.84% in Spring 2017 to 65.63% in Spring 2018. We are still 9.37% under our achievement goal. Previous Actions to Improve SLO: In January 2017, teachers in the cluster decided to spend more time in class reviewing non-chord tones and secondary chords in an effort to raise scores in these categories on the assessment. For Spring 2017, we lowered our

Page 276: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

266

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization were asked to analyze a piece of music by labeling the key, completing a Roman Numeral analysis, finding a modulation and labeling the type, and labeling non-chord tones and cadences. Maximum score: 100 Target Score: 75 (75%) See the attached Music SLO method of assessment. Sample: A total of 8 students from one section on the AL campus were assessed. This was one of only 2 sections of MUS 112 that were offered in the Spring 2018 semester across all campuses. AN reported overall scores for Spring 2018 but not data for each category.

Percentage Achieved Labeling Keys 8.75 /10

87.5% Roman Numeral Analysis for Diatonic Chords

24 /32 75%

Secondary Function Chords 16 /32 50%

Labeling Modulations (type and location)

12.5 /20 62.5%

Labeling Non-Chord Tones 3.875 / 6 64.6%

Labeling Cadences 1.75 / 2 87.5%

Overall Average Score 65.63 65.63%

Previous Results: MUS 112 Assessment Results

SLO #2 Components Spring 2017 Spring 2016

Labeling Keys 9.5 / 10 95%

8.63 / 10 86.36%

Roman Numeral Analysis for Diatonic Chords

24.4 /32 76.25%

16.636 / 18 92.42%

Secondary Function Chords

15.2 /32 47.5%

20.727 / 32 64.77%

Labeling Modulations (type and location)

9.4 /20 47%

15.636 / 20 78.18%

Labeling Non-Chord Tones

3.8 / 6 63.33%

3.636 / 6 60.606%

Labeling Cadences 2 / 2 100%

2 / 2 100%

Total Overall Averages

64.1 62.84%

84.23%

Our overall achievement target was not met. The achievement targets for “Labeling Keys,” “Analysis for Diatonic Chords,” and “Labeling Cadences” were met. Students scored much lower in the “Secondary Function Chords,” “Labeling Modulations” and “Labeling Non-Chord Tones” categories. These are the same categories that students scored lower on in the previous two years. We did not meet our goal to improve scores in these categories. Current results: Current results were very similar to previous results from Spring 2017. The students in Spring 2016 performed far above average and those scores all seem to be outliers. The current results for the Total Overall Averages score was slightly higher than in Spring 2017 but not by much.

Achievement Target from 80% to 75%. We are going to keep our Achievement target at 75% for this assessment. Current Results: This is the third year we've been able to compare data broken down by category for this SLO on the AL campus. AN reported overall scores for Spring 2018 but not data for each category. Deans at AN have been contacted and should be able to help data collection more in the future. New action to improve student learning based on the results: We need to improve student learning in the three categories where our students scored the lowest: “Secondary Function Chords,” “Labeling Modulations,” and “Labeling Non-Chord Tones.” Our students are consistently not meeting the achievement target for these three categories (past three APERs). Starting in Fall 2019, faculty who teach this course must focus more curriculum time on these topics and assess student learning of these topics better in practice and assignments before the final SLO assessment. We will achieve this by assessing more practice homework in these areas starting in Fall 2019. These improvement strategies were discussed at our meeting on March 17, 2017 and will be discussed at our next meeting on April 12, 2019. Next Assessment: Our Achievement Target is for our students to receive a 75% or higher overall and in each individual category. This SLO will be assessed again in Spring 2020.

Page 277: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

267

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization Strengths and Weaknesses by Criterion/Question Concept: Current results showed that both the strengths and weaknesses of each criterion (or category) were similar to those of the Spring 2017 results.

CLO Evaluation Methods Assessment Results Use of Results Critical Thinking: Students will be able to effectively research and write on topics in the area of music / jazz and popular music.

This assessment was given to music majors in MUS 111 in Fall 2017 and to music majors in MUS 112 in Spring 2018. Direct Measure: It is a writing assignment. Students were asked to write a concert report where they go to a classical concert and write a 2-3 page review of the concert. The concert report is not a research paper but a critical thinking paper. Scoring: There are three criteria for grading: 1. Summary (worth 20

points) 2. Integration of Course

Work (worth 20 points) 3. Writing Style (worth 10

points) See the attached Music SLO method of assessment and grading rubric. Sample: • Number of students: 18 • Number of sections: 3

(1 at LO and 2 at AL) Breakdown of Students by Campus: • 5 at LO from Fall 2017

MUS 111 • 5 at AL from Fall 2017

MUS 111 • 8 at AL from Spring

2018 MUS 112

Semester/year data collected: Fall 2017 (MUS 111) and Spring 2018 (MUS 112) Target Score: 37.5 (75%) Assessment Results: • Total Average score: 40.53/50 (81.1%) • Maximum score: 50 Breakdown of scores: • “Summary” category average score: 17.33/20 (86.7%) • “Integration of Course Work” average score: 17/20 (85%) • “Writing Style” category average score: 6.2/10 (62%) This is the first time we have assessed SLO #6 in more than 10 years so we have no previous results to compare. Strengths by Criteria/ Question Topic: We met our achievement target overall and in 2 categories: “Summary” and “Integration of Course Work.” Weaknesses by Criteria/ Question Topic: The results in the category of “Writing Style” (62%) are far under our 75% achievement target. It would benefit our students if we could help them with their writing style more. Perhaps good, clear feedback on written assignments would help our students with future writing assignments. Also, maybe we could make it a requirement to have students submit their papers to the Academic Success Center for feedback before they turn them in for a grade. The program will ask faculty to implement these suggestions. We are assessing this SLO #6 again in Spring 2019 so we can see if this helps our students improve their “Writing Style” scores.

Achievement target met in all areas except one: “Writing Style” Previous Results: This is the first time we have assessed SLO #6 in more than 10 years so we have no previous results to compare. Previous Actions Implemented by Discipline Group: This SLO has not been assessed in over 10 years. We decided to assess it and we will continue to assess it on a more regular basis. Current Action: We assessed MUS 111 music majors in Fall 2017 and MUS 112 music majors in Spring of 2018. To improve the “Writing Style” scores, faculty will be encouraged in Spring 2019 to give good, clear feedback on students’ writing assignments and also to have their students submit their papers to the Academic Success Center for feedback before submitting their paper for a grade. Next Assessment: Spring 2019

Page 278: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

268

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization Program Goals Evaluation Methods Assessment Results Use of Results Increase Graduation Totals

Graduation Totals: OIR Data from http://www.nvcc.edu/college-planning/data.html

College Graduates by Curriculum and Award type: Year Music AA Music AAA

2017-18 14 7 2016-17 5 12 2015-16 11 5 2014-15 6 7 2013-14 7 12 2012-13 7 4 2011-12 14 23 2010-11 4 10

Data was gathered from OIR in December 2018. Our goal was to increase our graduation numbers overall. The goal was met! It is interesting that there were significantly more AAA graduates in 2017 and then more AA graduates in 2018. The ratio of graduates between those two degrees were almost flip-flopped. The graduation totals for the A.A. degree largely increased from 5 in 2016-17 to 14 in 2017-18. The totals for the A.A.A. degrees (including the jazz/pop specialization) decreased from 12 to 7. The graduation totals for the A.A.A. Jazz/Pop Specialization degree increased: zero in 2015-16, 5 in 2016-17, and 1 in 2017-18. The increase of A.A. graduates is likely due to it being the “guaranteed transfer” degree. However, the A.A.A. degree requirements more closely match the first two years of a B.M. degree at four-year institutions. Some faculty are strongly encouraging their students to pursue the A.A.A. degree, while others are not. Advisors and counselors who are not music faculty are channeling students into the A.A. degree, even though the A.A.A. degree is a better option for most students. That is probably why we are seeing the graduation rate numbers flip-flop back and forth between the two degrees for the past three years. There is an inconsistency as to which degree most students pursue: the A.A.A. or the A.A. To remedy this, the Music Cluster has proposed a new A.F.A. degree that will help guide our students into one program that is best suited for transferring to a B.M. degree in Music Performance or a B.M. degree in Music Education. If the proposal is approved, we will be able to implement this new A.F.A. degree. The other music degrees will then be grandfathered out. The new A.F.A. degree proposal will be discussed at the next Curriculum Council meeting in January 2019. To increase graduation rates, music faculty members started pushing in Spring 2017 for students to see us during advising week and register for classes early in the registration period for Fall 2017 by sending emails to students and making announcements in classes. We hope this will improve retention rates and therefore completion rates. Assessed: Annually

Increase Program Placement

Program Placement Rates: OIR Data from

Program Placement Rates

MUSIC, A.A.

MUSIC A.A.A.

Fall 2017 163 59

Data was gathered from OIR in Decembe, 2018. Our goal was to increase our program placement rates for the A.A.A. degree. The goal was not met. However, program

Page 279: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

269

Music, A.A., A.A.A., and A.A.A. Jazz/Popular Music Specialization Rates https://www.nvcc.edu/colleg

e-planning/_docs/Distribution-of-Program-Placed-Students-by-Curriculum-and-Award-Type-Fall-2013-Fall-2017.pdf

Fall 2016 154 69 Fall 2015 206 78 Fall 2014 205 67 Fall 2013 213 87 Fall 2012 204 120 Fall 2011 216 141 Fall 2010 221 155

placement rates for the A.A. degree increased almost as much as they decreased for the A.A.A. degree. Again, this “flip-flopping” of numbers between the two degrees is based on the inconsistency between advising (as explained in the above section). Program placed students at the college as a whole decreased by 2.95%, continuing a five-year downward trend. The music department only had one more student placed in Fall 2016 than it did in Fall 2017. To see an increase, each full-time music faculty member on the AL campus is doing one to four recruiting trips to area high school music programs per year. On these recruiting trips, we are talking to high school students about the benefits of attending NOVA and being a music major. This is an effort to recruit more students for our NOVA AL music programs. We just started this recruiting effort three years ago and are continuing to visit more schools. The goal is to visit at least two schools every Spring semester. In addition, the NOVA Alexandria campus band hosts approximately two hundred 5th through 12th graders from local area schools to perform annually in combined concerts. The AL Jazz Ensemble’s Jazz4Justice concert in March 2018 will feature a performance by a local school jazz ensemble. These young students get to see a part of the strong music program at NOVA and remember when they got to perform on stage in the Rachel M. Schlesinger Concert Hall and Arts Center on the Alexandria campus. This has helped and will hopefully continue to help recruit music majors. Assessed: Annually

Page 280: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

270

Annual Planning and Evaluation Report: 2017-2018 Music Recording Technology Certificate

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Music Recording Technology curriculum is designed for persons who desire to set up their own studio or seek employment as music recording technicians. Occupational objectives include development for positions as assistants and aides in recording studios, broadcast studios, myriad other recording enterprises, and countless private studios in the recording industry. Training in digital audio is emphasized using industry standard software.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Predict and control room reverberation time (RT60) with regard to proper musical acoustic support for a traditional recording studio space exhibiting frequency-dependent exponential decay.

Sound Studio Design MUS 157 Direct Measure - Assignments: 1. Students had one week to complete the

Room Reverberation Calculation assignment (see attached)

2. Studio Design Project - The final studio design project required the student to design on paper, with an actual model, or in CAD software a two-room recording studio. See attached for the instructions for the design project.

# Expectation 1 Create a measured room drawing 2 Calculate the room volume

3 Choose target RT60(Reverberation Time)

4 Define surface materials

5 Locate material absorption coefficient (a) data

6 Calculate surface area

7 Combine surface area for each material

8 Calculate total absorption for room surface

Sample: One section of MUS 157 (Loudoun campus) was used to evaluate this SLO

Semester/year data collected: Spring 2018 Target: Assigned to 15 students, the instructor determined that a grade of 81% or better indicated successful completion. Results: • 8 earned an “A” (91-100%) • 1 earned a “B (81-90%) • 2 earned a “C” (71-80%) • 0 earned a “D” (61-70%) • 2 earned an “F” (0-60%) • 60% of the students achieved 81% or higher on

the assignment. Past assessment results: Data collection: Fall 2016 - Assigned to 16 students at the LO campus, one section. The instructor determined that a grade of 81% or better indicated successful completion of the assignment.

• 11 earned an “A” (91-100%) • 0 earned a “B” (81-90%) • 0 earned a “C” (71-80%) • 1 earned a “D” (61-70%) • 3 earned an “F” (0-60%) - no assignment

turned in

Expectation #

Students that completed successfully/total student

projects 1 11/11 2 11/11 3 11/11 4 11/11 5 11/11 6 11/11

The number of students that did well on the assignments proves that the material was taught thoroughly and that the assignment was adequately designed to prove the skill. Students that did poorly only did so because they handed in late or incomplete assignments. Students fail the final design project mostly from poor attendance and they do not drop the class. In the future, results will be broken down further. We will fill the lacunae by requesting SLO data with final grades from adjuncts starting in Fall 2018 and stress the importance of SLO when working at NVCC to new adjuncts. Next Assessment: Fall 2020

Page 281: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

271

Music Recording Technology Certificate Past results for this SLO have shown a similar outcome. More data is lacking for this SLO and faculty have left. The program will take active steps to ensure complete reporting of this SLO in future, by emphasizing the importance of reporting as part of teaching responsibility to be turned in with final grades.

Calculate basic electrical circuit resistance, amperage, and voltage

Recording Studio Electronics MUS158 Direct Measure: Electrical Circuit Worksheet assignment. In order to prove competency, the student needed to choose the appropriate formula and apply it correctly to calculate the correct answer. Please see attached assignment. Sample: One section of MUS 158 (Loudoun campus) was used to evaluate this SLO

Semester/year data collected: Fall 2017 Results - 13 students took the exam:

• 10 earned an “A” • 2 earned a “B” • 1 earned an “F”

Past results for this SLO have shown a similar outcome. More data is lacking for this SLO and faculty have left. We will add new areas for assessment starting in Fall 2018.

60% of the students met the 81% required grade for satisfactory performance on this assessment. Two students did not even attempt the assignment. The program will take active steps to ensure complete reporting of this SLO in the future, to include submitting accurate reports with final grades. Next Assessment: Fall 2020

Define terms used in Pro Audio

Introduction to Recording Techniques MUS 140 Direct Measure: Students had to define terms in a written exam and apply them in a practical setting when recording and mixing audio. This is the first time this SLO is being assessed. Sample: One section of MUS 140 (Loudoun campus) was used to evaluate this SLO

Semester/year data collected: Fall 2017 Target: 90% or better Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics Audio:

Fall 2017

Average Score

% of Students >

target 1. tone 90/100 >90 2. color 95/100 >90 3.density 90/100 >90 4.balance 100/100 >90 5. freq distribution 90/100 >90 6.revb 80/100 <90 7.EQ 90/100 >90 8. mood 100/100 >90 Total 95/100 >90

Target was 90% or better and was met. Areas that were weak involved judging reverberation tail, compression knee and EQ low shelf curve. One student had trouble with decay and RT60 and had the lowest score of 80. There is no need to improve since the target of >90 was met by all but one. This is within the realm of judging parameters in an artistic realm dictated somewhat by personal taste as evidenced by the millions of recordings-no two are mixed the same. That does not make one wrong

Target Met: [ X ] Yes [ ] No [ ] Partially Over 80% earned an A, and this was the expectation for a beginning course with no entrance audition. The target was met. Improvements in Spring 2018-Summer 2018 based on prior results: Addition of high resolution Adam S6X far field monitors to make audible nuances and accuracy lost on lesser monitors, students are provided the finest in audio fidelity in the Commonwealth of Virginia. Special stands had to be built and bolted to the floor to install each speaker weighing over 250 pounds. The speakers had to be tuned using various measurement tools to conform to the room. The program head and only faculty member had to oversee the purchase, selection, installation and subsequent upkeep. This alone is hours of research. Twenty new software titles for audio and spectral analysis were added to studio computers. Again hours of research, purchase, installation and updates for a staff of one. A new complete workstation for student use with Avid C24 console, Genelec speakers, Mac Pro computer and peripherals for students to practice was assembled over Summer 2018 for use starting in Fall 2018 in the classroom. All done by staff of one. Based on recent results, areas needing improvement: Judging reverb types (such as plate, hall, non- linear) and evaluating parameters like bloom and tail. Compressors and compression ratios, knee and output using RTA

Page 282: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

272

Music Recording Technology Certificate and the other right. Perhaps an essay on how and why one mixes sound differently will need to be provided if this is to be understood along with CDs since it is aural one has to listen. If needed please let us know and we will provide that data in a subsequent SLO. Students have the information from faculty in the program and from listening assignments. Strong areas were judging reverb intensity, mix and signal flow set up within the DAW and console/hardware via patchbay aux sends, automation of sends and returns, the importance and execution of a high pass filter notched at 500K or to taste, before the insert to remove information unnecessary for the plug. The addition of high definition monitors makes this distinction stand out aurally, which is where it matters in the end since people listen to audio, they don’t read it unless they have sophisticated music reading skills and perfect pitch or at least relative pitch. This is the first time this SLO was assessed.

analyzers and plugs implemented in Fall 2018. EQ curves. Added 20 EQ plugs in Spring 2018 to help bring current technology in use in professional facilities. Constant updates are done each semester to keep hardware and software current. The addition of new software and high resolution speakers as of Fall/Spring 2018 is a major update to the listening environment which is equivalent to the finest. This will help to improve essential hearing skills while making learning enjoyable. Extra practice on a “B” station added in Summer 2018 will help improve scores. This starts in Fall 2018. Updates /new software will be added to keep the studio current with practices in the industry. The “B” or practice station will be expanded, space permitting, with more hardware and software, if approved for purchase in Fall 2019-Spring 2020. Since funding will come from various sources -ETF for hardware, MPS for software - each happens once a year but at different times. With a staff of one to do everything it becomes difficult to prognosticate if this component will be completed by the next assessment. Actions to improve the SLO included installation of high resolution speakers. This took over 6 months; stands had to be built to support 250 pounds per speaker since the wall could not support it. It created many installation challenges that a staff of one had to resolve. The work started in December and final tuning completed during Spring 2018. Speakers were fine tuned to the room professionally, providing accuracy. A highly detailed monitor and tuned acoustical environment is needed to assess finer details. Now students have to learn to comprehend the “aural” information they are receiving like never before. New books and listening methods will be added in Fall 2019. Specific areas that need improvement: Ability to concentrate for an hour at a time without a break (total 3 hours, 40 minutes for the duration of the class). To listen critically using specific criteria provided. Examples are all frequencies present and in balance. This is related to the point above. There is dimensionality in the music-front/back, top/bottom. Identify problematic frequency areas in the music as well as room generated reflections. Remove DC offset, a common problem. Provide solutions to fix problem frequencies with the use of hardware and software by identifying: 1. Which component will address the issue; 2. within the component, which parameters will address the issues; and 3. The issues will change as correction is applied. Other areas change as well and will

Page 283: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

273

Music Recording Technology Certificate need to be identified and addressed since sounds interact with each other and do not exist in isolation on the final product. Next Assessment: Fall 2020

Explain issues in copyright law. For this class, issues that arise with regard to infringement by sampling OR music piracy, problems and possible solutions. [ X ] CT

Music Copyright Law MUS 179 Direct Measure: Essay, 2500 words minimum. Areas addressed:

1.What constitutes sampling violation? What constitutes piracy? 2. The context in which sampling happens-music is not isolated but a continuous thread. When is it infringement? In piracy, what is considered to be pirated, for instance recording a live show, is sharing that recording illegal? 3. Various perspectives and affected parties had to be considered: Artist, secondary artists, producer, record label, marketer, radio, internet, copyright enforcement, sales, live concerts. 4. Assumptions were in sampling: when buying a recording who do you expect to hear, the artist? Who gets what percentage of the money you spent, what does the musician earn and is it sustainable and for whom. In piracy who is affected when a person steals music and what are the consequences for all parties, including the writer, producer, the store, download links, concerts and more. 5. students provided evidence of piracy and sampling using examples of music, high profile litigations including current. The evidence of piracy is obvious and was identified by every student who chose the topic 6. The implications of piracy and sampling are vast and range from the creation or even before to the end which is payment to all parties involved. This was perhaps the most elusive aspect since the data is continually changing. for an artist not in the main stream and even within that the rate of success is very low. Getting paid is becoming

Semester/year data collected: Spring 2018 Target: 50% of students will score 80% or higher overall and each criterion Results: • 5 earned an “A” • 3 earned a “B” • 1 earned a “C” • 1 earned a “D” Results by CLO Criteria:

Criteria High P Proficient Some P No P 1 5/11 3/11 2/11 1/11 2 5/11 5/11 3/11 1/11 3 6/11 5/11 n/a n/a 4 6/11 3/11 2/11 1/11 5 5/11 4/11 2/11 1/11 6 6/11 3/11 2/11 n/a

This was the first time this CLO was assessed so there is no data to compare yet. In the future, we will be able to compare these results with the next assessment in Spring 2019. The CLO was assessed using an SLO. Results from prior SLO assessment: Data collection - Fall 2014. One section paper assigned to 17 students: • 12 earned an “A” • 2 earned a “B” • 2 earned a “C” • 1 earned a “D” Current Results: over 75% of students earned 80% or better. With a target of 75%, students completed the assignment successfully with a score of 80% or better.

Target Met: [ X ] Yes [ ] No [ ] Partially This was the first time this was used as a CLO topic. Faculty expectation was 50% success for the CLO. More emphasis will be placed on weak areas with quizzes and instruction. The class meets 4 times in the semester so it is incumbent upon the student to research what has been discussed in class. Further emphasis will be placed on this beginning in Spring 2019 to improve results. Current actions to improve CLO based on the results, starting Spring 2019: - Add more study guides and discussion groups,

emphasizing the importance of research between meeting times of 2 weeks.

- Add multimedia presentations to enhance engagement. - Improve assessment methods by accurately tabulating

criteria and scores. - Provide more videos and study guides for use outside

of meeting times starting in Spring 2019. New adjunct faculty will be teaching this course in Spring 2019 and providing the CLO. Next assessment of this CLO: Spring 2019

Page 284: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

274

Music Recording Technology Certificate almost impossible and income is shifting toward live music and production as a solution to lost wages. Music is becoming a “service industry”

Sample: One section of an 8- week session, Loudoun Campus, 11 students.

Program Goals Evaluation Methods Assessment Results Use of Results

Program Goal: To increase number of program-placed students

Evaluation Methods: OIR factbook

Target: To increase number of program-placed students Results for Past 5 years:

Fall Number of Students

Percentage Increased

2017 39 7 > 2016 32 3 < 2015 35 22< 2014 57 2< 2013 59

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment: The increase meets the target. Program moved in Fall 2015 to a brand new studio in a new building.

Faculty continue to advise students, pointing out advantages of being “program placed” i.e. scholarships, advising. Fall 2016-present: Work with area schools and community to showcase program and recording studio. Award free recording time to NOVA idol winners. Provide recording time to area high school music programs. Participate in college promotions and seminars to promote program. The studio is continually updated each semester to ensure the finest technology is available to students. This will be implemented in 2018-19. Obtaining scholarships through grants specific to the program and keeping up with trends in new technology, music, and music production techniques. First scholarships were awarded by NVCC Foundation in 2018-19 to 4 students. Two of the awardees never took a class again but accepted the award. Current improvement request includes: • Mics for specific instruments • Frequency analyzer to help students initially • FET mics • protected storage or RAID for the studio to safeguard

data loss. Assessed: Annually

To increase program graduation numbers

Evaluation Methods: Data compiled by OIR and published in the Factbook.

Target: To increase program graduation numbers Results for Past 5 Years:

Academic Year

Number of Graduates

Percentage Increased

2016-17 5 <2 students 2015-16 7 >1 2014-15 6 No change

Results improved: [ ] Yes [ ] No [ X ] Partially Current action(s) to improve program goal: Advertising - when this will happen depends on the PR department of the college. One pilot ad was done on bus during 2017-18 but results from that are yet to be compiled by PR. A flyer is handed out all events hosted in the studio. A display is carried to tabling events such as Stem, Nova idol and

Page 285: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

275

Music Recording Technology Certificate

2013-14 6 <1 2012-13 7 No change

Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): Lower by 2 students. This is a small program so doing it by numbers is a more useful measure over percentages, suited to larger programs. In percentages, a decrease of one in a cohort of 12 may result in a skewed result.

many more. Four scholarships per year worth $2,000 each in recording technology through grant money procured by Program Head to benefit students in the program and help with completion. Approximately 50% of students ostensibly have no intent to complete the certificate based on observations and program placed enrollment numbers as opposed to total number in core classes. This is not unusual since audio engineering or live sound is not an advanced degree-oriented field except in academia, ergo students cherry pick the courses that give them the knowledge they seek and move on to work or projects. For example, copyright law is not required knowledge to run a console at an event, but useful when recording in a studio and rarely if ever mentioned as a requirement in a job announcement. A minority do not intend to complete in 3 semesters and are on a longer track that may even combine an AA in music or arts and recording is an additional skill that is a complement. Assessed: Annually.

Page 286: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

276

Annual Planning and Evaluation Report: 2017-2018 Nursing, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose: The Nursing Program is designed to prepare students to participate as contributing members of the healthcare team, rendering direct care to patients in a variety of health care facilities and agencies. Upon satisfactory completion of the program, students will be eligible to take the National Council Licensure Examination (NCLEX) leading to state licensure as Registered Nurse (RN) and are qualified to assume registered nurse positions in hospitals, nursing homes, clinics, physician’s offices, HMOs, and other community based settings. The Nursing program consists of 5 semesters, one pre-requisite semester and then four semesters of nursing courses. Students may follow one of two tracks to complete the Nursing program: (1) the Traditional-hybrid, four semester track with no classes in the summer; (2) The Online/Hybrid track, which includes four consecutive semesters (including summer) and presents the didactic portion of the nursing courses online. Advanced placement to the Traditional track is available for Licensed Practical Nurses (LPNs) wishing to enter the Nursing Program through the LPN to RN Transition program.

Student Learning

Outcomes Evaluation Methods

Assessment Results Use of Results

Demonstrate using community-based nursing in the promotion of health, in providing for a safe and effective environment, and promoting/ maintaining physiological and psychological integrity

Community-Based Nursing in a Multicultural Environment NUR 150 Direct Measure: Midterm and Final Exam: A subset of items from NUR 150 midterm and a final; a 50 item multiple-choice exam was used to measure SLO #2. A total score of 78% is required for passing a unit exam. Nursing 150 is only offered at the MEC and testing was done in Blackboard proctored on campus. Sample: N=135; Sections=4. Two faculty members were assigned to NUR 150.

Semester/year data collected: Fall 2017 Target: The mean P value of unit exams and finals is 84%. There were a total of 5 items out of 50 on the midterm exam and 5 items on the Final exam that assessed SLO #2. Results: NUR150 Midterm and Final Exam items, Fall 2017:

Mid Term Test Item 7 11 17 19 24 Final 6 8 11 13 32

Results: SLO # 2: The mean score for the items assessing SLO # 2 on the midterm and final was ( ) exceeding the benchmark of 84%. None of the items had a P value below 50%. The PBC for the items ranged from a ( ) This is the first time SLO #2 has been assessed via Midterm and Final Exam items. This SLO was previously assessed with the HESI Community Health Standardized Exam in Fall 2015.

Prior Action: SLO #2 was last assessed in NUR150 in Fall 2015. The HESI score of 821 did not meet the benchmark of 850. At this time is was determined that the HESI standardized exam was not well aligned with the NUR150 course objectives and alternative assessment measures would be established beginning in Fall 2017. Current action: Fall 2018: The nursing program is adopting the VCCS common curriculum and concept–based learning model. Due to the implementation of the VCCS common curriculum, this is the final report for this SLO and course.

Utilize the teaching/learning process in providing nursing care

Nursing Fundamentals NUR 111

Semester/year data collected: Fall 2017 Method A: Seven items on the HESI assessed the concept of Teaching and Learning. The weighted HESI category score for these items was 745.

Prior Action: SLO #6 was last assessed in NUR 111 in Fall 2015. The HESI score of 821 did not meet the benchmark of 850. The assessment of SLO #6 via the subset of 10 items from the NUR 111 final exam exceeded

Page 287: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

277

Nursing, A.A.S. Method A: Health

Education System Inc. (HESI) Standardized Comprehensive Fundamentals of Nursing examination extrapolated scores for Teaching and Learning. Method B: Unit 2 Exam. A subset of items from NUR 111 exam 2; a 50 item multiple choice exam, was used to measure SLO #6. (Attachment 2) A total score of 78% is required for passing a unit exam. Item analysis (IA) consisting of the percentage of correct responses (P) and the point bi-serial correlation (PBC) for all items was carried out. In addition, KR (20) is reported for the exam to evaluate the reliability of the instrument. Nursing 111 is only offered at the MEC and testing was done in Blackboard proctored on campus. Sample: N=102; Sections=3. Three faculty members were assigned to NUR 111.

SLO#6 was last assessed in NUR111 in the Fall of 2015. N=80. 14 items assessed the concept of Teaching and Learning. The weighted HESI Category score for the items was 821. The benchmark for the HESI Standardized Comprehensive Fundamentals of Nursing Examination is 850. The weighted category score of 745 falls below the defined target and is a decrease from the previous Fall 2015 assessment value of 821. Method B, Fall 2017, NUR111 - Unit Exam 2: N=104; Sections=3. Target: The mean P value of unit exams and finals is 84%. There were 4 items out of 50 on the unit exam that assessed SLO #6. Results: SLO # 6: The mean score for the 4 items assessing SLO # 6 was 88%, exceeding the benchmark of 84%. None of the items had a P value below 50%, reference table 1. The PBC for the items ranged from a 0.11 to 0.31. Table 1: NUR111 Unit Exam 2, Fall 2017:

Test Item p-value PBC 6 85.6 0.23 7 83.7 0.11 9 93.3 0.31 50 92.3 0.21 MEAN 88

These results are similar to the last time this SLO was reported via this method in the Fall of 2015, reference table 2. The mean score of 10 items assessing SLO # 6 was 87%. Table 2: NUR111 Final Exam, Fall 2015:

Test Item p-value 4 56.82% 9 95.45% 22 100% 26 88.64% 34 89.77% 45 89.77% 65 85.23% 74 68.18% 81 97.73% 94 98.86% MEAN 87%

the benchmark of 84% with a mean p-value of 87%. Actions included implementing teaching strategies that engage students in purposeful critical thinking in Fall 2016 and providing HESI Success guidance for the students, including common errors to avoid and suggestions to improve strategies for testing. Most Recent Results: NUR111 students did not meet the target of 850 with a HESI category score of 745. However, the assessment of SLO #6 via the subset of 4 items from the NUR 111 Unit Exam 2 exceeded the benchmark of 84% with a mean p-value of 88%. NUR111 (Traditional Track): As part of the ongoing course evaluation review detailed analysis of HESI to identify potential gaps in the curriculum. Current action: Fall 2018: The nursing program is adopting the VCCS common curriculum and concept–based learning model. Due to the implementation of the VCCS common curriculum, this is the final report for this SLO and course.

Demonstrate the ability to manage client care

Nursing Management NUR 255 Method A: Health Education System Inc. (HESI) Standardized

Semester/year data collected: Spring 2018 Target: The benchmark for the HESI Standardized Nursing Exams is 850. Results: SLO #7: Total number of students taking the HESI Exit Standardized Management Exam was 6. The results for the SLO

Previous actions to improve SLO: Prior actions to improve the SLO #7 include: Teaching faculty incorporated practice case study assignments to allow students to practice the role of manager of care with patient scenarios on acute unit starting in Fall 2016. Furthermore, teaching faculty updated

Page 288: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

278

Nursing, A.A.S. Nursing Management examination extrapolated scores for Manager of Care. The Analysis of this specialized HESI was done by extrapolating the individual scores of the students and evaluating if students met the benchmark of 850 or not. The HESI scores are stratified by nursing program enrollment track: Traditional, Online, and LPN-RN; however, nursing students from all three tracks can enroll in either the hybrid or the online NUR 255 by their choice. Method B: Online Evolve management case studies are utilized with a completion level of 78% or above. Students have three (3) attempts to achieve the 78%. Sample: One online section; N= 6. One faculty member was assigned to the spring section.

Manager of Care were as follows: Three (3) students met or exceeded the benchmark of 850, and three (3) students scored below the benchmark. The average score for this cohort of six (6) is 844. The six students from the Spring cohort were added to Fall 2017 HESI Management Specialty exam cohort. It is no longer possible to use the average score, so each students overall score was used for the analysis. Method B: SLO #7: Spring 2018: NUR 255, complete three (3) Evolve case studies with a grade of 78%. Target: 100% of students will complete the three case studies with a grade of 78%. Results: 100% of students completed the three Evolve management case studies. Comparison to previous results: This is the first time Evolve case studies have been used to assess SLO #7 in the Nursing Management Class.

projects and assignments on the discussion board and test questions to ensure currency and adherence to evidence-based-practice. Most Recent Results: The comparison of HESI Management scores was not possible as only six students completed NUR 255 in Spring 2018, and these students’ scores were added to the previous fall cohort. However, students met the target for method B. Current actions to improve SLO: Current actions to improve the SLO #7 include: Teaching faculty will continue to review and update the course as needed before it will be taught for the last time in Fall 2018, as a new common core curriculum will replace the old curriculum. Hence, this course will not be reevaluated for the Annual Planning and Evaluation Report. Due to the implementation of the VCCS common curriculum, this is the final report for this SLO and course.

CLO: Critical Thinking Program SLO #8: Demonstrate the use of critical thinking throughout the nursing process in the provision of client care

Second Level Nursing Principles and Concepts II NUR 222 Method A: RN Exit Exam. Health Education System Inc. (HESI) Standardized HESI- Exit RN Exam extrapolated scores for Critical Thinking. Method B: Final Exam: The NUR 222 final exam is a 100 item multiple choice exam, used to measure SLO #8 “Critical Thinking.” The cognitive level of each

Semester/year data collected: Spring 2018 NUR 222- Traditional track and Online track Traditional track: N= 120; Sections= 3. Two faculty members were assigned to the traditional track. The Traditional track is a combination of traditional students and students from the LPN to RN track. Online track: N=36; Sections: 2. One faculty member was assigned to the online track. Combined Traditional and Online: N=156 total number of students; No students withdrew before the end of the semester. 155 students passed the course. One student received an incomplete. Target: The benchmark for the HESI Standardized Nursing Exams is 850.

Previous actions to improve SLO: In addition, PrepU which is an adaptive quizzing to mimic the NCLEX-RN which was used for assignments. No change was made to the Pediatric textbook. The PrepU quizzing allows students to answer questions until they reach mastery levels from 1 through 8. Mastery level of 4 was used for minimum requirements for the assignments. Most Recent Results: The benchmark for HESI standardized testing of 850 was not met for SLO #8; however, the benchmark was met for the method.

Current actions to improve SLO: Based on the current analysis, the current actions to

Page 289: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

279

Nursing, A.A.S. item on the final was application or higher. A total score of 78% is required for passing. Item analysis (IA) consisting of the percentage of correct responses (P), and the point biserial correlation (PBC) for all items was carried out. The KR (20) was also reported for the final exam of 0.602. Nursing 222 is only offered at the MEC. Sample: See next column.

Results: SLO # 8: Total number of students taking the HESI Exit Standardized Exam was 152. The Traditional was further broken down into 34 LPN to RN Traditional Students and 86 Traditional Students for the purpose of HESI Testing in NUR 222. The HESI Exit scores were as follows: LPN-RN Traditional:776; Traditional: 789.; Online: 867 This means the target was met for the online students, but not for the Traditional LPN-RN or Traditional students. Method B: Spring 2018 NUR 222: Of the 156 enrolled students, 153 students tested on Blackboard. The 3 remaining took the test on paper. 144 of 153 achieved > 78% (94%). The KR (20)/Cronbach Alpha = 0.6. However, this exam does not break down the results to the level of how many students met the benchmark for each SLO category at this time.

improve the SLO #8 include: Teaching faculty will provide immediate remediation including referrals to student success center for all students who score below 80% on the first unit exam or have other at-risk issues. Total faculty approved a new pediatric textbook, Kyle, that is leveled for associate degree nursing students and will be used by the next academic cohort (graduating class of 2018). Finally, a suggested increase of the mastery level of PrepU assignments to 6 out of 8. NUR 222 will be phased out in May 2019. No further action will be taken.

Program Goals Evaluation Methods Assessment Results Use of Results The graduates are eligible to sit for and complete the NCLEX-RN leading to licensure as a registered nurse as determined by NCLEX-RN pass rates

Method: Virginia Board of Nursing (VBON) quarterly reports: NCLEX-RN pass rates. Reported April 01-June 30; July 01-September 30; October 01-December 31 National Council State Board of Nursing (NCSBN) NCLEX Annual Report-Available annually in March. This report provides information on how the graduates performed against other programs regionally and nationally.

Target: The expected level of achievement (ELA) for this program outcome is: NCLEX-RN pass rates for NOVA Nursing Program graduates will be at or above the national licensure exam pass rates for associate degree programs as reported by the NCSBN. NCLEX-RN pass rate data:

Graduate Year # pass #

candidates Pass rate National Benchmark

2016 139 158 88% 83% 2017 165 185 89.2% 84% *2018

* NCLEX annual pass rate data is available the first of the following year and will be reported with the 2018-19 APER. The NCLEX-RN pass rates for NOVA Nursing Program graduates have consistently been at or above the national licensure exam pass rates for associate degree programs as reported by the National Council State Board of Nursing.

Spring 2016: Anticipate and prepare for changes to NCLEX test plan. Detailed Test Plan for the National Council Licensure Examination for Registered Nurses revised April 2016. Program revised and updated curriculum mapping document to NCLEX-RN competencies. 2016 NCLEX test plan includes items in multiple format including, but are not limited to: multiple-choice, multiple response, fill-in the-blank calculation, ordered response, and/or hot spots Additionally, the percent of items testing Pharmacological and Parenteral Therapies has increased. Program exams have been updated to address the changes in the 2016 NCLEX test plan. Fall 2017: Continue to monitor NCLEX pass rates. The Medical Education Campus (MEC) and Nursing Division have sustained programmatic efforts in place to assist nursing student success on the NCLEX-RN. The MEC student services offers an NCLEX-RN review course on campus each Spring semester. The MEC Library has a number of NCLEX-RN preparation resources. Next Assessment: NCLEX-RN pass rates are assessed annually. The NCLEX pass-rate for the May 2018 graduates will be reported with the 2018-19 APER.

Page 290: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

280

Nursing, A.A.S. The students enrolled in the Nursing Program complete curriculum requirements in the prescribed length of time as illustrated by program and college retention and graduation rates.

Method: The retention rate is calculated by dividing the number of students who have progressed to the third semester in the program by the number of students enrolled at census date in the first semester of the program. ELA is >60%. Completion is calculated by dividing the number of students who successfully complete the last semester of the program within 150% of program length by the number of students enrolled at census date in the first semester of the program. Retention and completion data are aggregated from SIS by the Office of Institutional Research (OIR).

Program Retention

Terms NUR111

Enrollment Subsequent Fall

NUR 221 Enrollment N # %

Fall 2015-Fall 2016 224 160 71 Fall 2016-Fall 2017 154 120 78 Fall 2017-Fall 2018

Nursing Program retention rates exceed the established ELA of >60%. Fall 2016 retention rates declined by 11%. The decline in retention Fall 2015 to Fall 2016 is attributed to a significant decline in student pass rates in the second semester with Maternal Child Health. The Nursing Course Evaluation Summaries reflected a decline in pass rates from 76% in the Spring of 2015 to 63% in the Spring of 2016. Because of these changes the pass rates for Maternal Child Health increased to 93% in Summer 2016 and 86% in Spring 2017. Fall 2017: retention rates improved by 7%. Program Completion Data

Terms

# Students enrolled at

census date

# Students Completing

Program at 150% N # %

Fall 2013-Spring 2014 (May 2016) 187 153 82%

Fall 2014-Spring 2015 (May 2017) 191 162 85%

Fall 2015- Spring 2016 (May 2018)

Nursing Program completion rates have consistently met or exceeded the established ELA of >75%. The improvement in completion rates is attributed to sustained efforts put in in place over the past academic years to assist with student retention and completion. Student support services at the MEC offer a wide range of support services for students to promote their success. These include academic, fiscal and personal support services. The nursing division has an established advisory program for students. Each student is assigned a nursing faculty member as an advisor in the first semester to support their academic progression. Additionally, in Fall 2014 faculty teaching the first semester nursing courses implemented a strategy to reach out to students unsuccessful on the first exam rather than waiting to mid-term to discuss strategies for success in the program and develop a remediation plan. Fall 2017 Nursing Exit Interview Data: Data from Nursing Program Exit Interviews indicate that the primary reason for students exiting the program is academic in nature (63%) followed by personal (28%).

Fall 2016: The retention rate declined by 11%. Plan of action to improve retention rates was developed by faculty which included: • Level the NUR 180 curriculum to the

entry level associate degree nurse. • Decrease the required readings focusing

on central concepts. • Removed STI, female reproduction, and

contraception from this course and placed this content into Medical-Surgical Nursing where this material originally belonged per the VCCS course content.

• Concepts not central to the material were assigned as recommended readings.

Completion rates improved from 80% to 82%. Fall 2017: Retention rates improved by 7%. To better inform program decision-making for the maintenance and improvement of students’ completion of the nursing program, a retrospective analysis of data from Nursing Program Exit Interviews was aggregated for AY 2014-15, 2015-16 and 2016-17. Total faculty group is working with student services to further assess the implications of these results. The MEC and Nursing Division have sustained efforts in place to assist with student retention. Student support services at the MEC offer a wide range of support services for students to promote their success. These include academic, fiscal and personal support services. The nursing division has an established advisory program for students. Each student is assigned a nursing faculty member as an advisor in the first semester to support their academic progression. Additionally, any student who is unsuccessful on a nursing exam is contacted by their course instructor and a remediation plan is implemented. Next Assessment: Retention rates are assessed annually. Program retention for the 2017-18 AY will be reported with the 2018-19 APER.

Page 291: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

281

Nursing, A.A.S. GPA and TEAS do not appear to be a factor in a student’s lack of success. Of possible significance is the percent of students who indicated they did not use the nursing tutor resource (77%). The data appears to support that working does impact student success in the nursing program. All of the students indicated they were working, with the majority working 10-20 hours per week. Of these students, 51% believed that work impacted their success compared to 48% that did not. It is difficult to say definitively that work impacted students’ success as comparative data for successful students is not available.

Completion rates are assessed annually. Completion rates for students who entered the nursing program in Fall 2015-Spring 2016 will be reported with the 2018-19 APER.

The Nursing Program prepares students to practice in various community-based settings as identified by employment data

Method A: Annual Survey of Graduates > 80% of NOVA Nursing Graduates surveyed will be employed either full time or part time within 12 months of graduation.

Fall 2017 Nursing Program Graduate Survey Employment Data: Are you currently employed as an RN?

Are you currently employed as a RN? Graduates Yes No % May 2015 42 2 95 May 2016 32 2 93 May 2017 22 10 54

Of the graduate survey respondents, job placement rates at 12 months exceed the benchmark of > 80%. Job placement for the 2017 graduate respondents is below the program’s target. This cohort will be surveyed again in May 2018 at 12 months. The job placement percentages support anecdotal discussion at Advisory Board Meetings and Division Meetings regarding the quality of the NOVA graduate.

Fall 2016: NOVA Survey of Graduates: 83% of survey respondents indicated employment. ELA met. Fall 2017: To better inform program decision making, a survey was conducted in Fall 2017 of the last three graduating classes. Establish process with NVCC Office of Institutional Research (OIR) in the development of a systematic process to survey nursing graduates at 12 months, beginning May 2018 with the May 2017 graduates. Next Assessment: Employment data is assessed annually. Employment rates for the May 2018 graduates will be reported with the 2018-19 APER.

Page 292: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

282

Annual Planning and Evaluation Report: 2017-2018 Occupational Therapy Assistant, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The NOVA Occupational Therapy Assistant (OTA) curriculum is designed to prepare students to assist occupational therapists in providing occupational therapy treatments and procedures. Graduates may, in accordance with state laws, assist in the development of occupational therapy treatment plans, carry-out routine functions, direct activity programs, and document the progress of treatments. Upon the completion of the program requirements the students are able to sit for the National Board for Certification in Occupational Therapy certification exam for OTAs. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Establish and maintain a therapeutic rapport with clients, families, colleagues, and other health care professionals through effective communication and appropriate professional behaviors during the screening and evaluation process.

Coordinated Internship in Physical Disabilities OCT 190 (Fall 2017) Coordinated Internship OCT 290 (Spring 2018) Direct Measure OCT 190: Clinical Fieldwork Level I Evaluations - The Clinical Fieldwork Level I Evaluations are completed by the assigned Fieldwork Educator and reviewed by the NOVA Academic Fieldwork Coordinator (AFWC) at the end of each Level I fieldwork experience. Specifically, the questions on the Clinical Fieldwork Level I Evaluations that target this SLO include: • Question #5: Demonstrates appropriate

professional behaviors and communicates appropriately with fieldwork educator and other staff.

• Question #11: Establishes rapport with clients and families, as appropriate to the setting.

Direct Measure OCT 290: The AOTA Fieldwork Performance Evaluation (FPE) for the Occupational Therapy Assistant Student is completed by the assigned Fieldwork Educator and reviewed by the NOVA AFWC at the end of each Level II fieldwork experience. Specifically, the questions on the AOTA FPE that target this SLO include: • Question #16 - Therapeutic Use of Self • Question #18 – Verbal/Nonverbal

Communication • Question #20 – Self Responsibility • Question # 21 – Responds to Feedback • Question #22 – Work Behaviors • Question #23 – Time Management

Semester/year data collected: Fall 2017 (OCT 190) and Spring 2018 (OCT 290) Target: 100% of OTA students will Exceed or Meet the Standard on Questions #5 and #11 on the Clinical Fieldwork Level I Evaluations. Current Results, OCT 190, Fall 2017: Question #: % Exceeds Standard: % Meets Standard: Question #5 12% 88 % Question #11 12% 88% Previous Results, OCT 190, Fall 2016: Question #: % Exceeds Standard: % Meets Standard: Question #5 50 % 50% Question #11 61% 39% Target: 100% of OTA students will Exceed or Meet the Standard on Questions #16, #18, #20, #21, #22, #23, #24, #25 on the AOTA FPE for the Occupational Therapy Assistant student. Current Results, OCT 290, Spring 2018: Question #: % Exceeds: % Meets % Needs Imp: Question #16 47% 50% 3% Question #18 32% 65% 3% Question #20 29% 68% 3% Question # 21 41% 56% 3% Question #22 32% 65% 3% Question #23 26% 65% 9% Question # 24 53% 47% Question #25 35% 65% Previous Results, OCT 290, Spring 2017: Question #: % Exceeds: % Meets % Needs Imp: Question #16 44 % 50% 6% Question #18 15 % 79% 6% Question #20 41% 59%

Previous action(s) to improve SLO: The Academic Fieldwork Coordinator (AFWC) continues to stress to all cohorts and all fieldwork classes, OCT 100 and in OTA orientation, the importance of professional behaviors. The AFWC includes practical examples and methods to reflect on their personal professional behaviors in the Fall 2017 OCT 190 and in Spring 2018 OCT 290 prior to students starting fieldwork experiences. The OTA faculty reviews professional behaviors during the bi-annual advising sessions with each student and watched for trends in score changes. Scores are somewhat subjective by each Fieldwork Educator. This topic is of key concern for all Fieldwork Educators. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Continued education and monitoring of professional behaviors throughout the OTA curriculum is utilized. Professional behaviors are emphasized at multiple points during the year including: during bi-annual advising sessions, during the initial OTA program orientation session, in all OTA courses in the curriculum, and especially in the OCT 190 courses. Additionally, all faculty address time management routinely in classes and 1:1 as appropriate.

Page 293: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

283

Occupational Therapy Assistant, A.A.S. • Question # 24 – Interpersonal Skills • Question #25- Cultural Competence Sample Size (Specify N/A where not offered):

Campus/

Modality

# of Total

Sections Offered

# Sections assesse

d

# Student

s assesse

d ME only 4 4 16 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Question # 21 62% 38% Question #22 50% 47% 3% Question #23 50% 41% 9% Question # 24 53% 47% Question #25 56% 44% Current results improved: [ ] Yes [ ] No [X] Partially Strengths by Criterion/ Question/Topic: Based on the data analysis, 100% of the students demonstrated professional behaviors on their Level I Fieldwork in Fall 2017. One student demonstrated difficulty with a professional behavior on Level II fieldwork (OCT 290), but this did not rise to the level of failure for the internship experience. Weaknesses by Criterion/ Question/Topic: One student demonstrated difficulty with professional behaviors on Level II fieldwork, but this did not rise to the level of failure for the internship experience. The program’s Academic Fieldwork Coordinator assisted the student and the Fieldwork Educator throughout the fieldwork to ensure that the student had successful completion. Comparison to Previous Assessment: Although the target was met for the OCT 190 course, less students exceeded expectations as compared to those in Fall 2016. The students’ scores were slightly improved this year as compared to last year on questions #16, 18 and 23 and slightly lower on questions #20, 21, and 22 and consistent with questions #24 and 25 on the AOTA FPE for the OCT 290 course however. Overall, the scores are within the acceptable range for fieldwork as all students completed the Level II fieldwork experience.

Current actions to improve SLO based on the results: Since professional behaviors are a critical component for a successful fieldwork education process, the OTA faculty will continue to stress appropriate behaviors in all fieldwork classes (OCT 190 and OCT 290) prior to students going on each fieldwork experience in Fall 2019, Spring 2020, and Summer of 2020, during the fieldwork experiences, during bi-annual (Fall 2019 and Spring 2020) advising meetings, and in the OTA program new student in August 2019. Next assessment of this SLO: This SLO will be reassessed in the AY 2019-20.

Implement evidence-based practice skills when working with clientele across the lifespan.

Topics in Evidence-Based Practice in Occupational Therapy OCT 195 Direct Measure: Critically Appraised Topic (CAT) Paper. Rubric for the CAT paper is attached. OTA program Grading Scale: • A= 90-100 • B=80-89 • C=75-79 • D=70-74 • F=<70 The Critically Appraised Topic (CAT) was a group paper requiring students to implement their evidence-based skills that they acquired throughout the summer semester in OCT

Semester/year data collected: Summer 2017 Target: 80% of the NOVA OTA students will receive an A (90-100%) on the CAT paper rubric, demonstrating effectiveness at utilizing evidence-based practice skills needed to be an informed OTA. Results: In Summer 2017, 16/16, or 100%, of the second year students of the OTA program students completed the CAT paper in the OCT 195, Topics in Evidence-Based Practice in Occupational Therapy course. Specifically, the following grades were earned by the students on the CAT paper:

• A = 13/16 students (81.2%) • B = 3/16 students (18.8%) • C = 0%

Previous action(s) to improve SLO: During each summer session, the Program Director, also the professor of the OCT 195 course, attempted to spend increased time in instruction on specifically reviewing the components of the CAT, including: the study design, outcome measures, the main findings, and interpretation of the results. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Understanding and translating the results section within a quantitative research article tends to be challenging for students. Spending

Page 294: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

284

Occupational Therapy Assistant, A.A.S. 195. Students were required to choose an evidence-based practice topic and a specific targeted sample of clients (with a particular diagnosis, age category any place across the lifespan, and a particular OT treatment approach) of their choice to perform an exhaustive search of the evidence. The students then were required to categorize the evidence based on quality and then draw conclusions on the best practices when working with their chosen specific clientele. Students wrote a CAT paper and presented this information to the class. Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 16 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Strengths by Criterion/ Question/Topic: Of the 16 students who completed the evidence-based practice paper that included an exhaustive search of the evidence, 13 received an A and 3 earned a B, showing an overall excellent understanding and application of the evidence-based practice process in OT. Weaknesses by Criterion/ Question/Topic: When reviewing the rubric of the those who received a B, the main area that the students had a difficult time with was translating the results section of the research articles. Current results improved: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment: Previously, this SLO was assessed in the AY 2015-16. In comparison to those results, students demonstrated a slight decline by 3.1%. Previously, in 2015-16, 84.3% of students received an A and 15.7% received a B. The slight decrease was due to a loss of points “Summary of Best Evidence” section, specifically the translation of the “Results” section of the articles.

more time reviewing outcomes, main findings, and the implementation of the results will be helpful. Current actions to improve SLO based on the results: Based on the current results, the program is going to propose to expand the summer session from a 5-week summer session to a 6-week summer session to allow for a greater amount of time that students be exposed to the translation of the results section of research articles to, thereby, improve the data pertaining to this SLO. Additionally, collaborating with library services to gain further resources for students to enhance their understanding of the interpretation of the results sections of articles will be beneficial. These actions will occur during the OCT 195 course iteration in Summer 2019. Next assessment of this SLO: This SLO will be reassessed in the AY 2019-20.

Utilize the teaching and learning process in providing occupational therapy interventions.

Therapeutic Skills OCT 207 Direct Measure: Final Practical Rubric Item: In the OCT 207 final practical, students receive a case and have to provide a simulated client with education, utilizing and applying teaching and learning principles integrated into the provision of occupational therapy interventions. Rubric item: Provided accurate and easy-to-understand education (5 points). OCT 207 Therapeutic Skills Final Practical is attached. The AOTA Fieldwork Performance Evaluation (FPE) for the Occupational Therapy Assistant Student is completed by the assigned Fieldwork Educator and reviewed by the AFWC at the end of each Level II fieldwork experience. Specifically, the question on the AOTA FPE for the Occupational Therapy Assistant Student that targets this SLO includes:

Semester/year data collected: Spring 2018 Target: 90% of OTA students will utilize the teaching and learning process in providing OT interventions as demonstrated in the OCT 207 Skills final practical exam as noted by receiving a 4/5 or a 5/5 on the rubric score. Results: Rubric Item out of 5 points Graded: % that earned: 5/5 89% 4/5 5.5% 3/5 5.5% 2/5 0% 1/5 0% Total percentage receiving a 5/5 or 4/5: 94.5% Average score: 4.83/5 Current results improved: [X] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: 94.5% of NOVA OTA students were able to demonstrate effective teaching and learning principles during the OCT 207 Final Practical Exam, and therefore the target was exceeded. Being an effective

Previous action(s) to improve SLO: The OTA faculty and the AFWC consistently stress to all cohorts the importance of the teaching learning process throughout the curriculum. Additionally, the students are required to demonstrate the skill to effectively teach “simulated” clients in a variety of OCT 207 lab sessions to enhance their practice with teaching clients and peers’ therapeutic skills in order to improve the students’ achievement in Spring 2019. Target Met: [X] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Ensuring that patient education is easy-to-understand is a critical clinical skill that all OTAs need to be able to demonstrate. Continued exposure, faculty-to-student modeling and discussion of the implementation of teaching-learning

Page 295: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

285

Occupational Therapy Assistant, A.A.S. • Question 14 – Implements Intervention • Question 18- Verbal and Non-Verbal

Communication Sample Size (Specify NA where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME 1 1 18 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

patient educator is critical to being a successful OT practitioner and the students exhibited this skill. Weaknesses by Criterion/ Question/Topic: The one student who scored below the benchmark earned a 3/5. She was required to retake the OCT 207 final practical and received a 4/5 on the retake exam on this particular criteria, showing improvement. Comparison to previous assessment: Previously, this SLO was assessed in 2015-16. Improvement was shown in the assessment performed in this reporting period. Previously, 72.2% earned a 5/5, 22.2% earned a 4/5, and one student earned a 2/5 with the average score of a 4.61/5. Thus, more students received a 5/5 in this assessment period. Further, the average of scores improved from the previous reporting period to this reporting period by 0.22 points. Also, the lowest earned score during the previous reporting period was a 2/5 and in 2017-18 the lowest was a 3/5 and improved to a 4/5 upon a retake. All data shows improvement. Semester/year data collected: Spring 2018 (OCT 290) Target: 100% of OTA students will Exceed or Meet the Standard on Questions #14 and #18 on the AOTA FPE for the Occupational Therapy Assistant student. Current Results, OCT 290, Spring 2018: Question #: % Exceeds: % Meets % Needs Imp: Question #14 18% 82% Question #18 32% 65% 3% Previous Results, OCT 290, Fall 2016: Question #: % Exceeds: % Meets % Needs Imp: Question #14 32% 68% Question #18 32% 64% 4% Current results improved: [ ] Yes [ ] No [ X ] Partially Strengths by Criterion/ Question/Topic: 34 students were identified as meeting or exceeding the goal of successfully implementing interventions with their clients at the end of the Level II fieldwork experience. 33 students met or exceeded the goal of demonstrating appropriate verbal and nonverbal communication while on their Level II fieldwork. Weaknesses by Criterion/ Question/Topic: The student who “needed improvement” with verbal and non-verbal communication was still able to successfully complete the

principles will be of focus in OCT 207 and all intervention coursework. Current actions to improve SLO based on the results: In order to continue to help students improve patient education, the OTA faculty members will demonstrate easy-to-understand education in a video library that is designed for students to watch skill implementation with accurate and understandable education that students can access at any time. Further, continued practice in the simulation center, and with their peers acting as patients, will help the students incorporate understandable and accurate education into their daily practices as an OTA student. The AFWC will stress the importance of the teaching and learning process and will demonstrate appropriate verbal and non-verbal communication in all fieldwork courses, including in each OCT 190 course in the Fall 2019, Spring 2020, and Summer 2020. Also during the Fall 2019 and Spring 2020 advising sessions, these professional behaviors will be of discussion topic with each student. Next assessment of this SLO: This SLO will be reassessed in the AY 2019-2020.

Page 296: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

286

Occupational Therapy Assistant, A.A.S. fieldwork experience. This student received the feedback from the evaluation and was able to identify strategies to continue to work on this issue in the future. Comparison to previous assessment: The data was similar when comparing this year’s assessment to the data of 2015-16 when viewing the verbal and non-verbal communication scores. However, there was a slight decline in the “Implements Intervention” category.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Apply reflective problem-solving skills and decision-making skills while providing OT intervention in a safe manner. [ X ] CT

Therapeutic Skills OCT 207 Direct Measure: OCT 207 Final Practical Exam Rubric Item. Grading Rubric (as attached): CATEGORY V: CREATIVE PROBLEM-SOLVING/FUNCTIONAL IMPLICATIONS (10 points) A. Demonstrates creative problem solving when performing:

1. Procedures to transfer the client or to perform grip/pinch strength and others. (5 points)

B. Demonstrates creative problem solving when discussing 2 functional implications that will be considered for discharge. (5 points)

Sample Size (Specify N/A where not offered)

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME 1 1 17 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment One student was not assessed in this category as she failed the practical before getting to this section of the practical.

Semester/year data collected: Spring 2018 Target: 90% of OTA students will apply reflective problem-solving skills and decision-making skills while providing OT intervention in a safe manner on a case-based Therapeutic Skills final practical by receiving a 4/5 or 5/5 on each of the two problem-solving areas on the OCT 207 final practical rubric. Category V. A.1. Rubric Item (out of 5 points) Graded: % that earned 5/5 82.50% 4/5 11.7% 3/5 5.8% 2/5 0% 1/5 0% Total percentage receiving a 5/5 or 4/5: 94.2% Average score: 4.76/5 Category V. B. Rubric Item (out of 5 points) Graded: % that earned 5/5 82.40% 4/5 17.6% 3/5 0% 2/5 0% 1/5 0% Total percentage receiving a 5/5 or 4/5: 100% Average score: 4.82/5 Strengths by Criterion/ Question/Topic: Out of 17 students assessed on this practical, 100% students scored a 5/5 or a 4/5 in both of the problem-solving and decision-making categories on the Therapeutic Skills grading rubric in the case-based final practical with the Category V.B rubric item. Overall, students demonstrated excellent problem-solving skills and decision-making skills in this case-based practical, which mimicked a typical treatment session in the clinic.

Previous action(s) to improve CLO if applicable: This is the first time that this specific core learning objective has been assessed. Target Met: [x] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Continued practice on applying problem-solving and decision-making skills in the OCT 207 lab sessions is recommended to continue to improve success on this CLO. Current actions to improve CLO based on the results: The OTA program will increase its utilization of more case-based scenarios in the OCT 207 lab in the simulation center so that students have to more consistently apply their problem-solving skills and decision-making skills during Spring 2019. Next assessment of this CLO: This CLO will be reassessed in the AY 2019-2020.

Page 297: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

287

Occupational Therapy Assistant, A.A.S. Weaknesses by Criterion/ Question/Topic: One student failed the practical before these two areas were assessed so these two criteria were not evaluated. Out of the 17 students assessed in these categories in the rubric, only one student received a 3/5 in the Category V.A.1 as she did not explain footwear safety in transferring although did incorporate proper footwear during the transfer process. Comparison to previous assessment: This is the first assessment of this CLO and will be used as a benchmark for future assessments.

Program Goals Evaluation Methods Assessment Results Use of Results The NOVA OTA Program will maintain a retention rate of an 85% or higher.

Number of admitted students in the program divided by the number of current students that are enrolled in the program. Percentage of OTA students with two signed NOVA OTA Program Advising Forms per year to ensure progress through the program.

Target: The NOVA OTA Program will maintain a retention rate of 85% of higher each year. Results for Past 5 years - OTA Program Retention Chart:

Academic Year

Number of Students Enrolled in the Fall/ Number of Students

by end of Spring Retention Rate

2013-2014 15/14 93.3% 2014-2015 33/28 84.8% 2015-2016 37/34 91.89% 2016-2017 38/44 86.8% 2017-2018 37/35 94.59%

Target: 100% of OTA students will participate in advising in the Fall semester and spring semester annually and will sign an advising form at the end of each advising session to enhance retention rates. OTA Program Advising Chart:

Academic Year Fall Semester (%

of students receiving advising)

Spring Semester (% of students

receiving advising)

2013-2014 100% 100% 2014-2015 100% 100% 2015-2016 100% 100% 2016-2017 100% 100% 2017-2018 100% 100%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The retention rate from 2016-17 to 2017-18 improved by 7.79%. Additionally, a consistent 100% of OTA students have received advising twice a year to ensure successful program progression.

Previous action(s) to improve program goal: These actions are ongoing. The competitive admissions process is re-examined each year amongst the faculty and the NOVA OTA program curriculum advisory committee to ensure that the demands of the program are accurately targeted. Another action for improvement is for the faculty to identify early when a student discloses personal circumstances that hinders academic performance or when academic performance is poor. Early interventions utilized include: Student Services resources, the tutor center, and the CARE team to support academic success. The assigned OTA faculty advisor is responsible for following-up with the identified student every other week to assist the student academically and navigate resources as necessary to again aid in student success and help in student retention. Most recent results: The NOVA OTA program’s retention rate improved by 7.79% this year. As per the recommendations above, the program re-examines the admissions process annually and has been proactive in providing early interventions to students when students are struggling academically and personally. Utilization of the college’s resources have proven to be helpful. Additionally, advising

Page 298: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

288

Occupational Therapy Assistant, A.A.S. continues to be a programmatic priority as 100% of OTA students have received advising in the fall and spring semester to enhance retention. Results improved: [ x ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: One action to improve this programmatic goal is to recommend the use of the new program at the college, Single Stop, particularly for students struggling with personal challenges that are hindering academic performance to be implemented starting in Fall 2018 and will be recommended on an as needed basis ongoing. Assessed: Annually

The NOVA OTA Program will increase program graduation total.

Number of Graduates by Program and Specialization: Comparing graduation rates and graduate totals in 2014-15, 2015-16, 2016-17 and 2017-18.

Target: NOVA OTA program will graduate 85% of its students who successfully completed 100% of the curricular program requirements within the cohort. Results for Past 4 Years: The program had its first graduating class in 2015 (4 years of data collected):

Academic Year

Students Entering/Graduating

Percentage Increased

2014-2015 15/12 = 80% N/A 2015-2016 18/15= 83.3% 3.3% 2016-2017 20/17=85% 1.7% 2017-2018 20/17=85% 0%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The graduation rate in 2016-17 and 2017-18 are the same with 20 entering the program and 17 graduating the program. The target of graduating 85% of students who entered the program was met.

Previous action to improve program goal: To continue to improve graduation rates, students are identified early by OTA faculty if there are academic performance issues or if personal circumstances hinder academic performance and refer students to resources, including the MEC retention counselor, tutor center, the CARE team, and/or Student Services as appropriate. The assigned OTA faculty advisor follows-up with these identified students every other week, or more consistently as needed, to support the students’ academic success to ultimately assist them to become a graduate of the NOVA OTA program. These actions are completed on an as-needed basis throughout the Fall, Spring, and Summer semesters and ongoing. Most recent results: The program had 20 students enter the program and 17 graduate from the program during the 2017-18 AY, meeting the target. Results improved: [ X ] Yes [ ] No [ ] Partially

Page 299: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

289

Occupational Therapy Assistant, A.A.S. Current actions to improve program goal: In addition to continuing the same plan as outlined above, the OTA program will utilize the new Single Stop program resources for students who are having personal struggles that are hindering academic performance. It will be implemented in Fall 2018 and on an as-needed basis ongoing based on student needs. Assessed: Annually

The NOVA OTA program will maintain a 100% pass rate on the National Board for Occupational Therapy (NBCOT) OTA certification exam.

NBCOT Program Director Portal and via the website: https://secure.nbcot.org/data/schoolstats.aspx

Target: A 100% pass rate on the NBCOT board pass rate. Results for Past 4 Years: The program had its first graduating class in 2015 (4 years of data collected).

Academic Year NBCOT OTA Board Pass Rate

Percentage Increased

2014-2015 100% N/A 2015-2016 100% 0% 2016-2017 100% 0% 2017-2018 100% 0%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The program continues to maintain a 100% NBCOT OTA board certification pass rate.

Previous action to improve program goal: Not assessed in 2016-17. Most recent results: The NOVA OTA program continues to have a consistent 100% pass rate on the NBCOT OTA certification board exam. Results improved: [ ] Yes [ ] No [ X ] Partially: maintaining an optimal standard. Current actions to improve program goal: The program will continue to host a Therapy Ed Board Prep course at the college, require the students to take a prep exam in during the final semester known as the OTKE and fund this for each student, orient the students to exam preparation materials in the OCT 295 course, and support students with classroom study resources to help them best prepare for successfully taking the Board Exam. In May 2019, NOVA’s OTA program will be hosting a Therapy Ed Board Prep course. In Spring 2019, the students will take the OTKE and will participate in a discussion about study resources for Board preparation as part of the OCT 295 course. Assessed: Annually

Page 300: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

290

Annual Planning and Evaluation Report: 2017-2018 Paralegal Studies, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to provide an individual with a sufficient level of knowledge, understanding, and proficiency to perform the tasks associated with meeting a client's needs. These tasks can be performed by a trained, non-lawyer assistant working under the direction and supervision of a lawyer. A paralegal or legal assistant will have a basic understanding of the general processes of American law, along with the knowledge and proficiency required to perform specific tasks under the supervision of a lawyer in the fields of civil and criminal law. Occupational objectives include employment in corporate law firms, government agencies, and any of the varied law related fields. Legal Specialty courses are only offered at the Alexandria Campus and through NOVA Online. This program is approved by the American Bar Association (ABA).

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Distinguish between the roles of attorney and paralegal.

Criminal Law LGL 218 Direct Measure: Data was collected from exam questions to determine whether students could distinguish between the roles of attorney and paralegal. The following exam question topics were used as the method of assessment: • Ethical rules for legal professionals • Attorney breach of confidentiality Sample: Number of Sections: 1 Number of Students: 21 Alexandria Campus

Semester/year data collected: Fall 2017 Target: 80 percent of students will answer two exam questions correctly showing that they are able to distinguish between the roles of attorney and paralegal. Results: The target was exceeded: 95 percent of students answered Question #25 correctly and 81 percent of students answered Question #26 correctly. Comparison to previous assessment: Fall 2016: This SLO was assessed using a different assignment (Court Visit Report); therefore, an accurate comparison cannot be made. Fall 2015: 80 percent of students answered similar questions correctly. Two of the three exam questions that were assessed focused on ethical rules and conflicts of interest.

The last time this SLO was accessed using the current evaluation method (Fall 2015), faculty who taught this course discussed conflicts of interest and ethical rules during one class period. During this period of assessment, faculty members placed more emphasis on these topics in all courses through the use of Blackboard discussions and in-class group exercises. Plan for improvement: Faculty will continue to emphasize the important distinction between the roles of attorney and paralegal in all courses. They will focus more on what constitutes the unlicensed practice of law for paralegals. A different method of assessment will be used in Fall 2019. Next Assessment: Fall 2019

Research federal and state laws using manual and computer assisted methods such as Lexis or Westlaw.

Legal Research LGL 125 Direct Measure: Data was collected from a Law Library Exercise to determine whether students understood how to research federal and state laws using manual and computer methods. The following grading rubric was used:

Criteria Factors considered

Points Assigned

On Time Turn in on or before due date

5

Followed Instructions

Answer each question completely; attach hand-written

25

Semester/year data collected: Fall 2017 Target: 80 percent of students will successfully locate federal and state laws and resolve legal issues thereby demonstrating their knowledge of manual and computer assisted research methods. Results: The target was exceeded: 89 percent of students demonstrated an understanding of how to research federal and state laws. The students who were successful answered each question fully which met the requirement of rubric component “Followed Instructions” by including case name, judge’s name, dissenting/concurring opinion, and summary of facts; these students also

The last time this SLO was assessed (Fall 2016) students were able to use the law library at George Mason University (GMU). Since that time, GMU’s law library is no longer open to the public. As a result, the program was forced to find other locations where professors are allowed to conduct classes and students are allowed to complete exercises, which is an ABA requirement. In Fall 2017, faculty used the Fairfax Public Law Library where students were able to complete their assignment with the guidance of their instructors. This library has excellent resources; however, the hours of operation are not as convenient for evening students.

Page 301: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

291

Paralegal Studies, A.A.S. worksheet to typed answers

Proper Legal Citation

Underline or use italics for parties’ names; correct volume and page number; identify court and year case decided.

20

Total 50 Sample: Number of Sections: 2 Number of Students: 36 Alexandria Campus

used proper legal citation, which met the requirement of rubric component “Proper Legal Citation.” Students who were not as successful lost points for Rubric Component “Followed Instructions.” These students did not answer each question completely and did not attach their handwritten worksheets. In addition, these students lost points for Rubric Component “Proper Legal Citation” because they failed to use italics or underline parties’ names; 11 percent of students did not turn in the assignment. Comparison to previous assessment: Fall 2016: A different method was used to assess this SLO (Final Research Project); however, students were also required to use proper legal citation and correctly identify the name of the court and the year the case was decided); 89 percent of students were successfully able to research federal and state laws using manual and computer assisted methods.

Plan for Improvement: In Fall 2018, faculty will include more legal citation exercises in their lesson plans. For example, citation competitions will be held. During this exercise, faculty will recite scenarios, students will work in groups to identify proper legal citations, and write them on the board. Also beginning in Fall 2018, faculty will put more emphasis on proper legal citation in all writing exercises. The issue of finding an appropriate law library (ABA requirement) was discussed at the Fall 2018 Faculty and Advisory Committee meetings. Before Fall 2020, the program will try to locate a law library that meets the ABA criteria and is accessible for day and evening students. Next Assessment: Fall 2018

Draft legal documents including but not limited to pleadings, contracts, wills, and deeds.

Legal Aspects of Business Organizations LGL 235 Direct Measure: Data was collected from a General Partnership Agreement to determine how well students drafted legal documents. The grading rubric is attached and includes the following components:

1) Partners’ names 2) Purpose 3) Liability 4) Financing 5) Management 6) Duration 7) Organization

Sample: Number of Sections: 2 Number of Students (Alexandria): 7 Number of Students (NOVA Online): 9

Semester/year data collected: Spring 2018 Target: 80 percent of students will draft a General Partnership Agreement and earn a grade of 70 percent or higher. Results: The target was met: 81 percent of students earned a grade of 70 percent or higher; 50 percent of students earned an “A” grade, 12 percent earned a “B” grade, and 19 percent earned a “C” grade. The students who were not as successful either turned in the assignment late or did not turn it in. Students that earned an “A” grade included the required information for rubric components 1-6: for the General Partnership Agreement: 1) Partners’ names, 2) purpose, 3) liability, 4) financing, 5) management, and 6) duration. The content of these students’ agreement flowed from point to point with a clear, logical order of these details (meeting the requirement of Rubric Component 7, organization). Students that earned a “B” grade included 5 of the 7 required components of the agreement; the components were generally consistent. Students that earned a “C” grade included only 4 of the 7

The last time this SLO was assessed (Spring 2015) a different method of assessment was used. In order to improve student learning, the program changed the method of assessment from Articles of Incorporation to General Partnership Agreement. During Spring 2018, the same instructor taught LGL 235 on campus and through NOVA Online. This contributed to the consistency of lesson plans and improved student learning. Plan for Improvement The next time this SLO is assessed the program will use a different method of assessment (contract, will, or pleading). This will enable us to improve student learning as it applies to drafting legal documents. This SLO will be assessed again in Spring 2018.

Page 302: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

292

Paralegal Studies, A.A.S. required components; the components deviated from the topic of the agreement, and the organization was only somewhat clear. Comparison to previous assessment: Spring 2015: A different method was used to assess this SLO (Articles of Incorporation). In that year, 100 percent of students were successful in completing the following components:

• Name of organization • Purpose • Registered agent • Incorporators • Capitalization • Address • Duration • Organization of document (information

should flow in an orderly fashion).

Because the methods of assessment were different, the data from the Spring 2018 assessment cannot be compared with any previous academic year.

CLO: Draft legal documents including but not limited to pleadings, contracts, wills, and deeds. [ X ] CT

Legal Writing LGL 126 Direct Measure: Data was collected from a legal memorandum assignment (Susie Speaker) to determine how well students solved issues using legal citation form and writing style. The following Critical Thinking rubric was used:

Criteria Explanation Points On Time Turned in on or before

due date 3

Followed Format

Explained in class and posted on Blackboard

5

Discussion Section

Used critical thinking to research and analyze relevant case law and statutes to reach a conclusion.

25

Proper Legal Citation

Use italics or underline parties’ names, identify correct reporter, court name, and year case decided.

5

Spelling/ Grammar

No typographical errors

2

Total 40

Semester/year data collected: Spring 2018 Target: 80 percent of students will successfully use proper legal citation form and writing style when they draft a legal memorandum and earn a grade of 70 percent or higher. Results: The target was met: 93 percent of students earned a grade of 70 percent or higher; 70 percent of students earned an “A” grade. These students followed the prescribed format, used all of the legal sources (case law, First Amendment, federal statute), used proper legal citation, and correct spelling and grammar. Seven percent of students earned a “B” grade. These students did not follow the correct format or use all of the legal sources. Fifteen percent of students earned a “C” grade. These students did not use all of the legal sources and did not use proper legal citation. The 7 percent of students who did not earn 70 percent or higher did not turn in the assignment. Comparison to previous assessment: Spring 2017: A different memorandum of law assignment was used for this assessment (Victoria

The last time this SLO was assessed (Spring 2017) the target was not met. The program decided to use a different method of assessment to improve student learning. In Spring 2018, the program decided to use this SLO to evaluate critical thinking. In an effort to improve student learning, the faculty used a different method of assessment for this CLO. The Memorandum of Law assignment proved to be a better method for students to use critical thinking to resolve legal issues while using proper legal citation because this assignment required them to use more resources than the Trial Court Brief. For example, students used case law, the First Amendment, and a federal statute to analyze the discussion section of the Memorandum of Law. Plan for improving Critical Thinking based on results: During Spring 2018 faculty and Advisory Committee meetings, the program will discuss methods in which to improve students’ ability to use legal research to locate,

Page 303: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

293

Paralegal Studies, A.A.S. Sample: Number of Sections: 2 Number of Students: 27 Alexandria Campus

V); however, the grading rubric was similar. At that time, two sections of the course were evaluated (21 students) and the target was not met—only 71 percent of students earned a grade of 70 percent or higher. Spring 2016: A different method of assessment was used during this period (Trial Court Brief); therefore, a comparison to the present results cannot be made.

evaluate, and interpret case law and statutes to reach appropriate conclusions. Next Assessment: Spring 2020

Program Goals Evaluation Methods Assessment Results Use of Results To increase the graduation rate of students in the Paralegal Studies program.

College Planning and Evaluation Process http://www.nvcc.edu/college-planning/_docs/41-17-Number-of-NOVA-Graduates-By-Degree-and-Specialization-2016-2017.pdf

Target: To increase the graduation rate of students in the Paralegal Studies program. 2014 39 2015 41 2016 41 2017 36 2018 27 Target: equal or surpass graduation rate from previous year. Results: The target was not met.

Previous Action to improve Program Goal: In Fall 2017 and Spring 2018, the program continued to work with the Fairfax Bar Association (FBA) Paralegal Section. In Summer 2017, the program conducted a Paralegal Symposium where current and prospective students learned about the program, benefits of membership in professional organizations, and interviewing techniques from the employee’s and employer’s perspectives. One of the goals of the symposium was to encourage students to complete the program. Faculty and Advisory Committee members along with legal professionals from the community were involved in the Paralegal Symposium. Target Met: [ ] Yes [ X ] No [ ] Partially Current Actions to improve Program Goal based on results: The program is planning to conduct a Panel Discussion in April 2019. During this event, current and prospective students will have an opportunity to ask questions about the program (benefits of ABA approval, reasons for sequencing plan and prerequisites, etc.), job opportunities, and so forth. Students who have more knowledge about the benefits of completing the Paralegal Studies program should be more inclined to do so, which should result in an increase in the graduation rate. Faculty and Advisory Committee members will be involved in the Panel Discussion. Assessed: Annually

To increase the success rate of students in the

College Planning and Evaluation Process http://www.nvcc.edu/college-

End of Fall 2013: 73.5% pass rate End of Fall 2014: 82.1% pass rate

Previous Action to improve Program Goal: An expanded version of Coordinated

Page 304: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

294

Paralegal Studies, A.A.S. Paralegal Studies program.

planning/_docs/Fall-2016-Success-Rates-by-Discipline_Excludes%20ELI%20Classes.pdf

End of Fall 2015: 88.6% pass rate End of Fall 2016: 85.8% pass rate End of Fall 2017: 85.7% pass rate Results: The target was not met.

Internship (LGL 290) was developed in Fall 2017. This course allows students to develop skills they learned in previous courses. For example, students participate in practical exercises such as client (intake) interviews and mock (job) interviews. They also draft complaints, memoranda, simple contracts, cover letters, and resumes. During the second half of the semester, students participate in job shadowing and internships in the public and private sector. They write reports explaining what they learned from this experience. The Advisory Committee and FBA Paralegal Section members helped identify internship and job shadowing opportunities for the students. The results of an employer survey that was conducted at the end of the semester are as follows: • 89 percent of students who worked onsite

reported to work on time and dressed appropriately for the environment

• 100 percent of students comprehended and followed verbal/written instructions

• 100 percent of students requested assistance from co-workers and supervisors when appropriate

• 100 percent of students met the objectives prescribed at the beginning of the placement

Target Met: The target was missed by 0.1 percent. Current Actions to improve Program Goal based on results: In Fall 2017, the program offered the same course with a different title (Paralegal Studies Capstone-LGL 298); however, the course was cancelled due to insufficient enrollment. The program will offer this course again in Fall 2019 or Spring 2020. The program will continue to monitor this data and discuss the student success rate during the Spring 2018 faculty and Advisory Committee meetings. Assessed: Annually

Page 305: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

295

Paralegal Studies, A.A.S. Increase number of program placed students.

College Planning and Evaluation Process http://www.nvcc.edu/college-planning/_docs/Distribution-of-Program-Placed-Students-by-Curriculum-and-Award-Type-Fall-2012-%20Fall-2016.pdf

Number of program placed students: Fall 2017: 152 Fall 2016: 161 Fall 2015: 189 Fall 2014: 199 Fall 2013: 246 Results: The target was not met.

Previous Action to improve Program Goal: In Summer 2017, the program conducted a Paralegal Symposium for current and prospective students. The agenda included an overview of the program, library resources, benefits of membership in professional organizations, and interviewing techniques from the employee’s and employer’s perspectives. The program provided computers for students to sign up for Fall 2017 classes. One of the goals of the symposium was to increase the number of program placed students. Faculty and Advisory Committee members along with legal professionals from the community were involved in the Paralegal Symposium. Target Met: [ ] Yes [ X ] No [ ] Partially Current Actions to improve Program Goal based on results: The program is planning to conduct a Panel Discussion in April 2019 for current and prospective students. During this event, participants will have an opportunity to ask questions about the program, job opportunities, etc. Prospective and current students who have more knowledge about the benefits of the Paralegal Studies program should be more inclined to choose this major, thereby increasing the number of program placed students. Faculty and Advisory Committee members will be involved in the Panel Discussion. Assessed: Annually

Page 306: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

296

Annual Planning and Evaluation Report: 2017-2018

Personal Training Career Studies Certificate NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is based on the standards of the American Council on Exercise (ACE) and prepares you to become a knowledgeable fitness professional in health clubs, recreation departments, and fitness facilities in private, commercial, corporate or government settings. The program will prepare you to sit for a nationally recognized certification exam in Personal Training. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to identify the stages of motivational readiness and recognize behavioral strategies to enhance change.

Lifetime Fitness and Wellness PED 116 Concepts of Personal and Community Health HLT 110 Direct Measure: The Physical Education/Personal Health cluster choose two multiple choice exam questions focusing on the popular Transtheoretical Model of Behavior Change. The goal was to include these questions on an exam in all PED 116 and HLT 110 during the Fall 2017 term. These courses were chosen as they provide foundation knowledge of behavior change for all students, including those in the Personal Training CSC (See Appendix A for questions). Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

AN 13 3 MA 5 5 AL 8 5 LO 10 10 WO 4 0 NOVA Online

** **

DE* N/A N/A Total 40 23

*Dual Enrollment **NOVA online evaluates their PED/ HLT courses separately from our program. We have encouraged cooperation but at time of the assessment, we were not working under NOVA’s new organizational model.

Semester/year data collected: Fall 2017 Target: 80% of students answering the questions correctly. Results:

Campus Question 1 Question 2 Annandale 71/101= 70% 87/101 = 86% Manassas 112/139= 81% 127/139= 91% Alexandria 145/169 = 86% 149/169= 88%% Loudoun 278/341= 81% 328/341= 96%

This is the first time this SLO was assessed.

The percentages of students answering the exam questions correctly is above the target percentage. This demonstrates that PED 116 and HLT 110 are successful at providing students with basic knowledge of behavior change models. It is imperative that these courses continue providing this knowledge as behavior change in an important concept in the Personal Training curriculum outlined by the American Council on Exercise (ACE) that is used in PED 168 – Introduction to Personal Training. We ascertain that the low number of faculty doing the assessments is due to three factors: (1) many cover behaviors change but do not use the term Transtheoretical Model; (2) the questions were sent to the faculty late; and (3) no full-time faculty representation at the Woodbridge Campus. As part of the action plan: (1) the questions should be identified and clarified earlier. Effective Fall 2018, SLOs were distributed a month earlier. For Fall 201,9 our new goal is to have the SLO distributed by the first week of classes. (2) A dialogue on how the Woodbridge Campus would disseminate and collect data without representation of a fulltime PED/HLT faculty. In Fall 2018, as requested by our previous dean, these concerns were shared with our discipline Dean.

Page 307: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

297

Personal Training Career Studies Certificate Students will identify modifiable risk factors contributing to Cardiovascular disease.

Lifetime Fitness and Wellness PED 116 Concepts of Personal and Community Health HLT 110 Direct Method: In January 2018, the PED/HLT Cluster chose to evaluate this SLO in two ways: (1) a True/False question to include in PED 116

and HLT 110 that focuses on the risk factors associated with all forms of Cardiovascular Disease. The aim was to include this question in all sections of both courses (see Appendix A for question).

(2) The Completion of a personal Cardiovascular Disease Risk Assessment. These assessments are a required part of the curriculum in both courses. The assessments can be completed as in in-class lab or as a homework assignment (see Appendix B for CVD Risk Assessment example).

Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

AN 10 10 MA 8 8 AL 8 6 LO 9 8 WO 4 0 NOVA Online

** **

DE* N/A N/A Total 39 32

*Dual Enrollment **See SLO 1

Semester/year data collected: Spring 2018 Target: 80% of students answering the question correctly. Results - CVD Risk Factors Question:

Campus # Students with

correct answers

Percentage Correct

Annandale 185 of 207 89% Alexandria 155 of 180 86% Loudoun 158 of 172 92% Manassas 225 of 233 96% Total Total

723 of 792 Cum % 91.2%

Results - CVD Risk Factor Assessment:

Campus # Students Completing

Risk Assessment

Percentage Correct

Annandale 229 of 247 93% Alexandria 162 of 180 90% Loudoun 151 of 169 89% Manassas 233 of 233 100% Total Total

775 of 829 Cum %

93.4%

This is the first time this SLO was assessed. Cardiovascular disease is the leading cause of death in the United States. The ability to recognize these risk factors in college years is a proactive step toward decreasing this major health concern in later years. Target Met: [ X ] Yes [ ] No [ ] Partially Actions for Improvement: 1) Comparison data 2) No full-time faculty representation at the

Woodbridge Campus (i.e. no assessment completed).

As part of the action plan: (1) for comparison we are assessing the SLO in Fall 2018. (2) A dialogue on how the Woodbridge Campus would disseminate and collect data without representation of a full-time PED/HLT faculty. In Fall 2018, as requested by our previous dean, these concerns were shared with our discipline Dean. Next Assessment: Fall 2018

Program Goals Evaluation Methods Assessment Results Use of Results Upon completion of training graduates will have the skills and knowledge to become an entry level fitness professional.

Number of students completing PED 220 and PED 168 with a grade of C or better: Data obtained from course completion information from the instructors Successful completion of the internship: PED 190

PED 220: Number of Fitness Students/total completions Campus Fall 2017 Spring

2018 Total

AN 4/8 3/6 7/14 AL 3/6 3/13 6/19

PED 168:

Number of Fitness students/ total completions Campus Spring

2017 Summer

2017 Total

AN N/A 4/9 4

In 2017-18 there were a total of 13 Personal Training students who completed PED 220. The remaining 20 students were PTA students. To successfully complete PED 220, PT students were required to teach small group exercise, conduct fitness assessments (muscular strength and endurance, flexibility, cardiovascular endurance, and balance), develop exercise plans as well as work one-on-one with older adults with varying health conditions.

Page 308: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

298

Personal Training Career Studies Certificate AL 6/10 N/A 6

PED 190: Three students successfully completed an internship: • In Fall 2017, a student interned at Providence

Recreation Center which is part of the Fairfax County Park Authority.

• In Spring 2018, a student interned at Gold’s Gym in Reston.

• This past summer, a student interned at the Washington House, a continuing care facility in Alexandria.

PED 168 is the Introduction to Personal Training. This is the foundation to all the concepts and practices of personal training. It is also the class which prepares the PT students for the ACE national fitness exam. Please note: The ten students who completed the course were program placed. The remaining 9 students who did not complete the course enrolled for personal enrichment to learn about Personal Training. They did not realize the difficulty and amount of material covered for the ACE Exam. All three students had a successful internship according to their supervisors and an exit interview with the AN Assistant Dean. Each student was able to choose the facility and type of experience they desired. Action for improvement: PED 168: It was noted that the students enrolling in the course were unclear on the scope of the class. As such we should strongly encourage the student to follow the recommended program sequence and prerequisites/co-requisites. PED 190: Explore additional internship locations and consider the possibility of all personal training students having an internship opportunity. Next Assessment: Graduate totals and program placed student totals will be reassessed 2018-2019.

Increase the number of graduates.

Data: Number of students graduating Number of students completing the graduation process in Personal Training Career Studies Graduation totals from OIR data Surveying Program Placed Personal Training Certificate students Financial Aid Requirements Clarified

Year Number of graduates

2017-2018 7 2016-2017 8

Although 2016-17 was the major transitional year for the Personal Training Career Certificate, a few students completed the Fitness Career Certificate during this time period as well. Based on OIR data there are 63 program placed students. In an effort to learn more about these students, a phone survey was conducted by Manassas faculty. In Fall 2017, a list of forty-three (43) students were contacted to establish their program status and provide support and advice. Of those 43 students, 14 had conversations, 21 received voice messages and the 8 remaining were sent emails. Four of the

Page 309: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

299

Personal Training Career Studies Certificate students changed their majors. Looking at the student list, only 3 had full-time PED faculty advisors. The remainder had no assigned advisor. As an action plan item, a collaborative effort should be made to develop strategies to better advise our PT students. This would include key stakeholders such as Deans, Associate Deans, first year advisors, counselors, enrollment services, financial aid officers, Veteran Affairs and PED faculty. Spring 2018 Manassas Pilot: To open lines of communication and increase awareness of our program, we conversed with our stakeholders and provided clear and concise market materials as well as shared Campus TV monitor PowerPoint slides to market the program across the college. In Fall 2017, financial aid officers across all campuses gave out misinformation, stating that students enrolled in the PTCSC were not eligible for financial aid, impacting the enrollment and recruitment. Once the mistake was corrected, only a few students were able to be identified. As an action plan, since the later part of Fall 2017, a concerted effort was made to maintain lines of communication with financial aid officers to assure accurate financial aid eligibility information is provided. Assessed: Annually

Increase marketing/visibility of program.

o Website Development o Updated information in the NOVA catalogue o Brochures Developed (see attachment C) o Information posted on campus TV monitors o Listed as an educational partner with the

American Council on Exercise (ACE) o Advisory Board

Website for Personal Training Career Catalogue names, descriptions, and credit amounts were updated to reflect the new Career Studies Certificate to Personal Training (24 credits) from the Fitness CSC (10-12 credits).

Campuses advertise the Personal Training Studies on the TV monitors NOVA is listed as an ACE provider

The website for the Personal Training CSC was developed in Summer 2017. As new information is available, it is given to the webmaster for updates on a continuous basis. The website will be maintained on a continuous basis. The updated revisions to the catalog provide students with more accurate information. This is extremely important as many do not have advisors. The marketing materials increase the visibility of the Personal Training CSC.

Page 310: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

300

Personal Training Career Studies Certificate o Increase accessibility and placement into

internships within the community Advisory board members from the Fairfax County Park Authority and Gold’s Gym interested in placing interns in their organizations

NOVA maintains a relationship with ACE and utilizes all of the resources available provided by the national organization. The fairly newly formed Advisory Board includes the Assistant Dean of GMU’s Kinesiology Department and a member from the commercial and municipal fitness industry. These individuals will be key assets to increase our visibility in the community. Action plan for Spring 2019: • Explore addition of internships for the

curriculum. • Pilot (1) PED/ HLT Facebook and twitter

pages at Manassas Campus and (2) Fitness/ Wellness Club.

Curriculum revision

Program Purpose revised Collaborating with Advance/GMU to establish Pathways

HLT and PED continue to be part of the Advance/GMU discussions to establish pathways to GMU in Kinesiology and Sport Management. .

The Program Purpose statement was revised in Spring 2019 to better reflect the curriculum changes and the association with our educational partner, American Council on Exercise. Updated statement and forwarded for approval. PED 168 was accepted along with PED 111- as KINS 200. The HLT 206 syllabus and course content was revised and approved for Fall 2017 to meet the KIN 100 requirement.

Page 311: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

301

Annual Planning and Evaluation Report: 2017-2018

Phlebotomy Career Studies Certificate NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Career Studies Certificate in Phlebotomy is designed to prepare personnel who collect and process blood and other samples for medical laboratory analysis. Phlebotomists work in hospitals, medical clinics, commercial laboratories, and in other settings where blood is collected from patients. The curriculum includes learning experiences in both on-campus laboratories and affiliated clinical laboratories. Graduates are eligible to sit for the national examinations to become certified as a Phlebotomy Technician.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Relate knowledge of body systems with the most common diagnostic laboratory testing for each system.

Phlebotomy MDL 105 Direct Measure: Quiz #3, question 1. The students’ knowledge is assessed in table format where they must identify the correct order of draw, the color tube and additive, the lab tests performed on the tube, and the department of the lab that performs the testing. SLO attachment 1.1 Quiz #3. Areas of assessment in question: a. Determine correct order of draw for

blood collection tubes b. Match Cap color of blood collection

tube with additive /anticoagulant content

c. Indicate lab tests associated to tube/anticoagulant

Correlate tests with Department of Clinical Laboratory Sample: One section, all students, N= 11

Semester/year data collected: Fall 2017 Target: 100% students passing this test with a grade equal to 70% or better. Results: Target was met during the Fall 2017 as 100% of students (N=11) in this course obtained a grade better than 70%. The average score on this question was 20 out of 22 points for a grade of 90%. 100% of students provided correct responses in the areas of assessment related to order of draw for blood tubes in venipuncture procedure and corresponding department of the Clinical laboratory that performed that testing. These correct responses included common tests performed in regular Chemistry, Hematology/ Coagulation testing and Microbiology. The association of anticoagulant utilized for special tests in Chemistry and Blood Bank showed a lower percentage of correct responses. In chemistry this was related to failure in recognition of the additive included in gray tubes as well as the special alcohol toxicology tests for which this tube is used. Two out of the eleven students failed to indicate either the special tests or additive related to gray tube utilization. One out of the eleven students was not able to indicate the additive included in the yellow tube used for special blood Bank testing. Current results improved: [ X ] Yes [ ] No [ ] Partially Current results improved when compared to Spring 2017 results. The students assessed during Spring 2017 showed a lower mastery of additive content, more difficulties in type of test and correlation with Department of the Clinical Laboratory that performs the testing not only for special tests but also with routine tests.

During the Fall of 2017, sessions in the MEC Anatomy lab were incorporated into the MDL 105 Phlebotomy course. These sessions at the anatomy lab had the goal to provide emphasis in the correlation of specimens collected for laboratory analysis and the body systems or organ function related to these tests. The target to obtain a 100% passing grade for students in course was met and students’ comments indicate that the improved outcome was directly related to their participation in the course activity using the facilities of the MEC Anatomy Lab. Based on these results that support the benefit of the utilization of Anatomy Lab sessions to reinforce correlations between laboratory testing and organ/system function, this activity will be maintained in the MDL 105 Phlebotomy course. Additional activities related to special tests, including quizzes and clinical scenarios for selection of venipuncture tubes and test routing to Clinical Lab departments will be implemented in lecture and lab sessions. These additional activities will be used to help improve outcomes related to special tests requirements. Next Assessment: Fall 2018

Page 312: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

302

Phlebotomy Career Studies Certificate Spring 2017: One section N=16; all phlebotomy students. Only 75% of the students obtained a passing grade equal to 70% or better. Strengths: Better understanding of correlations among tests and type of anticoagulated blood utilized in clinical chemistry testing to evaluate metabolic conditions, in hematology to perform CBC and hemostasis testing of coagulation panels. Weaknesses: Questions related to special test anticoagulants. These were missed by 3 students; 8 out of 11 answered them correctly. An action plan will be developed to improve the specific outcomes related to correlation of special tests requirements with body systems.

Develop problem- solving skills in the pre-analytical phase of specimen testing.

Clinical Phlebotomy MDL 106 Direct Measure: Performance Rating based on Clinical Objectives for Specimen Collection. Motor skills are essential for performing successful phlebotomy procedures which is the main objective of pre-analytical phase of specimen testing. Students in the Clinical Phlebotomy course should perform venipunctures using CLSI standard technique, making no more than two attempts. This process requires selection of venipuncture site, vein palpation to locate safest vein, and taking appropriate corrective actions during the process to obtain right amount of blood to complete a successful blood draw on assigned patients. The process of sample collection occurs so fast, and because each of these steps just takes seconds, when students are in clinical rotations the best indicator of appropriate performance and development of mechanical skills is to determine the amount of successful blood draws. The comparison of student’s evaluation during first and final weeks of clinical training in affiliates is used to demonstrate their developing psychomotor skills during sample collection procedures. The number of successful draws shows the

Semester/year data collected: Fall 2017 Target: 100% of students will score 75% or higher (3 or 4) in the phlebotomy clinical training rubric for mechanical ability. Current Results:

• Week 1 = 5.9 successful draws • Week 8 = average 37.9

Previous Results: Spring 2017, MDL 106, N=16:

• Week 1 = 6.5 average of successful draws • Week 8 = average 34.4

Target was met on evaluation of mechanical ability: 100% of PBT students (N=11) scored 75% or greater (3 to 4 points out of 4 maximum points) in clinical training rubric. Clinical preceptors expressed their satisfaction with the progress of phlebotomy students in clinical training and their ability to correct situations that avoid unsuccessful blood draws. Strengths: The clinical training time, a period devoted to practice, are showing that students do have the required cognitive skills needed to demonstrate a rate of success equivalent to an entry level professional that has the ability to perform the process with manual dexterity at the end of their clinical training period. Weaknesses: The misses of drawing blood constitute a weakness that will be improved by consecutive trails and practice on patients showing different physical and behavior characteristics.

This SLO has been evaluated formerly on cognitive knowledge. This time the psychomotor skills exhibited by phlebotomy students during clinical rotations were assessed by this SLO. Increased amount of practice in phlebotomy procedures was scheduled in MDL 105 during Spring 2017 in order to provide enough practice to make a better transition between the stages of psychomotor skills development given that some students presented more difficulties to reach the expected level of skills to get more successful draws at the end of MDL 105. The course instructor with the assistance of the lab manager provided more time for practical training in this novice stage that requires students to go from the cognitive to associative skills development through accumulation of practice. Open lab hours were created to provide more assistance in developing psychomotor skills for students that needed more time to pass to the precision stage of psychomotor skills domain. The lab manager that is a certified MLT provided the additional supervised practice. The psychomotor assessment of adjusting skills in the sample collection to each patient’s situation in order to end up in a successful blood draw was selected to be performed in MDL 106. In previous year cognitive knowledge of specimen collection was assessed and met the target goal.

Page 313: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

303

Phlebotomy Career Studies Certificate improvement in the mechanical ability to perform phlebotomy process at week 8 when practice helped students reach a higher level in the domain of psychomotor skills. The rubric for evaluation of skills and performance in clinical training includes a section for mechanical abilities. This section is used to determine the achievement of this psychomotor learning outcome in the phlebotomy training course. SLO attachment 2.1 MDL 106 Clinical Site Visit Evaluation. Sample: One section, all students in the 8 week course, N=11

Students gain more confidence and expertise as they continue to draw blood from a variety of patients during clinical training. The diversity in test orders provides a variety of challenges that help to build up and sharpen their mechanical ability to perform successful draws. At the end of their 8 weeks of training, there is a marked improvement in the amount of successful draws. Psychomotor skills need to be performed and observed to determine mastery of the skill. Final evaluations obtaining 3 or more in a scale of 0 to 4 in the area of mechanical ability indicate that the target was met as students were able to demonstrate their improvement in mechanical ability developing natural movements to perform successful venipuncture in patients that presented with difficult veins and other problems that were correctly managed. Based on recent results, areas needing improvement were successfully addressed by increasing time to practice phlebotomy procedures and providing flexibility in open lab hours to reach the goal of developing the natural stage of psychomotor skills domain by the end of clinical training in MDL 106. During the clinical training an unsuccessful blood draw can be caused by many situations. Students will be asked to make a daily list of unsuccessful draws to provide an explanation of the reason they think they were not successful. This reflection will be useful to review the process and to make a self- evaluation. Steps or personal actions needed for showing improvement in venipuncture will be required to the students in practice as a written weekly plan to end up with a higher successful blood draws during each week of training. Next Assessment: Fall 2018

Perform venipuncture and dermal puncture collection, handling and processing.

Direct Measure: External examination - Phlebotomy ASCP BOC Exam Statistics. Performance based on comparison of program mean scaled scores with overall national mean scaled scores in sub-content area of specimen collection including venipuncture and skin puncture which constitutes 50% of certification test content. SLO 3.1 BOC Program

Semester/year data collected: Spring 2018 Target: 90% of graduates sitting as first time examinees in this term cohort will get an equal or higher mean scaled score than national scores in the specimen collection sub-content area. Results: Mean scaled score for Specimen Collection Area:

• Program: 577

Previous actions introduced to improve SLO: the addition in Spring 2016 of web-based learning modules on specimen collection and handling. We have continued the use of these modules and have been a positive reinforcement to obtain better scores in this important SLO. Target Met: [ x ] Yes [ ] No [ ] Partially In the Fall 2017 cohort, the faculty and program director decided to implement the use of ASCP

Page 314: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

304

Phlebotomy Career Studies Certificate Performance Report 2018. SLO 3.2 PBT (ASCP) Examination content outline This is a computer adaptive test and scores are not broken down in the report. The areas included in this section are: -Review and clarification of order. -Patient Communication -Patient Identification -Patient Assessment and Preparation -Site selection -Techniques -Common Tests -Order of Draw -Complications and Considerations -Equipment Sample: All PBT Graduates sitting as first time examinees for this PBT(ASCP) Certification Test: N=14

• National: 546 Program examinees scores in Specimen collection:

• 800-999 N= 2/14 • 700-799 N= 4/14 • 600-699 N= 1/14 • 500-599 N= 1/14 • 400-499 N= 4/14 • 300-399 N= 1/14 • Less than 299 N=1/14

The required passing score for each area of this test is a minimum of 400 and only 1 student did not achieve this minimum score. This student did not pass the certification test on this first attempt. All the others (13/14) achieved passing scores ranging from low 400s to mid- 900s. The overall program mean scaled in PBT(SASCP) was 584 compared to a 540 national. Target was met and results show that graduates demonstrate appropriate knowledge in the main content area of their profession. Improvement is observed in overall mean scaled scores of our first time examinees as compared to 2017 and 2016 scores:

• 2017- overall program mean scaled score was 548 compared to a 490 national

• 2016- overall program mean scaled score was 543 compared to 522 national

practice exams for preparation of these students and the result of this activity contributed to a better preparation for the test and to meet the target for this SLO. The target was met for this cycle and the utilization of an external measure for this assessment demonstrates graduates are proficient in Phlebotomy’s body of knowledge related to specimen collection, handling and processing. Action Plan for improvement: Based on recent results, we will maintain use of web-based training modules for additional reinforcement of the knowledge required to pass this sub-content area of specimen collection. The web-based Phlebotomy modules include activities that reinforce the course objectives. We will continue with the use of certification exam simulation tests to help better prepare our students to meet the target passing score of this sub content area in the Phlebotomy certification test. Next Assessment: Spring 2019

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Critical Thinking [ X ] CT

Direct Measure: External examination: Phlebotomy ASCP BOC Exam Statistics. This is a computer adaptive test, and results provided in the report are not broken down by topics. CLO 1.1 BOC Program Performance Report 2018. CLO 1.2 PBT (ASCP) Examination content outline. The Laboratory Operations section of Phlebotomy technician PBT(ASCP) Certification Test includes the topics on: - Quality Control - Quality Improvement - Regulatory Applications to maintain

Safety and Infection Control

Semester/year data collected: Spring 2018 Target: 90% of graduates sitting as first time examinees in this term cohort will get a passing score equal to 400 or better in the Laboratory operations sub-content area. Results: The program mean for Laboratory Operations was 566 and was higher than the overall mean scaled score of 561. Program examinees scores in Laboratory Operations:

• 800-999 N= 1/14 • 700-799 N= 2/14 • 600-699 N= 2/14 • 500-599 N= 4/14 • 400-499 N= 4/14 • Less than 399 N=1/14

Critical thinking has not been measured previously in a cohort sitting at the certification test. This cohort includes graduates from Fall 2017 and Spring 2018. The section of Laboratory Operations requires the utilization of Critical Thinking skills to evaluate laboratory data to determine accuracy of generated data and for detection of systematic error in laboratory instruments. Other questions include situations to determine course of action needed to reduce risks for infections, accidental needle sticks and practices needed to maintain the safety in laboratory environment. The target was met showing that graduates have the ability to weigh evidence to select appropriate course of action in laboratory operations or to procedures

Page 315: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

305

Phlebotomy Career Studies Certificate Sample: All PBT Graduates sitting as first time examinees for this PBT(ASCP) Certification Test: N=14

The required passing score for each area of this test is a minimum of 400, and only 1 student did not achieve this minimum score. This student did not pass the certification test on this first attempt. All the others (13/14) achieved passing scores ranging from mid-400s to mid-800s. Target was met as 93% of graduates sitting as first time examinees scored more than 400 in this section.

and processes that provide safety, reliability, and accuracy. Actions to maintain the target and provide opportunity to improve the student’s performance in the certification test in the area of Laboratory Operations include to keep the hands-on activities related to QC, QA and safety regulatory compliance. These activities encourage the development of critical thinking skills. The case studies and Laboratory scenarios play an important step in assessing well-developed critical thinking skills, and the group discussions clarify the process of analysis to guide students that still have not achieved the expected outcome. Next Assessment: Assessment of this CLO should be monitored annually.

Program Goals Evaluation Methods Assessment Results Use of Results

Student retention at a rate equivalent to previous year.

Retention rate: Student Information System (SIS) class rosters: Progression from didactic course, MDL 105 Phlebotomy to clinical course, MDL 106 – Clinical Phlebotomy. SIS enrollment report: MDL 105 and MDL 106

Target: Maintain retention rate above 80%. Year 2018: 25 students enrolled and 2% were not able to continue for a 98% retention. Results for Past 5 years:

Year Retention Rate 2018 98% 2017 96% 2016 96% 2015 84% 2014 90% 2013 90%

Comparison to previous assessment: The retention rate target was met.

Actions implemented since Fall 2016 show that efforts for providing early support to students that are determined to be at academic risk have been successful. Academic counseling is provided for students at risk. Referral to a retention counselor and development of a plan for tutoring are among the supportive measures implemented to increase retention. Increase in hours to perform, review or refine psychomotor skills have been included in curriculum as open lab activities. All of these efforts have increased retention in the last three years. Non-academic factors determine unavoidable decisions to drop from the program. Assessed: Annually

Increase program graduation totals.

OIR reports: Number of Graduates by Program and Specialization: 2017-2018

Target: Full program capacity of 18 students per semester for a total of 36 combined. Results for Past 5 Years:

Academic Year Number of Graduates 2017-18 16 2016-17 17 2015-16 25 2014-15 25 2013-14 29

Target was not met for the 2017-18 year.

Previous actions taken to improve program goal have been directed to approve the increase of 5 credits to the Phlebotomy CSC to allow students to qualify for financial aid since this is the number one non-academic reason causing students to drop out of the program. This action was approved in the Fall 2017. An increase in graduates was anticipated but this goal was not met. An analysis of this situation given that we had an enrollment of 11 students for the Fall 2017 and 12 in the Spring 2018 shows that some students begin the phlebotomy program to continue in the Medical Laboratory Assistant Program or to continue to

Page 316: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

306

Phlebotomy Career Studies Certificate Medical Laboratory Technology in a near future. Although this program serves for some students as a standalone profession by fulfilling the educational credentialing requirements of PBT (ASCP), for other students it provides the advantage of completion of some courses needed to complete their AAS in MLT.

Most recent results show that the desired target was not met but within the Spring 2018 group of students some began in the summer their first semester of the MLA program. In the past year only about 65% of students who were enrolled in Fall and Spring Phlebotomy courses received the NOVA PBT CSC. Actions to improve this program goal have been included in a plan that begins in the Fall 2018. Actions include development of outreach activities within surrounding healthcare institutions and clinical training sites for increasing recruitment in our programs. The new online application system programmed to be launched in January 2019 for MEC programs is expected to expedite the information exchange to provide more support to applicants. Assessed: Annually

Provide students with the knowledge and skills necessary to pass certification examinations.

Number of students passing the national exam.

There is more than one certifying agency available to our Phlebotomy graduates; we recommend the most widely known the ASCP- BOC. The ASCP is the only certification agency we subscribe to for access to exam performance reports. Goal 3 attachment: PBT (ASCP) Program Performance Report 2018

Target: 80% pass rate 2017-2018 PBT (ASCP) First Time Examinee Cycle Report. This report includes graduates from Fall 2017 and Spring 2018. First time examinees N=14 Results: Number of First time examinees obtaining a pass score: 13/14; pass rate =92.86% Target was met and improvement is observed in overall mean scaled scores of our first time examinees as compared to 2017 and 2016 scores:

• 2017- overall program mean scaled score was 548 compared to a 490 national score

• 2016- overall program mean scaled score was 543 compared to 522 national

Results for Past 5 Years in PBT (ASCP) Performance Summary:

• 2017=86% • 2016=91%

Result were satisfactory, demonstrating improvement in the passing rate for this year cohort of students sitting for certification exam. Previous actions taken to improve program goal in Fall 2017 include an orientation provided by the Clinical Education Coordinator and Program Director at the end of the MDL 106 about external exam characteristics and application process. The target was met and an improvement of 12% in pass rate was achieved for this cycle that included student from Fall 2017 and Spring 2018. These numbers validate the program goal of providing the knowledge and skills necessary to pass PBT certification examinations. Current actions to maintain within target and improve previous year performance will be to maintain review session with the Fall 2018 and Spring 2019 groups with web-based simulation

Page 317: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

307

Phlebotomy Career Studies Certificate • 2015=92% • 2014=87% • 2013=83%

exams. Each Fall/Spring semester, at the end of MDL 106 the group of students in this course will receive an orientation about the application of certification and the professional benefits of certification.

Assessed: Annually

Page 318: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

308

Annual Planning and Evaluation Report: 2017-2018 Photography and Media, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Photography and Media, A.A.S. Program Purpose Statement: The program is designed for students for diverse career options within the field of photography, digital imaging, and media. Course work will stress both technical and aesthetic elements, enabling students to solve a wide range of visual problems with imagination and originality.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Control the image capture process

Photography II PHT 102 Direct Measure: Multiple choice exam questions on the following topics: • Shutter speeds • Shallow Depth of Field (DOF) • Maximum DOF Sample:

Campus/

Modality

# of Total

Sections Offered

# of Sections Assessed

# Students assessed

AL 1 1 AAA 1 AAS 1 AAA+AAS 1 Non-major 6 Total 9

WO 1 1 AAA 6 AAS 5 AAA+AAS 1 Non-major 3 Total 15

ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 2 2 24

*Dual Enrollment All PHT102 students were assessed. This assessment involved two faculty members in 2 sections. The total sample was 24 students (AAS=6 students). The WO result is not sorted by AAA and AAS. The assessment was previously done in PHT101. The result was not sorted by AAA and AAS when assessed in PHT101.

Semester/year data collected: Fall 2017 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees:

SLO questions Spring 2016 Fall 2017

Change from

previous assess-ments

Shutter speeds

AL (31) 96% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+4%

WO (20) 76% 86% +10% All = 83% All = 93% +10%

Shallow DOF

AL (19) 59% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+41%

WO (11) 42% 73% +31% All = 53% All = 86.5% +33.5%

Max. DOF AL (31) 96% AL/AAS 100% AL/AAA 100% AL/AAA+AAS

100%

+4%

WO (16) 61% 80% +19% All = 83% All = 90% +7%

Overall 73% 89% +16% The results for all three SLO questions are improved and the average of all three questions are above target. The achievement level of the Shallow DOF question significantly increased in both campuses.

Previous action(s) to improve SLO: The PHT discipline group decided to simplify answer choices and eliminate the answer selection of “None of the above.” The PHT discipline group also decided to simplify the wording of the question to the following: “To obtain shallow Depth of Field in an image.” This modified question was implemented in the assessment in Fall 2016. This revised wording helped students with reading/ language challenges. This assessment was previously done in PHT101 – Photography I. The PHT discipline group decided to assess the SLO in PHT102 where the level of outcome is “Practiced” and “Mastered”. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Shallow DOF question still has the lowest achievement level even though simplifying the wording helped increase the achievement level. Students commented that wording "wide aperture" in one of the answer choices is confusing. Current action(s) to improve SLO, based on results: The PHT discipline group will discuss answer choices for the Shallow DOF question and consider different wording such as "wide open" or "large" instead of “wide aperture” in the Spring 2019 Cross Campus meeting. The change will be implemented in Spring 2019. For the next assessment in Fall 2020, the SLO lead faculty member will send multiple reminders to collect assessment data separated by Photography+Media AAS degree and others. Next assessment of this SLO: Fall 2020.

Page 319: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

309

Photography and Media, A.A.S. Manage image assets and workflow

Photography II PHT102 Direct Measure: Multiple choice exam questions on the following topics: • Backing up files • Lightroom • Metadata • Missing image files Sample:

Campus/

Modality

# of Total

Sections Offered

# of Sections Assessed

# Students assessed

AL 1 1 AAA 1 AAS 1 AAA+AAS 1 Non-major 6 Total 9

WO 1 1 AAA 6 AAS 5 AAA+AAS 1 Non-major 3 Total 15

ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 2 2 24

*Dual Enrolled All PHT102 students were assessed. This involved 2 faculty members in 2 sections. Total sample was 24 students (AAS=6 students). The WO result is not sorted by AAA and AAS. The previous result was not sorted by AAA and AAS.

Semester/year data collected: Fall 2017 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees:

SLO questions Spring 2016 Fall 2017

Change from

previous assess-

ment Backing up files

AL (31) 75% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+25%

WO (10) 76% WO 93% +17% AL+WO 76% AL+WO 96.5% +20.5%

Lightroom AL(40) 97% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+3%

WO(13) 100% WO 100% +/-0% AL+WO 98% AL+WO 100% +2%

Metadata AL(37) 90% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+10%

WO(13) 100% WO 93% -7% AL+WO 92% AL+WO 96.5% +4.5%

Missing image files

AL(20) 90% AL/AAS 100% AL/AAA 100%

AL/AAA+AAS 100%

+10%

WO(13)100% WO 93% -7% AL+WO 94% AL+WO 96.5% +2.5

Overall 90% 97% +7% The overall average increased by 7% compared to the previous assessment. The success rate for the question “Backing up files” increased by 20.5%. The other three questions’ success rates are almost identical to the previous assessment (within 4.5% difference overall).

Previous action(s) to improve SLO: The PHT discipline group revised the question of “backing up files” to fit the current trend incorporating “3-2-1 rules” in Spring 2017. The concept of “3-2-1 rules” was discussed in class in additions to providing online materials on Blackboard. The achievement level for “backing up files” increased by 20.5%. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: While the overall achievement level for all four questions improved and is above target, the question on “Metadata” and “Missing image files” decreased by 7% in one campus. Current action(s) to improve SLO, based on results: For the next assessment in Fall 2019, the SLO lead faculty member will send multiple reminders to collect assessment data separated by Photography+Media AAS degree and others. Faculty decided to review the Library Module in Lightroom and how Lightroom handle files within a catalog system in class and online starting in Fall 2018. Next Assessment: Fall 2019

Solve technical and aesthetic problems independently and creatively

Photography II PHT 102 Direct Measure: Multiple choice exam questions on the following topics: • Orange color cast (WB) • Minimize highlight clipping • Vibrance Sample:

Semester/year data collected: Spring 2018 Target: 80% of students will answer correctly on each criterion as well as the overall score. Results by campus and degrees:

SLO questions Fall 2016 Spring 2018

Change from

previous assess-

ment Orange color cast (WB)

AL: AAS(5)=100%

AL/AAS 57% AL/AAA 33%

-43%

Previous action(s) to improve SLO: The faculty continued providing the review materials online and multiple practice tests to prepare students for the exam, which appeared to be effective after the last assessment in Fall 2016. Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: Achievement level for “Orange color cast” was significantly lower compared to the

Page 320: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

310

Photography and Media, A.A.S. Campu

s/ Modalit

y

# of Total

Sections Offered

# of Sections Assessed

# Students assessed

AL 2 2 AAA 6 AAS 7

AAA+AAS NA Non-major 12

Total 25 WO 1 1 AAA 2

AAS 3 AAA+AAS NA Non-major 7

Total 12 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 3 3 37

*Dual Enrollment All PHT102 students were assessed. This involved 3 faculty members in 3 sections. Total sample was 37 students (AAS=10 students). The previous result included AAS only.

WO/NA WO/AAS 66% WO/AAA 100%

NA

Total AAS(5)=100%

Total/AAS 61.5% Total/AAA 66.5%

-38.5%

Minimize highlight clipping

AL: AAS(4)=80%

AL/AAS 100% AL/AAA 66%

+20%

WO/NA WO/AAS 66% WO/AAA 50%

NA

Total AAS(4)=80%

Total/AAS 83% Total/AAA 58%

+3%

Vibrance AL: AAS(4)=80%

AL/AAS 71% AL/AAA 66%

-9%

WO/NA WO/AAS 100% WO/AAA 50%

NA

Total AAS(4)=80%

Total/AAS 85.5% Total/AAA 58%

+5.5%

Overall 86.6% AAS 76.6% -10% The success rate of the question “Minimize highlight clipping” improved by 20%. The success rate of two questions (Minimize highlight clipping, Vibrance) is above target. The overall average is under the target by 3.4%. The success rate of the question “Orange Color Cast” decreased by 38.5%.

previous assessment. Also, the achievement level for “Vibrance” at AL campus was down by 9%. Current action(s) to improve SLO, based on results: For the “Orange color cast” question, many students chose an answer which suggests that the students did not consider shooting with a correct white balance. Faculty will remind students of the importance of the correct capturing process instead of depending on post-processing in LR in class starting in Fall 2018. For the “Vibrance” question, faculty will use a portrait to demonstrate how skin tones are adjusted when using the Vibrance slider in class starting Fall 2018. Next Assessment: Spring 2020

CLO: Critical Thinking

Advanced Photography I + II PHT 201+ 202 Direct Measure: Writing Artist Statement (Rubric attached at the end of the report). The statement should be a reflection upon the student’s work throughout the semester. Writing a statement should clarify the conceptual intent of the student’s work and help them identify and be aware of their creative process. Critical thinking will be demonstrated by an artist’s statement that: 1. Identifies and explains the relevance

• Overview of the project • Main idea

2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience) • Subject matter • Styles • How your work relates to the world around

us? • Influences

Semester/year data collected: Spring 2018 Target: 70 percent of students will score 3 points or better on each criterion and 15 points or better on the overall score. Results by degrees:

Results by CLO Criteria Spring 2018

Average Score % of Students > 3 / 15 points

1. Identifies and explains the relevance

AAS = 3.6 points AAA = 4 points

AAS=94% AAA=100%

2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience)

AAS = 2.9 points AAA = 3.3 points

AAS=66% AAA=66%

3. Communicates personal points of view (perspective)

AAS = 3.3 points AAA = 3.6 points

AAS=94% AAA=100%

4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic)

AAS = 3.4 points AAA = 4 points

AAS=77% AAA=100%

5. Uses College-level writing

AAS = 2.8 points AAA = 3.6 points

AAS=61% AAA=100%

This was the first assessment of this CLO. The areas that need improvement are 2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience), and 5. Uses College-level writing, which are below target. Current action(s) to improve SLO, based on results: For #2, faculty will emphasize these aspects of writing an artist’s statement in class and provide students with websites and examples of well-written artists’ statements in Blackboard starting in Fall 2018. For #5, faculty will encourage students to get help from the Writing Center starting in Fall 2018. Next Assessment: Spring 2021

Page 321: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

311

Photography and Media, A.A.S. 3. Communicates personal points of view

(perspective) • Your experience of creating work • Massages to convey through work

4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic) • Specific ways of seeing, styles • Does the project change over time? If so,

why? • What medium does it involve? Is there any

significance for your project that you are using a specific media?

5. Uses College-level writing Sample:

Campus/

Modality

# of Total Sections Offered

# of Sections Assesse

d

# Students assessed

AL 1 1 AAA 3 AAS 18

AAA+AAS 3 Non-major 5

Total 23 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 Total 23

*Dual Enrollment All PHT201+202 students were assessed. This involved 1 faculty member in 1 section. Total sample was 23 students (AAS=18 students).

Total AAS = 16.2 points AAA = 18.6 points

AAS=72% AAA=100%

Above Target: 1. Identifies and explains the relevance and 3. Communicates personal points of view (perspective) are above target. Below Target: 2. Recognizes context (i.e., cultural/social, scientific, technological, political, ethical, personal experience), 4. Analyses and Justifies decisions (i.e., visual styles, technical, and aesthetic), and 5. Uses College-level writing are below target.

Program Goals Evaluation Methods Assessment Results Use of Results

To provide quality education and preparation for diverse careers options in the field of photography

Advanced Photography I + II (AL) PHT 201 + 202 Measure: Photography and Media Student Survey 2018 – developed within Photography program. The survey was given on Survey Monkey, and the rating scale ranged from 1 to 5 with 5 as the highest rating for the following question: How do you rate the overall: • Faculty concern for students • Instructors’ expertise in area of study • Quality of the PHT facilities

Semester/year data collected: Spring 2018 Target: 80 percent of students will rate 4 points or above on each criterion. Results:

Criteria

Spring 2017 Spring 2018 Change from

previous assess-

ment

Average rating

% of Students > 4

Average rating

% of Students > 4

Faculty concern for students

5.00 100% 4.92 100% +/-0%

Instructors’ expertise in area of

5.00 100% 5.00 100% +/-0%

Previous actions to improve Program Goal: The quality of Photo Facilities scored the lowest rating of the three questions. Moving into a new facility in Alexandria campus in Fall 2017 solved some facility related issues. Target Met: [ X ] Yes [ ] No [ ] Partially Most recent results: Although the target was met, “Quality of Photo Facilities” has the lowest rating. At the Alexandria campus one student said the ceiling lights interfere with viewing the computer monitors in AFA room 305 (Mini lab) and AFA room 304/308. Current actions to improve Program Goal:

Page 322: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

312

Photography and Media, A.A.S. Sample: 12 out of 18 AAS students participated in the survey (66%) • 9 identified their home campus as AL (75%) • identified their home campus as WO (25%)

study Quality of Photo Facilities

4.60 100% 4.73 91% -9%

Comments by students: • Excellent facilities. The computers, printers, light studio,

dark room, etc. All first rate. • The overall facility is amazing. I think the most basic

improvement that could be made is to modify the lighting in both the mini lab and the classrooms. The ceiling lights interfere with clear views of the screens in the classrooms, and reflect on the screens where we edit photos in the mini lab. On a positive note, the availability of computers and printers is wonderful! Of less importance, but along the lines of students' convenience, larger lockers would be helpful to store supplies such as large format papers and such (things that are awkward to transport, especially for those who take public transportation to school).

• Needs more computers. Overall, students are satisfied with the program.

Alexandria faculty are aware of the ceiling light problems. The Alexandria Photo Lab Manager and faculty are working to solve the issue. Currently, there is no estimated time when the problem will be solved. However, the inquiry on this issue was sent to Facilities in Fall 2018. In Woodbridge, faculty work towards consistently updating the facilities to provide state of the art education. Next Assessment: Spring 2019

Advanced Photography I + II (AL) PHT 201 + 202 Measure: Photography and Media Student Survey 2018 – developed within Photography program. The survey was given on Survey Monkey, and the rating scale ranged from 1 to 5 with 5 as the highest rating for the following question: After completing the Photography program, how would you rate your own mastery of: • Camera operation and exposure control • Photography software - Photoshop • Photography software - Lightroom • Video software - Final-Cut • The history and concepts of Photograph Sample: 12 out of 18 AAS students participated in the survey (66%) • 9 identified their home campus as AL (75%) • identified their home campus as WO (25%)

Semester/year data collected: Spring 2018 Target: 80 percent of students will rate 4 points or above on each criterion.

Criteria

Spring 2017 Spring 2018 Change from

previous assess-

ment

Average rating

% of Students > 4

Average rating

% of Students > 4

Camera operation and exposure control

5 100% 4.42 100% +/-0%

Photoshop 4 60% 3.75 50% -10% Lightroom 5 100% 4.17 91% -9% Final-Cut 4.5 40% 3.43 50%* +10% The history and concepts of Photo

4.8 100% 4.1 66%** -34%

*41.67% (5 students) have not taken PHT130 yet **16.67% (2 students) have not taken PHT110 yet Comments by students: • The NOVA photography program improved my technical &

composure skills tremendously. Plus, it introduced me to new areas of interest.

Previous actions to improve Program Goal: The previous assessment indicated that students struggled with Photoshop. Faculty created online demo videos starting in Fall 2016 as additional learning materials to see, hear and practice the software outside class time. However, this action was not enough to increase students’ confidence level of Photoshop. Target Met: [ ] Yes [ ] No [ X ] Partially Most recent results: Students’ mastery of Photoshop and Final-Cut need improvement. Five of the 12 students participating (41.6%) in the survey answered that they had not taken PHT130. Therefore, the survey results may indicate that students have not learned the software because they have not yet taken the class which covers this software - PHT 130. Similarly, 16.6% (2 students) answered that they had not taken PHT110. Student comments include “but feel mastery will come with more and more experience” and “I will probably never rate myself a 5, but…” Revising the question to “After completing the Photography program, how comfortable are you with the following areas of

Page 323: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

313

Photography and Media, A.A.S. • I have learned a lot, but feel mastery will come with more

and more experience. • I am a non-major. I have taken a limited selection of

classes for my own growth. I will probably never rate myself a 5, but the knowledge I have gained from the classes I have taken is invaluable!

photography?” may be helpful for modest students to rate their mastery level higher. Current actions to improve Program Goal: During Fall 2018 advising week, faculty will encourage students to enroll in PHT130 and PHT110 towards the beginning of their degree program. Faculty will revise the question for the Spring 2019 survey. Next Assessment: Spring 2019

Advanced Photography I + II (AL) PHT 201 + 202 Measure: Photography and Media Student Survey 2018 – developed within Photography program. The survey was given on Survey Monkey, and the rating scale ranged from 1 to 5 with 5 as the highest rating for the following question: Do you feel technically prepared to work in the field? Sample: 12 out of 18 AAS students participated in the survey (66%) • 9 identified their home campus as AL (75%) • identified their home campus as WO (25%)

Semester/year data collected: Spring 2018 Target: 70 percent of students will rate 4 points or above. Results:

Answer

Spring 2017 Spring 2018 Change from

previous

assess-ment

Average rating

% of Students > 4

Average rating

% of Students > 4

Yes, prepared

4.8 100% 4 75% -25%

Previous actions to improve Program Goal: Beginning in Fall 2017, faculty advised students to take Studio Lighting (PHT221) for preparation in commercial and portrait photography. PHT221 continues to be a popular class. Target Met: [ X ] Yes [ ] No [ ] Partially Most recent results: Even though the target goal was met, the students’ level of preparedness decreased by 25%. There were no student comments in this section. Faculty decided that comments for this question would be helpful to determine why students do not feel fully prepared. Current actions to improve Program Goal: Faculty decided to add a comment section for this question in the Spring 2019 survey and will monitor the results. Next Assessment: Spring 2019

Professional development One professor was Invited to be one of 20 curators for the Alchemical Vessels exhibition and benefit for the nonprofit Smith Center for Healing & the Arts, which serves both individuals with cancer and the Washington, DC, community by using the arts as tools for healing. She was also invited to exhibit work in the exhibition. One professor was invited to participate in the exhibition “(IM)migration”, Baltimore Community College-Essex” Baltimore, MD 2017. One professor’s artists’ books were acquired by the Beinecke Rare Book & Manuscript Library at Yale University, New Haven, CT. One faculty member attended the National Society for Photographic Education Conference in Philadelphia.

Faculty participation in art exhibitions and other professional activities in the art community enhances the visibility and prestige of the photography program. Faculty decided that the achievement level should be full support to attend professional conferences. Full funding is not available. The faculty member received partial funding and subsidized attendance at the conference. The achievement level to attend conferences was not met. Faculty will continue to request full funding for professional conferences.

Page 324: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

314

Photography and Media, A.A.S. Enhance the curriculum

Consult with Photography Curriculum Advisory Board about current trends in the profession

Advisory board meetings were held on November 6, 2017 and February 16, 2018. November 6, 2017 meeting agenda

• Sarah Raymond’s retirement • Video instruction • Adjunct instructors • Visual Art AFA degree • FOTOweek

Feb 16, 2018 meeting agenda

• Reorganization • Curriculum update • Program Review

The minutes from AAS advisory board meetings are attached.

The faculty decided that the achievement level would be to have one meeting each semester. Faculty will continue to consult with the Advisory Board about aligning the curriculum with current trends in the field. At the Spring 2018 meeting, two board members recommended making PHT 221, Studio Lighting, a required course. No motion was made regarding this recommendation. Target Met: [ X ] Yes [ ] No [ ] Partially In Summer 2018, new criteria and guidelines for Curriculum Advisory Board membership were announced. Faculty have concerns about this new criteria which appears to exclude self-employed professionals and academics from area colleges. Faculty will consult with the Associate Vice President for Academic Affairs regarding the implementation of those changes. Next Assessment: Fall 2019

Curriculum and instruction evaluation/review by PHT discipline group members

The Visual Art AFA degree was approved in Spring 2018. The Fine Arts AAA Photography Specialization was discontinued in Fall 2018. The PHT discipline program review is in progress. When completed, the review should provide insights into improving the curriculum. In Fall 2017, the Woodbridge campus started to offer the AAS degree. The PHT discipline is no longer allowed to offer PHT195 Special Topic courses.

In Summer 2018, faculty updated the course content summary for PHT 235. Faculty decided that the following course content summaries need revisions: PHT106, 201-202, 206, 211,221-222, 227, 231, 246, 247, 265, 270-271, and 274. Faculty will start revising them in Fall 2018. PHT195 courses such as Business of Photography, Flash/Speedlight Techniques, Street Photography, and Low/Night Photography were offered as special topic elective courses. The VCCS master course list does not have similar courses listed above for us to offer. Faculty are disappointed that we can no longer offer these courses. Faculty decided that the achievement level should be one new initiative annually. The achievement level was met. Next Assessment: Spring 2019

To enroll and retain students

Number of program placed students: Distribution of Program Placed Students by Curriculum and Award Type fall 2013-fall 2017 from OIR

Target: an increase (+5%) Results from Past 5 Years:

• Fall 2017 = 92 (College=40121, AAS=6528) • Fall 2016 = 97 (College=41342, AAS=6398)

Previous actions to improve Program Goal: Full-time faculty would like to be able to program place students during face-to-face office visits. The process of signing up for a faculty advisor should be simplified.

Page 325: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

315

Photography and Media, A.A.S. • Fall 2015 = 113 (College=43323, AAS=6893) • Fall 2014 = 97 (College=43871, AAS=7319) • Fall 2013 = 122 (College=44592, AAS=7995)

The numbers of program placed students in the Photography + Media AAS decreased by 5% from the previous year.

Target Met: [ ] Yes [ X ] No [ ] Partially Most recent results: Program placed students decreased by 5% from the previous year. Current actions to improve Program Goal: Full-time faculty members continue to meet with 100% of students without a faculty advisor in PHT 101 and PHT 102 to make sure the students have a faculty advisor and also are provided with clear guidance on the program and degrees. In addition, faculty decided to post flyers at local libraries and community areas to promote the program and courses starting in Fall 2018. Assessed: Annually

Graduation data: Number of NOVA Graduates by Degree and Specialization 2017-18 from OIR

Target: an increase (+10%) Results from Past 5 Years:

• 2017-18=10 (College=7004, AAS=1050) • 2016-17= 9 (College=7443, AAS=1041) • 2015-16= 12 (College=7752, AAS=958) • 2014-15 = 13 (College=7528, AAS=995) • 2013-14 = 11 (College=7374, AAS=1012)

The number of students graduating with a Photography and Media AAS degree increased by 1 student (+11%) from the previous year.

Previous actions to improve Program Goal: Full-time faculty members continued to meet with 100% of the program placed students in PHT 101 and PHT 102 providing clear guidance on the program, pathway, and degrees. Also, full-time faculty members continue to remind students to apply for graduation before the deadline each semester. The graduation data indicates that clear guidance and a reminder about the deadline for graduation are effective. Target Met: [ X ] Yes [ ] No [ ] Partially Most recent results: The graduation rate increased by 11%. Multiple reminders about graduation deadlines and meeting 100% of program placed students appears to have helped increase the graduation rate. Current actions to improve Program Goal: Starting Fall 2018, faculty will continue to advise in person and send graduation deadline reminders, and faculty will start using Navigate to communicate with other faculty members about students’ progress. Assessed: Annually

To graduate students

Success rates: Fall 2017 Success rates by discipline excludes ELI classes for OIR

Target: 75% Results from Past 5 Years:

• Fall 2017: 82.8% (+9.9% from previous year) • Fall 2016: 72.9% (-0.6% from previous year)

Previous actions to improve Program Goal: Full-time faculty worked with adjuncts to ensure quality instruction in all classes in Fall 2017 and Spring 2018. This may be responsible for the substantial increase in the success rate.

Page 326: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

316

Photography and Media, A.A.S. • Fall 2015: 73.5% (+2% from previous year) • Fall 2014: 72% (+0.4% from previous year) • Fall 2013: 71.7% (+5% from previous year)

The success rate increased by 9.9%.

Target Met: [ X ] Yes [ ] No [ ] Partially Most recent results: The success rate increased by 9.9%. Detailed data by individual courses for courses with less than 75% is indicated below along with actions to improve the success rate.

Fall 2017 Student Grade Distribution by Course: College-Wide from OIR Fall 2017 Student Grade Distribution by Course: Alexandria Campus from OIR Fall 2017 Student Grade Distribution by Course: Woodbridge Campus from OIR

Target: 75% Results: 14 courses were offered in Fall 2017. Some of the courses had multiple sections at both the AL and WO campus. 4 courses had college-wide success rates at or under 75%.

Fall2017 Grade distribution by course – College-Wide

Course/ %

Success

Total #

Students

# Credit Studen

ts D F W Audit I

PHT101* 72%

128

116 7 (6%)

13 (11%)

12 (10%)

12 (9%)

0

PHT103 67%

7 6 0

1 (16%)

1 (16%)

1 (14%)

0

PHT195 60%

21 10 0

0

4 (40%)

11 (52%)

0

PHT270 72%

40 36 3 (8%)

5 (13%)

2 (5%)

4 (10%)

0

*This may include ELI data

Sections of courses whose success rate are on or under 75% as college wide, organized by Campus

Course offered in Fall 2016

Course offered in Fall 2017

Changes from previous year

PHT101 / 69% (WO)* PHT101 / 63% (WO) -6% NA PHT103 / 67% (AL) NA NA PHT195 / 60% (AL) NA PHT270 / 74% (AL) PHT270 / 79% (AL) +5% PHT270 / 42% (WO) PHT270 / 65% (WO) +23%

*This may include ELI data

Most recent results and previous actions to improve Program Goal: The success rate for PHT 270 (AL) increased by 5%. The success rate for PHT 270 (WO) increased by 23%. Continuous reminders to students to submit assignments and encouragement to meet with their professors during office hours helped increased the success rate in Fall 2017. However, 21% of students still received D or F in Fall 2017. The success rate for PHT 101 (WO) declined by 6%. Three of the five sections were offered through ELI which historically has lower success rates. PHT 103 (AL) had a total of 6 credit students. 4 students earned A’s or B’s. The low success rate may be due to the small sample size. 4 students withdrew from PHT 195 (AL) which resulted in a low success rate for the course. One student had a death in her family, two students had a changes in employment, and the other student withdrew without communicating the faculty about the reason. 3 of 4 students had circumstances beyond students/faculty’s control which resulted in withdrawal from the course. Current actions to improve Program Goal: Faculty members continue to send emails to students in academic difficulty, encouraging them to submit assignments and meet with their professor during office hours. At the cross-campus meeting in Spring 2019, the PHT discipline group will discuss course materials for PHT 270 and possible assessment methods to identify the reasons for the low success rate. At the cross-campus meeting in Spring 2019, the PHT discipline group will discuss course materials for PHT 101, ELI course materials and teaching

Page 327: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

317

Photography and Media, A.A.S. methods to identify reasons for the low success rate.

Advanced Photography I + II (AL) PHT 201 + 202 Measure: Photography and Media Student Survey 2018 – developed within Photography program. The survey was given on Survey Monkey, and the rating scale ranged from 1 to 5 with 5 as the highest rating for the following question: Do you have a Photo advisor? If yes, how often do you contact your advisor? The PHT cluster decided that it is important for students to have a faculty advisor and meet the advisor to discuss the degree progress to increase the number of graduates. This survey informs if and how much guidance students are receiving from a faculty advisor. Sample: 12 out of 18 AAS students participated in the survey (66%). • 9 identified their home campus as AL (75%) • identified their home campus as WO (25%)

Semester/year data collected: Spring 2018 Target: 100 percent of students will have a PHT faculty advisor and meet with the advisor at least “once a semester.” Results: Do you have a Photo advisor?

Results: If yes, how often do you contact your advisor?

Answer

Spring 2017 Spring 2018 Change from

previous assess-

ment # of

Students % of

Students # of

Students % of

Students

Once a semester or more

5 100% 8 100% +/-0%

Answer Spring 2017 Spring 2018 Change

from previous asses-sment

# of Students

% of Students

# of Students

% of Students

Yes 5 100% 8 66.67% -33.33% No 0 0% 3 25% +25% I don’t know

0 0% 1 8.33% +8.33%

Previous actions to improve Program Goal: The previous survey indicates that it was effective to meet with 100% of students in PHT101 and 102 to inform them of the importance of having a faculty advisor, and to also provide clear guidance on the program and degrees. Since the last assessment, faculty continued to meet with all students to give information about having a faculty advisor and the program/degree. Target Met: [ ] Yes [ ] No [ X ] Partially Most recent results: 3 (25%) students answered that they do not have a PHT faculty advisor. 1 (8.33%) student did not know if she/he had a PHT faculty advisor. 2/3 of students have a PHT faculty advisor and they contact their PHT advisor at least once a semester for advising. Current actions to improve Program Goal: Simplifying the process of assigning an advisor would increase the number of students with a PHT advisor. Currently, there is no clear implementation date for this goal because it is not up to the faculty but it involves the college-wide (possibly VCCS-wide) change. Faculty continue to meet with all students to give information about having a faculty advisor and the program/degree. Next Assessment: Spring 2019

To obtain the instructional resources and develop the curriculum needed to provide excellent instruction

The PHT cluster evaluates the traditional and digital photographic facilities.

Target: the adequate funding of technology Results: Tech Plan FF&E and ETF funding are used annually to upgrade technology. New equipment at AL:

• Smith Victor Cool LED 100K 2 light kit – ETF • Qty11 - 15” Apple MacBook Pro laptops with

sleeves– FF&E • Qty2 - Manfrotto 025B large boom with stand • Hasselblad Imacon Flextight X1 virtual drum

scanner • Edwards Engineered Products 16x20” high output

UV vacuum frame exposure unit • Epson P9000 printer

The AFA building in AL officially opened in Fall 2017. In the darkroom, electrical problems caused five enlargers to not work properly in Fall 2017 and Spring 2018. Rewiring enlargers in Summer 2018 seemed to solve the problem. In addition to the listed new equipment, the studio blackout curtain was installed in Lighting Studio/AFA328 in Fall 2018. Also, whiteboard daylight wash lighting with a dimmer switch in AFA304 and AFA308 were installed in Spring 2018. At the Woodbridge campus, all iMacs were replaced in Summer 2017. However due to budget constraints, only 20 of the 40 iMacs were 27-inch monitors with extra RAM. This is unfortunate. The target goal for the adequate funding of technology was met.

Page 328: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

318

Photography and Media, A.A.S. • GoPro Hero 5 action camera with charger, extra

battery, case, grip, 64gb micro SD card – ETF • Nikon KeyMission 360 action camera with charger,

extra battery, case, grip, 64gb micro SD card – ETF • Qty6 - JVC GY-HM200U 4K streaming camcorder –

FF&E • Qty6 - JVC XLR shotgun microphones for

camcorders – FF&E • Qty3 - Epson Perfection V850 Pro flatbed scanners • Canon EOS 5D Mark IV DSLR with 24-70mm f/4 L

IS USM lens – ETF • Sony 70-200mm f/2.8 G Master zoom lens for Sony

E mount - ETF At the Woodbridge campus, all iMacs were replaced in Summer 2017.

Current actions to improve Program Goal: Faculty and the lab manager will work closely to check the inventory of the photography and media equipment and request necessary updates, upgrades, and replacements when FF&E, ETF, and Tech Plan funds are available. Next assessment: Spring 2019

Page 329: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

319

Annual Planning and Evaluation Report: 2017-2018 Physical Therapist Assistant, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The PTA Program is designed to prepare students to utilize exercise, specialty equipment and other treatment procedures to prevent, identify, correct and alleviate movement dysfunction. The program design provides students with the philosophical, theoretical, and clinical knowledge necessary to deliver high-quality patient care. Ultimately, students are prepared as skilled technical health care providers who work under the direction and supervision of a physical therapist to provide selected components of physical therapy treatments. Upon successful completion of the program, students must take and pass a licensing examination to begin their career as a physical therapist assistant (PTA). Students are prepared for employment in a variety of health-care settings including acute care hospitals, outpatient clinics, extended care facilities, rehabilitation centers, contract agencies and schools.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Demonstrate competence in implementing interventions identified in the plan of care established by the physical therapist.

Clinical Experience III PTH 232 Direct Measure: The summative evaluation method is performance on Criterion #9 Interventions: Therapeutic Techniques (“Applies selected manual therapy, airway clearance, and integumentary repair and protection techniques in a competent manner”) on the PTA Clinical Performance Instrument (CPI) in PTH 232 Clinical Experience III in the Spring semester of the second year. For this Student Learning Outcome (SLO), the student’s aggregate therapeutic techniques intervention performance is assessed. For this year’s SLO, the focus is on joint mobilization. Individual components are listed for Criterion # 9 but cannot be teased out; the relevant component to joint mobilization performance is “manual therapy,” defined in the CPI as “Skilled hand movements intended to improve tissue extensibility; increase range of motion; induce relaxation; mobilize or manipulate soft tissue and joints; modulate pain; and reduce soft tissue swelling, inflammation, or restriction”. Per the CPI, criteria which must be met in order for a student to achieve “entry level performance” are: • Is capable of completing tasks, clinical problem

solving, and interventions/data collection for patients with simple or complex conditions under general supervision of the physical therapist.

• Is consistently proficient and skilled in simple and complex tasks, clinical problem solving, and interventions/data collection.

• Is capable of maintaining 100% of a full-time PTA’s patient care workload in a cost effective

Semester/year data collected: Spring 2018, Cohort Class of 2019 Target: 100% of students will score “Entry Level” on PTH 232 CPI Criteria #9-12 on the Clinical Performance Instrument (CPI) assessment tool in PTH 232 Clinical Experience III in the Spring of 2019 Summative Results by In-Class Enrollment:

Results by Campus/ Modality

Spring 2019 Spring 2018 Average

Score Percent >

100% Average

Score Percent >

100% ME N/A N/A 100% =

The formative achievement targets are: 1. 100% of students will pass the joint mobilization portion of the

PTH 122 Therapeutic Exercise Practical Exam with a score of 19/25 (76%) on the first attempt.

2. 100% of students will correctly answer Question #7 on the PTH 122 Therapeutic Exercise Written exam regarding grade and direction of joint mobilization.

3. 100% of students will correctly answer Question #19 on the PTH 122 Therapeutic Exercise written exam regarding ankle joint mobilization technique.

4. 100% of students will correctly answer Question #23 on the PTH 122 cumulative final written exam regarding knee position for patellar joint mobilization.

5. 100% of students will correctly answer Question #24 on the PTH 122 cumulative final written exam regarding identification of joint mobilization grades.

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics Spring 2018

Average Score > 76% by 1. 85.2% >10.2% 1st Time Pass Rate > 100% target by

Previous action(s) to improve SLO: For the Class of 2018, a pre and post joint mobilization quiz was developed to assess the impact of the joint mobilization seminar presented in the seminar portion of PTH 232 Clinical Experience III in the Spring 2018 semester. Target Met: N/A: Summative achievement target for the Class of 2019 will be assessed in Spring 2019. Based on recent results, areas needing improvement: 1. Students who do not

demonstrate competency in the joint mobilization section of the PTH 122 Therapeutic Procedures II Therapeutic Exercise Practical exam may still receive a passing grade on the exam and are not required to retake to establish competency.

2. As evidenced by question #22 on the cumulative written exam, students do not demonstrate knowledge of appropriate positioning for the clinically common patellar gliding mobilization technique.

Page 330: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

320

Physical Therapist Assistant, A.A.S. manner with direction and supervision from the physical therapist.

“Entry level” is a single point highest level terminal benchmark without gradations. Students achieving this benchmark are deemed ready to practice as physical therapist assistants. There are no strengths or weaknesses defined or identified for individual criterions on this national performance assessment tool. Provided Rubric Criteria or Question Topics For Formative Measure: The component intervention that is the focus for this SLO is joint mobilization. The formative evaluation methods include: • Performance on the Therapeutic Exercise

practical exam in PTH 122 Therapeutic Procedures II in the second semester of the program (rubric attached). Prior to the practical exam with its component on joint mobilization, students take a written PTH 122 Therapeutic Procedures II Therapeutic Exercise Unit exam. Two exam questions were examined: 1. Question #7 on the grade and direction of

glenohumeral joint mobilization 2. Question #19 on ankle joint mobilization

technique • In addition, two questions on the final course

comprehensive exam in PTH 122 Therapeutic Procedures II were selected: 1. Question #23 on the knee position for

patellar mobilization 2. Question #24 on identification of grades of

mobilization Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 29 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment

24/29=82.8% <14.8 Average Score > 100%target by 2. 96.5% <3.5% 3. 93.1% <6.9% 4. 34.5% <65.5% 5. 100% =

Current results improved: N/A - this is a new inquiry Strengths by Criterion/ Question/Topic: 1. Overall, the cohort performed well on the joint mobilization

section of the practical exam with an average score above 76%. Breaking down the practical exam into its component parts allowed the program to identify 2 students who passed the practical in its entirety but were deficient in joint mobilization. The criterion also identified the joint mobilization component as a key factor in determining overall failure of the practical exam, as 3 students who failed the practical exam also failed the joint mobilization component.

2. Overall, the cohort performed well on this criterion, correctly performing the grade and direction of the glenohumeral mobilization. This is one of the most common joint mobilizations performed clinically.

3. Overall, the cohort performed well on this criterion which asked for just one component of the technique of ankle mobilization. The instructor can be confident that students know this information.

4. Review of this criterion confirmed that the question was correctly worded, matched material presented in the course, and was highly clinically relevant. This was a higher order question, requiring that students integrate their knowledge of anatomy with the concepts of mobilization to a greater degree than is needed with other joints.

5. The 100% benchmark was achieved on this key concept question.

Weaknesses by Criterion/ Question/Topic: 1. This criterion doesn’t discriminate between students who failed

the component due to a lack of knowledge/skill and those who failed for other reasons. 3 of the 5 failed because they received a zero for the joint mobilization section because although they correctly performed joint mobilization, they misread the case scenario and treated the wrong joint.

2. The question tested knowledge in two areas, direction and grade. Students may have been able to correctly guess if they knew one but not the other.

3. No weakness in this criterion is apparent.

3. There is currently no integration of the joint mobilization data obtained in the first year of the program and that obtained in the second year.

Current actions to improve SLO based on the results: 1. Faculty will review the policy

for passing the PTH 122 Therapeutic Exercise practical exam in Spring 2019. Any changes will be effective in Spring 2020.

2. The instructor for PTH 122 will retain the patellar gliding question on the exam, but will put added emphasis on the anatomical considerations in lecture in Spring 2019.

3. For the Class of 2019, faculty will collect pre and post quiz data for the PTH 232 joint mobilization seminar, integrate the findings with the data from the first year, and compare the data to that collected for the pre and post quiz in Spring 2018 for the Class of 2018.

Next Assessment: Spring 2019

Page 331: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

321

Physical Therapist Assistant, A.A.S. 4. The bi-serial point discrimination for this question is -.12:

students who scored well on the test were not more likely to get this question right than students who scored poorly.

5. No weakness in this criterion is apparent. Demonstrate competence in performing data collection skills essential for carrying out the plan of care established by the physical therapist.

Clinical Experience III PTH 232 Direct Measure: The summative evaluation method is performance on data collection skills contained in Criteria #8 (Therapeutic Exercise), 9 (Therapeutic Technique), 10 (Physical Agents & Mechanical Modalities), 11 (Electrotherapeutic Modalities), and 12 (Functional Training & Application of Devices and Equipment) in PTH 232 Clinical Experience III on the PTA Clinical Performance Instrument (CPI) in the Spring semester of the second year. For this Student Learning Outcome (SLO), the student’s aggregate data collection skill is assessed. For this year’s SLO, the focus is on goniometry. Individual components are listed for Criteria # 9-12 but cannot be teased out. For each of these criteria, associated data collection techniques include “anthropometric characteristics; arousal, attention, and cognition; integumentary integrity; pain; range of motion; sensory response; and vital signs”. The component relevant to goniometry is “range of motion.” Per the CPI, criteria which must be met in order for a student to achieve “entry level performance” are: • Is capable of completing tasks, clinical problem

solving, and interventions/data collection for patients with simple or complex conditions under general supervision of the physical therapist.

• Is consistently proficient and skilled in simple and complex tasks, clinical problem solving, and interventions/data collection.

• Is capable of maintaining 100% of a full-time PTA’s patient care workload in a cost effective manner with direction and supervision from the physical therapist.

“Entry level” is a single point highest level terminal benchmark without gradations. Students achieving this benchmark are deemed ready to practice as physical therapist assistants. There are no strengths or weaknesses defined or identified for individual criterions on this national performance assessment tool.

Semester/year data collected: Fall 2017/Summer 2018, Cohort Class of 2019 Target: The summative achievement target is that 100% of students will achieve scores of 100% (Entry Level) on Criteria # 8-12 on the Clinical Performance Instrument (CPI) assessment tool in PTH 232 Clinical Experience III in Spring 2019. Results:

Results by Campus/ Modality

Fall 2017 Fall 2016 Average

Score Percent >

target Average

Score Percent >

target ME N/A N/A 100% =

The formative achievement targets are: 1. 100% of students will achieve a passing score (75%) on the first

attempt of the PTH 121 Therapeutic Procedures I Goniometry Practical Exam. In the past decade no student has failed the second attempt.

2. 100% of students will accurately perform goniometry on the PTH 131 seminar Goniometry Skills Check without guidance/assistance.

3. 100% of clinical instructors of students who are performing goniometry (it is not commonly done in certain settings) in the clinical portion of PTH 131 will respond that the student is independent or requires minimal cueing in the skill on the Early Performance Assessment.

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017 Fall 2016

Average Score % of

Students > target

Average Score % of

Students > target

1. 85.9% N/A Not available

1st time pass rate

1st time pass rate

29/32=90.6% <9.4% 31/35=88.6% <11.4% 2. % Independent % Independent

23/29=79.3% <20.7% N/A N/A 3. % Independent % Independent

22/27=81.5% <18.5% N/A N/A Current results improved: N/A- not previously formatively assessed in this way.

Previous action(s) to improve SLO: This is a new initiative for this component (goniometry) of the SLO. Target Met: N/A: Summative achievement target for the Class of 2019 will be assessed in Spring 2019. Based on recent results, areas needing improvement: 1. A better strategy to reduce

testing stress to more accurately assess student skill in the PTH 131 seminar skills assessment.

2. Remediation strategy for students who are not independent in goniometry prior to the first clinical placement.

3. Follow up regarding goniometry skill acquisition/retention using the Clinical Placement Instrument.

Current actions to improve SLO based on the results: 1. Continued tracking of

average scores on the PTH 121 Goniometry Practical Exam to determine if a course of action is needed for students scoring between 80 and 75%.

2. Extended feedback session conducted by the Director of Clinical Education and the Lab Manager with students who require more assistance on the PTH 131 seminar skills assessment in Summer 2019.

Page 332: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

322

Physical Therapist Assistant, A.A.S. Provided Rubric Criteria or Question Topics: The component intervention that is the focus for this SLO is goniometric measurement. The formative evaluation methods include: 1. Performance on the Goniometry practical exam

in PTH 121 Therapeutic Procedures I in the first semester of the program (rubric attached).

2. In Summer 2018 in the third semester of the program in the seminar portion of PTH 131 Clinical Experience I preceding placement in clinic, faculty initiated a Goniometry Skills Check of 6 joints noting “correctly performed independently” or “correctly performed with assistance” to assist students in recognizing the need for additional practice prior to entering the clinic.

3. In Summer 2018 in the third semester of the program in the clinical portion of PTH 131 Clinical Experience I, faculty added a new question specifically asking whether students required assistance with goniometry to the PTH 131 Clinical Experience I Early Performance Assessment form (see attached) used during the midterm clinical site visit. The previous version of the form included a subjective checklist for “meeting expectations/ not meeting expectations” for goniometry; this was retained.

Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 32 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment

Strengths by Criterion/ Question/Topic: 1. First time pass rate on the practical exam exceeds the previous

year’s cohort. 2. The seminar skills check identified a half dozen students in need

of review before clinic placement. Of the 6, 3 had previously earned an A grade on the practical and would otherwise have been complacent regarding their competency. Two had previously failed the practical and 1 had earned the lowest passing score; this assessment identified them as still weak.

3. All students who were performing goniometry in clinic received “meets expectations” on the Early Performance Assessment form; asking the specific question teased out the actual level of assistance needed. The majority of students (22/27=81.5%) in clinic who were performing goniometry were performing well, requiring minimal cueing or no assistance. Of the 6 identified as weak through the PTH 131 assessment, 5 were required to perform goniometry in the clinic. Of those 5, 4 (80%) were performing well and 1 required moderate assistance. Of note, that student had passed the goniometry practical with the lowest possible passing score.

Weaknesses by Criterion/ Question/Topic: 1. Although the first time pass rate is higher than for the previous

year, it is below the target. Because stress management always plays a role in failures, a better indicator of how well students are performing may be the average score.

2. Students were unaware they would be assessed in the PTH 131 seminar and had not performed goniometry for several months.

3. The PTH 131 seminar assessment tool did not perfectly predict the 4 students who required 50% assistance in clinic. Follow up at the end of the PTH 131 clinical experience through review of the CPI was not performed.

3. The Director of Clinical Education will provide advanced notification for students that there will be a skills assessment in PTH 131 in Summer 2019.

4. The program director will track students in the PTH 231 Clinical Experience II in Fall 2018 who required assistance in the skills assessment but did not perform goniometry in PTH 131 in Summer 2018.

Next assessment: Spring 2019

Communicate verbally and nonverbally with the patient, the physical therapist, health care delivery personnel and others in an effective, appropriate,

Clinical Experience III PTH 232 Direct Measure: The summative evaluation method is performance on Criterion #5 Communication (“Communicates in ways that are congruent with situational needs”) on the PTA Clinical Performance Instrument (CPI) in PTH 232 Clinical Experience III in the Spring semester of the second year. For this Student Learning Outcome (SLO), the student’s aggregate communication performance is assessed. For this year’s SLO, the focus is on inter-professional communication. Individual components (“Essential Skills”) are listed for Criterion # 5 but cannot be

Semester/year data collected: Fall 2017 and Spring 2018 Target: The summative achievement target is that 100% of students will achieve scores of 100% (Entry Level) on Criterion #5 on the Clinical Performance Instrument (CPI) assessment tool in PTH 232 Clinical Experience III in Spring 2019. Summative Results by In-Class Enrollment:

Results by Campus/ Modality

Spring 2019 Spring 2018 Average

Score Percent >

100% Average

Score Percent >

100% ME NA NA 100% =

Previous action(s) to improve SLO: 1. The PTH 105 lead instructor

will add an inter-professional communication opportunity to the PTH 105 Introduction to Physical Therapy Evaluation of Student Clinical Performance form in Fall 2020.

2. Addition of inter-professional communication rating scale used in the Early

Page 333: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

323

Physical Therapist Assistant, A.A.S. and capable manner.

teased out; components related to inter-professional communication include “d. Adjusts style of communication based on target audience (e.g., age appropriateness, general public, professional staff)” and “j. Instructs members of the health care team, using established techniques, programs, and instructional materials, commensurate with the learning characteristics of the audience.” Per the CPI, criteria which must be met in order for a student to achieve “entry level performance” are: 1. Is capable of completing tasks, clinical problem

solving, and interventions/data collection for patients with simple or complex conditions under general supervision of the physical therapist.

2. Is consistently proficient and skilled in simple and complex tasks, clinical problem solving, and interventions/data collection.

3. Is capable of maintaining 100% of a full-time PTA’s patient care workload in a cost effective manner with direction and supervision from the physical therapist.

“Entry level” is a single point highest level terminal benchmark without gradations. Students achieving this benchmark are deemed ready to practice as physical therapist assistants. There are no strengths or weaknesses defined or identified for individual criteria on this national performance assessment tool. Provided Rubric Criteria or Question Topics: The component intervention that is the focus for this SLO is inter-professional communication. The formative evaluation methods included: 1. Clinical instructors assessing first year students

in the Fall 2017 (Class of 2019) semester in the PTH 105 Introduction to Physical Therapy integrated clinical experience answered yes/no to the criterion “Initiates communication with family members and other health care providers” on the Evaluation of Student Clinical Performance (see attached).

2. In the summer of 2018 (Class of 2019), faculty added a new question specifically asking clinical instructors to rate (scale of 1-10) the student’s communication with other members of the health care team to the PTH 131 Clinical Experience I Early Performance Assessment form (see

The formative achievement targets are: 1. 100% of students will be marked “yes” for “Initiates

communication with family members and other health care providers” on the Evaluation of Student Clinical Performance” in the Fall 2017 semester in the PTH 105 Introduction to Physical Therapy integrated clinical experience.

2. 100% of students will be rated at least 6 out of 10 (above average) for inter-professional communication on the Early Performance Assessment form in PTH 131Clinical Experience I in Summer 2018

3. 100% of students will achieve the passing benchmark (Intermediate) on Criterion #5 on the Clinical Performance Instrument (CPI) assessment tool in PTH 131 Clinical Experience I in Summer 2018.

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Summer 2018 Summer 2017

Average Score

% of Students =

yes Average

Score % of

Students = yes

1. “yes” 94.3% N/A N/A Average

Score % of

Students >6/10

Average Score

% of Students >

>6/10 2. 7.2/10 78.6% N/A N/A

Average Score

% of Students

Rated Intermediate

Average Score

% of Students

Rated Intermediate

3. Just below Advanced

Intermediate 100%

Just below Advanced

Intermediate 100%

Current results improved: N/A - Examining inter professional communication is a new initiative Strengths by Criterion/ Question/Topic: 1. For the students provided the opportunity to initiate

communication with other health care professionals, all were able to perform this task. Review of this item on the Evaluation of Student Clinical Performance revealed the important information that for 2 of the 35 students evaluated, the clinical instructor marked this item “N/A”, indicating that the opportunity to communicate with other members of the health care team was not provided to 100% of students. One of the students who did not have this opportunity did not meet the target achievement of “above average” on the Early Performance Assessment in PTH 131 Clinical Experience I and was one of the students with the lowest achievements on the final CPI in PTH 131 Clinical Experience I.

Performance Assessment form in PTH 131 Clinical Experience I.

Target Met: N/A: Summative achievement target for the Class of 2019 will be assessed in Spring 2019 Based on recent results, areas needing improvement: 1. Clearer guidelines for faculty

on the PTH 105 Introduction to Physical Therapy Evaluation of Student Clinical Performance form.

2. Clarification of the inter-professional communication scale used in the Early Performance Assessment form in PTH 131 Clinical Experience I.

Current actions to improve SLO based on the results: 1. The PTH 105 lead instructor

will amend the Guidelines for the PTH 105 Introduction to Physical Therapy Evaluation of Student Clinical Performance in Fall 2019 to reflect that opportunities for inter-professional communication must be provided to each student.

2. The Director of Clinical Education will provide descriptions for key anchors on the inter-professional communication scale used in the Early Performance Assessment form in PTH 131 Clinical Experience I in Summer 2019.

Next Assessment: Spring 2019

Page 334: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

324

Physical Therapist Assistant, A.A.S. attached) used during the midterm clinical site visit. The previous version of the form included a subjective checklist for “meeting expectations/ not meeting expectations” for communication; this was retained.

3. Clinical instructors assessed student performance on Criterion #5 on the Clinical Performance Instrument (CPI) assessment tool in PTH 131 Clinical Experience I in the summer of 2018 (Class of 2019).

Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME 1 1 35** ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment **Due to attrition, by Summer 2018 the # of students was 29.

2. For the 29 students in clinic in PTH 131 Clinical Experience I, 28 Early Performance Assessment forms were correctly completed to include the communication rating. Of those 28, 4 were identified as average (5/10), and 2 were below average (1 student each at 4/10 and 3/10). This specific midterm assessment may have functioned as an effective early warning system for the student and the clinical instructor, promoting more guidance for the instructor and more feedback for the student, as all 6 students were able to meet the Criterion #5 Communication Intermediate benchmark for the clinical experience by the final CPI evaluation. This increased awareness is supported by the instructor’s comments on the Early Performance Assessment for the two lowest rated students (“working on it” and “need to work on communication”). Of the 6 students who were not above average on the Early Performance Assessment, 3 achieved the lowest passing rating (Intermediate) on the final CPI.

3. Clinical instructors are familiar with the CPI as it is the national assessment tool used for both physical therapist assistant and doctor of physical therapy students. Clinical instructors learn to use the CPI through a standardized training module, which improves its validity. The rating system used makes comparisons between cohorts relatively easy. For the Class of 2018, 8/31 or 25.8% of students were at the lowest passing level for Criterion #5 compared to 6/29 or 20.7% for the Class of 2019, supporting that the Early Performance Assessment may have assisted students in identifying communication weaknesses early.

Weaknesses by Criterion/ Question/Topic: 1. The Guidelines for Student Evaluation provided with the

Evaluation of Student Clinical Performance do not mandate that students must be provided the opportunity for inter-professional communication, which had been the faculty’s original intent. As written, it is acceptable for faculty to note that there were no opportunities for the student to perform the skill.

2. Unlike the CPI, no benchmarks were offered to the clinical instructors for the 1-10 inter-professional communication rating scale leaving it open to interpretation and the individual instructor’s concept of what is acceptable for a novice learner. Although 100% of instructors marked “meets expectations” for the more global “communication” criterion on the checklist on the Early Performance Assessment, 2 students were marked below 5/10 on the specific inter-professional communication item. Clinicians’ comments did not appear to support the marks for 2 of the 4 students marked 5/10 (“doing fine” and “working well, getting more comfortable”).

3. Criterion #5 on the CPI is not restricted to inter-professional communication.

Page 335: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

325

Physical Therapist Assistant, A.A.S. Core

Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Present sound rationales for clinical problem solving within the plan of care established by the physical therapist. [ X ] CT

Clinical Experience III PTH 232 Direct Measure: The summative evaluation method is performance on Criterion #7 Clinical Problem Solving on the PTA Clinical Performance Instrument (CPI) in PTH 232 Clinical Experience III in the Spring semester of the second year. Per the CPI, criteria which must be met in order for a student to achieve “entry level performance” are: • Is capable of completing tasks, clinical problem

solving, and interventions/data collection for patients with simple or complex conditions under general supervision of the physical therapist.

• Is consistently proficient and skilled in simple and complex tasks, clinical problem solving, and interventions/data collection.

• Is capable of maintaining 100% of a full-time PTA’s patient care workload in a cost effective manner with direction and supervision from the physical therapist.

“Entry level” is a single point highest level terminal benchmark without gradations. Students achieving this benchmark are deemed ready to practice as physical therapist assistants. There are no strengths or weaknesses defined or identified for individual criterions on this national performance assessment tool. Provided Rubric Criteria or Question Topics: Performance on written and practical exam questions across the first year that required students to problem solve increasingly complex clinical applications of the concept of passive insufficiency of muscles was examined. The formative evaluation methods included: 1. In PTH 121 Therapeutic Procedures I in the first

semester in Fall 2017 for the Class of 2019, on the Unit II Goniometry exam question #22 students were asked to apply the concept of passive insufficiency to arrive at the correct positioning for a goniometric joint measurement.

2. In PTH 115 Kinesiology for the PTA in Spring 2018, the Class of 2019 was asked Unit III Lower Extremity Unit exam question #6 requiring problem solving for the higher level task of

Semester/year data collected: Fall 2017/Spring 2018, Cohort Class of 2019 Target: 100% of students will score “Entry Level” on PTH 232 CPI criterion #7 Summative Results by In-Class Enrollment:

Results by Campus/ Modality

Spring 2019 Spring 2018 Average

Score Percent >

100% Average

Score Percent >

100% ME N/A N/A 100% =

The formative achievement targets are: 1. 100% of students will correctly answer question #22 on the Unit

II Goniometry exam in PTH 121 Therapeutic Procedures I. 2. 100% of students will correctly answer Question #6 on the PTH

115 Kinesiology for the PTA Unit III written exam. 3. 100% of students will correctly answer Question #3b on the PTH

115 Unit IV Posture and Gait practical exam. Results by CLO Criteria:

Results by CLO

Criteria/ Question Topics

Fall 2017 Fall 2016

Average Score

% of Students > 100%

Average Score

% of Students > 100%

1. 71.9% <28.1% 40% <60% 2. 69% <31% 68.75% <31.25% 3. 89.7% <10.3% 90.3% <9.7%

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: 1. The question was worded differently in Fall 2016: “When

measuring hip extension ROM” stem with 4 answer choices beginning with the knee is (flexed or extended) to minimize tightness in the (muscle group name).” Although the point biserial discrimination was .37, indicating a strong correlation between students who scored high on the exam and students who answered the question correctly, the instructor reworded the question to improve clarity in Fall 2017, resulting in a 31.9% increase in the number of students who were able to problem solve applying the concept.

2. Students were able to problem solve by applying the passive insufficiency concept at a higher level than in the previous semester with very little attenuation in the percentage of students answering the question correctly. There was little change

Previous actions to improve CLO: Improving clarity of PTH 121 Therapeutic Procedures I Unit II Goniometry exam question #22 in Fall 2017 resulted in a sharp increase in students able to demonstrate their problem solving ability. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Retention of concepts from Fall semester to Spring semester. Opportunity to coordinate learning in PTH 115 Kinesiology for the PTA with PTH 122 Therapeutic Procedures II. Current actions to improve CLO based on the results: PTH 115 faculty will reinforce passive insufficiency concepts presented in lecture through integrated manual muscle testing activities in lab in Spring 2019. Faculty will triangulate PTH 115 Kinesiology for the PTA data with PTH 122 Therapeutic Procedures II Therapeutic Exercise practical exam results to identify students still struggling with problem solving using the passive insufficiency concept in order to offer additional remediation activities to improve the final PTH 115 practical exam performance in this area in Spring 2019. Next Assessment: Spring 2019

Page 336: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

326

Physical Therapist Assistant, A.A.S. assessing muscle length applying the passive insufficiency concept to hip flexors.

3. In PTH 115 Kinesiology for the PTA in Spring 2018, the Class of 2019 was asked on the Unit 4 Gait and Posture practical exam to problem solve at the highest level, applying the passive insufficiency concept to construct a stretching exercise for a tight muscle.

Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 29 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment

between the two cohorts despite a change in instructors to a first time adjunct in Spring 2018.

3. In each cohort, the same pattern is observed across the course with an increasing percentage of students able to problem solve across the continuum of difficulty suggesting that sequential learning has taken place.

Weaknesses by Criterion/ Question/Topic: 1. N/A 2. Although there was only a small drop (2.9%) in the percentage of

students correctly answering this question compared to the earlier lower level question, it is still not a gain suggesting that students may need more lab activities in this area to help cement this skill.

3. Faculty did not triangulate data from the PTH 122 Therapeutic Procedures II Therapeutic Exercise practical exam 6 weeks prior to the PTH 115 Unit 4 practical exam to identify students still struggling to problem solve by applying the concept who could have received remediation.

Program Goals Evaluation Methods Assessment Results Use of Results

Program goal on program-placed students: NOVA PTA program faculty are dedicated to increasing public awareness of the program in order to increase the number of qualified applicants to the program. Note: this is not an actual listed Goal of the program, as Goals focus on graduates of the program.

The number of students accepted into the PTA program is capped at 40 by the accrediting body, CAPTE. Although the program cannot increase the number of its accepted students, it can increase the size of the applicant pool in order to improve the caliber of its students accepted through its selective admissions process. The target goal is 60 qualified applicants, which on average requires 71 applications, which typically would require that 236 students attend an Information Session. 1. The program requires each applicant to attend a

2 hour face to face Information Session in the 12 months prior to the March application period. Information Sessions are scheduled 11 times/year but in most years 10 sessions are held due to inclement weather. Applicant attendance is recorded and tracked by month. In the 2018 application cycle, 2nd and 3rd priority application windows were added to capture students who completed prerequisites in the 16-week Spring semester and those who completed them in the first summer session.

2. Admissions staff tracks the number of applicants and the number of qualified applicants in each year and provides the program director with an excel spread sheet of the data.

Target: 60 qualified applicants, which on average requires 71 applications, which typically would require that 236 students attend an Information Session. Results for Past 5 years:

Admission Cycle Number of Information

Session Attendees

Percentage Change

3/13-2/14 337 ------------- 3/14-2/15 292 13.4% decline 4/15-3/16 182 37.7%

decline 4/16-3/17 170 14.6% decline 4/17-3/18 extended 5/18-6/18

143 35

Total 178

Overall 4.7% increase

Target Met: [ ] Yes [ X ] No [ ] Partially

Admission Cycle # of Qualified

Applicants/# of Applicants

% change Qualified Applicants

3/13-2/14 109/124 ------------- 3/14-2/15 66/73 39.4% decline 4/15-3/16 50/58 24.2%

decline 4/16-3/17 48/48 4% decline 4/17-3/18 extended 5/18-6/18

46/53 4.2% decline

Results improved: [ ] Yes [ ] No [ X ] Partially There was a small increase in the number of students attending Information Sessions and in the number of applications, but the number of qualified applicants actually fell slightly. The real value of adding the 2nd and 3rd windows was that 15 students were accepted in those windows, allowing for a full cohort. Current action(s) to improve program goal: 1. Continue to advocate to

Administration for more recruitment efforts, including the return of the campus Open House event.

2. Maintain a more robust social media presence with a minimum of 2 postings/week in Fall 2018 and Spring 2019.

Assessed: Annually

Page 337: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

327

Physical Therapist Assistant, A.A.S. Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s): This is the first formal assessment.

NOVA PTA program faculty is dedicated to enhancing student retention and student success in the PTA program

The program’s graduation records are compiled by the program director and corroborated by the OIR.

Target: The overall (150%) graduation rate target using the Commission on Accreditation of Physical Therapy Education (CAPTE) calculation method is to meet or exceed CAPTE’s reported national average (most recent 85.4%). For CAPTE accounting, students who leave the program for reasons other than dismissal for academic failure are not counted as members of the original cohort. Students who withdraw before census are not included in the calculation. Results for Past 5 Years:

Academic Year Number of

Graduates/Graduation Rate per CAPTE

Percentage Increased

2013-2014 30/93.8% N/A 2014-2015 28/84.8% ↓9% 2015-2016 29/96.7% ↑11.9% 2016-2017 28/84.8% ↓11.9% 2017-2018 31*/ 91.2 %*

With 1 student still enrolled and on track to graduate in

150% of the time, the graduation number and rate

may rise to 32/94.1%

↑6.4% May rise to

↑9.3%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): Previous 3 year graduates/graduation rate average was 28.3/88.8%. Current 3 year graduates/graduation rate is 29.3/90.9%, an increase of 1 graduate and an increase in the graduation rate of 2.1%.

Previous action to improve program goal: In Fall 2016, faculty began providing the assistant dean with all test and practical exam scores which were then collated and distributed to the full faculty so that all were aware of struggling students and core faculty could make appointments to counsel their at-risk advisees. By disseminating this information after every test and practical exam, all faculty could easily see whether students were trending up or down and could increase or decrease their outreach accordingly to improve student retention and ultimately graduation rates. Most recent results: 31 of 39 students in the Class of 2018 persisted to graduation compared to 28 of 40 students in the Class of 2017. This 9.48% improvement in the raw graduation rate suggests that intense early intervention may be an effective tool to increase student retention. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: The program was unable to track whether or not referred students actually followed up with the academic success counselor as requested. Using a new electronic referral system, the program director will coordinate with the academic

Page 338: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

328

Physical Therapist Assistant, A.A.S. success counselor to track this information in Fall 2018. The impact of early intervention by the academic success counselor on retention from the first semester to the second semester will be assessed in Spring 2019. Early analysis suggests that students who score in the bottom 20% of the class on the Test of Academic Skills (TEAS) used for admission are less likely to be retained as are students who earn a B grade in both the anatomy and physiology and medical terminology prerequisites. The program director has identified the students in the Class of 2020 who fall in these categories and will track their progress across the 2018-19 academic year in order to determine if the minimum TEAS scores should be raised and if a lower ranking in the admissions scoring rubric should be assigned to applicants with B grades in both prerequisites in order to admit a better prepared cohort. The impact of the lower TEAS scores and 2 B prerequisite grades will be assessed in Spring 2019. Assessed: Annually

Page 339: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

329

Annual Planning and Evaluation Report: 2017-2018 Professional Writing Certificate

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Professional Writing Certificate program prepares candidates to compose documents and manage professional communications for a variety of contemporary professions, including business, military, medicine, government, science, and industry. Writers will gain expertise in composing, designing, and editing electronic texts, as well as a comprehensive foundation in grammar and punctuation. Students may tailor their preparation for particular writing environments by selecting from a variety of elective courses in journalism, technical report writing, graphic design, writing for publication, writing for the Web, social media, and communications. Students may also incorporate a professional internship into the Certificate program.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will write a precise, clear, and properly formatted technical report.

Technical Writing ENG 115 Direct Measure: This outcome was analyzed during this evaluation cycle using data from English 115: Technical Writing. The technical reports studied for this assessment were composed collaboratively in small groups. This assessment analyzed the achievement of the resulting reports with respect to seven categories: ● Attribute 1: Audience awareness ● Attribute 2: Purpose ● Attribute 3: Organization ● Attribute 4: Style ● Attribute 5: Design ● Attribute 6: Graphics ● Attribute 7: Documentation For each of these elements, students were given a score of 1 (unsatisfactory), 2 (generally satisfactory), or 3 (effectively). Sample: During Spring 2018, there were two sections of English 115 offered. Both sections were affiliated with the Woodbridge campus. Both sections were ELI courses.

Semester/year data collected: Spring 2018 Target: 2.5 Assessment results for most recent cycle: Of the 23 students enrolled in both sections of the course, 20 students received credit for the course. Of the 23 students, 19 (82%) received passing scores on their final report assignment. The results below include 4 missing reports and show that the target of 2.5 was not met.

Attribute Mean Attribute 1 2.10 Attribute 2 2.23 Attribute 3 2.41 Attribute 4 2.13 Attribute 5 2.23 Attribute 6 1.87 Attribute 7 1.87

Because there were only 19 reports studied, the Mean scores are greatly impacted by the 4 missing reports.This impact resulted in no attributes meeting the target (2.5). However, when the data is considered without these missing reports included in the mean, the target is met for Attributes 1 through 5 as shown below. Attributes 6 and 7 score 2.3.

Attribute Mean Attribute 1 2.50 Attribute 2 2.60 Attribute 3 3.00 attribute 4 2.60 Attribute 5 2.60 Attribute 6 2.26 Attribute 7 2.26

Comparison to previous assessment: This outcome was previously assessed in Fall 2016. During the 2016-2017 evaluation

Previous actions to improve SLO: When this SLO was last evaluated in Fall 2016, results were similar. Areas of weakness were related to references and documentation. Faculty were encouraged to integrate documentation style-related activities throughout the term. Most recent results: The current results demonstrate that the greatest area of weakness continues to be in the areas of graphics and documentation. The most common issue was the documenting or referencing of graphics. The other weakness was related to documentation overall - while most students scored satisfactorily on documentation, a handful were missing references to cited material. Current actions to improve SLO: The first action in response of this SLO will be to communicate this finding with faculty who teach English 115 regularly. The SLO program lead will complete this task by the end of Fall 2018, encouraging faculty to integrate documentation style-related activities throughout the term and to include additional exercises related to graphic integration. The next step will be to notify ELI about this finding so that they will consider this finding when they next revise the course. This action will be completed by the end of Fall 2018. Next Assessment: Spring 2020

Page 340: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

330

Professional Writing Certificate cycle, the target for each area was 2.5. The results were as follows:

Attribute Mean Attribute 1 2.25 Attribute 2 2.50 Attribute 3 2.25 Attribute 4 1.75 Attribute 5 2.00 Attribute 6 1.75 Attribute 7 1.50

In both assessment cycles (Fall 2016 and Spring 2018), Attributes 6 and 7 were areas of weakness. As with this evaluation cycle, Fall 2016 results were also heavily impacted by enrolled students who had not submitted a report.

Students will plan, write, and finalize effective business letter with attachments.

Writing for Business ENG 116 Direct Measure: This evaluation cycle this outcome was studied using data from English 116: Writing for Business. The final assignment, a cover letter in response to a job posting, was the basis for the assessment. While the cover letter assignment has been used for previous SLO assessments, this particular outcome has not been assessed before. The cover letters were evaluated based on the following attributes: ● Attribute 1: Adherence to business

letter conventions (block letter form) ● Attribute 2: Opening contains clear

objective for the letter ● Attribute 3: Letter contains a brief

summary of qualifications connected to the job

● Attribute 4: Letter ends with specific contact information

● Attribute 5: References an attachment

These attributes were adapted from the associated readings and power point lessons for the cover letter assignment. For each of these elements, students were given a score of 1 (unsatisfactory), 2 (generally satisfactory), or 3 (effectively).

Semester/year data collected: Fall 2017 Target: 2.5 Assessment results for the most recent cycle: The cover letter assignment is the only assignment in English 116 that requires an attachment. For the purposes of this assessment, only the cover letter (and not the attached resume) was evaluated. This is because the business/cover letter has several unique features requiring mastery. This is the last course in the PW certificate program requiring a business letter, and the genre should be mastered by the end of the course. Between the two sections of English 116 offered during Fall 2017, there were a total of 15 students all of whom received credit for the course. Fourteen of the fifteen students submitted cover letters. All 14 cover letters were analyzed for the purpose of this assessment (thus representing 93% of the population enrolled in the course during the term under consideration). Of the 15 students enrolled, all but three received a passing grade on the assignment.

Attribute Mean Attribute 1 2.30 Attribute 2 2.50 Attribute 3 2.50 Attribute 4 2.50 Attribute 5 1.70

The projects analyzed met the target (2.5) in each category except for Attributes 1 and 5. Comparison to previous assessment: Although this outcome was assessed during the 2011-12 evaluation cycle, comparison

Previous actions to improve SLO: Because there is no meaningful comparison data, there are no SLO actions to assess. Most recent results: Student work from the period demonstrates a strong understanding of the information typically provided in the standard cover letter. For Attribute 1: Adherence to business letter conventions, all but two of the students scored satisfactorily on conventional business letter form. However, an area of weakness was Attribute 5: Referencing an attachment. While referencing attachments is standard business letter format, this requirement was not emphasized in the instructions for the assignment which might explain why so many students omitted the resume reference. Current actions to improve SLO: As this is the first meaningful data collected on these attributes, the first action in response to this SLO will be to communicate this finding with faculty who teach English 116 regularly by the end of the Fall 2018 semester. Next Assessment: Fall 2019

Page 341: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

331

Professional Writing Certificate Sample: During the Fall 2017 semester, there were two sections of English 116 online through the Woodbridge campus. Both sections were studied during this evaluation cycle; there were no other sections offered at the college.

data for each attribute was not available due to different reporting methods.

Students will produce edited document with application of copy and comprehensive editing

Technical Editing ENG 205 Direct Measure: Examining assignments in English 205. The assignments analyzed for this report were from a unit called “Practical Application of Editing Skills.” The first assignment was a quiz on terminology and writing style; the second assignment was a collaborative comprehensive editing project. The first measurement is a quantitative indicator of taxonomical mastery which is a fair benchmark of a basic skill set necessary in professional writing. The second assignment requires students to synthesize a set of documents into a unified and purposeful professional document. Mastery of this outcome is a good indicator of this editorial task. Student work was assessed in the following categories: ● Attribute 1: Displays comprehensive

editing ● Attribute 2: Addresses copy and

comprehensive issues ● Attribute 3: Uniform in style and tone ● Attribute 4: Free of stylistic,

mechanical or grammar errors For each of these categories, students were given a score of 1 (unsatisfactory), 2 (generally satisfactory) or 3 (effectively). Sample: During the Fall 2017 2 sections of ENG 205 were offered. Both sections were online courses through Annandale.

Semester/year data collected: Fall 2017 Assessment results for the most recent cycle: During the Fall 2017, two sections of ENG 205 were offered as online courses through the Annandale campus. There were 8 students enrolled in the two sections all of whom earned credit for the course. All students completed both assignments. The Quiz: Target is a 70% passing rate The average score for both sections combined was 7.8/10.0 or 78%. The 70% average reflects two failed quizzes and 6 quizzes with near perfect scores. The average is lowered considerably by the two failed quizzes. Comprehensive Editing Project: Target = 2.5 This project was a collaborative editing project. Two documents were assessed for this evaluation cycle. Both documents passed with a passing rate of 99%. The average scores of the projects for each category were as follows:

Attribute Mean Attribute 1 1.0 Attribute 2 2.5 Attribute 3 3.0 Attribute 4 2.5

The assessment met the target (2.5) for a majority of the categories. Attribute 1 scored unsatisfactorily on both reports because students had not used “track changes” to show the edits they had made. Therefore, it was impossible to discern whether the document was comprehensively edited. Comparison to previous assessment: The Quiz: This outcome was assessed in 2014-15. For that assessment, the average quiz grade was 76%. The quiz grades for this assessment cycle are 2% higher. Comprehensive Editing Project: In 2014-15 the average class grade was 77%. The class grade for the current evaluation cycle is considerably higher. The was no data collected for Attributes 1-4, so comparison is impossible.

Previous actions to improve SLO: In 2014-15, when this SLO was last assessed, it was noted that ENG 205 was being revised to meet industry standard expectations. This is the first assessment of the revised course. Most recent results: Students are meeting or exceeding the targets in the majority of areas. These results show that students are successfully meeting the requirements for this SLO. Current actions to improve SLO: Study of this course shows that students exhibit strong performance in terms of both completion rates and grades on assignments. The low scores on Attribute 1 will be communicated to the faculty who regularly teach this course. An email will be sent by the end of the Fall 2018 semester. Next Assessment: Spring 2020

CLO: Critical Thinking

Writing for Business ENG 116

Semester/year data collected: Spring 2018 Target: 2.5

Previous actions to improve SLO: Not applicable; this outcome has not previously been studied.

Page 342: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

332

Professional Writing Certificate Direct Measure: This outcome was analyzed during this evaluation cycle using data from the Eng. 116 Writing for Business courses. The research reports studied for this assessment were composed collaboratively in small groups. The assessment analyzed the achievement of the resulting reports with respect to the following categories: ● Attribute 1: Explanation of issues ● Attribute 2: Evidence ● Attribute 3: Influence of context and

assumptions ● Attribute 4: Students’ perspective or

thesis ● Attribute 5: Conclusions For each of these elements, students were given a score of 1 (unsatisfactory), 2 (generally satisfactory), or 3 (effectively). Sample: During the Spring 2018 there were two sections of Eng. 116 offered. Both sections were affiliated with the Annandale campus and both were ELI courses.

Assessment results for the most recent cycle: There were 24 students enrolled between the two sections of the course; 22 students received credit for the course. Of the 24 enrolled students, 23 participated in the group report assignment. A total of 6 collaborative reports were assessed. Group report: The assessment did not meet the target (2.5) for a majority of the categories. Five of the six submitted reports scored satisfactorily or effectively in almost all categories. However, skewing the results is the failure of one group to complete the assignment as directed. The average scores of the group report for each category, with or without the missing reports, were as follows:

Attribute Mean (all reports)

Mean (5 completed

reports) Attribute 1 2.5 2.8 Attribute 2 2.3 2.6 Attribute 3 1.8 2.0 Attribute 4 2.0 2.2 Attribute 5 2.0 2.2

Since these were group reports, there were four documents studied in total. Because one group’s report did not achieve a passing score, the mean scores were greatly impacted. This impact resulted in only one category meeting the target (2.5). When the data for these reports is considered without the outlier report included, the target is met for Attributes 1 and 2. Attributes 4 and 5 score lowest regardless of whether this low performing report is included in the data set Comparison to previous assessment: This outcome has not previously been assessed in the context of an Annual Planning and Evaluation Report on Standard Learning Outcomes. Therefore, prior data is not available for comparison.

Most recent results: The main area of weakness is Attribute 3 wherein students discuss alternative positions and evaluate those positions. Attributes 4 and 5, which also score low, require students to account for the complexities of alternative positions and draw conclusions. Since these three attributes are linked, it would make sense that there is some consistency in the low scores. Current actions to improve SLO: This is the first time studying this outcome; therefore, the first action in response to this SLO will be to communicate this finding with faculty who teach English 116 regularly. Faculty with disciplinary expertise in this area, and who teach the course regularly, will be asked to consider how they might develop activities to support incorporation and evaluation of alternative viewpoints. These actions will be completed in the Fall 2018 semester. Next Assessment: Spring 2020

Program Goals Evaluation Methods Assessment Results Use of Results To continue to increase the number of program-placed students.

Data for program-placed students and graduates are supplied by the College’s Office of Institutional Effectiveness and Student Success Fact Book for 2012-2013 through 2016-2017. The average from this period was compared to the average the last time program placement and graduates was studied (for the 2014-15 APER).

Target: increase the number of program-placed students. Results: The Fall 2013 through 2017 number of program-placed students represent a five-year average placement of 19.0.

Fall 2013

Fall 2014

Fall 2015

Fall 2016

Fall 2017

Program - Placed Students 20 18 18 19 20

The 5-year average of 19.0 falls below the target set in 2014-15 of 22. However, it is slightly higher than the 5-year average from the last evaluation cycle. That average was 18.8%. It also represents a

Previous actions to improve the Program: When numbers of program-placed students were last examined, three specific areas were identified as needed to help the program grow: ● Better tracking of current and program

placed students ● Stronger community advertising to attract

strong students to the program. ● Integrate advisory board

recommendations and program review findings to develop future programmatic vision and strategy.

Page 343: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

333

Professional Writing Certificate decrease in program-placed students compared to the years between 2010 and 2014, which were studied for the 2014-15 APER. The average program-placed student average of that period was 20.4.

Most recent improvements: ● An advisory board was formed and

approved by the college in Spring 2017; the board has had three meetings, Fall 2017, Spring 2018 and Fall 2018.

● A new program chair (as of Sept 2018) has taken over leading the program through the formal Program Review, the APER process, and serves as executive secretary to the new advisory board.

● Rubrics for all SLOs were created for consistency in future SLO reporting.

● A welcome letter is being developed (to be finalized Fall 2018) to welcome students to the program.

Current actions to improve the program: To continued progress toward the stated needs, the new program lead will work on: ● Better tracking and advising of current

program-placed students. ● Stronger community advertising to attract

a strong cohort of students. ● Bringing the Advisory Board from 6

members to a maximum of 12. ● Work on transferability between

Professional Writing courses and George Mason University (ongoing).

These issues will be examined in more detail as part of the Program Review initiated in Fall 2018, and a new assessment schedule will be established by that evaluation. Assessed: Annually

Increase the number of potential graduates.

Program-placed students and graduates are supplied by the College’s Office of Institutional Effectiveness and Student Success Fact Book for 2012-2013 through 2016-2017.

Target: 7 Results: The program has seen an increase in program placed students; however, the number of graduates decreased in 2017. While the number is consistent with graduation trends of the last 5 years, graduation rates overall continue to remain low compared to program placement.

Fall 2013

Fall 2014

Fall 2015

Fall 2016

Fall 2017

Program Graduates 3 3 2 6 4

Previous actions to improve graduation rates: When graduation rates were last examined, three specific areas were identified as needed to help the program grow: ● Better tracking of current and program

placed students. ● Stronger community advertising to attract

strong students to the program. ● Integrate advisory board

recommendations and program review findings to develop future programmatic vision and strategy.

Page 344: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

334

Professional Writing Certificate Thus, the average over this five-year period was 3.6 per year. During the period previously studied, the average was 3.4. Therefore, the program currently shows a noticeable increase in graduation averages at current, but falls below the target of 7.

Most recent improvements: ● A new program chair has taken over

leading the program through the program through the formal Program Review, the APER process, and serves as executive secretary to the new advisory board.

● A Program Review is currently underway which will examine placement and graduation trends. More information is expected by Fall 2019.

Current actions to improve the program: To continued progress toward the stated needs, the new program lead will work on: ● Better tracking and advising of current

program-placed students so they reach graduation

● Using targeted surveys (from Program Review) to better understand and assess student trends in the program courses.

These issues will be examined in more detail as part of the Program Review initiated in Fall 2018, and a new assessment schedule will be established by that evaluation. Assessed: Annually

Page 345: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

335

Annual Planning and Evaluation Report: 2017-2018 Public History & Historic Preservation Career Studies Certificate

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed for persons seeking to develop research, analytical, and field skills in historic preservation, archaeology, and museum studies sufficient for the student to continue or to participate in local community-based projects.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will analyze and assess museum exhibits and objects.

Survey of Museum Practice HIS 183 Direct Measure: Museum Exhibit Evaluation – Evaluation / Critique Paper. Each student was required to visit a museum exhibition and then produce a written evaluation of their experience based on museum and collection management best practices. Students needed to consider such questions as the exhibition’s message/central theme, how the subject matter was presented (such as their use of objects, label copy, and space design), how the exhibit fits within the scope/mission of the museum, and how does it relate to the audience. This SLO has been assessed across various courses, including HIS 183, HIS 186, and HIS 187, during the past few years. With the revised Curriculum Map, it will be assessed in HIS 180, HIS 183, and HIS 187 moving forward. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 90% of students should receive a C or better. A student who has received a grade of “C” or better on this assignment has successfully completed this objective. Results:

Results by Campus/ Modality

Fall 2017 Previous Assessment Results

Average Score

Percent > target

Average Score

Percent > target

LO >80% 100% >80% 100% Results by SLO Criteria:

Fall 2017 Previous Assessment Results

Average Score

% of Students > target

Average Score

% of Students > target

>80% 100% >80% 100% Current results improved: [ X ] Yes [ ] No [ ] Partially Thirteen students demonstrated competence with this learning outcome. (100%) In the future, the exact average score will be incorporated into the data collected. Strengths by Criterion/ Question/Topic: See below Weaknesses by Criterion/ Question/Topic: See below The assessment demonstrates that most students are indeed practicing this outcome successfully, but that the program needs to examine whether this particular paper assignment is sufficient in order to achieve and maintain the target outcome. In the future, we may examine and parse our rubric results in order to better assess SLO performance, or expand upon the SLO through multiple projects. • In the Fall 2012/Spring 2013 evaluation, this was assessed

in HIS 183 and 12 out of 13 students demonstrated competence with this learning outcome (92%).

Previous action(s) to improve SLO: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The level of achievement was 100%, surpassing the target threshold of 90% and maintaining consistency with last year’s APER assessment. Please consult the previous years’ APER reports for prior assessment details and analysis. The data results are listed in the Assessment Results column.

Page 346: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

336

Public History & Historic Preservation Career Studies Certificate • In the Fall 2013/Spring 2014 evaluation, this was assessed

in HIS 183 and 6 out of 7 students demonstrated competence with this learning outcome (86%).

• In the Fall 2014/Spring 2015 evaluation, this was assessed in HIS 187 and 12 out of 13 students demonstrated competence with this learning outcome (92%).

• In the Fall 2015/Spring 2016 evaluation, this was assessed in HIS 186 and 9 out of 11 students demonstrated competence with this learning outcome (81.8%)

• In the Fall 2016/Spring 2017 evaluation, this was assessed in HIS 183 and 14 out of 14 students demonstrated competence with this learning outcome (100%), the highest percentage within the past few years.

Within the context of past evaluations across multiple course offerings, the student learning objective has been consistent between 82% and 92%, hovering near the target of 90%. As the data showed, only one or two students did not meet the threshold of earning a “C” in this SLO, which caused the percentage to dip below are target percentage. Students who did not earn at least a “C” typically did not complete the related assignment.

Current actions to improve SLO based on the results: In order to help students more effectively practice this outcome, the instructor will provide students with examples of excellent exhibits through reading assignments and in-class discussions as a way to provide a guide for their own analyses. The goal is to begin implementing these examples in the Fall 2018 semester. Students also participated in various question and answering sessions at the start of class, providing updates on their progress and listening to instructor suggestions. Next Assessment: This SLO will next be assessed during the 2018-19 academic year, when we will break down the results from our data collected for analysis.

Students will explain the role and function of preservation in society.

Introduction to Historic Preservation HIS 181 Direct Measure: Preservation Commission Visit and Paper. Students were asked to identify and visit a meeting of a local Historic Preservation or Architectural Planning Commission and then produce a series of discussion posts analyzing that commission's powers, jurisdiction and role in local preservation efforts. This was the first time this course was offered in a hybrid format. Students needed to answer the following questions related to this hybrid discussion assignment: what are the powers and jurisdiction of the commission, what role does the commission play in local preservation efforts, and what happened at the meeting? Their discussion board posts were assessed on whether they could successfully answer the questions. This SLO was last assessed during the 2016-17 academic year. Sample:

Semester/year data collected: Fall 2017 Target: A student who has received a grade of “C” or better on the analysis element of the rubric for this assignment has successfully completed this objective. 90% of students should demonstrate competence. Results:

Results by Campus/ Modality

Fall 2017 Previous Assessment Results

Average Score

Percent > target

Average Score

Percent > target

LO >80% 75% >80% 100 Results by SLO Criteria:

Fall 2017 Previous Assessment Results

Average Score

% of Students > target

Average Score

% of Students > target

>80% 75% >80% 75% Current results improved: [ ] Yes [ X ] No [ ] Partially Nine out of 12 students demonstrated competence of the outcome (75%), a decrease from the previous total of 100% in Fall 2016/Spring 2017 evaluation period. In the future, the exact average score will be incorporated into the data collected.

Previous action(s) to improve SLO: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet.

Page 347: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

337

Public History & Historic Preservation Career Studies Certificate Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 12 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

Strengths by Criterion/ Question/Topic: See below Weaknesses by Criterion/ Question/Topic: See below These results show that most of the students did in fact demonstrate mastery of the outcome, but it was still under the target goal of 90%. Lack of student participation in the assignment was the number one issue with regards to missing the target threshold. • In the Fall 2011/Spring 2012 evaluation, this was assessed

in HIS 181 and 11 out of 13 students demonstrated competence with this learning outcome (85%).

• In the Fall 2012/Spring 2013 evaluation, this was assessed in HIS 187 and 11 out of 13 students demonstrated competence with this learning outcome (85%).

• In the Fall 2014/Spring 2015 evaluation, this was assessed in HIS 181 and 6 out of 8 students demonstrated competence with this learning outcome (75%).

• In the Fall 2015/Spring 2016 evaluation, this was assessed in an online HIS 181 course and 7 out of 8 students demonstrated competence with this learning outcome (87.5%).

• In the Fall 2016/Spring 2017 evaluation, this was assessed in a HIS 181 course and 10 out of 10 ten students demonstrated competence with this learning outcome (100%)

As the data showed, on average only two students did not meet the threshold of earning a “C” in this SLO, which caused the percentage to dip below are target percentage. Students who did not earn at least a “C” typically did not complete the related assignment.

Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: The level of achievement was 75%, which meant that we did not achieve the goal of 90%, and was a decrease from the previous total of 100% in Fall 2016/Spring 2017 evaluation period. Please consult the previous years’ APER reports for prior assessment. The data results are listed in the Assessment Results column. This evaluation was based on the on hybrid Loudoun Reston Center version of HIS 181, which was different than the previous assessments conducted during in-person and online course offerings. Current actions to improve SLO based on the results: The program included the suggestions from the previous APER assessment and increased the number of possible preservation and architectural planning commissions that students may attend and moved the due date of the project to later in the semester in order for students to have greater flexibility in fully participating in this project. The hybrid nature of the course also gave students more open evenings through the semester in order to attend these meetings in person. These changes should have helped students better achieve mastery of the outcome. Feedback for the hybrid course was positive, but revisions will be discussed during the Fall 2018 semester for future implementation on how to make the course stronger. The course might be offered in the Spring 2019 semester and, hopefully, these suggestions and changes can be implemented at that time. This will also give us a variety of course delivery data from the past few years – online, in-class, and hybrid – and can compare the results. Next Assessment: This SLO will next be assessed during the 2018-19 academic year, when we will break down the results from our data collected for analysis.

Page 348: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

338

Public History & Historic Preservation Career Studies Certificate Students will explain the historical development of preservation law and its applications in local, state, and national government.

Introduction to Historic Preservation HIS 181 Direct Measure: VDHR PIF Assignment- paper. Instructor-selected student groups visited a local site or building, approved by the instructor, and completed a Virginia Department of Historic Resources Preliminary Information Form (VDHR PIF), evaluating its suitability for inclusion in the National Register of Historic Places. During the course of developing the VDHR PIF, students were exposed to the history of the preservation movement at all three levels of government – local, state, and federal – and learned to navigate the nuances of preservation law and policy. To justify a potential site to the National Register, students must argue that it has historical importance and significance locally, regionally, or nationally, and understand that their approach must fit in with larger historical trends. Students also had the opportunity to work with local organizations and government agencies, such as Arlington County, Prince William County, and the Purcellville Historical Society. This SLO was assessed using the “VDHR PIF” assignment in attached rubric, which explains the various tasks students had to complete in order to successfully complete the project. There were numerous due dates throughout the semester, helping students more along in the project and reaching important milestones. Since this was a group project, it was challenging to determine the exact contributions of each member. Perhaps a reflection paper or a peer assessment might be included for next time. This SLO was last assessed during the 2016-17 academic year. Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 12

Semester/year data collected: Fall 2017 Target: A student who has received a grade of “C” or better on the “VDHR PIF” element of the rubric for this assignment has successfully completed this objective. 90% of students should demonstrate competence. Results:

Results by Campus/ Modality

Fall 2017 Previous Assessment Results

Average Score

Percent > target

Average Score

Percent > target

LO >80% 100% >80% 100% Results by SLO Criteria:

Fall 2017 Previous Assessment Results

Average Score

% of Students > target

Average Score

% of Students > target

>80% 100% >80% 100% Twelve students out of 12 demonstrated competence of the outcome (100%). Since this was a group project, it was challenging to determine individual student participation. In the future, an assessment of individual students might be included. In upcoming SLO reports, the exact average score will be incorporated into the data collected. Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: See below Weaknesses by Criterion/ Question/Topic: See below • In the Fall 2014/Spring 2015 evaluation, this was assessed

in an online course of HIS 181 and 6 out of 8 students demonstrated competence with this learning outcome (75%).

• In the Fall 2016/2017 evaluation, this was assessed in an in-person course of HIS 181 and 10 out of 10 students demonstrated competence with this learning outcome.

As the data showed, only a handful of students throughout the various reporting periods did not meet the threshold of earning a “C” in this SLO, which caused the percentage to dip below are target percentage. Students who did not earn at least a “C” typically did not complete the related assignment.

Previous action(s) to improve SLO: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The level of achievement was 100%, successfully achieving the goal of 90% and sustaining our level from our Fall 2016 assessment. Again, a larger sample size of data is required in order to adequately assess whether this particular SLO has plateaued at this current level of achievement. This evaluation was based on the hybrid campus version of HIS 181, which was different than the previous assessments during in-person and online course offerings. Please consult the previous years’ APER reports for prior assessment. The data results are listed in the Assessment Results column. Current actions to improve SLO based on the results: For Fall 2017, we offered a hybrid

Page 349: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

339

Public History & Historic Preservation Career Studies Certificate ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

version of this course at our Loudoun Reston Center, a first for the program and an opportunity to reach out to a potentially different student population. The hybrid course gave students a chance to provide in-person presentations of their VDHR PIF projects and provide more direct instructor feedback to help students master the outcome. The goal is to strike a balance between the SLO assessed with the online course and the campus course, trying to determine which learning mode might suit the students the best. The flexibility of the hybrid course schedule gave students more time to research their historic site and allow for greater instructor/student in-person progress reports. The online course lacked those elements, while the on-campus course made it challenging for working students to spend enough time researching and writing their finalized VDHR PIFs. Next Assessment: This SLO will next be assessed during the 2018-19 academic year, when we will break down the results from our data collected for analysis.

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: [ X ] CT

Historical Archaeology HIS 180 Direct Measure: Historic Documents Research Project – Research Paper Report. Historic Archaeology is dependent on archives and archive research. Students will learn the use and utility of diverse archive document types and their purpose. Students will attempt to use information from these sources to investigate historic topics designed by the adjunct instructor for this class, from Loudoun County. This requires some out of class work at the Balch Library, County Municipal Records office, and elsewhere. Students will write a professional-style Preliminary Research Report of all archives consulted and their possible usefulness related to their topic. The goal of the assignment is the process of historical archaeological research and not necessarily the end results. Students

Semester/year data collected: Spring 2018 Target: A student who has received a grade of “C” or better on this assignment has successfully completed this objective. 90% of students should receive a C or better. Results:

Results by Campus/ Modality

Spring 2018 Average Score Percent > target

LO 90% 11 Results by CLO Criteria:

Spring 2018 Average Score % of Students > target

90% 11 Ten out of thirteen students demonstrated competence with this learning outcome within this particular assignment (77%).

Previous action(s) to improve CLO: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a

Page 350: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

340

Public History & Historic Preservation Career Studies Certificate needed to demonstrate their competency in understanding how archives and research libraries operated and the material those repositories held. This particular assignment, along with the entire HIS 180 – Historical Archaeology course, is being reworked by a new adjunct instructor for the Fall 2018 semester. Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment Data was collected from one section of HIS 180 on the Loudoun Campus, which was the only section of the class offered this academic year.

As the data showed, only a handful of students did not meet the threshold of earning a “C” in this SLO, which caused the percentage to dip below are target percentage. Students who did not earn at least a “C” typically did not complete the related assignment. Current results improved: N/A - Since this was the first time we assessed a CLO, we have no measurements from past results. This particular assignment, along with the entire HIS 180 – Historical Archaeology course, is being reworked by a new adjunct instructor for the Fall 2018 semester.

guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement and current actions to improve CLO based on the results: This was the first time that this Core Learning Outcome, Critical Thinking, was assessed in our APER report. Ten students were able to successfully complete this objective, demonstrating their critical thinking skills with the research paper, as well as through the other assignments in the course. This particular course is being redesigned and a new adjunct instructor is schedule to teach it going forward (Fall 2018). All of that will help improve our target goal of 90% because the redesigned course can factor in this Core Learning Outcome in the development process for better assessment going forward. Next Assessment: This Core Learning Outcome could be assessed during the 2018-19 academic year, when we will break down the results from our data collected for analysis, as well as select a different course for evaluation.

Program Goals Evaluation Methods Assessment Results Use of Results

Increase Program Enrollment

Short description of method(s) and/or source of data: Number of students enrolled in program courses.

Target: To increase the number of students enrolling in core classes for the program. The program has succeeded in achieving modest enrollment increases. Results for Past 5 years:

Fall Number of Students Percentage Increased

2012-2013 23 2014-2015 40 74% 2015-2016 47 18% 2016-2017 47

(but 67 students have taken courses)

0%

2017-2018 65 36%

Previous action(s) to improve program goal: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for

Page 351: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

341

Public History & Historic Preservation Career Studies Certificate Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The program continues to encourage students to enroll in the certificate program and complete as soon as they possibly can.

ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet. Results improved: [ X ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: (1) Distribute marketing pamphlets to local

historical sites and associations. (2) Program directors will visit local historical

institutions and organizations to make them aware of the program.

(3) Activities on campus and coordinating with the campus History Club.

(4) Continue to build partnerships with local, state, and national organizations to help promote the program.

(5) Participate in college-wide career development fairs.

(6) Develop showcase events to highlight student work.

(7) Create a larger social media presence to

promote classes, events, and student achievements.

Some of these various goals have been phased in since Fall 2017 with various success and continue to be revised and reworked. Assessed: Annually

To encourage students to complete their career studies certificate

Short description of method(s) and/or source of data: Number of students graduating during the 2017 -2018 academic year.

Target: To maintain and increase the graduation total annually. This target has been met, but will need to be continuously watched and is tied to increased program participation. The more students we enroll in the program, the rise in graduates. Results for Past 5 Years:

Previous action to improve program goal: The program was revised before the Fall 2015 semester as a way to help streamline the learning process and produce more graduates. Since then, we have hired a full-time faculty member with a public history and historic preservation background to teach these

Page 352: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

342

Public History & Historic Preservation Career Studies Certificate Academic Year Number of

Graduates Percentage Increased

2013-2014 3 2014-2015 3 0% 2015-2016 5 66% 2016-2017 6 20% 2017-2018 8 33%

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The program continues to encourage students to enroll in the certificate program and complete as soon as they possibly can.

classes (hired in August 2015), revised our SLOs (March 2017), improved the curriculum map (March 2017), updated course content summaries (July 2016 and August 2018), and stabilized the program. We have used the SLOs as a way to measure the program’s strengths and weaknesses, always looking for ways to improve, especially in regard to course content delivery (on campus, hybrid, and online). A challenge we have had is trying to recognize an appropriate sample size for the SLO assessment, using multiple years as a way to indicate whether the data collected is solid or not. We are also embarking on a guided pathway for transferability with the University of Mary Washington and their Historic Preservation program (agreement signed June 2018); however, a definitive timeline for that process has not been determined yet. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: (1) Better and timelier communication with

students to encourage them to complete the certificate, including contact through our blog, Facebook page, and email.

(2) Make students aware of volunteer, employment, and academic options upon completion of the certificate.

(3) Increase the number of activities, such as speakers and local field trips that will further engage students with the profession.

(4) Continue to foster career development relationships with local institutions and organizations.

All of these measures will continue to be implemented and improved upon during the 2018-19 academic year and can be assessed during the next cycle. Assessed: Annually

Page 353: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

343

Annual Planning and Evaluation Report: 2017-2018 Radiography, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare students to produce diagnostic images of the human body through safe application of x-radiation. The radiographer is a central member of the health care team and assists the radiologist; a physician specialized in body image interpretation. This program emphasizes “hands-on” practice of instructional methods in a state-of the-art laboratory at the Medical Education Campus in Springfield followed by clinical experience at various affiliating health care organizations. Upon successful completion of degree requirements, the student will be eligible to take the American Registry of Radiologic Technology examination leading to certification as a Registered Technologist in Radiography: A.R R.T. (R).

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Evaluate images for diagnostic Information

Principles of Radiographic Quality II RAD 142 Direct Measure: Specific assessments related to this SLO: • RAD 142 Lecture Week 5: Chapter 30,

Carlton, 5th Ed - The Art of Image Critique

• Assessment tool: RAD 142 Quiz 5: Chapter 30- The Art of Image Critique, Carlton, 5th Ed

Quiz and statistics are available through Blackboard. Assessment: RAD 142 requires weekly quizzes that assess the student’s comprehension of subject matter delivered in interactive live classroom sessions, reading assignments and practical laboratory exercises. The weekly quiz content is drawn from a pool of test items used as a random block, containing a wide variety of relevant questions and radiographic images that are available on Blackboard: RAD 142-Spring 2017 Students are provided up to 3 attempts at weekly quizzes in order to research their wrong answers and master the content in subsequent assessment attempts. The grade for the assessment is the average grade of all attempts. Sample: 39 First-Year Students: RAD 142 (1 Section) – Full enrollment

Semester/year data collected: Cohort Students: Fall 2018 Target: 60% of students score 90% or higher on assessment. Results: Blackboard Grade Center does not provide an analysis of random blocks on a question-by-question basis The performance analysis provided by Blackboard is limited to the cohort (39 students) success on multiple (156) attempts on the entire assessment: Two students did not take Quiz 5 and were assigned grades of zero (0), which subsequently skewed the results downward. Result: 39 Students, 156 Attempts:

Statistics: (Points Possible = 15)

Grade Distribution:

Count: 39 90 – 100: 19 Minimum Value: 0.00 80 – 89: 12 Maximum Value: 15.00 70 – 79: 3 Range: 15.00 60 – 69: 1 Average: 12.38 50 – 59: 1 Median: 13.20 40 – 49: 1 Standard Deviation 3.27 30 – 39: 0 Variance: 10.70 20 – 29: 0 10 – 19: 1 0 – 9: 2

Spring 2017 – New unit of instruction implemented in Fall 2016 by faculty. No previous results for comparison. RAD 142-Quiz 5: Chapter 30 Result: 48.7% of students scored 90% or higher on assessment. Previous Target: 60% of students score 90% or higher on assessment.

Previous actions to improve SLO: Implemented in 2016-17: Image Analysis Sessions have been historically included as end-of-semester exercises for all Clinical Courses: RAD 196, RAD 131-135, and RAD 231-232; radiographic images (performed by the student or selected by the instructor) are presented by each student, then critiqued and evaluated in interactive group sessions led by the instructor. Fall 2016: Faculty reviewed quiz results and a new unit of instruction was added to RAD 142 - Principles of Radiographic Quality II for Spring 2017: “The Art of Image Critique.” Faculty also changed the students’ textbook in Spring 2018 for an edition that focused on digital radiography. Most Recent Results: The most recent results are described in detail in the Assessment Results column. At 52.90% of students scoring 90% or higher; there is room for academic improvement. Future actions - Fall semester 2019: 1. Expand assessment tool (quiz) to a total of thirty

(30) test items by including 10-15 additional diagnostic images and non-diagnostic images as assessment (quiz) items.

2. Expand the time limit of the assessment to 3 minutes per question from the current 1.5 minutes per question in the Blackboard quiz session, allowing students more thoughtful analysis of the image(s) being displayed.

3. Provide two (2) additional forms of assessment (quiz) items, to include: 1) “Hot Spot”, and 2) “Fill in the Blank.” Currently, only “Multiple Choice” questions are used in the RAD 142 Assessment, Quiz 5: Chapter 30-“The Art of mage Critique”

Page 354: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

344

Radiography, A.A.S. Future Target: 70% of students score 90% or higher on

assessment. Strengths: Students have a better understanding of radiographic physics to include the major components of a digital radiography system. Weaknesses: Students need more assessments to better analyze the latent radiographic image and identify the variables affecting the latent radiographic image.

4. Incorporate image evaluation into RAD 141 Principles of Radiographic Quality II, RAD 121 Radiographic Procedures I and RAD 221 Radiographic Procedures II. Faculty believe that incorporating image evaluation earlier in the program will aid in student understanding and success.

5. New textbook required the RAD 141 and RAD 142 course material to be updated along with all exams and quizzes.

These changes will be reviewed at the next Radiography Advisory Board meeting in April 2018. Next Assessment: Fall 2019

Provide patient care essential to radiological science

Patient Care Procedures RAD 125 Direct Measure: Specific assessments related to this SLO: 1. Quiz 3 – Body Mechanics. Questions

1, 3,4, 5, 6, 10, 13, 16, 20, 21. Questions centered on correct methods of patient handling and transfer.

2. Quiz 6 – Medical Emergencies. Questions 2, 4, 5, 7, 8, 9, 10, 11, 12, 13. Questions centered on methods of handling patients in acute and emergency situations.

Quizzes and statistics are available through Blackboard. Assessment Rubric: • 92.50-100 A • 83.50-92.49 B • 74.50-83.49 C • 69.50-74.49 D • 00.00-69.49 F Sample: 39 First years-RAD 125 (total enrollment)

Semester/year data collected: Cohort students: Fall 2018 Target: 87% of students score 90% or higher on each assessment. Quiz 3 Body Mechanics:

Question % Correct % Incorrect 1 90 10 3 90 10 4 100 0 5 62.5 37.5 6 80 20 10 92.5 7.5 13 85 15 16 80 20 20 47.5 52.5 21 75 25 Question % Correct –

Change from 2017-2018 1 10% Increase 3 5% Decrease 4 30% Increase 5 35% Decrease 6 60% Increase 10 38% Increase 13 15% Increase 16 4% Decrease 20 40% Decrease 21 12% Decrease

Current results improved: [ ] Yes [ ] No [ X ] Partially

Previous actions to improve SLO: Implemented in 2015-16: 1. Scenario questions to improve critical thinking skills

and the application to the clinical environment. 2. Class/blackboard discussion boards to address

critical thinking skills. 3. Additional homework assignments to better

understand the content of the class. 4. Additional study and practical applications on

safety, transfer and patient positioning. Special emphasis was put on sliding board transfer, immobilization techniques and the base of support for lifting heavy objects.

Most recent results: Quiz 3: Body Mechanics: Five questions decreased in the percentage of correct answers. Two of these questions presented with a higher decrease in the percentage of correct scores from the 2016-17 report. These two questions addressed patient restraints and cast removal. Quiz 6: Medical Emergencies: Nine questions decreased in the percentage of correct answers. One of these questions presented with a higher decrease in the percentage of correct scores from the 2016-17 report. These questions addressed patient emergency response to fainting. Spring 2017: Faculty reviewed and adjusted instruction to include more didactic, practical and preclinical activities in the following areas: 1. Practical applications on safety transfers and

positioning. 2. Sliding board transfers

Page 355: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

345

Radiography, A.A.S. Weaknesses: Students need more assessments and clinical scenarios to demonstrate correct patient transfer methods. Quiz 6 Medical Emergencies:

Question % Correct % Incorrect 2 92.5 7.5 4 92.5 7.5 5 92.5 7.5 7 95 5 8 90 10 9 85 15 10 87.5 12.5 11 87.5 12.5 12 92.5 7.5 13 72.5 27.5

Question % Correct Change

from Fall 2017

2 2% Decrease 4 5% Increase 5 0% No change 7 6% Increase 8 8% Increase 9 1% Increase 10 10% Decrease 11 30% Increase 12 5% Increase 13 3% Increase

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths: Students have increased in ability to describe and demonstrate methods for handling patients in acute and emergency situations. Result: Quiz 3 Body Mechanics:

Cohort Fall 2018 Fall 2016 Fall 2015 Fall 2013 Average 80.25% 79.67% 81.0% 85.7%

Quiz 6 Medical Emergencies:

Cohort Fall 2018 Fall 2016 Fall 2015 Fall 2013 Average 88.25% 87.8% 96.31% 88.0%

Previous 2016 Target: 87% of students score 90% or higher on each assessment. Previous 2015 Target: 80% of students score 90% or higher on each assessment.

3. Immobilization techniques 4. Signs and symptoms of shock 5. Emergency response steps 6. Signs of hypoglycemia The above revisions are directed to address the most recent results from the body mechanics and medical emergencies quizzes. • Quiz 3: Body Mechanics - Target for Body

mechanics was not met. • Quiz 6: Medical Emergencies - Target for medical

emergencies was met. Future actions: Faculty are revising RAD 125 to incorporate simulation labs. These changes will be reviewed during the next faculty meeting. Spring 2019. Faculty will continue to research ways of improvement and discuss at next meeting (Spring 2019). Revisions will be implemented with Radiography advisory board permission Spring 2019. Next Assessment: Fall 2019

Page 356: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

346

Radiography, A.A.S. Previous 2013 Target: 80% of students score 90% or higher on each assessment. Future Target: RAD 125 - Quiz 3 Body Mechanics – 87% or higher RAD 125- Quiz 6 Medical Emergencies – 90% or higher

Communicate effectively with patients, families and other health care providers

Advanced Clinical Procedures II RAD 232 Direct Measure: Specific assessments related to this SLO: Professional Development Evaluation: #6,7,8 completed by clinical faculty on each student. Tool found at www.Trajecsys.com 1-2 clinical instructors evaluate each student. Evaluation Rubric: • 97.25: Outstanding • 91.5: Good • 87.25: Acceptable • 79.5: Significant improvement by the

student is needed. • 50: Remediation by NOVA faculty is

required. Definition of items assessed: 6. Basic Communication: Understands

and is understood. Uses correct communication protocols at the clinical site.

7. Interpersonal Relationships: Interacts effectively with other students, technologists, other clinical employees and patients

8. Patient Rapport: Interacts well with patients.

Attachments: Professional Development Evaluation RAD 232 (2016) Sample: Of 34 second years in RAD 232 (1 sections): 10 student sample

Semester/year data collected: Cohort students: Spring 2018 Average of scores for Items 6, 7, 8:

Student Item #6 Item #7 Item # 8 1 95.61 97.25 97.25 2 92.96 94.18 92.96 3 94.79 95.61 93.96 4 89.68 91.71 90.89 5 94.79 90.29 93.36 6 96.43 97.25 97.25 7 93.96 93.96 97.25 8 97.25 97.25 97.25 9 97.25 96.43 97.25 10 97.25 96.43 97.25

Average scores 2018:

• Average score- Item #6 (95.00) • Average score -Item #7 (95.03) • Average score -Item #8 (95.46)

Average scores 2016:

• Average score- Item #6 (95.52) • Average score -Item #7 (95.87) • Average score -Item #8 (95.82)

Average scores 2015:

• Average score- Item #6 (95.54) • Average score -Item #7 (95.64) • Average score -Item #8 (95.95)

Spring 2014: Only one item (#6) was measured: Professional Development Evaluations – 10 students Item #6 Basic Communication. Average score: 95.60 Result: • Spring 2016: Average score for communication SLO

– 95.73 • Spring 2015: Average score for communication SLO

– 95.71 • Spring 2014: Average score for communication SLO

– 95.60

Previous actions to improve SLO: Implemented in 2017-18: RAD 196 pre-clinical activities were revised to include practical activities with an emphasis on verbal communication in the clinical environment. Most recent results: The results of this SLO demonstrate the majority of students in the program demonstrate good communication skills: 1. Faculty implemented additional lecture and practical

assessments in RAD 125 Patient Care. Practical assessments included labs on communication with different population groups: cultural diversity, family, and group communication.

2. Fall 2017 faculty implemented additional simulation labs to increase students’ comprehension of interpersonal relationships and patient rapport.

These changes were initiated by the faculty and approved by the Radiography Advisory Board in Spring 2018. Target Met: [ X ] Yes [ ] No [ ] Partially Future action - Fall 2019: The following adjustments for the delivery of instruction and assessment are planned for the future: 1. Incorporating additional practical assessments on

communication in RAD 121 Radiographic Procedure I and RAD 221 Radiographic Procedures II. By adding a verbal component to additional non-clinical courses, it will better prepare the students for communication in clinical rotations.

2. Develop labs to address non-verbal communication and body language in RAD 196 pre-clinical activities.

Faculty will meet in February 2019 to develop simulations and labs for RAD 121, RAD 221 and RAD 196. Revisions will be implemented with Radiography Advisory Board permission (Spring 2019). Next Assessment: 2018-19

Page 357: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

347

Radiography, A.A.S. Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths: Additional simulation labs have increased student methods of effective communication. Weaknesses: Additional simulation labs are needed to address non-verbal communication. Current and Future Target: RAD 232: 90% of the students’ average score of 97.00 or higher for communicating effectively with patients, families, and other health care providers.

CLO: Apply Knowledge of anatomy and positioning, and radiographic techniques to accurately image anatomical structures [ X ] CT

On Site Training RAD 196 Direct Measure: Specific assessment related to SLO: Radiographic Image Analysis. See attachments: Image Analysis Tool. GRADING SCALE

A- 94.50 - 100.00 B- 89.50 - 94.499

C - 84.50 - 89.499 D - 74.50 - 84.499 F - Below - 74.499 American Registry of Radiologic Technologist (ARRT) National certification examination. Category: Imaging Procedures See Attachments: ARRT Program Summary- ARRT National Comparison- Sample: 39 first years in RAD 196 (4 sections) - total enrollment

Semester/year data collected: Cohort students: Fall 2018 Target: 80% of students score 90% or higher on assessment Results: Blackboard Grade Center does not provide an analysis of random blocks on a question-by-question basis. Students presented images they acquired in clinical rotations. Some students presented radiographic exams that were 2 projections, 3 projections, 4 projections or 5 projections. This creates a random block in the Blackboard Grade Center:

Student Average Student Average 1 100 11 100 2 100 12 100 3 100 13 94.40 4 100 14 85.71 5 100 15 93.33 6 95 16 90.48 7 100 17 86.66 8 100 18 100 9 95 19 97.92 10 90 20 93.33

Student Average Student Average 21 93.34 31 100 22 90.48 32 93.33 23 100 33 93.33 24 96.67 34 100 25 92.31 35 96.67 26 92.31 36 95.24 27 97.43 37 95.23 28 96.67 38 86.67 29 100 39 100 30 100

Previous actions to improve SLO: Implemented in 2016-17: In Spring 2017, Image Analysis was incorporated into RAD 221 Radiographic Procedures II, a non-clinical course. Lab assignments to include image analysis were created for each anatomy and positioning lecture topic. Faculty confirmed that students demonstrated increased knowledge of anatomy and image quality through this assignment. Clinical instructors also noticed improved knowledge of anatomy and image quality during Summer 2017 clinical rotations. Most recent results: Image Analysis tool was updated to reflect changes from Computed Radiography to Digital or Direct Radiography Imagining X-ray machines. This update was initiated by faculty and approved by the Radiography Advisory Board in Fall 2014. Current results improved: [ X ] Yes [ ] No [ ] Partially Future action: Current scores indicate a drop in student understanding of radiographic anatomy and image quality. Faculty will meet in Spring 2019 to re-evaluate the assessment tool to ensure that it is keeping up with the current changes in Digital Radiography. These changes will be reviewed at the next Radiography Advisory meeting in May 2019. ARRT results indicate the program has dropped one point from the prior year. We exceeded the national average in this category. This goal will be monitored again next year. Future Target-9.0 or higher for all students. Next Assessment: 2018-19

Page 358: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

348

Radiography, A.A.S. Result: • Fall 2018 RAD 196- Image Analysis Average: 95.92 • Fall 2016 RAD 196- Image Analysis Average: 97.67 • Fall 2015 RAD 196- Image Analysis Average: 97.19 • Fall 2013 RAD 196-Image Analysis Average- 95.99 Previous Target: 80% of students score 90% or higher on assessment was met. Future Target: 95% of students score 97% or higher on assessment ARRT Results: • 2017-2018-8.3 NOVA/ 8.3 national results (33

students) • 2016-2017-8.4 NOVA/ 8.4 national results (40

students) • 2015-2016-8.4 NOVA/ 8.5 national results (37

students) • 2013-2014-8.4 NOVA/ 8.5 national results (38

students) • 2011-2012-8.6/ 8.5 national results (34 students) • 2010-2011-8.6 NOVA / 8.5 national results (50

students) • 2009-2010-8.2 NOVA/ 8.5 national results (45

students) ARRT results dropped by one point for 2016-17. Results included scores from two previous students from 2015-16 that retook the certification exam. Strengths: Students are able to correctly identify topographic anatomy of the skeletal system, cranial bones, facial bones, urinary system and gastrointestinal system. Weaknesses: More image analysis is needed.

Program Goals Evaluation Methods Assessment Results Use of Results

Increase retention rates between first and second year in the radiography program

Retention Rates: SIS-Class Rosters: • Fall 2016-RAD 121 • Spring 2017- RAD 205 Class rosters are available through Blackboard: • RAD 121 – (001H) Fall 2016

• 2017-2018 (82.5% retention) 17.5% attrition between first and second year students

• 2016-2017- (82% retention) 18% attrition between first and second year students

• 2015-2016- (95% retention) 5% attrition between first and second year students

• 2014-2015- (84% retention) 16% attrition between first and second year students

• 2013-2014- (95% retention) 5% attrition between

Previous actions to improve Program Goal 1: Implemented in 2015-2016: 1. TEAS Examination: The class entering Fall 2015

sat for the TEAS Exam. The purpose was to evaluate student skills and if necessary refer to remedial classes prior to program placement. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2015.

Page 359: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

349

Radiography, A.A.S. • RAD 205- (001H) Fall 2017

first and second year students • 2012-2013-(65% retention) 35% attrition between

first and second year • 2011-2012-(80% retention) 20% attrition between

first and second year • 2010-2011— (95% retention) 5% attrition between

first and second year Result: The Faculty and Radiography Advisory Board believes that all of the previous actions to improve retention have resulted in an increased retention rate for the academic year 2015-16. Future target: 2019 - 90% retention/10% attrition between first and second year students.

2. Student Referral Reports: Faculty refer students to a student success counselor if there is any indication of a learning, personal or financial issue that could interfere with their progress in the program. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2016.

3. Clinical Counseling Reports: Clinical instructors at clinical sites complete counseling forms on students who are performing poorly or appear to be troubled. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2016

4. Tutor referral: Initiated Fall 2015 - High risk students were referred by the RAD faculty to NOVA tutoring center. This change was initiated by faculty and approved by the Radiography Advisory Board in Fall 2015.

The retention rate decreased in the 2016-17 cohorts. A review of the students leaving indicated that 3 students left for academic reasons, 1 student left for health reasons, and 3 students left for clinical behavioral issues. Actions to improve retention: 1. TEAS Examination: The TEAS Examination was

implemented in Fall 2015. The purpose was to evaluate student skills and if necessary refer to remedial classes prior to program placement. The goal is to produce a more successful student. The faculty will compare TEAS exam scores with students who left for academic reasons and reassess the acceptable score limits during Fall 2019. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2017.

2. Student Referral Reports: Faculty refer students to a student success counselor if there is any indication of a learning, personal or financial issue that could interfere with their progress in the program. Student services will access the results of the student referral and gauge if additional resources are needed. Faculty reviewed this process and found that many of the students referred never responded or set up appointments to meet with the success counselors. Referral appointments are now mandatory, and students must show a counselor signature as proof of referral follow-up. This change was initiated by faculty and

Page 360: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

350

Radiography, A.A.S. approved by the Radiography Advisory Board in Spring 2017.

3. Clinical Counseling Reports: Clinical instructors at clinical sites complete counseling forms on students who are performing poorly or appear to be troubled. These forms are submitted to the NOVA faculty for review and counseling of the students. Clinical instructors are now documenting all failed clinical competencies, so faculty can quickly address clinical deficiencies. Clinical counseling reports have been updated to report improvement recommendations. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2017

4. Tutor referral: High risk students were referred by the RAD faculty to NOVA tutoring center. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2017.

5. RAD 196 Pre-Clinical Activities: Additional instruction and labs have been incorporated into RAD 196 pre-clinical activities to address behavioral issues that may occur during clinical rotations. This change was initiated by faculty and approved by the Radiography Advisory Board in Spring 2017

Assessed: Annually

Improve or equal student performance on the national certification exam (ARRT) from previous year

Source: American Registry of Radiologic Technologist (ARRT) National Examination Scores Attachment 5-ARRT Summary Report 2016

2017-2018-Cohort (34 students) • 94% pass rate; mean scaled score of 84.4 • 85% pass rate; mean scaled score of 83 • National Statistics not available until Jan 2019

2015-2017-Cohort (37 students)

• 80% pass rate; mean scaled score of 81 • National Statistics not available until Jan 2018

2014-2016-Cohort (35 students)

• 80% pass rate; mean scaled score of 81.6 • National Statistics not available until Jan 2017

2013-2015-Cohort (37 students)

• 85% pass rate; mean scaled score of 81.6 • National Statistics not available until Jan 2016

2012-2014-Cohort (35 students)

• 91% pass rate; mean scaled score of 84 • National Statistics not available until Jan 2015

2011-2013-Cohort (30 students)

Previous actions to improve Program Goal 2: Implemented in 2016-17: After reviewing the ARRT results, in Summer 2017 the faculty re-evaluated and changed the curriculum in the following sections: Equipment operation & Quality Control, and Image Acquisition & Evaluation. These changes were instated due to the changes to the ARRT Registry Exam and the requisition of new digital imaging machines in the Radiography Labs on campus. Actions to improve student performance on the national certification exam (ARRT): The pass-rate on ARRT examination remained constant from the previous cohort. After reviewing the ARRT results, in Summer 2017 the faculty re-evaluated and changed the curriculum in the following sections: Equipment operation & Quality Control, and Image Acquisition & Evaluation. These changes were instated due to the changes to the ARRT Registry Exam and the requisition of new digital imaging machines in the Radiography Labs on campus. Faculty will continue to review and revise these two sections of instruction.

Page 361: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

351

Radiography, A.A.S. • 80% pass-rate; mean scaled score of 83.2 • National comparison; 89 % pass-rate

Future target: 95% pass rate

Faculty reviewed the current schedule for mock registry exams and assessed that mock registry exams will start in the second semester in RAD 131. This change to RAD 131 was implemented Fall 2018. This will begin preparing the students to pass the registry exam in the first year of the program. The faculty believe that starting the registry review and remediation earlier will produce better results. Faculty will review these changes with the Advisory Board during the Spring 2019 meeting. Assessed: Annually

Increase the number of graduates in the program

Graduate Rates: SIS-Class Rosters:

• Fall 2016-RAD 121 • Spring 2018- RAD 240

Class rosters are available through Blackboard:

• RAD 121 – (001H) Fall 2016 • RAD 240- (001H) Spring 2018

Target: 87% graduation rate Results: • 2016-2018-Cohort-40 entering students/34

graduates (86% graduation rate) • 2015-2017-Cohort-40 entering students/35

graduates (87% graduation rate) • 2014-2016-Cohort-39 entering students/37

graduates (95% graduation rate) • 2013-2015-Cohort-44 entering students/37

graduates (84% graduation rate) • 2012-2014-Cohort-48 entering students/38

graduates (79% graduation rate) • 2011-2013 Cohort-47 entering students/31

graduates (66% graduation rate) • 2010-2012 Cohort-46 entering students/38 graduate

(83 % graduation rate) Future target: 96% graduation rate

Previous actions to improve Program Goal 3: Implemented in 2016-2017: Mandatory appointments with the academic success counselor will address high-risk students earlier in the program. The attrition of the program continues to occur in the first year. 100% of the student who started the second year of the program graduated. Faculty will continue to monitor progress of the students individually to evaluate student weaknesses early in the program. Actions to improve retention: The graduation rates decreased by 13% from the last cohort. The faculty and the Radiography Advisory Board believe the decrease is due to communication and academic difficulties. Additional communication instruction and labs will be incorporated into non-clinical courses to address this issue (Fall 2019). Mandatory appointments with the academic success counselor will address high risk students earlier in the program. The attrition of the program continues to occur in the first year. 100% of the student who started the second year of the program graduated. Faculty will continue to monitor progress of the students individually to evaluate student weaknesses early in the program. Faculty will continue to refer at-risk students to the academic success counselor and utilize all of the student services resources to help students succeed. Assessed: Annually

Page 362: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

352

Annual Planning and Evaluation Report: 2017-2018 Respiratory Therapy, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed to prepare students to be effective members of the healthcare team in assisting with diagnosis, treatment, management, and preventive care of patients with cardiopulmonary problems. Upon successful completion of the program, students are eligible to take the entry-level and advanced practitioner examinations leading to certification as a Certified Respiratory Therapist (CRT) and registration as a Registered Respiratory Therapist (RRT).

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Perform the psycho-motor and demonstrate the cognitive skills in all areas of adult non- critical care

Fundamental Clinical Procedures RTH 151 Direct Measure: RTH 151 Cognitive Exam questions #1-9 taken from 3 separate tests. Student group to achieve > 85% on each item evaluated. Sample

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 18/19 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 18/19**

*Dual-enrollment **One student dropped after first test. The original cohort of 19 decreased to 18.

Semester/year data collected: Fall 2017 (2019 grads) Target: Student group to achieve > 85% on each item evaluated. Results:

Results by Campus/Modality

Fall 2017 Fall 2015 Average

Score Average

Score ME .86 .88

Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Fall 2017 Fall 2015

Average Score Average Score

1.SFM .9 .85 2.TTO2 .95 .85 3.NRB .9 .85 4.Hyperbaric O2 .78 1.0 5. H2O Content .89 .85 6.Particle Deposition .84 .95 7. Asthma .84 .95 8. LVN .78 .8 9. Airway Obstruction .84 .9

Current results improved: [ ] Yes [ ] No [x] Partially Strengths by Criterion/ Question/Topic: Questions are used to prepare students for the board exams. Weaknesses by Criterion/ Question/Topic: Some of the concepts are abstract, therefore difficult to grasp by students.

Previous action(s) to improve SLO: SLO results were reviewed at the Advisory Committee meeting in April 2016. Faculty reviewed this SLO at their December 2015 faculty meeting and decided to add in more repetition of equipment set ups while the student verbalizes the set up’s parameters. This should support the type of course material involved in Item 3 of the assessment. Instructor will incorporate this addition in the Fall of 2016. The assessment was adjusted to 9 items. The repeated item was removed. Target Met: [ ] Yes [ x ] No [ ] Partially Based on recent results, areas needing improvement: Students showed improvement in some areas and declines in others. Items 6,7 and 9 are .01 below the threshold, while items 4 and 8 have declined. Current actions to improve SLO based on the results: Since 2016, 2nd year students have been hired as lab assistants to support students during practice and preparation for equipment proficiency testing prior to entering clinical. Although helpful in the psychomotor domain, it does not appear to be helpful in the cognitive domain as reflected in the decline in some of the topic questions. To address the question topics, more patient scenarios will be introduced into the material during Fall 2018. This may help with the application and analysis of material. Next assessment of this SLO: Fall 2020; the next assessment #’s 1-3 and 5 can be eliminated because they have met the criteria.

Perform the psychomotor and

Cardiopulmonary Science III RTH 223

Semester/year data collected: Fall 2017 (2018 grads) Target: Students will achieve > 80% on individual strip

Previous actions to improve SLO: For Fall 2016, the instructor added a comparison of rhythms section to the Blackboard assignment area and added time in

Page 363: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

353

Respiratory Therapy, A.A.S. demonstrate the cognitive skills in the areas of respiratory therapy home care, patient education/ disease manage-ment, pulmonary rehabili-tation and cardiac diagnostics

Direct Measure: RTH 223 Cognitive ECG: Students will be able to accurately interpret ECG waveforms for exam Questions #36-50. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 15 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

interpretation (Note: The previous target was: Students individually to achieve >84%) Results:

Results by Campus/Modality

Fall 2017 Fall 2015 Average

Score Average

Score ME 75% 86%

Results by SLO Criteria:

Results by SLO Criteria used is the students test

scores

Fall 2017 Fall 2015 ECG

Strips Score

1. Student 100% 2. Student 100% 3. Student 100% 4.Student 93% 5. Student 73% 6. Student 67% 7.Student 67% 36 100% 37 100% 38 100% 39 80% 40 100% 41 87% 42 93% 43 100% 44 100% 45 (1st degree AV block) 60% 46 93% 47 80% 48 (Pacing) 67% 49 (Failure to capture) 73% 50 93%

Current results improved: Not comparable Strengths by Criterion/ Question/Topic: Changing the focus to questions instead of test grades will narrow the focus to specific student weaknesses. Weaknesses by Criterion/ Question/Topic: Students rarely see some of the abnormal ECG strips in the clinical environment, making the application of these concepts more challenging.

class for comparisons of rhythms for students to increase comprehension. This action adds an assignment to be completed by the students on an independent basis so they will contemplate the information needed to improve their learning. Target Met: [ ] Yes [ x ] No [ ] Partially Based on recent results, areas needing improvement: Examination of this criteria is being slightly modified. Historically, test scores were used in the data analysis. Moving forward, specific ECG strips will be utilized to get a better breakdown of the questions, enabling better scrutiny of student challenges. Current actions to improve SLO based on the results: Three strips scored below the threshold. An assignment will be created that emphasis on these deficits. These changes will be implemented in Fall 2018. Next assessment of this SLO: Fall 2020; the strips that consistently meet or exceed the threshold will be eliminated.

Students will perform the psychomotor

Critical Care Monitoring RTH 236

Semester/year data collected: Fall 2017 (2018 grads)

Target: Students are expected to achieve a minimum of

Previous action(s) to improve SLO: The Noninvasive Ventilator (NIV) machine in the Respiratory Therapy Lab had been very problematic

Page 364: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

354

Respiratory Therapy, A.A.S. skills and demonstrate the cognitive skills in all areas of adult critical care.

Direct Measure: RTH 236 Cognitive Adult Ventilation, specifically Noninvasive Ventilation (NIV), is being assessed using the following test questions: #1-3, 22, 23, 26-28. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 15 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

82% on the identified knowledge questions pertaining to the NIV ventilator modality. Exam target Class Average > 85%. Results:

Results by Campus/Modality

Fall 2017 Fall 2015 Average Average

ME 68% 80% Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017 Fall 2015 Average

Score Average

Score 1. Exceptions 100% 100% 2. True statements 71% 40% 3. Advantages 50% 100% 20. Characteristics 71% 40% 21. Pressure Support 50% 80% 22. Identified Problem 71% 40% 23. Description 43% 100% 25. Pressure Control 57% 60% 26. False statement 79% 100% 27. Asthma 57% 100% 28. True statements 93% 60%

Current results improved: [ ] Yes [ x ] No [ ] Partially Strengths by Criterion/ Question/Topic: Having the newer technology in-house should improve future outcomes. Weaknesses by Criterion/ Question/Topic: Lack of the ability to apply the didactic content has hampered learning for the students.

and not in good operating condition. This is what students use to practice skills. It doesn’t work consistently. It is an older version and not comparable to what is seen during the students’ hospital clinical rotations. Approval was obtained to purchase a new machine in Fall 2017.

Target Met: [ ] Yes [ X ] No [ ] Partially Based on recent results, areas needing improvement: The new NIV arrived in Spring 2018 which means that the Fall 2017 class did not have access. Areas 2, 20 and 22, although still not meeting threshold, showed a 31% improvement, while other areas showed a sometimes-significant decline. Overall performance in all but one of these areas needs improvement.

Current actions to improve SLO based on the results: Several adjustments will be made for the next cohort. The materials for the lecture will be subdivided into more digestible chunks of content and the new NIV will be used with the high-fidelity simulator. Using the simulator will give the students a more realistic application of the technology and a better depiction of how it is used on an actual patient. These changes will be made during Fall 2018. Next Assessment: Fall 2020

Critical Care Monitoring RTH 236 Direct Measure: RTH 236, Psychomotor Domain, has a lecture and a lab. During the lab, students are taught the clinical application of materials learned during the lecture. At different intervals during the semester the students are expected to demonstrate proficiency in the new skills they’ve learned. This is in the form of a clinical competency in the adult weaning modality using a Non-Invasive Ventilator (NIV).

Semester/year data collected: Fall 2017 (2018 grads)

Target: Students are expected to achieve a minimum of 85% on all clinical laboratory competencies. If they do not, they are given one opportunity to repeat it. If the second attempt is not successful, the student will fail the class. Each student will achieve >84% on the competency performed on 1st attempts. Results: The class average for the 2017 cohort was 92%. Three students in this cohort had to repeat the competency. When a student has to repeat a competency, they must receive a minimum of 85% to receive a final score of 72%. Listed below are the errors and scores for the three students and others in the cohort:

Previous action(s) to improve SLO: No actions were necessary as students met the criteria. Target Met: [ x ] Yes [ ] No [ ] Partially Current actions to improve SLO based on the results: The student errors identified two areas of weakness. They are not making the appropriate parameter adjustments when given patient clinical data and placing the ventilator circuit on the machine and the patient. Beginning in Fall 2018, students will be encouraged to invest additional time preparing for the competency, by practicing in the lab. Also, an additional assignment where they must match the

Page 365: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

355

Respiratory Therapy, A.A.S.

Results by Campus/Modality

Fall 2017 Fall 2015 Average Average Score

ME 92% 92%

Student Competency Score

Results by SLO Criteria/

Errors Repeat Score

1 Fail Could not interpret ABG 72%

2 Fail Could not put on circuit 72%

3 Fail Could not put on circuit 72%

4 100% 5 90% Mask error 6 100% 7 90 % Settings did not

match ABG 8 95% Settings did not

match ABG 9 95% Settings did not

match ABG 1 100% 1 100% 1 100% 1 100% 1 95% 1 95%

Current results improved: [ ] Yes [ x ] No [ ] Partially Strengths by Criteria/ Question Topic: Deficiencies in student performance were identified and resolved prior to students entering and needing to perform this clinical intervention on a patient in the hospital. Weaknesses by Criteria/ Question Topic: Students must invest practice time to become proficient at the skills they’re being taught, and sometimes they don’t. A weakness is they are permitted to make these independent decisions and not forced to practice at an assigned time.

clinical data with the appropriate settings will be added. Next Assessment: Fall 2020

Core Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: Students will appropriately interpret graphic depictions of

Critical Care Monitoring RTH 236 Direct Measure: RTH 236 - Cognitive Adult Ventilation, specifically pertaining Graphic Waveforms, is being assessed

Semester/year data collected: Fall 2017 (2018 grads) Target: 80% of students will score 75% or higher overall and on each criterion during an in-class assessment. Results:

Previous action(s) to improve CLO if applicable: There are no previous actions; this is the first-time this outcome has been monitored. Target Met: [ ] Yes [ X ] No [ ] Partially

Page 366: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

356

Respiratory Therapy, A.A.S. ventilator waveforms as it applies to the patient’s clinical status [ X ] QR

using the following test questions: #18,19,21, 24, 25, 26, 28, 31. Sample

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

ME only 1 1 15 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment

Results by Campus/Modality

Fall 2017 Average

Score ME 36%

Results by CLO Criteria:

Results by CLO Criteria/ Question Topics

Fall 2017 Average

18 Appropriate action 53% 19 Flow Curve 47% 21 Airway Resistance 33% 24 Flow Cycling 33% 25 Static Compliance 26% 26 Compliance & PV Loop 33% 28 F—V Loop 53% 31 Press Vol Loop & disease 7% Total AVG 36%

Current results improved: [ ] Yes [ x ] No [ ] Partially Strengths by Criterion/ Question/Topic: Students will feel more confident in interpreting ventilator waveforms with the addition of content back into the curriculum. Weaknesses by Criterion/ Question/Topic: This content had fallen through the cracks after some modifications in the program.

Based on recent results, areas needing improvement: Students are expected to identify graphic abnormalities, the problems and solutions as it relates to ventilator management. Overall performance in this area for these specific questions is poor. Where, when, and how this content is taught will be explored. Current actions to improve CLO based on the results: Because this content is very difficult to comprehend and difficult to cover in the limited class time, historically an outside speaker did an immersive workshop on this topic. This has not been done for the last several years, and in her absence the content has not been fully re-absorbed into any specific course. Curriculum mapping will have to be done in Fall 2018 to identify where this specific content is/should be taught. In addition, we will reach out to see if the workshops can be resumed in Summer 2019 and/or create similar content that can be taught utilizing the high-fidelity simulators. Next Assessment: Fall 2020

Program Goals Evaluation Methods Assessment Results Use of Results

70% retention of the total number of students in the enrollment cohort (3-year average)

Short description of method(s) and/or source of data: Review total number of enrolled students and number of students that were unsuccessful. Some students in the cohort will be considered in-progress (IP) and therefore may not be accounted for in the attrition rate until they are fully withdrawn or graduate from the RTH program. Since students are continually leaving the program, data gathering is ongoing. Information is obtained from the student record.

Target: There will be 70% retention for a 3- year average Results for Past 5 years:

Grads Students Admitted

Academic Attrition/ Retention

Non-Academic Attrition/ Retention

2018 21 New

24% Attrition (=5/21)

19% Attrition (=4/21)

3 IP 76% Retention 81% Retention

24 Total 2017 24 New 8% Attrition

(=2/24) 21% Attrition

(=5/24) 1 IP 91% Retention

79% Retention

25 Total 2016 16 New

2 AP 1 IP

44% Attrition (=8/18)

18 Total 56% Retention 2015 26 New

6 IP 27% Attrition

(=7/26)

32 Total 73% Retention

Previous action(s) to improve program goal: In Fall 2016, an annual Meet and Greet between the first- and second-year students was initiated by the program director. The goal was to establish formal and informal mentoring relationships between the two groups. In addition to the peer tutor, a work-study with set hours was added specifically to assist the students in the lab. Being more proactive, the lab assistant and peer tutor were introduced during the program orientation for the class of 2017 (Grad 2019). The new cohort was also encouraged to be more proactive and seek support early. The retention counselor has come to classes to give students time management, test taking and study tips. This was done for the 2016 and 2017 cohorts. In Fall 2016, an online exit interview was created by the program director to capture data on students who leave the program. From this data and conversations with students, it appears that many continue to work

Page 367: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

357

Respiratory Therapy, A.A.S.

2014 25 New 2 IP

28% Attrition (=7/25)

27 Total 72% Retention 2013 23 New 37.5% Attrition

(9/24)

1 IP 1 Adv

62.5% Retention

25 Total Target Met: [ x ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): Prior to 2017, academic and non-academic retention/attrition were not separated, therefore comparison over the 2013-2018 is not possible. Looking at 2017 and 2018 data shows the significance of non-academic life issues and how they impact the ability of students to continue their education.

long hours while attending the program full time. This coupled with the lack of social support and/or networks prohibits students from being able to invest the necessary time needed to be successful in the program; therefore, they leave or fail. Most recent results: Retention for academic and non-academic reasons are now being teased apart. This shows the program attrition due to non-academic reasons has a 20% average over the last two years. There is some inconsistency on how the IP students are counted. This has an impact of the data. Will seek clarification this fall. Adjustments will be made. Results improved: [ x ] Yes [ ] No [ ] Partially Assessed: Annually

Short description of method(s) and/or source of data: First draft of a survey to gather demographic data on the 2020 cohort administered during orientation for Fall 2018. The instrument was completed by 23 of 30 students. Sample:

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

# Students assessed

ME 1 1 23 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual Enrollment

Semester/year data collected: Fall 2018 Results: Summary of the findings that could have a negative impact of student outcomes:

Question Response Employment Status

30.41% (35 or more hours) 39.13% (20-34 hours/week)

% Caring for children or family members

26% (1 child/family member) 26% (2 child/family member) 13% (3 child/family member)

Single Parent 13% Friends or families to support you

17% Most of the time 17% Some of the time

8.7% Rarely

Current action(s) to improve program goal: In Summer 2018, a survey was developed to capture more data on the students entering the program (2020 cohort). Met with OIR during Summer 2018 to refine the final instrument. The final survey will be administered in Fall 2019. This data will be used to steer students toward the services and supports they will need to complete the program. Assessed: Annually

80% of total number of graduates obtaining NBRC CRT credential (3-year average)

Source of Data: The National Board for Respiratory Care data base

Target: 80% of total number of graduates obtaining NBRC CRT credential (3-year average) Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Earned CRT

2018 15 100% 2017 17 100% 2016 7 100% 2015 22 100% 2014 15 100%

Target Met: [ x ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): Students continue to meet goals on this credentialing exam.

Previous action to improve program goal: No Most recent results: Students have consistently performed very well on their board exams over the past 5 years. We have exceeded the goals set by our accrediting body. This must be assessed every year. We will continue to monitor. Results improved: [ x ] Yes [ ] No [ ] Partially Assessed: Annually

Page 368: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

358

Annual Planning and Evaluation Report: 2017-2018

Science, A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community college is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed for persons who are interested in a professional or scientific program and who plan to transfer to a four-year college or university to complete a baccalaureate degree program with a major in one of the following fields: agriculture, biology, chemistry, pre-dentistry, forestry, geology, home economics, nursing, oceanography, pharmacy, physics, physical therapy, pre-medicine, science education, or mathematics. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to use mathematical reasoning to draw logical conclusions and make well-reasoned decisions. This goal also fits the CLO core competency #3, Quantitative Reasoning

General College Physics I PHY 201 Direct Measure: A thermodynamics problem was used to assess whether students could identify the correct formula, insert the formula, and use algebra to solve the problem. The problem was the same for all sections of PHY 201 in the Spring 2018 assessment. The problem involved calculating the specific heat of a cup of water mass of water at a given temperature. A common proficiency rubric was used for scoring that involved a score of 0 to 2 for three criteria associated with correctly answering the assigned problem: 1.) identifying the correct formula 2.) utilizing the correct

information/parameters 3.) using the correct algebra to solve

the problem The question and rubric were approved by the Physics Cluster Meetings in the beginning of the 2016-2017 school year. Sample: Data was collected from 112 students in 7 out of 11 sections of PHY 201 from AL (10), AN (16), WO (29), MA (21), and LO (36). We did not include dual enrollment sections. This may be something to consider in future assessments if we can coordinate with the high school instructor. There was one section of ELI but they did not report results. Of these 112

Semester/year data collected: Spring 2018 Achievement target: 70% average

Results: In the following results, there are two percentages listed. The first percentage is for all students and the second percentage in (parentheses) is only for A.S. Science students. • Results: 75% (76%) of students possessed the

proficiency required. 1.) 76% (78%) identified the correct law/formula

necessary for the solution of the problem 2.) 76% (78%) were able to insert correctly the given

information into the context of the problem 3.) 75% (76%) performed the necessary algebra without

mistakes 4.) Summary result is the lowest success percentage of

the above 3 criteria, or 75% (76%) • Achievement assessment goal was achieved by 5% (6%). Summary: • Proficiency in the physics discipline is defined as the

percentage of students who successfully perform all 3 criteria associated with the rubric.

• There is very consistent result for all three criteria. • The results from this assessment indicate 75% of the PHY

201 students and 76% of A.S. Science students taking PHY 201 possessed the necessary proficiency to successfully accomplish each of the 3 criteria associated with this SLO.

Previous Assessment Results: In the previous year’s assessment, the results were as follows: • Results: 73% (71%) of students possessed the

proficiency required. 1.) 84% (81%) identified the correct law/formula

necessary for the solution of the problem

In Spring 2018, the achievement target was met and increased slightly from the previous year. Unlike the previous year’s assessments, there was no noticeable change from the first criteria of identifying the correct formula to the second and third criteria which involves actual numerical and algebraic manipulation. We have seen large decreases in this percentage in previous years. There is a math prerequisite for the course but students often come into the course having passed this class with a low grade and may still struggle with mathematical concepts, and the cluster believes that this was the major contribution to this decrease. In Fall 2017, the cluster created a mathematical pretest for our PHY 201 courses to asses our students’ preparation in mathematics. This is a test that we give to the students on the first day of class. Most faculty do this and this early warning may help the students better prepare for the mathematical rigor required in the course. Also, faculty regularly point out the mathematical requirements of the course through covering the required material, and students can determine whether they need to review various mathematical concepts or not. The results for A.S. Science students are approximately the same when compared to all students. The faculty at the Physics Cluster meeting in August 2018 decided to continue to assess this SLO. Some talk was given to testing it with our PHY 231 class but there are very few A.S. Science students in this class as it is geared

Page 369: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

359

Science, A.S. students, 78 were identified as A.S. Science students. This year, all physical campuses which offer Physics contributed to the results. Special care was taken this year to send multiple reminders via email and to stress the importance of completing the assessment in order to collect as much data as possible; however, our compliance rate was very similar to last year when 8 out of 12 sections submitted data. Annandale seems to have the lowest contribution rate (as measured by fraction of sections contributing).

2.) 73% (71%) were able to insert correctly the given information into the context of the problem

3.) 73% (74%) performed the necessary algebra without mistakes

Summary result is the lowest success percentage of the above 3 criteria, or 73% (71%). Current results improved: [ X ] Yes [ ] No [ ] Partially

mainly towards Engineering majors. Therefore, the cluster decided to stick with PHY 201. The assessment methods and the proficiency rubric were all approved at the Physics Cluster meeting in August 2018. The assessment will be conducted on students that are program placed into the science program and also among all students in our courses. In addition, faculty are aware of the need for their continued focus and efforts in this area as well as allowing increased time for students to work on problems and examples to help them achieve our target. We plan to do all of this in Fall 2018. Next Assessment: The assessment and data collection will occur in the Fall 2018 semester with detailed data analysis occurring in the Spring 2019 semester. We will perform this assessment in the same way as last year.

Students will understand the scientific method and identify methods of inquiry that lead to scientific knowledge.

General Biology I BIO 101 Direct Measure: A quiz on the Scientific Method was available on Blackboard to all BIO 101 students in the college (students from all campuses including ELI and DE) towards the end of Fall 2017. The quiz consisted of 10 multiple choice questions that assessed steps in the Scientific Method. The topics were as follows: • Item #1: observation is first step • Item #2: order of steps • Item #3: definition of hypothesis • Item #4: validity of hypotheses • Item #5: importance of control • Item #6: definition of data • Item #7: example of hypothesis • Item #8: definition of variable • Item #9: definition of theory • Item #10: defining data

collecting This assessment is the same as given to students in the previous year. All assessment data is gathered through Blackboard.

Semester/year data collected: Fall 2017 Achievement target: For the whole quiz, 70% of students achieving 70% on the quiz. For each item, 70% of students correctly answering that item. Results for all students: 1) Average student score: 84.2% 2) Percentage of students earning at least 70% is = 88.5%

(506/572) 3) Percentage of students answering each item correctly:

• Item 1: 64.0% (366/572) • Item 2: 94.0% (538/572) • Item 3: 92.4% (529/572) • Item 4: 88.9% (509/572) • Item 5: 81.1% (464/572) • Item 6: 93.8% (537/572) • Item 7: 85.4% (489/572) • Item 8: 87.4% (500/572) • Item 9: 65.7% (376/572) • Item 10: 89.6% (500/572)

Results for students placed in A.S. Science: 1) Average student score: 83.3% 2) Percentage of students earning at least 70% is = 87.8%

(245/279) 3) Percentage of students answering each item correctly:

• Item 1: 58.7% (164/279)

Instructors and students of BIO 101 are becoming more used to assessment by Blackboard. During the Fall 2018 Cluster meeting, faculty members requested results of the previous year’s data. These data were sent to the Biology discipline chair for dissemination. The low achievement results on Items 1 and 9 are important to the biology faculty because they show that students do not understand that curiosity is the first step of solving a scientific problem through the scientific method. Also, the term “theory” in science continues to confuse students. Students’ wrong answers indicate that they do not realize “theory” in science is not a hypothesis, but a well-substantiated explanation of the natural world. It is valuable for instructors to have this feedback. The discipline chair recently elected in the Biology discipline in Fall 2018 has already seen this data. She wants to work with faculty on the concepts of the two low-scoring questions for the 2019-20 academic year. We need to find out if students are not understanding the concepts or there is a problem with the question itself. This is the second year that A.S. Science students were identified in the assessment. Although most A.S. Science majors take BIO

Page 370: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

360

Science, A.S. Sample: The assessment tool was deployed on Blackboard to all students taking BIO 101 on campus (AL, AN, LO, MA, WO), at ELI, and dual enrolled students, and 572 students took part in this assessment. The exact total number of students in BIO 101 during Fall 2017 is not available, but it is around 1600. This approximate number allows us to determine that about a third of all students responded to the Blackboard notice and took the quiz. Dual enrollment students were included, and 101 DE students (17.6% of the total) took the assessment. In the case of ELI, 128 ELI students (22.3% of the total) took the assessment. The number of students from each campus and from ELI was not tallied. However, the student ID numbers are in the raw data, and specific information can be gleaned from the data. Like the previous year, students identified themselves by major. This allowed us to compare results from students’ program placed in General Studies (219), Social Science (195) and Science (279). Note that these numbers add to 693; some of the students listed double majors.

• Item 2: 93.5% (261/279) • Item 3: 90.6% (253/279) • Item 4: 90.6% (253/279) • Item 5: 78.1% (218/279) • Item 6: 93.9% (262/279) • Item 7: 84.5% (236/279) • Item 8: 87.8% (245/279) • Item 9: 63.4% (177/279) • Item 10: 88.1% (246/279)

Results for students placed in A.S. Social Science: 1) Average student score: 82.4% 2) Percentage of students earning at least 70% is = 86.2%

(245/279) 3) Percentage of students answering each item correctly:

• Item 1: 57.4% (112/195) • Item 2: 93.8% (183/195) • Item 3: 87.1% (170/195) • Item 4: 90.2% (176/195) • Item 5: 74.3% (145/195) • Item 6: 94.8% (185/195) • Item 7: 84.1% (164/195) • Item 8: 86.1% (168/195) • Item 9: 64.1% (125/195) • Item 10: 88.2% (172/195)

Results for students placed in A.S. General Studies: 1) Average student score: 82.7% 2) Percentage of students earning at least 70% is = 85.8%

(188/219) 3) Percentage of students answering each item correctly:

• Item 1: 56.1% (123/219) • Item 2: 91.3% (200/219) • Item 3: 87.2% (191/219) • Item 4: 87.2% (191/219) • Item 5: 80.8% (177/219) • Item 6: 94.0% (206/219) • Item 7: 86.3% (89/219) • Item 8: 85.3% (187/219) • Item 9: 66.2% (145/219) • Item 10: 88.1% (193/219)

Results indicate that total scores are well above 70%, and most (8 out of 10) individual items meet achievement goals. Scores were very similar to those of last year. The lowest scores were in items 1 and 9. Item 1 asked about the first step

101, many students in General Studies and Social Sciences and other majors also take BIO 101. Faculty assessing Social Science and General Studies asked if we could identify their students, since those programs also wish to use this Scientific Method assessment for students in their majors. For the 2018-19 assessment year, we plan to add A.S. Liberal Arts. It is interesting that the results again show very similar results for students, regardless of major. BIO 101 is a class taken by science students early in their academic career, and results show that science students at this early stage did not outperform students in other majors. In this assessment, we were able to demonstrate for the first time that students from all campuses, ELI and Dual Enrollment took part. In the current Blackboard setup, each question is posed as an independent little exam, and that takes more time for students. The two more questions about ELI and DE that two more questions did not discourage students. Nearly 18% of student responders were DE, and 22% were ELI students. Next Assessment: Spring 2019

Page 371: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

361

Science, A.S. in the Scientific Method. The other low score was Item 9 which asked the definition of the word “theory.” Current results improved: [ ] Yes [ ] No [ X ] Partially Scores from students who are program placed in Science, Social Science and General Studies are very similar. In the 2015-2016 academic year, students scored below 70% in questions 1, 2 and 9 (42%, 47.5%, and 57.4%). In 2016-17, students scored below 70% in questions 1 and 9 (65.4% and 66.6%). This cycle, students also scored below 70% in questions 1 and 9 (64% and 65.7%). This shows a marked improvement in identifying the steps of the Scientific Method (question 2) over the years assessed, and an improvement in general knowledge of Scientific Method.

Students will use graphical methods to organize and interpret data.

Physical Geology GOL 105 Direct Measure: Assessment of SLO #3 utilized a laboratory assignment designed to demonstrate the process of finding earthquake epicenters. This task required students to create a graph using seismic wave data and then use the graph to determine distances of various recording stations from earthquake epicenters. From this information, students were then asked to triangulate an earthquake epicenter and indicate its location on a map. Success on this SLO was based on a point scale for the entire exercise. SLO #3 Assessment Method Example attached. Sample: Students enrolled in GOL 105 courses at the AN, AL, LO, WO, and ELI campuses took part in this assessment, and results were determined for A.S. Science degree students as well as those students seeking non-science degrees. No data was provided from MA campus courses at this time. Data was collected from 13 of 17 sections of standard (face-to-face) and 1 of 1 section of ELI GOL 105 offered during the Fall 2017

Semester/year data collected: Fall 2017 Target: An accumulation of 70% of possible points was considered successful for non-science majors and 90% for science majors. Results Standard:

• 87% of students were successful • 89% of the science majors were successful

Results ELI:

• 91% of students were successful • 100% of the science majors were successful

Results Overall (standard and ELI):

• 87% of students were successful • 90% of the science majors were successful

Non-science majors scored well above their 70% successful completion target in both ELI and standard courses. Overall, science majors achieved their target of 90%; however, the specific breakdown of the total data showed that ELI achieved 10% above target and standard 1% below. Previous Results: Results for 2016-17 academic year:

• 83% of students were successful • 88% of the science majors were successful

Results for overall students enrolled in the 2017-18 academic year rose three percentage points above those of the previous year for the non-science major population, and science majors rose by 2% from the previous year.

During Fall 2017, the GOL discipline continued to work on the improvement SLO data focusing on faculty/adjunct communication and clarity of SLO assessment methods. All actions were intended to increase the number of sections reporting data. Actions were taken in two forms: discussions at discipline meeting and email discussions between discipline SLO liaison and faculty as well as faculty and adjunct faculty on each campus. Results of these current efforts offered little change from the previous semester in terms of the number of sections reporting data. However, the discipline succeeded in our first separate reporting of delineated data between standard, ELI, and DE courses. The discipline must keep working on improving the communication aspect of assessment, especially from adjunct faculty. The Fall 2017 assessment overall results met the 90% passing goal for science majors established by the A.S. Science Program SLO committee and geology discipline. While standard courses fell short by 1%, the discipline considers this result within an acceptable range as it represents an improvement from last assessment. ELI results exceeded their targets for both major and non-majors by a considerable margin, a success in the view of the discipline. The current data reflects that the success of our students completing SLO #3 related tasks rose, although not significantly. This was the first semester GOL separated ELI from the overall data, so no comparison can be made to past

Page 372: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

362

Science, A.S. semester. No DE (Dual-enrollment) courses were offered this semester. Of the 374 students, 20 were program placed A.S. Science majors. The instructor graded the student results and provided a score of successful or not successful.

results for this course type. While GOL courses met their overall objectives, the increase in the percentage of successful students was small. Therefore, we can consider our results stable when compared with previous semesters. Although our goals were met or exceeded, the GOL discipline should continue discussions for improving student success during future meetings, perhaps with a focus on student interpretational skills when extracting meaning from graphs. The spring 2018 discipline meeting will be the first chance to discuss the path forward. The established target goal of assessing 70% of all GOL 105 sections taught at NOVA was met this semester with an unchanged 76% of all sections reporting from Fall 2016. The discipline must continue to work on improving communications in an effort in increase the reporting percentage during our next assessment. This topic will be further discussed at next discipline meeting. Next Assessment: Fall 2018

Students will be able to explain the principles of chemical bonding in the formation and properties of molecules

College Chemistry I CHM 111 Direct Measure: A series of chemical bonding questions were used to assess if the CHM 111 students understood the principle of chemical bonding in atoms and molecules. Fall 2017: There were 5 multiple choice questions and each had 5 possible responses. The same questions were used at each campus involved in the assessment. The questions were given at the discretion of the faculty involved in the assessment, in either an in-class exam or in the final exam. The faculty doing the assessment corrected the student response, collated it and provided the collected assessment results. The 5 questions involved the following subject material: • Question #1: bond polarity • Question #2: bond type

Semester/year data collected: Fall 2017 Target: 70% Fall 2017 data (CHM 111):

Question % of students with the correct answer

1 70% 2 65% 3 83% 4 61% 5 75%

Spring 2018 (CHM 112):

Score (out of 5)

% of students with the correct answer

5 16% 4 31% 3 23% 2 17% 1 76%

Current results improved: [ ] Yes [ ] No [ X ] Partially

The CHM cluster continued to use the new questions from the Fall 2016 assessment. The areas of chemical bonding in the formation of molecules was assessed in CHM 111. The second part of the SLO, properties of molecules, was assessed in a few sections of CHM 112. We feel that these new questions were better written and more consistent with the material that the students should have mastered as well as when the material should be mastered. However, after the assessments, the new steering committee of the chemistry discipline met with assessment staff to better understand SLO/CLO goals, and the steering committee began writing completely new assessments for CHM 111, to be used 2018-19. For Spring 2018, 890 students were enrolled in the BB course to enable them to take the SLO. Of this, 303 completed the SLO and 587 either did not attempt the SLO or did not complete it. 70% of the students earned a 3/5 or above (60%). Unfortunately, this method of assessment has not allowed us to form any

Page 373: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

363

Science, A.S. • Question #3: Lewis Structures • Question #4: Formal Charges • Question #5 Molecular Geometry Spring 2018: All students enrolled in CHM 111 at NVCC (including dual enrollment and ELI) were placed into a Blackboard class. The students answered the 5 SLO questions. The students were also prompted to indicate the campus and the delivery method for the course. Unfortunately, the set up in BB has not allowed us to extract anything except for the total score for the questions. The 5 questions involved the following subject material: • Question #1: Inter Molecular

forces • Question #2: Colligative

Properties • Question #3: Colligative

Properties • Question #4: viscosity • Question #5: Intermolecular

forces Data was collected from CHM 112, General Chemistry 2: 227 out of 582 students initially registered for the course completed the assessment. The data was not separated by campus, delivery method or individual question. Sample: Data was submitted from 24 Sections of CHM 111 from all campuses out of 50 possible sections. ELI and DE students were not assessed.

Previous Assessment: Fall 2016 (Data was collected for 5/15 CHM 112 sections):

Question % of students with the correct answer

1 75% 2 52% 3 66% 4 68% 5 57%

Spring 2018:

Score Out of 5 points

% of students with the correct answer

5 18% 4 30% 3 28% 2 12% 1 4%

Current results improved: [ ] Yes [ ] No [ X ] Partially

good conclusions and we will be returning to the previous method of allowing the faculty to give the questions on either an in-class exam or on the final. Spring 2018: All CHM 112 sections were given the opportunity to complete the SLO on molecular properties. The assessment was given in Blackboard. The assessment was supposed to separate students by campus and delivery method (hybrid, in-person or ELI), but it did not work. It was also supposed to separate by question, which also did not work. Instead we have total score for each student. Each question was worth 1 point, so 5/5 is a perfect score. All students who registered for CHM 112 were put into this BB site and were able to take the quiz. 227 students completed the quiz and 355 did not. 80% of the students who took the quiz received a 3,4 or 5. 48% earned a 4 or 5. The importance of this assessment was not well communicated to the students and many did not complete the assessment or put a lot of effort into it. With that in mind, a large number of students completed the assessment and 80% earned a 60% or above on the SLO. This is an encouraging result and suggests that we should continue to assess this the understanding of molecular properties in CHM 112. We will be assessing the student learning on this topic using traditional exams in the Fall of 2018 and hope to get more illuminating results. The steering committee feels that future efforts will be much more successful, since they now better understand goals and purposes of the SLO/CLO. The steering committee is taking responsibility for addressing these issues and ensuring faculty participation. Next Assessment: Spring 2019

Program Goals Evaluation Methods Assessment Results Use of Results To encourage students to complete their degree

Number of Graduates by Program and Specialization: Data from OIR College Graduates by Curriculum And Award Type (http://www.nvcc.edu/college-planning/_docs/41-16-number-of-

Target: a 4% increase per year in the number of Science graduates. Results: • 2016-2017: 338 (8.9% decrease from previous yr.) • 2015-2016: 371 (0.5% increase from previous yr.) • 2014-2015: 369 (0.5% increase from previous yr.)

The number of Science graduates has not met our 4% target, and for the first time there is a decrease in numbers. This follows a decreasing rate of increase of A.S. Science Degree graduates, discussed in previous APERS.

Page 374: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

364

Science, A.S. nova-graduates-by-degree-and-specialization-2015-2016-final.pdf and http://www.nvcc.edu/college-planning/_docs/41-17-Number-of-NOVA-Graduates-By-Degree-and-Specialization-2016-2017.pdf) Data from FACT BOOK, 2013-2014 through 2017-2018.

• 2013-2014: 370 (6.9% increase from previous yr.) • 2012-2013: 346 (3.9% increase from previous yr.) • 2011-2012: 333 (24.7% increase from previous yr.) • 2010-2011: 267 (44.3% increase from previous yr.) • 2009-2010: 185 (19.4% increase from previous yr.) For the first time, the number of graduates in Science decreased. This year, there was an 8.3% decrease. The target has not been met for the last three years. As seen above, the rates of increase in graduates has slowed over the past three years. Large increases after the 2009 recession slowed to smaller increases and leveled off in 2014-16.

The prime reason for this decrease in the number of graduates is felt to be primarily due to the reduced population of the 18-24 year demographics. It is this population which makes up the majority of college enrollments. The U.S. census (http://www.newgeography.com/content/00269-number-18-24-year-olds-united-states-2000-2050) shows that the 18-24 year old population has decreased nationwide from about 27 million in 2012 to 26 million in 2016. This decline has resulted in decreased student enrollments. The impact of a decreased student population, especially that sub-population that is STEM inclined, will be felt more in community colleges since many science students prefer going to a 4-year school. Another reason for a reduction in the number of Science graduates is the increased mathematic requirements for the key Biology courses, BIO 101 and 102. The Biology Discipline Cluster in the Fall 2014 implemented Developmental Math Unit 3 requirement for BIO 101 and BIO 102. This reduced the number of students taking biology classes and since the A.S. Science degree with a biology concentration is the largest number of Science graduates, this impacted the number of graduates. Given that the majority of our decreased graduates is due to the decrease 18-24 year old population and the resulting decrease in enrollments, a key initiative is to work with local High Schools and students to make sure they know of the advantages of going to NOVA before going to your 4-year school for a B.S. degree. Some of the advantages are: guaranteed admission agreements, reduced costs, and equal course standards with smaller class size, time to mature and ensure the field you are taking is the right field for you. As a result, NOVA Technical Divisions, in cooperation with the NOVA High School Outreach and NOVA Pathways group, will continue to promote NOVA Technical education options with the local High Schools, especially at college information nights.

Page 375: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

365

Science, A.S. Another initiative that the NOVA Technical Divisions are engaged in is to make it easy for NOVA students to transfer to key 4-year schools in Virginia. It is felt that once it is clear to the local High School student population that transfer with an AS Science degree is as “seamless” as possible, more students will be willing to enroll at NOVA to get their AS Science degree first. With more students aware of the ease to transfer to key 4-year schools, enrollment and resulting graduates will increase. The NOVA Technical Divisions continue, as indicated in previous APERs, to work closely with key 4-year schools in Virginia to ensure that our Science and Technical degree programs are consistent with the requirements of those colleges such that our transfer students meet the transfer requirements. As an example, the Biology and Chemistry Departments have worked closely with GMU to ensure that BIO/CHM labs are transferable to GMU. Changes have been made to NOVA’s labs consistent with GMU’s transfer requirements. The overall goal for all disciplines is to provide information on transfer opportunities to these schools as well as establishing as “seamless” as possible transition to these schools. Faculty already work with colleagues at transfer schools. The chemists meet regularly with 4-year school chemistry faculty. In 2018, NOVA biologists worked with six transfer schools to develop working pathways. Several faculty members have worked diligently with the NOVA/Mason ADVANCE initiative, which would help transfer to Mason while assuring completion of the NOVA degree. Finally, the new Pathway Deans for Physical and Life Sciences and the Science Provost are now involved during pathway meetings in looking at student completion data. The new cluster chairs and steering committees will have input in decisions affecting assessment and student success. There are not definite plans yet, but we expect movement in this direction, along with the Navigate system to assist student planning. Assessed: Annually

Page 376: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

366

Science, A.S. To increase the number of program placed students in the program

Distribution of Program Placed Students by Curriculum and Award Type: Data from OIR Distribution of Program Placed Students by Curriculum and Award Type 2010-2016 (http://www.nvcc.edu/college-planning/_docs/distribution-of-program-placed-students.pdf) and http://www.nvcc.edu/college-planning/_docs/Distribution-of-Program-Placed-Students-by-Curriculum-and-Award-Type-Fall-2012-%20Fall-2016.pdf Data from FACT BOOK, 2013-2014 through 2017-2018.

Target: a 4% per year increase in the number of program placed Science students. Results:

• Fall 2017: 3340 -5.0% • Fall 2016: 3514 -5.9% • Fall 2015: 3733 +1.6% • Fall 2014: 3676 +2.7% • Fall 2013: 3581 +7.2% • Fall 2012: 3338 +11.2% • Fall 2011: 3002 +16.1% • Fall 2010: 2584 +15.1% • Fall 2009: 2244

For the second time, the number of students program placed in Science has decreased. There was a 5.9% decrease from Fall 2015, and a 5.9% decrease in Fall 2017. The increase seen in the previous 6 years was slowing from large increases in 2010 to 2012 to smaller increases 2013-15. Numbers of students were leveling off and Fall 2016 shows a decrease to the Fall 2013 level of science placed students. As indicated above, the number of program placed students in the A.S. Science degree program has declined. This is consistent with the decreasing rate seen with the number of graduates in Program Goal #1.

The target of 4% per year increase in Program Placed students was not met, and there was a decrease for the second time since 2009. The lower rate of enrollment correlates with the lower graduates in Goal 1. The economic recovery as well as the lower high school demographics are likely reasons for the decrease in enrollment. The goal of increasing the number of graduates as well as overall increased enrollments in the NOVA Technical Divisions will take more effort on the part of NOVA, with efforts on campus as well as ELI and DE (dual enrollment) courses. A key initiative to increase the number of Program Placed students in the Sciences, is to work with Financial Aid to develop curriculums that will be fully supported by Financial Aid. Financial Aid has established new policies based upon Department of Education direction that does not allow Financial Aid for “hidden” courses. A “hidden” course results when in order to take a course in the official curriculum for the program that the student is in (“Program Placed”), a pre-requisite course is required. Financial Aid will not pay for pre-requisite courses unless they are listed on the official curriculum of that program of study. As an example, until this year (2018-19 catalog), the AS Science degree required Calculus I during the 1st semester. However, many students did not qualify for Calculus. These students had to take Pre-calculus without financial aid. Now Pre-calculus is listed in the Science degree and students will be able to apply their financial aid to the course. Discipline Clusters and Pathway Councils are working to develop Science pathways that have all the key “hidden” courses listed in the curriculum and advising sheets. The community college population has a large percentage of students that require pre-requisites. By adjusting our curriculums to include these courses while still leading to an acceptable and transferable AS Science degree, more of our students may choose to be program placed in Science and have the capability of financial aid for their courses. It may be that this previous impediment caused a drop in science enrollment the past two years, a time when many students were applying for aid.

Page 377: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

367

Science, A.S. Another important issue is advising. Only one-third of NOVA students receive advising, which means most students never see a counselor or faculty advisor. Certainly better tracking of students and advising is important to success. A new college-wide advising initiative will be put in place in Spring 2018, to be functional for Fall 2018. Part of this initiative is to guide students so they do not waste time and money (or financial aid) on courses which will not help them transfer. As discussed in previous APERs, another key effort associated with the A.S. Science degree is the promotion of STEM opportunities with the local communities. Campuses have science programs (Green Festivals, STEM Fairs, Night of Science, etc.) open to the community, and tours from local schools are encouraged. Obviously, the efforts associated with Goal 1 above, is highly synergistic with meeting Goal 2. As we increase enrollment, the number of our graduates will also increase. Assessed: Annually

To provide a top-level assessment of adequate course standards and student success

Distribution of Course Success Rate by Discipline: Data from https://www.nvcc.edu/college-planning/data.html Data from FACT BOOK, 2013-2014 through 2017-2018.

Target for the Discipline Course: Success rate is 70% or better. Results:

% Success in NOVA Courses for Science and Math Disciplines (C or better grade)

BIO CHM GOL MTH PHY Total*

Fall 2017 74.9 71.0 83.6 59.7 77.9 76.2

Fall 2016 76.3 70.7 80.1 61.8 75.9 75.3

Fall 2015 76.8 71.3 79.9 61.7 75.1 74.4

Fall 2014 75.1 71.3 78.2 64.3 72.8 74.0

Fall 2013 73.5 71.3 74.9 61.9 67.9 -**

Fall 2012 74.9 72.5 79.1 64.6 70.1 -

Fall 2011 73.3 71.2 75.4 59.2 67.8 -

Mean 75.0 71.3 78.7 61.8 72.5 75.0

The Science course success rate for Fall 2017 exceeds 70%, so the target is met for Biology, Chemistry, Geology and Physics. The Math success rate does not meet 70%. We will continue to monitor this program goal next year. The purpose of this Program Goal is to assess the maintenance of adequate course standards (or course difficulty) while also ensuring that acceptable student success is being achieved. This program goal utilizes % success in NOVA Science and Math courses to determine overall trends that might need action. The Math discipline success rate is the lowest and this is consistent with past performance. This is most likely due to the fact that Math is a key gateway course for STEM-related courses. All NOVA campuses continue to implement the VCCS Developmental Math program as part of the overall college’s efforts to increase STEM awareness and student STEM options. NOVA

Page 378: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

368

Science, A.S. ∗Total = total for all NOVA courses ** not reported The overall Course Success Rates during the Fall 2017 semester are consistent with the past 6 years. Biology, Chemistry, Geology and Physics all had course success rates greater than the target of 70%. The Math success rate is below the target as it has been for the last 6 years and is similar to the 7-year mean for Math. The overall trend for the science disciplines is consistent with the established long-term pattern of Geology having the highest success rate, and Math the lowest. The success rate in Physics has improved over the last three years, and it now exceeds that of Chemistry. The Fall 2017 grading of Science courses is holding steady compared to past years. As a result, it is concluded that NOVA’s Science disciplines are maintaining adequate course standards (difficulty) and student success.

and VCCS are working to find ways to improve Developmental Math education and to therefore improve overall Math success rates. The new Math courses began in Fall 2018; this was a state-mandated change for all math courses in the VCCS. The effect will be seen in the next years. Providing this data allows the Discipline Clusters to determine if proper trends are maintained and performance in each discipline is maintained.

Page 379: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

369

Annual Planning and Evaluation Report: 2017-2018 Science: Mathematics Specialization, A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed for persons who plan to transfer to a four-year college or university to complete a baccalaureate degree. This curriculum is designed to prepare students to major in one of the following fields: mathematics, mathematics education, statistics, operations research, applied mathematics or computer science. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Solving Applied Problems

Applied Calculus MTH 271 Direct Measure: Average score on final exam in MTH 271 Sample Size

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 6 1 37 AN 15 1 33 MA 10 2 50 LO 9 0 0 WO 6 1 26 ELI 5 0 0 DE* N/A N/A N/A Total 51 5 146

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: the average final exam score will be a 60% Results: Since most campuses used results from one section (MA used 2), the results have great variability. In addition, a common assessment was not used.

Results by Campus/ Modality

Spring 2018 Average Score

AL 81% AN 61.5% MA 63% LO no data WO 53% ELI no data Weighted average 65.4%

Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Due to the collection of final exam scores as data, the areas needing improvement are unclear. Further, collection of data did not include an adequate number of campuses/modalities or sections, nor was data disaggregated by program of study. Collection of data form MTH 271 is not an appropriate source since students pursuing the AS Science / Mathematics do not take this course. The assessment of learning outcomes in MTH 271 is more appropriate for discipline or course-specific assessment and may be beneficial to programs requiring the course, such as the AS Business Administration. Collection of data is the key area needing improvement to make results meaningful. Current actions to improve SLO based on the results: Previously determined SLO questions were not used to assess 2017-18 SLOs. A temporary change in leadership and lack of an SLO lead for the 2017-18 academic year have been resolved for the 2018-19 year. Further SLO collection is being done using common questions instead of collecting final exam scores. A robust schedule of SLO assessment is under development and will be implemented starting Spring 2019. Future SLO collection will be disaggregated by program of study and will include courses in the AS Science / Mathematics Specialization Program of Study.

Page 380: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

370

Science: Mathematics Specialization, A.S. Next Assessment: This SLO will be measured again according to the SLO schedule, which is under development. DE, NOVA online, and all comprehensive campuses will be included in data collection.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: [ X ] QR

Calculus I MTH 173 Direct Measure: Grade on Final Exam in MTH 173. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

# AL 5 1 25 AN 11 1 13 MA 7 1 25 LO 6 0 0 WO 3 1 23 ELI 7 0 0 DE* 1 0 0 Total 40 4 86

*Dual-enrollment

Results by Program Placement Spring 2018 # of students

AAS-Architecture 2 AS-Computer Science 21 AS-Engineering 32 AS-General Studies 2 AS-Science 24 AS-Science/Mathematics 4 Career Exploration 1 Total 86

Semester/year data collected: Spring 2018 Target: 50% of students will score at least a 70% on the final exam Results: Since most campuses used results from one section, the results have great variability. In addition, a common assessment was not used.

Results by Campus/ Modality

Spring 2018 Average

Score Percent >

target AL 75.36 64% AN 56.65 31% MA 61.16 32% LO no data no data WO 48.37 22% ELI no data no data Total 61.18 38%

Results by Program Placement

Spring 2018 Average

Score Percent >

target AAS-Architecture 44.75 50% AS-Computer Science 58.16 38% AS-Engineering 64.83 41% AS-General Studies 56.20 50% AS-Science 59.98 33% AS-Science/Mathematics 72.10 50% Career Exploration 36.20 0%

Target Met: [ ] Yes [ ] No [ X] Partially Only one campus (AL) met the target. Two programs met the target. Based on recent results, areas needing improvement: Previously determined SLO questions were not used to assess 2017-18 SLOs. A temporary change in leadership and lack of an SLO lead for the 2017-18 academic year have been resolved for the 2018-19 year. Further SLO collection is being done using common questions instead of collecting final exam scores. Due to the collection of final exam scores as data, the areas needing improvement are unclear. Further, collection of data did not include an adequate number of campuses/modalities or sections. Collection of data is the key area needing improvement to make results meaningful. Current actions to improve CLO based on the results: This CLO will be reassessed in a more meaningful way according to the CLO assessment schedule. The CLO Quantitative Reasoning will be assessed in the new MTH 154 Quantitative Reasoning course.

Program Goals Evaluation Methods Assessment Results Use of Results Increase the number of students placed in the AS Science / Mathematics Specialization.

SIS query NV_STUDENTS_IN_PLAN Acad Inst: NV 280 Enrl Term: 21x4 Acad Plan: 8802

Target: Increase the number of students, program placed in the AS Science / Mathematics Specialization by at least 5% each year. Results for Past 5 years:

Fall Number of Students

Percentage Increased

2018 344 - 8.3% 2017 375 12.6% 2016 333 18.9% 2015 280 7.7%

Most recent results: There was a decrease in the number of students program placed in the AS Science / Mathematics Specialization after 3 consecutive years of increases. This data will continue to be monitored to determine if the decrease is a trend. It is unclear why the decrease occurred. Current action(s) to improve program goal: The Computer Science and Mathematics Pathway Council will continue to

Page 381: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

371

Science: Mathematics Specialization, A.S. 2014 260 --

Target Met: [ ] Yes [ ] No [ X ] Partially Comparison to previous assessment(s): Program placement in the AS Science / Mathematics Specialization has increased from 2014 to 2017. There was a decrease of 8.3% in 2018.

monitor the number of program placed students. It may be necessary to develop marketing advising material for students and advisors to increase awareness of the program. The Council is currently investigating the need/demand and feasibility of a separate AS Mathematics degree. Assessed: Annually

Increase the number of students graduating with an AS Science / Mathematics Specialization.

Number of NOVA Graduates By Degree and Specialization: 2017-2018: Research Report No. 44-18 Number of NOVA Graduates by Degree and Specialization: 2016-2017: Research Report No. 41-17 Number of NOVA Graduates by Degree and Specialization: 2015-2016: Research Report No. 41-16 Number of NOVA Graduates by Degree and Specialization: 2014-2015: Research Report No. 35-15 Number of NOVA Graduates by Degree and Specialization: 2013-2014: Research Report No. 46-14

Target: Increase the number of students graduating with an AS Science / Mathematics Specialization at a rate comparable to the percentage increase in program placed students for the year prior, i.e. since there was an increase of 12.6% program placed students in Fall 2017, one would expect a similar increase in the number of graduates at the end of the following academic year. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2017-18 41 Goal: 12.6% Actual: -6.8%

2016-17 44 Goal: 18.9% Actual: 41.9%

2015-16 31 Goal: 7.7% Actual: - 31%

2014-15 45 -- 2013-14 50 --

Target Met: [ ] Yes [ X ] No [ ] Partially

Most recent results: Despite the number of student program placed in the AS Science / Mathematics Specialization in 2015, 2016, and 2017, the number of graduates in the program decreased from academic year 2016-17. It is unclear why this decrease occurred. With the decrease in the number of program placed students in 2018, this decrease may continue. More data may be needed to determine how long it typically takes program placed students to graduate. Current actions to improve program goal: The Computer Science and Mathematics Pathway Council and the Mathematics Discipline Steering Committee are currently investigating whether a separate AS Mathematics is desirable and feasible. Make 200-level course more accessible to campuses with traditionally low enrollment through delivery of synchronous courses, which will allow for completion of degree. Continue to work with transfer institutions to make degree more transferable. Continue to investigate if the development of an AS in Mathematics is a good option for students. Assessed: Annually

Page 382: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

372

Annual Planning and Evaluation Report: 2017-2018 Social Sciences, A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is designed for individuals who plan to transfer to a four-year college or university to complete a bachelor of science in one of the social sciences. It also prepares students for some teacher certification programs. Students from the A.S. program major in a wide variety of fields, including anthropology, economics, government/political science, history, mass communications, pre-law, psychology, public administration, social work, and sociology.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Civic Engagement PLS Discipline Student Learning Outcomes: “Students will be able to describe the political institutions and processes of the government of the United States.”

American National Politics PLS 135 Direct Measure: This assessment was performed in PLS 135 classes, American National Politics, which deals directly with this SLO. Provided Rubric Criteria or Question Topics We asked students 20 Multiple Choice (MC) questions requiring them to identify correct responses in four areas of important to American politics and government: The Constitution; Legislative Branch; Executive Branch; and Judicial Branch. Sample Size (Specify N/A where not offered)

Campus/

Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 2 2 36 AN 2 1 32 MA N/A N/A N/A ME 0 0 0 LO 1 1 17 WO 0 0 0 NOVA Online

3 0 0

DE* N/A N/A N/A Total 8 4 85

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: Average score 80% or higher overall on each criterion as well as the overall score. Results:

Results by Campus/ Modality

Spring 2018 Discipline

Average Score AL 82 AN 78 LO 77 Avg score 79

Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2018 Average Score

Constitution 78 Legislature 81 Executive 86 Judiciary 73 Avg score 79.5

Current results improved: N/A: First time assessed Strengths by Criterion/ Question/Topic: Students performed best in the area of the Executive Branch, which should not be surprising as it receives the most media attention. Next is legislature. Both topics are above the target. Weaknesses by Criterion/ Question/Topic: Knowledge of the Constitution and Judiciary was lacking, which is not surprising, particularly with the Judiciary. Still, knowledge of answers to questions in both fields was not too far off of target.

Previous action(s) to improve SLO: N/A: First time assessed. Target Met: [ ] Yes [ ] No [X] Partially All campuses slightly above or below target, which is great. However, knowledge of Constitution and Judiciary seem to be lag categories with regards to scores. Current actions to improve based on recent results, areas needing improvement: - Need to spend more time in class

discussing Constitution and Judiciary. - Inform students on importance of both

topics. - After Spring 2019, will adjust questions. - We have gotten all campuses and NOVA

Online to participate for Fall 2018 semester.

Will share results with all faculty and they will see recommendations to address issues. Discuss issue at Discipline Group meeting, during convocation of Spring 2019, and we will discuss this at every Fall convocation. Next Assessment: Fall 2018

Critical Thinking Student Development Orientation SDV 100

Semester/year data collected: Spring 2018

Previous action(s) to improve CLO: The SDV Curriculum Committee has a yearly

Page 383: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

373

Social Sciences, A.S. SDV 100: Identify three to five aspects of critical thinking such as: identifying faulty logic, problem-solving, and asking questions/probing etc.

Direct Measure: Students were quizzed on 5 critical thinking questions embedded in a College Resource Quiz in SDV 100. Question Topics • Q9: Thinking creatively • Q10: Solving problems • Q15: Critical thinking in high school

versus college • Q17: Narrowing the problem • Q18: Critical thinking

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 21 13 230 AN 36 32 678 MA (+1 SDV 101)

15 9 161

ME SDV 101

11 5 49

LO 18 13 250 WO 22 5 109 ELI 24 17 246 DE* 10 1 21 Total 157 95 1744

*Dual-enrollment Major improvement on data overall is due to the support and insistence of Associate Deans of Student Development on each campus.

Target: 80% of students will answer correctly on the 5 critical thinking questions included on the College Resource and Critical Thinking Quiz. Results:

Campus/ Modality Q9 Q10 Q15 Q17 Q18 Total

AL 97% 93% 31% 11% 83% 63% AN 95% 88% 24% 11% 80% 60% MA 98% 94% 28% 3% 86% 62% ME 98% 92% 16% 78% 80% 73% LO 99% 93% 23% 13% 84% 62% WO 100

% 96% 100%

100%

100% 99%

NOVA online 96% 68% 13% 76% 90% 69%

DE 100% 95% 24% 86% 100

% 81%

Total Average 98% 90% 32% 47% 88% 71%

Current results improved: N/A - First time we assess this topic. Strengths by Criterion/ Question/Topic: Questions 9, 10, and 18 had the best scores due to the fact that they could be assessed by using good test taking skills and singling out other answers that are not the best (multiple choice). The questions are broad enough that even without reviewing the textbook they can be answered. Weaknesses by Criterion/ Question/Topic: Questions 15 and 17 had the lowest scores. Question 15 is a question that requires the student to pick several right answers and there is more room for error. Question 17

had the highest wrong answers because it is not worded directly from the text but it’s inferred from the reading material and requires a bit more critical thinking to figure out the best answer.

mandatory SDV In-Service where we have instructors present on best practices on student engagement and learning (May 2016, May 2017, June 2018) The Committee has also considered using a different textbook but our primary goal has been to keep the textbook affordable by using OER (Open Education Resources). We have considered that since the textbook is only available online that it discourages students from reading it. The committee reviewed textbooks in 2017-2018 and we voted against the different options because they could not remain free. At this time we have not found a better free textbook that covers the topic we review in this class. Most of the assignments required self- assessment and reflection and students feel more comfortable with those assignments than assessments and quizzes that required them to review the textbook available online. NOVA Online, formerly ELI, differed on when/where they assessed the critical thinking questions. It was not in the first quiz/assessment and not attached to a college resource quiz but it was its own separate quiz. This allows discussion that putting a critical thinking reading assignment/assessment as its category later on in the class may improve the results. Target Met: [ ] Yes [ ] No [ x ] Partially Based on recent results, areas needing improvement: The Critical Thinking CLO is currently located along with College Resources and Communication Skills. Comparing with ELI on where they put their assessment, students may do best if Critical Thinking has its own category after Academic and Test-Taking skills. Current actions to improve CLO based on the results: Unfortunately, the Fall 2018 assessment is well underway and too late to make any improvements or changes. Critical Thinking is not going to be assessed for Spring

Page 384: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

374

Social Sciences, A.S. 2019. Comparing Spring 2018 to Fall 2018 would allow for more results to see if there is improvement or if the data stays the same. Next assessment of CT: Spring 2020

Quantitative Reasoning Students will use numerical values to perform various calculations and draw reasonable conclusion. Students will use graphical methods to organize and interpret data.

General Chemistry I & II CHM 111 and 112 Direct Measure: Lab Report (pilot) Rubric Criteria: QR Rubric for Lab assignment: Five criteria presented on the Quantitative Reasoning (QR) Rubric:

I. Interprets Quantitatively: Explains the numerical information presented in mathematical forms (equations, formulas, graphs, diagrams and tables).

II. Presents quantitatively: Converts the given information into mathematical forms such as tables, graphs, diagrams, and equations.

III. Analyzes thoughtfully: Draws relevant conclusions from provided information and data, and predicts future trends.

IV. Communicates qualitatively and persuasively: uses quantitative evidence to support the argument or purpose of the work (what evidence is used, how it is formatted and contextualized).

V. Problem solving: Sets up a numerical problem and calculates the solution correctly

Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

AL 10 1 23 AN 18 1 25 MA 8 3 52 ME 0 0 0 LO 23 8 128 WO 8 0 0 NOVA Online

1 1 18

DE* 8 8 78 Total 76 22 324

*Dual-enrollment Assessment Results’ Calculation:

Semester/year data collected: Spring 2018 Target: The average score of students participating will be 70%. For itemized criteria, 70% of students will correctly answer each item. Results:

Results by Campus/ Modality

Spring 2018

Average Score %Percent Earned

AL 16.8 84.1 AN 14.7 73.4 MA 17.9 89.6 ME N/A N/A LO 14.2 71.1 WO 0 0 ELI 16.8 84.0 DE* 15.6 78.0 Total Ave 14.8 74

Results by CLO Criteria:

Results by CLO Criteria/

Question Topics

Spring 2018 Average

Score % Earned on

Questions 1. 2.9 72.5 2. 3.0 75.0 3. 3.0 75.0 4. 3.0 75.0 5. 2.9 72.5 Total Ave 3.0 74

Current results improved [ X] Yes [ ] No [ ] Partially Four out of the five campuses offering in-person Chemistry courses contributed data for this report, in addition to ELI and DE courses. Although the larger sample of students evaluated resulted in lower score in each criterion, the results for this assessment are considered more meaningful compared to fall 2017. In spite of the overall decrease in the average, the targeted values for the evaluation were met by each campus and on each criterion.

Previous action(s) to improve SLO: This was the second round of assessing the QR objectives. In the January 2018 cluster meeting, the discipline group discussed the previous assessment in Fall 2017 and ways to improve the faculty participation and the Core Learning Outcomes. There were some questions regarding interpreting the rubric that seemed to be the reason for insufficient faculty participation. After the meeting, on January 05, an informative follow up email was sent to the cluster to allow enough time to plan for the semester. The following changes were assumed: • To improve the consistency of the

assessments and hence the results, two laboratory experiments were selected and shared with the faculty to use for the evaluation.

• To increase the students Core Learning Outcomes, a handout with guidelines regarding analysis of data, thinking quantitatively, and writing analytically was developed and shared with the discipline to distribute among all students on all campuses. This was to ensure that all students have access to the same information prior to their analytical writing and interpretation of data.

• To maintain standardization of the collected data, a table for collecting information was developed and shared with the Assistant Deans.

Target Met: [ X ] Yes [ ] No [ ] Partially All campuses met and some exceeded the targeted value. WO did not participate in the assessment, and only one course from each of AL and AN participated. Compared to Fall 2017, the number courses participating increased from 10% to 29% participation in Spring 2018. The number of

Page 385: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

375

Social Sciences, A.S. Average Score: Total Points in all courses ÷ Total Number of Students Maximum points available = 20 points #(15.2/20)x100=76% and (16.7/20)x100=84%

There was very little to no variations in the average score among criterion, which indicates students’ overall preparation. Furthermore, students met the targeted goal for each item. Strengths by Criterion/ Question/Topic: Three of the criteria, “Presents quantitatively,” “Analyzes thoughtfully,” and “Communicates qualitatively and persuasively” were scored equally high. “Interprets Quantitatively” and “problem solving” were among the weaknesses of the students evaluated. Both of these criteria are math related and more students find these types of assessments challenging. This may improve by addition of some kind of math related activity to the curriculum during the first few weeks of school.

students participating in this assessment increased by over 200% compared to Fall 2017. Moreover, ELI and DE courses have participated close to 100%. Future results may be improved by addition of a lab activity at the beginning of the semester to familiarize students with some of the mathematical manipulation and graphical analysis that they would encounter throughout the course.

Scientific Literacy: Students will understand the scientific method and identify methods of inquiry that lead to scientific knowledge.

General Biology I BIO 101 Direct Measure: A quiz on the Scientific Method was available on Blackboard to all BIO 101 students in the college (students from all campuses including ELI and DE) towards the end of the semester, Fall 2017. The quiz consisted of 10 multiple choice questions that assessed steps in the Scientific Method. The topics were as follows: • Item #1: observation is first step • Item #2: order of steps • Item #3: definition of hypothesis • Item #4: validity of hypotheses • Item #5: importance of control • Item #6: definition of data • Item #7: example of hypothesis • Item #8: definition of variable • Item #9: definition of theory • Item #10: defining data collecting This assessment is the same as given to students in the previous year. All assessment data is gathered through Blackboard. Sample:

Program Sample Size # %

General Studies 173 31

Semester/year data collected: Fall 2017 Achievement Target: • For the whole quiz, 70% of students achieving 70% on

the quiz. • For each item, 70% of students correctly answering

that item. Like the previous year, students identified themselves by major. This allowed us to compare results from students’ program placed in General Studies (219), Social Science (195) and Science (279). Note that these numbers add to 693; some of the students listed double majors. Results for all students: • Average student score: 84.2% • Percentage of students earning at least 70% is =

88.5% (506/572) • Percentage of students answering each item correctly:

1. Item 1: 64.0% (366/572) 2. Item 2: 94.0% (538/572) 3. Item 3: 92.4% (529/572) 4. Item 4: 88.9% (509/572) 5. Item 5: 81.1% (464/572) 6. Item 6: 93.8% (537/572) 7. Item 7: 85.4% (489/572) 8. Item 8: 87.4% (500/572) 9. Item 9: 65.7% (376/572) 10. Item 10: 89.6% (500/572)

Results for students placed in A.S. Science:

Instructors and students of BIO 101 are becoming more used to assessment by Blackboard. During the Fall 2018 Cluster meeting, faculty members requested results of the previous year’s data. These data were sent to the Biology discipline chair for dissemination. The low achievement results on Items 1 and 9 are important to the biology faculty, because they show that students do not understand that curiosity is the first step of solving a scientific problem through the scientific method. Also, the term “theory” in science continues to confuse students. Students’ wrong answers indicate that they do not realize “theory” in science is not a hypothesis, but a well-substantiated explanation of the natural world. It is valuable for instructors to have this feedback. The discipline chair recently elected in the Biology discipline in Fall 2018 has already seen this data. She wants to work with faculty on the concepts of the two low-scoring questions for the 2019-20 academic year. We need to find out if students are not understanding the concepts or there is a problem with the question itself. This is the second year that A.S. Science students were identified in the assessment. Although most A.S. Science majors take BIO 101, many students in General Studies and

Page 386: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

376

Social Sciences, A.S. Social Sciences 92 16

Liberal Arts 39 7 Science 115 20 Computer Science 11 2

Business Admin 56 10

Information Tech 25 4

Dual Enrollment (Off-site and On-campus Students)

9 2

Other (8 Students or Less)

43 8

Average of All Programs 563 100

The assessment tool was deployed on Blackboard to all students taking BIO 101 on campus (AL, AN, LO, MA, WO) or at ELI or as dual enrolled students, and 572 students took part in this assessment. The exact total number of students in BIO 101 during Fall 2017 is not available, but it is around 1600. This approximate number allows us to determine that about a third of all students responded to the Blackboard notice and took the quiz. Dual enrollment students were included, and 101 DE students (17.6% of the total) took the assessment. In the case of ELI, 128 ELI students (22.3% of the total) took the assessment. The number of students from each campus and from was not tallied. However, the student ID numbers are in the raw data, and specific information can be gleaned from the data.

• Average student score: 83.3% • Percentage of students earning at least 70% is =

87.8% (245/279) • Percentage of students answering each item correctly:

1. Item 1: 58.7% (164/279) 2. Item 2: 93.5% (261/279) 3. Item 3: 90.6% (253/279) 4. Item 4: 90.6% (253/279) 5. Item 5: 78.1% (218/279) 6. Item 6: 93.9% (262/279) 7. Item 7: 84.5% (236/279) 8. Item 8: 87.8% (245/279) 9. Item 9: 63.4% (177/279) 10. Item 10: 88.1% (246/279)

Results for students placed in A.S. Social Science: • Average student score: 82.4% • Percentage of students earning at least 70% is =

86.2% (245/279) • Percentage of students answering each item correctly:

1. Item 1: 57.4% (112/195) 2. Item 2: 93.8% (183/195) 3. Item 3: 87.1% (170/195) 4. Item 4: 90.2% (176/195) 5. Item 5: 74.3% (145/195) 6. Item 6: 94.8% (185/195) 7. Item 7: 84.1% (164/195) 8. Item 8: 86.1% (168/195) 9. Item 9: 64.1% (125/195) 10. Item 10: 88.2% (172/195)

Results for students placed in A.S. General Studies: • Average student score: 82.7% • Percentage of students earning at least 70% is =

85.8% (188/219) • Percentage of students answering each item correctly:

1. Item 1: 56.1% (123/219) 2. Item 2: 91.3% (200/219) 3. Item 3: 87.2% (191/219) 4. Item 4: 87.2% (191/219) 5. Item 5: 80.8% (177/219) 6. Item 6: 94.0% (206/219) 7. Item 7: 86.3% (89/219) 8. Item 8: 85.3% (187/219) 9. Item 9: 66.2% (145/219) 10. Item 10: 88.1% (193/219)

Social Sciences and other majors also take BIO 101. Faculty assessing Social Science and General Studies asked if we could identify their students, since those programs also wish to use this Scientific Method assessment for students in their majors. For the 2018-19 assessment year, we plan to add A.S. Liberal Arts. It is interesting that the results again show very similar results for students, regardless of major. BIO 101 is a class taken by science students early in their academic career, and results show that science students at this early stage did not outperform students in other majors. In this assessment, we were able to demonstrate for the first time that students from all campuses, ELI and Dual Enrollment took part. In the current Blackboard setup, each question is posed as an independent little exam, and that takes more time for students. The two more questions about ELI and DE that two more questions did not discourage students. Nearly 18% of student responders were DE, and 22% were ELI students. Next Assessment: Spring 2019

Page 387: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

377

Social Sciences, A.S. Results indicate that total scores are well above 70%, and most (8 out of 10) individual items meet achievement goals. Scores were very similar to those of last year. The lowest scores were in items 1 and 9. Item 1 asked about the first step in the Scientific Method. The other low score was Item 9 which asked the definition of the word “theory.” Current results improved: [ ] Yes [ ] No [ X ] Partially Scores from students’ program placed in Science, Social Science and General Studies are very similar. In the 2015-16 academic year, students scored below 70% in questions 1, 2 and 9 (42%, 47.5%, and 57.4%). In 2016-17, students scored below 70% in questions 1 and 9 (65.4% and 66.6%). This cycle, students also scored below 70% in questions 1 and 9 (64% and 65.7%). This shows a marked improvement in identifying the steps of the Scientific Method (question 2) over the years assessed, and an improvement in general knowledge of Scientific Method.

Program Goals Evaluation Methods Assessment Results Use of Results Program goal on program-placed students

Distribution of Program Placed Students by Curriculum and Award Type: Fact Book 2012-2013 through 2017-2018

Semester/year data collected: Fall 2017 Target: The growth from Fall 2017 to Fall 2018 in the number of program-placed students for the Social Sciences degree will meet or exceed that of the College total.

Fall Number of Students Percentage Difference

Fall 2017 3702 -4.1% Fall 2016 3863 -6.5% Fall 2015 4132 +5.3% Fall 2014 3923 +6.1% Fall 2013 3697

Total NOVA Program-Placed Students FALL 2013

FALL 2014

FALL 2015

FALL 2016

FALL 2017

44592 43871 43323 41342 40121 Change from prev. year

-1.6% -1.2% -4.5% 2.95%-

Target Met: [ ] Yes [ X ] No [ ] Partially Achievement Target Not Met: From Fall 2017 to Fall 2018, there was a decrease in Social Sciences program-

Previous action(s) to improve program goal: Previously, the SLO team reviewed the decrease of program placed students and established a more realistic achievement target to mirror the college enrollment in terms if meeting or exceeding it. Most recent results: 4.1% decrease. Results improved: [ ] Yes [ ] No [ X ] Partially Current action(s) to improve program goal: The number of program-placed students decreased by 4.1% thereby not meeting the achievement target. The achievement target for 2017-18 assessment year was reviewed for possible revision in light of the decrease in program placed students in Fall 2016, and it was decided to retain the achievement target to mirror the college enrollment in terms if meeting or exceeding it. The Program will also dedicate itself to promoting this discipline with more publicity to the students and offering them support services. The guided pathways program is also likely to help increase the program placed

Page 388: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

378

Social Sciences, A.S. placed students by 4.1%, while the College experienced a decrease of 2.95%. Comparison to previous assessment: Achievement target was not met in the previous assessment year of 2016-17. There was a decrease in Social Sciences program-placed students by 6.5%, while the College experienced a decrease of 3.9%.

students as they are mapped to student-centered goals where students get a clear picture of programs of study being offered. Assessed: Annually

Program goal on graduation

Number of Graduates: Fact Book 2012-2013 through 2017- 2018; Number of Graduates by Program and Specialization: 2017-18

Target: The growth from 2016-17 to 2017-18 in the number of graduates for the Social Sciences degree will meet or exceed that of the College total. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Difference

2017 - 2018 357 +10.8% 2016 - 2017 322 -19.5% 2015- 2016 400 +2.3% 2014- 2015 391 +4% 2013 - 2014 376

Target Met: As the number of Social Science graduates increased by 10.8%, the growth rate of College graduates fell by 5.6% in 2017-18. (College graduation rate in 2017-18 = 7004; total graduates in 2016-17 = 7443). Comparison to previous assessment: From 2015 -16 to 2016-17, there was a decrease in Social Sciences graduates by 19.5%. The growth rate of the College decreased by 3.9%. (Total Graduates in 2015 – 16 = 7752; total graduates in 2016-17 = 7443)

Previous action to improve program goal: The SLO team reviewed the resources available for students to assess how more students could be retained and to increase program graduation totals. Advising was a major resource for students in guiding them towards completion of their degrees. The SLO team promoted a responsive curriculum by putting student learning as its core value. The SLO team communicated to the Advising Divisions and all appropriate offices so there could be further initiatives to increase student enrollment, success and retention. Most recent results: Social Science graduates increased by 10.8%. Results improved: [ X ] Yes [ ] No Partially Current actions to improve program goal: In light of decrease in the number of graduates for the previous assessment year, the SLO team reviewed the achievement target in 2017-18 and left it unchanged as a realistic goal. These results will be shared with the Office of Student Success and Initiatives, Counseling/Advising Divisions and all appropriate offices so there can be further initiatives to increase student enrollment, success and retention. Students are now receiving structured advising by their faculty advisors which is an essential component for student success. This has had a positive effect on student retention so far. The guided pathways program is also likely to positively impact student retention and student success. The SLO team is hopeful of a stable student enrollment as the advising program gets further structure and enhancements. This

Page 389: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

379

Social Sciences, A.S. program goal will be assessed again next year in 2018-19, and the achievement target will be reviewed again to decide if it is realistic to sustain or not and establish a more realistic target. Assessed: Annually

Page 390: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

380

Annual Planning and Evaluation Report: 2017-2018 Geographic Information Systems Career Studies Certificate

Social Science: Geospatial Specialization, A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This program is designed to prepare students to transfer into baccalaureate programs in the geospatial or social sciences at a four-year institution. Students will develop both the theoretical knowledge and a practical facility with geospatial systems. Geographic Information Systems Career Studies Certificate: This program is designed to help students develop both the theoretical knowledge and a practical facility with GIS. Students who already hold a baccalaureate or master’s degrees will acquire the requisite skills and knowledge to switch careers, or to apply spatial analysis in their present workplaces. Students will be positioned to pursue additional coursework toward an associate degree and/or transfer to a four-year institution for further study in the geospatial, environmental or physical sciences; in civil engineering; in information technology; or in business/marketing at a four-year institution.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Student will prepare and present geospatial materials to an end user

Cartography for GIS GIS 203 Direct Measure: Rubric Attached: Students were assessed on 15 criteria within a class project:

1. Title 2. Scale 3. North Arrow 4. Name 5. Legend 6. Visual Hierarchy 7. Font Selection 8. Labeling 9. Use of Greyscale 10. Symbolism 11. Required content 12. Harmony 13. Projection 14. Map package 15. presentation

On each element the inclusion, placement, selection, fit and design of the item or concept is assessed. Each of the students were evaluated on a 4-point scale where a score of 4 was above and beyond skill level, 3 at skill level, 2 working toward skill level and 1 below skill level. Instructors are encouraged to identify any areas of instructional deficiency, rationale for lower scores, and suggested improvements or modification to be considered. Prior assessment methodology: In the prior assessment, GIS 201 instructors evaluated

Semester/year data collected: Fall 2017 Target: 80% of students will score 3 or higher overall. Prior assessment was in GIS 201 in Spring 2017. A different scoring metric was used at that time so the average score at that time is irrelevant, but for continued assessment, the percent above target is relevant. Results by In-Class

Results by

Campus/ Modality

Fall 2017 Spring 2017

Average Score

Percent > target

Average Score

Percent > target

LO 3.9 100 NA 86.7 Results by SLO Criteria: Previous results utilized alternative metric so assessment questions will be different and not applicable:

Results by SLO Criteria/ Question Topics

Fall 2017 Spring 2017

Average Score

% of Students > target

Average Score

% of Students > target

1. 3.7 93 85.7 86.7 2. 3.9 93 84.8 93.3 3. 4.0 100 84.8 93.3 4. 3.8 93 79.0 80 5. 4.0 100 90.0 86.7 6. 3.9 93 83.8 93.3 7. 3.8 93 8. 3.9 93 9. 3.8 93 10. 3.9 93 11. 3.7 93 12. 3.8 93

Previous action(s) to improve SLO: In GIS 201, to improve students basic GIS skills, the credit hours of the class were expanded and the classes were structured as hybrid classes. The teaching tutorials (a set of guided exercises) were combined with assigned readings and other elements to become the hybrid portion of the course. As students improve basic GIS skills they have more time to hone their speaking and writing skills. This SLO was evaluated in Spring 2017 with a high level of success in terms of meeting the objectives of this SLO. Many of those students would have enrolled in GIS 203 in Fall 2017 (this assessment). It would seem these results confirm what was measured in the prior class. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Specifically, the selection of evaluation metrics are so specific they become more of a pass/fail in many instances. This needs addressed. Current actions to improve SLO based on the results: For the next assessment, groups of items used such as Legend and North Arrow will be combined and evaluated as required map elements while new broader metrics will be selected, and new guidelines for these specific metrics will be developed. Next Assessment: Spring 2019 in GIS 205

Page 391: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

381

Social Science: Geospatial Specialization, A.S. each student presentation based on a specific presentation rubric. Presentations were graded on 6 criteria; each criterion was graded 1-7 where a score of 1 was completed at a minimal level up to a 7 for excellent work. As this SLO is about presenting spatial data, the ability to write clearly, speak well and do so professionally is important. As such, the presentation of the project was graded relative to these considerations: 6. Is the report well written and easy to read 7. Does the charts and tables enhance the

report 8. Did the author fully develop a logical

problem and solution 9. Quality of the presentation 10. Quality of the video 11. Does the report have a professional feel Sample: Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 15 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

13. 4.0 100 14. 3.9 93 15. 4.0 100 Total 3.9 95 83.2 88.8

Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: New assessment is clear, in depth, and measurable across classes and assignments. This approach evaluates student learning over a higher number of variables than we typically asses in such assessments, but the nature of these criterion is not without issue. Weaknesses by Criterion/ Question/Topic: A 4 point scale does prove to be a limited means of differentiation between students when criteria are so numerous and specific that in nearly becomes a pass/fail test.

Students will plan and perform spatial analysis

Exploring Our Earth: Introduction to Remote Sensing GIS 255 Direct Measure: Students were assessed on 3 questions relative to an aspect of their class project. Each student was evaluated 1-10 on each part, and the combined scores were converted to point values of 1-4 such that a score of 18 or less will receive a score of 1; 19-21 will receive 2 points; 22-24 will get a score of 3; and 25 and above a score of 4. Instructors are encouraged to identify any areas of instructional deficiency, rationale for lower scores, and suggested improvements or modification to be considered. Graded components were: 1. Creation of a training data set for your Study

Area; Evaluation of Class Statistics 2. Processing using Multiple Classification Alg

orithms; selection of the best technique 3. Post classification Filtering

Semester/year data collected: Fall 2017 Target: 80% of students will score 3 or higher overall Results: Scores from prior assessments utilize a different metric, so averages score do not apply, but percent above target are applicable:

Results by

Campus/ Modality

Fall 2017 Fall 2016

Average Score

Percent > target

Average Score

Percent > target

LO 4 100 68.8 Results by SLO Criteria: Different questions between assessments prohibit direct comparisons:

Results by SLO Criteria/ Question Topics

Fall 2017 Fall 2016

Average Score

% of Students > target

Average Score

% of Students > target

1. 4 100 2. 4 100 3. 4 100 57.9 42.1

Previous action(s) to improve SLO: Previously, credit hours were expanded and the classes were structured as hybrid classes. The teaching tutorials (a set of guided exercises) were combined with assigned readings and other elements to become the hybrid portion of the course. Students were given online quizzes related to the teaching tutorial exercises and allowed to re-take them until they answered the tutorials correctly. It was believed, this could facilitate student preparation for in class lecture and lab work. Combined, these changes should facilitate not only increased scores, but it was believed they could elevate understanding and achievement among students. However, once implemented, the students seem to be simply retaking the quizzes until they got perfect scores so the practice was reverted to a more traditional quiz format for the Spring 2017 semester. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The overall design is sufficient at this

Page 392: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

382

Social Science: Geospatial Specialization, A.S. Prior Assessment Methodology: In GIS 200, Fall 2016 instructors evaluated each student at the end of the semester based class projects relative to this SLO, relative to the specific criteria. Specific criteria for the project are: 1. Precise and unambiguous writing 2. Contains required components 3. Quality of your GIS technique 4. Aesthetics of you map(s) 5. Utility of the graph / table 6. Data Quality 7. Map Package 8. Geodatabase 9. Fully developing a logical problem 10. Articulating a plausible solution Each criterion was scored 0-10. Criteria 3,4,5,6,7,8,9, and 10 directly address the ability to plan and perform spatial analysis. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 13 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment

4. 57.9 47.4 5. 55.3 42.1 6. 68.4 57.8 7. 71.1 68.4 8. 73.7 73.7 9. 81.6 78.9 10. 78.9 73.7 Total 4 100 68.1 60.5

Current results improved: [ x ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Criteria are applied techniques used in data analysis, and to be successful students must plan and implement multiple procedures. Weaknesses by Criterion/ Question/Topic: When an assessment identifies nothing but perfect scores, the assessment should be evaluated carefully.

time but the specifics of the assessment need modification. More questions should be asked and those questions should be difficult enough to mute the near perfect scores from the last assessment. Current actions to improve SLO: Based on these results, we plan to redesign our assessment to have between 6-8 questions. This change will be implemented prior to the next assessment of this SLO, in Fall 2018. Next Assessment: Fall 2018

Students will manage diverse spatial data

Geographic Information Systems I (GIS) GIS 200 Direct Measure: Elements of a student project will be assessed to evaluate achievement in relation to the SLO Evaluation Criteria: Student projects were scored relative to selected questions, to evaluate their level of mastery of this SLO. Specifically, managing diverse spatial data can be reviewed in several ways. For this assessment, the following aspects of their final projects were examined: (1) The students correct use of GIS techniques

and database design (2) The quality of GIS analysis and the ability to

explain this (3) Creativity in problem solving are ways to

assess how well they manage data

Semester/year data collected: Spring 2018 Target: 80% of students will score 3 or higher overall Results:

Results by

Campus/ Modality

Spring 2018 Spring 2017

Average Score

Percent > target

Average Score

Percent > target

LO only 3.4 81.8 Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Spring 2018 Spring 2017

Average Score

% of Students > target

Average Score

% of Students > target

1. 3.4 81.8 2.47 88.9 2. 3.4 81.8 2.53 77.8 3. 3.5 81.8 2.76 88.9 4. 2.65 94.4

Previous action(s) to improve SLO: Beginning in Fall 2017, to improve student GIS skills, the credit hours of the class were expanded, and the classes were structured as hybrids. Implementing the hybrid structure allows student to combine online elements from multiple sources while still receiving high quality one on one instruction in lecture as well as in lab environments. Additionally, a new text was selected that hosts lab activities more in line with the premise of 3D GIS. The new text will be supplemented with additional content to enhance the significantly improved laboratory activities. No additional corrective actions are planned for GIS 205 at this time. Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: We will continue to monitor but we believe we are successfully achieving this SLO in this

Page 393: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

383

Social Science: Geospatial Specialization, A.S. As complex problems need skilled application of theory and management of data to facilitate solutions, students’ successful application of this SLO were evaluated against these aspects of their project. As such student were evaluated on a 4-point scale where a score of 4 was above and beyond skill level, 3 at skill level, 2 working toward skill level and 1 below skill level and reported here. Instructors are encouraged to identify any areas of instructional deficiency, rationale for lower scores, and suggested improvements or modification to be considered. Prior assessment: Prior assessment was in GIS 205 in Spring 2017, based on their student projects. Specific components of the final project in GIS 205 directly relates to this SLO. The use and management of data in Model builder for various 3D tools, types of analysis and the originality of the analysis all speak to a student’s ability to manage data. As such, the projects can be evaluated relative to these components to evaluate the student’s mastery of the concepts. 1. Effective use of model builder: 0 - not used;

1 used in simple analysis; 2 used in nonfunctional complex model; 3 used successfully in a complex model

2. Use or 3D tools: 1 very basic 3D tool usage; 2 complex 3D tool usage, inclusion of multiple 3D tools

3. Originality: 1 very simple design, limited data; 2 inclusion of multiple data types and data sets is a basic 3D model; 3 well-designed original model using multiple types of data

4. Analysis: 1 observational analysis; 2 usage of tools to generate new data layers; 3 complex data analysis with statistical elements

Students at an acceptable level in the program should accumulate between 8 and 12 points from this metric. The instructor re-evaluated the class project relative to this SLO specific metric and scores each project. Instructors are encouraged to identify any areas of instructional deficiency, rationale for lower scores, and

Total 3.4 81.8 2.60 87.5 Current results improved: [ ] Yes [ ] No [ x ] Partially Strengths by Criterion/ Question/Topic: A consistent and repeatable scoring metric has an innate value for long-term tracking and evaluation. Simple assessment design reduces ambiguity in evaluation and facilitates competent and consistent evaluation. However, this SLO is somewhat difficult to ascertain, as the ability to achieve this SLO is somewhat dependent on the difficulty of the specific instance and the temporal affiliation in the semester. Weaknesses by Criterion/ Question/Topic: The primary weakness is a significant limitation on this metric. Specifically, the limited number of questions asked may not adequately encompass the scope of the SLO, and there is some difficulty in developing new and better ways to evaluate this particular achievement. This is further complicated when this SLO is evaluated against differing classes.

class, but we need to consider alternative assessment metrics in other classes. Current actions to improve SLO based on the results: Moving forward during the next assessment cycle in GIS 101, 200 or 201 for this specific SLO, timely completing of lab activities (which requires students to manage and move data) will be examined as a potential metric for this SLO. Success being a reasonable percentage of students completing a reasonable percentage of lab assignments on time after the third week of the semester (to provide some time to develop the skills in earlier GIS classes) along with the assessment of finding and managing data for class projects when applicable. A trial of this assessment stratagem will be implemented for the next assessment cycle of this SLO. Next assessment: GIS 201 in Fall 2019

Page 394: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

384

Social Science: Geospatial Specialization, A.S. suggested improvements or modification to be considered. Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 11 ELI N/A N/A N/A DE* N/A N/A N/A

*Dual-enrollment Core

Learning Outcome

Evaluation Methods Assessment Results Use of Results

CLO: [ X ] CT

Geographic Information Systems II (GIS) GIS 201 Direct Measure: Measure student ability to articulate a complex problem and associated steps to solve based on assessment of project proposal (rubric attached). Instructors are encouraged to identify any areas of instructional deficiency, rationale for lower scores, and suggested improvements or modification to be considered. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO only 1 1 9 ELI N/A N/A N/A DE* N/A N/A N/A *Dual-enrollment

Semester/year data collected: Spring 2018 Target: 70% of students will score 4 or higher overall Results:

Results by Campus/ Modality

Spring 2018 Average

Score Percent > [target]

LO only 4.4 77% Results by CLO Criteria:

Results by CLO Criteria/ Question Topics

Spring 2018

Average Score

% of Students > target

1. Explicitly Ask a GIS Question 4.3 77

2. Explain in a general sense how you propose to answer that question,

5 100

3. Clearly identify the actual data to be used 5 100

4. Describe the maps, tables, charts, or graphs you will make

5 100

5. utilize the graph / table 3.6 67 6. Articulate a plausible solution 3.6 77

Current results improved: N/A – This was the first semester that this CLO was assessed. Strengths by Criterion/ Question/Topic: Grades students on multiple direct measures of their ability to follow instructions, develop a concept, think the

Previous action(s) to improve CLO if applicable: This was the first semester where CLOs were evaluated. Target Met: [X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: A closer look at score of individual components indicates students need to work more on expressing how their analytical output (maps, charts, graphs) will or can be used and be able to explain that in some detail. Current actions to improve CLO based on the results: For the most part, no changes are planned at this time. However, as we traditionally, provide additional class time to understanding how to utilize products of analytical procedures, later in the class, after the proposals are due, we plan to implement an additional question related to how well the students articulate the use of explanatory elements in the reports associated with their class projects. Next assessment of this CLO: The next CLO to be assessed will be Civic Engagement in GIS 201 in Spring 2019

Page 395: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

385

Social Science: Geospatial Specialization, A.S. concept out in a stepwise design, and anticipate multiple outcomes and uses. Weaknesses by Criterion/ Question/Topic: Some topics are problematic when trying to assign a variable score (see rubric – some only have options of 1, 2 and 5) as the possible grad outcomes are did noting, did a bad job, did it as required.

Program Goals Evaluation Methods Assessment Results Use of Results

Increase enrollment in GIS classes

Track GIS enrolment in all GIS classes from data collected in SIS to see trends over time. Our short term goal is to return enrollment to 2013-14 levels, and show improvement in class occupancy. Overall program growth of 1-2 percent is a reasonable goal moving forward with a goal of over 60 percent occupancy.

Target: Greater than 60 percent occupancy in our classes and 1-2 percent growth in overall student numbers. Results:

Year Students Classes Occupancy 2013 174 15 60% 2014 217 18 59% 2015 147 14 53% 2016 165 16 54% 2017 173 13 72% 2018 174 14 67%

Over the previous 4 years, student numbers have fluctuated significantly. Historically the program had been growing nicely, but those numbers dropped off when the program’s founder and only full-time faculty left NOVA. For a period of time the program was unattended and operated solely on adjunct faculty. This set back the growth and development of the program somewhat. Overall we are not at the 2014 level, but we are increasing total enrollment and exceeding the target occupancy rates. Our overall enrollment was up 1 student or 0.6% so we are only partially meeting our target goals. Target Met: [ ] Yes [ ] No [ x ] Partially Assessment is unchanged

In 2016, a new full-time program head was hired to oversee the program. Since that time we have decreased the use of adjuncts, limited the number of sections taught, and began to rebuild the program. As you can see, in 2016 we increased enrollment and percent occupancy in our classes. This trend continues into 2017 where (using best available numbers) it appears the number of students continues to climb, we have reduced our sections taught, and we have improved our classroom use efficiency or occupancy to 72%. We will continue to monitor these trends each year to see short and long term changes in enrolment patterns. Most recent results: We experienced a slight decline in class use efficiency. This was due in part to an additional section during this reporting season. Overall we are meeting our goals, but we have modified our software and moved into a new location with additional seating. Moving forward our maximum capacity will increase, thus lowering our usage unless we pick up an increased number of students. Results improved: [ ] Yes [ ] No [ x ] Partially Current action(s) to improve program goal: Continue monitoring class enrollment and evaluate the program on a yearly basis to foster minor modifications in a timely effort to avoid dramatic and negative changes in enrollment. Assessed: Annually

Increase GIS AS and CSC programs graduates

Track the number of students graduating from the AS and CSC programs classes from data collected in SIS to see trends over time. Our short term goal is maintaining a graduation count of 10 students per year in the CSC program and 10 per year in the AS degree.

Target: Increase GIS AS and CSC programs graduates Results for Past 5 Years:

Year AS specialization CSC Total

2014 1 8 9 2015 2 10 12 2016 1 4 5 2017 3 5 8

Previous action to improve program goal: As we improve our course scheduling and increase our student numbers, we expect to see increases in graduation rates. But it is unwise not to consider other options as well. To begin resolving the issues with the CSC program, a short term solution adopted was to simply offer the necessary classes more starting in Spring 2018. This

Page 396: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

386

Social Science: Geospatial Specialization, A.S. 2018 5 12 18

Target Met: [ X ] Yes [ ] No [ ] Partially Graduation numbers have improved since last year and are at the highest levels we have seen in 5 years.

will allow students to move through the degree faster but does not address the issue with degree length. Dividing the CSC into multiple stackable degree programs is also being considered and will be the subject of the GIS advisory committee’s work in 2018. The number of classes in the specialization was reduced, and it was made more flexible in terms of the required courses. This should make it easier to fulfill the degree requirements of the specialization. Further revision to the AS specialization is under consideration. This would include revising the degree requirements to more closely align to the wants of our transfer partners in terms of GIS specific course work for incoming transfer students. Currently both programs are currently underperforming but increasing. While the CSC seems ready to rebound, the AS degree is underperforming. We will continue to monitor these goals annually. Assessed: Annually

Page 397: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

387

Annual Planning and Evaluation Report: 2017-2018 Social Sciences: Political Science Specialization A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The Social Science Associate of Science degree, with a specialization in Political Science, prepares graduates for transfer into four-year colleges and universities to pursue a Bachelor’s degree of political science. The political science discipline educates students about this area of social science and how power operates at the local, state, national, and international levels. Students will also learn important current events in all of these areas of politics. Most importantly, those with political knowledge are more likely to be civically engaged in public life.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to identify the major subfields in political science (PLS): American Politics, Comparative Politics, International Relations, and Political Theory.

Intro to Political Science PLS 120 Direct Measure: This assessment was performed in PLS 120 classes, Intro to Political Science, which deals directly with this SLO. Provided Rubric Criteria or Question Topics: We asked students 20 Multiple Choice (MC) questions requiring them to identify the proper subfields of PLS, which include the following: Political Theory; American Politics; International Relations; Comparative Politics; and Science of Politics. See attached Assessment Quizzes. Sample:

Campus/

Modality

# of Total

Sections

Offered

# Section

s assess

ed

# Student

s assess

ed

AL 2 2 13 AN 3 0 0 MA 1 1 6 ME 0 0 0 LO 1 1 5 WO 0 0 0 ELI 3 0 0 DE* 0 0 0 Total 10 4 24

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: Average score 80% or higher overall on each criterion as well as the overall score. Results:

Results by Campus/ Modality

Spring 2018 Average Score

AL 71 MA 67 LO 68 Total 69

Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Spring 2018

Average Score Theory 64 American Politics 69 International Relations 78 Comparative Politics 72 Science of Politics 63 Total 69

Current results improved: N/A: First time assessed. Strengths by Criterion/ Question/Topic: International Relations had a surprisingly high score, comparatively (although still below target). Weaknesses by Criterion/ Question/Topic: All topics could use a look at since all were below target. In particular, Theory topic questions, answer choices, and content as discussed in class needs to be evaluated.

Previous action(s) to improve SLO: N/A: First time assessed. Target Met: [ ] Yes [X] No [ ] Partially None of the scores reached the minimum, but students did well, and for the first assessment for PLS, this worked. Based on recent results, areas needing improvement: - Need to present topics of import earlier in semester

and reinforce the subfield topics throughout semester.

- Addressing topic of theory is even more important, as it received the lowest scores. Will need to better explain theories, hypotheses, and research designs.

- Use some portion of every PLS course to address how all subfields are connected.

Current actions to improve SLO based on the results: - Will share results with all faculty, and they will see

recommendations to address issues. - Discuss issue at Discipline Group meeting. Was

discussed in Fall 2018, and will be discussed in Spring 2019.

- For Fall 2018, we have included all campuses (including ELI).

- The program will keep deans and associate deans in the loop on all this so they know what is needed.

Next Assessment: Fall 2018

CLO: Civic Engagement:

American National Politics PLS 135 Direct Measure: This assessment was performed in PLS 135 classes,

Semester/year data collected: Spring 2018

Previous action(s) to improve SLO: N/A: First time assessed.

Page 398: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

388

Social Sciences: Political Science Specialization A.S. Students will be able to describe the political institutions and processes of the government of the United States.

American National Politics, which deals directly with this SLO. Provided Rubric Criteria or Question Topics: We asked students 20 Multiple Choice (MC) questions requiring them to identify correct responses in four areas important to American politics and government: The Constitution; Legislative Branch; Executive Branch; and Judicial Branch. Sample:

Campus/

Modality

# of Total

Sections

Offered

# Section

s assess

ed

# Student

s assess

ed

AL 2 2 9 AN 2 1 7 LO 1 1 1 ELI 3 0 0 DE* N/A N/A N/A Total 8 4 17

*Dual-enrollment

Target: Average score 80% or higher overall on each criterion as well as the overall score. Results:

Results by Campus/ Modality

Spring 2018 Average Score

AL 87 AN 90 LO 95 Total 89

Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Spring 2018

Average Score Constitution 82 Legislature 88 Executive 94 Judiciary 89 Total 89

Current results improved: N/A: First time assessed. Strengths by Criterion/ Question/Topic: Students in the PLS Specialization performed best in the area of the Executive Branch, which should not be surprising as it receives the most media attention. Surprisingly, students did second best with the Judiciary questions. This area often gets less attention; however, events like major decisions or vacancies may change that on occasion. Thankfully, all scores were above the target of 80%. Weaknesses by Criterion/ Question/Topic: Knowledge of the Constitution was the lowest score, albeit still above the target of 80%. We spend a healthy bit of time on the making of the Constitution, and what’s in it, but we can reconsider where our focus is spent on.

Target Met: [ ] Yes [ ] No [X] Partially All campuses are above target, which is great. However, knowledge of Constitution seems to lag categories with regards to scores. Based on recent results, areas needing improvement: - Need to spend more time in class discussing the

Constitution. - Inform students on importance of the topic. - After Spring 2019, will adjust questions. - We have gotten all campuses and ELI to participate

for Fall 2018 semester. Current actions to improve SLO based on the results: - Will share results with all faculty and they will see

recommendations to address issues. - Discuss issue at Discipline Group meeting, during

convocation of Spring 2019, and we will discuss this at every Fall convocation.

Next Assessment: Fall 2018

Program Goals Evaluation Methods Assessment Results Use of Results

To increase # students placed in SS-PLS.

Short description of method(s) and/or source of data: - We will ask OIR to provide us with

numbers on students enrolled in SS-PLS. Currently the information is not on Planning and Evaluation Data website.

Target: To increase the number of students placed in SS-PLS.

Previous action(s) to improve program goal: N/A Most recent results: - We don’t currently have this information but will ask

OIR for it in future. - PLS faculty advise students on value of this

degree/specialization. - 1 PLS professor is a FAM and able to keep Disc

Group informed on trends and information. - 1 PLS professor connects students to area

internships.

Page 399: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

389

Social Sciences: Political Science Specialization A.S. - 1 PLS professor has compiled information on value

of PLS degree on website: https://sites.google.com/site/jlechelt/Home/advisementassistance

- Useful resource on how NOVA PLS courses transfer: https://docs.google.com/spreadsheets/d/1W86h2V4vMEOMZz386foP9PWfzlM4bhWvffxAuB3_3GA/edit?usp=sharing

Current action(s) to improve program goal: See points in most recent results. Assessed: Annually

Increase students enrolled in program. Consider changes as warranted.

Short description of method(s) and/or source of data: - OIR data on graduation rates.

Target: Increase students enrolled in program. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2017-18 42 27% over 16-17 2016-17 33 3% over 15-16 2015-16 32 3% over 14-15 2014-15 31 3% over 13-14 2013-14 30

Target Met: [X] Yes [ ] No [ ] Partially

Previous action to improve program goal: - See points in previous section, which involve

improved advising, coordinate information, internship opportunities.

- Beyond that, we present information on elections and political issues to various gatherings and through yearly Post-Election Conference.

Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: - See above. We will continue with all activities listed

above. - Regular Discipline Group meetings. - We will be addressing CLO on Civic Engagement

for Spring 2019. Assessed: Annually

Page 400: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

390

Annual Planning and Evaluation Report: 2017-2018

Social Sciences: Teacher Education Specialization, A.S. NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum prepares students to transfer to a 4-year college or university teacher preparation program. It is specifically designed for students who plan to seek endorsement and licensure as teachers in PK-3, PK-6, or special education.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will illustrate and explain their understanding of assessment techniques in the public school systems.

Introduction to Teaching as a Profession EDU 200 Direct Measure: Assessment Essay: Identify five (5) different assessment tools you have observed during your field placement or learned about during the course of EDU 200. Describe each one, including how they were used to complete assessments. Rubric: 1. Identify 5 assessments (20 points) 2. Describe Classroom Use (30 points) Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 17 1 17 AN 1 1 16 MA 1 1 18 LO 1 1 10 WO 1 1 9 ELI 1 1 6 DE* N/A N/A N/A Total 21 5 76

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 85% of students will score 85% or higher overall on each criterion as well as the overall score Results:

Results by

Campus/ Modality

Fall 2017 Previous Assessment

Results

Students Who Completed

Assignment 85% or better

Percent > target

AL (15/17) 88% +3% 97% of total students passed at 85%

Results by campus

unavailable

AN (16/16) 100% +15% MA (18/18) 100% +15% LO (9/10) 90% +55 WO (7/9) 78% -7% ELI (5/6) 83% -2%

Results by SLO Criteria:

Results by SLO

Criteria/ Question Topics

Fall 2017 Previous

Assessment Results

# Students Who

Successfully Completed

Percent > target

Average Score

% of Students > target

1. Identify 5 assessment tools

(72/76) 94.7%

+9.7% 100% +15%

2. Describe how to implement assessment in classroom

(69/76) 90.8%

+5.8% 97% +12%

Current results improved: [ ] Yes [ x ] No [ ] Partially

Previous action(s) to improve SLO: Instructors provided direct classroom instruction on how to create and use assessments. Students shared examples of assessments seen in the field placement. Finally, students completed a teacher-interview assignment on how to assess students beginning in Fall 2017. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: The target was met. Continued work needed on student writing skills as they complete essays. Note: Small class sections negatively impacted data for ELI and WO. Current actions to improve SLO based on the results: Maintain direct instruction. Increase enrollment for ELI and WO. Continue teacher interviews on assessments. For assessment implementation improvements, instructors will more intentionally reference strategies that students observe in the field placement beginning in Fall 2018. Next Assessment: Fall 2019

Page 401: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

391

Social Sciences: Teacher Education Specialization, A.S. Strengths by Criterion/ Question/Topic: Students are easily able to identify assessments. Weaknesses by Criterion/ Question/Topic: Students need more help with assessment implementation.

Differentiated Instruction Students will prepare and compose topics related to practice in a variety of communities; identifying students of differing ages and with culturally diverse and exceptional populations.

Introduction to Teaching as a Profession EDU 200 Direct Measure: As a course requirement, EDU 200 students will complete a final exam essay which will test their understanding of topics related to practice in a variety of communities; identifying students of differing ages and with culturally diverse and exceptional populations. Rubric: 1. 5 guidelines for differentiated instruction 2. 4 approaches to differentiated instruction 3. 1 examples of differentiated instruction in

the classroom Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 1 1 17 AN 1 1 16 MA 1 1 18 LO 1 1 10 WO 1 1 9 ELI 2 1 6 DE* N/A N/A N/A Total 5 5 76

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 85% of students will achieve 85% or greater on assignment Results:

Results by

Campus/ Modality

FALL 2017

Spring 2015 # Students Who

Complete Assignment 85%

or better

Percent > target

AL 15/17 88%

+3% Results not available by campus.

(86/90) 95.% students earned 85% or better

AN 16/16 100%

+15%

MA 18/18 100%

+15%

ME NA NA LO 8/10

80% -5%

WO 9/9 100%

+15%

ELI 5/6 Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Fall 2017 Spring 2015 %Students

Who Earned 85%

or better

% of Students > target

%Students who earned

85% or better

% of Students > target

1. 5 Guidelines for differentiated instruction

(66/76) 86.8%

+1.8% 94% +9%

2. 4 Approaches to Differentiated Instruction

(66/76) 86.8%

+1.8% 100% +85%

3. 1 Example of Differentiated Instruction

(67/76) 88.1%

+3.1% 97% +12%

Current results improved: [ X ] Yes [ ] No [ ] Partially Strengths by Criterion/ Question/Topic: Students do a good job identifying examples of differentiation in the classroom. This means

Previous action(s) to improve SLO: In the past, we had students interview their teachers on differentiated instruction. In class, we have discussed differentiated instruction and provided some OER materials that describe differentiated instruction in more detail beginning in Fall 2018. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Beginning in Spring 2019, we are going to modify this assessment to better align with the object. The current objective requires students to identify strategies to work with diverse cultures and exceptional students. While differentiated instruction does relate to this, the current essay does not require students to include information regarding diverse cultures or exceptional students. In Spring 2019, students will complete an essay to identify different types of diversity including language, culture, and specific learning needs. The essay will be graded with a rubric. Current actions to improve SLO based on the results: Students will complete a teacher interview regarding exceptional children in their field placement. Next Assessment: Fall 2019

Page 402: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

392

Social Sciences: Teacher Education Specialization, A.S. that students are also discussing differentiation with their supervising teachers during field placement. Weaknesses by Criterion/ Question/Topic: Students are having more trouble.

Students will summarize, reflect upon and outline their field experiences in the public school systems during their 40 hour field placement.

Introduction to Teaching as a Profession EDU 200 Direct Measure: As a course requirement, EDU 200 students will complete a 40- hour field placement. To assess this field placement, instructors require that students turn in the following requirements: - Hours sheet signed by supervising teacher,

proving that student completed 40 hours of placement.

- Evaluation sheet, signed by supervising teacher.

- Journal entries for each visit to the classroom, using one of two required formats outlined in the Field Placement Manual

Rubric: 1. Hour Log (30%) 2. Evaluation Sheet (40%) 3. Journal Entries (30%)

Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AL 17 1 17 AN 1 1 16 MA 1 1 18 LO 1 1 10 WO 1 1 9 ELI 1 1 6 DE* N/A N/A N/A Total 21 5 76

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 85% of students will score 85% or higher overall and on each criterion. Results:

Results by

Campus/ Modality

Spring 2018

Fall 2016 Students Who Met Completed Assignment at 85% or Better

Percent > target

AN (17/17) 100%

+15% Previous Assessment Results:

NOT AVAILABLE BY CAMPUS

90% complete hours

91% complete journal entries

90% received a positive evaluation from

supervising teacher

MA (15/15) 100%

+15%

LO (12/14) 85.7%

+.7%

ELI (29/32) 90.6%

+5.6%

Results by SLO Criteria:

Results by SLO Criteria/

Question Topics

Spring 2018 Fall 2016 Students Who Met Criteria at

85% or Better

% of Students > target

Average Score

% of Students > target

1. Hour Log 71/76 93.4%

+8.4% 90% +5%

2.Evaluation Sheet

71/76 93.4%

+8.4% 90% +5%

3. Journal Entries

68/76 89.4%

+4.4% 91% +6%

Strengths by Criterion/ Question/Topic: Students are completing their 40-hour field placement requirements. Students are following the directions for completing journal entries as outlined in the field placement manual. Weaknesses by Criterion/ Question/Topic: Some supervising teachers report not understanding the components of the field placement on the evaluation.

Previous action(s) to improve SLO: Spot-checks on journals during face-to-face meetings to ensure that students were taking notes and correctly following the template and requirements. This dramatically improved student performance beginning Spring 2016. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: Helping supervising teachers better understand the expectations for field placements. Current actions to improve SLO based on the results: Instructors will write to all supervising teachers and provide a copy of the field placement manual and tips for field experience. Instructors will also stress to students that the student must provide the supervising teacher with the guidelines for field placement beginning Fall 2018. Next Assessment: Spring 2020

Page 403: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

393

Social Sciences: Teacher Education Specialization, A.S. Core Learning

Outcome Evaluation Methods Assessment Results Use of Results

CLO: Critical Thinking – Philosophy of Education [ X ] CT

Introduction to Teaching as a Profession EDU 200 Direct Measure: To assess students’ critical thinking in EDU 200, students compose a 2-3 page philosophy of education. In their philosophies, they must synthesize information from class, the instructional materials, and their field experiences. Rubric: 1. How Students Learn (20%) 2. What Students Should Be Taught (20%) 3. How Students Should Be Taught (20%) 4. The Conditions Under Which Students

Learn the Best (20%) 5. Qualities That Make Up a Good Teacher

(20%)

Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

AN 1 1 17 MA 1 1 15 LO 1 1 14 ELI 2 2 32 DE* N/A N/A N/A Total 5 5 78

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 85% of students will score 85% or higher overall and on each criterion. Results:

Results by

Campus/ Modality

Spring 2018 Previous Assessment

Results

Students Who Met Completed

Assignment at 85% or Better

Percent > target

AN (16/17) 94%

+9% N/A: This CLO was not previously assessed.

MA (14/15) 93.3%

+8.3%

LO (11/14) 78.6%

-6.4%

ELI (29/32) 90.6%

+5.6%

Results by SLO Criteria:

Results by SLO Criteria/ Question

Topics

Spring 2018 Students Who Met Criteria at 85% or

Better % of Students >

target

1. How Students Learn 66/78 84.6%

-.4%

2. What Students Should Be Taught

68/78 87.1%

+2.1%

3. How Students Should Be Taught

68/78 87.1%

+2.1%

4. The Conditions Under Which Students Learn The Best

68/78 87.1%

+2.1%

5. Qualities That Make Up A Good Teacher

70/78 89.7%

+4.7%

Current results improved: N/A: This SLO has not previously been assessed. Strengths by Criterion/ Question/Topic: Students do a good job describing the qualities that make a good teacher. Instructors spend a lot of class time discussing field experiences and observations of teacher strengths. In addition, students have enough background knowledge to be successful on this section. Weaknesses by Criterion/ Question/Topic: Students need more help describing instructional strategies and classroom environment.

Previous action(s) to improve SLO: Data on this CLO has not been collected in the past. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: How Students Should Be Taught What Students Should Be Taught The Conditions Under Which Students Learn The Best Current actions to improve CLO based on the results: Instructors will work with students on describing instructional strategies and curriculums. Instructors will be more explicit in pointing out instructional techniques and components of a quality classroom environment beginning Fall 2018. Next Assessment: Fall 2019

Page 404: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

394

Social Sciences: Teacher Education Specialization, A.S. In addition, students need more help identifying and defining skills and curriculum that should be covered at school.

Program Goals Evaluation Methods Assessment Results Use of Results PROGRAM GOAL # To expand the Teacher Education Program at Northern Virginia Community College so that classes are available at all main campuses.

*Number of sections available. *Campuses where sections of EDU 200 are offered to students. Information is obtained through the VCCS SIS.

Fall 2016-Spring 2017 Target: Offer at least 1 section of EDU 200 at all 5 campuses and ELI each year. Results for Past 5 Years:

Semester Number of Sections

Locations

Fall 2016-Spring 2017 12 sections 5 campuses & ELI Fall 2015-Spring 2016 14 sections 5 campuses & ELI Fall 2014-Spring 2015 15 sections 5 campuses & ELI Fall 2013- Spring 2014 16 sections 5 campuses & ELI Fall 2012-Spring 2013 17 sections 5 campuses & ELI

Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment(s): We were unable to offer as many sections of EDU 200 in Spring 2017. Enrollment was lower at Alexandria and Woodbridge campuses.

Previous action(s) to improve program goal: This has included designing Teacher Ed flyers. Each semester, the Program Head requests that these posters be placed on the closed circuit TV systems campus-wide before registration begins. Additionally, we are offering more creative ways to offer EDU 200 (such as in hybrid format beginning in Spring 2016). Results improved: [ ] Yes [ X ] No [ ] Partially Current action(s) to improve program goal: We are going to do targeted advertising at both Alexandria and Woodbridge. We will distribute Teacher Ed flyers and meet with First Year Advisors. We also expect to see enrollment in EDU 200 increase with the addition of EDU 200 to the AAS Early Childhood degree. We will be sure to add night sections of EDU 200 to meet the needs of working child care providers beginning Spring 2019. Assessed: Annually

To increase the number of students who complete the Teacher Education Program and graduate with the Social Sciences with Teacher Education Specialization Associates Degree or the General

Number of students that graduate in the program. Information is obtained through the OIR Fact Book.

Target: To increase the number of students who complete the Teacher Education Program and graduate with the Social Sciences with Teacher Education. Results for Past 5 Years:

Academic Year Number of Graduates

Percentage Increased

2016-2017 67 -31% 2015-2016 88 -12% 2014-2015 99 +8% 2013-2014 92 +12% 2012-2013 82 -

Target Met: [ ] Yes [ X ] No [ ] Partially Comparison to previous assessment(s):

Previous action to improve program goal: In Spring 2016, we began meeting with the First Year Advising Teams at the various campuses. We went over our program requirements and the requirements for licensure. We also stressed the appropriate placement of students who would like to become teachers. We encouraged the First Year Advisors to refer students who would like to become teachers to us for further counseling. We will continue to schedule meetings with the first year advisors at all of the campuses. We may see a decrease

Page 405: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

395

Social Sciences: Teacher Education Specialization, A.S. Education with Teacher Specialization Degree

Currently, enrollment is down across the college and in Teacher Education Programs throughout the United States. We have seen a decrease in graduation the last 2 years.

in Teacher Education majors as some students select the new AAS degree. Results improved: [ ] Yes [ X ] No [ ] Partially Current actions to improve program goal: We will work to advertise our program more and provide information to local high school students and their parents about opportunities for becoming a teacher at NOVA beginning in Fall 2018. We will continue to advertise our program at each campus with flyers and announcements. Assessed: Annually

Page 406: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

396

Annual Planning and Evaluation Report: 2017-2018 Substance Abuse Rehabilitation Counselor, Certificate

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: This curriculum is designed to fulfill the Virginia state educational requirements for the certification of substance abuse counseling assistants. To meet substance abuse counselor assistant certification requirements, the applicant is expected to meet specific education requirements including didactic and experiential learning with a supervised internship required. Individuals desiring skills and knowledge in this career field, but not seeking State Certification may also enroll. Students in this curriculum will participate in at least 3 semester hours of Cooperative Education unless they already have equivalent experience. Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Students will be able to scientific facts of disease and the effects of psychoactive drugs on the central nervous system.

Effects of Psychoactive Drugs HMS 145 Direct Measure: Written Exam Provided Rubric Criteria or Question Topics: The exam covered how the brain operates at homeostasis (before any substance use), how the brain operates while under the influence of substances (drugs, alcohol, nicotine, etc.), and how the brain operates and functions post drug use. Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

AL only 1 1 15 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 15

*Dual-enrollment

Semester/year data collected: Fall 2017 Results:

Results by Campus/ Modality

Fall 2017 Average Score

AL only 81 Results by SLO Criteria:

Results by SLO Criteria/ Question Topics

Fall 2017 Average

Score % of Students

> target 1. 81 92

Previous action(s) to improve SLO: Provide a rubric for an assignment or a more detailed preview and summary of exam. Target Met: [ ] Yes [ ] No [ x ] Partially No previous standard was set. Based on recent results, areas needing improvement: The next SLO needs to include a comprehensive exam with multiple choice, fill-in-the-blank, T/F, and short essay. Current actions to improve SLO based on the results: Creating the comprehensive exam along with short quizzes leading up to the exam as well as videos on topics and in-class discussion. All methods will be geared to preparing for the comprehensive exam in Fall 2019. Next Assessment: Fall 2019

Students will be able to explain and discuss at least 8 of the counseling theories and why counselors use them.

Counseling Psychology HMS 266 Direct Measure: Written Exam Provided Rubric Criteria or Question Topics: Why do counselors use theories when providing counseling services to clients? What is Cognitive Behavior Therapy? What is Behavioral Therapy? What is Rational Emotive Behavioral Therapy? What are the three main categories discussed and reviewed this semester? Sample Size (Specify N/A where not offered):

Semester/year data collected: Fall 2017 Results:

Results by Campus/ Modality

Fall 2017 Average Score Percent > target

AL Only 85 80

Previous action(s) to improve SLO: None were made Target Met: [ x ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: It appears the class size is growing so to personalize the learning, triads will be used every class for a portion of the class, roughly 30 minutes of triads and 30 minutes of class discussion.

Page 407: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

397

Substance Abuse Rehabilitation Counselor, Certificate

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

AL only 1 1 25 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 25

*Dual-enrollment

Current actions to improve SLO based on the results: The next implementation will include triads and group discussions on each theory in Fall 2019. Next Assessment: Fall 2019

Conduct a site visit of an addiction treatment facility and make a presentation on the findings including levels of care, theoretical basis, modalities of treatment, and the requirements of staff.

Direct Measure: Site Visit Provided Rubric Criteria or Question Topics: What are the requirements for someone to enter your program? What is the philosophy of the program? What educational background and certifications do your staff have? Is there a particular style or modality your agency uses in counseling such as CBT? Sample Size (Specify N/A where not offered):

Campus/ Modality

# of Total Sections Offered

# Sections assessed

Students assessed

AL only 1 1 17 ELI N/A N/A N/A DE* N/A N/A N/A Total 1 1 17

*Dual-enrollment

Semester/year data collected: Spring 2018

Previous action(s) to improve SLO: The activity was not completed because the students could not find agencies willing to be observed by students in Spring 2018. Target Met: [ ] Yes [ x ] No [ ] Partially Based on recent results, areas needing improvement: The assignment as it relates to time and accessibility for the students. Current actions to improve SLO based on the results: The activity will be appropriate for an 8-week hybrid course and suited for adult learners in Spring 2019. Next Assessment: Spring 2020

Page 408: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

398

Annual Planning and Evaluation Report: 2017-2018 Veterinary Technology, A.A.S.

NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community College is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum will prepare the student for a career as a veterinary technician. Satisfactory completion of the curriculum will make the student eligible to take the Veterinary Technician National Examination for certification as a veterinary technician. The curriculum is broad based and includes both practical and theoretical course work which prepares the student for employment in various areas of animal health care, including veterinary hospitals and research and diagnostic laboratories.

Student Learning

Outcomes Evaluation Methods Assessment Results Use of Results

Safely and accurately prepare, dispense, administer, and explain use of prescribed medications.

Animal Pharmacology VET 216 Direct Measure: Completion of VET 216 (Animal Pharmacology) separate proctored quiz questions on tasks for prescribe veterinary medications. See APPENDIX SLO #1 – VET 216 (Animal Pharmacology) for assessment. The questions used for assessment were in the following categories: • Topic A: Safely and accurately prepare

prescribed medications: 1, 2, 6 • Topic B: Safely and accurately dispense

prescribed medications: 3, 4 • Topic C: Safely and accurately administer

prescribed medications: 5, 7, 8 • Topic D: Safely and accurately explain use

of prescribed medications: 9, 10, 11 Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO 1 1 23 ELI 1 1 24 DE* N/A N/A N/A Total 2 2 57

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of students will answer 70% or more of each exam question correctly. Results: Data collected from VET 216 EO5L (online cohort) and Spring 2018 from VET 216 Y01L (on-campus cohort). A total of 11 exam questions were used for this written assessment. The questions were the same for the on-campus and online cohorts.

Question #

% of On-campus students answered correctly

% of Online students answered correctly

1 100% 96% 2 74% 82% 3 100% 88% 4 100% 96% 5 78% 82% 6 100% 100% 7 91% 96% 8 100% 96% 9 86% 86% 10 78% 100% 11 100% 88%

Average per SLO Subcomponent Topic:

Topic of SLO

% Correct On-Campus

% Correct Online

Topic A 91% 93% Topic B 100% 92% Topic C 90% 91% Topic D 88% 91%

This SLO has not been previously assessed. Target Met: [ X ] Yes [ ] No [ ] Partially Based on recent results, areas needing improvement: For the on-campus cohort, the subtopic with the lowest percent score was Question #2 in the “Safely and accurately prepare” category. For the online cohort, the subtopics with the lowest average percent correct of 82% occurred with “safely and accurately explain” and “safely and accurately administer.” Current actions to improve SLO based on the results: For the on-campus cohort, Topic A category questions will be emphasized more throughout course lessons which can be accomplished through additional reading assignments and an in-class student demonstration/ presentation assignment beginning in Spring 2019. The two subtopics for the online cohort will be addressed through additional reading, video and possible writing assignments. For more peer-to-peer engagement, a discussion board assignment may be implemented beginning Spring 2019. Next Assessment: Spring 2019 and Spring 2020 when both courses

Page 409: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

399

Veterinary Technology, A.A.S. are next offered, with biennial reporting in 2020.

Explain how to perform and expedite triage, emergency and critical care nursing procedures in the implementation of prescribed

Advanced Clinical Practices VET 221 Direct Measure: Completion of separate proctored review quiz regarding animal triage and veterinary emergency/critical care nursing in VET 221 (Advanced Clinical Practices). See APPENDIX SLO #2 – VET 221 (Advanced Clinical Practices) Quiz for assessment. The questions used for assessment were in the following categories: • Recognizing normal ECG - #1, #2, #3 • Recognizing abnormal ECG - #4, #6 • Recognizing normal state - #9 • Recognizing abnormal state - #8 • Implementing diagnostics/treatments - #5,

#7 Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO 1 1 34 ELI 1 1 30 DE* N/A N/A N/A Total 2 2 64

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of students to score 70% or higher on each selected written exam question. Results: Data collected from VET 221 001L (on-campus cohort) and VET 221 EO5L (online cohort). A total of 9 exam questions were used for the assessment. The questions were the same for the on-campus and online cohorts:

Question # % of on-campus

students that answered correctly

% of online students

that answered correctly

3 91% 72% 4 82% 83% 5 91% 93% 7 73% 79% 8 91% 79% 9 50% 62% 18 91% 93% 21 15% 60% 99 88% 96%

This SLO has not been previously assessed. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: As can be seen from the table, a goal of 70% or more students answering correctly was met for all questions except #9 and #21. These two questions involve recognizing abnormal states that require intervention. Therefore, the recommendation is to increase training in triage and recognition of emergent and critical states. It is noted that the online cohort performed higher than the on-campus cohort for questions #9 and #21, although still did not make the 70% benchmark. There are some slight differences in the instructional approach for the two cohorts that may account for this. The online cohort submits a written assignment on critical care concepts based on independent reading assignments. They also review case scenarios as part of the online instruction, as well as additional review in face to face meetings on-campus. The on-campus cohort receives instruction via Power point lecture and reviews case scenarios on-campus in lab exercises. The on-campus students also have the same reading assignment as the online cohort, but do not submit a written assignment. Current actions to improve SLO based on the results: Both cohorts would benefit from additional training in critical care concepts and clinical case scenarios emphasizing the recognition of urgent and

Page 410: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

400

Veterinary Technology, A.A.S. emergent conditions the next time the course is offered during Fall semester. For the on-campus students, this can be accomplished by requiring a formal assignment to reinforce the lecture and reading material beginning in Fall 2018. For both cohorts, additional case discussions and testing will be added to emphasize critical thinking about clinical situations and to identify the features of abnormal conditions. Next Assessment: Fall 2018 and Fall 2019 when both courses are next offered, with biennial reporting in 2020.

Safely and effectively administer and monitor animal patient anesthesia.

Anesthesia of Domestic Animals VET 135 Direct Measure: Completion of separate written exam questions and a laboratory practical exam on anesthesia machine set-up and safety in VET 135 (Anesthesia of Domestic Animals). See APPENDIX SLO #3A and #3B – VET 135 (Anesthesia of Domestic Animals) for assessments. Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO 3 3 22 ELI 3 3 22 DE* N/A N/A N/A Total 6 6 44

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 100% of students to score 80% or higher for overall introduced and practiced skills. Results: Data collected from VET 135 lecture and lab sections for both on-campus and online cohort of students:

Results by Campus/ Modality

Anesthesia Equipment Lab Practical Results

SPRING 2018 Average Score Percent > 80%

LO 9.77 97.5% ELI 9.95 99.5%

Results by Assessment Score Distribution:

Lab Score % Range

# of On-Campus/LO Student # of On-Line ELI Students

96-100 21 (100% score) 20 (100% score) 90-95 0 2 (95% score) 86-89 0 0 80-85 0 0 70-79 0 0 0-69 1 (50% score) 0 Total 22 students 22 students

Results by SLO Lab Practical Item Analysis:

Results by SLO Question Topics

On-Campus Program

Anesthesia Item Analysis Results

On-Line Program Anesthesia Item Analysis Results

SPRING 2018

This SLO has not been previously assessed. Target Met: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: Students in both on-campus and online program cohorts scored very high (>97%) for hands-on skills except for 1 student in the on-campus class who failed 3 of the 4 parts involving proper set-up and leak testing of anesthesia machine. The student declined individual remediation and subsequently dropped the class. The weakest part for online students was set-up of the anesthesia machine due to lack of familiarity with anesthesia equipment models in the campus teaching lab. For the written knowledge questions, the on-campus students scored lowest on Question #3, with only 81% answering correctly. This is a patient monitoring parameter question. In contrast, the online students scored lowest (73%) with Question #5 regarding anatomical

Page 411: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

401

Veterinary Technology, A.A.S. SPRING 2018

% of Students > 80%

% of Students > 80%

PART A – ID Machine Parts 100% 100% PART B – Set-Up of System

95.5% 90.9%

PART C – Describe O2 Flow Thru Circuit

95.5% 100%

PART D - Leak Test Circuit 95.5% 100% Target: 100% of students to score 70% or higher for overall knowledge Results:

Results by Campus/ Modality

Anesthesia Monitoring Quiz Results SPRING 2018

Average Score Percent > 70% LO 9.57 95.7% ELI 9.21 92.1%

Results by SLO Quiz Item Analysis:

Results by SLO

Question #

Anesthesia Monitoring Quiz Item Analysis Results

SPRING 2018 % of On-Campus

Students > 70 % of On-Line Students > 70

1. 91% 100% 2. 96% 86% 3. 81% 91% 4. 96% 100% 5. 96% 73% 6. 100% 96% 7. 100% 100% 8. 100% 100% 9. 100% 82% 10. 100% 100%

Strengths by Criterion/ Question/Topic: Students assessed in-person on realistic skills that an entry-level veterinary technician anesthetist must possess for positive animal patient outcomes. Trouble-shooting begins with equipment knowledge and preparation. Also assesses students under timed conditions for performance under stress similar to veterinary anesthesia/surgical environment. Weaknesses by Criterion/ Question/Topic: Lab practical and selected quiz questions test chunks of knowledge and skills but not sequencing critical thinking as would occur with a clinical veterinary medical case.

landmarks for proper endotracheal tube placement and Question #9 regarding appropriate anesthetic action steps for adjusting patient anesthetic depth. Both cohorts answered corrected for 50% of the questions. Current actions to improve SLO based on the results: For improving lab practical outcomes, we will offer individual out-of-class practice sessions for feedback prior to the exam beginning in Spring 2019. Additionally, suggest students have a classmate video record themselves practicing and describing skills techniques to aid with self-awareness and self-correction. For quiz questions, review all topics verbally during in-person lab sessions when course is next offered in Spring 2019. For Question #5, the correct multiple choice answers were in close proximity to the incorrect one. Caution students to take time to read each possible answer carefully before selecting the best one then recheck prior to submitting an answer. Attention to written detail is similar to students reading and following written veterinarian medical orders for an animal patient. Next assessment of this SLO: Spring 2019 and Spring 2020 when both courses are next offere,d with biennial reporting in 2020.

Page 412: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

402

Veterinary Technology, A.A.S. Perform and assist with dental procedures including dental equipment preparation and maintenance.

Animal Dentistry VET 214 Direct Measure: Completion of small animal dental radiographic portfolio with assessment of dental prophylaxis technique and a specific written case study project in VET 214 (Animal Dentistry). See APPENDIX SLO #4A, #4B, #4C, #4D, #4E and #4F – VET 214 (Animal Dentistry) for assessments. For dental radiographic technique evaluation: Students worked in groups of 3 or 4 and set up the positioning for intraoral dental radiographs from a dog or cat skull. The portfolio was comprised of standard views from a predetermined list. The views assigned required the students to apply concepts of dental radiographic positioning in which they had been previously instructed. The student groups were evaluated on their ability to set up correct positioning that would generate an anatomically accurate and diagnostic image, taking into account the effects of cone and sensor alignment. Following the group practice work, each student was then required to individually set up the positioning for a dental radiographic view on demand without knowing ahead of time what the view would be. The view chosen for each student was random. Both the portfolio and individual radiographs were evaluated in the same way. Categories of image evaluation included observing the effects of the student’s positioning technique on the image caused by sensor-cone-subject alignment, vertical angulation of the cone, horizontal angulation of the cone, and if diagnostic criteria were met. Success was determined if an accurate and diagnostic image was achieved on the first try. The number of retakes and reasons for the retakes were recorded. If there were no retakes, then 10 points were awarded. For every view that required any retakes, one point was deducted. A range of scores and an average score were tabulated for both the groups and the individual student. For the group portfolio, the number of

Semester/year data collected: Summer 2018 Target for dental radiographic technique evaluation: 70% of the students or groups should be able to correctly set-up equipment for a dental radiograph (complete 100% of task) on the first try at this stage in the training. Average scores and ranges of scores should be at 7 or more. Results: Data collected for VET 214 (online and on-campus cohorts) Dental radiographic technique evaluation - VET 214 E20L data (online cohort): • Group portfolio work – 6 groups of 3 or 4 students • Portfolio evaluation overall:

o Average group score 7.5 out of 10 o Range of 7-8 out of 10

Dental radiography intraoral positioning group portfolio results – group success rate for each view, success on first attempt and sources of error:

Source of errors ->

Views (Bisecting angle

= BA)

Sensor-cone-

subject aligned-

Subject not centered and

imaged

Vertical angulation Image is not an accurate

length

Comments % of student

portfolio groups positioning for a

successful diagnostic

view on first attempt BA-Upper incisors dog

X

X

16% (1 out of 6 groups)

BA-upper lateral oblique canine dog

X

X

50% (3 out of 6 groups)

BA- upper lateral premolars dog

X

83% (5 out of 6 groups)

BA- upper Distoblique P4 dog

X

X

83% (5 out of 6 groups)

BA-lower Incisors + canine dog

83% (5 out of 6 groups)

Parallel- lower Premolars/molars dog

X 83%

(5 out of 6 groups)

BA-lower premolars and molars of cat

X

X

0% (0 out of 6 groups)

Dental radiographic technique evaluation - VET 214 0G1L and 0G2L data (on-campus cohort):

This SLO has not been previously assessed. Target Met: [ ] Yes [ ] No [ X ] Partially Online cohort results for dental radiographic techniques: Group portfolio work overall was satisfactory as indicated by the overall scores, although the benchmark of 70% was not met for every view. Positioning for each view was done sequentially in the order as displayed in the table. The first view success rate was 16% and then the second view success rate was 50%. The success rate for the dog improved thereafter to 83% and met the benchmark. As the students work through the portfolio list, the success rate improves with practice. However, the last view was of the cat and no group was successful on the first attempt. Applying the positioning concepts to another species with slightly different anatomy appeared to be more of a challenge than expected. Opportunity for more practice with the cat would be beneficial. For individual performance, the average score was 9.6 and the success rate was 77%, and therefore the benchmark was met. Sources of error were determined for those groups or individuals who were not successful with correct positioning on the first attempt. The most common source of error was caused by incorrect sensor-cone-subject alignment, with incorrect vertical angulation of the cone second. Both types of errors indicate that students need additional instruction in projection

Page 413: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

403

Veterinary Technology, A.A.S. groups successful on the first try for each radiographic view were tabulated. The number of students who, on an individual basis, were successful on the first try were also tabulated. The rubric for the portfolio and individual radiograph are attached separately. For dental prophylaxis technique: After training in the use of hand and power instruments for cleaning teeth of dogs and cats, students’ skill set for dental cleaning technique using the instruments was evaluated individually. The dental cleaning technique was performed on prepared artificial dental models, a canine typodont with simulated plaque and calculus. Students demonstrated techniques with the following instruments: calculus removal forceps, power scaler, sickle scaler, curette, probe, explorer, power polisher, and air water syringe. Along with skill, students were also evaluated for the knowledge of the routine order of operations. Observations about ergonomic habits were also noted to give students feedback on preventing repetitive/cumulative strain disorders, but this line item did not affect the scores for the evaluation. Feedback was also given to the student regarding the student’s ability to work efficiently, critique his/her own work effectively, and keep the instrument tray organized, but this line item did not affect the scores for the evaluation. Criteria for proper instrumentation technique and order of operations were established and a grading checklist was developed for each task in the cleaning technique. The total possible number of points for this evaluation was set at 10 points. If a task was completed correctly with no coaching needed, a point was given. If coaching was required to correct the student’s technique, then there was a deduction. The number of students who were able to perform each technique without coaching was also tabulated. For dog and cat case study: The students’ ability to recognize abnormal dental and oral conditions and describe the conditions properly

• Group portfolio work – 5 groups total; 1 group of 5 students, 3 groups of 4 students and 1 group of 3 students

• Portfolio evaluation overall: o Average group score 7.4 out of 10 o Range of 5-9 out of 10

Target for dental radiographic intraoral positioning: 70% of the students or groups should be able to correctly position patient and equipment for a dental radiograph (complete 100% of task) on the first try at this stage in the training. Average scores and ranges of scores should be at 7 or more. Dental radiography intraoral positioning group portfolio results – group success rate for each view, success on first attempt and sources of error:

Source of errors ->

Views (Bisecting angle

= BA)

Sensor-cone-

subject aligned- Subject

not centered

and imaged

Vertical angulation

Image is not an

accurate length

Horizontally

Inadequate adjustment for Super-imposition

Comments % of

student portfolio groups

positioning for a

successful diagnostic

view on first

attempt BA-Upper incisors dog

X 60%

(3 out of 5 groups)

BA-upper lateral oblique canine dog

X

80% (4 out of 5

groups)

BA- upper lateral premolars dog

100% (5 out of 5

groups) BA- upper Distoblique P4 dog

X X

60% (3 out of 5

groups) BA-lower Incisors + canine dog

X X 60%

(3 out of 5 groups)

Parallel- lower Premolars/molars dog (One group of 4 students forgot to take this view. Only 4 groups, totaling 16 students, completed this radiograph)

100%

(4 out of 4 groups)

geometry and bisecting angle concepts. On campus cohort results: The on-campus cohort did the radiographs in the order listed in the table. The first view success rate was 60%. The second and third view met the benchmark with 80% and 100%, respectively. The fourth and fifth success rate were both 60%, not meeting the benchmark. The parallel technique (the 6th view) had a success rate of 100%. In general, the parallel technique is a simpler concept to understand and there are less variables in positioning technique. The last view for the cat was met with 0% success rate on the first attempt. The cat view appeared to be the most difficult for all the groups. Devoting more time to techniques of the cat would be beneficial. For individual performance, the average score was 9.1 and the success rate was 70%, and therefore the benchmark was met. Sources of error were determined for those groups or individuals who were not successful with correct positioning on the first attempt. The most common source of error was caused by incorrect sensor-cone-subject alignment, with incorrect vertical angulation of the cone second. Both types of errors indicate that students need additional instruction in projection geometry and bisecting angle concepts. Comparison of online and on-campus cohort results: The online portfolio (group) met the benchmark for 4 out of 7 views. The on-campus portfolio (group) met the benchmark

Page 414: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

404

Veterinary Technology, A.A.S. was evaluated using case photos on an exam. One photograph of dog case and one photograph of a cat case was used. Both cases exhibited more than 4 abnormal conditions in the photo. Students were asked to identify at least 4 abnormal conditions for each case and describe them using the appropriate dental or medical terms along with correct dental notation. Full points (4 out of 4 points) were given for correctly identifying and describing 4 conditions for each photo. If a condition that did not exist in the photo was described, this was noted as an incorrect answer. Missing one of the 4 items was not considered an incorrect identification; a student in that case received less than 4 out of 4 as a score. The students’ essays were also evaluated for spelling and use of slang or informal language. If students used a term reserved for diagnosis, instead of or in addition to describing the dental or medical criteria, or recommended treatments, this was also noted. Sample:

Campus/ Modality

# of Total

Sections Offered

# Sections assessed

# Students assessed

LO 2 2 20 ELI 1 1 22 DE* N/A N/A N/A Total 3 3 42

*Dual-enrollment

BA-lower premolars and molars of cat

X

0% (0 out of 5

groups) Individual dental radiograph intraoral positioning results VET 214 E20L (online cohort) data - individual success rate on first attempt: • Average score 9.6 points out of 10 • Range 7-10 points out of 10 Success rate: • 77% (17 out of 22) students were individually successful on the

first attempt • Remaining 5 students were successful on subsequent attempts Source of error for the 5 students: • Horizontal angulation error- 1 out of 5 • Sensor-cone-subject alignment error- 4 out of 5 Individual dental radiograph intraoral positioning results VET 214 0G1L and 0G2L (on-campus cohort) data - individual success rate on first attempt: • Average score 9.1 points out of 10 • Range 7-10 points out of 10 Success rate: • 70% (14 out of 20) students were individually successful on the

first attempt • Remaining 6 students were successful on subsequent attempts Source of error for the 6 students: • Vertical angulation error-3 out of 6 • Sensor-cone-subject alignment error- 3 out of 6 Summary of individual dental radiograph positioning results:

Average score Out of 10 points

Score range Out of 10 points

Success rate on first attempt

Online On campus

Online On campus

Online On campus

9.6 9.1 7-10 7-10 77% (17 out of 22)

70% (16 out

of 20)

Source of error, cause of retake

Online student errors

On-campus student errors

Sensor-cone-subject alignment

80% (4 out of 5)

50% (3 out of 6)

Vertical angulation 50% (3 out of 6)

for 3 out of 7 views. One of the on-campus groups (4 students) had a more difficult time with the positioning. They scored a 5 out of 10. This group did not have any experience with dental radiographs and appeared to get frustrated. Offering additional practice for these students would be beneficial. Requiring the students to take turns leading the radiographs would also be helpful. The students tended to rely on one individual to position most of the views, which makes it less likely for the other students to develop the skill. The individual radiograph positioning results indicated similar scores for the online cohort and the on-campus cohort. Recommendations - Online course: Opportunity to practice several radiographic views beginning in Summer 2019 should improve introductory student success. More time to practice in general and practicing with the cat would improve performance. This will require more lab time which could be accomplished for the online students by providing more lab visits with smaller lab groups, and offering extra lab visits for those students needing tutoring. Starting in Summer 2019 when the course is offered next, additional instruction in projection geometry and bisecting angle could be accomplished by adding another assignment or series of assignments to the online lessons that require the student to evaluate positioning photos and radiographic images and to determine troubleshooting actions. A non-multiple choice quiz format is recommended for these additional assignments so that each individual

Page 415: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

405

Veterinary Technology, A.A.S. Horizontal angulation 20%

(1 out of 5)

Target for dental prophylaxis technique: 70% of the students or groups should be able to correctly perform small animal teeth cleaning skills (complete 100% of task) on the first try at this stage in the training. Average scores and ranges of scores should be at 7 or more. Dental Prophylaxis technique score results:

Average score Out of 10 points

Score range Out of 10 points

Online On campus Online On campus 8.4 8.4 6.5- 10 6-10

Dental Cleaning/ Prophylaxis grading checklist results both online and on campus cohorts:

Task % of online students

(out of 22) successful without

coaching

% of on campus students (out of 20) successful without

coaching Preop Equipment check

100% (22 out of 22)

100% (20 out of 20)

Operator prep-PPE

100% (22 out of 22)

100% (20 out of 20)

Patient prep- position, oral exam, antiseptic rinse

100% (22 out of 22)

100% (20 out of 20)

Calculus removal forceps- technique

73% (16 out of 22)

100% (20 out of 20)

Power scaling- technique

Power scaling- travel sequence

77% (17 out of 22)

90% (18/20)

91%

(20 out of 22)

95%

(19 out of 20) Hand scaler- technique

32% (7 out of 22)

80 % (16 out of 20)

Curette- technique

Curette- travel sequence

82% (18 out of 22)

80 % (16 out of 20)

50% (11 out of 22)

95% (19 out of 20)

Power scaling periodontal technique

82% (18 out of 22)

90% (18 out of 20)

Oral exam probe – technique

Probe- travel sequence

91% (20 out of 22)

95% (19 out of 20)

100% (22 out of 22)

100% (20 out of 20)

Oral exam explorer-

77% (17 out of 22)

75% (15 out of 20)

student is an active independent participant. Recommendations – On-campus course: More opportunities to practice positioning, especially in cats would be beneficial to improve success rates. Additional lab time would be helpful and offer more practice opportunities. The summer session is 12 weeks. The Fall and Spring semesters are 16 weeks. Having this class during the Fall or Spring may give the students more lab time to work on their technique. Program resequencing of the Summer course offering will be proposed to the department and academic deans during Fall 2018. Having more than one employee with a dosimeter badge (to physically generate the radiographs) would be beneficial. This would allow the instructor to be available in the lab to continue practicing the techniques with the students while there is a group taking radiographs. Providing students additional assignments beginning in Summer 2019 on radiographic techniques and providing opportunities to troubleshoot would also improve technique. Prophylaxis technique: A benchmark of 70% was chosen, meaning that 70% of the students should be able to perform each technique skillfully enough and show knowledge of the order of operations without coaching. A benchmark score of 7 out of 10 points for each individual student was chosen for the overall performance of the dental cleaning technique. Online cohort results: Average scores for the online cohort indicate

Page 416: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

406

Veterinary Technology, A.A.S. technique for caries (occlusal crevice), and technique for other surfaces (CEJ) Explorer - travel sequence

77% (17 out of 22)

100 % (20 out of 20)

Polishing – technique

Polishing- travel sequence

91% (20 of 22)

95% (19 out of 20)

95%

(21 of 22)

100%

(20 out of 20)

Rinsing - technique

86% (19 of 22)

90% (18 out of 20)

Order of operations

77% (17 of 22)

65% (13 out of 20)

Ergonomics – modified pen grasp, wrist position

50% (11 of 22 correct

modified pen grasp)

85% (17 out of 20 had

correct modified pen grasp)

Critiques own work effectively- makes corrections if needed, organized, efficient

73%

(16 of 22)

75%

(15 out of 20)

Target for dentistry case study: 70% of the students or groups should be able to correctly identify abnormal conditions using properly spelled dentistry medical terminology and create complete written documentation in form of oral dental chart with appropriate written notations (complete 100% of task) on the first try at this stage in the training. Average scores and ranges of scores should be at 7 or more. Dentistry Case Study results for both online and on campus cohorts:

Criteria Online students % out of 22

On campus students

% out of 20 Correct identification and description 4/4 of abnormal conditions for dog case

68% (15 out of 22)

80% (16 out of 20)

Correct identification and description 4/4 of abnormal conditions for cat case

59% (13 out of 22)

85% (17 out of 20)

Incorrect identification made either dog or cat case

18% (4 out of 22)

55% (11 out of 20)

that the benchmark was met, but the range indicates that some students did not meet the benchmark. There were 2 students who scored less than 7 because excessive coaching was required for most tasks. Otherwise, most students appeared prepared and required only some coaching to refine their technique or no coaching was required at all. The data showing which tasks required coaching and what percent of students required that coaching is in the table called “Dental Cleaning/ Prophylaxis grading checklist results.” The benchmark was met for all graded tasks except the hand scaler technique and the travel sequence for the curette. Half of the students needed coaching on ergonomics and the modified pen grasp. More specific instruction emphasizing those skills that did not meet the benchmark and more practice time with these techniques would be beneficial in improving student preparation and performance. On-campus cohort results: Average scores for the on- campus cohort met the benchmark. There was one student that did not meet the benchmark because of excess prompts needed. The majority of the students needed less than 3 prompts during the dental prophylaxis exercise. Dental Cleaning/ Prophylaxis grading checklist results: The benchmark was met for all graded tasks except the order of operations. Comparison of online and on-campus results: The on-campus cohort met all benchmarks except 1 skill. The online cohort met all

Page 417: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

407

Veterinary Technology, A.A.S. Correct dental terms dog case No misspellings or informal language

73% (16 out of 22)

60% (12 out of 20)

Correct dental terms cat case No misspellings or informal language

77% (17 out of 22)

75% (15 out of 20)

Correct dental notation dog 82%

(18 out of 22) 70%

(14 out of 20) Correct dental notation cat 68%

(15 out of 22) 65%

(13 out of 20) Inappropriate comments- diagnosis stated , or treatment recommended

54% (12 out of 22)

65% (13 out of 20)

benchmarks except for 3 skills. The on-campus and online average scores were exactly the same at 8.4, with a similar score range. Recommendations - Online course: Opportunity to practice dental cleaning technique with the full complement of instruments improves success. More time to practice in general and under supervision would improve performance and ergonomic awareness. This would require more lab time which could be accomplished for the online students by providing more lab visits with smaller lab groups, and offering extra lab visits for those students needing tutoring. Future instruction should provide more emphasis on the hand scaler and travel sequence of the curette. Additional instruction when the course is next offered to prepare students as to what is expected for the techniques will be accomplished by the addition of video demonstrations to enhance the online lesson material beginning in Summer 2019. Recommendations – On-campus course: More lab time for students to practice the order of operations would be beneficial to the students and implemented starting in Summer 2019. Occasional quizzes for the dental prophylaxis sequence would help students learn through repetition. The students requested an Open Lab prior to the final exam. One of the stations was a verbal review of the dental prophylaxis. A hands-on, more interactive prophylaxis station during the open lab would increase the student’s comfort level and allow them to gain more experience to develop a routine. There were several

Page 418: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

408

Veterinary Technology, A.A.S. mistakes made with hand instruments (modified pen grasp, using the incorrect side of the curette, incorrect use of the hand scaler, explorer and probe). More hand instrument instruction and practice time would help the students learn proper techniques and reduce their errors. This can be implemented in Summer 2019. Case Study: A benchmark of 70% was chosen, meaning that 70% of the students should be able to correctly identify 4 out of 4 abnormal conditions using correct terminology, spelling, and notation. 70% of the students should also describe the conditions in a way that is descriptive and not use the name of a diagnosis as part of, or in place of, the description. The percent of students making an incorrect identification or an inappropriate comment by including a diagnosis statement or recommending treatments with the description should be 30% or less at this stage of the training. (Said another way, 70% of the students should not make an incorrect identification, and 70% of the students should not make inappropriate statements regarding diagnosis or treatment.) Online cohort results: The results are tabulated in the table called “Case Study results” and indicate that some benchmarks were met and some were not. For full and correct identification, the benchmarks were not met at 68% (15 out of 22 students correct) for the dog, and 59% (13 out of 22 students correct) for the cat. Incorrect identification, or an actual mistake, was not common but did occur with 4 students at 18%. However, the benchmark was met

Page 419: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

409

Veterinary Technology, A.A.S. for the incorrect identification category as the incidence was below 30%. The benchmark was met for the correct use of dental/medical terms at 73% for the dog and 77% for the cat. The benchmark was met for the correct use of dental notation for the dog at 82% but not for the cat at 68%. This is likely because the cat has different anatomy than the dog. The familiar notation system of the dog, which is learned first because the dog has more teeth, must be adapted to fit the cat anatomy which can be challenging. Regarding inappropriate comments in which the student named a diagnosis or recommended a treatment, this occurred frequently at 54% (12 out of 22 students). Therefore, the benchmark for this category was not met, as the expectation was that this should be less than 30%. This may be because the student did not completely understand the intent of the exam question. The intent was for the student to merely describe the abnormal conditions. The concern here is that the student may not understand the difference between a clinical description that leads to a diagnosis, versus naming a diagnosis. Identifying or describing the clinical criteria for a diagnosis in a report is within the authority of the veterinary technician, but actually declaring a diagnosis is not. The student may also not recognize the difference between knowing about treatments and recommending treatments. Knowing what the common treatments are and how to assist in providing the treatment is a responsibility of the veterinary technician, whereas prescribing

Page 420: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

410

Veterinary Technology, A.A.S. treatments is not. This finding of 54% of the students making inappropriate statements may only reflect a difference in semantics from the students’ perspective, or the students were attempting to be very thorough. But it is important to clarify this issue for students so that they avoid overstepping boundaries regarding diagnosis and prescribing. On-campus cohort results: The benchmarks were met for correct identification and description of abnormal conditions for the dog (80%) and the cat (85%). Incorrect identification, or an actual mistake, occurred with 11 students at 55%. The benchmark was not met for the incorrect identification category. The benchmark was not met for the correct use of dental/medical terms at 60% for the dog. However, the benchmark for the cat was met at 75%. The benchmark was not met for the correct use of dental notation for the dog at 60%. The benchmark for the cat was not met at 65%. Correct dental notation was problematic on the midterm and final exam as well. The most common mistake is misidentifying the carnassial teeth, designated as the 4th premolar in the maxilla and the 1st molar in the mandible. Students often confuse these with both the modified Triadan System and the anatomical names. Regarding inappropriate comments in which the student named a diagnosis or recommended a treatment, this occurred 65% (13 out of 20 students). Therefore, the benchmark for this category was not met, as the expectation was that this should be less than 30%. Eight students declared a diagnosis,

Page 421: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

411

Veterinary Technology, A.A.S. which is outside the limits of an LVT. There was a total of 65% of inappropriate comments, diagnosis and/or prescribing treatments. It is important that the students understand the difference between reporting clinical findings and making a diagnosis. Comparison of online and on campus cohort results: The online cohort met the benchmark for 4 out of 8 criteria for the case study results. The on campus cohort met the benchmark for 4 out of 8 criteria for the case study results. Recommendations - Online course: To improve the percent of online students giving the correct identification of abnormal conditions, it is recommended to add more instruction with case reviews of the dog and cat beginning in Summer 2019. This could be done by developing a photographic collection of cases for students to study and practice their clinical recognition and writing skills on a regular basis. Cats should be emphasized more than is currently done, so that students can become more accurate at using the dental notation for the cat. The case reviews can be incorporated into the weekly online lessons as an assignment. Guidance on how to write up clinical observations properly and formally, without inadvertently making a diagnosis or prescribing a treatment, would be included with the case reviews assignment. Instructor feedback on the case reviews can also be used to help students develop discernment about what kind of statements are appropriate for a veterinary technician to make about a case. Because the VET 214

Page 422: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

412

Veterinary Technology, A.A.S. course is taught in the shorter 12- week Summer semester, time is very limited to make full use of this recommendation for the students’ long-term benefit. Adding more dental case reviews in the dental review section of the more advanced dentistry component of VET 221 (taught in Fall 2018) might be helpful. The alternative would be to recommend that the online VET 214 course be moved to a 16-week semester as part of overall curriculum resequencing. To further decrease the use of slang or informal language, assignments can be added to the weekly online lessons in which the student is asked to convert an informal description with slang terms into formal medical terms that would be more appropriate for a medical record. If spelling is noted to be an issue for a cohort, spelling quizzes requiring handwritten answers can be given, as needed, that cover the glossary from the weekly online lesson. The quizzes can be given during the campus lab visits or at testing centers to encourage more attention to accurate spelling beginning in Summer 2019. Finally, the exam question for the photos can be clarified to avoid misunderstanding and ensure the student understands the purpose of the question, meaning that the objective is for the student to only describe and identify the clinical features and location of a condition, not to suggest a diagnosis or treatment. Recommendations – On-campus: The benchmark for 4 out of 8 categories was met for the case study results. To improve skills,

Page 423: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

413

Veterinary Technology, A.A.S. case studies should be incorporated more frequently throughout the semester to ensure students become familiar with normal and abnormal conditions, correct language and comments and correct dental notation. More dental worksheets focusing on dental notation in the dog and cat should be assigned. Quizzes on the modified Triadan system and anatomical system should be administered throughout the session. More lab time starting in Summer 2019 should be helpful for the students to study the model cat and dog mouths, review case studies and identify abnormalities. The summer session was a 12-week semester. It may be more beneficial for this class to be a full 16-week semester. Next Assessment: Summer 2019 and Summer 2020 when both courses are next offered, with biennial reporting in 2020.

Core Learning Outcome Evaluation Methods Assessment Results Use of Results

CLO: Explain animal patient assessment, nursing procedures, and the implantation of prescribed diagnostics and treatments, including basic animal care or husbandry. [ X ] CT

Intro to Veterinary Technology VET 105 Direct Measure: Completion of written veterinary medical case-based patient scenarios including various decision-making components for animal assessment and nursing procedures in VET 105 (Intro to Veterinary Technology). See APPENDIX for CLO - VET 105 (Intro to Veterinary Technology) assessment. Sample:

Campus/ Modality

# of Total Sections Offered

# Sections assessed

# Students assessed

LO 1(OC)** 1 25 ELI 1 (OL)** 1 20 DE* N/A N/A N/A Total 2 2 45

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 100% of students will score 70% or higher on each scenario topic question Results by Enrollment – OC and OL Cohorts:

Results by Campus/ Modality

Fall 2017 Average Score Percent > target

LO 3.81 of 4 points (95%) 100% > target ELI 41.5 of 50 points (83%) 100% > target Total 89% 100%

Results by CLO Criteria – OL Cohort:

Results by Toxicity Question Topics

Fall 2017

Average Score % of Students > target

Q#1 - Plants 16.5 of 20 points (82.5%)

58% > target

Q#2 -Chocolate 7 of 9 points (77.7%)

81% > target

This CLO has not been previously assessed. Target Met - OC Cohort: [ X ] Yes [ ] No [ ] Partially Target Met - OL Cohort: [ ] Yes [ ] No [ X ] Partially Based on recent results, areas needing improvement: For OC students, they need to differentiate between clinical signs and treatment protocols for varying classes of rodenticides.For OL students, they need the most improvement with calculating toxicity level of Xylitol and Chocolate along with anticipated diagnostic tests, treatments and nursing interventions

Page 424: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

414

Veterinary Technology, A.A.S. **Separate Veterinary Technology Program Cohorts:

• On-Campus (full-time) = OC • Online (part-time) = OL

Q#3 – Xylitol 6 of 9 points (66.7%)

44% > target

Q#4 – Rodenticide 12 of 12 points (100%)

100% > target

Total 81.7% 70.75% Current results improved: N/A: This CLO has not previously been assessed. Strengths by Question Topics: • Triage of small animal patients including telephone (pre-arrival to

facility) • Case-based with patient history taking component to simulate

real-world client interactions and communications. Calculating toxic dosages and applying to particular patients to determine if they have ingested a life-threatening amount of toxic material

• Use of credible Internet resources provided to students

Weaknesses by Question Topics: • Plant Toxicity Questions – missing explanation of abnormal

physical assessment findings as relates to applied anatomy • Chocolate Toxicity Questions – lacking explanation of patient

monitoring parameters basic animal care • Xylitol Toxicity Questions – lacking order of priority nursing care

and treatment explanation • Rodenticide Questions – emphasizes memorization vs.

application of rodenticide information

to prepare for based on evidence-based protocols of care. They also need to improve upon describing specific client triage advice prior to and after client/owner arrival to veterinary medical facility. Both OC and OL students need to apply knowledge regarding species anatomy based on physical exam palpation and veterinary nursing assessment. Current actions to improve CLO based on the results: Additional written assignment or quizzing specifically regarding the varying classes of rodenticides and practice toxicity calculation problem sets. Beginning in Fall 2018, review veterinary anatomy in-class for OC cohort and during campus lab visits for OL cohort with class exercises on how this applies to patient illness and disease as well as proper animal patient restraint, handling and implementation of nursing procedures based on patient priority needs. Next Assessment: Fall 2018 and Fall 2019 when both courses are next offered, with biennial reporting in 2020.

Program Goals Evaluation Methods Assessment Results Use of Results

To maintain and increase the number of qualified Veterinary Technology graduates

Data obtained from OIR Factbook 2013 to 2017 for Veterinary Technology: • Number of NOVA Graduates by Degree and

Specialization 2017-2018 • College Graduates By Curriculum and

Award Type Fall 2013 through Fall 2017

Target: Produce 30 Veterinary Technology graduates per year with increase by 25% Results for Past 6 Years*:

Academic Year Number of Graduates

Percentage Increased

2012 - 2013 40 33% 2013 - 2014 49 63% 2014 - 2015 48 60% 2015 - 2016 52 73% 2016 - 2017 46 53% 2017 - 2018 60 100%

*Data includes both On-Campus and Online program graduates.

There is a regional and national workforce need for LVTs which is unmet and verified with the various veterinary medical industry members of the program curriculum advisory board. Previous action to improve program goal: Increased on-campus enrollment capacity by more than doubling the veterinary medical laboratory space with Phase III Loudoun campus Building expansion project completed in

Page 425: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

415

Veterinary Technology, A.A.S. Target Met: [X] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The number of program graduates continues to exceed annual goal with double the number (100% for 2017-2018 year). Both the On-Campus and Online Veterinary Technology programs are retaining a similar number of students in the past 6 years who are highly motivated to complete the AAS degree as it meets part of the requirement to obtain state licensure as a veterinary technician (LVT) in Virginia and nationally.

January 2012. Strengthen applicant pool for both programs by targeting outreach to working non-credentialed staff already employed in veterinary practices, various animal-related volunteers, self-identified NOVA pre-veterinary students and current veterinary mentors/employers of program students and graduates. Advising military veterans to seek degree to become LVT. Most recent results: Continued increase in program applicants with previous college credits and degrees from >60% to >85%. The completion of all or most general education courses prior to program acceptance decreases student tuition cost and reduces academic load toward degree completion. Approximately 20% more veterinary employers contacting program to post job positions for both students and LVT graduates. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: Conducted two successful veterinary career and job fairs (March and October 2018) with an average of 25 veterinary employers from the northern Virginia region which for the first time were opened to both program and pre-veterinary technology students. Students from newly NAVTA-accredited Veterinary Assistant certification program through NOVA Workforce also attended and will plan to open the fair to all NOVA students interested in exploring career. Assessed: Annually

Page 426: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

416

Veterinary Technology, A.A.S. To maintain retention of selectively program-placed Veterinary Technology A.A.S. degree program students (College Goal)

Data obtained from OIR Factbook 2013 to 2017 for Veterinary Technology: • Distribution of Program-Placed Students by

Curriculum and Award Type: 2013-2017

Target: 60% or greater program-placed student retention rate Results for past 6 years:

Fall Semester Year

Total VET Program-Placed

Students 2018 (prelim) 129* 2017 152 2016 150 2015 156 2014 144 2013 152

*Preliminary data based on Program course rosters at College census date in September 2018. Target Met: [ X ] Yes [ ] No [ ] Partially Comparison to previous assessment: The total number of program-placed veterinary technology students has decreased 15% to estimated 129 in Fall 2018 from 152 students in Fall 2017. This is similar to the trend of declining enrollment college-wide. The previous 5- year trend was flat-lined.

Previous action(s) to improve program goal: Use selective program admissions and re-admissions process criteria for program re-joiners to improve retention rates by establishing a wait-list of students to potentially fill program seats unexpectedly vacated by students dropping from the program. However, less students in applicant pool to qualify for wait-list due to academic reasons (less than 2.0 overall GPA, pattern of college course withdrawals, F’s, and academic warnings). Additionally, On-Campus program cohort size limited by physical laboratory space and live animal teaching resources. Most recent results: Overall applicant pool for both On-Camus and Online program in Spring 2018 for Fall 2018 admissions was noticeably decreased. Several program-placed students lost after acceptance but prior to Fall enrollment due to a combination of personal, health, financial and commuting distance issues. Results improved: [ ] Yes [ X ] No [ ] Partially Current action(s) to improve program goal: Shift Program application window from January-March to March-June beginning Spring 2019 to permit potential applicants time to complete the required prerequisite courses in Spring semester to officially appear on college transcript to qualify for selective admissions. Also aids students transferring in-progress Spring credits from an external institution. Offering monthly small group pre-veterinary technology advising sessions and encouraging

Page 427: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

417

Veterinary Technology, A.A.S. all potential applicants to attend via website contact form, program faculty advisors and campus general counseling offices. Assessed: Annually

To maintain retention of Veterinary Technology Program students based on national accreditation standards by Committee of Veterinary Technician education & Activities (CVTEA) of the American Veterinary Medical Association (Program Goal)

Data collected by Program from all VET course rosters beginning of each academic year after College census-date (September). Additional data obtained from: • Distribution of Program-Placed Students by

Curriculum and Award Type: 2013-2017

Target: 57% or greater average retention rate for currently 210 AVMA-accredited veterinary technology programs Results for On-Campus 2-year Program (Full-Time):

Fall Admitted

1st Yr Students

Start

Remain for

2nd Yr

Retention Rate

Re-joiners

2017 32 21 (Fall 2018)

66% 0

2016 44 34 (Fall 2017)

77% 0

2015 46 29 (Fall 2016)

63% 0

2014 48 35 (Fall 2015)

73% 0

2013 48 28 (Fall 2014)

58% 1

Target Met: [X] Yes [ ] No [ ] Partially Results for Online 3-year Program (Part-Time):

Student Program Yr

Student Program Yr

Retention Rate

Re-joiners

1st Yr Online Fall 2017 = 27

2nd Yr Online Fall 2018 = 24

89% 5

2nd Yr Online Fall 2017 = 26

3rd Yr Online Fall 2018 = 24

92% 1

1st Yr Online Fall 2016 = 28

2nd Yr Online Fall 2017 = 26

93% 0

2nd Yr Online Fall 2016 = 32

3rd Yr Online Fall 2017 = 30

92% 0

Target Met: [X] Yes [ ] No [ ] Partially Comparison to previous assessment(s): The 2017 to 2018 program retention rates range from 66% to 92% between both the On-Campus and Online programs which exceeds the national average of 57% for veterinary technology programs nationwide (currently 204 AVMA-accredited programs). Similar in comparison to the 2016 to 2017 retention rate range of 77% to 94%.

Previous action(s) to improve program goal: Encouragement of On-Campus pre-veterinary and 1st semester veterinary technology students to enroll in and develop connections with classmates when completing program-required foundational courses such as VET 111 (Animal A&P), MTH 133 (Math for Health Professionals) and SDV 101 (Orientation to Vet Tech). The Online students are advised to seek guidance with personal issues from program faculty advisor, veterinary mentor(s) at place of employment, NOVA Online and/or campus counselors and financial aid for student retention. The distant education students are more likely to unexpectedly un-enroll themselves in courses during the semester and take time off from the program without communicating with their veterinary practice mentors, instructors or the program head. For Online students living long distances from Loudoun campus, they are advised by course instructors to carpool with classmates to attend on-campus lab sessions and off-site fieldtrips when scheduled throughout the semester for companionship and small student groups. This helps onliners feel connected to peers and campus program facilities. Most recent results: Wide retention rate range between the full-time and part-time program students with more students dropping from On-Campus program after 1st semester due to academic

Page 428: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

418

Veterinary Technology, A.A.S. and personal reasons. The Online program has more re-joiners due to more flexibility of part-time enrollment with asynchronous online instruction. Students who reenter program after 1 to 4 years from either program tend to show greater persistence and determination to complete degree goals despite any previous set-backs. On-Campus students show decreased resiliency for time-constraints and the veterinary field altogether. They often leave to complete other degrees, pursue animal-related out-of-country work opportunities or seek other career paths. Financial (increasing number of program students receiving financial aid vs. self-funding) and health issues also contribute to higher stress-level and program withdrawals in the full-time program. Results improved: [ ] Yes [ X ] No [ ] Partially Current action(s) to improve program goal: Students considering program withdrawal are reminded via course Bb announcements to meet with instructor or assigned program faculty advisor prior to making a final decision, especially for those on financial aid or receiving military veteran benefits. All students are recruited to join the Veterinary Technology Club for campus social activities and peer-tutoring support. Beginning Summer 2018, all accepted program students will mandatorily attend a Loudoun campus veterinary technology-specific New Student Orientation (NSO) session conducted by faculty, staff and current upperclassmen program students to emphasize the social networking benefits of

Page 429: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

419

Veterinary Technology, A.A.S. attending this well-established AVMA-accredited program. In Spring 2019, all program applicants must complete and discuss their self-assessment results for SmarterMeasure in their letter of intent. This tool will help the student and program selection committee objectively gauge their readiness for online and hybrid instruction occurring in both the On-Campus and Online programs. Assessed: Annually

To maintain and increase the number of students completing VET-prefixed and equivalent courses

Data obtained from: • Fall 2017 Success Rate by Discipline • Fall 2017 Student Grade Distribution by

Course For Loudoun Campus

Target: 100% course completion Total VET courses offered = 9 (4 offered concurrently for On-Campus and Online programs) Total equivalent VET courses offered = 1 (offered only at Loudoun campus to On-Campus program and pre-Veterinary Technology students) Results for Fall 2017:

VET Course Success Rate

VET 105* 87% VET 111 (Online Only) 78%

BIO 195 (On-Campus Only equivalent to VET 111) 50%

VET 121 74% VET 122 100%

VET 132** 93% VET 133 90% VET 212 100% VET 216 88%

VET 221** 97% Target Met: [ ] Yes [ X ] No [ ] Partially

VET Course Total Student Withdrawal

VET 105* 7 VET 111 (Online Only) 3 BIO 195 (On-Campus Only equivalent to VET 111)

2

VET 121 7 VET 122 0 VET 132** 0 VET 133 0

The Program requires for students in VET courses is 100% by achieving a ‘C’ grade or better in both theoretical and practical portions of all VET and other general education courses with a C grade or better before continuing in the curriculum sequence. The program goal for 100% of selectively admitted program students to pass all of their degree courses on the first attempt is a much higher benchmark than other general science majors at NOVA. This is due to the applied nature of the veterinary medical field and the outside accreditation and credentialing standards for preparing entry-level Licensed Veterinary Technicians (LVTs). Both of the 1st semester courses (3 for the On-Campus program and 2 for the Online program) have historic completion rates between 50% and 90%. The lowest completion and withdrawal rate continues to correspond to the most challenging foundational laboratory science course in Veterinary Anatomy & Physiology (VET 111 and BIO 195). Possible causes for this change is the larger number of unprepared pre-veterinary technology mixed with selectively admitted program students in same course and a

Page 430: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

420

Veterinary Technology, A.A.S. VET 212 0 VET 216 0 VET 221** 0

VET Course Total

Students With D or F

VET 105* 0 VET 111 (On-Campus Only) 2 BIO 195 (On-Campus Only equivalent to VET 111)

10

VET 121 1 VET 122 0 VET 132** 6 VET 133 3 VET 212 0 VET 216 3 VET 221** 2

Total number of VET students with incompletes: 0 *Includes both on-campus and online 1st year program students in separate course sections **Includes both 2nd year on-campus program and 2nd or 3rd year online program students in separate course sections Comparison to previous assessment(s): For Fall 2017: Total passing rate for VET courses = 218 (94.4%) compared to previous program data for 2016-17 of 221 (96.9%); this is a 2.5 percent decrease in student success after 1st semester curriculum in both On-Campus and Online program cohorts combined. Also a similar small percent increase in 2nd and 3rd year Online cohort students not passing VET courses compared to previous data due to various student personal, academic and health issues. Total non-passing VET course grades: Fall 2017 = 13 (5.6%) Total non-passing VET course grades: Fall 2016 = 7 (3.1%) Previous Results from Fall 2016:

VET Course Success Rate

VET 105* 94% VET 111 (On-Line Only) 83% BIO 195 (On-Campus Only equivalent to VET 111)

70%

VET 121 90% VET 122 100% VET 132** 97% VET 133 100%

change in course instructor in Fall 2016. Previous action(s) to improve program goal: To improve the pass rates in BIO 195, course availability has increased from once yearly offering in Fall after acceptance to Veterinary Technology program to a course offering every semester. This permits pre-veterinary technology students to enroll in the course sooner and retake the course prior to program application. Pre-veterinary technology students are advised to seek peer and private tutoring and enroll in separate medical terminology course, HLT 141, in preparation for the large volume of unfamiliar anatomy and medical terms. Students are also strongly encouraged to lighten their first semester academic load in the program by completing other general education courses required such as MTH 126 and CHM 101 or 111 in the semesters prior to their 1st Fall program semester. The requirement of Program students to complete SDV 101 (Orientation to Veterinary Technology) during their 1st program semester since Fall 2013. Most recent results: Students in BIO 195 and VET 111 received an increased number of failing grades while VET 105 and VET 121 students withdrew to avoid grade penalty. Completion of SDV 101 by the On-Campus program cohort for past 5 years does not appear to have an effect on student success rate in these other courses. Results improved: [ ] Yes [ X ] No [ ] Partially

Page 431: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

421

Veterinary Technology, A.A.S. VET 212 100% VET 216 97% VET 221** 100%

Target Met: [ ] Yes [ X ] No [ ] Partially

VET Course Total Student Withdrawal

VET 105* 2 VET 111 (Online Only) 2 BIO 195 (On-Campus Only equivalent to VET 111)

3

VET 121 2 VET 122 0 VET 132** 0 VET 133 0 VET 212 0 VET 216 0 VET 221** 0

VET Course Total

Students With D or F

VET 105* 2 VET 111 (On-Campus Only) 3 BIO 195 (On-Campus Only equivalent to VET 111)

5

VET 121 2 VET 122 0 VET 132** 2 VET 133 0 VET 212 0 VET 216 1 VET 221** 0

Current action(s) to improve program goal: Continue to advise pre-program and degree program-placed students to frequently meet with their instructors regarding course material for assistance and discussion regarding course progress. Explain academic withdraw option as benefit to their GPA. Shift of key foundational courses effective 2018-19 (see College advising sheets) for degree completion via curriculum resequencing consistent with College Guided Pathways and pre-application advising. Student to demonstrate successful completion of foundation courses prior to selective program admissions beginning Fall 2019. The 5 prerequisite courses are ENG 111, SDV 101, MTH 133 (previous MTH 126, VCCS change) and CHM 101 or CHM 111 if plans to transfer to a 4-year college. Assessed: Annually

To demonstrate Veterinary Technology Program graduate achievement for mastery of skills based on national accreditation standards by Committee of Veterinary Technician education & Activities

Data obtained from required individual course task lists completed and collected by faculty prior to graduation for both the On-Campus and Online Programs in Spring 2018: • Essential and Recommended Veterinary

Technician Student Task List – APPENDIX I of CVTEA Policy & Procedures manual

Target: 100% completion Results for both On-Campus and Online Program:

Course Task Lists Submitted

Program Candidates

for Graduation

Completion Rate

May 2018 = 59 60 98%* May 2017 = 44 46 96%* May 2016 = 52 52 100% May 2015 = 48 48 100% May 2014 = 48 49 98%*

*One or two task lists not submitted due to Online student incomplete general education requirement. Target Met: [ ] Yes [ ] No [ X ] Partially

Each veterinary technician board-exam eligible student was individually signed off on over 200 entry-level skills required by the CVTEA. For the past 7 years, a higher percentage of accepted program students already have a previous degree (AS, BS, BA, MS, MA or MD/DVM). Therefore, program students are less likely to experience delayed graduation due to missing or unsatisfactory grades in required program specific general education courses such as Math for Health Professionals (MTH 133) and College Chemistry (CHM 101/CHM 111). The task list completion rate

Page 432: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

422

Veterinary Technology, A.A.S. (CVTEA) of the American Veterinary Medical Association (Program Goal)

Comparison to previous assessment: Similar student submission rates for past 5 years.

has minor fluctuations but ranges from 96% to 100%. Previous action(s) to improve program goal: Ongoing maintenance of individual task list by each student with course instructor verification and submission of documentation throughout enrollment as required for continued program external accreditation by CVTEA. The accreditation is one of qualifications graduates need to obtain a state license to practice veterinary technology. Students are responsible for reproducing lost or missing lists to qualify for program completion. Reminders to students are made by the Program Head and instructors. Submission of course lists are part of course grades and described in the syllabus. Most recent results: Online program students completing general education requirement (usually just one course) after completion of all VET courses resulting in minor drop in task list submission. Often occurs with rejoining program students who have a few year gaps prior to returning and affected by program curriculum changes. Other common reasons for unmet general education course requirements are student procrastination, course repeat due to unsatisfactory grades or failed transfer credits, money constraints for tuition and books and time-constraints due to employment commitments. All students subsequently complete the missing course the following Summer with a delay in graduation and application for state LVT license. Results improved:

Page 433: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

423

Veterinary Technology, A.A.S. [ X ] Yes [ ] No [ ] Partially Current action(s) to improve program goal: Convert paper-based skills documentation to an electronic format by program staff for improved feedback to students and more efficient reporting to CVTEA accreditors. Changed prerequisite requirements to both On-Campus and Online program for 5 foundational general education courses (ones likely to result in course repeats) to be completed with a ‘C’ letter grade or better prior to selective admissions beginning Spring 2019 to Fall 2019 cohorts. These college courses are ENG 111, SDV 101 for Vet Technology, MTH 133, CHM 101 or 111 and VET 111 (Animal A&P). The remaining general education courses in the Social Sciences and Humanities are lecture-based only and have not presented as course obstacles for program students to complete during their curriculum. Increased faculty advisement of all program students completing all general education courses prior to the final program semester to remain on-track to graduate. Delay in conferring of degree subsequently delays LVT license which leads to delay in employment options, career opportunities and pay increases. Implementation for changes beginning in Fall 2018 with switch to electronic task list documentation by Fall 2019. Assessed: Annually

To achieve successful program student pass rates via the Veterinary Technician

Data obtained from triannual American Association of Sate Veterinary Boards (AASVB) reports for VTNE candidates*

Target: Meet or exceed national benchmarks (see asterisked table below data) For On-Campus (Full-Time) Program:

VTNE Testing 1st Time Candidates

1st Time Pass Rate

Repeat Candidates

Repeat Pass Rate

VTNE national board exam is an applied knowledge-based entry-level test all students completing degrees in Veterinary Technology from an accredited CVTEA (AVMA) program are qualified to apply for prior to graduation. Passing scores

Page 434: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

424

Veterinary Technology, A.A.S. National Exam (VTNE) and maintain external accreditation standards for Committee of Veterinary Technician education & Activities (CVTEA) of the American Veterinary Medical Association (Program Goal)

Jul-Aug 2017 0 N/A 8 62.5% (5)

Nov-Dec 2017 0 N/A 3 0% (0)

Mar-Apr 2018 34 88.24% (30)

2 50% (1)

Jul-Aug 2018 0 N/A 5 20% (1)

For Online (Part-Time) Program:

VTNE Testing

1st Time Candidates

1st Time Pass Rate

Repeat Candidates

Repeat Pass Rate

Jul-Aug 2017

10 90% (9)

1 0% (0)

Nov-Dec 2017

0 N/A 0 N/A

Mar-Apr 2018

7 85.71% (6)

2 50% (1)

Jul-Aug 2018

17 82.35% (14)

0 N/A

*National VTNE Benchmarks:

Testing Window 1st Time Candidates Pass Rate

Repeaters Pass Rate

Jul-Aug 2017 72.88% 29.90% Nov-Dec 2017 63.95% 31.87% Mar-Apr 2018 69.80% 50% Jul-Aug 2018 79.01% 38.66%

Target Met: [ ] Yes [ ] No [ X ] Partially Comparison to previous assessment(s): In both cohorts, the number of repeater test-takers was very low at 5 or less graduates. The repeaters contain a mix of graduates from previous years with some not actively or minimally working in the field as non-credentialed veterinary technicians. Most notably, the full-time On-Campus students all choose to take VTNE in the March-April 2018 testing window, approximately 1 to 2 months prior to graduation to qualify for state licensure quickly. In comparison, the >two-thirds of the part-time Online students who are all required to work in a veterinary practice as part of their program enrollment, delayed taking the VTNE until after graduation. This is consistent with the previous year and is due in part to time-constraints for VTNE test prep, money constraints for VTNE exam fee and presumed decreased personal or veterinary employer pressure to obtain LVT license sooner. Previous Results from Summer 2016 to Summer 2017 - On-Campus (Full-Time) Program:

along with the conferred academic degree aid graduates seeking state requirements for becoming a Licensed Veterinary Technician (LVT). Previous action to improve program goal: Program students are advised to prep and take VTNE in March-April testing window prior to graduation while still in student-mode. Encourage VTNE online prep courses and study with classmates for support and encouragement. Faculty assign computer-based exercised via online adaptive quizzing tool for content mastery with customized student remediation modules. Revision of course assessment questions to mimic VTNE and testing format via computer to assist with general test-anxiety. On-Campus program students continue to complete a mock VTNE practice exam proctored by faculty in a campus computer lab room during their semester prior to graduation. In Fall 2017, students achieved an 81.64% scaled-score on the HESI Veterinary Technology Exit Exam compared to 79.19%. This is a slight improvement from Fall 2016, and student performance on mock-exam overall mimics performance on the VTNE. Students pay the fee for mock exam which includes individual test feedback with content mastery tools. Most recent results: Program 1st time candidate pass rates for On-Campus and Online students are similar ranging from 82.35% to 90% which exceeds the national benchmarks set for each testing window. This is an improvement from 2016-17 data in which the Online cohort 1st time pass rates

Page 435: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

425

Veterinary Technology, A.A.S. VTNE

Testing 1st Time

Candidates 1st

Time Pass Rate

Repeat Candidates

Repeat Pass Rate

Jul-Aug 2016

3 33.3% (1)

5 60% (3)

Nov-Dec 2016

0 N/A 4 50% (2)

Mar-Apr 2017

29 69% (20)

3 66.6% (2)

Jul-Aug 2017

0 N/A 8 63% (5)

Online (Part-Time) Program:

VTNE Testing

1st Time Candidates

1st Time Pass Rate

Repeat Candidates

Repeat Pass Rate

Jul-Aug 2016

5 80% (4)

2 0% (0)

Nov-Dec 2016

1 100% (1)

2 100% (2)

Mar-Apr 2017

6 100% (6)

0 N/A

Jul-Aug 2017

10 90% (9)

1 0% (0)

*National VTNE Benchmarks:

Testing Window 1st Time Candidates Pass Rate

Repeaters Pass Rate

Jul-Aug 2016 74% 40% Nov-Dec 2016 65% 44% Mar-Apr 2017 64% 39% Jul-Aug 2017 73% 30%

exceeded the On-Campus cohort (see tables below). The repeat VTNE test rate exceeded benchmarks for 2 of the 4 testing windows for On-Campus program and only once of the 2 testing windows for the Online program. Results improved: [ X ] Yes [ ] No [ ] Partially Current actions to improve program goal: Continue with previous and expand course requirement for Online program students to complete HESI test prep exam in same semester as On-Campus students. This action is planned for late Fall 2018. Assessed: Annually

Page 436: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

426

Annual Planning and Evaluation Report: 2017-2018

Welding: Basic Techniques Career Studies Certificate NOVA Mission Statement: With commitment to the values of access, opportunity, student success, and excellence, the mission of Northern Virginia Community college is to deliver world-class in-person and online post-secondary teaching, learning, and workforce development to ensure our region and the Commonwealth of Virginia have an educated population and globally competitive workforce. Program Purpose Statement: The curriculum is designed primarily for students who wish to find employment in various industries as entry-level welders. The curriculum emphasizes the study of equipment, reading of blueprint designs, and the various welding processes utilized in today’s industry.

Student Learning Outcomes Evaluation Methods Assessment Results Use of Results

Perform technical work related to welding applying OSHA safety and industry standards in a work environment.

Introduction to Welding WEL 120 Direct Measure: Students were assessed based on a culminating laboratory project. Criteria - Students will: • Set up an oxy-fuel torch for welding • Light torch and adjust flame for proper fuel /O2 mixture • Weld a lap joint using proper technique and correct

filler mixture • Shut down torch correctly • Break down torch for proper storage See attached procedure students were to follow to complete this SLO. Sample:

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

Students assessed

# % MA only 1 1 15 100 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 15 100

*Dual-enrollment

Semester/year data collected: Fall 2017 Target: 80% of students passing with a score of 80% or better Results:

Results by

Campus/ Modality

Fall 2017 Fall 2016

Average Score

Percent > target

Average Score

Percent > target

MA only 85 95 83 95

Target Met: [ X ] Yes [ ] No [ ] Partially The percentage of students passing remained the same from the previous assessment. With this, the Welding faculty will continue to work with students so that they fully understand and execute the procedures and give them more opportunities within the lab period to practice. Next Assessment: Fall 2018

Apply basic machine and technique adjustments to solve typical welding problems.

Arc Welding WEL 121 Direct Measure: Students were assessed based on a classroom assignment. Criteria - Students will: • Perform visual inspections of examples depicting

different bad welds • Examine, analyze and explain the cause of bad welds • State corrective action required to prevent bad welds. See attached sheet students completed for this SLO. Samples were actual welded coupons- unable to attach. Sample:

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

Students assessed

Semester/year data collected: Fall 2017 Target: 80% of students passing with a score of 80% or higher Results:

Results by

Campus/ Modality

Fall 2017 Fall 2016

Average Score

Percent > target

Average Score

Percent > target

MA only 90 90 86 85

Target Met: [X ] Yes [ ] No [ ] Partially The score was identical to the previous assessment. To improve even more, the Welding faculty is going work with the students using the American Welding Society standards for visual inspection. Also, the peer inspections which started last year seem to help the students because they are learning from each other. Next Assessment: Fall 2018

Page 437: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

427

Welding: Basic Techniques Career Studies Certificate # %

MA only 1 1 15 100 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 15 100

*Dual-enrollment Select appropriate filler material for compatible admixing and dilution of welding procedure for various ferrous and non-ferrous metals.

Inert Gas Welding WEL 130 Direct Measure: Students were assessed based on a classroom exam. Criteria - Students will: • Recognize metals and filler materials listed in a given

chart • Match a list of metals with appropriate filler materials See attached sheet students completed for this SLO. Sample:

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

Students assessed

# % MA only 1 1 15 100 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 15 100

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students passing with a score 80% or higher Results:

Results by

Campus/ Modality

Spring 2018 Spring 2017

Average Score

Percent > target

Average Score

Percent > target

MA 92 90 83 85

Target Met: [ ] Yes [ ] No [ X ] Partially The quizzes that are being given have helped with student retention, and the exercises during the lab periods have reinforced the concept. The welding faculty is also going to introduce practical quizzes within the lab period to test the students’ knowledge and retention. Next Assessment: Spring 2019

Interpret basic welding fabrication drawings, sketches, symbols, and/or welding specifications.

Welding Drawing and Interpretation WEL150 Direct Measure: Students were assessed based on a specific drawing given to them to analyze. Criteria - Students will: • Recognize welding symbols and specifications. • Find dimensions of objects on drawing • Identify line types used on drawing See attached drawing and list of questions students completed for this SLO. Sample:

Campus/ Modality

# of Total Sections Offered

# of Sections Assessed

Students assessed

# % MA 1 1 12 95 ELI N/A N/A N/A N/A DE* N/A N/A N/A N/A Total 1 1 12 95

*Dual-enrollment

Semester/year data collected: Spring 2018 Target: 80% of students passing with a score of 80% or higher Results:

Results by

Campus/ Modality

Spring 2018 Spring 2017

Average Score

Percent > target

Average Score

Percent > target

MA 90 95 82 90

Target Met: [ X ] Yes [ ] No [ ] Partially The achievement target for this SLO was met with a slightly higher average. The students seem to be able to grasp the concepts better since the faculty have been giving them more drawings to examine. The repetition of examining drawings is helping to reinforce what the students have been taught. Next Assessment: Spring 2019

Page 438: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

428

Welding: Basic Techniques Career Studies Certificate Program Goals Evaluation Methods Assessment Results Use of Results

To encourage graduation and career advancement through successful completion of the CSC in welding technology.

The number of program graduates for the 2017-18 academic year as provided by the NOVA OIR. This data can be found at http://www.nvcc.edu/college-planning/_docs/41-17-Number-of-NOVA-Graduates-By-Degree-and-Specialization-2016-2017.pdf

Target: The program’s target number of graduates is 15 per year. The total number of CSC recipients for 2017-18 was 14. The following are the number of graduates for the previous four years:

• 2016-2017: 16 • 2015-2016: 21 • 2014-2015: 13 • 2013-2014: 5

The number of graduates for the 2017-18 academic year was 14. This is slightly lower than the previous year. The faculty still attribute this to the fact that the program is typically part time for students. Also, after students get a few classes completed, they may find a job in the welding field. Thus, they may put off taking the classes off until later. Assessed: Annually

To promote course completion and progression through the program.

Most recent course completion rates as documented by the NOVA OIR report “Success Rates by Discipline”. This data can be found at http://www.nvcc.edu/college-planning/_docs/Fall-2016-Success-Rates-by-Discipline_Excludes%20ELI%20Classes.pdf

Target: The programs target completion rate per course is 85%. The following figures show the average course completion rates per year:

• Fall 2017: 89% • Fall 2016: 89.8% • Fall 2015: 80.9% • Fall 2014: 90.5% • Fall 2013: 87% • Fall 2012: 93% • Fall 2011: 87%

The results show a decrease in the completion rate, but it is still above the target. The faculty still believe the decrease stems from the fact that students are signing up for class, finding that it isn’t for them, and not completing the course. The faculty will make an emphasis on advising these students more to find out what the issues are and correct them or advise them to make sure they withdraw from the class instead of just quitting and getting the grade penalty.

Page 439: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)

PATHWAY TO THE AMERICAN DREAM—NOVA’S STRATEGIC PLAN 2017-2023

THE NOVA COMMITMENT As its primary contributions to meeting the needs of the Commonwealth of Virginia, the Northern Virginia Community College pledges to advance the social and economic mobility of its students while producing an educated citizenry for the 21st Century.

THE STRATEGIC PLAN GOALS AND OBJECTIVES To deliver on this commitment NOVA will focus its creativity and talent, its effort and energy, and its resources and persistence, on achieving three overarching goals—success, achievement, and prosperity. It will strive to enable Every Student to Succeed, Every Program to Achieve, and Every Community to Prosper. To advance the completion agenda described above, thereby promoting students’ success and enhancing their social mobility, ensuring that programs achieve, and producing an educated citizenry for the 21st Century, the following goals and objectives are adopted: GOAL 1: Every Student Succeeds • Objective 1: Develop a College-wide approach to advising that ensures all students are advised and have access to support throughout

their time at NOVA • Objective 2: Implement VIP-PASS System as the foundational technology based on NOVA Informed Pathways for student self-advising,

assignment and coordination of advisors, and course registration GOAL 2: Every Program Achieves • Objective 3: Develop comprehensive, fully integrated Informed Pathways for every program to ensure seamless transitions from high

school and other entry points to NOVA, and from NOVA to four-year transfer institutions or the workforce • Objective 4: Develop effective processes and protocols for programmatic College-wide collective decisions that include consistent,

accountable leadership and oversight of each academic program with designated “owners,” active advisory committees, clear student learning outcomes and assessments, and program reviews in all modalities of instruction

• Objective 5: Align NOVA’s organizational structures, position descriptions, and expectations for accountability with its overarching mission to support student engagement, learning, success and institutional effectiveness

GOAL 3: Every Community Prospers • Objective 6: Enhance the prosperity of every community in Northern Virginia by refocusing and prioritizing NOVA’s workforce

development efforts • Objective 7: Further develop NOVA’s IT and Cybersecurity programs to support regional job demand and position NOVA as the leading

IT community college in the nation • Objective 8: Re-envision workforce strategies and integrate workforce development into a NOVA core focus • Objective 9: Plan to expand the breadth and reach of NOVA’s healthcare and biotechnology programs, and prioritize future programs to

support regional economic development goals

Page 440: Annual Planning and Evaluation Report Instructional Programs … · report; if there is a question about an evaluation method, please contact the instructional program or OIESS)