draft - oklahoma department of human services pdf library/qualityassurplan_epmo... ·...

56
QUALITY ASSURANCE PLAN DRAFT V02.02 MOSAIC Project Oklahoma Department of Human Services DRAFT V02.02 3/19/2009

Upload: others

Post on 29-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

QUALITY ASSURANCE PLAN

DRAFT V02.02

MOSAIC Project Oklahoma Department of Human Services

DRAFT V02.02 3/19/2009

Page 2: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2

Quality Assurance Plan Version Control VERSION DATE CHANGE DESCRIPTION V02.00 09/08/08 Draft version V02.01 10/03/08 Team Lead Review Draft version V02.02 12/09/08 Final Draft version

QUALITY PLAN APPROVALS

Prepared by _________________________________________ Quality Assurance/Quality Control Lead

Approved by ________________________________________

Program Quality Manager

_________________________________________ Quality Team Lead

_________________________________________ Program Manager

_________________________________________ MOSAIC Project Sponsor

Page 3: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

DRAFT V02.02 3/19/2009

Quality Assurance Plan - Table of Contents

QUALITY ASSURANCE PLAN - TABLE OF CONTENTS .................................................................. 3

1 GENERAL INFORMATION................................................................................................................... 5

1.1 PURPOSE............................................................................................................................................... 5 1.2 QA PLAN SCOPE................................................................................................................................... 5 1.3 QA TEAM ROLES AND RESPONSIBILITIES ............................................................................................. 6 1.4 RESOURCES FOR QA TEAM................................................................................................................... 9

2 QUALITY ASSURANCE TASKS ......................................................................................................... 10

2.1 QA TEAM TASKS AND MATRIX .......................................................................................................... 11 2.2 CONDUCTING QA TEAM TASKS.......................................................................................................... 11 2.3 QA SCHEDULE.................................................................................................................................... 11 2.4 EVALUATE PLANNING OVERSIGHT ..................................................................................................... 12 2.5 EVALUATE PROJECT MANAGEMENT................................................................................................... 12 2.6 EVALUATE QUALITY MANAGEMENT .................................................................................................. 15 2.7 EVALUATE TRAINING ......................................................................................................................... 16 2.8 EVALUATE REQUIREMENTS MANAGEMENT........................................................................................ 17 2.9 EVALUATE OPERATING ENVIRONMENT .............................................................................................. 18 2.10 EVALUATE DEVELOPMENT ENVIRONMENT....................................................................................... 20 2.11 EVALUATE SOFTWARE DEVELOPMENT............................................................................................. 20 2.12 EVALUATE SYSTEM AND ACCEPTANCE TESTING.............................................................................. 22 2.13 EVALUATE DATA MANAGEMENT ..................................................................................................... 25 2.14 EVALUATE OPERATIONS AND BUSINESS OVERSIGHT ....................................................................... 25 2.15 EVALUATE SOFTWARE PRODUCTS REVIEW PROCESS ....................................................................... 26 2.16 EVALUATE COMPONENT DELIVERABLE (RELEASE) PROCESS........................................................... 26 2.17 EVALUATE MEDIA CERTIFICATION, STORAGE AND HANDLING PROCESS ......................................... 27 2.18 NON-DELIVERABLE SOFTWARE CERTIFICATION .............................................................................. 27 2.19 EVALUATE PERFORMANCE STANDARDS........................................................................................... 27

3 PROJECT DELIVERABLES ................................................................................................................ 28

4 REVIEWS AND AUDITS ...................................................................................................................... 28

4.1 VERIFY DOCUMENT AND ARTIFACT DELIVERABLE REVIEW .............................................................. 28 4.2 VERIFY PROJECT MANAGEMENT COMPLIANCE REVIEWS................................................................... 29 4.3 CONDUCT PROCESS AUDITS AND REVIEWS ........................................................................................ 30 4.4 EVALUATION ...................................................................................................................................... 31

5 TESTING PROCESS AND ENVIRONMENTS .................................................................................. 32

5.1 SYSTEM AND USER ACCEPTANCE TESTING PROCESSES...................................................................... 32 5.2 SYSTEM ENVIRONMENTS .................................................................................................................... 33 5.3 QUALITY CONTROL TESTING PROCESS............................................................................................... 34

6 VALIDATION AND ACCEPTANCE PROCESS................................................................................ 35

6.1 DELIVERABLE ACCEPTANCE............................................................................................................... 35 6.2 CONTRACTOR’S SYSTEM CERTIFICATION ........................................................................................... 36 6.3 ACCEPTANCE PLANS AND RELEASES .................................................................................................. 36 6.4 FEDERAL ACCEPTANCE AND APPROVAL PREPARATION ..................................................................... 36 6.5 FOOD AND NUTRITION SERVICE (FNS) ACCEPTANCE REQUIREMENTS............................................... 36 6.6 HEALTH AND HUMAN SERVICES (HHS) ACCEPTANCE REQUIREMENTS ............................................. 37

7 ISSUE REPORTING AND CORRECTIVE ACTION........................................................................ 38

7.1 ESCALATION PROCEDURE FOR RESOLUTION DISPUTES ...................................................................... 38 7.2 CORRECTIVE ACTION PROCESS .......................................................................................................... 39

Page 4: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

7.3 RECORDING DEFECTS (INCIDENTS) IN SOFTWARE CODE OR DOCUMENTATION.................................. 39

8 QUALITY METRICS............................................................................................................................. 40

8.1 MEASUREMENTS................................................................................................................................. 41 8.2 MONITOR AND CONTROL.................................................................................................................... 41 8.3 TREND ANALYSIS ............................................................................................................................... 41 8.4 PROCESS IMPROVEMENT ANALYSIS.................................................................................................... 42

APPENDIX A: QA TEAM TRAINING.................................................................................................. 43

APPENDIX B: QA TASK’S MATRIXES............................................................................................... 44

APPENDIX C: PROCESS AUDIT AND REVIEW SCHEDULE ........................................................ 49

APPENDIX D: PROCESS AUDIT REPORT......................................................................................... 52

APPENDIX E: SOFTWARE TOOL EVALUATION CHECKLIST................................................... 53

APPENDIX F: PERFORMANCE STANDARDS EVALUATION CHECKLIST.............................. 54

APPENDIX G: PILOT EVALUATION CHECKLIST ......................................................................... 55

APPENDIX H: IMPLEMENTATION EVALUATION CHECKLIST................................................ 56

4

Page 5: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

1 GENERAL INFORMATION This Quality Assurance Plan (QA Plan) presents a framework for managing quality assurance activities, which when followed, will verify delivery of quality products and services for the OKDHS MOSAIC Project. The QA Plan provides the Project standards and procedures to be used as the basis for the Quality Assurance (QA) Team's reviews and audits. The QA Plan will document how to plan, implement, and assess the effectiveness of Quality Assurance and Quality Control operations. A MOSAIC Project QA Team will implement the QA Plan and the Contractor will assist OKDHS staff throughout the life cycle of the MOSAIC Project. 1.1 Purpose The QA Plan will outline and reference the Quality Assurance (QA) and Quality Control (QC) procedures used to evaluate overall project and product performance, business and technical processes, all Deliverable software, and traceability within all MOSAIC Project documentation. Quality Assurance activities performed will verify that project management and project Deliverables are of high quality and meet quality standards as determined by the MOSAIC Project stakeholders. Section 2 of this QA Plan outlines QA tasks and Section 5 explains QC testing procedures. The purpose of this QA Plan is to:

1. Define the OKDHS MOSAIC Project Quality Assurance organization; 2. Identify QA tasks and responsibilities; 3. Implement quality standards and objectives providing defect-free systems

and business or technical processes delivered on time and within budget, and which are maintainable;

4. Provide reference documents and guidelines to perform the QA and QC activities;

5. Provide the standards, practices and conventions used in carrying out QA and QC activities;

6. Provide the tools, techniques, and methodologies to support QA and QC activities and reporting.

1.2 QA Plan Scope This QA Plan establishes the QA activities performed throughout the life cycle of the MOSAIC Project. This QA Plan is written to follow OKDHS approved QA standards and procedures adhering to state and federal policies and guidelines. Specifically, this QA Plan will show that the QA function is in place for this MOSAIC Project and QC measurement results are gathered by the QA Teams for use in re-evaluating and analyzing the quality standards and processes. This QA Plan will help assure the following:

1. That quality control system development, evaluation and acceptance standards are developed, documented and followed throughout the life of the MOSAIC Project.

2. Results from software quality reviews and audits performed by the QA Team will be submitted to the Quality Assurance/Quality Control (QA/QC)

5

Page 6: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

3. Test results adhere to predetermined acceptance standards. 4. The quality plan processes will be developed and approved to confirm

Deliverable’s and milestone acceptance criteria and to manage approved project processes.

1.2.1 Quality Management Scope Quality Management scope is to implement a Quality Management methodology and process that will verify all activities necessary to design, develop, implement, and utilize a product or service are effective and efficient with respect to the Enterprise System and its performance. For the OKDHS MOSAIC Project, Quality Management encompasses Quality Assurance, Quality Control, Change Management, and Risk Management operation methodologies within the plans and documents established for the MOSAIC Project. 1.2.2 Quality Assurance Scope Quality Assurance scope is to implement the Quality Assurance methodology and processes to verify that all the processes, methodologies, and technologies used for the MOSAIC Project and used to build and utilize the product perform as specified. 1.2.3 Quality Control Scope Quality Control scope will be to verify and validate that Deliverables comply with quality standards and the project requirements, and are complete and correct. QA and QC tools will be utilized to verify and track the MOSAIC Project life cycle activities, the creation and tracking of associated documents for those activities, and to verify completeness of product and that it met the requirements through QC testing. 1.3 QA Team Roles and Responsibilities The QA Team has the responsibility to report directly to the OKDHS Program Manager if the quality of the project or product is being jeopardized or compromised. The QA Team will follow the escalation process as defined in this plan. While in practice this rarely occurs, for almost all defects are correctly addressed at the project level, the fact that the QA Team can escalate above the project level gives it the ability to keep the majority of these defects at the project level. The diagram below reflects a reporting structure for resolving QA issues. Although the Program Quality Subject Matter Experts (SME) report directly to the QA/QC Lead, they may interact directly with the MOSAIC Project Teams including the Contractors. They have the authority to delegate responsibilities of interacting functions. The Program Quality Manager and Quality Team Lead review the QA Team work and will have the final approval. The QA Team is responsible for identifying compliance areas as either conforming or non-

6

Page 7: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

conforming with the QA standards, procedures, and guidelines set forth in this plan with the goal of ensuring compliance with QA requirements. Figure 1 reflects the QA Team members with relation to the MOSAIC Project

The following describes the roles of the QA Team members that influence and control OKDHS MOSAIC Project quality. OKDHS Program Manager(s) is responsible for the following items:

1. Establishing a quality program by committing the project to implement quality standards and methodologies.

2. Reviewing and approving the QA Plan. 3. Gaining approval from the Decision Team for the implementation of the

QA Plan. 4. Resolving and following-up on any quality issues escalated to this level by

the QA Team. 5. Facilitating the identification of an individual or group independent from

the project to audit and report on the MOSAIC Project’s QA Team function.

6. Facilitating the identification of the quality factors to be implemented in the system and software.

7. Facilitating the identification of, developing, and maintaining planning documents such as the Project Management Plan.

Program Quality Manager and Quality Team Lead are responsible for:

1. Implementing the quality program in accordance with the approved QA Plan which defines QA standards, procedures, and guidelines.

7

Page 8: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2. Reviewing and approving the QA activities to be performed by QA/QC Lead.

3. Reviewing and approving the QA Team’s work and recommendations during all phases of the program.

4. Resolving and following-up on any quality issues escalated to this level by the QA Team.

5. Identifying, developing and maintaining Quality Management planning documents such as the Risk Management Plan, Change Management Plan, and this QA Plan.

Quality Assurance/Quality Control (QA/QC) Lead is responsible for:

1. Implementation and adherence to the quality program in accordance with the approved QA Plan which defines QA standards, procedures, and guidelines.

2. Identifying the QA activities to be performed by QA Team. 3. Reviewing and approving the QA Team’s work and recommendations

during all phases of the program. 4. Resolving and following-up on any quality issues raised by QA Team. 5. Identifying, developing and maintaining planning documents such as the

Test Plans, Standards and Process Manuals (Quality Control Testing and Migration Approval) and this QA Plan.

6. Implementing practices, processes, and procedures as defined in OKDHS MOSAIC Project RFP, Project Plan, Business or Technical Requirements and Design Documents, SOW, and this QA Plan.

QA Team is responsible for:

1. Reviewing and commenting on the QA Plan. 2. Implementing the quality program in accordance with this QA Plan. 3. Resolving and following-up on any quality issues raised by QA Team

related to software development and hardware implementation activities. 4. Identifying and evaluating the quality factors to be implemented in the

system (software and hardware). 5. Implementing practices, processes, and procedures as defined in OKDHS

MOSAIC Project RFP, Project Plan, Business or Technical Requirements and Design Documents, SOW, and this QA Plan.

Contractor Project Manager and Contractors are responsible for:

1. Reviewing and commenting on the QA Plan. 2. Implementing the quality program in accordance with this QA Plan. 3. Resolving and following-up on any quality issues raised by QA Team

related to software design, application development, and hardware implementation.

4. Identifying and evaluating the quality factors to be implemented in the software and hardware.

5. Implementing the software design/development practices, processes, and procedures as defined in the project Software Development Methodology and Development Requirements Document, Functional and Technical Design Documents and this QA Plan.

8

Page 9: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Analysis, Design, Development, & Test (ADDT) Teams are responsible for: 1. Reviewing and commenting on the QA Plan. 2. Implementing the quality program in accordance with this QA Plan. 3. Resolving and following-up on any quality issues raised by QA Team

related to software design, application development, and hardware implementation.

4. Identifying and evaluating the quality factors to be implemented in the software and hardware.

5. Implementing the software design/development practices, processes, and procedures as defined in the project Software Development Methodology and Development Requirements Document, Functional and Technical Design Documents and this QA Plan.

Program Quality SME(s) and QCT Supervisor are responsible for:

1. Reviewing and commenting on the QA Plan. 2. Implementing the quality program in accordance with this QA Plan. 3. Resolving and following-up on any quality issues raised by QA Team

related to Change Management and QCT Standards and Process Manual. 4. Ensuring the quality factors are implemented in the quality control testing

of software as related to Change Management. 5. Implementing the practices, processes, and procedures as defined in

MOSAIC Project Change Management Plan, QCT Standards and Processes Manual, and Migration Approval Process Manual and this QA Plan.

Quality Control Testers are responsible for:

1. Reviewing and commenting on the QA Plan. 2. Implementation and adherence to the quality program in accordance with

this QA Plan. 3. Resolving and following-up on any quality issues raised by QA Team

related to software testing. 4. Verifying the quality factors are implemented in the system, specifically

software. 5. Implementing the software test practices, processes, and procedures as

defined in Quality Control Standards and Processes, and Software Development Methodology, and this QA Plan.

1.4 Resources for QA Team 1.4.1 Facilities and Equipment The QA Team will have the same access to the facilities and equipment as the MOSAIC Project Team. In addition, the QA Team will have access to computer resources to perform QA functions such as process and product evaluations, audits and quality control testing. 1.4.2 Testing Sites Testing sites that will be established will have all the necessary tools and documentation available to perform QA and QC activities.

9

Page 10: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

1.4.3 QA Team Personnel The QA Team selected and assigned to the MOSAIC Project Team has demonstrated experience in quality assurance as it relates to software engineering and software development as well as how it directly impacts the business units and their functions. The QA Team has a marked and explicit understanding of quality assurance and QA implications to the project, MOSAIC Project Team, and how it directly relates and results in quality results for overall project success. The QA Team will be responsible for ensuring the quality standards and procedures described in the QA Plan are executed. Additionally, the QA Team will be familiar with software quality, software development-related activities, and structured analysis, design, coding, and testing. The QA Team will also be familiar with how each division business unit’s functions and what their process requirements demand of new software implementation and design. 1.4.4 Independent Verification & Validation Contractor An Independent Verification & Validation (IV&V) Contractor will assist OKDHS by determining if development products conform to requirements and if the software satisfies the intended use and user needs. The determination includes assessment, analysis, evaluation, review, inspection and testing of software products and processes. The IV&V Contractor will work with the QA Team and will follow procedures established or referenced in the Quality Assurance Plan that will be used as the basis for managing the quality assurance of project Deliverables. 1.4.5 Quality Control Testing Team OKDHS will determine how and when QA/QC staff will be brought into the MOSAIC Project to perform quality control testing; this will be defined for the Staffing Plan as well. 1.4.6 QA Team Training To establish a QA Team that is experienced in quality assurance activities and to supplement their formal training and experience, they will be trained on OKDHS specific activities to achieve an effective Quality Program. As required, other MOSAIC Project Team members will be provided with training to support the QA activities. Any QA training requirements will be coordinated with the MOSAIC Project Team. Training may be conducted in several formats as specified in the MOSAIC Training Plan. The training schedule will be compatible with the project schedule. Appendix A provides a matrix that identifies the required skills to perform QA tasks to implement the QA Plan.

2 QUALITY ASSURANCE TASKS The scheduling of QA tasks is driven by the Project Plan. Therefore, a QA task is performed in relationship to what project activities are taking place and one or more QA tasks can be performed concurrently. A task is considered completed when the required report such as QA Reports or Process Audits Reports, etc. is satisfactorily completed or action items have been closed. The tasks identified

10

Page 11: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

below, requiring coordination and cooperation with the MOSAIC Project Team shall be performed by QA Team as defined in the OKDHS MOSAIC Project RFP, QA Plan, and MOSAIC Project Standards and Guidelines reference materials. 2.1 QA Team Tasks and Matrix The QA/QC Lead will have the responsibility for ensuring tasks are performed accordingly to assure the quality of the Deliverables on this MOSAIC Project. The QA Team will assist the QA/QC Lead in ensuring the QA procedures are executed in accordance with the QA Plan. The QA/QC Lead shall monitor project staff activities and review products for compliance to applicable standards and procedures by utilizing this QA Plan methodology. The results of QA Team monitoring and analysis along with QA Team’s recommendations for corrective action shall be reported to the MOSAIC Project Quality Program Manager and Quality Team Lead, and as required, to the OKDHS Program Manager. All documents and software approved by the OKDHS Program Manager for release to the business units will have been reviewed and approved by QA Team. The QA task matrix is reflected in Appendix B. 2.2 Conducting QA Team Tasks To conduct QA/QC activities identified in Section 2, checklists will be designed and used as a tool for conducting task audit/review/evaluation to verify required steps have been performed. The results of these evaluated tasks shall be documented using the Process Audit Form described in Appendix D, or sample evaluation forms in Appendices E, F, G and H; all will be reviewed with appropriate team members, and submitted to the QA/QC Lead for review. 2.2.1 Adding QA Team Tasks Any QA/QC tasks not identified in the planning stages of the MOSAIC Project can be added to the Quality Team’s scheduled activities by contacting the MOSAIC Project Quality Program Manager or Quality Team Lead for approval. 2.3 QA Schedule QA schedules are closely coordinated with the project Deliverables and product development. Process audits will be performed at the beginning of each new phase of development to verify that the appropriate processes are correctly implemented as defined in the planning documents. In addition, unscheduled audits will be made during each phase of development to verify that the processes and procedures are being followed. At the completion of a software development phase, QA Team will review and report whether all steps required to transition to the next phase have been accomplished. Section 4 of this QA Plan further explains reviews and audits. NOTE: In the following table Sections 2.4 - 2.19, the ‘Q’ indicators in the task column are activities that will be preformed in addition to the IV&V RFP tasks required.

11

Page 12: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2.4 Evaluate Planning Oversight As stated in the OKDHS MOSAIC Project RFP, key staff within each business area serves on the Project Decision Team. The MOSAIC Program Manager will report to the Executive Sponsors, who report directly to the OKDHS Chief Executive Officer (CEO). The Project Decision Team will perform overall oversight of the MOSAIC Project and will receive input from an independent Project Manager and the Independent Verification and Validation (IV&V) Team.

TABLE 1 – PLANNING OVERSIGHT

TASK ITEM TASK # TASK DESCRIPTION

Contract Verification

PO-1 QA Team shall evaluate and verify that the obligations of Contractor, Sub-Contractors, and external staff, including terms, conditions, statement of work, requirements, technical standards, performance standards, development milestones, acceptance criteria, and delivery dates, are clearly defined. QA Team shall verify that performance metrics are included that allow tracking of project performance and progress against the criteria set by OKDHS.

PO-2 QA Team shall perform ongoing evaluations and reviews of OKDHS methodologies used for the Feasibility Study, verifying the Feasibility Study is objective, reasonable, measurable, repeatable, consistent, accurate, and verifiable.

PO-3 QA Team shall review and evaluate PAPD(U)/IAPD(U) documents.

Feasibility Study

PO-4 QA Team shall review and evaluate the Cost Benefit Analysis.

2.5 Evaluate Project Management The overall guidance and direction to the MOSAIC Project is provided by the OKDHS Program Manager. The facilitation of project tasks and Deliverables will be organized in a project work plan with recommended sequence and schedule. The QA Team will review the joint project management efforts as needed and initiate corrective actions where appropriate to verify project delays and defects are minimal.

TABLE 2 – PROJECT MANAGEMENT TASK ITEM TASK # TASK DESCRIPTION

PM-1 QA Team shall evaluate and recommend improvements, as needed, to verify continuous executive agreement, participation, support, and commitment, and verify that open pathways of communication exist among all stakeholders.

PM-2 QA Team shall verify that OKDHS management has signed off on all changes that impact project objectives, cost, or schedule.

Project Sponsorship

PM-3-Q QA Team shall review and propose project schedule, scope, and expenditure controls.

PM-4 QA Team shall evaluate project management and organization, to verify that lines of reporting and responsibility provide adequate technical and managerial oversight of MOSAIC Project.

PM-5 QA Team shall evaluate and report findings on project progress, resources, budget, schedules, workflow, and reporting.

Management Assessment

PM-6 QA Team shall evaluate and review coordination, communication, and management to verify stakeholders are working collaboratively and are following the MOSAIC Project Communication Plan.

Project Management

PM-7 QA Team shall evaluate MOSAIC Project Management Plans and procedures to verify they are developed, communicated, implemented, monitored, and complete.

12

Page 13: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

PM-8 QA Team shall evaluate MOSAIC Project Management Plans and project reports to verify project status is accurately tracked using defined project metrics.

PM-9-Q QA Team shall evaluate and verify that project planning complies with MOSAIC Project Management Plan requirements.

PM-10-Q QA Team shall evaluate and verify that milestones and completion dates determined by OKDHS are planned, monitored, and met.

PM-11-Q QA Team shall evaluate and verify the Deliverables are entered and tracked by the approved Deliverables Management Tool and comply with the Deliverables requirements.

PM-12 QA Team shall evaluate and verify that the Tracking Tool for project issues and defects will be documented as they arise; enabling communication of issues and defects to appropriate stakeholders; documents a mitigation strategy as appropriate; and tracks issues and defects to resolution. Tracking issues and defects shall include, but is not limited to, technical and development efforts.

PM-13-Q QA Team shall evaluate and verify that Defect (Incident) Tracking Reports are provided for any defects which may place in jeopardy any project component presented in the MOSAIC Project Management Plan and complies with Defect Tracking Report requirements.

PM-14-Q QA Team shall review, analyze, and propose the Deliverable content, review, and submission process.

PM-15-Q QA Team shall review, critique, and propose issue resolution/escalation procedures.

PM-16 QA Team shall evaluate the life cycle development methodology(s), such as waterfall, evolutionary spiral, rapid prototyping, and incremental, to verify the methodology(s) is appropriate for the system being developed.

PM-17 QA Team shall evaluate and verify project capabilities and plans to redesign business systems to achieve improvements in critical measures of performance, such as cost, quality, service, and speed.

Business Process Reengineering

PM-18 QA Team shall evaluate and verify the Enterprise Architecture Methodology has the strategy, management backing, resources, skills, and incentives required for effective change.

PM-19 QA Team shall evaluate and verify that the MOSAIC Project Risk Management Plan is created and followed.

PM-20 QA Team shall evaluate the Risk Management Plan and procedures to verify that risks are identified and quantified and that Mitigation Plans are developed, communicated, implemented, monitored, and complete.

Risk Management

PM-21-Q QA Team shall review and be familiar with the Risk Management Plan compliance standards and procedures.

PM-22 QA Team shall evaluate and verify that the MOSAIC Project Change Management Plan is created and followed.

PM-23-Q QA Team shall evaluate and verify that requirements identified as having potential defects are reviewed with the Change Control Committee to determine the impact or necessity of the change.

PM-24 QA Team shall evaluate the Change Management Plan, Organizational Readiness Plan and procedures to verify they are developed, communicated, implemented, monitored, and complete and that resistance to change is anticipated and prepared for.

Change Management

PM-25-Q QA Team shall evaluate and verify that Change Control Management activities comply with Change Control Management

13

Page 14: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

requirements. Communication Management

PM-26 QA Team shall verify that the MOSAIC Project Communication Plan is created and followed. QA shall verify Organizational Management is addressed in the MOSAIC Project Communication Plan.

PM-27 QA Team shall evaluate the Communication Plan and strategies to verify they support communications and work product sharing between all MOSAIC Project stakeholders, and verify Communication Plan and strategies are effective, implemented, monitored, and complete.

PM-28 QA Team shall evaluate and verify the Change Management Plan and procedures associated with the development process.

PM-29 QA Team shall evaluate and verify that all critical development documents, including, but not limited to, requirements, design, and code, are maintained under the required level of control.

PM-30 QA Team shall evaluate and verify that processes and tools are in place to identify code versions and rebuild system configurations from source code.

PM-31 QA Team shall evaluate and verify that all source and object libraries are maintained for training, testing, and production, and that formal sign-off procedures are in place for approving Deliverables.

PM-32 QA Team shall evaluate and verify that processes and tools are in place to manage system changes, including formal logging of change requests and review, prioritization, and timely scheduling of maintenance actions.

PM-33 QA Team shall evaluate and verify that mechanisms are in place to prevent unauthorized changes being made to the system and to prevent authorized changes from being made to the wrong version.

Configuration Management

PM-34-Q QA Team shall evaluate and verify that a configuration management process is in place to control and manage the baseline configuration for all hardware, communications equipment, network equipment, application development security principles, security controls, application deployment, and operational configurations.

PM-35 QA Team shall evaluate and make recommendations on the estimating and scheduling process of the project to verify that project budget and resources are adequate for the Work Breakdown Structure and schedule.

Project Estimating and Scheduling

PM-36 QA Team shall review and evaluate schedules to verify that adequate time and resources are assigned for planning, development, review, testing, and rework.

Project Staff PM-37 QA Team shall evaluate the job assignments, skills, training, and experience of the staff involved in program development to verify that they are adequate for the development task.

PM-38 QA Team shall verify that lines of reporting and responsibility provide adequate technical and managerial oversight of the MOSAIC Project.

Project Organization

PM-39 QA Team shall verify that the MOSAIC Project's organizational structure supports training, process definition, independent quality assurance, Configuration Management, product evaluation, and any other functions critical to the MOSAIC Project's success.

Sub-Contractors and External Staff, if any

PM-40 QA Team shall evaluate the use, in project development, of Sub-Contractors or other external sources of project staff, such as IT staff from another State of Oklahoma organization.

14

Page 15: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

PM-41 QA Team shall evaluate and verify that the obligations of Sub-Contractors and external staff, including terms, conditions, statements of work, requirements, standards, development milestones, acceptance criteria, and delivery dates, are clearly defined.

PM-42 QA Team shall evaluate and verify that Sub-Contractor's software development methodology and product standards are compatible with OKDHS standards and environment.

PM-43 QA Team shall verify that Sub-Contractor has and maintains the required skills, staff, plans, resources, procedures, and standards to meet Sub-Contractor's commitment. QA Team shall evaluate and verify the feasibility of any off-site support of the MOSAIC Project.

PM-44 QA Team shall evaluate and verify that any proprietary tools used by Sub-Contractors do not restrict the future maintainability, portability, and reusability of the Enterprise System.

PM-45 QA Team shall verify that OKDHS oversight is provided in the form of periodic status reviews and technical interchanges.

PM-46 QA Team shall verify that OKDHS has defined the technical and managerial inputs the Sub-Contractor requires, including reviews, approvals, requirements, and interface clarifications, and has the resources to supply them on schedule.

OKDHS Oversight

PM-47 QA Team shall evaluate MOSAIC Project oversight to verify that OKDHS, not Contractor, exercises ultimate responsibility for monitoring project cost and schedule.

2.6 Evaluate Quality Management QA Team will verify the Contractor assists in updating the Quality Assurance Plan, which shall document how to plan, implement, and assess the effectiveness of Quality Assurance and Quality Control operations. The QA Plan shall fully define the entire Quality system, including, but not limited to, organizational structure, roles and responsibilities, processes, and resources required to implement and manage the QA Plan.

TABLE 3 – QUALITY MANAGEMENT TASK ITEM TASK # TASK DESCRIPTION

QA-1 QA Team shall evaluate and make recommendations about the MOSAIC Project Quality Assurance Plan, procedures, and organization.

QA-2 QA Team shall verify that the QA organization monitors the fidelity of all defined processes in all phases of the MOSAIC Project.

QA-3 QA Team shall verify that the quality of all products produced by the MOSAIC Project is monitored by formal reviews and sign-offs.

QA-4 QA Team shall verify that project self-evaluations are performed and that measures are continually taken to improve the process.

QA-5 QA Team shall monitor the performance of QA processes and/or designated staff by reviewing QA processes and reports, and by performing spot checks of system documentation. Contractor shall evaluate findings and performance of the processes and reports.

QA-6 QA Team shall verify that Project Management supports the appropriate levels of independence for QA activities.

Quality Assurance

QA-7-Q QA Team will review and be familiar with the QA Plan compliance standards and procedures.

15

Page 16: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

QA-8 QA Team shall evaluate and make recommendations about all defined processes and product standards associated with system development.

QA-9 QA Team shall evaluate and verify that all major development processes are defined and that the defined and approved processes and standards are followed in development.

QA-10 QA Team shall evaluate and verify that processes and standards are compatible with each other and with system development methodology.

QA-11 QA Team shall evaluate and verify that all process definitions and standards are complete, clear, up-to-date, consistent in format, and easily available to MOSAIC Project staff.

QA-12-Q QA Team shall evaluate and verify that Progress/Status Reports comply with Communications Plan requirements.

QA-13-Q QA Team shall evaluate and verify that all QA Reports comply with the Reporting requirements as stated in the QA Plan.

QA-14-Q QA Team shall evaluate and verify that Requirements Discovery and Documentation processes are followed as defined in the MOSAIC Requirements Management Plan.

Process Definition and Product Standards

QA-15-Q QA Team shall evaluate and verify that Functional Requirement Discovery and Documentation processes are followed as defined in the MOSAIC Requirements Management Plan.

2.7 Evaluate Training The QA Team will review Training Plan and the training systems along with any physical training space needs to verify they all meet the requirements.

TABLE 4 – TRAINING TASK ITEM TASK # TASK DESCRIPTION

TR-1 QA Team shall verify that the training plan and processes provide adequate training to system users.

TR-2 QA Team shall verify that user-friendly training materials and Help Desk services are easily available to all users.

TR-3 QA Team shall verify that all required standards and processes, and related documentation, are easily available to users.

User Training and Documentation

TR-4 QA Team shall verify that all training is provided on time, and is evaluated and monitored for effectiveness, with additional training provided as needed.

TR-5 QA Team shall verify that the training plan and processes provide adequate training to system developers.

TR-6 QA Team shall evaluate and verify that developer training is technically adequate, appropriate for the development phase, and available at appropriate times.

TR-7 QA Team shall verify that all required policy, process, and standards documentation is easily available to developers.

TR-8 QA Team shall verify the required knowledge transfer occurs for maintenance and operation of the new system.

TR-9 QA Team shall evaluate and verify that all training is provided on time and is evaluated and monitored for effectiveness, with additional training provided as needed.

TR-10-Q QA Team shall evaluate and validate training results.

Developer Training and Documentation

TR-11-Q QA Team shall evaluate and verify that MOSAIC Project staff involved in any processes is trained in accordance with Training Plan in the necessary procedures and standards applicable to their area of responsibility to do the job correctly.

16

Page 17: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2.8 Evaluate Requirements Management Requirements analysis establishes a common understanding of the business unit’s requirements between that business unit customer and the software MOSAIC Project Team. QA Team will verify that the requirements process reviews are conducted in accordance with the standards and procedures established by the MOSAIC Project Team. The Contractor will assist OKDHS in ensuring that all MOSAIC Project RFP requirements are not only met, but requirements are also incorporated and traceable within all documents, models, Deliverables, etc.; and accomplished in an efficient and effective manner.

TABLE 5 – REQUIREMENTS MANAGEMENT TASK ITEM TASK # TASK DESCRIPTION

RM-1 QA Team shall evaluate and make recommendations about the Requirements Management Plan processes and procedures for managing system requirements.

RM-2 QA Team shall evaluate and verify that requirements are well-defined, understood, documented and traceable throughout all phases of the MOSAIC Project.

RM-3 QA Team shall evaluate and verify the allocation of system resources per hardware and software requirements.

RM-4 QA Team shall evaluate and verify that software requirements can be traced through design, code, and test phases to verify that the Enterprise System performs as intended and contains no unnecessary software elements.

RM-5 QA Team shall evaluate and verify that requirements are under formal configuration control.

RM-6 QA Team shall evaluate and verify project’s security for managing requirements and performing the risk analysis on each requirement.

RM-7-Q QA Team shall evaluate and verify that requirements are reviewed to determine if they are clearly stated and consistent.

RM-8-Q QA Team shall evaluate and verify that changes to requirements, work products and activities are identified, reviewed, and tracked to closure.

RM-9-Q QA Team shall evaluate and verify that the prescribed processes for defining, documenting, and allocating requirements are followed and documented.

RM-10-Q QA Team shall evaluate and verify that requirements are managed, controlled, and traced by the approved requirements Change Management Tool.

RM-11-Q QA Team shall evaluate and verify that Requirements Reviews comply with the Requirements standards and guidelines.

RM-12 QA Team shall evaluate and verify that processes and equipment are in place to back-up project information and files and archive them safely at appropriate intervals

Requirements Management

RM-13 QA Team shall evaluate and make recommendations on MOSAIC Project standards and procedures for ensuring the system is secure and the privacy of client information is protected.

RM-14 QA Team shall evaluate and verify project restrictions to verify system and information access control.

RM-15-Q QA Team shall evaluate and verify security testing and functional tests are performed.

Security Requirements

RM-16 QA Team shall verify that the Requirements Analysis of OKDHS and federal and state requirements and objectives has been performed to verify that Enterprise System requirements are well understood, well-defined, and meet federal and state regulations.

17

Page 18: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

RM-17 QA Team shall verify that all stakeholders have been consulted about the desired functionality of the Enterprise System, and users have been involved in prototyping the user interface.

RM-18 QA Team shall verify that performance requirements, such as timing, response time, and throughput, meet user requirements.

RM-19 QA Team shall evaluate and verify that user’s maintenance requirements for the Enterprise System are completely specified.

RM-20-Q QA Team shall evaluate and verify that the requirements analysis process and associated requirements reviews are conducted in accordance with the standards and procedures established by the MOSAIC Project and as described in MOSAIC Project RFP, SOW, and Requirements Management Document.

Requirements Analysis

RM-21-Q QA Team shall evaluate and verify that action items resulting from reviews of the requirements analysis are resolved in accordance with these standards and procedures.

RM-22 QA Team shall evaluate and verify that all system interfaces are exactly described, by medium and by function, including input/output control codes, data format, polarity, range, units, and frequency.

Interface Requirements

RM-23 QA Team shall evaluate and verify approved interface documents are available and that appropriate relationships, such as interface working groups, are in place with all agencies and organizations supporting the interfaces.

RM-24 QA Team shall evaluate that requirements specifications have been developed for all hardware and software subsystems in a level of detail to verify successful implementation and business continuity.

Reverse Engineering

RM-25 QA Team shall evaluate and verify that a well-defined Transition Plan and process for reengineering the system is in place and is followed, if a legacy system or a transfer system is or will be used in development. The process, depending on the goals of the reuse/transfer, may include reverse engineering, code translation, re-documentation, restructuring, normalization, and re-targeting.

2.9 Evaluate Operating Environment The QA Team shall evaluate software and hardware compatibility in all Operating Environments to verify that system requirements have been met, are maintainable and upgradeable.

TABLE 6 – OPERATING ENVIRONMENT TASK ITEM TASK # TASK DESCRIPTION

OE-1 QA Team shall evaluate current and projected system hardware configurations to verify that performance meets system requirements.

OE-2 QA Team shall evaluate and verify that hardware is compatible with existing OKDHS processing environment, maintainable, and easily upgradeable. Evaluation and verification shall include, but is not limited to, CPUs and other processors, memory, network connections, bandwidth, communication controllers, terminals, printers, telecommunications systems (LAN/WAN), and storage devices.

System Hardware

OE-3 QA Team shall evaluate and verify current and projected outside Contractor support of the hardware and OKDHS hardware Configuration Management Plan and procedures.

System Software

OE-4 QA Team shall evaluate current and projected system software to verify that capabilities meet system requirements.

18

Page 19: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

OE-5 QA Team shall evaluate and verify that software is compatible with existing OKDHS hardware and software environment, maintainable, and easily upgradeable. Evaluation and verification shall include, but is not limited to, operating systems, middleware, and network software, including communications and file-sharing protocols.

OE-6-Q QA Team shall evaluate and verify that system requirements are traceable throughout all phases of the MOSAIC Project.

OE-7-Q QA Team shall evaluate and verify that relevant documents are updated and based on approved requirements changes.

OE-8-Q QA Team shall evaluate and verify that the agreed upon requirements follow the Systems Standards and Guidelines.

OE-9-Q QA Team shall verify that design walkthroughs evaluate compliance of the design of requirements, identify defects in the design, and evaluation and report design alternatives.

OE-10-Q QA Team shall participate in walkthroughs and verify all walkthroughs are conducted.

OE-11-Q QA Team shall identify defects, verify resolution for previous identified defects, and verify change control integrity.

OE-12-Q QA Team shall selectively review and audit the content of system documents.

OE-13-Q QA Team shall evaluate and review demonstration prototypes for compliance with requirements and standards, verify that the demonstration conforms to standards and procedures, identify lack of compliance with standards, and determine corrective actions.

OE-14 QA Team shall evaluate current and projected database products to verify that capabilities meet system requirements.

OE-15 QA Team shall evaluate and verify the data format of the database is easily convertible to other formats, supports the addition of new data items, is scalable, easily refreshable, and compatible with existing OKDHS hardware and software, including any online transaction processing (OLTP) environment.

Database Software

OE-16-Q QA Team shall review and evaluate the software tools are available to provide database back-up, recovery, performance analysis, and data creation control.

OE-17 QA Team shall evaluate the existing processing capacity of the system and verify that it is adequate for current statewide requirements for both batch and online processing.

OE-18 QA Team shall evaluate the historic availability and reliability of the system, including the frequency and criticality of system failure.

OE-19 QA Team shall evaluate the results of any volume testing or stress testing.

OE-20 QA Team shall evaluate any existing measurement and capacity planning program and shall evaluate the system’s capacity to support future growth.

OE-21 QA Team shall make recommendations about changes in processing hardware, storage, network systems, operating systems, Enterprise System software, and software design to meet future growth and improve system performance.

OE-22-Q QA Team shall evaluate and verify the system’s response time meets requirements stated in the RFP.

OE-23-Q QA Team shall evaluate and verify the system accessibility complies with RFP requirements.

OE-24-Q QA Team shall evaluate and verify that Operational Support complies with the Operational Support requirements.

System Capacity

OE-25-Q QA Team shall evaluate and validate system tests and procedures.

19

Page 20: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2.10 Evaluate Development Environment The QA Team shall evaluate software and hardware for compatibility in all Development Environments to verify that development requirements have been met and that documentation complies with the MOSAIC Software Development Methodologies and documentation standards.

TABLE 7 - DEVELOPMENT ENVIRONMENT TASK ITEM TASK # TASK DESCRIPTION

DE-1 QA Team shall evaluate current and projected development hardware configurations to verify performance meets the requirements of system development.

DE-2 QA Team shall evaluate and verify that hardware is compatible with existing OKDHS development and processing environment, maintainable, and easily upgradeable. Evaluation and verification shall include, but is not limited to, CPUs and other processors, memory, network connections, bandwidth, communication controllers, terminals, printers, telecommunications systems (LAN/WAN), and storage devices.

Development Hardware

DE-3 QA Team shall evaluate and verify current and projected outside Contractor support of the hardware.

DE-4 QA Team shall evaluate current and projected development software to verify capabilities meet system development requirements.

DE-5 QA Team shall evaluate and verify that software is compatible with the existing OKDHS hardware and software environment, maintainable, and easily upgradeable.

DE-6 QA Team shall evaluate the environment as a whole to verify it shows integration. Evaluation shall include, but is not limited to, operating systems, network software, Computer-Aided Software Engineering (CASE) Tools, project management software, configuration management software, compilers, cross-compilers, linkers, loaders, debuggers, editors, and reporting software.

DE-7 QA Team shall evaluate and verify current and projected outside Contractor support of the software.

DE-8-Q QA Team shall evaluate and verify that documentation complies with MOSAIC Software Development Methodology and Documentation standards.

Development Software

DE-9-Q QA Team shall evaluate and verify that development requirements comply with the Development standards and guidelines.

2.11 Evaluate Software Development The system design process is to develop decisions about the system’s behavioral design and other decisions affecting the selection and design of system components. The MOSAIC Project Design Documents Standards & Guidelines template will define the organization and content of the Functional Design Document and Technical Design Document.

TABLE 8 – SOFTWARE DEVELOPMENT TASK ITEM TASK # TASK DESCRIPTION

SD-1 QA Team shall evaluate and make recommendations on existing high-level design products to verify that the design is workable, efficient, and meets all system and system interface requirements.

SD-2 QA Team shall evaluate and verify that high-level design products conform to design methodology and standards.

High-Level Design

SD-3 QA Team shall evaluate and make recommendations on high-level process, standards, methodologies, and CASE tools used.

20

Page 21: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

SD-4 QA Team shall evaluate and verify that design requirements can be traced back to system requirements.

SD-5 QA Team shall evaluate and verify that all design products are under configuration control and formally approved before detailed design begins.

SD-6-Q QA Team shall evaluate and verify that the software requirements analysis process and associated requirements reviews are conducted in accordance with the standards and procedures established by the MOSAIC Project and as described in MOSAIC Project RFP, SOW, and Requirements Management Process.

SD-7-Q QA Team shall evaluate and verify that action items resulting from reviews of the software requirements analysis are conducted in accordance with the standards and procedures established by the MOSAIC Project and as described in MOSAIC Project RFP, SOW, and Requirements Management Process.

SD-8-Q QA Team shall evaluate and verify that Design Reviews are conducted in accordance with the standards and procedures established by the MOSAIC Project and as described in MOSAIC Project RFP, SOW, and Requirements Management Process.

SD-9 QA Team shall evaluate and make recommendations on existing detailed design products to verify that the design is workable, efficient, and meets all high-level design requirements.

SD-10 QA Team shall evaluate and verify that detailed design products conform to design methodology and standards.

SD-11 QA Team shall evaluate and make recommendations on the detailed design and analysis process, standards, methodologies, and CASE Tools used.

SD-12 QA Team shall evaluate and verify that design requirements can be traced back to system requirements and high-level design.

SD-13 QA Team shall evaluate and verify that all design products are under configuration control and formally approved before coding begins.

Detailed Design

SD-14-Q QA Team shall evaluate and verify that the method, such as the appropriate Software Development Repository, used for tracking and documenting the development of a software unit is implemented and is kept current.

SD-15 QA Team shall evaluate and make recommendations on the existing job control and the process for designing job control.

SD-16 QA Team shall evaluate the system’s division between batch and online processing to verify system performance and data integrity.

SD-17 QA Team shall evaluate batch jobs to verify appropriate scheduling, timing, and internal and external dependencies.

Job Control

SD-18 QA Team shall evaluate and verify that job control language scripts are under configuration control.

SD-19 QA Team shall evaluate and make recommendations on the standards and process for code development currently in place.

SD-20 QA Team shall evaluate the existing code base to verify portability and maintainability, taking into account software metrics, including, but not limited to, modularity, complexity, and source and object size.

SD-21 QA Team shall evaluate code documentation to verify quality, completeness – including maintenance history, and accessibility.

SD-22 QA Team shall evaluate the coding standards and guidelines to verify compliance with these standards and guidelines. Evaluation shall include, but is not limited to, structure, documentation, modularity, naming conventions, and format.

Code

SD-23 QA Team shall evaluate and verify that developed code is under configuration control and is easily accessible by developers.

21

Page 22: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

SD-24 QA Team shall evaluate and verify use of software metrics in management and quality assurance.

SD-25 QA Team shall evaluate and make recommendations on the plans, requirements, environment, tools, and procedures used for unit testing system modules.

SD-26 QA Team shall evaluate the level of test automation, interactive testing, and interactive debugging available in the test environment.

SD-27 QA Team shall evaluate and verify the test process, that test results are verified, that the correct code configuration has been tested, and that the test is appropriately documented.

SD-28-Q QA Team shall evaluate and verify that the software development methodology, associated code reviews, and software unit testing are conducted in conformance with the standards and procedures established by the MOSAIC Project.

SD-29-Q QA Team shall evaluate and verify that action items resulting from reviews of the code are resolved in accordance with these standards and procedures established by the MOSAIC Project.

Unit Test

SD-30-Q QA Team shall evaluate and verify that the system or module presented for acceptance must account for all required functionality, training, conversion, documentation, and any other related requirements of the Contract and federal and State of Oklahoma rules, regulations, and policies for the components comprising the release.

SD-31-Q QA Team shall evaluate and verify that Contractor in conjunction with OKDHS shall provide the Iterative Application Software Development Methodology.

SD-32-Q QA Team shall evaluate and verify that documentation and computer program materials are approved and placed under library control.

SD-33-Q QA Team shall evaluate and verify the establishment of formal release procedures for approved documentation and software versions.

Software Development Life Cycle

SD-34-Q QA Team shall evaluate and verify that library controls prevent unauthorized changes to the controlled software and verify the incorporation of all approved changes.

2.12 Evaluate System and Acceptance Testing QA Team activities will verify software integration and test activities combine individually developed components together in the testing environment to verify that they work together to complete the software and system functionality. A Pilot shall verify the functional and technical usability of the Enterprise System in a limited production environment.

TABLE 9 – SYSTEM AND ACCEPTANCE TESTING TASK ITEM TASK # TASK DESCRIPTION

ST-1 QA Team shall evaluate the plans, requirements, environment, tools, and procedures used for integration testing of system modules.

System Integration Test

ST-2-Q QA Team shall evaluate and verify that software test activities are identified, test environments have been defined, and guidelines for testing have been designed. QA Team will verify the software integration process, software integration testing activities and all software performance testing activities are being performed in accordance with QA Plan processes and procedures as well as the Software Development Methodology.

22

Page 23: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

ST-3-Q QA Team shall evaluate and verify any transfer of control of code to personnel performing software integration testing or software performance testing is being accomplished in accordance with QA Plan processes and procedures as well as Software Development Methodology.

ST-4-Q QA Team shall evaluate and review the Test Plan and Software Test Descriptions for compliance with requirements and standards.

ST-5-Q QA Team shall evaluate and monitor test activities, witness tests, and certify test results.

ST-6-Q QA Team shall evaluate and verify that requirements have been established for the certification or calibration of all support software or hardware used during tests.

ST-7 QA Team shall evaluate the level of automation and the availability of the system test environment.

ST-8 QA Team shall verify that an appropriate level of test coverage is achieved by the test process, test results are verified, the correct code configuration has been tested, and tests are appropriately documented, including formal logging of errors found in testing.

ST-9 QA Team shall verify that the test organization has a level of independence from the development organization.

ST-10 QA Team shall evaluate the plans, requirements, environment, tools, and procedures used to pilot the Enterprise System.

ST-11 QA Team shall verify that test scenarios are used to verify comprehensive but manageable testing, and that tests are run in a realistic, real-time environment.

ST-12 QA Team shall verify that test scripts are complete, with step-by-step procedures, required pre-existing events or triggers, and expected results.

ST-13-Q QA Team shall verify that pilot results meet criteria, that the correct code configuration has been used, and that the test runs are appropriately documented, including formal logging of errors found in testing as defined in the Implementation Plan.

ST-14-Q QA Team shall evaluate and validate pilot effectiveness. ST-15-Q QA Team shall evaluate and verify all functional aspects of the

system. ST-16-Q QA Team shall evaluate and verify operability and stability of

software. ST-17-Q QA Team shall evaluate and verify accuracy of conversion of legacy

data and manual data. ST-18-Q QA Team shall evaluate and verify impact of missing and erroneous

data. ST-19-Q QA Team shall evaluate and verify completeness and accuracy of

system documentation. ST-20-Q QA Team shall evaluate and verify effectiveness of training methods

and materials. ST-21-Q QA Team shall evaluate and verify Pilot impact on workflow and staff

productivity. ST-22-Q QA Team shall evaluate and verify response time and overall system

and network performance. ST-23-Q QA Team shall evaluate and verify system hardware, software and

telecommunications performance.

Pilot

ST-24-Q QA Team shall evaluate and verify appropriateness of system, data and application security.

23

Page 24: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

ST-25-Q QA Team shall evaluate and verify accuracy and performance of system interfaces.

ST-26-Q QA Team shall evaluate and verify Project Managers create a work plan that includes an approach to the implementation of components with recommended sequence and schedule.

ST-27-Q QA Team shall verify Contractor, with OKDHS staff, shall lead the effort to develop the Implementation Plan for implementing new and reengineered processes.

ST-28-Q QA Team shall evaluate and verify the tool selected for the Data Integration will provide the ability for: data profiling, data quality, data integration, data enrichment, and data monitoring.

Implementation

ST-29-Q QA Team shall evaluate and verify Contractor’s proposed Data Conversion Plan presents a comprehensive strategy for both the automated and manual conversion effort and incorporate the OKDHS schedule for Pilot testing and statewide Implementation.

ST-30-Q QA Team shall evaluate and verify Contractor tests performance of the software during a Pilot and of the application after Implementation conducting Benchmark Tests and reporting Benchmark Results to OKDHS.

Benchmark Tests

ST-31-Q QA Team shall evaluate and verify Contractor meets expected capacity simulation results, tuning specifications, and performance benchmarks for all installations and deployments.

ST-32 QA Team shall evaluate interface testing plans and procedures for compliance with requirements and verify that the plans meet acceptance criteria.

Interface Testing

ST-33-Q QA Team shall verify interface testing is in accordance with Interface Control Document.

ST-34 QA Team shall verify that acceptance procedures and acceptance criteria for each product are defined, reviewed, and approved by program stakeholders prior to test, and that results of the test are documented. QA Team shall verify that acceptance procedures address the process by which any software product that does not pass acceptance testing will be corrected.

ST-35-Q QA Team shall evaluate and verify that as many of the software integration tests as necessary and all software performance tests are witnessed to verify that the approved test procedures are being followed, that accurate records of test results are being kept, that all discrepancies discovered during the tests are being properly reported, that test results are being analyzed, and the associated test reports are completed.

ST-36-Q QA Team shall evaluate and verify that discrepancies discovered during software integration and performance tests are identified, analyzed, documented, and corrected; software unit tests, and software integration tests are re-executed as necessary to validate corrections made to the code; and the software unit’s design, code, and test is updated based on the results of software integration testing, software performance testing, and corrective action process.

ST-37-Q QA Team shall evaluate and verify the software performance tests produce results that will permit determination of performance parameters of the software.

ST-38-Q QA Team shall verify that the responsibility for testing and for reporting on results has been assigned to a specific program owner for sign-off.

Acceptance and Turnover

ST-39 QA Team shall verify that acceptance testing based on the defined acceptance criteria is performed satisfactorily before acceptance of products.

24

Page 25: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

ST-40 QA Team shall verify that the acceptance test organization has a level of independence from the Sub-Contractor or Contractor that has been contracted with OKDHS to develop the Enterprise System.

ST-41 QA Team shall verify that training of OKDHS staff in using Contractor-supplied software shall be ongoing throughout the development process.

ST-42 QA Team shall review and evaluate the Implementation Plan. ST-43-Q QA Team shall evaluate and validate implementation success. ST-44-Q QA Team shall evaluate and validate post-implementation activities. ST-45-Q QA Team shall assist in activities to secure federal acceptance and

approval. ST-46-Q QA Team shall verify that warranty and maintenance periods do not

start until OKDHS acceptance. Turnover Documentation

ST-47-Q QA Team shall evaluate and verify Contractor provides the Data Conversion Test Results Document.

2.13 Evaluate Data Management QA Team will verify that the Enterprise Database Repository being developed can perform quickly and efficiently by meeting MOSAIC Project RFP performance goals. QA Team will develop processes to verify that the converted data is not duplicated to the degree practicable and that the data is accurate.

TABLE 10 – DATA MANAGEMENT TASK ITEM TASK # TASK DESCRIPTION

DM-1 QA Team shall review and evaluate existing and proposed plans, procedures, and software for data conversion.

DM-2 QA Team shall verify that procedures are in place and are being followed to review the completed data for completeness and accuracy and to perform data clean-up as required.

DM-3 QA Team shall evaluate conversion error rates to verify the error rates are manageable and meet acceptance criteria.

Data Conversion

DM-4 QA Team shall make recommendations to make the conversion process more efficient and to maintain the integrity of data during the conversion.

DM-5 QA Team shall evaluate new and existing database design documents to verify they meet existing and proposed system requirements.

DM-6 QA Team shall verify that appropriate processes are used in database design to improve data integrity and system performance.

DM-7 QA Team shall verify appropriate processes are used to verify the database design has maintainability, scalability, concurrence, normalization – where appropriate, and any other factors affecting performance and data integrity.

Database Design

DM-8 QA Team shall review and evaluate the process for administering the database, including back-up, recovery, performance analysis, and control of data item creation.

2.14 Evaluate Operations and Business Oversight This QA task assures that Operation Process Improvement will be an ongoing and continuous process.

TABLE 11 – OPERATIONS OVERSIGHT TASK ITEM TASK # TASK DESCRIPTION

Operational & Business Change

OB-1 QA Team shall review and evaluate the OKDHS statewide systems change request and defect tracking processes.

25

Page 26: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Tracking OB-2 QA Team shall evaluate implementation of the Operational & Business process activities and request data to verify processes are effective and are being followed.

User Operational/Business Satisfaction

OB-3 QA Team shall review and evaluate user satisfaction with system and make recommendations for improvement.

Operational & Business Goals

OB-4 QA Team shall evaluate impact of the system on program goals and performance standards.

Operational Documentation

OB-5 QA Team shall evaluate operational/business plans and processes.

Operational Processes and Activity

OB-6 QA Team shall evaluate implementation of operational/businessprocesses and activities, including back-up, disaster recovery testing, and day-to-day operations, to verify the processes are being followed.

2.15 Evaluate Software Products Review Process This QA task assures that quality review processes are in place for all software products, which may include representations of information other than traditional hard-copy documents, and that these products have undergone software product evaluation, testing, and corrective action as required by the standard. QA Team will review all software tools and verify that OKDHS becomes the owner of all software tools at the conclusion of the MOSAIC Project.

TABLE 12 – SOFTWARE PRODUCTS REVIEW TASK ITEM TASK # TASK DESCRIPTION

SP-1-Q QA Team shall evaluate and verify that software products that are ready for review are reviewed and results are reported.

Software Product Review

SP-2-Q QA Team shall evaluate and verify issues or defects reported are resolved in accordance with QA standards and procedures.

SP-3-Q QA Team shall conduct evaluations of tools, both existing and planned, used for software development and support.

SP-4-Q QA Team shall evaluate and verify tools are adequate by assessing whether they perform the functions for which the tools are intended.

SP-5-Q QA Team shall evaluate and verify tools for applicability by assessing whether the tool capabilities are needed for the software development or support.

SP-6-Q QA Team shall evaluate and verify planned tools are feasible by assessing whether they can be developed with the techniques and computer resources available or by procurement.

SP-7-Q QA Team shall research and evaluate software tools available to automate QA functions.

Software Tool Evaluation

SP-8-Q QA Team shall evaluate the tools to verify the techniques used include review of the use of standards, software inspections, requirements tracing, requirements and design verification, reliability measurements and assessments, and rigorous or formal logic analysis.

2.16 Evaluate Component Deliverable (Release) Process QA Team shall evaluate the activities in preparation for component delivery to verify that program or project requirements for functional and physical audits of the end-item product are being satisfied. In some cases, QA Team should be allowed to prohibit delivery of certain items, such as documentation, code, or a system, if the project fails to meet contractual requirements or standards.

26

Page 27: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

TABLE 13 – COMPONENT DELIVERABLE PROCESS TASK ITEM TASK # TASK DESCRIPTION

CD-1-Q QA Team shall evaluate and verify the MOSAIC Project Migration Approval Process guide defines the migration of products being released or migrated from one environment to another.

Component Deliverable

CD-2-Q QA Team shall evaluate and verify that all required interfaces are properly connected and integrated.

2.17 Evaluate Media Certification, Storage and Handling Process The MOSAIC Project will have various media and storage needs throughout. QA Team will put processes in place to verify media, products, versioning and storage are current, compatible and in sync.

TABLE 14 – MEDIA CERTIFICATION PROCESS TASK ITEM TASK # TASK DESCRIPTION

MC-1-Q QA Team shall evaluate and verify the media containing the source code and the media containing the object code which are delivered to the MOSAIC Project correspond to one another.

MC-2-Q QA Team shall evaluate and verify that all products delivered will be in a format consistent with Standards as described in the MOSAIC Project RFP, SOW, and Requirements documents.

Media Certification

MC-3-Q QA Team shall evaluate and verify that the software version represented by this media matches that on which software performance testing was performed, or correctly represents an authorized update of the code, as applicable.

MC-4-Q QA Team shall evaluate and verify that there is an established plan, methodology, or set of procedures for storage and handling of media.

MC-5-Q QA Team shall evaluate and verify the storage of the software product and documentation to verify that storage areas for paper products or media are free from adverse environmental effects such as high humidity, magnetic forces, and dust.

Storage and Handling

MC-6-Q QA Team shall verify that electronic copies of source code are in the custody of an independent escrow agent.

2.18 Non-Deliverable Software Certification The MOSAIC Project may use non-Deliverable software in the development of Deliverable software as long as the operation and support of the Deliverable software after delivery to the project do not depend on the non-Deliverable software or provision is made to verify that OKDHS has or can obtain the same software.

TABLE 15 – NON-DELIVERABLE SOFTWARE CERTIFICATION TASK ITEM TASK # TASK DESCRIPTION

SC-1-Q QA Team shall evaluate and verify that the use of non-Deliverable software meets the above criteria, that is, Deliverable software is not dependent on non-Deliverable software to execute, or verify that OKDHS can obtain the same software.

Non-Deliverable Software

SC-2-Q QA Team shall evaluate and verify that all non-Deliverable software used on the project performs its intended functions.

2.19 Evaluate Performance Standards The minimum performance standard expectations and requirements are described in detail in the MOSAIC Project RFP and will provide the state and

27

Page 28: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

federal documentation the Contractor will follow regarding both business and systems requirements.

TABLE 17 – PERFORMANCE STANDARDS TASK ITEM TASK # TASK DESCRIPTION

PE-1-Q QA Team shall evaluate and verify the Contractor adheres to the performance requirements established in the MOSAIC Project RFP.

PE-2-Q QA Team shall review and measure response time standards are met in the production environment for the life of the MOSAIC Project.

PE-3-Q QA Team shall review and evaluate that response time standards are met during the testing of the system.

Performance Evaluation

PE-4-Q QA Team shall review and evaluate performance metrics.

3 PROJECT DELIVERABLES During the MOSAIC Project, QA activities will be performed to verify Contractor complies with OKDHS minimum requirements for content, submission, review, testing, and acceptance of all project Deliverables. In the event reviewing or testing identifies a requirement is not met in whole or in part, OKDHS may return the Deliverable to Contractor immediately for rework or continue reviewing or testing until reviewing or testing is complete, then return the Deliverable to Contractor for rework. Contractor shall facilitate a walkthrough with OKDHS for each MOSAIC Project Deliverable prior to delivery. Following the walkthrough, OKDHS will receive the Deliverable unless a deficiency is discovered in the walkthrough. In the event OKDHS identifies a Deliverable that requires more than the minimum period for review, the period will be scheduled in the Project Plan. OKDHS will receive MOSAIC Project Deliverables based upon the Project Plan. Deliverable Acceptance is further explained in the MOSAIC Project RFP and Section 6.1 of this QA Plan.

4 REVIEWS AND AUDITS In order for QA Team to evaluate compliance with the QA Plan and MOSAIC Project RFP, QA Team will review and approve Deliverables throughout the MOSAIC Project life cycle. These reviews will specify that the evidence of work generated is adequate to verify compliance with project scope, contract, and quality requirements. Audits performed by the QA Team or state and federal auditors shall include examination of both internal and external project and product Deliverables. 4.1 Verify Document and Artifact Deliverable Review All documentation and artifacts generated throughout the MOSAIC Project life cycle will be subject to a QA review. In addition to Project Management Plans, other identified documentation for review will be that which control the processes, development, verification and validation, use and maintenance of the software and hardware. Appropriate stakeholders and MOSAIC Project Team members will be requested as needed by the QA Team to help in the document reviews. QA checklists will be developed to verify all document components meet project and contract requirements.

28

Page 29: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

4.1.1 Disciplines for Documentation Standard Practices MOSAIC Project documentation templates will be used as the basis for all document Deliverables of the MOSAIC Project RFP. A waiver exempting project documentation Deliverables from standard template usage will be required. The MOSAIC Project Team shall use industry best practices documentation standards and application-related documentation modeling tools, and adhere to sound modeling principles to verify standardization and traceability of all system application documentation. QA’s will also focus on Business Requirement Documents, Technical Requirement Documents, and Design Documents which will be the basis for developing Test Plans and performing quality control testing. 4.1.2 Document and Artifact Review Timeframes As stated in the MOSAIC Project RFP, for each program document Deliverable, the schedule must allow as follows: ten business days for review of documents 50 pages or less, fifteen business days for documents 51 – 100 pages, and twenty business days for documents 101 pages or more. All document Deliverables will be reviewed in a group walkthrough conducted by the Contractor with OKDHS. No more than three documents will be scheduled for review at during the same timeframe by OKDHS. 4.2 Verify Project Management Compliance Reviews QA Team periodic project management compliance review of project status, progress, defects, and risks will provide an independent assessment of project activities. QA Team will provide the following information to MOSAIC Project Team:

1. Compliance – Identification of the level of compliance of the MOSAIC Project with established organizational and project processes.

2. Defect areas – Identification of potential or actual project defect areas based on analysis of technical review results.

3. Risks – Identification of risks based on participation and evaluation of project progress and trouble areas.

4.2.1 Scheduled Compliance Reviews QA Team will generate and maintain compliance review schedule. Reviews will occur based upon milestones or triggers as indicated in the Project Plan. The results of audits will be discussed with the Quality Program Manager and the MOSAIC Team Lead or other individual(s) responsible for the production of the Deliverable. Results will be submitted by the QA Team to the OKDHS Program Manager(s) in scheduled status reports. 4.2.2 Unscheduled Compliance Reviews QA Team will perform random and unannounced compliance reviews to verify the corrective actions agreed to during the scheduled reviews are being followed. The results of the reviews will be discussed with the Quality Program Manager and MOSAIC Team Lead or other individual(s) responsible for the production of the Deliverable. Results will be submitted by the QA Team to the OKDHS Program Manager(s) in scheduled status reports.

29

Page 30: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

4.2.3 Compliance Review Reports Compliance review reports and recommended corrective actions generated by QA Team will be brought to the attention of the individual(s) or MOSAIC Team Lead responsible for producing the Deliverable using the MOSAIC Deliverable process. Corrective action will be recommended and reviewed with the individual(s) and Quality Program Manager or MOSAIC Team Lead. Compliance, defect areas, and risks will be followed-up and tracked to closure. The results of reviews of the QA function will be tracked and maintained by the QA Team. Section 7 will further explain Issue Reporting and Corrective Action. 4.3 Conduct Process Audits and Reviews MOSAIC Project processes are audited according to the tasks specified in this QA Plan and performed in accordance with the Project Plan and schedule. The forms utilized by QA Team for audit reporting include, but are not limited to, the Process Audit Report. The Process Audit Report can be adapted and used on any specific audit or review. 4.3.1 Process Audit and Review Schedule The MOSAIC Project Teams will provide current status to the Executive Sponsor, and team members through the reviews, compliance audits, reports, and working interchanges established for the program. Additional audits, reviews, inspections, walkthroughs, tactics, and measurements will be further refined in later phases of the MOSAIC Project. Appendix C reflects the Process Audit and Review Schedule. 4.3.2 Process Audit Report QA Team reports the results of a process audit and provides recommendations, if necessary, using the Process Audit Report. The Process Audit Report is used to record that the process is being followed correctly and working effectively, being followed but is not working effectively, or that the process is not being followed. Appendix D reflects the Process Audit Report form. 4.3.3 Submittal and Disposition of Process Audit Report The Process Audit Report is directed to the groups listed below:

1. MOSAIC Project Decision Team – The results of process audits is used in conjunction with other project status information to guide the Decision Team’s attention to identify and mitigate project risks at the organizational level.

2. Program Quality Manager – The Program Quality Manager utilizes the report in the ways listed below:

a. To provide insight into whether there is compliance with the development process and its effectiveness in meeting project goals. Where necessary and appropriate, the Program Quality Manager may initiate enforcement activities or initiate change to the established processes using the approved procedures.

30

Page 31: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

b. To provide insight to MOSAIC Team Lead into whether there is compliance with the development process and its effectiveness in meeting project goals.

c. To indicate agreement, disagreement, or deferral of recommendations cited in the Process Audit Report. If the OKDHS Program Manager indicates disagreement with the recommendations recorded on the Process Audit Report, the final disposition of report recommendations is made by the MOSAIC Project Decision Team.

4.4 Evaluation MOSAIC Project processes are evaluated according to the tasks specified in this QA Plan and performed in accordance with the Project Plan and schedule. The forms utilized by QA Team for evaluation reporting include, but are not limited to, the Software Tool Evaluation checklist, Performance Standards Evaluation checklist, Pilot Evaluation checklist, and the Implementation Evaluation checklist. The Evaluation checklists can be adapted and used on any specific review. 4.4.1 Software Tool Evaluation Quality review processes are in place for all software products and these products will be evaluated, tested, and corrective action performed as required by the standard. Appendix E reflects the Software Tool Evaluation checklist. 4.4.2 Performance Standards Evaluation Quality review processes are in place for evaluation of performance standards. The QA Team will work with Contractor and the OKDHS Performance Analysis Team to analyze capacity and performance information and resolve any capacity and performance issues prior to or in conjunction with Implementation. Appendix F reflects the Performance Standards Evaluation checklist. 4.4.3 Pilot Evaluation The pilot criteria will be evaluated by the QA Team for pilot success. As stated in the MOSAIC Project RFP, the Pilot Plan the Contractor shall provide will detail the methodologies and best practices used, and shall include, but are not limited to, pilot test criteria which is subject to evaluation of how the criteria will be performed, captured, or measured. Appendix G reflects the Pilot Test Evaluation checklist. 4.4.4 Implementation Evaluation The implementation criteria will be evaluated by the QA Team for implementation success. As stated in the MOSAIC Project RFP, the Implementation Plan the Contractor shall provide will detail the methodologies and best practices used, and shall include, but are not limited to, implementation test criteria which is subject to evaluation of how the criteria will be performed, captured, or measured. Appendix H reflects the Implementation Evaluation checklist.

31

Page 32: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

5 TESTING PROCESS AND ENVIRONMENTS Six types of testing and five OKDHS-DSD environments have been defined below and will encompass MOSAIC Project testing activity for all the environments. The Quality Control Testing (QCT) Standards and Processes document provide the test process flow. QA Team shall audit the QCT activities as defined in the QCT Standards and Processes document and QA Plan, and shall verify that the software and test documentation is subject to change management control. QA Team shall witness the tests and verify that test results are evaluated and documented. 5.1 System and User Acceptance Testing Processes State system and User Acceptance testing shall not begin until after the Contractor has completed thorough internal testing, all programming is completed and approval of all documents has been received. All internal test documents shall be forwarded to the OKDHS for review to verify the Contractor has performed testing of each component. Failure to adequately test or provide components that have not been tested or failed testing shall result in the application of liquidated damages as outlined in the MOSAIC Project RFP. Figure 2 below reflects the component testing processes in DSD environments.

Component Testing Process

Production

Training

Load/ Performance

Testing

User Acceptance

TestingUnit Testing

F nctional uTesting

Load/ Performance

Testing

Regression Testing

System / IntegratedTesting

Start

End

User Acceptance Training Production QA/QC Development

Demarcation for environments. Approval is required to move objects from one environment to t next DSD Environments are indicated by ita slic.

he

1. Unit testing is usually performed on a smaller scale and is focused

specifically on functionality of the component. Each data field should be tested for its limits as described below. All paths through the component should be tested. Test scenarios should be defined for unit testing based on the functionality required and followed as part of the unit test.

2. Integration testing is performed either with multiple new components or a mixture of one or more new components and production code. All functionality (new and old) should be tested during this testing phase. All

32

Page 33: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

paths through the component should be tested with multiple values of test data.

3. Functional testing focuses upon both new functionality requested and on previously existing functionality. Compares new functionality to requirements documents and verifies that requirements have been met successfully.

4. Regression testing is a subset of functional testing. Testing is performed either with new components and existing components to determine if all work well together. All functionality (new and old) should be tested during this testing phase. All paths through components should be tested with multiple values of test data.

5. Load/Performance testing is a subset of functional testing. Performance best practices and guidelines are met concerning, but not limited to, resource utilization, response time, baselines, and processing time. Success criteria will be based upon defined metrics.

6. User acceptance testing is focused primarily upon new functionality requested, or if a completely new component, in its entirety.

5.2 System Environments The six OKDHS-DSD environments are Research, Development, Functional, User Acceptance, Training and Production. Only five of these environments will be utilized in the development of the MOSAIC Project Enterprise System. The figure below illustrates a typical migration path for all production bound modifications and enhancements. QA Team will verify that the MOSAIC Project Team will develop and provide acceptance criteria for the hardware, software, and Enterprise database for each environment. Figure 3 below reflects the six OKDHS-DSD environments.

The dotted line represents check-out of production objects The blue dashed line represents that the Training Environment is optional.

1. Research Environment is primarily for allowing the research analysts to

install, deploy and test new technology and products in an isolated environment. Research activities that change the nature of the environment enough to negatively impact the development environment occurs here. Basic feasibility studies and preliminary proofs of concept can be implemented here. It will include elements of the production

33

Page 34: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

2. Application Development Environment is primarily for allowing the analysts, designers and developer’s to create source objects (programs, databases etc.) for initial development in a protected environment. It closely resembles the production environment. Unit testing and maintenance occurs here. Unit testing will generally be performed by the requestor of the object.

3. QA/QC Environment will be primarily used to perform quality assurance and quality control testing for all applications before implementing in production. The objects created in the development environment will be migrated to this environment after unit testing by the analysts, designers and developers is complete. Application, integration, performance and regression testing occurs here. Changes to the software, hardware or process will never be made directly to this environment. Each change will need to follow the migration process and start with the Application Development environment.

4. User Acceptance Environment will be used to perform user acceptance testing of new or changed applications in a protected environment. Performance testing occurs here. The new applications and their supporting elements and processes are the only variances from the production environment. End-users will use this environment to verify that requirements have been translated to the end-product. This is where the users review, reject, or approve applications. Changes to the software, hardware or process will never be made directly to this environment. Each change will need to follow the migration process and start with the Development environment.

5. Training Environment will be primarily used for training for all OKDHS production applications. Changes to the software, hardware or process will never be made directly to this environment. Each change will need to follow the migration process and start with the Development environment.

6. Production Environment is the final destination of all finished products. All OKDHS production applications and databases reside in this environment. Applications can only be introduced here after they have been through the migration process and software configuration management procedures. Changes to the software, hardware or process will never be made directly to this environment. Each change will need to follow the migration process and start with the Development environment.

5.3 Quality Control Testing Process A thorough and consistent quality control process will be implemented by the QA Team. The QA Team will form a testing team to perform activities defined or referenced in this plan. Each major Deliverable will be reviewed by the MOSAIC Project QA/QC Lead, QA Team, and development Contractor against the quality control procedures to verify that no requirements are overlooked.

34

Page 35: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

5.3.1 Quality Control Testing (QCT) and Migration Manual The QCT processes to be followed by the testing team are outlined in the MOSAIC Project QCT Standards and Processes document. This document will establish and flowchart the QCT processes to follow and tools to be utilized during testing of the various framework modules. The QCT document shows how the QCT Deliverables are to be met for each of the planned modules. A MOSAIC Project Migration Approval Process manual will also be followed to identify the processes to follow for migrating modules or sub-modules into the OKDHS-DSD environments. An approval process with QC checklists will be utilized that gives OKDHS the final approval for any migration packages and to verify that all impacted customers are notified promptly. 5.3.2 Test Plans As stated in the MOSAIC Project RFP, the Contractor shall assist OKDHS in developing the Test Plans and test cases, scenarios, scripts, and data sufficient to fully prove the Enterprise System meets all business and technical requirements. The template requirements for the Test Plans are documented in the MOSAIC QCT Standards and Processes document. The QCT document describes how the system will be tested, how the Test Plans will be developed and who will perform each required test. The QA Team will verify the Test Plan will be traceable to the requirements and design documents to validate that all requirements have been addressed.

6 VALIDATION AND ACCEPTANCE PROCESS QA Team shall verify all validation and acceptances are done in accordance with the MOSAIC Project RFP. QA Team will document completed Deliverables that have been accepted or not accepted along with reasons for non-acceptance. A Deliverable Tool will be maintained by the QA Team which will track each Deliverable, review and approval dates, issues, etc. The Contractor in conjunction with the OKDHS Quality Team shall facilitate the User Acceptance process. 6.1 Deliverable Acceptance OKDHS will notify Contractor in writing of the acceptance status, by the end of the review/testing period for the Deliverable, whether Deliverable is:

1. Accepted without condition; 2. Accepted with conditions that must be satisfied; or 3. Not accepted and returned to Contractor for rework.

In the event OKDHS returns a Deliverable to Contractor for rework, the review period begins again on Contractor redelivery. At that time, Contractor shall estimate the redelivery date. Deliverables returned to the Contractor for rework will be considered delayed and appropriate liquidated damages may be applied. Contractor shall provide the OKDHS Program Manager a copy of the Deliverable Acceptance Letter with the invoice for each Deliverable.

35

Page 36: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

6.2 Contractor’s System Certification Contractor and OKDHS must certify in writing, via the Final Implementation Acceptance Letter, that the statewide Implementation of the module successfully met the requirements of the Implementation Plan. Final Implementation Acceptance Letter is required for final payment of the implemented module. OKDHS shall certify the acceptance of systems when all requirements are met and federal or State of Oklahoma approval, or both, if applicable, is received. 6.3 Acceptance Plans and Releases As stated in the MOSAIC Project RFP, upon successful implementation of each incremental release, the Contractor must present the systems/modules to the OKDHS for acceptance. The system presented for acceptance must account for all required functionality, training, conversion, documentation, and any other related requirements of the MOSAIC Project RFP and the federal regulations, OKDHS, and policies for the components comprising the release. Contractor shall turnover the system component(s) release to OKDHS for final acceptance upon successful implementation of each MOSAIC Project module in all OKDHS offices. Contractor shall produce Turnover Documentation and conduct interactive knowledge transfer sessions using the delivered documentation in the appropriate environment. 6.4 Federal Acceptance and Approval Preparation Contractor shall assist OKDHS in activities to secure federal acceptance and approval of the Enterprise System. OKDHS shall submit the Enterprise System Certification Review Plan for federal approval. Contractor shall provide support to OKDHS and documentation, as required, in all reviews in preparation for or during the federal approval process, to attain federal approval of the Enterprise System. Contractor shall provide all Required System Documentation and support to obtain federal approval. 6.5 Food and Nutrition Service (FNS) Acceptance Requirements FNS policies and procedures that OKDHS must follow in order to receive federal funding to develop, acquire, and implement the MOSAIC Project Enterprise System are described in FNS Handbook 901. 6.5.1 Status Reports The results of MOSAIC Project monitoring will be reported in status reports. FNS requires status reporting that will be defined in the Communication Plan. The FNS Handbook 901 refers to Appendix D-23 for Status Report Checklist and Appendix E for a sample status report. QA Team will verify the required information is included in the status reports. 6.5.2 Planning Document Reviews and Closures Reviewing planning documents such as PAPD, IAPD, APDU, APDs include confirming the project objectives have been met and determining the actual costs incurred. The required information is in the FNS Handbook 901, section 2.7.

36

Page 37: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

6.5.3 FNS Post-Implementation Review FNS may conduct a post-implementation review of the system once it is fully operational statewide (approximately 6 months after system deployment statewide and to accommodate the initial user learning curve). FNS may conduct an onsite post-implementation review to verify OKDHS accomplished the goals stated in its APD. This review encompasses the program, technical, security, and financial aspects of the system. The required information is in the FNS Handbook 901, section 2.7.1. The Food Stamp Program Post-Implementation Review printable checklist can be found at http://www.fns.usda.2ov/apd/FSP PIR/Full Checklist. PDF 6.5.4 Systems Functional Requirements Review FNS may elect to conduct a System Functional Requirements Review before and/or during the initial pilot training and before the deployment of software. The required information is in the FNS Handbook 901, section 2.11.3.1. 6.5.5 Cost Reviews and Audits OKDHS shall provide access to all cost records relating to system development and operations. FNS may use data mining software during these reviews. This will require OKDHS to provide FNS staff with project expenditures in an electronic format. Failure to cooperate with Federal requests for information in support of a review or audit may result in suspension or termination of FNS funding for the system and its operations. The required information is in the FNS Handbook 901, section 7.4. 6.5.6 Regional Office Expenditure Review FNS RO will compare reported expenditures for IS development from the Form SF-269 (http://www.whitehouse.gov/omb/grants/sf269.pdf), or other expenditure reports, with the expenditures reported in the annual APD. Any differences will be examined and will need to be reconciled. There should be no significant differences between expenditures reported on the Form SF-269 and those reported on the annual APDU. 6.6 Health and Human Services (HHS) Acceptance Requirements Automated Systems For Child Support Enforcement: A Guide For States was developed by the U.S. Department of Health and Human Services’ (DHHS) Administration for Children and Families (ACF). The guide addresses the requirements associated with Federal certification of comprehensive, automated, statewide Child Support Enforcement systems. The guide and other related certification material may be found at the following link: http://www.acf.hhs.gov/programs/cse/stsys/dsts_cert_guide.html 6.6.1 Authority The origin of the programs overseen and financed by DHHS/ACF is the Social Security Act. Included under Administration for Children and Families scope of review authority is Title IV-D, Child Support Enforcement. All other authority is referenced in Chapter I, Section B, of the Automated Systems Guide.

37

Page 38: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

6.6.2 General Requirements OKDHS personnel and Contractors working on systems subject to certification should use this guide throughout the life cycle of the system development effort. Chapter II of the Automated Systems Guide encompasses the general requirements to adhere to. 6.6.3 CSE System Requirements Nine functional areas of child support enforcement with their related system requirements are identified in Chapter III of the Automated Systems Guide. Several hyperlinks to related websites are referenced throughout Chapter III of the guide.

7 ISSUE REPORTING AND CORRECTIVE ACTION This section describes the issue reporting and control process used by QA Team to record and analyze issues, discrepancies, and defects; and to monitor the implementation of corrective action. Corrective action will involve actions taken as a result of a QA/QC measurement, audit, or review that indicates that the development process exceeds established parameters. Jointly with OKDHS, the Contractor shall review, critique, and propose issue and defect resolution and escalation procedures. 7.1 Escalation Procedure for Resolution Disputes In the event that affected MOSAIC Project staff dispute the findings and recommendations of a Process Audit Report, Defect (Incident) Report, or evaluation checklist, QA Team will first communicate with the OKDHS Program Manager to resolve the dispute. If the OKDHS Program Manager also disputes the findings and/or recommendations, the Quality Program Manager directs final disposition of recommendations and may implement, defer or cancel the implementation of corrective actions. This direction is recorded and dated by the OKDHS Program Manager to be added to the QA evaluation records of the MOSAIC Project. QA Team retains the original record of findings and subsequent resolution data in its audit files. The Authority/Decision Chart as reflected in the Project Management Plan and Table 20 below will be the Escalation Levels followed by the QA Team.

Person Responsible Degree of Authority* Responsible to:

Commission & Legislative Liaison

1

OKDHS Commission

Sponsor 2

Commission & Legislative Liaison

Decision Team 3

Sponsor

Program Manager 3

Decision Team

Project Manager or Team Lead

4

Program Manager

Team Member 4

Project Manager or Team Lead

*Degree of Authority Legend

38

Page 39: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

1 – Final authority to decide or act on any and all matters.

2 – Final authority to decide or act on any matters within budgets and existing OKDHS staff.

3 – Authority to decide or act on any matters within project budget and existing project staff.

4 – Authority to decide or act; participates with, or act after first consulting with the person to whom they are responsible.

7.2 Corrective Action Process Defects identified for resolution may be subject to a corrective action process and may include, but not limited to, schedule and plan non-conformance, documentation errors, software errors, and noncompliance with standards and procedures. The steps of the corrective action process are as follows:

1. Defect identification and correction occurring during software development to verify early detection of actual or potential defects;

2. Reporting of the defect to the proper authority; 3. Analysis of the defect to propose corrective measures; 4. Timely and complete corrective action; 5. Recording and follow-up of each defect’s status.

QA activities for the corrective action process are listed below:

1. Periodically review the corrective action process and their results against the Change Management Plan to assess the effectiveness of the corrective action process.

2. Perform periodic analysis of all reported defects to identify trends that may disclose generic defect areas. These analyses shall include the study of the causes, magnitude of impact, frequency of occurrence, and preventive measures.

7.3 Recording Defects (Incidents) in Software Code or Documentation Defects (Incidents) found in the software code or documentation that is being developed must be recorded by means explained below regardless of how or by whom the defect was discovered. QA Team shall analyze defects for trends in an effort to prevent recurring discrepancies. QA Team will report the results of trend analyses along with suggestions for defect resolution and prevention. As stated in the MOSAIC Project RFP, the Contractor shall track questions, issues, and defects, issue and defect resolution, and approved changes to design and make necessary changes to documentation within 30 calendar days after the approved change, using an agreed-upon issue and defect tracking system. 7.3.1 Defect (Incident) Tracking Tool If a defect is found during OKDHS QCT Process, the defect will be documented using the approved Defect (Incident) Tracking Tool and submitted to the

39

Page 40: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Contractor for review and statement of corrective measures. The following process will be followed for the documented Defect Tracking Tool:

1. The Contractor shall document corrective actions in the original Defect Tracking Tool.

2. OKDHS will review the correction and re-test the correction and all related automation components that the correction could affect.

3. If the re-test is successful, OKDHS will document the re-testing effort within the Defect Tracking Tool, close out the incident, and notify the Contractor.

4. If the re-test was unsuccessful, OKDHS will notify Contractor that the defect still exists and further corrective action must be taken. Contractor shall continue to use the Defect Tracking Tool and document all actions.

7.3.2 Resolve Defects (Incidents) As stated in the MOSAIC Project RFP, the Contractor shall correct any defect during system testing by OKDHS within two business days from the time the defect is entered in the Defect Tracking Tool. The Contractor must use the approved Change Control process to make corrections. The Contractor shall immediately report any corrections anticipated to require more than two days to the OKDHS Program Manager for assessment and determination of consequences. 7.3.3 Defect Repair Review Defect repair review is an action taken by the QA Team to verify that product defects are repaired or replaced and brought into compliance with requirements or specifications. The MOSAIC Project Team should make every reasonable effort to minimize the errors that cause the need for defect repair. A defect log will be used to collect the set of recommended repairs and will be implemented in an automated defect tracking system.

8 QUALITY METRICS To verify the delivery of a fully conforming, high-quality product, every individual assigned to the MOSAIC Project will participate in quality assurance. This section describes the procedures used by QA Team to verify that the quality assurance provisions of this QA Plan will measure the value of Deliverables and activities. The metrics will relate to business, project, process, and technical operations or performance. Performance metrics must be tracked within the application for an identified set of transactions and be reportable in order to provide information concerning the performance requirements. The Contractor will work with OKDHS in selecting the tools and techniques for the Quality Management plans and establishing proper and sufficient measurements and metrics to assess the MOSAIC Project. OKDHS and Contractor will also identify, define, and create the Quality Metrics Definitions and Report, which shall measure the value of tasks and Deliverables related to the MOSAIC Project.

40

Page 41: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

8.1 Measurements Measurements will be made by making a numerical assignment to specific attributes of each above identified project component selected for measurement and/or any other project component that management believes is a candidate for measurement and reporting. The objective of measuring these components is to help management better understand the project and the relationship of project components to each other as well as take corrective action to verify the success of the MOSAIC Project if necessary. Records and reports that provide a history of product quality throughout the MOSAIC Project life cycle document QA activities. Measurement data collected will be reviewed for trends and process improvement as explained in the next section. All QA Team records will be collected and maintained for the life of the MOSAIC Project in the established data repositories. The MOSAIC Project Decision Team will indicate whether every activity must start on time or only finish on time and whether individual activities will be measured, or only certain Deliverables and if so, which ones. 8.2 Monitor and Control QA Team will verify the monitor and control process is performed to monitor project processes associated with initiating, planning, executing, and closing. Corrective or preventive actions are taken to control the project performance. Monitoring includes collecting, measuring, and disseminating performance information, and assessing measurements and trends to effect process improvements. Continuous monitoring gives the MOSAIC Project Decision Team insight into the health of the project, and identifies any areas that can require special attention. The Monitor and Control process is concerned with:

1. Comparing actual project performance against the Project Plan; 2. Assessing performance to determine whether any corrective or preventive

actions are indicated, and then recommending those actions as necessary;

3. Analyzing, tracking, and monitoring project risks to make sure the risks are identified, their status is reported, and that appropriate risk action plans are being executed;

4. Maintaining an accurate, timely information base concerning the project’s product(s) and their associated documentation through project completion;

5. Providing information to support status reporting, progress measurement, and forecasting;

6. Providing forecasts to update current cost and current schedule information;

7. Monitoring implementation of approved changes when and as they occur. 8.3 Trend Analysis The Contractor will work with OKDHS QA Team to analyze project performance over time in an effort to prevent recurring discrepancies. QA Team will perform periodic analysis of all reported defects to identify trends that may disclose

41

Page 42: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

generic defect areas. These analyses shall include the study of the causes, magnitude of impact, frequency of occurrence, and preventive measures. QA Team will report the results of trend analyses along with suggestions for defect resolution and prevention. QA Team with the assistance from the Contractor shall provide a QA Trend Analysis Report, which shall identify trends that will assist with the planning, development, and implementation of Deliverables. 8.4 Process Improvement Analysis The Contractor will work with OKDHS QA Team to create and consolidate the process improvement methodologies, which should detail the steps for analyzing processes that will facilitate the identification of valued and non-value activities. The methodologies should also provide guides for process improvement activities, such as internal surveys, and process analysis. 8.4.1 Process Improvement Plan The process improvement plan details the steps for analyzing processes that will facilitate the identification of waste and non-value added activity, thus increasing Deliverable value, such as:

1. Process boundaries. Describes the purpose, start, and end of processes, their inputs and outputs, data required, if any, and the owner and stakeholders of processes.

2. Process configuration. A flowchart of processes to facilitate analysis with interfaces identified.

3. Process metrics. Maintain control over status of processes. 4. Targets for improved performance. Guides the process

improvement activities. 8.4.2 Lessons Learned Documentation The causes of variances, the reasoning behind the corrective action chosen, and other types of lessons learned from quality assurance reviews and audits shall be documented so that they become part of the historical data for this MOSAIC Project and OKDHS. Lessons learned will be documented throughout the MOSAIC Project life cycle, and finalized during project closure.

42

Page 43: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix A: QA Team Training Source of Training will be obtained from Training Plan/Coordinator.

TASK SKILL REQUIREMENTS TYPE OF TRAINING SOURCE OF TRAINING

Application Software Reviews

Project Knowledge, Product Knowledge, Source Language Knowledge

Peer Review, OJT

TBD

Documentation Reviews

Project Knowledge, Product Knowledge, Subject Matter Expertise

Peer Review, Workshops, OJT

TBD

Process Audits Project Knowledge, Product Knowledge, Process Improvement Knowledge, Project Methodology Knowledge

Process Improvement, Six Sigma/Lean

TBD

OCT Testing QCT Methodology Knowledge, Project Knowledge, Product Knowledge, Process Improvement Knowledge, Project Methodology Knowledge

QCT Methodology, Workshops, OJT

TBD

QA Management Project Management, Process Improvement Knowledge, Project Knowledge, Product Knowledge, Project Methodology

Project Management, Process Improvement, Six Sigma/Lean, OJT

OU Lean Institute, Character First

Metrics Collection Data Collection skills, Data Analysis skills, Metrics Definition skills, Data Measurement skills, Process Improvement Knowledge

Tools Training (EXCEL, MiniTab), Process Improvement, Six Sigma/Lean, OJT

OU Lean Institute

Defect Reporting QCT Methodology Knowledge, Project Knowledge, Product Knowledge, Defect Tracking Methodology Knowledge

QCT Methodology Workshops, OJT

TBD

Corrective Action QCT Methodology Knowledge, Project Knowledge, Product Knowledge, Defect Tracking Methodology Knowledge

QCT Methodology Workshops, OJT

TBD

Tool Utilization Product Knowledge Training with Product Contractors

TBD

Risk Management Risk Tools Knowledge, Risk Identification skills, Data Collection skills, Data Analysis skills, Metrics Definition skills, Data Measurement skills, Process Improvement Knowledge

Risk Management Training, Tools Training (EXCEL), Process Improvement, OJT

TBD

43

Page 44: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix B: QA Task’s Matrixes Appendix B consists of multiple responsibility matrixes for the tasks identified in Section 2 of this QA Plan. QA Task’s Legend P = Participant A = Accountable R = Review Required I = Input Required S = Sign-off Required Evaluate Planning Oversight SECTION 2.4

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Contract Verification

I R, A R, I P P I, S P, A

Feasibility Study I R, A R, I P

P I, S P, A

Evaluate Project Management SECTION 2.5

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Project Sponsorship

I R, A R, I I, P, A, S S

Management Assessment

I R, A R, I I, P, A, S

I, S

Project Management

I R, A R, I I, P, A, S

S

Business Process Reengineering

I R, A R, I I, P, A, S I, P, A, S

Risk Management

I R

R, I I, P, A I, P, A, S

I, P, S

Change Management

I R R, I I, P, A I, P, S

I, A, S

Communication Management

I R R, I I, P, S

I, A, S

Configuration Management

I R R, I I, P, A I, P, S

I, A, S

Project Estimating and Scheduling

I R R, I I, P, A, S

I, S

Project Staff I R R, I I, P, S

I, A, S

Project Organization

I R R, I I, P, S

I, A, S

Sub-Contractors and External Staff, if any

I R R, I I, P, A, S

I, S

OKDHS I R R, I I, P, A, S I, S

44

Page 45: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Oversight

Evaluate Quality Management SECTION 2.6

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Quality Assurance

I, A I, P, A

R, I I, P I, P

I, P, S

I, P, S

Process Definition and Product Standards

I, A I, P, A

R, I I, P I,P I, P, S I, P, S

Evaluate Training SECTION 2.7

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Training

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

User Training and Documentation

I, A I, P, A

R, I, S I, P, A I, P I, P I, P

Developer Training and Documentation

I, A I, P, A

R, I, S I, P, A I,P I, P I, P

Evaluate Requirements Management SECTION 2.8

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Business Engineer

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

Requirements Management

I, A I, P, A

R, I, S I, P, A I, P I, P

I, P, S

I, A, S

Security Requirements

I, A I, P, A

R, I, S I, P, A I, P I, P I, P, S

I, A, S

Requirements Analysis

I, A I, P, A

R, I, S I, P, A I, P I, P I, P I, A

Interface Requirements

I, A I, P, A

R, I, S I, P, A I, P I, P I, P I, A

Reverse Engineering

I, A I, P, A

R, I, S I, P, A I, P I, P

I, P I, A

Evaluate Operating Environment SECTION 2.9

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

TechnicalSpecialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

System Hardware

I, A I, P, A

R, I I, P, A, S I, P I, P

I, P I, A

System Software I, A I, P, A

R, I I, P, A, S I, P I, P I, P I, A

Database Software

I, A I, P, A

R, I I, P, A, S I, P I, P I, P I, A

System Capacity I, A I, P, A

R, I I, P, A, S I, P I, P I, P I, A

45

Page 46: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Evaluate Development Environment SECTION 2.10

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Infra-structure Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

Development Hardware

I, A I, P, A

R, I I, P, A, S I, P I, P I, P I, A

Development Software

I, A I, P, A

R, I I, P, A, S I, P I, P I, P I, A

Evaluate Software Development SECTION 2.11

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Quality Control Testing Supv

Chg Mgt Lead/ Config

Specialist

Quality Control Testers

ProgMgr (s)

Mosaic Team Lead

High-Level Design

I, A I, P, A

R, I I, P, A I, P I, P I, P, S

I, A

Detailed Design I, A I, P, A

R, I I, P, A I, P I, P I, P, S

I, A

Job Control I, A I, P, A

R, I I, P, A I, P I, P I, P, S

I, A

Code I, A I, P, A

R, I I, P, A I, P I, P I, P, S

I, A

Unit Test I, A I, P, A

R, I I, P, A I, P I, P I, P, S

I, A

Evaluate System and Acceptance Testing SECTION 2.12

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Quality Control Testing Supv

Chg Mgt Lead/ Config

Specialist

Quality Control Testers

ProgMgr (s)

Mosaic Team Lead

System Integration Test

I, A I, P, A

R, I I, P, A I, P I, P I, P I, A

Implementation I, A I, P, A

R, I I, P, A I, P

I, P I, A I, A

Benchmark Tests

I, A I, P, A

R, I I, P, A I, P

I, P I, A I, A

Interface Testing I, A I, P, A

R, I I, P, A I, P I, P I, P I, A

Acceptance and Turnover

I, A I, P, A

R, I I, P, A I, P I, P I, P I, A

Pilot I, A I, P, A

R, I I, P, A I, P

I, A I, A I, A

Turnover Documentation

I, A I, P, A

R, I I, P, A I, P I, P I, P I, A

Evaluate Data Management SECTION 2.13

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Technical Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

Data Conversion I, A I, P, A

R, I, S I, P, A, S I, P I, P

I, P I, A

Database Design

I, A I, P, A

R, I, S I, P, A, S I, P I, P

I, P I, A

46

Page 47: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Evaluate Operations Oversight SECTION 2.14

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Operational & Business Change Tracking

I, A I, P, A

R, I I, P, S I, P I, P I, A

User Operational/Business Satisfaction

I, A I, P, A

R, I I, P, S I, P I, P I, A

Operational & Business Goals

I, A I, P, A

R, I I, P, S I, P I, P I, A

Operational & Business Documentation

I, A I, P, A

R, I I, P, S I, P I, P I, A

Operational & Business Processes and Activity

I, A I, P, A

R, I I, P, S I, P I, P I, A

Evaluate Software Products Review Process SECTION 2.15

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Software Product Review

I, A I, P, A

R, I, S I, P, S I, P I, P I, A

Software Product Evaluation

I, A I, P, A

R, I, S I, P, S I, P I, P I, A

Evaluate Component Deliverable Process SECTION 2.16

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Component Deliverables

I, A I, P, A

R, I, S I, P, S I, P I, P

I, A

Evaluate Media Certification, Storage and Handling Process SECTION 2.17

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

Program Manager(s)

Mosaic Team Lead

Media Certification

I, A I, P, A

R, I, S I, P, S I, P I, P

I, A

Storage and Handling

I, A I, P, A

R, I, S I, P, S I, P I, P

I, A

47

Page 48: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Evaluate Non-Deliverable Software Certification SECTION 2.18

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Business Engineer

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

Non-Deliverable Software

I, A I, P, A

R, I, S I, P, S I, P I, P

I, A I, A

Evaluate Performance Standards SECTION 2.19

Prog Qual Mgr/ Lead

QA/ QC

Lead

QA Compliance Specialist

Infra-

structure Specialist

Chg Mgt Lead/ Config

Specialist

Risk Mgt Lead/ Risk

Specialist

ProgMgr (s)

Mosaic Team Lead

Performance Evaluation

I, A I, P, A

R, I, S I, P, S

I, P I, A I, A I, A

48

Page 49: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix C: Process Audit and Review Schedule PROJECT QUALITY AUDIT/REVIEW

PLANNED FREQUENCY OR PHASE

QUALITY REVIEW AUDITOR

COMMENTS

DSD Project Compliance Review

Monthly DSD Business Quality Unit

The DSD QA unit completes a project quality audit to assess the project quality. The report and results are submitted to program manager and sponsors.

MOSAIC Project Decision Team Reviews

Periodically as needed

Decision Team The Decision Team will conduct or delegate periodic reviews of funding, resource leveling, and project performance.

IV&V Audit Every 6 months IV&V Contractor The IV&V Contractor as required by the federal partners will perform audits. Audit items are outlined in IV&V RFP.

MOSAIC Project Quality Reviews

Weekly Quality Assurance Team

Project Management audits and reviews performed as determined by Project Team.

MOSAIC Scope and Project Reviews

Periodically as needed

Program Manager The project manager will monitor all new tasks added to schedule in order to assess scope change.

Procurement Audit As scheduled OIG Auditors Procurement audits can happen anytime during the project.

Cost & Variance Review

As scheduled Program Manager and/or Quality Assurance Team

PM or QA Team will assess noted variances reviewed to determine project impact.

State Audit As scheduled State Auditor & Inspector

State Auditor may audit books and financial records for compliance with state laws, accounting controls, and government auditing standards.

Earned Value Audit As scheduled Program Manager and/or Quality

PM or QA Team will assess physical work completed and the

49

Page 50: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Assurance Team authorized budget.

Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) Review for Certification

As scheduled Federal Auditors ACF may assess if data provided by computer-based system is reliable and accurate.

Statewide Automated Child Welfare Information System (SACWIS) Review for Certification

As scheduled Federal Auditors Division of State Systems may assess system’s functionality requirements.

Treasury Offset Program (TOP) Review for Certification

As scheduled Treasury Department Auditors

Treasury Department may assess states’ compliance with federal tax offset program.

Adoption & Foster Care Analysis and Reporting System (AFCARS) Review for Certification

As scheduled Federal Auditors ACF may assess the ability of states’ system to gather, extract and submit correct data accurately.

National Child Abuse & Neglect Data System (NCANDS) Review for Certification

As scheduled Federal Auditors Voluntary national data collection and analysis system.

Risk Management Review

As scheduled Quality Assurance Team

QA Team will assess management of risks; documenting, monitoring, and reporting processes are followed.

Change Management Review

As scheduled Quality Assurance Team

QA Team will assess that procedures are followed for identification, reporting, tracking and approval of all changes.

Project Management Reviews

TBD Team Leads To review areas of mgmt responsibility in coordination with quality assurance team as needed.

Children and Family Services Review

January 25-29, 2010 Federal Auditors To examine the state’s capacity and performance in improving outcomes for families engaged in child welfare services.

Title IV-E Foster Care Eligibility Review

January 25-29, 2010 Federal Auditors ACF may assess accuracy of state claims made on children placed in foster care services.

50

Page 51: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

National Youth in Transition Database (NYTD) Review

Implemented by October 1, 2010

Federal Auditors ACF may assess states’ performance in operating independent living programs.

FNS Food Stamp Post-Implementation Review

SECTION 6.5.3

As scheduled Federal Auditors FNS may conduct review of the systems once it is fully operational.

FNS Systems Functional Requirements Review

SECTION 6.5.4

As scheduled Federal Auditors FNS may conduct a review before and/or during the initial pilot training and before the deployment of software.

FNS Cost Reviews and Audits

SECTION 6.5.5

As scheduled Federal Auditors FNS may conduct review of all cost records relating to system development and operations.

FNS Regional Office Expenditure Review

SECTION 6.5.6

As scheduled Federal Auditors FNS RO will compare reported expenditures for IS development from the Form SF-269.

DHHS/ACF CSE System Certification Audit

SECTION 6.6

As scheduled Federal Auditors ACF certification of comprehensive, automated, statewide Child Support Enforcement systems.

51

Page 52: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix D: Process Audit Report

PROCESS AUDIT REPORT

TASK IDENTIFIER: ____________ LEAD AUDITOR: _____________________DATE OF REPORT: ____________ AUDIT TEAM: ___________________________________________________________ ________________________________________________________________PROJECT NAME:_________________________________________________ DATE OF AUDIT: ________________________ PROCESS/PROCEDURE AUDITED: ________________________________________ AUDIT CHECKLIST USED: (Attach) _____________________________________ AUDIT FINDINGS: (Check one) _____ Process/Procedure Acceptable _____ Process/Procedure Conditionally Acceptable (Subject to satisfactory completion of action items listed below) Conditions noted: _____ Process/Procedure Unacceptable (Subject to satisfactory completion of action items listed below) Conditions noted:

__ACTION ITEM (AI):

AI # TITLE ASSIGNED TO: DUE DATE: COMP DATE:

CORRECTIVE ACTION:

DISPOSITION: APPROVE CANCEL DEFER

Program Manager: DATE:

AI CLOSURE:

QA Team Sign-off: DATE:

52

Page 53: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix E: Software Tool Evaluation Checklist

SOFTWARE TOOL EVALUATION

QA Team Member: _________________________

DATE OF EVALUATION: ___________

Software Tool Evaluated:

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions:

Corrective Action Taken:

53

Page 54: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix F: Performance Standards Evaluation Checklist

PERFORMANCE STANDARDS EVALUATION

Quality Assurance Team Member: _____________________________ Date of Evaluation: _________________________________________ Time of Evaluation: _________________________________________ Location of Evaluation: ______________________________________ Performance Standard Evaluated: ________ Search Results of new and existing records ________ Refresh and Error Notification ________ Print Results ________ Navigation Results ________ Functionality Report ________ Data Entry/Transaction Results ________ Capacity Results ________ Batch Turnaround Results ________ Other:____________________________________________

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions:

Corrective Action Taken:

54

Page 55: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

Appendix G: Pilot Evaluation Checklist

PILOT EVALUATION

Quality Assurance Team Member: ______________________________ Date of Evaluation: _________________________________________ Time of Evaluation: _________________________________________ Location of Evaluation: ______________________________________ Pilot Criteria Evaluated: ________ Staffing requirements ________ User satisfaction ________ Verification of business processes and workflow ________ Verification of system functionality ________ Operability and stability of software ________ Accuracy of conversion of legacy data and manual data ________ Impact of missing and erroneous data ________ Completeness and accuracy of documentation ________ Effectiveness of training methods and materials ________ Impact on workflow and staff productivity ________ Response time and overall system and network performance ________ System infrastructure performance ________ Appropriateness of system, data, and application security ________ Accuracy and performance of system interfaces and integrated

processes ________ Periodic and Final Pilot communication

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions:

Corrective Action Taken:

55

Page 56: DRAFT - Oklahoma Department of Human Services PDF Library/QualityAssurPlan_EPMO... · 2015-05-29 · 1. Establishing a quality program by committing the project to implement quality

56

Appendix H: Implementation Evaluation Checklist

IMPLEMENTATION EVALUATION

Quality Assurance Team Member: ______________________________ Date of Evaluation: _________________________________________ Time of Evaluation: _________________________________________ Location of Evaluation: ______________________________________ Implementation Criteria Evaluated: ________ User satisfaction ________ Verification of system functionality ________ Operability and stability of software and hardware ________ Accuracy of conversion of legacy and manual data ________ Impact of missing or erroneous data ________ Completeness and accuracy of documentation ________ Complete product turnover documents ________ Response time, overall system and network performance ________ System infrastructure performance ________ Security of system, data and application in place ________ Accuracy and performance of system interface ________ Integrated process (accuracy and performance) ________ Training complete ________ Workflow and staff impact

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions:

Corrective Action Taken: