formal testing practices for auditable test scripts - · pdf fileformal testing practices for...
TRANSCRIPT
Computer Validation Training
Formal Testing Practices for
Auditable Test Scripts
Dr. Teri Stokes, Director, QA Compliance
Cytel, Inc., Cambridge, Waltham, Geneva
System “Stress” Tester
Or
“Stressed” System Tester?
Context for Computer
Validation Testing
Operating environment – GMP Laboratory
Computerized system (Lab.Info.Mgt.Sys.)
People
Software
Hardware
Computer system
SOPs
People
Equipment
Work Process
Infrastructure Platform (IT/IQ) Software Application (User/PQ)Slide 2
Context for Platform
Requirements (PRS)
Operating environment (GMP Laboratory)
Computerized system (Lab.Info.Mgt.Sys.)
People
Software
Hardware
Computer system
SOPs
People
Equipment
Work Process
Infrastructure Platform (IT/IQ) Software Application (User/PQ)Slide 3
Needs for:
• IT System
Process
• IT Data
Support
Context for (URS)
User Requirements
Operating environment (GMP Laboratory)
Computerized system (Lab.Info.Mgt.Sys.)
People
Software
Hardware
Computer system
SOPs
People
Equipment
Work Process
Infrastructure Platform (IT/IQ) Software Application (User/PQ)Slide 4
User Flows for:
• Lab Work
Process
• GXP Data
Process
Application Life Cycle
for GXP LIMS
2.SYSTEM PLAN
URS - User
Requirements
3.DESIGN
4.BUILD
5.TEST
6.COMMISSION
Validate & Re-
test after fixes
9.RETIRE
Decommission
& Replace
Test Fit
to Design
Test Fit to
Work Process
8.MAINTAIN
Fix & Modify
7.OPERATE
GCP Work
Process
LIMS User
Group
1.SYSTEM IDEA
Needs Analysis
Report
Application
Life Cycle
LIMS Platform1. PRS -
Configure
2. Install
3. Test fit to
SLA needs4. Operate
5. Maintain
6. Retire
User’s IT Dept.
LIMS
Software
SupplierSlide 5
How? FRS &
SDD
Verify to SDD
& Release
Program or
Configure
Software
Development
Life Cycle
(SDLC)
Data Capture – Lab
URS & PRS ViewLab Methods
Instruments Devices
Capture & Analysis
Systems (PCs)
Analysts & Supervisors
Sample Handling
Reagents
Defined Raw Data
Calculations Results
Verification Analytical Report
Slide 6
SOPs & Work
Instructions
Laboratory
Work & Data
Flow(s)
PRS
URS
User Acceptance at
System Go-Live
2.SYSTEM PLAN
What? URS
3.DESIGN
How? FRS &
SDD
4.BUILD
Program or
Configure
5.TEST
Verify to SDD
& Release
6.COMMISSION
Accept &
Validate to URS
9.RETIRE
Decommission
& Replace
Test Fit
to Design
Test Fit to
Work Process
8.MAINTAIN
Fix & Modify
7.OPERATE
Use & Monitor
1.SYSTEM IDEA
Needs Analysis,
RFP & Contract
LIMS
Application
Life Cycle
SDLC
OQ – Operational
Qualification
IQ – Installation
Qualification
PQ – Performance
Qualification
1. PRS -
Configure
2.
Install
3.
Test
Test Fit to Install Specs.
PQ
OQ
IQSDLC - SW
Development
Life Cycle
Slide 7
LIMS
Software
Supplier User’s IT Dept.
LIMS Platform
LIMS User
Group
Audit OQ
OECD GLP – Acceptance Testing
• “…there should be evidence that the system was adequately tested for conformance with acceptance criteria … prior to being put into routine use.”
• “Formal acceptance testing requires the conduct of tests following a pre-defined plan and retention of all testing procedures, test data, test results, a formal summary of testing, and a record of formal acceptance.”
Slide 11
Test
Plan Test
Report
Test Results
Formal
Release
Requirements are required for testing.
Slide 9
All formal testing requires criteria as a
metric, benchmark (e.g. requirement) to test
against. If you don’t know your “expected
results” then testing is meaningless and
validation is not possible.
Function & Design
SpecificationsPlatform Needs
Work Process Needs
Do You Have a Requirement or a Wish?
Slide 10
What Does the User’s Process Need? (URS)
Slide 11
Req.
ID
User’s Work
Step
System Role
– User Req.User Accept.
Test
UR01 Secure manual
& automated
data entry for
Lab Reports
Control access
and log off
intruders
Verify User &
instrument
Use pos. and neg.
IDs
Log off after 3
failed tries
UR02 Track data edits Audit data
changes
Change items.
Check screen.
Run Report.
How Can Technology Help? (FRS)
Slide 12
Req.
ID
URS Functional
Requirement
System Test
FR01 UR 01 Secure
manual & auto
data entry
Alpha/numeric
PW control. Third
fail lock out. Auto
ID check.
Use pos. and neg.
IDs
Fail 4th try
FR02 UR 01 Site & ID
match
Verify user, site, &
study IDs
Use pos. and neg.
IDs
FR03 UR 02 Track
data edits
Audit trail, screen
marker on changed
item, report
Change items in
fields. Check
screen. Run report.
Functional Requirements Specification
How Is the System to Be Built? (SDD)
Slide 13
Req.
ID
System
Function
System Design
Description SDD
Unit Test
FR01 Password
control
User Role/access
schema. Lock out
after 3rd fail.
Use pos. and
neg. IDs. Time
out 4th neg.
FR02 Study ID
Verification
User/Auto/Site/Study
access schema.
Use pos. and
neg. study IDs
FR03 Audit trail Role/privilege
schema. Screen
mark. Report items.
Change
roles/items per
schema.
Run report.
Software/System Design Description
CSV Package Model
Control of Software &
Platform Environment
CSV Package Plan Verification/Validation Plan
Control of System & SW
Changes & Audit of
CSV Package, QMS
CSV Package Summary Report Verification/Validation Summary Report
Standard CSV
package
SOPs, Training &
Materials for Human
Work with System
Prepared and maintained by a CSV Package Team for IQ, OQ, or PQ
Requirements &
Spec.s for SW, System
& Services - SLA
Slide 14
1. System Control 2. Human Control 3. Testing Control
Test PlansStartup & Ongoing
Test Cases, Scripts,
Data & Result Logs
Test Summary
Reports
QMS = Quality Management System SLA = Service Level Agreement
Trace
Part 11 Guidance*:
GXP Testing Principles
5.1 System requirements Specifications: Established, documented end user requirements are essential. (GXP, Part 11, work process – scanning, scalability, operating environment requirements)
5.2 Documentation of Validation Activity: Management approved plan(s), procedures and report(s).
5.3 Equipment Installation: Qualify installation prior to testing.
Slide 15 * Retracted Draft Guidance 2002
5.4 Dynamic Testing: 5.4.1 - Test conditions, simulation tests, and live, user-site tests. 5.4.2 - Software testing to include structural, functional, and program build testing. 5.4.3 – Quantifiable tests recorded in quantified, not pass/fail terms.
5.5 Static Verification Techniques: Document and code inspections, walk-throughs, and technical reviews.
5.6 Extent of Validation: System risk to product safety, efficacy, and quality. System risk to data integrity, authenticity, and confidentiality. System complexity.
Slide 16 * Retracted Draft Guidance 2002
Part 11 Guidance*:
GXP Testing Principles
5.7 Independence of Review: CSV to be performed by persons other than system developers.
5.8 Change Control (Configuration Management):Systems to control changes and evaluate extent of revalidation needed. Regression testing is a key tool.
Extent of Validation Testing: Based on – System risk to product safety, efficacy, and quality
– System risk to data integrity, authenticity, and confidentiality
– System complexity.
Slide 17 * Retracted Draft Guidance 2002
Part 11 Guidance*:
GXP Testing Principles
Different Kinds of Testing
• Static or Desk Testing: Examination of documents and procedures associated with a computerized system. No interaction with the system itself.
• Dynamic Testing: Direct interaction with a system using keyboard, mouse, wand, automated tools, etc.
• Code Review: Peer examination of software code for its compliance to prescribed coding and engineering standards.
• Walk-through Review: Group review of a work product for its compliance to accepted standards and requirements.
Slide 18
Focus for OQ, IQ, and PQ Testing
• Operational Qualification (OQ): Provision of documented evidence by system supplier that the software reliably operates as expected to its functional and design specifications.
• Installation Qualification (IQ): Provision of documented evidence by IT that the infrastructure and network platformreliably operates as intended to support regulated software.
• Performance Qualification (PQ): Provision of documented evidence by end users that a software application reliably performs as intended in a regulated work process.
Slide 19Stressed System Tester
Test Step #1 – Develop Requirements
CRF
design
Data
entry
Data
edits
Lab data
transfers
CRO/SAS
data sets
Final
reports
Submit
data
Slide 20
CDM Work Process Activities
Work
Step
Input
Doc.
System
Step
Output
Doc.
SOP,
WI
Role (s)
Responsible
N/A
Data Entry
Menu
(1) CRF design
(2) Data entry
Protocol
CRF/NCR
CRF/NCR
Data in
Database
DM.SOP
001
DM.WI
06
Designer
DE.01/02
Step #2 – Map System to Work Process
1. Team agrees a common workflow for use of system in the work process (approved URS).
2. Work process steps interacting with the system are identified.
3. Each work process step has positive and negative challenges identified as test step procedures.
Workflow
Actions (URS)
Work Process
Steps
Slide 21
Challenges
4. Logical work process flows are organized into Test Cases and related activity steps into specific Test Scripts.
5. Test Scripts and related documents are managed with large envelopes.
Test Cases
Test
Envelope
Slide 22
Print-
outs
Step #2 – Map System to Work Process
Test Scripts
Test
script Print-
outs
&
User Test Case Descriptions
(PQ)To include normal, problem, and stress conditions in
user’s work process environment.
Slide 23
1. Work Area Preparedness – SOP, WI, Manual, CVs
2. System Setup/Admin – User Profiles, DB schema
3. Work Process – Activity A (vanilla run)
4. Work Process – Activity B (chocolate issues)
5. Work Process – Activity C (strawberry stresses)
6. Special Challenges – Multi-user, Business Continuity
Server Test Case Descriptions
(IQ)To include normal, problem, and stress conditions in
IT/IS environment.
Slide 24
1. Configure and install Hardware & Network
2. Operating System (O/S) & Software Tools
3. Database Engine (DB) & Query Tools
4. Network Operations – LAN/WAN
5. Platform Routine Backup & Recovery HW/SW/DB
6. Platform Disaster Recovery for System & Data
Test Step # 3 Develop a Test Plan
Test Cases, Scripts,
Data & Result Logs
Test Plan(s)
Test Summary
Report (s)
Standard user’s
CSV package
Prepared and maintained by User Department(s) Team
Slide 25
1. System Control 2. Human Control
Startup & Ongoing
Trace
Requirements
3. Testing Control
IEEE Format for a
Software Test Plan
Software Test Plan Outline - IEEE Std. 829-1983
1. Test plan identifier
2. Introduction
3. Test items
4. Features to be tested
5. Features not to be tested
6. Approach
7. Item pass/fail criteria
8. Suspension criteria & resumption requirements
Slide 26
Software Test Plan Outline - IEEE Std. 829-1983
9. Test deliverables
10. Testing tasks
11. Environmental needs
12. Responsibilities
13. Staffing and training needs
14. Schedule
15. Risks and contingencies
16. Approvals
(IEEE Tel: 800-678-4333 or Search Internet for IEEE Standards)
IEEE Format for a Software Test Plan
Slide 27
Task
List
Role
Resp.
Due
Date
Test Step #4 – Develop a Traceability
Matrix
No testing project is too big or too small for a trace matrix.
Always trace Test Scripts to Requirements to ensure that all
requirements have been sufficiently tested.
Slide 28
Trace Matrix – System X
Req.
ID
Requirement
Description
Test
Topic
Test
ID
Requirements Format with
Trace Matrix
Slide 29
Pharmacovigilance System – URSData Entry Requirements
DEUR ID Requirement Description
User Test Topics Test Script ID
DEUR.10 The system provides automatic narrative generation based on data entered.
Enter data and check for narrative generation inMEDWATCH & CIOMS forms.
PV.PQ.TS.4.5
DEUR.11 The system supports automatic calculations
Enter patient case data & dependent fields are calculated, e.g., after date of birth & event date entered, then field for age at onset of event will automatically be calculated.
PV.PQ.TS.4.8
Who Is Who in Formal Testing?
Test Coordinator – writes test plan, test cases, and prepares testing materials. Trains testers & witnesses, manages testing process, and tracks results and anomalies. Writes conclusion per case. Writes test summary report. Organizes test records in a test records binder.
Slide 30
Roles &
Responsibilities
Who Is Who in Formal Testing?
Script author - can never be tester or approver of same script, but can witness testing of same script
Script approver – independent reviewer of script for content and strategy
Tester- performs test, records system response, and signs script for accurate recording of system response
Witness - verifies that tester is prepared and ready to perform test, checks presence of final test records, and signs script for compliance to testing practices SOP
Slide 31
Roles &
Responsibilities
Step #5 - Prepare Test Script
Materials
Test Script
Step ProceduresIncident Form
(Red Note) & WI
URS Trace &
Script Pass/Fail
T/W Signatures
Large Mailing Envelope
Standard Test
Script Envelope
Support Materials,
Result Printouts...
Prepared by Test Coordinator & Package Team
Witness
Checklist
Slide 32
Result Logs &
System Issues
Test Data Form,
SOP, Work Instr.
Test Script Author Guidelines – The High 5
1. Keep Tester Alert – author scripts that can be executed within 30-50 minutes.
2. Rule of Three – test key functions 3 times using a variety of challenges – normal, problem a, and issue b
3. Trace to Requirements – identify requirements being addressed by each script.
4. System Output – have at least one screen print or report per test script. Identify label info needed for output.
5. More than Pass/Fail – have tester write out an observed system response.
Slide 33
Step #6 - To Do’s in Formal Testing
1. Check script, logs, data, & testing area.
2. Record all system responses as they occur.
3. Use only indelible ink to record results.
4. Correct with single line through a record & new result next to it. Initial, date & explain.
5. Draw single line through unused log spaces.
6. Label, sign & date all printouts or CD/diskettes.
System
Setup
T.S. Envelope
System
Testing
Print-
outs
Slide 34
Step #6 - The Don’ts in Formal Testing
1. Don’t use pencil or other erasable media.
2. Don’t correct by using white-out or scribble over to obliterate prior result.
3. Don’t use check marks, dittos, Y/N or other abbreviations. Write results & comments in full.
4. Don’t leave large blank spaces in result logs. Line through.
5. Don’t forget to sign & date all output documents & logs.
T.S. Envelope
System Testing
Print-
outs
Slide 35
Witness Participation in Testing
1. Checks contents of Test Script Envelope with Tester for completeness and understanding.
2. Checks System Setup with Tester for logon access. Watches start of first testing step.
3. After testing, T.S. Envelope is re-examined. Checks Test Script and printouts for completeness, signatures, dates and labels.
System
Setup
Test
script
T.S. Envelope
System
Testing
Print-
outs
T.S. Envelope
Slide 36
Step #7 - Reporting on Test Results
Test Plan(s)
Test Summary
Report
Standard user’s
CSV package
Prepared and maintained by Site User Team
Slide 37
System Control Human Control
Trace
Test Plan Summary Report
1. Test Summary Report Identifier - Unique ID traceable to associated Test Plan.
2. Summary - Describes items tested (application version), test environment (platform system), and test documentation used (Test Cases, Test Scripts, T.S. Envelope contents).
3. Variances - States any deviations from Test Plan or Test Scripts and reason why.
4. Comprehensive Assessment - Discusses assumptions and limits to scope of testing. Were scope of testing and results obtained sufficient to assess system reliability? Discuss reasons for limits chosen.
IEEE Std. 829-1983 Adapted
Slide 38
5. Summary of Results - Gives table of testing results per Test Case. Table of anomalies and their resolutions. List of outstanding issues and risks (unresolved anomalies).
6. Evaluation - Pass/Fail conclusion based on test results and criteria in the Test Plan.
7. Summary of Activities - Tester/Witness staffing, task list from Test Plan with updated status.
8. Approvals - Names, titles, signatures, dates and meaning of signatures.
9. Appendix - Table of Contents list for test documentation and traceability to requirements
Test Plan Summary Report
IEEE Std. 829-1983 Adapted
Slide 39
Step #8 - Store Test Results for Audit
Site Test BinderSite Test Binder
Prepared by Test Coordinator at Site
T. Case 2 -Sum. Rpt.
Test scripts, test
data & Result Logs
T. Case 2 - System
Admin .Workflow
Step Step Step
Test scripts, test
data & Result Logs
T. Case 1 -Sum. Rpt.
T. Case 3 - Work
Process Workflow(s)
Step Step Step
Test scripts, test
data & Result Logs
T. Case 3 -Sum. Rpt.
Site A
T. Case 1 - SOP &
Document ReviewSOP for
Testing
Site Test Summary Report
T/W
Training
Records
Slide 40
Step #9 - System Ongoing Test Plan
Over time systems change. An Ongoing Test Plan sets up the
process for maintaining a validated status as change occurs.
All change control testing operates under the Ongoing Test
Plan until the next full version of software is installed.
Slide 41
OTP
Ongoing Testing and Change
Control
2.SYSTEM PLAN
What? URS
3.DESIGN
How? SDD
4.BUILD
Program or
Configure
5.TEST
Verify to SDD
& Release
6.COMMISSION
Accept &
Validate to URS
9.RETIRE
Decommission
& Replace
Test Fit
to Design
Test Fit to
Work Process
8.MAINTAIN
Fix & Modify
7.OPERATE
Use & Monitor
1.SYSTEM IDEA
Needs Analysis,
RFP & Contract
Application
Life Cycle
SDLC
Platform System
1.
Change
2.
Install
3.
Test
Test Fit to Design & Work
SDLC - Software
Development
Life Cycle
Software
Supplier
System User
Change
Symbol
Slide 42
Step #10 – Internal Audit of Testing
• Test Plan was approved with or after Validation Plan approval
date and before Test Script approval dates.
• Test Plan describes system to be tested, gives specific strategy
for testing, and identifies tasks and roles responsible.
• All issues arising are recorded, tracked, and resolved.
• Repeat testing is performed using a new copy of a script and
the run number is incremented.
Slide 43
Audit Checkpoints
Internal Audit of Testing
• Test Script author is not tester or approver for same script.
• All printouts are labelled with tester ID, date, and step ID.
• All log entries are made in indelible blue or black ink.
• No abbreviations (P/F, Y/N), ditto or check marks were used.
• Signature page identifies names, initials, signatures, and
Tester/Witness roles.
• Test Script identifies requirements being tested.
Slide 44
Audit Checkpoints
Internal Audit of Testing
• Test Summary Report clearly describes what happened, how
problems were handled, who was responsible, and how the Test
Plan was followed or what deviations were made and why.
• Test Summary Report should show that test execution was
consistent with Test Plan strategy.
• PQ testing uses GXP work process SOPs and work instructions
to test their suitability for working with the system.
• IQ testing uses IT Dept. SOPs and work instructions to test their
suitability.
Slide 45
Audit Checkpoints
SDLC
SW Supplier’s View of Product
Life Cycle9. RETIRE
Decommission
& Replace
8. MAINTAIN
Fix & Modify
7. OPERATE
GCP Work
Process
User Group
1.SYSTEM IDEA
Needs Analysis
Report
Application
Life Cycle
2. SYSTEM PLAN
MRS – Market
Req. Spec.
3. DESIGN
FRS - Functional Req. Spec.
SDD - SW Design Description
4. BUILD
Program to SDD using
SW Eng. SOPs
5. TEST
Req. & Design Review, Code
review, Peer, Unit,
Integration, System &
Regression testing
6. COMMISSION
Validate & retest
after fixes
Fit ?
Software
supplier
Platform
System
SLA - Service
Level Agreement
Software
Development
Life Cycle
(SDLC)IT/IS dept.
Work process
OQ Test Steps – Project Smoke Test
Slide 47
SDLC Testing Process
MRS
to FRS
SDD to
Milestone
Code
Review
Unit
Test
Smoke
Test
Project
regression
test to
ensure a
safe SQC
Build
Check that
SDD &
coding stds.
are met.
Identify
errors &
deviations.
Developer
Tests to
Milestone
SDD
Dev. Peer
Tests to
SDD
Trace
Matrix,
Coding
Standards,
Milestone
Definition
Trace
Matrix,
Master Test
Plan
Smoke Test
The smoke test should exercise the
entire system from end to end. It
does not have to be exhaustive, but it
should be capable of exposing major
problems. The smoke test should be
thorough enough that if the build
passes, you can assume that it is
stable enough to be tested more
thoroughly.
The smoke test is the sentry that guards against deteriorating
product quality and creeping integration problems.
Slide 48
At a minimum, a "good" build should compile all files, libraries,
and other components successfully; link all files, libraries, and
other components successfully; not contain any showstopper
bugs that prevent the program from being launched or that make
it hazardous to operate; and pass the smoke test.
Software QC Test Build
Ref.: Best Practices, IEEE Software, Vol. 13,
No. 4, July 1996, Daily Build and Smoke Test
Software Quality Control Testing
Slide 50
SDLC Testing Process
MRS
to FRS
SDD to
Milestone
Code
Review
Unit
Test
Smoke
Test
SW QC
Testing
Function
Module
System
Interface
Regression
Release
Project
regression
test to
ensure a
safe SQC
Build
Check that
SDD &
coding stds.
are met.
Identify
errors &
deviations.
Developer
Tests to
Milestone
SDD
Dev. Peer
Tests to
SDD
Trace
Matrix,
Coding
Standards,
Milestone
Definition
Trace
Matrix,
Master
Test Plan
Product Release Testing
Slide 51
SDLC Release Testing Process
MRS
to FRS
Alpha
Test
Beta
Test
GMR*
Test
Final formal
testing by
SW QC to
authorize
commercial
product
release
Internal
check that
all sections
are working
together
Trusted
customer
check out
of product
in pre-
release test
mode
Trace
Matrix,
Master
Test Plan
Summary
Report
Function
Module
System
Interface
Regression
Release
SQC
Testing
* GMR = Gold Master Release
Patch/Upgrade Release Testing
Slide 52
SDLC Release Testing Process
MRS
to FRS
Alpha
Test
GMR*
Test
Final formal
testing by
SW QC to
authorize
commercial
product
release
Internal
check that
all sections
are working
together
with new
code
Trace
Matrix,
Master
Test Plan
Summary
Report
Function
Module
System
Interface
Regression
Release
SW QC
Testing
* GMR = Gold Master Release
User Audit of SW
Supplier Testing (OQ)
To include normal, problem, and stress conditions.
Slide 53
1. Intended unit, module, or system functions2. Interface(s) to other unit, module, or system 3. Database interactions (DB)4. Robustness & error handling5. Logic pathway & security challenges6. Regression analysis & requirements match7. Audit trail integrity
Software Supplier’s (SDLC) OQ Package
Code & Tools Mgt. Logs,
SDLC, Controlled SW
Devel. Platform, Tools IQ
OQ Verification Plan
Test Cases, Scripts,
Data & Result Logs
Master Test Plan &
Sub Test Plans –
U,F,S,M,S,I,R*
SW Upgrade Plan, Bug
Tracking, Audit Logs,
QMS with SQC & QA
OQ Verification Summary Report
Standard SW supplier’s
CSV package
SW Engr. SOPs, Pgmr.
Trng, Product Manuals,
User Help Desk
Prepared by internal or external software supplier team
MRS, FRS, SDD, Code,
SDLC Docs., Support
Contract (SLA), (Escrow)
Slide 54
Test Summary
Report (per Plan)
*U,F,S,M,S,I,R = Unit, Function, Smoke, Module, System, Interface, Regression
1. System Control 2. Human Control 3. Testing Control
Trace
OQ Scripts versus PQ Scripts
OQ Test Scripts
Verified SW
Slide 55
PQ Test Scripts
Validated SW
• Software Design focus
• Feature and function tests
• Often automated scripts
• Code Reviews, Unit Tests,
Smoke Tests
• Simulated acceptance tests
based on hypothetical use of
features and functions
• User’s Work Process focus
• Normal work and data run
• Problem work and data issues
• No tic boxes or P/F
• Record from screen or print
out of SW response
• Use work actions to exercise
SW interface links and access
to devices on network
IQ Scripts versus PQ Scripts
IQ Test Scripts
Qualified Test Bed
Slide 56
PQ Test Scripts
Validated SW
• HW, SW, Network focus
• Standard review items
• No tic boxes or P/F
• Record or print out
configuration items
• Open and print test of
related SW, interface links
and access to devices on
network
• User’s Work Process focus
• Normal work and data run
• Problem work and data issues
• No tic boxes or P/F
• Record from screen or print
out SW response
• Use work actions to exercise
SW interface links and access
to devices on network
MANAGEMENT CONTROL
• Formal Testing Practices SOP
• Approved Test Plans, Test Scripts, & Test Summary Reports
• Anomaly Tracking Process
• User & Support Documents
Formal Testing Major Themes
SYSTEM RELIABILITY
• Requirements & Specifications
• Trace Matrix – Test IDs to Requirements IDs
• Defined Acceptance Criteria in Test Procedure
• Limits, Logic & Problem Testing
DATA INTEGRITY & PRIVACY
• Tester & Witness Signatures
• Result Logs in Indelible Ink
• IQ for Automated Testing Tools
• Security & Disaster Recovery Tests
• Audit Trail Tests
AUDITABLE QUALITY
• Approved Plans & Reports
• Traceable test coverage for all Requirements & Specifications
• Documented Anomaly Resolution
• Independence of Testing Process
Slide 57
Common Sense Computer Validation
Computer Validation Training
Thank You!
Tak – Tack – Takk
Mahalo
Ngiyabonga – Asante!
Dankie – Danku – Danke
Taxi Ride – Bangalore 2012
Any Questions or Comments?
??
…&…&
…&
CSV Package XQ Definitions
• Operational Qualification (OQ) – Documented evidence
that a system operates as intended throughout its expected
range as per design/functional specifications. (Supplier)
• Installation Qualification (IQ) – Documented evidence that
all system components are installed to supplier instructions &
user requirements. (Infrastructure/IT)
• Performance Qualification (PQ) – Documented evidence
that a system operates as intended in the user’s work process.
(User – Start-up & Ongoing)
Slide 60