testing: verification and validation

44
David Wettergreen School of Computer Science Carnegie Mellon Systems Engineering Carnegie Mellon © 2006 Testing: Verification and Validation

Upload: ori-rowland

Post on 30-Dec-2015

43 views

Category:

Documents


4 download

DESCRIPTION

Testing: Verification and Validation. Definitions. Error A problem at its point of origin Example: coding problem found in code inspection Defect A problem beyond its point of origin Example: requirements problem found in design inspection, system failure during deployment. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Testing: Verification and Validation

David WettergreenSchool of Computer

ScienceCarnegie Mellon

University

Systems Engineering

Carnegie Mellon © 2006

Testing:Verification and

Validation

Testing:Verification and

Validation

Page 2: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 2

DefinitionsDefinitions

Error

A problem at its point of origin

Example: coding problem found in code inspection

Defect

A problem beyond its point of origin

Example: requirements problem found in design inspection, system failure during deployment

Page 3: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 3

Cost of DefectsCost of Defects

According to NASA analysis:

The cost of finding a defects during system test versus finding the defect in design •Dollar cost is 100:1 •Cost in time 200:1

Page 4: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 4

Wow!

Cost of DefectsCost of Defects

The cost of finding a defect in a requirement is $100, in test $10,000

On average, design and code reviews reduce the cost of testing by 50-80% including the cost of the reviews

Page 5: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 5

Cost of RepairCost of Repair

Requirements

Design

Build

Test

Maintain

5

20

50

100

[Boehm81]1

Page 6: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 6

Quality and ProfitQuality and Profit

Which is better?

Time-to-profit may be improved by more investment in build quality early in the process

Time -to-market

Time -to-profit

Time -to-market

Time -to-profit

Support Cost

Revenue

[Fujimura93]

Page 7: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 7

Product StabilityProduct Stability

Measure defects (correlated with delivered unit)•System, product, component, etc.•Normalize defects to importance/criticality

Nu

mb

er

of

Defe

cts

Time

Page 8: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 8

DefinitionsDefinitions

From Webster’s

Verify 1) to confirm or substantiate in law by oath 2) to establish the truth, accuracy, or reality of

Validate 2) to support or corroborate on sound or authoritative basis <as in experiments designed to confirm an hypothesis>

Page 9: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 9

What’s the difference?What’s the difference?

Verification is determining whether the system is built right : correctly translating design into implementation

Validation is determining whether the right system is built : does the implementation meet the requirements

Page 10: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 10

Verification and ValidationVerification and Validation

Verification is applied at each transition in the development process

Validation is applied with respect to the results of each phase either for acceptance or process improvement.

Inspection for Verification

Testing for Validation

Page 11: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 11

Verification and ValidationVerification and Validation

User Requirements Architecture Design

Product

What is the practical difference between verification and validation?

Page 12: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 12

Verification and ValidationVerification and Validation

User Requirements Architecture Design

Product

Verification Verification Verification

Page 13: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 13

Verification and ValidationVerification and Validation

User Requirements Architecture Design

Product

User

Validatio

n

Verification Verification Verification

Page 14: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 14

Verification and ValidationVerification and Validation

User Requirements Architecture Design

Product

User

Validatio

n

Requirements

Validatio

n

Verification Verification Verification

Page 15: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 15

Verification and ValidationVerification and Validation

User Requirements Architecture Design

Product

User

Validatio

n

Requirements

Validatio

n

Architectural

Validatio

nDesign

Validatio

n

Verification Verification Verification

Page 16: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 16

Temporal

Structural

Functional

Real-Time SystemsReal-Time Systems

Not only do we have the standard software and system concerns…

…but performance is crucial as well

Real-time System

Page 17: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 17

Real-Time SystemsReal-Time Systems

In real-time systems (soft, firm, hard), correctness of function depends upon the ability of the system to be timely

In real-time systems, correct functionality may also depend on:

reliability, robustness, availability, security

If the system cannot meet any of these constraints, it may be defective

Page 18: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 18

Requirements InspectionsRequirements Inspections

Biggest potential return on investment

Attributes of good requirement specification:•Unambiguous•Complete•Verifiable•Consistent•Modifiable•Traceable•Usable

Page 19: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 19

Requirements InspectionsRequirements Inspections

Inspection objectives:•Each requirement is consistent with and traceable to prior information•Each requirement is clear, concise, internally consistent, unambiguous, and testable•Are we building the right system?

Page 20: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 20

Design InspectionDesign Inspection

Opportunity to catch problems early.

Objectives:•Does the design address all the requirements?•Are all design elements traceable to specific requirements?•Does the design conform to applicable standards?•Are we building the system correctly?

Page 21: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 21

Test Procedure InspectionsTest Procedure Inspections

Focus on verifying the validation process. Does the test validate all the requirements using formal procedure with predictable results and metrics.

Objectives:

Do validation tests accurately reflect requirements?

Have validation tests taken advantage of knowledge of the design?

Is the system ready for validation testing?

Page 22: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 22

Requirements ValidationRequirements Validation

Check:

Validity - Does the system provide the functions that best support customer need?

Consistency - Are there any requirements conflicts?

Completeness - Are all required functions included?

Realism - Can requirements be implemented with available resources and technology

Verifiability - Can requirements be checked?

Page 23: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 23

TestingTesting

Testing is an aspect of Verification and Validation

Testing can verify correct implementation of a design

Testing can validate accomplishment of requirement specifications

Testing is often tightly coupled with implementation (integration and evaluation) but it also is important to production

Page 24: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 24

When to TestWhen to Test

To test, you need something to evaluate•Algorithms•Prototypes•Components/Sub-systems•Functional Implementation•Complete Implementation•Deployed System

Testing can begin as soon as there’s something to test!

Page 25: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 25

Testing ParticipantsTesting Participants

System Engineering

ComponentEngineering

TestEngineering

TestArchitecture

TestPlanning

TestMeasurements

Test Requirementsand Evaluation

Test Conductand Analysis

TestEquipment

TestEquipment

Requirements

[Kossiakoff03]

Page 26: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 26

Testing StrategiesTesting Strategies

White Box (or Glass Box) Testing•Component level testing where internal elements are exposed •Test cases are developed with developers’ knowledge of critical design issues•Functional testing for verification

Black Box Testing•Component level testing where structure of test object is unknown•Test cases are developed using specifications only•Operational testing for validation

Page 27: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 27

Black Box TestingBlack Box Testing

Positive Tests

Valid test data is derived from specifications•Both high and low probability data is used•Tests reliability

Negative Tests

Invalid test data is derived violating specifications•Tests robustness of the test object

Need both kinds of tests and high and low probability events to develop statistical evidence for reliability and robustness

Page 28: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 28

Testing StrategiesTesting Strategies

X

YMax X

Max YMin X

Min Y

Normal

BoundaryNormal

BoundaryAbnormal

Test Envelopes

Given a behavior with 2 parameters we can establish the test envelope

Useful for identifying boundary conditions

Page 29: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 29

Test EnvelopesTest Envelopes

Boundary conditions define positive and negative tests

Test cases should include •High probability zones in the normal region•High probability zones in the abnormal region•Low probability zones in the abnormal region if the outcome is catastrophic

Page 30: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 30

Hierarchical TestingHierarchical Testing

Top-down testing

Developed early during development

High level components are developed

Low level components are “stubbed”

Allows for verification of overall structure of the system (testing the architectural pattern and infrastructure)

Page 31: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 31

Hierarchical TestingHierarchical Testing

Bottom-up testing

Lowest level components are developed first

Dedicated “test harnesses” are developed to operationally test the low-level components

Good approach for flat, distributed, functionally partitioned, systems (pipeline architecture)

Page 32: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 32

Testing StrategiesTesting Strategies

Regression Testing

Testing that is done after system has been modified•Assure that those things that it used to do—that it still should do—still function•Assure that any new functionality behaves as specified

Page 33: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 33

Testing ComplicationsTesting Complications

When a test discrepancy occurs (the test “fails”) it could be a fault in:•Test equipment (test harness)•Test procedures•Test execution•Test analysis•System under test•Impossible performance requirement

The first step in resolution is to diagnose the source of the test discrepancy

Page 34: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 34

Operational TestingOperational Testing

Validation Techniques

Simulation•Simulate the real-world to provide inputs to the system•Simulate the real world for evaluating the output from the system•Simulate the system itself to evaluate its fitness

Simulation can be expensive

Page 35: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 35

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

Simulation is a primary tool in real-time systems development

Page 36: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 36

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

Avionics integration labs develop “airplane on a bench”

Page 37: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 37

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

Full motion simulators are developed to train aircrews, test usability of flight control systems, and human factors

Page 38: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 38

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

Radar, INS, offensive, operability, and defensive avionics are tested in antiechoic chambers

Page 39: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 39

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

Fight control software and systems are installed on “flying test beds” to ensure they work

Page 40: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 40

Radar

CockpitDisplays

Flight ControlsFlaps, Ailerons, Rudder, Elevator

Inertial NavigationPropulsion Systems

Operational TestingOperational Testing

The whole system is put together and a flight test program undertaken

Page 41: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 41

Operational Test PlanOperational Test Plan

Types of tests• Unit Tests – Test a component

• Integration Tests – Test a set of components

• System Tests – Test an entire system

• Acceptance Tests – Have users test system

Page 42: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 42

Operational Test PlanOperational Test Plan

The Operational Test Plan should identify:ObjectivesPrerequisitesPreparation, Participants, LogisticsScheduleTestsExpected Outcomes and CompletionFor each specific test detail:Measurement/Metric, Objective, Procedure

Page 43: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 43

Verification and Validation PlansVerification and Validation PlansTest plans, or more rigorous V&V plans, are often left for late in the development process

Many development models do not consider V&V, or focus on testing after the system is implemented

Verifying progress in development is a continuous, parallel process

Start thinking about V&V during requirements specification

How will these requirements be verified in design? Think traceability.

How will the implementation be verified?

What formulation of requirements will clarify system validation?

Page 44: Testing: Verification and Validation

Systems EngineeringCarnegie Mellon © 2006 44

ReviewReview

Testing can verify correct implementation of a design and verify operational performance

Testing can validate accomplishment of requirement specifications

Variety of test strategies must be tailored to the specific application depending upon:

Likely failure modes

Known complexities

Reliability and safety concerns