verification & validation and systems integration

49
Verification & Validation and Systems Integration

Upload: dwain-randall

Post on 25-Dec-2015

227 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Verification & Validation and Systems Integration

Verification & Validation and

Systems Integration

Page 2: Verification & Validation and Systems Integration

Plan project

Integrate & test system

Analyze requirements

Design

Maintain

Test unitsImplement

Software Engineering Roadmap

Identify corporate practices

Construct system in stages - Plan integration of parts to yield whole - Test subassemblies - Assemble in “builds” - Test whole system in a variety of ways

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 3: Verification & Validation and Systems Integration

Chapter Learning Goals

• Link between testing and integration

• Understand types of testing required

• Be able to plan and execute testing– beyond the unit level

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 4: Verification & Validation and Systems Integration

• Validation: "Are we building the right product"– The software should do what the user really

requires (the requirements should express this)

• Verification: "Are we building the product right"– The software should conform to the

architecture and design as driven by the requirements

Verification vs validation

Page 5: Verification & Validation and Systems Integration

• Software inspections Concerned with analysis of the static system representation to discover problems (static verification)– May be supplement by tool-based document and code

analysis

• Software testing Concerned with exercising and observing product behaviour (dynamic verification)– The system is executed with test data and its operational

behaviour is observed

Two methods of V & V

Page 6: Verification & Validation and Systems Integration

V& V Goals

• Verification and validation should establish confidence that the software is ready for use

• This does NOT mean completely free of defects

• Rather, it must be good enough for its intended use and the type of use will determine the degree of confidence that is needed

Page 7: Verification & Validation and Systems Integration

• Careful planning is required to get the most out of testing and inspection processes

• Planning should start early in the development process – during requiremenst analysis

• The plan should identify the balance between static verification and dynamic testing

V & V Planning

Page 8: Verification & Validation and Systems Integration

1. Introduction to system integration

Before we can talk more about V & V we have to discuss an approach to

integration

Page 9: Verification & Validation and Systems Integration

Elaboration

Unified Process for Integration & Test

Inception Construction Transition

Requirements

Analysis

Design

Implemen-tation

Test

Jacobson et al: USDP

Prelim.iterations

Iter.#1

Iter.#n

Iter.#n+1

Iter.#m

Iter.#m+1

Iter.#k

….. …..

Unit Tests Integration tests ... System tests

Integration

Page 10: Verification & Validation and Systems Integration

loss

Development Overview

Requirements

Customer

Architecture

System code

loss

loss

loss of information … Why?

Interface specsDetailed design

Function code

Module (e.g., package) code

loss

loss

After Myers

Page 11: Verification & Validation and Systems Integration

Testing Overview:

Artifact Flow

Requirements

Architecture

Iteration or System code

Detailed design

(5) Interface tests

(8) System tests

Function code

(7) Regression tests

(11) Acceptance tests

(9) Usability tests

(6) Integration tests

Interface specs

(10) Installation testsDocs.

Docs.

Docs.

Docs.Code +

Code +

Code +

Complete code

(1),(4) Function tests

(2),(3) Module testsModule (e.g., package) code

Code +

order of testing

Docs.SystemTests

IntegrationTests

UnitTests

Page 12: Verification & Validation and Systems Integration

Testing forValidation and Verification(After Myers)

Requirements

(8) System tests*‡

(1), (4) Function tests

(7) Regression tests*

(11) Acceptance tests*

Includes… : *use-cases ‡performance testing

(9) Usability tests*‡

(10) Installation tests*‡Tested against requirements(“validation”)

Page 13: Verification & Validation and Systems Integration

Requirements

Architecture

Detailed design(2), (3) Module tests

(5) Interface tests

(8) System tests*‡

(1), (4) Function tests

(7) Regression tests*

(11) Acceptance tests*

Includes… : *use-cases ‡performance testing

(9) Usability tests*‡

(6) Integration tests*Interface specs

Note 2: Tested against documents indicated

(10) Installation tests*‡“validation1”

Note 1: Tested against requirements

“verification2”

“verification2”

“verification2”

Testing for Validation and Verification(After Myers)

Page 14: Verification & Validation and Systems Integration

2. The integration process

Page 15: Verification & Validation and Systems Integration

The Build Process

for a Bridge

Build 1 Build 2

Build 3 Final Build of Single Level

Final Build of DoubleLevel

Final Build of Single Level

. . . .

Builds for single level iteration

Double level iteration

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 16: Verification & Validation and Systems Integration

Integration in Spiral Development

1. Get additional requirements

2. Design for additional

requirements

4. Integratenew code

5. Test

3. Code additional

Second iteration

Implementation

Design Requirementsanalysis

Test

First iteration

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 17: Verification & Validation and Systems Integration

Elaboration

Relating Builds and Iterations in the Unified Process

Inception Construction Transition

Requirements

Analysis

Design

Implemen-tation

Test

Jacobson et al: USDP

Prelim.iterations

Iter.#1

Iter.#n

Iter.#m+1

Iter.#k

….. …..…..Iter.#n+1

Iter.#m

First buildfor

iteration i

Last buildfor

iteration i

Iter.#i

Page 18: Verification & Validation and Systems Integration

Build Sequences: Ideal vs. Typical

Build a Bridge

Build a Video Game

Page 19: Verification & Validation and Systems Integration

1. Understand the architecture decomposition. – try to make architecture simple to integrate

2. Identify the parts of the architecture that each iteration will implement.– build framework classes first, or in parallel– if possible, integrate “continually”– build enough UI to anchor testing– document requirements for each iteration– try to build bottom-up

• so the parts are available when required– try to plan iterations so as to retire risks

• biggest risks first– specify iterations and builds so that each use case is

handled completely by one3. Decompose each iteration into builds if necessary.4. Plan the testing, review and inspection process.5. Refine the schedule to reflect the results.

Plan Integration& Builds

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 20: Verification & Validation and Systems Integration

Job complete

System installed

System implemented

2. For each iteration …

2.1 Foreach build … 2. 1.4 Test interfaces if required

2.1.2 Retest functions if required

4. Perform acceptance tests -- section 3.7

Development of iteration complete

2. 1.3 Retest modules if required

2. 1.5 Perform build integration tests -- section 3.1

3. Perform installation tests -- section 3.8

2.1.1 Perform regression testing from prior build

Roadmap for Integration and System

Test

1. Decide extent of all tests.

2.2 Perform iteration system and usability tests -- sections 3.4, 3.5

Page 21: Verification & Validation and Systems Integration

Factors Determining the Sequence of Integration

• Usage of modules by other modules– build and integrate modules used

before modules that use them

• Defining and using framework classes

• Exercising integration early

• Exercising key risky parts of the application as early as possible

• Showing parts or prototypes to customers

technical(factors)

risk reduction

requirements

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 22: Verification & Validation and Systems Integration

Integration Schedule

Month 1

1 2 3 4

Month 2

1 2 3 4

Month 3

1 2 3 4

Month 4

1 2 3 4

Month 5

1 2 3 4

Milestones

Iteration 1“view characters in areas”

Complete prototypeProto. requirements

build 1 build 3

Inception iteration Elaboration iterations Iterations

Builds

Modules

build 2

1

Iteration 2“elementaryinteraction”

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

. . . . . . .

Page 23: Verification & Validation and Systems Integration

Integration Schedule

Month 1

1 2 3 4

Month 2

1 2 3 4

Month 3

1 2 3 4

Month 4

1 2 3 4

Month 5

1 2 3 4

Milestones

Iteration 1“view characters in areas”

Complete prototypeProto. requirements

Characters package

EncounterCharacters package

Integrate & test

build 1 build 3

Inception iteration Elaboration iterations

RolePlaying-Gamepackage

EncounterGamepackage

Integrate & test

Iterations

Builds

Modules

build 2

1

GameEn-vironmentpackage

Encounter-Environ-ment package

Integrate & test

Iteration 2“elementaryinteraction”

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 24: Verification & Validation and Systems Integration

3. The testing process

Page 25: Verification & Validation and Systems Integration

Encounter Continual Integration week 3

EncounterCharacters

EncounterGame

EncounterLayout

Characters

GameCharacter

RolePlayingGame

Layout

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 26: Verification & Validation and Systems Integration

EncounterCharacters

EncounterGame

EncounterLayout

EncounterEnvironment«facade»

Characters

GameCharacter

RolePlayingGame

Layout

Map

Encounter Continual Integration week 7

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 27: Verification & Validation and Systems Integration

EncounterCharacters

EncounterGame

EncounterCast«facade»

EncounterLayout

EncounterEnvironment«facade»

Characters

GameCharacter

RolePlayingGame

RPGame

Layout

Map

Encounter Continual Integration week 11

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 28: Verification & Validation and Systems Integration

EncounterCharacters

EncounterGame

EncounterCast«facade»

EncounterGame«facade»

EncounterLayout

EncounterEnvironment«facade»

Characters

GameCharacter

EncounterCharacter

RolePlayingGame

RPGame

Layout

Map Location

Area

Encounter Continual Integration week 15

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 29: Verification & Validation and Systems Integration

Plan and Execute Integration Tests

1. Decide how and where to store, reuse and code the integration tests. – show this in the project schedule

2. Execute as many unit tests (again) as time allows – this time in the context of the build

• no drivers or stubs required this time

– prioritize by those most likely to uncover defects

3. Exercise regression tests – to ensure existing capability has not been compromised

4. Ensure build requirements are properly specified5. Exercise use cases that the build should implement

– test against the SRS

6. Execute the system tests supported by this buildAdapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 30: Verification & Validation and Systems Integration

Relationship between Use Cases, Iterations and Builds

Iteration 5… 4 … 6

5.45.2 build 5.3

Use case 23

Use case 14

Use case 9 *

Use case 7

*

* «extends»or «includes»

*

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

14 builds on details of 7

Page 31: Verification & Validation and Systems Integration

Final Code Build and Integration Schedule: a Banking Example

week 31

week 23

weekly code builds biweekly daily

release

baseline

Twice weekly

Over night

Frequency of regression testing increasestoward the end of project.

Page 32: Verification & Validation and Systems Integration

Final Code Build and Integration Schedule: a Banking Example

week 31

week 23

weekly code builds biweekly daily

release bank query module

bank deposit module

bank withdrawal module

Integrationof module

intobaselinetasks

baseline

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Code is frozen when being

tested. Why?

Page 33: Verification & Validation and Systems Integration

Typical Day-by-Day Code Integration Process

week 25

week 26

week 27

week 28

week 29

week 30

week 31

week 24

week 23

weekly builds biweekly

development 6 pm 7 am

Confirm baseline orrevert to previous baseline

Runregression tests

time

development

daily

release= overnight regression test

Freeze development

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 34: Verification & Validation and Systems Integration

Artifacts and Roles for Integration Testing*

*Jacobson et al: USDP

Testengineer

Component engineer

Integration tester

System tester

Use-case model

Test case

Test procedure

Test evaluation

Test plan Test

component

. . . . . . . . . . . . . . . . . . responsible for . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Defect management

Page 35: Verification & Validation and Systems Integration

Success in Interface Testing

• Understand interface requirements

• Perform early “smoke” tests to weed out unanticipated interface issues

• Test thoroughly as per “black box” testing (see Unit Testing) so as to fully exercise the interface in terms of:– Variety of values for each parameter– Parameter combinations– Volume and timing

Page 36: Verification & Validation and Systems Integration

• VolumeSubject product to large amounts of input.

• UsabilityMeasure user reaction (e.g., score 1-10).

• PerformanceMeasure speed under various circumstances.

• ConfigurationConfigure to various hardware / software

e.g., measure set-up time.

• Compatibility-- with other designated applications

e.g., measure adaptation time.

• Reliability / AvailabilityMeasure up-time over extended period.

Types of System Tests* 1

*see Kit [Ki]

Page 37: Verification & Validation and Systems Integration

• SecuritySubject to compromise attempts.

• e.g., measure average time to break in.• Resource usage

Measure usage of RAM and disk space etc.• Install-abililty

Install under various circumstances.• measure time to install.

• RecoverabilityForce activities that take the application down.

• measure time to recover• Serviceabililty

Service application under various situations.• measure time to service

• Load / StressSubject to extreme data & event traffic

* see Kit [Ki]

Types of System

Tests* 2

Page 38: Verification & Validation and Systems Integration

Organization of Integration and System Test Documentation**

** With R. Bostwick

SCMP Software Test DocumentationSTD‡

IntegrationT.D.‡

SystemT.D.‡

AcceptanceT.D.‡

InstallationT.D.‡

Build 1T.D.‡

Build 2T.D.‡

Build 3T.D.‡

‡ Test Documentation,each divided into:

•Introduction•Test plans•Test designs•Test cases•Test procedures•Test log

references ...

consists of ...

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 39: Verification & Validation and Systems Integration

ANSI/IEEE 829-1983 Software Test Documentation (reaff. 1991)

1. Introduction

2. Test planitems under test, scope, approach, resources, schedule, personnel

3. Test design items to be tested, the approach, the plan in detail

4. Test cases

sets of inputs and events

5. Test procedures steps for setting up and executing the test cases

6. Test item transmittal reportitems under test, physical location of results, person responsible for transmitting

7. Test logchronological record, physical location of test, tester name

8. Test incident reportdocumentation of any event occurring during testing which requires further investigation

9. Test summary reportsummarizes the above

NOT REQUIRED IN PROJECTS

Page 40: Verification & Validation and Systems Integration

5. The Transition iterations

Page 41: Verification & Validation and Systems Integration

Goals of the Transition Iterations

Transition

Requirements

Analysis

Design

Implemen-tation

Test

Iter.#m+1

Iter.#k

h Find defects through customer use

h Test user documentation and help

h Determine realistically whether application meets customer requirements

h Retire deployment risks

h Satisfy miscellaneous marketing goals

Page 42: Verification & Validation and Systems Integration

Alpha- and Beta- Releases

In-house and highly trusted users

– Several repetitions by real users

– Previews customer reaction

– Benefits third-party developers

– Forestalls competition

Selected customers

– Many repetitions by real users

– Gets customer reaction

Alpha

Beta

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 43: Verification & Validation and Systems Integration

Roadmap for the Transition Iterations

2. Conduct alpha testing.

1. Plan alpha and beta testing.

• Prepare

• Distribute & install

• Carry out (users / customers)

• Gather defect reports

• Observe stopping criteria

• Correct defects3. Conduct beta testing.

• Define population

• Plan defect collection

• Identify stopping criteria

(num. allowed defects/test or /hour)

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 44: Verification & Validation and Systems Integration

Target: 98%

Stopping Criteria: Graphical Representation

week 1 End tests

Error detection rate

%

Numberper 1000 hrs

week 10

Target: <= 7 per1000 hrs for 4 weeks

Percentage of deposit transaction types tested Target: 91%

week 20

Percentage of withdrawal transactions tested

Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

Page 45: Verification & Validation and Systems Integration

7. Tools for integration and system testing

Page 46: Verification & Validation and Systems Integration

Capabilities of Automated System Test Tools

1. Record mouse and keyboard actions to enable repeated playback

2. Run test scripts repeatedly

3. Enable recording of test results

4. Record execution timing

5. Record runtime errors

6. Create and manages regression tests

7. Generate test reports

8. Generate test data

9. Record memory usage

10. Manage test cases

11. Analyze coverage

Page 47: Verification & Validation and Systems Integration

Types of Capture/Playback Tests*

• Native / software intrusivetest software intermingled with software under test– could compromise software under test – least expensive

• Native / hardware intrusivetest hardware intermingled with software under test– could compromise software under test

• Non-intrusiveuses separate test hardware– does not compromise software under test – most expensive

* adapted from Kit [Ki]

Page 48: Verification & Validation and Systems Integration

Memory Usage Test Tools

• Memory leaks– detect growing amounts of unusable memory

• inadvertently caused by the implementation

• Memory usage behavior– confirm expectations– identify bottlenecks

• Data bounds behavior– e.g., confirm integrity of arrays– e.g., detect attainment of limiting values

• Variable initialization– indicate un-initialized variables

• Overwriting of active memory

Page 49: Verification & Validation and Systems Integration

Test Case Management*

• Provide user interface– to manage tests

• Organize tests– for ease of use– for maintenance

• Manage test execution sessions– to run tests selected by user

• Integrate with other test tools– to capture and play back – to analyse coverage

• Provide reporting• Provide documentation

* adapted from Kit [Ki]