automated generation of test suites from formal specifications alexander k.petrenko institute for...

37
Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow. February, 2000 February, 2000

Upload: monica-warren

Post on 16-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

Automated Generation of Test Suites from Formal SpecificationsAlexander K.PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow.

February, 2000February, 2000

Page 2: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

2 Cambridge, February, 2000

Ideal Testing Process

Why formal specification?

What kind of specifications?

Forward engineering: Design specifications

Sources

? Oracles

Partition

Reverse engineering:

Sources

? Criteria? Oracles

Partition

Pre- and post-conditions, for Oracles and Partition invariants

? Algebraic specifications for Test sequences

Tests

Tests

? Criteria

Page 3: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

3 Cambridge, February, 2000

Ideal Testing Process

Why formal specification?

What kind of specifications?

Forward engineering: Design specifications

Sources

CriteriaOracles

Partition

Reverse engineering: Post-specifications

Sources

CriteriaOracles

Partition

Pre- and post-conditions, for Oracles and Partition invariants

? Algebraic specifications for Test sequences

Tests

Tests

Page 4: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

4 Cambridge, February, 2000

KVEST project history

• Started under contract with Nortel Networks in 1994 to develop a system automatically generating test suites for regression testing from formal specifications, reverse engineered from the existing code

• A joint collaboration effort between Nortel Networks and ISPRAS background:

— Soviet Space Mission Control Center OS and networks;

— Soviet space shuttle “Buran” OS and real-time programming language;

— formal specification of the real-time programming language

Page 5: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

5 Cambridge, February, 2000

What is KVEST?

• KVEST: Kernel Verification and Specification Technology

• Area of Application: specification, test generation and test execution for API like OS kernel interface

• Specification Language: RAISE/RSL (VDM family)

• Specification Style: state-oriented, implicit (pre- and post-conditions, subtype restrictions)

• Target Language: Programming language like C/C++

• Size of Application: over 600Kline

• Size of specification: over 100Kline

• Size of test suites: over 2Mline

• Results: over hundred errors have been detected in several projects

Page 6: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

6 Cambridge, February, 2000

Position

• Constraint specification

• Semi-automated test production

• Fully automated test execution and test result analysis

• Orientation on use in industrial software development

processes

Page 7: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

7 Cambridge, February, 2000

Research and design problems

• Test system architecture

• Mapping between specification and programming languages

• Integration of generated and manual components -

re-use of manual components

• Test sequence and test case generation

Page 8: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

8 Cambridge, February, 2000

Verification processes

• Reverse engineering: (post-) specification, testing based on the specification

• Forward engineering: specification design, development, test production

• Co-verification: specification design, simultaneous development and test production

Page 9: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

9 Cambridge, February, 2000

Reverse engineering: Technology stream

Software contractcontentsInterface A1

…………………….Interface A2

…………………….

Software contractcontentsInterface A1

…………………….Interface A2

…………………….

Detected error &

test coverage reports

Detected error &

test coverage reports

DocumentationDocumentation

Source codeSource code

Test driversand test cases

Test driversand test cases

Interfacedefinition Interface

definition

Phase 1

Specification Specification

Phase 2

Test suiteproductionTest suite

production

Phase 3

Test executionanalysis

Test executionanalysis

Phase 4

Test plansTest plans

Actual documentationActual documentation

Page 10: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

10 Cambridge, February, 2000

Key features of KVEST test suites

• Phase 1: Minimal and orthogonal API (Application Programming Interface) is determined

• Phase 2: Formal specification in RAISE Specification Language is developed for API.

• Phase 3: Automatic generation of sets of test suites (test cases and test sequences) in target language.

• Phase 4: Automatic execution of generated test suites. Pass/fail verdict is assigned for every test case execution. Error summary is provided at the end of the run. User has an option of specifying completeness of the test coverage and the form of tracing.

Page 11: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

11 Cambridge, February, 2000

DAY_OF_WEEK : INT >< INT -~-> RC >< WEEKDAY

DAY_OF_WEEK( tday, tyear ) as ( post_rc, post_Answer )

post

if tyear <= 0 \/ tday <= 0 \/

tday > 366 \/ tday = 366

/\ ~a_IS_LEAP( tyear )

then

BRANCH( bad_param, "Bad parameters" );

post_Answer = 0 /\ post_rc = NOK

else

BRANCH( ok, "OK" );

post_Answer = (a_DAYS_AFTER_INITIAL_YEAR(tyear, tday ) +

a_INITIAL_DAY_OF_WEEK ) \

a_DAYS_IN_WEEK /\ post_rc = OK

end

An example of specification in RAISE

Page 12: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

12 Cambridge, February, 2000

Partition based on the specification

Specification

post

if a \/ b \/

c \/ d

/\ e

then

BRANCH( bad_param, "Bad parameters" )

else

BRANCH( ok, "OK" )

end

Partition

(Branches and

Full Disjunctive Normal Forms - FDNF)

BRANCH "Bad parameters”

•a/\b/\c/\d/\e

•~a/\b/\c/\d/\e

•...

BRANCH "OK"

•~a/\~b/\~c/\~d/\e

•…

Page 13: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

13 Cambridge, February, 2000

Test execution scheme

SpecificationsSpecifications

Verdict and traceVerdict

and trace

Test drivers

Test harness

Target platform

UNIX

Program behavior model

Program behavior model

ComparisonComparison

Test suite generatorsTest suite generators

SUTSUT

Test case parametersTest case parameters

Page 14: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

14 Cambridge, February, 2000

Test execution management

Repository

Navigator: - test suite generation - repository browser - test plan run

Unix workstation Target platform

Test bed: - process control - communication - basic data conversion

Test suite

Script driver

MDC Basic drivers

MDC - Manually Developed Components

Page 15: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

15 Cambridge, February, 2000

KVEST Test Drivers

• Hierarchy of Test Drivers

— Basic test drivers: test single procedure by receiving input, calling the procedure, recording the output, assigning a verdict

— Script drivers: generate sets of input parameters, call basic drivers, evaluate results of test sequences, monitor test coverage

— Test plans: define the order of script driver calls with given test options and check their execution

• KVEST uses set of script driver skeletons to generate script drivers

• Test drivers are compiled from RAISE into the target language

Page 16: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

16 Cambridge, February, 2000

Test generation scheme

RAISE specificationsRAISE specificationsScript driver

skeletonsScript driver

skeletons

Basic drivergenerator

Basic drivergenerator

Script drivergenerator

Script drivergenerator

Test case generatorTest case generator

RAISE -> target language compilerRAISE -> target language compiler

Basic driversBasic drivers Test caseparametersTest case

parametersScript driversScript drivers

Target platform

Test suites

Tools (UNIX)

Page 17: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

17 Cambridge, February, 2000

Test generation scheme, details

RAISE -> target language compilerRAISE -> target language compiler

Basic driversBasic

driversTest case

parametersTest case

parametersScript driversScript drivers

Target platform

Test suites

RAISE specifications

RAISE specifications

Script driver skeletonsScript driver skeletons

Basic drivergenerator

Basic drivergenerator

Script drivergenerator

Script drivergenerator

Test case generatorTest case generatorTools (UNIX)

IteratorsIterators

Data convertersData converters

State observersState observersFiltersFilters

Manually Developed Components

Page 18: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

18 Cambridge, February, 2000

Test sequence generation based on implicit Finite State Machine (FSM)

– Partition based on pre- and post-conditions– Implicit FSM definition.

S1

S4

S2

S3

op2

op3

op3

op3

op2

op1

op3

Page 19: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

19 Cambridge, February, 2000

Test sequence generation based on implicit FSM

S1

S4

S2

S3

op21

op3

op3op3

op2

op1

op3

Partition (Branches and

Full Disjunctive Normal Forms - FDNF)

BRANCH "Bad parameters”•a/\b/\c/\d/\e -- op1•~a/\b/\c/\d/\e-- op2•...

BRANCH "OK" •~a/\~b/\~c/\~d/\e -- opi

•…

Page 20: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

20 Cambridge, February, 2000

Conclusion on KVEST experience

• Code inspection during formal specification can detect up to 1/3 of the errors

• Code inspection can not replace testing. Up to 2/3 of the errors are detected during and after testing.

• Testing is necessary to develop correct specifications.

• Up to 1/3 of the errors were caused by the lack of knowledge on pre-conditions and some details of the called procedures’ behavior.

Page 21: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

21 Cambridge, February, 2000

What part of testware is generated automatically?

Kind of source fortest generation

Percen-tage in the sources

Ratio betweensource size andgenerated tests

size

Kind ofgeneration result

Specification 50 1:5 Basic drivers

Data converters,Iterators andState observers(MDC)

50 1:10 Script drivers

Page 22: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

22 Cambridge, February, 2000

Solved and unsolved problems in test automation

Have been automated or simple problems

Have been automated or simple problems

Not automated and not simple

Not automated and not simple

Phase 2

Phase 4

For legacy softwareFor legacy software

Test resultunderstandingTest resultunderstanding

For well designedFor well designed

For single operationsFor single operations

Test oracles, partition, filtersTest oracles, partition, filters

Interfacedefinition Interface

definition

Phase 1

Specification Specification

Test suiteproductionTest suite

production

Phase 3

Test executionanalysis

Test executionanalysis

Test plans, execution and analysis, browsing, reporting

Test plans, execution and analysis, browsing, reporting

Test sequence design for operation groupsTest sequence design for operation groups

Page 23: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

23 Cambridge, February, 2000

Specification based testing: problems and prospects

Problems

• Lack of correspondence between any specification and programming languages

• There is users’ resistance to study any specification language and any additional SDE

• Methodology of Test sequence generation

• Testing methodologies for specific software areas

Prospects

• Use an OO programming

language specification

extension and standard SDE

instead a specific SDE

• FSM extraction from implicit

specification, FSM

factorization

• Research on Distributed software specification and testing

Page 24: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

Part II. KVEST revisionPart II. KVEST revision

Page 25: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

25 Cambridge, February, 2000

Specification notation revision.

UniTesK: Universal TEsting and Specification toolKit

• Formal methods deployment problems— lack of users with theoretical background— lack of tools— non conventional languages and paradigms

• UniTesK Solutions— first step is possible without “any theory”— extension of C++ and Java— integration with standard software development environment

• Related works— ADL/ADL2— Eiffel, Larch, iContract

Page 26: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

26 Cambridge, February, 2000

UniTesK: Test generation scheme

Specifications in Java or C++ extension

Specifications in Java or C++ extension

Test oraclesgenerator

Test oraclesgenerator

OO test suitegenerator

OO test suitegenerator

Test oraclesTest oracles

Target platform

Test suites in the target language

Tools

Iterators, FSM

Iterators, FSM

Iterators, FSM

Iterators, FSM

Path builderengines

Path builderengines

Test sequence fabric Test sequence fabric

Use casesUse cases

Page 27: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

27 Cambridge, February, 2000

Integration of Constraint Verification tools into software development environment

UML based design environment

OracleClass(from Generated classes)

AbstractInputTypeIterator(from Generated classes)

Iterator_seq_of_int_binary

<<instance variable>> length : nat1<<instance variable>> current : nat

<<operation>> init()<<operation>> next()hhh()

(from Generated classes)

ClassUnderTestImplementation(from Generated classes)

A standard Software Development Environment

Specification, Verification tools for the standard notation

Page 28: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

Part III. Test generation insidePart III. Test generation inside

Page 29: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

29 Cambridge, February, 2000

Requirements. Test coverage criteria

– All branches

– All disjuncts (all accessible disjuncts)

Specification

post if a \/ b \/

c \/ d/\ e

then BRANCH( bad_param,

"Bad parameters" )else

BRANCH( ok, "OK" )

end

Partition (Branches and

Full Disjunctive Normal Forms - FDNF)

BRANCH "Bad parameters”•a/\b/\c/\d/\e•~a/\b/\c/\d/\e•...

BRANCH "OK" •~a/\~b/\~c/\~d/\e•…

Page 30: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

30 Cambridge, February, 2000

Test sequence kinds. Kinds 1st, 2nd, 3rd

Such procedures can be tested separately because no other target procedure is needed to generate input parameters and analyze outcome.

— Kind 1. The input is data that could be represented in literal (text) form and can be produced without accounting for any interdependencies between the values of different parameters..

— Kind 2. No interdependencies exist between the input items (values of input parameters). Input does not have to be in literal form.  

— Kind 3. Some interdependencies exist, however separate testing is possible.  

Page 31: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

31 Cambridge, February, 2000

Kinds 1st, 2nd, 3rd. What are automated?

Kind Automatically Manually

1st Everything Nothing

2nd

Test sequences and Parameter tuple iterators

Data type iterators

3rd Test sequencesParameter

tuple iterators

Page 32: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

32 Cambridge, February, 2000

Test sequence kinds. Kinds 4th and 5th

Kinds 4th and 5th. The operations cannot be tested separately, because some input can be produced only by calling another operation from the group and/or some outcome can be analyzed only by calling other procedures.

Page 33: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

33 Cambridge, February, 2000

Requirements for kinds 4th and 5th

The same requirements: all branches/all disjuncts

Additional problem: how to traverse all states?

Page 34: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

34 Cambridge, February, 2000

FSM use for API testing

Traditional FSM approach (explicit FSM definition):— define all states

— for each state define all transitions (operation, input parameters, outcome, next state)

ISPRAS approach (implicit FSM definition):

— the state is defined by type definition— for each state

- operations and input are defined by pre-conditions

- outcome and next state are defined by post-conditions

Page 35: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

35 Cambridge, February, 2000

Advanced FSM use

—FSM factorization

—Optimization of exhaustive FSM traversing

—Use-case based test sequence generation

—Test scenario modularization

—Friendly interface for test sequence generation and debugging

Page 36: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

36 Cambridge, February, 2000

References

– Igor Bourdonov, Alexander Kossatchev, Alexander Petrenko, and Dmitri Galter. KVEST: Automated Generation of Test Suites from Formal Specifications.- Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, N 1708, 1999, pp.608-621.

– Igor Burdonov, Alexander Kosachev, Victor Kuliamin. FSM using for Software Testing. Programming and Computer Software, Moscow-New-York, No. 2, 2000.

Page 37: Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP

37 Cambridge, February, 2000

Contacts

Alexander PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow, [email protected] phone: +7 (095) 912-5317 ext 4404fax: +7 (095) 912-1524

http://www.ispras.ru/~RedVerst/index.html