ingegneria del software ii - · pdf filelecturer: henry muccini and vittorio cortellessa...

42
Lecturer : Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila - Italy [email protected] [email protected] [www.di.univaq.it/muccini ]–[www.di.univaq.it/cortelle ] Course : Ingegneria del Software II academic year: 2004-2005 Course Web-site: [www.di.univaq.it/ingegneria2/ ] An Introduction to Testing

Upload: phamhuong

Post on 18-Feb-2018

224 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

Lecturer:Henry Muccini and Vittorio Cortellessa

Computer Science Department University of L'Aquila - Italy

[email protected][email protected][www.di.univaq.it/muccini] – [www.di.univaq.it/cortelle]

Course:

Ingegneria del Software IIacademic year: 2004-2005

Course Web-site: [www.di.univaq.it/ingegneria2/]

An Introduction to Testing

Page 2: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

2SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Copyright Notice

» The material in these slides may be freely reproduced and distributed, partially or totally, as far as an explicit reference or acknowledge to the material author is preserved.

Henry Muccini

Page 3: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

3SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Acknowledgment

» With very special thanks to Antonia Bertolino and Debra J. Richardson which collaborated in previous versions of this lecture note

Page 4: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

4SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Agenda

» Testing Basics

- Definition of Software Testing

» Testing Glossary

» Testing Approaches

Page 5: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

5SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing basics

» Definition of sw testing

» Fault vs. failure

» Test selection

- Coverage testing

- Reliability testing

- Specification-based testing

» Traceability and test execution

» Testing from LTS

Page 6: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

6SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing

» Testing is a word with many interpretations

- Inspection/review/analysis

- Debugging

- Conformance testing

- Usability testing

- Performance testing

- ….

Page 7: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

7SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

What testing is not(citation from Hamlet, 1994):

» I've searched hard for defects in this program, found a lot of them, and repaired them. I can't find any more, so I'm confident there aren't any.

The fishing analogy:

» I've caught a lot of fish in this lake, but I fished all day today without a bite, so there aren't any more.

» NB: Quantitatively, a day's fishing probes far more of a large lake than a year's testing does of the input domain of even a trivial program.

Page 8: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

8SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Software testing consists of:

» the dynamic verification of the behavior of a program

» on a finite set of test cases

» suitably selected from the (in practice infinite) input domain

» against the specified expected behavior

An all-inclusive definition:

Page 9: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

9SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing vs. Other Approaches» Where is the boundary between testing and other (analysis/verification)

methods?

What distinguishes software testing from “other” approaches is execution. This requires the ability to:

- launch the tests on some executable version (traceability problem), and

- analyse (observe and evaluate) the results (not granted for embedded systems)

» I am not saying testing is superior and we should disregard otherapproaches: on the contrary, testing and other methods shouldcomplement/support each other

Page 10: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

10SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Test purpose» The software is tested in order to:

- Find bugs: “debug testing”, e.g., by:

> Conformance testing> Robustness testing> Coverage testing

- Measure its reliability: by statistical inference

- Measure its performance

- Release it to the customer (acceptance test)

- …

not mutually exclusive goals, but rather complementary

» Of course different approaches are taken

Page 11: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

11SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

» Any software test campaign involves a trade-off between

- Limiting resource utilization (time, effort)as few tests as possible

- Augmenting our confidenceas many tests as possible

» Two research challenges:

- Determining a feasible and meaningful stopping rule

- Evaluating test effectiveness (reliability, “coverage notion”, ....very tough problem

The high cost of testing

Page 12: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

12SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing Glossary

Page 13: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

13SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Failures, Errors, Faults» Failure: incorrect/unexpected behavior or output

- incident is the symptoms revealed by execution

- failures are usually classified

» Potential Failure: incorrect internal state

- sometimes also called an error, state error, or internal error

» Fault: anomaly in source code

- may or may not produce a failure

- a “bug”

» Error: inappropriate development action

- action that introduced a fault

Page 14: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

14SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

THE FAULT-FAILURE MODEL

Fault

must be activated

Error

sw internal stateis corrupted(latent or manifest)

Failureaffects the

delivered service

A fault will manifest itself as a failure if all of the 3 following conditions hold:

1) EXECUTION: the faulty piece of code is executed

2) INFECTION: the internal state of the program is corrupted: error

3) PROPAGATION: the error propagates to program output

PIE MODEL (Voas, IEEE TSE Aug. 1992)

Page 15: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

15SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

The ambiguous notion of a “fault”

» What a tester or a user observes is a failure. What is then a fault?

- It is a useful and intuitive notion, but very difficult to formalize.

» In practice we often characterise a fault with the modifications made to fix it: e.g. “Wrong operator”. In such a way,

- the concept of a fault becomes meaningful only after it has been fixed.

- Moreover, different people may react to a same failure with different fixes (and clever programmer find minimum fixes)

» A solution is to avoid the ambiguous notion of a fault and reason instead in terms of failure regions, i.e. collections of input points that give rise to failures.

- A failure region is characterised by a size and a pattern

Page 16: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

16SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Fundamental Testing Questions

» Test Case:How is test described / recorded?

» Test Criteria: What should we test?

» Test Oracle: Is the test correct?

» Test Adequacy: How much is enough?

» Test Process: Is testing complete and effective? How to make the most

of limited resources?

Page 17: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

17SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Test Case» Specification of

- identifier

- test items

- input and output specs

- environmental requirements

- procedural requirements

» Augmented with history of

- actual results

- evaluation status

- contribution to coverage

Page 18: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

18SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Test Criterion

» Test Criterion provides the guidelines, rules, strategy by which test cases are selected

requirements on test data -> conditions on test data ->

actual test data

» Equivalence partitioning is hoped for

- a test of any value in a given class is equivalent to a test of any other value in that class

- if a test case in a class reveals a failure, then any other test case in that class should reveal the failure

- some approaches limit conclusions to some chosen class of faults and/or failures

Page 19: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

19SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Theory of Testing

» P is incorrect if it is inconsistent with S on some element of D

» T is unsuccessful if there exists an element of T on which P is incorrect

LetP = program under testS = specification of PD = input domain of S and PT = subset of D, used as test set for P C = test criterion

Page 20: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

20SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Subdomain-based Test Criterion

» A test criterion C is subdomain-based if it induces one or more subsets, or subdomains, of the input domain

» A subdomain-based criterion typically does not partitionthe input domain (into a set of non-overlapping subdomains whose union is the entire domain)

Page 21: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

21SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Test Oracle

» A test oracle is a mechanism for verifying the behavior of test execution- extremely costly and error prone to verify

- oracle design is a critical part of test planning

» Sources of oracles- input/outcome oracle

- tester decision

- regression test suites

- standardized test suites and oracles

- gold or existing program

- formal specification

Page 22: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

22SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

The oracle problem The Expected Output is

f*(d)

YES, given input d

f(d) = f*(d)

» In some cases easier (e.g., an existing version, existing formal specification), but generally very difficult (e.g., operational testing)

» Not enough emphasized research problem?

Page 23: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

23SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Test Adequacy

» Theoretical notions of test adequacy are usually defined in terms of adequacy criteria

- Coverage metrics (sufficient percentage of the program structure has been exercised)

- Empirical assurance (failures/test curve flatten out)

- Error seeding (percentage of seeded faults found is proportional to the percentage of real faults found)

- Independent testing (faults found in common are representative of total population of faults)

» Adequacy criteria are evaluated with respect to a test suite and a program under test

Page 24: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

24SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing Approaches

Page 25: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

25SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Unit, Integration and System Testing [BE90, ch1]

» Unit Testing:

- A unit is the smallest piece of software

> A unit may be a procedure, a piece of code, one component (in object oriented systems), one system as a whole.

- The Unit test purpose is to ensure that the uniti satisfies its functional specification and/or that its implemented structure matches the intended design structure

- Unit tests can also be applied for test interface or local data structure.

Page 26: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

26SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Integration Testing

» Integration testing is specifically aimed at exposing the problems that arise from the combination of components

- Assuming that components have been unit tested

- Especially communicating interfaces among integrated components need to be tested to avoid communication errors.

» Different approaches:

- non-incremental: big-bang

- incremental: top-down, bottom-up, mixed

- In Object-Oriented distributed systems: new techniques, based on dependence analysis or dynamic relations of inheritance and polymorphism.

Page 27: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

27SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

System Testing

» Involves the whole system embedded in its actual hardware environment

» It attempts to reveal bugs which depends on the environment (bugs that cannot be attributed to components)

» Recovery testing, security testing, stress testing and performance testing

Page 28: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

28SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Integration Testing

» Stress testing: designed to test the software with abnormal situations.

- Stress testing attempts to find the limits at which the system will fail through abnormal quantity or frequency of inputs.

- The test is expected to succeed when the system is stressed with higher rates of inputs, maximum use of memory or system resources.

» Performance testing is usually applied to real-time, embedded systems in which low performances may have serious impact on thenormal execution.

- Performance testing checks the run-time performance of the system and may be coupled with stress testing.

- Performance is not strictly related to functional requirements: functional tests may fail, while performance ones may succeed.

Page 29: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

29SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Henry Muccini’s methafor

» The metaphor I like to use to explain how unit, integration and system testing differ considers

- a unit as a “musician”, playing his/her instrument. Unit esting consists in verifying that the musician produces high quality music.

- Putting together musicians each one playing good music is not enough to guarantee that the group (i.e., the integration of units) produces good music. They may play different pieces, or the same piece with different rhythms.

- The group may then play in different environments (e.g., theater, stadium, arena) with different results (i.e., system testing).

Page 30: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

30SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Testing Planning throughout Process

SoftwareRequirementsSpecification

UnitImplementations

ComponentSpecifications

ArchitectureDesign

Specification

UserRequirements

UnitTesting

IntegrationTesting

SystemTesting

AcceptanceTesting

ComponentTestingdesign &

analyzeintegrate

& test

time

leve

ls o

f abs

tract

ion

plan &validate/verify

Page 31: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

31SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Operational Testing

» A completely different approach:

- We cannot realistically presume to find and remove the last failure.

- Then, we should invest test resources to find out the “biggest” ones.

» “Goodness” here means to minimise the user perceived faults, i.e.

- Try to find earlier those bugs that are more frequently encountered (neglect “small” bugs)

- Not suitable for safety critical applications

Page 32: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

32SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Structural and Functional Testing

» Structural Testing

- Test cases selected based on structure of code

- Views program /component as white box(also called glass box testing)

» Functional Testing

- Test cases selected based on specification

- Views program/component as black box

Can do black-box testing of program bywhite-box testing of specification

Page 33: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

33SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Black Box vs. White Box Testing

SELECTED INPUTS

RESULTANT OUTPUTS

INTERNAL BEHAVIOR

DESIRED OUTPUT

SOFTWARE DESIGN

“BLACK BOX” TESTING

“WHITE BOX” TESTING

SELECTED INPUTS

RESULTANT OUTPUTS

DESIRED OUTPUT

Page 34: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

34SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Structural (White-Box) Test Criteria

» Criteria based on

- control flow

- data flow

- expected faults

» Defined formally in terms of flow graphs

» Metric: percentage of coverage achieved

» Adequacy based on metric requirements for criteria

Objective: Cover the software structure

Page 35: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

35SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Flow Graphs

» Control Flow

- The partial order of statement execution, as defined by the semantics of the language

» Data Flow

- The flow of values from definitions of a variable to its uses

Graph representation of control flow anddata flow relationships

Page 36: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

36SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

123456789

101112131415

function P return INTEGER isbegin

X, Y: INTEGER;READ(X); READ(Y);while (X > 10) loop

X := X – 10;exit when X = 10;

end loop;if (Y < 20 and then X mod 2 = 0) then

Y := Y + 20;else

Y := Y – 20;end if;return 2*X + Y;

end P;

A Sample Program to Test

Page 37: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

37SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

2,3,4 5

6

10

12

14

T T

F

F 9 T

F

7

TF

P’s Control Flow Graph (CFG)

9a 9b

Page 38: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

38SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

2,3,4 5

6 10

12

14

TF 9

T

F

7

TF

At least 2 test cases needed

Example all-statements-adequate test set:(X = 20, Y = 10)

<2,3,4,5,6,7,9,10,14>(X = 20, Y = 30)

<2,3,4,5,6,7,9,12,14>

All-Statements Coverage of P

Page 39: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

39SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

2,3,4 5

6

9b

10

12

14

T T

F

F 9aT

F

7

TF

At least 3 test cases needed

Example all-edges-adequate test set:(X = 20, Y = 10)

<2,3,4,5,6,7,9a,9b,10,14>(X = 5, Y = 30)

<2,3,4,5,9a,12,14>(X = 21, Y = 10)

<2,3,4,5,6,7,5,6,7,5,9a,9b,12,14>

All-Edges Coverage of P

Page 40: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

40SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

2,3,4 5

6

9b

10

12

14

T

F

9aT

FY

X

X

Y

Y X

X

Y

YXX

X

TF

7

TF X

X

P’s Control and Data Flow Graph

Page 41: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

41SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

Functional Testing

» The internal structure and behavior of the program is not considered

» What is assumed is to have the (formal or informal) specification of the system under test and the objective is to analyze if the system correctly behaves with respect to the specifications

» This analysis technique is particularly useful when the source code is not available

» A functional test consists of analyzing how the system reacts to external inputs

Page 42: Ingegneria del Software II - · PDF fileLecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila-Italy muccini@di.univaq.it – cortelle@di.univaq.it

42SEA Group

© 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software IISEA Group

References

» [BE90] B. Beizer. Software Testing Techniques. Van Nostrand Reinhold, 2nd. Edition, 1990.

» [M03] E. Marchetti. Software Testing in the XXI century: Methods, Tools and New Approaches to Manage, Control and Evaluate This Critical Phase. Ph.D. Thesis, Dipartimento di Informatica, Universita’ di Pisa, year 2003,