test effort estimation with test point analysis...software testing, analysis & review dec 4-8,...

50
P R E S E N T A T I O N International Conference On Software Testing, Analysis & Review DEC 4-8, 2000 COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8, 2000 Test Effort Estimation with Test Point Analysis Ruud Teunissen

Upload: others

Post on 18-Jul-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

P R E S E N T A T I O N

International Conference On

Software Testing, Analysis & ReviewDEC 4-8, 2000 • COPENHAGEN, DENMARK

Presentation

Bio

Return to Main Menu F10

Friday, Dec 8, 2000

Test Effort Estimation

with Test Point Analysis

Ruud Teunissen

Page 2: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test effort estimationwith Test Point Analysis

Ruud TeunissenGitek nv - interaction through software

EuroSTAR 2000

Page 3: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Topics

• Test effort estimation• Systematic software testing• Risk based testing• Test effort estimation with Test Point Analysis• Conclusion

Page 4: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test effort estimation

• Possible approaches:– top down according to permitted budget– extrapolation (expected) design- and/or build time– related to functionality (FP’s, #pages FD)– bottom up from work breakdown– intuition of experienced testers– based on comparison with earlier test projects– test point analysis (TPA®)– ………

Page 5: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Required: Structure

Life-cycleTechniquesInfrastructureOrganisation

What, when ?How ?Where, etc. ?Who ?

O I

L

T

TEST DEFECTS, ADVICE

Informing about quality & risksDelivering re-usable testwarePreventing defects

Page 6: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

TMap® testing life cycle

PreparationPreparationSpecificationSpecification

ExecutionExecution

CompletionCompletion

Planning & ControlPlanning & Control

P&C

P S E C

L

T

OI

Life Cycle

Page 7: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test Strategy

Aim: Find the most important defects as early as possible at the lowest price Aim: Find the most important defects as early as possible at the lowest price

What are the most important defects?

• Risks(business, project, test)

• Quality characteristics• Available resources

– people & expertise– infrastructure and tools– techniques

• Budget and time

OI

L

TTechniques

Page 8: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Risk thinking: Introduction

• Business reasons

• No risk, no test

• Risk analysis approach

budgetrisk

coverage

OI

L

TTechniques

Page 9: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Risk: Definition

Frequency of use

Chance of error

Chance of

failure

Damage

Risk

OI

L

TTechniques

Page 10: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Parties involved

Frequency of useDamage

System management

Functional management

Data center

Accountancy

CM & CC

Technicaldesign

Project

TEST

QADBA

Programmers

Business

Functionaldesign

Chance of error

management

Page 11: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

s1 s2 s3 conv tot.

+++++

++

++

++

+

++

++ +

++

30 15 20 15 20%

functionalitysecuritytime behaviorsuitabilityusability

6055

2010

Test importance per Q-char/subsystem

Page 12: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Strategy based effort estimation

SizeStrategy

Productivity

Test Point Analysis

Test effort

Page 13: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

TPA, The model

DynamicTesting

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

Page 14: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic and static testing

• Dynamic testing– activities related to individual functions or subsystems– explicit testing (dedicated test cases)– implicit testing (evaluate certain characteristics during

explicit tests)

• Static testing– activities related to the system as a whole– verification (checklists)

Page 15: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic Testing

• Dynamic test points = ∑ ( FP * T * Qd)– FP = Size in Function Points per function– T = Test impact per function– Qd = Dynamic quality attributes

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

DynamicTesting

Page 16: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Function Point Analysis

• Establish size of functionality based on:– number of functions:

• number of screens and fields• number of reports and fields• processing functions

– number of files and fields• Measurement unit: function point (FP)• Accepted standard, worldwide• Productivity (=hours/fp) depending on organisation and

programming language

Page 17: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test impact per function

• user-importance• usage-intensity• interfacing• complexity• uniformity

Page 18: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

User-Importance

• The user-importance is an expression of the importancethat the user attaches to a given function relative to theother system functions.

Rating3 Low: the importance of the function relative to the

other functions is low;6 Normal: the importance of the function relative to the

other functions is normal;12 High: the importance of the function relative to the

other functions is high.

Page 19: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Usage-Intensity

• Usage-intensity is defined as the frequency with which acertain function is processed by the users and the size ofthe user group that uses this function. As with user-importance usage-intensity is being determined at a userfunction level.

Rating2 Low: the function is executed by the user organisation

only a few times per day or per week;4 Normal: the function is being executed by the user

organisation a great many times per day;8 High: the function is used continuously throughout the

day.

Page 20: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

• Interfacing is an expression of the extent to which amodification in a given function affects other parts of thesystem.

Rating2 The degree of interfacing associated with the function

is low;4 The degree of interfacing associated with the function

is normal;8 The degree of interfacing associated with the function

is high.

L D S \ f u n c t i o n s 1 2 - 5 > 51 L L A

2 - 5 L A H> 5 A H H

Interfacing

Page 21: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Complexity

• The complexity of a function is determined on the basis ofits algorithm.

Rating3 The function contains no more than five conditions;6 The function contains between six and eleven

conditions;12 The function contains more than eleven conditions.

Page 22: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Uniformity

• Under the following circumstances, only 60% of the testpoints assigned to the function under analysis counttowards the system total:– a second occurrence of a virtually unique function: in such

cases, the test specifications can be largely reused;– a clone function: the test specifications can be reused for

clone functions;– a dummy function (provided that reusable test specifications

have already been drawn up for the dummy).

Page 23: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Example: test impact (T)

Factor Rating• user-importance 3 6 12• usage-intensity 2 4 8• interfacing 2 4 8• complexity 3 6 12

• uniformity 0.6 1

T(f) = ( ( 3 + 4 + 4 + 12 ) / 20 ) * 1 = 1.15

Page 24: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Standard Functions

• error report function• help-screen function• menu structure function

Function FP's Ue Uy I C U DfError message 4 6 8 4 3 1 1,05Help screens 4 6 8 4 3 1 1,05Menus 4 6 8 4 3 1 1,05

Page 25: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic Testing

• Dynamic test points = ∑ ( FP * T * Qd)– FP = Size in Function Points per function– T = Test impact per function– Qd = Dynamic quality attributes

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

DynamicTesting

Page 26: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic quality attributes (explicit)

• Required depth (test coverage) of explicit tests

Rating0 Quality requirements are not important and are therefore

disregarded for test purposes;3 Quality requirements are relatively unimportant but do need

to be taken into consideration for test purposes;4 Quality requirements are of average importance (this rating

is generally appropriate when the information systemrelates to a support process);

5 Quality requirements are very important (this rating isgenerally appropriate when the information system relatesto a primary process);

6 Quality requirements are extremely important.

Page 27: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic quality attributes

• Balanced, using a weight factor– functionality (0.75) 0 3 4 5 6– security (0.05) 0 3 4 5 6– suitability (0.10) 0 3 4 5 6– performance (0.10) 0 3 4 5 6

Qd' = ( 5 / 4 ) * 0,75+ ( 3 / 4 ) * 0,05

+ ( 5 / 4 ) * 0,10+ ( 4 / 4 ) * 0,10

= 1,20

Page 28: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Dynamic quality attributes (implicit)

• User friendliness (0.02) Y N• Resource usage (0.02) Y N• Performance (0.02) Y N• Maintainability (0.04) Y N

Qd(implicit) = 0.02 + 0 + 0 + 0.04 = 0.06

Qd = 1.20 + 0.06 = 1.26

Page 29: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Rating versus test design techniques

R a t i n g : 3 4 5 6S u i t a b i l i t y• P r o c e s s i n g :

D F T a n dE r r o r

G u e s s i n g

D F T E C T a n dD F T

E C T

• S c r e e n c h e c k s : E r r o rg u e s s i n g

s a m p l eS E M a n d

E r r o rg u e s s i n g

s a m p l eS E M a n d

S Y N

S E M a n ds a m p l e

S Y N

S e c u r i t y E r r o rG u e s s i n g

S E Ms a m p l e u s e r

p r o f i l e s

S E Mu s e r

p r o f i l e s

S E Mu s e r

p r o f i l e s a n do v e r a l l

s y s t e m *

U s a b i l i t y N ot e s t s p e c s

a n d S U M I

U s eC a s e s o r

P C Ta n d S U M I

U s e C a s e so r P C T

a n d S U M I

U s a b i l i t yL a b o r a t o r y

T e s t

E f f i c i e n c y T h e t h o r o u g h n e s s o f t h e R L T i s v a r i a b l e a n d w i l lt h u s b e d e t e r m i n e d b y t h e r a t i n g a n d t h e a m o u n t o fh o u r s t h a t c o m e s a v a i l a b l e a s a c o n s e q u e n c e .

DFT = Data Flow TestECT = Elementary Comparison TestSEM = Semantic TestSYN = Syntactic Test

PCT = Process Cycle TestSUMI = Software Usability Measurement InventoryRLT = Real Life Test

Page 30: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Static testing

• TP( indirect ) = Qs x (FP / 500)– Qs = Static quality attributes to be tested– FP = Total amount of function points of the system

Each static quality attributecosts 16 TP’s / 500 FP’s

DynamicTesting

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

Page 31: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Example: static test points

• Size of the system = 3500 FP• Static quality attributes

– flexibility Y N– testability Y N– security Y N– continuity Y N– traceability Y N

TP( indirect ) = (16 + 16) x 3500 / 500 = 224

Page 32: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

DynamicTesting

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

Environmental factor

• Test tool support• Development tests• Product documentation• Development environment• Test environment• Testware

Page 33: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test Tools

• The test tools variable reflects the extent to which thetesting activities are being supported by automatic tools.

Rating1 During testing a supporting tool for test specification

is applied and a tool is used for 'record & playback';2 During testing a supporting tool for test specification

is applied or a tool is used for 'record & playback';4 No test tools are available.

Page 34: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Development Testing

• The development testing variable reflects the quality ofearlier testing.

Rating2 A development testing plan is available and the test

team is familiar with the actual test cases and testresults;

4 A development testing plan is available;8 No development testing plan is available.

Page 35: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test Basis

• The test basis variable reflects the quality of the (system)documentation upon which the test under consideration isto be based.

Rating3 During system development documentation

standards and templates are being used. In additionto this, reviews are organized

6 During system development documentation standards and templates are being used, but theiruse is not checked in any formal way;

12 The system documentation was not developed usingspecific standards and templates.

Page 36: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Development Environment

• The development environment variable reflects the natureof the environment within which the information systemwas realized.

Rating2 The system was developed using a 4 GL

programming language with an integrated DBMScontaining numerous constraints;

4 The system was developed using a 4 GL programming language, possibly in combination witha 3 GL programming language;

8 The system was developed using only a 3 GL programming language

Page 37: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Test Environment

• The test environment variable reflects the extent to whichthe test infrastructure in which testing is to take place, haspreviously been tried out.

Rating1 The environment has been used for testing several

times in the past;2 The test is to be conducted in a newly equipped

environment similar to other well-used environmentswithin the organization;

4 The test is to be conducted in a newly equippedenvironment, which may be considered experimentalwithin the organization.

Page 38: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Testware

• The testware variable reflects the extent to which the testscan be conducted using existing testware.

Rating1 A usable, general initial data set (tables, etc.) and

specified test cases are available for the test;2 A usable, general initial data set (tables, etc.) is available

for the test;4 No usable testware is available.

Page 39: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Example: factor conditions

External condition score• Test tool support 1 2 4• Development tests 2 4 8• Product documentation 3 6 12• Development environment 2 4 8• Test environment 1 2 4• Testware 1 2 4

Factor = ( 1 + 8 + 6 + 4 + 4 + 4) / 21 = 1,29

Page 40: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Productivity factor

• Depending on– experienced personnel– culture– work procedures

• Typical range: 0.7 - 2.0

DynamicTesting

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

Page 41: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Management overhead

• Team size– larger -> more coordination

• Management tool support– time registration and reporting– problem registration and reporting

DynamicTesting

TotalTest points

TotalTest hours

PrimaryTest hours

Managementoverhead

Productivityfactor

Environmentalfactor

StaticTesting

Test-Strategy

Page 42: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Team Size

• The team size factor reflects the number of peoplemaking up the team (including the test manager and,where appropriate, the test controller).

Rating:3 The team consists of no more than four people;6 The team consists of between five and ten people;12 The team consists of more than ten people.

Page 43: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Management Tools

• The management or planning and control tools variablereflects the extent to which automated resources are tobe used for planning and control.

Rating2 Both an automated time registration system and an

automated defect tracking system (incl. configurationmanagement) are available and are applied;

4 Either an automated time registration system or anautomated defect tracking system (incl.configurationmanagement) is available and is applied;

8 No automated (management) systems are available.

Page 44: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Example: Management overhead

• Added directly as percentage– team size 3 6 12– mgt. support 2 4 8

Percentage = 6 + 8 = 14%

Page 45: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

PreparationPreparationSpecificationSpecification

ExecutionExecution

CompletionCompletion

Planning & ControlPlanning & Control

P&C

P S E C

Distribution of the test effort

10 %40 %

45 %

5 %

Overhead: an additional 5-20 %

Page 46: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

From business risks

to estimated test effort!

Risk

Frequency of use

Chance of error

Chance of

failure

Damage

SizeStrategy

Productivity

Test Point Analysis

Estimated test effort

Page 47: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Literature on Test Point Analysis

• English: ”Structured Testing of Information Systems, an Introductionto TMap ”, by Martin Pol and Erik van Veenendaal, published byKluwer, Deventer 1998 ISBN 90-267-2910-3

• English: “Structured Testing of Information Systems, by Martin Pol,Ruud Teunissen and Erik van Veenendaal, to be published byAddison-Wesley, Jan-Feb 2001

• Deutsch/German: “Management und Optimiering des Testprozesses ,Ein praktischer Leitfaden fur erfolgreiches Testen van Software”durch Martin Pol, Tim Koomen, Andreas Spillner, dpunkt.verlag, ISBN3-932588-65-7

• Nederlands/Dutch: "Testen volgens TMap ", by Martin Pol, RuudTeunissen and Erik van Veenendaal, published by Tutein Nolthenius,'s-Hertogenbosch, 1999 ISBN 90-72194-58-6

Page 48: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Questions?

Ruud TeunissenGitek n.v. interaction through software

e-mail: [email protected]://www.gitek.be

Page 49: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Gitek n.v. - interaction through software

• Founded in September 1986• Number of employees : + 100• IT Services : services on design & realization of

applications, system management, network management,project assistance or complete project realization.

• Test Expertise: advice, implementing a structured testprocess, planning and management, test design &execution, training & coaching in testing, total testprojects

• Customers in financial and insurance sector, publicsector, pharmaceutical and telecommunications industry,chemical industry

Page 50: Test Effort Estimation with Test Point Analysis...Software Testing, Analysis & Review DEC 4-8, 2000 • COPENHAGEN, DENMARK Presentation Bio Return to Main Menu F10 Friday, Dec 8,

Friday 8 December 2000

F10

Test Effort Estimation with Test Point Analysis Ruud Teunissen Since 1989 Ruud Teunissen is employed in the testing world. Ruud has performed several test functions in a number of ICT projects: tester, test specialist, test consultant, test manager, Ruud participated in the development of the structured testing methodology TMap® and is co-author of several books on structured testing. As manager testen for Gitek n.v. in Belgium, Ruud is responsible for the testing department that employs 40 professional Test Engineers. Ruud is frequently speaking at conferences in Europe.

Books:

English: ”Systematic Testing of Information Systems according to TMap®”, by Martin Pol, Ruud Teunissen and Erik van Veenendaal, to be published by Addison-Wesley, January 2001.

Dutch: "Testen volgens TMap®", by Martin Pol, Ruud Teunissen and Erik van Veenendaal, published by Tutein Nolthenius, 's-Hertogenbosch, 1999 ISBN 90-72194-58-6

Dutch: "Gestructureerd testen: een introductie tot TMap®", by Martin Pol, Ruud Teunissen and Erik van Veenendaal, published by Tutein Nolthenius, 's-Hertogenbosch, 1996 ISBN 90-72194-45-4