12 automated testing tools

41
Automatic Testing Tool

Upload: softwarecentral

Post on 13-Jun-2015

497 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 12 Automated Testing Tools

Automatic Testing Tool

Page 2: 12 Automated Testing Tools

Introduction

• The tool provides facilities for:  Dynamic Testing: executing the software

under test in order to verify its compliance Coverage Analysis: measuring the

proportion of software exercised by dynamic testing 

Static Analysis: examining source code to assess its complexity

Page 3: 12 Automated Testing Tools
Page 4: 12 Automated Testing Tools

Functional Overview

Dynamic Testing facilities allow both host and target.  – Coverage Analysis facilities allow to assess the coverage

of individual software units, programs/tasks and entire software systems.

– Static Analysis: Software that has been thoroughly dynamically tested can still have problems.

– Static analysis provides information on qualities such as maintainability and compliance with coding standards. 

• The number of code statements and the McCabe's cyclomatic complexity metric are calculated. 

Page 5: 12 Automated Testing Tools

• Users can define their coverage analysis, and static analysis criteria.

Dynamic Testing The Testing Problem

Page 6: 12 Automated Testing Tools
Page 7: 12 Automated Testing Tools

• There must be some form of specification for the software

• The testing tool must handle test cases generated using functional (black-box)

structural(white-box)

isolation 

Integration testing

positive testing

negative testing 

Page 8: 12 Automated Testing Tools

• The Solution

Page 9: 12 Automated Testing Tools

• A test script controls tests.

• The test script is compiled and linked with Test Harness (TH) and the software under test.

• This produces an executable program that produces a test Results file. 

• The benefits of this approach are:  Documentation: The tool provides documented testing. 

Repeatability: Tests can be easily. 

Maintainability: Tests Scripts are easy to understand and

updated. 

Page 10: 12 Automated Testing Tools

The Test Harness

• Test Harness (TH) provides facilities to run, verify the results of, document, and repeat dynamic tests.

• TH consists of a set of library directives accessed from the test script.

• The test script calls the software under test and the TH directives embedded in the script check the effects of the call on environment. 

Page 11: 12 Automated Testing Tools
Page 12: 12 Automated Testing Tools

• When a test is run, a Results file is produced detailing every step of the test and highlighting any failures.

• A Results table is displayed summarizing the results for each test case and providing total figures.

• An overall statement of test pass or fail is provided and

returned to the command shell.  Checking Values

• Most important aspect of dynamic testing is checking that

the outputs from the software under test are as expected.

Page 13: 12 Automated Testing Tools

• TH verifies data using CHECK directives. These cause the comparison of a data item with its expected value. 

Simulating External Software

• Isolation testing implies that calls to external units and external data must be simulated. 

• External calls can be simulated, ensuring that they are made in the expected order and that input parameters have the

correct values at each call. • Return values can be individually set on different calls to the

same simulated function.

Page 14: 12 Automated Testing Tools

• Similarly, external data areas may be simulated and checked.  Timing Analysis

• Not only the correct functioning of software determines its acceptability.

• Some applications need to perform certain activities both correctly and within defined time constraints. 

The tool permits execution times to be recorded and tests passed or failed depending on the performance of the

software. 

Page 15: 12 Automated Testing Tools

Results

• TH directives mark the progress of the test run in the output.

• Any unexpected event is highlighted by a message.

• For example: a failed check will be marked with FAILED and a diagnostic will give both the actual and expected

values of the item being checked.  • The output below contains a typical section of a TH output

file:

Page 16: 12 Automated Testing Tools

Test Results For : example 

Results File : example.ctr

Tests Run At : Feb 17 09:38:32

Start Test 001

  EXECUTE: my_function,

    Expected calls = 1

    START_STUB : my_stub

        CALL_REF/ACTION: Action 1, Call 1

        Check PASSED : my_stub_string

        Item = "Hello, world"

    END_STUB : my_stub

  DONE : my_function

Check FAILED : my_external

Expected 0x0000712B 28971 

Item 0x000080E8 33000 

End Test 001

Page 17: 12 Automated Testing Tools

TestErrors

ScriptPassed

ChecksFailed

ChecksWarning

ChecksFailed

StubsFailed

PathsFailed

Assertions

Status

PTE 0 0 0 0 0 0 0 PASS

001 0 0 1 0 0 1 0 >>FAIL

ANS 0 0 0 0 0 0 0 PASS

Total 0 0 1 0 0 1 0 >>FAIL

Page 18: 12 Automated Testing Tools

Test Script Generation

• The Test Script generator (TS) prepares dynamic test scripts for execution with TH.

• TS takes information from a test case definition file and generates a test script.

• The test case definition file specifies test cases, establishes initial

conditions and expected results.  • TS can produce comprehensive test script templates which can

be the basis of manually coded test scripts.  TS works by scanning the test case definition file and the

software under test to produce the test script. 

Page 19: 12 Automated Testing Tools
Page 20: 12 Automated Testing Tools

• Test scripts produced by TS feature positive and negative data checking.

• TS automatically codes check routines for user defined

types and stubs for external functions/classes.  Analysis

• Test Analysis (TA) provides the user with Coverage

Analysis and Static Analysis features.

Page 21: 12 Automated Testing Tools

Coverage Analysis

• Coverage analysis measures the proportion of software

executed during dynamic testing.  statement coverage

decision (or branch) coverage

boolean expression coverage

call pair coverage: measures the proportion of calls to other functions which have been exercised.

call coverage: measures that expected functions have been called. 

data value coverage: checks that program variables have held a

series of (user defined) values during the testing process. 

Page 22: 12 Automated Testing Tools

Static Analysis

• Static Analysis provides an assessment of various non-functional features relating to the software. 

Enforcement of coding standards 

Measurement of code complexity and structure.  Coding Standards

• The metrics examined are incorporated into overall test pass/fail criteria.

– For example, users may check that no goto statements (or labels) are used, that only one return statement is present and that there are no

switch statements without a default. 

Page 23: 12 Automated Testing Tools

• Data areas and types, which are declared but not used, are highlighted.

• Users can define their static analysis metrics to check on the use of code.

– For instance, these facilities can be used to restrict access to

certain library routine calls (such as malloc). 

Page 24: 12 Automated Testing Tools

Code Complexity

• Many metrics are supported: 

McCabe's measure and Myers' extension 

Essential McCabe's 

Hansen's measure of software complexity by the pair 

Halstead's software science metrics 

• Harrison's scope ratio 

Page 25: 12 Automated Testing Tools

Using Analysis

• Test Analysis may be used in many different ways and, if required, be extended by the user to meet client specific requirements. 

• The Analysis comprises:  a special C pre-processor (TP). 

an Instrumenter program, to analyse source code files and insert coverage 'probes' (TI); 

an additional library of test directives, which may be incorporated into a test script (the TA library). 

• The Analysis can be used in two main ways.

Page 26: 12 Automated Testing Tools

– An extension to TH, allowing the user to fully integrate dynamic testing with coverage analysis and static verification.

• This diagram illustrates the use of Cantata Analysis in this

way: 

Page 27: 12 Automated Testing Tools
Page 28: 12 Automated Testing Tools

• The pre-processor and instrumenter are used to produce instrumented source code that is source code containing probes to facilitate the collection of coverage data.

• The instrumenter produces an annotated code listing containing the source code and a static analysis report. 

• The instrumented source code is compiled and linked with a test script and the TA library.

• When run, the resultant executable produces test results which includes both static analysis, coverage analysis

information, and dynamic test results. 

Page 29: 12 Automated Testing Tools

Stand-Alone Analysis

• Test Analysis can also be used stand-alone.

• Developers can use their own or a third party tool test software, while generating analysis reports using Test

Analysis: 

Page 30: 12 Automated Testing Tools
Page 31: 12 Automated Testing Tools

Coverage Analysis Results

Coverage Analysis Within a Test Script

• Users have access to coverage and static analysis results from within the test script.

• The next example shows a simple decision coverage

report, indicating execution of each decision: 

Page 32: 12 Automated Testing Tools

DECISION STATISTICS REPORT

DECISION NO LINE NO TYPE NO OFOTCMS

BREAKDOWN

1 4 if 2 TRUE: 3

. FALSE: 0 >> NOT EXECUTED

2 6 switch 4 case #1: 1

. case #2: 1

. case #3: 0 >> NOT EXECUTED

. case #4: 0 >> NOT EXECUTED

. imp default: 1 >> WARNING

3 9 ?: 2 TRUE: 1

. FALSE: 0 >> NOT EXECUTED

4 12 for 2 TRUE: 8

. FALSE: 1

5 18 while 2 TRUE: 0 >> NOT EXECUTED

. FALSE: 0 >> NOT EXECUTED

6 28 while 2 TRUE: 0 >> NOT EXECUTED

. FALSE: 0 >> NOT EXECUTED

Page 33: 12 Automated Testing Tools

>> WARNING: Switch executed with unknown case value

Total of decision outcomes = 14

Total outcomes exercised at least once = 6

Decision coverage = 42%

>> WARNING: DECISION COVERAGE INCOMPLETE

Stand-Alone Coverage Analysis

• In stand-alone mode, coverage analysis can be used to provide coverage reporting from units, modules or

complete application programs. 

Page 34: 12 Automated Testing Tools

Static Analysis Results

• Static analysis results can be accessed in a number of ways:  as part of the instrumenter list file 

as a comma separated value (.CSV) file, which can be exported to spreadsheet or database packages 

from within the TH test script, where they may be checked 

• An example of each of these formats is shown next. 

Page 35: 12 Automated Testing Tools

Static Analysis Measures for "control.c a03m00"

SOURCE_LINES                     557

CODE_LINES                       198

COMMENT_LINES                    353

BLANK_LINES                      45

EXPRESSION_STATEMENTS            53

FOR_LOOP_STATEMENTS              3

WHILE_LOOP_STATEMENTS            0

DO_LOOP_STATEMENTS               0

IF_STATEMENTS                    13

SWITCH_STATEMENTS                1

RETURN_STATEMENTS                1

GOTO_STATEMENTS                  0

STATEMENTS                       74

DECLARATIONS                     13

COMMENTS                         215

Page 36: 12 Automated Testing Tools

.

.

MAXIMUM_NESTING_LEVEL            4

AVERAGE_NESTING_LEVEL            0.58

.

.

.

MCCABE                           19

ESSENTIAL_MCCABE                 1

MYERS_MCCABE_LOWER               19

MYERS_MCCABE_UPPER               21

HANSEN_CYCLOMATIC_NUM            18

HANSEN_OPERATOR_COUNT            98

HARRISON_SCOPE_RATIO             0.45

HALSTEAD_NUM_UNIQUE_OPRS         21

HALSTEAD_TOTAL_NUM_OPERATORS     194

Page 37: 12 Automated Testing Tools

HALSTEAD_NUM_UNIQUE_OPERANDS     75

HALSTEAD_TOTAL_NUM_OPERANDS      221

.

.

.

CLASSES                          0

NEW                              0

DELETE                           0

THROW                            0

TRY_CATCH                        0

ANONYMOUS_UNIONS                 0

PARAMETERS                       2

UNUSED_PARAMETERS                0

AUTOMATICS                       13

STATICS                          0

UNUSED_DATA                      0

LOCAL_TYPES                      0

Page 38: 12 Automated Testing Tools

UNUSED_LOCAL_TYPES               0

DATA ANALYSIS

-------------------------------------------------

Definitions and Declarations Outside of Any Function

-------------------------------------------------

Name                     Flags              No. of References

-----              -----------------

a03m01                   function           9

a03m02                   function           5

a03m03                   function           2

.

.

.

Page 39: 12 Automated Testing Tools

ct02_dirsep_ca           extern             1

ct02_hdrftr_pca          extern             6

ct02_notmdte_ca          extern             0 UNUSED

.

.

Definitions and Declarations Within Function : a03m00

--------------------------------------------

Name                     Flags                  No. of References

                 -----------------

cr_argv_pca              parameter              1

cr_numargc_n             parameter              0 UNUSED

vl_baselen_i                                    2

Page 40: 12 Automated Testing Tools

vl_error_b                                      10

vl_error_n                                      4

vl_exit_z                                       0 UNUSED

vl_fileloop_i                                   6

vl_format_pca                                   4 

============== Overall Preprocessor Measures For The File ============= 

TOTAL_MACROS                     1272

MACROS_TO_BE_EXPANDED            1270

MACROS_NOT_TO_BE_EXPANDED        2

SUBSTITUTED_MACROS               111

UNSUBSTITUTED_MACROS             0

DIRECT_INCLUDE_FILES             10

INDIRECT_INCLUDE_FILES           1

MAX_INCLUDE_NESTING              2

Page 41: 12 Automated Testing Tools

MAX_CONDITION_COMP_NESTING       3

==================== Overall Measures For The File ====================

FILE_SOURCE_LINES                767

FILE_CODE_LINES                  238

FILE_COMMENT_LINES               465

FILE_BLANK_LINES                 105

FILE_STATEMENTS                  74

FILE_DECLARATIONS                50

FILE_COMMENTS                    309

FILE_FUNCTIONS                   1

FILE_CHECKSUM                    846162015

FILE_CLASSES                     0

FUNCTION_CLASSES                 0

FILE_FRIENDS                     0

FUNCTIONS_IN_CLASSES             0