software verification / testing

23
Software Verification / Testing Rance Cleaveland Rance Cleaveland Department of Computer Science and Fraunhofer Center for Experimental Software Engineering Fraunhofer Center for Experimental Software Engineering University of Maryland 20 October 2011 20 October 2011 ©2011 University of Maryland

Upload: others

Post on 12-Sep-2021

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Software Verification / Testing

Software Verification / Testing

Rance CleavelandRance Cleaveland

Department of Computer Scienceand

Fraunhofer Center for Experimental Software EngineeringFraunhofer Center for Experimental Software Engineering

University of Maryland

20 October 201120 October 2011

©2011 University of Maryland

Page 2: Software Verification / Testing

This Talk

Some recent developments in software verification and testingg

©2011 University of Maryland 1

Page 3: Software Verification / Testing

Software Verification?

• Related to, but different from, IEEE definition• Traditionally, in CS: formal methodsTraditionally, in CS: formal methods

– Given software, spec• Software = “code”Software code• Spec = “requirement” = logical formula

Prove software meets spec– Prove software meets spec• (Informal verification often called “validation”.)

©2011 University of Maryland 2

Page 4: Software Verification / Testing

Model Checking

• Verification = proof• Model checking: automated proof!Model checking: automated proof!

– Given software, spec– Model checker tries to build proofModel checker tries to build proof

• Ongoing research: applicabilityDecidability– Decidability

– Scalability• Embedded control applications!• Embedded control applications!

©2011 University of Maryland 3

Page 5: Software Verification / Testing

Software Testing

• Most often-used method for checking software correctness– Select tests– Run software on tests– Analyze results

• TraditionallyM l h ti i i– Manual, hence time-consuming, expensive

– In control applications: hard to test software by itselfsoftware by itself

©2011 University of Maryland 4

Page 6: Software Verification / Testing

Exciting Developments

• Combine– Formal specsFormal specs– Testing

• To automate testing “scalably”To automate testing scalably– Model-based testing

Instrumentation based verification– Instrumentation-based verification– Requirements reconstruction

©2011 University of Maryland 5

Page 7: Software Verification / Testing

Model-Based Testing

• Develop specs as executable models– SimulinkSimulink– State machines– EtcEtc.

• Use model to determine correct test responseAutomates “results analysis”– Automates results analysis

– Models, tests needed

©2011 University of Maryland 6

Page 8: Software Verification / Testing

Model-Based Testing (cont.)

CTests Compare

©2011 University of Maryland 7

Page 9: Software Verification / Testing

Tests Can Be Generated fromTests Can Be Generated from Models!

TestTestGen

Model Tests

• Functionality provided by tools like Reactis® for Simulink / StateflowStateflow

• Goal: automate test generation task by creating tests that cover model logic

• Reactis: guided simulation algorithm• Reactis: guided simulation algorithm

©2011 University of Maryland 8

Page 10: Software Verification / Testing

Applying Model-Based Testing

• Widespread in automotive, less so in aero / medical-device– Regulatory issues– Need for models– Modeling notations, support

• What about models?What about models?– Sometimes result of earlier design phases– Models as reusable testing infrastructureModels as reusable testing infrastructure

©2011 University of Maryland 9

Page 11: Software Verification / Testing

Challenges

• Technical– Algorithms for test generationAlgorithms for test generation– Modeling languages

• ProceduralProcedural– Integration into existing QA processes

Regulatory considerations– Regulatory considerations

©2011 University of Maryland 10

Page 12: Software Verification / Testing

Instrumentation BasedInstrumentation-Based Verification

• Model-based testing assumes model correct• IBV: a way to check model correctness vis a vis

requirementsrequirements

Requirements

Specificationsmodels

•Designmodels

©2011 University of Maryland 11

Page 13: Software Verification / Testing

Instrumentation BasedInstrumentation-Based Verification: Requirements

• Verification needs formalized requirementsrequirements

• IBV: formalize requirements as

it d lmonitor models

• Example“If d i < 30“If speed is < 30, cruise control must remain inactive”

©2011 University of Maryland 12

Page 14: Software Verification / Testing

I t t ti B d V ifi tiInstrumentation-Based Verification: Checking Requirements

• Instrument design model with monitors

• Use coverage testingUse coverage testing to check for monitor violations

• Tool: Reactis®Tool: Reactis®– Product of Reactive

Systems, Inc. – SeparatesSeparates

instrumentation, design– More info:

www.reactive-systems comsystems.com

©2011 University of Maryland 13

Page 15: Software Verification / Testing

Applying Instrumentation BasedApplying Instrumentation-Based Testing

• Robert Bosch production automotive application– Requirements: 300-page document

10 b t f li d (20% f t )– 10 subsystems formalized (20% of system)• 62 requirements formalized as monitor

models• IBV applied• 11 requirements issues identified

• Another Bosch case study: product line• Another Bosch case study: product-line verification using IBV

• A number of other case studies

©2011 University of Maryland 14

Page 16: Software Verification / Testing

Requirements Reconstruction

• The Requirements Reconstruction problem

– Given: software

– Produce: requirements

• Why?

– System comprehension

– Specification reconstruction

• Missing / incomplete / out-of-date documentation• “Implicit requirements” (introduced by developers)

©2011 University of Maryland 15

Page 17: Software Verification / Testing

Invariants as Requirements

• Some requirements given as invariants– “When the brake pedal is depressed, the p p

cruise control must disengage”• State machines can be viewed as invariants

– States: values of variables– Transitions: invariants

“If th t t t i A th th t t t– “If the current state is A then the next state can be B”

• Another project with Robert Bosch• Another project with Robert Bosch©2011 University of Maryland 16

Page 18: Software Verification / Testing

Invariant Reconstruction

• Generate test data satisfying coverage criteria

• Use machine learning to propose invariants• Check invariants using instrumentation-based g

verification

©2011 University of Maryland 17

Page 19: Software Verification / Testing

Machine Learning: AssociationMachine Learning: Association Rule Mining

• Tools for inferring relationships among variables based on time-series data– Input: table

– Output: relationships (“association rules”)e g 0 ≤ x ≤ 3 -> y ≥ 0e.g. 0 ≤ x ≤ 3 > y ≥ 0

©2011 University of Maryland 18

Page 20: Software Verification / Testing

Association Rules and InvariantAssociation Rules and Invariant Reconstruction

• General dea– Treat tests (I/O sequences) as data– Use machine learning to infer relationships

between inputs, outputsO i i ht• Our insight– Ensure test cases satisfy coverage criteria to

ensure “thoroughness”ensure thoroughness– Use IBV to double-check proposed

relationships

©2011 University of Maryland 19

Page 21: Software Verification / Testing

Pilot Study: ProductionPilot Study: Production Automotive Application

• Artifacts– Simulink model (ca. 75 blocks)– Requirements formulated as state machine– Requirements correspond to 42 invariants

defining transition relationdefining transition relation• Goal: Compare our approach, random testing

[Raz][Raz]– Completeness (% of 42 detected?)– Accuracy (% false positives?)y ( p )

©2010 Fraunhofer USA Inc. 20

Page 22: Software Verification / Testing

Experimental Results

• Hypothesis: coverage-testing yields better invariants than random testing

• Coverage results:Coverage results:

95% of inferred invariants true97% of requirements inferredTwo missing requirements detected

• Random results:

55% f i f d i i t t55% of inferred invariants true40% of requirements inferred

• Hypothesis confirmedyp

©2010 Fraunhofer USA Inc. 21

Page 23: Software Verification / Testing

Summary

• Intersection of formal methods, testing can yield practical verification approachesy p pp– Model-based testing– Instrumentation-based verification

• Automated test generation can be used to infer invariants

©2011 University of Maryland 22