model checking techniques for test generation from business process models

16
from Business Process Models Didier Buchs, Levi Lucio, and Ang Chen Software Modeling and Verification laboratory University of Geneva, route de Drize 7, CH-1227 Carouge Switzerland {didier.buchs,levi.lucio,ang.chen}@unige.ch http://smv.unige.ch Abstract. We will present a methodology and a tool to generate test cases from a model expressed in Business Process models and a set of test intentions for choosing a particular kind of tests. In order to do this we transform the Business Process models in an intermediate format called Algebraic Petri Nets. We then use model checking techniques (e.g. De- cision Diagrams) to encode the state space — the semantics — of the model and producing test cases including their oracles according to that transition system. Keywords: System design and verification, Higher-level Nets, Algebraic Petri Nets, State Space Generation, Decisions Diagrams. 1 Introduction Model-Driven Engineering (MDE) is currently a very promising technique which aims at increasing the reliability of software while lowering the development time and associated costs. The approach consists of building models (abstractions) of the system under construction and automatically or semi-automatically refining those models until a program executable by machine is reached. Some of the main advantages of such approaches are: models are in general platform-independent and simple to maintain than programs; their properties can be proved on those models insuring — to a certain extent — the final implementation will also have those properties; test sets can be generated at the level of the model thus reducing the complexity of producing oracles for the test cases; the language in which the model is written can be aimed at describing certain kinds of realities — in other words the models can be written in Domain Specific Languages (DSL). In this paper we will describe an approach to generating test cases from mod- els for business processes. The business process models are subsequently imple- mented or deployed into real systems (e.g. information systems) to which we can apply our test cases. In order to build the oracles for the test cases we will use techniques pertaining to model-checking. In model-checking an exhaustive state space of the model is built and certain properties can be proved by examining the states in that state space. Currently very efficient model checkers exist — e.g. F. Kordon and Y. Kermarrec (Eds.): Ada-Europe 2009, LNCS 5570, pp. 59–74, 2009. c Springer-Verlag Berlin Heidelberg 2009

Upload: -

Post on 25-Jan-2016

216 views

Category:

Documents


1 download

DESCRIPTION

Model Checking Techniques for Test Generation From Business Process Models

TRANSCRIPT

Page 1: Model Checking Techniques for Test Generation From Business Process Models

M���� ������ ������ �� ��� ���� ��������

from Business Process Models

Didier Buchs, Levi Lucio, and Ang Chen

Software Modeling and Verification laboratoryUniversity of Geneva,

route de Drize 7, CH-1227 Carouge Switzerland{didier.buchs,levi.lucio,ang.chen}@unige.ch

http://smv.unige.ch

Abstract. We will present a methodology and a tool to generate testcases from a model expressed in Business Process models and a set of test

intentions for choosing a particular kind of tests. In order to do this wetransform the Business Process models in an intermediate format calledAlgebraic Petri Nets. We then use model checking techniques (e.g. De-cision Diagrams) to encode the state space — the semantics — of themodel and producing test cases including their oracles according to thattransition system.

Keywords: System design and verification, Higher-level Nets, AlgebraicPetri Nets, State Space Generation, Decisions Diagrams.

1 Introduction

Model-Driven Engineering (MDE) is currently a very promising technique whichaims at increasing the reliability of software while lowering the development timeand associated costs. The approach consists of building models (abstractions) ofthe system under construction and automatically or semi-automatically refiningthose models until a program executable by machine is reached. Some of the mainadvantages of such approaches are: models are in general platform-independentand simple to maintain than programs; their properties can be proved on thosemodels insuring — to a certain extent — the final implementation will alsohave those properties; test sets can be generated at the level of the model thusreducing the complexity of producing oracles for the test cases; the language inwhich the model is written can be aimed at describing certain kinds of realities —in other words the models can be written in Domain Specific Languages (DSL).

In this paper we will describe an approach to generating test cases from mod-els for business processes. The business process models are subsequently imple-mented or deployed into real systems (e.g. information systems) to which we canapply our test cases. In order to build the oracles for the test cases we will usetechniques pertaining to model-checking. In model-checking an exhaustive statespace of the model is built and certain properties can be proved by examiningthe states in that state space. Currently very efficient model checkers exist — e.g.

F. Kordon and Y. Kermarrec (Eds.): Ada-Europe 2009, LNCS 5570, pp. 59–74, 2009.c© Springer-Verlag Berlin Heidelberg 2009

Page 2: Model Checking Techniques for Test Generation From Business Process Models

6� �� ������ �� ������ !" #� $�%!

NuSMV [1] — allowing encoding state spaces of models up to tens of millions ofstates and efficiently verifying properties expressed in temporal logics. Compactrepresentation techniques for large state spaces based on BDD (Binary DecisionDiagrams) are used to achieve such performances. We use similar techniques inorder to efficiently build oracles for our test cases.

The paper is organized in the following fashion: section 2 provides the method-ological framework for our approach and introduces the concepts of Model-Driven Engineering we will be using throughout the paper. In section 3 weintroduce the language used to build business process models and the runningexample (a credit approval process) we will use throughout the paper. We alsodescribe Model Based Testing (MBT) from the point of view of generating testcases for business processes. Section 4 introduces and describes the techniquewe have developed to transform business process models into Algebraic PetriNets (APN) — an extension of Petri Nets with Algebraic Abstract Data Types.In fact we use Algebraic Petri Nets as the intermediate format to produce thenecessary state space. We then go on to introduce the semantics of AlgebraicPetri Nets in section 5. In section 6 we introduce some test intentions for therunning example. Test intentions are specifications of the test cases we wish toproduce for the credit approval process. Finally section 7 introduces the modelchecking techniques we have used to build the state space and generating testcases with their respective oracles. In order to do that we have used both thetest intentions and state space obtained from the model.

2 Model-Driven Engineering

Scientists have tried to find repeatable, predictable processes or methodologiesthat improve productivity and quality of software for decades. Some try to sys-tematize or formalize the tasks of developing software, others apply project man-agement techniques to the development process. A software engineering processis composed of many activities, notably the following: requirement analysis, spec-ification, verification, system design (architecture), and testing. In particular,safety-critical software systems are often carefully specified (verification) priorto application development and thoroughly tested after implementation.

MDE is a software development methodology which focuses on creating mod-els, or abstractions, more close to some particular domain concepts rather thancomputing or algorithmic concepts. One objective of MDE is to apply modeltransformation and composition automatically via supporting tools. In this con-text, ”Models” are rigorously defined specifications. Some activities or steps ofdevelopment process can be automatized or semi-automatized.

For example, instead of manually writing code, tools can generate executablecode from specifications; on the other hand, tests can be automatically generatedfrom specification and validate the code written by programmers; verification ofproperties (model-checking) can be also applied automatically on the models.Figure 1 shows the activities and artifacts involved in MDE, activities such astest generation, testing, properties validation, and implementation are supposedto be fully automatized or automatized with human assistance.

Page 3: Model Checking Techniques for Test Generation From Business Process Models

&'()* +,)-./01 2)-,0/34)5 7'8 2)59 :)0)8;9/'0 <=

F>?@ A@ Activities and Artifacts in Model-Driven Engineering

Depending on the level of automation of the implementation task in figure 1the testing activity may be necessary for different purposes. In the cases where animplementation is obtained with strong human assistance a large set of test casesis justified in the sense that the specification may have been misinterpreted anda test set derived from the specification will provide a fashion of detecting thosediscrepancies. A more automated code generation will in principle reduce theneed for strong functional testing, although the correction of the implementationregarding the specification assumes the test generator itself is well implemented.When code generation is fully automatic it may be necessary to use automaticallygenerated test sets for regression testing as additional functionalities or hardwareare introduced.

3 MBT from BP Models

The notation we use for business process modeling is Business Process Mod-eling Notation (BPMN) — a graphical representation for specifying businessprocesses. Developed by Business Process Management Initiative (BPMI), it iscurrently an OMG standard for business process modeling. An improvement ofBPMN since its initial version and comparing with similar approaches, is the se-mantics of control structure, i.e. activity, flow and gateways, are given by PetriNets. Details on BPMN can be found in OMG’s BPMN specification [2].

Figure 2 shows a simple BPMN process for credit approval. It has 4 pools,each pool represents a process participant: client, credit approval, approver, and

Page 4: Model Checking Techniques for Test Generation From Business Process Models

BC DE GHIJKL NE NHIOPL QRS TE UJVR

assessor. The pool client is the user interface which represents a client in thisprocess; the pool credit approval contains the main processing we are modeling;approver and assessor are entities which provide services. Moreover, 3 types ofmessages are used by this process: request, risk, and approval. Each message hassimple attributes related to the identification of the process, i.e. name of thecredit demander, and they can be represented using tuples with the followingsignatures:

– request: (name: String, amount: Integer), e.g. (’John’, 1000)– risk:(name: String, risk: String), e.g. (’Peter’, ’high’)– approval:(name: String, result: String), e.g. (’John’, ’approved’), (’Peter’,

’refused’)

A process instance starts when a client submits request via the request formand ends when the client receives an approval message. Basically, this processtries to automatically approve credit requests with low amount (less than 10000)and low risks. The evaluation of risk is delegated to assessor, which returns therisk of the credit demander. If the request’s amount is high (equal or more than10000) or its risk is high, the approval is delegated to approver, and implies ahuman approval task.

In the context of MDE, BPMN is the central artifact which the automationtools can work upon. By providing deployment information, the BPMN processin figure 2 can be transformed into executable BPEL specifications. Our cur-rent work focuses on automating the activities of test generation and property

WXYZ[\

W]Z^Y\_``]abcX

_``]abZ]

_ddZda]

efghfijForm

efiklmif

nmolpfassessor

qkkrlofTask

qiifss

Risk

nmolpfapprover

qiistmufivwxwyzwrequest

reply

check amount

amount<10000

amount>=10000

approvalrequest

request risk approvalrequest

approval

approvalcheck assessment

risk=low

risk=high

Fig. 2. BPMN Example for Credit Approval

Page 5: Model Checking Techniques for Test Generation From Business Process Models

{|}~� ��~����� �~������~� �|� �~�� �~�~����|� ��

���� �� Model Based Testing of the Credit Approval Implementation

validation with BPMN models. In particular in this paper we focus on automatedtest generation.

The two current main trends in testing are white-box and black-box. In white-box testing the actual code for the system under test (SUT) is analyzed and testsare produced from an observation of that code. On the other hand, the assump-tion behind black-box testing is that the state of the SUT is hidden and that itsbehavior is only observable by providing its inputs and observing its outputs. Inthis case an auxiliary — intellectual or formalized — specification of the SUT isnecessary to produce the tests and decide their verdict (pass, meaning no errorwas found, or fail, meaning at least one error found). When the model of the SUTis formalized we have a specialized kind of black-box testing called Model-BasedTesting (MBT). In reality MBT corresponds to a comparison of two artifacts (thespecification and the SUT) via a third artifact called a test set.

In order to apply MBT techniques to models expressed in BPMN we have cho-sen to isolate each pool as an individual specification fromwhich we wish to extracttest cases. This choice naturally fits the paradigm of MBT as a pool is a processwhich exchanges messages (inputs/outputs) with other pools. The test cases ex-tracted from a particular pool can then be applied to an implementation of thatpool which is seen as a black box. This principle can be observed in figure 3.

A possible test case extracted from the credit approval swimlane specificationin figure 2 is:

<receiveRequestIn(’jo’,1000)> <invokeAcessorOut(’jo’,1000)><invokeAcessorIn(’jo’,high)> <invokeApproverOut(’jo’,1000)>

<invokeApproverIn(’jo’,approved)> <replyOut(’jo’,approved)>

Page 6: Model Checking Techniques for Test Generation From Business Process Models

�� �� �����  ¡� ¡��¢£  ¤¥¦ §� ¨�©¥

The formalismwe use to describe test cases consists of sequences of events whichcan be either inputs or outputs). The test case described above consists of a se-quence of six events: firstly client ’jo’ sends a message asking for the approval ofa loan of 1000 (the parameter (jo,1000) corresponds to the request message). Be-cause the quantity of money asked is inferior to 10000 (see figure 2) the automaticassessor is sent a message to decide whether there is a risk associated to giving theloan. The assessor replies the risk associated to lending 1000 to ’jo’ high, whichmeans a request message has to be sent to the human approver asking for a deci-sion. Finally the human approver sends a message stating the loan can be grantedand the client is sent a message saying the loan has been approved.

Note that the test case we have presented corresponds to an expected behaviorof the SUT. When automatically generating test cases it is necessary to under-stand whether or not a sequence of inputs/outputs corresponds to an expectedbehavior. In order to automatically perform this decision we need an operationalversion of the specification where the generated test cases can be tried upon. Inthe context of MBT this operational version is called an oracle.

4 Building an Oracle

We have built an automatic procedure to extract oracles for our test casesfrom BPMN specifications. The procedure corresponds to the transformationof BPMN specifications into Algebraic Petri Nets. The transformation itself issimilar to the approaches described in [3] and we present the set of rules wehave devised for the transformation in [4]. From a technical point of view weuse standard model transformation techniques, in particular the ATL transfor-mation language [5]. ATL allows writing rules at the level of the elements ofthe meta-models of BPMN and APN. Those rules then act over instantiationsof the meta-model of BPMN and transforms them into instantiations of themeta-model of APN.

ReceiveRequestIn _ :

request

receiveRequest

t2

checkAmount

InvokeApproverOut _ : request

InvokeAssessorOut _ : request

invokeApprover

InvokeApproverIn _ : approval

invokeAssessor

InvokeAssessorIn _ : risk

checkAssessment

Reply

replyOut _ :

approval

t6

t5

t7 assignYes

t9

t4

t8

t3

t10

t1

synchronization

ª«¬­ ®­ Result of Transformation in APN

Page 7: Model Checking Techniques for Test Generation From Business Process Models

¯°±²³ ´µ²¶·¸¹º »²¶µ¹¸¼½²¾ ¿°À »²¾Á ²¹²ÀÃÁ¸°¹ ÄÅ

In figure 4 we partially present the result of transforming the Credit Approvalswimlane of figure 2 into an APN. In this example, we use modularized APN,i.e. APN encapsulated in rectangle with black box on the edges represent-ing incoming events and white box representing outgoing events. An event isparametrized with by type of message it sends or receives. Internal transitionsare invisible from outside of the APN model, except when it is synchronized withobservable events. Synchronization between event and transition is representedas dashed arrow.

In the followed parts of this paper, we apply our test generation approach onthe transformed encapsulated algebraic Petri nets.

5 Algebraic Petri Nets

Algebraic nets is an evolution of P/T Petri nets where tokens are belonging todomains defined by algebraic specifications. Although not very different fromother extention of Petri nets, APN have several significant advantages:

– The possibility to define any data structure;– An abstract level of axiomatisation;– A formal notation allowing to reason about data types and their usage.

An APN definition is split into two parts: a Petri Net with places holdingtyped tokens; a set of ADT (Abstract Algebraic Data Types) representing data.

5.1 Algebraic Abstract Data Type

Algebraic Abstract Data Type (AADT or ADT for simplicity) provide a mathe-matical way to define properties of data types. ADT modules define data types bymeans of algebraic specifications. Each module describes one or more sorts, alongwith generators and operations on these sorts. The properties of the operationsare given in the section body of the modules, by means of positive conditionalequational axioms.

An algebraic abstract data type is then composed of a signature Σ = 〈S, OP 〉where S is the set of sorts and OP the set of operations with their arity. Ingeneral, there are two types of ADTs:

– Primitive data ADTs: boolean, natural, and integer are simple data types.– Container ADTs: set, list, pair, bag, etc. They represent structured data and

allow to construct complex data types and operations. The contained typescan be data ADTs or container ADTs.

Moreover, through equations (Ax) based on terms defined by the signature(TΣ) and variables (X) the term with variables (TΣ,X) the properties aredefined. A specification is given by Spec = 〈Σ, Ax, X〉. It is theoretically possi-ble to model any data type with this algebraic approach (under the hypoth-esis that only first order axiomatisation is necessary, which is not the casefor real numbers). There are no predefined data types in ADT and all used

Page 8: Model Checking Techniques for Test Generation From Business Process Models

ÆÆ ÇÈ ÉÊËÌÍÎ ÏÈ ÏÊËÐÑÎ ÒÓÔ ÕÈ ÖÌ×Ó

types should be defined by ADT modules. However, we provide a library ofADTs as part of COOPNBuilder[6]/ALPINA framework which includes themost commonly used data types: boolean, natural, string, pair, list, etc. Thesetypes are fully axiomated allowing to infer properties of models for verificationpurposes.

A calculus can be defined through rewriting techniques, Rew : TΣ → TΣ ,which provides a normal for that can be used for deciding equalities betweenterms (the eval function evaluates terms into their semantic domains) i.e ∀t, t′ ∈TΣ, Rew(t) = Rew(t′)⇒ eval(t) = eval(t′).

5.2 APN

Components are described by modular Algebraic Petri Nets with particular pa-rameterized transitions which are defined by so-called behavioral axioms, similarto the axioms in an ADT. Encapsulated Petri nets possess ports input and outputallowing developer to model components of a system. Input and output representthe provided services (input events) and the required services (output events) ofthe module, respectively.

An APN spec is noted apn− spec = 〈Spec, P, In, Out, Beh, X〉 where Spec isthe algebraic specification, P the set of places (with τ as the typing function),In the input ports and Out the output ports, Beh the behavioural axioms (de-tailled below) and X some variables. The (behavior) axioms have the followingstructure:

Cond ⇒ event :: pre → post

In which each one of the terms has the following meaning:

– Cond is a set of equational conditions, similar to a guard;– event is the name of a input or output or local silent transition λ with

algebraic term parameters (only this part of axiom is mandatory);EventsIn,Out,Σ = InΣ ∪OutΣ ∪ {λ} and:• InΣ = {in(t1, ..., tn)|in ∈ Ins1,...,sn

∧ ti ∈ (TΣ,X)si, i ∈ 1, ..., n}

• OutΣ = {out(t1, ..., tn)|out ∈ Outs1,...,sn∧ ti ∈ (TΣ,X)si

, i ∈ 1, ..., n}– pre and post are typical Petri net flow relations (indexed by P ) determining

what is consumed and what is produced in the net’s places.

5.3 Semantics of APN

Knowing the semantics of an ADT, we can build the semantics of an APN. Thesemantics of an APN is given by a transition system where labels represents whatit is visible from outside i.e. the events. The states are represented by an placeindexed set of multiset of tokens expressed by algebraic values of the ADT’s.

M : P indexed familly of Dom(τ(p)) with operations such as marking union,marking difference and comparisons.

Page 9: Model Checking Techniques for Test Generation From Business Process Models

ØÙÚÛÜ ÝÞÛßàáâã äÛßÞâáåæÛç èÙé äÛçê ëÛâÛéìêáÙâ íî

Given an axiom Cond ⇒ event :: pre → post and m, m′ ∈ M the markings,event ∈ EventsIn,Out,Σ, the transitions induced by that behavioral property is:

∀σ1, if |=σ Cond,eval(pre))σ ⊆ m, meval(event)σ

−−−−−−−−→ m−eval(pre)σ+eval(post)σ2

The resulting construction is the transition system build from the initial mark-ing by applying this firing rule until reaching a fix point TSapn(m0).

6 Test Intentions

The formalism we have used to represent test cases is the HML (Hennessy-MilnerLogic) HML is a simple branching time logic that includes the sequence, notand and operators.Test intentions are written as HML formulas with constrainedvariables. The domains of those variables correspond to the three abstract di-mensions of a test case we stated previously, namely: the shape of the executionpaths; the kind of input/output pairs inside a path; and the parameters of theinputs and outputs. The constraints over the variables allow shaping the testintention.

Interface

SixEvents

Axioms

(nbEvents(f) <= 6) => f in SixEvents;

Variables

f : primitiveHML

Fig. 5. Test Intentions for the Credit Approval Process

In the very simple test intention named SixEvents in figure 5, f is a variablehaving as domain the execution paths of the Credit Approval process. ThenbEvents function allows measuring the number of events (inputs or outputs)contained in an execution path — thus the ’nbEvents(f)<=6’ condition reducesthe domain of f to execution paths including less than six events. The ideabehind this test intention — admittedly not very sophisticated — is thus togenerate tests that cover at most six interactions (inputs or outputs) with theSUT. Possible test cases produced by this test intention are depicted in figure 6.

Notice that the last test is annotated with the False logic value. This isbecause that particular sequence of events is an invalid behavior according tothe specification — the domain of variable f includes any path that can begenerated from the signature of the Credit Approval process, independentlyfrom the fact that it corresponds to a valid or invalid behavior of the SUT.

1 σ is a term substitution.2 eval is the evaluation morphism extended to APN structures such as multiset and

parameterized events from the algebraic morphism eval.

Page 10: Model Checking Techniques for Test Generation From Business Process Models

ïð ñò óôõö÷ø ùò ùôõúûø üýþ ÿò 6ö�ý

<receiveRequestIn(’maria’,20000)> <invokeAcessorOut(’maria’,20000)><invokeApproverIn(’maria’,refused)> <replyOut(’maria’,refused)>, true

<receiveRequestIn(’john’,1000)> <invokeAcessorOut(’john’,1000)>

<invokeAcessorIn(’john’,high)> <invokeApproverOut(’john’,1000)><invokeApproverIn(’john’,approved)> <replyOut(’john’,approved)>, true

<invokeAcessorIn(’john’,low)> <invokeApproverOut(’john’,1000)>, false

Fig. 6. Simple Test Cases for Credit Approval Process

Summarizing, a test intention is formally written as a set of partially in-stantiated HML formulas where the variables present in those formula are bydefault universally quantified. All the combined instantiations of the variableswill produce a (possibly infinite) number of test cases.

Each test intention may be given by several rules, each rule having the form:[ condition => ] inclusion. In the condition part of the rule it is possible todefine constraints over the variables present in the HML formulas that make upthe test. The inclusion part is comprised of the HML formulas with variablesand a name for the test intention defined by that rule. Although for space reasonswe cannot include in this paper the definition of the constraint language for thevariable types of SATEL (execution paths, input/output pairs, input/outputparameters), the full description can be found in [7].

SATEL (Semi-Automatic Testing Language) allows guiding test generation ina more precise fashion than the ”brute force” approach in the SixEvents testintention.

In the test intentions in figure 7 the oneLoanCycle test intention generatestest cases that exercise one cycle of the credit approval process. The shape of thetest intention states that tests generated from this test intention start with theevent receiveRequestIn(′john′, amount) and ends with the replyOut(approval)

Interface

oneLoanCycle

multipleLoanCycle

LoadTestLoanCycle

Axioms

uniformity(amount), nbEvents(path), (sequence(path) <= 4) =>

receiveRequestIn(’john’,amount) . path . replyOut(approval)

in oneCreditCycle;

path in oneLoanCycle, bigPath in multipleCreditCycle =>

path . bigPath multipleLoanCycle;

path in multipleLoanCycle, (nbEvents(path)/6 <= 3) =>

path in LoadCreditCycle;

Variables:

path : primitiveHML;

bigPath : primitiveHML;

amount : Integer;

approval : Boolean;

Fig. 7. Load Test Test Intentions

Page 11: Model Checking Techniques for Test Generation From Business Process Models

M���� ������ ���� ��� ��� ���� ������� ��

<receiveRequestIn(’john’,12000)> <invokeApproverOut(’maria’,20000)><invokeApproverIn(’john’,refused)> <replyOut(’john’,refused)>, true

Fig. 8. Guided Test Cases for Credit Approval Process

event where amount and approval are variables. The path variable representsan HML formula itself which is a sequence of events of size inferior to 4 asstated in the conditions. On the other hand the uniformity predicate ensuresthat only one value is chosen for variable amount and the variable approval isleft unbound. Test sets generated by this test intention include only test case.A possible test set generated from this test intention is depicted in figure 8.The multipleLoanCycle is defined recursively and allows building test caseswhich are repetitions of one cycle of the credit approval process as defined in theoneCreditCycle test intention. Finally, the LoadCreditCycle generates test setswhich include up until three repetitions of the credit approval process — as a loadtest for the SUT. Note that in order to generate the test sets there are two mainproblems to solve: expanding the axioms which are recursive and instantiatingof the variables within those axioms according to the stated conditions. Thisprocedure is called unfolding.

More formally, let apn − spec = 〈Spec, P, In, Out, Beh, X〉 be an algebraicpetri net. A test intention module for apn− spec is a triple 〈I, Λ, X〉 where:

– I is a set of test intention names. In the example of figure 5I =

{

oneLoanCycle, multipleLoanCycle, LoadTestLoanCycle}

;– Λ is a set of test intention axioms where a test intention axiom is a triple〈cond, pat, i〉, also written cond ⇒ pat ∈ i. In the topmost axiom of figure 7we have:• cond =

{

uniformity(amount), nbEvents(path) <= 4}

. Conditions canbe of several types, notably equalities 〈l = r〉, inclusion of other testintentions f ∈ i and uniformity over variables unif(v);

• pat ={

receiveRequestIn(′john′, amount).path.replyOut(approval)}

.Patterns correspond to concatenations of HML formulas where events areinputs or outputs of the SUT. These HML formulas include variables atthe level of parameters, inputs or outputs or the formulas themselves.The abstract syntax of the formulas (without variables) is T ∈ Hml,(¬f) ∈ Hml, (f ∧ g) ∈ Hml or 〈e〉f ∈ Hml, where f and g are HMLformulas and e ∈ EventsIn,Out,Σ . The abstract syntax of patterns isf ∈ Pat, p . p′ ∈ Pat where f ∈ Hml and p, p′ ∈ Pat;

• i ={

oneLoanCycle}

;– X is a set of variables used in test intention axioms. In the example of figure 5

I ={

path, bigPath, amount, approval}

.

7 Operational Techniques

In this section we will define the model checking techniques that will be used forrepresenting the transition system of the specification, the test space expressed

Page 12: Model Checking Techniques for Test Generation From Business Process Models

7� �� ������ � ��!"� #$% &� '�($

in the test intentions and for extracting valid test cases according to the testspace.

Data Decision Diagrams (DDD [8]) and Set Decision Diagrams (SDD [9]) areboth evolutions of the well-known Binary Decision Diagrams (BDD)[10]. WhileBDD is often seen as representing a Boolean function, it can also be seen asa set of sequences of assignments of Boolean values to variables. DDD (resp.SDD) are similar for assignments of any kind of values (resp. sets) of the form(var1 := val1).(var2 := val2) . . . (varn := valn). V al will designate the possiblevalues and V ar the variable names.

7.1 Abstract Definition of Decision Diagrams

A decision diagram (DD) is a structure DD with properties of set and:

– internal op. (DD× DD → DD) such as ∪DD, ∩DD, \DD, one constant ∅DD,– encoding and decoding op. enDD :

SeqA→ DD, decDD : DD →⋃

SeqA,– specific internal op. (DD → DD) such as inductive homomorphisms homDD.– all operations are homomorphic i.e. op(d ∪ d′) = op(d) ∪ op(d′).– the domain SeqA is the sequence of assigment specific to the encoded infor-

mations. Sequences of assignments are composable with concatenantion ′.′

or⊗

for an indexed sequence of concatenation.– an efficient comparison is provided = : (DD × DD → B). Which works in

constant time due to the canonicity of the representations.– encoding and decoding are reverse op. enDD ◦ decDD = decDD ◦ enDD = Id.

In the encoding necessary for this paper we will use various structures basedon different kind of assignment (defined in the domainSeqA = {(v1 := val1).(v2 := val2)...(vn := valn)}):

– DDD (Data Decision Diagrams) where SeqA is based on v := val, v ∈ V ar,val ∈ V al

– SDD (Set Decision Diagrams) where SeqA is based on v := val, v ∈ V ar,val ∈ P(V al)

– MSDD (Multi Set Decision Diagram) where SeqA is based on v := val,v ∈ V ar, val ∈ PMS(V al)3

– ΣDD (Signature based Decision Diagrams) where SeqA is based on v := val,v ∈ S, val ∈ TΣ,X with an homomorphism implementing rewriting RewΣDD

compatible with rewriting on terms Rew.

7.2 Computing Transition System

Computing the transition system needs to have a basic schema of how to encodethe transition system, and homomorphisms that compute new transition relationfrom existing one. Without entering into a detailed description, we will just givea sketch of what is the encoding of states, events and transition relation. Suchkind of encoding is unfortunately not optimal, this is due to the limited reusecaused by presence of discriminating events.

3 The PMS notation corresponds to the power multi-set.

Page 13: Model Checking Techniques for Test Generation From Business Process Models

)*+,- ./,01234 5,0/3289,: ;*< 5,:= >,3,<?=2*3 @A

Encoding. The encoding is given for transition systems following its inductivedefinition:

– enSDD(mevent−−−→ m′) =

p∈P (p := enMSDD(mp).p′ := enMSDD(m′

p)).ev := enΣEventDD(event)– enMSDD(ǫs) = s := ∅MSDD, s ∈ S

– enMSDD([t]s) = (s := enΣDD({t}))4 , s ∈ S and t ∈ TΣ

– enMSDD(ms + ms′) = enMSDD(ms) ∪MSDD enMSDD(ms′)– enMSDD(ms−ms′) = enMSDD(ms) \MSDD enMSDD(ms′)– enΣDD : P(TΣ,X)→ SIGDDΣ which encodes a term as a ΣDD [11].

Homomorphisms. Homomorphisms are used to encode the transition relation,we can define in [12] homomorphisms for algebraic nets that compute successorstates of given states.

Let Beht = 〈event, pre, Cond, post〉 be a transition behaviour. We defineH−

Beht, CheckBeht

and H+Beht

, based on elementary homomorphisms H+ andH− working on each individual places, [12] by:

– H−Beht

=©p∈P

H−(p, prep, event),

– H+Beht

=©p∈P

H+(p, postp, event),

– CheckBeht=©〈l,r〉∈Cond

check(〈l, r〉).

The homomorphism HomBeh applies the behaviour of all transitions of T

by combining the previous operators: HomBeh = Proj5 ◦⋃

Beht∈Beh H−Beht

CheckBeht◦ H+

Behtand finally we compute the transitive closure: Hom∗

Beh =(HomBeh ∪ Id)∗.

At the end of this process, we obtain a SDD structure building the wholetransition system TSapn(m0) from an initial marking 6 :

mevent−−−→m′∈TSapn(m0)

enSDD(mevent−−−→ m′) = Hom∗

Beh(enSDD(m0λ−→ m0))

7.3 Computing Tests by Unfolding Test Intentions

Unfolding test intentions is essentially an iterative process determined by therules expressed in the test intention language. We need to have a collection ofsets keeping these structures. If I is the set of the test intention names, we willhave an I indexed structure initialized as an empty set:

i∈I

i := ∅SDD

4 The basic encoding element of a sequence.5 Proj is a projection function keeping only the destination (third argument) of the

transition relation.6 We need to provide a pseudo silent event λ in order to start the process.

Page 14: Model Checking Techniques for Test Generation From Business Process Models

BC DE FGHIJK LE LGHNOK PQR SE TIUQ

and directly the set of values for each existing sorts of S. We do not worry aboutthis very exhaustive definition of the domains because we only explore domainsat once and we do not give a combinatorial definition.

s∈S

s := TΣ,s

Each step of the computation will successively apply the test intention rule on thestructure, condition on other intention will be used to instantiate variable andfinally alter the set of tests under construction. Consider 〈I, Λ, X〉 a test intentionmodule. The Apply : Λ×SDD → SDD homomorphism is based on several otherhomomorphisms not detailed here but close to that defined on ΣDD. Additionalones such as Select : Pat × I × SDD → SDD, Keep : X × SDD → SDD,Add : SDD → SDD and Instantiate : Hml × SDD → SDD are necessaryto manage the set structure of test intentions. In particular the Apply and theSelect homomorphisms are defined as follows:

Apply(〈l, r〉 ∪ Cond⇒ t ∈ i) = Apply(Cond⇒ t ∈ i) ◦ check(〈l, r〉)

Apply(f ′ ∈ i′ ∪ Cond⇒ t ∈ i) = Select(f ′ ∈ i′) ◦Apply(Cond⇒ t ∈ i)

Apply(unif(v) ∪ Cond⇒ t ∈ i) = Keep(v) ◦Apply(Cond⇒ t ∈ i)

Apply(∅ ⇒ t ∈ i) = Add(t ∈ i)

Select(T, f) = Id

Select(< e > f ∈ i) = Instantiate(e) . Select(f ∈ i)

Select(f ∧ g ∈ i) = Select(f ∈ i) ∪ Select(g ∈ i)

Select(¬f ∈ i) = Select(e ∈ f)

Select(pat . pat′ ∈ i) = Select(pat ∈ i) . Select(pat′ ∈ i)

The Apply function is successively computed for each rule of Λ until reaching afix-point.

Apply∗Λ = (⋃

〈cond⇒t∈i〉∈Λ

Apply(cond⇒ t ∈ i))∗

The encoded tests are then:

Apply∗Λ(⊗

s∈S

s := TΣ,s .⊗

i∈I

i := ∅SDD)

This DD structure can be used to extract the generated tests.

7.4 Computing Tests by Unfolding Intentions and Validating Tests

The Apply homomorphism in section 7.3 allows calculating all the test caseshapes by expanding all the test intentions in a test intention module. Withinthe unfolding process of the test intention the formula t ∈ i expresses the factthat we add a new test to the set of already selected test cases. However, the

Page 15: Model Checking Techniques for Test Generation From Business Process Models

VWXYZ [\Y]^_`a bY]\`_cdYe fWg bYeh iY`Ygjh_W` kl

Apply homomorphism only expands the test intentions in test cases withoutany semantics, which means we do not know if those test cases correspond tobehaviors which are expected or not in the SUT — we are missing calculatingthe Oracles for those test cases. This step will be achieved by performing a walkon the transition system of the algebraic Petri net resulting from translating theBPMN specification as explained in section 4. Although we leave the additionaldetails of this walk abstract, the final tests will be marked as satisfied or notwhen computing the homomorphism Add in Apply(∅ ⇒ t ∈ i) = Add(t, i).

8 Conclusions and Future Work

We have presented in an abstract manner a line of work currently in progressat our laboratory that links model checking and verification for event basedsystems. We explored such methods on a case study of business process modeling,i.e. transforming BPMN into APN models, automatically generating test setsand finally applying them to verify implementations of those models.

There are many approaches in the literature to provide semantics to BPMNin terms of Petri Nets, e.g. in [3], as well as of test case generation using modelcheckers, e.g. in [13]. Finally, several model checkers as the already mentionedNuSMV [1] uses BDD based technology to encode large state spaces. We havepresented a way to select test cases from the model and a description of the testintentions — the tests that seem useful to provide to the system under test. Wehave developped techniques, based on the use of model checking principles usingsymbolic representation, to first build a representation of the semantics of thesystem and then to unfold the test intentions into usable tests that have beencompared head to head with the specification semantics. This lets us to producetests with correct oracles. Apart the fact that a real benchmark must be done, wethink that this approach is extensible to many DSL, following the same patternof activities: definition of the modeling language; definition of transformationsin a target language; establishment of the interface of the SUT in a black-boxfashion; development of the test selection tools.

References

1. Cimatti, A., Clarke, E., Giunchiglia, E., Giunchiglia, F., Pistore, M., Roveri, M.,Sebastiani, R., Tacchella, A.: NuSMV Version 2: An OpenSource Tool for SymbolicModel Checking. In: Brinksma, E., Larsen, K.G. (eds.) CAV 2002. LNCS, vol. 2404,p. 359. Springer, Heidelberg (2002)

2. Object Management Group. Business process modeling notation, v1.1 (2008),http://www.bpmn.org

3. Dijkman, R.M., Dumas, M., Ouyang, C.: Semantics and analysis of business processmodels in bpmn. Inf. Softw. Technol. 50(12), 1281–1294 (2008)

4. Ang Chen, L.L.: Transform bpmn to algebraic petri nets with encapsulation. Tech-nical Report 207, CUI, University of Geneva (January 2009),http://smv.unige.ch/tiki-download_file.php?fileId=1153

Page 16: Model Checking Techniques for Test Generation From Business Process Models

mn op qrstuv wp wrsxyv z{| }p ~t�{

5. ATLAS Group. Atlas transformation language (2008),http://www.eclipse.org/m2m/atl/

6. Al-Shabibi, A., Buchs, D., Buffo, M., Chachkov, S., Chen, A., Hurzeler, D.: Pro-totyping object oriented specifications. In: van der Aalst, W.M.P., Best, E. (eds.)ICATPN 2003. LNCS, vol. 2679, pp. 473–482. Springer, Heidelberg (2003)

7. Lucio, L.: SATEL — A Test Intention Language for Object-Oriented Specificationsof Reactive Systems. PhD thesis, Universite de Geneve - Switzerland (2008),http://smv.unige.ch/tiki-download_file.php?fileId=975

8. Couvreur, J.-M., Encrenaz, E., Paviot-Adet, E., Poitrenaud, D., Wacrenier, P.:Data decision diagrams for petri net analysis. In: Esparza, J., Lakos, C.A. (eds.)ICATPN 2002. LNCS, vol. 2360, p. 101. Springer, Heidelberg (2002)

9. Couvreur, J.M., Thierry-Mieg, Y.: Hierarchical decision diagrams to exploit modelstructure. In: Wang, F. (ed.) FORTE 2005. LNCS, vol. 3731, pp. 443–457. Springer,Heidelberg (2005)

10. Bryant, R.: Graph-based algorithms for boolean function manipulation. Transac-tions on Computers C-35, 677–691 (1986)

11. Buchs, D., Hostettler, S.: Sigma decisions diagrams. Technical Report 204, CUI,Universite de Geneve. TERMGRAPH 2009 (January 2009) (to appear),http://smv.unige.ch/tiki-download_file.php?fileId=1147

12. Buchs, D., Hostettler, S.: Toward efficient state space generation of algebraic petrinet. Technical report, CUI, Universite de Geneve (January 2009),http://smv.unige.ch/tiki-download_file.php?fileId=1151

13. Lucio, L., Samer, M.: Technology of test-case generation. Model-Based Testing ofReactive Systems, 323–354 (2004)