comparing approaches to implement feature model composition

44
Comparing Approaches to Implement Feature Model Composition Mathieu Acher 1 , Philippe Collet 1 , Philippe Lahire 1 , Robert France 2 1 University of Nice Sophia Antipolis (France), Modalis Team (CNRS, I3S Laboratory) 2 Computer Science Department, Colorado State University

Upload: acher

Post on 13-Nov-2014

1.622 views

Category:

Technology


0 download

DESCRIPTION

The use of Feature Models (FMs) to define the valid combinations of features in Software Product Lines (SPL) is becoming commonplace.To enhance the scalability of FMs, support for composing FMsdescribing different SPL aspects is needed. Some composition operators, with interesting property preservationcapabilities, have already been defined but a comprehensive and efficient implementation is still to be proposed.In this paper, we systematically compare strengths and weaknesses of differentimplementation approaches. The study provides some evidence that using generic modelcomposition frameworks are not helping much in therealization, whereas a specific solution is finally necessary and clearly stands out by its qualities.

TRANSCRIPT

Page 1: Comparing Approaches to Implement Feature Model Composition

Comparing Approaches to Implement Feature Model Composition

Mathieu Acher1, Philippe Collet1, Philippe Lahire1, Robert France2

1 University of Nice Sophia Antipolis (France),

Modalis Team (CNRS, I3S Laboratory)

2 Computer Science Department,

Colorado State University

Page 2: Comparing Approaches to Implement Feature Model Composition

Context: Managing Variability• Constructing a Repository of Medical Imaging Algorithms

– deployable on Grid infrastructures

– services embed the business code and are invoked remotely through standardized protocol

• Highly Parameterized Services

– efficiently extend, change, customize, or configure services for use in a particular context

– reusability and composability

– service as software product line (SPL)• (SOAPL’08, MICCAI-Grid’08)

Comparing Approaches to Implement Feature Model Composition

2

Page 3: Comparing Approaches to Implement Feature Model Composition

Context: Managing Variability• Constructing a Repository of Medical Imaging Services

– Deployable on Grid infrastructures

– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol

• Highly Parameterized Services

Comparing Approaches to Implement Feature Model Composition

3

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Page 4: Comparing Approaches to Implement Feature Model Composition

Context: Managing Variability• Constructing a Repository of Medical Imaging Services

– Deployable on Grid infrastructures

– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol

• Highly Parameterized Services

Comparing Approaches to Implement Feature Model Composition

4

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group InteractiveMethod

Spatial Frequency

Transformation

Linear

Rotation Affine

Non Grid

Registration

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Scaling

Page 5: Comparing Approaches to Implement Feature Model Composition

Context: Managing Variability• Constructing a Repository of Medical Imaging Services

– Deployable on Grid infrastructures

– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol

• Highly Parameterized Services

Comparing Approaches to Implement Feature Model Composition

5

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group

InteractiveMethod

Spatial Frequency

Transformation

Linear

Rotation Affine

Non Grid

Registration

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Scaling

FileSizeLimitProcessor

x32 x64

Operating System

Windows Linux

GridComputingNode

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Page 6: Comparing Approaches to Implement Feature Model Composition

Context: Managing Variability• Constructing a Repository of Medical Imaging Services

– Deployable on Grid infrastructures

– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol

• Highly Parameterized Services

Comparing Approaches to Implement Feature Model Composition

6

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group

InteractiveMethod

Spatial Frequency

Transformation

Linear

Rotation Affine

Non Grid

Registration

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Scaling

FileSizeLimitProcessor

x32 x64

Operating System

Windows Linux

GridComputingNode

And-Group

Optional

Mandatory

Xor-Group

Or-Group

CryptographicFormat

XML HTTP

HeaderEncoding

NetworkProtocol

And-Group

Optional

Mandatory

Xor-Group

Or-Group

DynamicDimension

Reliability Time

Measurement

QoS

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Page 7: Comparing Approaches to Implement Feature Model Composition

Issues in Variability Modeling

• Current variability modeling techniques often do not scale up to SPLs with a large number of features.

Comparing Approaches to Implement Feature Model Composition

7

Scalability issues in terms of- construction- evolution- reasoning

Page 8: Comparing Approaches to Implement Feature Model Composition

Separation of Concerns in SPLs

• Large and monolithic variability model

– Use smaller models representing the variability of well-identified concerns.

– When variability models are separated… composition operators are needed.

• In earlier work, we proposed a set of composition operators for feature models (SLE’09)

• In this work, we focus on the merge operator

Comparing Approaches to Implement Feature Model Composition

8

Page 9: Comparing Approaches to Implement Feature Model Composition

Purpose and Intended Audience• An efficient, accurate implementation to

automatically merge feature models

• Our interest here:– determine how (MBE/AOM/specific) techniques

perform with feature model merging implementation

– and which techniques are the most suitable.

• Intended audience: – (1) SPL researchers working on feature modeling

techniques or developers of feature modeling tools ;

– (2) researchers/practitioners involved in the MBE/AOM community

Comparing Approaches to Implement Feature Model Composition

9

Page 10: Comparing Approaches to Implement Feature Model Composition

Agenda

• Background and Motivation

– Feature models and Merge operators

• Requirements for Merge Operators

– Criteria

• Comparison of Different Approaches

– Results

• Conclusion

Comparing Approaches to Implement Feature Model Composition

10

Page 11: Comparing Approaches to Implement Feature Model Composition

Background: Feature Models

• Hierarchy + Variability

– Mandatory features, Optional features

– Alternatives and Constraints

Comparing Approaches to Implement Feature Model Composition

11

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Page 12: Comparing Approaches to Implement Feature Model Composition

Background: Feature Models

• Hierarchy + Variability

– Mandatory features, Optional features

– Alternatives and Constraints

Comparing Approaches to Implement Feature Model Composition

12

AnonymizedFormat

DICOM Nifti Analyze

Modality Acquisition

MRI CT SPEC

T1 T2

PET

Medical Image

And-Group

Optional

Mandatory

Xor-Group

Or-Group

Page 13: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Principles

Comparing Approaches to Implement Feature Model Composition

13

When two feature models (FMs) share several features, there is a need to merge the overlapping parts.

T2

MRI

T1T2

MRI

T1

Page 14: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Principles

Comparing Approaches to Implement Feature Model Composition

14

FM2

?Semantics properties to preserve

FM1

T2

MRI

T1T2

MRI

T1

T2

MRI

T1

Page 15: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Union

Comparing Approaches to Implement Feature Model Composition

15

FM2T2

MRI

T1FM1

T2

MRI

T1

{{MRI, T1}, {MRI, T1, T2}}

{{MRI, T1}, {MRI, T2}}

{{MRI, T1},{MRI, T1, T2},{MRI, T2}}

T2

MRI

T1

Page 16: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Intersection

Comparing Approaches to Implement Feature Model Composition

16

FM2T2

MRI

T1FM1

T2

MRI

T1

MRI

T1

{{MRI, T1}, {MRI, T1, T2}}

{{MRI, T1}, {MRI, T2}}

{{MRI, T1}}

Page 17: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (1)

Comparing Approaches to Implement Feature Model Composition

17

no!

OK

Page 18: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (1)

Comparing Approaches to Implement Feature Model Composition

18

OK

OK

Not optimal

Nifti is a “dead” feature

Page 19: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (1)

Comparing Approaches to Implement Feature Model Composition

19

Everything is OK

Page 20: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (2)

Comparing Approaches to Implement Feature Model Composition

20

“Managing Variability in Workflow with Feature Model Composition Operators“Software Composition (SC) conference 2010

Page 21: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (3)

The ability of the merge operator to deal with several kinds of input FMs

Comparing Approaches to Implement Feature Model Composition

21

Page 22: Comparing Approaches to Implement Feature Model Composition

Merge Operator: Requirements (4)

Aspects of the Implementation

Comparing Approaches to Implement Feature Model Composition

22

Page 23: Comparing Approaches to Implement Feature Model Composition

Now the competition can start!

• Separate FMs

• AGG

• Kompose

• Kermeta

• Boolean Logic– Large spectrum: From modeling/composition

techniques to FM specific solutions

– Some approaches have been proposed by other researchers

Comparing Approaches to Implement Feature Model Composition

24

Page 24: Comparing Approaches to Implement Feature Model Composition

Separate FMs and Intersection

Comparing Approaches to Implement Feature Model Composition

25

Base Aspect

1. Prime features2. pp’3. Root R with And-group

{{R, A, A’, B, B’}}

Schobbens’ et al. 2007

Page 25: Comparing Approaches to Implement Feature Model Composition

Separate FMs and Intersection

Comparing Approaches to Implement Feature Model Composition

26

Base Aspect

{{R, A, A’, B, B’}}

+

--

--

--

Page 26: Comparing Approaches to Implement Feature Model Composition

Separate FMs and Intersection

Comparing Approaches to Implement Feature Model Composition

27

Base Aspect

+

-

Page 27: Comparing Approaches to Implement Feature Model Composition

Separate FMs and Intersection

Comparing Approaches to Implement Feature Model Composition

28

Base Aspect

++

++

++

Page 28: Comparing Approaches to Implement Feature Model Composition

Separate FMs and Intersection

Comparing Approaches to Implement Feature Model Composition

29

Base Aspect

++

++

++

Page 29: Comparing Approaches to Implement Feature Model Composition

AGG

• Attributed Grammar Graph

• Graph Transformation

– Left- Hand Side (LHS): source graph

– Right-Hand Side (RHS): target graph

• Catalogue of merge rules (Segura et al. 2007)

– Only for Union mode

Comparing Approaches to Implement Feature Model Composition

30

Page 30: Comparing Approaches to Implement Feature Model Composition

AGG and a non-trivial example

Comparing Approaches to Implement Feature Model Composition

31

(Intersection mode)

Page 31: Comparing Approaches to Implement Feature Model Composition

On the Difficulties of AGG• The semantics properties currently implemented are limited

to the merge in union mode – The intersection mode remains particularly challenging to be

implemented.

• Strategy based on patterns is difficult to realize– AGG expressiveness: non recursive patterns

• Negative application conditions can precisely locate the source of errors.

Comparing Approaches to Implement Feature Model Composition

32

---

--+

Page 32: Comparing Approaches to Implement Feature Model Composition

Kompose• Generic composition tool (Fleurey, R. France et al.)

• Two major phases: – (1) Matching phase identifies model elements that describe the same

concepts in the input models to be composed;

– (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model.

• Each element type has a signature – two elements with equivalent signatures are merged.

Comparing Approaches to Implement Feature Model Composition

33

Feature and Operator

(Union mode)

Page 33: Comparing Approaches to Implement Feature Model Composition

Kompose and a non trivial example

• Two major phases: – (1) Matching phase identifies model elements that describe the same

concepts in the input models to be composed;

– (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model.

Comparing Approaches to Implement Feature Model Composition

34

(Intersection mode)

Page 34: Comparing Approaches to Implement Feature Model Composition

On the Difficulties of Kompose• Compositional approach structured in two-stages (matching and merging)

is too restrictive for implementing an FM-specific merge operator.

• Recursive detection of matching elements is not sufficient since we need a more global vision to decide whether elements should be merged or not

– Post-conditions: Hard to implement

– As Kompose implies local reasonning, handling constraints is not conceivable as well

Comparing Approaches to Implement Feature Model Composition

35

--- -

(Intersection mode)

Page 35: Comparing Approaches to Implement Feature Model Composition

Experience with Kermeta

• Executable, imperative and object-oriented (meta-)modeling language– Kompose is built on top of Kermeta– we apply the same strategy as with Kompose but without

strictly following the compositional approach

• We gain some benefits, notably a better cover of semantics properties. Now that global and more complex reasoning is possible, some features are not necessary added and less FM errors are generated.

• There is still an issue when dealing with differenthierarchies.

• Finally, the handling of constraints appears to be unpractical.

Comparing Approaches to Implement Feature Model Composition

36

-

Page 36: Comparing Approaches to Implement Feature Model Composition

Boolean Logic

• The set of configurations represented by a FM can be described by a propositional formula defined over a set of Boolean variables

– A & (A<=>B) & (C=>A)

• We can define the Intersection mode

Comparing Approaches to Implement Feature Model Composition

37

{{A, B}, {A, B, C}}

Page 37: Comparing Approaches to Implement Feature Model Composition

Boolean Logic

• We have only a Boolean formula: where is the hierarchy? the variability information?

• Czarnecki et al. precisely propose an algorithm to construct a FM from Boolean formula (SPLC’07)– The algorithm constructs a tree with additional nodes for

feature groups that can be translated into a basic FM.

– We preliminary simplify the formula• If φ ∧ f is unsatisfiable, the feature F is dead and can be

removed.

• The feature F can be identified as a full mandatory feature if φ ∧ ¬f is unsatisfiable.

Comparing Approaches to Implement Feature Model Composition

38

Page 38: Comparing Approaches to Implement Feature Model Composition

Boolean Logic: Strengths and Current Limits

• Experiment on a set of input FMs sharing a same set of features and a same hierarchy. – The algorithm indicates all parent-child relationships (mandatory features) and

all possible optional subfeatures such that the hierarchy of the merged FM corresponds to hierarchies of input FMs.

– And-group, Or-group and X or-group can be efficiently restored in the resulting FM when it was necessary.

• Strengths– The semantics properties are by construction respected. – The technique does not introduce FM errors or does not increase unnecessary

the number of features. – Constraints in FMs can be expressed using the full expressiveness of Boolean

logic and different sets of features can be manipulated.– Apriori detection of error: formula is unsatisfiable

• Current Limits– Hierarchy Mismatch– Explanation

Comparing Approaches to Implement Feature Model Composition

39

Page 39: Comparing Approaches to Implement Feature Model Composition

Results

Comparing Approaches to Implement Feature Model Composition

40

Page 40: Comparing Approaches to Implement Feature Model Composition

Model-based composition techniques

• Difficulties. Why?

• The merge of FM is not purely structural – You cannot focus on syntactical properties

– Semantical transformation or Semantics Preserving Model Composition are needed

• A new challenge for modeling tools?– Of course, modeling solutions can be revisited

• Other modeling approaches and technologies can be considered and may emerge to outperform the solutions considered in this paper.

• e.g., Using another Graph Transformation language

Comparing Approaches to Implement Feature Model Composition

41

Page 41: Comparing Approaches to Implement Feature Model Composition

Conclusion• The implementation of a merge operator for FMs

is an interesting challenge:– We defined a set of criteria to systematically evaluate

an implementation – We compared MBE/AOM/state-of-the-art techniques– We proposed a solution based on Boolean logic that

fulfills most of the criteria• and raises some limitations of our earlier work

• Future Work – Open Issues – diff and refactoring operations for FMs. – Practical use of merge operators in different domains

Comparing Approaches to Implement Feature Model Composition

42

Page 42: Comparing Approaches to Implement Feature Model Composition

?

Page 43: Comparing Approaches to Implement Feature Model Composition

Related Work

• Schobbens, P.Y., Heymans, P., Trigaux, J.C., Bontemps, Y.: Generic semantics of feature diagrams. Comput. Netw. 51(2) (2007) 456–479

• Segura, S., Benavides, D., Ruiz-Cortés, A., Trinidad, P.: Automated merging of feature models using graph transformations. Post-proceedings of the Second Sum-mer School on GTTSE 5235 (2008) 489–505

• Fleurey, F., Baudry, B., France, R.B., Ghosh, S.: A generic approach for automatic model composition. In Giese, H., ed.: MoDELS Workshops, Springer (2007) 7–15

• Reddy, Y.R., Ghosh, S., France, R.B., Straw, G., Bieman, J.M., McEachen, N., Song, E., Georg, G.: Directives for composing aspect-oriented design class models. Transactions on Aspect-Oriented Software Development 3880 (2006) 75–105

• Czarnecki, K., Wasowski, A.: Feature diagrams and logics: There and back again. In: SPLC 2007. (2007) 23–34

• Acher, M., Collet, P., Lahire, P., France, R.: Composing Feature Models. In: 2nd Int’l Conference on Software Language Engineering (SLE’09). LNCS (2009) 20

Comparing Approaches to Implement Feature Model Composition

44

Page 44: Comparing Approaches to Implement Feature Model Composition

A non-trivial example

Comparing Approaches to Implement Feature Model Composition

45