object centric reflection

87
Object-Centric Reflection Jorge Ressia Software Composition Group Thursday, August 30, 12

Upload: esug

Post on 05-Dec-2014

713 views

Category:

Technology


0 download

DESCRIPTION

ESUG 2012, Ghent

TRANSCRIPT

Page 1: Object Centric Reflection

Object-Centric Reflection

Jorge RessiaSoftware Composition Group

Thursday, August 30, 12

Page 2: Object Centric Reflection

Profiling

Thursday, August 30, 12

Page 3: Object Centric Reflection

Profiling:Is the activity of analyzing a program execution.

Thursday, August 30, 12

Page 4: Object Centric Reflection

Profile

}

{

}

{

}

{}

{

}

{

Domain-Specific Profiling 3

CPU time profiling

Mondrian [9] is an open and agile visualization engine. Mondrian describes avisualization using a graph of (possibly nested) nodes and edges. In June 2010a serious performance issue was raised1. Tracking down the cause of the poorperformance was not trivial. We first used a standard sample-based profiler.

Execution sampling approximates the time spent in an application’s methodsby periodically stopping a program and recording the current set of methodsunder executions. Such a profiling technique is relatively accurate since it haslittle impact on the overall execution. This sampling technique is used by almostall mainstream profilers, such as JProfiler, YourKit, xprof [10], and hprof.

MessageTally, the standard sampling-based profiler in Pharo Smalltalk2, tex-tually describes the execution in terms of CPU consumption and invocation foreach method of Mondrian:

54.8% {11501ms} MOCanvas>>drawOn:54.8% {11501ms} MORoot(MONode)>>displayOn:30.9% {6485ms} MONode>>displayOn:

| 18.1% {3799ms} MOEdge>>displayOn:...

| 8.4% {1763ms} MOEdge>>displayOn:| | 8.0% {1679ms} MOStraightLineShape>>display:on:| | 2.6% {546ms} FormCanvas>>line:to:width:color:...

23.4% {4911ms} MOEdge>>displayOn:...

We can observe that the virtual machine spent about 54% of its time inthe method displayOn: defined in the class MORoot. A root is the unique non-nested node that contains all the nodes of the edges of the visualization. Thisgeneral profiling information says that rendering nodes and edges consumes agreat share of the CPU time, but it does not help in pinpointing which nodesand edges are responsible for the time spent. Not all graphical elements equallyconsume resources.

Traditional execution sampling profilers center their result on the frames ofthe execution stack and completely ignore the identity of the object that receivedthe method call and its arguments. As a consequence, it is hard to track downwhich objects cause the slowdown. For the example above, the traditional profilersays that we spent 30.9% in MONode>>displayOn: without saying which nodeswere actually refreshed too often.

Coverage

PetitParser is a parsing framework combining ideas from scannerless parsing,parser combinators, parsing expression grammars and packrat parsers to modelgrammars and parsers as objects that can be reconfigured dynamically [11].

1 http://forum.world.st/Mondrian-is-slow-next-step-tc2257050.html#a2261116

2 http://www.pharo-project.org/

Thursday, August 30, 12

Page 5: Object Centric Reflection

Domain

Profile

}

{

}

{

}

{}

{

}

{

Domain-Specific Profiling 3

CPU time profiling

Mondrian [9] is an open and agile visualization engine. Mondrian describes avisualization using a graph of (possibly nested) nodes and edges. In June 2010a serious performance issue was raised1. Tracking down the cause of the poorperformance was not trivial. We first used a standard sample-based profiler.

Execution sampling approximates the time spent in an application’s methodsby periodically stopping a program and recording the current set of methodsunder executions. Such a profiling technique is relatively accurate since it haslittle impact on the overall execution. This sampling technique is used by almostall mainstream profilers, such as JProfiler, YourKit, xprof [10], and hprof.

MessageTally, the standard sampling-based profiler in Pharo Smalltalk2, tex-tually describes the execution in terms of CPU consumption and invocation foreach method of Mondrian:

54.8% {11501ms} MOCanvas>>drawOn:54.8% {11501ms} MORoot(MONode)>>displayOn:30.9% {6485ms} MONode>>displayOn:

| 18.1% {3799ms} MOEdge>>displayOn:...

| 8.4% {1763ms} MOEdge>>displayOn:| | 8.0% {1679ms} MOStraightLineShape>>display:on:| | 2.6% {546ms} FormCanvas>>line:to:width:color:...

23.4% {4911ms} MOEdge>>displayOn:...

We can observe that the virtual machine spent about 54% of its time inthe method displayOn: defined in the class MORoot. A root is the unique non-nested node that contains all the nodes of the edges of the visualization. Thisgeneral profiling information says that rendering nodes and edges consumes agreat share of the CPU time, but it does not help in pinpointing which nodesand edges are responsible for the time spent. Not all graphical elements equallyconsume resources.

Traditional execution sampling profilers center their result on the frames ofthe execution stack and completely ignore the identity of the object that receivedthe method call and its arguments. As a consequence, it is hard to track downwhich objects cause the slowdown. For the example above, the traditional profilersays that we spent 30.9% in MONode>>displayOn: without saying which nodeswere actually refreshed too often.

Coverage

PetitParser is a parsing framework combining ideas from scannerless parsing,parser combinators, parsing expression grammars and packrat parsers to modelgrammars and parsers as objects that can be reconfigured dynamically [11].

1 http://forum.world.st/Mondrian-is-slow-next-step-tc2257050.html#a2261116

2 http://www.pharo-project.org/

Thursday, August 30, 12

Page 6: Object Centric Reflection

Mondrian

Thursday, August 30, 12

Page 7: Object Centric Reflection

Thursday, August 30, 12

Page 8: Object Centric Reflection

System ComplexityLanza and Ducasse 2003

Thursday, August 30, 12

Page 9: Object Centric Reflection

Domain-Specific Profiling 3

CPU time profiling

Mondrian [9] is an open and agile visualization engine. Mondrian describes avisualization using a graph of (possibly nested) nodes and edges. In June 2010a serious performance issue was raised1. Tracking down the cause of the poorperformance was not trivial. We first used a standard sample-based profiler.

Execution sampling approximates the time spent in an application’s methodsby periodically stopping a program and recording the current set of methodsunder executions. Such a profiling technique is relatively accurate since it haslittle impact on the overall execution. This sampling technique is used by almostall mainstream profilers, such as JProfiler, YourKit, xprof [10], and hprof.

MessageTally, the standard sampling-based profiler in Pharo Smalltalk2, tex-tually describes the execution in terms of CPU consumption and invocation foreach method of Mondrian:

54.8% {11501ms} MOCanvas>>drawOn:54.8% {11501ms} MORoot(MONode)>>displayOn:30.9% {6485ms} MONode>>displayOn:

| 18.1% {3799ms} MOEdge>>displayOn:...

| 8.4% {1763ms} MOEdge>>displayOn:| | 8.0% {1679ms} MOStraightLineShape>>display:on:| | 2.6% {546ms} FormCanvas>>line:to:width:color:...

23.4% {4911ms} MOEdge>>displayOn:...

We can observe that the virtual machine spent about 54% of its time inthe method displayOn: defined in the class MORoot. A root is the unique non-nested node that contains all the nodes of the edges of the visualization. Thisgeneral profiling information says that rendering nodes and edges consumes agreat share of the CPU time, but it does not help in pinpointing which nodesand edges are responsible for the time spent. Not all graphical elements equallyconsume resources.

Traditional execution sampling profilers center their result on the frames ofthe execution stack and completely ignore the identity of the object that receivedthe method call and its arguments. As a consequence, it is hard to track downwhich objects cause the slowdown. For the example above, the traditional profilersays that we spent 30.9% in MONode>>displayOn: without saying which nodeswere actually refreshed too often.

Coverage

PetitParser is a parsing framework combining ideas from scannerless parsing,parser combinators, parsing expression grammars and packrat parsers to modelgrammars and parsers as objects that can be reconfigured dynamically [11].

1 http://forum.world.st/Mondrian-is-slow-next-step-tc2257050.html#a2261116

2 http://www.pharo-project.org/

Thursday, August 30, 12

Page 10: Object Centric Reflection

Which is the relationship?Domain-Specific Profiling 3

CPU time profiling

Mondrian [9] is an open and agile visualization engine. Mondrian describes avisualization using a graph of (possibly nested) nodes and edges. In June 2010a serious performance issue was raised1. Tracking down the cause of the poorperformance was not trivial. We first used a standard sample-based profiler.

Execution sampling approximates the time spent in an application’s methodsby periodically stopping a program and recording the current set of methodsunder executions. Such a profiling technique is relatively accurate since it haslittle impact on the overall execution. This sampling technique is used by almostall mainstream profilers, such as JProfiler, YourKit, xprof [10], and hprof.

MessageTally, the standard sampling-based profiler in Pharo Smalltalk2, tex-tually describes the execution in terms of CPU consumption and invocation foreach method of Mondrian:

54.8% {11501ms} MOCanvas>>drawOn:54.8% {11501ms} MORoot(MONode)>>displayOn:30.9% {6485ms} MONode>>displayOn:

| 18.1% {3799ms} MOEdge>>displayOn:...

| 8.4% {1763ms} MOEdge>>displayOn:| | 8.0% {1679ms} MOStraightLineShape>>display:on:| | 2.6% {546ms} FormCanvas>>line:to:width:color:...

23.4% {4911ms} MOEdge>>displayOn:...

We can observe that the virtual machine spent about 54% of its time inthe method displayOn: defined in the class MORoot. A root is the unique non-nested node that contains all the nodes of the edges of the visualization. Thisgeneral profiling information says that rendering nodes and edges consumes agreat share of the CPU time, but it does not help in pinpointing which nodesand edges are responsible for the time spent. Not all graphical elements equallyconsume resources.

Traditional execution sampling profilers center their result on the frames ofthe execution stack and completely ignore the identity of the object that receivedthe method call and its arguments. As a consequence, it is hard to track downwhich objects cause the slowdown. For the example above, the traditional profilersays that we spent 30.9% in MONode>>displayOn: without saying which nodeswere actually refreshed too often.

Coverage

PetitParser is a parsing framework combining ideas from scannerless parsing,parser combinators, parsing expression grammars and packrat parsers to modelgrammars and parsers as objects that can be reconfigured dynamically [11].

1 http://forum.world.st/Mondrian-is-slow-next-step-tc2257050.html#a2261116

2 http://www.pharo-project.org/

?

Thursday, August 30, 12

Page 11: Object Centric Reflection

Debugging

Thursday, August 30, 12

Page 12: Object Centric Reflection

Debugging:Is the process of interacting with a running software system to test and understand its current behavior.

Thursday, August 30, 12

Page 13: Object Centric Reflection

Mondrian

Thursday, August 30, 12

Page 14: Object Centric Reflection

Thursday, August 30, 12

Page 15: Object Centric Reflection

System ComplexityLanza and Ducasse 2003

Thursday, August 30, 12

Page 16: Object Centric Reflection

Rendering

Thursday, August 30, 12

Page 17: Object Centric Reflection

Shape and Nodes

Thursday, August 30, 12

Page 18: Object Centric Reflection

Thursday, August 30, 12

Page 19: Object Centric Reflection

Thursday, August 30, 12

Page 20: Object Centric Reflection

How do we debug this?

Thursday, August 30, 12

Page 21: Object Centric Reflection

Breakpoints

Thursday, August 30, 12

Page 22: Object Centric Reflection

Conditional Breakpoints

Thursday, August 30, 12

Page 23: Object Centric Reflection

}

{

}

{

}

{}

{

}

{

Thursday, August 30, 12

Page 24: Object Centric Reflection

}

{

}

{

}

{}

{

}

{

Thursday, August 30, 12

Page 25: Object Centric Reflection

Developer Questions

Thursday, August 30, 12

Page 26: Object Centric Reflection

When during the execution is this method called? (Q.13)Where are instances of this class created? (Q.14) Where is this variable or data structure being accessed? (Q.15)What are the values of the argument at runtime? (Q.19)What data is being modified in this code? (Q.20)How are these types or objects related? (Q.22)How can data be passed to (or accessed at) this point

in the code? (Q.28)What parts of this data structure are accessed in this

code? (Q.33)

Thursday, August 30, 12

Page 27: Object Centric Reflection

When during the execution is this method called? (Q.13)Where are instances of this class created? (Q.14) Where is this variable or data structure being accessed? (Q.15)What are the values of the argument at runtime? (Q.19)What data is being modified in this code? (Q.20)How are these types or objects related? (Q.22)How can data be passed to (or accessed at) this point

in the code? (Q.28)What parts of this data structure are accessed in this

code? (Q.33) Sillito etal.

Questions programmers ask during software

evolution tasks. 2008

Thursday, August 30, 12

Page 28: Object Centric Reflection

Which is the relationship?

?When during the execution is this method called? (Q.13)

Where are instances of this class created? (Q.14)

Where is this variable or data structure being accessed? (Q.15)

What are the values of the argument at runtime? (Q.19)

What data is being modified in this code? (Q.20)

How are these types or objects related? (Q.22)

How can data be passed to (or accessed at) this point in the code? (Q.28)

What parts of this data structure are accessed in this code? (Q.33)

Thursday, August 30, 12

Page 29: Object Centric Reflection

What is the problem?

Thursday, August 30, 12

Page 30: Object Centric Reflection

Traditional Reflection

Thursday, August 30, 12

Page 31: Object Centric Reflection

Domain-Specific Profiling 3

CPU time profiling

Mondrian [9] is an open and agile visualization engine. Mondrian describes avisualization using a graph of (possibly nested) nodes and edges. In June 2010a serious performance issue was raised1. Tracking down the cause of the poorperformance was not trivial. We first used a standard sample-based profiler.

Execution sampling approximates the time spent in an application’s methodsby periodically stopping a program and recording the current set of methodsunder executions. Such a profiling technique is relatively accurate since it haslittle impact on the overall execution. This sampling technique is used by almostall mainstream profilers, such as JProfiler, YourKit, xprof [10], and hprof.

MessageTally, the standard sampling-based profiler in Pharo Smalltalk2, tex-tually describes the execution in terms of CPU consumption and invocation foreach method of Mondrian:

54.8% {11501ms} MOCanvas>>drawOn:54.8% {11501ms} MORoot(MONode)>>displayOn:30.9% {6485ms} MONode>>displayOn:

| 18.1% {3799ms} MOEdge>>displayOn:...

| 8.4% {1763ms} MOEdge>>displayOn:| | 8.0% {1679ms} MOStraightLineShape>>display:on:| | 2.6% {546ms} FormCanvas>>line:to:width:color:...

23.4% {4911ms} MOEdge>>displayOn:...

We can observe that the virtual machine spent about 54% of its time inthe method displayOn: defined in the class MORoot. A root is the unique non-nested node that contains all the nodes of the edges of the visualization. Thisgeneral profiling information says that rendering nodes and edges consumes agreat share of the CPU time, but it does not help in pinpointing which nodesand edges are responsible for the time spent. Not all graphical elements equallyconsume resources.

Traditional execution sampling profilers center their result on the frames ofthe execution stack and completely ignore the identity of the object that receivedthe method call and its arguments. As a consequence, it is hard to track downwhich objects cause the slowdown. For the example above, the traditional profilersays that we spent 30.9% in MONode>>displayOn: without saying which nodeswere actually refreshed too often.

Coverage

PetitParser is a parsing framework combining ideas from scannerless parsing,parser combinators, parsing expression grammars and packrat parsers to modelgrammars and parsers as objects that can be reconfigured dynamically [11].

1 http://forum.world.st/Mondrian-is-slow-next-step-tc2257050.html#a2261116

2 http://www.pharo-project.org/

?

Profiling

Thursday, August 30, 12

Page 32: Object Centric Reflection

?When during the execution is this method called? (Q.13)

Where are instances of this class created? (Q.14)

Where is this variable or data structure being accessed? (Q.15)

What are the values of the argument at runtime? (Q.19)

What data is being modified in this code? (Q.20)

How are these types or objects related? (Q.22)

How can data be passed to (or accessed at) this point in the code? (Q.28)

What parts of this data structure are accessed in this code? (Q.33)

Debugging

Thursday, August 30, 12

Page 33: Object Centric Reflection

Object Paradox

Thursday, August 30, 12

Page 34: Object Centric Reflection

Object-Centric Reflection

Thursday, August 30, 12

Page 35: Object Centric Reflection

Thursday, August 30, 12

Page 36: Object Centric Reflection

Organize the Meta-level

Thursday, August 30, 12

Page 37: Object Centric Reflection

ExplicitMeta-objects

Thursday, August 30, 12

Page 38: Object Centric Reflection

Object

Meta-object

Class

Thursday, August 30, 12

Page 39: Object Centric Reflection

Object

Meta-object

Class

Thursday, August 30, 12

Page 40: Object Centric Reflection

Evolved Object

Meta-object

Class

Thursday, August 30, 12

Page 41: Object Centric Reflection

ExecutionReification

StructureEvolution

ProfilingDebugging

Thursday, August 30, 12

Page 42: Object Centric Reflection

ExecutionReification

StructureEvolution

ProfilingDebugging

Thursday, August 30, 12

Page 43: Object Centric Reflection

Object-Centric Debugging

Thursday, August 30, 12

Page 44: Object Centric Reflection

Object-Centric Debugging

ICSE 2012J. Ressia, A, Bergel and O. Nierstrasz

Thursday, August 30, 12

Page 45: Object Centric Reflection

}

{

}

{

}

{}

{

}

{

Thursday, August 30, 12

Page 46: Object Centric Reflection

}

{

}

{

}

{}

{

}

{

Thursday, August 30, 12

Page 47: Object Centric Reflection

}

{

}

{

}

{}

{

}

{

Thursday, August 30, 12

Page 48: Object Centric Reflection

Thursday, August 30, 12

Page 49: Object Centric Reflection

Thursday, August 30, 12

Page 50: Object Centric Reflection

Thursday, August 30, 12

Page 51: Object Centric Reflection

InstructionStream class>>on: InstructionStream class>>new InstructionStream>>initializeCompiledMethod>>initialPCInstructionStream>>method:pc:InstructionStream>>nextInstructionMessageCatcher class>>newInstructionStream>>interpretNextInstructionFor:...

on: new

initializemethod:pc:nextInstructioninterpretNextInstructionFor:...

step into, step over, resume

next message, next change

stack-centric debugging

object-centric debugging

...

centered on the InstructionStream class

centered on the InstructionStream object

next message, next change

Thursday, August 30, 12

Page 52: Object Centric Reflection

Mondrian

Thursday, August 30, 12

Page 53: Object Centric Reflection

Shape and Nodes

Thursday, August 30, 12

Page 54: Object Centric Reflection

Thursday, August 30, 12

Page 55: Object Centric Reflection

Thursday, August 30, 12

Page 56: Object Centric Reflection

halt on object in call

Thursday, August 30, 12

Page 57: Object Centric Reflection

Halt on next messageHalt on next message/s namedHalt on state changeHalt on state change namedHalt on next inherited messageHalt on next overloaded messageHalt on object/s in callHalt on next message from package

Thursday, August 30, 12

Page 58: Object Centric Reflection

ExecutionReification

StructureEvolution

Debugging Profiling

Thursday, August 30, 12

Page 59: Object Centric Reflection

MetaSpy

Thursday, August 30, 12

Page 60: Object Centric Reflection

MetaSpy

TOOLS 2011Bergel etal.

Thursday, August 30, 12

Page 61: Object Centric Reflection

Mondrian Profiler

Thursday, August 30, 12

Page 62: Object Centric Reflection

Thursday, August 30, 12

Page 63: Object Centric Reflection

System ComplexityLanza, Ducasse 2003

Thursday, August 30, 12

Page 64: Object Centric Reflection

Thursday, August 30, 12

Page 65: Object Centric Reflection

Profiling

StructureEvolution

Debugging

ExecutionReification

Thursday, August 30, 12

Page 66: Object Centric Reflection

What if we do not know what to evolve?

Thursday, August 30, 12

Page 67: Object Centric Reflection

Thursday, August 30, 12

Page 68: Object Centric Reflection

?Thursday, August 30, 12

Page 69: Object Centric Reflection

Prisma

Thursday, August 30, 12

Page 70: Object Centric Reflection

Thursday, August 30, 12

Page 71: Object Centric Reflection

Thursday, August 30, 12

Page 72: Object Centric Reflection

Thursday, August 30, 12

Page 73: Object Centric Reflection

Thursday, August 30, 12

Page 74: Object Centric Reflection

Thursday, August 30, 12

Page 75: Object Centric Reflection

Back in time Debugger

Thursday, August 30, 12

Page 76: Object Centric Reflection

Back in time Debugger

Object Flow Debugger

Lienhard etal. ECOOP 2008

Thursday, August 30, 12

Page 77: Object Centric Reflection

:Person field-write@t2

field-write@t3

init@t1

'Doe'

person := Person new t1...name := 'Doe' t2...name := 'Smith' t3

'Smith'

nullpredecessor

predecessor

value

value

value

name

name

name

Thursday, August 30, 12

Page 78: Object Centric Reflection

Profiling

ExecutionReification

Debugging

StructureEvolution

Thursday, August 30, 12

Page 79: Object Centric Reflection

Talents

scg.unibe.ch/research/talents

Thursday, August 30, 12

Page 80: Object Centric Reflection

Talents

scg.unibe.ch/research/talents

IWST 2011J. Ressia, T. Gîrba, O. Nierstrasz, F. Perin and

L. Renggli

Thursday, August 30, 12

Page 81: Object Centric Reflection

Dynamically composable units of

reuse

Thursday, August 30, 12

Page 82: Object Centric Reflection

Streams

Thursday, August 30, 12

Page 83: Object Centric Reflection

scg.unibe.ch/research/bifrost

Thursday, August 30, 12

Page 84: Object Centric Reflection

Object-Centric

DebuggerPrisma

Subjectopia MetaSpy

Talents Chameleon

Thursday, August 30, 12

Page 85: Object Centric Reflection

scg.unibe.ch/jenkins/

Thursday, August 30, 12

Page 86: Object Centric Reflection

AlexandreBergel

MarcusDenker

StéphaneDucasse

OscarNierstrasz

LukasRenggli

OscarNierstrasz

OscarNierstrasz

OscarNierstrasz

TudorGîrba

FabrizioPerin

Thursday, August 30, 12

Page 87: Object Centric Reflection

scg.unibe.ch/research/bifrost

Thursday, August 30, 12