agile metrics v6
DESCRIPTION
Agile MetricsTRANSCRIPT
Agile Metrics
Dave Nicolette
An approach to software developmentbased on the values and principlesexpressed in the Agile Manifesto.
http://www.agilemanifesto.org
Definition: Agile Software Development
Copyright © 2007-2009 Dave Nicolette
A metric is a standard for measuring or evaluating something.
A measure is a quantity, a proportion, or a qualitative comparison of some kind.
Quantity: "There are 25 open defect reports on the application as of today.”
Proportion: "This week there are 10 percent fewer open defect reports than last week.”
Qualitative comparison: "The new version of the software is easier to use than the old version."
Definition: Metrics
Copyright © 2007-2009 Dave Nicolette
• Informational – tells us what’s going on• Diagnostic – identifies areas for improvement• Motivational – influences behavior
One metric may function in multiple categories.
Example: Delivering high value to customers (informational) can increase team morale (motivational).
Beware of unintended side-effects.
Example: Rewarding people for fixing bugs may result in an increase in bugs, as people create opportunities to earn the rewards.
Three Kinds of Metrics
Copyright © 2007-2009 Dave Nicolette
Metrics as Indicators
Leading IndicatorSuggests future trends or events
Lagging IndicatorProvides information about outcomes
Copyright © 2007-2009 Dave Nicolette
Einstein’s Wisdom
Copyright © 2007-2009 Dave Nicolette
Agile Rule of Thumb About Metrics
Measure outcomes, not activity.
Copyright © 2007-2009 Dave Nicolette
A Minimalist Philosophy of Metrics
Measure everything necessary and nothing more.
Copyright © 2007-2009 Dave Nicolette
All the information they need to make decisions, and no more.
Information at the level of detail they can use.
Information at the scope they care about (team, project, program, line of business, enterprise, industry)
Information pertaining to the time frame they care about (day, iteration, release, project, strategic milestone)
For each stakeholder...
Copyright © 2007-2009 Dave Nicolette
Stakeholders
Teammember
ProductOwner
ScrumMaster
Project Manager
User
Executive
Auditor
ProcessImprovement
Researcher
Production Support
Copyright © 2007-2009 Dave Nicolette
General style of the agile process
Type of work being done
How the work is decomposed and planned
The team's self-organization choices
Characteristics of the organization
The team's continuous improvement goals
Factors Influencing Your Choice of Metrics
Copyright © 2007-2009 Dave Nicolette
Iterative
Based on time-boxed iterations of fixed duration
Continuous Flow
Based on principles derived from Lean Manufacturing
Styles of Agile Software Development
Copyright © 2007-2009 Dave Nicolette
New Dev or Major Enhancement
Project with a defined “end” to build a planned scope of work
Ongoing Maintenance & Support
Variable rate of incoming work requests, no defined “end” or planned scope
Broad Categories of Business SW Development
Copyright © 2007-2009 Dave Nicolette
Variation 1: Short iterations (e.g., 1 week)Low-overhead processHigh maturity in agile thinking
Unit of work for execution is the User Story User Stories are not decomposed into Tasks User Stories are small and of consistent size No sizing or estimation of fine-grained work items Team commits to completing selected User Stories No daily burn tracking
How Agile Teams Plan Their Work (Short Term)
Copyright © 2007-2009 Dave Nicolette
Variation 2: Short to medium length iterations (e.g., 1-2 weeks)Reasonably low-overhead processModerately high maturity in agile thinking
Unit of work for execution is the User Story User Stories are not decomposed into Tasks User Stories are small and of consistent size Team sizes User Stories relative to each other Team commits to completing a given number
of Story Points No daily burn tracking
How Agile Teams Plan Their Work (Short Term)
Copyright © 2007-2009 Dave Nicolette
Variation 3: Short to medium length iterations (e.g., 1-2 weeks)Moderately low-overhead processAverage maturity in agile thinking
Unit of work for execution is the Task User Stories are decomposed into Tasks Variations in story size may affect planning Team sizes User Stories relative to each other Team estimates Tasks in terms of ideal time Team agrees to try to complete a given number
of ideal hours or days of work Daily burn tracking and re-estimation of Tasks
How Agile Teams Plan Their Work (Short Term)
Copyright © 2007-2009 Dave Nicolette
Variation 4: Iteration length up to 6 weeksIterative process with some elements of agile workLow maturity in agile thinking
Unit of work for execution is the Task User Stories are decomposed into Tasks Story points (if used) are pegged to ideal time Team estimates Tasks in terms of ideal time Team uses a “load factor” to guess at the amount of non-ideal time Team agrees to work a given number of
hours or days Daily burn tracking and re-estimation of Tasks
How Agile Teams Plan Their Work (Short Term)
Copyright © 2007-2009 Dave Nicolette
The Team's Self-Organization Choices
Copyright © 2007-2009 Dave Nicolette
Generalizing specialists – peer model Chief Programmer / technical lead model Specialists with internal hand-offs
Characteristics of the Organization
Copyright © 2007-2009 Dave Nicolette
Fully supportive lean or “agile” organization
Organization embraces “agile,” some areas operate in a traditional way
“Agile” is an experiment or skunkworks operation, low level of organizational buy-in
Organizational DifferencesTraditional organization
CultureRisk aversion, blame-shifting,competition, zero-sum thinking,fear of failure
StructureAdministrative separationbetween application developersand their customers
Management philosophyCommand-and-control, Theory X, crack the whip
TeamsTemporary assignment, multipleassignment, functional silos
Financial managementCost-center mentality; CostAccounting
Agile/Lean organization
Risk management, trust,transparency, collaboration,failure as learning opportunity
Application developers work forthe lines of business they serve;central IT is for central functions
Self-organizing teams, Theory Y,enable and support people
Stable teams, dedicated teams,cross-functional teams
Profit center mentality;Throughput Accounting
Copyright © 2007-2009 Dave Nicolette
Team's Self-Improvement Goals
Copyright © 2007-2009 Dave Nicolette
Choose metrics that track the team's progress toward self-improvement goals
Discontinue these metrics when the goals havebeen achieved
"Our highest priority is to satisfy the customer through early and continuous delivery of valuable software."
and
"Working software is the primary measure of progress."
Two Agile Principles That Guide the Choice of Metrics
Copyright © 2007-2009 Dave Nicolette
Running Tested Features
Graphic from Ron Jeffries
Running Tested Features
Copyright © 2007-2009 Dave Nicolette
1 2 3 4 5 6 7 8 9 10 11 12 13 14
0
2
4
6
8
10
12
Running Tested Features
RTF
Iteration
Fe
atu
res
Running Tested Features
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations Continuous flow
Ongoing supportDelivery of defined scope
As each feature is delivered
Copyright © 2007-2009 Dave Nicolette
Running Tested FeaturesPrincipleWorking software is the primary measure of progress.
InformationalDirect measure of delivered results.
DiagnosticIf flat or declining over time, a problem is indicated.
MotivationalTeam members naturally want to see RTF increase.
Copyright © 2007-2009 Dave Nicolette
Forms of “Business” Value
Revenue Cost savings Market share Customer relations Reputation
Copyright © 2007-2009 Dave Nicolette
Tracking Hard Financial Value
Copyright © 2007-2009 Dave Nicolette
Profit = Income - Costs
Incremental delivery to production
Baseline: Calculate profitability of the system/process being replaced or enhanced
Per release: Calculate the change in profitability
Tracking Hard Financial Value
Copyright © 2007-2009 Dave Nicolette
Baseline Release 1 Release 2 Release 3 Release 4
0
20000
40000
60000
80000
100000
120000
140000
160000
180000
Hard Value Delivered per Release
Total Revenue
Total Costs
Profitability
Profitability Change
Hard Financial Value
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations Continuous flow
Ongoing supportDelivery of defined scope
As process performance is observed in production operation
Copyright © 2007-2009 Dave Nicolette
Hard Financial ValuePrincipleOur highest priority is to satisfy the customer through early and continuous delivery of valuable software.
InformationalDirect measure of financial value delivered.
DiagnosticDownward trend or projection can be used to inform business decisions about continuing or modifying the project.
MotivationalTeam members like to deliver value because it makes them feel they are contributing to the success of the organization. Stakeholders are motivated to pay attention to the business value of incremental releases.
Copyright © 2007-2009 Dave Nicolette
Tracking Projected Value
Copyright © 2007-2009 Dave Nicolette
When incremental delivery is not to production...
Earned Business Value
from Dan Rawsthorne, Calculating Earned Business Value for an Agile Project, 2006
Earned Business Value
BV(bucket) = BV(parent) x
wt(bucket)
wt(bucket) + wt(sibling)
siblings
This results in a percentage value for each item delivered to the customer representing the relative business value of the item as defined by the customer.
Copyright © 2007-2009 Dave Nicolette
Earned Business Value Example
Applying the formula to “Update Cust Info” in the feature decomposition, we have
(1)x (3 / 4) x (1 / 1) x (10 / 40 ) x (10 / 20) = 9.4% (approx)
This means that when the team delivers the item named “Update Cust Info,” they will have delivered 9.4% of the business value of the project, according to the customer’s own definition of the relative value of each item.
Copyright © 2007-2009 Dave Nicolette
Earned Business Value: When To Use It
YesThe scope of the project is well-known up front and it is possible to develop a fairly comprehensive decomposition of features before development begins.
NoThere is a high level of uncertainty about scope, and the expectation is that the scope will emerge as the team makes progress and stakeholders learn more about the problem and the solution.
EBV breaks down in the latter case because as new scope is added, the percentage of business value already delivered decreases. This makes is appear as if the project is taking business value away from the customer.
Copyright © 2007-2009 Dave Nicolette
Earned Business Value by Points
Feature Group AFeature Group A
Feature Group BFeature Group B
Feature Group CFeature Group C
600 300 100
Early in project
Earned Business Value by Points
Feature Group AFeature Group A
Feature Group BFeature Group B
Feature Group CFeature Group C
600 300 100
Part-way through
Feature A-1
Feature A-1
Feature A-2
Feature A-2
400 200
Story 1Story 1
Story 2Story 2
50
35
etc.
Copyright © 2007-2009 Dave Nicolette
Earned Business Value Charts
See the Agile Metrics spreadsheet, “EBV Charts” sheet, for examples
Based on percentages
Based on points
Copyright © 2007-2009 Dave Nicolette
Earned Business Value
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations Continuous flow
Ongoing supportDelivery of defined scope
As each feature is delivered
Copyright © 2007-2009 Dave Nicolette
Earned Business ValuePrincipleOur highest priority is to satisfy the customer through early and continuous delivery of valuable software.
InformationalDirect measure of customer-defined value delivered.
DiagnosticTrend should be an S curve; otherwise, problems in prioritization or valuation are indicated.
MotivationalTeam members like to deliver value because it makes them feel they are contributing to the success of the organization. Stakeholders are motivated to pay attention to the business value of incremental releases.
Copyright © 2007-2009 Dave Nicolette
Velocity
Velocity is...An empirical observation of the team’s capacity to complete work per iteration.
...and not...• an estimate• a target to aim for
Copyright © 2007-2009 Dave Nicolette
Velocity
Velocity is...Based on the team’s own sizing of work items
...and not...• based on estimated or actual time• dictated or imposed by anyone other than team members
Copyright © 2007-2009 Dave Nicolette
Velocity
Velocity is...Comparable across iterations for a given team on a given project
...and not...• comparable across teams• comparable across projects
Copyright © 2007-2009 Dave Nicolette
Unit of Measure for Velocity
How the team plans Unit of Measure
Commitment to stories Story
Relative sizing (points) Points
Estimation (ideal hours) Ideal Hours
Copyright © 2007-2009 Dave Nicolette
What Counts Toward Velocity?
Only completed work counts toward velocity
Copyright © 2007-2009 Dave Nicolette
Velocity
Copyright © 2007-2009 Dave Nicolette
Velocity
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations Continuous flow
Ongoing supportDelivery of defined scope
At the end of each iteration
Copyright © 2007-2009 Dave Nicolette
VelocityPrincipleOur highest priority is to satisfy the customer through early and continuous delivery of valuable software.
InformationalEmpirical observation of the team’s capacity for work; useful for projecting the likely completion date of a given amount of scope; useful for estimating the amount of scope that can be delivered by a given date.
DiagnosticPatterns in trends in velocity indicate various problems; provides a baseline for continuous improvement efforts
MotivationalTeam members take pride in achieving a high velocity and keeping it stable.
Copyright © 2007-2009 Dave Nicolette
Putting Velocity to Work: Burn Charts
Burn-down chart
How much work remains to be completed?
Burn-up chart
How much work has been completed?
Combined burn chart
How much work has been completed and how much work remains?
Copyright © 2007-2009 Dave Nicolette
Burndown Chart – Line Style
Copyright © 2007-2009 Dave Nicolette
Burndown Chart – Bar Style
Copyright © 2007-2009 Dave Nicolette
Burnup and Burndown Chart
Copyright © 2007-2009 Dave Nicolette
Iteration 0Iteration 1
Iteration 2Iteration 3
Iteration 4Iteration 5
Iteration 6Iteration 7
Iteration 8Iteration 9
Iteration 10Iteration 11
Iteration 12Iteration 13
Iteration 14Iteration 15
0
100
200
300
400
500
600
700
800
Burn-Up and Burn-Down
Remaining Story Points
Completed Story Points Cumulative
Linear (Completed Story Points Cumulative)
Total Story Points (Scope)
Linear (Total Story Points (Scope))
Burn Chart
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations Continuous flow
Ongoing supportDelivery of defined scope
When time-boxed iterations are used: At the end of each iterationWhen continuous flow is used: At fixed time intervals (e.g., monthly)
Copyright © 2007-2009 Dave Nicolette
Burn ChartsPrincipleOur highest priority is to satisfy the customer through early and continuous delivery of valuable software.
InformationalDirect measure of work remaining; projected completion dates; impact of scope changes on schedule.
DiagnosticIndicates whether scope changes or team performance is the cause of schedule variance.
MotivationalTeam members are motivated by seeing clearly when they are likely to finish the project and by seeing the amount of work remaining steadily reduced.
Copyright © 2007-2009 Dave Nicolette
LT = WIP (units) / ACR (units per time period)
WIP = LT * ACR
ACR = WIP / LT
Little’s Law
• Lead Time is the time required to deliver a given amount of work.• WIP is work in process – items started but not completed.• ACR is average completion rate.
Copyright © 2007-2009 Dave Nicolette
Cumulative Flow Diagram
Copyright © 2007-2009 Dave Nicolette
2/2
1/2
00
8
2/2
8/2
00
8
3/6
/20
08
3/1
3/2
00
8
3/2
0/2
00
8
3/2
7/2
00
8
4/3
/20
08
4/1
0/2
00
8
4/1
7/2
00
8
4/2
4/2
00
8
5/1
/20
08
5/8
/20
08
5/1
5/2
00
8
5/2
2/2
00
8
5/2
9/2
00
8
0
20
40
60
80
100
120
140
DeployedTestingConstructionArchitecture / DesignRequirementsTotal Scope (Features)
Date
To
tal
Fea
ture
s
Cumulative Flow Diagram
Copyright © 2007-2009 Dave Nicolette
2/2
1/2
00
8
2/2
8/2
00
8
3/6
/20
08
3/1
3/2
00
8
3/2
0/2
00
8
3/2
7/2
00
8
4/3
/20
08
4/1
0/2
00
8
4/1
7/2
00
8
4/2
4/2
00
8
5/1
/20
08
5/8
/20
08
5/1
5/2
00
8
5/2
2/2
00
8
5/2
9/2
00
8
0
20
40
60
80
100
120
140
DeployedTestingConstructionArchitecture / DesignRequirementsTotal Scope (Features)
Date
To
tal
Fea
ture
s
Lead time
WIP inventory
Cumulative Flow Diagram
Process style
Nature of the work
Stakeholders
Frequency
Time-boxed iterations * Continuous flow
Ongoing supportDelivery of defined scope
When time-boxed iterations are used: At the end of each iterationWhen continuous flow is used: At fixed time intervals (e.g., monthly)
Copyright © 2007-2009 Dave Nicolette
* if release cadence is decoupled from development cadence
Cumulative Flow DiagramPrincipleOur highest priority is to satisfy the customer through early and continuous delivery of valuable software.
MetricCumulative Flow
InformationalVisualization of flow. Empirical observation of lead time and WIP queue depth.
DiagnosticExposes capacity constraints and not-immediately-available constraints.
MotivationalTeam members take pride in seeing the workflow in a visual form.
Copyright © 2007-2009 Dave Nicolette
“Continuous attention to technical excellence and good design enhances agility.”
“Simplicity – the art of maximizing the amount of work not done – is essential.”
“At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.”
More Agile Principles That Guide the Choice of Metrics
Copyright © 2007-2009 Dave Nicolette
Static Code Analysis Example
Copyright © 2007-2009 Dave Nicolette
Static Code Analysis Example
cyclomaticcomplexity
not coveredby tests
warns oflarge methods
Copyright © 2007-2009 Dave Nicolette
Automated Inference of TDD Practices
Copyright © 2007-2009 Dave Nicolette
Earned Value Management (EVM) Myth
EVM doesn’t apply to agile projects because it requires a detailed WBS at the outset.
Copyright © 2007-2009 Dave Nicolette
EV =
Current
Start
PV (Completed)
Earned Value Formula
Copyright © 2007-2009 Dave Nicolette
Adapting Earned Value Management (EVM) to Agile Projects
Predictive planning (traditional)• Detailed work breakdown structure at the outset• Method of quantifying “done” for each item in the WBS• Definition of the value of each item in the WBS• Track planned (BCWS) and actual costs (ACWP)• EV is the budgeted cost of work performed (BCWP)
Adaptive planning (agile)• Scope defined at a high level at the outset (features)• Definition of “done” for each feature in scope• Definition of the value of each feature in scope• Track planned (budget) and actual costs (spend)• EV is the budgeted cost of features delivered
Copyright © 2007-2009 Dave Nicolette
Budgeted Cost of Work Scheduled (BCWS) on Agile Projects
Iterative process (or non-iterative process with equal-length releases)
Sum of one-time costs / number of iterations (or releases) = One-time cost allocation per iteration
Total on-going costs per iteration X number of iterations = Total on-going costs
BCWS = Sum of one-time costs + Total on-going costsCost per iteration =
One-time cost allocation per iteration + On-going costs per iteration
Non-iterative process with variable release schedule
Sum of one-time costs / a chosen time interval (e.g., week) = One-time cost allocation per time interval
Total on-going costs per time interval X number of time intervals =Total on-going costs
BCWS = Sum of one-time costs + Total on-going costsCost per time interval =
One-time cost allocation per time interval + On-going costs per time interval
Copyright © 2007-2009 Dave Nicolette
EV Examples
See the Agile Metrics spreadsheet, “EV Iterative” and “EV Non-iterative” sheets for examples.
Iterative process
Non-iterative process
Copyright © 2007-2009 Dave Nicolette
When EVM is Applicable
YesLevel of effort per task is well understoodExample: Corporate intranet CRUD webapp based on existing standards
NoProject involves a high degree of uncertainty and will involve prototyping, spiking, research, and/or experimentationExample: The company’s first business application using an unfamiliar programming language
Work items are added to the work queue in an unpredictable fashionExample: Production support group that addresses bug reports as they are received
Copyright © 2007-2009 Dave Nicolette
Throughput Accounting
Throughput (T): The rate at which a system produces goal units (money).
S = net sales
TVC = totally variable cost
T = S – TVCInvestment (I): The money tied up in the system.
Operating Expense (OE): The cost of generating goal units.
Copyright © 2007-2009 Dave Nicolette
Throughput AccountingNet Profit (NP) is throughput less operating expense.
NP = T - OEReturn on Investment (ROI) is net profit / investment.
ROI = NP / ITA Productivity is throughput / operating expense
TAP = T / OEInvestment Turns (IT) are throughput / investment
IT = T / ICopyright © 2007-2009 Dave Nicolette
Throughput Accounting and Iterative Agile Methods
There are no “sales” and therefore no “sales price” in internal IT projects.Use the project budget as the sales price.
Investment is the total cost of preparing the Master Story List or Product Backlog – the list of all the features to be developed. May include:• All up-front analysis costs• All up-front requirements elaboration costs• All project planning, release planning, and iteration planning costs
Operating Expense includes all costs for the iteration except investment.
Copyright © 2007-2009 Dave Nicolette
I = Irelease +
Iterations
n = 0
In
Throughput Accounting: Investment for a Release
Copyright © 2007-2009 Dave Nicolette
OErelease = OEiteration x iterations
Throughput Accounting: Operating Expense for a Release
Copyright © 2007-2009 Dave Nicolette
Throughput Accounting Example
See the Agile Metrics spreadsheet, “TA Iterative” sheet for an example.
NP (net profit) isn’t really profit. It tells you whether you’re doing better than your budget. I (Inventory) is the cost of requirements, analysis, writing acceptance tests, and writing user stories. OE (operating expense) is the cost of building the solution. If you can drive these down, then T (throughput) and NP (net profit) will go up.
Copyright © 2007-2009 Dave Nicolette
Reliability of Promises
Reliable promise:
I deliver as promised, orI tell you I can't deliver as soon as I know it.
Copyright © 2007-2009 Dave Nicolette
Niko-Niko Calendar Symbols
Positive
Neutral
Negative
Copyright © 2007-2009 Dave Nicolette
Niko-Niko Calendar Patterns
Copyright © 2007-2009 Dave Nicolette
Niko-Niko Calendar Patterns
Copyright © 2007-2009 Dave Nicolette
Niko-Niko Calendar Example
Copyright © 2007-2009 Dave Nicolette
Story Cycle Time (Iterative)
The number of iterations it takes to complete a story.
Copyright © 2007-2009 Dave Nicolette
Cycle Time (Lean)
The average time between delivery of completed work items.
Copyright © 2007-2009 Dave Nicolette
Problematic Measures
Not relevant to agile methods:
Gantt chart
Percent complete
Time per team member per task
Actual time vs. estimated time
Copyright © 2007-2009 Dave Nicolette
Using Trends to Spot Problems
Iteration 1 Iteration 2 Iteration 3 Iteration 4 Iteration 5 Iteration 6 Iteration 7 Iteration 8 Iteration 9 Iteration 10 Iteration 11 Iteration 1205
1015202530354045
Velocity
Velocity
Iteration 1 Iteration 2 Iteration 3 Iteration 4 Iteration 5 Iteration 6 Iteration 7 Iteration 8 Iteration 9 Iteration 10Iteration 11 Iteration 120%
50%
100%
150%
200%
250%
300%
350%
Cyclomatic Complexity
Unit Test Coverage
Percentage of Tests Passing
1 2 3 4 5 6 7 8 9 10 11 1202468
10121416
Stability - Broken tests that used to pass
Stability - Broken tests that used to pass
Sample ScorecardValue Delivery
Risks
Copyright © 2007-2009 Dave Nicolette
Sample ScorecardDelivery Effectiveness
Story Cycle Time: 2
Copyright © 2007-2009 Dave Nicolette
Sample Scorecard
Software Quality
Customer satisfactionNon-functional requirementsTesting metrics Coverage Tests passing Least-tested componentsStatic code analysis metrics Cyclomatic complexity Structural complexity Cyclic dependenciesObservational/calculated Defect density
Copyright © 2007-2009 Dave Nicolette
Sample Scorecard
Continuous Improvement
Build frequency Escaped defects Use of TDD Big-bang refactorings Pairing time vs solo time Overtime Issues from Retrospective
Copyright © 2007-2009 Dave Nicolette
Agile Balanced Metrics (Forrester)
Operational Excellence User Orientation
Business Value Future Orientation
• Project Management• Productivity • Organizational Effectiveness• Quality
• User Satisfaction• Responsiveness to needs • Service Level Performance• IT Partnership
• Business value of projects• Alignment with strategy • Synergies across business units
• Development capability improvement• Use of emerging processes and methodologies • Skills for future needs
Copyright © 2007-2009 Dave Nicolette
Agile Project Scorecard (Ross Pettit)
Copyright © 2007-2009 Dave Nicolette
Sample Agile Dashboard (VersionOne)
Sample Agile Dashboard (Serena)
Thanks for your time!
Contact information:
Dave Nicolette
Blogshttp://www.davenicolette.net/agilehttp://www.davenicolette.net/taosoft
Workshopshttp://davenicolette.wikispaces.com
Copyright © 2007-2009 Dave Nicolette