agile adoption dashboard recommendation

Upload: pg74

Post on 03-Jun-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    1/24

    Greg Reiser,ThoughtWorks

    Updated November 182011

    ISG SW Engineering- Lean Enterprise /Agile Transformation

    Proposed Program Dashboard22 November 2011

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    2/24

    "Our highest priority is to satisfy the customer throughearly and continuous delivery of valuable software.

    "Working software is the primary measure of progress.

    http://agilemanifesto.org/principles.html

    Source - Dave Nicolette

    Guiding Principles for Metrics

    Measure outcomes, not activity.

    Various authors

    http://agilemanifesto.org/principles.htmlhttp://agilemanifesto.org/principles.html
  • 8/12/2019 Agile Adoption Dashboard Recommendation

    3/24

    Agile Balanced Metrics (Forrester)

    Operational Excellence

    Project Management

    Productivity

    Organizational Effectiveness

    Quality

    User Orientation

    User Satisfaction

    Responsiveness to needs

    Service Level Performance

    IT Partnership

    Business Value

    Business value of projects

    Alignment with strategy

    Synergies across business units

    Future Orientation

    Development capability improvement

    Use of emerging processes andmethodologies

    Skills for future needs

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    4/24

    Agile Balanced Metrics (NCR)

    Operational Excellence

    New-Functionality Ratio

    Internal Code Quality

    Build Hygiene

    Percent Accurate and Complete

    Defect Resolution Time

    User Orientation

    Customer Satisfaction

    External Customers

    Internal Customers

    Business Value

    Feature Lead Time

    Defects

    Cost per Feature Point

    Future Orientation

    Teams Agile Maturity

    People

    Number of Agile Practitioners

    Number of Agile Leads

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    5/24

    Future Orientation Agile Maturity

    What it isMeasure of team agility (ability to respond to customer demand and

    change) based on ThoughtWorksAgile Maturity Model.

    Measurement Qualitative assessment along 10 dimensions of software development

    Purpose Assess how teams are progressing towards a targeted future state

    Caveat The AMM is not a compliance tool. Its intent is to define the current state of asoftware development team with respect to agile principles and practices, develop a

    plan for change and track progress against that plan.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    6/24

    Future Orientation Agile Practitioners and Leaders

    What it isThe number (and rate of increase) of qualified Agile Practitioners and

    Leaders in ISG Software Engineering

    Measurement Use a variation of Net Promoter Score to assess individuals

    On a scale of 0 to 10, this person does not require coaching support in order to be a positivecontributor on an agile team.

    On a scale of 0 to 10, this person is effective as an agile coach for one or more roles

    Initial assessments will be performed by ThoughtWorks coaches. As NCR staff achieve Practitionerand Leader status they will assume this responsibility

    Purpose Monitor the rate at which NCR staff are developing agile expertise. Ensure

    that NCR are on track to developing the skills required to support broad agile adoption

    Student

    Practitioner

    Leader

    Training

    Project ExperienceCoachingMentoring

    Focused CoachingOngoing Support

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    7/24

    Operational Excellence New Functionality Ratio

    What it isThe ratio of effort (cost) spent on new feature-functionality vs. support

    Measurement

    Hours-New-Functionality:Hours-Support

    Purpose Well-run Agile teams generate higher quality and fit-for-purposefunctionality, and minimize the amount of effort spent on marginal value features. Thistranslates into lower failure, customer service and other support costs. This in turntranslates into increased capacity to innovate and satisfy customer demand.

    Recommendation

    When developing new functionality, the cost of defects discovered at the end of the developmentlifecycle should be counted as Support (Appraisal or Internal Failure costs)

    When developing new functionality, the cost of defects discovered earlier in the development cycle

    should be counted as New Functionality costs (Prevention costs)

    Defects reported by consumers of shared components (Product and Solution Teams) should betreated as Support (External Failure costs) by Component Teams

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    8/24

    Operational Excellence Internal Code Quality

    What it isMetrics that describe the design quality of the software in a product

    Measurements

    The four Cs: Coverage, Complexity, Cohesion, Coupling

    See Appendix for details

    Purpose Compares a codebase to generally accepted guidelines for good design.Identifies opportunities for making software more malleable. Increasing adherence tosuch guidelines plus decreasing Defect Resolution Time and shorter Feature Lead Timeare all indicators of reducing technical debt.

    Tools such as Sonar can collect and report on a broad range of metrics; its better tofocus on a small subset and adapt with caution as you learn how to use metrics to drivedesired behavior.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    9/24

    Operational Excellence Internal Code Quality:Dashboard Example

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    10/24

    Operational Excellence Build Hygiene

    What it isNumber of builds and Percentage of successful builds in a given timeframe.This can be easily monitored through a Sonar plugin.

    Measurement

    Number of builds (in a given timeframe)

    Build Success % (in a given timeframe)

    Purpose Drive a desired development behavior

    Example

    Number of builds = 15 builds in the last 30 days

    Build Success = 86.7% in the last 30 days

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    11/24

    Operational Excellence Percent Accurate and Complete

    What it isPercent of stories that do not revert to an earlier state

    Measurement

    (Ideal State Transitions / Total State Transitions) x 100

    Purpose Indicator of rework (waste). Measuring at the individual state transition levelidentifies opportunities for continuous improvement.

    Example

    Story Lifecycle: Not Started Analysis Development Testing Acceptance Testing Deployed

    Size of Backlog = 100 Stories; Hence, ideal number of forward state transitions for entire project is500 (5 x 100)

    Actual Experience:

    20 instances of stories reverting from UAT to Testing

    50 instances of stories reverting from Testing to Development (may include many of the above 20)

    PAC (Dev-to-Test) = (100/150) x 100 = 67%

    PAC (Test-to-UAT) = (100/120) x 100 = 83%

    PAC (Project) = (500/570) x 100 = 88%

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    12/24

    Operational Excellence Defect Resolution Time

    What it isAverage time between defect identification and resolution

    Measurement

    Defect Closed Timestamp Defect Open Timestamp

    Purpose Demonstrates the malleability of the codebase and the extent to which theteam has adopted a zero-defect culture

    Malleability of the Codebase Disciplined agile teams strive to limit technical debt asmuch as possible. Technical debt is assessed at two levels, defects and code metricsthat indirectly describe how easy it is to maintain and enhance the software(malleability). Rapid resolution of defects is one measure of the business benefit of lowtechnical debt.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    13/24

    Business Value Feature Lead Time

    What it is The average time to deploy a feature from the time that the developmentteam begins work on it

    Measurement

    Date Feature Deployed Date Analysis begins on first relevant story

    Deployed = Feature is Shippable

    Shippable (Component Teams) = Binary is fully tested and available for consumption by Product Teams

    Shippable (Product Teams) = Feature is part of a fully-tested release. Customer deployment is strictly abusiness decision.

    PurposeMeasures the responsiveness of the team once a feature has been identified

    as the next highest priority.

    Why not start measurement when the feature is first identified?

    It is much more important to be responsive with respect to higher priority features. If the metricstarts at feature identification time teams will be tempted to work on the easiest features regardlessof priority.

    If consumers are not receiving high quality components fast enough to meet their businesscommitments they dont need a metric to tell them that. If the root cause is determined to be highpriority features queuing up to get started, this indicates an obvious capacity issue rather than aprocess issue.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    14/24

    Business Value Defects

    What it isNumber of defects reported after a story has been flagged as

    Done

    by thetesters that are embedded in the development team

    Measurement

    Consider the following story life cycle:

    New In Analysis Ready for Dev In Dev Ready for Test In Test Done

    Track defects reported by any downstream activities (e.g., component integration test, controlled

    deployment, professional services, external customer, etc.)

    Raw number of defects reported per severity level and time (activity) of detection

    Report in terms of density (per KLOC) and technology stack when comparing across projects

    Purpose One indicator of the quality of software produced

    Why not record defects identified earlier in the story lifecycle?

    Testing within a Sprint is a defect prevention cost (up-front acceptance criteria and TDD are otherexamples of defect prevention). We want to encourage defect prevention by focusing on theexpected reduction in defect appraisal, internal failure and external failure costs. Hence the focus ondefects that are indicators of those other quality costs.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    15/24

    Business Value Cost per Feature Point

    What it isCost per deployed unit of business value, where units are

    feature points

    as defined by the Product Owner

    Measurement

    Development-Cost / Feature-Points-Deployed

    Development Cost = Development costs incurred within a specific time frame. If comprehensivedevelopment costs are difficult to obtain, hours of effort may serve as a reasonable proxy.

    Feature Point = Relative units of business value for a feature as determined by the Product Owner(Solution Manager)

    Feature-Points-Deployed = Sum of feature points for those features deployed (shippable) during thetarget time frame

    Purpose The direction of change indicates if teams and the organization are becomingmore or less productive

    Recommendation Trend is more important than raw value. If used to compare teams,limit comparison to teams that serve the same line of business. Since feature points aresubjective values assigned by Product Owners, comparisons are only valid where thereis consistency amongst people that work together.

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    16/24

    User Orientation Customer Satisfaction

    What it isNet Promoter Score (NPS) for NCR, ISG Software Engineering andindividual project teams

    Measurement

    Use the Net Promoter Score methodology (http://en.wikipedia.org/wiki/Net_Promoter) with thefollowing customer groups:

    External Customers (e.g., Kohls, Toys R Us, Tesco, etc.)

    Professional Services and ISG Solution ManagementThat is, the customers of ISG SoftwareEngineering

    Consumers of Component Team products For example, the Vision, Travel Air and SSCO teams areconsumers of components develped by the P24 team

    Purpose Simple way to collect and monitor how well ISG Software Engineering isresponding to the needs of its customers at multiple levels. Reinforces a customercentric culture even for teams that are several levels detached from NCRs external

    customers.

    http://en.wikipedia.org/wiki/Net_Promoterhttp://en.wikipedia.org/wiki/Net_Promoter
  • 8/12/2019 Agile Adoption Dashboard Recommendation

    17/24

    Appendix

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    18/24

    Operational Excellence Internal Code Quality:Coverage

    What it isa measurement of the extent to which linesand branches

    are executed as part of some test.

    Measurements

    Lines of code reached by a unit test / Executable lines of code

    (Branches that evaluate to True at least once + Branches thatevaluate to False at least once) / (2 * Total number of branches)

    Purpose indicates how much code isnt executed by tests

    Example

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    19/24

    Operational Excellence Internal Code Quality:Complexity

    What it isMcCabe Metric or Cyclomatic Complexity Number; the

    number of independent flow-paths through a method/function.

    Measurement

    (no. of branches in method: if, for, &&, ||, etc.) + 1

    Purpose a quantitative indicator of the complexity of code

    Example a method with a CCN of 3+

    1

    2

    3

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    20/24

    Operational Excellence Internal Code Quality:Cohesion

    What it isA measure of

    single responsibility principle

    of a class.

    High cohesion is a good thing

    Measurement

    Number of Connected methods and fields

    Purpose Measures the number of connected components in a class

    Example

    private final String privKey = readSecurelyFromSomewhere();

    private String encode(String plainText, String pubKey) { }

    private String decode(String encrypted, String pubKey) { }

    public void login(String uname, String pwd) { }

    public void logout(String uname) { }

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    21/24

    Operational Excellence Internal Code Quality: Coupling

    What it isa measure of dependency both ofand ona class. Low

    coupling is a good thing

    Measurement

    Afferent number of other classes that use this class

    Efferent number of other classes used by this class

    Purpose Measures the number of adjacent classes in dependency tree

    Example

    log()

    toString()

    hashCode()

    equals()

    T clone (T me)

    UtilityLogger

    StringBuilder

    EqualsBuilder

    Many

    Other

    Classes

    Mnemonic:

    Afferent:Arrive at class

    Efferent: Exit class

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    22/24

    Remember the Future Exercise

    0

    5

    10

    15

    20

    25

    30

    Responsive

    Innovative

    Efficient

    Transparent

    PartnershipQuality

    Time to market

    Feedback

    Predictability

    Higher value / lower cost

    Reduce waste (marginalvalue features) Clear backlog short/long

    term Competency

    Less rework; shorter cycletime

    Dollar value of softwaredeveloped

    Clarity of customerrequirements

    Understanding of goals Reusing software across

    enterprise Sustainable rhythms

    Being innovative Innovative Innovative/disruptive

    technology

    New product ideasintroduced into products

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    23/24

    Metrics Template

    What it is

    Measurement

    ABC

    Purpose

    Example

  • 8/12/2019 Agile Adoption Dashboard Recommendation

    24/24

    Scorecard Graphic

    Operational Excellence

    User OrientationBusiness Value

    Future Orientation