software metrics and measurements
DESCRIPTION
Software Metrics and Measurements. Supannika Koolmanojwong CS577. Outline. General Concepts about Metrics Example of Metrics Agile Metrics Case Studies Metrics from Empirical Data – Northrop Grumman Metrics in IT services – Lockheed Martin. Measurements in daily life. - PowerPoint PPT PresentationTRANSCRIPT
University of Southern California
Center for Systems and Software Engineering
Software Metrics and Measurements
Supannika KoolmanojwongCS577
1
University of Southern California
Center for Systems and Software Engineering
Outline
• General Concepts about Metrics• Example of Metrics• Agile Metrics• Case Studies
– Metrics from Empirical Data – Northrop Grumman
– Metrics in IT services – Lockheed Martin
2
University of Southern California
Center for Systems and Software Engineering
Measurements in daily life
3
University of Southern California
Center for Systems and Software Engineering
Why do we measure?
4
University of Southern California
Center for Systems and Software Engineering
5http://www.teamqualitypro.com/software-metrics/do-you-know-your-abcs-of-software-metrics/
University of Southern California
Center for Systems and Software Engineering
Objectives of software measurement
• “You can not control what you cannot measure.” – Tom DeMarco
• “Not everything that counts can be counted. Not everything that is counted counts.” – Albert Einstein
6
University of Southern California
Center for Systems and Software Engineering
Software Metrics
• Numerical data related to software development
• Strongly support software project management activities
• Can be directly observable quantities or can be derived from one
7
University of Southern California
Center for Systems and Software Engineering
A simplified measurement information model
8Ref: Ebert and Dumke 2007
Decisions / Actions Measurements
Process Work Products
Information products
Information needs
Attributes
Results
Information Needs,
Objectives, Control
University of Southern California
Center for Systems and Software Engineering
How the software measurements are used?
• Understand and communicate• Specify and achieve objectives• Identify and resolve problems• Decide and Improve
9
University of Southern California
Center for Systems and Software Engineering
Measurement Standard
10
ISO/IEC 12207 Software Life Cycle Processes
ISO/IEC 15288 System Life Cycle processes
SWEBOKSoftware Engineering Body of Knowledge
PMBOKProject Management Body of Knowledge
CMMICapability Maturity Model Integration
ISO 15504Software Process Capability Determination
ISO 9001Quality Management System
ISO/IEC 9126Software Product QualityTL 9000, AS 9100, etc.Objectives adaptations
How to do How to do better
ISO/IEC 15939:2002Software Measurement Process
How to measure what you are doing
University of Southern California
Center for Systems and Software Engineering
Ground rules for a Metrics• Metrics must be
– Understandable to be useful– Economical– Field tested– Highly leveraged– Timely– Must give proper incentives for process improvement– Evenly spaced throughout all phases of development– Useful at multiple levels
11http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf
University of Southern California
Center for Systems and Software Engineering
Measurements for Senior Management
• Easy and reliable visibility of business performance
• Forecasts and indicators where action is needed
• Drill-down into underlying information and commitments
• Flexible resource refocus
12
University of Southern California
Center for Systems and Software Engineering
13
University of Southern California
Center for Systems and Software Engineering
Measurements for Project Management
• Immediate project reviews• Status and forecasts for quality, schedule,
and budget• Follow-up action points• Report based on consistent raw data
14
University of Southern California
Center for Systems and Software Engineering
Metrics for Senior Management
15
University of Southern California
Center for Systems and Software Engineering
Measurements for Engineers
• Immediate access to team planning and progress
• Get visibility into own performance and how it can be improved
• Indicators that show weak spots in deliverables
• Focus energy on software development
17
University of Southern California
Center for Systems and Software Engineering
18http://www.ibm.com/developerworks/rational/library/customized-reports-rational-team-concert/
University of Southern California
Center for Systems and Software Engineering
The E4-Measurement Process
19
Objectives, needs
Business Process
Environment, resources
1. Establish 2. Extract 3. Evaluate 4. Execute
Decisions, re-direction,
updated plans
Ref: Ebert and Dumke 2007
University of Southern California
Center for Systems and Software Engineering
Aggregation of informationEnterprise
Division
Product Line/ Department
Projects
20
Cycle time, quality, cost, productivity, customer
satisfaction, resources, skills
Sales, cost reduction, innovative products, level of
customization
Cost reduction, Sales, margins, Customer Service
Cash flow, Shareholder value, Operations cost
University of Southern California
Center for Systems and Software Engineering
SMART goals
• Specific - precise• Measurable - tangible• Accountable – in line with individual responsibilities
• Realistic - achievable• Timely – suitable for the current needs
21
University of Southern California
Center for Systems and Software Engineering
Components of software measurements
23
University of Southern California
Center for Systems and Software Engineering
Example of Metrics
• Progress / Effort / Cost Indicator• Earned value management• Requirements / Code Churn• Defect-related metrics• Test-related metrics
24
University of Southern California
Center for Systems and Software Engineering
Size
• How big is the healthcare.gov website? – http://www.informationisbeautiful.net/
visualizations/million-lines-of-code/
25
University of Southern California
Center for Systems and Software Engineering
Size• Earth System Modeling Framework Project
26http://www.earthsystemmodeling.org/metrics/sloc.shtml
University of Southern California
Center for Systems and Software Engineering
Progress Indicator
27
University of Southern California
Center for Systems and Software Engineering
Effort Indicator
28
University of Southern California
Center for Systems and Software Engineering
Cost Indicator
29
University of Southern California
Center for Systems and Software Engineering
Earned value management• Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS)• Earned Value (EV) or Budgeted Cost of Work Performed (BCWP)
30http://en.wikipedia.org/wiki/Earned_value_management
University of Southern California
Center for Systems and Software Engineering
Burndown Chart
31http://en.wikipedia.org/wiki/Burn_down_chart
University of Southern California
Center for Systems and Software Engineering
Requirements Churn/ Requirements Creep/
Requirements Volatility• number of changes to system requirements
in each phase/week/increment
32
University of Southern California
Center for Systems and Software Engineering
Code Churn• Software change history• Large / recent changes• Total added, modified and deleted LOC• Number of times that a binary was edited• Number of consecutive edits
33
University of Southern California
Center for Systems and Software Engineering
Code Complexity
34
• Gathered from code itself• Multiple complexity values• Cyclomatic complexity• Fan-In / Fan-Out of functions• Lines of Code• Weighted methods per class• Depth of Inheritance• Coupling between objects• Number of subclasses• Total global variables
University of Southern California
Center for Systems and Software Engineering
Code coverage
• Degree to which the source code is tested• Statement coverage
– Has each node in the program been executed?• Branch coverage
– Has each control structure been evaluated both to true and false?
35
University of Southern California
Center for Systems and Software Engineering
Code Coverage
36http://www.firstlinesoftware.com/metrics_group2.html
University of Southern California
Center for Systems and Software Engineering
JUnit Code CoverageTool instruments byte code with extra code to measure
which statements are and are not reached.
37
Package Level Converage
A Code Coverage Report
Line Level Coverage
http://www.cafeaulait.org/slides/albany/codecoverage/Measuring_JUnit_Code_Coverage.html
University of Southern California
Center for Systems and Software Engineering
Defect reporting metric
• Can be categorized by – Status
• Remaining / Resolved / Found– Defect Sources
• Requirements / Design / Development– Defect found
• Peer review / unit testing / sanity check– Time
• Defect arrival rate / Defect age
38
University of Southern California
Center for Systems and Software Engineering
Defect Status
39
University of Southern California
Center for Systems and Software Engineering
Defect Density
40
University of Southern California
Center for Systems and Software Engineering
Test Pass Coverage
41http://www.jrothman.com/Papers/QW96.html
University of Southern California
Center for Systems and Software Engineering
Defect Density
42
University of Southern California
Center for Systems and Software Engineering
Defect Per LOC
43
University of Southern California
Center for Systems and Software Engineering
Developer Code Review
44
After 60‒90 minutes, our ability to find defects drops off precipitously
http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
University of Southern California
Center for Systems and Software Engineering
45
As the size of the code under review increases, our ability to find all the defects decreases. Don’t review more than 400 lines of code at a time.
http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
University of Southern California
Center for Systems and Software Engineering
Top 6 Agile Metrics
46Ref: Measuring Agility, Peter Behrens
University of Southern California
Center for Systems and Software Engineering
47Ref: Measuring Agility, Peter Behrens
Velocity = Work Completed per sprint
University of Southern California
Center for Systems and Software Engineering
48
University of Southern California
Center for Systems and Software Engineering
Case studies
• Metrics from Empirical Data – Northrop Grumman
• Metrics in IT services – Lockheed Martin
49
University of Southern California
Center for Systems and Software Engineering
50Richard W. Selby, Northrop Grumman Space Technology, ICSP '09Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"
University of Southern California
Center for Systems and Software Engineering
51
University of Southern California
Center for Systems and Software Engineering
52
University of Southern California
Center for Systems and Software Engineering
Case studiesLockheed Martin Consolidated InformationTechnology Infrastructure Contract (CITIC)
• Customer: Centers for Medicare & Medicaid Services (CMS)– day-to-day IT Operations & Maintenance (O&M) functions
(high availability – 24/7)• Service Desk, Mainframe Support (Tier 1), Mid-Tier
Support (Tier 2), Desktop Support (Tier 3), Voice, Data, NOC, etc.
• Performance, Request, Incident, Configuration, Asset, and Change management
• Security and Disaster Recovery– modernization of the IT infrastructure
53http://www.psmsc.com/UsersGroup2012.asp
University of Southern California
Center for Systems and Software Engineering
Service Measurement Plans & SLAs
54http://www.psmsc.com/UsersGroup2012.asp
University of Southern California
Center for Systems and Software Engineering
55
University of Southern California
Center for Systems and Software Engineering
56
University of Southern California
Center for Systems and Software Engineering
57
University of Southern California
Center for Systems and Software Engineering
58
University of Southern California
Center for Systems and Software Engineering
Measurements for progress vs predictionsProject Phase For Measurements For PredictionsProject Management
-Effort and Budget Tracking- Requirements Status-Task Status
-Top 10 risks- Cost to complete- Schedule evolution
Quality Management
-Code Stability-Open defects-Review status and follow up
-Residual defects-Reliability-Customer satisfaction
Requirements Management
-Analysis status -Specification progress
-Requirements volatility / completeness
Construction -Status of documents- Change requests-Review status
-Design progress of reqm- Cost to complete-Time to complete
Test Test progress (defects, coverage, efficiency, stability
- Residual defects- reliability
Transition, deployment
-Field performance (failure, corrections) - maintenance effort
-Reliability-Maintenance effort
59Ref: Ebert and Dumke, 2007
University of Southern California
Center for Systems and Software Engineering
Recommended books
60
Software Measurement: Establish - Extract - Evaluate – Execute by Christof Ebert, Reiner Dumke (2010)
Practical Software Measurement: Objective Information for Decision Makers by John McGarry, David Card, Cheryl Jones and Beth Layman (Oct 27, 2001)
University of Southern California
Center for Systems and Software Engineering
References• http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html• Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall,
1991.• http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf• Christof Ebert, Reiner Dumke, Software measurement: establish, extract,
evaluate, execute, Springer 2007• http://se.inf.ethz.ch/old/teaching/2010-S/0276/slides/kissling.pdf• Richard W. Selby, Northrop Grumman Space Technology, ICSP '09
Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems“
• Measuring Agility, Peter Behrens• [Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and
Time to Repair Software Failures for Three JPL Flight Projects. Software Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099, November 5, 1991.
61
University of Southern California
Center for Systems and Software Engineering
62
Today’s in-class assignment
• Individual in-class assignment• Pick any 2 measurements that would be
useful to your team• After the class, talk to your team members,
pick 2 measurements that will be used in your team
• Start collecting data, you will have to present them during the DCR ARB /TRR ARB(Transition Readiness Review)