project metrics for software development
Post on 07-Jul-2016
270 Views
Preview:
DESCRIPTION
TRANSCRIPT
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
About InfoQ Our Audience Contribute About C4Media Exclusive updates on:
Facilitating the spread of knowledge and innovation in professional software development
| | | |1,455,522 Apr unique visitors
New York Jun 13-17San Francisco Nov 7-11London Mar 6-10, 2017
Mobile HTML5 JavaScript APM IoT Cloud Java .NET Database All topics
You are here: InfoQ Homepage Articles Project Metrics for Softw are Development
Search
En 中文 日本 Fr Br
Project Metrics for Software Development
Posted by Carlos Sirias on Jul 14, 2009 | Discuss
Since 2007, I have been involved in an effort to measure success for software developmentprojects regardless of their methodology so that we can report to upper management. Thefollowing article presents some of the conclusions I personally made during this research in aneffort to present to a broader audience the challenges we had and how they were addressed; itfocuses on performance and not on progress metrics as I personally believe the set of secondones focus only on the present and have little impact on the team's future accomplishments,. Isee progress metrics as a way to help your team achieve a goal timely, however unless theyreflect on their performance their chances to improve get reduced; for example if a projectmanager keeps showing something like "Progress Against Schedule" the team will rush torecover their lost without stopping and thinking what went wrong and how to improve -since theywon't have time- that's why I believe progress metrics are helpful but not complete.
8
Share | Read laterMy Reading List
RELATED CONTENT
How to Win Hearts and Minds May 07, 2016
Is There Really Such a Thing as a“Hybrid Agile” Method? May 04, 2016
Q&A with Dave Snowden onLeadership and Using Cynefin forCapturing Requirements Apr 23, 2016
Development Architecture& Design
Data Science Culture &Methods
DevOps
Login
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Keep CALM and Embrace DevOps
Case study: How automation helped the UK’s#1 online recruitment agency fill over 60,000jobs for 9,000 companies
Top 10 Java Performance Problems
Download the FOREVER FREE and fully-featured version of TeamCity
How to Set Up a Git Repository
Related Vendor Content
Related Sponsor
Start a Free 30 day
You all might remember the famous quote: "If you can't measure it, you can't manage it" and if acompany is unable to manage a software project, how will they know how to improve? When theyhave improved? And what's the real value added to a change introduced into the process? - Forexample a transformation to their software practices and principles- did someone mention "aswitch from Waterfall to Agile"? Software project success has always been the goal of theindustry; however the metrics that helps us measure the success have been as diverse as theycould be. Depending on the particular methodology that you follow the set of suggested metricswill not have anything in common. We faced that challenge in Hewlett Packard as we had adiverse set of projects using different methodologies, so our upper management received mixedmetrics depending on what the different organizations wanted to report.
For those Agile readers, we know their projects are uniquely well suited for metrics, as the datacollection can be done seamlessly and without distraction of the team, however the set of metricssuggested might not be suitable for projects not using the principles and practices embraced byAgile things such as velocity, cumulative flow and burndown might not make sense for teams thathave not embraced Agile. What if we want to measure projects for what they are projects, andnot for what they use?
Management of the software developmentprocess requires the ability to identifymeasures that characterize the underlyingparameters to control and aid continuousimprovement of the project. After severalattempts and research inside and outsidethe company we ended up struggling with ahuge set of proposed metrics (aroundtwenty) but in a very Agile way we sat witha team of experts to retrospect on eachparticular metric, first we eliminated all theproposed ones that dealt with "size" -of theproject, artifacts, code, deliverables- as bigcompanies manage all kinds of softwareprojects, and we really asked ourselves:Do we really want to measure this? Theanswer was no as we first wanted to have a
0 Bugs Policy Apr 05, 2016
Q&A on the Lean IT Field Guide Feb 24, 2016
Q&A on Exploring the Practice ofAntifragility Jan 04, 2016
Reinventing Organizations for Agility Nov 30, 2015
Continuously Improving Your Lean-Agile Coaching Jan 01, 2016
Connect Agile Teams toOrganizational Hierarchy: ASociocratic Solution Dec 27, 2015
Wrike Releases Agile Marketing SurveyResults 2016 May 14, 2016
Taking Agile to Marketing: Process,Teams, and Success Stories May 07, 2016
SPONSORED CONTENT
How to Set Up a GitRepository
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
trial of Bitbucket Serveranswer was no as we first wanted to have aset of easy metrics that did not involvedany burden on the team and that weremore useful in determining team maturity and performance.
Finally we decided to keep it simple and defined three core metrics for all IT as follows:
Effort was defined as the total amount of time for a task that results in a work product or aservice. The planned amount of time is called 'Planned Effort' and the actual amount of timespent is called the 'Actual Effort'; it can be measure in hours or in days depending on thespecifics of the project. A task is a part of a set of actions which accomplishes a job, problem orassignment which should not be different for Waterfall or Agile projects. However we doacknowledge differences on how tasks are managed on using different methodologies; which canbe seen as advantages or disadvantages of each. The following figure shows how this metric issummarized to the team.
Figure 1: Cumulative Person Effort Visual Display & Actual Effort distribution per phase
Effort is a directly observable quantity which is measured cumulatively for the tasks beingassigned to specific resources or it can also be computed for specific tasks, milestones orphases. The 'Planned Effort' is collected when the work for a specific task is planned and isrefined when the task is analyzed or designed. The 'Actual Effort' is collected during theconstruction, testing and warranty related to the specific task. At the beginning organizationsshould see big differences or gaps between their 'estimation' and 'actuals' however as teams"mature" the numbers tend to get closer. Our experience has been that if after a six month periodthe effort charts don't show closer trends the team must retrospect and find root causes, whichcan be related to factors inside or outside the team. Defining the constraint could be a great
Sponsored by
RepositoryThis tutorial provides a succinctoverview of the most important Gitcommands. First, the Setting Up aRepository section explains all ofthe tools you need to start a newversion-controlled project. Then,the remaining sections introduceyour everyday Git commands.
Five tips for CI-friendly GitreposLearn some tips for getting your CIsystem to interact optimally withyour Git repository.
RELATED CONTENT
Cornelia Davis Talks about SoftwareTransformation in Enterprises May 07, 2016
Aviran Mordo Discusses TechnicalLeadership, DevOps and DistributedComputing May 06, 2016
Business Mapping: Building an AgileOrganization May 01, 2016
DevOps Days for the Enterprise Apr 30, 2016
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
place to start as suggested by Poppendieck5.
Productivity was defined as how many "simple tasks" can be delivered by day. It can becomputed at various levels within a project: individual, profile, task phase or project.
The definition "simple task" will always raise heated conversations; we finally define it as theamount of time required by a resource to deliver something and settle it to five hours (of coursesome simple tasks take less or more, but we settled for that number).
Given our definition "simple tasks that can be delivered by day -8 hours-" the formula wasdefined as:
Productivity = ((Planned Effort / 5) / Actual Effort) *8
The following figure shows how this metric is summarized to the team.
Figure 2: Productivity Visual Display
DevOps Days for the Enterprise Apr 30, 2016
Production Like Performance Tests ofWeb-Services Apr 29, 2016
DevOps & Disciplined DevOps Apr 19, 2016
Finding the Truth Behind MinimumViable Products Apr 18, 2016
#LearningIsHorrible, and Other HarshRealities Apr 17, 2016
Q&A with the Authors on"Requirements: The MasterclassLiveLessons-Traditional, Agile,Outsourcing" Apr 17, 2016
Ten Ways to Successfully Fail yourAgility Apr 13, 2016
InfoQ eMag: QCon London 2016Report Apr 06, 2016
SPONSORED CONTENT
Five tips for CI-friendly Git
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
It's obvious that the metric can be challenged and a lot of valid arguments were raised during ourdefinition, however we needed to fall for something that the majority will agree just like in everynice democracy. Once we had such a metric, we found its power to compare project health eitherby week, month, resource, etc.
The productivity metric begins at a task level and rolls up to a deliverable, phase and projectlevel. It has an obvious dependency on the estimation technique and effort collecting tool.
Figure 3: Productivity Visual Display
The figure shows the cumulative productivity for two similar project releases worked by the sameteam, with the same members using the same metrics. First the team worked six months usingtheir familiar Waterfall type of methodology (all analysis upfront, all coding in between, all test atthe end) and then they spent six more months on a release while adopting Agile. The productivityformula applied to both project releases shows the typical trend Agilists usually tell of batches ofwork being moved from one phase (for example coding) to another phase (for example testing)which does not allow the team to deliver a quality product at a sustainable pace, thereforedecreasing their productivity as defined by the previous formula (perhaps due to thrashing ofswitching from one task to another -an effort to do multitasking-). The same team using Agileprinciples and practices such as iterative development, focus on a given feature and fail earlydevelopment (choosing risky requirements first) was able to increase their productivity from theirprevious project release.
repos
The Ultimate DevOps Toolkit
RELATED CONTENT
Key Takeaway Points and LessonsLearned from QCon London 2016 Apr 05, 2016
Test Management Revisited Apr 03, 2016
Meaning it: What’s the Real Purposeof Corporate Social Responsibility? Apr 02, 2016
Designing Delivery Book Review andInterview Mar 30, 2016
DevOps Enterprise Adoption atHearst Business Media with PaulyComtois Mar 28, 2016
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
previous project release.
With a common metric we have been able to measure the predictability in terms of delivery thatAgile projects have, as well as their accuracy of estimations as the project matures.
Quality was defined as the number of: severe, medium or low defects delivered through thelifetime of the project. It contributes to identify the goodness of the deliverable to the end user.Each team needs to define what severe, low and medium means to their specific project. Qualityshould be reported throughout the life of the project; the later defects are caught, the moreimpact they will cause on the project. The following figure shows how this metric is summarized tothe team.
Figure 4: Quality Visual Display
Profilers Are Lying Hobbits (and wehate them!) Mar 27, 2016
Voys Learns to Play the HolacracyGame Mar 27, 2016
How NOT to Measure Latency Mar 26, 2016
LeSS of a Story: An Introduction toLarge-Scale Scrum (LeSS) Mar 18, 2016
Change from Within: Developers andManagers Working Together Mar 16, 2016
Innovation And Agility: Isn't It TwoFaces of the Same (Bit)Coin? Mar 10, 2016
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
With defects collected, we also track a derived metric "defect removal". The goal of it is toevaluate what is the percentage of issues that are found late in the process (they obviously costmore) as opposed to the ones found early. Here we have also found some interesting behaviorwhen comparing Agile to Waterfall type of projects.
Figure 5: Defect Removal Visual Display
The previous display shows how an Agile project has a more sustainable defect rate throughoutthe lifecycle of the software whereas Waterfall type of projects show a peak towards the end. Theinformation was collected from two different product releases created by the same team (with acomparable set of requirements) but using the two approaches.
This set of metrics are constantly collected, analyzed and reported. We can enclose them in a"project dashboard" so as to have a holistic view and a way to share and interpret the resultswith the stakeholders.
There's an obvious correlation among the metrics, as shown by the following table
Metric 1 Metric 2 Positive Trend Negative Trend
ProductivityDelivered
Defect
Density
A high productivity with high delivered defect density is an
indicator of the aspect that planned effort for QA activities
are insufficient in the project. While high productivity is
A high productivity with low
delivered defect density is a
good behavior and team could
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
desirable, there has to be a balance between productivity
and quality for overall benefit. Hence, an optimal level
productivity with good quality is desirable
aim for further improvement in
productivity while monitoring the
delivered defect density level.
A high delivered defect density
with low productivity indicates a
need to fine tune, tailor or
automate testing to detect
issues early in the game.
ProductivityDefect
removal
efficiency
A high productivity with high defect removal efficiency
indicates a good balance between productivity and quality.
Team can aim for further improvement in productivity
while monitoring the defect removal efficiency.
A high productivity with lower
defect removal efficiency
indicates that planned effort for
QA activities are insufficient in
the project.
A high defect removal efficiency
with low productivity indicates a
need to put some more
attention to quality throughout
the cycle.
Guilherme Souto, one of our Project Managers in Brazil R&D documented the following quick tipsthat will help us to be honest and rational driven during metrics adoption:
Metrics need to be linked to an organizational or project objective to demonstrate if we areachieving what we committed.The metric needs to be integrated in the day by day workflow (as much as possible) in orderto avoid having time allocated just to collect data.There needs to be a verification that the chosen metrics are covering different areas that willdefine if a project reaches the end with or without success.Metrics are like words, they will truly make sense in a set that creates a sentence, so asmuch as possible it is necessary to stay away from over analyzing data provided by onemetric; when this set of metrics was introduced we kept seeing teams focusing too much onproductivity, however the sense of isolating it by itself puts us in a bad position to take intoanalysis other variables such as how difficult, or challenging the project was or the level ofquality that the team produced (because of their maturity, the process followed or others). Atthe end of the day we didn't wanted to commit the same mistakes we make when we look at
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Personas Culture & Methods Architecture & Design Topics Measurement
Methodologies Agile
0 Bugs Policy
Related Editorial
progress metrics such as performance against schedule in isolation driving us to sacrificequality or resources.It is useless to use a metric that cannot be used to base or define action plans.Target values need to be defined for metrics like performance, once in order to takecorrective actions, the team must be aware of the expected minimum and upper limit valueranges.
The metrics provide only part of the history and their applicability to decision making depends onknowing the chain of events that trigger a specific change on a team trend. The approach anddefined set of metrics might not be the best for your particular team, but it is what we have doneso far and is working for us. Hopefully it will work for some of the readers but of course I could beway off.
References
The following are good starting points on the topics presented on this article.
1. H T. Goranson, The Agile Virtual Enterprise: Cases, Metrics, Tools. Quorum Books. 2003.
2. Tom DeMarco . Controlling Software Projects: Management, Measurement and Estimation.Prentice Hall. 1986
3. Dr. Cem Kaner, Software Engineer Metrics: What do they measure and how do we know? 10thInternational Software Metrics Symposium. 2004
4. Rüdiger Lincke, Jonas Lundberg, Welf Löwe: Comparing software metrics tools. InternationalSymposium on Software Testing and Analysis. 2008
5. Poppendieck and Poppendieck, Implementing Lean Software Development: From Concept toCash, Addison Wesley Professional, 2003.
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Q&A on the Lean IT Field Guide
Software Industry Pioneer Ed Yourdon Dies
Q&A on Exploring the Practice of Antifragility
Reinventing Organizations for Agility
Watch ThreadCommunity comments
Please enter a subject
Message
Tell us what you think
Post Message
Productivity by Piers Thompson Posted Jul 15, 2009 11:55
Re: Productivity by Carlos Sirias Posted Jul 15, 2009 10:09
Re: Productivity by Hal Arnold Posted Jul 19, 2009 05:49
Re: Productivity by Carlos Sirias Posted Jul 20, 2009 01:10
performance is not the only aspect to be measured by Mark Kofman Posted Jul 19, 2009 06:44
Re: performance is not the only aspect to be measured by Carlos Sirias Posted Jul 20, 2009 12:51
Measurements by Robert Fisher Posted Jul 21, 2009 07:15
Re: Measurements by Carlos Sirias Posted Jul 22, 2009 12:37
ProductivityJul 15, 2009 11:55 by Piers Thompson
InfoQ Weekly NewsletterSubscribe to our Weekly email newsletter to followall new content on InfoQ
Your email here Subscribe
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
If I understand correctly then your productivity measure is actually a measure of estimationaccuracy, and therefore the reasoning relating "productivity" and "discovered defect density"is bogus.
Rewriting your formula for "productivity" I get:
Productivity = (planned/actual)*k where k=8/5
Therefore if I plan that a particular task will take 1 hour, and it takes 1 hour then "Productivity"= 8/5 = 1.6
"productivity" > 1.6 means "faster than planned"
"productivity" < 1.6 means "slower than planned"
Incidentally, the use of defect counts to compare between dissimilar projects has preciselyone thing going for it: it's easy to do. It doesn't tell you much but at least you don't wastemuch money collecting the metric.
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Reply Back to top
P.
Reply Back to top
Re: ProductivityJul 15, 2009 10:09 by Carlos Sirias
Thanks for your reply Piers... We actually thought our definition was bogus... Yeah!!!! wedid... however trying to find another definition proove to be really painfull and time consuming,after all how can you measure productivity?.... I know some methodologies have betterdefinitions and I agree, however let's keep under perspective that I'm trying to have a metricthat fits under most of them. So that was our reasoning behind the formula, not perfect... noteven close, but at least something easy to uderstand and enough to get your team going on.
As you pointed out one of our premises was to have the simplest set of metrics that couldpossibly work... I hope we have show that. Thanks for your comments.
performance is not the only aspect to be measuredJul 19, 2009 06:44 by Mark Kofman
In your article you put too much attention on performance? Why performance? Why notmeasure expertise of the team, why not spend time evaluating scope and product size. On myopinion, it is important while measuring the software project to collect metrics informing youabout different aspects of the project. Only that would give you whole picture of how wellthings are going.
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Reply Back to top
Mark
ceo, programeter.com
Reply Back to top
Re: ProductivityJul 19, 2009 05:49 by Hal Arnold
So.. "it's really painful and time consuming, [and] how do you measure productivity..". But thisis exactly why so many others have tried and failed. But it's really dangerous, because youraudience and stakeholders won't understand how "bogus" it really is. They'll be happilyexpecting that you're REALLY improving productivity, when other more important things liketechnical debt are not being concentrated on by the team, because they are coding to yourproductivity metric.It seems to me that you've measured how well the team is able to estimate; which is wonderful,but not what you want
Reply Back to top
Re: performance is not the only aspect to be measuredJul 20, 2009 12:51 by Carlos Sirias
Thanks for your reply Mark indeed we pay too much attention to "performance". Keep underperspective that I work for a roughly 5000 technologist shop, so we are still maturing.Measuring scope and product size is part of what we do when estimating, we have found verydifficult to measure how well teams do this... the best we could do was our definition of"performance"
Re: Productivity
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Reply Back to top
Jul 20, 2009 01:10 by Carlos Sirias
Great comment Hal; I totally agree with you. Keep under perspective that I work for a roughly5000 technologist shop, so we are still maturing, measuring technical debt in this kind ofenvironment is really difficult, I would be interested in reading a proposal on that... I guessone thing we could do is have a really senior technologist audit this; but you can get the ideathat this would be a full time job of not one but probably a whole organization of dozens (tocover 5000) first we need to sell this to upper management and it will take a while the industrytrend reach us.
MeasurementsJul 21, 2009 07:15 by Robert Fisher
I like this article, and it deserves prominence.
The definitions of measurements are important (reference the misunderstanding about"Productivity" and "Estimating Accuracy" above).
About 80% of our software development shop is Agile with long experience, and we needcommon measurements across Agile and traditional projects. We found it boils down to justthree measurements:1) Size (key for any endeavour -c.f building a bridge "how big is it?")2) Effort (the cost of work - estimated and actual)3) Waste (in products and processes)This is only part of the story - they all come with attributes, e.g. business benefit, activity,defect source? But we keep them few and simple.
Useful measurements are ratios, e.g. resource ability, productivity, accuracy, defect density,escape rates, progress, etc. All derived from the three basics plus their attributes.
As the article states, we found measurements must be tied to key business objectives, e.g.
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
Development
Rustup Aims to Make Cross-Compiling Rust Much Easier
Porting Win32 Applications toWindows 10 with VS15
Google Firebase – A CompleteBack-end Solution for Mobileand Web
Architecture & Design
Moving a Large Swing-basedGeoscience Application toEclipse
Netflix Keystone - How We Builta 700B/day Stream ProcessingCloud Platform in a Year
Precision Medicine ModelingDemonstration with Spark on
Culture & Methods
The InfoQ Podcast: GILT VPHeather Fleming on Unlockingthe "Secret Sauce" of GreatTeams
Chris Young and Kate Gray onWinning Hearts and Minds ofPeople
Organizing the Test Team
Data Science
Netflix Keystone - How We Builta 700B/day Stream ProcessingCloud Platform in a Year
Precision Medicine ModelingDemonstration with Spark onEMR, ADAM, and the 1000Genomes Project
Martin Van Ryswyk on DataStax
DevOps
DevOps Days Kiel Day 2
Apprenda Offers CommercialDistribution of Kubernetes andEnterprise Support
Supergiant.io - ContainerPlatform for StatefulApplications
Reply Back to top
how good are our products?, how good are we? where are we with the project, etc?Unsurprisingly the three measurements are the same controls used by our professionsalengineers each day.
Most of all they are automatic - they are collected and analysed by the development tools atno cost to our professionals. Using spreadsheets invites failure. To do this, the sources, form,translation, charting, are statistical analysis are strongly defined and common in the tools weuse. This is where all the hard work came in.
Reply Back to top
Re: MeasurementsJul 22, 2009 12:37 by Carlos Sirias
Hi Robert
Thanks for your inputs. I concurr with you that every shop needs to accomplish some level ofautomation in collecting the data that feeds the metrics... that's a key so that we don't hurt thechemistry of our technical teams. Great feedback.
pdfcrowd.comopen in browser PRO version Are you a developer? Try out the HTML to PDF API
EMR, ADAM, and the 1000Genomes Project
Enterprise Graph Database
Home
All topics
QCon Conferences
About us
About You
Contribute
Purpose Index
Create account
Login
QCONS WORLDWIDENew York
Jun 13-17, 2016Rio de Janeiro
Oct 5-7, 2016Shanghai
Oct 20-22, 2016San Francisco
Nov 7-11, 2016Tokyo 2016London
Mar 6-10, 2017
Your personalized RSSFor daily content and announcementsFor major community updatesFor weekly community updates
Personalize Your Main Interests
Development
Architecture & Design
Data Science
Culture & Methods
DevOps
This affects what content you see on thehomepage & your RSS feed. Clickpreferences to access more fine-grainedpersonalization.
General Feedbackfeedback@infoq.com
Bugsbugs@infoq.com
Advertisingsales@infoq.com
Editorialeditors@infoq.com
Marketingmarketing@infoq.com
InfoQ.com and all content copyright © 2006-2016C4Media Inc. InfoQ.com hosted at Contegix, the bestISP we've ever worked with.Privacy policy
InfoQ Weekly NewsletterSubscribe to our Weekly emailnewsletter to follow all newcontent on InfoQ
Your email here Subscribe
top related