performance metrics example for software development v1 4-26-05

Upload: psn19843014

Post on 03-Jun-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    1/22

    Business Solutions Center of ExcellencePerformance Metrics

    Version 1.1

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    2/22

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    3/22

    Performance Metrics

    1. 11Performance Metrics2. Introduction

    2.1 Purpose

    ThisPerformance Metricsdocument addresses the basic principles of measurement and metrics within a

    measurement framework and aids in the establishment of a measurement plan for projects by utilizing a setof standard common metrics. This documents primary targets are software development leads and

    managers. It will also be of interest to system engineers software process specialists managers of

    disciplines related to software engineering !such as software configuration management" and program

    managers whose programs have a significant software component. The information contained in this

    document applies specifically to the #usiness $olutions %enter of &'cellence !#$%o&" environment.

    #$%o& is a project of the #ureau of &nterprise (rchitecture !&(" part of the )ffice of

    (dministration*)ffice for Information Technology !)(*)IT".

    &( is responsible for overseeing development of IT technical standards and monitoring compliance across

    %ommonwealth agencies. The #$%o& project was initiated to help &( meet its responsibilities.

    Moreover &( is a contributing sponsor for the %ommonwealth as a whole. #$%o& is intended to

    contribute to provide value for &(+s sponsorship.

    2.2 Scope

    ThePerformance Metricsdocument is intended to serve as a reference for establishing a measurement

    process for #$%o&.

    2.3 References

    ,-- / ,lossary

    0efer to (ppendi' ( for references

    2.4 !er!ie"

    The first section of this document describes e'actly what is meant by the term 1$oftware Metrics1 and

    introduces the reader to the domain of $oftware Metrics by discussing the need for a measurement2basedapproach to the management of software engineering. The second section is really the core of the

    document. This section describes an approach to the development and implementation of a $oftware

    Metrics 3ramework. &ssentially the approach centers around a model that breaks the work into a number

    of stages. This division of labor into phases is of course nothing more than the way in which most

    successful projects are handled.

    )(*)IT Page 454

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    4/22

    Performance Metrics

    3. Soft"are Metrics

    The purpose of software metrics is to make assessments throughout the software life cycle as to whether the

    software project and its 6uality re6uirements are being met. The use of software metrics reduces

    subjectivity in the assessment and control of software 6uality and its development processes by providing a

    6uantitative basis for making decisions.

    $pecifically7

    Track and monitor project status

    Identify project management and tracking issues

    (chieve project stated 6uality goals

    &stablish 6uality re6uirements for a system at its outset

    &stablish acceptance criteria and standards

    &valuate the level of 6uality achieved against the established re6uirements

    8etect anomalies or point to potential problems in the system

    Predict the level of 6uality that will be achieved in the future

    Monitor changes in 6uality when software is modified and

    (ssess the ease of change to the system during product evolution.

    Metrics should be used only when the value they add to the software development process outweighs the

    cost in tracking them. The end goal / developing high26uality low2cost software that meets re6uirements

    in a timely manner / should always remain the focus of any development team. Metrics can be a valuable

    tool to aid in this process but should not be emphasized to the point that their importance overshadows the

    final product.

    3.1 #$at are metrics%

    A metric is a measure of some aspect of a program, design, or algorithm. -It can be systematically

    calculated -It can be used to make inferences about that program, design, or algorithm.

    #y systematically calculating values for programs of known comple'ity - 9e can infer the comple'ity of

    other programs from their calculated values.

    3or &'ample72 9e know that programs designs and algorithms with y value for metric ' had problem z

    :igh defect rate poor maintainability etc.

    Then subse6uent programs with those metrics will probably have similar problems.

    3.2 Measures Vs MetricsTo understand how to apply software metrics we must first understand what measurement means and why

    we need it. Measurement lets us 6uantify concepts or attributes in order to manipulate them and learn more

    about them. ( measure is a mapping from the empirical world !that is the real world in which we live and

    function" to a more formal mathematical world. 9e identify an entity to study and then an attribute of that

    entity. ;e't we map the attribute to its mathematical representation where manipulation of the

    mathematical symbols may reveal more about the entity or attribute than our direct observation !in the real

    world" would allow.

    )(*)IT Page 455

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    5/22

    Performance Metrics

    3or e'ample we may choose to e'amine a code module !the entity" and capture its size !the attribute" using

    a measure such as lines of code !

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    6/22

    Performance Metrics

    4. Esta(lis$in' a Measurement Met$odolo')

    4.1 &$e Measurement Pro'ram Plan

    ( software measurement methodology is a systematic method of measuring assessing and adjusting the

    software development process using objective data. 9ithin such a systematic approach software data is

    collected based on known or anticipated development issues concerns 6uestions or needs. The data are

    analyzed with respect to the characteristics of the software development process and products and used to

    assess progress 6uality and performance throughout the development. There are seven key components to

    an effective measurement methodology7

    A 8efining clearly the software development goals and the software measures !data elements" that support

    insight to the goals.

    A ?se the ,oals2Buestions2Metrics !,BM" paradigm framework

    A 8efine and develop a set of metrics

    A %ollect and validate the data

    A Processing the software data into graphs and tabular reports !indicators" that support the analysis of

    issues.

    A (nalyzing the indicators to provide insight into the goals.

    A ?sing the results to implement improvements and identify new issues and 6uestions.

    4.1.1 Responsibilities

    The 0esponsibility for 8ata %ollection should be assigned as follows7

    The Project Planning Manager should take on the responsibility for collecting all size estimates

    time estimates and scheduling details. The PM should be the hub of activity for any Planning

    Metrics Program.

    3or software sizing purposes $oftware Buality (ssurance shall develop a coding standard and

    program counting standard. It is the programmers+ responsibility to provide $B( a consensus onwhat should be included in the standard.

    %onfiguration Management should be responsible for measuring and reporting the code size for

    the project.

    It is the responsibility of each developer working on a project to produce a time recording log of

    their activities for each day and report to the project planning manager. The developer is also

    re6uired to provide details of their availability for a new project when re6uired.

    4.1.2 Resources

    There are a number of resources that may be utilized in the measurement methodology namelyC

    ( metrics 8atabase / In fact research shows that the most popular implementation for storing the

    collected data is Microsoft &'cel spreadsheets. :owever if an organization is to have a collectivehistory over all projects then populating and maintaining an 08#M$ is the best practice.

    Metric Toolset / $ome data can be collected automatically and unobtrusively by software tools.

    3or e'ample code analyzers and compilers can count lines of codeC operating system accounting

    packages can supply data about processor and tool usageC and organizational accounting systems

    can typically report hours of effort by interfacing with the time card system.

    0eporting tools / %rystal reports is a common reporting tool.

    )(*)IT Page 45>

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    7/22

    Performance Metrics

    8ata %ollection forms 2 The forms are usually designed by the analysis and presentation team

    members and completed by the development and maintenance team. (ll forms re6uire the

    submitter to provide identifying information such as the project name the team members name

    and the date. In addition each type of form is designed to provide some of the measures that

    satisfy the goals of the measurement program. $ome forms re6uest both objective data !directly

    observed" and subjective data !based on opinion". (ll re6uire only short answers or the selection

    of options from a checklist. These forms could be paper based or on2line.

    4.2 *oal+,uestion+Metric Met$odolo')

    The Goal-Question-Metric!,BM" methodology is used to define measurement on the software project

    process and product in such a way that

    0esulting metrics are tailored to the organization and its goal.

    0esulting measurement data play a constructive and instructive role in the organization.

    Metrics and their interpretation reflect the values and the viewpoints of the different groups

    affected !e.g. developers users and operators".

    ,BM defines a measurement model on three levels7

    %onceptual level !goal"7 ( goal is defined for an object for a variety of reasons with respect to

    various models of 6uality from various points of view and relative to a particular environment.

    )perational level !6uestion"7 ( set of 6uestions is used to define models of the object of study and

    then focuses on that object to characterize the assessment or achievement of a specific goal.

    Buantitative level !metric"7 ( set of metrics based on the models is associated with every

    6uestion in order to answer it in a measurable way.

    (lthough originally used to define and evaluate a particular project in a particular environment ,BM can

    also be used for control and improvement of a single project within an organization running several

    projects.

    Figure 1 Goal-Question-Metric Paradigm

    )(*)IT Page 45D

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    8/22

    Performance Metrics

    The ,BM model has a hierarchical structure starting with a goal that specifies the purpose of

    measurement the object to be measured and viewpoint from which the measure is taken. The goal is

    refined in several 6uestions that usually break down the issue into its major components. &ach 6uestion is

    then refined into metrics. The same metric can be used in order to answer different 6uestions under the

    same goal. $everal ,BM goals can also have 6uestions and metrics in common provided that when the

    measure is actually collected the different viewpoints are taken into account correctly !i.e. the metric might

    have different values if taken from different viewpoints".

    9ith the ,BM method the number of metrics that need to be collected is focused on those that correspond

    to the most important goals. Thus data collection and analysis costs are limited to the metrics which give

    the best return. )n the other hand the emphasis on goals and business objectives establishes a clear link to

    strategic business decisions and helps in the acceptance of measurements by managers team leaders and

    engineers.

    4.2.1 Phases of GQM

    The ,BM method contains four phases7

    ThePlanning phase during which the project for measurement application is selected defined

    characterized and planned resulting a project plan.

    TheDefinition phase during which the measurement program is defined !goal 6uestions metricsand hypotheses are defined" and documented.

    TheData collection phase during which the actual data collection takes place resulting in

    collected data.

    TheInterpretation phase during which the collected data is processed with respect to the defined

    metrics into measurement results that provide answers to the defined 6uestions after whichgoal

    attainmentcan be evaluated.

    Figure 2 GQM Phases

    4.2.2 Planning Phase

    The planning process for ,BM measurement consists of the following activities7

    )(*)IT Page 45E

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    9/22

    Performance Metrics

    &stablish ,BM team

    $elect Improvement area

    $elect Project or )rganization

    %reate Project team

    Training

    The ,BM team must contain a variety of stakeholders to ensure proper metric balance. (gency

    management customers and $M&s will have the best perspective on the business and operational

    environments. The Project Manager will work with all of these stakeholders to design effective metrics.

    4.2.3 Definition Phase

    The definition process for ,BM measurement consists of

    the following activities7

    Pre2study 2 The pre2study e'amines and

    characterizes the application conte't and project

    in order to make current problems and goalse'plicit. The pre2study is an important preparation

    for measurement.

    Measurement ,oal selection / 8uring this

    activity informal improvement goals are

    described refined and ranked. Priorities are

    assigned and it is decided which goals will be

    used and transformed into ,BM goals.

    ,BM planning 2 ,BM planning is the actual

    design of the measurements. The ,BM paradigm

    is applied for defining a detailed tree of goal

    6uestions and metrics. Interviews are held with

    project members to retrieve the necessary

    Information.

    Measurement planning / Measurement planning

    is done to develop data collection procedures and

    introduce automated tools for data analysis.

    8uring this activity the initial ,BM2 plan is made

    operational for practice.

    ( formal definitions process is documented as a seven step

    procedure and is illustrated in 3igure 5.

    4.2.4 Definition Phase - ffecti!e "o#ing $tan#ar# %ample

    )(*)IT Page 45F

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    10/22

    Performance Metrics

    4.2.& Definition Phase ' "ore Metrics base# on the GQM metho#ology

    Table 1 GQM core metrics

    $ize Metrics&nsure the size status of the

    project agrees with plan

    9hat is the current size of

    the re6uirement statusG

    ;umber of re6uirements completed to date vs. Total

    ;umber of re6uirements planned

    9hat is the current size of

    the design statusG

    ;umber of $

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    11/22

    Performance Metrics

    The metrics addressed in Table are intended for mature measurement programs where the measurement

    team has gained e'pertise in the design and collection of metrics. 3our basic measures are among the

    management tools used within #$%o& to ac6uire develop and maintain software systems. These measures

    address important product and process characteristics that are central to planning tracking and improving

    the software development process. Table 4 lists the measures and relates them to the characteristics they

    address.

    Table 2 BSCoE core metrics

    Unit o Measure Characteristics !ddressed

    %ounts of physical source lines of code $ize progress reuse rework

    %ounts of staff hours e'pended &ffort cost rework resource allocations

    %alendar dates tied to milestones reviews

    and audits and deliverable products

    $chedule

    %ounts of software problems and defects Buality readiness for delivery improvement trends

    rework

    The measures in Table 4 are not the only ones that can be used to describe software products and processes

    but they represent a starting point and are practical measures that produce useful information.

    4.2.5.1 BSCoE Requirements Metrics

    0e6uirements development and management have always been critical in the implementation of software

    systems. 0ecently automated tools have become available to support re6uirements management. The use of

    these tools not only can provide support in the definition and tracing of re6uirements but also open the

    door to effective use of metrics in characterizing and assessing risks. These types of metrics are important

    because of the benefits associated with early detection and correction of problems with re6uirementsC

    problems not found until testing are at least @ times more costly to fi' than problems found in there6uirements phase. 8espite the significant advantages attributed to the use of automated tools their use has

    not become common practice and will not be addressed further.

    The remainder of this section discusses metrics analysis of information to be entered into the #$%o&

    re6uirements database and later used to provide insight into the stability and e'pansion of re6uirements.

    These metrics will assist #$%o& project managers and 6uality assurance engineers to identify risks. These

    metrics assure that the completed software system contains the functionality specified by the re6uirements.

    Table 5 lists the common metrics associated with re6uirements

    Table " Standard #e$uirements Metrics

    0e6uirements (pproval status 8etermining how well re6uirements are moving through

    life2cycle phases

    0e6uirements JerificationMethod Tabular list of re6uirements by verification method withsupporting re6uirements details

    0e6uirements %riticality Tabular list of re6uirements by criticality

    %hanged 0e6uirements Provides a count of re6uirements changes

    %ompleted 0e6uirements 8escribes the number of re6uirements completed

    8eleted 0e6uirements Provides a count of deleted re6uirements

    Incomplete 0e6uirements 8escribes the number of re6uirements that are incomplete

    )(*)IT Page 4@-

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    12/22

    Performance Metrics

    ;ew 0e6uirements Provides a count of re6uirements added

    Total 0e6uirements Tracks accuracy of re6uirements planning and total

    re6uirements progress

    0e6uirements $tability :ow fre6uently re6uirements are changing

    0e6uirements Jolatility ,eneral activity level with respect to re6uirements

    0e6uirements T#8s (mount of re6uirement definition effort still needed

    0e6uirements (llocation $tatus Progress of allocating system2level re6uirements to

    subsystems

    0e6uirements with 0isk Insight into how much re6uirements volatility can be

    e'pected in future periods

    0e6uirements 8etails Tabular list of re6uirements

    Three areas of re6uirement metrics will be discussed7

    0e6uirements $tability

    0e6uirements Testing

    (utomated %hange Management

    @.4.>.. 0e6uirements $tability

    0e6uirements $tability provides an indication of the completeness stability and understanding of the

    re6uirements. It indicates the number of changes to the re6uirements and the amount of information needed

    to complete the re6uirements definition. ( lack of re6uirements stability can lead to poor product 6uality

    increased cost and schedule slippage.

    0e6uirements stability indicators are in the form of trend charts that show the total number of re6uirementscumulative changes to the re6uirements and the number of T#8s over time. ( T#8 refers to an undefined

    re6uirement. #ased on re6uirements stability trends corrective action may be necessary.

    0e6uirements stability is applicable during all life2cycles phases from project inception to the end. The

    re6uirements stability indicators are most important during re6uirements and design phases.

    0e6uirements are developed and base2lined at major reviews during the system development life cycle. (t

    these milestone reviews documents containing the re6uirements are reviewed and commented upon. (fter

    resolution of the comments the re6uirement documents are base2lined and put under configuration control.

    Ideally the rate of change in each level of re6uirements should decrease as a milestone review approaches.

    0e6uirements stability metrics are collected and reported on a monthly basis.

    3igure D shows an e'ample of the total number of re6uirements the cumulative number of re6uirements

    changes and the number of remaining T#8s over time. It may be desirable to also show the number ofadded modified and deleted re6uirements over time.

    )(*)IT Page 4@

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    13/22

    Performance Metrics

    Table % #e$uirements Stabilit&

    4.2.5.2 BSCoE Repository and Publication Metrics

    #$%o& 8ocument Code Reuse trateg! CR""#details the strategy for reusing modeling design and code

    artifacts during the software development process. 3igure @ provides an overview of this process.

    Figure " #euse 'or(lo)

    $oftware metrics provide an element of 6uantification on the entire reuse process. 9hen selecting metrics

    the traditional approach to evaluating software reuse has usually been based on local optimization in that

    (n individual aspect !usually a cost aspect" is studied closely and used for evaluation or

    )(*)IT Page 4@4

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    14/22

    Performance Metrics

    ( special !empirically established" target area is selected and then 6uestions for improvement are derived

    from the area and the metrics are determined using ,BM

    &'amples of the LIndividual aspects are 0elative cost of reuse !0%0" and 0elative cost of writing for

    reuse !0%90".

    3igure > tabulates e'amples of special target areas where the metrics are determined from the goals

    although the metrics are only generally stated and not subjected to any theoretical measurement analysis.

    Goals #e*ositor& Use #euse e+*ansion

    in a *ro,ect

    Com*onent costs Process costs

    including reuse

    Com*onent $ualit& #eusabiliit&

    Metrics ;umber of

    components

    ;umber of

    access

    0euse

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    15/22

    Performance Metrics

    Figure . /eri0ing Metrics rom Goals and Questions

    )nce these 6uestions are identified B( must analyze each 6uestion to determine what must be measured inorder to answer the 6uestion. 3or e'ample to understand who is using the standard it is necessary to know

    what proportion of coders is using the standard. :owever it is also important to have an e'perience profile

    of the coders e'plaining how long they have worked with the standard the environment the language and

    other factors that will help to evaluate the effectiveness of the standard. The productivity 6uestion re6uires

    a definition of productivity which is usually some measure of effort divided by some measure of product

    size. (s shown in the figure the metric can be in terms of lines of code function points or any other metric

    that will be useful.

    4.2.( Data "ollection Phase

    The data collection phase for ,BM measurement consists of the following activities7

    :old trial period !optional"

    :old kick2off sessions !optional"

    %reate metrics repository

    %ollect and validate data

    $tore data in metrics database

    The collection occurs at periodic intervals defined in the project plans and is monitored for completeness

    integrity and accuracy. The primary source for actual data is outlined in $ection 5.5.

    The following figure illustrates the data sources and repository for the data collection phase of the

    measurements program.

    )(*)IT Page 4@@

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    16/22

    Performance Metrics

    Figure /ata collection Presentation

    4.2.".1 Metrics Repository

    &stablish a metrics repository where metrics history is kept for future projects. The availability of past

    metrics data can be the primary source of information for calibration planning estimates benchmarking

    process improvement calculating return on investment etc. (t a minimum the repository should store the

    following7

    8escription of projects and their objectives.

    Metrics used.

    0easons for using the various metrics.

    (ctual metrics collected over the life of each project.

    8ata indicating the effectiveness of the metrics used.

    4.2.) *nterpretation Phase

    4.2.#.1 $naly%in! &ata

    (nalyzing metrics and making objective 6uantitative management decisions is the true benefit step in the

    integrated engineering metrics process. Metrics are most often communicated graphically conveying a clearand easily understood message. It is better to have many graphs than it is to have many messages on one

    graph. Metrics are indicators that give warnings of problems associated with issues. (n issue may be

    tracked with several metrics that may be based on different measures. Insight into an issue typically

    re6uires statistical analysis of metrics over time and is trend2based or limit2based as follows7

    )(*)IT Page 4@>

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    17/22

    Performance Metrics

    Trend2based metrics are used when e'pected or planned values change regularly over time. The analysis of

    a trend2based metric involves determining whether the performance implied in the trend is achievable.

    .-

    .>

    4-

    4>

    5-

    5>

    @-

    @>

    - > .- .> 4- 4> 5- 5> @- @> >-

    0eviews*Inspections

    8efectsperN$

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    18/22

    Performance Metrics

    Figure 34 /eect !nal&sis

    4.2.#.2 Reportin!0eporting integrated engineering metrics is the final step in making 6uantitative management decisions and

    communicating to project team members management and customers. 0eporting and reviewing metrics

    should be integrated into the management process and occurs as soon as possible after analysis has been

    completed to assure that there is time for corrective action. These metrics should be reviewed by e'ecutive

    management project sponsors and program*project manager!s" at key cost control points during the

    project. (ny metric falling outside the control limits is reviewed for variance. %orrective actions are

    recorded and tracked to closure.

    )(*)IT Page 4@E

    -

    4

    @

    D

    F

    .-

    .4

    .@

    .D

    9.

    95

    9>

    9E

    9K

    9..

    9.5

    9.>

    9.E

    9.K

    Time

    5umbero/eects

    6ard)are

    Minor

    Ma,or

    Critical

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    19/22

    Performance Metrics

    4.3 -essons -earned

    Many organizations have implemented measurements programs and detailed the results and lessons learned

    as a result of the implementation. The community has collected these and established a common

    understanding of best practices for starting a measurement program

    Getting Started Using Measurement #esults

    &nsure that everyone in the organization understands both

    capabilities and limitations of the measurement process.

    8o not allow anyone in the organization to use measurement

    to evaluate individual or workgroup performance.

    $tart small. Implement only a few measures to address key

    issues and show how the measurement results support both

    individual and management objectives.

    Make the measurement data and information available to

    everyone in the organization. This is a key approach in

    helping people to actually use the results. If the information

    is valid people will find a way to use it.

    &nsure that only the re6uired measures are implemented

    based on the issues and objectives of the organization. If

    you dont need the data dont collect it. The measurement

    process must be cost effective to succeed.

    8o something early. ( considerable amount of meaningful

    analysis can be performed with a minimal amount of data.

    8ont wait until all of the data is available to apply it.

    (ssign a key individual to implement the measurementprocess. This Lmeasurement analyst should be an integral

    part of the program team and should act as the primary

    interface with the developer with respect to software

    measurement.

    8ifferent levels within the same organization have differentinformation needs. )rganization managers may make

    investment decisions with respect to software process

    technology and tools while program managers make

    decisions about specific technologies used to best satisfy

    program objectives. )rganizational issues and objectives do

    not always e6uate to those of a specific program.

    The program manager should not incur significant costs for

    the developer to collect software data. The unavailability of

    data may indicate a low level of maturity in the developers

    software process.

    Measurement should be made an integral part of the

    program or organization. Measurement should support the

    e'isting management and technical processes. Measurement

    should not be treated as an Ladd on within the organization.

    The measurement process can initially be implemented with

    basic commercially available database spreadsheet wordprocessing and presentation graphics applications. More

    advanced tools can be added as re6uired.

    The program manager must be at least willing to listen to

    Lbad news resulting from the measurement analysis. ;otevery analysis result re6uires action. In some cases the

    recommended action is not feasible. Measurement is

    intended to help the program manager make a decision not

    make it for him.

    (ll users at all levels must understand what the

    measurement data represents. This understanding is vital to

    the proper interpretation of the measurement analysis

    results.

    Management should not try to Linfluence the measurement

    results before they are reported. They should however

    understand how the reported results were arrived at and

    what they mean with respect to the associated software

    issues.

    Pro2actively use the measurement information to report

    program status.

    )(*)IT Page 4@F

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    20/22

    Performance Metrics

    ppendix / ((re!iations

    CM %onfiguration Management

    CMM %apability Maturity Model

    CPU %entral Processing ?nit

    CSC %omputer $oftware %omponent

    E!C &stimate at %ompletion

    E/S7 &6uivalent 8elivered $ource Instructions

    ES89C !%ost" &6uivalent to ;ew $ource

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    21/22

    Performance Metrics

    ppendix B / References 0 Resources

    #asili J.0. 18ata %ollection Jalidation and (nalysis1 in $utorial on Models and Metrics for

    oftware Management and %ngineering I&&& %atalog ;o. &:)2DE2E KF pp. 5-255.

    #asili J.0. 1Buantitative &valuation of $oftware &ngineering Methodology1 Proceedings of the

    &irst Pan Pacific Computer Conference Melbourne (ustralia $eptember KF>.

    #asili J.0. 1$oftware 8evelopment7 ( Paradigm for the 3uture1 Proceedings of the #'th

    Annual International Computer oftware ( Applications Conference )C*MPAC" Neynote

    (ddress )rlando 3D !?MI(%$2T02K42KD" ?niversity of

    Maryland %ollege Park M8 $eptember KK4.

    #asili J.0. 0ombach :.8. 1The T(M& Project7 Towards Improvement2)riented $oftware

    &nvironments1I%%% $ransactions on oftware %ngineering vol. $&2@ no.D Oune KFF

    pp.E>F2EE5

    #asili J.0. $elby 0.9. 18ata %ollection and (nalysis in $oftware 0esearch and

    Management1Proceedings of the American tatistical Association and +iomeasure ocietyOoint $tatistical Meetings Philadelphia P( (ugust KF@.

    #asili J.0. 9eiss 8.M. 1( Methodology for %ollecting Jalid $oftware &ngineering 8ata1

    I%%% $ransactions on oftware %ngineering vol. $&2- no.D ;ovember KF@ pp. E4F2E5F.

    #oehm 9. #rown O.0. and K42

    D->.

    8eMarco T. %ontrolling $oftware Projects. Management Measurement and &stimation.

    ourdon Press7 ; KF4.

    ,rady 0.#. %aswell 8.-

  • 8/12/2019 Performance Metrics EXAMPLE for Software Development v1 4-26-05

    22/22

    Performance Metrics

    :umphrey 9. Managing the $oftware Process. (ddison 2 9esley Publishing KFK.

    :ussein (. 1$oftware Measurement Plan for the $&;,D45 Project1 Oan - KKE.

    I&&& transaction on software engineering J)24K.

    Mc%all O.(. 0ichards P.N. 9alters ,.3. 13actors in $oftware Buality1 0ome (ir

    8evelopment %enter 0(8% T02EE25DK KEE.

    ;($( 1$oftware &ngineering .

    Paulk M. 1 %apability Maturity Model for $oftware1 $&I 3eb KK5.

    Paulk M. 1Ney Practices of the %apability Maturity Model1 $&I 3eb KK5.

    0ombach :.8. and J.0. #asili. +Practical benefits of goal2oriented measurement+ in Proc.

    (nnual 9orkshop of the %entre for $oftware 0eliability7 0eliability and Measurement. ,armisch2

    Partenkirchen ,ermany7 &lsevier KK-.

    $hepperd M.O. +(n empirical study of design measurement+ $oftware. &ngineering. O. >!"

    pp52- KK-.

    )(*)IT Page 4>