project measurement and control. q ualitative and q uantitative d ata software project managers...

88
PROJECT MEASUREMENT AND CONTROL

Upload: winifred-pierce

Post on 25-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PROJECT MEASUREMENT AND CONTROL

Page 2: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

QUALITATIVE AND QUANTITATIVE DATA

Software project managers need both qualitative and quantitative data to be able to make decisions and control software projects so that if there are any deviations from what is planned, control can be exerted.

Control actions may include:

Extending the schedule

Adding more resources

Using superior resources

Improving software processes

Reducing scope (product requirements)

Page 3: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

MEASURES AND SOFTWARE METRICS

Measurements enable managers to gain insight for objective project evaluation.

If we do not measure, judgments and decision making can be based only on our intuition and subjective evaluation.

A measure provides a quantitative indication of the extend, amount, dimension, capacity, or size of some attribute of a product or a process.

IEEE defines a software metric as “a quantitative measure of the degree to which a system, component, or process possesses a given attribute”.

Page 4: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

MEASURES AND SOFTWARE METRICS

Engineering is quantitative discipline, and direct measures such as voltage, mass, velocity, or temperature are measured. But unlike other engineering disciplines, software engineering is not grounded in the basic laws of physics.Some members of software community argue that software is not measurable. There will always be a qualitative assessments, but project managers need software metrics to gain insight and control.“Just as temperature measurement began with an index finger...and grew to sophisticated scales, tools, and techniques, so too is software measurement maturing”.

Page 5: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

DIRECT AND INDIRECT MEASURES

A direct measure is obtained by applying measurement rules directly to the phenomenon of interest.

For example, by using the specified counting rules, a software program’s “Line of Code” can be measured directly. http://sunset.usc.edu/research/CODECOUNT/

An indirect measure is obtained by combining direct measures.

For example, number of “Function Points” is an indirect measure determined by counting a system’s inputs, outputs, queries, files, and interfaces.

Page 6: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE METRIC TYPES

Product metrics, also called predictor metrics are measures of software product and mainly used to determine the quality of the product such as performance.

Process metrics, also called control metrics are measures of software process and mainly used to determine efficiency and effectiveness of the process, such as defects discovered during unit testing They are used for Software Process Improvement (SPI).

Project metrics are measures of effort, cost, schedule, and risk. They are used to assess status of a project and track risks.

Page 7: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE METRIC USAGE

Product, Process, Project Metrics Metrics Repository

Project Time

Project Cost

Product Scope/Quality

Risk Management

Future Project Estimation

Software Process Improvement

Page 8: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE METRIC USAGE

Page 9: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Very few companies have made long-term commitment to collecting metrics.

In 1990’s, several large companies such as Hewlett-Packard, AT&T, and Nokia introduced metrics programs.

Most of the focus was on collecting metrics on software program defects and the Verification & Validation processes.

There is little information publicly available about current use of systematic software measurement industry.

SOFTWARE METRICS COLLECTION

Page 10: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Some companies do collect information about their software, such as number of requirements change requests or the number of defects discovered during testing.

However, it is not clear if they then use them. The reasons are:

It is impossible to quantify Return On Investment (ROI) of introducing an organizational metrics program.

There are no standards for software metrics or standardized processes for measurement and analysis.

SOFTWARE METRICS COLLECTION

Page 11: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

In many companies processes are not standardized and poorly defined.

Much of the research has focused on code-based metrics and plan driven development processes. However, there are many software develop by using ERP packages, COTS products, and agile methods.

Software metrics are the basis of empirical software engineering. This is a research area in which experiments on software systems and collection of data about real projects has been used to form and validate hypotheses about methods and techniques.

SOFTWARE METRICS COLLECTION

Page 12: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS A group examines part or all of a process or system and

its documentation to find potential problems. Software or documents may be 'signed off' at a

review which signifies that progress to the next development stage has been approved by management.

There are different types of review with different objectives Inspections for defect removal (product);Reviews for progress assessment (product and

process);Quality reviews (product and standards).

12

Page 13: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

A group of people carefully examine part or all of a software system and its associated documentation.

Code, designs, specifications, test plans, standards, etc. can all be reviewed.

Software or documents may be 'signed off' at a review which signifies that progress to the next development stage has been approved by management.

13

Page 14: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

14

Page 15: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

The review process in agile software development is usually informal. In Scrum, for example, there is a review meeting after

each iteration of the software has been completed (a sprint review), where quality issues and problems may be discussed.

In extreme programming, pair programming ensures that code is constantly being examined and reviewed by another team member.

XP relies on individuals taking the initiative to improve and re-factor code. Agile approaches are not usually standards-driven, so issues of standards compliance are not usually considered.

15

Page 16: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

These are peer reviews where engineers examine the source of a system with the aim of discovering anomalies and defects.

Inspections do not require execution of a system so may be used before implementation.

They may be applied to any representation of the system (requirements, design, configuration data, test data, etc.).

They have been shown to be an effective technique for discovering program errors.

16

Page 17: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

Checklist of common errors should be used to drive the inspection.

Error checklists are programming language dependent and reflect the characteristic errors that are likely to arise in the language.

In general, the 'weaker' the type checking, the larger the checklist.

Examples: Initialization, Constant naming, loop termination, array bounds, etc.

17

Page 18: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

Fault class Inspection check

Data faults Are all program variables initialized before their values are used? Have all constants been named? Should the upper bound of arrays be equal to the size of the

array or Size -1? If character strings are used, is a delimiter explicitly assigned? Is there any possibility of buffer overflow?

Control faults For each conditional statement, is the condition correct? Is each loop certain to terminate? Are compound statements correctly bracketed? In case statements, are all possible cases accounted for? If a break is required after each case in case statements, has it

been included?

Input/output faults Are all input variables used? Are all output variables assigned a value before they are output? Can unexpected inputs cause corruption?

18

Page 19: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REVIEWS AND INSPECTIONS

Fault class Inspection check

Interface faults Do all function and method calls have the correct number of parameters?

Do formal and actual parameter types match? Are the parameters in the right order? If components access shared memory, do they have the

same model of the shared memory structure?

Storage management faults

If a linked structure is modified, have all links been correctly reassigned?

If dynamic storage is used, has space been allocated correctly?

Is space explicitly deallocated after it is no longer required?

Exception management faults

Have all possible error conditions been taken into account?

19

Page 20: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Agile processes rarely use formal inspection or peer review processes.

Rather, they rely on team members cooperating to check each other’s code, and informal guidelines, such as ‘check before check-in’, which suggest that programmers should check their own code.

Extreme programming practitioners argue that pair programming is an effective substitute for inspection as this is, in effect, a continual inspection process.

Two people look at every line of code and check it before it is accepted.

20

REVIEWS AND INSPECTIONS

Page 21: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Product metrics fall into two classes:

Dynamic metrics such as response time, availability, or reliability are collected by measurements made of a program in execution. These metrics can be collected during testing or after system has gone into use.

Static metrics are collected by measurements made of representations of the system such as the design program, or documentation.

SOFTWARE PRODUCT METRICS

Page 22: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Unfortunately, direct product measurements cannot be made on some of the quality attributes such as understandability and maintainability.

Therefore, you have to assume that there is a relationship between the internal attribute and quality attribute.

Model formulation involves identifying the functional form (linear or exponential) by analysis of collected data, identifying parameters that are to be included in the model, and calibrating these parameters.

SOFTWARE PRODUCT METRICS

Page 23: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE QUALITY ATTRIBUTES

Safety Understandability Portability

Security Testability Usability

Reliability Adaptability Reusability

Resilience Modularity Efficiency

Robustness Complexity Learnability

Subjective quality of a software system is largely based on its non-functional system attributes:

Page 24: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

24

M ain t a in a b il it yM a in t a in a b il it y

F le x i b il i t yF le x i b il i t y

T e s t a b ili t yT e s t a b ili t y

P o rt a b il i t yP o rt a b il i t y

R e u s a b il it yR e u s a b il i t y

I n t e r o p e r a b il it yI n t e r o p e r a b il i t y

C o rr e c t n e s sC o rr e c t n e s s

R e li a b il i t yR e li a b il i t y

E ff i c ie n c yE ff i c ie n c y

I n te g r it yI n te g r it y

U s a b il i t yU s a b il i t y

P R O D U C T T R A N S I T I O NP R O D U C T T R A N S I T I O NP R O D U C T R E V I S I O NP R O D U C T R E V I S I O N

P R O D U C T O P E R A TI O NP R O D U C T O P E R A TI O N

SOFTWARE QUALITY ATTRIBUTES

Page 25: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE QUALITY ATTRIBUTES

Page 26: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REQUIREMENTS MODEL METRICS

Technical work in software engineering begins with the creation of the requirements model.

These metrics examine requirements model with the intent of predicting the of the resultant system.

Size is an indicator of increased coding, integration, and testing effort.

Page 27: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REQUIREMENTS MODEL METRICS

Function-Based Metrics: Function Point (FP) metric is used to measure the functionality delivered by a software system.

Specification Metrics: Quality of analysis model and its specification such as ambiguity, completeness, correctness, understandability, verifiability, consistency, and traceability.

Although many of these characteristics are qualitative, there are quantitative specification metrics.

Page 28: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REQUIREMENTS MODEL METRICS

Number of total requirements: Nr = Nf + Nnf

Nr= number of total requirements

Nf=Number of functional requirements

Nfn=Number of non-functional requirements

Ambiguity: Q = Nui / Nr; where

Nui= number of requirements for which all reviewers had identical interpretations

The closer the value of Q to 1, the lower the ambiguity of the specification.

Page 29: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REQUIREMENTS MODEL METRICS

The adequacy of uses cases can be measured using an ordered triple (Low, Medium, High) to indicate:

Level of granularity (excessive detail) specified

Level in the primary and secondary scenarios

Sufficiency of the number of secondary scenarios in specifying alternatives to the primary scenario

Semantics of analysis UML models such as sequence, state, and class diagrams can also be measured.

Page 30: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

OO DESİGN MODEL METRICS

Coupling: Physical connections between elements of OO design such as number of collaborations between classes or the number of messages passed between objects

Cohesion: Cohesiveness of a class is determined by examining the degree to which the set of properties it possesses is part of the problem or design domain.

Page 31: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CLASS-ORİENTED METRİCS – THE CK METRİCS SUİTE

Depth of the inheritance tree (DIT) is the maximum length from the node to the root of the tree. As DIT grows, it is likely that lower-level classes will inherit many methods. This leads to potential difficulties when attempting to predict the behavior of a class, but large DIT values imply that many methods may be reused.

Coupling between object classes (CBO). The CRC model may be used to determine the value for CBO. It is the number of collaborations listed for a class on its CRC index card. As CBO increases, it is likely that the reusability of a class will decrease.

Page 32: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

USER INTERFACE DESİGN METRİCS

Layout Appropriateness (LA) measures the user’s movements from one layout entity such as icons, text, menu, and windows.

Web page metrics are number of words, links, graphics, colors, and fonts contained within a Web page.

Page 33: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

USER INTERFACE DESİGN METRİCS

Does the user interface promote usability?Are the aesthetics of the WebApp appropriate for the application domain and pleasing to the user?Is the content designed in a manner that imparts the most information with the least effort?Is navigation efficient and straightforward?Has the WebApp architecture been designed to accommodate the special goals and objectives of WebApp users, the structure of content and functionality, and the flow of navigation required to use the system effectively?Are components designed in a manner that reduces procedural complexity and enhances the correctness, reliability and performance?

Page 34: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOURCE CODE METRİCSCyclomatic Complexity is a measure of the number of independent paths through the code and measures structural complexity. Measures greater than 10-12 are considered to complex; difficult to understand, difficult to modify and test.

C= E – N + 2C: ComplexityE: number of edgesN: Number of nodes

Page 35: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOURCE CODE METRİCS

Length of identifiers is the average length of identifiers

(names for variables, methods, etc.). The longer the

identifiers, the more likely that they are meaningful, thus

more maintainable.

Depth of Conditional Nesting measures nesting of if-

statements. Deeply nested if-statements are hard to

understand and potentially error-prone.

Page 36: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PROCESS METRİCSProcess metrics are quantitative data about software processes, such as the time taken to perform some process activity.

3 Types of process metrics can be collected:

Time taken: time devoted to a process or spent by particular engineers

Resources required: effort, travel costs, or computer resources

Number of occurrences of events: events can be number of defects, number of requirements changes, number of lines of code modified in response to a requirements change

Page 37: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Concerned with ensuring that the required level of quality is achieved in a software product.

Three principal concerns: At the organizational level, quality management is

concerned with establishing a framework. At the project level, quality management involves the

application of specific quality processes and checking that these planned processes have been followed.

At the project level, quality management is also concerned with establishing a quality plan for a project. The quality plan should set out the quality goals for the project and define what processes and standards are to be used.

37

SOFTWARE QUALİTY MANAGEMENT

Page 38: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Quality management provides an independent check on the software development process.

The quality management process checks the project deliverables to ensure that they are consistent with organizational standards and goals.

The quality team should be independent from the development team so that they can take an objective view of the software. This allows them to report on software quality without being influenced by software development issues.

38

SOFTWARE QUALİTY MANAGEMENT

Page 39: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

39

SOFTWARE QUALİTY MANAGEMENT

Page 40: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

A quality plan sets out the desired product qualities and how these are assessed and defines the most significant quality attributes.

The quality plan should define the quality assessment process.

It should set out which organizational standards should be applied and, where necessary, define new standards to be used.

40

SOFTWARE QUALİTY PLANNİNG

Page 41: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Quality plan structureProduct introductionProduct plansProcess descriptionsQuality goalsRisks and risk management.

Quality plans should be short, succinct documentsIf they are too long, no-one will read them

41

SOFTWARE QUALİTY PLANNİNG

Page 42: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Quality, simplistically, means that a product should meet its specification.

This is problematical for software systems There is a tension between customer quality

requirements (efficiency, reliability, etc.) and developer quality requirements (maintainability, reusability, etc.);

Some quality requirements are difficult to specify in an unambiguous way;

Software specifications are usually incomplete and often inconsistent.

The focus may be ‘fitness for purpose’ rather than specification conformance.

42

SOFTWARE QUALİTY

Page 43: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

Have programming and documentation standards been followed in the development process?

Has the software been properly tested?Is the software sufficiently dependable to be put

into use?Is the performance of the software acceptable for

normal use? Is the software usable?Is the software well-structured and

understandable?

43

SOFTWARE QUALİTY

Page 44: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE STANDARDS Standards define the required attributes of a product or

process. They play an important role in quality management.

Standards may be international, national, organizational or project standards.

Product standards define characteristics that all software components should exhibit e.g. a common programming style.

Process standards define how the software process should be enacted.

44

Page 45: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE STANDARDSEncapsulation of best practices - avoids

repetition of past mistakesThey are a framework for defining what quality

means in a particular setting i.e. that organization’s view of quality.

They provide continuity - new staff can understand the organization by understanding the standards that are used.

45

Page 46: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE STANDARDS Product standards Process standards

Design review form Design review conduct

Requirements document structure

Submission of new code for system building

Method header format Version release process

Java programming style Project plan approval process

Project plan format Change control process

Change request form Test recording process

46

Page 47: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

Software Process Improvement (SPI) means understanding existing processes and changing these processes to increase product quality or reduce cost and development time.

Effort/duration per work product

Number of errors found before release of software

Defects reported by end users

Number of work products delivered (productivity)

Page 48: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

An American engineer Deming pioneered the SPI.

He worked with Japanese manufacturing industry after

World War II to help improve quality of manufactured goods.

Deming and several other introduced statistical quality

control which is based on measuring number of product

defects and relating these defects to the process.

Aim is to reduce number of defects by analyzing and

modifying a process.

Page 49: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

However the same techniques cannot be applied to

software development.

Process quality and product quality relationship is less

obvious when product is intangible and mostly dependent

on intellectual processes that cannot be automated.

People’s skills and experience are significant.

Still, for very large projects, the principal factor that

affects product quality is the software process.

Page 50: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

50

Page 51: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

SPI encompasses a set of activities that will lead to a

better software process, and as a consequence a

higher-quality software delivered in a more timely

manner.

SPI help software engineering companies to find their

process inefficiencies and try to improve them.

Page 52: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

52

Page 53: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

Process measurementAttributes of the current process are measured. These

are a baseline for assessing improvements. Process analysis

The current process is assessed and bottlenecks and weaknesses are identified.

Process changeChanges to the process that have been identified

during the analysis are introduced. For example, process change can be better UML tools, improved communications, changing order of activities, etc.

53

Page 54: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

54

These slides are designed to accompany Software Engineering: A Practitioner’s

Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.

SOFTWARE PROCESS IMPROVEMENT

Page 55: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

SOFTWARE PROCESS IMPROVEMENT

There are 2 different approaches to SPI:

Process Maturity: It is for “Plan-Driven” development and

focuses on improving process and project management.

Agile: Focuses on iterative development and reduction

of overheads.

SPI frameworks are intended as a means to assess the

extent to which an organization’s processes follow best

practices and help to identify areas of weakness for

process improvement.

Page 56: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI PROCESS IMPROVEMENT FRAMEWORK

There are several process maturity models:

SPICE

ISO/IEC 15504

Bootstrap

Personal Software Process (PSP)

Team Software Process (TSP)

TickIT

SEI CMMI

56

Page 57: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI PROCESS IMPROVEMENT FRAMEWORK

Capability Maturity Model Integrated (CMMI) framework is the current stage of work on process assessment and improvement that started at the Software Engineering Institute (SEI) in the 1980s.

The SEI’s mission is to promote software technology transfer particularly to US defense contractors.

It has had a profound influence on process improvement. Capability Maturity Model introduced in the early 1990s Revised maturity framework (CMMI) introduced in 2001

57

Page 58: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI PROCESS IMPROVEMENT FRAMEWORK

CMMI allows a software company’s development and

management processes to be assessed and assigned a

score.

There are 4 process groups which include 22 process

areas.

These process areas are relevant to software process

capability and improvement.

58

Page 59: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI PROCESS IMPROVEMENT FRAMEWORK

Category Process area

Process management Organizational process definition (OPD)

Organizational process focus (OPF)

Organizational training (OT)

Organizational process performance (OPP)

Organizational innovation and deployment (OID)

Project management Project planning (PP)

Project monitoring and control (PMC)

Supplier agreement management (SAM)

Integrated project management (IPM)

Risk management (RSKM)

Quantitative project management (QPM)

59

Page 60: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI PROCESS IMPROVEMENT FRAMEWORK

Category Process area

Engineering Requirements management (REQM)

Requirements development (RD)

Technical solution (TS)

Product integration (PI)

Verification (VER)

Validation (VAL)

Support Configuration management (CM)

Process and product quality management (PPQA)

Measurement and analysis (MA)

Decision analysis and resolution (DAR)

Causal analysis and resolution (CAR)

60

Page 61: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

Process areas are defined in terms of Specific Goals

(SG) and Specific Practices (SP) required to achieve

these goals.

SGs are the characteristics for the process area to be

effective.

SPs refine a goal into a set of process-related activities.

61

Page 62: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

For example, for Project Planning (PP), the following SGs

and SPs are defined:

SG1: Establish Estimates

SP1.1-1 Estimate the Scope of the Project

SP1.2-1 Establish Estimates of Work Product and Task

Attributes

SP1.3-1 Define Project Life Cycle

SP1.4-1 Determine Estimates of Effort and Cost

62

Page 63: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

In addition to SGs and SPs, CMMI also defines a set of 5 Generic Goals (GG) and Generic Practices (GP) for each process area.

Each of these 5 GGs corresponds to one of the 5 capability levels.

To achieve a particular capability level, GG for that level and GPs that correspond to that GG must be achieved.

63

Page 64: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

For example, for Project Planning (PP), the following GGsand GPs are defined:

GG1: Achieve Specific Goals

GP1.1 Perform Base Practices

GG2: Institutionalize a Managed Process

GP2.1 Establish an Organizational Policy

GP2.2 Plan the Process

GP2.3 Provide Resources

GP2.4 Assign Responsibility

GP2.5 Train People

64

Page 65: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

GG2: Institutionalize a Managed Process (continued)

GP2.6 Manage Configurations

GP2.7 Identify and Involve Relevant Stakeholders

GP2.8 Monitor and Control the Process

GP2.9 Objectively Evaluate Adherence

GP2.10 Review Status with Higher-Level Management

GG3: Institutionalize a Defined Process

GP3.1 Establish a Defined Process

GP3.2 Collect Improvement Information

65

Page 66: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI GOALS AND PRACTICES

GG4: Institutionalize a Quantitatively Managed Process

GP4.1 Establish Quantitative Objectives for the Process

GP4.2 Stabilize Sub-process Performance

GG5: Institutionalize an Optimizing Process

GP5.1 Ensure Continuous Process Improvement

GP5.2 Correct Root Causes of Problems

66

Page 67: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

CMMI MODELS There are 2 different CMMI models:

Staged CMMI: Assesses a software company’s maturity from 1 to 5.

Process improvement is achieved by implementing practices at each

level and moving from the lower level to higher level. Used to asses

a company as a whole.

Continuous CMMI: Assesses each process area separately, and

assigns a capability assessment score from 0 to 5. Normally

companies operate at different maturity levels for different process

areas. A company may be at level 5 for Configuration Management

process, but at level 2 for Risk Management process.

67

Page 68: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

STAGED CMMI

68

Page 69: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

STAGED CMMI

Initial: Essentially uncontrolled

Managed: Product management procedures defined and

used

Defined: Process management procedures and strategies

defined and used

Quantitatively Managed: Quality management strategies

defined and used

Optimizing: Process improvement strategies defined and

used

69

Page 70: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

STAGED CMMI (LEVELS 1-2)

70

Level Focus Process Area

ManagedBasic project management

Requirements management

Project planning

Project monitoring and controlSupplier agreement management

Measurement and analysisProcess and product quality assurance

Configuration management

Performed  

 

Page 71: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

STAGED CMMI (LEVEL 3)

71

Level Focus Process Area

Defined Process Standardization

Requirements development

Technical solution

Product integration

Verification

Validation

Organizational process focus

Organizational process definition

Organizational training

Integrated project management

Integrated supplier management

Risk management

Decision analysis and resolution

Organizational environment for integration

Integrated teaming

Page 72: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

STAGED CMMI (LEVELS 5-4)

72

Level Focus Process Area

Optimizing Continuous process improvement

Organizational innovation and deployment

Casual analysis and resolution

Quantitatively managed Quantitative management

Organizational process performance

Quantitative project management

Page 73: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

73

CONTİNUOUS CMMI

Page 74: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

The maturity assessment is not a single value but is a

set of values showing the organizations maturity in each

area.

Examines the processes used in an organization and

assesses their maturity in each process area.

The advantage of a continuous approach is that

organizations can pick and choose process areas to

improve according to their local needs.

74

CONTİNUOUS CMMI

Page 75: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PEOPLE CMM

Used to improve the workforce.Defines set of 5 organizational maturity levels that

provide an indication of sophistication of workforce practices and processes.

People CMM complements any SPI framework.

75

Page 76: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PEOPLE CMM

76

Page 77: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PROJECT METRİCS

Unlike software process metrics that are used for strategic purposes, software project metrics are tactical.

Adjustments to time, cost, and quality are done, and risk management is performed.

As the project proceeds, measures of effort and calendar time expended are compared to original estimates.

Project manager uses these data to monitor and control progress.

Page 78: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

PROJECT TRACKING

Binary tracking records status of work packages as 0% or 100% complete.

As a result, progress of a project can be measured as completed or not completed.

Earned Value Management (EVM) measures the progress of a project by combining technical performance, schedule performance, and cost performance.

Work AccomplishedSchedule Budget

Page 79: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

EVM compares PLANNED work to COMPLETED

work to determine if work accomplished, cost, and

schedule are progressing as planned.

The amount of work actually completed and resources

actually consumed at a certain point in a project

TO

The amount of work planned (budgeted) to be

completed and resources planned to be consumed at

that same point in the project

Page 80: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

Budgeted Cost of Work Scheduled (BCWS): The cost

of the work scheduled or planned to be completed in a

certain time period per the plan. This is also called the

PLANNED VALUE.

Budgeted Cost of Work Performed (BCWP): The

budgeted cost of the work done up to a defined point in

the project. This is called the EARNED VALUE.

Actual Cost of Work Performed (ACWP): The actual

cost of work up to a defined point in the project.

Page 81: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

Schedule Variance:

SV = BCWP – BCWS

Schedule Performance Index:

SPI = BCWP / BCWS

Cost Variance:

CV = BCWP - ACWP

Cost Performance Index:

CPI = BCWP / ACWP

Page 82: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

SV, CV = 0 Project On Budget and Schedule

SV, CV < 0 Over Budget and Behind Schedule

SV, CV > 0 Under Budget and Ahead of Schedule

CPI, SPI = 1 Project On Budget and Schedule

CPI, SPI < 1 Over Budget and Behind Schedule

CPI, SPI > 1 Under Budget and Ahead of Schedule

Page 83: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

Project Description:We are supposed to build 10 units of equipmentWe are supposed to complete the project within 6 weeksWe estimated that 600 man-hours to complete 10 unitsIt costs us $10/hour to build the equipment

Our Plan:We are supposed to build 1.67 units each weekEach unit costs $600We will spend $1,000 each week

Page 84: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

Project status:Week 34 units of equipment completed400 man-hours spent

How are we doing?Are we ahead or behind schedule?Are we under or over budget?

Results:Accomplished Work: 4/10 = %40 complete Schedule: 3/6 = %50 over Budget: 400/600 = %67 spent

Page 85: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

BCWS=(600 man-hours*$10/hour)*(3/6 weeks) = $3000

BCWP=(600 man-hours*$10/hour)*(4/10 units) = $2400

ACWP=400 man-hours*$10/hour = $4000

The price of the job that we have done is only $2400 (4

units)

Schedule: in 3 weeks, the price of the job that we should

have done was $3000

Cost: We spent much more; we spent $4000

Page 86: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

SV = BCWP – BCWS = $2400 - $3000 = -$600

SV is negative; we are behind schedule

CV = BCWP – ACWP = $2400 - $4000 = -$1600

CV is negative; we are over budget

SPI = BCWP / BCWS = $2400 / $3000 = 0.8

SPI is less than 1; we are behind schedule

CPI = BCWP / ACWP = $2400 / $4000 = 0.6

CPI is less than 1; we are over budget

Page 87: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

EARNED VALUE MANAGEMENT

Earned Value analysis results are used to predict the future

performance of the project

Budget At Completion (BAC) = The total budget (PV or

BCWS) at the end of the project. If a project has Management

Reserve (MR), it is typically added to the BAC.

Amount expended to date (AC)

Estimated cost To Complete (ETC)

ETC = (BAC – EV) / CPI

Estimated cost At Completion (EAC)

EAC = ATC + AC

Page 88: PROJECT MEASUREMENT AND CONTROL. Q UALITATIVE AND Q UANTITATIVE D ATA  Software project managers need both qualitative and quantitative data to be able

REFERENCESSoftware Engineering, “A practitioner’s Approach”, Roger S.

Pressman, McGraw Hill International Edition Seventh Edition 2010,

ISBN: 978-007-126782-3 or MHID 007-126782-4

Software Engineering, 9/E Ian Sommerville, University of St

Andrews, Scotland ISBN-13:  9780137035151, Addison-Wesley