isbsg benchmark standard v1.0 draft

41
Benchmarking Standard Draft 09/02/2014 International Software Benchmarking Standards Group (ISBSG) DRAFT Author: Anthony L Rollo !am "orris #wa $asylkowski %arol &ekkers !ekka 'orselius &ate: ersion: Reference: 'e*ruary +,-.-/, +,0.1-/doc Commercial In Confidence Copyright © ISBSG Page 1 of 40 Ref: ISBSG Standard v1.0 www.ISBSG.org

Upload: nina-naguyevskaya

Post on 04-Jun-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 1/40

Benchmarking Standard Draft 09/02/2014

International Software Benchmarking Standards Group

(ISBSG)

DRAFT

Author: Anthony L Rollo !am "orris #wa $asylkowski %arol&ekkers !ekka 'orselius

&ate:ersion:Reference:

'e*ruary +,-.-/,+,0.1-/doc

Commercial In Confidence

Copyright © ISBSG Page 1 of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 2/40

Benchmarking Standard Draft 09/02/2014

Contents

-/ !urpose/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////2 +/ Scope3/ 456 S7R# AB576 68IS S#%6I543////////////////////////////////////////////////////////2-/- 6ailoring 6his Standard3/ 456 S7R# 68IS S857L& B# 8#R#3/////////////////////////2-/+ %onformance////////////////////////////////////////////////////////////////////////////////////////////////////////////////////-/9 Limitations////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

+/ &efinitions////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////+/- Benchmark (noun)////////////////////////////////////////////////////////////////////////////////////////////////////////////+/+ Benchmark (er*)/////////////////////////////////////////////////////////////////////////////////////////////////////////////+/9 Benchmark Analyst///////////////////////////////////////////////////////////////////////////////////////////////////////////+/. Benchmark #;perience Base//////////////////////////////////////////////////////////////////////////////////////////-,+/0 Benchmark Li*rarian//////////////////////////////////////////////////////////////////////////////////////////////////////-,

+/1 Benchmark "ethod////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/< Benchmark !rocedure////////////////////////////////////////////////////////////////////////////////////////////////////-,+/2 Benchmark !rocess////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/ Benchmark !rocess 5wner////////////////////////////////////////////////////////////////////////////////////////////-,+/-, Benchmark 7ser///////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/-- !erformance//////////////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/-+ Repository ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/-9 Stakeholder///////////////////////////////////////////////////////////////////////////////////////////////////////////////////-,+/-. 6ype of Benchmark method/////////////////////////////////////////////////////////////////////////////////////////--+/-0 #;ternal Benchmarking////////////////////////////////////////////////////////////////////////////////////////////////--+/-1 Internal Benchmarking/////////////////////////////////////////////////////////////////////////////////////////////////-+

9/ Application of this International Standard//////////////////////////////////////////////////////////////////////////////-+9/- !urpose and 5utcomes of the software *enchmarking process/////////////////////////////////////-+9/+ 5eriew of this Standard//////////////////////////////////////////////////////////////////////////////////////////////-.9/9 5eriew of this Standard//////////////////////////////////////////////////////////////////////////////////////////////-0

./ &escription of the Actiities///////////////////////////////////////////////////////////////////////////////////////////////////-<./- #sta*lish and sustain *enchmark commitment////////////////////////////////////////////////////////////////-<

./-/- Accept Re=uirements//////////////////////////////////////////////////////////////////////////////////////////////-<

./-/+ "aintain Re=uirements////////////////////////////////////////////////////////////////////////////////////////////-<

./-/9 Assign Responsi*ility///////////////////////////////////////////////////////////////////////////////////////////////-<

./-/. Assign Resources////////////////////////////////////////////////////////////////////////////////////////////////////-2

./-/0 "anagement %ommitment//////////////////////////////////////////////////////////////////////////////////////-2

./-/1 %ommunicate %ommitment/////////////////////////////////////////////////////////////////////////////////////-2./+ Identify Information 4eeds/////////////////////////////////////////////////////////////////////////////////////////////-2

./+/- Benchmark information needs/////////////////////////////////////////////////////////////////////////////////-2

./+/+ !rioritise Information 4eeds///////////////////////////////////////////////////////////////////////////////////-

./+/9 Select Information needs/////////////////////////////////////////////////////////////////////////////////////////+,./9 &etermine >uestions//////////////////////////////////////////////////////////////////////////////////////////////////////+,./. #sta*lish Benchmark parameters////////////////////////////////////////////////////////////////////////////////////+,

././- Benchmark 6ype/////////////////////////////////////////////////////////////////////////////////////////////////////+,

././+ Benchmark Scope////////////////////////////////////////////////////////////////////////////////////////////////////+-

././9 Benchmark 're=uency/////////////////////////////////////////////////////////////////////////////////////////////++./0 !lan the Benchmark !rocess//////////////////////////////////////////////////////////////////////////////////////////+9

./0/- &escri*e organisation//////////////////////////////////////////////////////////////////////////////////////////////+9

./0/+ Select "easures///////////////////////////////////////////////////////////////////////////////////////////////////////+.

./0/9 &ocument "easures////////////////////////////////////////////////////////////////////////////////////////////////+.

Copyright © ISBSG Page 2 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 3/40

Benchmarking Standard Draft 09/02/2014

./0/. Select Benchmark Supplier//////////////////////////////////////////////////////////////////////////////////////+.

./0/0 Select Benchmark &ataset///////////////////////////////////////////////////////////////////////////////////////+0

./0/1 &efine !rocedures///////////////////////////////////////////////////////////////////////////////////////////////////+1

./0/< %onfiguration "anagement/////////////////////////////////////////////////////////////////////////////////////+<

./0/2 #aluating Information !roducts/////////////////////////////////////////////////////////////////////////////+<

./0/ #aluating Benchmark !rocess///////////////////////////////////////////////////////////////////////////////+<

./0/-, Approing the Benchmark !rocess///////////////////////////////////////////////////////////////////////+<

./0/-- Approal of !lanning ////////////////////////////////////////////////////////////////////////////////////////////+2

./0/-+ Ac=uire support technologies////////////////////////////////////////////////////////////////////////////////+./1 !erform the Benchmark !rocess////////////////////////////////////////////////////////////////////////////////////+

./1/- Integrate !rocedures///////////////////////////////////////////////////////////////////////////////////////////////+

./1/+ %ollect &ata/////////////////////////////////////////////////////////////////////////////////////////////////////////////9,

./1/9 Analyse &ata///////////////////////////////////////////////////////////////////////////////////////////////////////////9+

./1/. %ommunicate Information !roducts////////////////////////////////////////////////////////////////////////9+./< #aluate the Benchmark/////////////////////////////////////////////////////////////////////////////////////////////////99

./</- #aluate "easures///////////////////////////////////////////////////////////////////////////////////////////////////99

./</+ #aluate the *enchmark process/////////////////////////////////////////////////////////////////////////////99

./</9 Identify potential improements//////////////////////////////////////////////////////////////////////////////990/ Informatie References///////////////////////////////////////////////////////////////////////////////////////////////////////////9. Anne; A: #;amples (informatie)/////////////////////////////////////////////////////////////////////////////////////////////90

 A/- !roductiity e;ample///////////////////////////////////////////////////////////////////////////////////////////////////90 A/+ Schedule adherence/////////////////////////////////////////////////////////////////////////////////////////////////////91

 Anne; B !rocess $ork !roducts (informatie)////////////////////////////////////////////////////////////////////////9< %/- 6imeliness/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////92 %/+ #fficiency//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////92 %/9 &efect containment///////////////////////////////////////////////////////////////////////////////////////////////////////92 %/. Stakeholder satisfaction////////////////////////////////////////////////////////////////////////////////////////////////92

 %/0 !rocess conformance////////////////////////////////////////////////////////////////////////////////////////////////////92

Copyright © ISBSG Page 3 of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 4/40

Benchmarking Standard Draft 09/02/2014

Copyright © ISBSG Page 4 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 5/40

Benchmarking Standard Draft 09/02/2014

Foreword

ISBSG (the International Software Benchmarking Standards Group) is a not for profit organisationmade up from mem*er organisations/ 6hese mem*er organisations are the arious national softwaremeasurement organisations including:

• Australian Software "etrics Association ? Software >uality Association (AS"A@S>A)

• %hinese Software Benchmarking Standards Group (%SBSG)

• &eutschsprachige Anwendergruppe fuer Software@"etrik und Aufwandschatung

(&AS"A)

• 'innish Software "easurement Association ('iS"A)

Gruppo 7tenti 'unction !oint Italia @ Italian Software "etrics Association (G7'!I ?IS"A)

• International 'unction !oint 7sers Group (I'!7G)

• apanese 'unction !oint 7sers Group ('!7G)

• Corean Software "easurement Association (C5S"A)

•  4ational Association of Software and Serice %ompanies (4ASS%5" @ India)

•  4etherlands Software "etrics 7sers Association (4#S"A)

• 7C Software "etrics Association (7CS"A)

Copyright © ISBSG Page of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 6/40

Benchmarking Standard Draft 09/02/2014

Introduction

Since the Benchmarking !rocess is a specific instance of ameasurement process then this process descri*ed is a tailoring of theIS5DI#% Standard -09 "easurement !rocess/ It has thereforeadopted the structure and maEor actiities descri*ed within IS5DI#%-09: adapted to the specific needs of a *enchmarking process/!he Ben"h#ar$ing of %oftware and %oftware re&ated a"tivitie% "o'&d ta$eone of %evera& for#%:

• #;ternal Benchmarking @ 6he process of continuously

comparing and measuring an organisation with *usiness leaders anywhere in the world to gain

information to help the organisation take action toimproe its performance

• !eer group *enchmarks may also *e used within an

organisation to allow comparisons *etween diisionsor sites within that organisation

• !eriodic) or Internal Benchmarking is the process of

determining a metric *aseline for an organisational orfunctional unit of the purposes of comparison/

Contin'a& i#prove#ent re('ire% "hange within the organi%ation.)va&'ation of "hange re('ire% *en"h#ar$ing of perfor#an"e and

"o#pari%on. + *en"h#ar$ "an *e '%ed a% the vehi"&e for the perfor#an"e*en"h#ar$ and "o#pari%on, a% we&& a% to provide the i#pet'% to initiate apro"e%% i#prove#ent initiative. Ben"h#ar$% %ho'&d &ead to a"tion, andnot *e e#p&oyed p're&y to a""'#'&ate infor#ation. Ben"h#ar$% %ho'&dhave a "&ear&y defined p'rpo%e.

!hi% Ben"h#ar$ing Standard define% a %oftware *en"h#ar$ pro"e%%app&i"a*&e to a&& %oftware-re&ated engineering and #anage#ent di%"ip&ine%.!he pro"e%% i% de%"ri*ed thro'gh a #ode& that define% the a"tivitie% of the*en"h#ar$ pro"e%%, whi"h are re('ired to ade('ate&y %pe"ify whatinfor#ation i% re('ired, how the #ea%'re% and ana&y%i% re%'&t% are to *eapp&ied, and how to deter#ine if the ana&y%i% re%'&t% are va&id. !he %oftware*en"h#ar$ pro"e%% i% f&ei*&e, tai&ora*&e, and adapta*&e to addre%% the

need% of different '%er%.

 Benchmarking can be regarded as a special application of softwaremeasurement, in that a benchmark requires some measurement of

 some aspect(s) of performance. Benchmarking extends the needs ofmeasurement by the requirement to perform comparisons against

 some repository, either internal or external. For these reasons the I! standard "#$%$&' has been utilised in the deri*ation of this IB+ standard.

Copyright © ISBSG Page / of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 7/40

Benchmarking Standard Draft 09/02/2014

6he ISBSG group has deeloped the first ersion of the documentspecifically for the needs of the ISBSG community/ 6he longer@termstrategy is to refine the document to a generic framework standard iacontri*ution from the wider community (eg/ Benchmarking

companies)/ It is anticipated that the final document may *ecome the *asis of a future IS5 standard/

6his *enchmarking standard is applica*le to *enchmarking any aspectof Information 6echnology howeer in order to assist in understandingwe hae proided informatie guidance on how to use this standard to

 *enchmark software deelopment productiity using 'unctional Sie"easurement as the product measure/

Copyright © ISBSG Page of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 8/40

Benchmarking Standard Draft 09/02/2014

1. Purpose6his Benchmarking Standard identifies the re=uired actiities and tasksthat are necessary to successfully identify define select apply andimproe *enchmarking for I6 Serice "anagement/ It also proidesstandard definitions for *enchmarking terms within the I6 industry/

6his Benchmarking Standard is intended to proide guidance toorganiations a*out issues and considerations for data selection andcomparison in I6 *enchmarking/ It will also assist them in *eing a*le tointerpret the output from a *enchmark/6his Benchmarking Standard does not proide an e;haustie catalogueof all possi*le *enchmark types nor does it proide a recommended setof *enchmarks for I6 serice management It proides a genericframework to define the most suita*le set of *enchmark re=uirementsthat address specific information needs/

2. Scope…. NOT SURE

 ABOUT THISSECTION…

6his Benchmarking Standard is intended to *e used *y software

suppliers and ac=uirers/ Software suppliers include personnel performing management technical and =uality management functionsin software deelopment maintenance integration and productsupport organisations/ Software ac=uirers include personnel

 performing management technical and =uality management functionsin software procurement and user organisations/6he following are e;amples of how this Benchmarking Standard can *eused:By a supplier to implement a *enchmarking process to address specific

 proEect or organisational information re=uirements/By an ac=uirer (or third@party agents) for ealuating the performance

of the supplierFs processes and serices in the conte;t of a contract tosupply software or software related sericesBy an organisation to *e used internally to answer specific informationneeds/

1.1 Tailoring ThisStandard…. NOT SURETHIS SHOUL BEHERE….

6his ISBSG Standard contains a set of actiities and tasks that result ina *enchmarking process that meets the specific needs of softwareorganisations and proEects/ 6he tailoring process consists of modifyingthe non@normatie descriptions of the tasks to achiee the purpose ofthe *enchmarking process and to produce the re=uired outcomes in aspecific organisational conte;t/ 6he purpose and outcomes specified

for the *enchmarking process in this Standard must all *e satisfied andall the normatie descriptions of the tasks must *e satisfied/ 4ewactiities and tasks not defined in this Benchmarking Standard may *eadded as part of tailoring/6hroughout this Standard shallH is used to e;press a proision that is

 *inding on the party that is applying this International StandardshouldH to e;press a recommendation among other possi*ilities andmayH to indicate a course of action permissi*le within the limits of theStandard/

Copyright © ISBSG Page of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 9/40

Benchmarking Standard Draft 09/02/2014

1.2 Con!or"ance %onformance to this Standard is defined as satisfying the purpose allthe normatie clauses within the tasks in clause ./ Any organisationimposing this Standard as a condition of trade is responsi*le forspecifying and making pu*lic all task@specific criteria to *e imposed inconEunction with this Standard/

1.3 Li"itations 6his Benchmarking Standard does not assume or prescri*e anorganisational model for *enchmarking/ 6he user of this Standardshould decide for e;ample whether a separate *enchmark function isnecessary within the organisation whether the *enchmark functionshould *e integrated within an e;isting function such as softwaremetrics or software =uality/ 5r as in many organisations where a

 *enchmark process is regularly inoked eg/ annually or *iannually thenit may *e more economic to rely upon an e;ternal data collection andor *enchmark agency/6his Standard is not intended to prescri*e the name format or e;plicitcontent of the documentation to *e produced from the *enchmarking

 process/ 6he Standard does not imply that documents *e packaged orcom*ined in some fashion/ 6hese decisions are left to the user of theStandard/6he *enchmarking process should *e appropriately integrated with theorganisational =uality system/ All aspects of internal audits and non@conformance reporting will not *e e;plicitly coered in thisInternational Standard as they are assumed to *e in the domain of the=uality system/6his Standard is not intended to conflict with any organisational

 policies standards or procedures that are already in place/ 8oweerany conflict needs to *e resoled and any oerriding conditions and

situations need to *e cited in writing as e;ceptions to the application ofthe International Standard/

2. Definitions6his standard uses many of the terms used in software measurementin general only those terms associated with *enchmarking hae *eendefined here/ 6he reader is referred to the IS5 standard -09fordefinitions of measurement terms/

2.1 Benchmar !noun"A alue of some measure or deried measure which is indicatie of

the relationship *etween an organisational attri*ute and the alues ofthat attri*ute maintained in a *enchmarking repository/

2.2 Benchmar !#er$"%arrying out the set of processes undertaken to esta*lish the relatiealue of some organisational attri*ute with respect to the datarepository to *e used for comparison purposes/

2.3 Benchmar Anal%stAn indiidual or organisation that is responsi*le for the planning

 performance ealuation and improement of *enchmark/

Copyright © ISBSG Page of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 10/40

Benchmarking Standard Draft 09/02/2014

2.& Benchmar '(perience BaseA data store that contains the ealuation of the information productsand the *enchmark process as well as any lessons learned during

 *enchmark and analysis/

2.) Benchmar *i$rarianAn indiidual or organisation that is responsi*le for managing the

 *enchmark data store(s)/

2.+ Benchmar ,ethodA logical se=uence of operations descri*ed generically used in=uantifying an attri*ute with respect to a specified scale (*ased on thedefinition in International oca*ulary of Basic and General 6erms in"etrology -9J)/

2.- Benchmar ProcedureA set of operations descri*ed specifically used in the performance of

a particular *enchmark according to a gien method Internationaloca*ulary of Basic and General 6erms in "etrology -9J/

2. Benchmar Process6he process for identifying defining selecting applying andimproing software *enchmark within an oerall proEect ororganisational *enchmark structure/

2./ Benchmar Process 0wner An indiidual or organisation responsi*le for the *enchmark process

2.1 Benchmar ser 

An indiidual or organisation that uses the information products/

2.11 PerformanceA deried measure which gies an indication of some attri*uteassociated with how well how =uickly or how efficiently a product

 performs or a process is carried out/ 6ypical performance attri*utesof a product may *e =uality where a typical process attri*ute might

 *e productiity/

2.12 Repositor%

An organised and persistent collection of multiple data sets thatallows for its retrieal/ And which is designated for use as the sourceof comparatie measures for the purpose of *enchmarking/

2.13 taeholder 

An indiidual or organisation that sponsors *enchmarking proidesdata is a user of the *enchmark results or otherwise participates inthe *enchmarking process/

Copyright © ISBSG Page 10 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 11/40

Benchmarking Standard Draft 09/02/2014

2.1& T%pe of Benchmar method6he type of *enchmark method depends on the nature of theoperations used to =uantify an attri*ute/ 6wo types may *edistinguished:

• =ualitatie ? =uantification inoling human Eudgement orcomparison/ An e;ample would *e rating some aspect in linewith a model/ $hilst often e;pressed numerically these will *eordinal measures arried at *y su*Eectie Eudgement/

• =uantitatie ? *ased on numerical rules "ay *e su*Eected toarithmetic processes

2.1) '(ternal Benchmarin46he process of continuously comparing and measuring organiationalunits within an organisation with *usiness leaders anywhere in theworld to gain information to help the organisation take action to

improe its performance

 ote& -xternal benchmarks are benchmarks wherecomparison is drawn between an organisation or partof an organisation and either industry a*erage

 performances, or with competitor organisations withinthe same industry. hese types of benchmark may beconducted for a *ariety of reasons. For example, toanswer the following questions&

 /ow does the organisation compare to industry0standards12

 3re we more or less effecti*e in comparison to ourcompetitors2

 3re our processes effecti*e or do we need to launchan impro*ement initiati*e24an we reduce the costs of maintaining our existing

 portfolio and still pro*ide a sufficient ser*ice2 Is the outsource contract achie*ing the ser*ice le*elagreed in the contract2

 3re we getting *alue for money internally or from our

 suppliers25hat is the likely 6!I if we undertake a processimpro*ement pro7ect2

Copyright © ISBSG Page 11 of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 12/40

Benchmarking Standard Draft 09/02/2014

2.1+ Internal Benchmarin46he process of determining a metric *aseline for an organisational orfunctional unit for the purposes of comparison/

 ote& Internal benchmarks may be conducted for a

*ariety of reasons, it is important to be clear as to thereason for the conduct of the benchmark as this willbe a ma7or factor in the determination of the type ofbenchmark to be undertaken. 8uestions, which maybe addressed by an internal benchmark, are&

 Is our process impro*ement initiati*e pro*ingeffecti*e2 (baselining)

 Is the outsource contract meeting the le*els agreed inthe contract2

 3re all the di*isions and sites in our organisation performing at the same le*el2

 /as the introduction of a new technology achie*ed thebenefits expected25hat e*idence is there to support the estimates thatwe are using2

3. Application of this International tandard6his clause presents an oeriew of the software *enchmark process/ 6he o*Eectie is to orient theusers of this Benchmarking Standard so that they can apply it properly within conte;t/

3.1 Purpose and 0utcomes of the software $enchmarin4 process

Copyright © ISBSG Page 12 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 13/40

Benchmarking Standard Draft 09/02/2014

6he purpose of the software *enchmarking process defined in thisStandard is to collect analyse and report data relating to the productsdeeloped and processes implemented within the organisational unit tosupport effectie management of the processes and to o*Eectielydemonstrate the comparatie performance of these processes/

As a result of successful implementation of the *enchmarking processorganisational commitment for *enchmarking will *e esta*lished andsustainedKthe information o*Eecties of technical and management processes will

 *e identifiedKan appropriate set of =uestions drien *y the information needs will *eidentified andDor deelopedK

 *enchmark scope will *e identifiedKthe re=uired performance data will *e identifiedKthe re=uired performance data will *e measured stored and presentedin a form suita*le for the *enchmark 

the *enchmark outcomes will *e used to support decisions and proidean o*Eectie *asis for communicationK

 *enchmark actiities will *e plannedKopportunities for process improements will *e identified andcommunicated to the releant process owner/the *enchmark process and measures will *e ealuated/

6he performance measures defined and utilised during the *enchmark process should *e integrated with the organisations e;istingmeasurement process which should comply with IS5DI#%

-09:+,,+/

 ote & he purposes for doing the comparison may be for&

• 4omparing other di*isions or sites within your

organisation

• 4omparison with your closest competitors

• 4omparable benchmarking against industry

 performance a*erages

• 9ear:on:year comparisons of the organisations

 performance for process impro*ements

• !btaining performance measures from completed

 pro7ects for input into pro7ect estimates

Copyright © ISBSG Page 13 of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 14/40

Benchmarking Standard Draft 09/02/2014

3.2 0#er#iew of this tandard

6his Benchmarking Standard defines the actiities and tasks necessaryto implement a *enchmarking process/ An actiity is a set of relatedtasks that contri*utes towards achieing the purpose and outcomes ofthe *enchmarking process (see clause ./-)/ A task is a well@defined

segment of work/ #ach actiity is comprised of one or more normatietasks/ 6his Standard does not specify the details of how to perform thetasks included in the actiities/6he actiities of the *enchmarking process consist of actiities that areillustrated in the process model in 'igure -/ 6he actiities arese=uenced in an iteratie cycle allowing for continuous feed*ack andimproement of the *enchmark process/ 6he process model in 'igure -is *ased upon the "easurement process in IS5DI#% -09@-:+,,-/$ithin the actiities the tasks are in practice also iteratie/

6hree actiities are considered to *e the %ore Benchmark !rocess:

!lan the Benchmark !rocess !erform the Benchmark !rocess and#aluate and present the *enchmark results/ 6hese actiities mainlyaddress the concerns of the *enchmark user/ 6he other actiities

 proide a foundation for the %ore Benchmark !rocess and proidefeed*ack to help sustain *enchmark commitment and #aluateBenchmark Results/ Benchmarks should *e ealuated in terms of theadded alue they proide for the organisation and only deployedwhere the *enefit can *e identified/ 6hese latter two actiities addressthe concerns of the *enchmark process owner/

'igure - shows that the %ore Benchmark !rocess is drien *y the

information needs of the organisation/ 'or each information need the%ore Benchmark !rocess produces an information product  thatsatisfies the information need/ 6he information product is presentedto the organisation as a *asis for decision@making/

6he link *etween *enchmarks and an information need is descri*ed asthe benchmark information model  in Anne; A and illustrated withe;amples/

6he process defined in this Benchmarking Standard includes anealuation actiity (as shown in 'igure -)/ 6he intent is to emphasie

that ealuation and feed*ack are an essential component of the *enchmark process and should lead to improements of the *enchmark process/ #aluation can *e simple and performed in an ad@hoc manner (when capa*ility is low) or it can *e =uantitatie withsophisticated statistical techni=ues to ealuate the =uality of the

 *enchmark process and its outputs (when capa*ility is high)/

Copyright © ISBSG Page 14 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 15/40

Benchmarking Standard Draft 09/02/2014

At the centre of the cycle is the *enchmark e;perience *aseH 6hisis intended to capture information needs from past iterations of thecycle preious ealuations of information products andealuations of preious iterations of the *enchmark process/ 6hiswould include the measures that hae *een found to *e useful in

the organisational unit/

6he cycle also includes a *enchmark repository this may *eincorporated in the *enchmark e;perience *ase or may *emaintained e;ternally often *y e;ternal organisations which

 proide access to their proprietary *enchmark repository andanalysis/

 4o assumptions are made a*out the nature or technology of the*enchmark repositoryH or the *enchmark e;perience *ase onlythat it *e a persistent storage/

Information products stored in the *enchmark e;perience *aseHare e;pected to *e reused in future iterations of the *enchmark

 process/

Since the process model is cyclical su*se=uent iterations may onlyupdate *enchmark products and practices/ 6his Standard does notimply that *enchmark products and practices need to *e deelopedand implemented for each instantiation of the process/ 6hewording used in this International Standard adopts the conentionthat one is implementing the *enchmark process for the first time(ie the first instantiation)/ &uring su*se=uent instantiations this

wording should *e interpreted as updating or changingdocumentation and current practices/

6he typical functional roles mentioned in this Standard are:stakeholder sponsor *enchmark user *enchmark analyst

 *enchmark li*rarian data proider and *enchmark process owner/6hese are defined in the &efinitionsH section of this InternationalStandard/

A num*er of work products are produced during the performanceof the *enchmark process/ 6he work products are descri*ed in

Anne; B and mapped to the tasks that produce them/

3.3 0#er#iew of this tandard

In this international standard clauses 4/n denote an actiity and 4/n/n a task within an actiity/ 4onnormatie te;t is italici;ed / In addition informatie notes are headed 4ote:/

Copyright © ISBSG Page 1 of 40f: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 16/40

Benchmarking Standard Draft 09/02/2014

Copyright © ISBSG Page 1/ of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 17/40

Benchmarking Standard Draft 09/02/2014

&. Description of the Acti#itiesIn implementing a *enchmark process in line with this International Standard the organisational unitshall perform the actiities descri*ed *elow/ 6he Re=uirements for BenchmarkH from the 6echnicaland "anagement processes trigger the *enchmark process/

 ote& Benchmarking is an imprecise tool as it is not possible to find directly comparable&• !rganisations

• 4ontracts

• 9ears

%lients suppliers and *enchmarker should understand *enchmarking limitations to ensure the process scope is designed to proide the most compara*le *enchmark data possi*le and that themethod and underlying assumptions are transparent and audita*le/As technology changes the original *enchmark measurements may no longer proide suita*lecomparisons and there may *e a need to re esta*lish the *aseline measurements or the comparisongroup against which he *enchmark is *eing conducted/ It may een *e necessary to reconsider theterms of an outsourcing contract as new technologies may render older agreements unfair to oneside or the other/5er time the normalF *alance of proEect types undertaken may alter significantly with resultssimilar to those descri*ed for changing technologies/

#ffectie *enchmarking *egins at the contract stage/&ue to a lag timeF factor *enchmark results can *e anywhere from si; months to a year old/

&.1 'sta$lish and sustain $enchmar commitment

6his actiity consists of the following tasks:• Accept the re=uirements for *enchmark 

• Assign resources

• #sta*lish management commitment

• %ommunicate to the organisational unit

4.1.1  AcceptRe#$ire"ents

Re=uirements for the *enchmark should *e gathered and agreed *etween all thestakeholders for the *enchmark 

4.1.2 %aintai 

nRe#$ire"ents

6he re=uirements will *e recorded in some suita*le format and as changes to these

re=uirements emerge oer time they shall *e maintained/ It will *e necessary toensure that the impact of any changes to re=uirements is properly understood *ythe stakeholders *efore they can *e accepted/

 ote& 3s the benchmark process proceeds o*er a number of years there maywell be changes to the requirements caused by changes in business informationneeds or due to ad*ances in technology. hese changes may in*alidate someof the data and conclusions of preceding benchmarks and these effects willneed to be e*aluated to assess their impact.

4.1.3  Assign 6he sponsor of *enchmark should assign this responsi*ility/

Copyright © ISBSG Page 1 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 18/40

Benchmarking Standard Draft 09/02/2014

Responsi&ilit'  It should *e ensured that competent indiiduals are assigned thisresponsi*ility/ %ompetent indiiduals may *e ac=uired throughtransfer coaching training su*@contracting andDor hiring

 professional *enchmarking organisations/ %ompetence includesknowledge of the principles of *enchmark how to collect data

 perform data analysis and communicate the information products/At a minimum competent indiiduals should *e assigned theresponsi*ility for the following typical roles:

• the *enchmark user 

• the *enchmark analyst

• the *enchmark li*rarian

6he num*er of roles shown a*oe does not imply the specificnum*er of people needed to perform the roles/ 6he num*er of

 people is dependent on the sie and structure of the organisationalunit/ 6hese roles could *e performed *y as few as one person for asmall proEect/

4.1.4  AssignReso$rces

6he sponsor of *enchmark should *e responsi*le for ensuring thatresources are proided/ Resources include funding and staff/Resource allocations may *e updated in the course of actiity ./+

4.1.5 %anage"entCo""it"ent 

%ommitment should *e esta*lished when a Re=uirement forBenchmarkH is defined (see 'igure -)/6his includes the commitment of resources to the *enchmark

 process and the willingness to maintain this commitment/ 6heorganisational unit should demonstrate its commitment through for

e;ample a *enchmark policy for the organisational unit allocationof responsi*ility and duties training and the allocation of *udgetand other resources/ %ommitment may also come in the form of acontract with a customer stipulating that certain measures *e used/

4.1.6  Co""$nicateCo""it"ent 

6his can *e achieed for e;ample through organisational unit@wide announcements or newsletters/

&.2 Identif% Information 5eeds

4.2.1 Bench"ar(in!or"ation needs

Information needs are o*tained from the stakeholders in *enchmark/

Information needs are *ased on: *enchmark goals constraints risksand organisational andDor proEect pro*lems/ 6he information needsmay *e deried from the *usiness organisational regulatory (such

Copyright © ISBSG Page 1 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 19/40

Benchmarking Standard Draft 09/02/2014

as legal or goernmental) product andDor proEect o*Eecties/

Information needs may address =uestions such as: how do I predictthe productiity of my planned proEectMH how do I ealuate the=uality of the software product compared to industry normsMH and

how do I know the cost effectieness and efficiency of my suppliercompared to industry normsMH

 ote& In measuring performance in order to makecomparisons it is important to establish what aspects of

 performance are of interest to the organisation wishing toconduct a benchmark. he implication of the choice of arange of aspects to be measured is also important whenutilising the results of a performance benchmark ormeasurement. <erformance measurement may include&

•  =easures of producti*ity

 =easures of time to market (time to deli*er)•  =easures of quality

•  =easures of cost 

•  =easures of rework 

•  =easures of user (customer) satisfaction

Before approaching *enchmarking organisations it is important thatnot only should an organisation decide which aspects of

 performance are of importance *ut also to define what is meant *ythe arious terms descri*ed a*oe/ 'or e;ample what is meant *y=uality is it the num*er of defects discoered in a deliered system

in the first few months of operation or should measures of usa*ilityrelia*ility maintaina*ility and so on *e includedM $hen measuringcost is it simply the cost to deelop a system or should cost tomaintain the system *e includedM

4.2.2 )rioritiseIn!or"ation Needs

6his prioritisation is normally accomplished *y or inconEunction with the stakeholders/ 5nly a su*set of the initialinformation needs may *e pursued further/ 6his is particularlyreleant if *enchmark is *eing tried for the first time within anorganisational unit where it is prefera*le to start small/

An e;ample of a simple and concrete prioritisation approach isto ask a group of stakeholders to rank the information needs/'or each information need calculate the aerage rank/ 6henorder the aerage ranks/ 6his ordering would proide a

 prioritisation of the information need/

 ote& he purpose for which a benchmark isundertaken relates directly to the types of questions setout abo*e, for which answers are sought. /owe*er itmust be recognised that the list of questions is notexhausti*e and the answers to many other questionsmay be needed, it is ne*ertheless important to decideexactly what questions need to be addressed beforeundertaking a benchmark exercise, and hence defining

Copyright © ISBSG Page 1 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 20/40

Benchmarking Standard Draft 09/02/2014

the purpose of the benchmark. <ossible reasons forundertaking a benchmark are&

• et competiti*e range for metrics baseline.

•  >emonstrate ongoing competiti*eness and

continuous impro*ement in pricing ? ser*ice le*els

•  Identify <rocess Impro*ement opportunities

•  Identify best practices

•  >ecision making re&: outsourcing 

•  -stablish market position

4.2.3 SelectIn!or"ation needs

 4o assumptions are made a*out the type of documentation/ It can *e on paper or electronic for e;ample/ It is only necessary that thedocumentation is retriea*le/6he selected information needs should *e communicated to all

stakeholders/ 6his is to ensure that they understand why certain data areto *e collected and how they are to *e used/  -xamples of the definition of a measure deri*ed frominformation needs can be found in 3nnex 3 to this document.he reader is also referred to the I!@I- "#$%$&''

 standard for further information on defining measures

&.3 Determine6uestions

6he information needs preiously identified shall *e used in determiningthe =uestions which need to *e answered/'or e;ample if the information need is to esta*lish the relatie

 productiity of an organisational unit/ 6he =uestions which need to *eanswered would *e:

-/ $hat is the productiity of the unit+/ 8ow does it compare with other organisational units/

&.& 'sta$lish Benchmar parameters

4.4.1 Bench"ar(T'pe

 ote & Business eed 

5hat are the questions that the business needs to ha*e answeredby the benchmark exercise2

 Benchmarking ype Internal 

 Is an internal benchmark sufficient to answer the questions posed2 If so is it to be undertaken as a year on yearcomparison in which case, is sufficient data likely to bea*ailable in any single year to meet the business ob7ecti*es. It

 should be remembered that a sample of one or two performancemeasurements is unlikely to be a sound basis for comparisonA

 you may require ': years of data before a suitable basis forcomparison is a*ailable, especially if a range of pro7ect types ortechnologies are to be measured.

 Is the internal benchmark being conducted to allow comparison

Copyright © ISBSG Page 20 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 21/40

Benchmarking Standard Draft 09/02/2014

between di*isions or sites in the organisation2 If that is the case dothe disparate di*isions or sites de*elop the same type of softwareutilising the same technologies and approaches2 3 comparisonbetween modern e:business de*elopment and the more traditionallegacy based system is unlikely to pro*ide a useful basis for

comparison as *ery different performance is to be expected.

 -xternal

 If an external benchmark is to be conducted, then it is essential toensure that the scope of the systems being measured is representati*eof the long:term de*elopment profile of the work being measured.4omparing the performance of a help desk in the first year ofintroduction of a radically new system against industry 0standards1,where most systems will be mature, is unlikely to re*eal worthwhileinsights.

 It is important that the period o*er which measurements are taken forthe benchmark is comparable to the period of work, which forms thebulk of the external benchmark data repository. hus to comparethree months maintenance and support effort against a benchmarkdata base which consists of measurement reflecting a whole yearswork is liable to lead to misleading comparisons.

 =ust establish if the benchmark period is representati*e of past and future years

4.4.2 Bench"ar(Scope

6he scope of *enchmark is an organisational unit/ 6his may *e a single proEect a functional area the whole enterprise a single site or a multi@site organisation/ 6his may consist of software proEects or supporting

 processes or *oth/ All su*se=uent *enchmark tasks should *e withinthe defined scope/In addition all stakeholders should *e identified/ 'or e;ample thesemay *e proEect managers the Information Systems manager or thehead of >uality "anagement/ 6he stakeholders may *e internal ore;ternal to the organisational unit/6he scope of the organisational unit can *e identified throughinteriews and the inspection of documentation such as organisationalcharts/

 ote& For example an organisational unit may be the

 3pplications >e*elopment Function Benchmarking this unit is often referred to as an 3>@=benchmark the applications de*elopment function usuallyincludes enhancement pro7ects o*er a certain si;e,maintenance acti*ity will carry out minor enhancementsusually of small duration (typically less than ten days or

 smaller than # function points). <robably the most usefulbenchmark for this area is a pro7ect based benchmark and

 se*eral benchmark pro*iders will undertake this type ofbenchmark. =any pro*iders howe*er ha*e a standard

 product, which they will recommendA this may well be

acceptable as it may include a pro7ect:based element. /owe*er if this is the type of benchmark you belie*e will

Copyright © ISBSG Page 21 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 22/40

Benchmarking Standard Draft 09/02/2014

answer the business questions you wish to answer then checkcarefully that you will get the benefits you need from a widerbenchmark, which will of course pro*ide you with informationon other aspects of your organisations performance, but mayincrease your monitoring costs for little real benefit.

4.4.3 Bench"ar(*re#$enc'  Before undertaking a *enchmarking e;ercise it is important to decide atwhat fre=uency it will *e necessary to carry out further *enchmarks/Benchmark proiders may suggest standard fre=uencies usuallyannually or *iannually/ 8oweer you need to decide at what interalsmeet your needs/ 6his will relate to the =uestions decided upon duringthe initial planning stage/ arious strategies are aaila*le:Annually throughout contract/ If you hae entered an outsourcingagreement then it might *e wise to hae an annual *enchmark at the endof each year of the contract/ 8oweer remem*er the outsourcer will

 pro*a*ly not achiee all the *enefits in a linear manner/ 6he outsourcerwill hae to a*sor* some or all of your staff so they will need to make

some cultural change in addition user departments will now *eworking in a different relationship to the I6 supplier than preiously/6here will *e seeral part complete proEects during the first year soimproement will *e difficult on these proEects/ "any contracts do notdemand a demonstrated improement in the first year and some allowthe first and second years to form the *aseline for improement thoughthis can act as a disincentie to making improements in year one of thecontract/

Copyright © ISBSG Page 22 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 23/40

Benchmarking Standard Draft 09/02/2014

• !rior to contract renewal/ It may well *e that you are

satisfied to hae the performance of the outsourcerdemonstrated Eust prior to the end of the contractwhen you are considering its renewal/ 6his is

 pro*a*ly not a sufficient fre=uency if you hae a

contract longer than a couple of years since if thereis no demonstra*le improement you need tounderstand this well *efore the end of say a fieyear contract/

• After pre@defined period (eg/ secondDthird year)/ 6his is

a compromise *etween the first and second optionsand will mean that in a long contract performance is

 *eing monitored at seeral intermediate stages/6his strategy is pro*a*ly ade=uate for contractsoer 1 years in length/ It o*iates the pro*lem of

non@linear improement which can create pro*lemsin the contract relationship whilst still allowing anorganisation to hae demonstra*le progress towardsthe e;pected *enefits/

• Biennial/ If you are conducting internal process

improement then you may well wish to monitor the progress at a fre=uency greater than annually/ A *enchmark allows you to identify whereimproement effort should *e focussed/ A

 *enchmark carried out at this fre=uency is likely to *e an internal *enchmark monitoring your own

 progress/ 6his can *e supplemented *y an e;ternal *enchmark at some lower fre=uency say eery twoyears/

&.) Plan the Benchmar Process

6his actiity consists of the following tasks:

• %haracterie 5rganisational unit

• Select measures

• Select the Benchmarking supplier 

• Select Benchmark &ataset

4.5.1   Describeorganisation

6he 5rganiational unit shall *e e;plicitly descri*ed/

6he organiational unit proides the conte;t for *enchmark andtherefore it is important to make e;plicit this conte;t and theassumptions that it em*odies and constraints that it imposes/ Attri*utesof the organiational unit that are releant to the interpretation of theinformation products should *e identified/ %haracterisation can *e interms of organiational processes interfaces amongstdiisionsDdepartments and organiational structure/ !rocesses may *e

characteried in the form of a descriptie process model/6he organisational unit characteriation should *e taken into account in

Copyright © ISBSG Page 23 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 24/40

Benchmarking Standard Draft 09/02/2014

all su*se=uent actiities and tasks/

4.5.2 Select%eas$res

%andidate measures that allow one or more =uestions to *e answered theselected information needs shall *e identified/

6here should *e a clear link *etween the information needs the =uestionsand the candidate measures/ Such a link can *e made using theBenchmark Information "odel descri*ed in Anne; A/

 4ew measures should *e defined in sufficient detail to allow for aselection decision (task ././9)/ 5ther International Standards egIS5DI#% -+1:-- and IS5DI#% -.-.9:-2 descri*e some commonlyused software measures and re=uirements for their definition/A new measure does not hae to *e defined anew *ut may inole anadaptation of an e;isting measure/

6he selected measures should reflect the priority of the information

needs/ 'urther e;ample criteria that may *e used for the selection ofmeasures may *e found in IS5DI#% -09:+,,+/"easures that hae *een selected *ut that hae not *een fully specifiedshould *e deeloped/ 6his may inole the definition of o*Eectiemeasures for e;ample a product coupling measure or su*Eectiemeasures such as a user satisfaction =uestionnaire to meet newinformation needs/It should *e noted that conte;t information may need to *e considered aswell/ 'or e;ample in some cases measures may need to *e normalised/6his may re=uire the selection and definition of new measures such as anominal measure identifying the programming language used/

4.5.3 oc$"ent%eas$res

An e;ample of a unit of *enchmark is hours per function point dollars per function pointH/6he formal definition descri*es e;actly how the alues are to *ecomputed including input measures and constants for deried measures/

 4ote that such definitions may already e;ist in the Benchmark #;perienceBaseH which should *e consulted/6he method of data collection may *e for e;ample a static analyser adata collection form or a standard =uestionnaire/Anne; A proides guidelines for linking the measures to the informationneeds through the Benchmark Information "odel

4.5.4 SelectBench"ar( S$pplier 

%riteria for selecting the *enchmarking supplier shall *e defined/

 ote& the decision needs to be made whether to use a benchmark pro*ider. >o they ha*e rele*ant experience in benchmarkingarea5ithin target region5ithin Industry ector 

6he *enchmarking supplier should *e assessed for the "ethodology they propose in the following areas:

• Sampling techni=ues (if proposed)

Copyright © ISBSG Page 24 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 25/40

Benchmarking Standard Draft 09/02/2014

• Analysis techni=ues

• #;tent to which customied solution is possi*le

• Sample findings reports and conclusions/

Logistics• Re=uired resource commitments

• A*ility to meet proEect timeta*le

• 5n@site data collection

• %ost

  ote & 5hile it is typically a client organisation thatnegotiates a benchmark pro*ision in an outsourcingcontract, costs tend to be born by both the client and

 supplier organisations hence both should be acti*elyin*ol*ed in the planning acti*ities4apabilities and costs of benchmarking organisations candiffer widely. 4lient and supplier need to agree on thee*aluation criteria between them.

4.5.5 SelectBench"ar( ataset 

%riteria for selecting the *enchmarking dataset shall *e defined ote & It is recommended that the >ataset is from!rganisations of comparable si;e4o*erage of elected =etrics within the dataset need to becomparable with measures pre*iously defined. he following

attributes of the measures should be e*aluated.• egmentation by industry sector, application type and@or

business en*ironment •  <rocess =aturity Ce*els (e.g. 4==I, 4== or pice

assessment le*el)•  <ro7ects profiles (e.g. <ro7ect si;e , pro7ect *olumes,

de*elopment or enhancement or migration, pro7ectduration)

•  >eli*ery =echanisms, (for example <ackage

4ustomi;ation, !pen ource, Bespoke)

• 4urrency of data

• echnology <latforms (e.g. =ainframe, client ser*er, <4 ,

5eb based, multi:tiered etc)

6he *enchmark data set should *e ealuated to ensure it satisfies dataintegrity re=uirements/ ote& he benchmark dataset should be checked to ensure thatadequate data *alidation procedures are performed before data isaccepted.

•  >efine data collection, analysis, and reporting procedures

•  >efine criteria for e*aluating the information products

and the benchmark process

•  6e*iew, appro*e, and staff benchmark tasks

•  3cquire and deploy supporting technologies

Copyright © ISBSG Page 2 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 26/40

Benchmarking Standard Draft 09/02/2014

 Information products and e*aluation results in the DBenchmark -xperience BaseE should be consulted during the performance of thisacti*ity. -xamples of the benchmark planning details that need to beaddressed during this acti*ity are described in ..

4.5.6  e!ine)roced$res

!rocedures for data collection storage and erification shall *e defined/6he procedures should specify how data are to *e collected as well ashow and where they will *e stored/ &ata erification may *eaccomplished through an audit/

 ote& 4onsideration needs to be gi*en to the scope of thedata collection. I.e. 5hether a sample set of pro7ects orapplications is selected for benchmarking, *ersus theentire population within the organisational unit. If a

 sample set is chosen then consideration needs to be gi*ento the following&

• i;e of the sample sets

• ample selection technique eg. 6andom,

representati*e profiling 

erify that the profile of the data collection set ade=uately matches the profile for the *enchmark data collection set/

 ote& For example if the ma7ority of your de*elopmentwork is small pro7ects then you should choose abenchmark data set that has sufficient small pro7ects toenable meaningful comparison.

#;pecting a single result that is supposed to magicallyF factor in the

comple;ities of the systems deelopment and *usiness enironments isnaNe a waste of time and money/ Some degree of profiling must *eundertaken/InclusionD#;clusion rules for deelopment proEects that span the

 *oundaries of the *enchmark period must *e clearly defined ande=uita*le/ (>uantify what is likely to *e e;cluded *ased upon the rules/%onsider assessing O complete)

Pou need to decide if reworked functional sie will *e included in the product output measure or Eust the deliered functional sie and ensurethat the *enchmark dataset has compara*le measures/

erify that the definitions for the *ase measures and derie measures ofthe data collection correspond to or can *e deried from those used forthe *enchmark data set/

 ote& if functional si;e is used as the base measure of the product deli*ered, ensure that it has been measured usingan equi*alent technique. -.g. If the data collection sethas had its functional si;e measured in accordance withan I! conformant F= =ethod it not *alid to compareit with a set, which has functional si;e, deri*ed from linesof code or estimated from other parameters.

 

!rocedures for data analysis and reporting of information products shall *e defined/ 6he procedures should specify the data analysis method(s)

Copyright © ISBSG Page 2/ of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 27/40

Benchmarking Standard Draft 09/02/2014

and format and methods for reporting the information products/ ote& <rior to the benchmark proceeding all parties needto agree on the content, degree of detail, layout, re*iewand appro*al process for reporting the benchmarkinformation products.

6he range of possi*le statistical tools that would *e needed to performthe data analysis should *e identified/

4.5.7  Con!ig$ration %anage"ent 

Items such as the raw data information products and selectedinformation needs should *e placed under configuration management/6his may *e the same configuration management procedure used inother parts of the organisational unit/

4.5.8 E+al$atingIn!or"ation )rod$cts

6hese criteria would allow one to determine whether the data that areneeded hae *een collected and analysed with sufficient =uality tosatisfy the information needs/ 6he criteria need to *e defined at the

 *eginning and act as success criteria/

6he criteria need to *e defined within the conte;t of the technical and *usiness o*Eecties of the organisational unit/ #;ample criteria for theealuation of information products are the accuracy of a *enchmark

 procedure and the relia*ility of a *enchmark method/ 8oweer it may *e necessary to define new criteria and measures for ealuating theinformation products/

4.5.9 E+al$atingBench"ar( )rocess

6he criteria need to *e defined within the conte;t of the technical and *usiness o*Eecties of the organisational unit/ #;amples of such criteriaare timeliness and efficiency of the *enchmark process/ 8oweer it

may *e necessary to define new criteria and measures for ealuating the *enchmark process/

4.5.10  Appro+ingthe Bench"ar()rocess

6he criteria for approal will *e *ased upon meeting those criteriadefined for the ealuation/ 6hey should also include criteria such as the

 process for dispute resolution and the *asis upon which any *enchmarkreport will *e accepted/

6he following are e;ample elements that may *e included in a *enchmark plan:

• characteriation of the organisational unit

•  *usiness and proEect o*Eecties

•  prioritied information needs and how they link to the

 *usiness organisational regulatory product andDor proEect o*Eecties

• definition of the measures and how they relate to the

information needs

• responsi*ility for data collection and sources of data

• when will the data *e collected (e/g/ at the end of each

inspection) and fre=uency• tools and procedures for data collection (e/g/ instructions

Copyright © ISBSG Page 2 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 28/40

Benchmarking Standard Draft 09/02/2014

for e;ecuting a static analyser)

• data storage

• re=uirements on data erification

data entry and erification procedures• data analysis plan including fre=uency of analysis and

reporting

• any necessary organisational andDor software process

changes to implement the *enchmark plan

• criteria for the ealuation of the information products

• criteria for the ealuation of the *enchmark process

• confidentiality constraints on the data and information

 products and actionsDprecautions necessary to ensure

confidentiality

• schedule and responsi*ilities for the implementation of

 *enchmark plan including pilots and organisational unitwide implementation

•  procedures for configuration management of data

 *enchmark e;perience *ase and data definitions

4.5.11  Appro+al o!)lanning

6he *enchmark planning tasks constitute all tasks from clause ./0/-through ./0/-+ 6he results of *enchmark planning include the data

collection procedures storage analysis and reporting proceduresealuation criteria schedules and responsi*ilities/ Benchmark planningshould take into consideration improements and updates proposedfrom preious *enchmark cycles (Improement ActionsH in 'igure -)as well as releant e;periences in the Benchmark #;perience BaseH/%riteria such as the feasi*ility of making changes to e;isting plans in theshort@term the aaila*ility of resources and tools for the realisation ofchanges and any potential disruptions to proEects from which data iscollected should *e considered when selecting proposed improementsto implement/If *enchmark planning information already e;ists for e;ample from a

 preious *enchmark cycle then it may only need to *e updated asopposed to *eing deelopedH/Stakeholders must reiew and comment on the *enchmark planninginformation/ 6he sponsor of *enchmark will then approe the

 *enchmark planning information/ Approal demonstrates commitmentto *enchmark/

he benchmark planning information should be agreed to bythe management of the organisational unit, and resourcesallocated. For appro*al, the planning information mayundergo a number of iterations. ote that benchmark may be

 piloted on indi*idual pro7ects before committing toorganisation:wide use. herefore, resource a*ailability may

Copyright © ISBSG Page 2 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 29/40

Benchmarking Standard Draft 09/02/2014

be staged.

4.5.12  Ac#$ires$pporttechnologies

Aaila*le supporting technologies shall *e ealuated and appropriateones selected/

Supporting technology may consist of for e;ample automated toolsand training courses/6he types of automated tools that may *e needed include graphical

 presentation tools data analysis tools and data*ases/ 6ools forcollecting data for e;ample static analysers for the collection of

 product data may also *e re=uired/ 6his may inole the modificationandDor e;tension of e;isting tools and the cali*ration and testing of thetools/Based on the ealuation and selection of supporting technologies the

 *enchmark planning information may hae to *e updated/

6he selected supporting technologies shall *e ac=uired and deployed/

If the supporting technologies concern the infrastructure for datamanagement then access rights to the data should *e implemented inaccordance with organisational security policies and any additionalconfidentiality constraints/

&.+ Perform the Benchmar Process

6his actiity consists of the following tasks:

• Integrate procedures

• %ollect data

• !resent data to the *enchmark 

• Analyse data Actiity

• #aluate Benchmark results

!erforming the *enchmark process should *e done in accordance with the planning informationdescri*ed in clause ./0/

Information products and ealuation results in the Benchmark #;perience BaseH should *econsulted during the performance of this actiity/

4.6.1 Integrate)roced$res

&ata generation and collection shall *e integrated into the releant processes/

 Integration may in*ol*e changing current processes toaccommodate data generation and collection acti*ities. Forexample, the inspection process may be changed to requirethat the moderator of an inspection hand o*er the preparationeffort sheets and defect logs to the benchmark librarian at the

closure of e*ery inspection. his would then necessitatemodifying inspection procedures accordingly. Integrationin*ol*es a trade:off between the extent of impact on existing

Copyright © ISBSG Page 2 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 30/40

Benchmarking Standard Draft 09/02/2014

 processes that can be tolerated and the needs of thebenchmark process. he required changes to collect data

 should be minimi;ed.he extent of integration *aries depending on the type ofmeasures and the information needs. For example, a one:of:

a:kind staff morale sur*ey requires little integration. Fillingin time sheets at the end of e*ery week requires the staff tokeep track of their effort during the week.he data that need to be collected may include extrameasures defined specifically to e*aluate the information

 products or performance measures to e*aluate the benchmark process.

6he integrated data collection procedures shall *e communicated to thedata proiders/

his communication may be accomplished during, forexample, staff training, an orientation session, or *ia acompany newsletter.he ob7ecti*e of communicating the data collection

 procedures is to ensure that the data pro*iders are competentin the required data collection. 4ompetence may be achie*ed,

 for example, through training in the data collection procedures. his increases confidence that data pro*idersunderstand exactly the type of data that are required, the

 format that is required, the tools to use, when to pro*ide data,and how frequently. For example, the data pro*iders may betrained on how to complete a defect data form, to ensure thatthey understand the defect classification scheme, and themeanings of different types of effort (such as isolation andcorrection effort).

&ata analysis and reporting shall *e integrated into the releant processes/

 >ata analysis and reporting may need to be performed on aregular basis. his requires that data analysis and reportingbe integrated into the current organisational and pro7ect

 processes as well.

4.6.2 Collect ata &ata shall *e generated and collected/

 >ata may be generated, for example, by a static codeanalyser that calculates *alues for product measures e*erytime a module is checked in. >ata may be collected, forexample, by completing a defect data form and sending it tothe benchmark librarian.

6he collected data shall *e erified/

Copyright © ISBSG Page 30 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 31/40

Benchmarking Standard Draft 09/02/2014

#nsure that all of the contri*uting *ase measures collected coer thesame scope/

 ote& if collecting functional si;e and effort or cost youneed to ensure that&

•hat all effort collected has been expended in

 functional si;e generating acti*ities. -.g. excludeeffort related to user training when measuringde*elopment producti*ity.

• he scope of the work effort breakdown of the data

collected corresponds to the scope of the work effortbreakdown of the benchmark data set. -.g. If a

 supplier only expends de*elopment effort post designthen they need to be benchmarked against pro7ects,which ha*e collected effort for the same de*elopmentacti*ities.

• he scope of the effort collected is for the same set of pro7ect related acti*ities e.g. Is the effort foradministrati*e work, user participation, qualityassurance etc included in both the collected andbenchmarked datasets.

• he effort and costs allocated to a particular

organisational unit e.g. <ro7ect ha*e actually beenexpended on the organisational unit to for which ithas been recorded.

 >ata *erification may be performed by inspecting against a

checklist. he checklist should be constructed to *erify thatmissing data are minimal, and that the *alues make sense.

 -xamples of the latter include checking that a defectclassification is *alid, or that the si;e of a component is notten times greater than all pre*iously entered components. Incase of anomalies, the data pro*ider(s) should be consultedand corrections to the raw data made where necessary.

 3utomated range and type checks may be used. >ata *erification may also be performed after data ha*e been stored, since errors (for example, data entry errors) may beintroduced during the data storage.

 >ata *erification should be the responsibility of thebenchmark librarian in con7unction with the data pro*ider(s).

&ata shall *e stored including any conte;t information necessary toerify understand or ealuate the data/

 ote that the data store does not ha*e to be an automatedtool. It is possible to ha*e a paper:based data store, forexample, in the situation where only a handful of measuresare collected for a short period of time in a smallorganisation.

Copyright © ISBSG Page 31 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 32/40

Benchmarking Standard Draft 09/02/2014

4.6.3  Anal'se ata 6he collected data shall *e analysed/

 >ata may be aggregated, transformed, or re:coded prior toanalysis. +uidance for performing statistical analysis may be

 found in I! ""& "$$$.

6he data analysis results shall *e interpreted *y the *enchmarkanalyst(s)/

he benchmark analyst(s) would be able to draw some initialconclusions based on their results. /owe*er, since theanalyst(s) may not be directly in*ol*ed in the technical andmanagement processes, such conclusions need to be re*iewedby other stakeholders as well (see .#.).

 3ll interpretations should take into account the context of themeasures.he data analysis results make up one or more indicators.

6he collected data should *e normalised if appropriate/

 ote& >ata measures need to be normalised prior tobenchmarking to ensure comparability4osts should be normalised to account *ariability caused by

 factors such as salary fluctuations, inflation and currencyexchange rates.

 If the collected data set was recorded in 3d7usted Functionali;e and the Benchmark >ataset was nad7usted function

 points then the collected data set would need to ha*e ad7usted

re*ersed.

6he data analysis results shall *e reiewed/

he re*iew is intended to ensure that the analysis was performed, interpreted, and reported properly. It may be aninformal Dself re*iewE, or a more formal inspection process.

4.6.4 Co""$nicat e In!or"ation)rod$cts

6he information products shall *e reiewed with the data proiders andthe *enchmark users/

6his is to ensure that they are meaningful and if possi*le actiona*le/>ualitatie information should *e considered as a support to interpreting=uantitatie results/

6he information products shall *e documented/

6he information products shall *e communicated to the data proidersand the *enchmark users/

'eed*ack should *e proided to the stakeholders as well as *eing

sought from the stakeholders/ 6his ensures useful input for ealuatingthe information products and the *enchmark process/

Copyright © ISBSG Page 32 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 33/40

Benchmarking Standard Draft 09/02/2014

&.- '#aluate the Benchmar

6his actiity consists of the following tasks:#aluate measures and *enchmark processIdentify potential improements to the *enchmark process

4.7.1 E+al$ate%eas$res 6he information products shall *e ealuated against the specified

ealuation criteria and conclusions on strengths and weaknesses of theinformation products drawn/

he e*aluation of information products may be accomplishedthrough an internal or independent audit. -xample criteria

 for the e*aluation of information products are included inuitable e*aluation criteria can be found in I!@I-4 "#$%$.he inputs to this e*aluation are the performance measures,

the information products, and the benchmark user feedback.he e*aluation of information products may conclude that

 some measures ought to be remo*ed, for example, if they nolonger meet a current information need.

4.7.2 E+al$ate the&ench"ar( process

6he *enchmark process shall *e ealuated against the specifiedealuation criteria and conclusions on strengths and weaknesses of the

 *enchmark process drawn/

he e*aluation of benchmark process may be accomplishedthrough an internal or independent audit. he quality of the

benchmark process influences the quality of the information products.he inputs to this e*aluation are the performance measures,the information products, and the benchmark user feedback.

Lessons learned from the ealuation shall *e stored in the Benchmark#;perience BaseH/

 ote& If the benchmark had been performed to assess a supplier in an outsourcing contract then the Benchmark 6eport may be considered for input into re:calibration of

 performance targets.

 Cessons learned may take the form of strengths andweaknesses of the information products, of the benchmark

 process, of the e*aluation criteria themsel*es, and@orexperiences in benchmark planning (for example, Dthere was

 great resistance by the data pro*iders in collecting a specificmeasure at a specific frequencyE).

4.7.3 Identi!' potentiali"pro+e"ents

!otential improements to the *enchmark process shall *e identified/

uch DImpro*ement 3ctionsE should be used in futureinstances of the D<lan the Benchmark <rocessE acti*ity.he costs and benefits of potential impro*ements should be

Copyright © ISBSG Page 33 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 34/40

Benchmarking Standard Draft 09/02/2014

considered when selecting the DImpro*ement 3ctionsE toimplement. It should be noted that making a particularimpro*ement may not be cost effecti*e or the benchmark

 process may be good as it is, and therefore no potentialimpro*ements may be identified.

!otential improements to the *enchmark process shall *ecommunicated to the *enchmark process owner and other stakeholders/

his would allow the benchmark process owner to makedecisions about potential impro*ements to the benchmark

 process. If no potential impro*ements are identified in this clause thenit should be communicated that there were no potentialimpro*ements.

). Informati#e References

IS5DI#% +92+@-: -9 &ata !rocessing @ oca*ularyK !art -: 'undamental 6erms/

IS5DI#% +92+@+,: -, Information 6echnology ? oca*ulary/

IS5 2.,+: -. >uality management and =uality assurance ? oca*ulary/

IS5 ,,-: -. >uality Systems @ "odels for =uality assurance in designDdeelopment production installation and sericing/

IS5DI#% -++,<: -0 Information 6echnology @ Software Life %ycle !rocesses/

IS5DI#% -+1: -- Information 6echnology @ Software !roduct #aluation @ >uality%haracteristics and Guidelines for their 7se/

IS5DI#% -.-.9:-2 Information 6echnology @ Software Benchmark @ &efinition of 'unctionalSie Benchmark/

IS5DI#% -.02@-: -1 Information 6echnology ? Software !roduct #aluation: !art -General 5eriew/

IS5DI#% 6R -00,.@+: -2 Information 6echnology @ Software !rocess Assessment @ !art +:A Reference "odel for !rocesses and !rocess %apa*ility/

IS5DI#% 6R -00,.@: -2 Information 6echnology ? Software !rocess Assessment ? !art :oca*ulary/

IS5 International oca*ulary of Basic and General 6erms in "etrology -9/

IS5 6R -,,-<:- Guidance on Statistical 6echni=ues for IS5 ,,-:-./ 

'/ Ro*erts: Benchmark 6heory with Applications to &ecisionmaking 7tility and the SocialSciences/ Addison@$esley -</

Copyright © ISBSG Page 34 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 35/40

Benchmarking Standard Draft 09/02/2014

Anne( A7 '(amples !informati#e"

6he following su*sections proide e;amples of instantiations of the model that addressspecific information needs/ 6hese e;amples are not designed to recommend *est *enchmark

 practices *ut rather to show the applica*ility of the *enchmark information model in aariety of common situations/

 +.1 Prod'"tivityea#p&e

6he decision@maker in this e;ample needs to select a specific productiity leel as the *asis for proEect planning/ 6he measura*leconcept is that productiity is related to effort e;pended and amount ofsoftware produced/ 6hus effort and code are the measura*le entities ofconcern/ 6his e;ample assumes that the productiity is estimated *ased

on past performance/ 6hus data for the *ase measures (num*eredentries in ta*le *elow) must *e collected and the deried measurecomputed for each proEect in the data store/Regardless of how the productiity num*er is arried at the uncertaintyinherent in software engineering means that there is a considera*le

 pro*a*ility that the estimated productiity wonFt *e realied e;actly/#stimating productiity *ased on historical data ena*les the computationof confidence limits that help to assess how close actual results are likelyto come to the estimated alue/

Information 4eed #sta*lish aerage productiity of deelopment proEects

undertaken in the last year "easura*le %oncept !roEect productiity

Releant #ntities 'unctional Sie of proEects#ffort e;pended *y proEects

("easura*le) attri*utes 'unctional Sie6imecard entries (recording effort)

Base "easures !roEect Q 'unctional Sie eg/ %'S7 '!s!roEect Q 8ours of effort

"easurement "ethod IS5DI#% -.-.9@- %onformant 'S" "ethodAdd timecard entries together for !roEect Q

6ype of "ethod 5*Eectie5*Eectie

Scale Integers from minimum 'S" for "ethod toinfinityReal num*ers from ero to infinity

6ype of Scale 5rdinalRatio

7nit of Benchmark 'S" "ethod 7nit8our 

&eried "easure !roEect Q !roductiity

'unction &iide !roEect Q 'S"" 7nits *y !roEect Q 8ours of

#ffortIndicator Aerage productiity

Copyright © ISBSG Page 3 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 36/40

Benchmarking Standard Draft 09/02/2014

"odel %ompute mean and standard deiation of all proEect productiity alues

&ecision %riteria %omputed confidence limits *ased on thestandard deiation indicate the likelihoodthat an actual result close to the aerage

 productiity will *e achieed/ ery wideconfidence limits suggest a potentially largedeparture and the need for inestigation todetermine the reasons for departure/ 6hisshould *e followed *y a plan for action toimproe productiity of proEects

 +.2 S"hed'&eadheren"e

6he decision@maker in this e;ample needs to ealuate whether or not therate of progress on a proEect is sufficient/ 6he measura*le concept isthat progress is related to the amount of work planned and the amountof work completed/ 6hus planned work items are the entities ofconcern/ 6his e;ample assumes that the status (degree of completion)

of each unit is reported *y the deeloper assigned to it/ 6hus data forthe *ase measures (num*ered entries in ta*le *elow) must *e collectedand the deried measure computed for each work item in the plan/Since the status of units is a su*Eectie assessment a simple numericalthreshold is used as a decision criterion rather than statistical limits/

Information 4eed &etermine to what e;tent proEects hae adhered to their plannedschedules

"easura*le %oncept !roEect schedule leadDlag

Releant #ntities !lanned proEect schedule

Achieed proEect turn out("easura*le) attri*utes 6ime @ months

6ime @ months

Base "easures !roEect Q !lanned scheduled time!roEect Q actually achieed time to delier 

"easurement "ethod 4ote planned time ? in monthsSu*tract start date from end date ? result in elapsedmonths

6ype of "ethod 5*Eectie5*Eectie

Scale !ositie Real 4um*ers to infinity

!ositie Real num*ers to infinity6ype of Scale Ratio

Ratio

7nit of Benchmark "onth"onth

&eried "easure Schedule lagDlead for proEect Q

'unction Su*tract !roEect Q schedule months from !roEect Q actualelapsed months

Indicator Aerage Schedule deiation

"odel %ompute mean and standard deiation of all a*soluteschedule leads

Copyright © ISBSG Page 3/ of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 37/40

Benchmarking Standard Draft 09/02/2014

&ecision %riteria %omputed confidence limits *ased on thestandard deiation indicate the likelihood that anactual result close to the aerage scheduledeiation will *e achieed/ ery wideconfidence limits suggest a potentially large

schedule deiation and the need for inestigationto determine the reasons/ 6his should *efollowed *y a plan for action to reduce scheduledeiation of proEects

 +nne B Pro"e%% or$ Prod'"t% infor#ative5

6his anne; contains a mapping *etween the work products ($!s) mentioned in thisBenchmarking Standard and the actiities or tasks that produce them/ 4ote that this anne; only

 presents the final $!s not all of the intermediate $!s that may need to *e produced during the performance of the actiities and tasks/All of the tasksDactiities that produce these work products are normatie/ 8oweer only thework products with an asterisk () ne;t to them hae normatie re=uirements that they *edocumented/6his Benchmarking Standard is not intended to prescri*e the name format or e;plicit content ofthe documentation to *e produced/ 6he International Standard does not imply that documents

 *e stored packaged or com*ined in some fashion/ 6hese decisions are left to the user of thisInternational Standard/

$ork !roduct ActiityD6ask !roducing $!

$ork !roducts !roduced #;ternally

Re=uirements for Benchmark 6echnical and "anagement !rocesses

Information 4eeds 6echnical and "anagement !rocesses

Benchmark 7sers 'eed*ack 6echnical and "anagement !rocesses

$ork !roducts !roduced *y the plan the BenchmarkH !rocess

%haracterisation of the 5rganisational 7nit ./0/- &escri*e 5rganisational 7nit

Selected Information 4eeds ./+/- Identify *enchmark information needs

&efinition of selected measures ./0/+ Select "easures

!rocedures for data collection and storage ./0/1 &efine data collection and reporting

%onfiguration "anagement procedures ./0/< %onfiguration management

%riteria for the ealuation of information products

./0/2 #aluating information products

%riteria for ealuating *enchmark process ./0/ #aluating the *enchmark processApproed Results of *enchmark planning ./0/-- Approing the *enchmark plan

Select supporting 6echnologies ./0/-+ A=uire support technologies

$ork products produced *y the !erform the BenchmarkH !rocess

Integrated &ata collection and storage procedures

./0/1 Integrate !rocedures

Stored &ata ./1/+ %ollect &ata

&ata Analysis and interpretations ./1/9 Analyse &ata

Information products ./1/. %ommunicate information products

$ork products produced *y #aluate BenchmarkH process

Benchmark e;perience data *ase (update)

#aluation Results ./< #aluate "easuresD #aluate the *enchmark !rocess

Copyright © ISBSG Page 3 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 38/40

Benchmarking Standard Draft 09/02/2014

Improement actions ./1/9 Identify potential Improements

Annex C Criteria for evaluating the Benchmark Process

6he goodness of a process may *e Eudged *y assessing its capa*ility (as descri*ed in IS5DI#%6R -00,.) or *y measuring and ealuating its performance/ $hile this International Standardas a whole may *e used as a reference model for assessing the capa*ility of a *enchmark

 process this section only addresses the ealuation of the performance of the *enchmark process/

Below is a set of example criteria that may *e used for ealuating the performance of the *enchmark process/ In some cases the criteria can *e used for a =uantitatie ealuation and inother situations a =ualitatie ealuation may *e appropriate/

6he following criteria may *e regarded as potential information needs of the *enchmark processowner/ 6he *enchmark process descri*ed in this Benchmarking Standard may *e applied to

 produce information products that address the information needs identified *y the *enchmark process owner/

C.1 Timeliness

6he *enchmark process should proide information products in time to support the needs of the *enchmark user/ 6imeliness may well *e critical in proiding information products to meet theneeds of the sponsors and stakeholders/ 7ntimely information may lead to the *enchmark

 process *eing iewed as unhelpful or een misleading/

C.2 'fficienc%

6he *enchmark process should not cost more to perform than the alue of the information that it proides/ 6he more efficient the process the lower its cost and the greater the costD*enefit/

C.3 Defect containment

6he *enchmark process should minimise the introduction of erroneous data and results whileremoing any that do get introduced as thoroughly and soon as possi*le/ Analysis of *enchmarkdata should always take into account and make e;plicit the amount of any unaoida*le ariationin the data from which the information products are deried/

C.& taeholder satisfaction

6he users of information products (stakeholders and sponsors) should *e satisfied with the=uality of the information products and the performance of the *enchmark process in terms oftimeliness efficiency and defect containment/ Satisfaction may *e affected *y the userFse;pectation of the leel of =uality and performance to *e proided/ A high degree of satisfactionis ital if commitment to the *enchmark process is to *e maintained/

C.) Process conformance

6he e;ecution of *enchmark actiities should conform to any plans and procedures deeloped todescri*e the intended *enchmark process/ 6his may *e Eudged *y =uality management audits or

Copyright © ISBSG Page 3 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 39/40

Benchmarking Standard Draft 09/02/2014

 process capa*ility assessments/

Copyright © ISBSG Page 3 of 40Ref: ISBSG Standard v1.0 www.ISBSG.org

8/13/2019 ISBSG Benchmark Standard V1.0 Draft

http://slidepdf.com/reader/full/isbsg-benchmark-standard-v10-draft 40/40

Benchmarking Standard Draft 09/02/2014

Document ControlChan4e 8istor%

6er%ion +'thor78ate 9C78ate +'thori%ed By 78ate Co##ent%

0a !ony Ro&&o

ov 2004

;ir%t 8R+;! for review

0.1 Pa# <orri%

=an 200

Inter#ediate draft

0. !ony Ro&&o

2 Septe#*er 200

;ina& 8raft

0. Ca&ro& 8e$$er%Septe#*er 200

Re draft for IS> %'*#i%%ion

1.0 !ony Ro&&o ;ina& In"orporating %'gge%ted

i#prove#ent% fro# I;P?G

Cop% Control

)&e"troni" "opie%: ISBSG +d#in >ffi"e <a%ter  

!ony Ro&&o, Pa# <orri%, )waa%y&$ow%$i, Caro& 8e$$er%,Pe$$a ;or%e&i'%

 +'thor%

ISBSG <e#*er%

*ile re!erence +,0.1-/doc