university of sunderland cifm03lecture 12 1 feasibility lecture twelve

40
University of Sunderland CIFM03 Lecture 12 1 Feasibility Feasibility Lecture Twelve

Upload: meagan-bishop

Post on 01-Jan-2016

220 views

Category:

Documents


4 download

TRANSCRIPT

University of Sunderland CIFM03 Lecture 12 1

FeasibilityFeasibility

Lecture Twelve

University of Sunderland CIFM03 Lecture 12 2

Is It Feasible?Is It Feasible?

• Once scope has been defined, with the customer, it is as well to stand back and ask ‘Can this job be done?’

• ‘Not everything imaginable is feasible, not even in software.’ Putnam and Myers

University of Sunderland CIFM03 Lecture 12 3

Four TestsFour Tests

• Test One: Technology.

• Test Two: Finance.

• Test Three: Time.

• Test Four: Resources.

University of Sunderland CIFM03 Lecture 12 4

FailuresFailures

• Large software projects are notoriously difficult to bring in on time, on budget, and to the standard required.

• But the same story is true in large Engineering projects.

• Human beings are not good at visualising the work involved in such monsters.

University of Sunderland CIFM03 Lecture 12 5

Failures: Home Office Failures: Home Office ProjectProject

• Designed to track details of asylum seekers.

• Development schedule 1996 - 1998.

• Targeted £110M pa staff savings.

University of Sunderland CIFM03 Lecture 12 6

Failures: Performing Failures: Performing RightsRights

• The Performing Right Society spent £20M in 1987 collecting and distributing royalties.

• It decided to create a database to encompass all its functions, and migrate to a new hardware and software platform.

• The system was to be the largest in the world at that time.

University of Sunderland CIFM03 Lecture 12 7

Failures: Performing Failures: Performing RightsRights

• The costs were to be offset by huge staff reductions.

• Late in the project the team found that data on music publishing contracts could not be automatically input to the database, and a two year extension was sought.

University of Sunderland CIFM03 Lecture 12 8

Failures: Benefits Failures: Benefits Card Card

• Use magnetic stripe cards in Post Offices to claim benefit.

• Benefit - cut presumed fraud.

• Costs to develop and run: £1,000,000,000.

• Started in 1996, dropped in 1999.• ‘The complexity and resource requirements

had been seriously underestimated.’ National Audit

Report

University of Sunderland CIFM03 Lecture 12 9

Failures: PassportsFailures: Passports

• July 1996, Passport Agency decided to replace its 1989 systems

• System installed in October 98 in Liverpool, then November 98 to Newport.

University of Sunderland CIFM03 Lecture 12 10

Failures: Failures: AmbulancesAmbulances

• 1991 decision taken to develop a computerised despatch system for London ambulances.

• Jan 92 deadline missed. ‘Full’ system installed Oct 92.

• System degraded under load, and locked on 4 Nov 92.

University of Sunderland CIFM03 Lecture 12 11

Failures: Air Traffic Failures: Air Traffic ControlControl

• Six years late, £180m over budget. Cost £480m.

• Switched on in January, failed five time by May, and twice in June.

• Exhaustive spec., tackled by IBM 92-94, Loral 94-96, and Lockheed 96-02.

• Yet this is a safety-critical system...

University of Sunderland CIFM03 Lecture 12 12

A Risky BusinessA Risky Business

• Developing software is not like building a bridge

University of Sunderland CIFM03 Lecture 12 13

Quantifying RiskQuantifying Risk

• Since risk is part and parcel of every software project, its impact and likelihood need to be quantified.

• Boehm (1989) started the ball rolling, and gave us a list of the top ten problems of the day, together with ways of minimising impact.

University of Sunderland CIFM03 Lecture 12 14

Top Ten Software RisksTop Ten Software Risks

Risk Item Risk Management Technique 1. Insufficient or incompetent staff Use top talent. Build teams. Key-

personnel agreements. Training. Good scheduling.

2. Unrealistic schedules and budgets Detailed task and cost estimation in network. Design to cost. Incremental development. Software reuse. Requirements sifting.

3. Developing the wrong functionality Organisation analysis. Scope and spec. review. User surveys. Prototyping. Early user manuals.

4. Developing the wrong user interface Prototyping. Scenarios. Task analysis. User definition (functionality, style, workload).

5. Gold plating Requirements sifting. Prototyping. Cost/Benefit analysis. Design to cost.

Boehm, 1989. Software Risk Management

University of Sunderland CIFM03 Lecture 12 15

Top Ten Software RisksTop Ten Software RisksRisk Item Risk Management Technique

6. Stream of requirements changes Make changes formal and difficult. Hide information. Develop in increments (defer changes to later modules).

7. Shortfall in bought-in modules Benchmarking. Inspections. Check references. Check compatibility.

8. Shortfalls in externally performed tasks

Check references. Visits/Audits. Penalty clauses. Competitive design or prototyping. Teambuilding.

9. Real-time performance shortfalls. Simulation. Benchmarking. Modelling. Prototyping. Instrumentation. Tuning.

10. Pioneering work Technical analysis. Cost/Benefit. Prototyping.

Boehm, 1989. Software Risk Management

University of Sunderland CIFM03 Lecture 12 16

Your Views in TeamsYour Views in Teams

• How would you rank Boehm’s list?

• Do you have fresh points to add?

University of Sunderland CIFM03 Lecture 12 17

Another VersionAnother Version

• Glass (1998) has a different list:-– Objectives not fully specified– Bad planning and estimating– Technology new to the organisation– Inadequate project management technique– Too few senior client staff on team– Poor suppliers of hardware/software

Glass, RL. 1998: Software Runaways: Lessons Learnt from Massive Software Project Failures, New Jersey, Prentice Hall

University of Sunderland CIFM03 Lecture 12 18

Research on Failure Research on Failure CausesCauses

• There is some reliable research culled from a very large sample of American software projects.

• One organisation started to survey software projects in 1994, and has amassed a file of 30,000 completed ones.

• Each year it analyses the current crop.

1

University of Sunderland CIFM03 Lecture 12 19

Standish Group Standish Group Chaos ReportChaos Report

• 16% success rate in 1994 in the States.

• 24% overall in 1998, but lower for Governmental projects. Total cost$75 billion.

• 28% success rate overall in 2000.

• Cost over-runs fell from 189% in 1994, to 69% in 1998, and only 45% in 2000.

Http://www.softwaremag.com/archive/2001feb/CollaborativeMgt.html

University of Sunderland CIFM03 Lecture 12 20

CategoriesCategories

• Standish categorise projects into:

– Successful

– Challenged

– Failed

• Most of us would call a ‘challenged’ project a failure.

University of Sunderland CIFM03 Lecture 12 21

Standish GroupStandish Group

• Points out that projects should be limited to six months for six people…

• So does the Agile Alliance who advocate creation of software modules and very frequent validations of completed work.

http://www.cio.com/archive/070101/secret_content.html

University of Sunderland CIFM03 Lecture 12 22

Success FactorsSuccess Factors

• Standish now suggest the following success factors as a guide to good practice.

• Top of the pile is support from senior client management.

• Techniques like PRINCE 2 demand high level user involvement...

University of Sunderland CIFM03 Lecture 12 23

Conclusions: Conclusions: Success FactorsSuccess Factors

Executive Support 18

User Involvement 16

Experienced Proj. Manager 14

Clear Business Objectives 12

Minimised Scope 10

Standard Software Infrastructure 8

Firm Basic Requirements 6

Formal Methodology 6

Reliable Estimates 5

Other Criteria 5

University of Sunderland CIFM03 Lecture 12 24

Whither Next?Whither Next?

• Standish suggest that the trend is to micro projects (or modular developments) that use only four people for four months.

• Or you can have massive upheaval projects that take forever, cost the earth the moon and the stars, and then fail…

University of Sunderland CIFM03 Lecture 12 25

Risk ManagementRisk Management

• Introduction– Will look at the management of risk during

the project.– Risks vary in importance.– The importance of a particular risk

depends on the project.– Risk Management should reduce the

impact of a potential risk.

University of Sunderland CIFM03 Lecture 12 26

Murphy’s LawMurphy’s Law

• ‘If something can go wrong, it will.’

• The extended law is:-’If something cannot possibly go wrong, it will, and usually on a Monday.’

University of Sunderland CIFM03 Lecture 12 27

Risk CategoriesRisk Categories

• Project Risk Types – Those caused by the inherent difficulties of

estimation.

– Those due to assumptions made during the planning process.

– Those arising from unforeseen events.

University of Sunderland CIFM03 Lecture 12 28

Risk CategoriesRisk Categories

• Estimation Errors

• Planning Errors

• Eventualities

University of Sunderland CIFM03 Lecture 12 29

Managing RiskManaging Risk

• There are various models of risk management.

• They are generally similar, and identify two main elements:-– Risk identification– Risk management

• A popular model is the Boehm Risk Engineering Model.

University of Sunderland CIFM03 Lecture 12 30

Managing RiskManaging Risk

Ris k EvaluationRis k

Identif ic ationRis k Es tim ation

Ris k S taff ingRis k

M onitor ingRis k D irec ting

Ris k Analys is

Ris kEngineer ing

Ris k Contro lRis k P lanning

Ris kM anagem ent

From Boehm: Tutorial on software risk management IEEE Computer Society 1989

University of Sunderland CIFM03 Lecture 12 31

Reducing RisksReducing Risks

• There are five broad categories for risk reduction– Hazard Prevention – Likelihood Reduction– Risk Reduction– Risk Transfer– Contingency Planning

University of Sunderland CIFM03 Lecture 12 32

Risk IdentificationRisk Identification

• Identification of hazards that may affect a project must be the first steps in a risk assessment

• A hazard is an event that if it occurs may adversely affect the project

University of Sunderland CIFM03 Lecture 12 33

Risk IdentificationRisk Identification

• Checklists are often used to help in identifying hazards

• Knowledge-based software is also available to help with the task of hazard identification

University of Sunderland CIFM03 Lecture 12 34

Risk IdentificationRisk Identification

• Various categories of risk factors will need to be considered. For software:-– Application factors– Staff factors– Project Factors– Project Methods

University of Sunderland CIFM03 Lecture 12 35

Risk IdentificationRisk Identification

– Hardware / Software Factors– System Changeover Factor– Supplier Factors – Environmental and Social Factors – Health and Safety Factors

University of Sunderland CIFM03 Lecture 12 36

Risk AnalysisRisk Analysis

• Once identified, risks should be assessed for their possible affect on the project

• The level of importance of a risk must also be established

• This is often done by assessing the risk value

University of Sunderland CIFM03 Lecture 12 37

Risk AnalysisRisk Analysis

• Risk impact is estimated in monetary terms

• Risk likelihood is assessed as a probability (say 1-10)

• Risk exposure therefore is an expected cost

• Ranking schemes can be used to assess impact and likelihood

University of Sunderland CIFM03 Lecture 12 38

Risk Ranking TableRisk Ranking Table

Hazard Chance Impact Exposure

1. Changes to the requirements 1 8 8

specification during coding

2. Specification takes longer than 3 7 21

expected

3. Staff sickness affecting 5 7 35

critical path activities

4. Staff sickness affecting 10 3 30

non-critical activities

University of Sunderland CIFM03 Lecture 12 39

Risk AnalysisRisk Analysis

• Managing risk involves the use of two strategies:– Reducing the risk exposure

– Drawing up contingency plans.

University of Sunderland CIFM03 Lecture 12 40

Other FactorsOther Factors

• Other factors should be taken into account when prioritising risk management:-– Confidence of risk assessment– The number of risks– Cost of action