dan houston the aerospace corporationcoqualmo • extension of cocomo ii – relates defectivity to...
TRANSCRIPT
-
Dynamic Modeling for Project Management
Dan Houston The Aerospace Corporation
18 May 2011
© The Aerospace Corporation 2011
18 May 2011
1
-
Report Documentation Page Form ApprovedOMB No. 0704-0188Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.
1. REPORT DATE 18 MAY 2011 2. REPORT TYPE
3. DATES COVERED 00-00-2011 to 00-00-2011
4. TITLE AND SUBTITLE Dynamic Modeling for Project Management
5a. CONTRACT NUMBER
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S) 5d. PROJECT NUMBER
5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) The Aerospace Corporation,2310 E. El Segundo Blvd.,El Segundo,CA,90245-4609
8. PERFORMING ORGANIZATIONREPORT NUMBER
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)
11. SPONSOR/MONITOR’S REPORT NUMBER(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES Presented at the 23rd Systems and Software Technology Conference (SSTC), 16-19 May 2011, Salt LakeCity, UT. Sponsored in part by the USAF. U.S. Government or Federal Rights License
14. ABSTRACT
15. SUBJECT TERMS
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as
Report (SAR)
18. NUMBEROF PAGES
37
19a. NAME OFRESPONSIBLE PERSON
a. REPORT unclassified
b. ABSTRACT unclassified
c. THIS PAGE unclassified
Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18
-
Agenda
• Defining characteristics of current large product development projectsdevelopment projects
• Technical demands on project management and limitations of most-used project management techniquesp j g q
• How dynamic modeling has helped The Aerospace Corporation
2
-
Defining characteristics of large product development projectsdevelopment projects
• Structural complexityM lti li it f l t ( i ti t k t )– Multiplicity of elements (organizations, resources, tasks, etc.)
– Various types of interdependency• Pooled (resources)• Sequential (tasks)• Reciprocal (feedback)
• Uncertainty– Goal (requirements)– Methods (processes)
• Tight time-constraintTight time-constraint– Underestimation– Political factors
3
-
State of the Practice in Project Management
• Existing– Models activities and dependencies
• PERT chartsPERT charts• Gantt charts
– Resource levelingResource leveling• Project management software,
e.g., Microsoft Project
– Earned value management• Lagging indicators of progress
4
-
Challenges to Project Management Current Practice
• Feedback TestFix
• Downstream effects of quality problems
• Effects of waiting for work products and resources
• Intangible factorsIntangible factors– Schedule pressure– Morale– Overtime effects
5
Overtime effects
-
How dynamic modeling has helped The Aerospace CorporationCorporation
• Program office supportL l d ith D i COQUALMO– Lessons learned with Dynamic COQUALMO
– Test and fix modeling• Acquisition planning• Acquisition planning
– Modeling probabilities of successful completion• Research
– Study of concurrent processes
6
-
Lessons Learned with Dynamic COQUALMO
7
-
COQUALMO
• Extension of COCOMO II– Relates defectivity to cost and scheduleRelates defectivity to cost and schedule– COCOMO II drivers are treated as quality drivers– Quality measured in counts of non-trivial defects (critical system
function impairment or worse)function impairment or worse)• Submodels
– Defect introductionf
COCOMO II and COQUALMO were developed at the Center for Systems and Software Engineeringof the University of Southern California
8
– Defect removal of the University of Southern California.
-
S f d f t R i t D i d C d
Defect Introduction Submodel • Sources of defects: Requirements, Design, and Code
∏=
∗∗=21
1,,
isourcei
Bnomsourcesource erDefectDrivSizeDIRDI source
DI d f t i t d d f h• DI = defects introduced from each source• DIRnom = nominal defect introduction rate by source• SizeB = software size raised to scale factor by source• Defect Drivers in Quality Adjustment Factors (QAFs)
– Example: Analyst Capability (ACAP)• Defect driver values produced through a two-round Delphi processDefect driver values produced through a two round Delphi process.
ACAP Level Requirements Design CodingVery High .75 .83 .90High 87 91 95High .87 .91 .95Nominal 1.0 1.0 1.0Low 1.15 1.10 1.05V L 1 33 1 22 1 11
9
Very Low 1.33 1.22 1.11
-
Defect Removal Submodel
• Defect removal activities: peer reviews, automated analysis, testing
3
• DR = defects removed from artifact
∏=
−∗=3
1, )1(
iartifactiartifactartifact DRFDIDR
• DR = defects removed from artifact• DI = defects introduced into each artifact• DRF = removal fraction for each activity, i, applied to each artifact• DRF assigned to quality levels of activities in 2-round Delphi
10
-
Model Description: Defect Flows• Three inflows one each for requirements design code• Three inflows, one each for requirements, design, code• Outflow for each review type, automated analysis, and testing phase• Flows arrayed by interval
11
-
Major Defect Discovery Profiles for Projects A & C, actual and modeledactual and modeled
3500
Project C
2500
3000
Cou
nt
M d l d
1500
2000
ativ
e D
efec
t
Actual
Modeled
500
1000
Cum
ula
Project A
0
500
0 20 40 60 80 100
12
Project Time (month)
-
Study of Concurrent Processes
13
"'0 Q)
"'0 0
~ II) .. u
PCON, TOOL& • DOCU
~ f PCON & DOCU QJ PCON .c
§ • • TooL z bocu
PCON, TOOL, DOCU • & RESL • PCON, DOCU & RESL
• RESL
• PMAT
Estimated Additional Cost to Project
• TEAM
(~)AEROSPACE
-
Study of Concurrent Processes
14
1200
1000
E 800 ~ 0
(.) -0 '* 600 Cl Cl)
.:::= -~ ::J 400 E ::J
(.)
200
, /
/ /
Fitted Model
Early Revision
------/ --..... / ................
/ / Improved Project ......_ ......_ / ......
; --..... , ............. ..._._ / ---
0 ~--~------~-----------.-----------.------------,-----------.---~
0 20 40 60 80 100
Project Time (month)
(~)AEROSPACE
-
Duration of Test and Fix Cycling
15
-
The difficulty of estimating test duration
• A discovery process versus an insurance process
• Some factors in test duration
A terrifying adventure into the unknown
Validating what we already know
• Some factors in test duration– Amount of quality-inducing effort applied prior to testing– Type and complexity of software (including architecture)
Organizational knowledge of the product– Organizational knowledge of the product– Organizational discipline (change mgmt, SCM, build planning, etc.)– Types of testing required (high reliability requirements?)
Resource constraints (people/facilities)– Resource constraints (people/facilities)– Product (software, test cases, test tools) availability– Duration of individual activities
16
> Many factors contribute to software readiness
-
Previous Approaches to the Question
• Linear estimatesNumber of tests X Average time for each test = Total test duration
• Expert judgment plus Monte Carlo samplingp j g p p g
Probability
Pessimisticestimate
Optimisticestimate
Estimate of most likely duration
• Software project estimation toolsDevelopment Time = 2.5 (Effort Applied)b [months]
Software project type bp j yp
Basic 0.38
Intermediate 0.35
Highly-constrained 0.32
17
> Each of these methods has peculiar limitations
-
An example test-and-fix (TaF) duration model
18
Test Case ~------:oo-----+1
Queue
Sort by RunsCompleted
Set TF2Availability RunsCompleted = 0
Set RunsPerTestCase Set TestRunDuration (linearly increasing)
RunsCompleted++
Diagnosis and No Complete Fix Delay 14------< Required Test
Runs? Set FixTime
Passes
Record Time Test Case
Passes
(~)AEROSPACE
-
Discovering significant factors
• Used a full factorial experiment – Use constant inputs representing expected operational values– All combinations of four factors at two levels each (24): 16 simulation runs
Factor Low Value High Value
– All combinations of four factors at two levels each (2 ): 16 simulation runs– Response variable is duration of TaF process
Test Facility Availability 60 hrs/ week 100 hrs/week
Runs per Test Case 2 8
Test Run Duration 2 hrs 5 hrs
Fix Time 24 hrs 96 hrs
• Analysis of variance– Calculate percentage contribution to variation in duration from sums of
squares
19
squares
-
Significant factors60%
50%
60%
Runs per Test Case and Test Run Duration interact to produce an effect in addition
30%
40%
interact to produce an effect in addition to their individual effects
20%
30%
A: Runs per Test CaseB: Test Run Duration
10%
B: Test Run DurationC: Fix TimeD: Test Facility Availability
0%
20
-
Discovering behavior
• Used an additional full factorial experiment to produce response surfaces– Focus on Runs per Test Case and Test Run DurationFocus on Runs per Test Case and Test Run Duration– Use one Fix Time value and two Test Facility Availability values
Factor ValuesTest Facility Availability 60 and 100 hrs/weekRuns per Test Case 2, 4, 6, 8Test Run Duration 2, 3, 4, 5 hrsFix Time 7 daysFix Time 7 days
21
-
Behavior: the TF threshold
Factor interaction above the TF full utilization threshold
TF availability moves the threshold
22
-
Modeling a likely scenario and alternatives
• Used likely inputs to estimate the duration of the test-and-fix cycle
Factor Values SamplepTest Facility Availability
Both test facilities at 40 hrs/week each
Constant for all simulation runs
Runs per (2, .1), (3, .1), (4, .3) (5, .2), Randomly for each test case Test Case (6, .1) (7, .05), (8,.05) in each simulation runTest Run Duration
Triangular(2, 3.5, 5) hrs Randomly for each test case in each simulation run
( ) ( ) ( ) fFix Time
(7, .125), (8, .125), (9, .125), (10, .125), (11, .125), (12, .125),
(13, .125), (14, .125) days
Randomly for each test cycle of each test case in each
simulation run
• Alternative scenarios– Additional test facility availability or an additional test facility
More optimistic Test Run Duration and/or Fix Time
23
– More optimistic Test Run Duration and/or Fix Time
-
Test case completion times
3 regions: startup, at capacity, clearout
Dominant factor in each regionStartup: test cases availabilityAt it TF il bilitAt capacity: TF availabilityCleanout: Fix Time
24
-
Findings and impacts
• Good, early estimate of test duration– Eliminates re-planning activitiesEliminates re planning activities
• Identify the primary factor in test duration– Focus on the real problem– Avoid expensive, inconclusive experimentation on the actual systemAvoid expensive, inconclusive experimentation on the actual system
• Understand system behavior – Identify test management options
• Staffing levelsStaffing levels• Test facility availability• Degree of overlap in test and reviews• Rate of taking test cases into TaF cycleRate of taking test cases into TaF cycle• Order of test cases• Backlogging defects
25
> Actions supported by analytic underpinnings of statistical modeling
-
Acquisition Planning
26
-
Acquisition Seat Creation in the Concept Design CenterProcess and Tool(s) Repeat for Each Concept/Configuration
ChoicesTopics Primary
Program Type ACAT 1
ChoicesTopics Primary
Program Type ACAT 1
ocess a d oo (s) epeat o ac Co cept/Co gu at o
Experience
Program Type ACAT 1
1. Changes to basic Acquisition doctrines and processesa. Standard (large) Air Force program procurements Standardb. Standard procurements with ability to insert waivers
c. New and special Acquisition doctrines i. “faster, better,
Program Type ACAT 1
1. Changes to basic Acquisition doctrines and processesa. Standard (large) Air Force program procurements Standardb. Standard procurements with ability to insert waivers
c. New and special Acquisition doctrines i. “faster, better,
Reference Documents
Preliminary Acquisition
Strategy Panelcheaper” ii. Urgent needs and “DX” DPAS Priority
6. Special considerations:a. Sole source procurement Noneb. Foreign suppliers
cheaper” ii. Urgent needs and “DX” DPAS Priority
6. Special considerations:a. Sole source procurement Noneb. Foreign suppliers
1. DoD 5000.22. JCIDS 31703. FARs4. DFARs5. Defense Acq
GuideContracts FFP
Procurement Sole Source
Program Structure DoD 5000.2
Contracts FFP
Procurement Sole Source
Program Structure DoD 5000.2
Guide
Acq. SeatCDC Acq. Seat
Acquisition Model
CDC
5
4OD
1. Estimate of success of achieving Milestone C 2. Time duration to Milestone C3 Issues and impediments to program success
27
q
1 2 3 4 5CONSEQUENCE
4
3
1
LIK
ELIH
O
2
3. Issues and impediments to program success4. Time duration/success of alternative approaches• Technical Param’s
• TRL• Heritage
-
Concurrent Decision Support at AerospaceConcept Design Center (CDC) & Concurrent Program Definition Co cept es g Ce te (C C) & Co cu e t og a e t o
Environment (CPDE)*
28
-
Acquisition Enterprise ModelUpper left hand portion of Model shown in detailUppe e t a d po t o o ode s o deta
Pre MS ‐ A Pre MS ‐ B Pre MS ‐ C
Swim Lanes JCIDS
A B C
Pre MS ‐ A Pre MS ‐ B Pre MS ‐ C
Swim Lanes JCIDS
A B C
Part Part PPB&EP
DAS, Govt
DAS, Contractor
new lane TBD
PPB&EP
DAS, Govt
DAS, Contractor
new lane TBD
ShownShown
29
new lane TBDnew lane TBD
-
Histograms of Formal Process Time to MS C by ACAT
30
Ill E ro ... tn 0 ... a. .... 0 ~
120
100 1-- .____ ~
80 f-=-
60 r-- r-
40 1--
20
nn~ ~ lnnnnnn .nn n 0 ~~~o/~~~~~~~~~~v~~~~~~~~~~v~~ ~· " ' ~· ~· ~. PJ · ,.,_. ~· b· ~ . Q> · a,· ~· !:()· ~· S) • ')...• ~·
-
Study of Concurrent Processes
31
-
Phase Relationships Example
32
Leading increment
Trailing increment
I I I I \ I I \ \
\ \
\
' ' ' ' ' .............. ---------
-~~ Work available and inherited defects -------• Rework found in dependent phase _ ______.~ Staff allocation
(~)AEROSPACE
-
Rework Implications of Concurrent Engineering
DynamicFeedback
33
> Potential regression at the “ILA” anchor of the Trailing Increment
-
Duration
Sequential: 45 mo.
Possible 3 mo. duration reduction
34
> Duration risk increases with degree of concurrency
-
Duration in the Larger Staff ScenarioDouble staffingD ti d d ⅓ t 30 thDuration reduced ⅓ to 30 monthsDuration risk dramatically reduced
Duration in the Better Quality ScenarioLess defect introduction better discoveryLess defect introduction, better discoveryDuration reduced ~17 monthsNo duration benefit to concurrencyD ti i k b t ti ll d d
35
Duration risk substantially reduced
-
Conclusion
36
-
Experiential Lessons from Dynamic Modeling
• Dynamic modeling is useful, often necessary, for – Gaining insight into the nonlinear processes of programs– Gaining insight into the nonlinear processes of programs – Estimating outcomes
• We are learning to use it in project managementg p j g– Asking the right question– When the cost is justified– At this point, limited engagements work best– Can be used to identify dominant process constraints
f• Valuable tool for research– Across projects
Process concepts
37
– Process concepts