University of Southern California
Center for Systems and Software Engineering
Future Challenges for Systems and Software Cost Estimation
and Measurement
Barry Boehm, USC-CSSE
Fall 2011
1USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for systems and software data collection and analysis– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture– Alternative processes: rapid/agile; systems of systems;
evolutionary development– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and estimation methods needed for good management– Being addressed in Nov 4-5 workshops
USC-CSSE 204/19/23
University of Southern California
Center for Systems and Software Engineering
Metrics and “Productivity”
• “Equivalent” size
• Requirements/design/product/value metrics
• Productivity growth phenomena
• Incremental development productivity decline
USC-CSSE 304/19/23
University of Southern California
Center for Systems and Software Engineering
Size Issues and Definitions• An accurate size estimate is the most important input to
parametric cost models.
• Desire consistent size definitions and measurements across different models and programming languages
• The AFCAA Guide sizing chapter addresses these:
– Common size measures defined and interpreted for all the models– Guidelines for estimating software size– Guidelines to convert size inputs between models so projects can be
represented in in a consistent manner
• Using Source Lines of Code (SLOC) as common measure– Logical source statements consisting of data declarations executables– Rules for considering statement type, how produced, origin, build, etc. – Providing automated code counting tools adhering to definition– Providing conversion guidelines for physical statements
• Addressing other size units such as requirements, use cases, etc.
4USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Equivalent SLOC – A User Perspective *• “Equivalent” – A way of accounting for relative work done to generate
software relative to the code-counted size of the delivered software • “Source” lines of code: The number of logical statements prepared by
the developer and used to generate the executing code– Usual Third Generation Language (C, Java): count logical 3GL statements
– For Model-driven, Very High Level Language, or Macro-based development: count statements that generate customary 3GL code
– For maintenance above the 3GL level: count the generator statements
– For maintenance at the 3GL level: count the generated 3GL statements
• Two primary effects: Volatility and Reuse– Volatility: % of ESLOC reworked or deleted due to requirements volatility
– Reuse: either with modification (modified) or without modification (adopted)
*Stutzke, Richard D, Estimating Software-Intensive Systems,
Upper Saddle River, N.J.: Addison Wesley, 2005
5USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
6
“Number of Requirements”- Early estimation availability at kite level
-Data collection and model calibration at clam level
Cockburn, Writing Effective Use Cases, 2001USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE7
IBM-UK Expansion Factor Experience
Business Objectives 5 Cloud
Business Events/Subsystems 35 Kite
Use Cases/Components 250 Sea level
Main Steps/Main Operations 2000 Fish
Alt. Steps/Detailed Operations 15,000 Clam
SLOC* 1,000K – 1,500K Lava
(Hopkins & Jenkins, Eating the IT Elephant, 2008)
*(70 – 100 SLOC/Detailed Operation)
04/19/23
University of Southern California
Center for Systems and Software Engineering
8
SLOC/Requirement Data (Selby, 2009)
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE 9
Estimation Challenges: A Dual Cone of Uncertainty– Need early systems engineering, evolutionary development
Feasibility
Concept of Operation
Rqts. Spec.
Plans and
Rqts.
Product Design
Product Design Spec.
Detail Design Spec.
Detail Design
Devel. and Test
Accepted Software
Phases and Milestones
RelativeCost Range x
4x
2x
1.25x
1.5x
0.25x
0.5x
0.67x
0.8x
Uncertainties in competition, technology, organizations,
mission priorities
Uncertainties in scope, COTS, reuse, services
04/19/23
University of Southern California
Center for Systems and Software Engineering
Incremental Development Productivity Decline (IDPD)
• Example: Site Defense BMD Software – 5 builds, 7 years, $100M; operational and support software– Build 1 productivity over 200 LOC/person month– Build 5 productivity under 100 LOC/PM
• Including Build 1-4 breakage, integration, rework
• 318% change in requirements across all builds
• IDPD factor = 20% productivity decrease per build
– Similar trends in later unprecedented systems– Not unique to DoD: key source of Windows Vista delays
• Maintenance of full non-COTS SLOC, not ESLOC– Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC– Build 2: 400 KSLOC of Build 1 software to maintain, integrate
10USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
IDPD Cost Drivers: Conservative 4-Increment Example
• Some savings: more experienced personnel (5-20%)• Depending on personnel turnover rates
• Some increases: code base growth, diseconomies of scale, requirements volatility, user requests
• Breakage, maintenance of full code base (20-40%)• Diseconomies of scale in development, integration
(10-25%)• Requirements volatility; user requests (10-25%)
• Best case: 20% more effort (IDPD=6%)• Worst case: 85% (IDPD=23%)
11USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Effects of IDPD on Number of Increments
0
20004000
60008000
10000
1200014000
1600018000
20000
1 2 3 4 5 6 7 8
Build
Cumulative KSLOC
0% productivity decline10% productivity decline15% productivity decline20% productivity decline
• Model relating productivity decline to number of builds needed to reach 8M SLOC Full Operational Capability
• Assumes Build 1 production of 2M SLOC @ 100 SLOC/PM
– 20000 PM/ 24 mo. = 833 developers
– Constant staff size for all builds
• Analysis varies the productivity decline per build
– Extremely important to determine the incremental development productivity decline (IDPD) factor per build
2M
8M
SLOC
12USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Incremental Development Data Challenges • Breakage effects on previous increments
– Modified, added, deleted SLOC: need Code Count with diff tool
• Accounting for breakage effort– Charged to current increment or I&T budget (IDPD)
• IDPD effects may differ by type of software
– “Breakage ESLOC” added to next increment– Hard to track phase and activity distributions
• Hard to spread initial requirements and architecture effort
• Size and effort reporting– Often reported cumulatively– Subtracting previous increment size may miss deleted code
• Time-certain development– Which features completed? (Fully? Partly? Deferred?)
USC-CSSE 1304/19/23
University of Southern California
Center for Systems and Software Engineering
“Equivalent SLOC” Paradoxes
• Not a measure of software size• Not a measure of software effort• Not a measure of delivered software capability• A quantity derived from software component sizes
and reuse factors that helps estimate effort• Once a product or increment is developed, its
ESLOC loses its identity– Its size expands into full SLOC– Can apply reuse factors to this to determine an ESLOC
quantity for the next increment• But this has no relation to the product’s size
USC-CSSE 1404/19/23
University of Southern California
Center for Systems and Software Engineering
COCOMO II Database Productivity Increases
• Two productivity increasing trends exist: 1970 – 1994 and 1995 – 2009
1970-1974 1975-1979 1980-1984 1985-1989 1990-1994 1995-1999 2000-2004 2005-2009
Five-year Periods
SL
OC
pe
r P
M
• 1970-1999 productivity trends largely explained by
cost drivers and scale factors
• Post-2000 productivity trends
not explained by cost drivers and
scale factors
15USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Constant A Decreases Over Post-2000 Period
• Calibrate the constant A while stationing B = 0.91• Constant A is the inverse of adjusted productivity
– adjusts the productivity with SF’s and EM’s
• Constant A decreases over the periods
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
0 1 2 3 4 5 6 7 8 9
Con
stan
t A
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
0 1 2 3 4 5 6 7 8 9
Con
stan
t A
1970- 1975- 1980- 1985- 1990- 1995- 2000- 2005- 1974 1979 1984 1989 1994 1999 2004 2009
EMSize
PMA
SFB*
01.0
Productivity is not fully characterized by SF’s and
EM’s
50% decrease over Post-
2000 periodWhat factors can
explain the phenomenon? 16USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Candidate Explanation Hypotheses
• Productivity has doubled over the last 40 years – But scale factors and effort multipliers did not fully
characterize this increase
• Hypotheses/questions for explanation– Is standard for rating personnel factors being raised?
• E.g., relative to “national average”– Was generated code counted as new code?
• E.g., model-driven development– Was reused code counted as new code?– Are the ranges of some cost drivers not large enough?
• Improvement in tools (TOOL) only contributes to 20% reduction in effort
– Are more lightweight projects being reported?• Documentation relative to life-cycle needs
17USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for systems and software data collection and analysis– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture– Alternative processes: rapid/agile; systems of systems;
evolutionary development– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and estimation methods needed for good management– Being addressed in Nov 4-5 workshops
USC-CSSE 1804/19/23
University of Southern California
Center for Systems and Software Engineering
Cost Driver Rating Scales and Effects
• Application Complexity– Difficulty and Constraints scales
• Architecture, Criticality, and Volatility Effects– Architecture effects as function of product size– Added effects of criticality and volatility
USC-CSSE 1904/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE20
Candidate AFCAA Difficulty Scale
Very Easy
Simple handling of events / inputs
Relies on O/S or middleware for control
Can be restarted with minor inconvenience
Very Challenging Autonomous
operation Extremely high
reliability Complex algorithms Micro-second
control loop Micro-code
programming
Firm
ware
(R
OM
)
Att
itude C
ontr
ol
Guid
ance /
Navig
ation
Radar
/ S
onar
/ T
ele
metr
y
pro
cessin
g
Pro
cess C
ontr
ol
Seis
mic
Pro
cessin
g
Facto
ry A
uto
mation
Dig
ital S
witches &
PB
X
Case T
ools
/ C
om
pile
rs
Website A
uto
mation
Busin
ess S
yste
ms
Difficulty would be described in terms of required software reliability, database size, product complexity, integration complexity, information
assurance, real-time requirements, different levels of developmental risks, etc.
04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE21
Candidate AFCAA Constraints Scale
Very Unconstrained Virtually unlimited
resources Accessible physical
environment Stationary
conditions Stable platform
Very Constrained Limited physical
volume, power & weight
Unmanned Inaccessible
physical environment
Evolving platform Hard real-time
Un-Manned Space Bus
Manned Space Vehicle
Missile
Un-Manned Airborne
Airborne Shipboard
Mobile Ground
Land Site
Har
dwar
e P
latf
orm
s
Dimensions of constraints include electrical power, computing capacity, storage capacity, repair capability, platform volatility, physical environment
accessibility, etc.
04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE 22
Added Cost of Weak ArchitectingCalibration of COCOMO II Architecture and Risk Resolution
factor to 161 project data points
04/19/23
University of Southern California
Center for Systems and Software Engineering
Effect of Size on Software Effort Sweet Spots
23USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Effect of Volatility and Criticality on Sweet Spots
24USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for systems and software data collection and analysis– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture– Alternative processes: rapid/agile; systems of systems;
evolutionary development– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and estimation methods needed for good management– Being addressed in Nov 4-5 workshops
USC-CSSE 2504/19/23
University of Southern California
Center for Systems and Software Engineering
Estimation for Alternative Processes
• Agile Methods– Planning Poker/Wideband Delphi– Yesterday’s Weather Adjustment: Agile COCOMO II
• Evolutionary Development– Schedule/Cost/Quality as Independent Variable
– Incremental Development Productivity Decline
• Systems of Systems– Hybrid Methods
USC-CSSE 2604/19/23
University of Southern California
Center for Systems and Software Engineering
Planning Poker/Wideband Delphi
• Stakeholders formulate story to be developed• Developers choose and show cards indicating their
estimated ideal person-weeks to develop story– Card values: 1,2,3,5,8,13,20,30, 50,100
• If card values are about the same, use the median as the estimated effort
• If card values vary significantly, discuss why some estimates are high and some low
• Re-vote after discussion– Generally, values will converge, and the median can be
used as the estimated effort
USC-CSSE 2704/19/23
University of Southern California
Center for Systems and Software Engineering
Agile COCOMO IIAdjusting agile “yesterday’s weather” estimates
USC-CSSE 28
Agile COCOMO II is a web-based software cost estimation tool that
enables you to adjust your estimates by analogy through identifying the
factors that will be changing and by how much.
Step 1
Estimate Cost: Estimate Effort:
Analogy Parameter
Project Name:
Baseline Value: (Dollars) (Person - Month)(Dollars / Function Point) (Dollars / Lines of
Code) (Function Points / Person-Months) (Lines of Code / Person-Months)(Ideal-Person-Weeks / Iteration)
Current Project Function Points
Current Project Size (SLOC): (Lines of Code)
Current Labor Rate: (Dollars / Person-Month)
Current Labor Rate (for Ideal-Person-Weeks):
(Dollars / Ideal-Person-Week)
Current Iteration Number:
04/19/23
1
University of Southern California
Center for Systems and Software Engineering
Incremental Development Forms
USC-CSSE 29
Type Examples Pros Cons Cost Estimation
Evolutionary Sequential
Small: AgileLarge: Evolutionary
Development
Adaptability to change; rapid
fielding
Easiest-first; late, costly breakage
Small: Planning-poker-typeLarge: Parametric with IDPD
Prespecified Sequential
Platform base plus PPPIs
Prespecifiable full-capability requirements
Emergent requirements or
rapid change
COINCOMO with no increment overlap
Overlapped Evolutionary
Product lines with ultrafast change
Modular product line
Cross-increment breakage
Parametric with IDPD and Requirements Volatility
Rebaselining Evolutionary
Mainstream product lines;
Systems of systems
High assurance with rapid change
Highly coupled systems with
very rapid change
COINCOMO, IDPD for development; COSYSMO for
rebaselining
Time phasing terms: Scoping; Architecting; Developing; Producing; Operating (SADPO) Prespecified Sequential: SA; DPO1; DPO2; DPO3; …Evolutionary Sequential: SADPO1; SADPO2; SADPO3; …Evolutionary Overlapped: SADPO1; SADPO2; SADPO3; …Evolutionary Concurrent: SA; D1 ; PO1… SA2; D2 ; PO2… SA3; D3; PO3 …
04/19/23
University of Southern California
Center for Systems and Software Engineering
Evolutionary Development Implications
• Total Package Procurement doesn’t work– Can’t determine requirements and cost up front
• Need significant, sustained systems engineering effort– Need best-effort up-front architecting for evolution– Can’t dismiss systems engineers after Preliminary Design
Review
• Feature set size becomes dependent variable– Add or drop borderline-priority features to meet schedule or cost– Implies prioritizing, architecting steps in SAIV process model– Safer than trying to maintain a risk reserve
USC-CSSE 3004/19/23
University of Southern California
Center for Systems and Software Engineering
SourceSelection
● ● ●
ValuationExploration Architecting Develop Operation
ValuationExploration Architecting Develop Operation
ValuationExploration Architecting Develop Operation
OperationDevelop Operation Operation Operation
System A
System B
System C
System x
LCO-typeProposal &Feasibility
Info
Candidate Supplier/ Strategic Partner n
● ●●
Candidate Supplier/Strategic Partner 1
SoS-Level ValuationExploration Architecting Develop
FCR1 DCR1
Operation
OCR1
Rebaseline/Adjustment FCR1 OCR2
OCRx1
FCRB DCRB OCRB1
FCRA DCRA
FCRC DCRC OCRC1
OCRx2 OCRx3 OCRx4 OCRx5
OCRC2
OCRB2
OCRA1
Future DoD Challenges: Systems of Systems
31 USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE32
Conceptual SoS SE Effort Profile
Planning, Requirements Management, and Architecting
Source Selection and Supplier Oversight
SoS Integration and Testing
Inception Elaboration Construction Transition
• SoS SE activities focus on three somewhat independent activities, performed by relatively independent teams
• A given SoS SE team may be responsible for one, two, or all activity areas• Some SoS programs may have more than one organization performing
SoS SE activities04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE33
SoS SE Cost Model
Planning, Requirements Management,
and Architecting
Source Selection and Supplier
Oversight
SoS Integrationand Testing
Size Drivers
SoS SEEffort
Cost Factors
• SoSs supported by cost model– Strategically-oriented
stakeholders interested in tradeoffs and costs
– Long-range architectural vision for SoS
– Developed and integrated by an SoS SE team
– System component independence
• Size drivers and cost factors– Based on product
characteristics, processes that impact SoS SE team effort, and SoS SE personnel experience and capabilities
Calibration
04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE 34
Comparison of SE and SoSE Cost Model Parameters
Parameter Aspects COSYSMO COSOSIMO
Size drivers # of system requirements
# of system interfaces
# operational scenarios
# algorithms
# of SoS requirements
# of SoS interface protocols
# of constituent systems
# of constituent system organizations
# operational scenarios
“Product” characteristics Size/complexity/volatility
Requirements understanding
Architecture understanding
Level of service requirements
# of recursive levels in design
Migration complexity
Technology risk
#/ diversity of platforms/installations
Level of documentation
Size/complexity/volatility
Requirements understanding
Architecture understanding
Level of service requirements
Component system maturity and stability
Component system readiness
Process characteristics Process capability
Multi-site coordination
Tool support
Maturity of processes
Tool support
Cost/schedule compatibility
SoS risk resolution
People characteristics Stakeholder team cohesion
Personnel/team capability
Personnel experience/continuity
Stakeholder team cohesion
SoS team capability
04/19/23
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for systems and software data collection and analysis– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture– Alternative processes: rapid/agile; systems of systems;
evolutionary development– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and estimation methods needed for good management– Being addressed in Nov 4-5 workshops
USC-CSSE 3504/19/23
University of Southern California
Center for Systems and Software Engineering
36
Reasoning about the Value of Dependability – iDAVE
• iDAVE: Information Dependability Attribute Value Estimator• Use iDAVE model to estimate and track software dependability ROI
– Help determine how much dependability is enough– Help analyze and select the most cost-effective combination of software dependability techniques– Use estimates as a basis for tracking performance– Integrates cost estimation (COCOMO II), quality estimation (COQUALMO), value estimation relationships
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
37
iDAVE Model Framework Time-phased
information processing capabilities
Project attributes
Time-phased dependability investments
IP Capabilities (size), project attributes
Cost estimating relationships (CER’s)
Dependability investments, project attributes
Dependability attribute estimating relationships (DER’s)
Cost = f
Di = gi
Value estimating relationships (VER’s)
Vj = hj IP Capabilities dependability levels Di
Time-phased Cost Dependability
attribute levels Di Value components
Vj
Return on Investment
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
38
Examples of Utility Functions: Response Time
Value
Time
Real-Time Control; Event
Support
Value
TimeMission Planning,
Competitive Time-to-Market
Value
TimeEvent Prediction
- Weather; Software Size
Value
TimeData Archiving
Priced Quality of Service
Critical Region
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE39
Tradeoffs Among Cost, Schedule, and Reliability: COCOMO II
0
1
2
3
4
5
6
7
8
9
0 10 20 30 40 50
Development Time (Months)
Co
st
($M
)
(VL, 1)
(L, 10)
(N, 300)
(H, 10K)
(VH, 300K)
-- Cost/Schedule/RELY:
“pick any two” points
(RELY, MTBF (hours))
•For 100-KSLOC set of features•Can “pick all three” with 77-KSLOC set of
features
04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE40
The SAIV* Process Model1. Shared vision and expectations management
2. Feature prioritization
3. Schedule range estimation and core-capability determination- Top-priority features achievable within fixed schedule with 90% confidence
4. Architecting for ease of adding or dropping borderline-priority features - And for accommodating past-IOC directions of growth
5. Incremental development- Core capability as increment 1
6. Change and progress monitoring and control- Add or drop borderline-priority features to meet schedule
*Schedule As Independent Variable; Feature set as dependent variable– Also works for cost, schedule/cost/quality as independent variable
04/19/23
University of Southern California
Center for Systems and Software Engineering
41
How Much Testing is Enough?- Early Startup: Risk due to low dependability - Commercial: Risk due to low dependability - High Finance: Risk due to low dependability
- Risk due to market share erosion
COCOMO II: 0 12 22 34 54 Added % test time
COQUALMO: 1.0 .475 .24 .125 0.06 P(L)
Early Startup: .33 .19 .11 .06 .03 S(L)
Commercial: 1.0 .56 .32 .18 .10 S(L)
High Finance: 3.0 1.68 .96 .54 .30 S(L)
Market Risk: .008 .027 .09 .30 1.0 REm
Sweet Spot
Combined Risk Exposure
0
0. 2
0. 4
0. 6
0. 8
1
VL L N H VH RELY
RE =P(L) * S(L)
Market ShareErosion
Early Startup
Commercial
High Finance
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
Related Additional Measurement Challenges
• Tracking progress of rebaselining, V&V teams– No global plans; individual changes or software drops– Earlier test preparation: surrogates, scenarios, testbeds
• Tracking content of time-certain increments– Deferred or partial capabilities; effects across system
• Trend analysis of emerging risks– INCOSE Leading Indicators; SERC Effectiveness Measures
• Contributions to systems effectiveness– Measures of Effectiveness models, parameters
• Systems of systems progress, risk, change tracking– Consistent measurement flow-up, flow-down, flow-across
USC-CSSE 4204/19/23
University of Southern California
Center for Systems and Software Engineering
Some data definition topics for discussionIn SW Metrics Unification workshop Nov 4-5
• Ways to treat data elements– COTS, other OTS (open source; services; GOTS; reuse; legacy code)– Other size units (function points object points, use case points, etc.)– Generated code: counting generator directives– Requirements volatility– Rolling up CSCIs into systems and systems of systems– Cost model inputs and outputs (e.g., submitting estimate files)
• Scope issues– Cost drivers, Scale factors– Reuse parameters: Software Understanding , Programmer Unfamiliarity– Phases included: hardware-software integration; systems of systems
integration, transition, maintenance– WBS elements and labor categories included– Parallel software WBS
• How to involve various stakeholders– Government, industry, commercial cost estimation organizations
USC-CSSE 4304/19/23
University of Southern California
Center for Systems and Software Engineering
Summary
• Current and future trends create challenges for systems and software data collection and analysis– Metrics and “productivity:” “equivalent” size;
requirements/design/product/value metrics; productivity growth and decline phenomena
– Cost drivers: effects of complexity, volatility, architecture– Alternative processes: rapid/agile; systems of systems;
evolutionary development– Model integration: systems and software; cost, schedule, and
quality; costs and benefits
• Updated systems and software data definitions and estimation methods needed for good management– Being addressed in Nov 4-5 workshops
USC-CSSE 4404/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE 45
References Boehm, B., “Some Future Trends and Implications for Systems and Software Engineering Processes”,
Systems Engineering 9(1), pp. 1-19, 2006. Boehm, B., and Lane, J., “Using the ICM to Integrate System Acquisition, Systems Engineering, and
Software Engineering,” CrossTalk, October 2007, pp. 4-9.Boehm, B., Brown, A.W.. Clark, B., Madachy, R., Reifer, D., et al., Software Cost Estimation with COCOMO
II, Prentice Hall, 2000.Dahmann, J. (2007); “Systems of Systems Challenges for Systems Engineering”, Systems and Software
Technology Conference, June 2007.
Department of Defense (DoD), Instruction 5000.02, Operation of the Defense Acquisition System, December 2008.
Galorath, D., and Evans, M., Software Sizing, Estimation, and Risk Management, Auerbach, 2006.Lane, J. and Boehm, B., “Modern Tools to Support DoD Software-Intensive System of Systems Cost
Estimation, DACS State of the Art Report, also Tech Report USC-CSSE-2007-716Lane, J., Valerdi, R., “Synthesizing System-of-Systems Concepts for Use in Cost Modeling,” Systems
Engineering, Vol. 10, No. 4, December 2007.Madachy, R., “Cost Model Comparison,” Proceedings 21st, COCOMO/SCM Forum, November, 2006,
http://csse.usc.edu/events/2006/CIIForum/pages/program.htmlNorthrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future, Software
Engineering Institute, 2006. Reifer, D., “Let the Numbers Do the Talking,” CrossTalk, March 2002, pp. 4-8.Stutzke, R., Estimating Software-Intensive Systems, Addison Wesley, 2005. Valerdi, R, Systems Engineering Cost Estimation with COSYSMO, Wiley, 2010 (to appear)
USC-CSSE Tech Reports, http://csse.usc.edu/csse/TECHRPTS/by_author.html
04/19/23
University of Southern California
Center for Systems and Software Engineering
Backup Charts
46USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
47
COSYSMO
SizeDrivers
EffortMultipliers
Effort
Calibration
# Requirements# Interfaces# Scenarios# Algorithms
+Volatility Factor
- Application factors-8 factors
- Team factors-6 factors
- Schedule driver WBS guided by ISO/IEC 15288
COSYSMO Operational Concept
USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
48
4. Rate Cost Drivers - Application
University of Southern California
Center for Systems and Software Engineering
COSYSMO Change Impact Analysis – I – Added SysE Effort for Going to 3 Versions
• Size: Number, complexity, volatility, reuse of system requirements, interfaces, algorithms, scenarios (elements)– 13 Versions: add 3-6% per increment for number of
elements add 2-4% per increment for volatility– Exercise Prep.: add 3-6% per increment for number of
elements add 3-6% per increment for volatility
• Most significant cost drivers (effort multipliers)– Migration complexity: 1.10 – 1.20 (versions)– Multisite coordination: 1.10 – 1.20 (versions, exercise prep.)– Tool support: 0.75 – 0.87 (due to exercise prep.)– Architecture complexity: 1.05 – 1.10 (multiple baselines)– Requirements understanding: 1.05 – 1.10 for increments 1,2; 1.0 for increment 3; .9-.95 for increment 4 49USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
COSYSMO Change Impact Analysis – II – Added SysE Effort for Going to 3 Versions
Cost Element Incr. 1 Incr. 2 Incr. 3 Incr. 4Size 1.11 – 1.22 1.22 – 1.44 1.33 – 1.66 1.44 – 1.88
Effort Product 1.00 – 1.52 1.00 – 1.52 0.96 – 1.38 0.86 – 1.31
Effort Range 1.11 – 1.85 1.22 – 2.19 1.27 – 2.29 1.23 – 2.46
Arithmetic Mean 1.48 1.70 1.78 1.84
Geometric Mean 1.43 1.63 1.71 1.74
50USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
COSYSMO Requirements Counting Challenge
• Estimates made in early stages– Relatively few high-level design-to requirements
• Calibration performed on completed projects– Relatively many low-level test-to requirements
• Need to know expansion factors between levels– Best model: Cockburn definition levels
• Cloud, kite, sea level, fish, clam
• Expansion factors vary by application area, size– One large company: Magic Number 7– Small e-services projects: more like 3:1, fewer lower levels
• Survey form available to capture your experience
USC-CSSE 5104/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE52
Achieving Agility and High Assurance -IUsing timeboxed or time-certain development
Precise costing unnecessary; feature set as dependent variable
Rapid Change
HighAssurance
Short, StabilizedDevelopment
Of Increment N
Increment N Transition/O&M
Increment N Baseline
Short DevelopmentIncrements
ForeseeableChange
(Plan)
Stable DevelopmentIncrements
04/19/23
University of Southern California
Center for Systems and Software Engineering
Evolutionary Concurrent: Incremental Commitment Model
Agile Rebaselining for
Future Increments
Short, StabilizedDevelopment
of Increment N
Verification and Validation (V&V)of Increment N
Deferrals
Artifacts Concerns
Rapid Change
HighAssurance
Future Increment Baselines
Increment N Transition/
Operations and Maintenance
Future V&V
Resources
Increment N Baseline
Current V&V
Resources
Unforeseeable Change (Adapt)
ShortDevelopmentIncrements
ForeseeableChange
(Plan)
Stable DevelopmentIncrements
Continuous V&V
53 USC-CSSE04/19/23
University of Southern California
Center for Systems and Software Engineering
USC-CSSE54
$100M
$50M
Arch. A:Custom
many cache processors
Arch. B:Modified
Client-Server
1 2 3 4 5
Response Time (sec)
Original Spec After Prototyping
Available budget
Effect of Unvalidated Requirements-15 Month Architecture Rework Delay
04/19/23