oscar ipt/bold stroke open systems lessons learned · review of early expectations • oscar’s...

60
2/26/00 OSLLLogan.ppt 1 OSCAR-TDL12-003-V1 OSCAR IPT/Bold Stroke Open Systems Lessons Learned Prepared by the OSCAR IPT for: Glenn T. Logan - Lt Col USAF Open Systems Joint Task Force

Upload: others

Post on 04-Nov-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt1OSCAR-TDL12-003-V1

OSCAR IPT/Bold StrokeOpen Systems Lessons Learned

Prepared by the OSCAR IPT for:

Glenn T. Logan - Lt Col USAF

Open Systems Joint Task Force

Page 2: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

REPORT DOCUMENTATION PAGE Form Approved OMB No.0704-0188

Public reporting burder for this collection of information is estibated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completingand reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burder to Department of Defense, WashingtonHeadquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision oflaw, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.

1. REPORT DATE (DD-MM-YYYY)26-02-2000

2. REPORT TYPEBriefing

3. DATES COVERED (FROM - TO)xx-xx-2000 to xx-xx-2000

4. TITLE AND SUBTITLEOSCAR IPT/Bold Stroke Open Systems Lessons LearnedUnclassified

5a. CONTRACT NUMBER5b. GRANT NUMBER5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S)Logan, Glenn T. ;

5d. PROJECT NUMBER5e. TASK NUMBER5f. WORK UNIT NUMBER

7. PERFORMING ORGANIZATION NAME AND ADDRESSOpen Systems Joint Task Force2001 N. Beauregard St., Suite 800Alexandria, VA22311

8. PERFORMING ORGANIZATION REPORTNUMBER

9. SPONSORING/MONITORING AGENCY NAME AND ADDRESSOpen Systems Joint Task Force (OSJTF)1931 Jefferson Davis HighwayCrystal Mall 3, Suite 104Arlington, VA22202

10. SPONSOR/MONITOR'S ACRONYM(S)11. SPONSOR/MONITOR'S REPORTNUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENTAPUBLIC RELEASE,13. SUPPLEMENTARY NOTES14. ABSTRACTSee Report.15. SUBJECT TERMS16. SECURITY CLASSIFICATION OF: 17. LIMITATION

OF ABSTRACTPublic Release

18.NUMBEROF PAGES59

19. NAME OF RESPONSIBLE PERSONhttp://www.acq.osd.mil/osjtf/library/library_alpha.html,(blank)[email protected]

a. REPORTUnclassified

b. ABSTRACTUnclassified

c. THIS PAGEUnclassified

19b. TELEPHONE NUMBERInternational Area CodeArea Code Telephone Number703767-9007DSN427-9007

Standard Form 298 (Rev. 8-98)Prescribed by ANSI Std Z39.18

Page 3: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt2OSCAR-TDL12-003-V1

Lessons Learned Agenda

0900-0915 Welcome (D. Weissgerber/J. Wojciehowski)

0915-1045 OSCAR Program (D. Weissgerber)

Early Expectations & Assumptions

Actual Experiences

1045-1100 Break

1100-1130 OSCAR Hardware (B. Abendroth)

1130-1145 Tools (C. Hibler)

1145-1200 Summary (D. Weissgerber)

1200-1300 Lunch

Page 4: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt3OSCAR-TDL12-003-V1

1300-1400 Bold Stroke

OASIS (D. Seal)

Cost Performance & Metrics (E. Beckles)

1400-1500 Open Discussions

1500 Closing Remarks (D. Weissgerber/J. Wojciehowski)

Lessons Learned Agenda

Page 5: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt4OSCAR-TDL12-003-V1

Products• OC1.1 and OC1.2 OFPs

Status• I-6 Flight Test

COTS• DY-4 PowerPC Processor

OFP Architecture• OOD / C++

Products• OC1.1 and OC1.2 OFPs

Status• I-6 Flight Test

COTS• DY-4 PowerPC Processor

OFP Architecture• OOD / C++

Common Products• HOL OFPs• DOORS• ROSE• TORNADO (WindRiver)• Gen Purpose Processor• Image Proc. Module

Common Products• HOL OFPs• DOORS• ROSE• TORNADO (WindRiver)• Gen Purpose Processor• Image Proc. Module

Products• EMD OFP• Suite 5 OFP

Status• EMD Go-Ahead - May ‘00

COTS• DY-4 PowerPC Processor• HI Image Processor

OFP Architecture• Ada / C++ / C

Products• COSSI AMC variant H/W• Stage 1 functionality OFP

Status• CDR upcoming

COTS• DY_4 PowerPC Processor• HI Image Processor

OFP Architecture• OOD / C++

Products• H1, H2 and H3 OFPs

Status• H1 Build 2 flight test - Aug. ‘00

COTS• DY-4 PowerPC Processor• HI Image Processing• Fibre Channel Network

OFP Architecture• OOD / C++

BOLDBOLDSTROKESTROKE

Page 6: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt5OSCAR-TDL12-003-V1

Boeing’s Previous System Arch Lesson LearnedCase Studies

• Software Modification/Maintenance Costs Are aSignificant Recurring Investment

• Must Break the Block Upgrade Paradigm MadeNecessary by the Tight Coupling Between OFPs andSpecific H/W Configurations

• Assembly Language OFPs Have Become IncreasinglyUnstructured Through Many Upgrade Iterations

Page 7: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt6OSCAR-TDL12-003-V1

OSCAR IPT Open System Lesson Learned Analysis

• Represents a Snapshot-In-Time– Where We’ve Been– Where We Are– Where We’re Going

• Compiled by the Engineers Working the Issues– Analysis of Key Impact Areas

• Identifies Current Top 10 OSCAR Lessons Learned

• Provides a Basis for Future Lessons LearnedComparisons/Analysis

Page 8: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt7OSCAR-TDL12-003-V1

AV-8B OSCAR Principles

• Follow US DoD Directive For Acquisition Reform– Apply Revised DoD Directive 5000 (dated 15 Mar 96)– Commercial Business Philosophy– Performance Based Specs vs Procurement Specs

• Insert Commercial Technologies– COTS Hardware– COTS Software Development Environment

• Reduce Life Cycle Cost

• Apply Open System Architecture– Emphasis on Non-Proprietary Hardware and Software– Object Oriented Design and High Order Language– Software Independent of Hardware

• Increase Allied Software Development Workshare

Page 9: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt8OSCAR-TDL12-003-V1

Review of Early Expectations

• OSCAR’s Goals– Reduce Life Cycle Support Cost of Software Upgrades

(Cost Savings to be Realized during 3rd Block Upgrade)

• Shortened OFP Development Cycle• Reduce Rework in Dev Cycle & DT/OT• Reduce Regression Testing in OC1.2 (OC1.1 set baseline)

– Leverage Commercial Technology– Incorporate an Open Architecture Concept– No Reduction in System Performance

Page 10: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt9OSCAR-TDL12-003-V1

Review of OSCAR Open SystemAssumptions

• Implementation of Open Systems H/W and S/W RequiresUp-Front Investment– Recoupment Within 2-3 Updates to the S/W

• Open System Computing H/W is Based on CommercialStandards– Promotes Competition– Takes Advantage of Commercially Driven Requirements for Technology

Insertion

• LCC Analysis Shows a 30-40% Cost Reduction in CoreComputing H/W and S/W Development but notnecessarily applicable to System Integration/Test ofMulti-Sys Block Upgrades

Page 11: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt10OSCAR-TDL12-003-V1

• OSCAR and Open Systems Computing Does Not AffectTasks Associated with the Airframe or FlightQualification of New Weapons/Capabilities

• Two-Level Maintenance Concept Philosophy WillReduce LCC and Increase Operational Availability

• OSA provides Arch for a Plug-and-Play Trainer Concept

• With OSCAR as First Large Scale Implementation ofOpen Systems and Object Oriented S/W:– Reluctance to Fully Realize the Cost Benefits Until OSCAR is Fielded

and all the Data Collected and Analyzed

Review of OSCAR Open SystemAssumptions (cont.)

Page 12: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt11OSCAR-TDL12-003-V1

• OSCAR’s Open System Architecture Will MakeIncremental Upgrades Possible by Decoupling H/W andS/W (I.e., MSC-750-G4)

• Commercial Off-The-Shelf Products can be DirectlyIncorporated with Minimal Development Costs– Multi-Vendor Support Ensures Competitive Procurement Costs

• Software LCC Savings are Derived from the HighDegree of Modularity Envisioned– Less Than Half the Regression Test and Re-Qual Effort of Today

Review of OSCAR’s Open SystemAssumptions (cont.)

Page 13: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt12OSCAR-TDL12-003-V1

• SPI

• CPI

• Requirements -- System & software levels, stability index

• SLOC -- Estimates vs. actuals, productivity factor

• Classes

• Peer Review

• TWD -- Development & ground test execution• Flight Test -- flights, test points, analysis• Problem Reports - various flavors

• Throughput & Memory Spare

• Hardware Performance

• Risk

Data & Metrics Currently Collected

Page 14: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt13OSCAR-TDL12-003-V1

Initial Expectations for Metrics

• SPI -- Identify an immediate schedule problem

• CPI -- Control overspending, identify underruns

• System & Software Requirements -- Track the developmentto plan and identify any Growth

• Requirements Stability -- Control requirements growth

• SLOC Actuals vs. Estimated -- Control growth and ‘gold-plating’

• Software productivity (Manhrs/SLOC) -- Improve efficiencywithin which software is produced

• Classes Actuals vs. Planned To Date -- Indication ofperformance to schedule

• Peer Review -- Capture errors before the product isdelivered

Page 15: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt14OSCAR-TDL12-003-V1

• TWD Development & Ground Test -- Readiness oftest team to support system level test phase

• Problem Reports -- Quality of the software &where are problems found

• Throughput/Memory -- Keep software within thebounds of hardware performance

• Risk -- Control risks & be prepared to act quicklyif they materialize

Initial Expectations of Metrics

Page 16: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt15OSCAR-TDL12-003-V1

What Metrics Actually Provided

• SPI -- Watch The Details– Lower level problems are masked within larger cost

accounts– Top-level SPI can mask lower level account SPI

difficulties– Provides good focus for the CAMs OSCAR OC1.1 PERFORMANCE STOP LIGHT CHART

WEEK ENDING: 27FEB00

TOTAL CONVERSION (TDL01)

AVIONICS SOFTWARE

o WEAPON DELIVERY

o PILOT/VEHICLE INTERFA

o SENSORS & TARGET

o NAVIGATION & A/C STAT DMU REBHAN (15) 81 9 414 100.1 102.8 14,721 20,292 23,558

o STF/DTE DMT BELL, R.T. 30 7 523 704 107.6 110.5 6,727 9,987 9,988

o SSEE DMR RUSSELL, W.H 0 (15) 0 (428) 100.0 91.0 4,766 9,166 10,866

o QUALITY DMR HIBLER, C.A. 0 3 0 26 100.0 102.5 1,048 2,019 2,135

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 2,969 100.0 137.9 7,836 10,805 8,668

TACTICAL ELECT WARFARE DCA MARX, J.A. 0 5 0 82 100.0 103.6 2,251 3,626 3,626

AVIONICS HARDWARE

o MSC/CD DEA ABENDROTH 0 5 (10) 186 99.6 107.8 2,376 3,527 3,616

o WMC DEA SZCZUKA 0 (4) (98) 425 97.9 110.4 4,080 6,396 6,397INTEGRATION

o EMC DHF GOODWIN 0 12 0 161 100.0 122.5 714 1,303 1,305

AVIONICS TEST J00 ILGES, J.F. 70 173 (188) 1,285 98.8 109.4 13,655 21,701 21,367

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 7 0 63 100.0 107.8 800 2,038 2,038

DWG RELEASE AB0 REARDON 0 1 0 673 100.0 177.5 868 2,031 1,356

SYSTEM ENGINEERING BD0 WESTPHAL, J.L. 0 7 0 825 100.0 209.8 752 2,097 1,486

SAFETY/R&M BD0 MCCOY, R.L. 0 (9) 0 484 100.0 136.5 1,325 3,211 2,630

MANAGEMENT DMR FRANKENFIELD, C.R. 0 (40) 0 (387) 100.0 94.4 6,877 11,648 12,133

GENERAL BULK DMR RUSSO, A.G. 0 36 0 (448) 100.0 93.7 7,100 11,154 11,154

TOTAL AMRAAM (TDL02) H00 86 44 47 1,251 100.1 103.2 39,432 62,602 65,343

AVIONICS SOFTWARE

o WEAPON DELIVERY DMW HEZEL, K.C. 15 18 43 866 100.6 114.4 6,005 14,168 14,993

o PILOT/VEHICLE INTERFA DMY VOLLE, D.A. 0 44 0 (1,241) 100.0 86.5 9,213 9,271 9,775

o SENSORS & TARGET DMX SHYLANSKI, J. 0 22 0 (266) 100.0 97.3 9,891 15,150 17,669

o STF/DTE DMT BELL, R.T. 0 6 0 169 100.0 146.2 365 573 573

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 545 100.0 153.9 1,010 1,555 1,093

INTEGRATION

o EMC DHF GOODWIN 0 3 0 111 100.0 156.6 196 361 298

AVIONICS TEST J00 ILGES, J.F. 71 (53) 4 940 100.1 114.0 6,696 11,294 10,896

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 3 0 30 100.0 111.2 265 567 609

AIR VEHICLE TECHNOLOGY BAJ BRIGULIO, J.A. 0 (12) 0 (91) 100.0 94.8 1,754 1,807 2,010

DWG RELEASE AB0 REARDON 0 0 0 155 100.0 168.5 226 497 386

SYSTEM ENGINEERING BD0 WESTPHAL, J.L. 0 1 0 288 100.0 152.1 552 1,422 1,032

MANAGEMENT DMR FRANKENFIELD, C.R. 0 15 0 54 100.0 104.5 1,199 3,104 2,941

GENERAL BULK DMR RUSSO, A.G. 0 (2) 0 (307) 100.0 85.1 2,060 2,833 3,068

TOTAL 1760B (TDL03) H00 (6) 27 (14) (15) 99.9 99.9 14,868 22,121 22,050

AVIONICS SOFTWARE

o WEAPON DELIVERY DMW HEZEL, K.C. 3 45 154 507 107.2 128.3 1,790 5,907 5,019

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 71 100.0 999.9 1 72 42

INTEGRATION

o EMC DHF GOODWIN 0 6 0 33 100.0 122.8 143 448 449

AVIONICS TEST J00 ILGES, J.F. 1 38 (17) 539 99.6 115.7 3,435 4,657 4,521

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 (9) 0 (212) 100.0 91.3 2,425 3,023 3,260

DWG RELEASE AB0 REARDON 0 (1) 0 (221) 100.0 16.9 266 64 213

AIRFRAME/SUBSYSTEM ACK HOHL, K.L. (11) (50) (151) (622) 96.8 88.2 5,259 5,432 5,520

+

Overall Program Healthy

Critical Path Behind Schedule

Page 17: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt16OSCAR-TDL12-003-V1

What Metrics Actually Provided

• CPI -- New functionality Costs More ThanLegacy

OSCAR OC1.1 PERFORMANCE STOP LIGHT CHART

WEEK ENDING: 27FEB00 TEAM TEAM LEADER SVdelta CVdelta SV CV SPI CPI ACI BAC EAC

GRAND TOTAL 348 554 (1,858) 12,301 99.2

TOTAL CONVERSION (TDL01) H00

AVIONICS SOFTWARE

o WEAPON DELIVERY

o PILOT/VEHICLE INTERFA

o SENSORS & TARGET

o NAVIGATION & A/C STAT DMU REBHAN (15) 81 9 414 100.1 102.8 14,721 20,292 23,558

o STF/DTE DMT BELL, R.T. 30 7 523 704 107.6 110.5 6,727 9,987 9,988

o SSEE DMR RUSSELL, W.H 0 (15) 0 (428) 100.0 91.0 4,766 9,166 10,866

o QUALITY DMR HIBLER, C.A. 0 3 0 26 100.0 102.5 1,048 2,019 2,135

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 2,969 100.0 137.9 7,836 10,805 8,668

TACTICAL ELECT WARFARE DCA MARX, J.A. 0 5 0 82 100.0 103.6 2,251 3,626 3,626

AVIONICS HARDWARE

o MSC/CD DEA ABENDROTH 0 5 (10) 186 99.6 107.8 2,376 3,527 3,616

o WMC DEA SZCZUKA 0 (4) (98) 425 97.9 110.4 4,080 6,396 6,397INTEGRATION

o EMC DHF GOODWIN 0 12 0 161 100.0 122.5 714 1,303 1,305

AVIONICS TEST J00 ILGES, J.F. 70 173 (188) 1,285 98.8 109.4 13,655 21,701 21,367

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 7 0 63 100.0 107.8 800 2,038 2,038

DWG RELEASE AB0 REARDON 0 1 0 673 100.0 177.5 868 2,031 1,356

SYSTEM ENGINEERING BD0 WESTPHAL, J.L. 0 7 0 825 100.0 209.8 752 2,097 1,486

SAFETY/R&M BD0 MCCOY, R.L. 0 (9) 0 484 100.0 136.5 1,325 3,211 2,630

MANAGEMENT DMR FRANKENFIELD, C.R. 0 (40) 0 (387) 100.0 94.4 6,877 11,648 12,133

GENERAL BULK DMR RUSSO, A.G. 0 36 0 (448) 100.0 93.7 7,100 11,154 11,154

TOTAL AMRAAM (TDL02) H00 86 44 47 1,251 100.1 103.2 39,432 62,602 65,343

AVIONICS SOFTWARE

o WEAPON DELIVERY DMW HEZEL, K.C. 15 18 43 866 100.6 114.4 6,005 14,168 14,993

o PILOT/VEHICLE INTERFA DMY VOLLE, D.A. 0 44 0 (1,241) 100.0 86.5 9,213 9,271 9,775

o SENSORS & TARGET DMX SHYLANSKI, J. 0 22 0 (266) 100.0 97.3 9,891 15,150 17,669

o STF/DTE DMT BELL, R.T. 0 6 0 169 100.0 146.2 365 573 573

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 545 100.0 153.9 1,010 1,555 1,093

INTEGRATION

o EMC DHF GOODWIN 0 3 0 111 100.0 156.6 196 361 298

AVIONICS TEST J00 ILGES, J.F. 71 (53) 4 940 100.1 114.0 6,696 11,294 10,896

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 3 0 30 100.0 111.2 265 567 609

AIR VEHICLE TECHNOLOGY BAJ BRIGULIO, J.A. 0 (12) 0 (91) 100.0 94.8 1,754 1,807 2,010

DWG RELEASE AB0 REARDON 0 0 0 155 100.0 168.5 226 497 386

SYSTEM ENGINEERING BD0 WESTPHAL, J.L. 0 1 0 288 100.0 152.1 552 1,422 1,032

MANAGEMENT DMR FRANKENFIELD, C.R. 0 15 0 54 100.0 104.5 1,199 3,104 2,941

GENERAL BULK DMR RUSSO, A.G. 0 (2) 0 (307) 100.0 85.1 2,060 2,833 3,068

TOTAL 1760B (TDL03) H00 (6) 27 (14) (15) 99.9 99.9 14,868 22,121 22,050

AVIONICS SOFTWARE

o WEAPON DELIVERY DMW HEZEL, K.C. 3 45 154 507 107.2 128.3 1,790 5,907 5,019

o SOFTWARE BULK DMR HIBLER, C.A. 0 0 0 71 100.0 999.9 1 72 42

INTEGRATION

o EMC DHF GOODWIN 0 6 0 33 100.0 122.8 143 448 449

AVIONICS TEST J00 ILGES, J.F. 1 38 (17) 539 99.6 115.7 3,435 4,657 4,521

LOGISTICS (Product Supt) SK0 HERBERT, B.K. 0 (9) 0 (212) 100.0 91.3 2,425 3,023 3,260

DWG RELEASE AB0 REARDON 0 (1) 0 (221) 100.0 16.9 266 64 213

AIRFRAME/SUBSYSTEM ACK HOHL, K.L. (11) (50) (151) (622) 96.8 88.2 5,259 5,432 5,520

+

New Functionality

Page 18: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt17OSCAR-TDL12-003-V1

What Metrics Actually Provided

• System Requirements - No Changes Resulting FromOO/C++ Development

– Level Of Detail & Complexity Commensurate With Assembly

– OO Makes Traceablity To Code Is Difficult (see other chart)

• Requirements Stability -- good to show what’s movingthrough the system, but don’t really know how manyrequirements and corresponding code/tests are affected(traceability)

• Risks -- hard to maintain a monthly review jugglingschedules, but good tool to keep on top of issues, whenHigh risks are identified - resources are focused on them– Engineers tend to set risks at HW/SW detail level and not see the

top level System Functionality High Risks

Page 19: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt18OSCAR-TDL12-003-V1

What Metrics Actually Provided

• Throughput Usage– OO , COTS OS makes throughput consumption difficult

to predict

MSC/AMC Throughput UtilizationActual Throughput Consumed, Estimate by Iteration, and Estimate at Complete

50%

60%

70%

80%

90%

100%

110%

120%

Jan-

98

Mar

-98

May-9

8

Jul-9

8

Sep-9

8

Nov-9

8

Jan-

99

Mar

-99

May-9

9

Jul-9

9

Sep-9

9

Nov-9

9

Jan-

00

Mar

-00

May-0

0

Jul-0

0

Sep-0

0

Nov-0

0

Jan-

01

Mar

-01

May-0

1

Jul-0

1

Sep-0

1

Nov-0

1

Iteration 5 Iteration 6Iteration 3 Iteration 4 OC1.2

AMC OC1.2 Estimate

MSC OC1.1 Estimate

MSC OC1.1 EAC

MSC OC1.1 EAC w /EDC Off, Run from RAM

MSC OC1.1 Estimate w /EDC Off, Run from RAM

AMC OC1.1 EAC

MSC Actual

Wo rst-case scenario was Radar A/G. Initia l data point (63%). Complete wo rst case path functionality not yet incorporated.

AMC OC1.1 Estimate

Spare

Wo rst-case scenario is n o w Radar A/A. Radar A/A AMRAAM, 3 targets: 97% pre-launch, 104% post launch Radar A/G = 80%. Night Attack A/G = 77%.

MSC Actual w /EDC Off, Run from RAM

Predicted Usage

Page 20: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt19OSCAR-TDL12-003-V1

• Memory Usage– Consumption can be predictably scaled from assembly

language implementation

What Metrics Actually Provided

MSC/AMC RAM Memory UtilizationActual Memory Consumed, Estimate by Iteration, and Estimate at Complete

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Jan-

98

Mar

-98

May

-98

Jul-9

8

Sep-9

8

Nov-9

8

Jan-

99

Mar

-99

May

-99

Jul-9

9

Sep-9

9

Nov-9

9

Jan-

00

Mar

-00

May

-00

Jul-0

0

Sep-0

0

Nov-0

0

Jan-

01

Mar

-01

May

-01

Jul-0

1

Sep-0

1

Nov-0

1

Itera

tion

5

Itera

tion

6

Itera

tion

3

MSC OC1.1 EAC

MSC Actual

MSC Estimate

AMC Estimate

AMC OC1.1 EAC

Spare Objective

Itera

tion

4

OC

1.2

Spare Threshold

MSC Estimate w/EDC Off, Run from RAM

MSC OC1.1 EAC w /EDC Off, Run from RAM

RAM Memory Capacity:MSC = 16 MB

MSC (I5 Option 11) = 64 MB AMC = 64 MB

MSC/AMC NVRAM Memory UtilizationActual Memory Consumed, Estimate by Iteration, and Estimate at Complete

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Jan-

98

Mar

-98

May

-98

Jul-9

8

Sep-9

8

Nov-9

8

Jan-

99

Mar

-99

May

-99

Jul-9

9

Sep-9

9

Nov-9

9

Jan-

00

Mar

-00

May

-00

Jul-0

0

Sep-0

0

Nov-0

0

Jan-

01

Mar

-01

May

-01

Jul-0

1

Sep-0

1

Nov-0

1

Itera

tion

5

Itera

tion

6

Itera

tion

4

Itera

tion

3

MSC OC1.1 EAC

MSC Actual

MSC Estimate

AMC Estimate

AMC OC1.1 EAC

Spare Objective

OC

1.2

Spare Threshold

NVRAM Memory Capacity:MSC = 256 KBAMC = 128 KB

MSC/AMC Flash Memory UtilizationActual Memory Consumed, Estimate by Iteration, and Estimate at Complete

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Jan-

98

Mar

-98

May

-98

Jul-9

8

Sep-9

8

Nov-9

8

Jan-

99

Mar

-99

May

-99

Jul-9

9

Sep-9

9

Nov-9

9

Jan-

00

Mar

-00

May

-00

Jul-0

0

Sep-0

0

Nov-0

0

Jan-

01

Mar

-01

May

-01

Jul-0

1

Sep-0

1

Nov-0

1

Itera

tion

5

Itera

tion

6

Itera

tion

3

MSCOC1.1 EAC

MSC Estimate

AMC Estimate

AMC OC1.1 EAC

Spare Objective

OC

1.2

Spare Threshold

Flash Memory Capacity:MSC = 16 MBAMC = 32 MB

Page 21: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt20OSCAR-TDL12-003-V1

What Metrics Actually Provided

0

500

1000

1500

2000

2500

12/8

/97

2/8/

98

4/8/

98

6/8/

98

8/8/

98

10/8

/98

12/8

/98

2/8/

99

4/8/

99

6/8/

99

8/8/

99

10/8

/99

12/8

/99

2/8/

00

Nu

mb

er o

f P

Rs

Open

Closed

RejectedIte

ratio

n 3

Ite

ratio

n 4

Itera

tion

5

Problem Reports -- Open/Closed/Rejected– OO/C++ enables trained developers with Tools to rapidly

diagnose and correct anomalies.

Page 22: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt21OSCAR-TDL12-003-V1

What Metrics Actually Provided

• Problem Reports - Where Found– DTE Saves Time & Money– Provides a “Software Test Facility” on every

desktop– Less problems found in flight than Legacy OFP

0

500

1000

1500

2000

2500

12/8

/97

1/8/

98

2/8/

98

3/8/

98

4/8/

98

5/8/

98

6/8/

98

7/8/

98

8/8/

98

9/8/

98

10/8

/98

11/8

/98

12/8

/98

1/8/

99

2/8/

99

3/8/

99

4/8/

99

5/8/

99

6/8/

99

7/8/

99

8/8/

99

9/8/

99

10/8

/99

11/8

/99

12/8

/99

1/8/

00

2/8/

00

Nu

mb

er o

f P

Rs

Vendor Test FacilityA/C - FlightA/C - GroundAIC CLAIC STLSTFFlight SimDesktop

Page 23: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt22OSCAR-TDL12-003-V1

• SLOC– Not very useful

• Some code “auto”-generated by 4th generation tools– Poor unit for estimating resources required

What Metrics Actually Provided

MSC OC1.1 SLOC Count

Itera

tion

4

Itera

tion

2

Itera

tion

3

Itera

tion

6

Itera

tion

50

50000

100000

150000

200000

250000

300000

1-N

ov-9

7

31-D

ec-9

7

2-M

ar-9

8

2-M

ay-9

8

2-Ju

l-98

1-S

ep-9

8

31-O

ct-9

8

31-D

ec-9

8

2-M

ar-9

9

2-M

ay-9

9

2-Ju

l-99

1-S

ep-9

9

31-O

ct-9

9

31-D

ec-9

9

1-M

ar-0

0

1-M

ay-0

0

1-Ju

l-00

31-A

ug-0

0

30-O

ct-0

0

So

urc

e L

ines

of

Co

de

(SL

OC

)

Source Lines of Code (Actual) Iteration 5 Plan Iteration 6 Plan

Page 24: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt23OSCAR-TDL12-003-V1

• Classes– Best measure of development progress

• Similar to function points• SLOC difficult to estimate

What Metrics Actually Provided

MSC Development Status History

0

100

200

300

400

500

600

700

3-Ja

n-99

17-J

an-9

9

31-J

an-9

9

14-F

eb-9

9

28-F

eb-9

9

14-M

ar-9

9

28-M

ar-9

9

11-A

pr-9

9

25-A

pr-9

9

9-M

ay-9

9

23-M

ay-9

9

6-Ju

n-99

20-J

un-9

9

4-Ju

l-99

18-J

ul-9

9

1-A

ug-9

9

15-A

ug-9

9

29-A

ug-9

9

12-S

ep-9

9

26-S

ep-9

9

10-O

ct-9

9

24-O

ct-9

9

7-N

ov-9

9

21-N

ov-9

9

Date

Cla

sses

Actual Total Planned Total End Total

Page 25: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt24OSCAR-TDL12-003-V1

• Risk– Good tool to keep on top of issues but can bring too

much Political help• When high risks are identified -- resources are

focused on them– Discipline of regular periodic review is important to keep

the focus

What Metrics Actually Provided

Supplier support during flight test

• Get Raytheon on contract

Supplier support during flight test

• Get Raytheon on contract

5

Lik

elih

oo

d (P

o)

5

4

3

2

1

Consequence (Co)4321

High

Medium

Low

•••

A/G solutions

• Compare flight test results to manned

A/G solutions

• Compare flight test results to manned

Transition to AMC&D

• Measure AV-8 AMC unit #1 thruputperformance

• Begin integration with initial deliveryof s/w & h/w (with 750 processor)

• Verify sufficient I6 thruput foralternate configuration (RAM with

Transition to AMC&D

• Measure AV-8 AMC unit #1 thruputperformance

• Begin integration with initial deliveryof s/w & h/w (with 750 processor)

• Verify sufficient I6 thruput foralternate configuration (RAM with

Tactical mode sensors

• Provide programmable filterconstants for flight test

Tactical mode sensors

• Provide programmable filterconstants for flight test

Integrate AMRAAM/13C into AV-8B

• Confirm MSC/Radar timinganalysis on AMC

Integrate AMRAAM/13C into AV-8B

• Confirm MSC/Radar timinganalysis on AMC

Consider New Risks•I6 OFP Release

Page 26: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt25OSCAR-TDL12-003-V1

• SPI -- Watch The Details

• CPI -- New functionality Costs More Than Legacy

• System Requirements - No Changes For Assembly– Traceablity To Code Is Difficult

• TWD Development -- Same as in TraditionalDevelopment

• SLOC count -- Not as Useful for OO/C++Development Tracking

• Classes -- Good Indicator of Development Progress

Summary of OS Lessons LearnedFor Currently Collected Metrics

Page 27: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt26OSCAR-TDL12-003-V1

• Problem Reports - Total -- OO/C++ a Benefit toProblem Resolution

• Problem Reports - Where found -- DTE SavesTime & Money

• Throughput Usage - OO, COTS Makes PredictionDifficult

• Memory Usage - Scaleable from LegacyDevelopment

• Risk - Good Tool to Focus Attention & Resources,if Risk Identification doesn’t get too Political

Summary of OS Lessons Learned ForCurrently Collected Metrics

Page 28: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt27OSCAR-TDL12-003-V1

Technology Challenges

COTS supports the code/debug/unit test stages of development wellbut many Voids still exist:

•“Front end” of process– Model-based tools for requirements/design capture– Automated configuration and integration of components

•“Back end” of process– Simulation-based testing

• Support for hard real-time embedded systems is limited– Quality-of-service requirements expression/guarantees

• Legacy system constraints– Infusing new technology into resource-limited, “closed”

systems

• High Integrity System development technologies

Page 29: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt28OSCAR-TDL12-003-V1

Cultural Challenges

• Acquisition culture presents impediments as well

– “Silo” approach to planning/funding systemmodernization

– “Wasn’t invented here” mindset in programs– Inability to trade front-end investment for life-cycle

returns, even when business case is compelling– Synergy with COTS industry will always be limited

without cultural transformation– Support structure based on single fielded configuration– T&E community resistance to tailored re-qualification

No incentive for multi-platform development

Page 30: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt29OSCAR-TDL12-003-V1

OSA Lessons Learned - StandardsGoal: Use Widely Accepted Commercial Standards

• Standardize Module Form, Fit, Function and Interface(F3I) to Allow Functional Performance Upgrades

• USE COTS Standards for Networks, Processors,Memory, and Operating System

Reality: Existing Commercial Standards Do Not TypicallyAccommodate Aerospace Requirements

• Real Time Operation - Flight Dynamics• Memory Partitioning for Fault Containment• Built-In-Test

Solution: Modify Commercial Standards Through ActiveParticipation in Standards Bodies

• ANSI Fibre Channel Avionics Environment (FC-AE)• Modify Commercial STD Common Object Request

Broker Architecture (CORBA) for Real-Time Operation• Add Service Layers on Top of Commercial Software

Infrastructure

Page 31: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt30OSCAR-TDL12-003-V1

OSA Lessons Learned - SpecificationsGoal: Focus on Specifying Functional/Performance

Requirements versus “How To”• Use Commercial Specs Wherever Possible• Use Tailored Mil-Specs• Eliminate Unnecessary “How To” specs

Reality: It is Difficult to Prevent Engineers (Boeing, Customer,and Supplier) From Diving Down Into Too Much Detail

• Commercial Specifications may not match Aerospacerequirements

• Additional effort needed to ensure PerformanceLevels and interoperability Are Achievable

Solution: Need to get a Better Handle on the High LevelPerformance Requirements

• Develop benchmark application program to validatememory and throughput for COTS processors

• Using a “Performance Prediction Team” to ConductSimulation and Modeling of Key System Attributes.

• Evaluate Lab Prototype H/W to Gather Data.

Page 32: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt31OSCAR-TDL12-003-V1

COTS Lessons Learned

• COTS May Not Work As Well For Your Application As TheApplication For Which It Was Developed

• COTS Frequently Has Surprises, Especially With Little UsedFeatures

• COTS Documentation May Be Lacking, Or Will Not Tell YouHow It Will Work In Your System

Page 33: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt32OSCAR-TDL12-003-V1

Lessons Learned - Diagnostics

• Diagnostics Processes/Tools must better addressFalse Alarm Rate

• Supplier must better understand TotalDiagnostics Requirements– Fault Coverage– Fault Isolation– False Alarms– Failure Reporting & Recording

• Diagnostic System must have integrated on-board and off-board capability that can beupdated in a timely manner

Total System Diagnostics Architecture Must Minimize NFF Occurrences

Page 34: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt33OSCAR-TDL12-003-V1

Lessons Learned - Prototyping

• Early And Frequent Prototyping RequiredThroughout The Program

• Develop Software Incrementally Utilizing DailyBuilds

• Complex Functionality needs to be partitioned andimplemented early

• Verify Design And Ensure API’s Meet Needs OfUser

• Verify Software And Hardware Performing AsExpected

No New Lessons from Legacy Developments

Page 35: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt34OSCAR-TDL12-003-V1

CurrentTraceability

Object Oriented Design in a FunctionalDecomposition World

Precise Traceability isInfeasible to AchievePrecise Traceability isInfeasible to Achieve

FunctionalRequirements

ObjectOrientedDesign

Code

Unit Tests

System Tests(Black Box)

CurrentTraceability

Page 36: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt35OSCAR-TDL12-003-V1

Early Returns - Measured Benefit

Key Sources of Gain:

● Reuse (of all types)

● COTS Tools

● Change Containment

● Desktop Testing

● High Order Language

Measured Software Development Affordability ImprovementMeasured Software Development Affordability Improvement

F/A-18F-15

Historical F/A-18 Legacy Cost

Historical F-15 Legacy Cost:

Goal

Lab

or

Ho

urs

/ S

LO

C

Cumulative Software Development Productivity

Page 37: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt36OSCAR-TDL12-003-V1

0

1

2

3

4

5

6

Planned

3.18

1.30

5.59

2.49

1.26

1.00

S/W Development Productivity(Hand plus Rose Generated Code)

Man

hour

s/SL

OC

AV-8B IterativeAV-8B CumAV-8B IterativeAV-8B Cum

4 November 1999

Page 38: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt37OSCAR-TDL12-003-V1

Lesson Learned - OSCARHardware

Page 39: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt38OSCAR-TDL12-003-V1

Qual Test

• The following environmental qual testshave been completed :

MSC & WMC– Temp-Alt– Vibration– EMIC– Acoustic Noise

– Loads– Shock– Humidity– Salt– Exp Atmosphere– Sand & Dust

Page 40: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt39OSCAR-TDL12-003-V1

Qual Test Cont’d

• COTS hardware did Well.– No problems with off-the-shelf DY-4 Processor

board (one capacitor failure in RDT.

• No problems with plastic parts(PEMS)– Hardware with plastic parts were exposed to

MIL-STD-810 Humidity and Salt-Fogenvironments in two WRA’s with no failures.

– Was a major concern of some people early inthe program.

Page 41: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt40OSCAR-TDL12-003-V1

Reliability

• Reliability experience to date with COTS hardware has beengood.

• Reliability Development Testing (RDT) done on three WRAs.– WMC - 1,000+ hours– MSC #1- 1,000+ hours– MSC #2 - 1,000+ hours

• One capacitor failure on COTS board, Root cause unknown.

• One commercial grade capacitor failed on another SRA.Switching to a MIL-SPEC capacitor.

• Other failures occurred, but unrelated to COTS hardware.

Page 42: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt41OSCAR-TDL12-003-V1

Memory and Throughput

• OOD is a big resource consumer.

• The F-15 Central Computer OFP had already beenconverted from an assembly language to a HOL(Ada) in the early 1990’s.

• Felt comfortable with initial OSCAR estimatesbased on complexity of the F-15 aircraft versusthe AV-8B, a six processor solution (on the F-15)versus a single processor, and the continuedgrowth in available throughput in commercialprocessors.

However, a 4x estimate turned into a 40x reality

Page 43: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt42OSCAR-TDL12-003-V1

F-15 Mission Computer Memory Utilization

0.0

500.0

1000.0

1500.0

2000.0

2500.0

3000.0

1974

1976

1978

1981

1985

1986

1988

1990

1991

1993

1997

2001

VHSIC Central Computer•Ada language OFP•Six processors with fault tolerance

F-15E•Dual cockpit •Full A/A and A/G capability•Eight displays

Page 44: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt43OSCAR-TDL12-003-V1

F-15 / AV-8B OSCARMission Computer

Memory Comparison

F-15 and AV-8B Mission Computer (pre-OSCAR) Memory Utilization

0.0

500.0

1000.0

1500.0

2000.0

2500.0

3000.0

1974

1976

1978

1981

1985

1986

1988

1990

1991

1993

1997

2001

F-15

AV-8B

Page 45: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt44OSCAR-TDL12-003-V1

F-15 and AV-8B Mission Computer memory Utilization

0.0

2000.0

4000.0

6000.0

8000.0

10000.0

12000.0

14000.0

1974

1976

1978

1981

1985

1986

1988

1990

1991

1993

1997

2001

F-15

AV-8BOSCAR

Page 46: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt45OSCAR-TDL12-003-V1

Memory and Throughput Conclusions

• Use of OOD has a tremendous impact onMemory usage.

• Believe throughput impact is even greater,although more difficult to compare.

• Lesson Learned - Use of OOD adds anorder of magnitude (or more) to memoryand throughput requirements.

Page 47: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt46OSCAR-TDL12-003-V1

Tools Lessons

Page 48: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt47OSCAR-TDL12-003-V1

OSA Lessons Learned - Tools

• Not All Commercial Tools Scale To LargeDevelopment Programs

• Interoperability Of Commercial Tools Must BeEvaluated Prior To Selection

• Keep Up With New Tool Versions To MaintainVendor Support

• Plan Tool Transitions

• Utilize Dedicated Tool Engineers

Page 49: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt48OSCAR-TDL12-003-V1

Tool Compatibility

S/SEE ManagementAcross Geographicallyand OrganizationallySeparated DevelopmentSites requires tight Tool CM

S/SEE ManagementAcross Geographicallyand OrganizationallySeparated DevelopmentSites requires tight Tool CM

COTS VendorsGDIS, Smiths,

Wind River

OtherDevelopmentCommunities

F18 H1

OSCARDevelopment Community

OSCAROSCARDevelopment CommunityDevelopment Community

BoeingBold Stroke

BoeingBoeingBold StrokeBold Stroke

PMA-209(GPWS Function)

PMA-209PMA-209(GPWS Function)(GPWS Function)

NAWC-WDNAWC-WDNAWC-WDBoeingSt. LouisBoeingBoeing

St. LouisSt. Louis

Rapid Version Rolls(Rose 98, 98i, 2000)

Tool du JourAttitude

(PVCS vs ClearCase)

noise

Page 50: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt49OSCAR-TDL12-003-V1

Desktop Test Environment

Transitioned to multiple programsTransitioned to multiple programs

Rapidly design once•Autogenerated code•COTS processors & tools•Developers run OFP at their desk•Reduces time and cost•Enabled by hardware and O/S change containment

Rapidly design once•Autogenerated code•COTS processors & tools•Developers run OFP at their desk•Reduces time and cost•Enabled by hardware and O/S change containment

Page 51: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt50OSCAR-TDL12-003-V1

Summary

Page 52: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt51OSCAR-TDL12-003-V1

Lessons Learned Summary (Most Critical)

• COTS– Use Existing Products

• Don’t Push Technology, Follow It (Cost/Schedule\Risk)• Use Technology Rolls To Satisfy Growth, Not Baseline

Requirements– DOD Programs Have Limited Influence On Commercial

Developments• Very-Very-Small Quantities Compared to Industry

– COTS Does Well In Qualification Testing

• Open Systems Design– Cultivate/Develop Multiple Production Sources Up Front– Partition Software Workpackages Along Functional Lines

(Self Contained Packages)

Page 53: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt52OSCAR-TDL12-003-V1

Lessons Learned Summary (Cont.)(Most Critical)

• C++ / OO Design– Throughput Is Difficult To Estimate– Scale The Software To the EXISTING Computer Resources:

• Memory, Throughput, I/O– In Order To Reuse Functional Software The Top Level

Requirements MUST Be The Same– Reused Software Will Require Significant Rework– Process & Procedures Are No Substitute For A Stable, Well-

Trained Workforce– Troubleshooting Transient Problems Is More Difficult in COTS

Environment– Turnaround On Fixes Is Much Quicker

• Functionality– Document And Bound All Requirements– Limit New Functionality Until After Legacy Is Complete– Be Selective in Legacy Problem Fixing During Conversion

• Use Multiple Metrics To Identify Problems

Page 54: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt53OSCAR-TDL12-003-V1

Priority Order of theTop 10 OSCAR Lessons Learned

1 -- Document And Bound All Requirements

2 -- Reused Software Will Require Significant Rework

3 -- Process & Procedures Are No Substitute For A Stable Well Trained Workforce

4 -- Throughput Is Difficult To Estimate (OO)

5 -- Use Existing Products (COTS)

6 -- Use Multiple Metrics To Identify Problems

7 -- DOD Programs Have Limited Influence On Commercial Developments

8 -- Troubleshooting Transient Problems Is More Difficult

9 -- In Order To Reuse Functional Software The Top Level Requirements MUST Be The Same

10-- Partition Software Workpackages Along Functional Lines - (Self ContainedPackages)

Page 55: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt54OSCAR-TDL12-003-V1

Summary• How Are We Doing with Respect to Earlier Expectations?

– LCC savings and schedule improvements will not be realizeduntil 2nd and 3rd upgrades

– Thruput estimates were off by an order of magnitude

• Where Are We Going with the Open Systems Approach?– Boeing Company roadmap for all legacy and future A/C

system upgrades

• Where Are We Going with Metrics Collection?– Classes planned-vs-actuals is the best metric for program

progress indicator– Will continue to collect thru OC1.3 to set baseline

• What Are We Going to “Do” with Lessons Learned Metrics?– Compare to legacy systems metrics( where available) and

produce / quantify data to establish baseline for F/A-18 & JSFsystems development

– Incorporate lessons learned into Boeing-wide trainingprograms

Page 56: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt55OSCAR-TDL12-003-V1

The Next Step

1 -- How Fast Can The Investment Costs Be Recaptured?2 -- Is OO/C++ Software Transparent To Hardware?3 -- What is the Ratio Of New Functionality Development

Costs Of OO/C++ vs. Assembly4 -- Does OO/C++ Software Reduce Retest?5 -- Is COTS Less Expensive?

Answer 5 Questions (Based On OSCAR Experiences)

Page 57: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt56OSCAR-TDL12-003-V1

The Next Steps - Develop A PlanDevelop A Plan/Process to Collect/Generate Data* that will

Support the Determination of:

1 -- Actual Cost Of OSCAR Software Conversion• Use As Basis For Determining Investment Cost• Factor Out New Functionality• Requirements through Fleet Release

• Compare Against Original Estimates– If Different, Why?

2 -- Actual Cost Of New Hardware (WMC / AMC)• Development Of Boxes

– Use As Basis For Determining Investment Cost

• Unit Production Costs• Compare Against Predictions• Compare Against Dedicated Mil Spec. Box (Non-COTS)

3 -- Was COTS Less Expensive?• Why or Why Not?

*Note: Some Data May Not Be Available Until After The Completion Of OC1.1 & AMC&D

Page 58: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt57OSCAR-TDL12-003-V1

The Next Steps - Develop A PlanDevelop A Plan/Process to Collect/Generate Data* that will

Support the Determination of:4 -- Actual Costs Of new Functionality

• AMRAAM/13C (OC1.1)

• JDAM, HQ/SG (OC1.2)5 -- Comparsion With Assembly Language Version

• Was It Cheaper to Develop? To Test?– Why?

6 -- “Will OO & C++ Cause Less Retest In Subsequent OFPs?”• How?

– Generate An OC1.2 Metric To Measure Unplanned Fixes To LegacyCaused By New Functionality

7 -- Costs Associated With Migrating OSCAR OFP To New Processors• 603e to 750• 750 to G4• Was Hardware Transparent to Applications OFP?

– If Not then Why?– Identify Issues

*Note: Some Data May Not Be Available Until After The Completion Of OC1.1 & AMC&D

Page 59: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt58OSCAR-TDL12-003-V1

The Next Steps - Determine the Pay Back

• Using– The Initial Investment Costs– Follow On New Development Costs

• Determine– How Much Software Must Be Written To Pay Back Initial

Investment

Page 60: OSCAR IPT/Bold Stroke Open Systems Lessons Learned · Review of Early Expectations • OSCAR’s Goals – Reduce Life Cycle Support Cost of Software Upgrades (Cost Savings to be

2/26/00 OSLLLogan.ppt59OSCAR-TDL12-003-V1

Bold StrokeOpen Systems Lessons Learned