systemic analysis of software findings

19
Systemic Analysis of Software Findings Scott Lucero Office of the Deputy Undersecretary of Defense (Acquisition and Technology) Software Engineering and System Assurance

Upload: carla-schultz

Post on 03-Jan-2016

19 views

Category:

Documents


0 download

DESCRIPTION

Systemic Analysis of Software Findings. Scott Lucero Office of the Deputy Undersecretary of Defense (Acquisition and Technology) Software Engineering and System Assurance. Approach. Question: Are the findings from Program Support Reviews consistent with the NDIA top software issues?. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Systemic Analysis of Software Findings

Systemic Analysis of Software Findings

Scott Lucero

Office of the Deputy Undersecretary of Defense (Acquisition and Technology)Software Engineering and System Assurance

Page 2: Systemic Analysis of Software Findings

Approach

• Used keywords to pull findings from the systemic analysis database and binned against top issues– Questions about binning methodology

• Looked at totality of findings and allocated to new affinity groups, based on SWEBOK

• Conducted two one-day workshops with the authors of the findings to provide overall context– First-hand experience with over 90 percent of findings

• Developed summary statement of the issues associated with each affinity group– Started looking at associated affinity groups

Question: Are the findings from Program Support Reviews consistent with the NDIA top software issues?Question: Are the findings from Program Support Reviews consistent with the NDIA top software issues?

Page 3: Systemic Analysis of Software Findings

Top Software Issues*

1. The impact of requirements upon software is not consistently quantified and managed in development or sustainment. “Requirements”

2. Fundamental system engineering decisions are made without full participation of software engineering. “SE/SW Integration”

3. Software life-cycle planning and management by acquirers and suppliers is ineffective. “SW Sustainment”

4. The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry. “Human Capital”

5. Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems. “SW Testing”

6. There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments. “SW Assurance”

7. Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk. “SW COTS/NDI/Reuse”

*NDIA Top Software Issues Workshop - August 2006

Page 4: Systemic Analysis of Software Findings

Program Support Review (PSR)

• Repeatable, tailorable, exportable process• Trained workforce with in-depth understanding of PMs’

program issuesPSR Evaluation Areas1. Mission Capabilities/Requirements2. Resources3. Management4. Technical Process5. Technical Product6. Environment

SME Insight

Program Support Review Methodology

Pgm Reference Mat’l

PSR Plan

Q’sPSR Reference Matl’s• Templates• Sample Questions• Documented Processes• Training Materials• Execution Guidance

PMs Report Process is Insightful, Valuable, and Results Oriented;better than 95% acceptance of recommendations

PMs Report Process is Insightful, Valuable, and Results Oriented;better than 95% acceptance of recommendations

“…PSR team serves as ‘disinterested 3rd party’ that allows [the PM] to approach leadership armed with powerful program truths, reinforce issues.” (PM)

“…PSR team serves as ‘disinterested 3rd party’ that allows [the PM] to approach leadership armed with powerful program truths, reinforce issues.” (PM)

Page 5: Systemic Analysis of Software Findings

Page 5

Source of the Findings

• 68 reviews of 38 different acquisition programs – Conducted from early 2004 to present– Primarily ACAT 1D programs– Findings of these reviews placed into Systemic Analysis Database

(SADB) – a formal repository for all review findings

• Data extracted from SADB using the following keywords:

– Software– Systems-of-Systems (SoS)– Assurance– Architecture– Security

• 600+ findings resulted from the keyword search

Page 6: Systemic Analysis of Software Findings

Page 6

Data Validation

• Data validation conducted to: – Remove findings unrelated to

software– Ensure that positive, neutral, and

negative findings were identified properly

• Resulted in 284 findings directly related to software– Keyword search probably missed

some software-related findings

Software Related FindingsTotal: 284

Positive Neutral Negative

16447

73

We examined these software findings without a predefined taxonomy in order to allow issue

areas and recurring trends to emerge

Page 7: Systemic Analysis of Software Findings

Page 7

Affinity Groups for Negative Findings

Systemic Analysis Database (SADB)Negative Affinity Groups

SW/SE Integration

Software Development

HumanCapital

Data/ Metrics

KnowledgeSharing

Software Engineering Management

SoftwareAssurance

~ Project Planning~ Management Oversight~ Software Configuration Management

~ SW Metrics~ EVM

~ Architecture~ Software COTS/Reuse~ Software Testing~ Sustainment/Maintenance~ Tech Readiness

~ Resources~ Quality Level

~ Process~ Reporting

Requirements

~ Engineering~ Management~ Acquisition Strategy

~ System of Systems~ Interoperability~ Tech Refresh

Definitions of affinity groups use sources such as Software Engineering Body of Knowledge (SWEBOK) to bring consistency to the methodology

Page 8: Systemic Analysis of Software Findings

Page 8

Analysis of Findings

• Conducted workshops to provide context for findings:– Examined findings to identify related issues based on experience

of the review participants– Characterized the strength of the relationship between the finding

and the affinity group– Added issues beyond the originally identified affinity groups

• Results transferred to a graphing editor tool (yEd) for further analysis

Page 9: Systemic Analysis of Software Findings

Description of Issues

Management Oversight• Insufficient tracking of program against plans

throughout lifecycle• Underestimation of system complexity• Failure to manage “the big picture”

– e.g., focusing on short-term vs. long term goals, management of SoS and GFE

Process Planning• Lack of mature software processes impacting

management oversight

Page 10: Systemic Analysis of Software Findings

Description of Issues (2)

Human Capital• Staff lacks software skills and experience,

hindering delivery• Insufficient availability of software leads and

other key software personnel

Knowledge Sharing• Poor communication on software issues within

program office and between organizations, resulting in poorly synchronized plans and oversight

Page 11: Systemic Analysis of Software Findings

Initial Analysis of Relationships between Affinity Groups

Page 11

Page 12: Systemic Analysis of Software Findings

Path Forward

• Develop issue statements for remaining affinity groups

• Continue to examine findings for relationships between affinity groups

• Periodically query systemic database for software findings from additional reviews– Conduct analysis about once a year

Systemic analysis of software findings is consistent with the NDIA top software issues and overall systemic analysis findings.

Systemic analysis of software findings is consistent with the NDIA top software issues and overall systemic analysis findings.

Page 13: Systemic Analysis of Software Findings

Page 13

Back-up Slides

Page 14: Systemic Analysis of Software Findings

Negative Software Trends [2]

Page 14

▫ Lack of planning for integration within engineering plans (i.e. CONOPS and architecture) ▫ Underestimated effort required to integrate software ▫ Software complexity (GFE / COTS), requirements instability, and time constraints contribute to inadequate risk identification and management (i.e. updating of legacy systems) ▫ Lack of resources (equipment) ▫ Underestimation of available budget and resources

▫ Lack of detail in planning leading to schedule delays ▫ Over reliance on EVM to provide visibility into schedule risks ▫ Lack of planned software updates due to management issues (ie Competing priorities and over reliance on legacy tools) ▫ Poor software estimation analysis for COTS/Reuse within programs

▫ Undefined expectations leading to lack of Software Metrics during full program lifecycle ▫ Lack of comparability / integration of metrics and programs are not held accountable to comparable metrics

▫ Security Classification requirements can inhibit flexibility and information exchange ▫ Lack of Software Assurance guidelines; Evident in lack of coordination across security plans/processes, unclear countermeasure efforts/techniques, lack of understanding of foreign involvement standards

▫ Inadequate Requirements Management process causing undeveloped definition of requirements and lack of traceability ▫ Requirements gathering is incomplete (ie lack of funding, over reliance on contractor, staff experience, and immature technology)

▫ Inconsistent test process management especially during planning

▫ Lack of emphasis on software architecture priorities in software requirement documents ▫ Lack of or poorly designed software architecture

▫ Lack of emphasis on configuration management process

Process Planning Management Oversight

Knowledge Sharing Human Capital

Schedule Estimation, Resource Allocation, Risk Management, Configuration Management, Software Assurance, Data and Metrics, Requirements, Systems Engineering and Software Integration, Architecture, Software COTS/Reuse, Software Testing

Techniques and Process Management, Sustainment/Maintenance

Page 15: Systemic Analysis of Software Findings

Relationships between Issues

Page 15

▫ Inadequate knowledge management system and poor communication within the program and between organizations (ie Management and contractors)▫ Lack of Understanding and poor communication involving contract language, program plans, legacy systems, schedule status, software complexity, and software architecture

▫ Staff lacks software skills and experience hindering software delivery▫ Lack of staffing for software leads and other key positions to meet workload requirements▫ Lack of authority to manage integration of systems (i.e. Multi-platform, legacy systems)

▫ Insufficient tracking of Program and Plan implementation throughout the lifecycle▫ Underestimation of system complexity▫ Program is not keeping in mind the overall picture (ie Focused on short term goals rather than long term goals)

▫ Lack of mature software processes to aid in management oversight

Schedule Estimation, Resource Allocation, Risk Management, Configuration Management, Software Assurance, Data and Metrics, Requirements, Systems

Engineering and Software Integration, Architecture, Software COTS/Reuse, Software Testing Techniques and Process Management, Sustainment/Maintenance

Process Planning

Management Oversight

Knowledge SharingHuman Capital

Page 16: Systemic Analysis of Software Findings

Common Threads

Page 16

1st Thread

2nd Thread

Thread Definition: In arguments about specific events, a reason for seeing X as the cause of Y.  X must be the only factor common to more than one example of Y; and the examples of Y should not be linked by chance.

Page 17: Systemic Analysis of Software Findings

Page 17

Affinity Group Definitions [1]

Affinity Group Definition

Software Engineering Management

Application of management activities – planning, coordinating, measuring, monitoring, controlling, and reporting – to ensure that the development and maintenance of software is systematic, disciplined, and quantified

Requirements A property which must be exhibited in order to solve some problem in the real world

Data/Metrics Measure of some property for a piece of software or its specifications

Software Assurance Relates to the level of confidence that software functions as intended and is free of vulnerabilities, either intentionally or unintentionally designed or inserted as part of the software

*See SADB Affinity Group Definitions Word Document for complete set of definitions

Page 18: Systemic Analysis of Software Findings

Page 18

Affinity Group Definitions [2]

Affinity Group Definition

SW/SE Integration Bringing together of the component subsystems into one system and ensuring that the subsystems function together as a system. Process of linking together different computing systems and software applications physically or functionally

Human Capital Stock of productive skills and technical knowledge embodied in the workforce

Knowledge Sharing Ensuring Communication of information and sources both within and between programs and organizations

Software Development

Encompasses Software Engineering processes combined with research and goals to develop computer software products

*See SADB Affinity Group Definitions Word Document for complete set of definitions

Page 19: Systemic Analysis of Software Findings

Page 19

Challenge

Reviews- PSRs- NARs- Nunn-McCurdy- AOTRs

Systemic Analysis Database(SADB)

SSA Software Systemic Analysis

Process

Briefings as part of

Outreach

Guidance Development

PastProblem

Preparation and Training for Reviewers

Internal SSA PPBE & Control

Define a consistent and flexible SSA Software Systemic Analysis Process that will be used to Identify the top positive, neutral, and negative software recurring trends within Acquisition Category (ACAT) 1D programs