using software project courses to integrate education and research
DESCRIPTION
Using Software Project Courses to Integrate Education and Research. Barry Boehm November 18, 2009. Outline. Nature of real-client project course Primarily USC campus, neighborhood e-services MS-level; 2 semester; 6-8 person teams Well-instrumented for continuous improvement - PowerPoint PPT PresentationTRANSCRIPT
University of Southern California
Center for Systems and Software Engineering
Using Software Project Courses to Integrate Education and Research
Barry Boehm
November 18, 2009
University of Southern California
Center for Systems and Software Engineering
Outline• Nature of real-client project course
– Primarily USC campus, neighborhood e-services – MS-level; 2 semester; 6-8 person teams– Well-instrumented for continuous improvement
• Research/education integration via project experiments– Validate new methods and tools via project usage
– Partial basis of 12 PhD dissertations– Rqts. negotiation, formalization (3), COTS integration (2),
Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1)
• Conclusions 11/18/2009
University of Southern California
Center for Systems and Software Engineering
2009-10 Software Engineering Projects
11/18/2009
Project Client Organization
Online DB support for CSCI 511 Jim Alstad CSCI 511
SHIELDS for Family Reginald Van Appelen SHIELDS for Families
Theater Stage Manager Program Julie Sanchez Jules T Bear
Growing Great Online Matt McMahon GrowingGreat
SPC Website Automation Enhancement Dean L. Jones Southland Partnership Corporation
VALE Information Management System Pamela Clay Livingadvantage Inc.
LANI D-Base Ashley WestmanLos Angeles Neighborhood Initiative (LANI)
Freehelplist.org Stephen Wolfson Freehelplist
Early Medieval East Asian Timeline Ken Klien USC East Asian Library
BHCC Website Development Cesar ArmendarizBoyle Heights Chamber Of Commerce
Client Case Management Database OR Marcy Pullard Avenue Of Independence
Website Development : Avenue Of Independence Marcy Pullard Avenue Of Independence
Healthcare The Rightway Roderick Foreman Right Way Direction
AROHE Web Development Janette Brown AROHE
University of Southern California
Center for Systems and Software Engineering
11/18/2009
University of Southern California
Center for Systems and Software Engineering
MBASE Model Integration: LCO Stage
11/18/2009
Domain Model
WinWin Taxonomy
Basic Conceptof Operation
FrequentRisks
Stakeholders,Primary win conditions
WinWin Negotiation
Model
IKIWISI Model,Prototypes,
Properties Models
EnvironmentModels
WinWin Agreements, Shared Vision
ViableArchitecture
Options
Updated ConOps, Business Case
Life Cycle Planelements
Outstanding LCO risks
RequirementsDescription
LCO Rationale
Life Cycle Objectives (LCO) Package
Anchor PointModel
determinesidentifiesidentifiesdetermines
situates exercise exercise focususe of
focus use of determines
guidesdetermination of validate
inputs for
provides
initialize adopt identify identify
update update
achieveiterate to feasibility, consistency determines exit
criteria for validates readiness of
initializes
University of Southern California
Center for Systems and Software Engineering
11/18/2009
S&C Subdomain (General)
1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 31, 32, 35, 36, 37, 39
Type ofApplication
Simple Block Diagram Examples(project nos.)
DeveloperSimplifiers
DeveloperComplicators
MultimediaArchive
Use standardquery languages
Use standard orCOTS searchengine
Uniform mediaformats
Natural languageprocessing
Automatedcataloging orindexing
Digitizing largearchives
Digitizingcomplex or fragileartifacts
Automatedannotation/description/ or meaningsto digital assets
Integration oflegacy systems
MM assetinfo
Catalog
MMArchive
query
MM assetupdate
query updatenotification
Rapid access tolarge Archives
Access toheterogeneousmedia collections
University of Southern California
Center for Systems and Software Engineering
The Results• Projects That Failed LCO Criteria
• Post-1997 failures due to non-S&C causes
– Team cohesion, client outages, poor performance
11/18/2009
1996-97
1997-98
1998-99
1999-2000
…. 2004- 2005
2005- 2006
2006-2007
Teams 15 16 20 21 17 23 21
Teams Failing LCO 4 4 1 1 0 1 1
Teams failing LCA Review 0 0 0 0 0 1 1
University of Southern California
Center for Systems and Software Engineering
Outline• Nature of real-client project course
– Primarily USC campus, neighborhood e-services – MS-level; 2 semester; 6-8 person teams– Well-instrumented for continuous improvement
• Research/education integration via project experiments– Validate new methods and tools via project usage
– Partial basis of 12 PhD dissertations– Rqts. negotiation, formalization (3), COTS integration (2),
Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1)
• Conclusions and references11/18/2009
University of Southern California
Center for Systems and Software Engineering
Empirical Software Engineering Research
• Empirical software engineering research generally slow– Projects take 2-5 years to complete– Improvements confounded with other factors– Data generally sparse, hard to compare across projects
• Team projects the ESE equivalent of the fruit fly– 20 per year, real clients, using industrial-grade processes– Teams include 2 off-campus working professionals– 1-2 of 6 on-campus students have 1-2 years work experience– Extensive data, consistently collected– Opportunities to run (partially) controlled experiments
• Projects, teams not identical
11/18/2009
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Project Course Experience Factory Projects Organization Experience Factory
1. Characterize2. Set Goals3. Tailor Process
Execution plans
4. Execute Process
ProjectSupport
5. Analyze
products,lessons learned,models
6. Technology
Initialize
Apply, Refine
Formalize
Disseminate
ExperienceBase
environmentcharacteristics
tailorabletechnology,mentoring
projectanalysis,process
modification
data,lessonslearned
University of Southern California
Center for Systems and Software Engineering
11/18/2009
WikiWinWin – Tool
Shaper facilitate negotiation in WikiWinWin
Initial ideas surfaced at the meeting
Shaper organize them into a prospective win condition after the meeting
Stakeholders engage in a further discussion
University of Southern California
Center for Systems and Software Engineering
11/18/2009
WikiWinWin – Current Progress
• Initial Results (fall 07)– Correlation between usage aspect and outcome– Not all positive feedbacks
LCO package quality shortfall vs. usage by team
LCO package quality shortfall vs. usage by shaper
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Growth of COTS-Based USC e-Services Projects– Requires new processes, architectures, methods
– It’s not “all about programming” anymore
– Similar trends at Motorola, other USC-CSE Affiliates
0
10
20
30
40
50
60
70
1997 1998 1999 2000 2001
Year
% COTS-based USC
e-services projects
*
* Industry: 2001 Standish Report
University of Southern California
Center for Systems and Software Engineering
Axiom 1. Process Happens Where the Effort Happens
• COTS Assessment, Tailoring, Glue Code CCode/Integration
11/18/2009
b. CBA Effort Distribution of Industry COCOTS Calibration Data
a. CBA Effort Distribution of USC E-Service Projects
0%
20%
40%
60%
80%
100%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
Assessment Tailoring Glue Code
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
University of Southern California
Center for Systems and Software Engineering
CBA Spiral Framework
11/18/2009
P1: Identify Objective, Constraints and
Priorities (OC&Ps)
P2: Do Relevant COTS Products Exist?
P3: Assess COTS Candidates
P4: Tailoring Required?
Single COTS solution best
Yes or Unsure
P6: Can adjust OC&Ps?No
No acceptable COTS-Based Solutions
P5: Multiple COTS cover all OC&Ps?
Multiple COTS requiredto satisfy OC&Ps
P7: Custom Development
NoYes
P10: Develop Glue Code
P8: Coordinate custom code and glue code
development
P9: Develop Custom Code
No, Custom codeRequired to satisfy
all OC&Ps
Yes
P11: Tailor COTSP12: Production, Test and
Transition
NoYes
Deploy
Deploy
A
T
G
Task/Process
Decision/Review
Process Element
A Assessment
T Tailoring
G Glue-Code
University of Southern California
Center for Systems and Software Engineering
COTS Assessment Example
• USC Collaborative Services (USCCS):– Objectives
• Project management• File management• Discussion board• Project calendaring• Chat room
– COTS Candidates• eProject (577a and 577b)• Dot Project (577a)• eStudio (577a)• eRoom (577a)• Blackboard (577b)• iPlanet (577b)
11/18/2009
University of Southern California
Center for Systems and Software Engineering
Example: USCCS Evaluation Results
11/18/2009
WeightEvaluation Criteria
Average Score Average Score Average Score
90 Inter-component compatibility6.4 576 7 630 8 720150 Product Performance8.8 1320 9 720 8 1200500 Functionality 8.239 4119.5 5 500 7.28 364060 Documentation understandability8.2 492 9 540 9 54080 Flexibility 6.6 528 8 640 8 64050 Maturity of the product9.8 490 10 500 9 45080 Vendor support 9.2 736 9 720 9 720150 Security 9.2 1380 9 810 9 1350150 Ease of use 8.4 1260 8 720 7 105080 Training 7 560 7 560 8 64060 Ease of Installation/Upgrade7 420 7 420 8 48060 Ease of maintenance6.2 372 8 480 8 480100 Scalability 8.4 840 9 900 8 80060 vendor viability/stability8.6 516 8 480 9 540100 Compatibility with USC IT infrastructure6.6 660 10 1000 9 90080 Evolution ability 7.2 576 8 640 9 72090 Ease of Integration with third-party software9 810 8 720 8 720
1850 Total 15655.5 14630 15590
Average 8.06985 7.5412 8.0361
eProject iplanet Blackboard
University of Southern California
Center for Systems and Software Engineering
Interoperability Evaluation Framework Interfaces
Integration Rules and Strategies
COTS Representation Attributes
COTS Interoperability Evaluator(StudioI)
COTS Components & Proposed System
Architecture
Cost & Effort Estimate to Integrate COTS
Products
COCOTS Glue-CodeEstimation model[Chris Abts 2002]
COTS Interoperability Analysis Report
Developer
Estimates Lines of Glue-Code
Interoperability Evaluation Framework
11/18/2009
University of Southern California
Center for Systems and Software Engineering
iStudio Tool
11/18/2009
University of Southern California
Center for Systems and Software Engineering
Experiment 1 ResultsData Set Groups Mean
StandardDeviation
P-Value
Dependency Accuracy
Pre Framework Application 79.3% 17.90.017
Post Framework Application 100% 0
Interface AccuracyPre Framework Application 76.9% 14.4
0.0029Post Framework Application 100% 0
Actual Assessment Effort
Projects using this framework 1.53 1.71
0.053Equivalent projects that did not use this framework
5 hrs 3.46
Actual Integration Effort
Projects using this framework 9.5 hrs 2.17
0.0003Equivalent projects that did not use this framework
18.2 hrs 3.37
* Accuracy of Dependency Assessment:1 – (number of unidentified dependencies/total number of dependencies)
** Accuracy of Interface Assessment: 1 – (number of interface interaction mismatches identified/total number of interface interactions)
Accuracy: a quantitative measure of the magnitude of error [IEEE 1990]
11/18/2009
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Assumptions S M OP SV
Developer needs extensive, on demand user/customer interaction Dev SS
0.86 HUser/Customer are available at limited times User/
CustSS
User/Customer are free to add/modify the system requirements during the system implementation
User/
CustPD
0.46 H
Changes/additions to system requirements require extra budget and schedule
Dev PP
Model Clashes Occurrence Probability & Severity: 35 Projects RAD Sample
S: Stakeholder; M: Model; OP: Occurrence Probability; SV: Severity
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Clash Types and their Contribution to Project Risk
% of Clashes 4 12 3 16 4 13 30 7 6 5
% of Risk 6 17 4 20 5 12 24 5 4 3
Success-Property
Success-Product
Success-Success
Product-Property
Process-Property
Property-Property
Product-Product
Process-Process
Product-Process
Success-Process
0.81.31.4
• Majority of research (product-product) addresses minority of risk
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Contribution of Inter and Intra Model Clashes to Risk
0
20
40
60
80
100
Inter Model Clashes
% T
otal
Ris
k
Intra Model Clashes
Distribution of Inter and Intra Model Clashes
0
20
40
60
80
100
Inter Model Clashes
% T
otal
Mod
el C
lash
es
Intra Model Clashes
Inter and Intra Model Clashes and their Contribution to Project Risk
• Inter model clashes caused majority of risk
47% 53% 55%
43%
University of Southern California
Center for Systems and Software Engineering
11/18/2009
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Value-Based Review Process (II)
Negotiation
Meeting
Developers
Customers
Users
Other stakeholders
Priorities of system
capabilities
Artifacts-oriented checklist
Criticalities of issues
General Value-based checklist
Domain Expert
Priority
HighMedium
Low
Criticality
High
Medium
Low
1
2
3
4
5
optional
6
optional
optional
ReviewingArtifacts
Number indicates the usual ordering of review*
* May be more cost-effective to review highly-coupled mixed-priority artifacts.
University of Southern California
Center for Systems and Software Engineering
11/18/2009
Value-Based Checklist (I) <General Value-Based Checklist>High-Criticality Issues Medium-Criticality Issues Low-Criticality Issues
Completeness •Critical missing elements: backup/ recovery, external interfaces, success-critical stakeholders; critical exception handling, missing priorities•Critical missing processes and tools; planning and preparation for major downstream tasks (development, integration, test, transition)•Critical missing project assumptions (client responsiveness, COTS adequacy, needed resources)
•Medium-criticality missing elements, processes and tools: maintenance and diagnostic support; user help•Medium-criticality exceptions and off-nominal conditions; smaller tasks (review, client demos), missing desired growth capabilities, workload characterization
•Easily-deferrable, low-impact missing elements: straightforward error messages, help messages, GUI details doable via GUI builder, project task sequence details
Consistency/
Feasibility
•Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other•Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions•Missing evidence of critical consistency/feasibility assurance in FRD
•Medium-criticality shortfalls in traceability, inter-artifact inconsistencies, evidence of consistency/feasibility in FRD
•Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors
Ambiguity •Vaguely defined critical dependability capabilities: fault tolerance, graceful degradation, interoperability, safety, security, survivability•Critical misleading ambiguities: stakeholder intent, acceptance criteria, critical user decision support, terminology
•Vaguely defined medium-criticality capabilities, test criteria•Medium-criticality misleading ambiguities
•Non-misleading, easily deferrable, low-impact ambiguities: GUI details, report details, error messages, help messages, grammatical errors
Conformance •Lack of conformance with critical operational standards, external interfaces
•Lack of conformance with medium-criticality operational standards, external interfaces•Misleading lack of conformance with document formatting standards, method and tool conventions
•Non-misleading lack of conformance with document formatting standards, method and tool conventions, optional or low-impact operational standards
Risk •Missing FRD evidence of critical capability feasibility: high-priority features, levels of service, budgets and schedules•Critical risks in top-10 risk checklist: personnel, budgets and schedules, requirements, COTS, architecture, technology
•Missing FRD evidence of mitigation strategies for low-probability high-impact or high-probability, low-impact risks: unlikely disasters, off-line service delays, missing but easily-available information
•Missing FRD evidence of mitigation strategies for low-probability, low-impact risks
University of Southern California
Center for Systems and Software Engineering
11/18/2009
By Number P-value % Gr A higher By Impact P-value % Gr A higher
Average of Concerns 0.202 34 Average Impact of Concerns
0.049 65
Average of Problems 0.056 51 Average Impact of Problems
0.012 89
Average of Concerns per hour
0.026 55 Average Cost Effectiveness of Concerns
0.004 105
Average of Problems per hour
0.023 61 Average Cost Effectiveness of Problems
0.007 108
• Group A: 15 IV&V personnel using VBR procedures and checklists
• Group B 13 IV&V personnel using previous value-neutral checklists– Significantly higher numbers of trivial typo and grammar faults
ExperimentExperiment
Value-Based Reading (VBR) Experiment— Keun Lee, ISESE 2005
University of Southern California
Center for Systems and Software Engineering
Pair Development vs. Fagan InspectionTDC = Total Development Costs
TDC(man-hour)
ProductionCosts
(man-hour)
AppraisalCosts
(man-hour)
ReworkCosts
(man-hour)
E1(Thailand 05)
PD Group 526.73 314.02 102.07 8.03
FI Group 695.11 309.23 234.97 43.72
E2(Thailand 05)
PD Group 336.66 186.67 73.33 13.67
FI Group 482.5 208.5 165 45
E3(Thailand 05)
PD Group 1392.9 654.2 325.7 233
FI Group 1342 429 436 317
E4(US 05)
PD Group 187.54 68.16 88.83 20.05
FI Group 237.93 62.82 122.10 42.52
11/18/2009
University of Southern California
Center for Systems and Software Engineering
Lean MBASE Effort ComparisonComparison of Effort in generating I&E document set
0
20
40
60
80
100
120
140
160
OCD SSRD SSAD LCP FRD
Inception and Elaboration documents
Ave
rag
e E
ffo
rt p
er T
eam
(h
ou
rs)
Fall 03 : MBASE
Fall 04 : MBASE
Fall 05 : LeanMBASE
Fall 06 : LeanMBASE
Number of hour / page in generating I&E document set
0
0.2
0.4
0.6
0.8
1
1.2
1.4
OCD SSRD SSAD LCP FRD
Inception and Elaboration documents
nu
mb
er o
f h
ou
rs /
pag
e
Fall2003 : MBASE
Fall2004 : MBASE
Fall2005 : LeanMBASE
Fall2006 : LeanMBASE
11/18/2009
Average number of hours spent for documentation:
Less Effort, except SSAD in Fall 2005
Average number of hour/page in documentation:
Less number of hours per page; except SSRD in Fall 2006
University of Southern California
Center for Systems and Software Engineering
ICM Electronic Process Guide
11/18/2009
University of Southern California
Center for Systems and Software Engineering
Integrating Software Research, Education
• Empirical software engineering research generally slow– Projects take 2-5 years to complete– Improvements confounded with other factors– Data generally sparse, hard to compare across projects
• MS-student projects the ESE equivalent of the fruit fly– 20 per year, real clients, using industrial-grade processes– Extensive data, consistently collected– Opportunities to run (partially) controlled experiments
• Projects, teams not identical– Results frequently correlate with industry experience
• Results strengthen future educational experiences
11/18/2009