chapter 8. the process phases 1. launching teams 2. the development strategy 3. team planning 4. the...

50
Chapter 8. The Process Phases 1.Launching teams 2.The development strategy 3.Team planning 4.The requirements process 5.Designing with teams 6.Implementation 7.Test 8.The postmortem

Upload: sibyl-harrell

Post on 18-Jan-2016

216 views

Category:

Documents


3 download

TRANSCRIPT

Chapter 8. The Process Phases

Chapter 8. The Process Phases

Launching teamsThe development strategyTeam planningThe requirements processDesigning with teamsImplementationTestThe postmortem

8.1. Launching Teams

In the launch phase, the likely questions concern roles, goals, the team meeting, and data requirements.

Team roles

During the launch phase, you allocate the students to teams and give them their role assignments. The key requirement for a high-performance team is that everyone does their own job plus something extra. It is that something extra that makes great teams.Team goals Teams should review their goals, make sure they understand them, and then use these goals to guide their work. Also, suggest that the teams periodically reexamine their goals to make sure they have not lost sight of their overall objectives.The team meeting

The team meeting is the principal communication mechanism among team members. The teams hold their first weekly meeting during the launch phase. In this meeting, the first order of business is to decide where and when to hold their regular weekly meetings.Data requirements

During the launch phase, the team members discuss how to complete these forms and agree on when they will provide them to the planning manager. 8.2. The Development Strategy

Students are often confused about what a strategy is and why one is needed. Discuss these points and explain the criteria used for evaluating alternative strategies.

Producing the strategy

They select the strategy, they decide how to divide the work among the cycles and document the strategy on the STRAT form. Finally, suggest that they attach a brief description of the conceptual design to this STRAT form.

TSPi Strategy Form - Form STRATNameDateTeamInstructorPart/LevelCycleCycle LOCCycle HoursRef.Functions123123TotalsTSPi Strategy Form Instructions - Form STRAT

PurposeThis form is used to record strategic decisions.It is used during strategy development to allocate product functions to cycles.It is also used during high-level design to allocate SRS functions to components.GeneralThis form suggests a way to record strategic decisions.Use it or any other format that contains the same data.HeaderEnter your name, date, team name, and instructor's name.Name the part or assembly and its level.Enter the cycle number.ReferenceUse this column to list the need statement or SRS paragraph or sentence number for every function.FunctionsIn this column, list all the functions to be included in the product in all cycles.Cycle LOCUse these columns for the estimated LOC for each function.Enter the LOC estimated for each function under the number of the cycle that will include that function.If you plan to implement a function partially in two or even three of the cycles, enter the estimated new and changed LOC for each cycle.If one function is included in another function's LOC, mark it with an X.Cycle HoursUse these columns for the estimated time required to develop each function.Enter the time estimated for each function under the number of the cycle where you plan to include that function.If you plan to implement a function partially in two or even three of the cycles, enter the estimated development time for each cycle.If one function is included in another function's LOC, mark it with an X.Workload balance

While teams should not attempt to develop too large a product in the first development cycle, they should also realize that the subsequent cycles will have much shorter schedules. System infrastructureNeed statements typically define the functions users will see but say little about the system facilities needed to provide these functions. There is, however, a lot more to systems than the user-visible behavior. The configuration management plan Their configuration management plan need only consist of naming the CCB members, establishing CCB meeting guidelines, and ensuring that the team understands the configuration change request (CCR). The support manager also needs to produce a CSR report every week.8.3. Team Planning

One issue that is often confusing is the relationship between the conceptual design produced in the strategy phase and the conceptual design used in the planning phase. A second possible area of confusion is the quality plan. A third common area of confusion is workload balancing. Depending on the accuracy of the team's plan, the engineers might even have to rebalance their workload every week. 8.4. The Requirements Process

Students are often confused about the need for a requirements document. They assume that the need statement describes what is wanted and wonder why an SRS is needed. The need statement, however, describes what the customer wants while the SRS specifies what the team intends to build.

The system test plan

When teams wait to produce the system test plan until after the product is designed, they tend to test what they are building not what they requirements specify. Such test will generally fail to identify missing functions or not reveal major design errors. By making the system test plan during the requirements phase, these problems are less likely. Inspecting the requirements and test plan The key point is to make sure the teams inspect the requirements and that they also inspect the system test plan at the same time.Requirements baselineOnce the SRS has been inspected, reviewed with the instructor, and approved, make sure that it is baselined. Also, check that the teams use the CCR procedure for every change to the baselined requirements and report these activities in the CSR for that week.

8.5. Designing With TeamsIn producing the first-cycle design, the teams should anticipate the enhancements planned for subsequent cycles. Since high-level design mistakes can be extraordinarily difficult to find and fix during implementation and test, each team should do a very careful design inspection.

Design standards and methodsMost students, and even many working engineers, do not truly believe that design is important. They may produce vague design notes or brief sketches or even start coding directly from the requirements. Inspecting the design and test plan Again, make sure the students have read Appendix C in the textbook before they do the inspections.Design baselineReemphasize to the teams that they must baseline their products once they have completed and inspected them. They must not then change these products without using the CCR procedure.8.6. ImplementationJust as in the PSP, the students should make personal implementation-phase plans for each program before they start its detailed design. They then document these plans on the SUMP and SUMQ forms. TSPi Plan Summary - Form SUMP

TeamNameDatePart/LevelCycleProduct SizePlanActualRequirements pages (SRS)Other text pagesHigh-level design pages (SDS)Detailed design linesBase LOC (B) (measured) Deleted LOC (D) (Estimated)(Counted) Modified LOC (M)(Estimated)(Counted) Added LOC (A)(N-M)(T-B+D-R) Reused LOC (R)(Estimated)(Counted)Total New & Changed LOC (N)(Estimated)(A+M)Total LOC (T)(N+B-M-D+R)(Measured)Total New Reuse LOCEstimated Object LOC (E)Upper Prediction Interval (70%)Lower Prediction Interval (70%)TSPi Plan Summary - Form SUMP (continued)Time in Phase (hours)PlanActualActual % Management and miscellaneous Launch and strategy Planning Requirements System test plan Requirements inspection High-level design Integration test plan High-level design inspection Implementation planning Detailed design Detailed design review Test development Detailed design inspection Code Code review Compile Code inspection Unit test Build and integration System test Documentation Postmortem TotalTotal Time UPI (70%)Total Time LPI (70%)NameDateTeamInstructorPart/LevelCycleDefects InjectedPlanActualActual % Strategy and Planning Requirements System test plan Requirements inspection High-level design Integration test plan High-level design inspection Detailed design Detailed design review Test development Detailed design inspection Code Code review Compile Code inspection Unit Test Build and integration System test Total DevelopmentTSPi Plan Summary - Form SUMP (continued)Defects RemovedPlanActualActual % Strategy and Planning Requirements System test plan Requirements inspection High-level design Integration test plan High-level design inspection Detailed design Detailed design review Test development Detailed design inspection Code Code review Compile Code inspection Unit Test Build and integration System test Total Development

TSPi Plan Summary - Form SUMP (continued)TSPi Plan Summary Instructions - Form SUMP

PurposeThis form holds plan and actual data for program parts or assemblies.GeneralAn assembly could be a system with multiple products, a product with multiple components, or a component with multiple modules.A part could be a module, component, or product.Note: the lowest-level parts or modules typically have no system-level data, such as requirements, high-level design, or system test.Using the TSPi ToolWhen using the TSPi tool, the plan values are automatically generated.The time and size data are computed from the TASK and SUMS forms.The defect values are automatically generated during the quality planning process (SUMQ).The actual values are also automatically generated by the TSPi tool.Time and size values come from the LOGT, TASK, and SUMS forms.Defect data come from the LOGD forms.When not using the TSPi tool, follow the instructions below.HeaderEnter your name, date, team name, and instructor's name.Name the part or assembly and its level.Enter the cycle number.ColumnsPlan: This column holds the part or assembly plan data. Actual: For assemblies, this column holds the sum of the actual data for the parts of the assembly (at the lowest level, the modules).Product SizeFor text and designs, enter only the new and changed size data.For program parts or assemblies, enter all the indicated LOC data.Obtain the data from the SUMS form.Time in PhaseEnter estimated and actual time by phase.For parts, obtain these data from the TASK forms for those parts.For assemblies, obtain the part-level time data from the totals on the SUMT form and the assembly-level data from the assembly-level TASK form. For example, HLD time would come from the assembly TASK form while total part unit test time would come from the SUMT form.Actual %: Enter the percent of the actual development time by phase. Defects InjectedEnter estimated and actual defects injected by phase.Enter the defect estimates while producing the quality plan.For parts, obtain actual data from the LOGD forms for those parts.For assemblies, get part-level defect data from the totals of the SUMDI form and assembly-level data from the assembly LOGD form. For example, HLD defects would come from the assembly LOGD form while the total part coding defects would come from the SUMDI form.Actual %: Enter the percent of the actual defects injected by phase. Defects RemovedEnter estimated and actual defects removed by phase.Enter the defect estimates while producing the quality plan.For parts, obtain actaul data from the LOGD forms for those parts.For assemblies, obtain part-level defect data from the totals of the SUMDR form and assembly-level data from the assembly LOGD form. For example, HLD review defects would come from the assembly LOGD form while the total part code review defects would come from the SUMDR form.Actual %: Enter the percent of the actual defects removed by phase.

NameDateTeamInstructorPart/LevelCycleSummary RatesPlanActual LOC/hour % Reuse (% of total LOC) % New Reuse (% of N&C LOC)Percent Defect Free (PDF) In compile In unit test In build and integration In system testDefect/page Requirements inspection HLD inspectionDefects/KLOC DLD review DLD inspection Code review Compile Code inspection Unit test Build and integration System test Total developmentTSPi Quality Plan - Form SUMQDefect Ratios Code review/Compile DLD review/Unit testDevelopment time ratios (%)Requirements inspection/Requirements HLD inspection/HLD DLD/code DLD review/DLD Code review/codeA/FRReview rates DLD lines/hour Code LOC/hourInspection rates Requirement pages/hour HLD pages/hour DLD lines/hour Code LOC/hour(continued)NameDateTeamInstructorPart/LevelCycleDefect-injection Rates (Defects/Hr.)PlanActual Requirements HLD DLD Code Compile Unit test Build and integration System testDefect-removal Rates (Defects/Hr.) Requirements inspection HLD inspection DLD review DLD inspection Code review Compile Code inspection Unit test Build and integration System testTSPi Quality Plan - Form SUMQ (continued)Phase Yields Requirements inspection HLD inspection DLD review Test development DLD inspection Code review Compile Code inspection Unit test Build and integration System testProcess Yields % before compile % before unit test % before build and integration % before system test % before system delivery

(continued)PurposeThis form holds plan and actual quality data for parts or assemblies.GeneralWhere possible, establish goals based on your own historical data.Where data are not available, use the QUAL standard for guidance (see Appendix G).Before making the quality plan, you must have a partially completed SUMP form with size and development time data by process phase.Make the Quality PlanTo make the quality plan, do the following:Estimate the defects injected in each phase (use plan data and the QUAL standard for defects injected per hour times hours spent by phase).Estimate the yield for each defect-removal phase (QUAL standard).The defects removed in each phase are estimated as the number of defects at phase entry, times the estimated yield for that phase, divided by 100.Examine the defects/KLOC values for reasonableness.If the defects/KLOC values are not reasonable, adjust phase times, defect injection rates, or yields (use QUAL standard for guidance).When the numbers appear reasonable, the quality plan is complete.Record Actual Quality DataTo complete the quality plan with actual values, enter the following data:Record development time in the time log and summarize in SUMP.Record the defects found in the defect log and summarize in SUMP.Enter the size of each product produced and summarize in SUMP.With the completed SUMP data, complete the SUMQ form with the TSPi tool or as described below and in Chapter 5.TSPi ToolIf you use the TSPi tool, it will complete all the SUMQ calculations.Without the tool, you will have to make the SUMQ calculations as you complete each step described above.At part completion, make the quality calculations by following the instructions below and in Chapter 5.TSPi Quality Plan Instructions - Form SUMQ

HeaderEnter your name, date, team name, and instructor's name.Name the part or assembly and its level.Enter the cycle number.Summary RatesLOC/hour: new and changed LOC divided by total development hours.% Reuse: the percentage of total LOC that was reused. % New Reuse: the percentage of new and changed LOC that was inserted in the reuse library.Percent Defect Free (PDF)PDF refers to the percentage of a program's components that had no defects in a development or test phase.Thus, if 3 of a program's 10 components had no defects in compile, that program would have a PDF of 30% in compile.Base the plan percent defect free (PDF) values on the QUAL standard.Defects/page and Defects/KLOCSet the defect/page and defect/KLOC plan values during planning.Defects/page are calculated as (no. of defects)/(no. of pages)Defects/KLOC are calculated as 1000*(no. of defects)/(N&C LOC).Defect RatiosThese are the ratios of the number of defects found in various phases.Thus, the (code review)/compile ratio is the ratio of the defects found in code review to those found in compile.These ratios can also be calculated from the defects/KLOC values.When the denominator phase values are 0, enter "inf."(continued)Development Time Ratios (%)These are the ratios of the times spent in each development phase.Thus, the DLD/code ratio is the ratio of the time spent in detailed design to the time spent in coding a program.Calculate the planned and actual ratios from the SUMP date.When the denominator phase values are 0, enter "inf."A/FRA/FR is calculated as the ratio of appraisal to failure time.Appraisal time is the time spent reviewing and inspecting programs.Failure time is the time spent compiling and testing programs.To calculate A/FR, divide the total detailed design review, code review, and inspection times by total compile and unit test times.Use the sum of personal review and total team inspection times.When the denominator phase values are 0, enter "inf."Review and Inspection Rates Calculate the review and inspection rates by dividing the size of the reviewed product by the total review or inspection time in hours.Make this calculation for each review and inspection.In planning, use the QUAL standard for guidance (Appendix G).When the denominator phase values are 0, enter "inf."TSPi Quality Plan Instructions - Form SUMQ (continued)

Defect Injection and Removal RatesThe defect rates are calculated in defects injected per hour.Thus, for coding, if you spent 2 hours coding a 100 LOC module and injected 12 defects, you would have injected 6 defects/hour.Similarly, if you spent 1 hour reviewing this module and found 4 defects, you would have removed 4 defects/hour.Based on the QUAL standard, establish standard team rates.Phase YieldPhase yield refers to the percentage of the defects in the product that were removed in that phase.Thus, in reviewing a 100 LOC module, if the review found four and you later determine that there were 6 defects in the module, the phase yield would be 100*4/6=66.7%.In planning, use historical data to estimate the yield values needed for each defect-removal phase. After each phase, calculate the estimated yield values.Process YieldProcess yield refers to the percentage of the defects injected into a product that were removed before a given phase.Thus, for a 100 LOC module, if you later determine that a total of 8 defects were injected into a module before compile and 5 were removed before compile, the yield before compile would be 100*5/8=62.5%.In planning, use the QUAL standard or your own data to estimate the yield values for each defect-removal phase. (continued)Detailed designIn detailed design, the teams should use their agreed design standard. Again, stress the need for a complete and documented design and tell them that you will look at their design documents.Implementation standardsIf the teams have not already produced their coding and LOC counting standards, they should produce them now and review them with all team members. To ensure that they produce these standards, ask the team leaders to bring copies to your next weekly meeting. Also, point out that if several teams want to share the same standards, they can do so. They should not waste time developing a new standard if they can find an acceptable one that someone else has already produced.

The unit test planThe unit test plans need not be elaborate, but they should explain what tests will be run. They should test all variables and parameters at nominal values, limit values, and outside these limits, and, where appropriate, they should test for overflows, underflows, zero values, empty and full states, date sensitivity, and so forth.Test developmentFor TSPi, the test-development work will likely be modest, but the teams should still use disciplined methods. They should record the test-development hours in the appropriate phase, but they should not count the LOC or defects against the system components.

Design and code reviewsIn the implementation phase, require that the students to do personal design and code reviews. While you should look at how the engineers do these reviews, do not set numerical yield or rate targets as this could bias the data. Look at the SUMQ forms to see if they are doing the reviews and if they are getting reasonable yields. Detailed design and code inspections

Again, make sure the students have read Appendix C in the textbook before they do these inspections. Component quality review

The component quality review is the final quality check before integration and system test. Before a module or component is baselined, the quality/process manager examines the engineer's data and attests that the work was properly done. The quality review is important because any single poor-quality component can often delay the entire team while they try to clean it up in system test.Component release and baseline

Again, emphasize that the teams must baseline their products once they have completed and inspected them. They must not then change these products without using the CCR procedure.8.7. Test

If the students have developed thorough test plans, carefully followed these plans, and observed the CCR process for all changes, they should have few problems in the test phase. There is, however, a chance that some teams will spend too much effort on the user documentation. They need only write enough so that a typical user could follow the documentation to install and use the product.8.8. The Postmortem

At the end of each cycle, each team writes a brief summary report on its work. To ensure that all the teams produce comparable reports, suggest some basic report contents. At a minimum, the teams should compare their actual performance with their cycle goals and use TSPi data to justify their conclusions. They should also provide summary plan versus actual comparisons for

Product sizeDevelopment hoursLOC/hourPDFYield before compileYield before system testDefect levels in compile Defect level in all test phasesThe teams should also give their review and inspection rates and ratios and relate these values to the yields they achieved in these reviews and inspections. Also suggest that they show the defect-removal profiles for the each system and for every component.Preparing role evaluationsDiscuss with the class the role evaluations and ask them to point out specific ways to improve the way each role was performed. Explain how you will use the evaluations in grading and tell them that you will remove any identifying comments from the evaluations and make them available to help this and other teams in the future. 8.9. The Final ReportThe teams should prepare and present a brief final report at the end of the course. In these reports, they should review their results across all development cycles and explain any significant variations. Have them briefly summarize their results, discuss how these results changed from cycle to cycle, and what caused the variations. Finally, ask them to summarize the key lessons they took from the course and what they would do differently on the next project.