Download - ftr-tetmeyer (2)
-
8/3/2019 ftr-tetmeyer (2)
1/86
1
Formal Technical Reviews
Annette Tetmeyer
Fall 2009
-
8/3/2019 ftr-tetmeyer (2)
2/86
2
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
3/86
3
Formal Technical Review
What is Formal Technical Review (FTR)?
Definition (Philip Johnson)
A method involving a structured encounter in
which a group of technical personnel analyzes or
improves the quality of the original work product
as well as the quality of the method.
quality of the original work product
quality of the method
-
8/3/2019 ftr-tetmeyer (2)
4/86
4
Software Quality Improvement
Improve the quality of the original work Find defects early (less costly)
Reduce defects
Leads to improved productivity Benefits by reducing rework build throughout theproject
requirements design coding testing
-
8/3/2019 ftr-tetmeyer (2)
5/86
5
Software Quality Improvement (2/4)
Survey regarding when reviews are conducted Design or Requirements: 40%
Code review: 30%
Code reviews pay off even if the code is beingtested later (Fagan)
-
8/3/2019 ftr-tetmeyer (2)
6/86
6
Software Quality Improvement (3/4)
Improve the quality of the method
Improve team communication
Enhance team learning
-
8/3/2019 ftr-tetmeyer (2)
7/86
7
Software Quality Improvement (4/4)
Which impacts overall quality the most? To raise the quality of the finished product
To improve developer skills
finished
product
developer
skills
-
8/3/2019 ftr-tetmeyer (2)
8/86
8
Maturity Level Key Process Area
1: Initial None
2: Repeatable
Requirements Management, Software Project Planning,
Software Project Tracking and Oversight, Software
Subcontract Management, Software Quality Assurance,
Software Configuration Management
3:Defined
Organization Process Focus, Organization Process
Definition, Training Program, Integrated Software
Management, Software Product Engineering, Intergroup
Coordination, Peer Reviews
4: Managed Quantitative Process Management, Software QualityManagement
5: OptimizingDefect Prevention, Technology Change Management,
Process Change Management
Key Process Areas of CMMI
-
8/3/2019 ftr-tetmeyer (2)
9/86
9
Peer Reviews and CMMI
Does not dictate specific techniques, butinstead requires that: A written policy about peer reviews is required
Resources, funding, and training must be provided Peer reviews must be planned
The peer review procedures to be used must bedocumented
-
8/3/2019 ftr-tetmeyer (2)
10/86
10
SEI-CMMI Checklist for Peer Reviews
Are peer reviews planned?
Are actions associated with defects that are identifiedduringpeer reviews tracked until they are resolved?
Does the project follow a written organizational policyfor performingpeer reviews?
Do participants of peer reviews receive the trainingrequired to perform their roles?
Are measurements used to determine the status of peer
review activities?
Are peer review activities and work products subjectedto Software Quality Assurance review and audit?
-
8/3/2019 ftr-tetmeyer (2)
11/86
11
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
12/86
12
Researchers and Influencers
Fagan
Johnson
Ackermann
Gilb and Graham
Weinberg
Weigers
-
8/3/2019 ftr-tetmeyer (2)
13/86
13
Inspection, Walkthrough or Review?
An inspection is a visual examination of asoftware product to detect and identify softwareanomalies, including errors and deviations fromstandards and specifications
-
8/3/2019 ftr-tetmeyer (2)
14/86
14
Inspection, Walkthrough or Review? (2/2)
A walkthrough is a static analysis technique in which adesigner or programmer leads members of the
development team and other interested parties through a
software product, and the participants ask questions and
make comments about possible errors, violation ofdevelopment standards, and other problems
A review is a process or meeting during which a softwareproduct is presented to project personnel, managers,
users, customers, user representatives, or otherinterested parties for comment or approval
Source: IEEE Std. 1028-1997
-
8/3/2019 ftr-tetmeyer (2)
15/86
15
Families of Review Methods
Method Family Typical Goals Typical AttributesWalkthroughs Minimal overhead
Developer training
Quick turnaround
Little/no preparation
Informal process
No measurement
Not FTR!
TechnicalReviews
Requirements elicitation
Ambiguity resolution
Training
Formal process
Author presentation
Wide range of discussion
Inspections Detect and remove all
defects efficiently andeffectively
Formal process
ChecklistsMeasurements
Verify phase
Source: Johnson, P. M. (1996). Introduction to formal technical reviews.
-
8/3/2019 ftr-tetmeyer (2)
16/86
16
Informal vs. Formal
Informal Spontaneous
Ad-hoc
No artifacts produced
Formal Carefully planned and executed
Reports are produced
In reality, there is also a middle ground betweeninformal and formal techniques
-
8/3/2019 ftr-tetmeyer (2)
17/86
17
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
18/86
18
Cost-Benefit Analysis
Fagan reported that IBM inspections found 90%of all defects for a 9% reduction in averageproject cost
Johnson estimates that rework accounts for 44%of development cost
Finding defects, finding defects early andreducing rework can impact the overall cost of
a project
-
8/3/2019 ftr-tetmeyer (2)
19/86
19
Cost of Defects
What is the impact of the annual cost of softwaredefects in the US?
$59 billion
Estimated that $22 billion could be avoided byintroducing a best-practice defect detection
infrastructure
Source: NIST, The Economic Impact of Inadequate Infrastructure for SoftwareTesting, May 2002
-
8/3/2019 ftr-tetmeyer (2)
20/86
20
Cost of Defects
Gilb project with jet manufacturer
Initial analysis estimated that 41,000 hours ofeffort would be lost through faulty
requirements Manufacturer concurred because:
10 people on the project using 2,000 hours/year
Project is already one year late (20,000 hours)
Project is estimated to take one more year (another20,000 hours)
-
8/3/2019 ftr-tetmeyer (2)
21/86
21
Jet Propulsion LaboratoryStudy
Average two hour inspection exposed fourmajor and fourteen minor faults
Savings estimated at $25,000 per inspection
Additional studies showed the number of faultsdetected decreases exponentially by phase Detecting early saves time and money
-
8/3/2019 ftr-tetmeyer (2)
22/86
22
Software Inspections
Why are software inspections not widely used?
Lack of time
Not seen as a priority
Not seen as value added (measured by loc) Lack of understanding of formalized
techniques
Improper tools used to collect data
Lack of training of participants Pits programmer against reviewers
-
8/3/2019 ftr-tetmeyer (2)
23/86
23
Twelve Reasons ConventionalReviews are Ineffective
1. The reviewers are swamped with information.
2. Most reviewers are not familiar with theproduct design goals.
3. There are no clear individual responsibilities.4. Reviewers can avoid potential embarrassment
by saying nothing.
5. The review is a large meeting; detaileddiscussions are difficult.
6. Presence of managers silences criticism.
-
8/3/2019 ftr-tetmeyer (2)
24/86
24
Twelve Reasons ConventionalReviews are Ineffective
7. Presence of uninformed reviewers may turnthe review into a tutorial.
8. Specialists are asked general questions.
9. Generalists are expected to know specifics.10. The review procedure reviews code without
respect to structure.
11. Unstated assumptions are not questioned.
12. Inadequate time is allowed.
From class website: sw-inspections.pdf (Parnas)
-
8/3/2019 ftr-tetmeyer (2)
25/86
25
Fagans Contributions
Design and code inspections to reduce errors inprogram development (1976)
A systematic and efficient approach to
improvingprogramming quality Continuous improvement: reduce initial errors
and follow-up with additional improvements
Beginnings of formalized software inspections
-
8/3/2019 ftr-tetmeyer (2)
26/86
26
Fagans Six Major Steps
1. Planning
2. Overview
3. Preparation
4. Examination5. Rework
6. Follow-up
Can steps be skipped or combined?How many people hours are typically involved?
-
8/3/2019 ftr-tetmeyer (2)
27/86
27
Fagans Six Major Steps (2/2)
1. Planning: Form team, assign roles
2. Overview: Inform team about product(optional)
3. Preparation: Independent review of materials4. Examination: Inspection meeting
5. Rework: Author verify defects and correct
6. Follow-up: Moderator checks and verifiescorrections
-
8/3/2019 ftr-tetmeyer (2)
28/86
28
Fagans Team Roles
Fagan recommends that a good size teamconsists of four people
Moderator: the key person, manages team and
offers leadership Readers, reviewers and authors
Designer: programmer responsible for producing theprogram design
Coder/ Implementer: translates the design to code Tester: write, execute test cases
-
8/3/2019 ftr-tetmeyer (2)
29/86
29
Common Inspection Processes
-
8/3/2019 ftr-tetmeyer (2)
30/86
30
Active Design
Parnas and Weiss (1985)
Rationale Reviewers may be overloaded duringpreparation phase
Reviewers lack of familiarity with goals
Large team meetings can have drawbacks
Several brief reviews rather than one large review
Focus on a certain part of the project
Used this approach for the design of a military flight
navigation system
-
8/3/2019 ftr-tetmeyer (2)
31/86
31
Two Person Inspection
Bisant and Lyle (1989)
One author, one reviewer (eliminatemoderator)
Ad-hoc preparation Noted immediate benefits in program quality
and productivity
May be more useful in small organizations orsmall projects
-
8/3/2019 ftr-tetmeyer (2)
32/86
32
N-fold Inspection
Martin and Tsai (1990)
Rationale A single team finds only a fraction of defects
Different teams do not duplicate efforts
Follows Fagan inspection steps
N-teams inspect in parallel with results
Results from teams are merged
After merging results, only one team continues on
Team size 3-4 people (author, moderator, reviewers)
-
8/3/2019 ftr-tetmeyer (2)
33/86
33
Phased Inspection
Knight and Myers (1993)
Combines aspects of active design, Fagan, andN-fold
Mini- inspections or phases with specific goals Use checklists for inspection
Can have single-inspector or multiple-inspectorphases
Team size 1-2 people
-
8/3/2019 ftr-tetmeyer (2)
34/86
34
Inspection without Meeting
Research by Votta (1993) and Johnson (1998) Does every inspection need a meeting?
Builds on the fact that most defects are found
in preparation for the meeting (90/10) Is synergy as important to finding defects as
stated by others?
Collection occurs after preparation
Rework follows
-
8/3/2019 ftr-tetmeyer (2)
35/86
35
Gilb Inspections
Gilb and Graham (1993)
Similar to Fagan inspections
Process brainstorming meetingimmediately
following the inspection meeting
-
8/3/2019 ftr-tetmeyer (2)
36/86
36
Other Inspections
Structured Walkthough (Yourdon, 1989)
Verification-Based Inspection (Dyer, 1992)
-
8/3/2019 ftr-tetmeyer (2)
37/86
37
Inspection, Walkthrough or Review?
Some researchers interpret Fagans work as acombination of all three
Does present many of the elements associated
with FTR FTR may be seen as a variant of Faganinspections (Johnson, Tjahjono 1998)
-
8/3/2019 ftr-tetmeyer (2)
38/86
38
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
39/86
39
Formal Technical Review (FTR)
Process Phases and procedures
Roles
Author, Moderator, Reader, Reviewer, Recorder Objectives
Defect removal, requirements elicitation, etc.
Measurements
Forms, consistent data collection, etc.
-
8/3/2019 ftr-tetmeyer (2)
40/86
40
FTR Process
How much to review
Review pacing
When to review
Pre-meetingpreparation Meetingpace
-
8/3/2019 ftr-tetmeyer (2)
41/86
41
How Much to Review?
Tied into meeting time (hours)
Should be manageable
Break into chunks if needed
-
8/3/2019 ftr-tetmeyer (2)
42/86
42
Review Pacing
How long should the meeting last?
Based on:
Lines per hour?
Pages? Specific time frame?
-
8/3/2019 ftr-tetmeyer (2)
43/86
43
When to Review?
How much work should be completed beforethe review
Set out review schedule with project planning
Again, break into manageable chunks Prioritize based on impact of code module to
overall project
-
8/3/2019 ftr-tetmeyer (2)
44/86
44
Pre-Meeting Preparation
Materials to be given to reviewers
Time expectations prior to the meeting
Understand the roles of participants
Training for team members on their variousroles
Expected end product
-
8/3/2019 ftr-tetmeyer (2)
45/86
45
Pre-Meeting Preparation (2/2)
How is document examination conducted? Ad-hoc
Checklist
Specific reading techniques (scenarios orperspective-based reading)
Preparation is crucial to effective reviews
-
8/3/2019 ftr-tetmeyer (2)
46/86
46
FTR Team Roles
Select the correct participants for each role
Understand team review psychology
Choose the correct team size
-
8/3/2019 ftr-tetmeyer (2)
47/86
47
FTR Team Roles (2/2)
Author
Moderator
Reader
Reviewer Recorder (optional?)
Who should not be involved and why?
-
8/3/2019 ftr-tetmeyer (2)
48/86
48
-
8/3/2019 ftr-tetmeyer (2)
49/86
49
Team Participants
Must be actively engaged
Must understand the bigger picture
-
8/3/2019 ftr-tetmeyer (2)
50/86
50
Team Psychology
Stress
Conflict resolution
Perceived relationship to performance reviews
-
8/3/2019 ftr-tetmeyer (2)
51/86
51
Team Size
What is the ideal size for a team? Less than 3?
3-6?
Greater than 6? What is the impact of large, complex projects?
How to work with globally distributed teams?
-
8/3/2019 ftr-tetmeyer (2)
52/86
52
FTRObjectives
Review meetings can take place at variousstages of the project lifecycle
Understand the purpose of the review
Requirements elicitation Defect removal
Other
Goal of the review is not to provide solutions
Raise issues, dont resolve them
-
8/3/2019 ftr-tetmeyer (2)
53/86
53
FTR Measurements
Documentation and use
Sample forms
Inspection metrics
-
8/3/2019 ftr-tetmeyer (2)
54/86
54
Documentation
Forms used to facilitate the process
Documenting the meeting
Use of standards
How is documentation used by: Managers
Developers
Team members
-
8/3/2019 ftr-tetmeyer (2)
55/86
55
Sample Forms
NASA Software Formal Inspections Guidebook
Sample checklists
Architecture design
Detailed design Code inspection
Functional design
Software requirements
Refer to sample forms distributed in class
-
8/3/2019 ftr-tetmeyer (2)
56/86
56
Inspection Metrics
How to gather and classify defects?
How to collect?
What to do with collected metrics?
What metrics are important? Defects per reviewer?
Inspection rate?
Estimated defects remaining?
Historical data
Future use (or misuse) of data
-
8/3/2019 ftr-tetmeyer (2)
57/86
57
Inspection Metrics (2/2)
Tools for collecting metrics
Move beyond spreadsheets and word processors
Primary barriers to using:
Cost Quality
Utility
-
8/3/2019 ftr-tetmeyer (2)
58/86
58
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
59/86
59
Beyond the FTR Process
Impact of reviews on the programmer
Post-meeting activities
Review challenges
Survey of reviews and comparisons Future of FTR
-
8/3/2019 ftr-tetmeyer (2)
60/86
60
Impact on the Programmer
Should reviews be used as a measure ofperformance during appraisal time?
Can it help to improve commitment to their
work? Will it make them a better reviewer when roles
are reversed?
Improve teamwork?
-
8/3/2019 ftr-tetmeyer (2)
61/86
61
Post-Meeting Activities
Defect correction How to ensure that identified defects are corrected?
What metrics or communication tools are needed?
Follow-up Feedback to team members
Additional phases of reviews
Data collection for historical purposes
Gauging review effectiveness
-
8/3/2019 ftr-tetmeyer (2)
62/86
62
Review Challenges
Distributed, global teams
Large teams
Complex projects
Virtual vs. face-to-face meetings
-
8/3/2019 ftr-tetmeyer (2)
63/86
63
Survey of Reviews
Reviews were integrated into softwaredevelopment with a range of goals Early defect detection
Better team communication
Review approaches vary widely
Tending towards nonsystematic methods andtechniques
Source: Ciolkowski, M., Laitenberger, O., & Biffl, S. (2003). Software reviews,
the state of the practice. Software, IEEE, 20(6), 46-51.
-
8/3/2019 ftr-tetmeyer (2)
64/86
64
Survey of Reviews (2/2)
What were common review goals? Quality improvement
Project status evaluation
Means to enforce standards
Common obstacles Time pressures
Cost
Lack of training (most train by participation)
-
8/3/2019 ftr-tetmeyer (2)
65/86
65
Do We ReallyNeed a Meeting?
Phantom Inspector (Fagan) The synergism among the review team that can
lead to the discovery of defects not found by any ofthe participants workingindividually
Meetings are perceived as higher quality
What about false positives and duplicates?
-
8/3/2019 ftr-tetmeyer (2)
66/86
66
A Study of Review Meetings
The need for face-to-face meetings has neverbeen questioned
Meetings are expensive! Simultaneous attendance of all participants
Preparation
Readiness of work product under review
High quality moderation
Team personalities
Adds to project time and cost (15-20%overhead)
-
8/3/2019 ftr-tetmeyer (2)
67/86
67
A Study of Review Meetings (2/3)
Studied the impact of: Real (face-to-face) vs. nominal (individual) groups
Detection effectiveness (number of defectsdetected)
Detection cost
Significant differences were expected
-
8/3/2019 ftr-tetmeyer (2)
68/86
68
A Study of Review Meetings (3/3)
Results Defect detection effectiveness was not significantly
different for either group
Cost was less for nominal than for real groups(average time to find defects was higher)
Nominal groups generated more issues, but hadhigher false positives and more duplication
-
8/3/2019 ftr-tetmeyer (2)
69/86
69
Does Openness and AnonymityImpact Meetings?
Workingin a group helped me find errors faster andbetter.
Workingin a group helped me understand the codebetter.
You spent less time arguing the issue validity whenworking alone.
I could get a lot more done in a given amount of timewhen working by myself.
-
8/3/2019 ftr-tetmeyer (2)
70/86
70
Study on Successful Industry Uses
Lack of systematic execution duringpreparation and detection 60% dont prepare at all, only 50% use checklist, less
than 10% use advance reading techniques
Reviews are not part of an overall improvementprogram
Only 23% try to optimize the review process
-
8/3/2019 ftr-tetmeyer (2)
71/86
71
Study on Successful Industry Uses(2/3)
Factors for sustained success Top-management support is required
Need evidence (external, internal) to warrant usingreviews
Process must be repeatable and measurable forcontinuous improvement
Techniques need to be easily adaptable to changingneeds
-
8/3/2019 ftr-tetmeyer (2)
72/86
72
Study on Successful Industry Uses(3/3)
Repeatable success tends to use well definedtechniques
Reported success (NASA, Motorola, IBM)
95% defect detection rates before testing 50% overall cost reduction
50% reduction in delivery time
-
8/3/2019 ftr-tetmeyer (2)
73/86
73
Future of FTR
1. Provide tighter integration between FTR and thedevelopment method
2. Minimize meetings and maximize asynchronicity inFTR
3. Shift the focus from defect removal to improveddeveloper quality
4. Build organizational knowledge bases on review
5. Outsource review and in-source review knowledge
6. Investigate computer-mediated review technology7. Break the boundaries on review group size
-
8/3/2019 ftr-tetmeyer (2)
74/86
74
Outline
Overview of FTR and relationship to softwarequality improvement
History of software quality improvement
Impact of quality on software products The FTR process
Beyond FTR
Discussion and questions
-
8/3/2019 ftr-tetmeyer (2)
75/86
75
Discussion
Testingis commonly outsourced, but whatabout reviews?
What are the implications to outsourcing one
over the other? What if code production is outsourced? What
do you review and how?
What is the relationship between reviews and
testing?
-
8/3/2019 ftr-tetmeyer (2)
76/86
76
Discussion
Relationship between inspections and testing
Do anonymous review tools impact the qualityof the review process?
How often to review? When to re-review?
How to estimate number of defects expected?
Wi S D dl Si f
-
8/3/2019 ftr-tetmeyer (2)
77/86
77
Wiegers Seven DeadlySins ofSoftware Reviews
1. Participants dont understand the reviewprocess
2. Reviewers critique the producer, not theproduct
3. Reviews are not planned4. Review meetings drift into problem solving
5. Reviewer are not prepared
6. The wrongpeople participate7. Reviewers focus on style, not substance
Source: www.processimpact.com
-
8/3/2019 ftr-tetmeyer (2)
78/86
78
Observations
1985-1995: a fair amount of interest and research
Terminology changes and appears to wane post 2000
Many sites are obsolete or have not been updated
Very few surveys on quantifiable results regarding
reviews, cost and quality improvements
Those using quality methods tend to be enthusiastic,others have not joined in yet
-
8/3/2019 ftr-tetmeyer (2)
79/86
79
Questions
-
8/3/2019 ftr-tetmeyer (2)
80/86
80
References
Full references handout provided in class
-
8/3/2019 ftr-tetmeyer (2)
81/86
81
References and Resources
Ackerman, A. F., Buchwald, L. S., & Lewski, F. H. (1989). Softwareinspections: an effective verification process. Software, IEEE, 6(3),31-36.
Aurum, A., Petersson, H., & Wohlin, C. (2002). State-of-the-art:software inspections after 25 years. Software Testing, Verification
andReliability, 12(3), 133-154.Boehm, B., & Basili, V. R. (2001). Top 10 list [software development].
Computer, 34(1), 135-137.
Ciolkowski, M., Laitenberger, O., & Biffl, S. (2003). Software reviews,the state of the practice. Software, IEEE, 20(6), 46-51.
D'Astous, P., Dtienne, F., Visser, W., & Robillard, P. N. (2004).Changing our view on design evaluation meetings methodology: astudy of software technical review meetings. Design Studies,25(6), 625-655.
-
8/3/2019 ftr-tetmeyer (2)
82/86
82
References and Resources
Denger, C., & Shull, F. (2007). A Practical Approach for Quality-DrivenInspections. Software, IEEE, 24(2), 79-86.
Fagan, M. E. (1976). Design and code inspections to reduce errors inprogram development. IBM Systems Journal, 15(3), 182-211.
Freedman, D. P., & Weinberg, G. M. (2000). Handbook of
Walkthroughs, Inspections, and Technical Reviews: EvaluatingPrograms, Projects, andProducts: Dorset House Publishing Co.,Inc.
IEEEStandard for Software Reviews and Audits (2008). IEEESTD 1028-2008, 1-52.
Johnson, P. M. (1996). Introduction to formal technical reviews, fromhttp://www.ccs.neu.edu/home/lieber/com3205/f02/lectures/Reviews.ppt
-
8/3/2019 ftr-tetmeyer (2)
83/86
83
References and Resources
Johnson, P. M. (1998). Reengineeringinspection. Commun. ACM,41(2), 49-52.
Johnson, P. M. (2001), You cant even ask them to push a button:Toward ubiquitous, developer-centric, empirical softwareengineering, Retrieved from
http://www.itrd.gov/subcommittee/sdp/vanderbilt/position_papers/philip_johnson_you_cant_even_ask.pdf, Accessed on October 7,2009.
Johnson, P. M., & Tjahjono, D. (1998). Does every inspection reallyneed a meeting? Empirical Software Engineering, 3(1), 9-35.
Neville-Neil, G. V. (2009). Kode Vicious
Kode reviews 101.Commun. ACM, 52(10), 28-29. -
8/3/2019 ftr-tetmeyer (2)
84/86
84
References and Resources
NIST (May 2002). The economic impact of inadequate infrastructurefor software testing. Retrieved October 8, 2009. fromhttp://www.nist.gov/public_affairs/releases/n02-10.htm.
Parnas, D. L., & Weiss, D. M. (1987). Active design reviews: principlesand practices.J. Syst. Softw., 7(4), 259-265.
Porter, A., Siy, H., & Votta, L. (1995).A review of softwareinspections: University of Maryland at College Park.
Porter, A. A., & Johnson, P. M. (1997). Assessing software reviewmeetings: results of a comparative analysis of two experimentalstudies. Software Engineering, IEEE Transactions on, 23(3), 129-
145.
-
8/3/2019 ftr-tetmeyer (2)
85/86
85
References and Resources
Rombach, D., Ciolkowski, M., Jeffery, R., Laitenberger, O., McGarry,F., & Shull, F. (2008). Impact of research on practice in the fieldof inspections, reviews and walkthroughs: learning from successfulindustrial uses. SIGSOFTSoftw. Eng. Notes, 33(6), 26-35.
Votta, L. G. (1993). Does every inspection need a meeting? SIGSOFT
Softw. Eng. Notes, 18(5), 107-114.
-
8/3/2019 ftr-tetmeyer (2)
86/86
Interesting Websites
Gilb: Extreme Inspection
http://www.result-planning.com/Inspection
Collaboration Tools http://www.result-planning.com/Site+Content+Overview
http://www.sdtcorp.com/pdf/ReviewPro.pdf