testing with limited, vague, and missing requirements

Post on 01-Nov-2014

136 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Requirements are essential for the success of projects―or are they? As testers, we often demand concrete requirements, specified and documented in minute detail. However, does the business really know what they want early in the project? Can they actually produce such a document? Is it acceptable to test with limited or vague requirements? Lloyd Roden challenges your most basic beliefs, explaining how detailed requirements can damage and hinder the progress of testing. Lloyd provides example applications that have no requirements, vague requirements, evolving requirements, complex requirements, and detailed requirements. You will assess and test each of these examples in turn and will establish what is the best approach in these situations and for what reasons. Learn how to question applications and provide feedback on their quality using your experience and appropriate techniques regardless of the level of detail provided in the requirements.

TRANSCRIPT

MN Half-day Tutorials

5/5/2014 1:00:00 PM

Testing with Limited,

Vague, and Missing

Requirements

Presented by:

Lloyd Roden

Lloyd Roden Consultancy

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073

888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com

Lloyd Roden Lloyd Roden Consultancy

With more than twenty-eight years in the software industry, Lloyd Roden has worked as a developer, test analyst, and test manager for many different organizations. Lloyd was a consultant/partner with Grove Consultants for twelve years. In 2011 he created Lloyd Roden Consultancy, an independent UK-based training and consultancy company specializing in software testing. Lloyd’s passion is to enthuse, excite, and inspire people in the area of software testing. He has spoken at conferences worldwide including STAREAST, STARWEST, Better Software, EuroSTAR, AsiaSTAR, and Special Interest Groups in software testing in several countries. In 2004, he won the European Testing Excellence award.

Testing with limited, vague and missing requirements

Written by Lloyd Roden www.lloydrodenconsultancy.com Version 1_0 © Lloyd Roden

LRC110815

Copyright Notice

It is strictly prohibited to reproduce any of these materials in any form without prior written permission from Lloyd Roden Consultancy. This includes photocopying, scanning and the use of any recording devices.

Requirements Testing Workshop-1

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-2

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

2 2

Requirements Testing Workshop-3

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-4

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-5

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

5

Requirements Testing Workshop-6

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-7

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

7 7

Requirements Testing Workshop-8

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-9

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-10

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-11

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-12

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-13

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-14

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-15

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-16

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

16 16

Requirements Testing Workshop-17

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-18

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-19

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-20

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-21

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-22

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-23

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-24

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-25

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-26

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-27

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-28

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-29

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-30

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-31

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-32

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-33

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Requirements Testing Workshop-34

© Lloyd Roden RTW140308

Requirement Testing Workshop Notes

Number:System:Project:

Author: Date:

Why (Purpose of the Test/Mission)

What (What to test and look for; bugs, risks to mitigate, expected results)

How (The test approach: tools, techniques and tactics)

When

Results (Observations, faults raised, tests passed)

Test Notes: (What happened during testing)

Exploratory Testing Charter

Duration .Start Time:

End Time:

ETW110825 © Lloyd Roden

Slider Puzzle Exercise

Slider Puzzle Exercise110831 Page 1 of 1 © Lloyd Roden

Specification An application has been written which allows you to play the well known slider puzzle game on your computer.

The object after starting the game is to move the numbers back to their original place in order, starting with number 1 in the top left hand corner and the blank in the bottom right hand corner. Installing the game: Either double-click on the icon in the folder called “Slider Puzzle” or drag the icon onto you browser. Supported browsers: All major browsers are supported (Internet Explorer 6.0 or higher, Safari, Netscape and Firefox) Starting the game: After opening the application, click on “New Game” and you will see the game start by the numbers moving around the grid to new positions. Once the automatic movement has finished you can then commence the game. Moving the numbers:

• You can only move a number next to a blank field. • To move it, click on that number and continue until all the numbers are in their

correct sequence. Starting a new game: You can start a new game at any time by selecting “new game”

To help you understand the game there has been a “prototype” generated which is a physical game.

Exercise

• formulate a mission/objective of your testing (e.g. complete the game without any bugs or find at least 3 major bugs within the time).

• generate some high level test conditions relating to your mission. • if bugs are found, re-create the bugs that you find and write a description of the bug

so that the bug can be re-created by the developer • at the end of the session ascertain whether the quality level of this product

Aspects covered:

• defining a mission/objective for the testing • creation of high level tests and documenting results • re-creating bugs • providing assessment of the quality level

Time allocated:

• 5 minutes to generate a mission and high level test conditions • 15 minutes to test the application and write bug information • 5 minutes to provide an assessment report

Triangle 1 Exercise

Triangle 1 Exercise110831 Page 1 of 1 © Lloyd Roden

Specification

A program reads three integer values on a screen. The three values are interpreted as representing the lengths of the sides of a triangle. The program prints a message that states whether the triangle is:

• an equilateral (all three sides are the same length) • an isosceles (two sides are the same length), or • a scalene (all sides are different lengths)

Note that the three side lengths must make a valid triangle, e.g. 4, 4 and 6 make a valid isosceles triangle whereas 1, 1 and 6 do not (see diagrams below).

1. write a charter for this program stating: • why: the mission/objective of the session • what: the tests to perform

o write a set of test cases (i.e., specific sets of data) that you feel would adequately test this program, use any techniques you would like generate these tests. Include anything that you would consider if this were a “real” (important) application and write the tests so that someone other than yourself could run them.

• how: the approach, tools, tactics that will be used 2. run these tests on the application 3. write down the following on the charter:

• any new tests that were generated • any bugs you find • any observations that were made during testing

4. write a brief report to the project manager stating the following: a. your opinion as to the quality of the application b. how much coverage of the application you feel you have made c. what you consider the next step should be and why?

i. ship the product? ii. obtain a new/better version of the product? iii. perform more tests? iv. anything else?

Aspects covered:

• individual Exploratory Testing • producing and updating a charter • providing a de-brief report

Time allocated: • 5 minutes to generate first part of the charter • 20 minutes to run tests • 5 minutes to write up a brief report

4

6

4

a valid isosceles triangle.

1 6

1

an invalid triangle.

Triangle 2 Exercise

Triangle 2 Exercise110831 Page 1 of 1 © Lloyd Roden

Specification A new version of the triangle program has been produced (version 2) following some of the problems and observations that have been made for version 1 of the program. Not only have some of the bugs been fixed but the following enhancements have also been made:

a) each side is entered into a different field b) there is no drop down list c) the colours have changed to make the application more usable d) a drawing is made of the triangle e) any number can be entered (integer and decimal) f) error messaging has been improved

Work in pairs 1. create a charter for “paired testing” stating:

• why: the mission/objective of the session • what: the tests to perform

a. which tests from the last test session you will run again (giving reasons for your choice)

b. based upon the information above generate some further tests (if you feel this is required) and give your reasons

• how: the approach, tools, tactics that will be used 2. the execution time (20 minutes is divided into 2)

a. person 1: run the tests on the application b. person 2: write down the following on the charter:

• any new tests that were generated • any bugs that have been fixed • any bugs that were found • any observations that were made during testing

after 10 minutes swap roles 3. choose a bug that you have found and write an “incident report” using the incident report

form provided. Write it so that the person receiving it can re-create the bug. 4. write a brief report to the project manager stating the following:

a. your opinion as to the quality of the application b. how much coverage of the application you feel you have made c. what you consider the next step should be and why?

i. ship the product? ii. obtain a new/better version of the product? iii. perform more tests? iv. anything else?

Aspects covered:

• paired Exploratory Testing • producing and updating a paired-charter • writing a good bug log • providing a de-brief report

Time allocated: • 5 minutes to generate first part of the charter • 20 minutes to run tests • 5 minutes to write up a brief report

PowerPoint Compare Exercise

PowerPoint Compare Exercise110831 Page 1 of 2 © Lloyd Roden

Specification Introduction A compare utility has been put together to help identify what has changed within versions of courses. This utility is an augural part of a course Configuration Management System and allows the comparison of 2 PowerPoint presentations. How it works A macro has been built called 'Compare'. This can be found in the Compare Macros.ppt. It can be found under tools/macro/macros. Using this utility can highlight differences between two PowerPoint presentations. The PowerPoint difference macro detects the following differences in PowerPoint presentations:

• text that is different (missing or changed) • charts and images that are different • extra slides • missing slides • animation that is different

Using the utility

• open 2 PowerPoint presentations, the first is known as the “Master” and the second is known as the “Slave”.

• open the Compare Macros.ppt • within the Compare Macro.ppt run the “Compare Macro”

o this can be found within tools/macro/macros o select the “compare” macro and run

Troubleshooting if the “run” button is greyed out it means that security is set too high and the macros cannot be enabled. To rectify this go to Tools menu and select Options. In the Options dialogue box, select the 'Security' tab and then the 'Macro Security...' button. Then in the 'Security' dialogue select the 'Security Level' tab and check the 'Medium' radio button. Save the settings and close PowerPoint and then restart the Macros and PowerPoint presentations

• a “presentation comparison results” screen is produced, this is divided into 3 sections: o master only: this shows slides that are present in the master only o slave only: this shows slides that are present in the slave only o titles in both presentations

• select one of the differences shown and then select the “view differences” button, this then displays the differences within the 2 presentations

• there are now a number of features available: o view options

• single: shows single slide of the slave or the master presentation. This can be toggled by using the “show slave/show master” button

• top/bottom: shows both the slave and master presentations on top of each other

• left/right: shows both the slave and master presentations next to each other o back and next button: moves through the differences o location of presentation: shows the location of the slave and master presentations o type of difference detected

• shape (such as an image or chart) • text • animation • black and white (difference can only be seen in B&W mode)

o feature buttons include • B&W: showing the black and white view • animate: showing the animation differences • reverse and advance are to be included in the next version • compare: will highlight the differences • updatem: updates the master presentation with the slave • updates: updates the slave presentation with the master

PowerPoint Compare Exercise

PowerPoint Compare Exercise110831 Page 2 of 2 © Lloyd Roden

Exercise 1. spend 5 minutes in pairs implementing the utility and running a test on the sample

presentations provided 2. spend 10 minutes in pairs reading the requirement specification and creating a charter

stating: a. why: the mission/objective of the session b. what: the tests to perform

i. high level tests that you want to run with expected results ii. bugs you want to look for

c. how: the approach, tools, tactics that will be used 3. spend 30 minutes of paired testing

the first 15 minutes: a. person 1: run the tests on the application b. person 2: write down the following on the charter:

• any new tests that were generated • any bugs that were found • any observations that were made during testing • generate another mission and charter based on your findings

the next 15 minutes: c. person 2: run the tests on the application d. person 1: write down the following on the charter:

• any new tests that were generated • any bugs that were found • any observations that were made during testing

4. at the end of this session spend 10 minutes writing a brief report of your assessment of the quality of this utility. Special notes should be made against the following:

a. tests that pass and fail b. new tests that you run in addition to the original 2 charters c. problems/bugs you encounter d. whether this application can be used in its current form or whether a new version

is required e. what to do next

Aspects covered:

• paired Exploratory Testing • producing charters based on a requirement specification • providing a further charter based upon the findings of the first charter • providing a team de-brief report

Time allocated:

• 5 minutes for implementing the utility • 10 minutes creating a charter based upon the requirement specification • 30 minutes to test the application in pairs creating a further charter • 10 minutes to provide an assessment

Website Exercise

Website Exercise110831 Page 1 of 1 © Lloyd Roden

1. The overall mission is “to find as many bugs as possible with

www.lloydrodenconsultancy.com website”. a. conduct a team meeting and breakdown the mission into sub-missions for

individuals and paired testing. b. Each individual and pair should:

i. utilise their strengths ii. produce a charter for their mission

2. run these tests on the application 3. write down the following on the charter:

• any new tests that were generated • any bugs you find • any observations that were made during testing

4. re-group to discuss the following: • problems found • observations made • tests that were run

5. as a group prepare a brief report with the following information • coverage of website with the tests performed • assessment of the potential problem areas • assessment of the quality of the website • what you would like to do next

Aspects covered:

• team Exploratory Testing • producing and updating sub-charters • providing a team de-brief report

Time allocated:

• 10 minutes for allocation of tasks and charter production • 20 minutes to test the application • 10 minutes to discuss • 5 minutes to provide an assessment

top related