what do testers hate about testing
DESCRIPTION
A nice list about what testers *hate* about software testing. By the Software Testing Club.TRANSCRIPT
What do testers hate about testing?
A short while back there was an interesting thread on the
Software Testing Club about what testers hated about
testing. The results were interesting and entertaining. Here
we present 62 of them.
As a technology industry should we all reflect on these
points to see how and if it is possible to improve ourselves
and our industries?
www.softwaretestingclub.com
1 Finding the same silly bugs year after year whatever program you are testing - leaving an input
box blank, finding that 99999999999 overflows. Yawn.
2 People asking for the program to be QA'd when they mean tested.
3 Calling testers as QA (Quality Assurance), QA is process but not a designation title.
4 Not having time to satisfactorily determine the root cause of unexpected behaviour.
5 Crappy, unreliable test environments where most of my time and effort is expended identifying
environment issues rather than product issues.
6 The same problem recurring again and again.
7 People who just follow the process by wrote without using their brain.
8 Wasted effort.
www.softwaretestingclub.com
9 Poor test releases where blocking bugs are found within minutes meaning testing needs to stop.
10 UI Automation
11 Code that isn't architected in a testable manner, causing testers to have to write integration tests
when simple unit tests could find the same bugs.
12 Testers who get comfortable with what they already know and stop pushing themselves to learn
more.
13 Developers believing they know HOW to test, WHEN to test and WHAT to test. (Note: Not aimed at
ALL devs. Just some of them.) (Note: If they do know testing - then why do we find so many
bugs?)
14 Non testers belief that automation is not only a silver bullet, but a really cheap one at that.
www.softwaretestingclub.com
15 Poorly interpreted or miscommunicated requirements and designs which result in a system that is
not only unstable but is also not future proof.
16 The almost unbelievable emphasis and trust placed in metrics and their value by testers and non
testers (especially when used a gauge a testers competence and project completion).
17 Best Practices and certain industry beliefs that they actually exist and should be followed by the
letter.
18 Unproductive test-tool shelfware, which was historically over-purchased and still swallows a
recurring chunk of budget despite my protestations.
19 Inaccurate test prep, so you have a script where the objective is so test a specific condition yet the
tester hasn't properly researched the way the system works and the script is effectively pointless.
20 My corresponding lead developer!!!
www.softwaretestingclub.com
21 The fact that no matter how methodically and consciously you plan & test, once in a while a trivial
(and catastrophic) bug will escape that was literally 1 click away of at least 5 of the scenarios you
ran before the release.
22 Many of our colleagues that see testing as a stepping stone to their "real-job" within the
company.
23 Project managers, developers and non-testers in general who don't understand why you can't "just
run a quick pass, so that we can release on time."
24 The lack of standard education around the field.
25 "Testing is the bottleneck" - attitude.
26 Delivery date stays the same when development time spills over (way over).
27 Forget the process being iterative.
www.softwaretestingclub.com
28 No/Bad/Clueless responses from project managers.
29 Sometimes we view ourselves as second citizens.
30 The perception that a career in testing is somehow of less value.
31 Falling into the automation trap - projects/organisations jumping/starting automation
development - jumping for all the benefits - without always considering the costs.
32 Day 1 revelation and verdict, "We haven't yet discussed performance testing. Go performance test
it."
Day 2 questions, "Are you done performance testing? Where are the results?"
33 Test execution is an unskilled task: only scoping and planning require skill and knowledge.
www.softwaretestingclub.com
34 Testers are responsible for putting quality into the product: gimme my sack of magic quality
beans.
35 Successful testing is about passing tests.
36 Clients or PM's asking to certify a release. Against what? for what? they have no answer.
37 Testers testing on a developer's PC, and bad lab environment for testing.
38 Generate and share meaningless reports/metrics.
39 Waiting for the release to happen from development team.
40 Testers doing some manual mundane tasks repeatedly, where technology can be applied and
used. (for example preparing a 1 GB file manually or typing 255 characters).
41 Testing team not informed about the changes.
www.softwaretestingclub.com
42 Finding schedules and budgets have been set with insane optimism by people who know nothing
about testing, but reckon that if everyone else does their job then the testing will be a breeze.
43 Having to explain for the 100th time the difference between defect fix priority and defect severity.
“Yes, I know it has to be fixed and it will be before we go live, but there are other fixes we need
now so we can carry on with testing”.
44 Having to explain to users that exit criteria from testing are not an aspiration or a target, but
triggers that will stop the implementation if they’re exceeded.
45 Builders of the test environment who think that proving the environment works is the job of the
test team.
46 Being at the bottom of the food chain when using waterfall (yah Agile).
47 Untestable requirement.
www.softwaretestingclub.com
48 Project managers.
49 The amount of meetings and the time it takes to get through them.... I have a pet peeve about
preparedness, and this really, REALLY gets to me at times. Think about how much time could be
saved if everyone attending was prepared... and time IS money. So part of the annoyance is
business related, and part of it is testing related... I would rather be testing :)
50 Testing time being crunched because developers ran over schedule, which really comes back to
executives making poor estimates in the initial stages.
51 Badly written defect reports. It's just not that hard to do, you just have to give a damn.
52 Automated testing tools that need as much scripting skills as the code base they are testing. Why
should it be that hard?
53 People thinking the current state of testing certifications have value (testers, companies, et al).
www.softwaretestingclub.com
54 Assumptions, especially when dealing with coding changes impact.
55 Testers that don't continue to learn their craft and hone their skills (or have open minds for
thoughtful discussions).
56 Managers that ask for a QA strategy. After you spend time doing it they shorten everything for
lack of time or resources. Why bother writing a strategy then?
57 Not finding the time to read all the books and blogs on testing that I want to.
58 Testing being seen as a checklist.
59 Writing Test Plans which will never get read.
60 Realising your test plans were never read after going back to review them and noticing that what
you wrote in no way resembles what you actually ended up doing because you defined areas for
testing that were never going to be in the software.
www.softwaretestingclub.com
61 Having to write test plans with information you gathered from outdated marketing material where
the software detailed in the material was 5 revisions back and was already outdated before you
even joined the organisation.
62 All the talk about respect. Respect is earned.
www.softwaretestingclub.com
Big credits to Phil Kirkham for starting the original forum post:
http://www.softwaretestingclub.com/forum/topics/5-things-you-hate-about
And to all the Software Testing Club members who participated:
Fatima, Glenn Halstead, Rosie Sherry, Jason Barile, Rob Lambert, Andy Smith, Joel Montvelisky, Simon
Godfrey, Peter L, Mak, Anne-Marie Charrett, Teemu Vesala, Simon Morley, Sean Murphy, Jake Brake, Pari,
Anna Baik, Linda Wilkinson, Sharath Byregowda, Bhagawati, James Christie, Sherilyn Tasker, Michelle
Smith, Reddy, Jeroen Rosink, Tom Lo, Joseph Ours, Shrini, Georgia Motoc, Trisherino and Tony Bruce.
www.softwaretestingclub.com