scrimps-std: test automation design principles - and asking the right questions!

Post on 06-Jul-2015

709 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Download, install, record… we’re automating!?? Asking good questions and building good design to reach reasonable Automation outcomes

TRANSCRIPT

Download, install, record… we’re automating!??Asking good questions and building good design

to reach reasonable Automation outcomes

Richard Robinson, May 2014

Automation frameworks…. arggghhh!

• So many platforms!

• So many programming languages!

• So many tool vendors!

• So many interfaces!

• So much data!

• So many stakeholders!

• So little time!

Automation framework design principles

• Parameterise your input data

• Modularise your automation blocks

• Setup and tear down maintenance

• Use self-descriptive data (SDD)

• Use a database for scalability

• Manage states

• Expand input stakeholders

• Simple trigger points

• Simple reporting methods

Design-principles-in-a-diagram

Execution UI Output

reports

Automation engine

System under test

Automation databaseInput values lists

Stakeholders

Execution

trigger from

Web UI

Reporting to

Web UI

Input values

from flexible

entry

sourcesDatabase input

values and results

collection

Adding DB

values

Execution of tests

with system

Modularisation

Setup, teardown

Parameterisation

Self-descriptive data

State management

Parameterise - explanation

• Parameterised data is changed each time the test is run

• Values can be randomised

• Known stored valid values can be used (or invalid,

depending on your test)

• Functions can be used to create data depending on

other variables

• Input fields reference the variable, not actual concrete

data

Parameterise - example 1

• For a registration form with Firstname/lastname fields,

you can use a name lookup list using a long static list of

first name of all lengths and punctuation.

• Chandrarajan Sivasaravanamuttu (Sri Lankan Cricket

President)

• Mata'afa Mulinu'u II (Samoan President)

• Haven’T (Actual boys names from 2012)

Parameterise - example 2

• Suburbs in Australia

http://burbs.com.au/australian-postcodes/

Parameterise - example 3

• Credit card numbers used around the world

http://en.wikipedia.org/wiki/Bank_card_number

• Remember to also refer to your institutional whitelist for further

reduce the valid credit card numbers

Parameterise - examples

• Mobile numbers in use in Australia:

http://en.wikipedia.org/wiki/Telephone_numbers_in_Austral

ia

Parameterise - benefits

• A different test is run each time

• Framework is more extensible to other users

• Can scale the input values

• Reduces maintainability

• Get around constraints of the data or function

• Data is realistic, and supports and bugs found

• Can build in safety so that live records, account, or data is not

used

Modularise - explanation

• Modules are building blocks of script or code

• They are like a family tree and any layer can be

executed

• Built once and reused often across scenario tests

Modularise - examples

• Field definitions and reuse - eg. password field is used

on registration and also password reset

• Library use of common functions

• Parts of a scenarios

eg. Log in, register, find random item, checkout,

payment, deregister, change password, update profile

• Full scenarios that can be executed in serial to show

real end-to-end usage of the system

From http://safsdev.sourceforge.net/FRAMESDataDrivenTestAutomationFrameworks.htm

Modularise - breaking it down

Modularise - benefits

• Fewer code blocks to maintain

• Build once, use often, saves time when building

• Non-expert coders can build scenarios using blocks

(premise of BDD)

Setup & teardown - explanation

• Test your framework

• Build quality checks into the framework

• Ensure the system is ready for execution before finding

out from a functional tests

• Check and maintain data

Setup & teardown - examples

• Execution setup script - checking and maintaining

environment variables, 3rd party connectivity, server

availability. eg Ping servers, send test request

• Data setup script - gather record to use, and apply any

initial checks on it

• Reset your data once its been used

Setup & teardown - benefits

• Informed of environmental issues prior to any test is run

• Can be used as an environment monitoring tool

• Allows for better re-use of data and records

Self-descriptive data - explanation

• Self-descriptive data is labelled to identify itself and the

test

• It is constrained by what is acceptable for the field

• Used for quick checks of test results by scanning

sizeable output

Self-descriptive data - examples

• PDF report on screen showing transactions and

balances

• SMS texts being received for all notifications

- matching the action to the message

• Web table data showing groups of results

Self-descriptive data - benefits

• Good for manual output checking from time to time

• Good for simplified results checking, as all tests should fit in same

results class

• Does not require external oracles

• Can generate and verify large amounts of complex

• Reduces time to do any manual verification

• Reduces diagnosis when things go wrong

• Good record keeping practice

Database-driven - explanation

• If your framework is starting to grow and expand, then

consider using a database for:

- input data storage

- parameter storage (eg. environment variables)

- output result storage

• Excel is great for small operations, but at some point,

having excel output files created for all tests becomes

unmanageable

Database-driven - examples

• Environmental variables - server names, IPs,

usernames, passwords, interface values

• Data-set up - time-based variables

• Input data - customer registration, orders, credit cards

Database-driven - benefits

• Scalable

• Efficient processing of data, results, and reports

• Centralised - one collection point for all data and results

• Allows for multiple users

State management - explanation

• Know your record states

• Maintain record states as in-use

• Return records for re-use

State management - examples

• Capture main records in a DB, or file

• Capture defining field values eg. in-use state, customer

cohort, type of payment method, address type, comms

type

State management - benefits

• Efficient for role-based, and state-based systems

• Minimised occurrence of record conflicts

• Allows for multiple instances of execution

• Allows for multiple users to execute

Stakeholder usage - explanation

• Framework used by any stakeholder

• Additional framework components build to support

others

• Access to data inputting, execution, results generation

• Input to methodologies such as BDD, ATDD, or any

keyword-driven framework implementation

Stakeholder usage - examples

• Adding values to data input lists

• UI to add values to input data directly to database

• UI to trigger tests on demand

• Running a demo at a stakeholder meeting

• Obtaining report

Stakeholder usage - benefits

• Buy in from stakeholders, increase in investment

• Credibility and quality for all to see

• More feedback on quality of tests

Simple triggers - explanation

• Automation can be used for many purposes and for may

stakeholders

• If the trigger points are easy to use, then it will get used

more often

Simple triggers - examples

• Web front end page with scenarios and suits available

as a menu to visitors

• Mobile app to trigger tests

Simple triggers - benefits

• Framework more widely used, improving value (“ROI”)

• More usage means more testing (and checking), means

greater chance of finding bugs

Simple reporting - explanation

• Reporting is what stakeholders see and use to

communicate to others

• Publish automation runs to a web GUI

• Allows layers of access to different reports (role-based

login)

Simple reporting - examples

• Report to levels of management, or stakeholder groups

eg. Product owner, development team, managers

• Report to levels of test execution

eg Acceptance test runs, function test runs,

environmental monitoring, data-driven tests, scenarios

Simple reporting - benefits

• Exposes the value of automation to the stakeholders

• Allows for feedback into the quality of the framework

• Allows for higher usage of framework by other

stakeholder groups

Focussing questions… (we are not done yet!)

• Do you agree?

• Learned something new?

• Applicable? How much do I implement the design

principles?

• Are there any more?

• What’s missing?

10th design principle… CONTEXT IS KING!

• What are the outcomes we are trying to realise and

why?

• What problem are we trying to solve?

• What matters and what questions do we ask?

outcomeautomationproblem

context & environment

What outcome are we trying to achieve?

• To automate 100% of the functions

• To gain 100% coverage of a specification

• To reduce the number of manual testers to zero

• To shorten regression cycles to 1 minute

• To deepen exploratory testing to be exhaustive

• To reduce administration overhead to zero

• Monitor all user pathways

• To delivery working code overnight ready for utilisation

What problems are we trying to solve?

• Regression testing takes too long and is too costly

• We cannot afford all these testers, lets get rid of 40%

• We are losing quality from our product, lets automate

• Lets make use of dead time by testing overnight, automatically

• Lets save time by checking the quality of the release prior to

manually testing it

• There are too many bugs found in production, lets increase our

testing coverage by automating

What matters?

Strategy - Cost considerations

• Prototype to get further funding?

• Automation is the saviour of our project, spend, spend,

spend

• Whose vision is it to automate? Do they have a budget?

• Are the automation team in budget?

Strategy- People considerations

• Full time dedicated resources or stealing developer time

• Proven skills or developing them

• Full domain knowledge or getting from business

resources

• Expert automation consultant (selling of implementation)

• Expert automation specialist (coding, implementing)

Strategy - Time considerations

• Big bang approach, or easy onset

• Quick wins, or consistent development

• Milestone based, or open ended outcomes

Strategy - System considerations

• Platform tools, language

• Interfaces - 3rd party vendors, access

• Data sensitivity, record availability

• Environment management and coverage

• Deployment infrastructure and processes

Strategy - Quality considerations

• How to measure?

• Are we actually speeding things up?

• Are we actually reducing any cost?

• What is our maintenance costs?

• When do we know its not working?

• What does ‘good’ look like?

Want a trick to remembering the principles?

The single most important

reason

to reduce cost

or save

and another word for ‘save’ is…

management buy-in to Automation is…

so…

Lets rearrange the principles…

• Setup and tear-down

• Context is king

• Reporting interface

• Input stakeholders

• Modularise

• Parameterise

• Self-descriptive

• State management

• Trigger interface

• Database-driven

And take the first letters together…

and you get…

SCRIMPS STD

(Scrimps Standard)

“A management-supported

automation framework design

standard,

with the sole objective to gain

economical outcomes of the

quality assurance function”

What is your context? Experience reports…

• Setup and tear-down

• Context is king

• Reporting interface

• Input stakeholders

• Modularise

• Parameterise

• Self-descriptive

• State management

• Trigger interface

• Database-driven

Resources

Excellent overview of testing and automation, best next steps to read.

Kaner, 2010, http://kaner.com/pdfs/VISTACONexploratoryTestAutmation.pdf

Models of autoation

McCowatt, http://www.ministryoftesting.com/2014/05/automation-time-change-models-iain-mccowatt/

http://exploringuncertainty.com/blog/archives/1010

Problem, solutions, maintainability,

http://www.ministryoftesting.com/2010/07/an-introduction-to-test-automation-design/

Self-verifying data

Doug Hoffman, 2012, http://www.ssqa-sv.org/presentations/Doug%20Hoffman%20SSQA%2011-13-2012.pdf

Noel Nyman, Microsoft, 2001, http://www.stickyminds.com/sites/default/files/presentation/file/2013/01TAU_T6.pdf

Maintainability

http://dhemery.com/pdf/writing_maintainable_automated_acceptance_tests.pdf

Automation test pyramid

Martin Fowler (2012), http://martinfowler.com/bliki/TestPyramid.html

Framework design

Carl Nagle, ”The test framework should be application-independent"

http://safsdev.sourceforge.net/FRAMESDataDrivenTestAutomationFrameworks.htm

General automation methodology

Borland (2012), https://www.borland.com/_images/Silk-Test_WP_How-to-successfully-automate-the-functional-testing-process_tcm32-205735.pdf

IBM, http://public.dhe.ibm.com/common/ssi/rep_wh/n/GBW03035USEN/GBW03035USEN.PDF

Hybrid approach most popular

Yogi Gupta, general implementation methodology

http://www.slideshare.net/yogindernath/understanding-of-automation-framework-presentation

Resources continued…

IMPLEMENTATION

Good design ideas, technical implementation (2011)

http://technologyandleadership.com/30-feet-view-of-test-automation-framework-with-selenium/

Generic implementation principles

http://www.theeffectiveengineer.com/blog/what-makes-a-good-engineering-culture

Doug Hoffman on Automation

http://www.slideshare.net/Softwarecentral/software-test-automation-beyond-regression-testing

Failure and problems with automation

http://five-ways-to-make-test-automation-fail.pen.io/

Smartbear Guide to converting to automation testing -

desktop http://www2.smartbear.com/rs/smartbear/images/Your%20Guide%20to%20Converting%20to%20Automated%20Testing.pdf

Alan Page - The A Word - desktop https://leanpub.com/TheAWord

Automation Checklist - strategy, business, tests

https://hackpad.com/Tips-to-Approaching-Test-Automation-fDRllKNrRJ2

About the presenter

An independent contract tester who works primarily in

automation, performance and test management roles.

The current President of the Sydney Testers Meetup, and

facilitator of WeekendTesting.com/WTANZ.

The head the facilitation of the Let’s Test Oz and CAST

conferences.

Blogs at www.hellotestworld.com and publish his ideas

to slideshare, and testing magazines such as Testing

Trapeze.

A Black Belt Tester of the Miagi-do school of software

testing, and a founding member of the ISST.

Richard Robinson

@richrichnz

richrob79

top related