improving specifications and test to enhance system · pdf fileimproving specifications and...
Post on 12-Mar-2018
215 Views
Preview:
TRANSCRIPT
Improving Specifications and Test to Enhance System Quality
Sigrid Eldh, PhD, Adj. Prof.Ericsson AB, Sweden
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 2
› Ericsson 20+ years many levels of test +
10+ years of management
– Coordinating Ericsson Research on Software Test
and Debug (fault finding…)
› 10+ years experience from Other business
(mgmt.): HP, Government, Consultancy,
University
– Supervised/-ing 6 PhD
› Started SAST, ISTQB, SSTB, ASTA
› PhD “On Test Design”, 2011
Twitter @DrSEldh
Dr. Sigrid Eldh
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 3
Do the “right” system for the customer
Ultra-Large, Complex, Backward Compatibility, Combinations
Most focus is on Functional, but easily forgetting Non-functional aspects
Not sufficient to only address one abstraction level – from “high level” to
detailed implementation (especially with proprietary Hardware)
Massive parallelism in system, redundancy, reliability, robustness
Going from HW to SW, from Proprietary to Virtual/Cloud
Transformation to NETWORKED SOCIETY = INTERNET OF THINGS
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 4
High Observability
High Controllability
High Testability
Hard
Real-Time
systems
web
OS
Database
systems
Soft Real-Time
systems
Cloud
User
Interface
/GUI
Embedded SystemsMC
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 5
› Few Customer requirements on internal/ lower level
(few GUI’s)
– Embedded, hidden M2M, multi-layered
– We must ourselves define requirements
› Explore the unknown
› Customer’s requirements textual, implicit and from
standards, and “too late”…
› We drive future trends and customer wishes
› Hence, from a test stand point – FUZZY requirements
- sometimes all the way until “solved”/implemented
Requirements Context-For us and probably many industries
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 6
The reality Of Testing more Complex Systems
Difficult Test
Environments
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 7
› Simulators/emulators
› Before
– “Test Quality into system, End-to-End”
› Now
– Automation – More than a million test cases with thousands running every hour
› Future – Hopefully a BETTER Automation with
– Statistical analysis/ Decision making analysis support
– With Mutation testing at more levels than one
– With Configuration “sensitive” executions –“what is config. def”?
› Refactoring of TEST ARCHITECTURE
› Automatic improvement of test cases
Test Environment Solution?
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 8
› Design Specification? Implementation Specification?
› Textual requirements insufficient (even if in tool)
› Requirements in our context is a “delta” from last version
– All other requirements are “gone” and we only talk about “existing test cases” = LEGACY!
› Does not mirror entire system – how it works
– Complex – needs break-down -> Every level needs to be tested
› Testing requirements (only) is greatly insufficient and incomplete
– Even if we add all LEGACY tests
– We cannot define all about a software system in the requirements
› Customer focus might mean we miss other markets – Over-tailoring
› Separate requirement and technical solution (implementation detail)
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 9
The main reason for not
modelling is that specifying
systems in full and up front is
Systems are often too complex,
Also A lack of competence & Time
& History
- Smaller, well defined Is Possible!
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 10
Requirements Interaction Analysis:
How requirements depend/relate to
each other (integration test) and how to
handle different viewpoints
– Robinson WN, Pawlowski SD, Volkov V.
Requirements interaction management.
ACM Computing Surveys (CSUR). 2003
Jun 1;35(2):132-90.
– Sommerville I, Sawyer P. Viewpoints:
principles, problems and a practical
approach to requirements engineering.
Annals of Software Engineering. 1997
Jan 1;3(1):101-30.
1. The interactions among technical
requirements (both intra- and inter-
viewpoints)
2. The interaction among legal and
technical requirements
3. The mapping of requirements to the
architecture
An interesting list**Gürses S, Seguran M, Zannone N. Requirements engineering within a large-scale security-oriented research project: lessons learned. Requirements Engineering. 2013 Mar 1;18(1):43-66.
Dependencies/
Relations Opinions/ Views
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 11
› A further complication is that the existing system must also be taken into
consideration as well as the test environments, architectures, evaluation, costs,
organizations, frameworks, and tools that are surrounding the test in any
software developing industry.
Dealing with Reality
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 12
The ultimate goal
OUR Ultimate GOAL: Find ”all” failures and ”cover” all aspects of the system
› We should at least find failures that matter and are important for the product
– And that works for the customer (ever environment/configuration and context is different)
› Every selected set of tests are also a selection of the failures we are likely to
find
SystemTests
System
Levels
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 13
› Test Verdicts, the outcome and the log from executions contain information
– IF we have sufficient trace information – that aids giving information
› It was a necessary step – because we had 11 Test verdicts before… a typical
evolution of an area – every test “result” told you something….
– So no, simplicity does not always help
– Test cases are not only True or False, Not run, or “blocked”
– Combining “the reason” makes analysis improved
› This requires that we ADD information, to be interpreted and understood
- Future focus is more about “adding, retrieving, sorting and analyzing data”
It is not
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 14
Many ways combine & influence actual quality of system
DIVIDE and Conquer
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 15
Test Analysis
Test Analysis: Understanding the software/system or “product”
Understanding the context of our test design, execution and
measurements, the people, the goals etc…
We know what we have done before – Experience – Legacy
Do we know what we did not do before?
New tools, new methods, new approaches
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 16
The issue of selection of WHAT we test and HOW we test it
How mature is Industry in their test design?
- IMMATURE!
There is an entire universe here to explore!
Still the more advanced code coverage is a great measure (!)
Simple techniques goes a long way!
But you have to do them! Repeatedly!
Test Design
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 17
?
How can weknow
the Qualityof our
Software?
?
?
Test/Execution
is the Measurement we have of
Quality
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 19
?
How can weknow
the Qualityof our TEST?
?
?
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 20
?
How can weknow
the Qualityof our TEST?
?
?
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 21
›
–
–
›
–
›
›
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 22
Systematic Improvement To Increase Reliability
ALL PARTS
MATTERS!
Faults!
I SAY FAULTS!
Faults are costly
cannot happen
TOOLS
– Lets discuss TOOLS
Agile – CI machinery
We are now automating our
test execution
just use
A good test
PROCESS
Requirements should just be better
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 23
TAIM- Test Automation Improvement ModelLevel
Focus
1 Initial
Metrics defined & deployed
Initial
2 Repeatable
Data collected
Analysis
3 Defined
Mechanism
Statistical
Validity
4 Self- Managed
Actions and Issues
“highlighted”
Accuracy
5 Self-Optimized
Cost minimization
Safety-critical
Fail-safe
General Cost, standards, Metrics
1 Test Management Planning factors
Automation ++
Trend, cost etc
Self adapting
Guidance
Management “redundant”
in ongoing
2 Test Requirements Standards,
25010 e.g. testability
Traceability, Validation
3 Test Specifications TDT, TC Gen,
Pre-process
4 Test Code Lang, templates (models),
Architecture
5 Test Automation
Process
Context, type, level, CR/AR
mgmt, Improve , Flow
6 Test Execution Select, Type, When (Func/non-
Func, Regression)
Automatic priority schemes of
what, how and when to execute
(validate)
7 Test Verdicts Test Oracle, Post Process Validity & Gap analysis
8 Test Environment
(context)
Set-up /Prep, Type:
Simu/Emu/Hw/Virtual , test data
Self-installing, self-configuring ,
self-utilization
Self-optimizing
9 Tools Select, Integrate (tool-chains),
Components/API
10 Fault/ Defect
management
CR/AR; Class, Ide/triage, Localize,
Prediction
Self-healing systems Cost optimization
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 24
Focus Areas in TAIM1.Test Management
1.Planning & Deployment
2.Evaluation
3.Automation analysis
4.Technical Debt
2.Test Requirements1.Traceability
2.Validation
3.Test Specifications 1.Test Case generation
2.Test Design Technique (TDT)
3.Pre-process analysis
4.Test Code1.Language
2.Standards/templates
3.Architectures (within code)
5.Test Automation Process1.Context, type, Level
2.CR/AR
3.Improvements
4.Flow, speed & workflows
6.Test Execution1. Selection
2. Functional
3. Non-Functional (Robustness,…)
4. Regression test (legacy)
7. Test Verdicts1. Post-process analysis
2. Test Oracle
8. Test Environment (context)1. Test case set up
2. Type: Simulated, Emulated, limited, actual
3. Test Data
4. Standards and certification suites and API’s
9. Test Tools1. Tool selection
2. Integration, Context “Tools chain”
3. Tool(s) Architecture: Classification
4. Components, API’s
10. Fault/Defect Management1. Change Report/Anomaly (Failure bug reports)
2. Classifications
3. Fault identification, triaging
4. Fault Localization
5. Fault Correction
6. Fault Prediction
General : Measurements, Standards, Cost,
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 25
› Lack of understanding the role and need of testing for quality
› Lack of sufficiently serious (university) education on Test leads to
› Lack of Know-how on Test
– Developers are not understanding how to test their code
› Some lack of focus in research – Formal verification vs “real” Test with large &
Legacy systems…
– Historic reasons impact
› This also goes for Poor Architecture – untestable /low testability (but here is the cost)
› Time-lapse!
Major Challenges
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 26
› Difference between PLAN and Strategy!
– Different focus: Short term, Medium and Long term test
strategies
› More than 90% of issues that is called “test
problems” are NOT test problems!
– TECHNICHAL DEBT
› Cost of Poor Quality
› Non-functional aspects:
Performance/Installability/upgradability/Adaptabi
lity/Reliability/Maintainability/Interoperability/Co
mpatibility/Usability/Security….
Mature your Test Strategy
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 27
ICES Presentation 2016 | Public | © Ericsson AB 2016 | 2016-07-06 | Page 28
Thank you for Your Attention
top related