Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Do you really want to delete this prezi?
Neither you, nor the coeditors you shared it with will be able to recover it again.
Make your likes visible on Facebook?
You can change this under Settings & Account at any time.
Intro to Specification By Example
Transcript of Intro to Specification By Example
In the real
A journey not a destination
Approach Zero Defects
Find bugs early
- costs less,
Build the right thing
Just What is it
Specification by example, also known
as: BDD, ATDD, AAT, STDD
Essentially, A light weight specification
based on example behaviour.
Application drivers can be difficult
Payback can take time
Many hurdles (practice pitfalls)
Probably won't make you any faster
(But will reduce rework due to bugs)
Will blur roles within teams
from the process
Break work into functional blocks
Team collaborates on developing examples
Individuals creates executable specification
Developer instruments spec
Developer implements code to prove examples work
Tests unusable as live
Expecting acceptance tests to be a full regression suite
Focussing on tools
Failure to recognise the value
Test code not maintained with love
Team member's objectives not aligned
No management buy-in
Underestimating the effort required
Focussing on HOW,
Specifications define when a story is completed and examples are collaboratively produced and agreed upon.
Business people, developers and testers give input from their own perspectives.
Automated examples prove the team has built the correct thing.
The collaboration process is more valuable than the tests themselves.
Developers writing examples themselves.
Testers trying to handle all the acceptance testing.
Business dictating tests.
Team members missing from spec workshops.
Hard to maintain tests
Far from facilitating system change, tests focussing on 'WHAT' inhibit change since you become reluctant to break the tests.
Possible poor performance and slow feedback.
AAT considered part of 'Testing' and not a team discipline.
Test automation given to junior team members
Missing members from specification workshops
The spec is not reviewed
Test automation is always finished last in story/sprint.
Tests and code for automation are sometimes considered less important than production code.
But tests must be maintained and kept in sync with the code and serve as live documentation.
Acceptance tests are the specification of a system; they are crucial to the success of a project.
If business analysts only deliver the specifications, developers only build the system, and testers only ensure quality, then automated acceptance tests fall between responsibilities.
Introducing AAT to an organisation requires significant change to technical practices and collaboration within teams.
These changes cannot be achieved without support, understanding and active participation, of management
Build wrong thing
Test wrong things
Dead end development
Unaware when code is finished.
Business specifications and examples should be perfectly understandable by users of the system.
Specifications define WHAT a feature does, and NOT HOW it does it.
Teams are free to implement the best solution they can devise.
Action-oriented acceptance tests
Lots of workflows
Duplication in tests
Very technical examples
Hard to understand examples
A great benefit of agile acceptance testing is the gradual building of a human readable specification.
This specification is automated so you can be confident it describes the function of the code at all times.
Correct, lightweight, easily understandable and accessible documentation is crucial for future change.
Very technical tests
Hard to understand tests,
Tests that are poorly organise
Specs irrelevant to business
Poor review of specs by business leads
No ownership of broken builds
AAT becomes technician's plaything, not team practice.
A large number of similar cases
Tedious spec workshops
Acceptance tests are a specification of how the system behaves and contain examples representing entire sets of test cases.
It is tempting to add more and more examples, just to be sure everything is covered for regression, but this is a mistake.
By trying to cover all possible edge cases you will make specs hard to understand and may even introduce an effect similar to paralysis by analysis.
Ignoring areas that are hard to automate.
Specifications written for a tool not for humans.
Tools defining what can and cannot be automatically tested.
Tool becomes a scapegoat for failures of the process.
This makes the specification vague in parts that aren’t covered by a particular tool.
Loss of business interest in AAT.
Introducing acceptance testing can be challenging and often means changing the way teams are organised and approach their work.
It requires time and investment in skill-up, mastering tools, facilitating workshops and dealing with resistance to change.
Teams think they have failed and get disappointed early.
Resistance to change
Disillusionment and complaints
Agile acceptance testing is not just about Quality Assurance, but about specifying and agreeing on what gets implemented.
Benefits are not immediately apparent, even while team practices are being (dramatically) changed.
It is easy to create huge expectations which cannot be delivered.
But if expectations are too low, then the team may not commit the required time and effort.
Individual ownership of acceptance tests
AAT left until last in story/iteration
Poor quality acceptance tests
Finished is not "done"
Loss of interest in spec workshop
Loss of interest in build failures
Abandonment of test suites
Slow development of tests
Code standards not applied to test code
Re-factor stage not applied to test code
Test code not peer reviewed
High-maintenance acceptance test code
Test failures difficult to diagnose
No time given for exploring required tools
AAT resources not forthcoming
Gradual return to pre-AAT practices
Pressure to reduce quality not scope
AAT introduced as a new development tool not a process
Treat test code as production code
Solve test failures immediately
Demand business involvement in specification workshops and spec review
Keep specifications at business intent level.
Ensure that specs can be used as a live documentation.
Reorganising specs when inevitable change happens.
Include business in creating and reviewing specs.
Follow good agile practices
Teams should focus on the business specifications and then choose the right tool for the job.
Tools should not play an important part in workshops.
Prove success with statistics
Do not underestimate effort of writing good bussiness level test automation.
Automated testing doesn’t come for free.
Acceptance tests cannot provide a full regression suite. Additional tests are always required.
If needed, regression tests can be written and automated using the same tools as acceptance tests.
Communicate business intent
Keep asking "What are we building?"
Distil the specification, until it cannot be simplified further.
Box of solutions
Automated acceptance testing has arisen out of clever tools created by developers.
There can be a tendency to equate the AAT practice with the use of the tool.
But AAT is a business practice designed to get the team talking and to prove the delivery against business intent.
Pressure escalates to reduce time spent on AAT.
Build quality is not maintainted.
Our Experience with a famous bank:
Approach Zero defects
Greater business involvement
Found critical defects close to release
point at which you
find the bug
Spec by example
It does not come easily, BUT...
if you want
reduced risk of production defects
a talking team
increased confidence in the work and
improved business involvement
Examples are easy to understand
Frist, a warning, Spec by example
Is not just a few new tools and
an excuse to muck about with webdriver and other cool stuff
it's a change to the way teams work...
I can specify in minute technical detail how I want my bonce to look.
Or... I can give them a suitable example
I want to look like this...
Key things to notice about the demo:
SBE is not for testing implementation directly, it tests business intentions...
Specs are written for the business. (users)
So that we build the right thing, and
knows the right thing has been built.
The Spec is primarily a business communication device.
Good automated tests can be reused lower down the test pyramid for edge tests etc, scripts, performance. data injection etc.
Good layering of test code (reuse)
Automated specs follow code through to deployment
As part of a testing strategy
Translate into code (fixture)
Make the app do what you want (driver)
application built for testing
Business intentions (acceptance)
End to end
Fullfilled through good layering
10 Barriers teams face
Prove to stakholders we built what we said we would build. In the stakeholders language.