Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Usability testing in Scrum environments

No description
by

Matthew Hodgson

on 11 April 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Usability testing in Scrum environments

usability. better. Matthew Hodgson CIO. Agile Coach. UX Strategist. matthew.hodgson@zenexmachina.com

www.zenexmachina.com

www.facebook.com/zenexmachina

@magia3e The Team Product Owner But I don't have the budget
for another resource comprised solely of developers
varying levels of skills - junior to advanced
used to testing their own work
some with architecture experience
a few with an understanding of UX
a little understanding of Scrum Sprint 1 Sprint 2 Sprint 4 Sprint 5 Sprint 8 Sprint 9 a story about usability testing in scrum environments The Environment Traditional Waterfall SDLC
Microsoft Dynamics CRM
18 months of analysis without delivery
Business "designing" the solution
First project to adopt Scrum but we'll build less if we have a UX person instead of another developer it's just a COTS solution, the UX can't be changed BAU team slow moving to support project
No integration testing til the end of the project
'Lead Developer' doing up-front design and promising 'features' to business w/o notifying the Product Owner Observations: Come Demo time: Lack of consistency of UI between feature sets Rationale: Different team members working on different features
It's just a skinny solution Observations: BAU team attempting to line up 9 projects for delivery at the end of the year.
Still no integration testing environment
Team working well with Scrum Come Demo time: Forms layout issues Rationale: It's just a skinny solution Sprint 3 Observations: Team Norming well
Team showing "unconscious competence" Minimal Viable Product vs. Minimal Usable Product Come Demo time: Inconsistent naming conventions
No calls to action Rationale: We can fix it later Usability as Definition of Done? In accordance with "usability standards" ISO 9241:210 Ergonomics of human-system interaction, including guidance on human-system interaction throughout the lifecycle of interactive systems Product Owner or Team didn't:
Understand it
Know what it meant
Know how to apply it
Know what to look for in the end-product Problems with Definition of Done Scrum Australia, Sydney. April 10-11. 2013
Timebox: 1 hour
10.30am - 11.30am Understood usability as a concept, but not how to turn it into practical rules for developers Team's interpretation: minimal product to get it over the line Complication: No cycle of collecting information from real users to iterate features Major factors: Waterfall environment Time for usability testing! Wrote scenarios with business users based on the business processes
Identified which User Stories were involved (traceabilty) Prepared for usability testing for Sprint 7
Setup environments for users
Setup pre-conditions for scenarios in SIT
Some consternation amongst team that this took them away from development work Test strategy Type of techniques:
Cognitive walk-through
Expert review
Heuristics Test with real users Backlog Grooming Developers setup the environment in SIT
Actual users participated over 3 days
Gave brief instructions on what outcomes, not tasks, to accomplish
Encouraged self-talk
Recorded the session using Nikon D7000
20 minute timebox each user for recording
20 minutes timebox for cognitive walkthru
Importantly > done at their own desks Cognitive walkthru:
Played back the video
Asked them to do a commentary on what they were doing, what they were trying to achieve
Talked over where they struggled
Asked what they were expecting to happen at the difficult points See how real-world distractions impacted the ability to use the system Why invivo and not a lab? After each test Sprint 5 - Setup testing in integration testing environment
Sprint 6 - Do the testing
Sprint 7 - Team implements usability issues
Sprint 8 - UAT Visibility of system status.
Match between system and the real world cognitive and behavioural metaphors.
User control and freedom.
Consistency and standards.
Error prevention.
Recognition over recall.
Flexibility and efficiency of use.
Aesthetic and minimalist design.
Assistance in recognition, diagnosis, and recovery from errors.
Visual flow, visual hierarchy and chunking. Heuristics Expert review Visual flow
Visual hierarchy
Visual contrast
Serial position effect
Limited choice
Chunking Usability Testing in Traditional Environments Analysis Design Develop Test Time Cost Cheap to
change Expensive to change UAT Dev Fixing bugs is more important
Usability issues are a "nice to have" How do you adapt usability testing to work better in Scrum? Unconscious
Competence Conscious
Competence Conscious
Incompetence Especially in a Waterfall enterprise environment? Scrum + Prince2 Earliest point to do Sprint 7 Setting up for testing: Heuristics Expert Review Usability testing Preparation Fix usability issues UAT Fix UAT issues &
deploy Conclusions Usability testing Issues: Integration testing not available til about Sprint 7 Visibility of system status Visual cues tell people:
A system has reacted to a person's interaction
What the system is doing
Whether the system has finished that action People initiated commands, but there was no sign that the system was carrying them out ...

... people then kept pressing that button ... multiple actions were then queued Match between system and the real world ... lacks an overall real-world model or metaphor on which to base its interaction design ... the business area does not have a metaphor on which they base their processes ... this issue is further complicated by the lack of a standardised lexicon to describe their processes. Using an existing mental models means:
Immediate recognition of how to interact with a system Error prevention Good form field validation, however:
No messaging is given regarding the minimal set of data required to save and then submit an application for review.
In one instance, a user can complete a field and then have it overwritten on executing a "Save" by templated text without alerting the user.
No messaging is given regarding the need to save a document before some actions can be initiated from the ribbon menu.
No messaging is given to indicate that an action is successful that then provides instruction as to the next likely step to take given the workflow of an investigation (or its constituent components). Error messages require:
A description of what happened
What the user should do next The interface license cannot be accessed

Suggested solutions:
1. Verify the license manager is running
2. Ensure connectivity with the license manager
3. Verify the license hasn't expired

Click here to view the FAQs ... Recognition rather than recall ... required to be highly cognisant of their business processes in order to interact with the system ... this includes the language they currently use to describe their processes to initiate workflow ... Mimimise memory load
People shouldn't have to remember information from one part to use in another Aesthetic & minimalist design ... layout of the form data labels and fields result in a highly visually complex interaction design ...

... Icons are not generally representative of the functionality provided and should act more clearly as wayfinder signals. Dialogues should not contain irrelevant or rarely used information/visual stimuli.
Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Recognition over recall Visual complexity Visual contrast Visual flow Visual hierarchy Wrapping up Sprint 6 Note usability defects against all relevant User Stories
Create new Product Backlog items
Brief the Product Owner
Help prepare for Grooming for Sprint 7
Get it all done and estimated prior to Planning for Sprint 7 Sprint 6 Sprint Planning Business Owner:
Had been involved in usability testing
Learned for himself the issues
Understood what they were and why they were important
He wanted them done
Ranked them with the Product Owner Team estimated complexity of usability issues
Total points equated to approx 2 Sprints worth of work "Formal" UAT was:
A non-event
No surprises from users' perspective
Very 'usable' system Team continued to:
Work on usability issues
Tweak & simplify functionality
Finalise defects Users had already:
Seen the system
Played with the system
Identified problem areas in usability testing
Given feedback
Seen that issues were being addressed Proximity &
uniform connectedness Things that are close to one another are perceived to be more related than things that are spaced farther apart. The order in which the human eye perceives what it sees is created by the visual contrast between forms in a field of perception. Objects with highest contrast to their surroundings are processed first by the human mind. Objects with highest contrast to their surroundings are processed first by the human mind. The path the human eye moves or is led through a composition based on visual hierarchy and contrast The brain processes everything it sees. The more complex a visual image the longer it takes to process and the longer it takes for a person to make a conscious decision to act. It's easier for the brain to recognise things we have previously experienced than it is to recall them from memory. Considerations:
Usability testing highlighted re-work that could have been avoided
UX capability in the team
Increase UX skills of the team
Develop explicit usability rules (DOD) Addressing waste
next time around? Minimal Viable Product:
Made development effort easier
Would have resulted in ongoing co$t for training > waste Impact of MVP? Teams should own the usability of their design decisions, not an 'outsider'
Explicit DOD where there the team is homogenous in capability (e.g. developers without UX skills) Future considerations Conflict & resistance Observations:
Developers were somewhat resistant regarding usability improvements
Reinforcement of team's Minimal Viable Product philosophy
Core values of development reinforced over others' opinions Groupthink Psychological phenomenon
Occurs within a group of people
Desire for harmony or conformity results in an incorrect or deviant decision-making outcome
Group members try to minimise conflict
Blame the messenger Observations:
Very few issues
Users happy with system
No need for manual Issues:
Some hostility against usability findings
Inflated usability estimates of complexity
Reflected Groupthink over Performing team Issues emerging out of Retrospective:
Team felt usability rushed
Not enough time
Shouldn't do it in SIT
Reputation issues with bugs System integration testing Overall impressions: Quite easy to use
After about 10-20 minutes users understood how to get around the system
Needed to avoid recognition over recall issues to remove necessity to train people FIN Aesthetic-usability effect Aesthetically pleasing designs are often perceived as being easier to use How many users to test? 4-6 users "provides the greatest insight into usability issues and will account for approximately 70% of a product's serious usability problems”. - Nielsen Norman Group Aesthetic-usability effect usability testing Scenarios used only to:
Increase understanding of flow within the system amongst testers.
Setup environment pre-conditions within the system.
Not used as instructions for users. at their own desks in a big lab Immediate Action Discovered usability issues early enough to do something about them.
Broke down complex UI issues, where affected multiple pages focussed instead on more frequently used pages or pages likely used most often.
Used usability findings to write some new overarching Definition of Done for future Product Backlog items. Implications for MVP MVP had previously been used as a "Skinny Solution First" pattern
Became crutch for minimal versus usable
Usable became only interpreted as a subjective, aesthetic-usability effect
Where feedback on MVP wasn't possible, design debt was established and grew Big wins before system launch Usability testing had decreased acceptance risk Usability testing recycled Scenarios refined and re-used to assess end-to-end process support
Areas had already been seen by users
Formal acceptance easier to obtain from executives Definition of Done (and usable) Now:
Form labels above their fields
Form elements aligned to the left
When a form region is not editable it appears as text, not an explicitly non-interactive area
Only pairwise elements appear side-by-side, e.g. start date & end date
Only headings are in bold
Tool tips and related information appears beside its parent element (not off to the side) Patterns Focus on reducing system training debt over making something purely 'functional' but not 'usable'. Pros & Cons Testing in SIT wasn't ideal but worked very well
Do it before we ran out of actual Sprints to make usability improvements
Made UAT easy
Made deployment and transition to BAU easier
Idea of Minimal Viable Product conflicted with usable product philosophy and produced waste & rework Before:

"According to usability standards" Usability testing from Waterfall to Scrum Analysis Design Develop Test Wasted effort
Features over usability Sprint 5 Sprint 6 Sprint 7 Sprint 9 Prepare Test
Usability Implement
Usability
Improvements
Full transcript