Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Lessons Learned from Software Testing at Startups

No description
by

Michael Kelly

on 8 March 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Lessons Learned from Software Testing at Startups

A story of two startups Startup A Startup B Technical Debt Unhappy Customers Illusion of Progress Linear Velocity Validated Learning Happy Users The Life of a Startup Seed Stage Growth Stage Established validated learning make the demo work finding revenue and meeting funding milestones winning, delivering on, and keeping early clients operationalization and maintainability don't break old features when you create new features preserve the power of small batches Inside a "winning" technology startup testing as an accelerator to the process
testing to provide visibility into the world as it really is, not how we want it to be
testing provides the capability to answer business questions qucikly
testing to identify and track technical debt What else are they doing? agile development
working in the cloud
rapid development tools/technologies
simple and understandable business models
customer development (Steven Blank)
lean and small batches (Eric Reis) Product and Platform Scalability Customer Experience and Validated Learning Operations and Maintenance Compliance and Regulation Technical-facing scalability of the product and the platform
- How hard is it to add or support more features?
- How hard is it to add or support more technologies?
- How hard is it to add or support more developers?

Business-facing scalability of the product and the platform
- How hard is it to add or support more users?
- How hard is it to add or support more customers?
- How hard is it to add or support more content?

Managing application performance in an iterative manner What are testers doing in these companies? Identify core users/use cases
Generate a list of key assumptions being made around those users or their interactions with the product
Force rank those assumptions based on risk
Select the highest ranking assumption and test it
Review test results
Re-assess the assumption backlog
Rinse and Repeat Application Data: Pull data from application database
Concierge: Functionality faked by a person (user might know), high interaction/opinion/qualitative content
Dummy Site: HTML-only version to simulate workflows, interfaces, clickable PDF
Event Tracking: Application events tracked with a tool (e.g. MixPanel)
Fake Doors: Links to nowhere used to track interest behaviors
Inline Feedback: User feedback tools within the application, usually at the location of the item of interest
Interviews: Live, interactive surveys and discussion
Mechanical Turk: Functionality faked by a person (user may or may not know) Most testers ignore (or loosely pay attention to) testing focused on controlling operations costs. Founders think this testing activity is important.
-2010 Online Survey of Startup Founders Assisting with monitoring and alerting
Assisting with setup for debugging
Testing for: Make sure "done" includes performance:
Diagrams updated
Monitors implemented (or stories added)
Performance tests completed (or stories added) compatibility
data migration
upgrade and rollback
regression issues Subject matter expertise with regards to key compliance concerns:
third party terms and conditions
regulatory controls
technology standards
Running quick tests using automated tools that check basic standards compliance
Periodically list out "the bad stuff" and perform quick tests or audits for those items Founders largely don’t care about testing to ensure that they are compliant with regulations. Nearly half of testers feel this testing is critical.
-2010 Online Survey of Startup Founders The Dark Side This One Not This One Tester as Facilitator Using testing to establishing credibility
Pair testing/programming
Supplementing existing unit tests
Helping write new unit tests
Writing simple, but high-impact UI automation
Removing automation roadblocks
Teaching others What else are they doing? agile development
working in the cloud
rapid development tools/technologies
simple and understandable business models
customer development (Steven Blank)
lean and small batches (Eric Reis) Questions? Thank you. DeveloperTown.com
MichaelDKelly.com
@michael_d_kelly Pinocchio: A physical stand-in for the product (e.g. block-of-wood palm pilot)
Provincials: Limited rollout of a feature to a subset of users
Split Testing: Expose users to different interfaces to track usage patterns/preferences
Surveys: Non-interactive opinion gathering, often with a form-based tool
User Testing: Alpha, Private Beta, Beta, and UAT
Video Demo: Faked functionality video recorded, used to solicit interview/survey feedback
Web Data: Pull data from web server logs
Wireframes: Drawings used to model layouts and workflows

- List created by Rick Grey, rickgrey@gmail.com (techdarkside.com) @esconfs
#esconfs www.eurostarconferences.com
Lessons Learned from
Software Testing at Startups Michael Kelly, DeveloperTown @esconfs
#esconfs www.eurostarconferences.com Managing Partner, DeveloperTown
www.DeveloperTown.com

Past President, Association for Software Testing
www.AssociationForSoftwareTesting.org

Articles and Blogs
www.MichaelDKelly.com Summary of Lessons: Validated learning is testing. Testers should excel at designing and executing experiments.
There are a vast number of methods to test customer experience - use them all.
Testers should be tech-debt sonar.
The tester should be the first derivative of a product owner. Testers can play a critical role in identifying product and process gaps, even if all the stories are "done."
Testing can be a subversive activity. The tester doesn't have to do all the testing, they can facilitate it.
Full transcript