Introducing 

Prezi AI.

Your new presentation assistant.

Refine, enhance, and tailor your content, source relevant images, and edit visuals quicker than ever before.

Loading content…
Transcript

What are testers doing in these companies?

  • Identify core users/use cases
  • Generate a list of key assumptions being made around those users or their interactions with the product
  • Force rank those assumptions based on risk
  • Select the highest ranking assumption and test it
  • Review test results
  • Re-assess the assumption backlog
  • Rinse and Repeat
  • Technical-facing scalability of the product and the platform

- How hard is it to add or support more features?

- How hard is it to add or support more technologies?

- How hard is it to add or support more developers?

  • Business-facing scalability of the product and the platform

- How hard is it to add or support more users?

- How hard is it to add or support more customers?

- How hard is it to add or support more content?

  • Managing application performance in an iterative manner

Make sure "done" includes performance:

  • Diagrams updated
  • Monitors implemented (or stories added)
  • Performance tests completed (or stories added)

Most testers ignore (or loosely pay attention to) testing focused on controlling operations costs. Founders think this testing activity is important.

-2010 Online Survey of Startup Founders

Founders largely don’t care about testing to ensure that they are compliant with regulations. Nearly half of testers feel this testing is critical.

-2010 Online Survey of Startup Founders

The Dark Side

  • Validated learning is testing. Testers should excel at designing and executing experiments.
  • There are a vast number of methods to test customer experience - use them all.
  • Testers should be tech-debt sonar.
  • The tester should be the first derivative of a product owner. Testers can play a critical role in identifying product and process gaps, even if all the stories are "done."
  • Testing can be a subversive activity. The tester doesn't have to do all the testing, they can facilitate it.
  • Using testing to establishing credibility
  • Pair testing/programming
  • Supplementing existing unit tests
  • Helping write new unit tests
  • Writing simple, but high-impact UI automation
  • Removing automation roadblocks
  • Teaching others

Customer Experience and Validated Learning

(techdarkside.com)

This One Not This One

Tester as Facilitator

Product and Platform Scalability

Operations and Maintenance

  • Pinocchio: A physical stand-in for the product (e.g. block-of-wood palm pilot)
  • Provincials: Limited rollout of a feature to a subset of users
  • Split Testing: Expose users to different interfaces to track usage patterns/preferences
  • Surveys: Non-interactive opinion gathering, often with a form-based tool
  • User Testing: Alpha, Private Beta, Beta, and UAT
  • Video Demo: Faked functionality video recorded, used to solicit interview/survey feedback
  • Web Data: Pull data from web server logs
  • Wireframes: Drawings used to model layouts and workflows

- List created by Rick Grey, rickgrey@gmail.com

  • Application Data: Pull data from application database
  • Concierge: Functionality faked by a person (user might know), high interaction/opinion/qualitative content
  • Dummy Site: HTML-only version to simulate workflows, interfaces, clickable PDF
  • Event Tracking: Application events tracked with a tool (e.g. MixPanel)
  • Fake Doors: Links to nowhere used to track interest behaviors
  • Inline Feedback: User feedback tools within the application, usually at the location of the item of interest
  • Interviews: Live, interactive surveys and discussion
  • Mechanical Turk: Functionality faked by a person (user may or may not know)

Compliance and Regulation

  • Assisting with monitoring and alerting
  • Assisting with setup for debugging
  • Testing for:
  • compatibility
  • data migration
  • upgrade and rollback
  • regression issues
  • Subject matter expertise with regards to key compliance concerns:
  • third party terms and conditions
  • regulatory controls
  • technology standards
  • Running quick tests using automated tools that check basic standards compliance
  • Periodically list out "the bad stuff" and perform quick tests or audits for those items

Managing Partner, DeveloperTown

www.DeveloperTown.com

Past President, Association for Software Testing

www.AssociationForSoftwareTesting.org

Articles and Blogs

www.MichaelDKelly.com

DeveloperTown.com

MichaelDKelly.com

@michaeldkelly

Michael Kelly

Software Testing at Startups

A story of two startups

What else are they doing?

  • agile development
  • working in the cloud
  • rapid development tools/technologies
  • simple and understandable business models
  • customer development (Steven Blank)
  • lean and small batches (Eric Reis)

Seed Stage

validated learning

make the demo work

The Life of a Startup

Growth Stage

What else are they doing?

  • agile development
  • working in the cloud
  • rapid development tools/technologies
  • simple and understandable business models
  • customer development (Steven Blank)
  • lean and small batches (Eric Reis)

finding revenue and meeting funding milestones

winning, delivering on, and keeping early clients

  • testing as an accelerator to the process
  • testing to provide visibility into the world as it really is, not how we want it to be
  • testing provides the capability to answer business questions qucikly
  • testing to identify and track technical debt

Inside a "winning" technology startup

Established

operationalization and maintainability

don't break old features when you create new features

preserve the power of small batches

Questions?

DeveloperTown.com

MichaelDKelly.com

@michael_d_kelly

Startup B

Startup A

Technical Debt

Linear Velocity

Validated Learning

Illusion of Progress

Happy Users

Unhappy Customers

Summary of Lessons:

Learn more about creating dynamic, engaging presentations with Prezi