Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


How to tell if you're dangerously popular (or 'Performance Testing Better')

No description

Tim Perry

on 24 April 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of How to tell if you're dangerously popular (or 'Performance Testing Better')

How to tell if you're dangerously popular (or 'Performance Testing Better') Waterfall Performance Testing Derive NFRs from customer Implement Manually Test Performance Panic when it's too slow THE ICEBERG OF Define your initial project requirements Plan performance tests for your story Implement your story Automate some performance tests Monitor production performance Can repeat unit tests as microbenchmarks (e.g. Google Caliper), but carefully Can parallelise integration tests for high-level tests Define some specific test user paths Identify target performance results (e.g. throughput, latency, both) Deliver project to happy customer ...it isn't fully applicable to every project e.g. http://status.github.com Stay high-level: number of users on common paths or equivalent Aiming for something that will easily inform later specific story targets ...gives fine granularity of performance results CONTINUOUS PERFORMANCE TESTING Thanks for listening! Further reading: Make it work
Make it right
Make it fast - Kent Beck ...you can emulate it for cheap with good metrics and reporting in production (and you should!) ...creates a performance regression safety net from the start of the project but Incorporating Performance Testing in Test-Driven Development
- IEEE Software, vol. 24, no. 3 https://github.com/carrotsearch/junit-benchmarks https://www.destroyallsoftware.com/blog/2011/continuous-automated-performance-testing http://databene.org/contiperf http://www.infoq.com/articles/iterative-continuous Monitor performance test results Add to your build pipeline, after your functional tests Graph results somewhere visible to the team to help spot trends Set error thresholds (in either your CI server or in the testing framework itself) Time critical segments in your code and monitor results Time single runs of your performance tests against live too to sanity-check Not performance tests first As far as possible functional tests should influence design, not performance tests (Except performance fixes) Furiously fix performance issues Release ' ' https://github.com/jamesgpearce/confess/ (and make them pass!) ...works best when done at many levels
Full transcript