It all started for me as an idea to understand how the performance testing figured its way in Agile. All my quest for finding some useful info lead me to the same place of getting the client requirements, finding the load pattern, simulating the load... normal stuff. But my question of how to effectively find a simple way to do performance testing earlier and often remains? After a few hours or days of brainstorming (I forgot :D) and posting on twitter here are a list of points which I could come up with.
Some of them are ideas and some are things which I have done before like using functional tests. I would like to keep this as a living document constantly updating them when new ideas come up. I would also encourage you to contribute to this. Ideas can be crazy but plausible :).
1) When writing some complicated functionality which may create performance or scalability problems, use something like jUnitPerf along with TDD to continuously test how it performs. The functionality as I say need not be end to end but just the unit which is complicated. This would facilitate the reduction in area of concentration as well performance problem localization.
2) Use profiling to understand the application performance and memory usage early.
3) Place some performance metrics in the functional tests like Selenium or Watir suite, and graph the test suite execution time to find out the abnormalities earlier. (Thanks @jtf and @cgoldberg). If there is a sudden increase in the execution time there may be a problem.
4) Use YSlow, PageSpeed and Firebug to identify performance bottlenecks on the client side like too many HTTP requests or browser side caching stuff. It is important to understand the bottlenecks across different layers and in typical Agile iteration the app gets created in thin vertical slices of functionality rather than horizontal. I really like the idea of integrating YSlow reports in Continuous Integration process to understand the bottlenecks by Adam Goucher. Check his post here http://adam.goucher.ca/?p=1017.
5) When performance is one of a primary considerations, use a continuous performance benchmarking tool may be with your CI build pipeline to understand the changes in performance over time. (Thanks @cgoldberg)
6) In an iteration when developing a functionality use something like Pylot or JMeter to identify the bottlenecks in that functionality and clear them. Before release it is important to test the application in realistic conditions so use something like BrowserMob or SOASTA to use real browsers to performance test from internet. (Thanks @bradjohnsonsv). I have spoken about testing in the cloud in CloudCamp before :)
7) Discuss with the team and identify the patterns in code which can affect performance. Use a code quality to catch these patterns in the build process if possible.
8) Wrap all integration-level or API-level calls in a timer, so the system logs always contain timing information. Then you can always mine the system logs in dev/test/staging/prod for real performance information, and fix anything that takes too long. Thank you Chris McMahon.
I have faced two problems with respect to performance work in Agile teams. One is premature optimization is evil. There is a difference between premature optimization and context sensitive optimization which is really needed. Second is in Agile most of the time the user stories revolve around functionality. As a user I would like to..... When giving the stories customers assume that it will perform and the developers will miss out because it is not stated. Testers need to pitch in and ask if the functionality is business critical what are the performance considerations for that (Lisa's Agile Testing book has an excellent chapter on this).
Well that is all I have to say for now. But this is in no way complete. I would really love to have people with much better knowledge than me to pitch in here and correct my mistakes or contribute better ideas than this.
Feel free to comment on this post :).