Real Agile Approach To Performance Testing

Real Agile Approach To Performance Testing

Got a few spare minutes? Come and read the best of The Testing Planet archive from Ministry of Testing

By Rajni Singh

Performance testing can be an integral part of Agile processes. It can help organisations to develop higher quality software in less time while reducing development costs. The goal is to test performance early and often in the development effort and to test functionality and performance in the same sprint. That’s because the longer you wait to conduct performance tests, the more expensive it will become to incorporate changes.

Performance issues are pernicious because, while the application meets the functional requirements, it may not deliver that functionality within the expected (or demanded) time frame. But what really matters with performance issues is that the solution is frequently not correctable with code changes. Too often the remedy involves architectural reformation, changes to the footprint (memory usage typically) or demands for more specific topologies with greater horsepower. This kind of repair is costly to do and can take a long time to effect. In the case of a performance issue in the embedded code, it simply may not be possible to solve it without re-architecting the silicon. So early lifecycle performance is critical if project costs are to be managed. Many tools exist that can simulate load on the application as well as throttling the resources (bandwidth, data transfer speeds, latency) so that realistic expectations of the resulting performance can be observed.

In an Agile project, the definition of “done” should include the completion of performance testing within a sprint. Only when performance testing within a sprint has been successfully completed can you confidently deliver a successful application to your end users.

We will need an approach which can help us to introduce performance testing quite early on rather than following the big bang approach (Sprint Hardening) of conducting performance testing at the end of the sprint, as it delays time to market for product and will be more expensive to incorporate changes at a later stage as explained earlier. It’s relatively easy to adapt functional testing in the Agile environment. However, adapting performance testing is difficult, as it’s normally expected that the application should be fully developed and functionally stable before performance testing will be conducted on it.

If we would like to introduce or fit performance testing early during the development phase, then we will need to take the points below into consideration:

  • Performance testing at the code level.
  • Performance testing or response time testing for newly developed features during the sprint.
  • Performance regression testing for the system integrated with newly developed features.

As soon as the coding starts in an Agile project, performance testing at the coding level can be started in parallel. And upon completion of functional testing of the newly developed feature, it should be performance or response time tested to understand how much time it’s taking to perform its functionality. And once newly developed functions get integrated with the overall system, performance testing is conducted to ensure that the newly developed features do not introduce any performance bottlenecks in the overall system behaviour. Performance testing during the regression phase will also help to identify any configuration, sizing of hardware and infrastructure issues.

So, any Agile approach to performance testing must consider the above three points to be successful. With this in mind, I propose the approach detailed below, which is a very simple and easy to adapt an approach that can be used in a wide variety of projects and organisations.

An Agile approach for performance testing would need to be divided into three sections to ensure coding, features and overall integrated system-level performance testing is conducted in parallel. And we would need to ensure that these sections are covered within the sprint so that working or shippable software can be delivered upon sprint completion. The figure below explains how these three sections fit into an Agile environment:

 Figure 1: An Agile Approach To Performance Testing


During the planning phase, Agile teams prioritise the list of features that need to be delivered during the current sprint. As soon as the sprint backlog is decided and prioritised, the performance tester should start reviewing the high-level user stories and developing a performance test plan. The testers should decide and assign the user stories and regression performance testing among themselves that need to be performance tested. The performance tester should also speak to the product owner to drive the performance acceptance criteria for the selected user stories in terms of response time.

Sprint 1…N

In every sprint, daily stand-ups play a very important role, as it gives an opportunity for team members to let the Scrum Master and team members know about the progress and issues they are facing in their day-to-day work. The typical duration for a sprint varies from 2 to 4 weeks. However, it can be more or less than this typical duration depending upon the nature of the project.

As shown in the figure [1] above, performance or response time testing at code, feature and integrated system level need to be performed in parallel within the sprint. This can be achieved through the following approach:

During the sprint initiation, performance testers start assisting team members by providing recommendations on the best performance practices that can be followed during code development. This can help in making code more robust in terms of performance. And at the same time, while the team is writing code, performance or response time testing at the code level will be performed to understand function or method response times and memory leak issues if there are any. This can be conducted by using various industry standard tools like HP Java probe, .NET probe etc. depending on the programming language used for application development. In parallel with this activity, as soon as the functional tester signs off the user story, the performance tester starts conducting performance or response time testing. After performance test execution, test results are analysed to determine if the performance acceptance criteria are met or not and feedback for the same is provided\shared with the entire Agile team. Now the development team can seek feedback and make fixes to ensure that the code drop can achieve the required performance standards. This can be conducted by using various proprietary or open source performance testing tools.

At the same time, the integrated features need to be regression tested to ensure that the newly developed feature did not introduce any performance issues in the overall system. This activity can be performed in parallel to other performance testing activities. However, in the case of time or resource constraints, performance regression testing can either be executed during off hours or over the weekends.

All the performance scripts developed during this stage are maintained so that a robust performance regression pack can be developed for future performance testing of the overall end-to-end system. Changes in requirements can happen at any stage of the sprint, so if there is any changes required we would go back to the planning stage, re-prioritise the list of features and the cycle can be repeated.

As soon as all the features are developed, UAT can be performed by the product owners and other concerned stakeholders. During this time, the agile team can conduct their retrospective meeting. The objective of this meeting is to find the lessons learned during the sprint. This basically covers:

  • What went well?
  • What did not go well?
  • What can be done to improve?

Real Agile Approach

There are some advantages to an agile performance testing approach:

Early Performance Checks: As performance can be considered early in the software lifecycle, robust code and functionality can be delivered before releasing the product.

Cost Savings: As performance is considered early in the software lifecycle, testing can help to avoid costly code or software changes.

Time Savings: As performance is conducted in parallel to software development and within the sprint, time to market can be reduced in delivering the release in comparison to the waterfall approach or sprint hardening approach.

There may be some disadvantages too:

Less Emphasis on Documentation: The agile approach does not emphasise documentation. This can sometimes create challenges if a team member leaves during the sprint and a newly joined team member will need time to come up to the speed of sprint delivery.

This approach can be applied to wide variety of projects (small to large-scale) and can help organisations not only to conduct performance testing in an agile environment by following a systematic methodology but also increase confidence by delivering more robust application in production environment.

This approach targets at three major points:

  • Performance or response time testing at code level
  • Performance or response time testing for newly developed features
  • Regression testing for the overall system integrated with newly developed features

This approach has worked for me in my organisation and it may work in yours… What do you think?


Author Bio

Rajni Singh has more than seven years of extensive experience in the performance testing field. She has gained this exposure while working with various world-renowned organisations. Rajni has completed MS in software systems from BITS and started her career in performance testing with Mercury Interactive, subsequently HP.

Explore MoT
Test Exchange
A skills and knowledge exchange to enhance your testing, QA and quality engineering life
MoT Foundation Certificate in Test Automation
Unlock the essential skills to transition into Test Automation through interactive, community-driven learning, backed by industry expertise
This Week in Testing
Debrief the week in Testing via a community radio show hosted by Simon Tomes and members of the community