When Fast Isn’t Fast Enough: Performance Testing in an Agile Development Framework

Share:

For any of us to thrive in the digital age, one thing is paramount: we must embrace change. At the enterprise level, this is about more than just being responsive to a dynamic market – it’s about being able to move quickly. This is why many IT shops are adopting agile and other bi-modal development methodologies to boost the frequency and flexibility with which they can deliver new functionality. These new methodologies also allow development teams to work closely with business teams so IT solutions are more likely to solve actual business problems – and get to market faster.

But even the fastest companies and the most competitive and creative application development teams can’t produce perfect high-performing functionality without a full understanding of customer expectations. New functionality that shows account balances can easily be added to an online banking site, for example, but if a bank customer has to wait five – or even two – minutes for their account balance to load, they are likely to begin looking for another bank that can guarantee a better online experience.

These kinds of performance errors can be some of the most costly to fix – often involving the underlying architecture of the application. This is why it is important to communicate the performance criteria early on and integrate performance testing throughout the agile process. Performance testing at the end of a release or after the product has been shipped to production will likely result in functionality that is known for its defects.

To improve performance, enterprises should concentrate specifically on the following:

  1. Requirements and planning. For any new project, business users need to identify and communicate the specific performance criteria that is important to the success of the end product. For instance, if seeing an account balance quickly is important to the user, performance criteria should include:

    • Volume of throughput the application should expect in both peak and off-peak times

    • Projected number of concurrent users for the application in peak and off-peak times, along with any projected user growth over time

    • Response times for critical functionality within the application.

    Agile teams must cultivate the habit of adding performance criteria to their user stories (including a description of the user, what he or she wants and why) or creating performance-only stories in each sprint. Most importantly, the team needs to include expected performance criteria in the definition of “done” for each story in the sprint.

  2. Environment. The performance test environment comprises perhaps the biggest expense associated with performance testing. To get the most accurate and meaningful results, the performance test environment should mirror the production environment as closely as possible. The typical development environment, where most sprint activity takes place, is too dynamic, with many different coding activities going on at once. The more stable the environment, the more accurately the results will predict how the application will behave in production. To evaluate performance criteria most effectively, the team should promote the user stories to the performance environment as soon as they are functionally tested. Because developers are building the application with agile practices, other applications or interfaces may not be fully ready to test against. Therefore, the performance tester should be prepared to use “stubs” and “drivers” – the dummy code that enables components to be tested separately – to simulate interactions with other outside applications. 
  3. Frequency. Traditionally, performance testing has been conducted at the very end of the systems development lifecycle (SDLC), making errors much more expensive to correct. In an agile framework, in which sprints occur every two to three weeks, teams can adapt by conducting several smaller performance tests at the feature level during each sprint and building a regression library. These iterative performance tests also will make identifying the source of performance issues easier, since testing occurs every time a user story is promoted into the environment. “Early and often” should be the guiding principles for performance testing in an agile development framework.

ISG helps enterprises incorporate agile methodologies to increase business agility and speed to deliver. Contact me to discuss further. 

About the author

Jerry Lawson brings extensive experience in the implementation of strategic change in information technology, specializing in the areas of application development and maintenance, quality and process management, strategic sourcing, IT governance, and service management. With over 15 years of experience, he is highly skilled at analyzing organization’s development methodologies, identifying inefficiencies and quality gaps in existing processes, and tailoring innovative solutions to meet the client’s needs. Jerry is part of ISG’s IT Strategy practice as well as a leader in ISG’s ADM practice.

Share:

About the author

Jerry Lawson

Jerry Lawson

Jerry Lawson brings extensive experience in the implementation of strategic change in information technology, specializing in the areas of application development and maintenance, quality and process management, strategic sourcing, IT governance, and service management. With over 15 years of experience, he is highly skilled at analyzing organization’s development methodologies, identifying inefficiencies and quality gaps in existing processes, and tailoring innovative solutions to meet the client’s needs. Jerry is part of ISG’s IT Strategy practice as well as a leader in ISG’s ADM practice.