Load Testing by Example - Best Practices - Web Performance
Menu

Load Testing by Example

Best Practices

Read and post comments

Overview

The purpose of this tutorial is to present some of the best practices we have developed at Web Performance. Through our Load Tester product development and load testing service experiences, we have found some easy tips and general strategies that can help make load testing efforts more successful. They are presented here in hopes of making all load testing efforts more successful.

General Strategies

As a tester, our job is to provide the best possible feedback on the readiness of an application as early as possible in the development cycle. Of course in the real world, deadlines are always tight and testing begins later than it should. We've found a few simple rules that have helped make our load testing projects more effective.

1. A reasonably accurate result right now is worth more than a very accurate result later. It is well understood that changes become more expensive later in the project. As a result, the quicker that performance problems are found, the more easily they are fixed.
2. Test broadly, then deeply. Testing a broad range of scenarios in a very simple manner is better than testing a few testcases very deeply. Early in a project, an approximate simulation of the real world is acceptable. Time spent getting each testcase to exactly mimic the predicted real world scenario or testing dozens of slight variations of the scenario is better spent testing a wide range of scenarios.
3. Test in a controlled environment. Testing without dedicated servers and good configuration management will yield test results that are not reproducible. If you cannot reproduce the results, then you cannot measure the improvements accurately when the next version of the system is ready for testing.

Testing Steps

At Web Performance, we generally recommend that beginners follow a series of steps like:
  1. Select the scenarios to be tested
  2. Pick a scenario and configure it
  3. Load test the scenario
  4. Repeat steps 2-3 until each scenario is ready, as schedule permits
  5. Combine the load tests and run a combined test

Scenario Selection

When selecting scenarios to test from the group of all possible scenarios, we rate each scenario by two aspects: Difficulty of Simulation and Importance. Unfortunately for beginners, the Difficulty of Simulation is hard to judge without some experience with the load testing tool and some understanding of the underlying application implementation details. But in general, scenarios that are short and easy to verify are frequently also easy to simulate. Verifying a scenario refers to the ability of confirming that, the scenario was successful and performed the desired action in the system, when simulated in the testing tool. For instance a scenario that sends an e-mail is easy to verify - just check to see if the e-mail was sent. Similarly, anything that performs an add operation that can later be queried is easy to verify.

Once the scenarios have been rated on the Difficulty of Simulation and Importance, we order them. The scenarios that are easy to simulate and very important come first. Then we choose the scenarios that are either easy to simulate but only somewhat important or not as easy to simulate but very important. We usually alternate those - favoring the easy ones when schedules are tight. We favor the easy scenarios because experience has shown that the factors that cause performance problems are rarely related to the importance of the scenario. There is a lot of value in testing as many scenarios as possible very early in the testing effort. This gives the developers more time to troubleshoot and tune the application when problems are found.

Configuration Management

Good configuration management is essential to getting consistent load test results. If the test environment is frequently changing, it will be difficult to compare the result of one test to another. It will also be difficult to troubleshoot problems, since it will not be easy to rule out environmental factors as affecting the test results.

Some pointers:
Part of managing the application data (database) is populating it. When you do not have good testing tools available, you might consider doing this by injecting data directly into the database. But with a good load testing tool, you have the ability to easily populate the database with large amounts of data with relative ease. Since the data is entered via the same GUI that real users would be entering the data, you have the added assurance that the data has passed all of the cleansing and verification rules of the application. Once you have created and configured a testcase to add users to your application, for instance, you can easy populate the database with hundreds or thousand of users. You can even create different populations, backing up each one, to allow for comparison testing. This can help answer questions like "Will the performance change in 6 months after 50,000 users have been registered?"

Testcase Construction

First, I want to clarify our terminology. When we describe the way a user interacts with a web application, we refer to scenarios. Once we have recorded the scenario and begin configuring it for simulation, we refer to it as a testcase. It is possible to have multiple testcases for the same scenario. For example, minor variations might be considered the same scenario, while requiring separate testcases for Load Testing purposes.

Creating and customizing a testcase generally follows these steps:
  1. Record the scenario using a browser while the testing software records the actions.
  2. Configure the testcase to simulate multiple user identities (for any system with a login)
  3. Customize the testcase to supply different inputs (search keywords, for example)
  4. Replay the testcase to verify correct simulation
Note that when first working with a new application, we will typically perform replays after steps 1 and 2, to ensure the simulation is correct before moving on to the next step.

Testcase complexity

In order to get quick results, we try to keep our testcases as simple as possible while still reflecting the expected usage of the system.

One of the ways we do this is to model variations of a scenario as separate testcases. For example, if some variations of a scenario require a user to visit a confirmation page while others do not, we would model these as two separate testcases. One reason is that modeling if-then logic in a testcase adds complexity and generally requires a script-based testing tool. Maintaining scripts is labor-intensive and increases the overall cost of testing. Second, unless measuring both variations is critical, we will generally only choose one of the scenarios earlier in the testing process to allow us to test more broadly. In later stages, we will come back to address the remaining variations.

Feedback & Comments

Comments about this report may be posted at the company blog post.

Metadata

Christopher L Merrill
©2007 Web Performance, Inc.; v1.1

Version History

v1.0 - 1st public release (10 September 2007) v1.1 - email cleanup (23 Jan 09)


Resources

Copyright © 2024 Web Performance, Inc.

A Durham web design company

×

(1) 919-845-7601 9AM-5PM EST

Just complete this form and we will get back to you as soon as possible with a quote. Please note: Technical support questions should be posted to our online support system.

About You
How Many Concurrent Users