Previous versions of Load Tester had 2 metrics describing the success or failure of a web page (or single transaction) during a load test: Repeats and Errors. A Repeat was counted any time a valid HTTP response was received from the server. Errors, however, were a bit more complex. Errors could occur at any point in the transaction, from establishing a connection to post-response validation. An error was counted if the connection was terminated in the middle of a transaction. It was also counted if the status code from the server indicated a failure (e.g. a 500 status). Validators record an error when the expected content was not received from the server. Extractors, which search for dynamic content on a page that is needed later in the testcase, record an error when the required content is not found. This could result in several errors reported for a single web page — especially since every transaction within a page can generate one or more errors. This made it impossible to determine exactly how many pages were attempted or failed during a given period in a load test.
The 3.6 release of Load Tester will include 4 new metrics, which replace the Repeats and Errors metrics. The new metrics are Successes, Failures, Completions and Failure Rate.
With these new metrics, it is easy to see exactly how many pages succeeded and failed, as well as the total number of requests served (Completions) and the total number of attempts (the sum of Successes and Failures). The Failure Rate is clearly shown for each page and transaction – and is shown by default in the displays and reports wherever the Errors metric was shown in previous versions.
Note that each error encountered (network, validation, etc) is still recorded as an individual entity, but they are not counted – since the number of errors is not generally valuable (for the reasons mentioned above).
We hope this will clear up confusion faced by some users in previous versions of Load Tester!
When his dad brought home a Commodore PET computer, Chris was drawn into computers. 7 years later, after finishing his degree in Computer and Electrical Engineering at Purdue University, he found himself writing software for industrial control systems. His first foray into testing software resulted in an innovative control system for testing lubricants in automotive engines. The Internet grabbed his attention and he became one of the first Sun Certified Java Developers. His focus then locked on performance testing of websites. As Chief Engineer for Web Performance since 2001, Chris now spends his time turning real-world testing challenges into new features for the Load Tester product.