No, Microsoft has not admitted that our load testing solution is superior…yet. We anticipate that announcement any day now :>
At last year’s Velocity conference, Microsoft’s Eric Schurman (working on Bing optimization) presented results of tests that Microsoft ran to measure the business impact of the performance of Bing, Microsoft’s newest search platform. Using an A-to-B test platform that directs a subset of their users to a dedicated testing system, they are able to directly observe the effect of specific changes to the performance of the Bing test site. They can collect and analyze a number of metrics about both subsets of users (the experiment group and the control group) including the amount of time a user waits before selecting a search result, the number of times they refine their search, user satisfaction and revenue/user.
Their results are nothing short of amazing! The industry standard recommendation for web page load time for many years was 6 seconds. We have been recommending 4 seconds for several years, now. The Bing searches typically respond in less than a second and they found that increasing page load time by a mere 1-2 seconds had a dramatic impact on both user satisfaction and revenue per user:
These numbers may sound small, but consider the impact when total search revenue is $100M: a 4.3% decrease amounts to a decrease in revenue of $4.3M. Compare this to the cost of implementing a major new feature to the site, which may only improve user satisfaction and revenue by 0.1%: performance has more than 10x the impact on the bottom line.
In another experiment, the Bing engineers implemented progressive rendering of the pages. This allows the header of the page (including the Bing logo and the “refine search” box) to render nearly instantaneously. While this part of the page is being sent from the server to the browser and rendering on the users screen, the server is working on the search operation. The results are sent from the server at nearly the same time, but the user sees the start of the results much sooner. Microsoft’s measurements indicate an improvement in user satisfaction of 0.7%, which they consider a huge improvement.
Next time somebody in your organization questions the money spent on performance testing, send them this article or the link to the full presentation at Velocity 2009 for more details about the Microsoft research.
The Velocity 2010 Conference is coming up soon — we’re looking forward to more good information about the business impacts of web performance.
Chris, Chief Engineer
When his dad brought home a Commodore PET computer, Chris was drawn into computers. 7 years later, after finishing his degree in Computer and Electrical Engineering at Purdue University, he found himself writing software for industrial control systems. His first foray into testing software resulted in an innovative control system for testing lubricants in automotive engines. The Internet grabbed his attention and he became one of the first Sun Certified Java Developers. His focus then locked on performance testing of websites. As Chief Engineer for Web Performance since 2001, Chris now spends his time turning real-world testing challenges into new features for the Load Tester product.