The Measurements extension is a free (and open-source) project extension for MuseIDE and the Muse Test Framework that adds evaluation of performance criteria to a test. This initial release adds two new capabilities:
The extension is available for installation directly within the MuseIDE: after opening your project, go to Extensions and switch to the Available tab. The Measurement extension can be installed into your project with the click of a button:
Each of the new capabilties is implemented as a context initializer, which means that to make use of it, you must add it to a context initializer group. Here is an example with both features in use:
In the case of the Step Duration Collector, that is all there is to do. Once added to an initializer group, it will collect the duration of each step in the test. When the test is complete and the output is stored, it will appear in a file named StepDurations.json and the contents will be in this format:
This output is intended to be used by other tools for analysis (which could include transforming it to a more human-readable format). Each of the durations is identified by the Step Id – which references a step in the test. Since a step could execute multiple times in a test (e.g. looping), each step has a list of durations recorded.
Note that storing the output of a test happens when a test is run from the command-line with the output option. For example
muse run test1 -o out-folder
This feature has a few parameters that you must provide before it can be useful. Looking more closely at the above example:
The parameters are:
Look at the tags and metadata attributes on this step. When used with the configuration shown above, goals will be assessed on this step because it has the assess-goal tag. The goal will be 100ms, instead of the default 200ms goal, because it has a duration-goal attribute.
When run in the IDE, failure of the performance goal will generate an event that looks like this:
The test will fail due to the goal failure event:
If you want to dig into the source, it can be found on GitHub.
As always, please contact us with any questions!
Chris
When his dad brought home a Commodore PET computer, Chris was drawn into computers. 7 years later, after finishing his degree in Computer and Electrical Engineering at Purdue University, he found himself writing software for industrial control systems. His first foray into testing software resulted in an innovative control system for testing lubricants in automotive engines. The Internet grabbed his attention and he became one of the first Sun Certified Java Developers. His focus then locked on performance testing of websites. As Chief Engineer for Web Performance since 2001, Chris now spends his time turning real-world testing challenges into new features for the Load Tester product.