Imagine a bacon-wrapped Ferrari. Still not better than our free technical reports.

Performance testing automation with XRebel Hub REST API

Last week we released the XRebel Hub public API. It can be used to automate your performance testing. This article guides you through setting up an automated performance validation step in your CI server, using Jenkins as an example. We also provide a few real life automation examples.

Setting the baseline

Setting the baseline is the first step of any performance testing initiative — assuming we have defined our performance goals. The best practice here is to test the current build against the one that has been previously approved and shipped to production.

The XRebel Hub API allows you to set the baseline build. The API call should be made as part of the deployment script where the approved build is pushed to production:

curl -sX PUT "$APPLICATION_NAME/baselines/default/" \
-H "Content-Type:application/json" \
-H "authorization:$SECRET_API_KEY" \
-d "{\"build\":\"$BUILD_NUMBER\"}" 

Here (and throughout this article):

  • $BUILD_NUMBER is a Jenkins environment variable holding your current build number. This is the recommended way to identify builds in XRebel Hub.
  • $SECRET_API_KEY is your API key. You can find this on the project settings page in XRebel Hub.
  • $APPLICATION_NAME is your application name in XRebel Hub.

In our example:

Checking for performance issues

Now that the baseline is set, we can add another build step to check for performance issues.

This performance validation step should be added after all your performance tests or regular tests are executed. XRebel Hub needs the data generated during those tests.

To check for performance issues in the current build:

curl -s "$APPLICATION_NAME/issues?targetBuild=$BUILD_NUMBER&defaultBaseline" \
-H "authorization:$SECRET_API_KEY" 

And a real life example:

You should receive a response like this:

  "appName": "Petclinic-9966",
  "target": "32",
  "baseline": "27",
  "issuesCount": {
    "DURATION": 2,
    "IO": 1,
  "entryPoints": [
      "name": "HTTP /owners/{ownerId}",
      "hits": 48,

Parsing the performance issues response

We have learned how to call the API to find performance regressions between builds. Now we need a way to use this JSON data from the shell.

We recommend using jq.

Let’s get the most recent version of jq, save it as “jq” and make it executable:

curl -Lo jq "" && chmod +x jq

Make sure to choose the right binary for your OS:

Next, since we will be running this as a build step, we do not need to download jq every build:

if [ ! -f jq ]; \
then curl -Lo jq "" \
&& chmod +x jq; fi

Finally, let us save XRebel Hub’s API response into a file and use jq for formatting and output:

if [ ! -f jq ]; \
then curl -Lo jq "" \
&& chmod +x jq; fi
curl -s "$APPLICATION_NAME/issues?targetBuild=$BUILD_NUMBER&defaultBaseline" \
-H "authorization:$SECRET_API_KEY" \
> response.json
cat response.json | ./jq "."


Failing a build

Now for the actual automation — we are not calling an API just to see some nicely formatted data. Let us start simple and fail the build when it has performance regressions.

curl -s "$APPLICATION_NAME/issues?targetBuild=$BUILD_NUMBER&defaultBaseline" -H "authorization:$SECRET_API_KEY" > response.json
cat response.json | ./jq "."
cat response.json | ./jq "if .issuesCount == {} then 0 else 1 end" | grep -q 0

The second line can be skipped if you do not need to output the response to the console.

As you can see on the third line, the jq syntax is quite straightforward. Here, jq goes through the file, finds the “issueCount” node, and if it is empty (meaning there are no issues), outputs “0”. If the “issueCount” contains data, jq outputs “1”.

The “grep” pipe is there to convert the output to an exit code: if output is not “0”, exit with “1” and fail the build.

Practical examples

Fail when exceptions are present

Find the EXCEPTIONS field and look at its value. If value is greater than 0, exit with 1.

cat response.json | ./jq "if .issuesCount.EXCEPTIONS > 0 then 1 else 0 end" | grep -q 0

Pro tip: you can simplify the response by asking XRebel Hub for only exceptions:

curl -s "$APPLICATION_NAME/issues?targetBuild=$BUILD_NUMBER&defaultBaseline&issues=EXCEPTIONS" \
-H "authorization:$SECRET_API_KEY" \
> response.json

Fail when a specific entry point has issues

This one is a bit trickier. We need to look at all entry points in the response and figure out if they have a “name” field with the entry point name that we are looking for.

cat response.json | ./jq ".entryPoints[] | if contains({name: \"GET /owners/{ownerId}/pets/new\"}) then 1 else 0 end" | grep -c 1 | grep -q 0

We will apply a pipeline of jq filters for this:

  1. First, find an array of entry points and feed them to the next step one by one.
  2. Second, run each entry point object through the conditional filter. This checks the “name” field for the value we are looking for.

This pipeline will output “1” for every entry point with a matching name and “0” for all others.

There will be multiple outputs, and all not matching entry points will output “0”. We cannot simply grep for a positive “0” response and fail if it is not found. Instead, we will have to get a little more creative. We will first count the number of “1” values with “grep -c 1” and exit with an error code when that number is not “0” (meaning there is at least one matching entry point with issues).

Fail depending on issue severity

Let us add on to the previous example and fail the build only when a certain entry point is present and that entry point became at least 20 times slower. To do that, we simply need to verify that the average request duration increase is over 20:

cat response.json | ./jq ".entryPoints[] | if contains({name: \"GET /owners/{ownerId}/pets/new\"}) and .duration.averageIncrease > 20 then 1 else 0 end" | grep -c 1 | grep -q 0

Note that this example is simply to demonstrate the flexibility that the API provides. A better approach is to set relevant increase thresholds directly in XRebel Hub.

Fail a group of entry points

You can also make the filter more generic and instead of listing specific names, fail when any entry point matches a given string. To do that, we will add another pipe before the conditional filter to extract the value of the name field. For example, dogs are important, so let us not approve the build if there are entry points containing “dog” in the list of regressions:

cat response.json | ./jq ".entryPoints[] | .name | if contains(\"dogs\") then 1 else 0 end" | grep -c 1 | grep -q 0

Pro tip: you can tell the agent to whitelist entry points instead of filtering them with a shell script. To do this, use the custom profiling configuration file.

Multiple conditions

Finally, let us figure out how to add multiple conditions. The best practice is to list them as separate lines, one after another:

  1. This makes the script more readable.
  2. You can clearly see which condition was triggered to fail the build — it will be the last one printed out to the console.

Putting it all together

Here is the final shell script for the performance check build step:

if [ ! -f jq ]; then curl -Lo jq "" && chmod +x jq; fi
curl -s "$APPLICATION_NAME/issues?targetBuild=$BUILD_NUMBER&defaultBaseline" -H "authorization:$SECRET_API_KEY" > response.json
cat response.json | ./jq "."
cat response.json | ./jq "if .issuesCount.EXCEPTIONS > 0 then 1 else 0 end" | grep -q 0
cat response.json | ./jq ".entryPoints[] | if contains({name: \"GET /owners/{ownerId}/pets/new\"}) and .duration.averageIncrease > 20 then 1 else 0 end" | grep -c 1 | grep -q 0
cat response.json | ./jq ".entryPoints[] | .name | if contains(\"dogs\") then 1 else 0 end" | grep -c 1 | grep -q 0

Breaking this example down line by line:

  1. Check for jq and download it when not found.
  2. Ask XRebel Hub for performance regressions, comparing the current build against a default baseline. Do not forget to set that baseline during your deployment stage.
  3. Output the nicely formatted response to the console.
  4. Fail the build if there are any exceptions.
  5. Fail the build if “GET /owners/{ownerId}/pets/new” requests became 20 times slower.
  6. Fail the build if there are any regressions with “dog” in the entry point names.


We have now walked through the example of performance regression testing automation using Jenkins and XRebel Hub. The best practice approach consists of only two steps:

  1. Use the last approved and shipped build as the performance baseline for future builds. To do that, add a call to XRebel Hub API, telling it the number of the currently deployed build.
  2. Add a performance check step to your testing pipeline. This should be the final testing step. Simply ask XRebel Hub’s API for a list of regressions in your current build and fail the build if your performance criteria are not met.

Both the API and XRebel Hub itself are flexible tools that allow you to tailor performance testing according to your requirements. Hopefully, the examples provided in this article will help you automate performance testing for your product. Should you have any questions, please feel free to post in the comments below!