Oh-boy, oh-boy, oh-boy! I’m extremely happy to say that we’re really close to publishing the results of the Java Performance Survey, that we’ve run this spring. I hope you remember it, we asked you about the tools that you use for performance testing. How do you do it, how long does it take to verify that there are no performance regressions and who monitors and cares about the performance of your production deployments?
We shared it countless times on our social networks and asked everybody to do the same, in order to reach the most respondents and get the most unbiased statistics. Also, for the second year in a row, ZeroTurnaround donated $.50 for every completed survey. I’m happy to say that the Dogs for the Disabled organization benefited from your participation and they have an opportunity to improve people’s quality of life.
But let’s not digress too far, all in all we collected 1562 responses from Java developers, architects, performance engineers and operations people all over the world. Here’s the basic spread across the roles our respondents maintain at their teams. You can see that almost 65% are developers. Which is great news, because this is the group most of our readers belong to, so the analysis of the results will be more directly applicable.
One of the most important questions we want to ask is: are the projects the respondents working on similar projects to projects you care about? Now the majority of respondents indicated that they work on web applications. Imagine a back-end server tech, a web front-end, a database or two. Other types of projects were also present, but the results for them unfortunately won’t be as convincing as for web app developers. Check the spread on the image below.
I promised Simon Maple, who’s the main author of the data analysis in the upcoming report, that I won’t spoil the report by giving away too many details, however, this is one bit that makes things really interesting. When we asked if you see performance issues affecting end-users in production, this is what the answer looked like:
We got three quarters of honest responses that say yes, issues slip into the late stages of software lifecycle and they are visible to the end users. A happy 24% of respondents don’t think their users struggle with the application performance. The best bit is that we can compare these honest apples and ignorant oranges and see what they do differently. There is a probability that they all work on Android apps where there’s only one active user at a time, but when was the last time you saw a responsive Android app?
Anyway, the upcoming RebelLabs report with the analysis of the data will be published next week! In the report we analyze the answers to the questions like which tools are the most popular to uncover the performance issues, how long is a lifespan of a typical performance bug, and whether existence of special performance team gives a larger safety margin against the performance issues.
Stay tuned for the report, follow us on Twitter to see the announcements (@ZeroTurnaround) or just subscribe to our email and we’ll send you the report in a beautiful pdf.