What did we find out?
In September 2013, just a few days before JavaOne, we released our 4th annual report looking into different aspects of Developer Productivity. While in the past we looked at things like build and restart times, market share of different Java tools and technologies and what stresses out developers and how developers spend their work weeks, this time we went hunting after larger game.
Our question was: How do the methodologies we practice and the tools we use affect our the quality of our software and our ability to predictably deliver it?
A reasonable question is “How come we didn’t know these things before?” Because most of those life-saving tools and best practices are backed up only by anecdotal evidence and stories, without any data or stats to back it up. And this is what we wanted to do. Find real data, and not just parrot more stories. Like this, where we saw a relationship between specific tool types, like version control, code quality analysis and CI servers, and your ability to deliver releases predictably:
We ended up creating a survey that over 1000 engineers around the world responded to, and you can see more about the survey and our methodology in the full report. These findings below are the most significant statistics that we uncovered, so without further ado I offer you “11 stats we [now] know about developer productivity” (note: links below take you to the specific page of the report with that data presented):
- Fixing code quality issues increases your software quality by up to 7%, and the predictability of your releases by up to 9%. See the stats.
Automating tests increases your software quality by up to 9% and predictability of your release by up to 12%. See the stats.
Having developers pair up increases software quality by up to 7%. See the stats.
Doing code reviews increases the predictability of releases by up to 11%. See the stats.
Making sure that developers test software and not just rely on the QA team and Automated tests raises software quality by up to 5%. See the stats.
Who does the task estimation influences predictability. When done as a team, the increase is 6%, whereas involving the management drops it by -6%. See the stats.
Using Version Control increases predictability of releases by 9%. See the stats.
Using an IDE increases predictability of releases by 8%. See the stats.
The top individual tools that increase predictability of releases are JRebel (+8%), Jenkins (+4%), Bamboo (+4%) and Confluence (+3%). See the stats. Note: the original survey didn’t include JRebel, which would be a conflict of interest. However, we matched respondents’ email addresses afterwards and saw that JRebel has a positive effect on release predictability. Cool!
The trends of the Top 10% of respondents (i.e. those with quality and predictability scores better than 90% of respondents) are as follows: definitely use Jenkins, look for something other than Google Drive for collaborating, choose Git instead of Subversion, and opt for Google Hangout over Skype. See the stats.
It’s now possible for your organization to calculate, with some degree of accuracy, general trends in how tools and practices can affect the quality of your software and the degree to which its delivery is predictable. Hell yeah! Just check the full report for how: http://zeroturnaround.com/rebellabs/developer-productivity-report-2013-how-engineering-tools-practices-impact-software-quality-delivery/
What do you think of these findings? They pretty much speak for themselves, but the complete report provides more context and information. Show it to your colleagues, user group members, Twitter followers, etc. Most folks in the software game, even non-engineers, can benefit from this data. Love/hate this report? Leave comments or tweet @RebelLabs to get in touch.