Imagine a bacon-wrapped Ferrari. Still not better than our free technical reports.

Developer Productivity Report 2012: Java Tools, Tech, Devs & Data

Introduction to the Java Developer Productivity Survey

Digging into data searching for insights is always an exciting activity. Last year’s Java EE Productivity Report was focused on tools/technologies/standards in use (exclusive selections) and turnaround time in Java (i.e. how much time is spent per hour redeploying/restarting).

In this year’s Java Productivity Report, we expanded the selection of technologies and tools available to Java development teams to choose from, made them non-exclusive, and covered more areas, for example Version Control Systems and Code Quality Tools. We also focused more on the question, “What makes developers tick?” and learned a lot about how devs spend their work week, what elements of the developer worklife increases/decreases efficiency, and what stresses out developers. We found a lot of interesting trends and insights, and broke them down into 4 parts:

Part I: Developer Tools & Technologies Usage Report
coverage of Java versions, JVM languages, IDEs, Build Tools, Application Servers (containers), Web Frameworks, Application (server-side) Frameworks, JVM Standards, Continuous Integration Servers, Frontend Technology, Code Quality Tools, Version Control Systems

Part II: Developer Timesheet
How do devs spend their work week?

Part III: Developer Efficiency
What aspects of your job make devs more/less efficient?

Part IV: Developer Stress
What keeps devs awake at night?

But before we go any deeper, let’s review the data as a whole.

What if you no longer had to redeploy your Java code to see changes? The choice is yours. In just a few clicks you can Say Goodbye to Java Redeploys forever.

Although we ran the survey through an open call on the web, there is a certain amount of bias present in it. Obviously, our customers are more likely to reply to the call than anyone else. Also, it’s likely that people who find and take such surveys are less conservative in their technology choice than those who don’t. To protect against such bias, we asked for the size of the company and the industry, to be able to normalize the data, if there is any overrepresentation. However the no size or industry was over-represented, so aside from some “early adopter” bias there shouldn’t be a lot of misrepresentation.

Another thing to understand is what are we measuring. Popularity can be measured in different ways — number of users, number of organizations, number of lines of code, etc. In this survey we measure the penetration of tools & technology in the market. So 10% of respondents means that likely 10% of the organizations that do Java development use this tool or technology somewhere in the organization. It does not tell us whether it is critical for them or not, whether it is used a lot or only in rare cases and whether it is used by everyone or only the respondent.

tools and tech leaderboard