Imagine a bacon-wrapped Ferrari. Still not better than our free technical reports.
See all our reports

Developer Productivity Report 2012: Java Tools, Tech, Devs & Data

Introduction to the Java Developer Productivity Survey

Digging into data searching for insights is always an exciting activity. Last year’s Java EE Productivity Report was focused on tools/technologies/standards in use (exclusive selections) and turnaround time in Java (i.e. how much time is spent per hour redeploying/restarting).

In this year’s Java Productivity Report, we expanded the selection of technologies and tools available to Java development teams to choose from, made them non-exclusive, and covered more areas, for example Version Control Systems and Code Quality Tools. We also focused more on the question, “What makes developers tick?” and learned a lot about how devs spend their work week, what elements of the developer worklife increases/decreases efficiency, and what stresses out developers. We found a lot of interesting trends and insights, and broke them down into 4 parts:

Part I: Developer Tools & Technologies Usage Report
coverage of Java versions, JVM languages, IDEs, Build Tools, Application Servers (containers), Web Frameworks, Application (server-side) Frameworks, JVM Standards, Continuous Integration Servers, Frontend Technology, Code Quality Tools, Version Control Systems

Part II: Developer Timesheet
How do devs spend their work week?

Part III: Developer Efficiency
What aspects of your job make devs more/less efficient?

Part IV: Developer Stress
What keeps devs awake at night?

But before we go any deeper, let’s review the data as a whole.

What if you no longer had to redeploy your Java code to see changes? The choice is yours. In just a few clicks you can Say Goodbye to Java Redeploys forever.

Although we ran the survey through an open call on the web, there is a certain amount of bias present in it. Obviously, our customers are more likely to reply to the call than anyone else. Also, it’s likely that people who find and take such surveys are less conservative in their technology choice than those who don’t. To protect against such bias, we asked for the size of the company and the industry, to be able to normalize the data, if there is any overrepresentation. However the no size or industry was over-represented, so aside from some “early adopter” bias there shouldn’t be a lot of misrepresentation.

Another thing to understand is what are we measuring. Popularity can be measured in different ways — number of users, number of organizations, number of lines of code, etc. In this survey we measure the penetration of tools & technology in the market. So 10% of respondents means that likely 10% of the organizations that do Java development use this tool or technology somewhere in the organization. It does not tell us whether it is critical for them or not, whether it is used a lot or only in rare cases and whether it is used by everyone or only the respondent.

tools and tech leaderboard

DOWNLOAD THE PDF


  • Retard

    Need more robots.

  • http://twitter.com/halyph Orest Ivasiv

    Guys, it’s a very nice report. Keep going. Looking forward for the next report ;-)

  • Phillip VU

    It looks like some helpful tools were missed in this report.

  • J. Johnson

    It will be great if it covers ZK and specific JSF solutions, such as RichFaces nad PrimeFaces.

  • arhan

    as with any statistical slice, there are tools that are more popular and some cool tools might be mentioned just a couple of times which makes the number negligible. It is very possible that very very very cool tools just didn’t make it to the list as not many people mentioned ‘em

  • arhan

    Surprisingly, ZK received a negligible number of votes for the survey. Next time be sure to vote for it! :)

  • PIeter Humphrey

    great report — do you publish any details about survey size, sample, methodology?

  • neil stockton

    LOL. JDO lacked a good implementation for years and that is your reason it “lags behind” ? Suggest you do a bit more background reading. DataNucleus (previously JPOX) was RI from 2006, and there were some v good commercial impls before that. What caused it to “lag behind” was politics from RedHat, Oracle and IBM. This is common knowledge, just ask anyone involved in JSR0243. Suggest you at least get the facts straight if pushing out any further such “reports”

  • Gaius

    Agree – I thought that comment was interesting myself. There’s a JPA kool-aid drinking factor in play too. “Everybody’s doing it, so it must be the best.”