Imagine a bacon-wrapped Ferrari. Still not better than our free technical reports.

Java Productivity Report 2011: India vs Rest of World


Download PDF

Have you ever wondered what it’s like to be a Java developer in India? Which tools, containers and frameworks do Indian developers use? How do Indian development environments compare to the environments of other java developers globally? India is home to hundreds of thousands of Java developers, and we think that the show “Outsourced” shouldn’t be the only window into the daily work lives of India’s IT crowd. To illuminate the space, we collected information from over 1250 Indian developers in order to provide you with answers to questions like:

As a follow-up to this report, we’d like to ask you for your opinions on this data, and for responses related to the following questions:

  • What influences the differences we’ve found?
  • Do you feel that this small sampling is somewhat accurate, or way off?
  • Why are WebSphere and WebLogic so much more popular in India than RoW?
  • Is productivity a greater or lesser concern in India?
  • What other similarities and differences have you found?

Our Methodology

This report includes several comparisons between the data we received from our India survey and the Java EE Productivity Report, which we published early in 2011. We followed up that report with Reactions to the Java EE Productivity Report, which featured commentary from various reviewers, such as Joe Ottinger (of TheServerSide and GigaSpaces), Jason van Zyl (from Sonatype, talking about Maven), Duncan Mills (from Oracle, talking about Netbeans), Ian Skerett (of the Eclipse Foundation), Todd Williams (of Genuitec), Rich Sharples (on JBoss), Howard Lewis Ship (on Tapestry), Eelco Hillenius & Jeremy Thomerson (on Wicket), and Matt Raible & Ed Burns (on JSF).

In previous surveys, we emailed contacts that we met at JavaOne and other conferences, and announced the survey on sites like DZone and The Server Side.  Those methods may have a location bias and other self-selection biases. For the data on India, we had to do our best to ensure that the audience we reached was exclusively based in India, so we did something that we’ve never done before at ZeroTurnaround – we rented an email list, and asked that audience to complete the survey. This introduces other forms of bias, and we openly acknowledge it. In the worst case, this data may only serve to start a conversation, but if the data is indeed accurate, it may be used for other purposes. The choice is up to you, and the data is freely available for your own analysis.  Let’s get started.

First, a look at some demographics…

We wanted to get a quick picture on the job profiles, industry/sector and the breakdown of Java versions in use in India, according to our survey results.

When asked about the respondents’ role at work, we found that 72% are directly involved in Java development as software developers, senior developers or engineers. Another 10% fell into the Management segment as project managers, directors or VPs involved with Java development teams. The remaining 18% fall into academic institutions as University professors or students, or identified themselves as some of the following: Software Architect, Technical Staff, Team Lead, Technology Architect, Business Development, Delivery Manager and Trainers.

When asked to describe which Industry or Sector of the workforce the respondents work in, the data showed that 31% selected the “Services” industry/sector, which includes system integration plus managed and professional services. Independent Software Vendors (ISV) took the number two position at 16%, and consulting firms, outsourcing firms, telecommunications providers and academic institutions each represent about 10% of the market.

Finally, we got some interesting data on Java versions used in India. Just over 68% of respondents use Java 5.0 or later, and 11% continue to use Java 1.4 or earlier. These choices were not mutually exclusive, nor required, so it is unclear what other respondents were using.

Build Tools Comparison

In our last report, Ant and Maven were nearly neck-to-neck, but there is a huge gap in use in development teams in India.

Maven adoption appears much lower in India compared to the RoW report, which had actually placed Maven with a slight lead over Ant (53% vs 50%). Perhaps name recognition (i.e. Apache Maven vs. Maven) led respondents to hesitate or misread the choices?

In any case, it would be great to find out more about this large discrepancy between Maven use in India, which is used by only 17% of respondents, versus the RoW segment. What could cause this?

We shall expand the number of choices for build tools as much as possible for our next survey to include those rising in popularity, such as Gradle.

IDE Popularity Comparison

The Eclipse IDE is still the market leader in India, lower penetration of IntelliJ IDEA in India has lead to NetBeans taking the #2 market spot.

IDE usage trends in India are relatively in-line with RoW, but a couple of significant exceptions are immediately visible. In terms of IDE popularity, the trends in the RoW appear to match up with India, with the exception of IntelliJ IDEA, which is greatly underrepresented in India compared to the rest of the world. In the RoW report, IntelliJ was used by 22% of survey respondents, compared to only 2% in India.

MyEclipse, NetBeans, JDeveloper and RAD pick up the slack for IntelliJ with larger market shares, demonstrating the reach of IBM and Oracle on the subcontinent. While less than 1% of RoW respondents mentioned using RAD, it claims 8% of the IDE market share in India. Small note: In the RoW report, the “Other” option was not offered, so the comparison here is not accurate.

Application Server Comparison

Heavier, enterprise-focused containers move in for the steal in India, turning the tables on more popular app servers in RoW

The major players are all represented in India, but market shares are significantly different when compared with the RoW. The greatest difference can be seen with IBM WebSphere AS, which is the #2 player in India with 26% of respondents using it. Oracle WebLogic, the #3 application server both in India and RoW, takes a larger share of the India market with 16%. Compared with the last report, where only 7% of those surveyed use WebSphere and 10% used WebLogic, we can see pretty definitively that both IBM and Oracle have been more successful at dominating the Indian market for application servers.

While Tomcat still enjoys nearly identical market share in India as everywhere else, JBoss, Glassfish and Jetty all have significantly smaller roles in the application server market. JBoss accounted for over one-quarter of the market for application servers in the last report, but this is nearly halved for India.

Glassfish (3%) and Jetty (1%), which were tied for the fourth most-used application server in the last report with 8% of respondents using each, have considerably smaller influence in India.

Is it safe to say that the larger companies, using heavier application servers, are more likely to outsource to India, and their preferences dictate the technologies to be used in development environments?  How does Tomcat maintain its’ market share while other application servers falter? Who is more likely to use JBoss, Glassfish, and Jetty in North America and Europe?  Are there similar numbers of those types of users in India?

Java Frameworks Comparison

Older technologies are to be found in India, but the most popular frameworks are still well represented.

In the last report, we separated Web Frameworks and Server-side Frameworks, but lack of clarity in some areas caused confusion as some frameworks, like Seam and Spring, can be used for both.

The last report also omitted entries for log4j and AspectJ, which were left out and rectified for this latest survey. The closest match-up of a more widely used tool is Google Web Toolkit (GWT), although Stripes is just as popular in India as everywhere else.

Significantly, huge gaps in use of Struts (1 and 2), Hibernate, JSF and JPA were seen. Spring has a significantly smaller representation in India, which also shows a greater use of EJB 2.0 over EJB 3.0. EJB 1.0 is used 3x more in Indian development teams in comparison with the rest of the world.

Turnaround Time

Although we can’t measure all forms of productivity in a report like this, we do like to identify one that can be measured and compared by any Java EE developer globally: Turnaround Time.

In these surveys, we collect data on turnaround time by asking:

  1. “How long does it take to build, restart your container and redeploy your app?”
  2. “In an hour of coding, how many times do you redeploy?

Responses to these questions give us a reasonable estimation of the amount of time, in an average coding hour, that a java developer spends building, restarting, and redeploying their application.  A low number here is seen as highly productive; quick turnaround times mean that developers are constantly verifying their work, writing code in an incremental nature, and delivering features that are less likely to introduce new bugs.  Higher numbers mean that teams spend a significant amount of time waiting to see the effects of changes they make to code, implying that they do it less often, lose their flow, and take longer to get the same amount of work done, at the same quality.

That being said, we could not directly watch each respondent as they timed their redeploy process – an issue that we’ll address later.

Something we’ve never asked is, “After your build, restart, and redeploy process finishes, how much time does it take to get back into the flow of your work?”. Unfortunately, this is not easily calculable, but clearly existent and certain to affect developer productivity, turnaround time and project completion times when applied across a team of hundreds of Java developers over a period of even one month.

Disclosure: ZeroTurnaround builds and sells JRebel, a tool designed to eliminate the time spent building & redeploying. This provides another reason why we are interested in measuring the amount of Turnaround Time spent in Java development, and the reader should be made aware of this.  All data is available for personal analysis.

The length of a single redeploy cycle – India vs RoW 2011

The first question asked to get at the heart of turnaround time in Indian Java development teams was “How long does it take to build, restart your container and redeploy your app?”

Average redeploy times between India and RoW are quite different. In the last report, we saw an average redeploy time for RoW at 3.1 minutes with a standard deviation of 2.8. For India, we see an average of 4.8 minutes with a standard deviation of 4.2.

Just over one-third of respondents enjoy redeploy times of 1 minute or less, but nearly a quarter (23.1%) of the population require at least 10 minutes for each redeploy.

It is also interesting to note the peaks at “About 5 minutes” and “10 min or more” were selected more often than close alternatives.  This could become more accurate when actually timed.

Number of redeploys in an hour of coding – India vs RoW 2011

When asked how many times per hour the respondent redeploys their application (“In an hour of coding, how many times do you redeploy?”), the responses resulted in an average of 3.3 redeploys with a standard deviation of 2.9.

Over 50% of respondents redeploy once or twice an hour. The most frequent selection was 1x hour (Once), which is not surprising considering the statistically high proportion of developers with redeploy times of 10 minutes or more – it’s possible to imagine that they may not even be permitted to redeploy more than once per hour.

At the end of the tail, we see that a statistically significant segment redeploys greater than 10 times per hour. In the raw data, nearly 40 respondents that redeploy 10 or more times per hour have greater than 5 minute redeploy times, which means that they spend 80% or more or each hour redeploying. This brings up an issue that we’ll address in the next section.

To compare this, in the RoW report, we found that only 18% of respondents said they redeploy once per hour, 12% suggested they redeploy 5x per hour, and 11% said that they redeploy 10x or more per hour.

Those respondents that do not redeploy were asked to give some indication about this. Here are the first dozen or so responses to this question (these responses are unedited):

  • Not in one hour about 8 Hours work, we redeploy.
  • Deployment of code on dev server is once in day.
  • I am using Struts so until all the 3 layers i.e UI, business logic, and DAO are complete you cannot deploy
  • We redeploy our project in every 4 months gap.  Since its a security product, we cannot redeploy it in hours. (this response indicates that there is confusion between a development and production deployment – something we’ve found globally)
  • We are using Grails framework, which reflects most of the changes on the go
  • Actually I am working in very small firm so I directly work on server which is Linux and i use apache and jdk for db I use mysql
  • At times I don’t redeploy since I can attach my IDE to the container. For some scenarios hot deployment is helpful when there are no structural changes.
  • We had integrated JBoss with our ide. once the code is saved it will be updated in the server and that process is automated through the ide integration.
  • WE develop it in WebSphere IDE using its test server. Once the developement is done. we pack it in war /ear file and the deploy it.
  • I use Jetty for UI changes which does the hot – deploy for GWT based application and in case there is any server side changes, then I just replace the maven jar of that module and restart tomcat.
  • I always work in debug mode so I don’t need to redeploy often.
  • We use the Intellij Idea / incremental change deploy process. We need not re-deploy the entire war
  • I am using JRebel which will automatically reload the modified class file in the server, so the change would reflect immediately instead of doing a build and deploy.

Amount of coding time per hour spent redeploying – India vs. RoW 2011

The calculation for total turnaround time per hour of coding is the product of the length of a single redeploy x number of redeploys per hour.

The average respondent spends about 11.8 minutes per hour (vs. RoW, at 10.5 min/hour) redeploying with a standard deviation of 13.2 (RoW: 9.8). This is almost 20% of total coding time (RoW: 17.5%), and adds up to over 20 hours per month in time wasted. If you consider 5 hours out of each day as “coding time”, and assume 4 weeks out of each year are vacation time, then on average 5.8 full, 40-hour workweeks (RoW: 5.25 weeks) per year are spent exclusively redeploying and restarting.

Data for this chart has been cleaned to reflect the following:

  • Removal of developers who “redeploy instantly” – since they do not have to wait for redeployment, their environment is not affected by turnaround time.
  • Removal of developers who “don’t redeploy” – whether this is due to technology choice, a conscious decision, their role in the team, etc, these respondents are not affected by turnaround time
  • Removal of developers whose amount of redeploy time per hour of coding was >60 minutes – these responses are considered outliers, and some results were mathematically impossible (e.g. 144 minutes each hour spent redeploying). Even responses as high as 60 minutes should be considered strange, however, the fact that a developer feels like he spends most of his day redeploying is a powerful indicator.

How would you interpret this data?

Measuring Efficiency in Java development 

As in the RoW report, we saw large standard deviations across the board for India, which speaks to the level of variation among development environments. For this survey, we wanted to provide readers with a view into development efficiency, and to attempt to define what is considered efficient or inefficient. As in the past, we make the assumption that 5 hours of each 8-hour work day is spent writing code, although we’ve been told that 6-6.5 hours is more accurate. The following is a subjective view of efficiency, and we acknowledge that.

For the purposes of color-coding, we have segmented the data from our Indian report as follows:

  • Efficient – 26% of respondents. Developers spend 5% or less of their coding time redeploying. This consumes a maximum of 75 minutes per week, which is still over 1.5 work weeks/year.
  • Average – 19% of respondents. Developers spend >5 – 10% of their coding time redeploying. This adds up to 76 – 150 minutes per week, which costs as much as 3.25 work weeks/year
  • Inefficient – 21% of respondents. Developers spend >10 – 20% of coding time redeploying. This equals 151 – 300 minutes per week, or as high as 1 hr per work day, per developer. Annually, 6.5 work-weeks are lost to Turnaround Time.
  • Very inefficient – 17% of respondents. Developers spend >20% – 33% of coding time redeploying. This costs 301 – 500 minutes per week, which is over 33 hours per month and 10+ work-weeks per year. What is the opportunity cost of 10+ work weeks?
  • Highly wasteful – 18% of respondents. Sadly, there are some development teams that, according to their own numbers, spend 1/3 or more of their coding time waiting for redeploys. This is 20 or more minutes per hour, more than 1 full workday each week, and approximately 3 months per year.

As we mentioned, this is subjective segmentation and your work environment may have a different view on efficiency.


Well, we’d like you to draw your own conclusions, rather than listen to those of a small (but major award-winning) company :-)

As a follow-up to this report, we’d like to ask you for your opinions on this data, and for responses related to the following questions:

  • What influences the differences we’ve found?
  • Do you feel that this small sampling is somewhat accurate, or way off?
  • Why are WebSphere and WebLogic so much more popular in India than RoW?
  • Is productivity a greater or lesser concern in India than RoW?
  • What other similarities and differences have you found?

Please leave comments below, or let us know @jrebel