Is node.js turning into the new Java?

It's been quite a ride for for node.js. A group of core developers secured a nice series A round to start npm, Inc., bringing a corporate steward to the growing repository site. Success stories on node.js in various companies seem common place with new stories weekly. Recently, NodeSource closed their series A round to bring node.js to enterprise customers. All this excitement for node.js and JavaScript was succinctly summed up in 140 characters:

The adoption of a new technology isn't the interesting part of the story. Engineers, hackers and technically minded individuals like the challenge of learning and adopting a new technology. It's more interesting to see the blatant bashing of Java, the technology node.js is trying to replace. For instance:

Monolithic applications, mainly written in Java, are killing development cycles, stifling innovation and keeping Java-heavy IT organizations many steps behind their competitors, especially those who have embraced a microservices architecture"

-- Joe McCann, NodeSource Co-Founder and CEO (source)

This comment is only fair if you want to compare writing Java 1.2 code with writing node.js v0.10 code. The problem being highlighted isn't with Java, it's with the code being produced by the average Java developer.

The code being produced by the average Java developer sucks. It's not that Java hasn't kept up with advances in software development. The problem is that the average Java developer is still using old patterns and libraries to solve their problem. Take non-blocking I/O (NIO). Java added support for NIO back in Java 1.4, but most answers on how to perform I/O in Java highlight the package. The same can be seen in how to build a service. We used to bundle things in a WAR file and deploy to Tomcat. It was painful and lacked dependency isolation so people just built bigger services. Today, the Java developer can use Dropwizard to roll up your application into a standalone service enabling true microservices.

Even though you can write Java that doesn't suck, developers won't use the new functionality in Java unless Oracle breaks the language and improves the default behaviors. When node.js was created, the design of JavaScript made blocking I/O impossible and thus callbacks were built into the platform. Even though promises provide a lot of benefits, the average node.js developer will solve the problem with callbacks. Why? The core library uses callbacks so the concept is introduced early on. Armed with a tool, the average developer will continue to use that tool regardless of better or more efficient tool that may be offered to them.

This pattern isn't new either - look at C++. Everyone who learned C++ was first introduced to allocation via pointers and the new keyword. While better options existed, such as object allocation on the stack or the use of auto_ptr from the STL library, developers still continued to allocate memory with pointers. Combined with sloppy designs, this led to memory leaks and null pointer reference bugs being common in C++ code. Java solved this problem by making garbage collection the default regardless of how the object was allocated. In the end, Java and its garbage collector won partially because it was the default behavior in the platform. (Funny enough, Java and JavaScript still have memory leaks and null pointer reference bugs.)

All that aside, I love writing node.js code and if I'm honest with myself, it's not for any technical merits that I couldn't accomplish in Java. It's simply because everything with node.js is a greenfield project and there isn't a lot of legacy code that sucks to deal with. It's the same feeling I had about Java in 2000. Yet if history is any indication, the next few years will attract the average Java developer to node.js and with it the problems of sucky code that never gets refactored.