I wrote From Java to Ruby: Things Every Manager should Know not for programmers, but for technical decision makers. Ruby advocates have done an excellent job of helping new developers understand the intricacies of Ruby and the flagship Rails frameworks, but less information is available for managers and executives deciding between technologies. In the last article of this series, I discussed strategies for establishing a pilot project in a Java shop. In this article, I'll examine the changing risk profiles for Java and Ruby.

In the mainstream, "Ruby is risky" is a common perception, and for good reason. New languages are inherently risky. As Ruby on Rails moves closer to the mainstream, that risk will decrease, because you'll have access to a growing set of programmers, components (called gems or plug-ins), books, and business partners. You'll also see the mainstream opinion that "Java is safe". On this point, I strongly disagree. As any language bloats, the risk will often increase. To understand what's happening on this front today, it pays to examine Java's initial adoption.

Technology adoption profiles

Many analysts have described models for technology adoption. One of the most popular was defined in Iowa to describe the adoption of agricultural products, and was later described in the context of technology in a book called Crossing the Chasm by Geoffrey A. Moore. In it, Moore describes the technology adoption cycle in five distinct groups:

Technologists. This group is predisposed to adopt new technologies. Any promising technology can attract this group.

Early adopters. This group will adopt new technologies for competitive advantage, regardless of whether they are successful in the mainstream.

Pragmatists. This group will adopt technologies once they become mainstream, or have a steep enough growth curve to effectively assure widespread adoption.

Conservatives. This group will adopt technologies only after it becomes necessary.

Skeptics. This group may adopt very late, or may never adopt a given technology.

Related Sponsored Content 3 Common Pitfalls in Microservice Integration – And How to Avoid Them

Moore argues that the key to technology adoption is getting pragmatists on board. Since pragmatists require mass adoption, this middling group wants to see other pragmatists use a technology before that group is willing to make a commitment. It's a catch-22. You can't get pragmatists without other pragmatists. For that reason, you'll often see a downward trend in a market acceptance curve after the early adopters are on board, but before the pragmatists. Moore called this downward trend the chasm, and this notion should be at the center of a risk discussion surrounding any new technology.

Moore's solution was to focus on crossing the chasm in stages. Normally, you can't cross the chasm with one big leap. You need to niche-market. Java did so by attacking the Internet clients first with Applets, and then moving into server-side computing, mobile, and other niches such as mobile computing and enterprise architectures.

In Beyond Java, I argue that the chasm for programming languages is especially severe. Most of us recognize that an investment in Lisp may lead to productivity gains, but also will make it more difficult to find programmers, education, libraries, and components. We'll also have to spend more than we'd like to do any significant integration. For this reason, the mass market will adopt a major new programming language only every ten years or so. You can easily see this trend in server-side programming languages. COBOL and Fortran emerged in 1954 and 1961, C in the early 1970), C++ in the mid 1980s, and Java in 1996. I'd throw C# into the mix as effectively a Java clone, though there's room for some argument. Many other languages emerged over this time, but none received the dominant adoption of those above. Risk is the overriding reason that so many resist new programming language adoption.

Java's risk profile

Java once had to overcome high risk. At the time, most server-side programming was in the C++ programming language. C++ was effectively a systems language, adapted to applications development. The C family succeeded in that space because client-server development--and user-interface development--demanded a combination of performance and flexibility that was not available in many languages of the time. To overcome the risk of adopting a new language, Java needed three conditions to be true:

C++ developers had to experience a high level of pain. Pointer arithmetic (combined with the lack of compile-time safety) led to difficult family of bugs. Memory management made leaks commonplace. C++ was simply too difficult for many applications developers. These problems increased the risk profile for C++.

Java needed to solve some problems that C++ could not. The Java language provided simplicity, portability, and libraries that C++ couldn't touch. These factors reduced the overall risk profile for Java, keeping teams smaller and radically improving productivity.

Java needed a catalyst. With the exploding Internet, applets embedded into NetScape provided a compelling reason for C developers to take a look at Java. The C++ like syntax simplified the transition. Java was able to quickly grab a massive community, and a Microsoft backlash escalated the transition.

Java's explosion was bigger than anything we'd ever seen since, and was much larger than anything we're likely to see in my lifetime, but the blueprint is clear. To establish a new language, the old language needs to be painful, the new language needs to overcome that pain in a compelling way, and finally rapidly accumulate a community through some catalyst.

Java got a foothold quickly as an Internet applications language on the client side. Though the toehold with applets was tenuous, Java quickly moved onto the server side because it offered features that application developers found useful, including:

Memory management

A cleaner inheritance model

Better features for object orientation

Portability

Internet libraries

Security

...and many others. In my opinion, Java is the most successful programming language of all time. Over time, through growth, Java became less risky, and eventually dominated the market for server-side Internet programming. Commercial investment, the pool of programmers, available education, open source frameworks, and many kinds of published information all drive risks down. The reason is intuitive and clear.

Risk associated with a programming language decreases dramatically with marketshare once the language crosses the chasm.

Java has had an amazingly successful run. But programming languages do not remain the state of the art indefinitely. All successful languages bloat, because they must adopt to the changing needs of their users. Successful programming languages cannot move as quickly as others because they must maintain a certain level of backward compatibility to satisfy a growing user base. As the technology lags and the language bloats, a different kind of risk profile emerges. For the new risk profile, risks related to marketshare decrease as risks based on the programmer's ability to effectively get work done increase.

So far, I've focused on the marketplace risks of an emerging technology. As Java reaches its 10th year, another kind of risk assessment becomes necessary. Many influential books, such as The Mythical Man Month, Death March, and Peopleware preach about a different kind of risk:

Poor productivity leads to larger teams and longer schedules

Risk increases with project length

Risk increases with the size of a team

Quality risks, measured in the numbers of bugs, increase with the size of a code base

Risk increases with cost

Integration costs increase with complexity

As a programming language--or even a programming paradigm--ages, the language will often slip in terms of productivity, and expressiveness, relative to the state of the art. Project teams will need to increase in size, and programmers will need to write more lines of code to solve the same problem. Both of these factors inherently increase risk. All of these factors lead to an inevitable conclusion.

Toward the end of market dominance, productivity risks associated with a language will increase relative to the state of the art.

Whether and how this happens within the Java language is the subject of intense debate. Certainly, Java remains the best language for solving a whole host of enterprise problems, such as very large projects, or those with certain demands such as two-phased commit or hardcore object-relational mapping. Java's commercial investment has never been stronger, and the community is at an all-time high. But cracks in the foundation may be beginning to appear.

Java's Enterprise JavaBeans framework, WS* style web services, and JEE have come under increasing criticism for complexity and sagging productivity. James Duncan Davidson, one of the fathers of the servlet, says Java is no longer as approachable as it once was. It's harder to educate a typical Java developer to solve the most common programming problems: database-backed web applications. Circumstantial evidence is emerging that shows frameworks on other languages, most notably Ruby on Rails, are several times as productive for solving niche problems. High-profile Java developers--James Duncan Davidson, Mike Clark, Justin Gehtland, Stuart Halloway, and many others--have reported very high productivity after using Rails in that important niche: greenfield database-backed web applications. Certainly, my private experience is that I can build, deploy and maintain such applications with far less effort using Ruby on Rails.

These reports will be broadly debated, just as the early reports of Java's productivity were. Remember, Java emerged first in a variety of niches before it expanded more broadly. Programmer productivity was one of the most important criteria driving Java's early growth. Keep in mind Moore's theory for the emergence of technologies. You'll best cross the chasm not with one giant leap, but one niche at a time.

I strongly believe that complexity and sagging productivity are driving Java's risks up now.

Inherent Ruby risks

Ruby is no different than any other emerging programming language. Lack of commercial investment, a limited pool of developers, and lack of experience all will add risk to an emerging language. Here are the biggest risks I've encountered.

Lack of talent. It's harder to find existing Ruby developers. As it did with Java, that fact will change quickly, but right now, if you need to build large teams in a short time, you're better off with an established market leader such as Java.

Lack of experience. Some LAMP languages have established track records. Google uses Python; many major .COMs use Perl or C. There's not yet a flagship account for Ruby that shows massive scalability, or complex enterprise integration. We just don't know if it can solve a certain class of problems.

Deployment and profiling strategies. Ruby on Rails has been out for less than a year, so deployment and profiling experience isn't nearly as rich as it is for competing languages.

Lack of libraries. Ruby does not have nearly as rich a set as libraries as Java.

Lack of commercial investment. You have to work harder to find Ruby consulting, education, or contractors, and off-shoring is practically nonexistent.

There are many others. Still, you can effectively mitigate risks associated with Ruby. Take performance-related risks. Though the body of knowledge around large-scale Ruby deployments is limited, you can learn if you look in the right places. The industry has a wealth of knowledge of other LAMP-languages such as PhP, Perl, and Python. The deployment mechanisms, web servers, and shared-nothing strategies for scalability are all similar.

Or consider staffing. Don't underestimate your ability to build an effective staff through internal training. My training schedule for new Java developers for Spring, Eclipse, Hibernate, and WebWork is effectively five times as long as a similar schedule for a Ruby on Rails developer. You can do well by starting with a programming language with characteristics similar to Ruby, such as Perl, Python, or Smalltalk. If you want to build a programmer from scratch, you'll probably build a productive Ruby developer at least as fast as you can train a Java developer how to use the latest bevy of frameworks.

And think about libraries. How much do you really need? If you need distributed, two-phased commit, use Java. If you need perfect integration into Microsoft Office macros, use .NET. But if you're building operating system scripts for integration, or greenfield database backed applications, Ruby will have just about everything you need. And you can often build what you need if it's not there. I work with one company that built their own database driver in two weeks, but more than made up that time over the rest of the project. I talked to another that extended Oracle support by patching existing code in four hours. Thoughtworks built RBatis, Ruby's version of iBATIS, in a very short time.

So Ruby's risks are often overstated when you consider the whole picture, especially if Java is not giving you everything you need. The best way to put these risks into perspective is often to try Ruby for yourself. Use Rails to build something nontrivial, and make a call based on what you find. Don't buy into the myths.

Myth versus reality

Rails is a silver bullet.

People have failed with Rails, and many more will fail. If you apply it without the requisite skills, you'll fail too.

On a similar note, if Java's not your problem, Ruby will not be the answer. Most software development problems are not related to technology. If you're thrashing, Ruby on Rails will only help you thrash faster.

Choosing Ruby is too risky, because you could guess wrong.

The primary risk of adopting any new language is that you'll guess wrong, and be left with a stagnated set of libraries. That's certainly a significant risk, but that problem is in no way limited to just Ruby. Within Java, you need to make potentially dozens of small decisions about major libraries, any of which can leave you with a struggling, stagnating code base. Should you pick Spring, or EJB 3 for declarative transactions? Is the Java Persistence Architecture the right choice, or is Hibernate ultimately the answer? What's the right answer for the Web MVC layer, a fading Struts, or something cleaner?

Within Ruby, choosing a web development framework is much easier. You'll likely be working with Rails. The dynamic nature of the language also makes it easier to decouple layers of the architecture, making certain decisions much less invasive than their Java counterparts.

It's always easier to staff a Java project.

Java does have a much larger pool of developers, but the community has significant fragmentation. If you want to use an integrated stack, your choices are limited. Even if you do choose a popular stack such as Spring, your developers must learn potentially dozens of libraries that are specific to a given project. In this case, Java's core strength, a plethora of libraries, works against it. In contrast, most Ruby developers know Rails. Also, you typically need more Java developers to handle a similar task. Sometimes, staffing for Java is easier. Sometimes, it's not.

Rails cannot scale.

Ruby on Rails actually has good scalability. The caching model is strong, and the shared-nothing architecture has proven effective dozens of times over within the LAMP community. In reality, we know that Ruby on Rails can scale to moderately large applications. We don't know at all whether or not Ruby on Rails can handle very large application deployments. Nothing inherent in the architecture leads me to believe that it is a dead end. For typical applications, the latency is in the database anyway.

Rails integration options are too limited.

Rails has very good support of ReST-based web services. Ruby also has emerging support for the JVM through JRuby and Microsoft's virtual machine, called the CLR in a separate project. Good messaging options are emerging as well. In the end, you'll be in good shape if you pick the best tool for the job. Good teams can succeed with either Java or Ruby.

Wrapping up: What actions can you take?

If you're considering using Ruby, there's a wealth of information at your fingertips. Talk to people who have done both Java and Ruby effectively. Read about the frameworks. Check out From Java to Ruby. If you don't think you can leave Java but want a lightweight development experience, check out the Java projects that give you better leverage, such as RIFE, JMatter, or Wicket. If you think Ruby might be a good choice, consider these suggestions:

Pick the right tool for the job. Ruby on Rails is not a silver bullet. It's a highly-tailored environment for database-backed web applications. It will work much better with new database schemas, or those you can modify to take advantage of Rails defaults.

Plan your team ramp-up carefully. You won't be able to throw out an ad on Monster.com and staff the project in three days. You might want to consider training some or all of your developers, and recruiting a few top Rails developers, or taking on some limited consulting help to jump-start things.

Know your legacy integration points. Often, the hardest part of a project is defining interactions with external systems. Your initial proof-of-concept work should work through some of these touch points, at least to the point where you're comfortable with your solutions.

If you're not sure, do a pilot, or go with the conservative option. The best risk mitigation is always good judgement.

About the author

Bruce Tate is a mountain biker, kayaker and father of two in Austin, Texas. He has written nine programming books, including two on Ruby and five on Java. He is the founder of RapidRed, a company with a focus on lightweight development technologies including Ruby and Rails, offering development, consulting, and training. Bruce is recognized worldwide as an excellent speaker, programmer, trainer, and consultant.