As announced in the recent Scala roadmap, the Scala language will only run on Java 8 and later from Scala version 2.12 and onwards. InfoQ caught up with Adriaan Moors (Scala tech lead at Typesafe) and Jason Zaugg (Typesafe engineer) to hear more about this change and how Scala will be making use of Java 8's lambdas implementation.

InfoQ: What's the biggest driver behind making the change?

Adriaan: Targeting the Java platform has been instrumental to Scala’s success and rapid adoption. We’re keen to evolve with the platform to enjoy the improvements made to it and its eco-system. Native support for lambdas makes the Java 8 VM an even better host for Scala.

InfoQ: What do you think the biggest challenges are likely to be when making the change?

Adriaan: Moving to Java 8 is a natural evolution for us. For example, Scala 2.11 already has an experimental feature which emulates Java 8-style functions as much as possible on Java 6.

A function’s body is lifted into a private method of the enclosing class, and the anonymous class that’s instantiated to represent the function at run-time consists of a method that simply invokes the lifted method. By moving to Java 8, we no longer have to generate this anonymous class at compile time, but instead use the LambdaMetaFactory at run time. Similarly, the 2.11 type checker supports synthesizing Single Abstract Method classes from Scala function literals (when running under -Xexperimental).

These are the highlights of the technical side of the challenge. As you’d expect with a platform change, the social side is a very important factor. Everything about 2.12 is planned around facilitating the upgrade, and we are counting on the community to do so eagerly. Rapid adoption of Scala 2.12 is contingent on the availability of the core libraries, testing frameworks, IDE support and other tooling. That said, we realize not everyone will be able to upgrade to Java 8 immediately. To this end, we plan to limit the divergence between 2.11 and 2.12, both in the language and in the library. We want to make it dead simple for open source authors to cross-compile to both versions, which is an important part of making 2.11 a viable target for a longer period of time. The 2.12 roadmap has some more details on how we intend to execute this strategy.

InfoQ: From the initial blog post it looks like this is a fairly big change that touches a lot of Scala's subsystems. How big a team do you think you'll need to get it done?

Adriaan: It’s hard to put a number on the head count of “the Scala team”. As always, a significant part of the work will be done by the community. Several Typesafe engineers are working on Scala’s Java 8 support: specifically, Jason Zaugg is working on Java 8-style lambdas, and Lukas Rytz is continuing the work that Miguel Garcia started on the new ASM-based back-end and optimizer. The whole effort by its nature crosscuts the whole company: Akka & Play have great Java 8 APIs and we will keep refining them, the Scala IDE and sbt already support Java 8, as does all our other tooling.

InfoQ: Scala has had equivalents of basically the Java 8 capabilities for some time now. Are there any semantic differences between what Scala has, and what's arrived with Java 8 that you think will cause problems when adopting JDK 8 as a backend?

Jason: There are a few corner cases where the semantics of the JDK8 LambdaMetaFactory doesn’t work out of the box for us. For example, it doesn’t know how to box/unbox our Value Classes, or how to bridge a void-returning method handle to the generic return type of FunctionN::apply by using BoxedUnit. But we have plenty of options to adapt to these limitations: we can choose to stick with our current anonymous-class encoding in some cases or we can emit accessor methods that perform the requisite boxing. Emitting correctly specialized versions of functions over primitives also requires some careful work. But we’ve prototyped solutions to these challenges and are pretty confident that there are no show stoppers.

We’re also planning to take advantage of default methods to compile traits (Scala's lightweight form of multiple inheritance). This encoding will be much more restricted: it won’t work for a trait with a method that overrides a method in a class, nor are fields supported, among other limitations. Default methods simply weren’t designed with this application in mind, but they are an important asset in designing for binary compatibility. Future JDK versions might offer more powerful tools here; Brian Goetz's recently drafted proposal for classdynamic sounds extremely promising for Scala.

InfoQ: Any additional comments on the overall design of the lambdas subsystem in JDK 8?

Jason: The JDK side of the lambda implementation seems very well conceived and executed. The key design insight centres around the use of nvokedynamic (to defer the details of the encoding). This in turn builds on the forward thinking specification for invokedynamic itself, which has stood up to the test of a great API: it has been used successfully in ways that the API authors could not have foreseen! That said, the way that lambdas are exposed through the Java language takes a different set of tradeoffs to Scala. The most notable difference is the use of Functional Interfaces, rather than a canonical set of generic types for Functions of various arities. This in itself isn’t such a bad thing: we’re extending our functions to support functional interfaces, too. But selection of manually specialized function interfaces seems somewhat ad-hoc and makes it more difficult to write generic code.

InfoQ: Java 8 obviously brings in the capability to use some basic functional idioms (map, filter, etc). Do you see that as a threat or an opportunity for Scala? Is it going to take away developers who would otherwise have used Scala, or is it increasing the size of the pool of devs who could become Scala users?

Adriaan: I’m absolutely convinced it’s a great opportunity for Scala. Functional programming is not just about lightweight syntax for function literals and some more type inference for polymorphic method calls. That's just the start. To me, it’s about building and understanding programs by composing abstractions of the right size. Scala offers the whole gamut: from composing and defining single-method abstractions (functions) to multiple coherent

classes (traits). To me, a (statically-typed) FP language:

1) Focuses on composing small, easy to understand, units of functionality. Functions are easy to understand because they only depend on their arguments. The types of the functions and the combinators that are used to combine them capture the higher-level structure of the program. Types are kind of like high-tech plumbing here: essential to guide the data correctly, while accommodating structural changes without getting in the way. This is only true if the plumbing is hidden by the language. Scala’s powerful local type inference is essential to easily consume an FP library, because they tend to make heavy use of generics. Java 8’s limited type inference can be augmented by IDE support but that leads to the age-old problem of maintaining IDE-generated boilerplate about types.

2) Has a type system that lets library designers express these safe higher-order abstractions succinctly. Scala has long supported higher-order parametric polymorphism ("higher-kinded types" or "type constructor polymorphism"), as well as ad-hoc polymorphism. Java’s generics are strictly first-order, and overloading is an extremely limited form of ad-hoc polymorphism that causes more pain than benefit.

3) Encourages immutability. Immutable code is usually easier to understand. It also has clear benefits for scaling out your application (using all cores in your machine or across a datacenter) and to compartmentalize failure — using Akka, of course. Concretely, mutation often causes hard to diagnose entanglement between different parts of your code. Scala is pragmatic here as well: val and var intentionally only differ in one letter (though the Scala IDE highlights references to mutable variables in red).

4) Is biased towards a fixed data model with evolving operations, by defining functions in terms of pattern matching on this data (and making sure all cases are covered); Scala complements and deeply integrates this with OOP, where the focus is on an evolving data model, with constant functionality (see the Visitor pattern).

InfoQ: Finally, one of our editors was slightly confused by the fact that 2.12 is a major release. Any plans for a numbering change to Scala 3?

Adriaan: We distinguish two kinds of upgrades that require action from our users: those that require only recompilation (they preserve source compatibility but not binary compatibility), and those that are more involved, requiring source code changes. We reserve the most significant component of the version (the epoch) for the latter kind of upgrades, which are "rare" (let’s say, once a decade). We release a new major version every 18 months or so and this bumps the middle part of the version number. To ensure smooth upgrades, the same code base should compile without modification on adjacent major versions of Scala – as long as deprecation warnings were addressed. Deprecation is a crucial part of our process: we do our very best to (conservatively) balance stability and evolution, and will not break your code lightly, but we do believe everyone benefits from a healthy pace of innovation.

Finally, minor versions of Scala are indistinguishable drop-in replacements they should be forward and backward binary compatible. Their cadence is variable: early in the release cycle they may happen every other month, slowing down to quarterly releases as our development focus shifts to the next major release.

As explained in the roadmap, we plan for cross-building between the 2.11 and 2.12 major versions to be at least as easy as cross-building for 2.10 and 2.11. When we do break source compatibility in Scala 3, we will do so very conservatively and for good reasons like simplifying the language and speeding up the compiler.

When going from Scala x.y.z to (x+1).y.z, you will likely have to change your source code or have a tool do it for you. When upgrading from x.y.z to x.(y+1).z, you should only need to recompile assuming you dealt with deprecation warnings in the old version, because deprecated members may have been removed in the new version. Finally, Scala x.y.a and x.y.b should be able to be used interchangeably.

About the Interviewees

Adriaan Moors led the Scala team at Typesafe. He started programming in Scala in 2006, when he was experimenting with datatype-generic programming, which makes heavy use of type constructor polymorphism. During an internship at the Scala team, he implemented this feature in Scala 2.5 (making him the first external contributor to the Scala type checker). After graduating, Adriaan joined the Scala team as a post-doc, working on Scala's theoretical foundations as well as its implementation (dependent method types and implicit search, type constructor inference, 2.10's new pattern matcher).

Jason Zaugg is a software engineer in the Scala team at Typesafe. He has been coding in Scala for the last six years, both in enterprise and open source projects, and is now devoted to advancing the Scala platform itself.