Gradle Release Notes

Version 2.9

The Gradle team is pleased to bring you Gradle 2.9, delivering significant performance benefits together with some major enhancements to the Gradle TestKit.

Gradle 2.9 brings both faster incremental build speeds and reduced memory consumption. All builds can benefit from these changes, but the improvements should be particularly noticeable in large builds with many source files.

This release also brings further improvements to the Gradle TestKit. With support for debugging, cross-version testing, and capturing build output, Gradle TestKit now makes it easier than ever to develop and test Gradle plugins.

Within the experimental Java software model, the ability to declare the API of a JVM library brings a number of advantages. Separation of API and implementation is enforced at compile time, and recompilation is avoided where possible. This feature also provides a path for migrating to the Java Module System coming in JDK 9, providing build-time enforcement of concepts that will be enforced at runtime in Java 9.

New and noteworthy Here are the new features introduced in this Gradle release. Easier Gradle Plugin development with Gradle TestKit The Gradle TestKit was introduced in Gradle 2.6, and provides support for developing and testing Gradle plugins. This release delivers significant improvements to TestKit, with support for debugging, cross-version testing, and capturing build output. Easier debugging of functional tests The Gradle TestKit facilitates programmatic execution of Gradle builds for the purpose of testing plugins and build logic. This release of Gradle makes it easier to use a debugger to debug build logic under test. In order to provide an accurate simulation of a Gradle build, the TestKit executes the build in a separate process by default. This facilitates more accurate testing by preventing interference between the build environment and the test environment. However, it does mean that executing a test via a debugger does not automatically allow debugging the build process. To support debugging, it is now possible to specify that the build should be run in the same process as the test. This can be done by setting the org.gradle.testkit.debug system property to true for the test process, or by using the withDebug(boolean) method of the GradleRunner . Please see the Gradle User Guide section on debugging with the TestKit for more information. Test plugins against multiple Gradle versions It is now possible to use the GradleRunner to execute builds with arbitrary Gradle versions and distributions. This feature is extremely useful for verifying a plugin's functionality with a range of different Gradle versions. The version to use can be specified via the new GradleRunner.withGradleVersion(String) method. Please see the section in the User Guide on specifying versions for more information. Capturing build output When using the GradleRunner to programmatically execute Gradle builds for testing plugins and build logic, it is now possible to capture the output from the build under test. By default, no output is captured. The new forwardOutput() method can be used to route the output from the build under test to the output stream of the process using the Gradle runner. This is often convenient when being used in a testing context, as output generated by the test is typically associated with the test results (e.g. in the IDE UI or test results report). If more control is needed, the new forwardStdOutput(Writer) and forwardStdError(Writer) methods can be used. Performance improvements for incremental builds In many cases, Gradle 2.9 is much faster than Gradle 2.8 when performing incremental builds. Very large builds (many thousands of source files) could see incremental build speeds up to 80% faster than 2.7 and up to 40% faster than 2.8. Faster up-to-date checking for incremental builds Gradle now uses a more efficient mechanism to scan the filesystem, making up-to-date checks significantly faster. This improvement is only available when running Gradle with Java 7 or newer. Other improvements have been made to speed-up include and exclude pattern evaluation; these improvements apply to all supported Java versions. No build script changes are needed to take advantage of these performance optimizations. Reduced memory footprint for incremental builds Gradle now uses much less memory than previous releases when performing incremental builds. By de-duplicating Strings used as file paths in internal caches, and by reducing the overhead of listing classes under test for Java projects, some builds use 30-70% less memory that Gradle 2.8. Reduced memory consumption can translate into significant performance improvements when a build process is running low on memory. No build script changes are needed to take advantage of these memory savings. Explicit declaration of library API with the experimental Java software model Developing with the experimental Java software model is now more powerful, with the ability to explicitly declare which packages and dependencies make up the API of a library. Declaring the API of a JVM library has many benefits, including: Preventing the accidental leakage of "internal" classes to downstream consumers. Code compiled against the library will only have access to the API.

Removing the need to recompile downstream consumers when the signature of the library has not changed. This can significantly improve performance by avoiding recompilation.

Providing a path for migrating to the Java Module System coming in JDK 9. With this feature, Gradle can provide build-time enforcement of the type of separation that JDK 9 will bring at runtime. Declaring packages that belong to the API of a JVM Library It is now possible to declare the packages that make up the API of a JVM component. Declaring the API of a component is done using the api { ... } block: model { components { myJvmLibrary ( JvmLibrarySpec ) { api { exports 'com.acme' } } } } Gradle will automatically create an API jar for the 'myJvmLibrary' component. Components that depend on that component will be compiled against the 'myJvmLibrary' API jar. The API jar will only include classes that belong to declared api packages. As a consequence: attempting to compile a consumer that accesses a non-API class will result in a compile time error.

updating a non-API class will not result in the recompilation of downstream consumers.

downstream consumers will not be recompiled when an API class is changed in a way that does not change signature of the API (e.g. changing the implementation of a method body, renaming parameters, adding a private method). Declaring dependencies that form part of the API of a JVM Library As well as exported packages, the library API can include types from dependent libraries. In this case, we say that the API of the dependent library is included in the library API. Dependencies that are included in the API are declared in a similar way to regular compile dependencies, but they are declared within the api { ... } block. model { components { logging ( JvmLibrarySpec ) { api { exports 'my.logging.api' } } myJvmLibrary ( JvmLibrarySpec ) { api { exports ... dependencies { library "logging" library "utils" project ":util" } } } } } If component 'main' depends on the 'myJvmLibrary' library, it will be compiled against the 'myJvmLibrary' API jar together with the 'logging' library API jar. It is illegal for any classes in 'main' to access non-exported classes of 'myJvmLibrary' or non-exported classes of 'logging'. Improvements to the incubating model infrastructure Rules defined in build scripts can now declare input dependencies It is now possible for rules declared directly in a build script to depend on other model elements as inputs. model { components { all { targetPlatform = $ . platforms . java6 } } } In the above example, a model rule is declaring that all components target Java 6, by setting their targetPlatform property to the Java 6 platform. The $.platforms.java6 construct is an input reference to that model element. This dependency is understood by the rule execution system, which ensures that the definition of the depended upon item is complete when it is used in this manner. Please see the section in the User Guide on declaring input dependencies for Model DSL rules for more information. Consistent validation of model types The error messages produced for an unknown model type have been improved, to describe the types that are actually supported. In the following example MyModel is not a valid managed model type because managed models cannot have properties of type java.io.FileInputStream . @ Managed interface MyModel { FileInputStream getStream () void setStream ( FileInputStream stream ) } A model element of type : 'MyModel' can not be constructed . Its property 'java.io.FileInputStream stream' can not be constructed It must be one of : - A managed type ( annotated with @ Managed ) - A managed collection . A valid managed collection takes the form of ModelSet < T > or ModelMap < T > where 'T' is : - A managed type ( annotated with @ Managed ) - A scalar collection . A valid scalar collection takes the form of List < T > or Set < T > where 'T' is one of ( String , Boolean , Character , Byte , Short , Integer , Float , Long , Double , BigInteger , BigDecimal , File ) - An unmanaged property ( i . e . annotated with @ Unmanaged ) Directly add a LanguageSourceSet to a FunctionalSourceSet It is now possible to add a LanguageSourceSet instance of any registered type to a FunctionalSourceSet which exists in the model space. This can be done via a RuleSource plugin: class Rules extends RuleSource { @ Model void functionalSources ( FunctionalSourceSet fss ) { fss . create ( "myJavaSourceSet" , JavaSourceSet ) { LanguageSourceSet lss -> lss . source . srcDir "src/main/myJavaSourceSet" } } } apply plugin : Rules Or via the model DSL: model { functionalSources ( FunctionalSourceSet ){ myJavaSourceSet ( JavaSourceSet ) { source { srcDir "src/main/myJavaSourceSet" } } } } Any registered LanguageSourceSet implementation can be specified for creation. LanguageSourceSet types are registered via a rule annotated with @LanguageType : class JavaLangRuleSource extends RuleSource { @ LanguageType void registerLanguage ( LanguageTypeBuilder < JavaSourceSet > builder ) { builder . setLanguageName ( "java" ); builder . defaultImplementation ( DefaultJavaLanguageSourceSet . class ); } } apply plugin : JavaLangRuleSource Note: LanguageSourceSet instances added to a FunctionalSourceSet in this fashion are not yet added to the top-level sources container. This will be addressed in a subsequent release. Tooling API provides details of Eclipse builders and natures Clients of the Tooling API now can query the list of Eclipse builders and natures via the EclipseProject model. The result of the EclipseProject.getProjectNatures() and EclipseProject.getBuildCommands() contain any builders and natures required for the target project. These values contain any natures and builders determined by Gradle as required, as well any customisations defined in the configuration for the 'eclipse' Gradle plugin.

Fixed issues

Potential breaking changes The binaries container is no longer accessible as a project extension The binaries container is no longer bridged into the regular plugin space, and is now only visible to rules via model. The binaries project extension has been removed. For the following code that works in Gradle 2.8 and earlier: binaries . all { ... } use this in Gradle 2.9: model { binaries { all { ... } } } A similar change is required for binaries.withType and binaries.matching . Changes to the incubating native software model NativeExecutableBinarySpec.executableFile is now reachable via NativeExecutableBinarySpec.executable.file .

is now reachable via . NativeTestSuiteBinarySpec.executableFile is now reachable via NativeTestSuiteBinarySpec.executable.file . Tool settings like cppCompiler.args are no longer added via the Gradle extension mechanism. PreprocessingTool accessors are now implemented directly by NativeBinarySpec , which no longer implements ExtensionAware . No changes to build scripts should be required. Changes to the experimental model rules DSL The Model DSL has stricter syntax in Gradle 2.9. The only top-level elements permitted within a model block are rule definitions: any other constructs (e.g. if statements, variables etc.) are invalid and will fail to compile. The following is no longer valid: model { if ( someCondition ) { tasks { create ( "someTask" ) } } } Within a given rule, arbitrary statements may still be used. In order to improve the incremental build performance, a number of changes and optimizations were made to the way we check if a Task's inputs are up-to-date: For input of type zipTree or tarTree , Gradle will no longer extract the archive, but will simply compare the backing archive file for changes.

or , Gradle will no longer extract the archive, but will simply compare the backing archive file for changes. For a Task that has a zipTree or tarTree input with a filter, any change to the backing archive will cause the Task to be out-of-date. Previously only changes to files inside the archive matching the filter would make the input out-of-date.

or input with a filter, any change to the backing archive will cause the Task to be out-of-date. Previously only changes to files inside the archive matching the filter would make the input out-of-date. Previous versions of Gradle did not consider changes to directories when performing up-to-date checks. This has been fixed, and directories are now involved in up-to-date checking.

External contributions We would like to thank the following community members for making contributions to this release of Gradle. Raluca Sauciuc - Do not attempt to set AWT system properties in the daemon JVM. We love getting contributions from the Gradle community. For information on contributing, please see gradle.org/contribute.