The Gradle team is pleased to bring you Gradle 2.9, delivering significant performance benefits together with some major enhancements to the Gradle TestKit.
Gradle 2.9 brings both faster incremental build speeds and reduced memory consumption. All builds can benefit from these changes, but the improvements should be particularly noticeable in large builds with many source files.
This release also brings further improvements to the Gradle TestKit. With support for debugging, cross-version testing, and capturing build output, Gradle TestKit now makes it easier than ever to develop and test Gradle plugins.
Within the experimental Java software model, the ability to declare the API of a JVM library brings a number of advantages. Separation of API and implementation is enforced at compile time, and recompilation is avoided where possible. This feature also provides a path for migrating to the Java Module System coming in JDK 9, providing build-time enforcement of concepts that will be enforced at runtime in Java 9.
Here are the new features introduced in this Gradle release.
The Gradle TestKit was introduced in Gradle 2.6, and provides support for developing and testing Gradle plugins.
This release delivers significant improvements to TestKit, with support for debugging, cross-version testing, and capturing build output.
The Gradle TestKit facilitates programmatic execution of Gradle builds for the purpose of testing plugins and build logic. This release of Gradle makes it easier to use a debugger to debug build logic under test.
In order to provide an accurate simulation of a Gradle build, the TestKit executes the build in a separate process by default. This facilitates more accurate testing by preventing interference between the build environment and the test environment. However, it does mean that executing a test via a debugger does not automatically allow debugging the build process.
To support debugging, it is now possible to specify that the build should be run in the same process as the test. This can be done by setting the org.gradle.testkit.debug
system property to true
for the test process, or by using the withDebug(boolean)
method of the GradleRunner
.
Please see the Gradle User Guide section on debugging with the TestKit for more information.
It is now possible to use the GradleRunner
to execute builds with arbitrary Gradle versions and distributions. This feature is extremely useful for verifying a plugin's functionality with a range of different Gradle versions.
The version to use can be specified via the new GradleRunner.withGradleVersion(String)
method.
Please see the section in the User Guide on specifying versions for more information.
When using the GradleRunner
to programmatically execute Gradle builds for testing plugins and build logic, it is now possible to capture the output from the build under test.
By default, no output is captured. The new forwardOutput()
method can be used to route the output from the build under test to the output stream of the process using the Gradle runner. This is often convenient when being used in a testing context, as output generated by the test is typically associated with the test results (e.g. in the IDE UI or test results report).
If more control is needed, the new forwardStdOutput(Writer)
and forwardStdError(Writer)
methods can be used.
In many cases, Gradle 2.9 is much faster than Gradle 2.8 when performing incremental builds.
Very large builds (many thousands of source files) could see incremental build speeds up to 80% faster than 2.7 and up to 40% faster than 2.8.
Gradle now uses a more efficient mechanism to scan the filesystem, making up-to-date checks significantly faster. This improvement is only available when running Gradle with Java 7 or newer.
Other improvements have been made to speed-up include and exclude pattern evaluation; these improvements apply to all supported Java versions.
No build script changes are needed to take advantage of these performance optimizations.
Gradle now uses much less memory than previous releases when performing incremental builds. By de-duplicating Strings used as file paths in internal caches, and by reducing the overhead of listing classes under test for Java projects, some builds use 30-70% less memory that Gradle 2.8.
Reduced memory consumption can translate into significant performance improvements when a build process is running low on memory.
No build script changes are needed to take advantage of these memory savings.
Developing with the experimental Java software model is now more powerful, with the ability to explicitly declare which packages and dependencies make up the API of a library.
Declaring the API of a JVM library has many benefits, including:
It is now possible to declare the packages that make up the API of a JVM component. Declaring the API of a component is done using the api { ... }
block:
model {
components {
myJvmLibrary(JvmLibrarySpec) {
api {
// declares the package 'com.acme' as belonging to the public, exported API
exports 'com.acme'
}
}
}
}
Gradle will automatically create an API jar for the 'myJvmLibrary' component. Components that depend on that component will be compiled against the 'myJvmLibrary' API jar.
The API jar will only include classes that belong to declared api
packages. As a consequence:
As well as exported packages, the library API can include types from dependent libraries. In this case, we say that the API of the dependent library is included in the library API.
Dependencies that are included in the API are declared in a similar way to regular compile dependencies, but they are declared within the api { ... }
block.
model {
components {
logging(JvmLibrarySpec) {
api {
exports 'my.logging.api'
}
}
myJvmLibrary(JvmLibrarySpec) {
api {
exports ...
dependencies {
// The API of the 'logging' library is included in the 'myJvmLibrary' API
// Any classes in the 'my.logging.api' package of 'logging' are exported in the API of 'myJvmLibrary'
library "logging"
// The API of the 'utils' library from the ':util' project is included in the 'myJvmLibrary' API
library "utils" project ":util"
}
}
}
}
}
If component 'main' depends on the 'myJvmLibrary' library, it will be compiled against the 'myJvmLibrary' API jar together with the 'logging' library API jar. It is illegal for any classes in 'main' to access non-exported classes of 'myJvmLibrary' or non-exported classes of 'logging'.
It is now possible for rules declared directly in a build script to depend on other model elements as inputs.
model {
components {
all {
targetPlatform = $.platforms.java6
}
}
}
In the above example, a model rule is declaring that all components target Java 6, by setting their targetPlatform
property to the Java 6 platform. The $.platforms.java6
construct is an input reference to that model element. This dependency is understood by the rule execution system, which ensures that the definition of the depended upon item is complete when it is used in this manner.
Please see the section in the User Guide on declaring input dependencies for Model DSL rules for more information.
The error messages produced for an unknown model type have been improved, to describe the types that are actually supported. In the following example MyModel
is not a valid managed model type because managed models cannot have properties of type java.io.FileInputStream
.
@Managed
interface MyModel {
FileInputStream getStream()
void setStream(FileInputStream stream)
}
A model element of type: 'MyModel' can not be constructed.
Its property 'java.io.FileInputStream stream' can not be constructed
It must be one of:
- A managed type (annotated with @Managed)
- A managed collection. A valid managed collection takes the form of ModelSet<T> or ModelMap<T> where 'T' is:
- A managed type (annotated with @Managed)
- A scalar collection. A valid scalar collection takes the form of List<T> or Set<T> where 'T' is one of (String, Boolean, Character, Byte, Short, Integer, Float, Long, Double, BigInteger, BigDecimal, File)
- An unmanaged property (i.e. annotated with @Unmanaged)
LanguageSourceSet
to a FunctionalSourceSet
It is now possible to add a LanguageSourceSet
instance of any registered type to a FunctionalSourceSet
which exists in the model space.
This can be done via a RuleSource
plugin:
class Rules extends RuleSource {
@Model
void functionalSources(FunctionalSourceSet fss) {
fss.create("myJavaSourceSet", JavaSourceSet) { LanguageSourceSet lss ->
lss.source.srcDir "src/main/myJavaSourceSet"
}
}
}
apply plugin: Rules
Or via the model DSL:
model {
functionalSources(FunctionalSourceSet){
myJavaSourceSet(JavaSourceSet) {
source {
srcDir "src/main/myJavaSourceSet"
}
}
}
}
Any registered LanguageSourceSet
implementation can be specified for creation. LanguageSourceSet
types are registered via a rule annotated with @LanguageType
:
class JavaLangRuleSource extends RuleSource {
@LanguageType
void registerLanguage(LanguageTypeBuilder<JavaSourceSet> builder) {
builder.setLanguageName("java");
builder.defaultImplementation(DefaultJavaLanguageSourceSet.class);
}
}
apply plugin: JavaLangRuleSource
Note: LanguageSourceSet
instances added to a FunctionalSourceSet
in this fashion are not yet added to the top-level sources
container. This will be addressed in a subsequent release.
Clients of the Tooling API now can query the list of Eclipse builders and natures via the EclipseProject model.
The result of the EclipseProject.getProjectNatures()
and EclipseProject.getBuildCommands()
contain any builders and natures required for the target project. These values contain any natures and builders determined by Gradle as required, as well any customisations defined in the configuration for the 'eclipse' Gradle plugin.
binaries
container is no longer accessible as a project extensionThe binaries
container is no longer bridged into the regular plugin space, and is now only visible to rules via model. The binaries
project extension has been removed.
For the following code that works in Gradle 2.8 and earlier:
binaries.all {
...
}
use this in Gradle 2.9:
model {
binaries {
all {
...
}
}
}
A similar change is required for binaries.withType
and binaries.matching
.
NativeExecutableBinarySpec.executableFile
is now reachable via NativeExecutableBinarySpec.executable.file
.NativeTestSuiteBinarySpec.executableFile
is now reachable via NativeTestSuiteBinarySpec.executable.file
.Tool settings like cppCompiler.args
are no longer added via the Gradle extension mechanism. PreprocessingTool
accessors are now implemented directly by NativeBinarySpec
, which no longer implements ExtensionAware
. No changes to build scripts should be required.
The Model DSL has stricter syntax in Gradle 2.9.
The only top-level elements permitted within a model
block are rule definitions: any other constructs (e.g. if statements, variables etc.) are invalid and will fail to compile.
The following is no longer valid:
model {
if (someCondition) {
tasks {
create("someTask")
}
}
}
Within a given rule, arbitrary statements may still be used.
In order to improve the incremental build performance, a number of changes and optimizations were made to the way we check if a Task's inputs are up-to-date:
zipTree
or tarTree
, Gradle will no longer extract the archive, but will simply compare the backing archive file for changes.zipTree
or tarTree
input with a filter, any change to the backing archive will cause the Task to be out-of-date. Previously only changes to files inside the archive matching the filter would make the input out-of-date.We would like to thank the following community members for making contributions to this release of Gradle.
We love getting contributions from the Gradle community. For information on contributing, please see gradle.org/contribute.
Known issues are problems that were discovered post release that are directly related to changes made in this release.