Gradle User Manual: Version 8.12
- OVERVIEW
- RELEASES
- UPGRADING
- RUNNING GRADLE BUILDS
- CORE CONCEPTS
- AUTHORING GRADLE BUILDS
- CORE CONCEPTS
- GRADLE TYPES
- STRUCTURING BUILDS
- DEVELOPING TASKS
- DEVELOPING PLUGINS
- OTHER TOPICS
- OPTIMIZING BUILD PERFORMANCE
- THE BUILD CACHE
- DEPENDENCY MANAGEMENT
- CORE CONCEPTS
- DECLARING DEPENDENCIES
- DECLARING REPOSITORIES
- CENTRALIZING DEPENDENCIES
- MANAGING DEPENDENCIES
- UNDERSTANDING DEPENDENCY RESOLUTION
- CONTROLLING DEPENDENCY RESOLUTION
- PUBLISHING LIBRARIES
- OTHER TOPICS
- PLATFORMS
- JVM BUILDS
- JAVA TOOLCHAINS
- JVM PLUGINS
- INTEGRATION
- REFERENCE
- GRADLE DSL/API
- CORE PLUGINS
- HOW TO GUIDES
- LICENSE INFORMATION
OVERVIEW
Gradle User Manual
Gradle Build Tool
Gradle Build Tool is a fast, dependable, and adaptable open-source build automation tool with an elegant and extensible declarative build language.
In this User Manual, Gradle Build Tool is abbreviated Gradle.
Why Gradle?
Gradle is a widely used and mature tool with an active community and a strong developer ecosystem.
-
Gradle is the most popular build system for the JVM and is the default system for Android and Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.
-
Gradle can automate a wide range of software build scenarios using either its built-in functionality, third-party plugins, or custom build logic.
-
Gradle provides a high-level, declarative, and expressive build language that makes it easy to read and write build logic.
-
Gradle is fast, scalable, and can build projects of any size and complexity.
-
Gradle produces dependable results while benefiting from optimizations such as incremental builds, build caching, and parallel execution.
Gradle, Inc. provides a free service called Build Scan® that provides extensive information and insights about your builds. You can view scans to identify problems or share them for debugging help.
Supported Languages and Frameworks
Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.
Compatible IDEs
All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse, and NetBeans.
You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your continuous integration (CI) server.
Releases
Information on Gradle releases and how to install Gradle is found on the Installation page.
User Manual
The Gradle User Manual is the official documentation for the Gradle Build Tool:
-
Running Gradle Builds — Learn how to use Gradle with your project.
-
Authoring Gradle Builds — Learn how to develop tasks and plugins to customize your build.
-
Working with Dependencies — Learn how to add dependencies to your build.
-
Authoring JVM Builds — Learn how to use Gradle with your Java project.
-
Optimizing Builds — Learn how to use caches and other tools to optimize your build.
Education
-
Training Courses — Head over to the courses page to sign up for free Gradle training.
Support
-
Forum — The fastest way to get help is through the Gradle Forum.
-
Slack — Community members and core contributors answer questions directly on our Slack Channel.
Licenses
Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user manual and DSL reference manual are licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright
Copyright © 2024 Gradle, Inc. All rights reserved. Gradle is a trademark of Gradle, Inc.
For inquiries related to commercial use or licensing, contact Gradle Inc. directly.
RELEASES
Installing Gradle
Gradle Installation
If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the build uses the Gradle Wrapper.
This is identifiable by the presence of the gradlew
or gradlew.bat
files in the root of the project:
. // (1) ├── gradle │ └── wrapper // (2) ├── gradlew // (3) ├── gradlew.bat // (3) └── ⋮
-
Project root directory.
-
Scripts for executing Gradle builds.
If the gradlew
or gradlew.bat
files are already present in your project, you do not need to install Gradle.
But you need to make sure your system satisfies Gradle’s prerequisites.
You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version for your project. Please use the Gradle Wrapper to upgrade Gradle.
Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle separately when only working within that IDE.
If you do not meet the criteria above and decide to install Gradle on your machine, first check if Gradle is already installed by running gradle -v
in your terminal.
If the command does not return anything, then Gradle is not installed, and you can follow the instructions below.
You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done manually or using a package manager like SDKMAN! or Homebrew.
You can find all Gradle releases and their checksums on the releases page.
Prerequisites
Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or higher to run. You can check the compatibility matrix for more information.
To check, run java -version
:
❯ java -version openjdk version "11.0.18" 2023-01-17 OpenJDK Runtime Environment Homebrew (build 11.0.18+0) OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)
Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your project. In this example, the $PATH points to JDK17:
❯ echo $PATH /opt/homebrew/opt/openjdk@17/bin
You can also set the JAVA_HOME
environment variable to point to a specific JDK installation directory.
This is especially useful when multiple JDKs are installed:
❯ echo %JAVA_HOME% C:\Program Files\Java\jdk1.7.0_80
❯ echo $JAVA_HOME /Library/Java/JavaVirtualMachines/jdk-16.jdk/Contents/Home
Linux installation
Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and maintained by SDKMAN!:
❯ sdk install gradle
Other package managers are available, but the version of Gradle distributed by them is not controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle that is incompatible or incomplete when compared to the official version.
Installing manually
Step 1 - Download the latest Gradle distribution
The distribution ZIP file comes in two flavors:
-
Binary-only (bin)
-
Complete (all) with docs and sources
We recommend downloading the bin file; it is a smaller file that is quick to download (and the latest documentation is available online).
Step 2 - Unpack the distribution
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /opt/gradle ❯ unzip -d /opt/gradle gradle-8.12-bin.zip ❯ ls /opt/gradle/gradle-8.12 LICENSE NOTICE bin README init.d lib media
Step 3 - Configure your system environment
To install Gradle, the path to the unpacked files needs to be in your Path.
Configure your PATH
environment variable to include the bin
directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/opt/gradle/gradle-8.12/bin
Alternatively, you could also add the environment variable GRADLE_HOME
and point this to the unzipped distribution.
Instead of adding a specific version of Gradle to your PATH
, you can add $GRADLE_HOME/bin
to your PATH
.
When upgrading to a different version of Gradle, simply change the GRADLE_HOME
environment variable.
export GRADLE_HOME=/opt/gradle/gradle-8.12 export PATH=${GRADLE_HOME}/bin:${PATH}
macOS installation
Installing with a package manager
SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and maintained by SDKMAN!:
❯ sdk install gradle
Using Homebrew:
❯ brew install gradle
Using MacPorts:
❯ sudo port install gradle
Other package managers are available, but the version of Gradle distributed by them is not controlled by Gradle, Inc.
Installing manually
Step 1 - Download the latest Gradle distribution
The distribution ZIP file comes in two flavors:
-
Binary-only (bin)
-
Complete (all) with docs and sources
We recommend downloading the bin file; it is a smaller file that is quick to download (and the latest documentation is available online).
Step 2 - Unpack the distribution
Unzip the distribution zip file in the directory of your choosing, e.g.:
❯ mkdir /usr/local/gradle ❯ unzip gradle-8.12-bin.zip -d /usr/local/gradle ❯ ls /usr/local/gradle/gradle-8.12 LICENSE NOTICE README bin init.d lib
Step 3 - Configure your system environment
To install Gradle, the path to the unpacked files needs to be in your Path.
Configure your PATH
environment variable to include the bin
directory of the unzipped distribution, e.g.:
❯ export PATH=$PATH:/usr/local/gradle/gradle-8.12/bin
Alternatively, you could also add the environment variable GRADLE_HOME
and point this to the unzipped distribution.
Instead of adding a specific version of Gradle to your PATH
, you can add $GRADLE_HOME/bin
to your PATH
.
When upgrading to a different version of Gradle, simply change the GRADLE_HOME
environment variable.
It’s a good idea to edit .bash_profile
in your home directory to add GRADLE_HOME
variable:
export GRADLE_HOME=/usr/local/gradle/gradle-8.12 export PATH=$GRADLE_HOME/bin:$PATH
Windows installation
Installing manually
Step 1 - Download the latest Gradle distribution
The distribution ZIP file comes in two flavors:
-
Binary-only (bin)
-
Complete (all) with docs and sources
We recommend downloading the bin file.
Step 2 - Unpack the distribution
Create a new directory C:\Gradle
with File Explorer.
Open a second File Explorer window and go to the directory where the Gradle distribution was downloaded. Double-click the ZIP archive to expose the content.
Drag the content folder gradle-8.12
to your newly created C:\Gradle
folder.
Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle
using the archiver tool of your choice.
Step 3 - Configure your system environment
To install Gradle, the path to the unpacked files needs to be in your Path.
In File Explorer right-click on the This PC
(or Computer
) icon, then click Properties
→ Advanced System Settings
→ Environmental Variables
.
Under System Variables
select Path
, then click Edit
.
Add an entry for C:\Gradle\gradle-8.12\bin
.
Click OK
to save.
Alternatively, you can add the environment variable GRADLE_HOME
and point this to the unzipped distribution.
Instead of adding a specific version of Gradle to your Path
, you can add %GRADLE_HOME%\bin
to your Path
.
When upgrading to a different version of Gradle, just change the GRADLE_HOME
environment variable.
Verify the installation
Open a console (or a Windows command prompt) and run gradle -v
to run gradle and display the version, e.g.:
❯ gradle -v ------------------------------------------------------------ Gradle 8.12 ------------------------------------------------------------ Build time: 2024-06-17 18:10:00 UTC Revision: 6028379bb5a8512d0b2c1be6403543b79825ef08 Kotlin: 1.9.23 Groovy: 3.0.21 Ant: Apache Ant(TM) version 1.10.13 compiled on January 4 2023 Launcher JVM: 11.0.23 (Eclipse Adoptium 11.0.23+9) Daemon JVM: /Library/Java/JavaVirtualMachines/temurin-11.jdk/Contents/Home (no JDK specified, using current Java home) OS: Mac OS X 14.5 aarch64
You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available from the releases page) and following these verification instructions.
Compatibility Matrix
The sections below describe Gradle’s compatibility with several integrations. Versions not listed here may or may not work.
Java Runtime
Gradle runs on the Java Virtual Machine (JVM), which is often provided by either a JDK or JRE. A JVM version between 8 and 23 is required to execute Gradle. JVM 24 and later versions are not yet supported.
Executing the Gradle daemon with JVM 16 or earlier has been deprecated and will become an error in Gradle 9.0. The Gradle wrapper, Gradle client, Tooling API client, and TestKit client will remain compatible with JVM 8.
JDK 6 and 7 can be used for compilation. Testing with JVM 6 and 7 is deprecated and will not be supported in Gradle 9.0.
Any fully supported version of Java can be used for compilation or testing. However, the latest Java version may only be supported for compilation or testing, not for running Gradle. Support is achieved using toolchains and applies to all tasks supporting toolchains.
See the table below for the Java version supported by a specific Gradle release:
Java version | Support for toolchains | Support for running Gradle |
---|---|---|
8 |
N/A |
2.0 |
9 |
N/A |
4.3 |
10 |
N/A |
4.7 |
11 |
N/A |
5.0 |
12 |
N/A |
5.4 |
13 |
N/A |
6.0 |
14 |
N/A |
6.3 |
15 |
6.7 |
6.7 |
16 |
7.0 |
7.0 |
17 |
7.3 |
7.3 |
18 |
7.5 |
7.5 |
19 |
7.6 |
7.6 |
20 |
8.1 |
8.3 |
21 |
8.4 |
8.5 |
22 |
8.7 |
8.8 |
23 |
8.10 |
8.10 |
24 |
N/A |
N/A |
Kotlin
Gradle is tested with Kotlin 1.6.10 through 2.1.0-Beta2. Beta and RC versions may or may not work.
Embedded Kotlin version | Minimum Gradle version | Kotlin Language version |
---|---|---|
1.3.10 |
5.0 |
1.3 |
1.3.11 |
5.1 |
1.3 |
1.3.20 |
5.2 |
1.3 |
1.3.21 |
5.3 |
1.3 |
1.3.31 |
5.5 |
1.3 |
1.3.41 |
5.6 |
1.3 |
1.3.50 |
6.0 |
1.3 |
1.3.61 |
6.1 |
1.3 |
1.3.70 |
6.3 |
1.3 |
1.3.71 |
6.4 |
1.3 |
1.3.72 |
6.5 |
1.3 |
1.4.20 |
6.8 |
1.3 |
1.4.31 |
7.0 |
1.4 |
1.5.21 |
7.2 |
1.4 |
1.5.31 |
7.3 |
1.4 |
1.6.21 |
7.5 |
1.4 |
1.7.10 |
7.6 |
1.4 |
1.8.10 |
8.0 |
1.8 |
1.8.20 |
8.2 |
1.8 |
1.9.0 |
8.3 |
1.8 |
1.9.10 |
8.4 |
1.8 |
1.9.20 |
8.5 |
1.8 |
1.9.22 |
8.7 |
1.8 |
1.9.23 |
8.9 |
1.8 |
1.9.24 |
8.10 |
1.8 |
2.0.20 |
8.11 |
1.8 |
2.0.21 |
8.12 |
1.8 |
Groovy
Gradle is tested with Groovy 1.5.8 through 4.0.0.
Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy DSL build scripts.
Android
Gradle is tested with Android Gradle Plugin 7.3 through 8.7. Alpha and beta versions may or may not work.
The Feature Lifecycle
Gradle is under constant development. New versions are delivered on a regular and frequent basis (approximately every six weeks) as described in the section on end-of-life support.
Continuous improvement combined with frequent delivery allows new features to be available to users early. Early users provide invaluable feedback, which is incorporated into the development process.
Getting new functionality into the hands of users regularly is a core value of the Gradle platform.
At the same time, API and feature stability are taken very seriously and considered a core value of the Gradle platform. Design choices and automated testing are engineered into the development process and formalized by the section on backward compatibility.
The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of Gradle what the state of a feature is. The term feature typically means an API or DSL method or property in this context, but it is not restricted to this definition. Command line arguments and modes of execution (e.g. the Build Daemon) are two examples of other features.
1. Internal
Internal features are not designed for public use and are only intended to be used by Gradle itself. They can change in any way at any point in time without any notice. Therefore, we recommend avoiding the use of such features. Internal features are not documented. If it appears in this User Manual, the DSL Reference, or the API Reference, then the feature is not internal.
Internal features may evolve into public features.
2. Incubating
Features are introduced in the incubating state to allow real-world feedback to be incorporated into the feature before making it public. It also gives users willing to test potential future changes early access.
A feature in an incubating state may change in future Gradle versions until it is no longer incubating. Changes to incubating features for a Gradle release will be highlighted in the release notes for that release. The incubation period for new features varies depending on the feature’s scope, complexity, and nature.
Features in incubation are indicated. In the source code, all methods/properties/classes that are incubating are annotated with incubating. This results in a special mark for them in the DSL and API references.
If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the incubating state.
Feature Preview API
The feature preview API allows certain incubating features to be activated by adding enableFeaturePreview('FEATURE')
in your settings file.
Individual preview features will be announced in release notes.
When incubating features are either promoted to public or removed, the feature preview flags for them become obsolete, have no effect, and should be removed from the settings file.
3. Public
The default state for a non-internal feature is public. Anything documented in the User Manual, DSL Reference, or API reference that is not explicitly said to be incubating or deprecated is considered public. Features are said to be promoted from an incubating state to public. The release notes for each release indicate which previously incubating features are being promoted by the release.
A public feature will never be removed or intentionally changed without undergoing deprecation. All public features are subject to the backward compatibility policy.
4. Deprecated
Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such features will eventually be removed from Gradle after being deprecated. A deprecated feature may become stale until it is finally removed according to the backward compatibility policy.
Deprecated features are indicated to be so. In the source code, all methods/properties/classes that are deprecated are annotated with “@java.lang.Deprecated” which is reflected in the DSL and API References. In most cases, there is a replacement for the deprecated element, which will be described in the documentation. Using a deprecated feature will result in a runtime warning in Gradle’s output.
The use of deprecated features should be avoided. The release notes for each release indicate any features being deprecated by the release.
Backward compatibility policy
Gradle provides backward compatibility across major versions (e.g., 1.x
, 2.x
, etc.).
Once a public feature is introduced in a Gradle release, it will remain indefinitely unless deprecated.
Once deprecated, it may be removed in the next major release.
Deprecated features may be supported across major releases, but this is not guaranteed.
Release end-of-life Policy
Every day, a new nightly build of Gradle is created.
This contains all of the changes made through Gradle’s extensive continuous integration tests during that day. Nightly builds may contain new changes that may or may not be stable.
The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or major release. When no problems are found after a short time (usually a week), the release candidate is promoted to a general availability (GA) release. If a regression is found in the release candidate, a new RC distribution is created, and the process repeats. Release candidates are supported for as long as the release window is open, but they are not intended to be used for production. Bug reports are greatly appreciated during the RC phase.
The Gradle team may create additional patch releases to replace the final release due to critical bug fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.
Once a release candidate has been made, all feature development moves on to the next release for the latest major version. As such, each minor Gradle release causes the previous minor releases in the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or feature backports.
For major versions, Gradle will backport critical fixes and security fixes to the last minor in the previous major version. For example, when Gradle 7 was the latest major version, several releases were made in the 6.x line, including Gradle 6.9 (and subsequent releases).
As such, each major Gradle release causes:
-
The previous major version becomes maintenance only. It will only receive critical bug fixes and security fixes.
-
The major version before the previous one to become end-of-life (EOL), and that release line will not receive any new fixes.
UPGRADING
Upgrading your build from Gradle 8.x to the latest
This chapter provides the information you need to migrate your Gradle 8.x builds to the latest Gradle release. For migrating from Gradle 4.x, 5.x, 6.x, or 7.x, see the older migration guide first.
We recommend the following steps for all users:
-
Try running
gradle help --scan
and view the deprecations view of the generated build scan.This lets you see any deprecation warnings that apply to your build.
Alternatively, you can run
gradle help --warning-mode=all
to see the deprecations in the console, though it may not report as much detailed information. -
Update your plugins.
Some plugins will break with this new version of Gradle because they use internal APIs that have been removed or changed. The previous step will help you identify potential problems by issuing deprecation warnings when a plugin tries to use a deprecated part of the API.
-
Run
gradle wrapper --gradle-version 8.12
to update the project to 8.12. -
Try to run the project and debug any errors using the Troubleshooting Guide.
Upgrading from 8.11 and earlier
Potential breaking changes
Upgrade to Kotlin 2.0.21
The embedded Kotlin has been updated from 2.0.20 to Kotlin 2.0.21.
Upgrade to Ant 1.10.15
Ant has been updated to Ant 1.10.15.
Upgrade to Zinc 1.10.4
Zinc has been updated to 1.10.4.
Swift SDK discovery
To determine the location of the Mac OS X SDK for Swift, Gradle now passes the --sdk macosx
arguments to xcrun
.
This is necessary because the SDK could be discovered inconsistently without this argument across different environments.
Source level deprecation of TaskContainer.create methods
Eager task creation methods on the TaskContainer
interface have been marked @Deprecated
and will generate compiler and IDE warnings when used in build scripts or plugin code.
There is not yet a Gradle deprecation warning emitted for their use.
However, if the build is configured to fail on warnings during Kotlin script or plugin code compilation, this behavior may cause the build to fail.
A standard Gradle deprecation warning will be printed upon use when these methods are fully deprecated in a future version.
Deprecations
Deprecated Ambiguous Transformation Chains
Previously, when at least two equal-length chains of artifact transforms were available that would produce compatible variants that would each satisfy a resolution request, Gradle would arbitrarily, and silently, pick one.
Now, Gradle emits a deprecation warning that explains this situation:
There are multiple distinct artifact transformation chains of the same length that would satisfy this request. This behavior has been deprecated. This will fail with an error in Gradle 9.0.
Found multiple transformation chains that produce a variant of 'root project :' with requested attributes:
- color 'red'
- texture 'smooth'
Found the following transformation chains:
- From configuration ':squareBlueSmoothElements':
- With source attributes:
- artifactType 'txt'
- color 'blue'
- shape 'square'
- texture 'smooth'
- Candidate transformation chains:
- Transformation chain: 'ColorTransform':
- 'BrokenColorTransform':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
- Transformation chain: 'ColorTransform2':
- 'BrokenColorTransform2':
- Converts from attributes:
- color 'blue'
- texture 'smooth'
- To attributes:
- color 'red'
Remove one or more registered transforms, or add additional attributes to them to ensure only a single valid transformation chain exists.
In such a scenario, Gradle has no way to know which of the two (or more) possible transformation chains should be used. Picking an arbitrary chain can lead to inefficient performance or unexpected behavior changes when seemingly unrelated parts of the build are modified. This is potentially a very complex situation and the message now fully explains the situation by printing all the registered transforms in order, along with their source (input) variants for each candidate chain.
When encountering this type of failure, build authors should either:
-
Add additional, distinguishing attributes when registering transforms present in the chain, to ensure that only a single chain will be selectable to satisfy the request
-
Request additional attributes to disambiguate which chain is selected (if they result in non-identical final attributes)
-
Remove unnecessary registered transforms from the build
This will become an error in Gradle 9.0.
init
must run alone
The init
task must run by itself.
This task should not be combined with other tasks in a single Gradle invocation.
Running init
in the same invocation as other tasks will become an error in Gradle 9.0.
For instance, this wil not be allowed:
> gradlew init tasks
Calling Task.getProject()
from a task action
Calling Task.getProject() from a task action at execution time is now deprecated and will be made an error in Gradle 10.0. This method can still be used during configuration time.
The deprecation is only issued if the configuration cache is not enabled. When the configuration cache is enabled, calls to Task.getProject() are reported as configuration cache problems instead.
This deprecation was originally introduced in Gradle 7.4 but was only issued when the STABLE_CONFIGURATION_CACHE
feature flag was enabled. That feature flag no longer controls this deprecation.
This is another step towards moving users away from idioms that are incompatible with the configuration cache, which will become the only mode supported by Gradle in a future release.
Please refer to the configuration cache documentation for alternatives to invoking Task.getProject()
at execution time that are compatible with the configuration cache.
Groovy "space assignment" syntax
Currently, there are multiple ways to set a property with Groovy DSL syntax:
---
propertyName = value
setPropertyName(value)
setPropertyName value
propertyName(value)
propertyName value
---
The latter one, "space-assignment", is a Gradle-specific feature that is not part of the Groovy language.
In regular Groovy, this is just a method call: propertyName(value)
, and Gradle generates propertyName
method in the runtime if this method hasn’t been present already.
This feature may be a source of confusion (especially for new users) and adds an extra layer of complexity for users and the Gradle codebase without providing any significant value.
Sometimes, classes declare methods with the same name, and these may even have semantics that are different from a plain assignment.
These generated methods are now deprecated and will be removed in Gradle 10.0, and both propertyName value
and propertyName(value)
will stop working unless the explicit method propertyName
is defined.
Use explicit assignment propertyName = value
instead.
For explicit methods, consider using the propertyName(value)
syntax instead of propertyName value
for clarity.
For example, jvmArgs "some", "arg"
can be replaced with jvmArgs("some", "arg")
or with jvmArgs = ["some", "arg"]
for Test
tasks.
If you have a big project, to replace occurrences of space-assignment syntax you can use, for example, the following sed
command:
---
find . -name 'build.gradle' -type f -exec sed -i.bak -E 's/([^A-Za-z]|^)(replaceme)[ \t]*([^= \t{])/\1\2 = \3/g' {} +
---
You should replace replaceme
with one or more property names you want to replace, separated by |
, e.g. (url|group)
.
DependencyInsightReportTask.getDependencySpec
The method was deprecated because it was not intended for public use in build scripts.
ReportingExtension.baseDir
ReportingExtension.getBaseDir()
, `ReportingExtension.setBaseDir(File)
, and ReportingExtension.setBaseDir(Object)
were deprecated.
They should be replaced with ReportingExtension.getBaseDirectory()
property.
Upgrading from 8.10 and earlier
Potential breaking changes
Upgrade to Kotlin 2.0.20
The embedded Kotlin has been updated from 1.9.24 to Kotlin 2.0.20. Also see the Kotlin 2.0.10 and Kotlin 2.0.0 release notes.
The default kotlin-test
version in JVM test suites has been upgraded to 2.0.20 as well.
Kotlin DSL scripts are still compiled with Kotlin language version set to 1.8 for backward compatibility.
Gradle daemon JVM configuration via toolchain
The type of the property UpdateDaemonJvm.jvmVersion
is now Property<JavaLanguageVersion>
.
If you configured the task in a build script, you will need to replace:
jvmVersion = JavaVersion.VERSION_17
With:
jvmVersion = JavaLanguageVersion.of(17)
Using the CLI options to configure which JVM version to use for the Gradle Daemon has no impact.
Name matching changes
The name-matching logic has been updated to treat numbers as word boundaries for camelCase names.
Previously, a request like unique
would match both uniqueA
and unique1
.
Such a request will now fail due to ambiguity. To avoid issues, use the exact name instead of a shortened version.
This change impacts:
-
Task selection
-
Project selection
-
Configuration selection in dependency report tasks
Deprecations
Deprecated JavaHome property of ForkOptions
The JavaHome property of the ForkOptions
type has been deprecated and will be removed in Gradle 9.0.
Use JVM Toolchains, or the executable property instead.
Deprecated mutating buildscript configurations
Starting in Gradle 9.0, mutating configurations in a script’s buildscript block will result in an error. This applies to project, settings, init, and standalone scripts.
The buildscript configurations block is only intended to control buildscript classpath resolution.
Consider the following script that creates a new buildscript configuration in a Settings script and resolves it:
buildscript {
configurations {
create("myConfig")
}
dependencies {
"myConfig"("org:foo:1.0")
}
}
val files = buildscript.configurations["myConfig"].files
This pattern is sometimes used to resolve dependencies in Settings, where there is no other way to obtain a Configuration. Resolving dependencies in this context is not recommended. Using a detached configuration is a possible but discouraged alternative.
The above example can be modified to use a detached configuration:
val myConfig = buildscript.configurations.detachedConfiguration(
buildscript.dependencies.create("org:foo:1.0")
)
val files = myConfig.files
Selecting Maven variants by configuration name
Starting in Gradle 9.0, selecting variants by name from non-Ivy external components will be forbidden.
Selecting variants by name from local components will still be permitted; however, this pattern is discouraged. Variant aware dependency resolution should be preferred over selecting variants by name for local components.
The following dependencies will fail to resolve when targeting a non-Ivy external component:
dependencies {
implementation(group: "com.example", name: "example", version: "1.0", configuration: "conf")
implementation("com.example:example:1.0") {
targetConfiguration = "conf"
}
}
Deprecated manually adding to configuration container
Starting in Gradle 9.0, manually adding configuration instances to a configuration container will result in an error. Configurations should only be added to the container through the eager or lazy factory methods. Detached configurations and copied configurations should not be added to the container.
Calling the following methods on ConfigurationContainer will be forbidden: - add(Configuration) - addAll(Collection) - addLater(Provider) - addAllLater(Provider)
Deprecated ProjectDependency#getDependencyProject()
The ProjectDependency#getDependencyProject()
method has been deprecated and will be removed in Gradle 9.0.
Accessing the mutable project instance of other projects should be avoided.
To discover details about all projects that were included in a resolution, inspect the full ResolutionResult. Project dependencies are exposed in the DependencyResult. See the user guide section on programmatic dependency resolution for more details on this API. This is the only reliable way to find all projects that are used in a resolution. Inspecting only the declared `ProjectDependency`s may miss transitive or substituted project dependencies.
To get the identity of the target project, use the new Isolated Projects safe project path method: ProjectDependency#getPath()
.
To access or configure the target project, consider this direct replacement:
val projectDependency: ProjectDependency = getSomeProjectDependency()
// Old way:
val someProject = projectDependency.dependencyProject
// New way:
val someProject = project.project(projectDependency.path)
This approach will not fetch project instances from different builds.
Deprecated ResolvedConfiguration.getFiles()
and LenientConfiguration.getFiles()
The ResolvedConfiguration.getFiles() and LenientConfiguration.getFiles() methods have been deprecated and will be removed in Gradle 9.0.
These deprecated methods do not track task dependencies, unlike their replacements.
val deprecated: Set<File> = conf.resolvedConfiguration.files
val replacement: FileCollection = conf.incoming.files
val lenientDeprecated: Set<File> = conf.resolvedConfiguration.lenientConfiguration.files
val lenientReplacement: FileCollection = conf.incoming.artifactView {
isLenient = true
}.files
Deprecated AbstractOptions
The AbstractOptions
class has been deprecated and will be removed in Gradle 9.0.
All classes extending AbstractOptions
will no longer extend it.
As a result, the AbstractOptions#define(Map)
method will no longer be present.
This method exposes a non-type-safe API and unnecessarily relies on reflection.
It can be replaced by directly setting the properties specified in the map.
Additionally, CompileOptions#fork(Map)
, CompileOptions#debug(Map)
, and GroovyCompileOptions#fork(Map)
, which depend on define
, are also deprecated for removal in Gradle 9.0.
Consider the following example of the deprecated behavior and its replacement:
tasks.withType(JavaCompile) {
// Deprecated behavior
options.define(encoding: 'UTF-8')
options.fork(memoryMaximumSize: '1G')
options.debug(debugLevel: 'lines')
// Can be replaced by
options.encoding = 'UTF-8'
options.fork = true
options.forkOptions.memoryMaximumSize = '1G'
options.debug = true
options.debugOptions.debugLevel = 'lines'
}
Deprecated Dependency#contentEquals(Dependency)
The Dependency#contentEquals(Dependency) method has been deprecated and will be removed in Gradle 9.0.
The method was originally intended to compare dependencies based on their actual target component, regardless of whether they were of different dependency type. The existing method does not behave as specified by its Javadoc, and we do not plan to introduce a replacement that does.
Potential migrations include using Object.equals(Object)
directly, or comparing the fields of dependencies manually.
Deprecated Project#exec
and Project#javaexec
The Project#exec(Closure), Project#exec(Action), Project#javaexec(Closure), Project#javaexec(Action) methods have been deprecated and will be removed in Gradle 9.0.
These methods are scheduled for removal as part of the ongoing effort to make writing configuration-cache-compatible code easier. There is no way to use these methods without breaking configuration cache requirements so it is recommended to migrate to a compatible alternative. The appropriate replacement for your use case depends on the context in which the method was previously called.
At execution time, for example in @TaskAction
or doFirst
/doLast
callbacks, the use of Project
instance is not allowed when the configuration cache is enabled.
To run external processes, tasks should use an injected ExecOperation
service, which has the same API and can act as a drop-in replacement.
The standard Java/Groovy/Kotlin process APIs, like java.lang.ProcessBuilder
can be used as well.
At configuration time, only special Provider-based APIs must be used to run external processes when the configuration cache is enabled.
You can use ProviderFactory.exec
and
ProviderFactory.javaexec
to obtain the output of the process.
A custom ValueSource
implementation can be used for more sophisticated scenarios.
The configuration cache guide has a more elaborate example of using these APIs.
Detached Configurations should not use extendsFrom
Detached configurations should not extend other configurations using extendsFrom
.
This behavior has been deprecated and will become an error in Gradle 9.0.
To create extension relationships between configurations, you should change to using non-detached configurations created via the other factory methods present in the project’s ConfigurationContainer
.
Deprecated customized Gradle logging
The Gradle#useLogger(Object) method has been deprecated and will be removed in Gradle 9.0.
This method was originally intended to customize logs printed by Gradle. However, it only allows intercepting a subset of the logs and cannot work with the configuration cache. We do not plan to introduce a replacement for this feature.
Unnecessary options on compile options and doc tasks have been deprecated
Gradle’s API allowed some properties that represented nested groups of properties to be replaced wholesale with a setter method.
This was awkward and unusual to do and would sometimes require the use of internal APIs.
The setters for these properties will be removed in Gradle 9.0 to simplify the API and ensure consistent behavior.
Instead of using the setter method, these properties should be configured by calling the getter and configuring the object directly or using the convenient configuration method.
For example, in CompileOptions
, instead of calling the setForkOptions
setter, you can call getForkOptions()
or forkOptions(Action)
.
The affected properties are:
Deprecated Javadoc.isVerbose()
and Javadoc.setVerbose(boolean)
These methods on Javadoc have been deprecated and will be removed in Gradle 9.0.
-
isVerbose() is replaced by getOptions().isVerbose()
-
Calling setVerbose(boolean) with
true
is replaced by getOptions().verbose() -
Calling
setVerbose(false)
did nothing.
Upgrading from 8.9 and earlier
Potential breaking changes
JavaCompile
tasks may fail when using a JRE even if compilation is not necessary
The JavaCompile
tasks may sometimes fail when using a JRE instead of a JDK.
This is due to changes in the toolchain resolution code, which enforces the presence of a compiler when one is requested.
The java-base
plugin uses the JavaCompile
tasks it creates to determine the default source and target compatibility when sourceCompatibility
/targetCompatibility
or release
are not set.
With the new enforcement, the absence of a compiler causes this to fail when only a JRE is provided, even if no compilation is needed (e.g., in projects with no sources).
This can be fixed by setting the sourceCompatibility
/targetCompatibility
explicitly in the java
extension, or by setting sourceCompatibility
/targetCompatibility
or release
in the relevant task(s).
Upgrade to Kotlin 1.9.24
The embedded Kotlin has been updated from 1.9.23 to Kotlin 1.9.24.
Upgrade to Ant 1.10.14
Ant has been updated to Ant 1.10.14.
Upgrade to JaCoCo 0.8.12
JaCoCo has been updated to 0.8.12.
Upgrade to Groovy 3.0.22
Groovy has been updated to Groovy 3.0.22.
Deprecations
Running Gradle on older JVMs
Starting in Gradle 9.0, Gradle will require JVM 17 or later to run. Most Gradle APIs will be compiled to target JVM 17 bytecode.
Gradle will still support compiling Java code to target JVM version 6 or later. The target JVM version of the compiled code can be configured separately from the JVM version used to run Gradle.
All Gradle clients (wrapper, launcher, Tooling API and TestKit) will remain compatible with JVM 8 and will be compiled to target JVM 8 bytecode. Only the Gradle daemon will require JVM 17 or later. These clients can be configured to run Gradle builds with a different JVM version than the one used to run the client:
-
Using Daemon JVM criteria (an incubating feature)
-
Setting the
org.gradle.java.home
Gradle property -
Using the ConfigurableLauncher#setJavaHome method on the Tooling API
Alternatively, the JAVA_HOME
environment variable can be set to a JVM 17 or newer, which will run both the client and daemon with the same version of the JVM.
Running Gradle builds with --no-daemon or using ProjectBuilder in tests will require JVM version 17 or later. The worker API will remain compatible with JVM 8, and running JVM tests will require JVM 8.
We decided to upgrade the minimum version of the Java runtime for a number of reasons:
-
Dependencies are beginning to drop support for older versions and may not release security patches.
-
Significant language improvements between Java 8 and Java 17 cannot be used without upgrading.
-
Some of the most popular plugins already require JVM 17 or later.
-
Download metrics for Gradle distributions show that JVM 17 is widely used.
Deprecated consuming non-consumable configurations from Ivy
In prior versions of Gradle, it was possible to consume non-consumable configurations of a project using published Ivy metadata.
An Ivy dependency may sometimes be substituted for a project dependency, either explicitly through the DependencySubstitutions
API or through included builds.
When this happens, configurations in the substituted project could be selected that were marked as non-consumable.
Consuming non-consumable configurations in this manner is deprecated and will result in an error in Gradle 9.0.
Deprecated extending configurations in the same project
In prior versions of Gradle, it was possible to extend a configuration in a different project.
The hierarchy of a Project’s configurations should not be influenced by configurations in other projects. Cross-project hierarchies can lead to unexpected behavior when configurations are extended in a way that is not intended by the configuration’s owner.
Projects should also never access the mutable state of another project. Since Configurations are mutable, extending configurations across project boundaries restricts the parallelism that Gradle can apply.
Extending configurations in different projects is deprecated and will result in an error in Gradle 9.0.
Upgrading from 8.8 and earlier
Potential breaking changes
Change to toolchain provisioning
In previous versions of Gradle, toolchain provisioning could leave a partially provisioned toolchain in place with a marker file indicating that the toolchain was fully provisioned.
This could lead to strange behavior with the toolchain.
In Gradle 8.9, the toolchain is fully provisioned before the marker file is written.
However, to not detect potentially broken toolchains, a different marker file (.ready
) is used.
This means all your existing toolchains will be re-provisioned the first time you use them with Gradle 8.9.
Gradle 8.9 also writes the old marker file (provisioned.ok
) to indicate that the toolchain was fully provisioned.
This means that if you return to an older version of Gradle, an 8.9-provisioned toolchain will not be re-provisioned.
Upgrade to Kotlin 1.9.23
The embedded Kotlin has been updated from 1.9.22 to Kotlin 1.9.23.
Change the encoding of daemon log files
In previous versions of Gradle, the daemon log file, located at $GRADLE_USER_HOME/daemon/8.12/
, was encoded with the default JVM encoding.
This file is now always encoded with UTF-8 to prevent clients who may use different default encodings from reading data incorrectly.
This change may affect third-party tools trying to read this file.
Compiling against Gradle implementation classpath
In previous versions of Gradle, Java projects that had no declared dependencies could implicitly compile against Gradle’s runtime classes.
This means that some projects were able to compile without any declared dependencies even though they referenced Gradle runtime classes.
This situation is unlikely to arise in projects since IDE integration and test execution would be compromised.
However, if you need to utilize the Gradle API, declare a gradleApi
dependency or apply the java-gradle-plugin
plugin.
Configuration cache implementation packages now under org.gradle.internal
References to Gradle types not part of the public API should be avoided, as their direct use is unsupported. Gradle internal implementation classes may suffer breaking changes (or be renamed or removed) from one version to another without warning.
Users need to distinguish between the API and internal parts of the Gradle codebase.
This is typically achieved by including internal
in the implementation package names.
However, before this release, the configuration cache subsystem did not follow this pattern.
To address this issue, all code initially under the org.gradle.configurationcache*
packages has been moved to new internal packages (org.gradle.internal.*
).
File-system watching on macOS 11 (Big Sur) and earlier is disabled
Since Gradle 8.8, file-system watching has only been supported on macOS 12 (Monterey) and later. We added a check to automatically disable file-system watching on macOS 11 (Big Sur) and earlier versions.
Possible change to JDK8-based compiler output when annotation processors are used
The Java compilation infrastructure has been updated to use the Problems API. This change will supply the Tooling API clients with structured, rich information about compilation issues.
The feature should not have any visible impact on the usual build output, with JDK8 being an exception. When annotation processors are used in the compiler, the output message differs slightly from the previous ones.
The change mainly manifests itself in typename printed.
For example, Java standard types like java.lang.String
will be reported as java.lang.String
instead of String
.
Upgrading from 8.7 and earlier
Deprecations
Deprecate mutating configuration after observation
To ensure the accuracy of dependency resolution, Gradle checks that Configurations are not mutated after they have been used as part of a dependency graph.
-
Resolvable configurations should not have their resolution strategy, dependencies, hierarchy, etc., modified after they have been resolved.
-
Consumable configurations should not have their dependencies, hierarchy, attributes, etc. modified after they have been published or consumed as a variant.
-
Dependency scope configurations should not have their dependencies, constraints, etc., modified after a configuration that extends from them is observed.
In prior versions of Gradle, many of these circumstances were detected and handled by failing the build. However, some cases went undetected or did not trigger build failures. In Gradle 9.0, all changes to a configuration, once observed, will become an error. After a configuration of any type has been observed, it should be considered immutable. This validation covers the following properties of a configuration:
-
Resolution Strategy
-
Dependencies
-
Constraints
-
Exclude Rules
-
Artifacts
-
Role (consumable, resolvable, dependency scope)
-
Hierarchy (
extendsFrom
) -
Others (Transitive, Visible)
Starting in Gradle 8.8, a deprecation warning will be emitted in cases that were not already an error.
Usually, this deprecation is caused by mutating a configuration in a beforeResolve
hook.
This hook is only executed after a configuration is fully resolved but not when it is partially resolved for computing task dependencies.
Consider the following code that showcases the deprecated behavior:
plugins {
id("java-library")
}
configurations.runtimeClasspath {
// `beforeResolve` is not called before the configuration is partially resolved for
// build dependencies, but only before a full graph resolution.
// Configurations should not be mutated in this hook
incoming.beforeResolve {
// Add a dependency on `com:foo` if not already present
if (allDependencies.none { it.group == "com" && it.name == "foo" }) {
configurations.implementation.get().dependencies.add(project.dependencies.create("com:foo:1.0"))
}
}
}
tasks.register("resolve") {
val conf: FileCollection = configurations["runtimeClasspath"]
// Wire build dependencies
dependsOn(conf)
// Resolve dependencies
doLast {
assert(conf.files.map { it.name } == listOf("foo-1.0.jar"))
}
}
For the following use cases, consider these alternatives when replacing a beforeResolve
hook:
-
Adding dependencies: Use a DependencyFactory and
addLater
oraddAllLater
on DependencySet. -
Changing dependency versions: Use preferred version constraints.
-
Adding excludes: Use Component Metadata Rules to adjust dependency-level excludes, or withDependencies to add excludes to a configuration.
-
Roles: Configuration roles should be set upon creation and not changed afterward.
-
Hierarchy: Configuration hierarchy (
extendsFrom
) should be set upon creation. Mutating the hierarchy prior to resolution is highly discouraged but permitted within a withDependencies hook. -
Resolution Strategy: Mutating a configuration’s ResolutionStrategy is still permitted in a
beforeResolve
hook; however, this is not recommended.
Filtered Configuration file
and fileCollection
methods are deprecated
In an ongoing effort to simplify the Gradle API, the following methods that support filtering based on declared dependencies have been deprecated:
On Configuration:
-
files(Dependency…)
-
files(Spec)
-
files(Closure)
-
fileCollection(Dependency…)
-
fileCollection(Spec)
-
fileCollection(Closure)
-
getFiles(Spec)
-
getFirstLevelModuleDependencies(Spec)
-
getFirstLevelModuleDependencies(Spec)
-
getFiles(Spec)
-
getArtifacts(Spec)
To mitigate this deprecation, consider the example below that leverages the ArtifactView
API along with the componentFilter
method to select a subset of a Configuration’s artifacts:
val conf by configurations.creating
dependencies {
conf("com.thing:foo:1.0")
conf("org.example:bar:1.0")
}
tasks.register("filterDependencies") {
val files: FileCollection = conf.incoming.artifactView {
componentFilter {
when(it) {
is ModuleComponentIdentifier ->
it.group == "com.thing" && it.module == "foo"
else -> false
}
}
}.files
doLast {
assert(files.map { it.name } == listOf("foo-1.0.jar"))
}
}
configurations {
conf
}
dependencies {
conf "com.thing:foo:1.0"
conf "org.example:bar:1.0"
}
tasks.register("filterDependencies") {
FileCollection files = configurations.conf.incoming.artifactView {
componentFilter {
it instanceof ModuleComponentIdentifier
&& it.group == "com.thing"
&& it.module == "foo"
}
}.files
doLast {
assert files*.name == ["foo-1.0.jar"]
}
}
Contrary to the deprecated Dependency
filtering methods, componentFilter
does not consider the transitive dependencies of the component being filtered.
This allows for more granular control over which artifacts are selected.
Deprecated Namer
of Task
and Configuration
Task
and Configuration
have a Namer
inner class (also called Namer
) that can be used as a common way to retrieve the name of a task or configuration.
Now that these types implement Named
, these classes are no longer necessary and have been deprecated.
They will be removed in Gradle 9.0.
Use Named.Namer.INSTANCE
instead.
The super interface, Namer
, is not being deprecated.
Unix mode-based file permissions deprecated
A new API for defining file permissions has been added in Gradle 8.3, see:
The new API has now been promoted to stable, and the old methods have been deprecated:
Deprecated setting retention period directly on local build cache
In previous versions, cleanup of the local build cache entries ran every 24 hours, and this interval could not be configured.
The retention period was configured using buildCache.local.removeUnusedEntriesAfterDays
.
In Gradle 8.0, a new mechanism was added to configure the cleanup and retention periods for various resources in Gradle User Home. In Gradle 8.8, this mechanism was extended to permit the retention configuration of local build cache entries, providing improved control and consistency.
-
Specifying
Cleanup.DISABLED
orCleanup.ALWAYS
will now prevent or force the cleanup of the local build cache -
Build cache entry retention is now configured via an
init-script
, in the same manner as other caches.
If you want build cache entries to be retained for 30 days, remove any calls to the deprecated method:
buildCache {
local {
// Remove this line
removeUnusedEntriesAfterDays = 30
}
}
Add a file like this in ~/.gradle/init.d
:
beforeSettings {
caches {
buildCache.setRemoveUnusedEntriesAfterDays(30)
}
}
Calling buildCache.local.removeUnusedEntriesAfterDays is deprecated, and this method will be removed in Gradle 9.0.
If set to a non-default value, this deprecated setting will take precedence over Settings.caches.buildCache.setRemoveUnusedEntriesAfterDays()
.
Deprecated Kotlin DSL gradle-enterprise plugin block extension
In settings.gradle.kts
(Kotlin DSL), you can use gradle-enterprise
in the plugins block to apply the Gradle Enterprise plugin with the same version as gradle --scan
.
plugins {
`gradle-enterprise`
}
There is no equivalent to this in settings.gradle
(Groovy DSL).
Gradle Enterprise has been renamed Develocity, and the com.gradle.enterprise
plugin has been renamed com.gradle.develocity
.
Therefore, the gradle-enterprise
plugin block extension has been deprecated and will be removed in Gradle 9.0.
The Develocity plugin must be applied with an explicit plugin ID and version.
There is no develocity
shorthand available in the plugins block:
plugins {
id("com.gradle.develocity") version "3.17.3"
}
If you want to continue using the Gradle Enterprise plugin, you can specify the deprecated plugin ID:
plugins {
id("com.gradle.enterprise") version "3.17.3"
}
We encourage you to use the latest released Develocity plugin version, even when using an older Gradle version.
Potential breaking changes
Changes in the Problems API
We have implemented several refactorings of the Problems API, including a significant change in how problem definitions and contextual information are handled. The complete design specification can be found here.
In implementing this spec, we have introduced the following breaking changes to the ProblemSpec
interface:
-
The
label(String)
anddescription(String)
methods have been replaced with theid(String, String)
method and its overloaded variants.
Changes to collection properties
The following incubating API introduced in 8.7 have been removed:
-
MapProperty.insert*(…)
-
HasMultipleValues.append*(…)
Replacements that better handle conventions are under consideration for a future 8.x release.
Upgrade to Groovy 3.0.21
Groovy has been updated to Groovy 3.0.21.
Some changes in static type checking have resulted in source-code incompatibilities.
Starting with 3.0.18, if you cast a closure to an Action
without generics, the closure parameter will be Object
instead of any explicit type specified.
This can be fixed by adding the appropriate type to the cast, and the redundant parameter declaration can be removed:
// Before
tasks.create("foo", { Task it -> it.description = "Foo task" } as Action)
// Fixed
tasks.create("foo", { it.description = "Foo task" } as Action<Task>)
Upgrade to ASM 9.7
ASM was upgraded from 9.6 to 9.7 to ensure earlier compatibility for Java 23.
Upgrading from 8.6 and earlier
Potential breaking changes
Upgrade to Kotlin 1.9.22
The embedded Kotlin has been updated from 1.9.10 to Kotlin 1.9.22.
Upgrade to Apache SSHD 2.10.0
Apache SSHD has been updated from 2.0.0 to 2.10.0.
Replacement and upgrade of JSch
JSch has been replaced by com.github.mwiede:jsch
and updated from 0.1.55 to 0.2.16
Upgrade to Eclipse JGit 5.13.3
Eclipse JGit has been updated from 5.7.0 to 5.13.3.
This includes reworking the way that Gradle configures JGit for SSH operations by moving from JSch to Apache SSHD.
Upgrade to Apache Commons Compress 1.25.0
Apache Commons Compress has been updated from 1.21 to 1.25.0. This change may affect the checksums of the produced jars, zips, and other archive types because the metadata of the produced artifacts may differ.
Upgrade to ASM 9.6
ASM was upgraded from 9.5 to 9.6 for better support of multi-release jars.
Upgrade of the version catalog parser
The version catalog parser has been upgraded and is now compliant with version 1.0.0 of the TOML spec.
This should not impact catalogs that use the recommended syntax or were generated by Gradle for publication.
Deprecations
Deprecated registration of plugin conventions
Using plugin conventions has been emitting warnings since Gradle 8.2. Now, registering plugin conventions will also trigger deprecation warnings. For more information, see the section about plugin convention deprecation.
Referencing tasks and domain objects by "name"()
in Kotlin DSL
In Kotlin DSL, it is possible to reference a task or other domain object by its name using the "name"()
notation.
There are several ways to look up an element in a container by name:
tasks {
"wrapper"() // 1 - returns TaskProvider<Task>
"wrapper"(Wrapper::class) // 2 - returns TaskProvider<Wrapper>
"wrapper"(Wrapper::class) { // 3 - configures a task named wrapper of type Wrapper
}
"wrapper" { // 4 - configures a task named wrapper of type Task
}
}
The first notation is deprecated and will be removed in Gradle 9.0.
Instead of using "name"()
to reference a task or domain object, use named("name")
or one of the other supported notations.
The above example would be written as:
tasks {
named("wrapper") // returns TaskProvider<Task>
}
The Gradle API and Groovy build scripts are not impacted by this.
Deprecated invalid URL decoding behavior
Before Gradle 8.3, Gradle would decode a CharSequence
given to Project.uri(Object)
using an algorithm that accepted invalid URLs and improperly decoded others.
Gradle now uses the URI
class to parse and decode URLs, but with a fallback to the legacy behavior in the event of an error.
Starting in Gradle 9.0, the fallback will be removed, and an error will be thrown instead.
To fix a deprecation warning, invalid URLs that require the legacy behavior should be re-encoded to be valid URLs, such as in the following examples:
Original Input | New Input | Reasoning |
---|---|---|
|
|
The |
|
|
Without a scheme, the path is taken as-is, without decoding. |
|
Spaces are not valid in URLs. |
|
|
Deprecated SelfResolvingDependency
The SelfResolvingDependency
interface has been deprecated for removal in Gradle 9.0.
This type dates back to the first versions of Gradle, where some dependencies could be resolved independently.
Now, all dependencies should be resolved as part of a dependency graph using a Configuration
.
Currently, ProjectDependency
and FileCollectionDependency
implement this interface.
In Gradle 9.0, these types will no longer implement SelfResolvingDependency
.
Instead, they will both directly implement Dependency
.
As such, the following methods of ProjectDependency
and FileCollectionDependency
will no longer be available:
-
resolve
-
resolve(boolean)
-
getBuildDependencies
Consider the following scripts that showcase the deprecated interface and its replacement:
plugins {
id("java-library")
}
dependencies {
implementation(files("bar.txt"))
implementation(project(":foo"))
}
tasks.register("resolveDeprecated") {
// Wire build dependencies (calls getBuildDependencies)
dependsOn(configurations["implementation"].dependencies.toSet())
// Resolve dependencies
doLast {
configurations["implementation"].dependencies.withType<FileCollectionDependency>() {
assert(resolve().map { it.name } == listOf("bar.txt"))
assert(resolve(true).map { it.name } == listOf("bar.txt"))
}
configurations["implementation"].dependencies.withType<ProjectDependency>() {
// These methods do not even work properly.
assert(resolve().map { it.name } == listOf<String>())
assert(resolve(true).map { it.name } == listOf<String>())
}
}
}
tasks.register("resolveReplacement") {
val conf = configurations["runtimeClasspath"]
// Wire build dependencies
dependsOn(conf)
// Resolve dependencies
val files = conf.files
doLast {
assert(files.map { it.name } == listOf("bar.txt", "foo.jar"))
}
}
Deprecated members of the org.gradle.util
package now report their deprecation
These members will be removed in Gradle 9.0.
-
Collection.stringize(Collection)
Upgrading from 8.5 and earlier
Potential breaking changes
Upgrade to JaCoCo 0.8.11
JaCoCo has been updated to 0.8.11.
DependencyAdder
renamed to DependencyCollector
The incubating DependencyAdder
interface has been renamed to DependencyCollector
.
A getDependencies
method has been added to the interface that returns all declared dependencies.
Deprecations
Deprecated calling registerFeature
using the main
source set
Calling registerFeature
on the java
extension using the main
source set is deprecated and will change behavior in Gradle 9.0.
Currently, features created while calling usingSourceSet
with the main
source set are initialized differently than features created while calling usingSourceSet
with any other source set.
Previously, when using the main
source set, new implementation
, compileOnly
, runtimeOnly
, api
, and compileOnlyApi
configurations were created, and the compile and runtime classpaths of the main
source set were configured to extend these configurations.
Starting in Gradle 9.0, the main
source set will be treated like any other source set.
With the java-library
plugin applied (or any other plugin that applies the java
plugin), calling usingSourceSet
with the main
source set will throw an exception.
This is because the java
plugin already configures a main
feature.
Only if the java
plugin is not applied will the main
source set be permitted when calling usingSourceSet
.
Code that currently registers features with the main source set, such as:
plugins {
id("java-library")
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets["main"])
}
}
plugins {
id("java-library")
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets.main)
}
}
Should instead, create a separate source set for the feature and register the feature with that source set:
plugins {
id("java-library")
}
sourceSets {
create("feature")
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets["feature"])
}
}
plugins {
id("java-library")
}
sourceSets {
feature
}
java {
registerFeature("feature") {
usingSourceSet(sourceSets.feature)
}
}
Deprecated publishing artifact dependencies with explicit name to Maven repositories
Publishing dependencies with an explicit artifact with a name different from the dependency’s artifactId
to Maven repositories has been deprecated.
This behavior is still permitted when publishing to Ivy repositories.
It will result in an error in Gradle 9.0.
When publishing to Maven repositories, Gradle will interpret the dependency below as if it were declared with coordinates org:notfoo:1.0
:
dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}
dependencies {
implementation("org:foo:1.0") {
artifact {
name = "notfoo"
}
}
}
Instead, this dependency should be declared as:
dependencies {
implementation("org:notfoo:1.0")
}
dependencies {
implementation("org:notfoo:1.0")
}
Deprecated ArtifactIdentifier
The ArtifactIdentifier
class has been deprecated for removal in Gradle 9.0.
Deprecate mutating DependencyCollector
dependencies after observation
Starting in Gradle 9.0, mutating dependencies sourced from a DependencyCollector, after those dependencies have been observed will result in an error.
The DependencyCollector
interface is used to declare dependencies within the test suites DSL.
Consider the following example where a test suite’s dependency is mutated after it is observed:
plugins {
id("java-library")
}
testing.suites {
named<JvmTestSuite>("test") {
dependencies {
// Dependency is declared on a `DependencyCollector`
implementation("com:foo")
}
}
}
configurations.testImplementation {
// Calling `all` here realizes/observes all lazy sources, including the `DependencyCollector`
// from the test suite block. Operations like resolving a configuration similarly realize lazy sources.
dependencies.all {
if (this is ExternalDependency && group == "com" && name == "foo" && version == null) {
// Dependency is mutated after observation
version {
require("2.0")
}
}
}
}
In the above example, the build logic uses iteration and mutation to try to set a default version for a particular dependency if the version is not already set.
Build logic like the above example creates challenges in resolving declared dependencies, as reporting tools will display this dependency as if the user declared the version as "2.0", even though they never did.
Instead, the build logic can avoid iteration and mutation by declaring a preferred
version constraint on the dependency’s coordinates.
This allows the dependency management engine to use the version declared on the constraint if no other version is declared.
Consider the following example that replaces the above iteration with an indiscriminate preferred version constraint:
dependencies {
constraints {
testImplementation("com:foo") {
version {
prefer("2.0")
}
}
}
}
Upgrading from 8.4 and earlier
Potential breaking changes
Upgrade to Kotlin 1.9.20
The embedded Kotlin has been updated to Kotlin 1.9.20.
Changes to Groovy task conventions
The groovy-base
plugin is now responsible for configuring source and target compatibility version conventions on all GroovyCompile
tasks.
If you are using this task without applying grooy-base
, you will have to manually set compatibility versions on these tasks.
In general, the groovy-base
plugin should be applied whenever working with Groovy language tasks.
Provider.filter
The type of argument passed to Provider.filter
is changed from Predicate
to Spec
for a more consistent API.
This change should not affect anyone using Provider.filter
with a lambda expression.
However, this might affect plugin authors if they don’t use SAM conversions to create a lambda.
Deprecations
Deprecated members of the org.gradle.util
package now report their deprecation
These members will be removed in Gradle 9.0:
-
VersionNumber.parse(String)
-
VersionNumber.compareTo(VersionNumber)
Deprecated depending on resolved configuration
When resolving a Configuration
, selecting that same configuration as a variant is sometimes possible.
Configurations should be used for one purpose (resolution, consumption or dependency declarations), so this can only occur when a configuration is marked as both consumable and resolvable.
This can lead to circular dependency graphs, as the resolved configuration is used for two purposes.
To avoid this problem, plugins should mark all resolvable configurations as canBeConsumed=false
or use the resolvable(String)
configuration factory method when creating configurations meant for resolution.
In Gradle 9.0, consuming configurations in this manner will no longer be allowed and result in an error.
Including projects without an existing directory
Gradle will warn if a project is added to the build where the associated projectDir
does not exist or is not writable.
Starting with version 9.0, Gradle will not run builds if a project directory is missing or read-only.
If you intend to dynamically synthesize projects, make sure to create directories for them as well:
include("project-without-directory")
project(":project-without-directory").projectDir.mkdirs()
include 'project-without-directory'
project(":project-without-directory").projectDir.mkdirs()
Upgrading from 8.3 and earlier
Potential breaking changes
Upgrade to Kotlin 1.9.10
The embedded Kotlin has been updated to Kotlin 1.9.10.
XML parsing now requires recent parsers
Gradle 8.4 now configures XML parsers with security features enabled. If your build logic depends on old XML parsers that don’t support secure parsing, your build may fail. If you encounter a failure, check and update or remove any dependency on legacy XML parsers.
If you are an Android user, please upgrade your AGP version to 8.3.0 or higher to fix the issue caused by AGP itself. See the Update XML parser used in AGP for Gradle 8.4 compatibility for more details.
If you are unable to upgrade XML parsers coming from your build logic dependencies, you can force the use of the XML parsers built into the JVM.
In OpenJDK, for example, this can be done by adding the following to gradle.properties
:
systemProp.javax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl
systemProp.javax.xml.transform.TransformerFactory=com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl
systemProp.javax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl
See the CVE-2023-42445 advisory for more details and ways to enable secure XML processing on previous Gradle versions.
EAR plugin with customized JEE 1.3 descriptor
Gradle 8.4 forbids external XML entities when parsing XML documents.
If you use the EAR plugin and configure the application.xml
descriptor via the EAR plugin’s DSL and customize the descriptor using withXml {}
and use asElement{}
in the customization block, then the build will now fail for security reasons.
plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}
plugins {
id("ear")
}
ear {
deploymentDescriptor {
version = "1.3"
withXml {
asElement()
}
}
}
If you happen to use asNode()
instead of asElement()
, then nothing changes, given asNode()
simply ignores external DTDs.
You can work around this by running your build with the javax.xml.accessExternalDTD
system property set to http
.
On the command line, add this to your Gradle invocation:
-Djavax.xml.accessExternalDTD=http
To make this workaround persistent, add the following line to your gradle.properties
:
systemProp.javax.xml.accessExternalDTD=http
Note that this will enable HTTP access to external DTDs for the whole build JVM. See the JAXP documentation for more details.
Deprecations
Deprecated GenerateMavenPom
methods
The following methods on GenerateMavenPom
are deprecated and will be removed in Gradle 9.0.
They were never intended to be public API.
-
getVersionRangeMapper
-
withCompileScopeAttributes
-
withRuntimeScopeAttributes
Upgrading from 8.2 and earlier
Potential breaking changes
Deprecated Project.buildDir
can cause script compilation failure
With the deprecation of Project.buildDir
, buildscripts that are compiled with warnings as errors could fail if the deprecated field is used.
See the deprecation entry for details.
TestLauncher
API no longer ignores build failures
The TestLauncher
interface is part of the Tooling API, specialized for running tests.
It is a logical extension of the BuildLauncher
that can only launch tasks.
A discrepancy has been reported in their behavior: if the same failing test is executed, BuildLauncher
will report a build failure, but TestLauncher
won’t.
Originally, this was a design decision in order to continue the execution and run the tests in all test tasks and not stop at the first failure.
At the same time, this behavior can be confusing for users as they can experience a failing test in a successful build.
To make the two APIs more uniform, we made TestLauncher
also fail the build, which is a potential breaking change.
Tooling API clients should explicitly pass --continue
to the build to continue the test execution even if a test task fails.
Fixed variant selection behavior with ArtifactView
and ArtifactCollection
The dependency resolution APIs for selecting different artifacts or files (Configuration.getIncoming().artifactView { }
and Configuration.getIncoming().getArtifacts()
) captured immutable copies of the underlying `Configuration’s attributes to use for variant selection.
If the `Configuration’s attributes were changed after these methods were called, the artifacts selected by these methods could be unexpected.
Consider the case where the set of attributes on a Configuration
is changed after an ArtifactView
is created:
tasks {
myTask {
inputFiles.from(configurations.classpath.incoming.artifactView {
attributes {
// Add attributes to select a different type of artifact
}
}.files)
}
}
configurations {
classpath {
attributes {
// Add more attributes to the configuration
}
}
}
The inputFiles
property of myTask
uses an artifact view to select a different type of artifact from the configuration classpath
.
Since the artifact view was created before the attributes were added to the configuration, Gradle could not select the correct artifact.
Some builds may have worked around this by also putting the additional attributes into the artifact view. This is no longer necessary.
Upgrade to Kotlin 1.9.0
The embedded Kotlin has been updated from 1.8.20 to Kotlin 1.9.0. The Kotlin language and API levels for the Kotlin DSL are still set to 1.8 for backward compatibility. See the release notes for Kotlin 1.8.22 and Kotlin 1.8.21.
Kotlin 1.9 dropped support for Kotlin language and API level 1.3. If you build Gradle plugins written in Kotlin with this version of Gradle and need to support Gradle <7.0 you need to stick to using the Kotlin Gradle Plugin <1.9.0 and configure the Kotlin language and API levels to 1.3. See the Compatibility Matrix for details about other versions.
Eager evaluation of Configuration
attributes
Gradle 8.3 updates the org.gradle.libraryelements
and org.gradle.jvm.version
attributes of JVM Configurations to be present at the time of creation, as opposed to previously, where they were only present after the Configuration had been resolved or consumed.
In particular, the value for org.gradle.jvm.version
relies on the project’s configured toolchain, meaning that querying the value for this attribute will finalize the value of the project’s Java toolchain.
Plugins or build logic that eagerly queries the attributes of JVM configurations may now cause the project’s Java toolchain to be finalized earlier than before. Attempting to modify the toolchain after it has been finalized will result in error messages similar to the following:
The value for property 'implementation' is final and cannot be changed any further.
The value for property 'languageVersion' is final and cannot be changed any further.
The value for property 'vendor' is final and cannot be changed any further.
This situation may arise when plugins or build logic eagerly query an existing JVM Configuration’s attributes to create a new Configuration with the same attributes. Previously, this logic would have omitted the two above-noted attributes entirely, while now, the same logic will copy the attributes and finalize the project’s Java toolchain. To avoid early toolchain finalization, attribute-copying logic should be updated to query the source Configuration’s attributes lazily:
fun <T> copyAttribute(attribute: Attribute<T>, from: AttributeContainer, to: AttributeContainer) =
to.attributeProvider<T>(attribute, provider { from.getAttribute(attribute)!! })
val source = configurations["runtimeClasspath"].attributes
configurations {
create("customRuntimeClasspath") {
source.keySet().forEach { key ->
copyAttribute(key, source, attributes)
}
}
}
def source = configurations.runtimeClasspath.attributes
configurations {
customRuntimeClasspath {
source.keySet().each { key ->
attributes.attributeProvider(key, provider { source.getAttribute(key) })
}
}
}
Deprecations
Deprecated Project.buildDir
is to be replaced by Project.layout.buildDirectory
The Project.buildDir
property is deprecated.
It uses eager APIs and has ordering issues if the value is read in build logic and then later modified.
It could result in outputs ending up in different locations.
It is replaced by a DirectoryProperty
found at Project.layout.buildDirectory
.
See the ProjectLayout
interface for details.
Note that, at this stage, Gradle will not print deprecation warnings if you still use Project.buildDir
.
We know this is a big change, and we want to give the authors of major plugins time to stop using it.
Switching from a File
to a DirectoryProperty
requires adaptations in build logic.
The main impact is that you cannot use the property inside a String
to expand it.
Instead, you should leverage the dir
and file
methods to compute your desired location.
Here is an example of creating a file where the following:
// Returns a java.io.File
file("$buildDir/myOutput.txt")
// Returns a java.io.File
file("$buildDir/myOutput.txt")
Should be replaced by:
// Compatible with a number of Gradle lazy APIs that accept also java.io.File
val output: Provider<RegularFile> = layout.buildDirectory.file("myOutput.txt")
// If you really need the java.io.File for a non lazy API
output.get().asFile
// Or a path for a lazy String based API
output.map { it.asFile.path }
// Compatible with a number of Gradle lazy APIs that accept also java.io.File
Provider<RegularFile> output = layout.buildDirectory.file("myOutput.txt")
// If you really need the java.io.File for a non lazy API
output.get().asFile
// Or a path for a lazy String based API
output.map { it.asFile.path }
Here is another example for creating a directory where the following:
// Returns a java.io.File
file("$buildDir/outputLocation")
// Returns a java.io.File
file("$buildDir/outputLocation")
Should be replaced by:
// Compatible with a number of Gradle APIs that accept a java.io.File
val output: Provider<Directory> = layout.buildDirectory.dir("outputLocation")
// If you really need the java.io.File for a non lazy API
output.get().asFile
// Or a path for a lazy String based API
output.map { it.asFile.path }
// Compatible with a number of Gradle APIs that accept a java.io.File
Provider<Directory> output = layout.buildDirectory.dir("outputLocation")
// If you really need the java.io.File for a non lazy API
output.get().asFile
// Or a path for a lazy String based API
output.map { it.asFile.path }
Deprecated ClientModule
dependencies
ClientModule
dependencies are deprecated and will be removed in Gradle 9.0.
Client module dependencies were originally intended to allow builds to override incorrect or missing component metadata of external dependencies by defining the metadata locally. This functionality has since been replaced by Component Metadata Rules.
Consider the following client module dependency example:
dependencies {
implementation(module("org:foo:1.0") {
dependency("org:bar:1.0")
module("org:baz:1.0") {
dependency("com:example:1.0")
}
})
}
dependencies {
implementation module("org:foo:1.0") {
dependency "org:bar:1.0"
module("org:baz:1.0") {
dependency "com:example:1.0"
}
}
}
This can be replaced with the following component metadata rule:
@CacheableRule
abstract class AddDependenciesRule @Inject constructor(val dependencies: List<String>) : ComponentMetadataRule {
override fun execute(context: ComponentMetadataContext) {
listOf("compile", "runtime").forEach { base ->
context.details.withVariant(base) {
withDependencies {
dependencies.forEach {
add(it)
}
}
}
}
}
}
dependencies {
components {
withModule<AddDependenciesRule>("org:foo") {
params(listOf(
"org:bar:1.0",
"org:baz:1.0"
))
}
withModule<AddDependenciesRule>("org:baz") {
params(listOf("com:example:1.0"))
}
}
implementation("org:foo:1.0")
}
@CacheableRule
abstract class AddDependenciesRule implements ComponentMetadataRule {
List<String> dependencies
@Inject
AddDependenciesRule(List<String> dependencies) {
this.dependencies = dependencies
}
@Override
void execute(ComponentMetadataContext context) {
["compile", "runtime"].each { base ->
context.details.withVariant(base) {
withDependencies {
dependencies.each {
add(it)
}
}
}
}
}
}
dependencies {
components {
withModule("org:foo", AddDependenciesRule) {
params([
"org:bar:1.0",
"org:baz:1.0"
])
}
withModule("org:baz", AddDependenciesRule) {
params(["com:example:1.0"])
}
}
implementation "org:foo:1.0"
}
Earliest supported Develocity plugin version is 3.13.1
Starting in Gradle 9.0, the earliest supported Develocity plugin version is 3.13.1. The plugin versions from 3.0 up to 3.13 will be ignored when applied.
Upgrade to version 3.13.1 or later of the Develocity plugin. You can find the latest available version on the Gradle Plugin Portal. More information on the compatibility can be found here.
Upgrading from 8.1 and earlier
Potential breaking changes
Upgrade to Kotlin 1.8.20
The embedded Kotlin has been updated to Kotlin 1.8.20. For more information, see What’s new in Kotlin 1.8.20.
Note that there is a known issue with Kotlin compilation avoidance that can cause OutOfMemory
exceptions in compileKotlin
tasks if the compilation classpath contains very large JAR files.
This applies to builds applying the Kotlin plugin v1.8.20 or the kotlin-dsl
plugin.
You can work around it by disabling Kotlin compilation avoidance in your gradle.properties
file:
kotlin.incremental.useClasspathSnapshot=false
See KT-57757 for more information.
Upgrade to Groovy 3.0.17
Groovy has been updated to Groovy 3.0.17.
Since the previous version was 3.0.15, the 3.0.16 changes are also included.
Upgrade to Ant 1.10.13
Ant has been updated to Ant 1.10.13.
Since the previous version was 1.10.11, the 1.10.12 changes are also included.
Upgrade to CodeNarc 3.2.0
The default version of CodeNarc has been updated to CodeNarc 3.2.0.
Upgrade to PMD 6.55.0
PMD has been updated to PMD 6.55.0.
Since the previous version was 6.48.0, all changes since then are included.
Upgrade to JaCoCo 0.8.9
JaCoCo has been updated to 0.8.9.
Plugin compatibility changes
A plugin compiled with Gradle >= 8.2 that makes use of the Kotlin DSL functions Project.the<T>()
, Project.the(KClass)
or Project.configure<T> {}
cannot run on Gradle ⇐ 6.1.
Deferred or avoided configuration of some tasks
When performing dependency resolution, Gradle creates an internal representation of the available Configurations. This requires inspecting all configurations and artifacts. Processing artifacts created by tasks causes those tasks to be realized and configured.
This internal representation is now created more lazily, which can change the order in which tasks are configured. Some tasks may never be configured.
This change may cause code paths that relied on a particular order to no longer function, such as conditionally adding attributes to a configuration based on the presence of certain attributes.
This impacted the bnd plugin and JUnit5 build.
We recommend not modifying domain objects (configurations, source sets, tasks, etc) from configuration blocks for other domain objects that may not be configured.
For example, avoid doing something like this:
configurations {
val myConfig = create("myConfig")
}
tasks.register("myTask") {
// This is not safe, as the execution of this block may not occur, or may not occur in the order expected
configurations["myConfig"].attributes {
attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage::class.java, Usage.JAVA_RUNTIME))
}
}
Deprecations
CompileOptions
method deprecations
The following methods on CompileOptions
are deprecated:
-
getAnnotationProcessorGeneratedSourcesDirectory()
-
setAnnotationProcessorGeneratedSourcesDirectory(File)
-
setAnnotationProcessorGeneratedSourcesDirectory(Provider<File>)
Current usages of these methods should migrate to DirectoryProperty getGeneratedSourceOutputDirectory()
Using configurations incorrectly
Gradle will now warn at runtime when methods of Configuration are called inconsistently with the configuration’s intended usage.
This change is part of a larger ongoing effort to make the intended behavior of configurations more consistent and predictable and to unlock further speed and memory improvements.
Currently, the following methods should only be called with these listed allowed usages:
-
resolve()
- RESOLVABLE configurations only -
files(Closure)
,files(Spec)
,files(Dependency…)
,fileCollection(Spec)
,fileCollection(Closure)
,fileCollection(Dependency…)
- RESOLVABLE configurations only -
getResolvedConfigurations()
- RESOLVABLE configurations only -
defaultDependencies(Action)
- DECLARABLE configurations only -
shouldResolveConsistentlyWith(Configuration)
- RESOLVABLE configurations only -
disableConsistentResolution()
- RESOLVABLE configurations only -
getDependencyConstraints()
- DECLARABLE configurations only -
copy()
,copy(Spec)
,copy(Closure)
,copyRecursive()
,copyRecursive(Spec)
,copyRecursive(Closure)
- RESOLVABLE configurations only
Intended usage is noted in the Configuration
interface’s Javadoc.
This list is likely to grow in future releases.
Starting in Gradle 9.0, using a configuration inconsistently with its intended usage will be prohibited.
Also note that although it is not currently restricted, the getDependencies()
method is only intended for use with DECLARABLE configurations.
The getAllDependencies()
method, which retrieves all declared dependencies on a configuration and any superconfigurations, will not be restricted to any particular usage.
Deprecated access to plugin conventions
The concept of conventions is outdated and superseded by extensions to provide custom DSLs.
To reflect this in the Gradle API, the following elements are deprecated:
-
org.gradle.api.internal.HasConvention
Gradle Core plugins still register their conventions in addition to their extensions for backwards compatibility.
It is deprecated to access any of these conventions and their properties. Doing so will now emit a deprecation warning. This will become an error in Gradle 9.0. You should prefer accessing the extensions and their properties instead.
For specific examples, see the next sections.
Prominent community plugins already migrated to using extensions to provide custom DSLs. Some of them still register conventions for backward compatibility. Registering conventions does not emit a deprecation warning yet to provide a migration window. Future Gradle versions will do.
Also note that Plugins compiled with Gradle ⇐ 8.1 that make use of the Kotlin DSL functions Project.the<T>()
, Project.the(KClass)
or Project.configure<T> {}
will emit a deprecation warning when run on Gradle >= 8.2.
To fix this these plugins should be recompiled with Gradle >= 8.2 or changed to access extensions directly using extensions.getByType<T>()
instead.
Deprecated base
plugin conventions
The convention properties contributed by the base
plugin have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The conventions are replaced by the base { }
configuration block backed by BasePluginExtension.
The old convention object defines the distsDirName,
libsDirName
, and archivesBaseName
properties with simple getter and setter methods.
Those methods are available in the extension only to maintain backward compatibility.
Build scripts should solely use the properties of type Property
:
plugins {
base
}
base {
archivesName.set("gradle")
distsDirectory.set(layout.buildDirectory.dir("custom-dist"))
libsDirectory.set(layout.buildDirectory.dir("custom-libs"))
}
plugins {
id 'base'
}
base {
archivesName = "gradle"
distsDirectory = layout.buildDirectory.dir('custom-dist')
libsDirectory = layout.buildDirectory.dir('custom-libs')
}
Deprecated application
plugin conventions
The convention properties the application
plugin contributed have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The following code will now emit deprecation warnings:
plugins {
application
}
applicationDefaultJvmArgs = listOf("-Dgreeting.language=en") // Accessing a convention
plugins {
id 'application'
}
applicationDefaultJvmArgs = ['-Dgreeting.language=en'] // Accessing a convention
This should be changed to use the application { }
configuration block, backed by JavaApplication, instead:
plugins {
application
}
application {
applicationDefaultJvmArgs = listOf("-Dgreeting.language=en")
}
plugins {
id 'application'
}
application {
applicationDefaultJvmArgs = ['-Dgreeting.language=en']
}
Deprecated java
plugin conventions
The convention properties the java
plugin contributed have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The following code will now emit deprecation warnings:
plugins {
id("java")
}
configure<JavaPluginConvention> { // Accessing a convention
sourceCompatibility = JavaVersion.VERSION_18
}
plugins {
id 'java'
}
sourceCompatibility = 18 // Accessing a convention
This should be changed to use the java { }
configuration block, backed by JavaPluginExtension, instead:
plugins {
id("java")
}
java {
sourceCompatibility = JavaVersion.VERSION_18
}
plugins {
id 'java'
}
java {
sourceCompatibility = JavaVersion.VERSION_18
}
Deprecated war
plugin conventions
The convention properties contributed by the war
plugin have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The following code will now emit deprecation warnings:
plugins {
id("war")
}
configure<WarPluginConvention> { // Accessing a convention
webAppDirName = "src/main/webapp"
}
plugins {
id 'war'
}
webAppDirName = 'src/main/webapp' // Accessing a convention
Clients should configure the war
task directly.
Also, tasks.withType(War.class).configureEach(…) can be used to configure each task of type War
.
plugins {
id("war")
}
tasks.war {
webAppDirectory.set(file("src/main/webapp"))
}
plugins {
id 'war'
}
war {
webAppDirectory = file('src/main/webapp')
}
Deprecated ear
plugin conventions
The convention properties contributed by the ear
plugin have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The following code will now emit deprecation warnings:
plugins {
id("ear")
}
configure<EarPluginConvention> { // Accessing a convention
appDirName = "src/main/app"
}
plugins {
id 'ear'
}
appDirName = 'src/main/app' // Accessing a convention
Clients should configure the ear
task directly.
Also, tasks.withType(Ear.class).configureEach(…) can be used to configure each task of type Ear
.
plugins {
id("ear")
}
tasks.ear {
appDirectory.set(file("src/main/app"))
}
plugins {
id 'ear'
}
ear {
appDirectory = file('src/main/app') // use application metadata found in this folder
}
Deprecated project-report
plugin conventions
The convention properties contributed by the project-reports
plugin have been deprecated and scheduled for removal in Gradle 9.0.
For more context, see the section about plugin convention deprecation.
The following code will now emit deprecation warnings:
plugins {
`project-report`
}
configure<ProjectReportsPluginConvention> {
projectReportDirName = "custom" // Accessing a convention
}
plugins {
id 'project-report'
}
projectReportDirName = "custom" // Accessing a convention
Configure your report task instead:
plugins {
`project-report`
}
tasks.withType<HtmlDependencyReportTask>() {
projectReportDirectory.set(project.layout.buildDirectory.dir("reports/custom"))
}
plugins {
id 'project-report'
}
tasks.withType(HtmlDependencyReportTask) {
projectReportDirectory = project.layout.buildDirectory.dir("reports/custom")
}
Configuration
method deprecations
The following method on Configuration
is deprecated for removal:
-
getAll()
Obtain the set of all configurations from the project’s configurations
container instead.
Relying on automatic test framework implementation dependencies
In some cases, Gradle will load JVM test framework dependencies from the Gradle distribution to execute tests. This existing behavior can lead to test framework dependency version conflicts on the test classpath. To avoid these conflicts, this behavior is deprecated and will be removed in Gradle 9.0. Tests using TestNG are unaffected.
To prepare for this change in behavior, either declare the required dependencies explicitly or migrate to Test Suites, where these dependencies are managed automatically.
Builds that use test suites will not be affected by this change. Test suites manage the test framework dependencies automatically and do not require dependencies to be explicitly declared. See the user manual for further information on migrating to test suites.
In the absence of test suites, dependencies must be manually declared on the test runtime classpath:
-
If using JUnit 5, an explicit
runtimeOnly
dependency onjunit-platform-launcher
is required in addition to the existingimplementation
dependency on the test engine. -
If using JUnit 4, only the existing
implementation
dependency onjunit
4 is required. -
If using JUnit 3, a test
runtimeOnly
dependency onjunit
4 is required in addition to acompileOnly
dependency onjunit
3.
dependencies {
// If using JUnit Jupiter
testImplementation("org.junit.jupiter:junit-jupiter:5.9.2")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
// If using JUnit Vintage
testCompileOnly("junit:junit:4.13.2")
testRuntimeOnly("org.junit.vintage:junit-vintage-engine:5.9.2")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
// If using JUnit 4
testImplementation("junit:junit:4.13.2")
// If using JUnit 3
testCompileOnly("junit:junit:3.8.2")
testRuntimeOnly("junit:junit:4.13.2")
}
dependencies {
// If using JUnit Jupiter
testImplementation 'org.junit.jupiter:junit-jupiter:5.9.2'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
// If using JUnit Vintage
testCompileOnly 'junit:junit:4.13.2'
testRuntimeOnly 'org.junit.vintage:junit-vintage-engine:5.9.2'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
// If using JUnit 4
testImplementation 'junit:junit:4.13.2'
// If using JUnit 3
testCompileOnly 'junit:junit:3.8.2'
testRuntimeOnly 'junit:junit:4.13.2'
}
BuildIdentifier
and ProjectComponentSelector
method deprecations
The following methods on BuildIdentifier
are deprecated:
-
getName()
-
isCurrentBuild()
You could use these methods to distinguish between different project components with the same name but from different builds. However, for certain composite build setups, these methods do not provide enough information to guarantee uniqueness.
Current usages of these methods should migrate to BuildIdentifier.getBuildPath()
.
Similarly, the method ProjectComponentSelector.getBuildName()
is deprecated.
Use ProjectComponentSelector.getBuildPath()
instead.
Upgrading from 8.0 and earlier
CACHEDIR.TAG files are created in global cache directories
Gradle now emits a CACHEDIR.TAG
file in some global cache directories, as specified in Cache marking.
This may cause these directories to no longer be searched or backed up by some tools. To disable it, use the following code in an init script in the Gradle User Home:
beforeSettings {
caches {
// Disable cache marking for all caches
markingStrategy.set(MarkingStrategy.NONE)
}
}
beforeSettings { settings ->
settings.caches {
// Disable cache marking for all caches
markingStrategy = MarkingStrategy.NONE
}
}
Configuration cache options renamed
In this release, the configuration cache feature was promoted from incubating to stable.
As such, all properties originally mentioned in the feature documentation (which had an unsafe
part in their names, e.g., org.gradle.unsafe.configuration-cache
) were renamed, in some cases, by removing the unsafe
part of the name.
Incubating property | Finalized property |
---|---|
|
|
|
|
|
|
Note that the original org.gradle.unsafe.configuration-cache…
properties continue to be honored in this release,
and no warnings will be produced if they are used, but they will be deprecated and removed in a future release.
Potential breaking changes
Kotlin DSL scripts emit compilation warnings
Compilation warnings from Kotlin DSL scripts are printed to the console output. For example, the use of deprecated APIs in Kotlin DSL will emit warnings each time the script is compiled.
This is a potentially breaking change if you are consuming the console output of Gradle builds.
Configuring Kotlin compiler options with the kotlin-dsl
plugin applied
If you are configuring custom Kotlin compiler options on a project with the kotlin-dsl plugin applied you might encounter a breaking change.
In previous Gradle versions, the kotlin-dsl
plugin was adding required compiler arguments on afterEvaluate {}.
Now that the Kotlin Gradle Plugin provides lazy configuration properties, our kotlin-dsl
plugin switched to adding required compiler arguments to the lazy properties directly.
As a consequence, if you were setting freeCompilerArgs
the kotlin-dsl
plugin is now failing the build because its required compiler arguments are overridden by your configuration.
plugins {
`kotlin-dsl`
}
tasks.withType(KotlinCompile::class).configureEach {
kotlinOptions { // Deprecated non-lazy configuration options
freeCompilerArgs = listOf("-Xcontext-receivers")
}
}
With the configuration above you would get the following build failure:
* What went wrong
Execution failed for task ':compileKotlin'.
> Kotlin compiler arguments of task ':compileKotlin' do not work for the `kotlin-dsl` plugin. The 'freeCompilerArgs' property has been reassigned. It must instead be appended to. Please use 'freeCompilerArgs.addAll(\"your\", \"args\")' to fix this.
You must change this to adding your custom compiler arguments to the lazy configuration properties of the Kotlin Gradle Plugin for them to be appended to the ones required by the kotlin-dsl
plugin:
plugins {
`kotlin-dsl`
}
tasks.withType(KotlinCompile::class).configureEach {
compilerOptions { // New lazy configuration options
freeCompilerArgs.addAll("-Xcontext-receivers")
}
}
If you were already adding to freeCompilerArgs
instead of setting its value, you should not experience a build failure.
New API introduced may clash with existing Gradle DSL code
When a new property or method is added to an existing type in the Gradle DSL, it may clash with names already used in user code.
When a name clash occurs, one solution is to rename the element in user code.
This is a non-exhaustive list of API additions in 8.1 that may cause name collisions with existing user code.
Using unsupported API to start external processes at configuration time is no longer allowed with the configuration cache enabled
Since Gradle 7.5, using Project.exec
, Project.javaexec
, and standard Java and Groovy APIs to run external processes at configuration time has been considered an error only if the feature preview STABLE_CONFIGURATION_CACHE
was enabled.
With the configuration cache promotion to a stable feature in Gradle 8.1, this error is detected regardless of the feature preview status.
The configuration cache chapter has more details to help with the migration to the new provider-based APIs to execute external processes at configuration time.
Builds that do not use the configuration cache, or only start external processes at execution time are not affected by this change.
Deprecations
Mutating core plugin configuration usage
The allowed usage of a configuration should be immutable after creation.
Mutating the allowed usage on a configuration created by a Gradle core plugin is deprecated.
This includes calling any of the following Configuration
methods:
-
setCanBeConsumed(boolean)
-
setCanBeResolved(boolean)
These methods now emit deprecation warnings on these configurations, except for certain special cases which make allowances for the existing behavior of popular plugins.
This rule does not yet apply to detached configurations or configurations created in buildscripts and third-party plugins.
Calling setCanBeConsumed(false)
on apiElements
or runtimeElements
is not yet deprecated in order to avoid warnings that would be otherwise emitted when using select popular third-party plugins.
This change is part of a larger ongoing effort to make the intended behavior of configurations more consistent and predictable, and to unlock further speed and memory improvements in this area of Gradle.
The ability to change the allowed usage of a configuration after creation will be removed in Gradle 9.0.
Reserved configuration names
Configuration names "detachedConfiguration" and "detachedConfigurationX" (where X is any integer) are reserved for internal use when creating detached configurations.
The ability to create non-detached configurations with these names will be removed in Gradle 9.0.
Calling select methods on the JavaPluginExtension
without the java
component present
Starting in Gradle 8.1, calling any of the following methods on JavaPluginExtension
without
the presence of the default java
component is deprecated:
-
withJavadocJar()
-
withSourcesJar()
-
consistentResolution(Action)
This java
component is added by the JavaPlugin
, which is applied by any of the Gradle JVM plugins including:
-
java-library
-
application
-
groovy
-
scala
Starting in Gradle 9.0, calling any of the above listed methods without the presence of the default java
component will become an error.
WarPlugin#configureConfiguration(ConfigurationContainer)
Starting in Gradle 8.1, calling WarPlugin#configureConfiguration(ConfigurationContainer)
is deprecated.
This method was intended for internal use and was never intended to be used as part of the public interface.
Starting in Gradle 9.0, this method will be removed without replacement.
Relying on conventions for custom Test tasks
By default, when applying the java
plugin, the testClassesDirs`and `classpath
of all Test
tasks have the same convention.
Unless otherwise changed, the default behavior is to execute the tests from the default test
TestSuite
by configuring the task with the classpath
and testClassesDirs
from the test
suite.
This behavior will be removed in Gradle 9.0.
While this existing default behavior is correct for the use case of executing the default unit test suite under a different environment, it does not support the use case of executing an entirely separate set of tests.
If you wish to continue including these tests, use the following code to avoid the deprecation warning in 8.1 and prepare for the behavior change in 9.0. Alternatively, consider migrating to test suites.
val test by testing.suites.existing(JvmTestSuite::class)
tasks.named<Test>("myTestTask") {
testClassesDirs = files(test.map { it.sources.output.classesDirs })
classpath = files(test.map { it.sources.runtimeClasspath })
}
tasks.myTestTask {
testClassesDirs = testing.suites.test.sources.output.classesDirs
classpath = testing.suites.test.sources.runtimeClasspath
}
Modifying Gradle Module Metadata after a publication has been populated
Altering the GMM (e.g., changing a component configuration variants) after a Maven or Ivy publication has been populated from their components is now deprecated. This feature will be removed in Gradle 9.0.
Eager population of the publication can happen if the following methods are called:
-
Maven
-
Ivy
Previously, the following code did not generate warnings, but it created inconsistencies between published artifacts:
publishing {
publications {
create<MavenPublication>("maven") {
from(components["java"])
}
create<IvyPublication>("ivy") {
from(components["java"])
}
}
}
// These calls eagerly populate the Maven and Ivy publications
(publishing.publications["maven"] as MavenPublication).artifacts
(publishing.publications["ivy"] as IvyPublication).artifacts
val javaComponent = components["java"] as AdhocComponentWithVariants
javaComponent.withVariantsFromConfiguration(configurations["apiElements"]) { skip() }
javaComponent.withVariantsFromConfiguration(configurations["runtimeElements"]) { skip() }
publishing {
publications {
maven(MavenPublication) {
from components.java
}
ivy(IvyPublication) {
from components.java
}
}
}
// These calls eagerly populate the Maven and Ivy publications
publishing.publications.maven.artifacts
publishing.publications.ivy.artifacts
components.java.withVariantsFromConfiguration(configurations.apiElements) { skip() }
components.java.withVariantsFromConfiguration(configurations.runtimeElements) { skip() }
In this example, the Maven and Ivy publications will contain the main JAR artifacts for the project, whereas the GMM module file will omit them.
Running tests on JVM versions 6 and 7
Running JVM tests on JVM versions older than 8 is deprecated. Testing on these versions will become an error in Gradle 9.0
Applying Kotlin DSL precompiled scripts published with Gradle < 6.0
Applying Kotlin DSL precompiled scripts published with Gradle < 6.0 is deprecated. Please use a version of the plugin published with Gradle >= 6.0.
Applying the kotlin-dsl
together with Kotlin Gradle Plugin < 1.8.0
Applying the kotlin-dsl
together with Kotlin Gradle Plugin < 1.8.0 is deprecated.
Please let Gradle control the version of kotlin-dsl
by removing any explicit kotlin-dsl
version constraints from your build logic.
This will let the kotlin-dsl
plugin decide which version of the Kotlin Gradle Plugin to use.
If you explicitly declare which version of the Kotlin Gradle Plugin to use for your build logic, update it to >= 1.8.0.
Accessing libraries
or bundles
from dependency version catalogs in the plugins {}
block of a Kotlin script
Accessing libraries
or bundles
from dependency version catalogs in the plugins {}
block of a Kotlin script is deprecated.
Please only use versions
or plugins
from dependency version catalogs in the plugins {}
block.
Using ValidatePlugins
task without a Java Toolchain
Using a task of type ValidatePlugins without applying the Java Toolchains plugin is deprecated, and will become an error in Gradle 9.0.
To avoid this warning, please apply the plugin to your project:
plugins {
id("jvm-toolchains")
}
plugins {
id 'jvm-toolchains'
}
The Java Toolchains plugin is applied automatically by the Java library plugin or other JVM plugins. So you can apply any of them to your project and it will fix the warning.
Deprecated members of the org.gradle.util
package now report their deprecation
These members will be removed in Gradle 9.0.
-
WrapUtil.toDomainObjectSet(…)
-
GUtil.toCamelCase(…)
-
GUtil.toLowerCase(…)
-
ConfigureUtil
Deprecated JVM vendor IBM Semeru
The enum constant JvmVendorSpec.IBM_SEMERU
is now deprecated and will be removed in Gradle 9.0.
Please replace it by its equivalent JvmVendorSpec.IBM
to avoid warnings and potential errors in the next major version release.
Setting custom build layout on StartParameter
and GradleBuild
Following the related previous deprecation of the behaviour in Gradle 7.1, it is now also deprecated to use related StartParameter and GradleBuild properties. These properties will be removed in Gradle 9.0.
Setting custom build file using buildFile property in GradleBuild task has been deprecated.
Please use the dir property instead to specify the root of the nested build. Alternatively, consider using one of the recommended alternatives for GradleBuild task.
Setting custom build layout using StartParameter methods setBuildFile(File) and setSettingsFile(File) as well as the counterpart getters getBuildFile() and getSettingsFile() have been deprecated.
Please use standard locations for settings and build files:
-
settings file in the root of the build
-
build file in the root of each subproject
Deprecated org.gradle.cache.cleanup property
The org.gradle.cache.cleanup
property in gradle.properties
under Gradle User Home has been deprecated.
Please use the cache cleanup DSL instead to disable or modify the cleanup configuration.
Since the org.gradle.cache.cleanup
property may still be needed for older versions of Gradle, this property may still be present and no deprecation warnings will be printed as long as it is also configured via the DSL.
The DSL value will always take preference over the org.gradle.cache.cleanup
property.
If the desired configuration is to disable cleanup for older versions of Gradle (using org.gradle.cache.cleanup
), but to enable cleanup with the default values for Gradle versions at or above Gradle 8, then cleanup should be configured to use Cleanup.DEFAULT:
if (GradleVersion.current() >= GradleVersion.version('8.0')) {
apply from: "gradle8/cache-settings.gradle"
}
if (GradleVersion.current() >= GradleVersion.version("8.0")) {
apply(from = "gradle8/cache-settings.gradle")
}
beforeSettings { settings ->
settings.caches {
cleanup = Cleanup.DEFAULT
}
}
beforeSettings {
caches {
cleanup.set(Cleanup.DEFAULT)
}
}
Deprecated using relative paths to specify Java executables
Using relative file paths to point to Java executables is now deprecated and will become an error in Gradle 9. This is done to reduce confusion about what such relative paths should resolve against.
Calling Task.getConvention()
, Task.getExtensions()
from a task action
Calling Task.getConvention(), Task.getExtensions() from a task action at execution time is now deprecated and will be made an error in Gradle 9.0.
See the configuration cache chapter for details on how to migrate these usages to APIs that are supported by the configuration cache.
Deprecated running test task successfully when no test executed
Running the Test
task successfully when no test was executed is now deprecated and will become an error in Gradle 9.
Note that it is not an error when no test sources are present, in this case the test
task is simply skipped.
It is only an error when test sources are present, but no test was selected for execution.
This is changed to avoid accidental successful test runs due to erroneous configuration.
Changes in the IDE integration
Workaround for false positive errors shown in Kotlin DSL plugins {}
block using version catalog is not needed anymore
Version catalog accessors for plugin aliases in the plugins {}
block aren’t shown as errors in IntelliJ IDEA and Android Studio Kotlin script editor anymore.
If you were using the @Suppress("DSL_SCOPE_VIOLATION")
annotation as a workaround, you can now remove it.
If you were using the Gradle Libs Error Suppressor IntelliJ IDEA plugin, you can now uninstall it.
After upgrading Gradle to 8.1 you will need to clear the IDE caches and restart.
CORE CONCEPTS
Gradle Basics
Gradle automates building, testing, and deployment of software from information in build scripts.
Gradle core concepts
Projects
A Gradle project is a piece of software that can be built, such as an application or a library.
Single project builds include a single project called the root project.
Multi-project builds include one root project and any number of subprojects.
Build Scripts
Build scripts detail to Gradle what steps to take to build the project.
Each project can include one or more build scripts.
Dependency Management
Dependency management is an automated technique for declaring and resolving external resources required by a project.
Each project typically includes a number of external dependencies that Gradle will resolve during the build.
Tasks
Tasks are a basic unit of work such as compiling code or running your test.
Each project contains one or more tasks defined inside a build script or a plugin.
Plugins
Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.
Gradle project structure
Many developers will interact with Gradle for the first time through an existing project.
The presence of the gradlew
and gradlew.bat
files in the root directory of a project is a clear indicator that Gradle is used.
A Gradle project will look similar to the following:
project
├── gradle // (1)
│ ├── libs.versions.toml // (2)
│ └── wrapper
│ ├── gradle-wrapper.jar
│ └── gradle-wrapper.properties
├── gradlew // (3)
├── gradlew.bat // (3)
├── settings.gradle(.kts) // (4)
├── subproject-a
│ ├── build.gradle(.kts) // (5)
│ └── src // (6)
└── subproject-b
├── build.gradle(.kts) // (5)
└── src // (6)
-
Gradle directory to store wrapper files and more
-
Gradle version catalog for dependency management
-
Gradle wrapper scripts
-
Gradle settings file to define a root project name and subprojects
-
Gradle build scripts of the two subprojects -
subproject-a
andsubproject-b
-
Source code and/or additional files for the projects
Invoking Gradle
IDE
Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse, and NetBeans.
Gradle can be automatically invoked when you build, clean, or run your app in the IDE.
It is recommended that you consult the manual for the IDE of your choice to learn more about how Gradle can be used and configured.
Command line
Gradle can be invoked in the command line once installed. For example:
$ gradle build
Note
|
Most projects do not use the installed version of Gradle. |
Gradle Wrapper
The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to execute a Gradle build.
It is found in the project root directory as a gradlew
or gradlew.bat
file:
$ gradlew build // Linux or OSX
$ gradlew.bat build // Windows
Next Step: Learn about the Gradle Wrapper >>
Gradle Wrapper Basics
The recommended way to execute any Gradle build is with the Gradle Wrapper.
The Wrapper script invokes a declared version of Gradle, downloading it beforehand if necessary.
The Wrapper is available as a gradlew
or gradlew.bat
file.
The Wrapper provides the following benefits:
-
Standardizes a project on a given Gradle version.
-
Provisions the same Gradle version for different users.
-
Provisions the Gradle version for different execution environments (IDEs, CI servers…).
Using the Gradle Wrapper
It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and standardized execution of the build.
Depending on the operating system, you run gradlew
or gradlew.bat
instead of the gradle
command.
Typical Gradle invocation:
$ gradle build
To run the Wrapper on a Linux or OSX machine:
$ ./gradlew build
To run the Wrapper on Windows PowerShell:
$ .\gradlew.bat build
The command is run in the same directory that the Wrapper is located in. If you want to run the command in a different directory, you must provide the relative path to the Wrapper:
$ ../gradlew build
The following console output demonstrates the use of the Wrapper on a Windows machine, in the command prompt (cmd), for a Java-based project:
$ gradlew.bat build Downloading https://services.gradle.org/distributions/gradle-5.0-all.zip ..................................................................................... Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0-all.zip to C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv Set executable permissions for: C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle BUILD SUCCESSFUL in 12s 1 actionable task: 1 executed
Understanding the Wrapper files
The following files are part of the Gradle Wrapper:
.
├── gradle
│ └── wrapper
│ ├── gradle-wrapper.jar // (1)
│ └── gradle-wrapper.properties // (2)
├── gradlew // (3)
└── gradlew.bat // (4)
-
gradle-wrapper.jar
: This is a small JAR file that contains the Gradle Wrapper code. It is responsible for downloading and installing the correct version of Gradle for a project if it’s not already installed. -
gradle-wrapper.properties
: This file contains configuration properties for the Gradle Wrapper, such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or TARBALL). -
gradlew
: This is a shell script (Unix-based systems) that acts as a wrapper aroundgradle-wrapper.jar
. It is used to execute Gradle tasks on Unix-based systems without needing to manually install Gradle. -
gradlew.bat
: This is a batch script (Windows) that serves the same purpose asgradlew
but is used on Windows systems.
Important
|
You should never alter these files. |
If you want to view or update the Gradle version of your project, use the command line. Do not edit the wrapper files manually:
$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2
$ gradlew.bat --version
$ gradlew.bat wrapper --gradle-version 7.2
Consult the Gradle Wrapper reference to learn more.
Next Step: Learn about the Gradle CLI >>
Command-Line Interface Basics
The command-line interface is the primary method of interacting with Gradle outside the IDE.
Use of the Gradle Wrapper is highly encouraged.
Substitute ./gradlew
(in macOS / Linux) or gradlew.bat
(in Windows) for gradle
in the following examples.
Executing Gradle on the command line conforms to the following structure:
gradle [taskName...] [--option-name...]
Options are allowed before and after task names.
gradle [--option-name...] [taskName...]
If multiple tasks are specified, you should separate them with a space.
gradle [taskName1 taskName2...] [--option-name...]
Options that accept values can be specified with or without =
between the option and argument. The use of =
is recommended.
gradle [...] --console=plain
Options that enable behavior have long-form options with inverses specified with --no-
. The following are opposites.
gradle [...] --build-cache gradle [...] --no-build-cache
Many long-form options have short-option equivalents. The following are equivalent:
gradle --help gradle -h
Command-line usage
The following sections describe the use of the Gradle command-line interface. Some plugins also add their own command line options.
Executing tasks
To execute a task called taskName
on the root project, type:
$ gradle :taskName
This will run the single taskName
and all of its dependencies.
Specify options for tasks
To pass an option to a task, prefix the option name with --
after the task name:
$ gradle taskName --exampleOption=exampleValue
Consult the Gradle Command Line Interface reference to learn more.
Next Step: Learn about the Settings file >>
Settings File Basics
The settings file is the entry point of every Gradle project.
The primary purpose of the settings file is to add subprojects to your build.
Gradle supports single and multi-project builds.
-
For single-project builds, the settings file is optional.
-
For multi-project builds, the settings file is mandatory and declares all subprojects.
Settings script
The settings file is a script.
It is either a settings.gradle
file written in Groovy or a settings.gradle.kts
file in Kotlin.
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
The settings file is typically found in the root directory of the project.
Let’s take a look at an example and break it down:
rootProject.name = "root-project" // (1)
include("sub-project-a") // (2)
include("sub-project-b")
include("sub-project-c")
-
Define the project name.
-
Add subprojects.
rootProject.name = 'root-project' // (1)
include('sub-project-a') // (2)
include('sub-project-b')
include('sub-project-c')
-
Define the project name.
-
Add subprojects.
1. Define the project name
The settings file defines your project name:
rootProject.name = "root-project"
There is only one root project per build.
2. Add subprojects
The settings file defines the structure of the project by including subprojects, if there are any:
include("app")
include("business-logic")
include("data-model")
Consult the Writing Settings File page to learn more.
Next Step: Learn about the Build scripts >>
Build File Basics
Generally, a build script details build configuration, tasks, and plugins.
Every Gradle build comprises at least one build script.
In the build file, two types of dependencies can be added:
-
The libraries and/or plugins on which Gradle and the build script depend.
-
The libraries on which the project sources (i.e., source code) depend.
Build scripts
The build script is either a build.gradle
file written in Groovy or a build.gradle.kts
file in Kotlin.
The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.
Let’s take a look at an example and break it down:
plugins {
id("application") // (1)
}
application {
mainClass = "com.example.Main" // (2)
}
-
Add plugins.
-
Use convention properties.
plugins {
id 'application' // (1)
}
application {
mainClass = 'com.example.Main' // (2)
}
-
Add plugins.
-
Use convention properties.
1. Add plugins
Plugins extend Gradle’s functionality and can contribute tasks to a project.
Adding a plugin to a build is called applying a plugin and makes additional functionality available.
plugins {
id("application")
}
The application
plugin facilitates creating an executable JVM application.
Applying the Application plugin also implicitly applies the Java plugin.
The java
plugin adds Java compilation along with testing and bundling capabilities to a project.
2. Use convention properties
A plugin adds tasks to a project. It also adds properties and methods to a project.
The application
plugin defines tasks that package and distribute an application, such as the run
task.
The Application plugin provides a way to declare the main class of a Java application, which is required to execute the code.
application {
mainClass = "com.example.Main"
}
In this example, the main class (i.e., the point where the program’s execution begins) is com.example.Main
.
Consult the Writing Build Scripts page to learn more.
Next Step: Learn about Dependency Management >>
Dependency Management Basics
Gradle has built-in support for dependency management.
Dependency management is an automated technique for declaring and resolving external resources required by a project.
Gradle build scripts define the process to build projects that may require external dependencies. Dependencies refer to JARs, plugins, libraries, or source code that support building your project.
Version Catalog
Version catalogs provide a way to centralize your dependency declarations in a libs.versions.toml
file.
The catalog makes sharing dependencies and version configurations between subprojects simple. It also allows teams to enforce versions of libraries and plugins in large projects.
The version catalog typically contains four sections:
-
[versions] to declare the version numbers that plugins and libraries will reference.
-
[libraries] to define the libraries used in the build files.
-
[bundles] to define a set of dependencies.
-
[plugins] to define plugins.
[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"
[libraries]
googleMaterial = { group = "com.google.android.material", name = "material", version = "1.1.0-alpha05" }
mockitoCore = { module = "org.mockito:mockito-core", version.ref = "mockito" }
[plugins]
androidApplication = { id = "com.android.application", version.ref = "androidGradlePlugin" }
The file is located in the gradle
directory so that it can be used by Gradle and IDEs automatically.
The version catalog should be checked into source control: gradle/libs.versions.toml
.
Declaring Your Dependencies
To add a dependency to your project, specify a dependency in the dependencies block of your build.gradle(.kts)
file.
The following build.gradle.kts
file adds a plugin and two dependencies to the project using the version catalog above:
plugins {
alias(libs.plugins.androidApplication) // (1)
}
dependencies {
// Dependency on a remote binary to compile and run the code
implementation(libs.googleMaterial) // (2)
// Dependency on a remote binary to compile and run the test code
testImplementation(libs.mockitoCore) // (3)
}
-
Applies the Android Gradle plugin to this project, which adds several features that are specific to building Android apps.
-
Adds the Material dependency to the project. Material Design provides components for creating a user interface in an Android App. This library will be used to compile and run the Kotlin source code in this project.
-
Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java code. This library will be used to compile and run the test source code in this project.
Dependencies in Gradle are grouped by configurations.
-
The
material
library is added to theimplementation
configuration, which is used for compiling and running production code. -
The
mockito-core
library is added to thetestImplementation
configuration, which is used for compiling and running test code.
Note
|
There are many more configurations available. |
Viewing Project Dependencies
You can view your dependency tree in the terminal using the ./gradlew :app:dependencies
command:
$ ./gradlew :app:dependencies
> Task :app:dependencies
------------------------------------------------------------
Project ':app'
------------------------------------------------------------
implementation - Implementation only dependencies for source set 'main'. (n)
\--- com.google.android.material:material:1.1.0-alpha05 (n)
testImplementation - Implementation only dependencies for source set 'test'. (n)
\--- org.mockito:mockito-core:2.16.0 (n)
...
Consult the Dependency Management chapter to learn more.
Next Step: Learn about Tasks >>
Task Basics
A task represents some independent unit of work that a build performs, such as compiling classes, creating a JAR, generating Javadoc, or publishing archives to a repository.
You run a Gradle build
task using the gradle
command or by invoking the Gradle Wrapper (./gradlew
or gradlew.bat
) in your project directory:
$ ./gradlew build
Available tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in the project by running the following command in the terminal:
$ ./gradlew tasks
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
...
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
...
Other tasks
-----------
compileJava - Compiles main Java source.
...
Running tasks
The run
task is executed with ./gradlew run
:
$ ./gradlew run
> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:run
Hello World!
BUILD SUCCESSFUL in 904ms
2 actionable tasks: 2 executed
In this example Java project, the output of the run
task is a Hello World
statement printed on the console.
Task dependency
Many times, a task requires another task to run first.
For example, for Gradle to execute the build
task, the Java code must first be compiled.
Thus, the build
task depends on the compileJava
task.
This means that the compileJava
task will run before the build
task:
$ ./gradlew build
> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build
BUILD SUCCESSFUL in 764ms
7 actionable tasks: 7 executed
Build scripts can optionally define task dependencies. Gradle then automatically determines the task execution order.
Consult the Task development chapter to learn more.
Next Step: Learn about Plugins >>
Plugin Basics
Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a sophisticated dependency resolution engine. The rest of its functionality comes from plugins.
A plugin is a piece of software that provides additional functionality to the Gradle build system.
Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-related capabilities:
- The Java Library Plugin -
java-library
-
Used to define and build Java libraries. It compiles Java source code with the
compileJava
task, generates Javadoc with thejavadoc
task, and packages the compiled classes into a JAR file with thejar
task. - The Google Services Gradle Plugin -
com.google.gms:google-services
-
Enables Google APIs and Firebase services in your Android application with a configuration block called
googleServices{}
and a task calledgenerateReleaseAssets
. - The Gradle Bintray Plugin -
com.jfrog.bintray
-
Allows you to publish artifacts to Bintray by configuring the plugin using the
bintray{}
block.
Plugin distribution
Plugins are distributed in three ways:
-
Core plugins - Gradle develops and maintains a set of Core Plugins.
-
Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.
-
Local plugins - Gradle enables users to create custom plugins using APIs.
Applying plugins
Applying a plugin to a project allows the plugin to extend the project’s capabilities.
You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a version:
plugins {
id «plugin id» version «plugin version»
}
1. Core plugins
Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These plugins provide essential functionality for building and managing projects.
Some examples of core plugins include:
-
java: Provides support for building Java projects.
-
groovy: Adds support for compiling and testing Groovy source files.
-
ear: Adds support for building EAR files for enterprise applications.
Core plugins are unique in that they provide short names, such as java
for the core JavaPlugin, when applied in build scripts.
They also do not require versions.
To apply the java
plugin to a project:
plugins {
id("java")
}
There are many Gradle Core Plugins users can take advantage of.
2. Community plugins
Community plugins are plugins developed by the Gradle community, rather than being part of the core Gradle distribution. These plugins provide additional functionality that may be specific to certain use cases or technologies.
The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java applications.
To apply the org.springframework.boot
plugin to a project:
plugins {
id("org.springframework.boot") version "3.1.5"
}
Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can easily discover and use them.
3. Local plugins
Custom or local plugins are developed and used within a specific project or organization. These plugins are not shared publicly and are tailored to the specific needs of the project or organization.
Local plugins can encapsulate common build logic, provide integrations with internal systems or tools, or abstract complex functionality into reusable components.
Gradle provides users with the ability to develop custom plugins using APIs. To create your own plugin, you’ll typically follow these steps:
-
Define the plugin class: create a new class that implements the
Plugin<Project>
interface.// Define a 'HelloPlugin' plugin class HelloPlugin : Plugin<Project> { override fun apply(project: Project) { // Define the 'hello' task val helloTask = project.tasks.register("hello") { doLast { println("Hello, Gradle!") } } } }
-
Build and optionally publish your plugin: generate a JAR file containing your plugin code and optionally publish this JAR to a repository (local or remote) to be used in other projects.
// Publish the plugin plugins { `maven-publish` } publishing { publications { create<MavenPublication>("mavenJava") { from(components["java"]) } } repositories { mavenLocal() } }
-
Apply your plugin: when you want to use the plugin, include the plugin ID and version in the
plugins{}
block of the build file.// Apply the plugin plugins { id("com.example.hello") version "1.0" }
Consult the Plugin development chapter to learn more.
Next Step: Learn about Incremental Builds and Build Caching >>
Gradle Incremental Builds and Build Caching
Gradle uses two main features to reduce build time: incremental builds and build caching.
Incremental builds
An incremental build is a build that avoids running tasks whose inputs have not changed since the previous build. Re-executing such tasks is unnecessary if they would only re-produce the same output.
For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine whether the input or outputs have changed at build time. If they have changed, Gradle will execute the task. Otherwise, it will skip execution.
Incremental builds are always enabled, and the best way to see them in action is to turn on verbose mode. With verbose mode, each task state is labeled during a build:
$ ./gradlew compileJava --console=verbose
> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava UP-TO-DATE
BUILD SUCCESSFUL in 374ms
12 actionable tasks: 12 up-to-date
When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE
is printed next to the task.
Tip
|
To permanently enable verbose mode, add org.gradle.console=verbose to your gradle.properties file.
|
Build caching
Incremental Builds are a great optimization that helps avoid work already done. If a developer continuously changes a single file, there is likely no need to rebuild all the other files in the project.
However, what happens when the same developer switches to a new branch created last week? The files are rebuilt, even though the developer is building something that has been built before.
This is where a build cache is helpful.
The build cache stores previous build results and restores them when needed. It prevents the redundant work and cost of executing time-consuming and expensive processes.
When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-CACHE
:
$ ./gradlew compileJava --build-cache
> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava FROM-CACHE
> Task :utilities:compileJava FROM-CACHE
> Task :app:compileJava FROM-CACHE
BUILD SUCCESSFUL in 364ms
12 actionable tasks: 3 from cache, 9 up-to-date
Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE
and not FROM-CACHE
.
The build cache allows you to share and reuse unchanged build and test outputs across teams. This speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new code changes.
Consult the Build cache chapter to learn more.
Next Step: Learn about Build Scans >>
Build Scans
A build scan is a representation of metadata captured as you run your build.
Build Scans
Gradle captures your build metadata and sends it to the Build Scan Service. The service then transforms the metadata into information you can analyze and share with others.
The information that scans collect can be an invaluable resource when troubleshooting, collaborating on, or optimizing the performance of your builds.
For example, with a build scan, it’s no longer necessary to copy and paste error messages or include all the details about your environment each time you want to ask a question on Stack Overflow, Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.
Enable Build Scans
To enable build scans on a gradle command, add --scan
to the command line option:
./gradlew build --scan
You may be prompted to agree to the terms to use Build Scans.
Vist the Build Scans page to learn more.
Next Step: Start the Tutorial >>
CORE CONCEPTS
Gradle Directories
Gradle uses two main directories to perform and manage its work: the Gradle User Home directory and the Project Root directory.
Gradle User Home directory
By default, the Gradle User Home (~/.gradle
or C:\Users\<USERNAME>\.gradle
) stores global configuration properties, initialization scripts, caches, and log files.
It can be set with the environment variable GRADLE_USER_HOME
.
Tip
|
Not to be confused with the GRADLE_HOME , the optional installation directory for Gradle.
|
It is roughly structured as follows:
├── caches // (1)
│ ├── 4.8 // (2)
│ ├── 4.9 // (2)
│ ├── ⋮
│ ├── jars-3 // (3)
│ └── modules-2 // (3)
├── daemon // (4)
│ ├── ⋮
│ ├── 4.8
│ └── 4.9
├── init.d // (5)
│ └── my-setup.gradle
├── jdks // (6)
│ ├── ⋮
│ └── jdk-14.0.2+12
├── wrapper
│ └── dists // (7)
│ ├── ⋮
│ ├── gradle-4.8-bin
│ ├── gradle-4.9-all
│ └── gradle-4.9-bin
└── gradle.properties // (8)
-
Global cache directory (for everything that is not project-specific).
-
Version-specific caches (e.g., to support incremental builds).
-
Shared caches (e.g., for artifacts of dependencies).
-
Registry and logs of the Gradle Daemon.
-
Global initialization scripts.
-
JDKs downloaded by the toolchain support.
-
Distributions downloaded by the Gradle Wrapper.
-
Global Gradle configuration properties.
Consult the Gradle Directories reference to learn more.
Project Root directory
The project root directory contains all source files from your project.
It also contains files and directories Gradle generates, such as .gradle
and build
, as well as the Gradle configuration directory: gradle
.
Tip
|
gradle and .gradle directories are different.
|
While gradle
is usually checked into source control, build
and .gradle
directories contain the output of your builds, caches, and other transient files Gradle uses to support features like incremental builds.
The anatomy of a typical project root directory looks as follows:
├── .gradle // (1)
│ ├── 4.8 // (2)
│ ├── 4.9 // (2)
│ └── ⋮
├── build // (3)
├── gradle
│ └── wrapper // (4)
├── gradle.properties // (5)
├── gradlew // (6)
├── gradlew.bat // (6)
├── settings.gradle.kts // (7)
├── subproject-one // (8)
| └── build.gradle.kts // (9)
├── subproject-two // (8)
| └── build.gradle.kts // (9)
└── ⋮
-
Project-specific cache directory generated by Gradle.
-
Version-specific caches (e.g., to support incremental builds).
-
The build directory of this project into which Gradle generates all build artifacts.
-
Contains the JAR file and configuration of the Gradle Wrapper.
-
Project-specific Gradle configuration properties.
-
Scripts for executing builds using the Gradle Wrapper.
-
The project’s settings file where the list of subprojects is defined.
-
Usually, a project is organized into one or multiple subprojects.
-
Each subproject has its own Gradle build script.
Consult the Gradle Directories reference to learn more.
Next Step: Learn how to structure Multi-Project Builds >>
Multi-Project Build Basics
Gradle supports multi-project builds.
While some small projects and monolithic applications may contain a single build file and source tree, it is often more common for a project to have been split into smaller, interdependent modules. The word "interdependent" is vital, as you typically want to link the many modules together through a single build.
Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-module project. Gradle refers to modules as subprojects.
A multi-project build consists of one root project and one or more subprojects.
Multi-Project structure
The following represents the structure of a multi-project build that contains three subprojects:
The directory structure should look as follows:
├── .gradle
│ └── ⋮
├── gradle
│ ├── libs.versions.toml
│ └── wrapper
├── gradlew
├── gradlew.bat
├── settings.gradle.kts // (1)
├── sub-project-1
│ └── build.gradle.kts // (2)
├── sub-project-2
│ └── build.gradle.kts // (2)
└── sub-project-3
└── build.gradle.kts // (2)
-
The
settings.gradle.kts
file should include all subprojects. -
Each subproject should have its own
build.gradle.kts
file.
Multi-Project standards
The Gradle community has two standards for multi-project build structures:
-
Multi-Project Builds using buildSrc - where
buildSrc
is a subproject-like directory at the Gradle project root containing all the build logic. -
Composite Builds - a build that includes other builds where
build-logic
is a build directory at the Gradle project root containing reusable build logic.
1. Multi-Project Builds using buildSrc
Multi-project builds allow you to organize projects with many modules, wire dependencies between those modules, and easily share common build logic amongst them.
For example, a build that has many modules called mobile-app
, web-app
, api
, lib
, and documentation
could be structured as follows:
.
├── gradle
├── gradlew
├── settings.gradle.kts
├── buildSrc
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts
The modules will have dependencies between them such as web-app
and mobile-app
depending on lib
.
This means that in order for Gradle to build web-app
or mobile-app
, it must build lib
first.
In this example, the root settings file will look as follows:
include("mobile-app", "web-app", "api", "lib", "documentation")
include("mobile-app", "web-app", "api", "lib", "documentation")
Note
|
The order in which the subprojects (modules) are included does not matter. |
The buildSrc
directory is automatically recognized by Gradle.
It is a good place to define and maintain shared configuration or imperative build logic, such as custom tasks or plugins.
buildSrc
is automatically included in your build as a special subproject if a build.gradle(.kts)
file is found under buildSrc
.
If the java
plugin is applied to the buildSrc
project, the compiled code from buildSrc/src/main/java
is put in the classpath of the root build script, making it available to any subproject (web-app
, mobile-app
, lib
, etc…) in the build.
Consult how to declare dependencies between subprojects to learn more.
2. Composite Builds
Composite Builds, also referred to as included builds, are best for sharing logic between builds (not subprojects) or isolating access to shared build logic (i.e., convention plugins).
Let’s take the previous example.
The logic in buildSrc
has been turned into a project that contains plugins and can be published and worked on independently of the root project build.
The plugin is moved to its own build called build-logic
with a build script and settings file:
.
├── gradle
├── gradlew
├── settings.gradle.kts
├── build-logic
│ ├── settings.gradle.kts
│ └── conventions
│ ├── build.gradle.kts
│ └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│ └── build.gradle.kts
├── web-app
│ └── build.gradle.kts
├── api
│ └── build.gradle.kts
├── lib
│ └── build.gradle.kts
└── documentation
└── build.gradle.kts
Note
|
The fact that build-logic is located in a subdirectory of the root project is irrelevant. The folder could be located outside the root project if desired.
|
The root settings file includes the entire build-logic
build:
pluginManagement {
includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")
Consult how to create composite builds with includeBuild
to learn more.
Multi-Project path
A project path has the following pattern: it starts with an optional colon, which denotes the root project.
The root project, :
, is the only project in a path not specified by its name.
The rest of a project path is a colon-separated sequence of project names, where the next project is a subproject of the previous project:
:sub-project-1
You can see the project paths when running gradle projects
:
------------------------------------------------------------
Root project 'project'
------------------------------------------------------------
Root project 'project'
+--- Project ':sub-project-1'
\--- Project ':sub-project-2'
Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for composite builds.
Identifying project structure
You can use the gradle projects
command to identify the project structure.
As an example, let’s use a multi-project build with the following structure:
$ gradle -q projects
Projects:
------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------
Root project 'multiproject'
+--- Project ':api'
+--- Project ':services'
| +--- Project ':services:shared'
| \--- Project ':services:webservice'
\--- Project ':shared'
To see a list of the tasks of a project, run gradle <project-path>:tasks
For example, try running gradle :api:tasks
Multi-project builds are collections of tasks you can run. The difference is that you may want to control which project’s tasks get executed.
The following sections will cover your two options for executing tasks in a multi-project build.
Executing tasks by name
The command gradle test
will execute the test
task in any subprojects relative to the current working directory that has that task.
If you run the command from the root project directory, you will run test
in api, shared, services:shared and services:webservice.
If you run the command from the services project directory, you will only execute the task in services:shared and services:webservice.
The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this name. And complain if there is no such task found in any of the subprojects traversed.
Note
|
Some task selectors, like help or dependencies , will only run the task on the project they are invoked on and not on all the subprojects to reduce the amount of information printed on the screen.
|
Executing tasks by fully qualified name
You can use a task’s fully qualified name to execute a specific task in a particular subproject.
For example: gradle :services:webservice:build
will run the build
task of the webservice subproject.
The fully qualified name of a task is its project path plus the task name.
This approach works for any task, so if you want to know what tasks are in a particular subproject, use the tasks
task, e.g. gradle :services:webservice:tasks
.
Multi-Project building and testing
The build
task is typically used to compile, test, and check a single project.
In multi-project builds, you may often want to do all of these tasks across various projects.
The buildNeeded
and buildDependents
tasks can help with this.
In this example, the :services:person-service
project depends on both the :api
and :shared
projects.
The :api
project also depends on the :shared
project.
Assuming you are working on a single project, the :api
project, you have been making changes but have not built the entire project since performing a clean
.
You want to build any necessary supporting JARs but only perform code quality and unit tests on the parts of the project you have changed.
The build
task does this:
$ gradle :api:build
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
BUILD SUCCESSFUL in 0s
If you have just gotten the latest version of the source from your version control system, which included changes in other projects that :api
depends on, you might want to build all the projects you depend on AND test them too.
The buildNeeded
task builds AND tests all the projects from the project dependencies of the testRuntime
configuration:
$ gradle :api:buildNeeded
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :shared:assemble
> Task :shared:compileTestJava
> Task :shared:processTestResources
> Task :shared:testClasses
> Task :shared:test
> Task :shared:check
> Task :shared:build
> Task :shared:buildNeeded
> Task :api:buildNeeded
BUILD SUCCESSFUL in 0s
You may want to refactor some part of the :api
project used in other projects.
If you make these changes, testing only the :api
project is insufficient.
You must test all projects that depend on the :api
project.
The buildDependents
task tests ALL the projects that have a project dependency (in the testRuntime configuration) on the specified project:
$ gradle :api:buildDependents
> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :services:person-service:compileJava
> Task :services:person-service:processResources
> Task :services:person-service:classes
> Task :services:person-service:jar
> Task :services:person-service:assemble
> Task :services:person-service:compileTestJava
> Task :services:person-service:processTestResources
> Task :services:person-service:testClasses
> Task :services:person-service:test
> Task :services:person-service:check
> Task :services:person-service:build
> Task :services:person-service:buildDependents
> Task :api:buildDependents
BUILD SUCCESSFUL in 0s
Finally, you can build and test everything in all projects. Any task you run in the root project folder will cause that same-named task to be run on all the children.
You can run gradle build
to build and test ALL projects.
Consult the Structuring Builds chapter to learn more.
Next Step: Learn about the Gradle Build Lifecycle >>
Build Lifecycle
As a build author, you define tasks and specify dependencies between them. Gradle guarantees that tasks will execute in the order dictated by these dependencies.
Your build scripts and plugins configure this task dependency graph.
For example, if your project includes tasks such as build
, assemble
, and createDocs
, you can configure the build script so that they are executed in the order: build
→ assemble
→ createDocs
.
Task Graphs
Gradle builds the task graph before executing any task.
Across all projects in the build, tasks form a Directed Acyclic Graph (DAG).
This diagram shows two example task graphs, one abstract and the other concrete, with dependencies between tasks represented as arrows:
Both plugins and build scripts contribute to the task graph via the task dependency mechanism and annotated inputs/outputs.
Build Phases
A Gradle build has three distinct phases.
Gradle runs these phases in order:
- Phase 1. Initialization
- Phase 2. Configuration
-
-
Evaluates the build scripts,
build.gradle(.kts)
, of every project participating in the build. -
Creates a task graph for requested tasks.
-
- Phase 3. Execution
-
-
Schedules and executes the selected tasks.
-
Dependencies between tasks determine execution order.
-
Execution of tasks can occur in parallel.
-
Example
The following example shows which parts of settings and build files correspond to various build phases:
rootProject.name = "basic"
println("This is executed during the initialization phase.")
println("This is executed during the configuration phase.")
tasks.register("configured") {
println("This is also executed during the configuration phase, because :configured is used in the build.")
}
tasks.register("test") {
doLast {
println("This is executed during the execution phase.")
}
}
tasks.register("testBoth") {
doFirst {
println("This is executed first during the execution phase.")
}
doLast {
println("This is executed last during the execution phase.")
}
println("This is executed during the configuration phase as well, because :testBoth is used in the build.")
}
rootProject.name = 'basic'
println 'This is executed during the initialization phase.'
println 'This is executed during the configuration phase.'
tasks.register('configured') {
println 'This is also executed during the configuration phase, because :configured is used in the build.'
}
tasks.register('test') {
doLast {
println 'This is executed during the execution phase.'
}
}
tasks.register('testBoth') {
doFirst {
println 'This is executed first during the execution phase.'
}
doLast {
println 'This is executed last during the execution phase.'
}
println 'This is executed during the configuration phase as well, because :testBoth is used in the build.'
}
The following command executes the test
and testBoth
tasks specified above.
Because Gradle only configures requested tasks and their dependencies, the configured
task never configures:
> gradle test testBoth
This is executed during the initialization phase.
> Configure project :
This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in the build.
> Task :test
This is executed during the execution phase.
> Task :testBoth
This is executed first during the execution phase.
This is executed last during the execution phase.
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
> gradle test testBoth
This is executed during the initialization phase.
> Configure project :
This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in the build.
> Task :test
This is executed during the execution phase.
> Task :testBoth
This is executed first during the execution phase.
This is executed last during the execution phase.
BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
Phase 1. Initialization
In the initialization phase, Gradle detects the set of projects (root and subprojects) and included builds participating in the build.
Gradle first evaluates the settings file, settings.gradle(.kts)
, and instantiates a Settings
object.
Then, Gradle instantiates Project
instances for each project.
Phase 2. Configuration
In the configuration phase, Gradle adds tasks and other properties to the projects found by the initialization phase.
Phase 3. Execution
In the execution phase, Gradle runs tasks.
Gradle uses the task execution graphs generated by the configuration phase to determine which tasks to execute.
Next Step: Learn how to write Settings files >>
Writing Settings Files
The settings file is the entry point of every Gradle build.
Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root directory.
When the settings file settings.gradle(.kts)
is found, Gradle instantiates a Settings
object.
One of the purposes of the Settings
object is to allow you to declare all the projects to be included in the build.
Settings Scripts
The settings script is either a settings.gradle
file in Groovy or a settings.gradle.kts
file in Kotlin.
Before Gradle assembles the projects for a build, it creates a Settings
instance and executes the settings file against it.
As the settings script executes, it configures this Settings
.
Therefore, the settings file defines the Settings
object.
Important
|
There is a one-to-one correspondence between a Settings instance and a settings.gradle(.kts) file.
|
The Settings
Object
The Settings
object is part of the Gradle API.
Many top-level properties and blocks in a settings script are part of the Settings API.
For example, we can set the root project name in the settings script using the Settings.rootProject
property:
settings.rootProject.name = "application"
Which is usually shortened to:
rootProject.name = "application"
rootProject.name = 'application'
Standard Settings
properties
The Settings
object exposes a standard set of properties in your settings script.
The following table lists a few commonly used properties:
Name | Description |
---|---|
|
The build cache configuration. |
|
The container of plugins that have been applied to the settings. |
|
The root directory of the build. The root directory is the project directory of the root project. |
|
The root project of the build. |
|
Returns this settings object. |
The following table lists a few commonly used methods:
Name | Description |
---|---|
|
Adds the given projects to the build. |
|
Includes a build at the specified path to the composite build. |
Settings Script structure
A Settings script is a series of method calls to the Gradle API that often use { … }
, a special shortcut in both the Groovy and Kotlin languages.
A { }
block is called a lambda in Kotlin or a closure in Groovy.
Simply put, the plugins{ }
block is a method invocation in which a Kotlin lambda object or Groovy closure object is passed as the argument.
It is the short form for:
plugins(function() {
id("plugin")
})
Blocks are mapped to Gradle API methods.
The code inside the function is executed against a this
object called a receiver in Kotlin lambda and a delegate in Groovy closure.
Gradle determines the correct this
object and invokes the correct corresponding method.
The this
of the method invocation id("plugin")
object is of type PluginDependenciesSpec
.
The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script line by line, top to bottom.
Let’s take a look at an example and break it down:
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
plugins { // (2)
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
rootProject.name = "simple-project" // (3)
dependencyResolutionManagement { // (4)
repositories {
mavenCentral()
}
}
include("sub-project-a") // (5)
include("sub-project-b")
include("sub-project-c")
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
plugins { // (2)
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
rootProject.name = 'simple-project' // (3)
dependencyResolutionManagement { // (4)
repositories {
mavenCentral()
}
}
include("sub-project-a") // (5)
include("sub-project-b")
include("sub-project-c")
-
Define the location of plugins
-
Apply settings plugins.
-
Define the root project name.
-
Define dependency resolution strategies.
-
Add subprojects to the build.
1. Define the location of plugins
The settings file can manage plugin versions and repositories for your build using the pluginManagement
block.
It provides a way to define which plugins should be used in your project and from which repositories they should be resolved.
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
2. Apply settings plugins
The settings file can optionally apply plugins that are required for configuring the settings of the project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the example below.
Plugins applied in the settings file only affect the Settings
object.
plugins { // (2)
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
plugins { // (2)
id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
3. Define the root project name
The settings file defines your project name using the rootProject.name
property:
rootProject.name = "simple-project" // (3)
rootProject.name = 'simple-project' // (3)
There is only one root project per build.
4. Define dependency resolution strategies
The settings file can optionally define rules and configurations for dependency resolution across your project(s). It provides a centralized way to manage and customize dependency resolution.
dependencyResolutionManagement { // (4)
repositories {
mavenCentral()
}
}
dependencyResolutionManagement { // (4)
repositories {
mavenCentral()
}
}
You can also include version catalogs in this section.
5. Add subprojects to the build
The settings file defines the structure of the project by adding all the subprojects using the include
statement:
include("sub-project-a") // (5)
include("sub-project-b")
include("sub-project-c")
include("sub-project-a") // (5)
include("sub-project-b")
include("sub-project-c")
You can also include entire builds using includeBuild
.
Settings File Scripting
There are many more properties and methods on the Settings
object that you can use to configure your build.
It’s important to remember that while many Gradle scripts are typically written in short Groovy or Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings
object in the Gradle API:
include("app")
Is actually:
settings.include("app")
Additionally, the full power of the Groovy and Kotlin languages is available to you.
For example, instead of using include
many times to add subprojects, you can iterate over the list of directories in the project root folder and include them automatically:
rootDir.listFiles().filter { it.isDirectory && (new File(it, "build.gradle.kts").exists()) }.forEach {
include(it.name)
}
Tip
|
This type of logic should be developed in a plugin. |
Next Step: Learn how to write Build scripts >>
Writing Build Scripts
The initialization phase in the Gradle Build lifecycle finds the root project and subprojects included in your project root directory using the settings file.
Then, for each project included in the settings file, Gradle creates a Project
instance.
Gradle then looks for a corresponding build script file, which is used in the configuration phase.
Build Scripts
Every Gradle build comprises one or more projects; a root project and subprojects.
A project typically corresponds to a software component that needs to be built, like a library or an application. It might represent a library JAR, a web application, or a distribution ZIP assembled from the JARs produced by other projects.
On the other hand, it might represent a thing to be done, such as deploying your application to staging or production environments.
Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).
A build script configures a project and is associated with an object of type Project
.
As the build script executes, it configures Project
.
The build script is either a *.gradle
file in Groovy or a *.gradle.kts
file in Kotlin.
Important
|
Build scripts configure Project objects and their children.
|
The Project
object
The Project
object is part of the Gradle API:
Many top-level properties and blocks in a build script are part of the Project API.
For example, the following build script uses the Project.name property to print the name of the project:
println(name)
println(project.name)
println name
println project.name
$ gradle -q check project-api project-api
Both println
statements print out the same property.
The first uses the top-level reference to the name
property of the Project
object.
The second statement uses the project
property available to any build script, which returns the associated Project
object.
Standard project properties
The Project
object exposes a standard set of properties in your build script.
The following table lists a few commonly used properties:
Name | Type | Description |
---|---|---|
|
|
The name of the project directory. |
|
|
The fully qualified name of the project. |
|
|
A description for the project. |
|
|
Returns the dependency handler of the project. |
|
|
Returns the repository handler of the project. |
|
|
Provides access to several important locations for a project. |
|
|
The group of this project. |
|
|
The version of this project. |
The following table lists a few commonly used methods:
Name | Description |
---|---|
|
Resolves a file path to a URI, relative to the project directory of this project. |
|
Creates a Task with the given name and adds it to this project. |
Build Script structure
The Build script is composed of { … }
, a special object in both Groovy and Kotlin.
This object is called a lambda in Kotlin or a closure in Groovy.
Simply put, the plugins{ }
block is a method invocation in which a Kotlin lambda object or Groovy closure object is passed as the argument.
It is the short form for:
plugins(function() {
id("plugin")
})
Blocks are mapped to Gradle API methods.
The code inside the function is executed against a this
object called a receiver in Kotlin lambda and a delegate in Groovy closure.
Gradle determines the correct this
object and invokes the correct corresponding method.
The this
of the method invocation id("plugin")
object is of type PluginDependenciesSpec
.
The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes the script line by line, top to bottom.
Let’s take a look at an example and break it down:
plugins { // (1)
id("application")
}
repositories { // (2)
mavenCentral()
}
dependencies { // (3)
testImplementation("org.junit.jupiter:junit-jupiter-engine:5.9.3")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
implementation("com.google.guava:guava:32.1.1-jre")
}
application { // (4)
mainClass = "com.example.Main"
}
tasks.named<Test>("test") { // (5)
useJUnitPlatform()
}
tasks.named<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}
tasks.register<Zip>("zip-reports") {
from("Reports/")
include("*")
archiveFileName.set("Reports.zip")
destinationDirectory.set(file("/dir"))
}
plugins { // (1)
id 'application'
}
repositories { // (2)
mavenCentral()
}
dependencies { // (3)
testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.9.3'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
implementation 'com.google.guava:guava:32.1.1-jre'
}
application { // (4)
mainClass = 'com.example.Main'
}
tasks.named('test', Test) { // (5)
useJUnitPlatform()
}
tasks.named('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}
tasks.register('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = 'Reports.zip'
destinationDirectory = file('/dir')
}
-
Apply plugins to the build.
-
Define the locations where dependencies can be found.
-
Add dependencies.
-
Set properties.
-
Register and configure tasks.
1. Apply plugins to the build
Plugins are used to extend Gradle. They are also used to modularize and reuse project configurations.
Plugins can be applied using the PluginDependenciesSpec
plugins script block.
The plugins block is preferred:
plugins { // (1)
id("application")
}
plugins { // (1)
id 'application'
}
In the example, the application
plugin, which is included with Gradle, has been applied, describing our project as a Java application.
2. Define the locations where dependencies can be found
A project generally has a number of dependencies it needs to do its work. Dependencies include plugins, libraries, or components that Gradle must download for the build to succeed.
The build script lets Gradle know where to look for the binaries of the dependencies. More than one location can be provided:
repositories { // (2)
mavenCentral()
}
repositories { // (2)
mavenCentral()
}
In the example, the guava
library and the JetBrains Kotlin plugin (org.jetbrains.kotlin.jvm
) will be downloaded from the Maven Central Repository.
3. Add dependencies
A project generally has a number of dependencies it needs to do its work. These dependencies are often libraries of precompiled classes that are imported in the project’s source code.
Dependencies are managed via configurations and are retrieved from repositories.
Use the DependencyHandler
returned by Project.getDependencies()
method to manage the dependencies.
Use the RepositoryHandler
returned by Project.getRepositories()
method to manage the repositories.
dependencies { // (3)
testImplementation("org.junit.jupiter:junit-jupiter-engine:5.9.3")
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
implementation("com.google.guava:guava:32.1.1-jre")
}
dependencies { // (3)
testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.9.3'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
implementation 'com.google.guava:guava:32.1.1-jre'
}
In the example, the application code uses Google’s guava
libraries.
Guava provides utility methods for collections, caching, primitives support, concurrency, common annotations, string processing, I/O, and validations.
4. Set properties
A plugin can add properties and methods to a project using extensions.
The Project
object has an associated ExtensionContainer
object that contains all the settings and properties for the plugins that have been applied to the project.
In the example, the application
plugin added an application
property, which is used to detail the main class of our Java application:
application { // (4)
mainClass = "com.example.Main"
}
application { // (4)
mainClass = 'com.example.Main'
}
5. Register and configure tasks
Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping up a WAR file.
While tasks are typically defined in plugins, you may need to register or configure tasks in build scripts.
Registering a task adds the task to your project.
You can register tasks in a project using the TaskContainer.register(java.lang.String)
method:
tasks.register<Zip>("zip-reports") {
from("Reports/")
include("*")
archiveFileName.set("Reports.zip")
destinationDirectory.set(file("/dir"))
}
tasks.register('zip-reports', Zip) {
from 'Reports/'
include '*'
archiveFileName = 'Reports.zip'
destinationDirectory = file('/dir')
}
You may have seen usage of the TaskContainer.create(java.lang.String)
method which should be avoided.
tasks.create<Zip>("zip-reports") { }
Tip
|
register() , which enables task configuration avoidance, is preferred over create() .
|
You can locate a task to configure it using the TaskCollection.named(java.lang.String)
method:
tasks.named<Test>("test") { // (5)
useJUnitPlatform()
}
tasks.named('test', Test) { // (5)
useJUnitPlatform()
}
The example below configures the Javadoc
task to automatically generate HTML documentation from Java code:
tasks.named<Javadoc>("javadoc").configure {
exclude("app/Internal*.java")
exclude("app/internal/*")
}
tasks.named('javadoc', Javadoc).configure {
exclude 'app/Internal*.java'
exclude 'app/internal/*'
}
Build Scripting
A build script is made up of zero or more statements and script blocks:
println(project.layout.projectDirectory);
Statements can include method calls, property assignments, and local variable definitions:
version = '1.0.0.GA'
A script block is a method call which takes a closure/lambda as a parameter:
configurations {
}
The closure/lambda configures some delegate object as it executes:
repositories {
google()
}
A build script is also a Groovy or a Kotlin script:
tasks.register("upper") {
doLast {
val someString = "mY_nAmE"
println("Original: $someString")
println("Upper case: ${someString.toUpperCase()}")
}
}
tasks.register('upper') {
doLast {
String someString = 'mY_nAmE'
println "Original: $someString"
println "Upper case: ${someString.toUpperCase()}"
}
}
$ gradle -q upper Original: mY_nAmE Upper case: MY_NAME
It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class definitions:
tasks.register("count") {
doLast {
repeat(4) { print("$it ") }
}
}
tasks.register('count') {
doLast {
4.times { print "$it " }
}
}
$ gradle -q count 0 1 2 3
Flexible task registration
Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:
repeat(4) { counter ->
tasks.register("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}
4.times { counter ->
tasks.register("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}
$ gradle -q task1 I'm task number 1
Gradle Types
In Gradle, types, properties, and providers are foundational for managing and configuring build logic:
-
Types: Gradle defines types (like
Task
,Configuration
,File
, etc.) to represent build components. You can extend these types to create custom tasks or domain objects. -
Properties: Gradle properties (e.g.,
Property<T>
,ListProperty<T>
,SetProperty<T>
) are used for build configuration. They allow lazy evaluation, meaning their values are calculated only when needed, enhancing flexibility and performance. -
Providers: A
Provider<T>
represents a value that is computed or retrieved lazily. Providers are often used with properties to defer value computation until necessary. This is especially useful for integrating dynamic, runtime values into your build.
You can learn more about this in Understanding Gradle Types.
Declare Variables
Build scripts can declare two variables: local variables and extra properties.
Local Variables
Declare local variables with the val
keyword. Local variables are only visible in the scope where they have been declared. They are a feature of the underlying Kotlin language.
Declare local variables with the def
keyword. Local variables are only visible in the scope where they have been declared. They are a feature of the underlying Groovy language.
val dest = "dest"
tasks.register<Copy>("copy") {
from("source")
into(dest)
}
def dest = 'dest'
tasks.register('copy', Copy) {
from 'source'
into dest
}
Extra Properties
Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined properties.
Add, read, and set extra properties via the owning object’s extra
property. Alternatively, you can access extra properties via Kotlin delegated properties using by extra
.
Add, read, and set extra properties via the owning object’s ext
property. Alternatively, you can use an ext
block to add multiple properties simultaneously.
plugins {
id("java-library")
}
val springVersion by extra("3.1.0.RELEASE")
val emailNotification by extra { "build@master.org" }
sourceSets.all { extra["purpose"] = null }
sourceSets {
main {
extra["purpose"] = "production"
}
test {
extra["purpose"] = "test"
}
create("plugin") {
extra["purpose"] = "production"
}
}
tasks.register("printProperties") {
val springVersion = springVersion
val emailNotification = emailNotification
val productionSourceSets = provider {
sourceSets.matching { it.extra["purpose"] == "production" }.map { it.name }
}
doLast {
println(springVersion)
println(emailNotification)
productionSourceSets.get().forEach { println(it) }
}
}
plugins {
id 'java-library'
}
ext {
springVersion = "3.1.0.RELEASE"
emailNotification = "build@master.org"
}
sourceSets.all { ext.purpose = null }
sourceSets {
main {
purpose = "production"
}
test {
purpose = "test"
}
plugin {
purpose = "production"
}
}
tasks.register('printProperties') {
def springVersion = springVersion
def emailNotification = emailNotification
def productionSourceSets = provider {
sourceSets.matching { it.purpose == "production" }.collect { it.name }
}
doLast {
println springVersion
println emailNotification
productionSourceSets.get().each { println it }
}
}
$ gradle -q printProperties 3.1.0.RELEASE build@master.org main plugin
This example adds two extra properties to the project
object via by extra
. Additionally, this example adds a property named purpose
to each source set by setting extra["purpose"]
to null
. Once added, you can read and set these properties via extra
.
This example adds two extra properties to the project
object via an ext
block. Additionally, this example adds a property named purpose
to each source set by setting ext.purpose
to null
. Once added, you can read and set all these properties just like predefined ones.
Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows Gradle to recognize when a script attempts to set a property that does not exist. You can access extra properties anywhere where you can access their owning object. This gives extra properties a wider scope than local variables. Subprojects can access extra properties on their parent projects.
For more information about extra properties, see ExtraPropertiesExtension in the API documentation.
Configure Arbitrary Objects
The example greet()
task shows an example of arbitrary object configuration:
class UserInfo(
var name: String? = null,
var email: String? = null
)
tasks.register("greet") {
val user = UserInfo().apply {
name = "Isaac Newton"
email = "isaac@newton.me"
}
doLast {
println(user.name)
println(user.email)
}
}
class UserInfo {
String name
String email
}
tasks.register('greet') {
def user = configure(new UserInfo()) {
name = "Isaac Newton"
email = "isaac@newton.me"
}
doLast {
println user.name
println user.email
}
}
$ gradle -q greet Isaac Newton isaac@newton.me
Closure Delegates
Each closure has a delegate
object. Groovy uses this delegate to look up variable and method references to nonlocal variables and closure parameters.
Gradle uses this for configuration closures,
where the delegate
object refers to the object being configured.
dependencies {
assert delegate == project.dependencies
testImplementation('junit:junit:4.13')
delegate.testImplementation('junit:junit:4.13')
}
Default imports
To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.
As a result, instead of writing throw new org.gradle.api.tasks.StopExecutionException()
, you can write throw new StopExecutionException()
instead.
Gradle implicitly adds the following imports to each script:
import org.gradle.*
import org.gradle.api.*
import org.gradle.api.artifacts.*
import org.gradle.api.artifacts.capability.*
import org.gradle.api.artifacts.component.*
import org.gradle.api.artifacts.dsl.*
import org.gradle.api.artifacts.ivy.*
import org.gradle.api.artifacts.maven.*
import org.gradle.api.artifacts.query.*
import org.gradle.api.artifacts.repositories.*
import org.gradle.api.artifacts.result.*
import org.gradle.api.artifacts.transform.*
import org.gradle.api.artifacts.type.*
import org.gradle.api.artifacts.verification.*
import org.gradle.api.attributes.*
import org.gradle.api.attributes.java.*
import org.gradle.api.attributes.plugin.*
import org.gradle.api.cache.*
import org.gradle.api.capabilities.*
import org.gradle.api.component.*
import org.gradle.api.configuration.*
import org.gradle.api.credentials.*
import org.gradle.api.distribution.*
import org.gradle.api.distribution.plugins.*
import org.gradle.api.execution.*
import org.gradle.api.file.*
import org.gradle.api.flow.*
import org.gradle.api.initialization.*
import org.gradle.api.initialization.definition.*
import org.gradle.api.initialization.dsl.*
import org.gradle.api.initialization.resolve.*
import org.gradle.api.invocation.*
import org.gradle.api.java.archives.*
import org.gradle.api.jvm.*
import org.gradle.api.launcher.cli.*
import org.gradle.api.logging.*
import org.gradle.api.logging.configuration.*
import org.gradle.api.model.*
import org.gradle.api.plugins.*
import org.gradle.api.plugins.antlr.*
import org.gradle.api.plugins.catalog.*
import org.gradle.api.plugins.jvm.*
import org.gradle.api.plugins.quality.*
import org.gradle.api.plugins.scala.*
import org.gradle.api.problems.*
import org.gradle.api.project.*
import org.gradle.api.provider.*
import org.gradle.api.publish.*
import org.gradle.api.publish.ivy.*
import org.gradle.api.publish.ivy.plugins.*
import org.gradle.api.publish.ivy.tasks.*
import org.gradle.api.publish.maven.*
import org.gradle.api.publish.maven.plugins.*
import org.gradle.api.publish.maven.tasks.*
import org.gradle.api.publish.plugins.*
import org.gradle.api.publish.tasks.*
import org.gradle.api.reflect.*
import org.gradle.api.reporting.*
import org.gradle.api.reporting.components.*
import org.gradle.api.reporting.dependencies.*
import org.gradle.api.reporting.dependents.*
import org.gradle.api.reporting.model.*
import org.gradle.api.reporting.plugins.*
import org.gradle.api.resources.*
import org.gradle.api.services.*
import org.gradle.api.specs.*
import org.gradle.api.tasks.*
import org.gradle.api.tasks.ant.*
import org.gradle.api.tasks.application.*
import org.gradle.api.tasks.bundling.*
import org.gradle.api.tasks.compile.*
import org.gradle.api.tasks.diagnostics.*
import org.gradle.api.tasks.diagnostics.configurations.*
import org.gradle.api.tasks.incremental.*
import org.gradle.api.tasks.javadoc.*
import org.gradle.api.tasks.options.*
import org.gradle.api.tasks.scala.*
import org.gradle.api.tasks.testing.*
import org.gradle.api.tasks.testing.junit.*
import org.gradle.api.tasks.testing.junitplatform.*
import org.gradle.api.tasks.testing.testng.*
import org.gradle.api.tasks.util.*
import org.gradle.api.tasks.wrapper.*
import org.gradle.api.toolchain.management.*
import org.gradle.authentication.*
import org.gradle.authentication.aws.*
import org.gradle.authentication.http.*
import org.gradle.build.event.*
import org.gradle.buildconfiguration.tasks.*
import org.gradle.buildinit.*
import org.gradle.buildinit.plugins.*
import org.gradle.buildinit.specs.*
import org.gradle.buildinit.tasks.*
import org.gradle.caching.*
import org.gradle.caching.configuration.*
import org.gradle.caching.http.*
import org.gradle.caching.local.*
import org.gradle.concurrent.*
import org.gradle.external.javadoc.*
import org.gradle.ide.visualstudio.*
import org.gradle.ide.visualstudio.plugins.*
import org.gradle.ide.visualstudio.tasks.*
import org.gradle.ide.xcode.*
import org.gradle.ide.xcode.plugins.*
import org.gradle.ide.xcode.tasks.*
import org.gradle.ivy.*
import org.gradle.jvm.*
import org.gradle.jvm.application.scripts.*
import org.gradle.jvm.application.tasks.*
import org.gradle.jvm.tasks.*
import org.gradle.jvm.toolchain.*
import org.gradle.language.*
import org.gradle.language.assembler.*
import org.gradle.language.assembler.plugins.*
import org.gradle.language.assembler.tasks.*
import org.gradle.language.base.*
import org.gradle.language.base.artifact.*
import org.gradle.language.base.compile.*
import org.gradle.language.base.plugins.*
import org.gradle.language.base.sources.*
import org.gradle.language.c.*
import org.gradle.language.c.plugins.*
import org.gradle.language.c.tasks.*
import org.gradle.language.cpp.*
import org.gradle.language.cpp.plugins.*
import org.gradle.language.cpp.tasks.*
import org.gradle.language.java.artifact.*
import org.gradle.language.jvm.tasks.*
import org.gradle.language.nativeplatform.*
import org.gradle.language.nativeplatform.tasks.*
import org.gradle.language.objectivec.*
import org.gradle.language.objectivec.plugins.*
import org.gradle.language.objectivec.tasks.*
import org.gradle.language.objectivecpp.*
import org.gradle.language.objectivecpp.plugins.*
import org.gradle.language.objectivecpp.tasks.*
import org.gradle.language.plugins.*
import org.gradle.language.rc.*
import org.gradle.language.rc.plugins.*
import org.gradle.language.rc.tasks.*
import org.gradle.language.scala.tasks.*
import org.gradle.language.swift.*
import org.gradle.language.swift.plugins.*
import org.gradle.language.swift.tasks.*
import org.gradle.maven.*
import org.gradle.model.*
import org.gradle.nativeplatform.*
import org.gradle.nativeplatform.platform.*
import org.gradle.nativeplatform.plugins.*
import org.gradle.nativeplatform.tasks.*
import org.gradle.nativeplatform.test.*
import org.gradle.nativeplatform.test.cpp.*
import org.gradle.nativeplatform.test.cpp.plugins.*
import org.gradle.nativeplatform.test.cunit.*
import org.gradle.nativeplatform.test.cunit.plugins.*
import org.gradle.nativeplatform.test.cunit.tasks.*
import org.gradle.nativeplatform.test.googletest.*
import org.gradle.nativeplatform.test.googletest.plugins.*
import org.gradle.nativeplatform.test.plugins.*
import org.gradle.nativeplatform.test.tasks.*
import org.gradle.nativeplatform.test.xctest.*
import org.gradle.nativeplatform.test.xctest.plugins.*
import org.gradle.nativeplatform.test.xctest.tasks.*
import org.gradle.nativeplatform.toolchain.*
import org.gradle.nativeplatform.toolchain.plugins.*
import org.gradle.normalization.*
import org.gradle.platform.*
import org.gradle.platform.base.*
import org.gradle.platform.base.binary.*
import org.gradle.platform.base.component.*
import org.gradle.platform.base.plugins.*
import org.gradle.plugin.devel.*
import org.gradle.plugin.devel.plugins.*
import org.gradle.plugin.devel.tasks.*
import org.gradle.plugin.management.*
import org.gradle.plugin.use.*
import org.gradle.plugins.ear.*
import org.gradle.plugins.ear.descriptor.*
import org.gradle.plugins.ide.*
import org.gradle.plugins.ide.api.*
import org.gradle.plugins.ide.eclipse.*
import org.gradle.plugins.ide.idea.*
import org.gradle.plugins.signing.*
import org.gradle.plugins.signing.signatory.*
import org.gradle.plugins.signing.signatory.pgp.*
import org.gradle.plugins.signing.type.*
import org.gradle.plugins.signing.type.pgp.*
import org.gradle.process.*
import org.gradle.swiftpm.*
import org.gradle.swiftpm.plugins.*
import org.gradle.swiftpm.tasks.*
import org.gradle.testing.base.*
import org.gradle.testing.base.plugins.*
import org.gradle.testing.jacoco.plugins.*
import org.gradle.testing.jacoco.tasks.*
import org.gradle.testing.jacoco.tasks.rules.*
import org.gradle.testkit.runner.*
import org.gradle.util.*
import org.gradle.vcs.*
import org.gradle.vcs.git.*
import org.gradle.work.*
import org.gradle.workers.*
Next Step: Learn how to use Tasks >>
Using Tasks
The work that Gradle can do on a project is defined by one or more tasks.
A task represents some independent unit of work that a build performs. This might be compiling some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.
When a user runs ./gradlew build
in the command line, Gradle will execute the build
task along with any other tasks it depends on.
List available tasks
Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks
:
> Task :tasks
------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------
Build Setup tasks
-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project 'myTutorial'.
...
Tasks either come from build scripts or plugins.
Once we apply a plugin to our project, such as the application
plugin, additional tasks become available:
plugins {
id("application")
}
plugins {
id 'application'
}
$ ./gradlew tasks
> Task :tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.
Other tasks
-----------
compileJava - Compiles main Java source.
...
Many of these tasks, such as assemble
, build
, and run
, should be familiar to a developer.
Task classification
There are two classes of tasks that can be executed:
-
Actionable tasks have some action(s) attached to do work in your build:
compileJava
. -
Lifecycle tasks are tasks with no actions attached:
assemble
,build
.
Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at once.
Task registration and action
Let’s take a look at a simple "Hello World" task in a build script:
tasks.register("hello") {
doLast {
println("Hello world!")
}
}
tasks.register('hello') {
doLast {
println 'Hello world!'
}
}
In the example, the build script registers a single task called hello
using the TaskContainer API, and adds an action to it.
If the tasks in the project are listed, the hello
task is available to Gradle:
$ ./gradlew app:tasks --all
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.
You can execute the task in the build script with ./gradlew hello
:
$ ./gradlew hello
Hello world!
When Gradle executes the hello
task, it executes the action provided.
In this case, the action is simply a block containing some code: println("Hello world!")
.
Task group and description
The hello
task from the previous section can be detailed with a description and assigned to a group with the following update:
tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
}
tasks.register('hello') {
group = 'Custom'
description = 'A lovely greeting task.'
doLast {
println 'Hello world!'
}
}
Once the task is assigned to a group, it will be listed by ./gradlew tasks
:
$ ./gradlew tasks
> Task :tasks
Custom tasks
------------------
hello - A lovely greeting task.
To view information about a task, use the help --task <task-name>
command:
$./gradlew help --task hello
> Task :help
Detailed task information for hello
Path
:app:hello
Type
Task (org.gradle.api.Task)
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom
As we can see, the hello
task belongs to the custom
group.
Task dependencies
You can declare tasks that depend on other tasks:
tasks.register("hello") {
doLast {
println("Hello world!")
}
}
tasks.register("intro") {
dependsOn("hello")
doLast {
println("I'm Gradle")
}
}
tasks.register('hello') {
doLast {
println 'Hello world!'
}
}
tasks.register('intro') {
dependsOn tasks.hello
doLast {
println "I'm Gradle"
}
}
$ gradle -q intro Hello world! I'm Gradle
The dependency of taskX
to taskY
may be declared before taskY
is defined:
tasks.register("taskX") {
dependsOn("taskY")
doLast {
println("taskX")
}
}
tasks.register("taskY") {
doLast {
println("taskY")
}
}
tasks.register('taskX') {
dependsOn 'taskY'
doLast {
println 'taskX'
}
}
tasks.register('taskY') {
doLast {
println 'taskY'
}
}
$ gradle -q taskX taskY taskX
The hello
task from the previous example is updated to include a dependency:
tasks.register("hello") {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn(tasks.assemble)
}
tasks.register('hello') {
group = "Custom"
description = "A lovely greeting task."
doLast {
println("Hello world!")
}
dependsOn(tasks.assemble)
}
The hello
task now depends on the assemble
task, which means that Gradle must execute the assemble
task before it can execute the hello
task:
$ ./gradlew :app:hello
> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:startScripts UP-TO-DATE
> Task :app:distTar UP-TO-DATE
> Task :app:distZip UP-TO-DATE
> Task :app:assemble UP-TO-DATE
> Task :app:hello
Hello world!
Task configuration
Once registered, tasks can be accessed via the TaskProvider API for further configuration.
For instance, you can use this to add dependencies to a task at runtime dynamically:
repeat(4) { counter ->
tasks.register("task$counter") {
doLast {
println("I'm task number $counter")
}
}
}
tasks.named("task0") { dependsOn("task2", "task3") }
4.times { counter ->
tasks.register("task$counter") {
doLast {
println "I'm task number $counter"
}
}
}
tasks.named('task0') { dependsOn('task2', 'task3') }
$ gradle -q task0 I'm task number 2 I'm task number 3 I'm task number 0
Or you can add behavior to an existing task:
tasks.register("hello") {
doLast {
println("Hello Earth")
}
}
tasks.named("hello") {
doFirst {
println("Hello Venus")
}
}
tasks.named("hello") {
doLast {
println("Hello Mars")
}
}
tasks.named("hello") {
doLast {
println("Hello Jupiter")
}
}
tasks.register('hello') {
doLast {
println 'Hello Earth'
}
}
tasks.named('hello') {
doFirst {
println 'Hello Venus'
}
}
tasks.named('hello') {
doLast {
println 'Hello Mars'
}
}
tasks.named('hello') {
doLast {
println 'Hello Jupiter'
}
}
$ gradle -q hello Hello Venus Hello Earth Hello Mars Hello Jupiter
Tip
|
The calls doFirst and doLast can be executed multiple times.
They add an action to the beginning or the end of the task’s actions list.
When the task executes, the actions in the action list are executed in order.
|
Here is an example of the named
method being used to configure a task added by a plugin:
tasks.dokkaHtml.configure {
outputDirectory.set(buildDir)
}
tasks.named("dokkaHtml") {
outputDirectory.set(buildDir)
}
Task types
Gradle tasks are a subclass of Task
.
In the build script, the HelloTask
class is created by extending DefaultTask
:
// Extend the DefaultTask class to create a HelloTask class
abstract class HelloTask : DefaultTask() {
@TaskAction
fun hello() {
println("hello from HelloTask")
}
}
// Register the hello Task with type HelloTask
tasks.register<HelloTask>("hello") {
group = "Custom tasks"
description = "A lovely greeting task."
}
// Extend the DefaultTask class to create a HelloTask class
class HelloTask extends DefaultTask {
@TaskAction
void hello() {
println("hello from HelloTask")
}
}
// Register the hello Task with type HelloTask
tasks.register("hello", HelloTask) {
group = "Custom tasks"
description = "A lovely greeting task."
}
The hello
task is registered with the type HelloTask
.
Executing our new hello
task:
$ ./gradlew hello
> Task :app:hello
hello from HelloTask
Now the hello
task is of type HelloTask
instead of type Task
.
The Gradle help
task reveals the change:
$ ./gradlew help --task hello
> Task :help
Detailed task information for hello
Path
:app:hello
Type
HelloTask (Build_gradle$HelloTask)
Options
--rerun Causes the task to be re-run even if up-to-date.
Description
A lovely greeting task.
Group
Custom tasks
Built-in task types
Gradle provides many built-in task types with common and popular functionality, such as copying or deleting files.
This example task copies *.war
files from the source
directory to the target
directory using the Copy
built-in task:
tasks.register<Copy>("copyTask") {
from("source")
into("target")
include("*.war")
}
tasks.register('copyTask', Copy) {
from("source")
into("target")
include("*.war")
}
There are many task types developers can take advantage of, including GroovyDoc
, Zip
, Jar
, JacocoReport
, Sign
, or Delete
, which are available in the link:DSL.
Next Step: Learn how to write Tasks >>
Writing Tasks
Gradle tasks are created by extending DefaultTask
.
However, the generic DefaultTask
provides no action for Gradle.
If users want to extend the capabilities of Gradle and their build script, they must either use a built-in task or create a custom task:
-
Built-in task - Gradle provides built-in utility tasks such as
Copy
,Jar
,Zip
,Delete
, etc… -
Custom task - Gradle allows users to subclass
DefaultTask
to create their own task types.
Create a task
The simplest and quickest way to create a custom task is in a build script:
To create a task, inherit from the DefaultTask
class and implement a @TaskAction
handler:
abstract class CreateFileTask : DefaultTask() {
@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}
class CreateFileTask extends DefaultTask {
@TaskAction
void action() {
def file = new File("myfile.txt")
file.createNewFile()
file.text = "HELLO FROM MY TASK"
}
}
The CreateFileTask
implements a simple set of actions.
First, a file called "myfile.txt" is created in the main project.
Then, some text is written to the file.
Register a task
A task is registered in the build script using the TaskContainer.register()
method, which allows it to be then used in the build logic.
abstract class CreateFileTask : DefaultTask() {
@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}
tasks.register<CreateFileTask>("createFileTask")
class CreateFileTask extends DefaultTask {
@TaskAction
void action() {
def file = new File("myfile.txt")
file.createNewFile()
file.text = "HELLO FROM MY TASK"
}
}
tasks.register("createFileTask", CreateFileTask)
Task group and description
Setting the group and description properties on your tasks can help users understand how to use your task:
abstract class CreateFileTask : DefaultTask() {
@TaskAction
fun action() {
val file = File("myfile.txt")
file.createNewFile()
file.writeText("HELLO FROM MY TASK")
}
}
tasks.register<CreateFileTask>("createFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
}
class CreateFileTask extends DefaultTask {
@TaskAction
void action() {
def file = new File("myfile.txt")
file.createNewFile()
file.text = "HELLO FROM MY TASK"
}
}
tasks.register("createFileTask", CreateFileTask) {
group = "custom"
description = "Create myfile.txt in the current directory"
}
Once a task is added to a group, it is visible when listing tasks.
Task input and outputs
For the task to do useful work, it typically needs some inputs. A task typically produces outputs.
abstract class CreateAFileTask : DefaultTask() {
@get:Input
abstract val fileText: Property<String>
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
abstract class CreateAFileTask extends DefaultTask {
@Input
abstract Property<String> getFileText()
@Input
final String fileName = "myfile.txt"
@OutputFile
final File myFile = new File(fileName)
@TaskAction
void action() {
myFile.createNewFile()
myFile.text = fileText.get()
}
}
Configure a task
A task is optionally configured in a build script using the TaskCollection.named()
method.
The CreateAFileTask
class is updated so that the text in the file is configurable:
abstract class CreateAFileTask : DefaultTask() {
@get:Input
abstract val fileText: Property<String>
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
tasks.register<CreateAFileTask>("createAFileTask") {
group = "custom"
description = "Create myfile.txt in the current directory"
fileText.convention("HELLO FROM THE CREATE FILE TASK METHOD") // Set convention
}
tasks.named<CreateAFileTask>("createAFileTask") {
fileText.set("HELLO FROM THE NAMED METHOD") // Override with custom message
}
abstract class CreateAFileTask extends DefaultTask {
@Input
abstract Property<String> getFileText()
@Input
final String fileName = "myfile.txt"
@OutputFile
final File myFile = new File(fileName)
@TaskAction
void action() {
myFile.createNewFile()
myFile.text = fileText.get()
}
}
tasks.register("createAFileTask", CreateAFileTask) {
group = "custom"
description = "Create myfile.txt in the current directory"
fileText.convention("HELLO FROM THE CREATE FILE TASK METHOD") // Set convention
}
tasks.named("createAFileTask", CreateAFileTask) {
fileText.set("HELLO FROM THE NAMED METHOD") // Override with custom message
}
In the named()
method, we find the createAFileTask
task and set the text that will be written to the file.
When the task is executed:
$ ./gradlew createAFileTask
> Configure project :app
> Task :app:createAFileTask
BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date
A text file called myfile.txt
is created in the project root folder:
HELLO FROM THE NAMED METHOD
Consult the Developing Gradle Tasks chapter to learn more.
Next Step: Learn how to use Plugins >>
Using Plugins
Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with Gradle, third-party plugins, and script plugins defined within builds.
Plugins introduce new tasks (e.g., JavaCompile
), domain objects (e.g., SourceSet
), conventions (e.g., locating Java source at src/main/java
), and extend core or other plugin objects.
Plugins in Gradle are essential for automating common build tasks, integrating with external tools or services, and tailoring the build process to meet specific project needs. They also serve as the primary mechanism for organizing build logic.
Benefits of plugins
Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins offer several advantages over adding logic directly to the build script:
-
Promotes Reusability: Reduces the need to duplicate similar logic across projects.
-
Enhances Modularity: Allows for a more modular and organized build script.
-
Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.
Plugin distribution
You can leverage plugins from Gradle and the Gradle community or create your own.
Plugins are available in three ways:
-
Core plugins - Gradle develops and maintains a set of Core Plugins.
-
Community plugins - Gradle plugins shared in a remote repository such as Maven or the Gradle Plugin Portal.
-
Custom plugins - Gradle enables users to create plugins using APIs.
Types of plugins
Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:
1. Script Plugins
Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build script using the apply from:
syntax.
They are applied inline within a build script to add functionality or customize the build process.
They are not recommended but it’s important to understand how to work:
// Define a plugin
class HelloWorldPlugin : Plugin<Project> {
override fun apply(project: Project) {
project.tasks.register("helloWorld") {
group = "Example"
description = "Prints 'Hello, World!' to the console"
doLast {
println("Hello, World!")
}
}
}
}
// Apply the plugin
apply<HelloWorldPlugin>()
2. Precompiled Script Plugins
Precompiled script plugins are Groovy DSL or Kotlin DSL scripts compiled and distributed as Java class files packaged in some library.
They are meant to be consumed as a binary Gradle plugin, so they are applied to a project using the plugins {}
block.
The plugin ID by which the precompiled script can be referenced is derived from its name and optional package declaration.
// This script is automatically exposed to downstream consumers as the `my-plugin` plugin
tasks {
register("myCopyTask", Copy::class) {
group = "sample"
from("build.gradle.kts")
into("build/copy")
}
}
plugins {
id("my-plugin") version "1.0"
}
3. BuildSrc
and Convention Plugins
These are a hybrid of precompiled plugins and binary plugins that provide a way to reuse complex logic across projects and allow for better organization of build logic.
plugins {
java
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("org.junit.jupiter:junit-jupiter:5.8.1")
implementation("com.google.guava:guava:30.1.1-jre")
}
tasks.named<Test>("test") {
useJUnitPlatform()
}
tasks.register<Copy>("backupTestXml") {
from("build/test-results/test")
into("/tmp/results/")
exclude("binary/**")
}
plugins {
application
id("shared-build-conventions")
}
4. Binary Plugins
Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as JAR files.
They are applied to a project using the plugins {}
block.
They offer better performance and maintainability compared to script plugins or precompiled script plugins.
class MyPlugin : Plugin<Project> {
override fun apply(project: Project) {
project.run {
tasks {
register("myCopyTask", Copy::class) {
group = "sample"
from("build.gradle.kts")
into("build/copy")
}
}
}
}
}
plugins {
id("my-plugin") version "1.0"
}
The difference between a binary plugin and a script plugin lies in how they are shared and executed:
-
A binary plugin is compiled into bytecode, and the bytecode is shared.
-
A script plugin is shared as source code, and it is compiled at the time of use.
Binary plugins can be written in any language that produces JVM bytecode, such as Java, Kotlin, or Groovy. In contrast, script plugins can only be written using Kotlin DSL or Groovy DSL.
However, there is also a middle ground: precompiled script plugins. These are written in Kotlin DSL or Groovy DSL, like script plugins, but are compiled into bytecode and shared like binary plugins.
A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes more valuable, it’s migrated to a binary plugin that can be easily tested and shared between multiple projects or organizations.
Using plugins
To use the build logic encapsulated in a plugin, Gradle needs to perform two steps.
First, it needs to resolve the plugin, and then it needs to apply the plugin to the target, usually a Project
.
-
Resolving a plugin means finding the correct version of the JAR that contains a given plugin and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build script. Script plugins are self-resolving in that they are resolved from the specific file path or URL provided when applying them. Core binary plugins provided as part of the Gradle distribution are automatically resolved.
-
Applying a plugin means executing the plugin’s Plugin.apply(T) on a project.
The plugins DSL is recommended to resolve and apply plugins in one step.
Resolving plugins
Gradle provides the core plugins (e.g., JavaPlugin
, GroovyPlugin
, MavenPublishPlugin
, etc.) as part of its distribution, which means they are automatically resolved.
Core plugins are applied in a build script using the plugin name:
plugins {
id «plugin name»
}
For example:
plugins {
id("java")
}
Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a unique ID and a version in the build file:
plugins {
id «plugin id» version «plugin version»
}
And the location of the plugin must be specified in the settings file:
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
pluginManagement { // (1)
repositories {
gradlePluginPortal()
}
}
There are additional considerations for resolving and applying plugins:
# | To | Use | For example: |
---|---|---|---|
Apply a plugin to a project. |
plugins { id("org.barfuin.gradle.taskinfo") version "2.1.0" } |
||
Apply a plugin to multiple subprojects. |
The |
plugins { id("org.barfuin.gradle.taskinfo") version "2.1.0" } allprojects { apply(plugin = "org.barfuin.gradle.taskinfo") repositories { mavenCentral() } } |
|
Apply a plugin to multiple subprojects. |
A convention plugin in the |
plugins { id("my-convention.gradle.taskinfo") } |
|
Apply a plugin needed for the build script itself. |
buildscript { repositories { mavenCentral() } dependencies { classpath("org.barfuin.gradle.taskinfo:gradle-taskinfo:2.1.0") } } apply(plugin = "org.barfuin.gradle.taskinfo") |
||
Apply a script plugins. |
The legacy |
apply<MyCustomBarfuinTaskInfoPlugin>() |
1. Applying plugins using the plugins{}
block
The plugin DSL provides a concise and convenient way to declare plugin dependencies.
The plugins block configures an instance of PluginDependenciesSpec
:
plugins {
application // by name
java // by name
id("java") // by id - recommended
id("org.jetbrains.kotlin.jvm") version "1.9.0" // by id - recommended
}
Core Gradle plugins are unique in that they provide short names, such as java
for the core JavaPlugin.
To apply a core plugin, the short name can be used:
plugins {
java
}
plugins {
id 'java'
}
All other binary plugins must use the fully qualified form of the plugin id (e.g., com.github.foo.bar
).
To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally unique identifier, must be used:
plugins {
id("org.springframework.boot") version "3.3.1"
}
plugins {
id 'org.springframework.boot' version '3.3.1'
}
See PluginDependenciesSpec
for more information on using the Plugin DSL.
Limitations of the plugins DSL
The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine which plugins are used quickly. This allows Gradle to:
-
Optimize the loading and reuse of plugin classes.
-
Provide editors with detailed information about the potential properties and values in the build script.
However, the DSL requires that plugins be defined statically.
There are some key differences between the plugins {}
block mechanism and the "traditional" apply()
method mechanism.
There are also some constraints and possible limitations.
The plugins{}
block can only be used in a project’s build script build.gradle(.kts)
and the settings.gradle(.kts)
file.
It must appear before any other block.
It cannot be used in script plugins or init scripts.
The plugins {}
block does not support arbitrary code.
It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for Gradle to execute at any time).
The form is:
plugins {
id(«plugin id») // (1)
id(«plugin id») version «plugin version» // (2)
}
-
for core Gradle plugins or plugins already available to the build script
-
for binary Gradle plugins that need to be resolved
Where «plugin id»
and «plugin version»
are a string.
Where «plugin id»
and «plugin version»
must be constant, literal strings.
The plugins{}
block must also be a top-level statement in the build script.
It cannot be nested inside another construct (e.g., an if-statement or for-loop).
2. Applying plugins to all subprojects{} or allprojects{}
Suppose you have a multi-project build, you probably want to apply plugins to some or all of the subprojects in your build but not to the root
project.
While the default behavior of the plugins{}
block is to immediately resolve
and apply
the plugins, you can use the apply false
syntax to tell Gradle not to apply the plugin to the current project. Then, use the plugins{}
block without the version in subprojects' build scripts:
include("hello-a")
include("hello-b")
include("goodbye-c")
plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective build files).
id("com.example.hello") version "1.0.0" apply false
id("com.example.goodbye") version "1.0.0" apply false
}
allprojects {
// Apply the common 'java' plugin to all projects (including the root)
plugins.apply("java")
}
subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the root)
plugins.apply("java-library")
}
plugins {
id("com.example.hello")
}
plugins {
id("com.example.hello")
}
plugins {
id("com.example.goodbye")
}
include 'hello-a'
include 'hello-b'
include 'goodbye-c'
plugins {
// These plugins are not automatically applied.
// They can be applied in subprojects as needed (in their respective build files).
id 'com.example.hello' version '1.0.0' apply false
id 'com.example.goodbye' version '1.0.0' apply false
}
allprojects {
// Apply the common 'java' plugin to all projects (including the root)
apply(plugin: 'java')
}
subprojects {
// Apply the 'java-library' plugin to all subprojects (excluding the root)
apply(plugin: 'java-library')
}
plugins {
id 'com.example.hello'
}
plugins {
id 'com.example.hello'
}
plugins {
id 'com.example.goodbye'
}
You can also encapsulate the versions of external plugins by composing the build logic using your own convention plugins.
3. Applying convention plugins from the buildSrc
directory
buildSrc
is an optional directory at the Gradle project root that contains build logic (i.e., plugins) used in building the main project.
You can apply plugins that reside in a project’s buildSrc
directory as long as they have a defined ID.
The following example shows how to tie the plugin implementation class my.MyPlugin
, defined in buildSrc
, to the id "my-plugin":
plugins {
`java-gradle-plugin`
}
gradlePlugin {
plugins {
create("myPlugins") {
id = "my-plugin"
implementationClass = "my.MyPlugin"
}
}
}
plugins {
id 'java-gradle-plugin'
}
gradlePlugin {
plugins {
myPlugins {
id = 'my-plugin'
implementationClass = 'my.MyPlugin'
}
}
}
The plugin can then be applied by ID:
plugins {
id("my-plugin")
}
plugins {
id 'my-plugin'
}
4. Applying plugins using the buildscript{}
block
To define libraries or plugins used in the build script itself, you can use the buildscript
block.
The buildscript
block is also used for specifying where to find those dependencies.
This approach is less common with newer versions of Gradle, as the plugins {}
block simplifies plugin usage.
However, buildscript {}
may be necessary when dealing with custom or non-standard plugin repositories as well as libraries dependencies:
import org.yaml.snakeyaml.Yaml
import java.io.File
buildscript {
repositories {
maven {
url = uri("https://plugins.gradle.org/m2/")
}
mavenCentral() // Where to find the plugin
}
dependencies {
classpath("org.yaml:snakeyaml:1.19") // The library's classpath dependency
classpath("com.gradleup.shadow:shadow-gradle-plugin:8.3.4") // Plugin dependency for legacy plugin application
}
}
// Applies legacy Shadow plugin
apply(plugin = "com.gradleup.shadow")
// Uses the library in the build script
val yamlContent = """
name: Project
""".trimIndent()
val yaml = Yaml()
val data: Map<String, Any> = yaml.load(yamlContent)
import org.yaml.snakeyaml.Yaml
buildscript {
repositories { // Where to find the plugin or library
maven {
url = uri("https://plugins.gradle.org/m2/")
}
mavenCentral()
}
dependencies {
classpath 'org.yaml:snakeyaml:1.19' // The library's classpath dependency
classpath 'com.gradleup.shadow:shadow-gradle-plugin:8.3.4' // Plugin dependency for legacy plugin application
}
}
// Applies legacy Shadow plugin
apply plugin: 'com.gradleup.shadow'
// Uses the library in the build script
def yamlContent = """
name: Project Name
"""
def yaml = new Yaml()
def data = yaml.load(yamlContent)
5. Applying script plugins using the legacy apply()
method
A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is applied using the legacy application method:
class MyPlugin : Plugin<Project> {
override fun apply(project: Project) {
println("Plugin ${this.javaClass.simpleName} applied on ${project.name}")
}
}
apply<MyPlugin>()
class MyPlugin implements Plugin<Project> {
@Override
void apply(Project project) {
println("Plugin ${this.getClass().getSimpleName()} applied on ${project.name}")
}
}
apply plugin: MyPlugin
Plugin Management
The pluginManagement{}
block is used to configure repositories for plugin resolution and to define version constraints for plugins that are applied in the build scripts.
The pluginManagement{}
block can be used in a settings.gradle(.kts)
file, where it must be the first block in the file:
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = "plugin-management"
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
rootProject.name = 'plugin-management'
The block can also be used in Initialization Script:
settingsEvaluated {
pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}
settingsEvaluated { settings ->
settings.pluginManagement {
plugins {
}
resolutionStrategy {
}
repositories {
}
}
}
Custom Plugin Repositories
By default, the plugins{}
DSL resolves plugins from the public Gradle Plugin Portal.
Many build authors would also like to resolve plugins from private Maven or Ivy repositories because they contain proprietary implementation details or to have more control over what plugins are available to their builds.
To specify custom plugin repositories, use the repositories{}
block inside pluginManagement{}
:
pluginManagement {
repositories {
maven(url = file("./maven-repo"))
gradlePluginPortal()
ivy(url = file("./ivy-repo"))
}
}
pluginManagement {
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}
This tells Gradle to first look in the Maven repository at ../maven-repo
when resolving plugins and then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository.
If you don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal()
line.
Finally, the Ivy repository at ../ivy-repo
will be checked.
Plugin Version Management
A plugins{}
block inside pluginManagement{}
allows all plugin versions for the build to be defined in a single location.
Plugins can then be applied by id to any build script via the plugins{}
block.
One benefit of setting plugin versions this way is that the pluginManagement.plugins{}
does not have the same constrained syntax as the build script plugins{}
block.
This allows plugin versions to be taken from gradle.properties
, or loaded via another mechanism.
Managing plugin versions via pluginManagement
:
pluginManagement {
val helloPluginVersion: String by settings
plugins {
id("com.example.hello") version "${helloPluginVersion}"
}
}
plugins {
id("com.example.hello")
}
helloPluginVersion=1.0.0
pluginManagement {
plugins {
id 'com.example.hello' version "${helloPluginVersion}"
}
}
plugins {
id 'com.example.hello'
}
helloPluginVersion=1.0.0
The plugin version is loaded from gradle.properties
and configured in the settings script, allowing the plugin to be added to any project without specifying the version.
Plugin Resolution Rules
Plugin resolution rules allow you to modify plugin requests made in plugins{}
blocks, e.g., changing the requested version or explicitly specifying the implementation artifact coordinates.
To add resolution rules, use the resolutionStrategy{}
inside the pluginManagement{}
block:
pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == "com.example") {
useModule("com.example:sample-plugins:1.0.0")
}
}
}
repositories {
maven {
url = uri("./maven-repo")
}
gradlePluginPortal()
ivy {
url = uri("./ivy-repo")
}
}
}
pluginManagement {
resolutionStrategy {
eachPlugin {
if (requested.id.namespace == 'com.example') {
useModule('com.example:sample-plugins:1.0.0')
}
}
}
repositories {
maven {
url = file('./maven-repo')
}
gradlePluginPortal()
ivy {
url = file('./ivy-repo')
}
}
}
This tells Gradle to use the specified plugin implementation artifact instead of its built-in default mapping from plugin ID to Maven/Ivy coordinates.
Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts that implement the plugin. Read Gradle Plugin Development Plugin for more information on publishing plugins to custom repositories.
See PluginManagementSpec for complete documentation for using the pluginManagement{}
block.
Plugin Marker Artifacts
Since the plugins{}
DSL block only allows for declaring plugins by their globally unique plugin id
and version
properties, Gradle needs a way to look up the coordinates of the plugin implementation artifact.
To do so, Gradle will look for a Plugin Marker Artifact with the coordinates plugin.id:plugin.id.gradle.plugin:plugin.version
.
This marker needs to have a dependency on the actual plugin implementation.
Publishing these markers is automated by the java-gradle-plugin.
For example, the following complete sample from the sample-plugins
project shows how to publish a com.example.hello
plugin and a com.example.goodbye
plugin to both an Ivy and Maven repository using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish plugin.
plugins {
`java-gradle-plugin`
`maven-publish`
`ivy-publish`
}
group = "com.example"
version = "1.0.0"
gradlePlugin {
plugins {
create("hello") {
id = "com.example.hello"
implementationClass = "com.example.hello.HelloPlugin"
}
create("goodbye") {
id = "com.example.goodbye"
implementationClass = "com.example.goodbye.GoodbyePlugin"
}
}
}
publishing {
repositories {
maven {
url = uri(layout.buildDirectory.dir("maven-repo"))
}
ivy {
url = uri(layout.buildDirectory.dir("ivy-repo"))
}
}
}
plugins {
id 'java-gradle-plugin'
id 'maven-publish'
id 'ivy-publish'
}
group = 'com.example'
version = '1.0.0'
gradlePlugin {
plugins {
hello {
id = 'com.example.hello'
implementationClass = 'com.example.hello.HelloPlugin'
}
goodbye {
id = 'com.example.goodbye'
implementationClass = 'com.example.goodbye.GoodbyePlugin'
}
}
}
publishing {
repositories {
maven {
url = layout.buildDirectory.dir('maven-repo')
}
ivy {
url = layout.buildDirectory.dir('ivy-repo')
}
}
}
Running gradle publish
in the sample directory creates the following Maven repository layout (the Ivy layout is similar):
Legacy Plugin Application
With the introduction of the plugins DSL, users should have little reason to use the legacy method of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to restrictions in how it currently works.
apply(plugin = "java")
apply plugin: 'java'
Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to apply the JavaPlugin.
Rather than using a plugin id, plugins can also be applied by simply specifying the class of the plugin:
apply<JavaPlugin>()
apply plugin: JavaPlugin
The JavaPlugin
symbol in the above sample refers to the JavaPlugin.
This class does not strictly need to be imported as the org.gradle.api.plugins
package is automatically imported in all build scripts (see Default imports).
Furthermore, one needs to append the ::class
suffix to identify a class literal in Kotlin instead of .class
in Java.
Furthermore, it is unnecessary to append .class
to identify a class literal in Groovy as it is in Java.
You may also see the apply
method used to include an entire build file:
apply(from = "other.gradle.kts")
apply from: 'other.gradle'
Using a Version Catalog
When a project uses a version catalog, plugins can be referenced via aliases when applied.
Let’s take a look at a simple Version Catalog:
[versions]
groovy = "3.0.5"
checkstyle = "8.37"
[libraries]
groovy-core = { module = "org.codehaus.groovy:groovy", version.ref = "groovy" }
groovy-json = { module = "org.codehaus.groovy:groovy-json", version.ref = "groovy" }
groovy-nio = { module = "org.codehaus.groovy:groovy-nio", version.ref = "groovy" }
commons-lang3 = { group = "org.apache.commons", name = "commons-lang3", version = { strictly = "[3.8, 4.0[", prefer="3.9" } }
[bundles]
groovy = ["groovy-core", "groovy-json", "groovy-nio"]
[plugins]
versions = { id = "com.github.ben-manes.versions", version = "0.45.0" }
Then a plugin can be applied to any build script using the alias
method:
plugins {
`java-library`
alias(libs.plugins.versions)
}
plugins {
id 'java-library'
alias(libs.plugins.versions)
}
Tip
|
Gradle generates type safe accessors for catalog items. |
Next Step: Learn how to write Plugins >>
Writing Plugins
If Gradle or the Gradle community does not offer the specific capabilities your project needs, creating your own custom plugin could be a solution.
Additionally, if you find yourself duplicating build logic across subprojects and need a better way to organize it, convention plugins can help.
Script plugin
A plugin is any class that implements the Plugin
interface.
For example, this is a "hello world" plugin:
abstract class SamplePlugin : Plugin<Project> { // (1)
override fun apply(project: Project) { // (2)
project.tasks.register("ScriptPlugin") {
doLast {
println("Hello world from the build file!")
}
}
}
}
apply<SamplePlugin>() // (3)
class SamplePlugin implements Plugin<Project> { // (1)
void apply(Project project) { // (2)
project.tasks.register("ScriptPlugin") {
doLast {
println("Hello world from the build file!")
}
}
}
}
apply plugin: SamplePlugin // (3)
-
Extend the
org.gradle.api.Plugin
interface. -
Override the
apply
method. -
apply
the plugin to the project.
1. Extend the org.gradle.api.Plugin
interface
Create a class that extends the Plugin
interface:
abstract class SamplePlugin : Plugin<Project> {
}
class SamplePlugin implements Plugin<Project> {
}
2. Override the apply
method
Add tasks and other logic in the apply()
method:
override fun apply() {
}
void apply(Project project) {
}
3. apply
the plugin to your project
When SamplePlugin
is applied in your project, Gradle calls the fun apply() {}
method defined.
This adds the ScriptPlugin
task to your project:
apply<SamplePlugin>()
apply plugin: SamplePlugin
Note that this is a simple hello-world
example and does not reflect best practices.
Important
|
Script plugins are not recommended. |
The best practice for developing plugins is to create convention plugins or binary plugins.
Pre-compiled script plugin
Pre-compiled script plugins offer an easy way to rapidly prototype and experiment.
They let you package build logic as *.gradle(.kts)
script files using the Groovy or Kotlin DSL.
These scripts reside in specific directories, such as src/main/groovy
or src/main/kotlin
.
To apply one, simply use its ID
derived from the script filename (without .gradle
).
You can think of the file itself as the plugin, so you do not need to subclass the Plugin
interface in a precompiled script.
Let’s take a look at an example with the following structure:
.
└── buildSrc
├── build.gradle.kts
└── src
└── main
└── kotlin
└── my-create-file-plugin.gradle.kts
Our my-create-file-plugin.gradle.kts
file contains the following code:
abstract class CreateFileTask : DefaultTask() {
@get:Input
abstract val fileText: Property<String>
@Input
val fileName = "myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
tasks.register<CreateFileTask>("createMyFileTaskInConventionPlugin") {
group = "from my convention plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY CONVENTION PLUGIN")
}
abstract class CreateFileTask extends DefaultTask {
@Input
abstract Property<String> getFileText()
@Input
String fileName = "myfile.txt"
@OutputFile
File getMyFile() {
return new File(fileName)
}
@TaskAction
void action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
tasks.register("createMyFileTaskInConventionPlugin", CreateFileTask) {
group = "from my convention plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY CONVENTION PLUGIN")
}
The pre-compiled script can now be applied in the build.gradle(.kts
) file of any subproject:
plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
`kotlin-dsl`
}
plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id 'groovy' // Apply the Groovy DSL plugin
}
The createFileTask
task from the plugin is now available in your subproject.
Binary Plugins
A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR file. It is resolved as a dependency rather than compiled from source.
For most use cases, convention plugins must be updated infrequently. Having each developer execute the plugin build as part of their development process is wasteful, and we can instead distribute them as binary dependencies.
There are two ways to update the convention plugin in the example above into a binary plugin.
-
Use composite builds:
settings.gradle.ktsincludeBuild("my-plugin")
-
Publish the plugin to a repository:
build.gradle.ktsplugins { id("com.gradle.plugin.my-plugin") version "1.0.0" }
Let’s go with the second solution.
This plugin has been re-written in Kotlin and is called MyCreateFileBinaryPlugin.kt
.
It is still stored in buildSrc
:
import org.gradle.api.DefaultTask
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.provider.Property
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction
import java.io.File
abstract class CreateFileTask : DefaultTask() {
@get:Input
abstract val fileText: Property<String>
@Input
val fileName = project.rootDir.toString() + "/myfile.txt"
@OutputFile
val myFile: File = File(fileName)
@TaskAction
fun action() {
myFile.createNewFile()
myFile.writeText(fileText.get())
}
}
class MyCreateFileBinaryPlugin : Plugin<Project> {
override fun apply(project: Project) {
project.tasks.register("createFileTaskFromBinaryPlugin", CreateFileTask::class.java) {
group = "from my binary plugin"
description = "Create myfile.txt in the current directory"
fileText.set("HELLO FROM MY BINARY PLUGIN")
}
}
}
The plugin can be published and given an id
using a gradlePlugin{}
block so that it can be referenced in the root:
group = "com.example"
version = "1.0.0"
gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "com.example.my-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}
publishing {
repositories {
mavenLocal()
}
}
group = 'com.example'
version = '1.0.0'
gradlePlugin {
plugins {
create("my-binary-plugin") {
id = "com.example.my-binary-plugin"
implementationClass = "MyCreateFileBinaryPlugin"
}
}
}
publishing {
repositories {
mavenLocal()
}
}
Then, the plugin can be applied in the build file:
plugins {
id("my-create-file-plugin") // Apply the pre-compiled convention plugin
id("com.example.my-binary-plugin") // Apply the binary plugin
`kotlin-dsl`
}
plugins {
id 'my-create-file-plugin' // Apply the pre-compiled convention plugin
id 'com.example.my-binary-plugin' // Apply the binary plugin
id 'groovy' // Apply the Groovy DSL plugin
}
Consult the Developing Plugins chapter to learn more.
GRADLE TYPES
Understanding Properties and Providers
Gradle provides properties that are important for lazy configuration. When implementing a custom task or plugin, it’s imperative that you use these lazy properties.
Gradle represents lazy properties with two interfaces:
Properties and providers manage values and configurations in a build script.
In this example, configuration
is a Property<String>
that is set to the configurationProvider
Provider<String>
.
The configurationProvider
lazily provides the value "Hello, Gradle!"
:
abstract class MyIntroTask : DefaultTask() {
@get:Input
abstract val configuration: Property<String>
@TaskAction
fun printConfiguration() {
println("Configuration value: ${configuration.get()}")
}
}
val configurationProvider: Provider<String> = project.provider { "Hello, Gradle!" }
tasks.register("myIntroTask", MyIntroTask::class) {
configuration.set(configurationProvider)
}
abstract class MyIntroTask extends DefaultTask {
@Input
abstract Property<String> getConfiguration()
@TaskAction
void printConfiguration() {
println "Configuration value: ${configuration.get()}"
}
}
Provider<String> configurationProvider = project.provider { "Hello, Gradle!" }
tasks.register("myIntroTask", MyIntroTask) {
it.setConfiguration(configurationProvider)
}
Understanding Properties
Properties in Gradle are variables that hold values. They can be defined and accessed within the build script to store information like file paths, version numbers, or custom values.
Properties can be set and retrieved using the project
object:
// Setting a property
val simpleMessageProperty: Property<String> = project.objects.property(String::class)
simpleMessageProperty.set("Hello, World from a Property!")
// Accessing a property
println(simpleMessageProperty.get())
// Setting a property
def simpleMessageProperty = project.objects.property(String)
simpleMessageProperty.set("Hello, World from a Property!")
// Accessing a property
println(simpleMessageProperty.get())
Properties:
-
Properties with these types are configurable.
-
Property
extends theProvider
interface. -
The method Property.set(T) specifies a value for the property, overwriting whatever value may have been present.
-
The method Property.set(Provider) specifies a
Provider
for the value for the property, overwriting whatever value may have been present. This allows you to wire togetherProvider
andProperty
instances before the values are configured. -
A
Property
can be created by the factory method ObjectFactory.property(Class).
Understanding Providers
Providers are objects that represent a value that may not be immediately available. Providers are useful for lazy evaluation and can be used to model values that may change over time or depend on other tasks or inputs:
// Setting a provider
val simpleMessageProvider: Provider<String> = project.providers.provider { "Hello, World from a Provider!" }
// Accessing a provider
println(simpleMessageProvider.get())
// Setting a provider
def simpleMessageProvider = project.providers.provider { "Hello, World from a Provider!" }
// Accessing a provider
println(simpleMessageProvider.get())
Providers:
-
Properties with these types are read-only.
-
The method Provider.get() returns the current value of the property.
-
A
Provider
can be created from anotherProvider
using Provider.map(Transformer). -
Many other types extend
Provider
and can be used wherever aProvider
is required.
Using Gradle Managed Properties
Gradle’s managed properties allow you to declare properties as abstract getters (Java, Groovy) or abstract properties (Kotlin).
Gradle then automatically provides the implementation for these properties, managing their state.
A property may be mutable, meaning that it has both a get()
method and set()
method:
abstract class MyPropertyTask : DefaultTask() {
@get:Input
abstract val messageProperty: Property<String> // message property
@TaskAction
fun printMessage() {
println(messageProperty.get())
}
}
tasks.register<MyPropertyTask>("myPropertyTask") {
messageProperty.set("Hello, Gradle!")
}
abstract class MyPropertyTask extends DefaultTask {
@Input
abstract Property<String> messageProperty = project.objects.property(String)
@TaskAction
void printMessage() {
println(messageProperty.get())
}
}
tasks.register('myPropertyTask', MyPropertyTask) {
messageProperty.set("Hello, Gradle!")
}
Or read-only, meaning that it has only a get()
method.
The read-only properties are providers:
abstract class MyProviderTask : DefaultTask() {
final val messageProvider: Provider<String> = project.providers.provider { "Hello, Gradle!" } // message provider
@TaskAction
fun printMessage() {
println(messageProvider.get())
}
}
tasks.register<MyProviderTask>("MyProviderTask") {
}
abstract class MyProviderTask extends DefaultTask {
final Provider<String> messageProvider = project.providers.provider { "Hello, Gradle!" }
@TaskAction
void printMessage() {
println(messageProvider.get())
}
}
tasks.register('MyProviderTask', MyProviderTask)
Mutable Managed Properties
A mutable managed property is declared using an abstract getter method of type Property<T>
, where T
can be any serializable type or a fully managed Gradle type.
The property must not have any setter methods.
Here is an example of a task type with an uri
property of type URI
:
public abstract class Download extends DefaultTask {
@Input
public abstract Property<URI> getUri(); // abstract getter of type Property<T>
@TaskAction
void run() {
System.out.println("Downloading " + getUri().get()); // Use the `uri` property
}
}
Note that for a property to be considered a mutable managed property, the property’s getter methods must be abstract
and have public
or protected
visibility.
The property type must be one of the following:
Property Type | Note |
---|---|
|
Where |
|
Configurable regular file location, whose value is mutable |
|
Configurable directory location, whose value is mutable |
|
List of elements of type |
|
Set of elements of type |
|
Map of |
|
A mutable |
|
A mutable |
Read-only Managed Properties (Providers)
You can declare a read-only managed property, also known as a provider, using a getter method of type Provider<T>
.
The method implementation needs to derive the value.
It can, for example, derive the value from other properties.
Here is an example of a task type with a uri
provider that is derived from a location
property:
public abstract class Download extends DefaultTask {
@Input
public abstract Property<String> getLocation();
@Internal
public Provider<URI> getUri() {
return getLocation().map(l -> URI.create("https://" + l));
}
@TaskAction
void run() {
System.out.println("Downloading " + getUri().get()); // Use the `uri` provider (read-only property)
}
}
Read-only Managed Nested Properties (Nested Providers)
You can declare a read-only managed nested property by adding an abstract getter method for the property to a type annotated with @Nested
.
The property should not have any setter methods.
Gradle provides the implementation for the getter method and creates a value for the property.
This pattern is useful when a custom type has a nested complex type which has the same lifecycle.
If the lifecycle is different, consider using Property<NestedType>
instead.
Here is an example of a task type with a resource
property.
The Resource
type is also a custom Gradle type and defines some managed properties:
public abstract class Download extends DefaultTask {
@Nested
public abstract Resource getResource(); // Use an abstract getter method annotated with @Nested
@TaskAction
void run() {
// Use the `resource` property
System.out.println("Downloading https://" + getResource().getHostName().get() + "/" + getResource().getPath().get());
}
}
public interface Resource {
@Input
Property<String> getHostName();
@Input
Property<String> getPath();
}
Read-only Managed "name" Property (Provider)
If the type contains an abstract property called "name" of type String
, Gradle provides an implementation for the getter
method, and extends each constructor with a "name" parameter, which comes before all other constructor parameters.
If the type is an interface, Gradle will provide a constructor with a single "name" parameter and @Inject
semantics.
You can have your type implement or extend the Named interface, which defines such a read-only "name" property:
import org.gradle.api.Named
interface MyType : Named {
// Other properties and methods...
}
class MyTypeImpl(override val name: String) : MyType {
// Implement other properties and methods...
}
// Usage
val instance = MyTypeImpl("myName")
println(instance.name) // Prints: myName
Using Gradle Managed Types
A managed type as an abstract class or interface with no fields and whose properties are all managed. These types have their state entirely managed by Gradle.
For example, this managed type is defined as an interface:
public interface Resource {
@Input
Property<String> getHostName();
@Input
Property<String> getPath();
}
A named managed type is a managed type that additionally has an abstract property "name" of type String
.
Named managed types are especially useful as the element type of NamedDomainObjectContainer:
interface MyNamedType {
val name: String
}
class MyNamedTypeImpl(override val name: String) : MyNamedType
class MyPluginExtension(project: Project) {
val myNamedContainer: NamedDomainObjectContainer<MyNamedType> =
project.container(MyNamedType::class.java) { name ->
project.objects.newInstance(MyNamedTypeImpl::class.java, name)
}
}
interface MyNamedType {
String getName()
}
class MyNamedTypeImpl implements MyNamedType {
String name
MyNamedTypeImpl(String name) {
this.name = name
}
}
class MyPluginExtension {
NamedDomainObjectContainer<MyNamedType> myNamedContainer
MyPluginExtension(Project project) {
myNamedContainer = project.container(MyNamedType) { name ->
new MyNamedTypeImpl(name)
}
}
}
Using Java Bean Properties
Sometimes you may see properties implemented in the Java bean property style.
That is, they do not use a Property<T>
or Provider<T>
types but are instead implemented with concrete setter and getter methods (or corresponding conveniences in Groovy or Kotlin).
This style of property definition is legacy in Gradle and is discouraged:
public class MyTask extends DefaultTask {
private String someProperty;
public String getSomeProperty() {
return someProperty;
}
public void setSomeProperty(String someProperty) {
this.someProperty = someProperty;
}
@TaskAction
public void myAction() {
System.out.println("SomeProperty: " + someProperty);
}
}
Understanding Collections
Gradle provides types for maintaining collections of objects, intended to work well to extends Gradle’s DSLs and provide useful features such as lazy configuration.
Available collections
These collection types are used for managing collections of objects, particularly in the context of build scripts and plugins:
-
DomainObjectSet<T>
: Represents a set of objects of type T. This set does not allow duplicate elements, and you can add, remove, and query objects in the set. -
NamedDomainObjectSet<T>
: A specialization ofDomainObjectSet
where each object has a unique name associated with it. This is often used for collections where each element needs to be uniquely identified by a name. -
NamedDomainObjectList<T>
: Similar toNamedDomainObjectSet
, but represents a list of objects where order matters. Each element has a unique name associated with it, and you can access elements by index as well as by name. -
NamedDomainObjectContainer<T>
: A container for managing objects of type T, where each object has a unique name. This container provides methods for adding, removing, and querying objects by name. -
ExtensiblePolymorphicDomainObjectContainer<T>
: An extension ofNamedDomainObjectContainer
that allows you to define instantiation strategies for different types of objects. This is useful when you have a container that can hold multiple types of objects, and you want to control how each type of object is instantiated.
These types are commonly used in Gradle plugins and build scripts to manage collections of objects, such as tasks, configurations, or custom domain objects.
1. DomainObjectSet
A DomainObjectSet
simply holds a set of configurable objects.
Compared to NamedDomainObjectContainer
, a DomainObjectSet
doesn’t manage the objects in the collection.
They need to be created and added manually.
You can create an instance using the ObjectFactory.domainObjectSet() method:
abstract class MyPluginExtensionDomainObjectSet {
// Define a domain object set to hold strings
val myStrings: DomainObjectSet<String> = project.objects.domainObjectSet(String::class)
// Add some strings to the domain object set
fun addString(value: String) {
myStrings.add(value)
}
}
abstract class MyPluginExtensionDomainObjectSet {
// Define a domain object set to hold strings
DomainObjectSet<String> myStrings = project.objects.domainObjectSet(String)
// Add some strings to the domain object set
void addString(String value) {
myStrings.add(value)
}
}
2. NamedDomainObjectSet
A NamedDomainObjectSet
holds a set of configurable objects, where each element has a name associated with it.
This is similar to NamedDomainObjectContainer
, however a NamedDomainObjectSet
doesn’t manage the objects in the collection.
They need to be created and added manually.
You can create an instance using the ObjectFactory.namedDomainObjectSet() method.
abstract class Person(val name: String)
abstract class MyPluginExtensionNamedDomainObjectSet {
// Define a named domain object set to hold Person objects
private val people: NamedDomainObjectSet<Person> = project.objects.namedDomainObjectSet(Person::class)
// Add a person to the set
fun addPerson(name: String) {
people.plus(name)
}
}
abstract class Person {
String name
}
abstract class MyPluginExtensionNamedDomainObjectSet {
// Define a named domain object set to hold Person objects
NamedDomainObjectSet<Person> people = project.objects.namedDomainObjectSet(Person)
// Add a person to the set
void addPerson(String name) {
people.create(name)
}
}
3. NamedDomainObjectList
A NamedDomainObjectList
holds a list of configurable objects, where each element has a name associated with it.
This is similar to NamedDomainObjectContainer
, however a NamedDomainObjectList
doesn’t manage the objects in the collection.
They need to be created and added manually.
You can create an instance using the ObjectFactory.namedDomainObjectList() method.
abstract class Person(val name: String)
abstract class MyPluginExtensionNamedDomainObjectList {
// Define a named domain object list to hold Person objects
private val people: NamedDomainObjectList<Person> = project.objects.namedDomainObjectList(Person::class)
// Add a person to the container
fun addPerson(name: String) {
people.plus(name)
}
}
abstract class Person {
String name
}
abstract class MyPluginExtensionNamedDomainObjectList {
// Define a named domain object container to hold Person objects
NamedDomainObjectList<Person> people = project.container(Person)
// Add a person to the container
void addPerson(String name) {
people.create(name: name)
}
}
4. NamedDomainObjectContainer
A NamedDomainObjectContainer
manages a set of objects, where each element has a name associated with it.
The container takes care of creating and configuring the elements, and provides a DSL that build scripts can use to define and configure elements. It is intended to hold objects which are themselves configurable, for example a set of custom Gradle objects.
Gradle uses NamedDomainObjectContainer
type extensively throughout the API.
For example, the project.tasks
object used to manage the tasks of a project is a NamedDomainObjectContainer<Task>
.
You can create a container instance using the ObjectFactory service, which provides the ObjectFactory.domainObjectContainer() method.
This is also available using the Project.container() method, however in a custom Gradle type it’s generally better to use the injected ObjectFactory
service instead of passing around a Project
instance.
You can also create a container instance using a read-only managed property.
abstract class Person(val name: String)
abstract class MyPluginExtensionNamedDomainObjectContainer {
// Define a named domain object container to hold Person objects
private val people: NamedDomainObjectContainer<Person> = project.container(Person::class)
// Add a person to the container
fun addPerson(name: String) {
people.create(name)
}
}
abstract class Person {
String name
}
abstract class MyPluginExtensionNamedDomainObjectContainer {
// Define a named domain object container to hold Person objects
NamedDomainObjectContainer<Person> people = project.container(Person)
// Add a person to the container
void addPerson(String name) {
people.create(name: name)
}
}
In order to use a type with any of the domainObjectContainer()
methods, it must either
-
be a named managed type; or
-
expose a property named “name” as the unique, and constant, name for the object. The
domainObjectContainer(Class)
variant of the method creates new instances by calling the constructor of the class that takes a string argument, which is the desired name of the object.
Objects created this way are treated as custom Gradle types, and so can make use of the features discussed in this chapter, for example service injection or managed properties.
See the above link for domainObjectContainer()
method variants that allow custom instantiation strategies:
public interface DownloadExtension {
NamedDomainObjectContainer<Resource> getResources();
}
public interface Resource {
// Type must have a read-only 'name' property
String getName();
Property<URI> getUri();
Property<String> getUserName();
}
For each container property, Gradle automatically adds a block to the Groovy and Kotlin DSL that you can use to configure the contents of the container:
plugins {
id("org.gradle.sample.download")
}
download {
// Can use a block to configure the container contents
resources {
register("gradle") {
uri = uri("https://gradle.org")
}
}
}
plugins {
id("org.gradle.sample.download")
}
download {
// Can use a block to configure the container contents
resources {
register('gradle') {
uri = uri('https://gradle.org')
}
}
}
5. ExtensiblePolymorphicDomainObjectContainer
An ExtensiblePolymorphicDomainObjectContainer is a NamedDomainObjectContainer
that allows you to
define instantiation strategies for different types of objects.
You can create an instance using the ObjectFactory.polymorphicDomainObjectContainer() method:
abstract class Animal(val name: String)
class Dog(name: String, val breed: String) : Animal(name)
abstract class MyPluginExtensionExtensiblePolymorphicDomainObjectContainer(objectFactory: ObjectFactory) {
// Define a container for animals
private val animals: ExtensiblePolymorphicDomainObjectContainer<Animal> = objectFactory.polymorphicDomainObjectContainer(Animal::class)
// Add a dog to the container
fun addDog(name: String, breed: String) {
var dog : Dog = Dog(name, breed)
animals.add(dog)
}
}
abstract class Animal {
String name
}
abstract class Dog extends Animal {
String breed
}
abstract class MyPluginExtensionExtensiblePolymorphicDomainObjectContainer {
// Define a container for animals
ExtensiblePolymorphicDomainObjectContainer<Animal> animals
MyPluginExtensionExtensiblePolymorphicDomainObjectContainer(ObjectFactory objectFactory) {
// Create the container
animals = objectFactory.polymorphicDomainObjectContainer(Animal)
}
// Add a dog to the container
void addDog(String name, String breed) {
animals.create(Dog, name: name, breed: breed)
}
}
Understanding Services and Service Injection
Gradle provides a number of useful services that can be used by custom Gradle types. For example, the WorkerExecutor service can be used by a task to run work in parallel, as seen in the worker API section. The services are made available through service injection.
Available services
The following services are available for injection:
-
ObjectFactory
- Allows model objects to be created. -
ProjectLayout
- Provides access to key project locations. -
BuildLayout
- Provides access to important locations for a Gradle build. -
ProviderFactory
- CreatesProvider
instances. -
WorkerExecutor
- Allows a task to run work in parallel. -
FileSystemOperations
- Allows a task to run operations on the filesystem such as deleting files, copying files or syncing directories. -
ArchiveOperations
- Allows a task to run operations on archive files such as ZIP or TAR files. -
ExecOperations
- Allows a task to run external processes with dedicated support for running externaljava
programs. -
ToolingModelBuilderRegistry
- Allows a plugin to registers a Gradle tooling API model.
Out of the above, ProjectLayout
and WorkerExecutor
services are only available for injection in project plugins.
BuildLayout
is only available in settings plugins and settings files.
ProjectLayout
is unavailable in Worker API actions.
1. ObjectFactory
ObjectFactory
is a service for creating custom Gradle types, allowing you to define nested objects and DSLs in your build logic.
It provides methods for creating instances of different types, such as properties (Property<T>
), collections (ListProperty<T>
, SetProperty<T>
, MapProperty<K, V>
), file-related objects (RegularFileProperty
, DirectoryProperty
, ConfigurableFileCollection
, ConfigurableFileTree
), and more.
You can obtain an instance of ObjectFactory
using the project.objects
property.
Here’s a simple example demonstrating how to use ObjectFactory
to create a property and set its value:
tasks.register("myObjectFactoryTask") {
doLast {
val objectFactory = project.objects
val myProperty = objectFactory.property(String::class)
myProperty.set("Hello, Gradle!")
println(myProperty.get())
}
}
tasks.register("myObjectFactoryTask") {
doLast {
def objectFactory = project.objects
def myProperty = objectFactory.property(String)
myProperty.set("Hello, Gradle!")
println myProperty.get()
}
}
Tip
|
It is preferable to let Gradle create objects automatically by using managed properties. |
Using ObjectFactory
to create these objects ensures that they are properly managed by Gradle, especially in terms of configuration avoidance and lazy evaluation.
This means that the values of these objects are only calculated when needed, which can improve build performance.
In the following example, a project extension called DownloadExtension
receives an ObjectFactory
instance through its constructor.
The constructor uses this to create a nested Resource
object (also a custom Gradle type) and makes this object available through the resource
property:
public class DownloadExtension {
// A nested instance
private final Resource resource;
@Inject
public DownloadExtension(ObjectFactory objectFactory) {
// Use an injected ObjectFactory to create a Resource object
resource = objectFactory.newInstance(Resource.class);
}
public Resource getResource() {
return resource;
}
}
public interface Resource {
Property<URI> getUri();
}
Here is another example using javax.inject.Inject
:
abstract class MyObjectFactoryTask
@Inject constructor(private var objectFactory: ObjectFactory) : DefaultTask() {
@TaskAction
fun doTaskAction() {
val outputDirectory = objectFactory.directoryProperty()
outputDirectory.convention(project.layout.projectDirectory)
println(outputDirectory.get())
}
}
tasks.register("myInjectedObjectFactoryTask", MyObjectFactoryTask::class) {}
abstract class MyObjectFactoryTask extends DefaultTask {
private ObjectFactory objectFactory
@Inject //@javax.inject.Inject
MyObjectFactoryTask(ObjectFactory objectFactory) {
this.objectFactory = objectFactory
}
@TaskAction
void doTaskAction() {
var outputDirectory = objectFactory.directoryProperty()
outputDirectory.convention(project.layout.projectDirectory)
println(outputDirectory.get())
}
}
tasks.register("myInjectedObjectFactoryTask",MyObjectFactoryTask) {}
The MyObjectFactoryTask
task uses an ObjectFactory
instance, which is injected into the task’s constructor using the @Inject
annotation.
2. ProjectLayout
ProjectLayout
is a service that provides access to the layout of a Gradle project’s directories and files.
It’s part of the org.gradle.api.file
package and allows you to query the project’s layout to get information about source sets, build directories, and other file-related aspects of the project.
You can obtain a ProjectLayout
instance from a Project
object using the project.layout
property.
Here’s a simple example:
tasks.register("showLayout") {
doLast {
val layout = project.layout
println("Project Directory: ${layout.projectDirectory}")
println("Build Directory: ${layout.buildDirectory.get()}")
}
}
tasks.register('showLayout') {
doLast {
def layout = project.layout
println "Project Directory: ${layout.projectDirectory}"
println "Build Directory: ${layout.buildDirectory.get()}"
}
}
Here is an example using javax.inject.Inject
:
abstract class MyProjectLayoutTask
@Inject constructor(private var projectLayout: ProjectLayout) : DefaultTask() {
@TaskAction
fun doTaskAction() {
val outputDirectory = projectLayout.projectDirectory
println(outputDirectory)
}
}
tasks.register("myInjectedProjectLayoutTask", MyProjectLayoutTask::class) {}
abstract class MyProjectLayoutTask extends DefaultTask {
private ProjectLayout projectLayout
@Inject //@javax.inject.Inject
MyProjectLayoutTask(ProjectLayout projectLayout) {
this.projectLayout = projectLayout
}
@TaskAction
void doTaskAction() {
var outputDirectory = projectLayout.projectDirectory
println(outputDirectory)
}
}
tasks.register("myInjectedProjectLayoutTask",MyProjectLayoutTask) {}
The MyProjectLayoutTask
task uses a ProjectLayout
instance, which is injected into the task’s constructor using the @Inject
annotation.
3. BuildLayout
BuildLayout
is a service that provides access to the root and settings directory in a Settings plugin or a Settings script, it is analogous to ProjectLayout
.
It’s part of the org.gradle.api.file
package to access standard build-wide file system locations as lazily computed value.
Note
|
These APIs are currently incubating but eventually should replace existing accessors in Settings, which return eagerly computed locations:Settings.rootDir → Settings.layout.rootDirectory Settings.settingsDir → Settings.layout.settingsDirectory
|
You can obtain a BuildLayout
instance from a Settings
object using the settings.layout
property.
Here’s a simple example:
println("Root Directory: ${settings.layout.rootDirectory}")
println("Settings Directory: ${settings.layout.settingsDirectory}")
println "Root Directory: ${settings.getLayout().rootDirectory}"
println "Settings Directory: ${settings.getLayout().settingsDirectory}"
Here is an example using javax.inject.Inject
:
abstract class MyBuildLayoutPlugin @Inject constructor(private val buildLayout: BuildLayout) : Plugin<Settings> {
override fun apply(settings: Settings) {
println(buildLayout.rootDirectory)
}
}
apply<MyBuildLayoutPlugin>()
abstract class MyBuildLayoutPlugin implements Plugin<Settings> {
private BuildLayout buildLayout
@Inject //@javax.inject.Inject
MyBuildLayoutPlugin(BuildLayout buildLayout) {
this.buildLayout = buildLayout
}
@Override void apply(Settings settings) {
// the meat and potatoes of the plugin
println buildLayout.rootDirectory
}
}
apply plugin: MyBuildLayoutPlugin
This code defines a MyBuildLayoutPlugin
plugin that implements the Plugin
interface for the Settings
type.
The plugin expects a BuildLayout
instance to be injected into its constructor using the @Inject
annotation.
4. ProviderFactory
ProviderFactory
is a service that provides methods for creating different types of providers.
Providers are used to model values that may be computed lazily in your build scripts.
The ProviderFactory
interface provides methods for creating various types of providers, including:
-
provider(Callable<T> value)
to create a provider with a value that is lazily computed based on aCallable
. -
provider(Provider<T> value)
to create a provider that simply wraps an existing provider. -
property(Class<T> type)
to create a property provider for a specific type. -
gradleProperty(Class<T> type)
to create a property provider that reads its value from a Gradle project property.
Here’s a simple example demonstrating the use of ProviderFactory
using project.providers
:
tasks.register("printMessage") {
doLast {
val providerFactory = project.providers
val messageProvider = providerFactory.provider { "Hello, Gradle!" }
println(messageProvider.get())
}
}
tasks.register('printMessage') {
doLast {
def providerFactory = project.providers
def messageProvider = providerFactory.provider { "Hello, Gradle!" }
println messageProvider.get()
}
}
The task named printMessage
uses the ProviderFactory
to create a provider
that supplies the message string.
Here is an example using javax.inject.Inject
:
abstract class MyProviderFactoryTask
@Inject constructor(private var providerFactory: ProviderFactory) : DefaultTask() {
@TaskAction
fun doTaskAction() {
val outputDirectory = providerFactory.provider { "build/my-file.txt" }
println(outputDirectory.get())
}
}
tasks.register("myInjectedProviderFactoryTask", MyProviderFactoryTask::class) {}
abstract class MyProviderFactoryTask extends DefaultTask {
private ProviderFactory providerFactory
@Inject //@javax.inject.Inject
MyProviderFactoryTask(ProviderFactory providerFactory) {
this.providerFactory = providerFactory
}
@TaskAction
void doTaskAction() {
var outputDirectory = providerFactory.provider { "build/my-file.txt" }
println(outputDirectory.get())
}
}
tasks.register("myInjectedProviderFactoryTask",MyProviderFactoryTask) {}
The ProviderFactory
service is injected into the MyProviderFactoryTask
task’s constructor using the @Inject
annotation.
5. WorkerExecutor
WorkerExecutor
is a service that allows you to perform parallel execution of tasks using worker processes.
This is particularly useful for tasks that perform CPU-intensive or long-running operations, as it allows them to be executed in parallel, improving build performance.
Using WorkerExecutor
, you can submit units of work (called actions) to be executed in separate worker processes.
This helps isolate the work from the main Gradle process, providing better reliability and performance.
Here’s a basic example of how you might use WorkerExecutor
in a build script:
abstract class MyWorkAction : WorkAction<WorkParameters.None> {
private val greeting: String = "Hello from a Worker!"
override fun execute() {
println(greeting)
}
}
abstract class MyWorkerTask
@Inject constructor(private var workerExecutor: WorkerExecutor) : DefaultTask() {
@get:Input
abstract val booleanFlag: Property<Boolean>
@TaskAction
fun doThings() {
workerExecutor.noIsolation().submit(MyWorkAction::class.java) {}
}
}
tasks.register("myWorkTask", MyWorkerTask::class) {}
abstract class MyWorkAction implements WorkAction<WorkParameters.None> {
private final String greeting;
@Inject
public MyWorkAction() {
this.greeting = "Hello from a Worker!";
}
@Override
public void execute() {
System.out.println(greeting);
}
}
abstract class MyWorkerTask extends DefaultTask {
@Input
abstract Property<Boolean> getBooleanFlag()
@Inject
abstract WorkerExecutor getWorkerExecutor()
@TaskAction
void doThings() {
workerExecutor.noIsolation().submit(MyWorkAction) {}
}
}
tasks.register("myWorkTask", MyWorkerTask) {}
See the worker API for more details.
6. FileSystemOperations
FileSystemOperations
is a service that provides methods for performing file system operations such as copying, deleting, and syncing.
It is part of the org.gradle.api.file
package and is typically used in custom tasks or plugins to interact with the file system.
Here is an example using javax.inject.Inject
:
abstract class MyFileSystemOperationsTask
@Inject constructor(private var fileSystemOperations: FileSystemOperations) : DefaultTask() {
@TaskAction
fun doTaskAction() {
fileSystemOperations.sync {
from("src")
into("dest")
}
}
}
tasks.register("myInjectedFileSystemOperationsTask", MyFileSystemOperationsTask::class)
abstract class MyFileSystemOperationsTask extends DefaultTask {
private FileSystemOperations fileSystemOperations
@Inject //@javax.inject.Inject
MyFileSystemOperationsTask(FileSystemOperations fileSystemOperations) {
this.fileSystemOperations = fileSystemOperations
}
@TaskAction
void doTaskAction() {
fileSystemOperations.sync {
from 'src'
into 'dest'
}
}
}
tasks.register("myInjectedFileSystemOperationsTask", MyFileSystemOperationsTask)
The FileSystemOperations
service is injected into the MyFileSystemOperationsTask
task’s constructor using the @Inject
annotation.
With some ceremony, it is possible to use FileSystemOperations
in an ad-hoc task defined in a build script:
interface InjectedFsOps {
@get:Inject val fs: FileSystemOperations
}
tasks.register("myAdHocFileSystemOperationsTask") {
val injected = project.objects.newInstance<InjectedFsOps>()
doLast {
injected.fs.copy {
from("src")
into("dest")
}
}
}
interface InjectedFsOps {
@Inject //@javax.inject.Inject
FileSystemOperations getFs()
}
tasks.register('myAdHocFileSystemOperationsTask') {
def injected = project.objects.newInstance(InjectedFsOps)
doLast {
injected.fs.copy {
from 'source'
into 'destination'
}
}
}
First, you need to declare an interface with a property of type FileSystemOperations
, here named InjectedFsOps
, to serve as an injection point.
Then call the method ObjectFactory.newInstance
to generate an implementation of the interface that holds an injected service.
Tip
|
This is a good time to consider extracting the ad-hoc task into a proper class. |
7. ArchiveOperations
ArchiveOperations
is a service that provides methods for accessing the contents of archives, such as ZIP and TAR files.
It is part of the org.gradle.api.file
package and is typically used in custom tasks or plugins to unpack archive files.
Here is an example using javax.inject.Inject
:
abstract class MyArchiveOperationsTask
@Inject constructor(
private val archiveOperations: ArchiveOperations,
private val layout: ProjectLayout,
private val fs: FileSystemOperations
) : DefaultTask() {
@TaskAction
fun doTaskAction() {
fs.sync {
from(archiveOperations.zipTree(layout.projectDirectory.file("sources.jar")))
into(layout.buildDirectory.dir("unpacked-sources"))
}
}
}
tasks.register("myInjectedArchiveOperationsTask", MyArchiveOperationsTask::class)
abstract class MyArchiveOperationsTask extends DefaultTask {
private ArchiveOperations archiveOperations
private ProjectLayout layout
private FileSystemOperations fs
@Inject
MyArchiveOperationsTask(ArchiveOperations archiveOperations, ProjectLayout layout, FileSystemOperations fs) {
this.archiveOperations = archiveOperations
this.layout = layout
this.fs = fs
}
@TaskAction
void doTaskAction() {
fs.sync {
from(archiveOperations.zipTree(layout.projectDirectory.file("sources.jar")))
into(layout.buildDirectory.dir("unpacked-sources"))
}
}
}
tasks.register("myInjectedArchiveOperationsTask", MyArchiveOperationsTask)
The ArchiveOperations
service is injected into the MyArchiveOperationsTask
task’s constructor using the @Inject
annotation.
With some ceremony, it is possible to use ArchiveOperations
in an ad-hoc task defined in a build script:
interface InjectedArcOps {
@get:Inject val arcOps: ArchiveOperations
}
tasks.register("myAdHocArchiveOperationsTask") {
val injected = project.objects.newInstance<InjectedArcOps>()
val archiveFile = "${project.projectDir}/sources.jar"
doLast {
injected.arcOps.zipTree(archiveFile)
}
}
interface InjectedArcOps {
@Inject //@javax.inject.Inject
ArchiveOperations getArcOps()
}
tasks.register('myAdHocArchiveOperationsTask') {
def injected = project.objects.newInstance(InjectedArcOps)
def archiveFile = "${projectDir}/sources.jar"
doLast {
injected.arcOps.zipTree(archiveFile)
}
}
First, you need to declare an interface with a property of type ArchiveOperations
, here named InjectedArcOps
, to serve as an injection point.
Then call the method ObjectFactory.newInstance
to generate an implementation of the interface that holds an injected service.
Tip
|
This is a good time to consider extracting the ad-hoc task into a proper class. |
8. ExecOperations
ExecOperations
is a service that provides methods for executing external processes (commands) from within a build script.
It is part of the org.gradle.process
package and is typically used in custom tasks or plugins to run command-line tools or scripts as part of the build process.
Here is an example using javax.inject.Inject
:
abstract class MyExecOperationsTask
@Inject constructor(private var execOperations: ExecOperations) : DefaultTask() {
@TaskAction
fun doTaskAction() {
execOperations.exec {
commandLine("ls", "-la")
}
}
}
tasks.register("myInjectedExecOperationsTask", MyExecOperationsTask::class)
abstract class MyExecOperationsTask extends DefaultTask {
private ExecOperations execOperations
@Inject //@javax.inject.Inject
MyExecOperationsTask(ExecOperations execOperations) {
this.execOperations = execOperations
}
@TaskAction
void doTaskAction() {
execOperations.exec {
commandLine 'ls', '-la'
}
}
}
tasks.register("myInjectedExecOperationsTask", MyExecOperationsTask)
The ExecOperations
is injected into the MyExecOperationsTask
task’s constructor using the @Inject
annotation.
With some ceremony, it is possible to use ExecOperations
in an ad-hoc task defined in a build script:
interface InjectedExecOps {
@get:Inject val execOps: ExecOperations
}
tasks.register("myAdHocExecOperationsTask") {
val injected = project.objects.newInstance<InjectedExecOps>()
doLast {
injected.execOps.exec {
commandLine("ls", "-la")
}
}
}
interface InjectedExecOps {
@Inject //@javax.inject.Inject
ExecOperations getExecOps()
}
tasks.register('myAdHocExecOperationsTask') {
def injected = project.objects.newInstance(InjectedExecOps)
doLast {
injected.execOps.exec {
commandLine 'ls', '-la'
}
}
}
First, you need to declare an interface with a property of type ExecOperations
, here named InjectedExecOps
, to serve as an injection point.
Then call the method ObjectFactory.newInstance
to generate an implementation of the interface that holds an injected service.
Tip
|
This is a good time to consider extracting the ad-hoc task into a proper class. |
9. ToolingModelBuilderRegistry
ToolingModelBuilderRegistry
is a service that allows you to register custom tooling model builders.
Tooling models are used to provide rich IDE integration for Gradle projects, allowing IDEs to understand and work with the project’s structure, dependencies, and other aspects.
The ToolingModelBuilderRegistry
interface is part of the org.gradle.tooling.provider.model
package and is typically used in custom Gradle plugins that provide enhanced IDE support.
Here’s a simplified example:
// Implements the ToolingModelBuilder interface.
// This interface is used in Gradle to define custom tooling models that can
// be accessed by IDEs or other tools through the Gradle tooling API.
class OrtModelBuilder : ToolingModelBuilder {
private val repositories: MutableMap<String, String> = mutableMapOf()
private val platformCategories: Set<String> = setOf("platform", "enforced-platform")
private val visitedDependencies: MutableSet<ModuleComponentIdentifier> = mutableSetOf()
private val visitedProjects: MutableSet<ModuleVersionIdentifier> = mutableSetOf()
private val logger = Logging.getLogger(OrtModelBuilder::class.java)
private val errors: MutableList<String> = mutableListOf()
private val warnings: MutableList<String> = mutableListOf()
override fun canBuild(modelName: String): Boolean {
return false
}
override fun buildAll(modelName: String, project: Project): Any? {
return null
}
}
// Plugin is responsible for registering a custom tooling model builder
// (OrtModelBuilder) with the ToolingModelBuilderRegistry, which allows
// IDEs and other tools to access the custom tooling model.
class OrtModelPlugin(private val registry: ToolingModelBuilderRegistry) : Plugin<Project> {
override fun apply(project: Project) {
registry.register(OrtModelBuilder())
}
}
// Implements the ToolingModelBuilder interface.
// This interface is used in Gradle to define custom tooling models that can
// be accessed by IDEs or other tools through the Gradle tooling API.
class OrtModelBuilder implements ToolingModelBuilder {
private Map<String, String> repositories = [:]
private Set<String> platformCategories = ["platform", "enforced-platform"]
private Set<ModuleComponentIdentifier> visitedDependencies = []
private Set<ModuleVersionIdentifier> visitedProjects = []
private static final logger = Logging.getLogger(OrtModelBuilder.class)
private List<String> errors = []
private List<String> warnings = []
@Override
boolean canBuild(String modelName) {
return false
}
@Override
Object buildAll(String modelName, Project project) {
return null
}
}
// Plugin is responsible for registering a custom tooling model builder
// (OrtModelBuilder) with the ToolingModelBuilderRegistry, which allows
// IDEs and other tools to access the custom tooling model.
class OrtModelPlugin implements Plugin<Project> {
ToolingModelBuilderRegistry registry
OrtModelPlugin(ToolingModelBuilderRegistry registry) {
this.registry = registry
}
void apply(Project project) {
registry.register(new OrtModelBuilder())
}
}
Constructor injection
There are 2 ways that an object can receive the services that it needs. The first option is to add the service as a parameter of the class constructor.
The constructor must be annotated with the javax.inject.Inject
annotation.
Gradle uses the declared type of each constructor parameter to determine the services that the object requires.
The order of the constructor parameters and their names are not significant and can be whatever you like.
Here is an example that shows a task type that receives an ObjectFactory
via its constructor:
public class Download extends DefaultTask {
private final DirectoryProperty outputDirectory;
// Inject an ObjectFactory into the constructor
@Inject
public Download(ObjectFactory objectFactory) {
// Use the factory
outputDirectory = objectFactory.directoryProperty();
}
@OutputDirectory
public DirectoryProperty getOutputDirectory() {
return outputDirectory;
}
@TaskAction
void run() {
// ...
}
}
Property injection
Alternatively, a service can be injected by adding a property getter method annotated with the javax.inject.Inject
annotation to the class.
This can be useful, for example, when you cannot change the constructor of the class due to backwards compatibility constraints.
This pattern also allows Gradle to defer creation of the service until the getter method is called, rather than when the instance is created. This can help with performance.
Gradle uses the declared return type of the getter method to determine the service to make available. The name of the property is not significant and can be whatever you like.
The property getter method must be public
or protected
. The method can be abstract
or, in cases where this isn’t possible, can have a dummy method body.
The method body is discarded.
Here is an example that shows a task type that receives a two services via property getter methods:
public abstract class Download extends DefaultTask {
// Use an abstract getter method
@Inject
protected abstract ObjectFactory getObjectFactory();
// Alternatively, use a getter method with a dummy implementation
@Inject
protected WorkerExecutor getWorkerExecutor() {
// Method body is ignored
throw new UnsupportedOperationException();
}
@TaskAction
void run() {
WorkerExecutor workerExecutor = getWorkerExecutor();
ObjectFactory objectFactory = getObjectFactory();
// Use the executor and factory ...
}
}
STRUCTURING BUILDS
Structuring Projects with Gradle
It is important to structure your Gradle project to optimize build performance. A multi-project build is the standard in Gradle.
A multi-project build consists of one root project and one or more subprojects. Gradle can build the root project and any number of the subprojects in a single execution.
Project locations
Multi-project builds contain a single root project in a directory that Gradle views as the root path: .
.
Subprojects are located physically under the root path: ./subproject
.
A subproject has a path, which denotes the position of that subproject in the multi-project build. In most cases, the project path is consistent with its location in the file system.
The project structure is created in the settings.gradle(.kts)
file.
The settings file must be present in the root directory.
A simple multi-project build
Let’s look at a basic multi-project build example that contains a root project and a single subproject.
The root project is called basic-multiproject
, located somewhere on your machine.
From Gradle’s perspective, the root is the top-level directory .
.
The project contains a single subproject called ./app
:
.
├── app
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
└── settings.gradle
This is the recommended project structure for starting any Gradle project. The build init plugin also generates skeleton projects that follow this structure - a root project with a single subproject:
The settings.gradle(.kts)
file describes the project structure to Gradle:
rootProject.name = "basic-multiproject"
include("app")
rootProject.name = 'basic-multiproject'
include 'app'
In this case, Gradle will look for a build file for the app
subproject in the ./app
directory.
You can view the structure of a multi-project build by running the projects
command:
$ ./gradlew -q projects Projects: ------------------------------------------------------------ Root project 'basic-multiproject' ------------------------------------------------------------ Root project 'basic-multiproject' \--- Project ':app' To see a list of the tasks of a project, run gradle <project-path>:tasks For example, try running gradle :app:tasks
In this example, the app
subproject is a Java application that applies the application plugin and configures the main class.
The application prints Hello World
to the console:
plugins {
id("application")
}
application {
mainClass = "com.example.Hello"
}
plugins {
id 'application'
}
application {
mainClass = 'com.example.Hello'
}
package com.example;
public class Hello {
public static void main(String[] args) {
System.out.println("Hello, world!");
}
}
You can run the application by executing the run
task from the application plugin in the project root:
$ ./gradlew -q run Hello, world!
Adding a subproject
In the settings file, you can use the include
method to add another subproject to the root project:
include("project1", "project2:child1", "project3:child1")
include 'project1', 'project2:child1', 'project3:child1'
The include
method takes project paths as arguments.
The project path is assumed to be equal to the relative physical file system path.
For example, a path services:api
is mapped by default to a folder ./services/api
(relative to the project root .
).
More examples of how to work with the project path can be found in the DSL documentation of Settings.include(java.lang.String[]).
Let’s add another subproject called lib
to the previously created project.
All we need to do is add another include
statement in the root settings file:
rootProject.name = "basic-multiproject"
include("app")
include("lib")
rootProject.name = 'basic-multiproject'
include 'app'
include 'lib'
Gradle will then look for the build file of the new lib
subproject in the ./lib/
directory:
.
├── app
│ ...
│ └── build.gradle.kts
├── lib
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
├── lib
│ ...
│ └── build.gradle
└── settings.gradle
Project Descriptors
To further describe the project architecture to Gradle, the settings file provides project descriptors.
You can modify these descriptors in the settings file at any time.
To access a descriptor, you can:
include("project-a")
println(rootProject.name)
println(project(":project-a").name)
include('project-a')
println rootProject.name
println project(':project-a').name
Using this descriptor, you can change the name, project directory, and build file of a project:
rootProject.name = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "project-a.gradle.kts"
rootProject.name = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = 'project-a.gradle'
Consult the ProjectDescriptor class in the API documentation for more information.
Modifying a subproject path
Let’s take a hypothetical project with the following structure:
.
├── app
│ ...
│ └── build.gradle.kts
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── app
│ ...
│ └── build.gradle
├── subs // Gradle may see this as a subproject
│ └── web // Gradle may see this as a subproject
│ └── my-web-module // Intended subproject
│ ...
│ └── build.gradle
└── settings.gradle
If your settings.gradle(.kts)
looks like this:
include(':subs:web:my-web-module')
Gradle sees a subproject with a logical project name of :subs:web:my-web-module
and two, possibly unintentional, other subprojects logically named :subs
and :subs:web
.
This can lead to phantom build directories, especially when using allprojects{}
or subproject{}
.
To avoid this, you can use:
include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"
So that you only end up with a single subproject named :my-web-module
.
So, while the physical project layout is the same, the logical results are different.
Naming recommendations
As your project grows, naming and consistency get increasingly more important. To keep your builds maintainable, we recommend the following:
-
Keep default project names for subprojects: It is possible to configure custom project names in the settings file. However, it’s an unnecessary extra effort for the developers to track which projects belong to what folders.
-
Use lower case hyphenation for all project names: All letters are lowercase, and words are separated with a dash (
-
) character. -
Define the root project name in the settings file: The
rootProject.name
effectively assigns a name to the build, used in reports like Build Scans. If the root project name is not set, the name will be the container directory name, which can be unstable (i.e., you can check out your project in any directory). The name will be generated randomly if the root project name is not set and checked out to a file system’s root (e.g.,/
orC:\
).
Declaring Dependencies between Subprojects
What if one subproject depends on another subproject? What if one project needs the artifact produced by another project?
This is a common use case for multi-project builds. Gradle offers project dependencies for this.
Depending on another project
Let’s explore a theoretical multi-project build with the following layout:
.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts
.
├── api
│ ├── src
│ │ └──...
│ └── build.gradle
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle
└── settings.gradle
In this example, there are three subprojects called shared
, api
, and person-service
:
-
The
person-service
subproject depends on the other two subprojects,shared
andapi
. -
The
api
subproject depends on theshared
subproject.
We use the :
separator to define a project path such as services:person-service
or :shared
.
Consult the DSL documentation of Settings.include(java.lang.String[]) for more information about defining project paths.
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
}
plugins {
id("java")
}
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
implementation(project(":shared"))
implementation(project(":api"))
}
rootProject.name = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
}
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
implementation project(':shared')
implementation project(':api')
}
A project dependency affects execution order. It causes the other project to be built first and adds the output with the classes of the other project to the classpath. It also adds the dependencies of the other project to the classpath.
If you execute ./gradlew :api:compile
, first the shared
project is built, and then the api
project is built.
Depending on artifacts produced by another project
Sometimes, you might want to depend on the output of a specific task within another project rather than the entire project. However, explicitly declaring a task dependency from one project to another is discouraged as it introduces unnecessary coupling between tasks.
The recommended way to model dependencies, where a task in one project depends on the output of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency management engine allows you to share arbitrary artifacts between projects and build them on demand.
Sharing Build Logic between Subprojects
Subprojects in a multi-project build typically share some common dependencies.
Instead of copying and pasting the same Java version and libraries in each subproject build script, Gradle provides a special directory for storing shared build logic that can be automatically applied to subprojects.
Share logic in buildSrc
buildSrc
is a Gradle-recognized and protected directory which comes with some benefits:
-
Reusable Build Logic:
buildSrc
allows you to organize and centralize your custom build logic, tasks, and plugins in a structured manner. The code written in buildSrc can be reused across your project, making it easier to maintain and share common build functionality. -
Isolation from the Main Build:
Code placed in
buildSrc
is isolated from the other build scripts of your project. This helps keep the main build scripts cleaner and more focused on project-specific configurations. -
Automatic Compilation and Classpath:
The contents of the
buildSrc
directory are automatically compiled and included in the classpath of your main build. This means that classes and plugins defined in buildSrc can be directly used in your project’s build scripts without any additional configuration. -
Ease of Testing:
Since
buildSrc
is a separate build, it allows for easy testing of your custom build logic. You can write tests for your build code, ensuring that it behaves as expected. -
Gradle Plugin Development:
If you are developing custom Gradle plugins for your project,
buildSrc
is a convenient place to house the plugin code. This makes the plugins easily accessible within your project.
The buildSrc
directory is treated as an included build.
For multi-project builds, there can be only one buildSrc
directory, which must be in the root project directory.
Note
|
The downside of using buildSrc is that any change to it will invalidate every task in your project and require a rerun.
|
buildSrc
uses the same source code conventions applicable to Java, Groovy, and Kotlin projects.
It also provides direct access to the Gradle API.
A typical project including buildSrc
has the following layout:
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──MyCustomTask.kt // (1)
│ ├── shared.gradle.kts // (2)
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts // (3)
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts // (3)
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts
└── settings.gradle.kts
-
Create the
MyCustomTask
task. -
A shared build script.
-
Uses the
MyCustomTask
task and shared build script.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──MyCustomTask.groovy // (1)
│ ├── shared.gradle // (2)
│ └── build.gradle
├── api
│ ├── src
│ │ └──...
│ └── build.gradle // (3)
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle // (3)
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle
└── settings.gradle
-
Create the
MyCustomTask
task. -
A shared build script.
-
Uses the
MyCustomTask
task and shared build script.
In the buildSrc
, the build script shared.gradle(.kts)
is created.
It contains dependencies and other build information that is common to multiple subprojects:
repositories {
mavenCentral()
}
dependencies {
implementation("org.slf4j:slf4j-api:1.7.32")
}
repositories {
mavenCentral()
}
dependencies {
implementation 'org.slf4j:slf4j-api:1.7.32'
}
In the buildSrc
, the MyCustomTask
is also created.
It is a helper task that is used as part of the build logic for multiple subprojects:
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction
open class MyCustomTask : DefaultTask() {
@TaskAction
fun calculateSum() {
// Custom logic to calculate the sum of two numbers
val num1 = 5
val num2 = 7
val sum = num1 + num2
// Print the result
println("Sum: $sum")
}
}
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction
class MyCustomTask extends DefaultTask {
@TaskAction
void calculateSum() {
// Custom logic to calculate the sum of two numbers
int num1 = 5
int num2 = 7
int sum = num1 + num2
// Print the result
println "Sum: $sum"
}
}
The MyCustomTask
task is used in the build script of the api
and shared
projects.
The task is automatically available because it’s part of buildSrc
.
The shared.gradle(.kts)
file is also applied:
// Apply any other configurations specific to your project
// Use the build script defined in buildSrc
apply(from = rootProject.file("buildSrc/shared.gradle.kts"))
// Use the custom task defined in buildSrc
tasks.register<MyCustomTask>("myCustomTask")
// Apply any other configurations specific to your project
// Use the build script defined in buildSrc
apply from: rootProject.file('buildSrc/shared.gradle')
// Use the custom task defined in buildSrc
tasks.register('myCustomTask', MyCustomTask)
Share logic using convention plugins
Gradle’s recommended way of organizing build logic is to use its plugin system.
We can write a plugin that encapsulates the build logic common to several subprojects in a project. This kind of plugin is called a convention plugin.
While writing plugins is outside the scope of this section, the recommended way to build a Gradle project is to put common build logic in a convention plugin located in the buildSrc
.
Let’s take a look at an example project:
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──kotlin
│ │ └──myproject.java-conventions.gradle.kts // (1)
│ └── build.gradle.kts
├── api
│ ├── src
│ │ └──...
│ └── build.gradle.kts // (2)
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle.kts // (2)
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle.kts // (2)
└── settings.gradle.kts
-
Create the
myproject.java-conventions
convention plugin. -
Applies the
myproject.java-conventions
convention plugin.
.
├── buildSrc
│ ├── src
│ │ └──main
│ │ └──groovy
│ │ └──myproject.java-conventions.gradle // (1)
│ └── build.gradle
├── api
│ ├── src
│ │ └──...
│ └── build.gradle // (2)
├── services
│ └── person-service
│ ├── src
│ │ └──...
│ └── build.gradle // (2)
├── shared
│ ├── src
│ │ └──...
│ └── build.gradle // (2)
└── settings.gradle
-
Create the
myproject.java-conventions
convention plugin. -
Applies the
myproject.java-conventions
convention plugin.
This build contains three subprojects:
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
rootProject.name = 'dependencies-java'
include 'api', 'shared', 'services:person-service'
The source code for the convention plugin created in the buildSrc
directory is as follows:
plugins {
id("java")
}
group = "com.example"
version = "1.0"
repositories {
mavenCentral()
}
dependencies {
testImplementation("junit:junit:4.13")
}
plugins {
id 'java'
}
group = 'com.example'
version = '1.0'
repositories {
mavenCentral()
}
dependencies {
testImplementation "junit:junit:4.13"
}
For the convention plugin to compile, basic configuration needs to be applied in the build file of the buildSrc
directory:
plugins {
`kotlin-dsl`
}
repositories {
mavenCentral()
}
plugins {
id 'groovy-gradle-plugin'
}
The convention plugin is applied to the api
, shared
, and person-service
subprojects:
plugins {
id("myproject.java-conventions")
}
dependencies {
implementation(project(":shared"))
}
plugins {
id("myproject.java-conventions")
}
plugins {
id("myproject.java-conventions")
}
dependencies {
implementation(project(":shared"))
implementation(project(":api"))
}
plugins {
id 'myproject.java-conventions'
}
dependencies {
implementation project(':shared')
}
plugins {
id 'myproject.java-conventions'
}
plugins {
id 'myproject.java-conventions'
}
dependencies {
implementation project(':shared')
implementation project(':api')
}
Do not use cross-project configuration
An improper way to share build logic between subprojects is cross-project configuration via the subprojects {}
and allprojects {}
DSL constructs.
Tip
|
Avoid using subprojects {} and allprojects {} .
|
With cross-project configuration, build logic can be injected into a subproject which is not obvious when looking at its build script.
In the long run, cross-project configuration usually grows in complexity and becomes a burden. Cross-project configuration can also introduce configuration-time coupling between projects, which can prevent optimizations like configuration-on-demand from working properly.
Convention plugins versus cross-project configuration
The two most common uses of cross-project configuration can be better modeled using convention plugins:
-
Applying plugins or other configurations to subprojects of a certain type.
Often, the cross-project configuration logic isif subproject is of type X, then configure Y
. This is equivalent to applyingX-conventions
plugin directly to a subproject. -
Extracting information from subprojects of a certain type.
This use case can be modeled using outgoing configuration variants.
Composite Builds
A composite build is a build that includes other builds.
A composite build is similar to a Gradle multi-project build, except that instead of including subprojects
, entire builds
are included.
Composite builds allow you to:
-
Combine builds that are usually developed independently, for instance, when trying out a bug fix in a library that your application uses.
-
Decompose a large multi-project build into smaller, more isolated chunks that can be worked on independently or together as needed.
A build that is included in a composite build is referred to as an included build. Included builds do not share any configuration with the composite build or the other included builds. Each included build is configured and executed in isolation.
Defining a composite build
The following example demonstrates how two Gradle builds, normally developed separately, can be combined into a composite build.
my-composite
├── gradle
├── gradlew
├── settings.gradle.kts
├── build.gradle.kts
├── my-app
│ ├── settings.gradle.kts
│ └── app
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/my-app/Main.java
└── my-utils
├── settings.gradle.kts
├── number-utils
│ ├── build.gradle.kts
│ └── src/main/java/org/sample/numberutils/Numbers.java
└── string-utils
├── build.gradle.kts
└── src/main/java/org/sample/stringutils/Strings.java
The my-utils
multi-project build produces two Java libraries, number-utils
and string-utils
.
The my-app
build produces an executable using functions from those libraries.
The my-app
build does not depend directly on my-utils
.
Instead, it declares binary dependencies on the libraries produced by my-utils
:
plugins {
id("application")
}
application {
mainClass = "org.sample.myapp.Main"
}
dependencies {
implementation("org.sample:number-utils:1.0")
implementation("org.sample:string-utils:1.0")
}
plugins {
id 'application'
}
application {
mainClass = 'org.sample.myapp.Main'
}
dependencies {
implementation 'org.sample:number-utils:1.0'
implementation 'org.sample:string-utils:1.0'
}
Defining a composite build via --include-build
The --include-build
command-line argument turns the executed build into a composite, substituting dependencies from the included build into the executed build.
For example, the output of ./gradlew run --include-build ../my-utils
run from my-app
:
$ ./gradlew --include-build ../my-utils run link:https://docs.gradle.org/8.12/samples/build-organization/composite-builds/basic/tests/basicCli.out[role=include]
Defining a composite build via the settings file
It’s possible to make the above arrangement persistent by using Settings.includeBuild(java.lang.Object) to declare the included build in the settings.gradle(.kts)
file.
The settings file can be used to add subprojects and included builds simultaneously.
Included builds are added by location:
includeBuild("my-utils")
In the example, the settings.gradle(.kts) file combines otherwise separate builds:
rootProject.name = "my-composite"
includeBuild("my-app")
includeBuild("my-utils")
rootProject.name = 'my-composite'
includeBuild 'my-app'
includeBuild 'my-utils'
To execute the run
task in the my-app
build from my-composite
, run ./gradlew my-app:app:run
.
You can optionally define a run
task in my-composite
that depends on my-app:app:run
so that you can execute ./gradlew run
:
tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}
Including builds that define Gradle plugins
A special case of included builds are builds that define Gradle plugins.
These builds should be included using the includeBuild
statement inside the pluginManagement {}
block of the settings file.
Using this mechanism, the included build may also contribute a settings plugin that can be applied in the settings file itself:
pluginManagement {
includeBuild("../url-verifier-plugin")
}
pluginManagement {
includeBuild '../url-verifier-plugin'
}
Restrictions on included builds
Most builds can be included in a composite, including other composite builds. There are some restrictions.
In a regular build, Gradle ensures that each project has a unique project path. It makes projects identifiable and addressable without conflicts.
In a composite build, Gradle adds additional qualification to each project from an included build to avoid project path conflicts. The full path to identify a project in a composite build is called a build-tree path. It consists of a build path of an included build and a project path of the project.
By default, build paths and project paths are derived from directory names and structure on disk. Since included builds can be located anywhere on disk, their build path is determined by the name of the containing directory. This can sometimes lead to conflicts.
To summarize, the included builds must fulfill these requirements:
-
Each included build must have a unique build path.
-
Each included build path must not conflict with any project path of the main build.
These conditions guarantee that each project can be uniquely identified even in a composite build.
If conflicts arise, the way to resolve them is by changing the build name of an included build:
includeBuild("some-included-build") {
name = "other-name"
}
Note
|
When a composite build is included in another composite build, both builds have the same parent. In other words, the nested composite build structure is flattened. |
Interacting with a composite build
Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be executed, tests can be run, and builds can be imported into the IDE.
Executing tasks
Tasks from an included build can be executed from the command-line or IDE in the same way as tasks from a regular multi-project build. Executing a task will result in task dependencies being executed, as well as those tasks required to build dependency artifacts from other included builds.
You can call a task in an included build using a fully qualified path, for example, :included-build-name:project-name:taskName
.
Project and task names can be abbreviated.
$ ./gradlew :included-build:subproject-a:compileJava > Task :included-build:subproject-a:compileJava $ ./gradlew :i-b:sA:cJ > Task :included-build:subproject-a:compileJava
To exclude a task from the command line, you need to provide the fully qualified path to the task.
Note
|
Included build tasks are automatically executed to generate required dependency artifacts, or the including build can declare a dependency on a task from an included build. |
Importing into the IDE
One of the most useful features of composite builds is IDE integration.
Importing a composite build permits sources from separate Gradle builds to be easily developed together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse Project. Source dependencies are configured, providing cross-build navigation and refactoring.
Declaring dependencies substituted by an included build
By default, Gradle will configure each included build to determine the dependencies it can provide.
The algorithm for doing this is simple.
Gradle will inspect the group and name for the projects in the included build and substitute project dependencies for any external dependency matching ${project.group}:${project.name}
.
Note
|
By default, substitutions are not registered for the main build. To make the (sub)projects of the main build addressable by |
There are cases when the default substitutions determined by Gradle are insufficient or must be corrected for a particular composite. For these cases, explicitly declaring the substitutions for an included build is possible.
For example, a single-project build called anonymous-library
, produces a Java utility library but does not declare a value for the group attribute:
plugins {
java
}
plugins {
id 'java'
}
When this build is included in a composite, it will attempt to substitute for the dependency module undefined:anonymous-library
(undefined
being the default value for project.group
, and anonymous-library
being the root project name).
Clearly, this isn’t useful in a composite build.
To use the unpublished library in a composite build, you can explicitly declare the substitutions that it provides:
includeBuild("anonymous-library") {
dependencySubstitution {
substitute(module("org.sample:number-utils")).using(project(":"))
}
}
includeBuild('anonymous-library') {
dependencySubstitution {
substitute module('org.sample:number-utils') using project(':')
}
}
With this configuration, the my-app
composite build will substitute any dependency on org.sample:number-utils
with a dependency on the root project of anonymous-library
.
Deactivate included build substitutions for a configuration
If you need to resolve a published version of a module that is also available as part of an included build, you can deactivate the included build substitution rules on the ResolutionStrategy of the Configuration that is resolved. This is necessary because the rules are globally applied in the build, and Gradle does not consider published versions during resolution by default.
For example, we create a separate publishedRuntimeClasspath
configuration that gets resolved to the published versions of modules that also exist in one of the local builds.
This is done by deactivating global dependency substitution rules:
configurations.create("publishedRuntimeClasspath") {
resolutionStrategy.useGlobalDependencySubstitutionRules = false
extendsFrom(configurations.runtimeClasspath.get())
isCanBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage.JAVA_RUNTIME))
}
configurations.create('publishedRuntimeClasspath') {
resolutionStrategy.useGlobalDependencySubstitutionRules = false
extendsFrom(configurations.runtimeClasspath)
canBeConsumed = false
attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage.JAVA_RUNTIME))
}
A use-case would be to compare published and locally built JAR files.
Cases where included build substitutions must be declared
Many builds will function automatically as an included build, without declared substitutions. Here are some common cases where declared substitutions are required:
-
When the
archivesBaseName
property is used to set the name of the published artifact. -
When a configuration other than
default
is published. -
When the
MavenPom.addFilter()
is used to publish artifacts that don’t match the project name. -
When the
maven-publish
orivy-publish
plugins are used for publishing and the publication coordinates don’t match${project.group}:${project.name}
.
Cases where composite build substitutions won’t work
Some builds won’t function correctly when included in a composite, even when dependency substitutions are explicitly declared.
This limitation is because a substituted project dependency will always point to the default
configuration of the target project.
Any time the artifacts and dependencies specified for the default configuration of a project don’t match what is published to a repository, the composite build may exhibit different behavior.
Here are some cases where the published module metadata may be different from the project default configuration:
-
When a configuration other than
default
is published. -
When the
maven-publish
orivy-publish
plugins are used. -
When the
POM
orivy.xml
file is tweaked as part of publication.
Builds using these features function incorrectly when included in a composite build.
Depending on tasks in an included build
While included builds are isolated from one another and cannot declare direct dependencies, a composite build can declare task dependencies on its included builds. The included builds are accessed using Gradle.getIncludedBuilds() or Gradle.includedBuild(java.lang.String), and a task reference is obtained via the IncludedBuild.task(java.lang.String) method.
Using these APIs, it is possible to declare a dependency on a task in a particular included build:
tasks.register("run") {
dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
tasks.register('run') {
dependsOn gradle.includedBuild('my-app').task(':app:run')
}
Or you can declare a dependency on tasks with a certain path in some or all of the included builds:
tasks.register("publishDeps") {
dependsOn(gradle.includedBuilds.map { it.task(":publishMavenPublicationToMavenRepository") })
}
tasks.register('publishDeps') {
dependsOn gradle.includedBuilds*.task(':publishMavenPublicationToMavenRepository')
}
Limitations of composite builds
Limitations of the current implementation include:
-
No support for included builds with publications that don’t mirror the project default configuration.
See Cases where composite builds won’t work. -
Multiple composite builds may conflict when run in parallel if more than one includes the same build.
Gradle does not share the project lock of a shared composite build between Gradle invocations to prevent concurrent execution.
Configuration On Demand
Configuration-on-demand attempts to configure only the relevant projects for the requested tasks, i.e., it only evaluates the build script file of projects participating in the build. This way, the configuration time of a large multi-project build can be reduced.
The configuration-on-demand feature is incubating, so only some builds are guaranteed to work correctly. The feature works well for decoupled multi-project builds.
In configuration-on-demand mode, projects are configured as follows:
-
The root project is always configured.
-
The project in the directory where the build is executed is also configured, but only when Gradle is executed without any tasks.
This way, the default tasks behave correctly when projects are configured on demand. -
The standard project dependencies are supported, and relevant projects are configured.
If project A has a compile dependency on project B, then building A causes the configuration of both projects. -
The task dependencies declared via the task path are supported and cause relevant projects to be configured.
Example:someTask.dependsOn(":some-other-project:someOtherTask")
-
A task requested via task path from the command line (or tooling API) causes the relevant project to be configured.
For example, buildingproject-a:project-b:someTask
causes configuration ofproject-b
.
Enable configuration-on-demand
You can enable configuration-on-demand using the --configure-on-demand
flag or adding org.gradle.configureondemand=true
to the gradle.properties
file.
To configure on demand with every build run, see Gradle properties.
To configure on demand for a given build, see command-line performance-oriented options.
Decoupled projects
Gradle allows projects to access each other’s configurations and tasks during the configuration and execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform optimizations such as parallel project builds and configuration on demand.
Projects are considered decoupled when they interact solely through declared dependencies and task dependencies. Any direct modification or reading of another project’s object creates coupling between the projects. Coupling during configuration can result in flawed build outcomes when using 'configuration on demand', while coupling during execution can affect parallel execution.
One common source of coupling is configuration injection, such as using allprojects{}
or subprojects{}
in build scripts.
To avoid coupling issues, it’s recommended to:
-
Refrain from referencing other subprojects' build scripts and prefer cross-project configuration from the root project.
-
Avoid dynamically changing other projects' configurations during execution.
As Gradle evolves, it aims to provide features that leverage decoupled projects while offering solutions for common use cases like configuration injection without introducing coupling.
Parallel projects
Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently executing tasks from different projects.
To enable parallel execution, use the --parallel
command-line argument or configure your build environment.
Gradle automatically determines the optimal number of parallel threads based on CPU cores.
During parallel execution, each worker handles a specific project exclusively. Task dependencies are respected, with workers prioritizing upstream tasks. However, tasks may not execute in alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and inputs/outputs to avoid ordering issues.
DEVELOPING TASKS
Understanding Tasks
A task represents some independent unit of work that a build performs, such as compiling classes, creating a JAR, generating Javadoc, or publishing archives to a repository.
Before reading this chapter, it’s recommended that you first read the Learning The Basics and complete the Tutorial.
Listing tasks
All available tasks in your project come from Gradle plugins and build scripts.
You can list all the available tasks in a project by running the following command in the terminal:
$ ./gradlew tasks
Let’s take a very basic Gradle project as an example. The project has the following structure:
gradle-project
├── app
│ ├── build.gradle.kts // empty file - no build logic
│ └── ... // some java code
├── settings.gradle.kts // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat
gradle-project
├── app
│ ├── build.gradle // empty file - no build logic
│ └── ... // some java code
├── settings.gradle // includes app subproject
├── gradle
│ └── ...
├── gradlew
└── gradlew.bat
The settings file contains the following:
rootProject.name = "gradle-project"
include("app")
rootProject.name = 'gradle-project'
include('app')
Currently, the app
subproject’s build file is empty.
To see the tasks available in the app
subproject, run ./gradlew :app:tasks
:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.
We observe that only a small number of help tasks are available at the moment. This is because the core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build your project or compile your code, are added by plugins.
Let’s explore this by adding the Gradle core base
plugin to the app
build script:
plugins {
id("base")
}
plugins {
id('base')
}
The base
plugin adds central lifecycle tasks.
Now when we run ./gradlew app:tasks
, we can see the assemble
and build
tasks are available:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
Task outcomes
When Gradle executes a task, it labels the task with outcomes via the console.
These labels are based on whether a task has actions to execute and if Gradle executed them. Actions include, but are not limited to, compiling code, zipping files, and publishing archives.
(no label)
orEXECUTED
-
Task executed its actions.
-
Task has actions and Gradle executed them.
-
Task has no actions and some dependencies, and Gradle executed one or more of the dependencies. See also Lifecycle Tasks.
-
UP-TO-DATE
-
Task’s outputs did not change.
-
Task has outputs and inputs but they have not changed. See Incremental Build.
-
Task has actions, but the task tells Gradle it did not change its outputs.
-
Task has no actions and some dependencies, but all the dependencies are
UP-TO-DATE
,SKIPPED
orFROM-CACHE
. See Lifecycle Tasks. -
Task has no actions and no dependencies.
-
FROM-CACHE
-
Task’s outputs could be found from a previous execution.
-
Task has outputs restored from the build cache. See Build Cache.
-
SKIPPED
-
Task did not execute its actions.
-
Task has been explicitly excluded from the command-line. See Excluding tasks from execution.
-
Task has an
onlyIf
predicate return false. See Using a predicate.
-
NO-SOURCE
-
Task did not need to execute its actions.
-
Task has inputs and outputs, but no sources (i.e., inputs were not found).
-
Task group and description
Task groups and descriptions are used to organize and describe tasks.
- Groups
-
Task groups are used to categorize tasks. When you run
./gradlew tasks
, tasks are listed under their respective groups, making it easier to understand their purpose and relationship to other tasks. Groups are set using thegroup
property. - Descriptions
-
Descriptions provide a brief explanation of what a task does. When you run
./gradlew tasks
, the descriptions are shown next to each task, helping you understand its purpose and how to use it. Descriptions are set using thedescription
property.
Let’s consider a basic Java application as an example.
The build contains a subproject called app
.
Let’s list the available tasks in app
at the moment:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application.
Build tasks
-----------
assemble - Assembles the outputs of this project.
Here, the :run
task is part of the Application
group with the description Runs this project as a JVM application
.
In code, it would look something like this:
tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
tasks.register("run") {
group = "Application"
description = "Runs this project as a JVM application."
}
Private and hidden tasks
Gradle doesn’t support marking a task as private.
However, tasks will only show up when running :tasks
if task.group
is set or no other task depends on it.
For instance, the following task will not appear when running ./gradlew :app:tasks
because it does not have a group; it is called a hidden task:
tasks.register("helloTask") {
println("Hello")
}
tasks.register("helloTask") {
println 'Hello'
}
Although helloTask
is not listed, it can still be executed by Gradle:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
Let’s add a group to the same task:
tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println("Hello")
}
tasks.register("helloTask") {
group = "Other"
description = "Hello task"
println 'Hello'
}
Now that the group is added, the task is visible:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
Other tasks
-----------
helloTask - Hello task
In contrast, ./gradlew tasks --all
will show all tasks; hidden and visible tasks are listed.
Grouping tasks
If you want to customize which tasks are shown to users when listed, you can group tasks and set the visibility of each group.
Note
|
Remember, even if you hide tasks, they are still available, and Gradle can still run them. |
Let’s start with an example built by Gradle init
for a Java application with multiple subprojects.
The project structure is as follows:
gradle-project
├── app
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle.kts
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle.kts
│ ├── settings.gradle.kts
│ └── src // common build logic
│ └── ...
├── settings.gradle.kts
├── gradle
├── gradlew
└── gradlew.bat
gradle-project
├── app
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── utilities
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── list
│ ├── build.gradle
│ └── src // some java code
│ └── ...
├── buildSrc
│ ├── build.gradle
│ ├── settings.gradle
│ └── src // common build logic
│ └── ...
├── settings.gradle
├── gradle
├── gradlew
└── gradlew.bat
Run app:tasks
to see available tasks in the app
subproject:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Application tasks
-----------------
run - Runs this project as a JVM application
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these tasks are rarely required directly by developers using the build.
We can configure the :tasks
task and limit the tasks shown to a certain group.
Let’s create our own group so that all tasks are hidden by default by updating the app
build script:
val myBuildGroup = "my app build" // Create a group name
tasks.register<TaskReportTask>("tasksAll") { // Register the tasksAll task
group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}
tasks.named<TaskReportTask>("tasks") { // Move all existing tasks to the group
displayGroup = myBuildGroup
}
def myBuildGroup = "my app build" // Create a group name
tasks.register(TaskReportTask, "tasksAll") { // Register the tasksAll task
group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}
tasks.named(TaskReportTask, "tasks") { // Move all existing tasks to the group
displayGroup = myBuildGroup
}
Now, when we list tasks available in app
, the list is shorter:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
My app build tasks
------------------
tasksAll - Show additional tasks.
Task categories
Gradle distinguishes between two categories of tasks:
-
Lifecycle tasks
-
Actionable tasks
Lifecycle tasks define targets you can call, such as :build
your project.
Lifecycle tasks do not provide Gradle with actions.
They must be wired to actionable tasks.
The base
Gradle plugin only adds lifecycle tasks.
Actionable tasks define actions for Gradle to take, such as :compileJava
, which compiles the Java code of your project.
Actions include creating JARs, zipping files, publishing archives, and much more.
Plugins like the java-library
plugin adds actionable tasks.
Let’s update the build script of the previous example, which is currently an empty file so that our app
subproject is a Java library:
plugins {
id("java-library")
}
plugins {
id('java-library')
}
Once again, we list the available tasks to see what new tasks are available:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.
Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.
Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.
Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.
We see that many new tasks are available such as jar
and testClasses
.
Additionally, the java-library
plugin has wired actionable tasks to lifecycle tasks.
If we call the :build
task, we can see several tasks have been executed, including the :app:compileJava
task.
$./gradlew :app:build
> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build
The actionable :compileJava
task is wired to the lifecycle :build
task.
Incremental tasks
A key feature of Gradle tasks is their incremental nature.
Gradle can reuse results from prior builds.
Therefore, if we’ve built our project before and made only minor changes, rerunning :build
will not require Gradle to perform extensive work.
For example, if we modify only the test code in our project, leaving the production code unchanged, executing the build will solely recompile the test code.
Gradle marks the tasks for the production code as UP-TO-DATE
, indicating that it remains unchanged since the last successful build:
$./gradlew :app:build
gradle@MacBook-Pro temp1 % ./gradlew :app:build
> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:assemble UP-TO-DATE
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check UP-TO-DATE
> Task :app:build UP-TO-DATE
Caching tasks
Gradle can reuse results from past builds using the build cache.
To enable this feature, activate the build cache by using the --build-cache
command line parameter or by setting org.gradle.caching=true
in your gradle.properties
file.
This optimization has the potential to accelerate your builds significantly:
$./gradlew :app:clean :app:build --build-cache
> Task :app:compileJava FROM-CACHE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava FROM-CACHE
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses UP-TO-DATE
> Task :app:test FROM-CACHE
> Task :app:check UP-TO-DATE
> Task :app:build
When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE
.
The build cache is handy if you switch between branches regularly. Gradle supports both local and remote build caches.
Developing tasks
When developing Gradle tasks, you have two choices:
-
Use an existing Gradle task type such as
Zip
,Copy
, orDelete
-
Create your own Gradle task type such as
MyResolveTask
orCustomTaskUsingToolchains
.
Task types are simply subclasses of the Gradle Task
class.
With Gradle tasks, there are three states to consider:
-
Registering a task - using a task (implemented by you or provided by Gradle) in your build logic.
-
Configuring a task - defining inputs and outputs for a registered task.
-
Implementing a task - creating a custom task class (i.e., custom class type).
Registration is commonly done with the register()
method.
Configuring a task is commonly done with the named()
method.
Implementing a task is commonly done by extending Gradle’s DefaultTask
class:
tasks.register<Copy>("myCopy") // (1)
tasks.named<Copy>("myCopy") { // (2)
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
abstract class MyCopyTask : DefaultTask() { // (3)
@TaskAction
fun copyFiles() {
val sourceDir = File("sourceDir")
val destinationDir = File("destinationDir")
sourceDir.listFiles()?.forEach { file ->
if (file.isFile && file.extension == "txt") {
file.copyTo(File(destinationDir, file.name))
}
}
}
}
-
Register the
myCopy
task of typeCopy
to let Gradle know we intend to use it in our build logic. -
Configure the registered
myCopy
task with the inputs and outputs it needs according to its API. -
Implement a custom task type called
MyCopyTask
which extendsDefaultTask
and defines thecopyFiles
task action.
tasks.register(Copy, "myCopy") // (1)
tasks.named(Copy, "myCopy") { // (2)
from "resources"
into "target"
include "**/*.txt", "**/*.xml", "**/*.properties"
}
abstract class MyCopyTask extends DefaultTask { // (3)
@TaskAction
void copyFiles() {
fileTree('sourceDir').matching {
include '**/*.txt'
}.forEach { file ->
file.copyTo(file.path.replace('sourceDir', 'destinationDir'))
}
}
}
-
Register the
myCopy
task of typeCopy
to let Gradle know we intend to use it in our build logic. -
Configure the registered
myCopy
task with the inputs and outputs it needs according to its API. -
Implement a custom task type called
MyCopyTask
which extendsDefaultTask
and defines thecopyFiles
task action.
1. Registering tasks
You define actions for Gradle to take by registering tasks in build scripts or plugins.
Tasks are defined using strings for task names:
tasks.register("hello") {
doLast {
println("hello")
}
}
tasks.register('hello') {
doLast {
println 'hello'
}
}
In the example above, the task is added to the TasksCollection
using the register()
method in TaskContainer
.
2. Configuring tasks
Gradle tasks must be configured to complete their action(s) successfully.
If a task needs to ZIP a file, it must be configured with the file name and location.
You can refer to the API for the Gradle Zip
task to learn how to configure it appropriately.
Let’s look at the Copy
task provided by Gradle as an example.
We first register a task called myCopy
of type Copy
in the build script:
tasks.register<Copy>("myCopy")
tasks.register('myCopy', Copy)
This registers a copy task with no default behavior.
Since the task is of type Copy
, a Gradle supported task type, it can be configured using its API.
The following examples show several ways to achieve the same configuration:
1. Using the named()
method:
Use named()
to configure an existing task registered elsewhere:
tasks.named<Copy>("myCopy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
tasks.named('myCopy') {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
2. Using a configuration block:
Use a block to configure the task immediately upon registering it:
tasks.register<Copy>("copy") {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
tasks.register('copy', Copy) {
from 'resources'
into 'target'
include('**/*.txt', '**/*.xml', '**/*.properties')
}
3. Name method as call:
A popular option that is only supported in Groovy is the shorthand notation:
copy {
from("resources")
into("target")
include("**/*.txt", "**/*.xml", "**/*.properties")
}
Note
|
This option breaks task configuration avoidance and is not recommended! |
Regardless of the method chosen, the task is configured with the name of the files to be copied and the location of the files.
3. Implementing tasks
Gradle provides many task types including Delete
, Javadoc
, Copy
, Exec
, Tar
, and Pmd
.
You can implement a custom task type if Gradle does not provide a task type that meets your build logic needs.
To create a custom task class, you extend DefaultTask
and make the extending class abstract:
abstract class MyCopyTask : DefaultTask() {
}
abstract class MyCopyTask extends DefaultTask {
}
Controlling Task Execution
Task dependencies allow tasks to be executed in a specific order based on their dependencies. This ensures that tasks dependent on others are only executed after those dependencies have completed.
Task dependencies can be categorized as either implicit or explicit:
- Implicit dependencies
-
These dependencies are automatically inferred by Gradle based on the tasks' actions and configuration. For example, if
taskB
uses the output oftaskA
(e.g., a file generated bytaskA
), Gradle will automatically ensure thattaskA
is executed beforetaskB
to fulfill this dependency. - Explicit dependencies
-
These dependencies are explicitly declared in the build script using the
dependsOn
,mustRunAfter
, orshouldRunAfter
methods. For example, if you want to ensure thattaskB
always runs aftertaskA
, you can explicitly declare this dependency usingtaskB.mustRunAfter(taskA)
.
Both implicit and explicit dependencies play a crucial role in defining the order of task execution and ensuring that tasks are executed in the correct sequence to produce the desired build output.
Task dependencies
Gradle inherently understands the dependencies among tasks. Consequently, it can determine the tasks that need execution when you target a specific task.
Let’s take an example application with an app
subproject and a some-logic
subproject:
rootProject.name = "gradle-project"
include("app")
include("some-logic")
rootProject.name = 'gradle-project'
include('app')
include('some-logic')
Let’s imagine that the app
subproject depends on the subproject called some-logic
, which contains some Java code.
We add this dependency in the app
build script:
plugins {
id("application") // app is now a java application
}
application {
mainClass.set("hello.HelloWorld") // main class name required by the application plugin
}
dependencies {
implementation(project(":some-logic")) // dependency on some-logic
}
plugins {
id('application') // app is now a java application
}
application {
mainClass = 'hello.HelloWorld' // main class name required by the application plugin
}
dependencies {
implementation(project(':some-logic')) // dependency on some-logic
}
If we run :app:build
again, we see the Java code of some-logic
is also compiled by Gradle automatically:
$./gradlew :app:build
> Task :app:processResources NO-SOURCE
> Task :app:processTestResources NO-SOURCE
> Task :some-logic:compileJava UP-TO-DATE
> Task :some-logic:processResources NO-SOURCE
> Task :some-logic:classes UP-TO-DATE
> Task :some-logic:jar UP-TO-DATE
> Task :app:compileJava
> Task :app:classes
> Task :app:jar UP-TO-DATE
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava UP-TO-DATE
> Task :app:testClasses UP-TO-DATE
> Task :app:test
> Task :app:check
> Task :app:build
BUILD SUCCESSFUL in 430ms
9 actionable tasks: 5 executed, 4 up-to-date
Adding dependencies
There are several ways you can define the dependencies of a task.
Defining dependencies using task names and the dependsOn()` method is simplest.
The following is an example which adds a dependency from taskX
to taskY
:
tasks.register("taskX") {
dependsOn("taskY")
}
tasks.register("taskX") {
dependsOn "taskY"
}
$ gradle -q taskX taskY taskX
For more information about task dependencies, see the Task API.
Ordering tasks
In some cases, it is useful to control the order in which two tasks will execute, without introducing an explicit dependency between those tasks.
The primary difference between a task ordering and a task dependency is that an ordering rule does not influence which tasks will be executed, only the order in which they will be executed.
Task ordering can be useful in a number of scenarios:
-
Enforce sequential ordering of tasks (e.g.,
build
never runs beforeclean
). -
Run build validations early in the build (e.g., validate I have the correct credentials before starting the work for a release build).
-
Get feedback faster by running quick verification tasks before long verification tasks (e.g., unit tests should run before integration tests).
-
A task that aggregates the results of all tasks of a particular type (e.g., test report task combines the outputs of all executed test tasks).
Two ordering rules are available: "must run after" and "should run after".
To specify a "must run after" or "should run after" ordering between 2 tasks, you use the Task.mustRunAfter(java.lang.Object...) and Task.shouldRunAfter(java.lang.Object...) methods. These methods accept a task instance, a task name, or any other input accepted by Task.dependsOn(java.lang.Object...).
When you use "must run after", you specify that taskY
must always run after taskX
when the build requires the execution of taskX
and taskY
.
So if you only run taskY
with mustRunAfter
, you won’t cause taskX
to run.
This is expressed as taskY.mustRunAfter(taskX)
.
val taskX by tasks.registering {
doLast {
println("taskX")
}
}
val taskY by tasks.registering {
doLast {
println("taskY")
}
}
taskY {
mustRunAfter(taskX)
}
def taskX = tasks.register('taskX') {
doLast {
println 'taskX'
}
}
def taskY = tasks.register('taskY') {
doLast {
println 'taskY'
}
}
taskY.configure {
mustRunAfter taskX
}
$ gradle -q taskY taskX taskX taskY
The "should run after" ordering rule is similar but less strict, as it will be ignored in two situations:
-
If using that rule introduces an ordering cycle.
-
When using parallel execution and all task dependencies have been satisfied apart from the "should run after" task, then this task will be run regardless of whether or not its "should run after" dependencies have been run.
You should use "should run after" where the ordering is helpful but not strictly required:
val taskX by tasks.registering {
doLast {
println("taskX")
}
}
val taskY by tasks.registering {
doLast {
println("taskY")
}
}
taskY {
shouldRunAfter(taskX)
}
def taskX = tasks.register('taskX') {
doLast {
println 'taskX'
}
}
def taskY = tasks.register('taskY') {
doLast {
println 'taskY'
}
}
taskY.configure {
shouldRunAfter taskX
}
$ gradle -q taskY taskX taskX taskY
In the examples above, it is still possible to execute taskY
without causing taskX
to run:
$ gradle -q taskY taskY
The “should run after” ordering rule will be ignored if it introduces an ordering cycle:
val taskX by tasks.registering {
doLast {
println("taskX")
}
}
val taskY by tasks.registering {
doLast {
println("taskY")
}
}
val taskZ by tasks.registering {
doLast {
println("taskZ")
}
}
taskX { dependsOn(taskY) }
taskY { dependsOn(taskZ) }
taskZ { shouldRunAfter(taskX) }
def taskX = tasks.register('taskX') {
doLast {
println 'taskX'
}
}
def taskY = tasks.register('taskY') {
doLast {
println 'taskY'
}
}
def taskZ = tasks.register('taskZ') {
doLast {
println 'taskZ'
}
}
taskX.configure { dependsOn(taskY) }
taskY.configure { dependsOn(taskZ) }
taskZ.configure { shouldRunAfter(taskX) }
$ gradle -q taskX taskZ taskY taskX
Note that taskY.mustRunAfter(taskX)
or taskY.shouldRunAfter(taskX)
does not imply any execution dependency between the tasks:
-
It is possible to execute
taskX
andtaskY
independently. The ordering rule only has an effect when both tasks are scheduled for execution. -
When run with
--continue
, it is possible fortaskY
to execute iftaskX
fails.
Finalizer tasks
Finalizer tasks are automatically added to the task graph when the finalized task is scheduled to run.
To specify a finalizer task, you use the Task.finalizedBy(java.lang.Object…) method. This method accepts a task instance, a task name, or any other input accepted by Task.dependsOn(java.lang.Object…):
val taskX by tasks.registering {
doLast {
println("taskX")
}
}
val taskY by tasks.registering {
doLast {
println("taskY")
}
}
taskX { finalizedBy(taskY) }
def taskX = tasks.register('taskX') {
doLast {
println 'taskX'
}
}
def taskY = tasks.register('taskY') {
doLast {
println 'taskY'
}
}
taskX.configure { finalizedBy taskY }
$ gradle -q taskX taskX taskY
Finalizer tasks are executed even if the finalized task fails or if the finalized task is considered UP-TO-DATE
:
val taskX by tasks.registering {
doLast {
println("taskX")
throw RuntimeException()
}
}
val taskY by tasks.registering {
doLast {
println("taskY")
}
}
taskX { finalizedBy(taskY) }
def taskX = tasks.register('taskX') {
doLast {
println 'taskX'
throw new RuntimeException()
}
}
def taskY = tasks.register('taskY') {
doLast {
println 'taskY'
}
}
taskX.configure { finalizedBy taskY }
$ gradle -q taskX taskX taskY FAILURE: Build failed with an exception. * Where: Build file '/home/user/gradle/samples/build.gradle' line: 4 * What went wrong: Execution failed for task ':taskX'. > java.lang.RuntimeException (no error message) * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. > Get more help at https://help.gradle.org. BUILD FAILED in 0s
Finalizer tasks are useful when the build creates a resource that must be cleaned up, regardless of whether the build fails or succeeds. An example of such a resource is a web container that is started before an integration test task and must be shut down, even if some tests fail.
Skipping tasks
Gradle offers multiple ways to skip the execution of a task.
1. Using a predicate
You can use Task.onlyIf
to attach a predicate to a task.
The task’s actions will only be executed if the predicate is evaluated to be true
.
The predicate is passed to the task as a parameter and returns true
if the task will execute and false
if the task will be skipped.
The predicate is evaluated just before the task is executed.
Passing an optional reason string to onlyIf()
is useful for explaining why the task is skipped:
val hello by tasks.registering {
doLast {
println("hello world")
}
}
hello {
val skipProvider = providers.gradleProperty("skipHello")
onlyIf("there is no property skipHello") {
!skipProvider.isPresent()
}
}
def hello = tasks.register('hello') {
doLast {
println 'hello world'
}
}
hello.configure {
def skipProvider = providers.gradleProperty("skipHello")
onlyIf("there is no property skipHello") {
!skipProvider.present
}
}
$ gradle hello -PskipHello > Task :hello SKIPPED BUILD SUCCESSFUL in 0s
To find why a task was skipped, run the build with the --info
logging level.
$ gradle hello -PskipHello --info ... > Task :hello SKIPPED Skipping task ':hello' as task onlyIf 'there is no property skipHello' is false. :hello (Thread[included builds,5,main]) completed. Took 0.018 secs. BUILD SUCCESSFUL in 13s
2. Using StopExecutionException
If the logic for skipping a task can’t be expressed with a predicate, you can use the StopExecutionException
.
If this exception is thrown by an action, the task action as well as the execution of any following action is skipped. The build continues by executing the next task:
val compile by tasks.registering {
doLast {
println("We are doing the compile.")
}
}
compile {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw StopExecutionException()
}
}
}
tasks.register("myTask") {
dependsOn(compile)
doLast {
println("I am not affected")
}
}
def compile = tasks.register('compile') {
doLast {
println 'We are doing the compile.'
}
}
compile.configure {
doFirst {
// Here you would put arbitrary conditions in real life.
if (true) {
throw new StopExecutionException()
}
}
}
tasks.register('myTask') {
dependsOn('compile')
doLast {
println 'I am not affected'
}
}
$ gradle -q myTask I am not affected
This feature is helpful if you work with tasks provided by Gradle. It allows you to add conditional execution of the built-in actions of such a task.[1]
3. Enabling and Disabling tasks
Every task has an enabled
flag, which defaults to true
.
Setting it to false
prevents executing the task’s actions.
A disabled task will be labeled SKIPPED
:
val disableMe by tasks.registering {
doLast {
println("This should not be printed if the task is disabled.")
}
}
disableMe {
enabled = false
}
def disableMe = tasks.register('disableMe') {
doLast {
println 'This should not be printed if the task is disabled.'
}
}
disableMe.configure {
enabled = false
}
$ gradle disableMe > Task :disableMe SKIPPED BUILD SUCCESSFUL in 0s
4. Task timeouts
Every task has a timeout
property, which can be used to limit its execution time.
When a task reaches its timeout, its task execution thread is interrupted.
The task will be marked as FAILED
.
Finalizer tasks are executed.
If --continue
is used, other tasks continue running.
Tasks that don’t respond to interrupts can’t be timed out. All of Gradle’s built-in tasks respond to timeouts.
tasks.register("hangingTask") {
doLast {
Thread.sleep(100000)
}
timeout = Duration.ofMillis(500)
}
tasks.register("hangingTask") {
doLast {
Thread.sleep(100000)
}
timeout = Duration.ofMillis(500)
}
Task rules
Sometimes you want to have a task whose behavior depends on a large or infinite number value range of parameters. A very nice and expressive way to provide such tasks are task rules:
tasks.addRule("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + (taskName.replace("ping", "")))
}
}
}
}
tasks.addRule("Pattern: ping<ID>") { String taskName ->
if (taskName.startsWith("ping")) {
task(taskName) {
doLast {
println "Pinging: " + (taskName - 'ping')
}
}
}
}
$ gradle -q pingServer1 Pinging: Server1
The String
parameter is used as a description for the rule, which is shown with ./gradlew tasks
.
Rules are not only used when calling tasks from the command line.
You can also create dependsOn
relations on rule based tasks:
tasks.addRule("Pattern: ping<ID>") {
val taskName = this
if (startsWith("ping")) {
task(taskName) {
doLast {
println("Pinging: " + (taskName.replace("ping", "")))
}
}
}
}
tasks.register("groupPing") {
dependsOn("pingServer1", "pingServer2")
}
tasks.addRule("Pattern: ping<ID>") { String taskName ->
if (taskName.startsWith("ping")) {
task(taskName) {
doLast {
println "Pinging: " + (taskName - 'ping')
}
}
}
}
tasks.register('groupPing') {
dependsOn 'pingServer1', 'pingServer2'
}
$ gradle -q groupPing Pinging: Server1 Pinging: Server2
If you run ./gradlew -q tasks
, you won’t find a task named pingServer1
or pingServer2
, but this script is executing logic based on the request to run those tasks.
Exclude tasks from execution
You can exclude a task from execution using the -x
or --exclude-task
command-line option and provide the task’s name to exclude.
$ ./gradlew build -x test
For instance, you can run the check
task but exclude the test
task from running.
This approach can lead to unexpected outcomes, particularly if you exclude an actionable task that produces results needed by other tasks.
Instead of relying on the -x
parameter, defining a suitable lifecycle task for the desired action is recommended.
Using -x
is a practice that should be avoided, although still commonly observed.
Organizing Tasks
There are two types of tasks, actionable and lifecycle tasks.
Actionable tasks in Gradle are tasks that perform actual work, such as compiling code. Lifecycle tasks are tasks that do not do work themselves. These tasks have no actions, instead, they bundle actionable tasks and serve as targets for the build.
A well-organized setup of lifecycle tasks enhances the accessibility of your build for new users and simplifies integration with CI.
Lifecycle tasks
Lifecycle tasks can be particularly beneficial for separating work between users or machines (CI vs local). For example, a developer on a local machine might not want to run an entire build on every single change.
Let’s take a standard app
as an example which applies the base
plugin.
Note
|
The Gradle base plugin defines several lifecycle tasks, including build , assemble , and check .
|
We group the build
, check
task, and the run
task by adding the following lines to the app
build script:
tasks.build {
group = myBuildGroup
}
tasks.check {
group = myBuildGroup
description = "Runs checks (including tests)."
}
tasks.named("run") {
group = myBuildGroup
}
tasks.build {
group = myBuildGroup
}
tasks.check {
group = myBuildGroup
description = "Runs checks (including tests)."
}
tasks.named('run') {
group = myBuildGroup
}
If we now look at the app:tasks
list, we can see the three tasks are available:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
My app build tasks
------------------
build - Assembles and tests this project.
check - Runs checks (including tests).
run - Runs this project as a JVM application
tasksAll - Show additional tasks.
This is already useful if the standard lifecycle tasks are sufficient. Moving the groups around helps clarify the tasks you expect to used in your build.
In many cases, there are more specific requirements that you want to address.
One common scenario is running quality checks without running tests.
Currently, the :check
task runs tests and the code quality checks.
Instead, we want to run code quality checks all the time, but not the lengthy test.
To add a quality check lifecycle task, we introduce an additional lifecycle task called qualityCheck
and a plugin called spotbugs
.
To add a lifecycle task, use tasks.register()
.
The only thing you need to provide is a name.
Put this task in our group and wire the actionable tasks that belong to this new lifecycle task using the dependsOn()
method:
plugins {
id("com.github.spotbugs") version "6.0.7" // spotbugs plugin
}
tasks.register("qualityCheck") { // qualityCheck task
group = myBuildGroup // group
description = "Runs checks (excluding tests)." // description
dependsOn(tasks.classes, tasks.spotbugsMain) // dependencies
dependsOn(tasks.testClasses, tasks.spotbugsTest) // dependencies
}
plugins {
id 'com.github.spotbugs' version '6.0.7' // spotbugs plugin
}
tasks.register('qualityCheck') { // qualityCheck task
group = myBuildGroup // group
description = 'Runs checks (excluding tests).' // description
dependsOn tasks.classes, tasks.spotbugsMain // dependencies
dependsOn tasks.testClasses, tasks.spotbugsTest // dependencies
}
Note that you don’t need to list all the tasks that Gradle will execute. Just specify the targets you want to collect here. Gradle will determine which other tasks it needs to call to reach these goals.
In the example, we add the classes
task, a lifecycle task to compile all our production code, and the spotbugsMain
task, which checks our production code.
We also add a description that will show up in the task list that helps distinguish the two check tasks better.
Now, if run './gradlew :app:tasks', we can see that our new qualityCheck
lifecycle task is available:
$ ./gradlew :app:tasks
> Task :app:tasks
------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------
My app build tasks
------------------
build - Assembles and tests this project.
check - Runs checks (including tests).
qualityCheck - Runs checks (excluding tests).
run - Runs this project as a JVM application
tasksAll - Show additional tasks.
If we run it, we can see that it runs checkstyle but not the tests:
$ ./gradlew :app:qualityCheck
> Task :buildSrc:checkKotlinGradlePluginConfigurationErrors
> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:processTestResources NO-SOURCE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava
> Task :app:classes
> Task :app:compileTestJava
> Task :app:testClasses
> Task :app:spotbugsTest
> Task :app:spotbugsMain
> Task :app:qualityCheck
BUILD SUCCESSFUL in 1s
16 actionable tasks: 5 executed, 11 up-to-date
So far, we have looked at tasks in individual subprojects, which is useful for local development when you work on code in one subproject.
With this setup, developers only need to know that they can call Gradle with :subproject-name:tasks
to see which tasks are available and useful for them.
Global lifecycle tasks
Another place to invoke lifecycle tasks is within the root build; this is especially useful for Continuous Integration (CI).
Gradle tasks play a crucial role in CI or CD systems, where activities like compiling all code, running tests, or building and packaging the complete application are typical. To facilitate this, you can include lifecycle tasks that span multiple subprojects.
Note
|
Gradle has been around for a long time, and you will frequently observe build files in the root directory serving various purposes. In older Gradle versions, many tasks were defined within the root Gradle build file, resulting in various issues. Therefore, exercise caution when determining the content of this file. |
One of the few elements that should be placed in the root build file is global lifecycle tasks.
Let’s continue using the Gradle init
Java application multi-project as an example.
This time, we’re incorporating a build script in the root project. We’ll establish two groups for our global lifecycle tasks: one for tasks relevant to local development, such as running all checks, and another exclusively for our CI system.
Once again, we narrowed down the tasks listed to our specific groups:
val globalBuildGroup = "My global build"
val ciBuildGroup = "My CI build"
tasks.named<TaskReportTask>("tasks") {
displayGroups = listOf<String>(globalBuildGroup, ciBuildGroup)
}
def globalBuildGroup = "My global build"
def ciBuildGroup = "My CI build"
tasks.named(TaskReportTask, "tasks") {
displayGroups = [globalBuildGroup, ciBuildGroup]
}
You could hide the CI tasks if you wanted to by updating displayGroups
.
Currently, the root project exposes no tasks: