Table of Contents

OVERVIEW

Gradle User Manual

Gradle Build Tool

gradle Gradle Build Tool is a fast, dependable, and adaptable open-source build automation tool with an elegant and extensible declarative build language.

In this User Manual, Gradle Build Tool is abbreviated Gradle.

Why Gradle?

Gradle is a widely used and mature tool with an active community and a strong developer ecosystem.

  • Gradle is the most popular build system for the JVM and is the default system for Android and Kotlin Multi-Platform projects. It has a rich community plugin ecosystem.

  • Gradle can automate a wide range of software build scenarios using either its built-in functionality, third-party plugins, or custom build logic.

  • Gradle provides a high-level, declarative, and expressive build language that makes it easy to read and write build logic.

  • Gradle is fast, scalable, and can build projects of any size and complexity.

  • Gradle produces dependable results while benefiting from optimizations such as incremental builds, build caching, and parallel execution.

Gradle, Inc. provides a free service called Build Scan® that provides extensive information and insights about your builds. You can view scans to identify problems or share them for debugging help.

Supported Languages and Frameworks

Gradle supports Android, Java, Kotlin Multiplatform, Groovy, Scala, Javascript, and C/C++.

userguide languages
Compatible IDEs

All major IDEs support Gradle, including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse, and NetBeans.

userguide ides

You can also invoke Gradle via its command-line interface (CLI) in your terminal or through your continuous integration (CI) server.

Education

The Gradle User Manual is the official documentation for the Gradle Build Tool.

  • Getting Started TutorialLearn Gradle basics and the benefits of building your App with Gradle.

  • Training Courses — Head over to the courses page to sign up for free Gradle training.

Support

  • Forum — The fastest way to get help is through the Gradle Forum.

  • Slack — Community members and core contributors answer questions directly on our Slack Channel.

Licenses

Gradle Build Tool source code is open and licensed under the Apache License 2.0. Gradle user manual and DSL reference manual are licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

The User Manual

Explore our guides and examples to use Gradle.

Releases

Information on Gradle releases and how to install Gradle is found on the Installation page.

Content

The Gradle User Manual is broken down into the following sections:

Running Gradle Builds

Learn Gradle basics and how to use Gradle to build your project.

Authoring Gradle Builds

Develop tasks and plugins to customize your build.

Authoring JVM Builds

Use Gradle with your Java project.

Working with Dependencies

Add dependencies to your build.

Optimizing Builds

Use caches to optimize your build and understand the Gradle daemon, incremental builds and file system watching.

Reference

  1. Gradle’s API Javadocs

  2. Gradle’s Groovy DSL

  3. Gradle’s Kotlin DSL

  4. Gradle’s Core Plugins


RELEASES

Installing Gradle

Gradle Installation

If all you want to do is run an existing Gradle project, then you don’t need to install Gradle if the build uses the Gradle Wrapper. This is identifiable by the presence of the gradlew or gradlew.bat files in the root of the project:

.   // (1)
├── gradle
│   └── wrapper // (2)
├── gradlew         // (3)
├── gradlew.bat     // (3)
└── ⋮
  1. Project root directory.

  2. Gradle Wrapper.

  3. Scripts for executing Gradle builds.

If the gradlew or gradlew.bat files are already present in your project, you do not need to install Gradle. But you need to make sure your system satisfies Gradle’s prerequisites.

You can follow the steps in the Upgrading Gradle section if you want to update the Gradle version for your project. Please use the Gradle Wrapper to upgrade Gradle.

Android Studio comes with a working installation of Gradle, so you don’t need to install Gradle separately when only working within that IDE.

If you do not meet the criteria above and decide to install Gradle on your machine, first check if Gradle is already installed by running gradle -v in your terminal. If the command does not return anything, then Gradle is not installed, and you can follow the instructions below.

You can install Gradle Build Tool on Linux, macOS, or Windows. The installation can be done manually or using a package manager like SDKMAN! or Homebrew.

You can find all Gradle releases and their checksums on the releases page.

Prerequisites

Gradle runs on all major operating systems. It requires Java Development Kit (JDK) version 8 or higher to run. You can check the compatibility matrix for more information.

To check, run java -version:

❯ java -version
openjdk version "11.0.18" 2023-01-17
OpenJDK Runtime Environment Homebrew (build 11.0.18+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.18+0, mixed mode)

Gradle uses the JDK it finds in your path, the JDK used by your IDE, or the JDK specified by your project. In this example, the $PATH points to JDK17:

❯ echo $PATH
/opt/homebrew/opt/openjdk@17/bin

You can also set the JAVA_HOME environment variable to point to a specific JDK installation directory. This is especially useful when multiple JDKs are installed:

❯ echo %JAVA_HOME%
C:\Program Files\Java\jdk1.7.0_80
❯ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk-16.jdk/Contents/Home

Gradle supports Kotlin and Groovy as the main build languages. Gradle ships with its own Kotlin and Groovy libraries, therefore they do not need to be installed. Existing installations are ignored by Gradle.

Linux installation

Installing with a package manager

SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and maintained by SDKMAN!:

❯ sdk install gradle

Other package managers are available, but the version of Gradle distributed by them is not controlled by Gradle, Inc. Linux package managers may distribute a modified version of Gradle that is incompatible or incomplete when compared to the official version.

Installing manually

Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

  • Binary-only (bin)

  • Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /opt/gradle
❯ unzip -d /opt/gradle gradle-8.10.2-bin.zip
❯ ls /opt/gradle/gradle-8.10.2
LICENSE  NOTICE  bin  README  init.d  lib  media

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/opt/gradle/gradle-8.10.2/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add $GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change the GRADLE_HOME environment variable.

export GRADLE_HOME=/opt/gradle/gradle-8.10.2
export PATH=${GRADLE_HOME}/bin:${PATH}

macOS installation

Installing with a package manager

SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most Unix-like systems (macOS, Linux, Cygwin, Solaris and FreeBSD). Gradle is deployed and maintained by SDKMAN!:

❯ sdk install gradle

Using Homebrew:

❯ brew install gradle

Using MacPorts:

❯ sudo port install gradle

Other package managers are available, but the version of Gradle distributed by them is not controlled by Gradle, Inc.

Installing manually

Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

  • Binary-only (bin)

  • Complete (all) with docs and sources

We recommend downloading the bin file; it is a smaller file that is quick to download (and the latest documentation is available online).

Step 2 - Unpack the distribution

Unzip the distribution zip file in the directory of your choosing, e.g.:

❯ mkdir /usr/local/gradle
❯ unzip gradle-8.10.2-bin.zip -d /usr/local/gradle
❯ ls /usr/local/gradle/gradle-8.10.2
LICENSE	NOTICE	README	bin	init.d	lib

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path. Configure your PATH environment variable to include the bin directory of the unzipped distribution, e.g.:

❯ export PATH=$PATH:/usr/local/gradle/gradle-8.10.2/bin

Alternatively, you could also add the environment variable GRADLE_HOME and point this to the unzipped distribution. Instead of adding a specific version of Gradle to your PATH, you can add $GRADLE_HOME/bin to your PATH. When upgrading to a different version of Gradle, simply change the GRADLE_HOME environment variable.

It’s a good idea to edit .bash_profile in your home directory to add GRADLE_HOME variable:

export GRADLE_HOME=/usr/local/gradle/gradle-8.10.2
export PATH=$GRADLE_HOME/bin:$PATH

Windows installation

Installing manually

Step 1 - Download the latest Gradle distribution

The distribution ZIP file comes in two flavors:

  • Binary-only (bin)

  • Complete (all) with docs and sources

We recommend downloading the bin file.

Step 2 - Unpack the distribution

Create a new directory C:\Gradle with File Explorer.

Open a second File Explorer window and go to the directory where the Gradle distribution was downloaded. Double-click the ZIP archive to expose the content. Drag the content folder gradle-8.10.2 to your newly created C:\Gradle folder.

Alternatively, you can unpack the Gradle distribution ZIP into C:\Gradle using the archiver tool of your choice.

Step 3 - Configure your system environment

To install Gradle, the path to the unpacked files needs to be in your Path.

In File Explorer right-click on the This PC (or Computer) icon, then click PropertiesAdvanced System SettingsEnvironmental Variables.

Under System Variables select Path, then click Edit. Add an entry for C:\Gradle\gradle-8.10.2\bin. Click OK to save.

Alternatively, you can add the environment variable GRADLE_HOME and point this to the unzipped distribution. Instead of adding a specific version of Gradle to your Path, you can add %GRADLE_HOME%\bin to your Path. When upgrading to a different version of Gradle, just change the GRADLE_HOME environment variable.

Verify the installation

Open a console (or a Windows command prompt) and run gradle -v to run gradle and display the version, e.g.:

❯ gradle -v

------------------------------------------------------------
Gradle 8.10.2
------------------------------------------------------------

Build time:    2024-06-17 18:10:00 UTC
Revision:      6028379bb5a8512d0b2c1be6403543b79825ef08

Kotlin:        1.9.23
Groovy:        3.0.21
Ant:           Apache Ant(TM) version 1.10.13 compiled on January 4 2023
Launcher JVM:  11.0.23 (Eclipse Adoptium 11.0.23+9)
Daemon JVM:    /Library/Java/JavaVirtualMachines/temurin-11.jdk/Contents/Home (no JDK specified, using current Java home)
OS:            Mac OS X 14.5 aarch64

You can verify the integrity of the Gradle distribution by downloading the SHA-256 file (available from the releases page) and following these verification instructions.

Compatibility Matrix

The sections below describe Gradle’s compatibility with several integrations. Versions not listed here may or may not work.

Java Runtime

Gradle runs on the Java Virtual Machine (JVM), which is often provided by either a JDK or JRE. A JVM version between 8 and 23 is required to execute Gradle. JVM 24 and later versions are not yet supported.

Executing the Gradle daemon with JVM 16 or earlier has been deprecated and will become an error in Gradle 9.0. The Gradle wrapper, Gradle client, Tooling API client, and TestKit client will remain compatible with JVM 8.

JDK 6 and 7 can be used for compilation. Testing with JVM 6 and 7 is deprecated and will not be supported in Gradle 9.0.

Any fully supported version of Java can be used for compilation or testing. However, the latest Java version may only be supported for compilation or testing, not for running Gradle. Support is achieved using toolchains and applies to all tasks supporting toolchains.

See the table below for the Java version supported by a specific Gradle release:

Table 1. Java Compatibility
Java version Support for toolchains Support for running Gradle

8

N/A

2.0

9

N/A

4.3

10

N/A

4.7

11

N/A

5.0

12

N/A

5.4

13

N/A

6.0

14

N/A

6.3

15

6.7

6.7

16

7.0

7.0

17

7.3

7.3

18

7.5

7.5

19

7.6

7.6

20

8.1

8.3

21

8.4

8.5

22

8.7

8.8

23

8.10

8.10

24

N/A

N/A

Kotlin

Gradle is tested with Kotlin 1.6.10 through 2.0.20-Beta2. Beta and RC versions may or may not work.

Table 2. Embedded Kotlin version
Embedded Kotlin version Minimum Gradle version Kotlin Language version

1.3.10

5.0

1.3

1.3.11

5.1

1.3

1.3.20

5.2

1.3

1.3.21

5.3

1.3

1.3.31

5.5

1.3

1.3.41

5.6

1.3

1.3.50

6.0

1.3

1.3.61

6.1

1.3

1.3.70

6.3

1.3

1.3.71

6.4

1.3

1.3.72

6.5

1.3

1.4.20

6.8

1.3

1.4.31

7.0

1.4

1.5.21

7.2

1.4

1.5.31

7.3

1.4

1.6.21

7.5

1.4

1.7.10

7.6

1.4

1.8.10

8.0

1.8

1.8.20

8.2

1.8

1.9.0

8.3

1.8

1.9.10

8.4

1.8

1.9.20

8.5

1.8

1.9.22

8.7

1.8

1.9.23

8.9

1.8

1.9.24

8.10

1.8

Groovy

Gradle is tested with Groovy 1.5.8 through 4.0.0.

Gradle plugins written in Groovy must use Groovy 3.x for compatibility with Gradle and Groovy DSL build scripts.

Android

Gradle is tested with Android Gradle Plugin 7.3 through 8.4. Alpha and beta versions may or may not work.

The Feature Lifecycle

Gradle is under constant development. New versions are delivered on a regular and frequent basis (approximately every six weeks) as described in the section on end-of-life support.

Continuous improvement combined with frequent delivery allows new features to be available to users early. Early users provide invaluable feedback, which is incorporated into the development process.

Getting new functionality into the hands of users regularly is a core value of the Gradle platform.

At the same time, API and feature stability are taken very seriously and considered a core value of the Gradle platform. Design choices and automated testing are engineered into the development process and formalized by the section on backward compatibility.

The Gradle feature lifecycle has been designed to meet these goals. It also communicates to users of Gradle what the state of a feature is. The term feature typically means an API or DSL method or property in this context, but it is not restricted to this definition. Command line arguments and modes of execution (e.g. the Build Daemon) are two examples of other features.

Feature States

Features can be in one of four states:

1. Internal

Internal features are not designed for public use and are only intended to be used by Gradle itself. They can change in any way at any point in time without any notice. Therefore, we recommend avoiding the use of such features. Internal features are not documented. If it appears in this User Manual, the DSL Reference, or the API Reference, then the feature is not internal.

Internal features may evolve into public features.

2. Incubating

Features are introduced in the incubating state to allow real-world feedback to be incorporated into the feature before making it public. It also gives users willing to test potential future changes early access.

A feature in an incubating state may change in future Gradle versions until it is no longer incubating. Changes to incubating features for a Gradle release will be highlighted in the release notes for that release. The incubation period for new features varies depending on the feature’s scope, complexity, and nature.

Features in incubation are indicated. In the source code, all methods/properties/classes that are incubating are annotated with incubating. This results in a special mark for them in the DSL and API references.

If an incubating feature is discussed in this User Manual, it will be explicitly said to be in the incubating state.

Feature Preview API

The feature preview API allows certain incubating features to be activated by adding enableFeaturePreview('FEATURE') in your settings file. Individual preview features will be announced in release notes.

When incubating features are either promoted to public or removed, the feature preview flags for them become obsolete, have no effect, and should be removed from the settings file.

3. Public

The default state for a non-internal feature is public. Anything documented in the User Manual, DSL Reference, or API reference that is not explicitly said to be incubating or deprecated is considered public. Features are said to be promoted from an incubating state to public. The release notes for each release indicate which previously incubating features are being promoted by the release.

A public feature will never be removed or intentionally changed without undergoing deprecation. All public features are subject to the backward compatibility policy.

4. Deprecated

Some features may be replaced or become irrelevant due to the natural evolution of Gradle. Such features will eventually be removed from Gradle after being deprecated. A deprecated feature may become stale until it is finally removed according to the backward compatibility policy.

Deprecated features are indicated to be so. In the source code, all methods/properties/classes that are deprecated are annotated with “@java.lang.Deprecated” which is reflected in the DSL and API References. In most cases, there is a replacement for the deprecated element, which will be described in the documentation. Using a deprecated feature will result in a runtime warning in Gradle’s output.

The use of deprecated features should be avoided. The release notes for each release indicate any features being deprecated by the release.

Backward compatibility policy

Gradle provides backward compatibility across major versions (e.g., 1.x, 2.x, etc.). Once a public feature is introduced in a Gradle release, it will remain indefinitely unless deprecated. Once deprecated, it may be removed in the next major release. Deprecated features may be supported across major releases, but this is not guaranteed.

Release end-of-life Policy

Every day, a new nightly build of Gradle is created.

This contains all of the changes made through Gradle’s extensive continuous integration tests during that day. Nightly builds may contain new changes that may or may not be stable.

The Gradle team creates a pre-release distribution called a release candidate (RC) for each minor or major release. When no problems are found after a short time (usually a week), the release candidate is promoted to a general availability (GA) release. If a regression is found in the release candidate, a new RC distribution is created, and the process repeats. Release candidates are supported for as long as the release window is open, but they are not intended to be used for production. Bug reports are greatly appreciated during the RC phase.

The Gradle team may create additional patch releases to replace the final release due to critical bug fixes or regressions. For instance, Gradle 5.2.1 replaces the Gradle 5.2 release.

Once a release candidate has been made, all feature development moves on to the next release for the latest major version. As such, each minor Gradle release causes the previous minor releases in the same major version to become end-of-life (EOL). EOL releases do not receive bug fixes or feature backports.

For major versions, Gradle will backport critical fixes and security fixes to the last minor in the previous major version. For example, when Gradle 7 was the latest major version, several releases were made in the 6.x line, including Gradle 6.9 (and subsequent releases).

As such, each major Gradle release causes:

  • The previous major version becomes maintenance only. It will only receive critical bug fixes and security fixes.

  • The major version before the previous one to become end-of-life (EOL), and that release line will not receive any new fixes.

CORE CONCEPTS

Gradle Basics

Gradle automates building, testing, and deployment of software from information in build scripts.

gradle basic 1

Gradle core concepts

Projects

A Gradle project is a piece of software that can be built, such as an application or a library.

Single project builds include a single project called the root project.

Multi-project builds include one root project and any number of subprojects.

Build Scripts

Build scripts detail to Gradle what steps to take to build the project.

Each project can include one or more build scripts.

Dependency Management

Dependency management is an automated technique for declaring and resolving external resources required by a project.

Each project typically includes a number of external dependencies that Gradle will resolve during the build.

Tasks

Tasks are a basic unit of work such as compiling code or running your test.

Each project contains one or more tasks defined inside a build script or a plugin.

Plugins

Plugins are used to extend Gradle’s capability and optionally contribute tasks to a project.

Gradle project structure

Many developers will interact with Gradle for the first time through an existing project.

The presence of the gradlew and gradlew.bat files in the root directory of a project is a clear indicator that Gradle is used.

A Gradle project will look similar to the following:

project
├── gradle                              // (1)
│   ├── libs.versions.toml              // (2)
│   └── wrapper
│       ├── gradle-wrapper.jar
│       └── gradle-wrapper.properties
├── gradlew                             // (3)
├── gradlew.bat                         // (3)
├── settings.gradle(.kts)               // (4)
├── subproject-a
│   ├── build.gradle(.kts)              // (5)
│   └── src                             // (6)
└── subproject-b
    ├── build.gradle(.kts)              // (5)
    └── src                             // (6)
  1. Gradle directory to store wrapper files and more

  2. Gradle version catalog for dependency management

  3. Gradle wrapper scripts

  4. Gradle settings file to define a root project name and subprojects

  5. Gradle build scripts of the two subprojects - subproject-a and subproject-b

  6. Source code and/or additional files for the projects

Invoking Gradle

IDE

Gradle is built-in to many IDEs including Android Studio, IntelliJ IDEA, Visual Studio Code, Eclipse, and NetBeans.

Gradle can be automatically invoked when you build, clean, or run your app in the IDE.

It is recommended that you consult the manual for the IDE of your choice to learn more about how Gradle can be used and configured.

Command line

Gradle can be invoked in the command line once installed. For example:

$ gradle build
Note
Most projects do not use the installed version of Gradle.
Gradle Wrapper

The Wrapper is a script that invokes a declared version of Gradle and is the recommended way to execute a Gradle build. It is found in the project root directory as a gradlew or gradlew.bat file:

$ gradlew build     // Linux or OSX
$ gradlew.bat build  // Windows

Gradle Wrapper Basics

The recommended way to execute any Gradle build is with the Gradle Wrapper.

gradle basic 2

The Wrapper script invokes a declared version of Gradle, downloading it beforehand if necessary.

wrapper workflow

The Wrapper is available as a gradlew or gradlew.bat file.

The Wrapper provides the following benefits:

  • Standardizes a project on a given Gradle version.

  • Provisions the same Gradle version for different users.

  • Provisions the Gradle version for different execution environments (IDEs, CI servers…​).

Using the Gradle Wrapper

It is always recommended to execute a build with the Wrapper to ensure a reliable, controlled, and standardized execution of the build.

Depending on the operating system, you run gradlew or gradlew.bat instead of the gradle command.

Typical Gradle invocation:

$ gradle build

To run the Wrapper on a Linux or OSX machine:

$ ./gradlew build

To run the Wrapper on Windows PowerShell:

$ .\gradlew.bat build

The command is run in the same directory that the Wrapper is located in. If you want to run the command in a different directory, you must provide the relative path to the Wrapper:

$ ../gradlew build

The following console output demonstrates the use of the Wrapper on a Windows machine, in the command prompt (cmd), for a Java-based project:

$ gradlew.bat build

Downloading https://services.gradle.org/distributions/gradle-5.0-all.zip
.....................................................................................
Unzipping C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0-all.zip to C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-al\ac27o8rbd0ic8ih41or9l32mv
Set executable permissions for: C:\Documents and Settings\Claudia\.gradle\wrapper\dists\gradle-5.0-all\ac27o8rbd0ic8ih41or9l32mv\gradle-5.0\bin\gradle

BUILD SUCCESSFUL in 12s
1 actionable task: 1 executed

Understanding the Wrapper files

The following files are part of the Gradle Wrapper:

.
├── gradle
│   └── wrapper
│       ├── gradle-wrapper.jar  // (1)
│       └── gradle-wrapper.properties   // (2)
├── gradlew // (3)
└── gradlew.bat // (4)
  1. gradle-wrapper.jar: This is a small JAR file that contains the Gradle Wrapper code. It is responsible for downloading and installing the correct version of Gradle for a project if it’s not already installed.

  2. gradle-wrapper.properties: This file contains configuration properties for the Gradle Wrapper, such as the distribution URL (where to download Gradle from) and the distribution type (ZIP or TARBALL).

  3. gradlew: This is a shell script (Unix-based systems) that acts as a wrapper around gradle-wrapper.jar. It is used to execute Gradle tasks on Unix-based systems without needing to manually install Gradle.

  4. gradlew.bat: This is a batch script (Windows) that serves the same purpose as gradlew but is used on Windows systems.

Important
You should never alter these files.

If you want to view or update the Gradle version of your project, use the command line. Do not edit the wrapper files manually:

$ ./gradlew --version
$ ./gradlew wrapper --gradle-version 7.2
$ gradlew.bat --version
$ gradlew.bat wrapper --gradle-version 7.2

Consult the Gradle Wrapper reference to learn more.

Command-Line Interface Basics

The command-line interface is the primary method of interacting with Gradle outside the IDE.

gradle basic 2

Use of the Gradle Wrapper is highly encouraged.

Substitute ./gradlew (in macOS / Linux) or gradlew.bat (in Windows) for gradle in the following examples.

Executing Gradle on the command line conforms to the following structure:

gradle [taskName...] [--option-name...]

Options are allowed before and after task names.

gradle [--option-name...] [taskName...]

If multiple tasks are specified, you should separate them with a space.

gradle [taskName1 taskName2...] [--option-name...]

Options that accept values can be specified with or without = between the option and argument. The use of = is recommended.

gradle [...] --console=plain

Options that enable behavior have long-form options with inverses specified with --no-. The following are opposites.

gradle [...] --build-cache
gradle [...] --no-build-cache

Many long-form options have short-option equivalents. The following are equivalent:

gradle --help
gradle -h

Command-line usage

The following sections describe the use of the Gradle command-line interface. Some plugins also add their own command line options.

Executing tasks

To execute a task called taskName on the root project, type:

$ gradle :taskName

This will run the single taskName and all of its dependencies.

Specify options for tasks

To pass an option to a task, prefix the option name with -- after the task name:

$ gradle taskName --exampleOption=exampleValue

Consult the Gradle Command Line Interface reference to learn more.

Settings File Basics

The settings file is the entry point of every Gradle project.

gradle basic 3

The primary purpose of the settings file is to add subprojects to your build.

Gradle supports single and multi-project builds.

  • For single-project builds, the settings file is optional.

  • For multi-project builds, the settings file is mandatory and declares all subprojects.

Settings script

The settings file is a script. It is either a settings.gradle file written in Groovy or a settings.gradle.kts file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

The settings file is typically found in the root directory of the project.

Let’s take a look at an example and break it down:

settings.gradle.kts
rootProject.name = "root-project"   // (1)

include("sub-project-a")            // (2)
include("sub-project-b")
include("sub-project-c")
  1. Define the project name.

  2. Add subprojects.

settings.gradle
rootProject.name = 'root-project'   // (1)

include('sub-project-a')            // (2)
include('sub-project-b')
include('sub-project-c')
  1. Define the project name.

  2. Add subprojects.

1. Define the project name

The settings file defines your project name:

rootProject.name = "root-project"

There is only one root project per build.

2. Add subprojects

The settings file defines the structure of the project by including subprojects, if there are any:

include("app")
include("business-logic")
include("data-model")

Consult the Writing Settings File page to learn more.

Build File Basics

Generally, a build script details build configuration, tasks, and plugins.

gradle basic 4

Every Gradle build comprises at least one build script.

In the build file, two types of dependencies can be added:

  1. The libraries and/or plugins on which Gradle and the build script depend.

  2. The libraries on which the project sources (i.e., source code) depend.

Build scripts

The build script is either a build.gradle file written in Groovy or a build.gradle.kts file in Kotlin.

The Groovy DSL and the Kotlin DSL are the only accepted languages for Gradle scripts.

Let’s take a look at an example and break it down:

build.gradle.kts
plugins {
    id("application")               // (1)
}

application {
    mainClass = "com.example.Main"  // (2)
}
  1. Add plugins.

  2. Use convention properties.

build.gradle
plugins {
    id 'application'                // (1)
}

application {
    mainClass = 'com.example.Main'  // (2)
}
  1. Add plugins.

  2. Use convention properties.

1. Add plugins

Plugins extend Gradle’s functionality and can contribute tasks to a project.

Adding a plugin to a build is called applying a plugin and makes additional functionality available.

plugins {
    id("application")
}

The application plugin facilitates creating an executable JVM application.

Applying the Application plugin also implicitly applies the Java plugin. The java plugin adds Java compilation along with testing and bundling capabilities to a project.

2. Use convention properties

A plugin adds tasks to a project. It also adds properties and methods to a project.

The application plugin defines tasks that package and distribute an application, such as the run task.

The Application plugin provides a way to declare the main class of a Java application, which is required to execute the code.

application {
    mainClass = "com.example.Main"
}

In this example, the main class (i.e., the point where the program’s execution begins) is com.example.Main.

Consult the Writing Build Scripts page to learn more.

Dependency Management Basics

Gradle has built-in support for dependency management.

gradle basic 7

Dependency management is an automated technique for declaring and resolving external resources required by a project.

Gradle build scripts define the process to build projects that may require external dependencies. Dependencies refer to JARs, plugins, libraries, or source code that support building your project.

Version Catalog

Version catalogs provide a way to centralize your dependency declarations in a libs.versions.toml file.

The catalog makes sharing dependencies and version configurations between subprojects simple. It also allows teams to enforce versions of libraries and plugins in large projects.

The version catalog typically contains four sections:

  1. [versions] to declare the version numbers that plugins and libraries will reference.

  2. [libraries] to define the libraries used in the build files.

  3. [bundles] to define a set of dependencies.

  4. [plugins] to define plugins.

[versions]
androidGradlePlugin = "7.4.1"
mockito = "2.16.0"

[libraries]
googleMaterial = { group = "com.google.android.material", name = "material", version = "1.1.0-alpha05" }
mockitoCore = { module = "org.mockito:mockito-core", version.ref = "mockito" }

[plugins]
androidApplication = { id = "com.android.application", version.ref = "androidGradlePlugin" }

The file is located in the gradle directory so that it can be used by Gradle and IDEs automatically. The version catalog should be checked into source control: gradle/libs.versions.toml.

Declaring Your Dependencies

To add a dependency to your project, specify a dependency in the dependencies block of your build.gradle(.kts) file.

The following build.gradle.kts file adds a plugin and two dependencies to the project using the version catalog above:

plugins {
   alias(libs.plugins.androidApplication)  // (1)
}

dependencies {
    // Dependency on a remote binary to compile and run the code
    implementation(libs.googleMaterial)    // (2)

    // Dependency on a remote binary to compile and run the test code
    testImplementation(libs.mockitoCore)   // (3)
}
  1. Applies the Android Gradle plugin to this project, which adds several features that are specific to building Android apps.

  2. Adds the Material dependency to the project. Material Design provides components for creating a user interface in an Android App. This library will be used to compile and run the Kotlin source code in this project.

  3. Adds the Mockito dependency to the project. Mockito is a mocking framework for testing Java code. This library will be used to compile and run the test source code in this project.

Dependencies in Gradle are grouped by configurations.

  • The material library is added to the implementation configuration, which is used for compiling and running production code.

  • The mockito-core library is added to the testImplementation configuration, which is used for compiling and running test code.

Note
There are many more configurations available.

Viewing Project Dependencies

You can view your dependency tree in the terminal using the ./gradlew :app:dependencies command:

$ ./gradlew :app:dependencies

> Task :app:dependencies

------------------------------------------------------------
Project ':app'
------------------------------------------------------------

implementation - Implementation only dependencies for source set 'main'. (n)
\--- com.google.android.material:material:1.1.0-alpha05 (n)

testImplementation - Implementation only dependencies for source set 'test'. (n)
\--- org.mockito:mockito-core:2.16.0 (n)

...

Consult the Dependency Management chapter to learn more.

Next Step: Learn about Tasks >>

Task Basics

A task represents some independent unit of work that a build performs, such as compiling classes, creating a JAR, generating Javadoc, or publishing archives to a repository.

gradle basic 5

You run a Gradle build task using the gradle command or by invoking the Gradle Wrapper (./gradlew or gradlew.bat) in your project directory:

$ ./gradlew build

Available tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in the project by running the following command in the terminal:

$ ./gradlew tasks
Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

...

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.

...

Other tasks
-----------
compileJava - Compiles main Java source.

...

Running tasks

The run task is executed with ./gradlew run:

$ ./gradlew run

> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes

> Task :app:run
Hello World!

BUILD SUCCESSFUL in 904ms
2 actionable tasks: 2 executed

In this example Java project, the output of the run task is a Hello World statement printed on the console.

Task dependency

Many times, a task requires another task to run first.

For example, for Gradle to execute the build task, the Java code must first be compiled. Thus, the build task depends on the compileJava task.

This means that the compileJava task will run before the build task:

$ ./gradlew build

> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:startScripts
> Task :app:distTar
> Task :app:distZip
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

BUILD SUCCESSFUL in 764ms
7 actionable tasks: 7 executed

Build scripts can optionally define task dependencies. Gradle then automatically determines the task execution order.

Consult the Task development chapter to learn more.

Next Step: Learn about Plugins >>

Plugin Basics

Gradle is built on a plugin system. Gradle itself is primarily composed of infrastructure, such as a sophisticated dependency resolution engine. The rest of its functionality comes from plugins.

A plugin is a piece of software that provides additional functionality to the Gradle build system.

gradle basic 6

Plugins can be applied to a Gradle build script to add new tasks, configurations, or other build-related capabilities:

The Java Library Plugin - java-library

Used to define and build Java libraries. It compiles Java source code with the compileJava task, generates Javadoc with the javadoc task, and packages the compiled classes into a JAR file with the jar task.

The Google Services Gradle Plugin - com.google.gms:google-services

Enables Google APIs and Firebase services in your Android application with a configuration block called googleServices{} and a task called generateReleaseAssets.

The Gradle Bintray Plugin - com.jfrog.bintray

Allows you to publish artifacts to Bintray by configuring the plugin using the bintray{} block.

Plugin distribution

Plugins are distributed in three ways:

  1. Core plugins - Gradle develops and maintains a set of Core Plugins.

  2. Community plugins - Gradle’s community shares plugins via the Gradle Plugin Portal.

  3. Local plugins - Gradle enables users to create custom plugins using APIs.

Applying plugins

Applying a plugin to a project allows the plugin to extend the project’s capabilities.

You apply plugins in the build script using a plugin id (a globally unique identifier / name) and a version:

plugins {
    id «plugin id» version «plugin version»
}

1. Core plugins

Gradle Core plugins are a set of plugins that are included in the Gradle distribution itself. These plugins provide essential functionality for building and managing projects.

Some examples of core plugins include:

  • java: Provides support for building Java projects.

  • groovy: Adds support for compiling and testing Groovy source files.

  • ear: Adds support for building EAR files for enterprise applications.

Core plugins are unique in that they provide short names, such as java for the core JavaPlugin, when applied in build scripts. They also do not require versions. To apply the java plugin to a project:

build.gradle.kts
plugins {
    id("java")
}

There are many Gradle Core Plugins users can take advantage of.

2. Community plugins

Community plugins are plugins developed by the Gradle community, rather than being part of the core Gradle distribution. These plugins provide additional functionality that may be specific to certain use cases or technologies.

The Spring Boot Gradle plugin packages executable JAR or WAR archives, and runs Spring Boot Java applications.

To apply the org.springframework.boot plugin to a project:

build.gradle.kts
plugins {
    id("org.springframework.boot") version "3.1.5"
}

Community plugins can be published at the Gradle Plugin Portal, where other Gradle users can easily discover and use them.

3. Local plugins

Custom or local plugins are developed and used within a specific project or organization. These plugins are not shared publicly and are tailored to the specific needs of the project or organization.

Local plugins can encapsulate common build logic, provide integrations with internal systems or tools, or abstract complex functionality into reusable components.

Gradle provides users with the ability to develop custom plugins using APIs. To create your own plugin, you’ll typically follow these steps:

  1. Define the plugin class: create a new class that implements the Plugin<Project> interface.

    // Define a 'HelloPlugin' plugin
    class HelloPlugin : Plugin<Project> {
        override fun apply(project: Project) {
            // Define the 'hello' task
            val helloTask = project.tasks.register("hello") {
                doLast {
                    println("Hello, Gradle!")
                }
            }
        }
    }
  2. Build and optionally publish your plugin: generate a JAR file containing your plugin code and optionally publish this JAR to a repository (local or remote) to be used in other projects.

    // Publish the plugin
    plugins {
        `maven-publish`
    }
    
    publishing {
        publications {
            create<MavenPublication>("mavenJava") {
                from(components["java"])
            }
        }
        repositories {
            mavenLocal()
        }
    }
  3. Apply your plugin: when you want to use the plugin, include the plugin ID and version in the plugins{} block of the build file.

    // Apply the plugin
    plugins {
        id("com.example.hello") version "1.0"
    }

Consult the Plugin development chapter to learn more.

Gradle Incremental Builds and Build Caching

Gradle uses two main features to reduce build time: incremental builds and build caching.

gradle basic 8

Incremental builds

An incremental build is a build that avoids running tasks whose inputs have not changed since the previous build. Re-executing such tasks is unnecessary if they would only re-produce the same output.

For incremental builds to work, tasks must define their inputs and outputs. Gradle will determine whether the input or outputs have changed at build time. If they have changed, Gradle will execute the task. Otherwise, it will skip execution.

Incremental builds are always enabled, and the best way to see them in action is to turn on verbose mode. With verbose mode, each task state is labeled during a build:

$ ./gradlew compileJava --console=verbose

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava UP-TO-DATE
> Task :utilities:compileJava UP-TO-DATE
> Task :app:compileJava UP-TO-DATE

BUILD SUCCESSFUL in 374ms
12 actionable tasks: 12 up-to-date

When you run a task that has been previously executed and hasn’t changed, then UP-TO-DATE is printed next to the task.

Tip
To permanently enable verbose mode, add org.gradle.console=verbose to your gradle.properties file.

Build caching

Incremental Builds are a great optimization that helps avoid work already done. If a developer continuously changes a single file, there is likely no need to rebuild all the other files in the project.

However, what happens when the same developer switches to a new branch created last week? The files are rebuilt, even though the developer is building something that has been built before.

This is where a build cache is helpful.

The build cache stores previous build results and restores them when needed. It prevents the redundant work and cost of executing time-consuming and expensive processes.

When the build cache has been used to repopulate the local directory, the tasks are marked as FROM-CACHE:

$ ./gradlew compileJava --build-cache

> Task :buildSrc:generateExternalPluginSpecBuilders UP-TO-DATE
> Task :buildSrc:extractPrecompiledScriptPluginPlugins UP-TO-DATE
> Task :buildSrc:compilePluginsBlocks UP-TO-DATE
> Task :buildSrc:generatePrecompiledScriptPluginAccessors UP-TO-DATE
> Task :buildSrc:generateScriptPluginAdapters UP-TO-DATE
> Task :buildSrc:compileKotlin UP-TO-DATE
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy NO-SOURCE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :list:compileJava FROM-CACHE
> Task :utilities:compileJava FROM-CACHE
> Task :app:compileJava FROM-CACHE

BUILD SUCCESSFUL in 364ms
12 actionable tasks: 3 from cache, 9 up-to-date

Once the local directory has been repopulated, the next execution will mark tasks as UP-TO-DATE and not FROM-CACHE.

The build cache allows you to share and reuse unchanged build and test outputs across teams. This speeds up local and CI builds since cycles are not wasted re-building binaries unaffected by new code changes.

Consult the Build cache chapter to learn more.

Next Step: Learn about Build Scans >>

Build Scans

A build scan is a representation of metadata captured as you run your build.

gradle basic 1

Build Scans

Gradle captures your build metadata and sends it to the Build Scan Service. The service then transforms the metadata into information you can analyze and share with others.

build scan 1

The information that scans collect can be an invaluable resource when troubleshooting, collaborating on, or optimizing the performance of your builds.

For example, with a build scan, it’s no longer necessary to copy and paste error messages or include all the details about your environment each time you want to ask a question on Stack Overflow, Slack, or the Gradle Forum. Instead, copy the link to your latest build scan.

build scan 2

Enable Build Scans

To enable build scans on a gradle command, add --scan to the command line option:

 ./gradlew build --scan

You may be prompted to agree to the terms to use Build Scans.

Vist the Build Scans page to learn more.

Next Step: Start the Tutorial >>

OTHER TOPICS

Continuous Builds

Continuous Build allows you to automatically re-execute the requested tasks when file inputs change. You can execute the build in this mode using the -t or --continuous command-line option.

For example, you can continuously run the test task and all dependent tasks by running:

$ gradle test --continuous

Gradle will behave as if you ran gradle test after a change to sources or tests that contribute to the requested tasks. This means unrelated changes (such as changes to build scripts) will not trigger a rebuild. To incorporate build logic changes, the continuous build must be restarted manually.

Continuous build uses file system watching to detect changes to the inputs. If file system watching does not work on your system, then continuous build won’t work either. In particular, continuous build does not work when using --no-daemon.

When Gradle detects a change to the inputs, it will not trigger the build immediately. Instead, it will wait until no additional changes are detected for a certain period of time - the quiet period. You can configure the quiet period in milliseconds by the Gradle property org.gradle.continuous.quietperiod.

Terminating Continuous Build

If Gradle is attached to an interactive input source, such as a terminal, the continuous build can be exited by pressing CTRL-D (On Microsoft Windows, it is required to also press ENTER or RETURN after CTRL-D).

If Gradle is not attached to an interactive input source (e.g. is running as part of a script), the build process must be terminated (e.g. using the kill command or similar).

If the build is being executed via the Tooling API, the build can be cancelled using the Tooling API’s cancellation mechanism.

Limitations

Under some circumstances, continuous build may not detect changes to inputs.

Creating input directories

Sometimes, creating an input directory that was previously missing does not trigger a build, due to the way file system watching works. For example, creating the src/main/java directory may not trigger a build. Similarly, if the input is a filtered file tree and no files are matching the filter, the creation of matching files may not trigger a build.

Inputs of untracked tasks

Changes to the inputs of untracked tasks or tasks that have no outputs may not trigger a build.

Changes to files outside of project directories

Gradle only watches for changes to files inside the project directory. Changes to files outside the project directory will go undetected and not trigger a build.

Build cycles

Gradle starts watching for changes just before a task executes. If a task modifies its own inputs while executing, Gradle will detect the change and trigger a new build. If every time the task executes, the inputs are modified again, the build will be triggered again. This isn’t unique to continuous build. A task that modifies its own inputs will never be considered up-to-date when run "normally" without continuous build.

If your build enters a build cycle like this, you can track down the task by looking at the list of files reported changed by Gradle. After identifying the file(s) that are changed during each build, you should look for a task that has that file as an input. In some cases, it may be obvious (e.g., a Java file is compiled with compileJava). In other cases, you can use --info logging to find the task that is out-of-date due to the identified files.

THE BASICS

Gradle Directories

Gradle uses two main directories to perform and manage its work: the Gradle User Home directory and the Project Root directory.

author gradle 2

Gradle User Home directory

By default, the Gradle User Home (~/.gradle or C:\Users\<USERNAME>\.gradle) stores global configuration properties, initialization scripts, caches, and log files.

It can be set with the environment variable GRADLE_USER_HOME.

Tip
Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.

It is roughly structured as follows:

├── caches                  // (1)
│   ├── 4.8                     // (2)
│   ├── 4.9                     // (2)
│   ├── ⋮
│   ├── jars-3                  // (3)
│   └── modules-2               // (3)
├── daemon // (4)
│   ├── ⋮
│   ├── 4.8
│   └── 4.9
├── init.d                  // (5)
│   └── my-setup.gradle
├── jdks                    // (6)
│   ├── ⋮
│   └── jdk-14.0.2+12
├── wrapper
│   └── dists                   // (7)
│       ├── ⋮
│       ├── gradle-4.8-bin
│       ├── gradle-4.9-all
│       └── gradle-4.9-bin
└── gradle.properties       // (8)
  1. Global cache directory (for everything that is not project-specific).

  2. Version-specific caches (e.g., to support incremental builds).

  3. Shared caches (e.g., for artifacts of dependencies).

  4. Registry and logs of the Gradle Daemon.

  5. Global initialization scripts.

  6. JDKs downloaded by the toolchain support.

  7. Distributions downloaded by the Gradle Wrapper.

  8. Global Gradle configuration properties.

Consult the Gradle Directories reference to learn more.

Project Root directory

The project root directory contains all source files from your project.

It also contains files and directories Gradle generates, such as .gradle and build.

While gradle is usually checked into source control, the build directory contains the output of your builds as well as transient files Gradle uses to support features like incremental builds.

The anatomy of a typical project root directory looks as follows:

├── .gradle                 // (1)
│   ├── 4.8                     // (2)
│   ├── 4.9                     // (2)
│   └── ⋮
├── build                   // (3)
├── gradle
│   └── wrapper                 // (4)
├── gradle.properties       // (5)
├── gradlew                 // (6)
├── gradlew.bat             // (6)
├── settings.gradle.kts     // (7)
├── subproject-one          // (8)
|   └── build.gradle.kts        // (9)
├── subproject-two          // (8)
|   └── build.gradle.kts        // (9)
└── ⋮
  1. Project-specific cache directory generated by Gradle.

  2. Version-specific caches (e.g., to support incremental builds).

  3. The build directory of this project into which Gradle generates all build artifacts.

  4. Contains the JAR file and configuration of the Gradle Wrapper.

  5. Project-specific Gradle configuration properties.

  6. Scripts for executing builds using the Gradle Wrapper.

  7. The project’s settings file where the list of subprojects is defined.

  8. Usually, a project is organized into one or multiple subprojects.

  9. Each subproject has its own Gradle build script.

Consult the Gradle Directories reference to learn more.

Multi-Project Build Basics

Gradle supports multi-project builds.

gradle basic 9

While some small projects and monolithic applications may contain a single build file and source tree, it is often more common for a project to have been split into smaller, interdependent modules. The word "interdependent" is vital, as you typically want to link the many modules together through a single build.

Gradle supports this scenario through multi-project builds. This is sometimes referred to as a multi-module project. Gradle refers to modules as subprojects.

A multi-project build consists of one root project and one or more subprojects.

Multi-Project structure

The following represents the structure of a multi-project build that contains two subprojects:

multi project structure

The directory structure should look as follows:

├── .gradle
│   └── ⋮
├── gradle
│   ├── libs.version.toml
│   └── wrapper
├── gradlew
├── gradlew.bat
├── settings.gradle.kts  // (1)
├── sub-project-1
│   └── build.gradle.kts // (2)
├── sub-project-2
│   └── build.gradle.kts // (2)
└── sub-project-3
    └── build.gradle.kts // (2)
  1. The settings.gradle.kts file should include all subprojects.

  2. Each subproject should have its own build.gradle.kts file.

Multi-Project standards

The Gradle community has two standards for multi-project build structures:

  1. Multi-Project Builds using buildSrc - where buildSrc is a subproject-like directory at the Gradle project root containing all the build logic.

  2. Composite Builds - a build that includes other builds where build-logic is a build directory at the Gradle project root containing reusable build logic.

multi project standards
1. Multi-Project Builds using buildSrc

Multi-project builds allow you to organize projects with many modules, wire dependencies between those modules, and easily share common build logic amongst them.

For example, a build that has many modules called mobile-app, web-app, api, lib, and documentation could be structured as follows:

.
├── gradle
├── gradlew
├── settings.gradle.kts
├── buildSrc
│   ├── build.gradle.kts
│   └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│   └── build.gradle.kts
├── web-app
│   └── build.gradle.kts
├── api
│   └── build.gradle.kts
├── lib
│   └── build.gradle.kts
└── documentation
    └── build.gradle.kts

The modules will have dependencies between them such as web-app and mobile-app depending on lib. This means that in order for Gradle to build web-app or mobile-app, it must build lib first.

In this example, the root settings file will look as follows:

settings.gradle.kts
include("mobile-app", "web-app", "api", "lib", "documentation")
Note
The order in which the subprojects (modules) are included does not matter.

The buildSrc directory is automatically recognized by Gradle. It is a good place to define and maintain shared configuration or imperative build logic, such as custom tasks or plugins.

buildSrc is automatically included in your build as a special subproject if a build.gradle(.kts) file is found under buildSrc.

If the java plugin is applied to the buildSrc project, the compiled code from buildSrc/src/main/java is put in the classpath of the root build script, making it available to any subproject (web-app, mobile-app, lib, etc…​) in the build.

Consult how to declare dependencies between subprojects to learn more.

2. Composite Builds

Composite Builds, also referred to as included builds, are best for sharing logic between builds (not subprojects) or isolating access to shared build logic (i.e., convention plugins).

Let’s take the previous example. The logic in buildSrc has been turned into a project that contains plugins and can be published and worked on independently of the root project build.

The plugin is moved to its own build called build-logic with a build script and settings file:

.
├── gradle
├── gradlew
├── settings.gradle.kts
├── build-logic
│   ├── settings.gradle.kts
│   └── conventions
│       ├── build.gradle.kts
│       └── src/main/kotlin/shared-build-conventions.gradle.kts
├── mobile-app
│   └── build.gradle.kts
├── web-app
│   └── build.gradle.kts
├── api
│   └── build.gradle.kts
├── lib
│   └── build.gradle.kts
└── documentation
    └── build.gradle.kts
Note
The fact that build-logic is located in a subdirectory of the root project is irrelevant. The folder could be located outside the root project if desired.

The root settings file includes the entire build-logic build:

settings.gradle.kts
pluginManagement {
    includeBuild("build-logic")
}
include("mobile-app", "web-app", "api", "lib", "documentation")

Consult how to create composite builds with includeBuild to learn more.

Multi-Project path

A project path has the following pattern: it starts with an optional colon, which denotes the root project.

The root project, :, is the only project in a path not specified by its name.

The rest of a project path is a colon-separated sequence of project names, where the next project is a subproject of the previous project:

:sub-project-1

You can see the project paths when running gradle projects:

------------------------------------------------------------
Root project 'project'
------------------------------------------------------------

Root project 'project'
+--- Project ':sub-project-1'
\--- Project ':sub-project-2'

Project paths usually reflect the filesystem layout, but there are exceptions. Most notably for composite builds.

Identifying project structure

You can use the gradle projects command to identify the project structure.

As an example, let’s use a multi-project build with the following structure:

> gradle -q projects
Projects:

------------------------------------------------------------
Root project 'multiproject'
------------------------------------------------------------

Root project 'multiproject'
+--- Project ':api'
+--- Project ':services'
|    +--- Project ':services:shared'
|    \--- Project ':services:webservice'
\--- Project ':shared'

To see a list of the tasks of a project, run gradle <project-path>:tasks
For example, try running gradle :api:tasks

Multi-project builds are collections of tasks you can run. The difference is that you may want to control which project’s tasks get executed.

The following sections will cover your two options for executing tasks in a multi-project build.

Executing tasks by name

The command gradle test will execute the test task in any subprojects relative to the current working directory that has that task.

If you run the command from the root project directory, you will run test in api, shared, services:shared and services:webservice.

If you run the command from the services project directory, you will only execute the task in services:shared and services:webservice.

The basic rule behind Gradle’s behavior is to execute all tasks down the hierarchy with this name. And complain if there is no such task found in any of the subprojects traversed.

Note
Some task selectors, like help or dependencies, will only run the task on the project they are invoked on and not on all the subprojects to reduce the amount of information printed on the screen.
Executing tasks by fully qualified name

You can use a task’s fully qualified name to execute a specific task in a particular subproject. For example: gradle :services:webservice:build will run the build task of the webservice subproject.

The fully qualified name of a task is its project path plus the task name.

This approach works for any task, so if you want to know what tasks are in a particular subproject, use the tasks task, e.g. gradle :services:webservice:tasks.

Multi-Project building and testing

The build task is typically used to compile, test, and check a single project.

In multi-project builds, you may often want to do all of these tasks across various projects. The buildNeeded and buildDependents tasks can help with this.

In this example, the :services:person-service project depends on both the :api and :shared projects. The :api project also depends on the :shared project.

Assuming you are working on a single project, the :api project, you have been making changes but have not built the entire project since performing a clean. You want to build any necessary supporting JARs but only perform code quality and unit tests on the parts of the project you have changed.

The build task does this:

$ gradle :api:build

> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build

BUILD SUCCESSFUL in 0s

If you have just gotten the latest version of the source from your version control system, which included changes in other projects that :api depends on, you might want to build all the projects you depend on AND test them too.

The buildNeeded task builds AND tests all the projects from the project dependencies of the testRuntime configuration:

$ gradle :api:buildNeeded

> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :shared:assemble
> Task :shared:compileTestJava
> Task :shared:processTestResources
> Task :shared:testClasses
> Task :shared:test
> Task :shared:check
> Task :shared:build
> Task :shared:buildNeeded
> Task :api:buildNeeded

BUILD SUCCESSFUL in 0s

You may want to refactor some part of the :api project used in other projects. If you make these changes, testing only the :api project is insufficient. You must test all projects that depend on the :api project.

The buildDependents task tests ALL the projects that have a project dependency (in the testRuntime configuration) on the specified project:

$ gradle :api:buildDependents

> Task :shared:compileJava
> Task :shared:processResources
> Task :shared:classes
> Task :shared:jar
> Task :api:compileJava
> Task :api:processResources
> Task :api:classes
> Task :api:jar
> Task :api:assemble
> Task :api:compileTestJava
> Task :api:processTestResources
> Task :api:testClasses
> Task :api:test
> Task :api:check
> Task :api:build
> Task :services:person-service:compileJava
> Task :services:person-service:processResources
> Task :services:person-service:classes
> Task :services:person-service:jar
> Task :services:person-service:assemble
> Task :services:person-service:compileTestJava
> Task :services:person-service:processTestResources
> Task :services:person-service:testClasses
> Task :services:person-service:test
> Task :services:person-service:check
> Task :services:person-service:build
> Task :services:person-service:buildDependents
> Task :api:buildDependents

BUILD SUCCESSFUL in 0s

Finally, you can build and test everything in all projects. Any task you run in the root project folder will cause that same-named task to be run on all the children.

You can run gradle build to build and test ALL projects.

Consult the Structuring Builds chapter to learn more.

Build Lifecycle

As a build author, you define tasks and dependencies between tasks. Gradle guarantees that these tasks will execute in order of their dependencies.

Your build scripts and plugins configure this dependency graph.

For example, if your project tasks include build, assemble, createDocs, your build script(s) can ensure that they are executed in the order buildassemblecreateDoc.

Task Graphs

Gradle builds the task graph before executing any task.

Across all projects in the build, tasks form a Directed Acyclic Graph (DAG).

This diagram shows two example task graphs, one abstract and the other concrete, with dependencies between tasks represented as arrows:

task dag examples

Both plugins and build scripts contribute to the task graph via the task dependency mechanism and annotated inputs/outputs.

Build Phases

A Gradle build has three distinct phases.

author gradle 1

Gradle runs these phases in order:

Phase 1. Initialization
  • Detects the settings.gradle(.kts) file.

  • Creates a Settings instance.

  • Evaluates the settings file to determine which projects (and included builds) make up the build.

  • Creates a Project instance for every project.

Phase 2. Configuration
  • Evaluates the build scripts, build.gradle(.kts), of every project participating in the build.

  • Creates a task graph for requested tasks.

Phase 3. Execution
  • Schedules and executes the selected tasks.

  • Dependencies between tasks determine execution order.

  • Execution of tasks can occur in parallel.

build lifecycle example
Example

The following example shows which parts of settings and build files correspond to various build phases:

settings.gradle.kts
rootProject.name = "basic"
println("This is executed during the initialization phase.")
build.gradle.kts
println("This is executed during the configuration phase.")

tasks.register("configured") {
    println("This is also executed during the configuration phase, because :configured is used in the build.")
}

tasks.register("test") {
    doLast {
        println("This is executed during the execution phase.")
    }
}

tasks.register("testBoth") {
    doFirst {
        println("This is executed first during the execution phase.")
    }
    doLast {
        println("This is executed last during the execution phase.")
    }
    println("This is executed during the configuration phase as well, because :testBoth is used in the build.")
}
settings.gradle
rootProject.name = 'basic'
println 'This is executed during the initialization phase.'
build.gradle
println 'This is executed during the configuration phase.'

tasks.register('configured') {
    println 'This is also executed during the configuration phase, because :configured is used in the build.'
}

tasks.register('test') {
    doLast {
        println 'This is executed during the execution phase.'
    }
}

tasks.register('testBoth') {
        doFirst {
          println 'This is executed first during the execution phase.'
        }
        doLast {
          println 'This is executed last during the execution phase.'
        }
        println 'This is executed during the configuration phase as well, because :testBoth is used in the build.'
}

The following command executes the test and testBoth tasks specified above. Because Gradle only configures requested tasks and their dependencies, the configured task never configures:

> gradle test testBoth
This is executed during the initialization phase.

> Configure project :
This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in the build.

> Task :test
This is executed during the execution phase.

> Task :testBoth
This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
> gradle test testBoth
This is executed during the initialization phase.

> Configure project :
This is executed during the configuration phase.
This is executed during the configuration phase as well, because :testBoth is used in the build.

> Task :test
This is executed during the execution phase.

> Task :testBoth
This is executed first during the execution phase.
This is executed last during the execution phase.

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

Phase 1. Initialization

In the initialization phase, Gradle detects the set of projects (root and subprojects) and included builds participating in the build.

Gradle first evaluates the settings file, settings.gradle(.kts), and instantiates a Settings object. Then, Gradle instantiates Project instances for each project.

Phase 2. Configuration

In the configuration phase, Gradle adds tasks and other properties to the projects found by the initialization phase.

Phase 3. Execution

In the execution phase, Gradle runs tasks.

Gradle uses the task execution graphs generated by the configuration phase to determine which tasks to execute.

Writing Settings Files

The settings file is the entry point of every Gradle build.

author gradle 7

Early in the Gradle Build lifecycle, the initialization phase finds the settings file in your project root directory.

When the settings file settings.gradle(.kts) is found, Gradle instantiates a Settings object.

One of the purposes of the Settings object is to allow you to declare all the projects to be included in the build.

Settings Scripts

The settings script is either a settings.gradle file in Groovy or a settings.gradle.kts file in Kotlin.

Before Gradle assembles the projects for a build, it creates a Settings instance and executes the settings file against it.

Settings

As the settings script executes, it configures this Settings. Therefore, the settings file defines the Settings object.

Important
There is a one-to-one correspondence between a Settings instance and a settings.gradle(.kts) file.

The Settings Object

The Settings object is part of the Gradle API.

  • In the Groovy DSL, the Settings object documentation is found here.

  • In the Kotlin DSL, the Settings object documentation is found here.

Many top-level properties and blocks in a settings script are part of the Settings API.

For example, we can set the root project name in the settings script using the Settings.rootProject property:

settings.rootProject.name = "root"

Which is usually shortened to:

rootProject.name = "root"
Standard Settings properties

The Settings object exposes a standard set of properties in your settings script.

The following table lists a few commonly used properties:

Name Description

buildCache

The build cache configuration.

plugins

The container of plugins that have been applied to the settings.

rootDir

The root directory of the build. The root directory is the project directory of the root project.

rootProject

The root project of the build.

settings

Returns this settings object.

The following table lists a few commonly used methods:

Name Description

include()

Adds the given projects to the build.

includeBuild()

Includes a build at the specified path to the composite build.

Settings Script structure

A Settings script is a series of method calls to the Gradle API that often use { …​ }, a special shortcut in both the Groovy and Kotlin languages. A { } block is called a lambda in Kotlin or a closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy closure object is passed as the argument. It is the short form for:

plugins(function() {
    id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct corresponding method. The this of the method invocation id("plugin") object is of type PluginDependenciesSpec.

The settings file is composed of Gradle API calls built on top of the DSLs. Gradle executes the script line by line, top to bottom.

Let’s take a look at an example and break it down:

settings.gradle.kts
pluginManagement {                                          // (1)
    repositories {
        gradlePluginPortal()
        google()
    }
}

plugins {                                                   // (2)
    id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}

rootProject.name = "root-project"                           // (3)

dependencyResolutionManagement {                            // (4)
    repositories {
        mavenCentral()
    }
}

include("sub-project-a")                                    // (5)
include("sub-project-b")
include("sub-project-c")
  1. Define the location of plugins

  2. Apply settings plugins.

  3. Define the root project name.

  4. Define dependency resolution strategies.

  5. Add subprojects to the build.

settings.gradle
pluginManagement {                                          // (1)
    repositories {
        gradlePluginPortal()
        google()
    }
}

plugins {                                                   // (2)
    id 'org.gradle.toolchains.foojay-resolver-convention' version '0.8.0'
}

rootProject.name = 'root-project'                           // (3)

dependencyResolutionManagement {                            // (4)
    repositories {
        mavenCentral()
    }
}

include('sub-project-a')                                    // (5)
include('sub-project-b')
include('sub-project-c')
  1. Define the location of plugins.

  2. Apply settings plugins.

  3. Define the root project name.

  4. Define dependency resolution strategies.

  5. Add subprojects to the build.

1. Define the location of plugins

The settings file can optionally manage plugin versions and repositories for your build with pluginManagement It provides a centralized way to define which plugins should be used in your project and from which repositories they should be resolved.

pluginManagement {
    repositories {
        gradlePluginPortal()
        google()
    }
}
2. Apply settings plugins

The settings file can optionally apply plugins that are required for configuring the settings of the project. These are commonly the Develocity plugin and the Toolchain Resolver plugin in the example below.

Plugins applied in the settings file only affect the Settings object.

plugins {
  id("org.gradle.toolchains.foojay-resolver-convention") version "0.8.0"
}
3. Define the root project name

The settings file defines your project name using the rootProject.name property:

rootProject.name = "root-project"

There is only one root project per build.

4. Define dependency resolution strategies

The settings file can optionally define rules and configurations for dependency resolution across your project(s). It provides a centralized way to manage and customize dependency resolution.

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.PREFER_PROJECT)
    repositories {
        mavenCentral()
    }
}

You can also include version catalogs in this section.

5. Add subprojects to the build

The settings file defines the structure of the project by adding all the subprojects using the include statement:

include("app")
include("business-logic")
include("data-model")

You can also include entire builds using includeBuild.

Settings File Scripting

There are many more properties and methods on the Settings object that you can use to configure your build.

It’s important to remember that while many Gradle scripts are typically written in short Groovy or Kotlin syntax, every item in the settings script is essentially invoking a method on the Settings object in the Gradle API:

include("app")

Is actually:

settings.include("app")

Additionally, the full power of the Groovy and Kotlin languages is available to you.

For example, instead of using include many times to add subprojects, you can iterate over the list of directories in the project root folder and include them automatically:

rootDir.listFiles().filter { it.isDirectory && (new File(it, "build.gradle.kts").exists()) }.forEach {
    include(it.name)
}
Tip
This type of logic should be developed in a plugin.

Writing Build Scripts

The initialization phase in the Gradle Build lifecycle finds the root project and subprojects included in your project root directory using the settings file.

author gradle 6

Then, for each project included in the settings file, Gradle creates a Project instance.

Gradle then looks for a corresponding build script file, which is used in the configuration phase.

Build Scripts

Every Gradle build comprises one or more projects; a root project and subprojects.

A project typically corresponds to a software component that needs to be built, like a library or an application. It might represent a library JAR, a web application, or a distribution ZIP assembled from the JARs produced by other projects.

On the other hand, it might represent a thing to be done, such as deploying your application to staging or production environments.

Gradle scripts are written in either Groovy DSL or Kotlin DSL (domain-specific language).

A build script configures a project and is associated with an object of type Project.

Build

As the build script executes, it configures Project.

The build script is either a *.gradle file in Groovy or a *.gradle.kts file in Kotlin.

Important
Build scripts configure Project objects and their children.

The Project object

The Project object is part of the Gradle API:

  • In the Groovy DSL, the Project object documentation is found here.

  • In the Kotlin DSL, the Project object documentation is found here.

Many top-level properties and blocks in a build script are part of the Project API.

For example, the following build script uses the Project.name property to print the name of the project:

build.gradle.kts
println(name)
println(project.name)
build.gradle
println name
println project.name
$ gradle -q check
project-api
project-api

Both println statements print out the same property.

The first uses the top-level reference to the name property of the Project object. The second statement uses the project property available to any build script, which returns the associated Project object.

Standard project properties

The Project object exposes a standard set of properties in your build script.

The following table lists a few commonly used properties:

Name Type Description

name

String

The name of the project directory.

path

String

The fully qualified name of the project.

description

String

A description for the project.

dependencies

DependencyHandler

Returns the dependency handler of the project.

repositories

RepositoryHandler

Returns the repository handler of the project.

layout

ProjectLayout

Provides access to several important locations for a project.

group

Object

The group of this project.

version

Object

The version of this project.

The following table lists a few commonly used methods:

Name Description

uri()

Resolves a file path to a URI, relative to the project directory of this project.

task()

Creates a Task with the given name and adds it to this project.

Build Script structure

The Build script is composed of { …​ }, a special object in both Groovy and Kotlin. This object is called a lambda in Kotlin or a closure in Groovy.

Simply put, the plugins{ } block is a method invocation in which a Kotlin lambda object or Groovy closure object is passed as the argument. It is the short form for:

plugins(function() {
    id("plugin")
})

Blocks are mapped to Gradle API methods.

The code inside the function is executed against a this object called a receiver in Kotlin lambda and a delegate in Groovy closure. Gradle determines the correct this object and invokes the correct corresponding method. The this of the method invocation id("plugin") object is of type PluginDependenciesSpec.

The build script is essentially composed of Gradle API calls built on top of the DSLs. Gradle executes the script line by line, top to bottom.

Let’s take a look at an example and break it down:

build.gradle.kts
plugins {                                                               // (1)
    id("org.jetbrains.kotlin.jvm") version "1.9.0"
    id("application")
}

repositories {                                                          // (2)
    mavenCentral()
}

dependencies {                                                          // (3)
    testImplementation("org.jetbrains.kotlin:kotlin-test-junit5")
    testImplementation("org.junit.jupiter:junit-jupiter-engine:5.9.3")
    testRuntimeOnly("org.junit.platform:junit-platform-launcher")
    implementation("com.google.guava:guava:32.1.1-jre")
}

application {                                                           // (4)
    mainClass = "com.example.Main"
}

tasks.named<Test>("test") {                                             // (5)
    useJUnitPlatform()
}
  1. Apply plugins to the build.

  2. Define the locations where dependencies can be found.

  3. Add dependencies.

  4. Set properties.

  5. Register and configure tasks.

build.gradle
plugins {                                                               // (1)
    id 'org.jetbrains.kotlin.jvm' version '1.9.0'
    id 'application'
}

repositories {                                                          // (2)
    mavenCentral()
}

dependencies {                                                          // (3)
    testImplementation 'org.jetbrains.kotlin:kotlin-test-junit5'
    testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.9.3'
    testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
    implementation 'com.google.guava:guava:32.1.1-jre'
}

application {                                                           // (4)
    mainClass = 'com.example.Main'
}

tasks.named('test') {                                                   // (5)
    useJUnitPlatform()
}
  1. Apply plugins to the build.

  2. Define the locations where dependencies can be found.

  3. Add dependencies.

  4. Set properties.

  5. Register and configure tasks.

1. Apply plugins to the build

Plugins are used to extend Gradle. They are also used to modularize and reuse project configurations.

Plugins can be applied using the PluginDependenciesSpec plugins script block.

The plugins block is preferred:

plugins {
    id("org.jetbrains.kotlin.jvm") version "1.9.0"
    id("application")
}

In the example, the application plugin, which is included with Gradle, has been applied, describing our project as a Java application.

The Kotlin gradle plugin, version 1.9.0, has also been applied. This plugin is not included with Gradle and, therefore, has to be described using a plugin id and a plugin version so that Gradle can find and apply it.

2. Define the locations where dependencies can be found

A project generally has a number of dependencies it needs to do its work. Dependencies include plugins, libraries, or components that Gradle must download for the build to succeed.

The build script lets Gradle know where to look for the binaries of the dependencies. More than one location can be provided:

repositories {
    mavenCentral()
    google()
}

In the example, the guava library and the JetBrains Kotlin plugin (org.jetbrains.kotlin.jvm) will be downloaded from the Maven Central Repository.

3. Add dependencies

A project generally has a number of dependencies it needs to do its work. These dependencies are often libraries of precompiled classes that are imported in the project’s source code.

Dependencies are managed via configurations and are retrieved from repositories.

Use the DependencyHandler returned by Project.getDependencies() method to manage the dependencies. Use the RepositoryHandler returned by Project.getRepositories() method to manage the repositories.

dependencies {
    implementation("com.google.guava:guava:32.1.1-jre")
}

In the example, the application code uses Google’s guava libraries. Guava provides utility methods for collections, caching, primitives support, concurrency, common annotations, string processing, I/O, and validations.

4. Set properties

A plugin can add properties and methods to a project using extensions.

The Project object has an associated ExtensionContainer object that contains all the settings and properties for the plugins that have been applied to the project.

In the example, the application plugin added an application property, which is used to detail the main class of our Java application:

application {
    mainClass = "com.example.Main"
}
5. Register and configure tasks

Tasks perform some basic piece of work, such as compiling classes, or running unit tests, or zipping up a WAR file.

While tasks are typically defined in plugins, you may need to register or configure tasks in build scripts.

Registering a task adds the task to your project.

You can register tasks in a project using the TaskContainer.register(java.lang.String) method:

tasks.register<Zip>("zip-reports") {
    from 'Reports/'
    include '*'
    archiveName 'Reports.zip'
    destinationDir(file('/dir'))
}

You may have seen usage of the TaskContainer.create(java.lang.String) method which should be avoided:

tasks.create<Zip>("zip-reports") {
    from 'Reports/'
    include '*'
    archiveName 'Reports.zip'
    destinationDir(file('/dir'))
}
Tip
register(), which enables task configuration avoidance, is preferred over create().

You can locate a task to configure it using the TaskCollection.named(java.lang.String) method:

tasks.named<Test>("test") {
    useJUnitPlatform()
}

The example below configures the Javadoc task to automatically generate HTML documentation from Java code:

tasks.named("javadoc").configure {
    exclude 'app/Internal*.java'
    exclude 'app/internal/*'
    exclude 'app/internal/*'
}

Build Scripting

A build script is made up of zero or more statements and script blocks:

println(project.layout.projectDirectory);

Statements can include method calls, property assignments, and local variable definitions:

version = '1.0.0.GA'

A script block is a method call which takes a closure/lambda as a parameter:

configurations {
}

The closure/lambda configures some delegate object as it executes:

repositories {
    google()
}

A build script is also a Groovy or a Kotlin script:

build.gradle.kts
tasks.register("upper") {
    doLast {
        val someString = "mY_nAmE"
        println("Original: $someString")
        println("Upper case: ${someString.toUpperCase()}")
    }
}
build.gradle
tasks.register('upper') {
    doLast {
        String someString = 'mY_nAmE'
        println "Original: $someString"
        println "Upper case: ${someString.toUpperCase()}"
    }
}
$ gradle -q upper
Original: mY_nAmE
Upper case: MY_NAME

It can contain elements allowed in a Groovy or Kotlin script, such as method definitions and class definitions:

build.gradle.kts
tasks.register("count") {
    doLast {
        repeat(4) { print("$it ") }
    }
}
build.gradle
tasks.register('count') {
    doLast {
        4.times { print "$it " }
    }
}
$ gradle -q count
0 1 2 3 
Flexible task registration

Using the capabilities of the Groovy or Kotlin language, you can register multiple tasks in a loop:

build.gradle.kts
repeat(4) { counter ->
    tasks.register("task$counter") {
        doLast {
            println("I'm task number $counter")
        }
    }
}
build.gradle
4.times { counter ->
    tasks.register("task$counter") {
        doLast {
            println "I'm task number $counter"
        }
    }
}
$ gradle -q task1
I'm task number 1
Declare Variables

Build scripts can declare two variables: local variables and extra properties.

Local Variables

Declare local variables with the val keyword. Local variables are only visible in the scope where they have been declared. They are a feature of the underlying Kotlin language.

Declare local variables with the def keyword. Local variables are only visible in the scope where they have been declared. They are a feature of the underlying Groovy language.

build.gradle.kts
val dest = "dest"

tasks.register<Copy>("copy") {
    from("source")
    into(dest)
}
build.gradle
def dest = 'dest'

tasks.register('copy', Copy) {
    from 'source'
    into dest
}
Extra Properties

Gradle’s enhanced objects, including projects, tasks, and source sets, can hold user-defined properties.

Add, read, and set extra properties via the owning object’s extra property. Alternatively, you can access extra properties via Kotlin delegated properties using by extra.

Add, read, and set extra properties via the owning object’s ext property. Alternatively, you can use an ext block to add multiple properties simultaneously.

build.gradle.kts
plugins {
    id("java-library")
}

val springVersion by extra("3.1.0.RELEASE")
val emailNotification by extra { "build@master.org" }

sourceSets.all { extra["purpose"] = null }

sourceSets {
    main {
        extra["purpose"] = "production"
    }
    test {
        extra["purpose"] = "test"
    }
    create("plugin") {
        extra["purpose"] = "production"
    }
}

tasks.register("printProperties") {
    val springVersion = springVersion
    val emailNotification = emailNotification
    val productionSourceSets = provider {
        sourceSets.matching { it.extra["purpose"] == "production" }.map { it.name }
    }
    doLast {
        println(springVersion)
        println(emailNotification)
        productionSourceSets.get().forEach { println(it) }
    }
}
build.gradle
plugins {
    id 'java-library'
}

ext {
    springVersion = "3.1.0.RELEASE"
    emailNotification = "build@master.org"
}

sourceSets.all { ext.purpose = null }

sourceSets {
    main {
        purpose = "production"
    }
    test {
        purpose = "test"
    }
    plugin {
        purpose = "production"
    }
}

tasks.register('printProperties') {
    def springVersion = springVersion
    def emailNotification = emailNotification
    def productionSourceSets = provider {
        sourceSets.matching { it.purpose == "production" }.collect { it.name }
    }
    doLast {
        println springVersion
        println emailNotification
        productionSourceSets.get().each { println it }
    }
}
$ gradle -q printProperties
3.1.0.RELEASE
build@master.org
main
plugin

This example adds two extra properties to the project object via by extra. Additionally, this example adds a property named purpose to each source set by setting extra["purpose"] to null. Once added, you can read and set these properties via extra.

This example adds two extra properties to the project object via an ext block. Additionally, this example adds a property named purpose to each source set by setting ext.purpose to null. Once added, you can read and set all these properties just like predefined ones.

Gradle requires special syntax for adding a property so that it can fail fast. For example, this allows Gradle to recognize when a script attempts to set a property that does not exist. You can access extra properties anywhere where you can access their owning object. This gives extra properties a wider scope than local variables. Subprojects can access extra properties on their parent projects.

For more information about extra properties, see ExtraPropertiesExtension in the API documentation.

Configure Arbitrary Objects

The example greet() task shows an example of arbitrary object configuration:

build.gradle.kts
class UserInfo(
    var name: String? = null,
    var email: String? = null
)

tasks.register("greet") {
    val user = UserInfo().apply {
        name = "Isaac Newton"
        email = "isaac@newton.me"
    }
    doLast {
        println(user.name)
        println(user.email)
    }
}
build.gradle
class UserInfo {
    String name
    String email
}

tasks.register('greet') {
    def user = configure(new UserInfo()) {
        name = "Isaac Newton"
        email = "isaac@newton.me"
    }
    doLast {
        println user.name
        println user.email
    }
}
$ gradle -q greet
Isaac Newton
isaac@newton.me
Closure Delegates

Each closure has a delegate object. Groovy uses this delegate to look up variable and method references to nonlocal variables and closure parameters. Gradle uses this for configuration closures, where the delegate object refers to the object being configured.

build.gradle
dependencies {
    assert delegate == project.dependencies
    testImplementation('junit:junit:4.13')
    delegate.testImplementation('junit:junit:4.13')
}

Default imports

To make build scripts more concise, Gradle automatically adds a set of import statements to scripts.

As a result, instead of writing throw new org.gradle.api.tasks.StopExecutionException(), you can write throw new StopExecutionException() instead.

Gradle implicitly adds the following imports to each script:

Gradle default imports
import org.gradle.*
import org.gradle.api.*
import org.gradle.api.artifacts.*
import org.gradle.api.artifacts.component.*
import org.gradle.api.artifacts.dsl.*
import org.gradle.api.artifacts.ivy.*
import org.gradle.api.artifacts.maven.*
import org.gradle.api.artifacts.query.*
import org.gradle.api.artifacts.repositories.*
import org.gradle.api.artifacts.result.*
import org.gradle.api.artifacts.transform.*
import org.gradle.api.artifacts.type.*
import org.gradle.api.artifacts.verification.*
import org.gradle.api.attributes.*
import org.gradle.api.attributes.java.*
import org.gradle.api.attributes.plugin.*
import org.gradle.api.cache.*
import org.gradle.api.capabilities.*
import org.gradle.api.component.*
import org.gradle.api.configuration.*
import org.gradle.api.credentials.*
import org.gradle.api.distribution.*
import org.gradle.api.distribution.plugins.*
import org.gradle.api.execution.*
import org.gradle.api.file.*
import org.gradle.api.flow.*
import org.gradle.api.initialization.*
import org.gradle.api.initialization.definition.*
import org.gradle.api.initialization.dsl.*
import org.gradle.api.initialization.resolve.*
import org.gradle.api.invocation.*
import org.gradle.api.java.archives.*
import org.gradle.api.jvm.*
import org.gradle.api.launcher.cli.*
import org.gradle.api.logging.*
import org.gradle.api.logging.configuration.*
import org.gradle.api.model.*
import org.gradle.api.plugins.*
import org.gradle.api.plugins.antlr.*
import org.gradle.api.plugins.catalog.*
import org.gradle.api.plugins.jvm.*
import org.gradle.api.plugins.quality.*
import org.gradle.api.plugins.scala.*
import org.gradle.api.problems.*
import org.gradle.api.project.*
import org.gradle.api.provider.*
import org.gradle.api.publish.*
import org.gradle.api.publish.ivy.*
import org.gradle.api.publish.ivy.plugins.*
import org.gradle.api.publish.ivy.tasks.*
import org.gradle.api.publish.maven.*
import org.gradle.api.publish.maven.plugins.*
import org.gradle.api.publish.maven.tasks.*
import org.gradle.api.publish.plugins.*
import org.gradle.api.publish.tasks.*
import org.gradle.api.reflect.*
import org.gradle.api.reporting.*
import org.gradle.api.reporting.components.*
import org.gradle.api.reporting.dependencies.*
import org.gradle.api.reporting.dependents.*
import org.gradle.api.reporting.model.*
import org.gradle.api.reporting.plugins.*
import org.gradle.api.resources.*
import org.gradle.api.services.*
import org.gradle.api.specs.*
import org.gradle.api.tasks.*
import org.gradle.api.tasks.ant.*
import org.gradle.api.tasks.application.*
import org.gradle.api.tasks.bundling.*
import org.gradle.api.tasks.compile.*
import org.gradle.api.tasks.diagnostics.*
import org.gradle.api.tasks.diagnostics.configurations.*
import org.gradle.api.tasks.incremental.*
import org.gradle.api.tasks.javadoc.*
import org.gradle.api.tasks.options.*
import org.gradle.api.tasks.scala.*
import org.gradle.api.tasks.testing.*
import org.gradle.api.tasks.testing.junit.*
import org.gradle.api.tasks.testing.junitplatform.*
import org.gradle.api.tasks.testing.testng.*
import org.gradle.api.tasks.util.*
import org.gradle.api.tasks.wrapper.*
import org.gradle.api.toolchain.management.*
import org.gradle.authentication.*
import org.gradle.authentication.aws.*
import org.gradle.authentication.http.*
import org.gradle.build.event.*
import org.gradle.buildconfiguration.tasks.*
import org.gradle.buildinit.*
import org.gradle.buildinit.plugins.*
import org.gradle.buildinit.tasks.*
import org.gradle.caching.*
import org.gradle.caching.configuration.*
import org.gradle.caching.http.*
import org.gradle.caching.local.*
import org.gradle.concurrent.*
import org.gradle.external.javadoc.*
import org.gradle.ide.visualstudio.*
import org.gradle.ide.visualstudio.plugins.*
import org.gradle.ide.visualstudio.tasks.*
import org.gradle.ide.xcode.*
import org.gradle.ide.xcode.plugins.*
import org.gradle.ide.xcode.tasks.*
import org.gradle.ivy.*
import org.gradle.jvm.*
import org.gradle.jvm.application.scripts.*
import org.gradle.jvm.application.tasks.*
import org.gradle.jvm.tasks.*
import org.gradle.jvm.toolchain.*
import org.gradle.language.*
import org.gradle.language.assembler.*
import org.gradle.language.assembler.plugins.*
import org.gradle.language.assembler.tasks.*
import org.gradle.language.base.*
import org.gradle.language.base.artifact.*
import org.gradle.language.base.compile.*
import org.gradle.language.base.plugins.*
import org.gradle.language.base.sources.*
import org.gradle.language.c.*
import org.gradle.language.c.plugins.*
import org.gradle.language.c.tasks.*
import org.gradle.language.cpp.*
import org.gradle.language.cpp.plugins.*
import org.gradle.language.cpp.tasks.*
import org.gradle.language.java.artifact.*
import org.gradle.language.jvm.tasks.*
import org.gradle.language.nativeplatform.*
import org.gradle.language.nativeplatform.tasks.*
import org.gradle.language.objectivec.*
import org.gradle.language.objectivec.plugins.*
import org.gradle.language.objectivec.tasks.*
import org.gradle.language.objectivecpp.*
import org.gradle.language.objectivecpp.plugins.*
import org.gradle.language.objectivecpp.tasks.*
import org.gradle.language.plugins.*
import org.gradle.language.rc.*
import org.gradle.language.rc.plugins.*
import org.gradle.language.rc.tasks.*
import org.gradle.language.scala.tasks.*
import org.gradle.language.swift.*
import org.gradle.language.swift.plugins.*
import org.gradle.language.swift.tasks.*
import org.gradle.maven.*
import org.gradle.model.*
import org.gradle.nativeplatform.*
import org.gradle.nativeplatform.platform.*
import org.gradle.nativeplatform.plugins.*
import org.gradle.nativeplatform.tasks.*
import org.gradle.nativeplatform.test.*
import org.gradle.nativeplatform.test.cpp.*
import org.gradle.nativeplatform.test.cpp.plugins.*
import org.gradle.nativeplatform.test.cunit.*
import org.gradle.nativeplatform.test.cunit.plugins.*
import org.gradle.nativeplatform.test.cunit.tasks.*
import org.gradle.nativeplatform.test.googletest.*
import org.gradle.nativeplatform.test.googletest.plugins.*
import org.gradle.nativeplatform.test.plugins.*
import org.gradle.nativeplatform.test.tasks.*
import org.gradle.nativeplatform.test.xctest.*
import org.gradle.nativeplatform.test.xctest.plugins.*
import org.gradle.nativeplatform.test.xctest.tasks.*
import org.gradle.nativeplatform.toolchain.*
import org.gradle.nativeplatform.toolchain.plugins.*
import org.gradle.normalization.*
import org.gradle.platform.*
import org.gradle.platform.base.*
import org.gradle.platform.base.binary.*
import org.gradle.platform.base.component.*
import org.gradle.platform.base.plugins.*
import org.gradle.plugin.devel.*
import org.gradle.plugin.devel.plugins.*
import org.gradle.plugin.devel.tasks.*
import org.gradle.plugin.management.*
import org.gradle.plugin.use.*
import org.gradle.plugins.ear.*
import org.gradle.plugins.ear.descriptor.*
import org.gradle.plugins.ide.*
import org.gradle.plugins.ide.api.*
import org.gradle.plugins.ide.eclipse.*
import org.gradle.plugins.ide.idea.*
import org.gradle.plugins.signing.*
import org.gradle.plugins.signing.signatory.*
import org.gradle.plugins.signing.signatory.pgp.*
import org.gradle.plugins.signing.type.*
import org.gradle.plugins.signing.type.pgp.*
import org.gradle.process.*
import org.gradle.swiftpm.*
import org.gradle.swiftpm.plugins.*
import org.gradle.swiftpm.tasks.*
import org.gradle.testing.base.*
import org.gradle.testing.base.plugins.*
import org.gradle.testing.jacoco.plugins.*
import org.gradle.testing.jacoco.tasks.*
import org.gradle.testing.jacoco.tasks.rules.*
import org.gradle.testkit.runner.*
import org.gradle.util.*
import org.gradle.vcs.*
import org.gradle.vcs.git.*
import org.gradle.work.*
import org.gradle.workers.*

Next Step: Learn how to use Tasks >>

Using Tasks

The work that Gradle can do on a project is defined by one or more tasks.

author gradle 5

A task represents some independent unit of work that a build performs. This might be compiling some classes, creating a JAR, generating Javadoc, or publishing some archives to a repository.

When a user runs ./gradlew build in the command line, Gradle will execute the build task along with any other tasks it depends on.

List available tasks

Gradle provides several default tasks for a project, which are listed by running ./gradlew tasks:

> Task :tasks

------------------------------------------------------------
Tasks runnable from root project 'myTutorial'
------------------------------------------------------------

Build Setup tasks
-----------------
init - Initializes a new Gradle build.
wrapper - Generates Gradle wrapper files.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in root project 'myTutorial'.
...

Tasks either come from build scripts or plugins.

Once we apply a plugin to our project, such as the application plugin, additional tasks become available:

build.gradle.kts
plugins {
    id("application")
}
$ ./gradlew tasks

> Task :tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the main source code.

Other tasks
-----------
compileJava - Compiles main Java source.

...

Many of these tasks, such as assemble, build, and run, should be familiar to a developer.

Task classification

There are two classes of tasks that can be executed:

  1. Actionable tasks have some action(s) attached to do work in your build: compileJava.

  2. Lifecycle tasks are tasks with no actions attached: assemble, build.

Typically, a lifecycle tasks depends on many actionable tasks, and is used to execute many tasks at once.

Task registration and action

Let’s take a look at a simple "Hello World" task in a build script:

build.gradle.kts
tasks.register("hello") {
    doLast {
        println("Hello world!")
    }
}
build.gradle
tasks.register('hello') {
    doLast {
        println 'Hello world!'
    }
}

In the example, the build script registers a single task called hello using the TaskContainer API, and adds an action to it.

If the tasks in the project are listed, the hello task is available to Gradle:

$ ./gradlew app:tasks --all

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Other tasks
-----------
compileJava - Compiles main Java source.
compileTestJava - Compiles test Java source.
hello
processResources - Processes main resources.
processTestResources - Processes test resources.
startScripts - Creates OS-specific scripts to run the project as a JVM application.

You can execute the task in the build script with ./gradlew hello:

$ ./gradlew hello
Hello world!

When Gradle executes the hello task, it executes the action provided. In this case, the action is simply a block containing some code: println("Hello world!").

Task group and description

The hello task from the previous section can be detailed with a description and assigned to a group with the following update:

build.gradle.kts
tasks.register("hello") {
    group = "Custom"
    description = "A lovely greeting task."
    doLast {
        println("Hello world!")
    }
}

Once the task is assigned to a group, it will be listed by ./gradlew tasks:

$ ./gradlew tasks

> Task :tasks

Custom tasks
------------------
hello - A lovely greeting task.

To view information about a task, use the help --task <task-name> command:

$./gradlew help --task hello

> Task :help
Detailed task information for hello

Path
:app:hello

Type
Task (org.gradle.api.Task)

Options
--rerun     Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom

As we can see, the hello task belongs to the custom group.

Task dependencies

You can declare tasks that depend on other tasks:

build.gradle.kts
tasks.register("hello") {
    doLast {
        println("Hello world!")
    }
}
tasks.register("intro") {
    dependsOn("hello")
    doLast {
        println("I'm Gradle")
    }
}
build.gradle
tasks.register('hello') {
    doLast {
        println 'Hello world!'
    }
}
tasks.register('intro') {
    dependsOn tasks.hello
    doLast {
        println "I'm Gradle"
    }
}
$ gradle -q intro
Hello world!
I'm Gradle

The dependency of taskX to taskY may be declared before taskY is defined:

build.gradle.kts
tasks.register("taskX") {
    dependsOn("taskY")
    doLast {
        println("taskX")
    }
}
tasks.register("taskY") {
    doLast {
        println("taskY")
    }
}
build.gradle
tasks.register('taskX') {
    dependsOn 'taskY'
    doLast {
        println 'taskX'
    }
}
tasks.register('taskY') {
    doLast {
        println 'taskY'
    }
}
$ gradle -q taskX
taskY
taskX

The hello task from the previous example is updated to include a dependency:

build.gradle.kts
tasks.register("hello") {
    group = "Custom"
    description = "A lovely greeting task."
    doLast {
        println("Hello world!")
    }
    dependsOn(tasks.assemble)
}

The hello task now depends on the assemble task, which means that Gradle must execute the assemble task before it can execute the hello task:

$ ./gradlew :app:hello

> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:startScripts UP-TO-DATE
> Task :app:distTar UP-TO-DATE
> Task :app:distZip UP-TO-DATE
> Task :app:assemble UP-TO-DATE

> Task :app:hello
Hello world!

Task configuration

Once registered, tasks can be accessed via the TaskProvider API for further configuration.

For instance, you can use this to add dependencies to a task at runtime dynamically:

build.gradle.kts
repeat(4) { counter ->
    tasks.register("task$counter") {
        doLast {
            println("I'm task number $counter")
        }
    }
}
tasks.named("task0") { dependsOn("task2", "task3") }
build.gradle
4.times { counter ->
    tasks.register("task$counter") {
        doLast {
            println "I'm task number $counter"
        }
    }
}
tasks.named('task0') { dependsOn('task2', 'task3') }
$ gradle -q task0
I'm task number 2
I'm task number 3
I'm task number 0

Or you can add behavior to an existing task:

build.gradle.kts
tasks.register("hello") {
    doLast {
        println("Hello Earth")
    }
}
tasks.named("hello") {
    doFirst {
        println("Hello Venus")
    }
}
tasks.named("hello") {
    doLast {
        println("Hello Mars")
    }
}
tasks.named("hello") {
    doLast {
        println("Hello Jupiter")
    }
}
build.gradle
tasks.register('hello') {
    doLast {
        println 'Hello Earth'
    }
}
tasks.named('hello') {
    doFirst {
        println 'Hello Venus'
    }
}
tasks.named('hello') {
    doLast {
        println 'Hello Mars'
    }
}
tasks.named('hello') {
    doLast {
        println 'Hello Jupiter'
    }
}
$ gradle -q hello
Hello Venus
Hello Earth
Hello Mars
Hello Jupiter
Tip
The calls doFirst and doLast can be executed multiple times. They add an action to the beginning or the end of the task’s actions list. When the task executes, the actions in the action list are executed in order.

Here is an example of the named method being used to configure a task added by a plugin:

tasks.named("dokkaHtml") {
    outputDirectory.set(buildDir.resolve("dokka"))
}

Task types

Gradle tasks are a subclass of Task.

In the build script, the HelloTask class is created by extending DefaultTask:

build.gradle.kts
// Extend the DefaultTask class to create a HelloTask class
abstract class HelloTask : DefaultTask() {
    @TaskAction
    fun hello() {
        println("hello from HelloTask")
    }
}

// Register the hello Task with type HelloTask
tasks.register<HelloTask>("hello") {
    group = "Custom tasks"
    description = "A lovely greeting task."
}

The hello task is registered with the type HelloTask.

Executing our new hello task:

$ ./gradlew hello

> Task :app:hello
hello from HelloTask

Now the hello task is of type HelloTask instead of type Task.

The Gradle help task reveals the change:

$ ./gradlew help --task hello

> Task :help
Detailed task information for hello

Path
:app:hello

Type
HelloTask (Build_gradle$HelloTask)

Options
--rerun     Causes the task to be re-run even if up-to-date.

Description
A lovely greeting task.

Group
Custom tasks

Built-in task types

Gradle provides many built-in task types with common and popular functionality, such as copying or deleting files.

This example task copies *.war files from the source directory to the target directory using the Copy built-in task:

tasks.register("copyTask",Copy) {
    from("source")
    into("target")
    include("*.war")
}

There are many task types developers can take advantage of, including GroovyDoc, Zip, Jar, JacocoReport, Sign, or Delete, which are available in the link:DSL.

Next Step: Learn how to write Tasks >>

Writing Tasks

Gradle tasks are created by extending DefaultTask.

However, the generic DefaultTask provides no action for Gradle. If users want to extend the capabilities of Gradle and their build script, they must either use a built-in task or create a custom task:

  1. Built-in task - Gradle provides built-in utility tasks such as Copy, Jar, Zip, Delete, etc…​

  2. Custom task - Gradle allows users to subclass DefaultTask to create their own task types.

Create a task

The simplest and quickest way to create a custom task is in a build script:

To create a task, inherit from the DefaultTask class and implement a @TaskAction handler:

build.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @TaskAction
    fun action() {
        val file = File("myfile.txt")
        file.createNewFile()
        file.writeText("HELLO FROM MY TASK")
    }
}

The CreateFileTask implements a simple set of actions. First, a file called "myfile.txt" is created in the main project. Then, some text is written to the file.

Register a task

A task is registered in the build script using the TaskContainer.register() method, which allows it to be then used in the build logic.

build.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @TaskAction
    fun action() {
        val file = File("myfile.txt")
        file.createNewFile()
        file.writeText("HELLO FROM MY TASK")
    }
}

tasks.register<CreateFileTask>("createFileTask")

Task group and description

Setting the group and description properties on your tasks can help users understand how to use your task:

build.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @TaskAction
    fun action() {
        val file = File("myfile.txt")
        file.createNewFile()
        file.writeText("HELLO FROM MY TASK")
    }
}

tasks.register<CreateFileTask>("createFileTask", ) {
    group = "custom"
    description = "Create myfile.txt in the current directory"
}

Once a task is added to a group, it is visible when listing tasks.

Task input and outputs

For the task to do useful work, it typically needs some inputs. A task typically produces outputs.

build.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @Input
    val fileText = "HELLO FROM MY TASK"

    @Input
    val fileName = "myfile.txt"

    @OutputFile
    val myFile: File = File(fileName)

    @TaskAction
    fun action() {
        myFile.createNewFile()
        myFile.writeText(fileText)
    }
}

tasks.register<CreateFileTask>("createFileTask") {
    group = "custom"
    description = "Create myfile.txt in the current directory"
}

Configure a task

A task is optionally configured in a build script using the TaskCollection.named() method.

The CreateFileTask class is updated so that the text in the file is configurable:

build.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @get:Input
    abstract val fileText: Property<String>

    @Input
    val fileName = "myfile.txt"

    @OutputFile
    val myFile: File = File(fileName)

    @TaskAction
    fun action() {
        myFile.createNewFile()
        myFile.writeText(fileText.get())
    }
}

tasks.register<CreateFileTask>("createFileTask") {
    group = "custom"
    description = "Create myfile.txt in the current directory"
    fileText.convention("HELLO FROM THE CREATE FILE TASK METHOD") // Set convention
}

tasks.named<CreateFileTask>("createFileTask") {
    fileText.set("HELLO FROM THE NAMED METHOD") // Override with custom message
}

In the named() method, we find the createFileTask task and set the text that will be written to the file.

When the task is executed:

$ ./gradlew createFileTask

> Configure project :app

> Task :app:createFileTask

BUILD SUCCESSFUL in 5s
2 actionable tasks: 1 executed, 1 up-to-date

A text file called myfile.txt is created in the project root folder:

myfile.txt
HELLO FROM THE NAMED METHOD

Consult the Developing Gradle Tasks chapter to learn more.

Next Step: Learn how to use Plugins >>

Using Plugins

Much of Gradle’s functionality is delivered via plugins, including core plugins distributed with Gradle, third-party plugins, and script plugins defined within builds.

Plugins introduce new tasks (e.g., JavaCompile), domain objects (e.g., SourceSet), conventions (e.g., locating Java source at src/main/java), and extend core or other plugin objects.

Plugins in Gradle are essential for automating common build tasks, integrating with external tools or services, and tailoring the build process to meet specific project needs. They also serve as the primary mechanism for organizing build logic.

Benefits of plugins

Writing many tasks and duplicating configuration blocks in build scripts can get messy. Plugins offer several advantages over adding logic directly to the build script:

  • Promotes Reusability: Reduces the need to duplicate similar logic across projects.

  • Enhances Modularity: Allows for a more modular and organized build script.

  • Encapsulates Logic: Keeps imperative logic separate, enabling more declarative build scripts.

Plugin distribution

You can leverage plugins from Gradle and the Gradle community or create your own.

Plugins are available in three ways:

  1. Core plugins - Gradle develops and maintains a set of Core Plugins.

  2. Community plugins - Gradle plugins shared in a remote repository such as Maven or the Gradle Plugin Portal.

  3. Local plugins - Gradle enables users to create custom plugins using APIs.

Types of plugins

Plugins can be implemented as binary plugins, precompiled script plugins, or script plugins:

Binary Plugins

Binary plugins are compiled plugins typically written in Java or Kotlin DSL that are packaged as JAR files. They are applied to a project using the plugins {} block. They offer better performance and maintainability compared to script plugins or precompiled script plugins.

Precompiled Script Plugins

Precompiled script plugins are Groovy DSL or Kotlin DSL scripts compiled and distributed as Java class files packaged in a library. They are applied to a project using the plugins {} block. They provide a way to reuse complex logic across projects and allow for better organization of build logic.

Script Plugins

Script plugins are Groovy DSL or Kotlin DSL scripts that are applied directly to a Gradle build script using the apply from: syntax. They are applied inline within a build script to add functionality or customize the build process. They are simple to use.

A plugin often starts as a script plugin (because they are easy to write). Then, as the code becomes more valuable, it’s migrated to a binary plugin that can be easily tested and shared between multiple projects or organizations.

Using plugins

To use the build logic encapsulated in a plugin, Gradle needs to perform two steps. First, it needs to resolve the plugin, and then it needs to apply the plugin to the target, usually a Project.

  1. Resolving a plugin means finding the correct version of the JAR that contains a given plugin and adding it to the script classpath. Once a plugin is resolved, its API can be used in a build script. Script plugins are self-resolving in that they are resolved from the specific file path or URL provided when applying them. Core binary plugins provided as part of the Gradle distribution are automatically resolved.

  2. Applying a plugin means executing the plugin’s Plugin.apply(T) on a project.

The plugins DSL is recommended to resolve and apply plugins in one step.

Resolving plugins

Gradle provides the core plugins (e.g., JavaPlugin, GroovyPlugin, MavenPublishPlugin, etc.) as part of its distribution, which means they are automatically resolved.

Core plugins are applied in a build script using the plugin name:

plugins {
    id «plugin name»
}

For example:

build.gradle
plugins {
    id("java")
}

Non-core plugins must be resolved before they can be applied. Non-core plugins are identified by a unique ID and a version in the build file:

plugins {
    id «plugin id» version «plugin version»
}

And the location of the plugin must be specified in the settings file:

settings.gradle
pluginManagement {
    repositories {
        gradlePluginPortal()
        maven {
            url 'https://maven.example.com/plugins'
        }
    }
}

There are additional considerations for resolving and applying plugins:

# To Use For example:

1

Apply a core, community or local plugin to a specific project.

The plugins block in the build file

plugins {
  id("org.barfuin.gradle.taskinfo") version "2.1.0"
}

2

Apply common core, community or local plugin to multiple subprojects.

A build script in the buildSrc directory

plugins {
    id("org.barfuin.gradle.taskinfo") version "2.1.0"
}
repositories {
    mavenCentral()
}
dependencies {
    implementation(Libs.Kotlin.coroutines)
}

3

Apply a core, community or local plugin needed for the build script itself.

The buildscript block in the build file

buildscript {
  repositories {
    maven {
      url = uri("https://plugins.gradle.org/m2/")
    }
  }
  dependencies {
    classpath("org.barfuin.gradle.taskinfo:gradle-taskinfo:2.1.0")
  }
}
plugins {
  id("org.barfuin.gradle.taskinfo") version "2.1.0"
}

4

Apply a local script plugins.

The legacy apply() method in the build file

apply(plugin = "org.barfuin.gradle.taskinfo")
apply<MyPlugin>()

1. Applying plugins using the plugins{} block

The plugin DSL provides a concise and convenient way to declare plugin dependencies.

The plugins block configures an instance of PluginDependenciesSpec:

plugins {
    application                                     // by name
    java                                            // by name
    id("java")                                      // by id - recommended
    id("org.jetbrains.kotlin.jvm") version "1.9.0"  // by id - recommended
}

Core Gradle plugins are unique in that they provide short names, such as java for the core JavaPlugin.

To apply a core plugin, the short name can be used:

build.gradle.kts
plugins {
    java
}
build.gradle
plugins {
    id 'java'
}

All other binary plugins must use the fully qualified form of the plugin id (e.g., com.github.foo.bar).

To apply a community plugin from Gradle plugin portal, the fully qualified plugin id, a globally unique identifier, must be used:

build.gradle.kts
plugins {
    id("org.springframework.boot") version "3.3.1"
}
build.gradle
plugins {
    id 'org.springframework.boot' version '3.3.1'
}

See PluginDependenciesSpec for more information on using the Plugin DSL.

Limitations of the plugins DSL

The plugins DSL provides a convenient syntax for users and the ability for Gradle to determine which plugins are used quickly. This allows Gradle to:

  • Optimize the loading and reuse of plugin classes.

  • Provide editors with detailed information about the potential properties and values in the build script.

However, the DSL requires that plugins be defined statically.

There are some key differences between the plugins {} block mechanism and the "traditional" apply() method mechanism. There are also some constraints and possible limitations.

Constrained Syntax

The plugins {} block does not support arbitrary code.

It is constrained to be idempotent (produce the same result every time) and side effect-free (safe for Gradle to execute at any time).

The form is:

build.gradle.kts
plugins {
    id(«plugin id»)                             // (1)
    id(«plugin id») version «plugin version»    // (2)
}
  1. for core Gradle plugins or plugins already available to the build script

  2. for binary Gradle plugins that need to be resolved

build.gradle
plugins {
    id «plugin id»                            // (1)
    id «plugin id» version «plugin version»   // (2)
}
  1. for core Gradle plugins or plugins already available to the build script

  2. for binary Gradle plugins that need to be resolved

Where «plugin id» and «plugin version» are a string.

Where «plugin id» and «plugin version» must be constant, literal strings.

The plugins{} block must also be a top-level statement in the build script. It cannot be nested inside another construct (e.g., an if-statement or for-loop).

Only in build scripts and settings file

The plugins{} block can only be used in a project’s build script build.gradle(.kts) and the settings.gradle(.kts) file. It must appear before any other block. It cannot be used in script plugins or init scripts.

Applying plugins to all subprojects

Suppose you have a multi-project build, you probably want to apply plugins to some or all of the subprojects in your build but not to the root project.

While the default behavior of the plugins{} block is to immediately resolve and apply the plugins, you can use the apply false syntax to tell Gradle not to apply the plugin to the current project. Then, use the plugins{} block without the version in subprojects' build scripts:

settings.gradle.kts
include("hello-a")
include("hello-b")
include("goodbye-c")
build.gradle.kts
plugins {
    id("com.example.hello") version "1.0.0" apply false
    id("com.example.goodbye") version "1.0.0" apply false
}
hello-a/build.gradle.kts
plugins {
    id("com.example.hello")
}
hello-b/build.gradle.kts
plugins {
    id("com.example.hello")
}
goodbye-c/build.gradle.kts
plugins {
    id("com.example.goodbye")
}
settings.gradle
include 'hello-a'
include 'hello-b'
include 'goodbye-c'
build.gradle
plugins {
    id 'com.example.hello' version '1.0.0' apply false
    id 'com.example.goodbye' version '1.0.0' apply false
}
hello-a/build.gradle
plugins {
    id 'com.example.hello'
}
hello-b/build.gradle
plugins {
    id 'com.example.hello'
}
goodbye-c/build.gradle
plugins {
    id 'com.example.goodbye'
}

You can also encapsulate the versions of external plugins by composing the build logic using your own convention plugins.

2. Applying plugins from the buildSrc directory

buildSrc is an optional directory at the Gradle project root that contains build logic (i.e., plugins) used in building the main project. You can apply plugins that reside in a project’s buildSrc directory as long as they have a defined ID.

The following example shows how to tie the plugin implementation class my.MyPlugin, defined in buildSrc, to the id "my-plugin":

buildSrc/build.gradle.kts
plugins {
    `java-gradle-plugin`
}

gradlePlugin {
    plugins {
        create("myPlugins") {
            id = "my-plugin"
            implementationClass = "my.MyPlugin"
        }
    }
}
buildSrc/build.gradle
plugins {
    id 'java-gradle-plugin'
}

gradlePlugin {
    plugins {
        myPlugins {
            id = 'my-plugin'
            implementationClass = 'my.MyPlugin'
        }
    }
}

The plugin can then be applied by ID:

build.gradle.kts
plugins {
    id("my-plugin")
}
build.gradle
plugins {
    id 'my-plugin'
}

3. Applying plugins using the buildscript{} block

The buildscript block is used for:

  1. global dependencies and repositories required for building the project (applied in the subprojects).

  2. declaring which plugins are available for use in the build script (in the build.gradle(.kts) file itself).

So when you want to use a library in the build script itself, you must add this library on the script classpath using buildScript:

import org.apache.commons.codec.binary.Base64

buildscript {
    repositories {  // this is where the plugins are located
        mavenCentral()
        google()
    }
    dependencies { // these are the plugins that can be used in subprojects or in the build file itself
        classpath group: 'commons-codec', name: 'commons-codec', version: '1.2' // used in the task below
        classpath 'com.android.tools.build:gradle:4.1.0' // used in subproject
    }
}

tasks.register('encode') {
    doLast {
        def byte[] encodedString = new Base64().encode('hello world\n'.getBytes())
        println new String(encodedString)
    }
}

And you can apply the globally declared dependencies in the subproject that needs it:

plugins {
    id 'com.android.application'
}

Binary plugins published as external jar files can be added to a project by adding the plugin to the build script classpath and then applying the plugin.

External jars can be added to the build script classpath using the buildscript{} block as described in External dependencies for the build script:

build.gradle.kts
buildscript {
    repositories {
        gradlePluginPortal()
    }
    dependencies {
        classpath("org.springframework.boot:spring-boot-gradle-plugin:3.3.1")
    }
}

apply(plugin = "org.springframework.boot")
build.gradle
buildscript {
    repositories {
        gradlePluginPortal()
    }
    dependencies {
        classpath 'org.springframework.boot:spring-boot-gradle-plugin:3.3.1'
    }
}

apply plugin: 'org.springframework.boot'

4. Applying script plugins using the legacy apply() method

A script plugin is an ad-hoc plugin, typically written and applied in the same build script. It is applied using the legacy application method:

class MyPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        println("Plugin ${this.javaClass.simpleName} applied on ${project.name}")
    }
}

apply<MyPlugin>()

Let’s take a rudimentary example of a plugin written in a file called other.gradle located in the same directory as the build.gradle file:

public class Other implements Plugin<Project> {
    @Override
    void apply(Project project) {
        // Does something
    }
}

First, import the external file using:

apply from: 'other.gradle'

Then you can apply it:

apply plugin: Other

Script plugins are automatically resolved and can be applied from a script on the local filesystem or remotely:

build.gradle.kts
apply(from = "other.gradle.kts")
build.gradle
apply from: 'other.gradle'

Filesystem locations are relative to the project directory, while remote script locations are specified with an HTTP URL. Multiple script plugins (of either form) can be applied to a given target.

Plugin Management

The pluginManagement{} block is used to configure repositories for plugin resolution and to define version constraints for plugins that are applied in the build scripts.

The pluginManagement{} block can be used in a settings.gradle(.kts) file, where it must be the first block in the file:

settings.gradle.kts
pluginManagement {
    plugins {
    }
    resolutionStrategy {
    }
    repositories {
    }
}
rootProject.name = "plugin-management"
settings.gradle
pluginManagement {
    plugins {
    }
    resolutionStrategy {
    }
    repositories {
    }
}
rootProject.name = 'plugin-management'

The block can also be used in Initialization Script:

init.gradle.kts
settingsEvaluated {
    pluginManagement {
        plugins {
        }
        resolutionStrategy {
        }
        repositories {
        }
    }
}
init.gradle
settingsEvaluated { settings ->
    settings.pluginManagement {
        plugins {
        }
        resolutionStrategy {
        }
        repositories {
        }
    }
}
Custom Plugin Repositories

By default, the plugins{} DSL resolves plugins from the public Gradle Plugin Portal.

Many build authors would also like to resolve plugins from private Maven or Ivy repositories because they contain proprietary implementation details or to have more control over what plugins are available to their builds.

To specify custom plugin repositories, use the repositories{} block inside pluginManagement{}:

settings.gradle.kts
pluginManagement {
    repositories {
        maven(url = "./maven-repo")
        gradlePluginPortal()
        ivy(url = "./ivy-repo")
    }
}
settings.gradle
pluginManagement {
    repositories {
        maven {
            url './maven-repo'
        }
        gradlePluginPortal()
        ivy {
            url './ivy-repo'
        }
    }
}

This tells Gradle to first look in the Maven repository at ../maven-repo when resolving plugins and then to check the Gradle Plugin Portal if the plugins are not found in the Maven repository. If you don’t want the Gradle Plugin Portal to be searched, omit the gradlePluginPortal() line. Finally, the Ivy repository at ../ivy-repo will be checked.

Plugin Version Management

A plugins{} block inside pluginManagement{} allows all plugin versions for the build to be defined in a single location. Plugins can then be applied by id to any build script via the plugins{} block.

One benefit of setting plugin versions this way is that the pluginManagement.plugins{} does not have the same constrained syntax as the build script plugins{} block. This allows plugin versions to be taken from gradle.properties, or loaded via another mechanism.

Managing plugin versions via pluginManagement:

settings.gradle.kts
pluginManagement {
  val helloPluginVersion: String by settings
  plugins {
    id("com.example.hello") version "${helloPluginVersion}"
  }
}
build.gradle.kts
plugins {
    id("com.example.hello")
}
gradle.properties
helloPluginVersion=1.0.0
settings.gradle
pluginManagement {
  plugins {
        id 'com.example.hello' version "${helloPluginVersion}"
    }
}
build.gradle
plugins {
    id 'com.example.hello'
}
gradle.properties
helloPluginVersion=1.0.0

The plugin version is loaded from gradle.properties and configured in the settings script, allowing the plugin to be added to any project without specifying the version.

Plugin Resolution Rules

Plugin resolution rules allow you to modify plugin requests made in plugins{} blocks, e.g., changing the requested version or explicitly specifying the implementation artifact coordinates.

To add resolution rules, use the resolutionStrategy{} inside the pluginManagement{} block:

settings.gradle.kts
pluginManagement {
    resolutionStrategy {
        eachPlugin {
            if (requested.id.namespace == "com.example") {
                useModule("com.example:sample-plugins:1.0.0")
            }
        }
    }
    repositories {
        maven {
            url = uri("./maven-repo")
        }
        gradlePluginPortal()
        ivy {
            url = uri("./ivy-repo")
        }
    }
}
settings.gradle
pluginManagement {
    resolutionStrategy {
        eachPlugin {
            if (requested.id.namespace == 'com.example') {
                useModule('com.example:sample-plugins:1.0.0')
            }
        }
    }
    repositories {
        maven {
            url './maven-repo'
        }
        gradlePluginPortal()
        ivy {
            url './ivy-repo'
        }
    }
}

This tells Gradle to use the specified plugin implementation artifact instead of its built-in default mapping from plugin ID to Maven/Ivy coordinates.

Custom Maven and Ivy plugin repositories must contain plugin marker artifacts and the artifacts that implement the plugin. Read Gradle Plugin Development Plugin for more information on publishing plugins to custom repositories.

See PluginManagementSpec for complete documentation for using the pluginManagement{} block.

Plugin Marker Artifacts

Since the plugins{} DSL block only allows for declaring plugins by their globally unique plugin id and version properties, Gradle needs a way to look up the coordinates of the plugin implementation artifact.

To do so, Gradle will look for a Plugin Marker Artifact with the coordinates plugin.id:plugin.id.gradle.plugin:plugin.version. This marker needs to have a dependency on the actual plugin implementation. Publishing these markers is automated by the java-gradle-plugin.

For example, the following complete sample from the sample-plugins project shows how to publish a com.example.hello plugin and a com.example.goodbye plugin to both an Ivy and Maven repository using the combination of the java-gradle-plugin, the maven-publish plugin, and the ivy-publish plugin.

build.gradle.kts
plugins {
    `java-gradle-plugin`
    `maven-publish`
    `ivy-publish`
}

group = "com.example"
version = "1.0.0"

gradlePlugin {
    plugins {
        create("hello") {
            id = "com.example.hello"
            implementationClass = "com.example.hello.HelloPlugin"
        }
        create("goodbye") {
            id = "com.example.goodbye"
            implementationClass = "com.example.goodbye.GoodbyePlugin"
        }
    }
}

publishing {
    repositories {
        maven {
            url = uri(layout.buildDirectory.dir("maven-repo"))
        }
        ivy {
            url = uri(layout.buildDirectory.dir("ivy-repo"))
        }
    }
}
build.gradle
plugins {
    id 'java-gradle-plugin'
    id 'maven-publish'
    id 'ivy-publish'
}

group 'com.example'
version '1.0.0'

gradlePlugin {
    plugins {
        hello {
            id = 'com.example.hello'
            implementationClass = 'com.example.hello.HelloPlugin'
        }
        goodbye {
            id = 'com.example.goodbye'
            implementationClass = 'com.example.goodbye.GoodbyePlugin'
        }
    }
}

publishing {
    repositories {
        maven {
            url layout.buildDirectory.dir("maven-repo")
        }
        ivy {
            url layout.buildDirectory.dir("ivy-repo")
        }
    }
}

Running gradle publish in the sample directory creates the following Maven repository layout (the Ivy layout is similar):

plugin markers

Legacy Plugin Application

With the introduction of the plugins DSL, users should have little reason to use the legacy method of applying plugins. It is documented here in case a build author cannot use the plugin DSL due to restrictions in how it currently works.

build.gradle.kts
apply(plugin = "java")
build.gradle
apply plugin: 'java'

Plugins can be applied using a plugin id. In the above case, we are using the short name "java" to apply the JavaPlugin.

Rather than using a plugin id, plugins can also be applied by simply specifying the class of the plugin:

build.gradle.kts
apply<JavaPlugin>()
build.gradle
apply plugin: JavaPlugin

The JavaPlugin symbol in the above sample refers to the JavaPlugin. This class does not strictly need to be imported as the org.gradle.api.plugins package is automatically imported in all build scripts (see Default imports).

Furthermore, one needs to append the ::class suffix to identify a class literal in Kotlin instead of .class in Java.

Furthermore, it is unnecessary to append .class to identify a class literal in Groovy as it is in Java.

Using a Version Catalog

When a project uses a version catalog, plugins can be referenced via aliases when applied.

Let’s take a look at a simple Version Catalog:

gradle/libs.versions.toml
[versions]
intellij-plugin = "1.6"

[plugins]
jetbrains-intellij = { id = "org.jetbrains.intellij", version.ref = "intellij-plugin" }

Then a plugin can be applied to any build script using the alias method:

build.gradle.kts
plugins {
    alias(libs.plugins.jetbrains.intellij)
}
Tip
jetbrains-intellij is available as the Gradle generated safe accessor: jetbrains.intellij.

Writing Plugins

If Gradle or the Gradle community does not offer the specific capabilities your project needs, creating your own plugin could be a solution.

Additionally, if you find yourself duplicating build logic across subprojects and need a better way to organize it, custom plugins can help.

Custom plugin

A plugin is any class that implements the Plugin interface.

To create a "hello world" plugin:

import org.gradle.api.Plugin
import org.gradle.api.Project

abstract class SamplePlugin : Plugin<Project> { // (1)
    override fun apply(project: Project) {      // (2)
        project.tasks.create("SampleTask") {
            println("Hello world!")
        }
    }
}
  1. Extend the org.gradle.api.Plugin interface.

  2. Override the apply method.

1. Extend the org.gradle.api.Plugin interface

Create a class that extends the Plugin interface.

abstract class MyCreateFilePlugin : Plugin<Project> {
    override fun apply() {}
}
2. Override the apply method

Add tasks and other logic in the apply() method.

When SamplePlugin is applied in your project, Gradle calls the fun apply() {} method defined. This adds the SampleTask to your project.

You can then apply the plugin in your build script:

build.gradle.kts
import org.gradle.api.Plugin
import org.gradle.api.Project

plugins {
    application
}

//
// More build script logic
//

abstract class SamplePlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.tasks.register("createFileTask") {
            val fileText = "HELLO FROM MY PLUGIN"
            val myFile = File("myfile.txt")
            myFile.createNewFile()
            myFile.writeText(fileText)
        }
    }
}

apply<SamplePlugin>()   // (1)
  1. Apply the SamplePlugin.

Note that this is a simple hello-world example and does not reflect best practices.

Important
Script plugins are not recommended. Plugin code should not be in your build.gradle(.kts) file.

Plugins should always be written as pre-compiled script plugins, convention plugins or binary plugins.

Pre-compiled script plugin

Pre-compiled script plugins offer an easy way to rapidly prototype and experiment. They let you package build logic as *.gradle(.kts) script files using the Groovy or Kotlin DSL. These scripts reside in specific directories, such as src/main/groovy or src/main/kotlin.

To apply one, simply use its ID derived from the script filename (without .gradle). You can think of the file itself as the plugin, so you do not need to subclass the Plugin interface in a precompiled script.

Let’s take a look at an example with the following structure:

└── buildSrc
    ├── build.gradle.kts
    └── src
       └── main
          └── kotlin
             └── my-create-file-plugin.gradle.kts

Our my-create-file-plugin.gradle.kts file contains the following code:

buildSrc/src/main/kotlin/my-create-file-plugin.gradle.kts
abstract class CreateFileTask : DefaultTask() {
    @get:Input
    abstract val fileText: Property<String>

    @Input
    val fileName = "myfile.txt"

    @OutputFile
    val myFile: File = File(fileName)

    @TaskAction
    fun action() {
        myFile.createNewFile()
        myFile.writeText(fileText.get())
    }
}

tasks.register("createFileTask", CreateFileTask::class) {
    group = "from my plugin"
    description = "Create myfile.txt in the current directory"
    fileText.set("HELLO FROM MY PLUGIN")
}

And the buildSrc build file contains the following:

buildSrc/build.gradle.kts
plugins {
    `kotlin-dsl`
}

The pre-compiled script can now be applied in the build.gradle(.kts) file of any subproject:

plugins {
    id("my-create-file-plugin")  // Apply the plugin
}

The createFileTask task from the plugin is now available in your subproject.

Convention Plugins

Convention plugins are a way to encapsulate and reuse common build logic in Gradle. They allow you to define a set of conventions for a project, and then apply those conventions to other projects or modules.

The example above has been re-written as a convention plugin as a Kotlin script called MyConventionPlugin.kt and stored in buildSrc:

buildSrc/src/main/kotlin/MyConventionPlugin.kt
import org.gradle.api.DefaultTask
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.provider.Property
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.OutputFile
import org.gradle.api.tasks.TaskAction
import java.io.File

abstract class CreateFileTask : DefaultTask() {
    @get:Input
    abstract val fileText: Property<String>

    @Input
    val fileName = project.rootDir.toString() + "/myfile.txt"

    @OutputFile
    val myFile: File = File(fileName)

    @TaskAction
    fun action() {
        myFile.createNewFile()
        myFile.writeText(fileText.get())
    }
}

class MyConventionPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.tasks.register("createFileTask", CreateFileTask::class.java) {
            group = "from my plugin"
            description = "Create myfile.txt in the current directory"
            fileText.set("HELLO FROM MY PLUGIN")
        }
    }
}

The plugin can be given an id using a gradlePlugin{} block so that it can be referenced in the root:

buildSrc/build.gradle.kts
gradlePlugin {
    plugins {
        create("my-convention-plugin") {
            id = "my-convention-plugin"
            implementationClass = "MyConventionPlugin"
        }
    }
}

The gradlePlugin{} block defines the plugins being built by the project. With the newly created id, the plugin can be applied in other build scripts accordingly:

build.gradle.kts
plugins {
    application
    id("my-convention-plugin") // Apply the plugin
}

Binary Plugins

A binary plugin is a plugin that is implemented in a compiled language and is packaged as a JAR file. It is resolved as a dependency rather than compiled from source.

For most use cases, convention plugins must be updated infrequently. Having each developer execute the plugin build as part of their development process is wasteful, and we can instead distribute them as binary dependencies.

There are two ways to update the convention plugin in the example above into a binary plugin.

  1. Use composite builds:

    settings.gradle.kts
    includeBuild("my-plugin")
  2. Publish the plugin to a repository:

    build.gradle.kts
    plugins {
        id("com.gradle.plugin.myconventionplugin") version "1.0.0"
    }

Consult the Developing Plugins chapter to learn more.

STRUCTURING BUILDS

Structuring Projects with Gradle

It is important to structure your Gradle project to optimize build performance. A multi-project build is the standard in Gradle.

structuring builds 1

A multi-project build consists of one root project and one or more subprojects. Gradle can build the root project and any number of the subprojects in a single execution.

Project locations

Multi-project builds contain a single root project in a directory that Gradle views as the root path: ..

Subprojects are located physically under the root path: ./subproject.

A subproject has a path, which denotes the position of that subproject in the multi-project build. In most cases, the project path is consistent with its location in the file system.

The project structure is created in the settings.gradle(.kts) file. The settings file must be present in the root directory.

A simple multi-project build

Let’s look at a basic multi-project build example that contains a root project and a single subproject.

The root project is called basic-multiproject, located somewhere on your machine. From Gradle’s perspective, the root is the top-level directory ..

The project contains a single subproject called ./app:

.
├── app
│   ...
│   └── build.gradle.kts
└── settings.gradle.kts
.
 app
   ...
    build.gradle
 settings.gradle

This is the recommended project structure for starting any Gradle project. The build init plugin also generates skeleton projects that follow this structure - a root project with a single subproject:

The settings.gradle(.kts) file describes the project structure to Gradle:

settings.gradle.kts
rootProject.name = "basic-multiproject"
include("app")
settings.gradle
rootProject.name = 'basic-multiproject'
include 'app'

In this case, Gradle will look for a build file for the app subproject in the ./app directory.

You can view the structure of a multi-project build by running the projects command:

$ ./gradlew -q projects

Projects:

------------------------------------------------------------
Root project 'basic-multiproject'
------------------------------------------------------------

Root project 'basic-multiproject'
\--- Project ':app'

To see a list of the tasks of a project, run gradle <project-path>:tasks
For example, try running gradle :app:tasks

In this example, the app subproject is a Java application that applies the application plugin and configures the main class. The application prints Hello World to the console:

app/build.gradle.kts
plugins {
    id("application")
}

application {
    mainClass = "com.example.Hello"
}
app/build.gradle
plugins {
    id 'application'
}

application {
    mainClass = 'com.example.Hello'
}
app/src/main/java/com/example/Hello.java
package com.example;

public class Hello {
    public static void main(String[] args) {
        System.out.println("Hello, world!");
    }
}

You can run the application by executing the run task from the application plugin in the project root:

$ ./gradlew -q run
Hello, world!

Adding a subproject

In the settings file, you can use the include method to add another subproject to the root project:

settings.gradle.kts
include("project1", "project2:child1", "project3:child1")
settings.gradle
include 'project1', 'project2:child1', 'project3:child1'

The include method takes project paths as arguments. The project path is assumed to be equal to the relative physical file system path. For example, a path services:api is mapped by default to a folder ./services/api (relative to the project root .).

More examples of how to work with the project path can be found in the DSL documentation of Settings.include(java.lang.String[]).

Let’s add another subproject called lib to the previously created project.

All we need to do is add another include statement in the root settings file:

settings.gradle.kts
rootProject.name = "basic-multiproject"
include("app")
include("lib")
settings.gradle
rootProject.name = 'basic-multiproject'
include 'app'
include 'lib'

Gradle will then look for the build file of the new lib subproject in the ./lib/ directory:

.
├── app
│   ...
│   └── build.gradle.kts
├── lib
│   ...
│   └── build.gradle.kts
└── settings.gradle.kts
.
 app
   ...
    build.gradle
 lib
   ...
    build.gradle
 settings.gradle

Project Descriptors

To further describe the project architecture to Gradle, the settings file provides project descriptors.

You can modify these descriptors in the settings file at any time.

To access a descriptor, you can:

settings.gradle.kts
include("project-a")
println(rootProject.name)
println(project(":project-a").name)
settings.gradle
include('project-a')
println rootProject.name
println project(':project-a').name

Using this descriptor, you can change the name, project directory, and build file of a project:

settings.gradle.kts
rootProject.name = "main"
include("project-a")
project(":project-a").projectDir = file("custom/my-project-a")
project(":project-a").buildFileName = "project-a.gradle.kts"
settings.gradle
rootProject.name = 'main'
include('project-a')
project(':project-a').projectDir = file('custom/my-project-a')
project(':project-a').buildFileName = 'project-a.gradle'

Consult the ProjectDescriptor class in the API documentation for more information.

Modifying a subproject path

Let’s take a hypothetical project with the following structure:

.
├── app
│   ...
│   └── build.gradle.kts
├── subs // Gradle may see this as a subproject
│   └── web // Gradle may see this as a subproject
│       └── my-web-module // Intended subproject
│           ...
│           └── build.gradle.kts
└── settings.gradle.kts
.
 app
   ...
    build.gradle
 subs // Gradle may see this as a subproject
    web // Gradle may see this as a subproject
        my-web-module // Intended subproject
           ...
            build.gradle
 settings.gradle

If your settings.gradle(.kts) looks like this:

include(':subs:web:my-web-module')

Gradle sees a subproject with a logical project name of :subs:web:my-web-module and two, possibly unintentional, other subprojects logically named :subs and :subs:web. This can lead to phantom build directories, especially when using allprojects{} or subproject{}.

To avoid this, you can use:

include(':my-web-module')
project(':my-web-module').projectDir = "subs/web/my-web-module"

So that you only end up with a single subproject named :my-web-module.

So, while the physical project layout is the same, the logical results are different.

Naming recommendations

As your project grows, naming and consistency get increasingly more important. To keep your builds maintainable, we recommend the following:

  1. Keep default project names for subprojects: It is possible to configure custom project names in the settings file. However, it’s an unnecessary extra effort for the developers to track which projects belong to what folders.

  2. Use lower case hyphenation for all project names: All letters are lowercase, and words are separated with a dash (-) character.

  3. Define the root project name in the settings file: The rootProject.name effectively assigns a name to the build, used in reports like Build Scans. If the root project name is not set, the name will be the container directory name, which can be unstable (i.e., you can check out your project in any directory). The name will be generated randomly if the root project name is not set and checked out to a file system’s root (e.g., / or C:\).

Declaring Dependencies between Subprojects

What if one subproject depends on another subproject? What if one project needs the artifact produced by another project?

structuring builds 2

This is a common use case for multi-project builds. Gradle offers project dependencies for this.

Depending on another project

Let’s explore a theoretical multi-project build with the following layout:

.
├── api
│   ├── src
│   │   └──...
│   └── build.gradle.kts
├── services
│   └── person-service
│       ├── src
│       │   └──...
│       └── build.gradle.kts
├── shared
│   ├── src
│   │   └──...
│   └── build.gradle.kts
└── settings.gradle.kts
.
 api
    src
      ...
    build.gradle
 services
    person-service
        src
          ...
        build.gradle
 shared
    src
      ...
    build.gradle
 settings.gradle

In this example, there are three subprojects called shared, api, and person-service:

  1. The person-service subproject depends on the other two subprojects, shared and api.

  2. The api subproject depends on the shared subproject.

We use the : separator to define a project path such as services:person-service or :shared. Consult the DSL documentation of Settings.include(java.lang.String[]) for more information about defining project paths.

settings.gradle.kts
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
shared/build.gradle.kts
plugins {
    id("java")
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation("junit:junit:4.13")
}
api/build.gradle.kts
plugins {
    id("java")
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation("junit:junit:4.13")
    implementation(project(":shared"))
}
services/person-service/build.gradle.kts
plugins {
    id("java")
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation("junit:junit:4.13")
    implementation(project(":shared"))
    implementation(project(":api"))
}
settings.gradle
rootProject.name = 'basic-dependencies'
include 'api', 'shared', 'services:person-service'
shared/build.gradle
plugins {
    id 'java'
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation "junit:junit:4.13"
}
api/build.gradle
plugins {
    id 'java'
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation "junit:junit:4.13"
    implementation project(':shared')
}
services/person-service/build.gradle
plugins {
    id 'java'
}

repositories {
    mavenCentral()
}

dependencies {
    testImplementation "junit:junit:4.13"
    implementation project(':shared')
    implementation project(':api')
}

A project dependency affects execution order. It causes the other project to be built first and adds the output with the classes of the other project to the classpath. It also adds the dependencies of the other project to the classpath.

If you execute ./gradlew :api:compile, first the shared project is built, and then the api project is built.

Depending on artifacts produced by another project

Sometimes, you might want to depend on the output of a specific task within another project rather than the entire project. However, explicitly declaring a task dependency from one project to another is discouraged as it introduces unnecessary coupling between tasks.

The recommended way to model dependencies, where a task in one project depends on the output of another, is to produce the output and mark it as an "outgoing" artifact. Gradle’s dependency management engine allows you to share arbitrary artifacts between projects and build them on demand.

Sharing Build Logic between Subprojects

Subprojects in a multi-project build typically share some common dependencies.

structuring builds 3

Instead of copying and pasting the same Java version and libraries in each subproject build script, Gradle provides a special directory for storing shared build logic that can be automatically applied to subprojects.

Share logic in buildSrc

buildSrc is a Gradle-recognized and protected directory which comes with some benefits:

  1. Reusable Build Logic:

    buildSrc allows you to organize and centralize your custom build logic, tasks, and plugins in a structured manner. The code written in buildSrc can be reused across your project, making it easier to maintain and share common build functionality.

  2. Isolation from the Main Build:

    Code placed in buildSrc is isolated from the other build scripts of your project. This helps keep the main build scripts cleaner and more focused on project-specific configurations.

  3. Automatic Compilation and Classpath:

    The contents of the buildSrc directory are automatically compiled and included in the classpath of your main build. This means that classes and plugins defined in buildSrc can be directly used in your project’s build scripts without any additional configuration.

  4. Ease of Testing:

    Since buildSrc is a separate build, it allows for easy testing of your custom build logic. You can write tests for your build code, ensuring that it behaves as expected.

  5. Gradle Plugin Development:

    If you are developing custom Gradle plugins for your project, buildSrc is a convenient place to house the plugin code. This makes the plugins easily accessible within your project.

The buildSrc directory is treated as an included build.

For multi-project builds, there can be only one buildSrc directory, which must be in the root project directory.

Note
The downside of using buildSrc is that any change to it will invalidate every task in your project and require a rerun.

buildSrc uses the same source code conventions applicable to Java, Groovy, and Kotlin projects. It also provides direct access to the Gradle API.

A typical project including buildSrc has the following layout:

.
├── buildSrc
│   ├── src
│   │   └──main
│   │      └──kotlin
│   │         └──MyCustomTask.kt    // (1)
│   ├── shared.gradle.kts   // (2)
│   └── build.gradle.kts
├── api
│   ├── src
│   │   └──...
│   └── build.gradle.kts    // (3)
├── services
│   └── person-service
│       ├── src
│       │   └──...
│       └── build.gradle.kts    // (3)
├── shared
│   ├── src
│   │   └──...
│   └── build.gradle.kts
└── settings.gradle.kts
  1. Create the MyCustomTask task.

  2. A shared build script.

  3. Uses the MyCustomTask task and shared build script.

.
 buildSrc
    src
      main
         groovy
            MyCustomTask.groovy    // (1)
    shared.gradle   // (2)
    build.gradle
 api
    src
      ...
    build.gradle    // (3)
 services
    person-service
        src
          ...
        build.gradle    // (3)
 shared
    src
      ...
    build.gradle
 settings.gradle
  1. Create the MyCustomTask task.

  2. A shared build script.

  3. Uses the MyCustomTask task and shared build script.

In the buildSrc, the build script shared.gradle(.kts) is created. It contains dependencies and other build information that is common to multiple subprojects:

shared.gradle.kts
repositories {
    mavenCentral()
}

dependencies {
    implementation("org.slf4j:slf4j-api:1.7.32")
}
shared.gradle
repositories {
    mavenCentral()
}

dependencies {
    implementation 'org.slf4j:slf4j-api:1.7.32'
}

In the buildSrc, the MyCustomTask is also created. It is a helper task that is used as part of the build logic for multiple subprojects:

MyCustomTask.kt
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction

open class MyCustomTask : DefaultTask() {
    @TaskAction
    fun calculateSum() {
        // Custom logic to calculate the sum of two numbers
        val num1 = 5
        val num2 = 7
        val sum = num1 + num2

        // Print the result
        println("Sum: $sum")
    }
}
MyCustomTask.groovy
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.TaskAction

class MyCustomTask extends DefaultTask {
    @TaskAction
    void calculateSum() {
        // Custom logic to calculate the sum of two numbers
        int num1 = 5
        int num2 = 7
        int sum = num1 + num2

        // Print the result
        println "Sum: $sum"
    }
}

The MyCustomTask task is used in the build script of the api and shared projects. The task is automatically available because it’s part of buildSrc.

The shared.build(.kts) file is also applied:

build.gradle.kts
// Apply any other configurations specific to your project

// Use the build script defined in buildSrc
apply(from = rootProject.file("buildSrc/shared.gradle"))

// Use the custom task defined in buildSrc
tasks.register<MyCustomTask>("myCustomTask")
build.gradle
// Apply any other configurations specific to your project

// Use the build script defined in buildSrc
apply from: rootProject.file('buildSrc/shared.gradle')

// Use the custom task defined in buildSrc
tasks.register('myCustomTask', MyCustomTask)

Share logic using convention plugins

Gradle’s recommended way of organizing build logic is to use its plugin system.

We can write a plugin that encapsulates the build logic common to several subprojects in a project. This kind of plugin is called a convention plugin.

While writing plugins is outside the scope of this section, the recommended way to build a Gradle project is to put common build logic in a convention plugin located in the buildSrc.

Let’s take a look at an example project:

.
├── buildSrc
│   ├── src
│   │   └──main
│   │      └──kotlin
│   │         └──myproject.java-conventions.gradle.kts  // (1)
│   └── build.gradle.kts
├── api
│   ├── src
│   │   └──...
│   └── build.gradle.kts    // (2)
├── services
│   └── person-service
│       ├── src
│       │   └──...
│       └── build.gradle.kts    // (2)
├── shared
│   ├── src
│   │   └──...
│   └── build.gradle.kts    // (2)
└── settings.gradle.kts
  1. Create the myproject.java-conventions convention plugin.

  2. Applies the myproject.java-conventions convention plugin.

.
 buildSrc
    src
      main
         groovy
            myproject.java-conventions.gradle  // (1)
    build.gradle
 api
    src
      ...
    build.gradle    // (2)
 services
    person-service
        src
          ...
        build.gradle    // (2)
 shared
    src
      ...
    build.gradle    // (2)
 settings.gradle
  1. Create the myproject.java-conventions convention plugin.

  2. Applies the myproject.java-conventions convention plugin.

This build contains three subprojects:

settings.gradle.kts
rootProject.name = "dependencies-java"
include("api", "shared", "services:person-service")
settings.gradle
rootProject.name = 'dependencies-java'
include 'api', 'shared', 'services:person-service'

The source code for the convention plugin created in the buildSrc directory is as follows:

buildSrc/src/main/kotlin/myproject.java-conventions.gradle.kts
plugins {
    id("java")
}

group = "com.example"
version = "1.0"

repositories {
    mavenCentral()
}

dependencies {
    testImplementation("junit:junit:4.13")
}
buildSrc/src/main/groovy/myproject.java-conventions.gradle
plugins {
    id 'java'
}

group = 'com.example'
version = '1.0'

repositories {
    mavenCentral()
}

dependencies {
    testImplementation "junit:junit:4.13"
}

For the convention plugin to compile, basic configuration needs to be applied in the build file of the buildSrc directory:

buildSrc/build.gradle.kts
plugins {
    `kotlin-dsl`
}

repositories {
    mavenCentral()
}
buildSrc/build.gradle
plugins {
    id 'groovy-gradle-plugin'
}

The convention plugin is applied to the api, shared, and person-service subprojects:

api/build.gradle.kts
plugins {
    id("myproject.java-conventions")
}

dependencies {
    implementation(project(":shared"))
}
shared/build.gradle.kts
plugins {
    id("myproject.java-conventions")
}
services/person-service/build.gradle.kts
plugins {
    id("myproject.java-conventions")
}

dependencies {
    implementation(project(":shared"))
    implementation(project(":api"))
}
api/build.gradle
plugins {
    id 'myproject.java-conventions'
}

dependencies {
    implementation project(':shared')
}
shared/build.gradle
plugins {
    id 'myproject.java-conventions'
}
services/person-service/build.gradle
plugins {
    id 'myproject.java-conventions'
}

dependencies {
    implementation project(':shared')
    implementation project(':api')
}

Do not use cross-project configuration

An improper way to share build logic between subprojects is cross-project configuration via the subprojects {} and allprojects {} DSL constructs.

Tip
Avoid using subprojects {} and allprojects {}.

With cross-configuration, build logic can be injected into a subproject which is not obvious when looking at its build script.

In the long run, cross-configuration usually grows in complexity and becomes a burden. Cross-configuration can also introduce configuration-time coupling between projects, which can prevent optimizations like configuration-on-demand from working properly.

Convention plugins versus cross-configuration

The two most common uses of cross-configuration can be better modeled using convention plugins:

  1. Applying plugins or other configurations to subprojects of a certain type.
    Often, the cross-configuration logic is if subproject is of type X, then configure Y. This is equivalent to applying X-conventions plugin directly to a subproject.

  2. Extracting information from subprojects of a certain type.
    This use case can be modeled using outgoing configuration variants.

Composite Builds

A composite build is a build that includes other builds.

structuring builds 4

A composite build is similar to a Gradle multi-project build, except that instead of including subprojects, entire builds are included.

Composite builds allow you to:

  • Combine builds that are usually developed independently, for instance, when trying out a bug fix in a library that your application uses.

  • Decompose a large multi-project build into smaller, more isolated chunks that can be worked on independently or together as needed.

A build that is included in a composite build is referred to as an included build. Included builds do not share any configuration with the composite build or the other included builds. Each included build is configured and executed in isolation.

Defining a composite build

The following example demonstrates how two Gradle builds, normally developed separately, can be combined into a composite build.

my-composite
├── gradle
├── gradlew
├── settings.gradle.kts
├── build.gradle.kts
├── my-app
│   ├── settings.gradle.kts
│   └── app
│       ├── build.gradle.kts
│       └── src/main/java/org/sample/my-app/Main.java
└── my-utils
    ├── settings.gradle.kts
    ├── number-utils
    │   ├── build.gradle.kts
    │   └── src/main/java/org/sample/numberutils/Numbers.java
    └── string-utils
        ├── build.gradle.kts
        └── src/main/java/org/sample/stringutils/Strings.java

The my-utils multi-project build produces two Java libraries, number-utils and string-utils. The my-app build produces an executable using functions from those libraries.

The my-app build does not depend directly on my-utils. Instead, it declares binary dependencies on the libraries produced by my-utils:

my-app/app/build.gradle.kts
plugins {
    id("application")
}

application {
    mainClass = "org.sample.myapp.Main"
}

dependencies {
    implementation("org.sample:number-utils:1.0")
    implementation("org.sample:string-utils:1.0")
}
my-app/app/build.gradle
plugins {
    id 'application'
}

application {
    mainClass = 'org.sample.myapp.Main'
}

dependencies {
    implementation 'org.sample:number-utils:1.0'
    implementation 'org.sample:string-utils:1.0'
}
Defining a composite build via --include-build

The --include-build command-line argument turns the executed build into a composite, substituting dependencies from the included build into the executed build.

For example, the output of ./gradlew run --include-build ../my-utils run from my-app:

$ ./gradlew --include-build ../my-utils run
link:https://docs.gradle.org/8.10.2/samples/build-organization/composite-builds/basic/tests/basicCli.out[role=include]
Defining a composite build via the settings file

It’s possible to make the above arrangement persistent by using Settings.includeBuild(java.lang.Object) to declare the included build in the settings.gradle(.kts) file.

The settings file can be used to add subprojects and included builds simultaneously.

Included builds are added by location:

settings.gradle.kts
includeBuild("my-utils")

In the example, the settings.gradle(.kts) file combines otherwise separate builds:

settings.gradle.kts
rootProject.name = "my-composite"

includeBuild("my-app")
includeBuild("my-utils")
settings.gradle
rootProject.name = 'my-composite'

includeBuild 'my-app'
includeBuild 'my-utils'

To execute the run task in the my-app build from my-composite, run ./gradlew my-app:app:run.

You can optionally define a run task in my-composite that depends on my-app:app:run so that you can execute ./gradlew run:

build.gradle.kts
tasks.register("run") {
    dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
build.gradle
tasks.register('run') {
    dependsOn gradle.includedBuild('my-app').task(':app:run')
}
Including builds that define Gradle plugins

A special case of included builds are builds that define Gradle plugins.

These builds should be included using the includeBuild statement inside the pluginManagement {} block of the settings file.

Using this mechanism, the included build may also contribute a settings plugin that can be applied in the settings file itself:

settings.gradle.kts
pluginManagement {
    includeBuild("../url-verifier-plugin")
}
settings.gradle
pluginManagement {
    includeBuild '../url-verifier-plugin'
}

Restrictions on included builds

Most builds can be included in a composite, including other composite builds. There are some restrictions.

In a regular build, Gradle ensures that each project has a unique project path. It makes projects identifiable and addressable without conflicts.

In a composite build, Gradle adds additional qualification to each project from an included build to avoid project path conflicts. The full path to identify a project in a composite build is called a build-tree path. It consists of a build path of an included build and a project path of the project.

By default, build paths and project paths are derived from directory names and structure on disk. Since included builds can be located anywhere on disk, their build path is determined by the name of the containing directory. This can sometimes lead to conflicts.

To summarize, the included builds must fulfill these requirements:

  • Each included build must have a unique build path.

  • Each included build path must not conflict with any project path of the main build.

These conditions guarantee that each project can be uniquely identified even in a composite build.

If conflicts arise, the way to resolve them is by changing the build name of an included build:

settings.gradle.kts
includeBuild("some-included-build") {
    name = "other-name"
}
Note

When a composite build is included in another composite build, both builds have the same parent. In other words, the nested composite build structure is flattened.

Interacting with a composite build

Interacting with a composite build is generally similar to a regular multi-project build. Tasks can be executed, tests can be run, and builds can be imported into the IDE.

Executing tasks

Tasks from an included build can be executed from the command-line or IDE in the same way as tasks from a regular multi-project build. Executing a task will result in task dependencies being executed, as well as those tasks required to build dependency artifacts from other included builds.

You can call a task in an included build using a fully qualified path, for example, :included-build-name:project-name:taskName. Project and task names can be abbreviated.

$ ./gradlew :included-build:subproject-a:compileJava
> Task :included-build:subproject-a:compileJava

$ ./gradlew :i-b:sA:cJ
> Task :included-build:subproject-a:compileJava

To exclude a task from the command line, you need to provide the fully qualified path to the task.

Note
Included build tasks are automatically executed to generate required dependency artifacts, or the including build can declare a dependency on a task from an included build.
Importing into the IDE

One of the most useful features of composite builds is IDE integration.

Importing a composite build permits sources from separate Gradle builds to be easily developed together. For every included build, each subproject is included as an IntelliJ IDEA Module or Eclipse Project. Source dependencies are configured, providing cross-build navigation and refactoring.

Declaring dependencies substituted by an included build

By default, Gradle will configure each included build to determine the dependencies it can provide. The algorithm for doing this is simple. Gradle will inspect the group and name for the projects in the included build and substitute project dependencies for any external dependency matching ${project.group}:${project.name}.

Note

By default, substitutions are not registered for the main build.

To make the (sub)projects of the main build addressable by ${project.group}:${project.name}, you can tell Gradle to treat the main build like an included build by self-including it: includeBuild(".").

There are cases when the default substitutions determined by Gradle are insufficient or must be corrected for a particular composite. For these cases, explicitly declaring the substitutions for an included build is possible.

For example, a single-project build called anonymous-library, produces a Java utility library but does not declare a value for the group attribute:

build.gradle.kts
plugins {
    java
}
build.gradle
plugins {
    id 'java'
}

When this build is included in a composite, it will attempt to substitute for the dependency module undefined:anonymous-library (undefined being the default value for project.group, and anonymous-library being the root project name). Clearly, this isn’t useful in a composite build.

To use the unpublished library in a composite build, you can explicitly declare the substitutions that it provides:

settings.gradle.kts
includeBuild("anonymous-library") {
    dependencySubstitution {
        substitute(module("org.sample:number-utils")).using(project(":"))
    }
}
settings.gradle
includeBuild('anonymous-library') {
    dependencySubstitution {
        substitute module('org.sample:number-utils') using project(':')
    }
}

With this configuration, the my-app composite build will substitute any dependency on org.sample:number-utils with a dependency on the root project of anonymous-library.

Deactivate included build substitutions for a configuration

If you need to resolve a published version of a module that is also available as part of an included build, you can deactivate the included build substitution rules on the ResolutionStrategy of the Configuration that is resolved. This is necessary because the rules are globally applied in the build, and Gradle does not consider published versions during resolution by default.

For example, we create a separate publishedRuntimeClasspath configuration that gets resolved to the published versions of modules that also exist in one of the local builds. This is done by deactivating global dependency substitution rules:

build.gradle.kts
configurations.create("publishedRuntimeClasspath") {
    resolutionStrategy.useGlobalDependencySubstitutionRules = false

    extendsFrom(configurations.runtimeClasspath.get())
    isCanBeConsumed = false
    attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage.JAVA_RUNTIME))
}
build.gradle
configurations.create('publishedRuntimeClasspath') {
    resolutionStrategy.useGlobalDependencySubstitutionRules = false

    extendsFrom(configurations.runtimeClasspath)
    canBeConsumed = false
    attributes.attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage.JAVA_RUNTIME))
}

A use-case would be to compare published and locally built JAR files.

Cases where included build substitutions must be declared

Many builds will function automatically as an included build, without declared substitutions. Here are some common cases where declared substitutions are required:

  • When the archivesBaseName property is used to set the name of the published artifact.

  • When a configuration other than default is published.

  • When the MavenPom.addFilter() is used to publish artifacts that don’t match the project name.

  • When the maven-publish or ivy-publish plugins are used for publishing and the publication coordinates don’t match ${project.group}:${project.name}.

Cases where composite build substitutions won’t work

Some builds won’t function correctly when included in a composite, even when dependency substitutions are explicitly declared. This limitation is because a substituted project dependency will always point to the default configuration of the target project. Any time the artifacts and dependencies specified for the default configuration of a project don’t match what is published to a repository, the composite build may exhibit different behavior.

Here are some cases where the published module metadata may be different from the project default configuration:

  • When a configuration other than default is published.

  • When the maven-publish or ivy-publish plugins are used.

  • When the POM or ivy.xml file is tweaked as part of publication.

Builds using these features function incorrectly when included in a composite build.

Depending on tasks in an included build

While included builds are isolated from one another and cannot declare direct dependencies, a composite build can declare task dependencies on its included builds. The included builds are accessed using Gradle.getIncludedBuilds() or Gradle.includedBuild(java.lang.String), and a task reference is obtained via the IncludedBuild.task(java.lang.String) method.

Using these APIs, it is possible to declare a dependency on a task in a particular included build:

build.gradle.kts
tasks.register("run") {
    dependsOn(gradle.includedBuild("my-app").task(":app:run"))
}
build.gradle
tasks.register('run') {
    dependsOn gradle.includedBuild('my-app').task(':app:run')
}

Or you can declare a dependency on tasks with a certain path in some or all of the included builds:

build.gradle.kts
tasks.register("publishDeps") {
    dependsOn(gradle.includedBuilds.map { it.task(":publishMavenPublicationToMavenRepository") })
}
build.gradle
tasks.register('publishDeps') {
    dependsOn gradle.includedBuilds*.task(':publishMavenPublicationToMavenRepository')
}

Limitations of composite builds

Limitations of the current implementation include:

  • No support for included builds with publications that don’t mirror the project default configuration.
    See Cases where composite builds won’t work.

  • Multiple composite builds may conflict when run in parallel if more than one includes the same build.
    Gradle does not share the project lock of a shared composite build between Gradle invocations to prevent concurrent execution.

Configuration On Demand

Configuration-on-demand attempts to configure only the relevant projects for the requested tasks, i.e., it only evaluates the build script file of projects participating in the build. This way, the configuration time of a large multi-project build can be reduced.

The configuration-on-demand feature is incubating, so only some builds are guaranteed to work correctly. The feature works well for decoupled multi-project builds.

In configuration-on-demand mode, projects are configured as follows:

  • The root project is always configured.

  • The project in the directory where the build is executed is also configured, but only when Gradle is executed without any tasks.
    This way, the default tasks behave correctly when projects are configured on demand.

  • The standard project dependencies are supported, and relevant projects are configured.
    If project A has a compile dependency on project B, then building A causes the configuration of both projects.

  • The task dependencies declared via the task path are supported and cause relevant projects to be configured.
    Example: someTask.dependsOn(":some-other-project:someOtherTask")

  • A task requested via task path from the command line (or tooling API) causes the relevant project to be configured.
    For example, building project-a:project-b:someTask causes configuration of project-b.

Enable configuration-on-demand

You can enable configuration-on-demand using the --configure-on-demand flag or adding org.gradle.configureondemand=true to the gradle.properties file.

To configure on demand with every build run, see Gradle properties.

To configure on demand for a given build, see command-line performance-oriented options.

Decoupled projects

Gradle allows projects to access each other’s configurations and tasks during the configuration and execution phases. While this flexibility empowers build authors, it limits Gradle’s ability to perform optimizations such as parallel project builds and configuration on demand.

Projects are considered decoupled when they interact solely through declared dependencies and task dependencies. Any direct modification or reading of another project’s object creates coupling between the projects. Coupling during configuration can result in flawed build outcomes when using 'configuration on demand', while coupling during execution can affect parallel execution.

One common source of coupling is configuration injection, such as using allprojects{} or subprojects{} in build scripts.

To avoid coupling issues, it’s recommended to:

  • Refrain from referencing other subprojects' build scripts and prefer cross-configuration from the root project.

  • Avoid dynamically changing other projects' configurations during execution.

As Gradle evolves, it aims to provide features that leverage decoupled projects while offering solutions for common use cases like configuration injection without introducing coupling.

Parallel projects

Gradle’s parallel execution feature optimizes CPU utilization to accelerate builds by concurrently executing tasks from different projects.

To enable parallel execution, use the --parallel command-line argument or configure your build environment. Gradle automatically determines the optimal number of parallel threads based on CPU cores.

During parallel execution, each worker handles a specific project exclusively. Task dependencies are respected, with workers prioritizing upstream tasks. However, tasks may not execute in alphabetical order, as in sequential mode. It’s crucial to correctly declare task dependencies and inputs/outputs to avoid ordering issues.

DEVELOPING TASKS

Understanding Tasks

A task represents some independent unit of work that a build performs, such as compiling classes, creating a JAR, generating Javadoc, or publishing archives to a repository.

writing tasks 1

Before reading this chapter, it’s recommended that you first read the Learning The Basics and complete the Tutorial.

Listing tasks

All available tasks in your project come from Gradle plugins and build scripts.

You can list all the available tasks in a project by running the following command in the terminal:

$ ./gradlew tasks

Let’s take a very basic Gradle project as an example. The project has the following structure:

gradle-project
├── app
│   ├── build.gradle.kts    // empty file - no build logic
│   └── ...                 // some java code
├── settings.gradle.kts     // includes app subproject
├── gradle
│   └── ...
├── gradlew
└── gradlew.bat
gradle-project
 app
    build.gradle    // empty file - no build logic
    ...             // some java code
 settings.gradle     // includes app subproject
 gradle
    ...
 gradlew
 gradlew.bat

The settings file contains the following:

settings.gradle.kts
rootProject.name = "gradle-project"
include("app")
settings.gradle
rootProject.name = 'gradle-project'
include('app')

Currently, the app subproject’s build file is empty.

To see the tasks available in the app subproject, run ./gradlew :app:tasks:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.

We observe that only a small number of help tasks are available at the moment. This is because the core of Gradle only provides tasks that analyze your build. Other tasks, such as the those that build your project or compile your code, are added by plugins.

Let’s explore this by adding the Gradle core base plugin to the app build script:

app/build.gradle.kts
plugins {
    id("base")
}
app/build.gradle
plugins {
    id('base')
}

The base plugin adds central lifecycle tasks. Now when we run ./gradlew app:tasks, we can see the assemble and build tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
clean - Deletes the build directory.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.

Task outcomes

When Gradle executes a task, it labels the task with outcomes via the console.

author tasks 1

These labels are based on whether a task has actions to execute and if Gradle executed them. Actions include, but are not limited to, compiling code, zipping files, and publishing archives.

(no label) or EXECUTED

Task executed its actions.

  • Task has actions and Gradle executed them.

  • Task has no actions and some dependencies, and Gradle executed one or more of the dependencies. See also Lifecycle Tasks.

UP-TO-DATE

Task’s outputs did not change.

  • Task has outputs and inputs but they have not changed. See Incremental Build.

  • Task has actions, but the task tells Gradle it did not change its outputs.

  • Task has no actions and some dependencies, but all the dependencies are UP-TO-DATE, SKIPPED or FROM-CACHE. See Lifecycle Tasks.

  • Task has no actions and no dependencies.

FROM-CACHE

Task’s outputs could be found from a previous execution.

  • Task has outputs restored from the build cache. See Build Cache.

SKIPPED

Task did not execute its actions.

NO-SOURCE

Task did not need to execute its actions.

  • Task has inputs and outputs, but no sources (i.e., inputs were not found).

Task group and description

Task groups and descriptions are used to organize and describe tasks.

Groups

Task groups are used to categorize tasks. When you run ./gradlew tasks, tasks are listed under their respective groups, making it easier to understand their purpose and relationship to other tasks. Groups are set using the group property.

Descriptions

Descriptions provide a brief explanation of what a task does. When you run ./gradlew tasks, the descriptions are shown next to each task, helping you understand its purpose and how to use it. Descriptions are set using the description property.

Let’s consider a basic Java application as an example. The build contains a subproject called app.

Let’s list the available tasks in app at the moment:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application.

Build tasks
-----------
assemble - Assembles the outputs of this project.

Here, the :run task is part of the Application group with the description Runs this project as a JVM application. In code, it would look something like this:

app/build.gradle.kts
tasks.register("run") {
    group = "Application"
    description = "Runs this project as a JVM application."
}
app/build.gradle
tasks.register("run") {
    group = "Application"
    description = "Runs this project as a JVM application."
}

Private and hidden tasks

Gradle doesn’t support marking a task as private.

However, tasks will only show up when running :tasks if task.group is set or no other task depends on it.

For instance, the following task will not appear when running ./gradlew :app:tasks because it does not have a group; it is called a hidden task:

app/build.gradle.kts
tasks.register("helloTask") {
    println("Hello")
}
app/build.gradle
tasks.register("helloTask") {
    println 'Hello'
}

Although helloTask is not listed, it can still be executed by Gradle:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Let’s add a group to the same task:

app/build.gradle.kts
tasks.register("helloTask") {
    group = "Other"
    description = "Hello task"
    println("Hello")
}
app/build.gradle
tasks.register("helloTask") {
    group = "Other"
    description = "Hello task"
    println 'Hello'
}

Now that the group is added, the task is visible:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.

Other tasks
-----------
helloTask - Hello task

In contrast, ./gradlew tasks --all will show all tasks; hidden and visible tasks are listed.

Grouping tasks

If you want to customize which tasks are shown to users when listed, you can group tasks and set the visibility of each group.

Note
Remember, even if you hide tasks, they are still available, and Gradle can still run them.

Let’s start with an example built by Gradle init for a Java application with multiple subprojects. The project structure is as follows:

gradle-project
├── app
│   ├── build.gradle.kts
│   └── src                 // some java code
│       └── ...
├── utilities
│   ├── build.gradle.kts
│   └── src                 // some java code
│       └── ...
├── list
│   ├── build.gradle.kts
│   └── src                 // some java code
│       └── ...
├── buildSrc
│   ├── build.gradle.kts
│   ├── settings.gradle.kts
│   └── src                 // common build logic
│       └── ...
├── settings.gradle.kts
├── gradle
├── gradlew
└── gradlew.bat
gradle-project
 app
    build.gradle
    src             // some java code
        ...
 utilities
    build.gradle
    src             // some java code
        ...
 list
    build.gradle
    src             // some java code
        ...
 buildSrc
    build.gradle
    settings.gradle
    src             // common build logic
        ...
 settings.gradle
 gradle
 gradlew
 gradlew.bat

Run app:tasks to see available tasks in the app subproject:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Application tasks
-----------------
run - Runs this project as a JVM application

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Distribution tasks
------------------
assembleDist - Assembles the main distributions
distTar - Bundles the project as a distribution.
distZip - Bundles the project as a distribution.
installDist - Installs the project as a distribution as-is.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
kotlinDslAccessorsReport - Prints the Kotlin code for accessing the currently available project extensions and conventions.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.

If we look at the list of tasks available, even for a standard Java project, it’s extensive. Many of these tasks are rarely required directly by developers using the build.

We can configure the :tasks task and limit the tasks shown to a certain group.

Let’s create our own group so that all tasks are hidden by default by updating the app build script:

app/build.gradle.kts
val myBuildGroup = "my app build"               // Create a group name

tasks.register<TaskReportTask>("tasksAll") {    // Register the tasksAll task
group = myBuildGroup
description = "Show additional tasks."
setShowDetail(true)
}

tasks.named<TaskReportTask>("tasks") {          // Move all existing tasks to the group
displayGroup = myBuildGroup
}
app/build.gradle
def myBuildGroup = "my app build"               // Create a group name

tasks.register(TaskReportTask, "tasksAll") {    // Register the tasksAll task
    group = myBuildGroup
    description = "Show additional tasks."
    setShowDetail(true)
}

tasks.named(TaskReportTask, "tasks") {          // Move all existing tasks to the group
    displayGroup = myBuildGroup
}

Now, when we list tasks available in app, the list is shorter:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

My app build tasks
------------------
tasksAll - Show additional tasks.

Task categories

Gradle distinguishes between two categories of tasks:

  1. Lifecycle tasks

  2. Actionable tasks

Lifecycle tasks define targets you can call, such as :build your project. Lifecycle tasks do not provide Gradle with actions. They must be wired to actionable tasks. The base Gradle plugin only adds lifecycle tasks.

Actionable tasks define actions for Gradle to take, such as :compileJava, which compiles the Java code of your project. Actions include creating JARs, zipping files, publishing archives, and much more. Plugins like the java-library plugin adds actionable tasks.

Let’s update the build script of the previous example, which is currently an empty file so that our app subproject is a Java library:

app/build.gradle.kts
plugins {
    id("java-library")
}
app/build.gradle
plugins {
    id('java-library')
}

Once again, we list the available tasks to see what new tasks are available:

$ ./gradlew :app:tasks

> Task :app:tasks

------------------------------------------------------------
Tasks runnable from project ':app'
------------------------------------------------------------

Build tasks
-----------
assemble - Assembles the outputs of this project.
build - Assembles and tests this project.
buildDependents - Assembles and tests this project and all projects that depend on it.
buildNeeded - Assembles and tests this project and all projects it depends on.
classes - Assembles main classes.
clean - Deletes the build directory.
jar - Assembles a jar archive containing the classes of the 'main' feature.
testClasses - Assembles test classes.

Documentation tasks
-------------------
javadoc - Generates Javadoc API documentation for the 'main' feature.

Help tasks
----------
buildEnvironment - Displays all buildscript dependencies declared in project ':app'.
dependencies - Displays all dependencies declared in project ':app'.
dependencyInsight - Displays the insight into a specific dependency in project ':app'.
help - Displays a help message.
javaToolchains - Displays the detected java toolchains.
outgoingVariants - Displays the outgoing variants of project ':app'.
projects - Displays the sub-projects of project ':app'.
properties - Displays the properties of project ':app'.
resolvableConfigurations - Displays the configurations that can be resolved in project ':app'.
tasks - Displays the tasks runnable from project ':app'.

Verification tasks
------------------
check - Runs all checks.
test - Runs the test suite.

We see that many new tasks are available such as jar and testClasses.

Additionally, the java-library plugin has wired actionable tasks to lifecycle tasks. If we call the :build task, we can see several tasks have been executed, including the :app:compileJava task.

$./gradlew :app:build

> Task :app:compileJava
> Task :app:processResources NO-SOURCE
> Task :app:classes
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check
> Task :app:build

The actionable :compileJava task is wired to the lifecycle :build task.

Incremental tasks

A key feature of Gradle tasks is their incremental nature.

Gradle can reuse results from prior builds. Therefore, if we’ve built our project before and made only minor changes, rerunning :build will not require Gradle to perform extensive work.

For example, if we modify only the test code in our project, leaving the production code unchanged, executing the build will solely recompile the test code. Gradle marks the tasks for the production code as UP-TO-DATE, indicating that it remains unchanged since the last successful build:

$./gradlew :app:build

lkassovic@MacBook-Pro temp1 % ./gradlew :app:build
> Task :app:compileJava UP-TO-DATE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar UP-TO-DATE
> Task :app:assemble UP-TO-DATE
> Task :app:compileTestJava
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses
> Task :app:test
> Task :app:check UP-TO-DATE
> Task :app:build UP-TO-DATE

Caching tasks

Gradle can reuse results from past builds using the build cache.

To enable this feature, activate the build cache by using the --build-cache command line parameter or by setting org.gradle.caching=true in your gradle.properties file.

This optimization has the potential to accelerate your builds significantly:

$./gradlew :app:clean :app:build --build-cache

> Task :app:compileJava FROM-CACHE
> Task :app:processResources NO-SOURCE
> Task :app:classes UP-TO-DATE
> Task :app:jar
> Task :app:assemble
> Task :app:compileTestJava FROM-CACHE
> Task :app:processTestResources NO-SOURCE
> Task :app:testClasses UP-TO-DATE
> Task :app:test FROM-CACHE
> Task :app:check UP-TO-DATE
> Task :app:build

When Gradle can fetch outputs of a task from the cache, it labels the task with FROM-CACHE.

The build cache is handy if you switch between branches regularly. Gradle supports both local and remote build caches.

Developing tasks

When developing Gradle tasks, you have two choices:

  1. Use an existing Gradle task type such as Zip, Copy, or Delete

  2. Create your own Gradle task type such as MyResolveTask or CustomTaskUsingToolchains.

Task types are simply subclasses of the Gradle Task class.

With Gradle tasks, there are three states to consider:

  1. Registering a task - using a task (implemented by you or provided by Gradle) in your build logic.

  2. Configuring a task - defining inputs and outputs for a registered task.

  3. Implementing a task - creating a custom task class (i.e., custom class type).

Registration is commonly done with the register() method.
Configuring a task is commonly done with the named() method.
Implementing a task is commonly done by extending Gradle’s DefaultTask class:

tasks.register<Copy>("myCopy")                              // (1)

tasks.named<Copy>("myCopy") {                               // (2)
    from("resources")
    into("target")
    include("**/*.txt", "**/*.xml", "**/*.properties")
}

abstract class MyCopyTask : DefaultTask() {                 // (3)
    @TaskAction
    fun copyFiles() {
        val sourceDir = File("sourceDir")
        val destinationDir = File("destinationDir")
        sourceDir.listFiles()?.forEach { file ->
            if (file.isFile && file.extension == "txt") {
                file.copyTo(File(destinationDir, file.name))
            }
        }
    }
}
  1. Register the myCopy task of type Copy to let Gradle know we intend to use it in our build logic.

  2. Configure the registered myCopy task with the inputs and outputs it needs according to its API.

  3. Implement a custom task type called MyCopyTask which extends DefaultTask and defines the copyFiles task action.

tasks.register(Copy, "myCopy")                              // (1)

tasks.named(Copy, "myCopy") {                               // (2)
    from "resources"
    into "target"
    include "**/*.txt", "**/*.xml", "**/*.properties"
}

abstract class MyCopyTask extends DefaultTask {             // (3)
    @TaskAction
    void copyFiles() {
        fileTree('sourceDir').matching {
            include '**/*.txt'
        }.forEach { file ->
            file.copyTo(file.path.replace('sourceDir', 'destinationDir'))
        }
    }
}
  1. Register the myCopy task of type Copy to let Gradle know we intend to use it in our build logic.

  2. Configure the registered myCopy task with the inputs and outputs it needs according to its API.

  3. Implement a custom task type called MyCopyTask which extends DefaultTask and defines the copyFiles task action.

1. Registering tasks

You define actions for Gradle to take by registering tasks in build scripts or plugins.

Tasks are defined using strings for task names:

build.gradle.kts
tasks.register("hello") {
    doLast {
        println("hello")
    }
}
build.gradle
tasks.register('hello') {
    doLast {
        println 'hello'
    }
}

In the example above, the task is added to the TasksCollection using the register() method in TaskContainer.

2. Configuring tasks

Gradle tasks must be configured to complete their action(s) successfully. If a task needs to ZIP a file, it must be configured with the file name and location. You can refer to the API for the Gradle Zip task to learn how to configure it appropriately.

Let’s look at the Copy task provided by Gradle as an example. We first register a task called myCopy of type Copy in the build script:

build.gradle.kts
tasks.register<Copy>("myCopy")
build.gradle
tasks.register('myCopy', Copy)

This registers a copy task with no default behavior. Since the task is of type Copy, a Gradle supported task type, it can be configured using its API.

The following examples show several ways to achieve the same configuration:

1. Using the named() method:

Use named() to configure an existing task registered elsewhere:

build.gradle.kts
tasks.named<Copy>("myCopy") {
    from("resources")
    into("target")
    include("**/*.txt", "**/*.xml", "**/*.properties")
}
build.gradle
tasks.named('myCopy') {
    from 'resources'
    into 'target'
    include('**/*.txt', '**/*.xml', '**/*.properties')
}
2. Using a configuration block:

Use a block to configure the task immediately upon registering it:

build.gradle.kts
tasks.register<Copy>("copy") {
   from("resources")
   into("target")
   include("**/*.txt", "**/*.xml", "**/*.properties")
}
build.gradle
tasks.register('copy', Copy) {
   from 'resources'
   into 'target'
   include('**/*.txt', '**/*.xml', '**/*.properties')
}
3. Name method as call:

A popular option that is only supported in Groovy is the shorthand notation:

copy {
    from("resources")
    into("target")
    include("**/*.txt", "**/*.xml", "**/*.properties")
}
Note
This option breaks task configuration avoidance and is not recommended!

Regardless of the method chosen, the task is configured with the name of the files to be copied and the location of the files.

3. Implementing tasks

Gradle provides many task types including Delete, Javadoc, Copy, Exec, Tar, and Pmd. You can implement a custom task type if Gradle does not provide a task type that meets your build logic needs.

To create a custom task class, you extend DefaultTask and make the extending class abstract:

app/build.gradle.kts
abstract class MyCopyTask : DefaultTask() {

}
app/build.gradle
abstract class MyCopyTask extends DefaultTask {

}

Unresolved directive in userguide_single.adoc - include::lifecycle_tasks.adoc[leveloffset=+2] Unresolved directive in userguide_single.adoc - include::actionable_tasks.adoc[leveloffset=+2] :leveloffset: +2

Configuring Tasks Lazily

Knowing when and where a particular value is configured is difficult to track as a build grows in complexity. Gradle provides several ways to manage this using lazy configuration.

writing tasks 4

Understanding Lazy properties

Gradle provides lazy properties, which delay calculating a property’s value until it’s actually required.

Lazy properties provide three main benefits:

  1. Deferred Value Resolution: Allows wiring Gradle models without needing to know when a property’s value will be known. For example, you may want to set the input source files of a task based on the source directories property of an extension, but the extension property value isn’t known until the build script or some other plugin configures them.

  2. Automatic Task Dependency Management: Connects output of one task to input of another, automatically determining task dependencies. Property instances carry information about which task, if any, produces their value. Build authors do not need to worry about keeping task dependencies in sync with configuration changes.

  3. Improved Build Performance: Avoids resource-intensive work during configuration, impacting build performance positively. For example, when a configuration value comes from parsing a file but is only used when functional tests are run, using a property instance to capture this means that the file is parsed only when the functional tests are run (and not when clean is run, for example).

Gradle represents lazy properties with two interfaces:

Provider

Represents a value that can only be queried and cannot be changed.

  • Properties with these types are read-only.

  • The method Provider.get() returns the current value of the property.

  • A Provider can be created from another Provider using Provider.map(Transformer).

  • Many other types extend Provider and can be used wherever a Provider is required.

Property

Represents a value that can be queried and changed.

  • Properties with these types are configurable.

  • Property extends the Provider interface.

  • The method Property.set(T) specifies a value for the property, overwriting whatever value may have been present.

  • The method Property.set(Provider) specifies a Provider for the value for the property, overwriting whatever value may have been present. This allows you to wire together Provider and Property instances before the values are configured.

  • A Property can be created by the factory method ObjectFactory.property(Class).

Lazy properties are intended to be passed around and only queried when required. This typically happens during the execution phase.

The following demonstrates a task with a configurable greeting property and a read-only message property:

build.gradle.kts
abstract class Greeting : DefaultTask() { // (1)
    @get:Input
    abstract val greeting: Property<String> // (2)

    @Internal
    val message: Provider<String> = greeting.map { it + " from Gradle" } // (3)

    @TaskAction
    fun printMessage() {
        logger.quiet(message.get())
    }
}

tasks.register<Greeting>("greeting") {
    greeting.set("Hi") // (4)
    greeting = "Hi" // (5)
}
build.gradle
abstract class Greeting extends DefaultTask { // (1)
    @Input
    abstract Property<String> getGreeting() // (2)

    @Internal
    final Provider<String> message = greeting.map { it + ' from Gradle' } // (3)

    @TaskAction
    void printMessage() {
        logger.quiet(message.get())
    }
}

tasks.register("greeting", Greeting) {
    greeting.set('Hi') // (4)
    greeting = 'Hi' // (5)
}
  1. A task that displays a greeting

  2. A configurable greeting

  3. Read-only property calculated from the greeting

  4. Configure the greeting

  5. Alternative notation to calling Property.set()

$ gradle greeting

> Task :greeting
Hi from Gradle

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

The Greeting task has a property of type Property<String> to represent the configurable greeting and a property of type Provider<String> to represent the calculated, read-only, message. The message Provider is created from the greeting Property using the map() method; its value is kept up-to-date as the value of the greeting property changes.

Creating a Property or Provider instance

Neither Provider nor its subtypes, such as Property, are intended to be implemented by a build script or plugin. Gradle provides factory methods to create instances of these types instead.

In the previous example, two factory methods were presented:

See the Quick Reference for all of the types and factories available.

A Provider can also be created by the factory method ProviderFactory.provider(Callable).

Note

There are no specific methods to create a provider using a groovy.lang.Closure.

When writing a plugin or build script with Groovy, you can use the map(Transformer) method with a closure, and Groovy will convert the closure to a Transformer.

Similarly, when writing a plugin or build script with Kotlin, the Kotlin compiler will convert a Kotlin function into a Transformer.

Connecting properties together

An important feature of lazy properties is that they can be connected together so that changes to one property are automatically reflected in other properties.

Here is an example where the property of a task is connected to a property of a project extension:

build.gradle.kts
// A project extension
interface MessageExtension {
    // A configurable greeting
    abstract val greeting: Property<String>
}

// A task that displays a greeting
abstract class Greeting : DefaultTask() {
    // Configurable by the user
    @get:Input
    abstract val greeting: Property<String>

    // Read-only property calculated from the greeting
    @Internal
    val message: Provider<String> = greeting.map { it + " from Gradle" }

    @TaskAction
    fun printMessage() {
        logger.quiet(message.get())
    }
}

// Create the project extension
val messages = project.extensions.create<MessageExtension>("messages")

// Create the greeting task
tasks.register<Greeting>("greeting") {
    // Attach the greeting from the project extension
    // Note that the values of the project extension have not been configured yet
    greeting = messages.greeting
}

messages.apply {
    // Configure the greeting on the extension
    // Note that there is no need to reconfigure the task's `greeting` property. This is automatically updated as the extension property changes
    greeting = "Hi"
}
build.gradle
// A project extension
interface MessageExtension {
    // A configurable greeting
    Property<String> getGreeting()
}

// A task that displays a greeting
abstract class Greeting extends DefaultTask {
    // Configurable by the user
    @Input
    abstract Property<String> getGreeting()

    // Read-only property calculated from the greeting
    @Internal
    final Provider<String> message = greeting.map { it + ' from Gradle' }

    @TaskAction
    void printMessage() {
        logger.quiet(message.get())
    }
}

// Create the project extension
project.extensions.create('messages', MessageExtension)

// Create the greeting task
tasks.register("greeting", Greeting) {
    // Attach the greeting from the project extension
    // Note that the values of the project extension have not been configured yet
    greeting = messages.greeting
}

messages {
    // Configure the greeting on the extension
    // Note that there is no need to reconfigure the task's `greeting` property. This is automatically updated as the extension property changes
    greeting = 'Hi'
}
$ gradle greeting

> Task :greeting
Hi from Gradle

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example calls the Property.set(Provider) method to attach a Provider to a Property to supply the value of the property. In this case, the Provider happens to be a Property as well, but you can connect any Provider implementation, for example one created using Provider.map()

Working with files

In Working with Files, we introduced four collection types for File-like objects:

Read-only Type Configurable Type

FileCollection

ConfigurableFileCollection

FileTree

ConfigurableFileTree

All of these types are also considered lazy types.

There are more strongly typed models used to represent elements of the file system: Directory and RegularFile. These types shouldn’t be confused with the standard Java File type as they are used to tell Gradle that you expect more specific values such as a directory or a non-directory, regular file.

Gradle provides two specialized Property subtypes for dealing with values of these types: RegularFileProperty and DirectoryProperty. ObjectFactory has methods to create these: ObjectFactory.fileProperty() and ObjectFactory.directoryProperty().

A DirectoryProperty can also be used to create a lazily evaluated Provider for a Directory and RegularFile via DirectoryProperty.dir(String) and DirectoryProperty.file(String) respectively. These methods create providers whose values are calculated relative to the location for the DirectoryProperty they were created from. The values returned from these providers will reflect changes to the DirectoryProperty.

build.gradle.kts
// A task that generates a source file and writes the result to an output directory
abstract class GenerateSource : DefaultTask() {
    // The configuration file to use to generate the source file
    @get:InputFile
    abstract val configFile: RegularFileProperty

    // The directory to write source files to
    @get:OutputDirectory
    abstract val outputDir: DirectoryProperty

    @TaskAction
    fun compile() {
        val inFile = configFile.get().asFile
        logger.quiet("configuration file = $inFile")
        val dir = outputDir.get().asFile
        logger.quiet("output dir = $dir")
        val className = inFile.readText().trim()
        val srcFile = File(dir, "${className}.java")
        srcFile.writeText("public class ${className} { }")
    }
}

// Create the source generation task
tasks.register<GenerateSource>("generate") {
    // Configure the locations, relative to the project and build directories
    configFile = layout.projectDirectory.file("src/config.txt")
    outputDir = layout.buildDirectory.dir("generated-source")
}

// Change the build directory
// Don't need to reconfigure the task properties. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")
build.gradle
// A task that generates a source file and writes the result to an output directory
abstract class GenerateSource extends DefaultTask {
    // The configuration file to use to generate the source file
    @InputFile
    abstract RegularFileProperty getConfigFile()

    // The directory to write source files to
    @OutputDirectory
    abstract DirectoryProperty getOutputDir()

    @TaskAction
    def compile() {
        def inFile = configFile.get().asFile
        logger.quiet("configuration file = $inFile")
        def dir = outputDir.get().asFile
        logger.quiet("output dir = $dir")
        def className = inFile.text.trim()
        def srcFile = new File(dir, "${className}.java")
        srcFile.text = "public class ${className} { ... }"
    }
}

// Create the source generation task
tasks.register('generate', GenerateSource) {
    // Configure the locations, relative to the project and build directories
    configFile = layout.projectDirectory.file('src/config.txt')
    outputDir = layout.buildDirectory.dir('generated-source')
}

// Change the build directory
// Don't need to reconfigure the task properties. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')
$ gradle generate

> Task :generate
configuration file = /home/user/gradle/samples/src/config.txt
output dir = /home/user/gradle/samples/output/generated-source

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed
$ gradle generate

> Task :generate
configuration file = /home/user/gradle/samples/kotlin/src/config.txt
output dir = /home/user/gradle/samples/kotlin/output/generated-source

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

This example creates providers that represent locations in the project and build directories through Project.getLayout() with ProjectLayout.getBuildDirectory() and ProjectLayout.getProjectDirectory().

To close the loop, note that a DirectoryProperty, or a simple Directory, can be turned into a FileTree that allows the files and directories contained in the directory to be queried with DirectoryProperty.getAsFileTree() or Directory.getAsFileTree(). From a DirectoryProperty or a Directory, you can create FileCollection instances containing a set of the files contained in the directory with DirectoryProperty.files(Object...) or Directory.files(Object...).

Working with task inputs and outputs

Many builds have several tasks connected together, where one task consumes the outputs of another task as an input.

To make this work, we need to configure each task to know where to look for its inputs and where to place its outputs. Ensure that the producing and consuming tasks are configured with the same location and attach task dependencies between the tasks. This can be cumbersome and brittle if any of these values are configurable by a user or configured by multiple plugins, as task properties need to be configured in the correct order and locations, and task dependencies kept in sync as values change.

The Property API makes this easier by keeping track of the value of a property and the task that produces the value.

As an example, consider the following plugin with a producer and consumer task which are wired together:

build.gradle.kts
abstract class Producer : DefaultTask() {
    @get:OutputFile
    abstract val outputFile: RegularFileProperty

    @TaskAction
    fun produce() {
        val message = "Hello, World!"
        val output = outputFile.get().asFile
        output.writeText( message)
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer : DefaultTask() {
    @get:InputFile
    abstract val inputFile: RegularFileProperty

    @TaskAction
    fun consume() {
        val input = inputFile.get().asFile
        val message = input.readText()
        logger.quiet("Read '${message}' from ${input}")
    }
}

val producer = tasks.register<Producer>("producer")
val consumer = tasks.register<Consumer>("consumer")

consumer {
    // Connect the producer task output to the consumer task input
    // Don't need to add a task dependency to the consumer task. This is automatically added
    inputFile = producer.flatMap { it.outputFile }
}

producer {
    // Set values for the producer lazily
    // Don't need to update the consumer.inputFile property. This is automatically updated as producer.outputFile changes
    outputFile = layout.buildDirectory.file("file.txt")
}

// Change the build directory.
// Don't need to update producer.outputFile and consumer.inputFile. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")
build.gradle
abstract class Producer extends DefaultTask {
    @OutputFile
    abstract RegularFileProperty getOutputFile()

    @TaskAction
    void produce() {
        String message = 'Hello, World!'
        def output = outputFile.get().asFile
        output.text = message
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer extends DefaultTask {
    @InputFile
    abstract RegularFileProperty getInputFile()

    @TaskAction
    void consume() {
        def input = inputFile.get().asFile
        def message = input.text
        logger.quiet("Read '${message}' from ${input}")
    }
}

def producer = tasks.register("producer", Producer)
def consumer = tasks.register("consumer", Consumer)

consumer.configure {
    // Connect the producer task output to the consumer task input
    // Don't need to add a task dependency to the consumer task. This is automatically added
    inputFile = producer.flatMap { it.outputFile }
}

producer.configure {
    // Set values for the producer lazily
    // Don't need to update the consumer.inputFile property. This is automatically updated as producer.outputFile changes
    outputFile = layout.buildDirectory.file('file.txt')
}

// Change the build directory.
// Don't need to update producer.outputFile and consumer.inputFile. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')
$ gradle consumer

> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/output/file.txt

> Task :consumer
Read 'Hello, World!' from /home/user/gradle/samples/output/file.txt

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer

> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/file.txt

> Task :consumer
Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/file.txt

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

In the example above, the task outputs and inputs are connected before any location is defined. The setters can be called at any time before the task is executed, and the change will automatically affect all related input and output properties.

Another important thing to note in this example is the absence of any explicit task dependency. Task outputs represented using Providers keep track of which task produces their value, and using them as task inputs will implicitly add the correct task dependencies.

Implicit task dependencies also work for input properties that are not files:

build.gradle.kts
abstract class Producer : DefaultTask() {
    @get:OutputFile
    abstract val outputFile: RegularFileProperty

    @TaskAction
    fun produce() {
        val message = "Hello, World!"
        val output = outputFile.get().asFile
        output.writeText( message)
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer : DefaultTask() {
    @get:Input
    abstract val message: Property<String>

    @TaskAction
    fun consume() {
        logger.quiet(message.get())
    }
}

val producer = tasks.register<Producer>("producer") {
    // Set values for the producer lazily
    // Don't need to update the consumer.inputFile property. This is automatically updated as producer.outputFile changes
    outputFile = layout.buildDirectory.file("file.txt")
}
tasks.register<Consumer>("consumer") {
    // Connect the producer task output to the consumer task input
    // Don't need to add a task dependency to the consumer task. This is automatically added
    message = producer.flatMap { it.outputFile }.map { it.asFile.readText() }
}
build.gradle
abstract class Producer extends DefaultTask {
    @OutputFile
    abstract RegularFileProperty getOutputFile()

    @TaskAction
    void produce() {
        String message = 'Hello, World!'
        def output = outputFile.get().asFile
        output.text = message
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer extends DefaultTask {
    @Input
    abstract Property<String> getMessage()

    @TaskAction
    void consume() {
        logger.quiet(message.get())
    }
}

def producer = tasks.register('producer', Producer) {
    // Set values for the producer lazily
    // Don't need to update the consumer.inputFile property. This is automatically updated as producer.outputFile changes
    outputFile = layout.buildDirectory.file('file.txt')
}
tasks.register('consumer', Consumer) {
    // Connect the producer task output to the consumer task input
    // Don't need to add a task dependency to the consumer task. This is automatically added
    message = producer.flatMap { it.outputFile }.map { it.asFile.text }
}
$ gradle consumer

> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/build/file.txt

> Task :consumer
Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed
$ gradle consumer

> Task :producer
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/build/file.txt

> Task :consumer
Hello, World!

BUILD SUCCESSFUL in 0s
2 actionable tasks: 2 executed

Working with collections

Gradle provides two lazy property types to help configure Collection properties.

These work exactly like any other Provider and, just like file providers, they have additional modeling around them:

This type of property allows you to overwrite the entire collection value with HasMultipleValues.set(Iterable) and HasMultipleValues.set(Provider) or add new elements through the various add methods:

Just like every Provider, the collection is calculated when Provider.get() is called. The following example shows the ListProperty in action:

build.gradle.kts
abstract class Producer : DefaultTask() {
    @get:OutputFile
    abstract val outputFile: RegularFileProperty

    @TaskAction
    fun produce() {
        val message = "Hello, World!"
        val output = outputFile.get().asFile
        output.writeText( message)
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer : DefaultTask() {
    @get:InputFiles
    abstract val inputFiles: ListProperty<RegularFile>

    @TaskAction
    fun consume() {
        inputFiles.get().forEach { inputFile ->
            val input = inputFile.asFile
            val message = input.readText()
            logger.quiet("Read '${message}' from ${input}")
        }
    }
}

val producerOne = tasks.register<Producer>("producerOne")
val producerTwo = tasks.register<Producer>("producerTwo")
tasks.register<Consumer>("consumer") {
    // Connect the producer task outputs to the consumer task input
    // Don't need to add task dependencies to the consumer task. These are automatically added
    inputFiles.add(producerOne.get().outputFile)
    inputFiles.add(producerTwo.get().outputFile)
}

// Set values for the producer tasks lazily
// Don't need to update the consumer.inputFiles property. This is automatically updated as producer.outputFile changes
producerOne { outputFile = layout.buildDirectory.file("one.txt") }
producerTwo { outputFile = layout.buildDirectory.file("two.txt") }

// Change the build directory.
// Don't need to update the task properties. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir("output")
build.gradle
abstract class Producer extends DefaultTask {
    @OutputFile
    abstract RegularFileProperty getOutputFile()

    @TaskAction
    void produce() {
        String message = 'Hello, World!'
        def output = outputFile.get().asFile
        output.text = message
        logger.quiet("Wrote '${message}' to ${output}")
    }
}

abstract class Consumer extends DefaultTask {
    @InputFiles
    abstract ListProperty<RegularFile> getInputFiles()

    @TaskAction
    void consume() {
        inputFiles.get().each { inputFile ->
            def input = inputFile.asFile
            def message = input.text
            logger.quiet("Read '${message}' from ${input}")
        }
    }
}

def producerOne = tasks.register('producerOne', Producer)
def producerTwo = tasks.register('producerTwo', Producer)
tasks.register('consumer', Consumer) {
    // Connect the producer task outputs to the consumer task input
    // Don't need to add task dependencies to the consumer task. These are automatically added
    inputFiles.add(producerOne.get().outputFile)
    inputFiles.add(producerTwo.get().outputFile)
}

// Set values for the producer tasks lazily
// Don't need to update the consumer.inputFiles property. This is automatically updated as producer.outputFile changes
producerOne.configure { outputFile = layout.buildDirectory.file('one.txt') }
producerTwo.configure { outputFile = layout.buildDirectory.file('two.txt') }

// Change the build directory.
// Don't need to update the task properties. These are automatically updated as the build directory changes
layout.buildDirectory = layout.projectDirectory.dir('output')
$ gradle consumer

> Task :producerOne
Wrote 'Hello, World!' to /home/user/gradle/samples/output/one.txt

> Task :producerTwo
Wrote 'Hello, World!' to /home/user/gradle/samples/output/two.txt

> Task :consumer
Read 'Hello, World!' from /home/user/gradle/samples/output/one.txt
Read 'Hello, World!' from /home/user/gradle/samples/output/two.txt

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed
$ gradle consumer

> Task :producerOne
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/one.txt

> Task :producerTwo
Wrote 'Hello, World!' to /home/user/gradle/samples/kotlin/output/two.txt

> Task :consumer
Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/one.txt
Read 'Hello, World!' from /home/user/gradle/samples/kotlin/output/two.txt

BUILD SUCCESSFUL in 0s
3 actionable tasks: 3 executed

Working with maps

Gradle provides a lazy MapProperty type to allow Map values to be configured. You can create a MapProperty instance using ObjectFactory.mapProperty(Class, Class).

Similar to other property types, a MapProperty has a set() method that you can use to specify the value for the property. Some additional methods allow entries with lazy values to be added to the map.

build.gradle.kts
abstract class Generator: DefaultTask() {
    @get:Input
    abstract val properties: MapProperty<String, Int>

    @TaskAction
    fun generate() {
        properties.get().forEach { entry ->
            logger.quiet("${entry.key} = ${entry.value}")
        }
    }
}

// Some values to be configured later
var b = 0
var c = 0

tasks.register<Generator>("generate") {
    properties.put("a", 1)
    // Values have not been configured yet
    properties.put("b", providers.provider { b })
    properties.putAll(providers.provider { mapOf("c" to c, "d" to c + 1) })
}

// Configure the values. There is no need to reconfigure the task
b = 2
c = 3
build.gradle
abstract class Generator extends DefaultTask {
    @Input
    abstract MapProperty<String, Integer> getProperties()

    @TaskAction
    void generate() {
        properties.get().each { key, value ->
            logger.quiet("${key} = ${value}")
        }
    }
}

// Some values to be configured later
def b = 0
def c = 0

tasks.register('generate', Generator) {
    properties.put("a", 1)
    // Values have not been configured yet
    properties.put("b", providers.provider { b })
    properties.putAll(providers.provider { [c: c, d: c + 1] })
}

// Configure the values. There is no need to reconfigure the task
b = 2
c = 3
$ gradle generate

> Task :generate
a = 1
b = 2
c = 3
d = 4

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

Applying a convention to a property

Often, you want to apply some convention, or default value to a property to be used if no value has been configured. You can use the convention() method for this. This method accepts either a value or a Provider, and this will be used as the value until some other value is configured.

build.gradle.kts
tasks.register("show") {
    val property = objects.property(String::class)

    // Set a convention
    property.convention("convention 1")

    println("value = " + property.get())

    // Can replace the convention
    property.convention("convention 2")
    println("value = " + property.get())

    property.set("explicit value")

    // Once a value is set, the convention is ignored
    property.convention("ignored convention")

    doLast {
        println("value = " + property.get())
    }
}
build.gradle
tasks.register("show") {
    def property = objects.property(String)

    // Set a convention
    property.convention("convention 1")

    println("value = " + property.get())

    // Can replace the convention
    property.convention("convention 2")
    println("value = " + property.get())

    property.set("explicit value")

    // Once a value is set, the convention is ignored
    property.convention("ignored convention")

    doLast {
        println("value = " + property.get())
    }
}
$ gradle show
value = convention 1
value = convention 2

> Task :show
value = explicit value

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

Where to apply conventions from?

There are several appropriate locations for setting a convention on a property at configuration time (i.e., before execution).

build.gradle.kts
// setting convention when registering a task from plugin
class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.getTasks().register<GreetingTask>("hello") {
            greeter.convention("Greeter")
        }
    }
}

apply<GreetingPlugin>()

tasks.withType<GreetingTask>().configureEach {
    // setting convention from build script
    guest.convention("Guest")
}

abstract class GreetingTask : DefaultTask() {
    // setting convention from constructor
    @get:Input
    abstract val guest: Property<String>

    init {
        guest.convention("person2")
    }

    // setting convention from declaration
    @Input
    val greeter = project.objects.property<String>().convention("person1")

    @TaskAction
    fun greet() {
        println("hello, ${guest.get()}, from ${greeter.get()}")
    }
}
build.gradle
// setting convention when registering a task from plugin
class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        project.getTasks().register("hello", GreetingTask) {
            greeter.convention("Greeter")
        }
    }
}

apply plugin: GreetingPlugin

tasks.withType(GreetingTask).configureEach {
    // setting convention from build script
    guest.convention("Guest")
}

abstract class GreetingTask extends DefaultTask {
    // setting convention from constructor
    @Input
    abstract Property<String> getGuest()

    GreetingTask() {
        guest.convention("person2")
    }

    // setting convention from declaration
    @Input
    final Property<String> greeter = project.objects.property(String).convention("person1")

    @TaskAction
    void greet() {
        println("hello, ${guest.get()}, from ${greeter.get()}")
    }
}

From a plugin’s apply() method

Plugin authors may configure a convention on a lazy property from a plugin’s apply() method, while performing preliminary configuration of the task or extension defining the property. This works well for regular plugins (meant to be distributed and used in the wild), and internal convention plugins (which often configure properties defined by third party plugins in a uniform way for the entire build).

build.gradle.kts
// setting convention when registering a task from plugin
class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.getTasks().register<GreetingTask>("hello") {
            greeter.convention("Greeter")
        }
    }
}
build.gradle
// setting convention when registering a task from plugin
class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        project.getTasks().register("hello", GreetingTask) {
            greeter.convention("Greeter")
        }
    }
}

From a build script

Build engineers may configure a convention on a lazy property from shared build logic that is configuring tasks (for instance, from third-party plugins) in a standard way for the entire build.

build.gradle.kts
apply<GreetingPlugin>()

tasks.withType<GreetingTask>().configureEach {
    // setting convention from build script
    guest.convention("Guest")
}
build.gradle
tasks.withType(GreetingTask).configureEach {
    // setting convention from build script
    guest.convention("Guest")
}

Note that for project-specific values, instead of conventions, you should prefer setting explicit values (using Property.set(…​) or ConfigurableFileCollection.setFrom(…​), for instance), as conventions are only meant to define defaults.

From the task initialization

A task author may configure a convention on a lazy property from the task constructor or (if in Kotlin) initializer block. This approach works for properties with trivial defaults, but it is not appropriate if additional context (external to the task implementation) is required in order to set a suitable default.

build.gradle.kts
// setting convention from constructor
@get:Input
abstract val guest: Property<String>

init {
    guest.convention("person2")
}
build.gradle
// setting convention from constructor
@Input
abstract Property<String> getGuest()

GreetingTask() {
    guest.convention("person2")
}

Next to the property declaration

You may configure a convention on a lazy property next to the place where the property is declared. Note this option is not available for managed properties, and has the same caveats as configuring a convention from the task constructor.

build.gradle.kts
// setting convention from declaration
@Input
val greeter = project.objects.property<String>().convention("person1")
build.gradle
// setting convention from declaration
@Input
final Property<String> greeter = project.objects.property(String).convention("person1")

Making a property unmodifiable

Most properties of a task or project are intended to be configured by plugins or build scripts so that they can use specific values for that build.

For example, a property that specifies the output directory for a compilation task may start with a value specified by a plugin. Then a build script might change the value to some custom location, then this value is used by the task when it runs. However, once the task starts to run, we want to prevent further property changes. This way we avoid errors that result from different consumers, such as the task action, Gradle’s up-to-date checks, build caching, or other tasks, using different values for the property.

Lazy properties provide several methods that you can use to disallow changes to their value once the value has been configured. The finalizeValue() method calculates the final value for the property and prevents further changes to the property.

libVersioning.version.finalizeValue()

When the property’s value comes from a Provider, the provider is queried for its current value, and the result becomes the final value for the property. This final value replaces the provider and the property no longer tracks the value of the provider. Calling this method also makes a property instance unmodifiable and any further attempts to change the value of the property will fail. Gradle automatically makes the properties of a task final when the task starts execution.

The finalizeValueOnRead() method is similar, except that the property’s final value is not calculated until the value of the property is queried.

modifiedFiles.finalizeValueOnRead()

In other words, this method calculates the final value lazily as required, whereas finalizeValue() calculates the final value eagerly. This method can be used when the value may be expensive to calculate or may not have been configured yet. You also want to ensure that all consumers of the property see the same value when they query the value.

Using the Provider API

Guidelines to be successful with the Provider API:

  1. The Property and Provider types have all of the overloads you need to query or configure a value. For this reason, you should follow the following guidelines:

    • For configurable properties, expose the Property directly through a single getter.

    • For non-configurable properties, expose an Provider directly through a single getter.

  2. Avoid simplifying calls like obj.getProperty().get() and obj.getProperty().set(T) in your code by introducing additional getters and setters.

  3. When migrating your plugin to use providers, follow these guidelines:

    • If it’s a new property, expose it as a Property or Provider using a single getter.

    • If it’s incubating, change it to use a Property or Provider using a single getter.

    • If it’s a stable property, add a new Property or Provider and deprecate the old one. You should wire the old getter/setters into the new property as appropriate.

Property Files API Reference

Use these types for mutable values:

RegularFileProperty

File on disk

DirectoryProperty

Directory on disk

ConfigurableFileCollection

Unstructured collection of files

ConfigurableFileTree

Hierarchy of files

SourceDirectorySet

Hierarchy of source directories

Lazy Collections API Reference

Use these types for mutable values:

ListProperty<T>

a property whose value is List<T>

SetProperty<T>

a property whose value is Set<T>

Lazy Objects API Reference

Use these types for read only values:

Provider<T>

a property whose value is an instance of T

Factories

Use these types for mutable values:

Property<T>

a property whose value is an instance of T

Developing Parallel Tasks

Gradle provides an API that can split tasks into sections that can be executed in parallel.

writing tasks 5

This allows Gradle to fully utilize the resources available and complete builds faster.

The Worker API

The Worker API provides the ability to break up the execution of a task action into discrete units of work and then execute that work concurrently and asynchronously.

Worker API example

The best way to understand how to use the API is to go through the process of converting an existing custom task to use the Worker API:

  1. You’ll start by creating a custom task class that generates MD5 hashes for a configurable set of files.

  2. Then, you’ll convert this custom task to use the Worker API.

  3. Then, we’ll explore running the task with different levels of isolation.

In the process, you’ll learn about the basics of the Worker API and the capabilities it provides.

Step 1. Create a custom task class

First, create a custom task that generates MD5 hashes of a configurable set of files.

In a new directory, create a buildSrc/build.gradle(.kts) file:

buildSrc/build.gradle.kts
repositories {
    mavenCentral()
}

dependencies {
    implementation("commons-io:commons-io:2.5")
    implementation("commons-codec:commons-codec:1.9") // (1)
}
buildSrc/build.gradle
repositories {
    mavenCentral()
}

dependencies {
    implementation 'commons-io:commons-io:2.5'
    implementation 'commons-codec:commons-codec:1.9' // (1)
}
  1. Your custom task class will use Apache Commons Codec to generate MD5 hashes.

Next, create a custom task class in your buildSrc/src/main/java directory. You should name this class CreateMD5:

buildSrc/src/main/java/CreateMD5.java
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.OutputDirectory;
import org.gradle.api.tasks.SourceTask;
import org.gradle.api.tasks.TaskAction;
import org.gradle.workers.WorkerExecutor;

import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;

abstract public class CreateMD5 extends SourceTask { // (1)

    @OutputDirectory
    abstract public DirectoryProperty getDestinationDirectory(); // (2)

    @TaskAction
    public void createHashes() {
        for (File sourceFile : getSource().getFiles()) { // (3)
            try {
                InputStream stream = new FileInputStream(sourceFile);
                System.out.println("Generating MD5 for " + sourceFile.getName() + "...");
                // Artificially make this task slower.
                Thread.sleep(3000); // (4)
                Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile.getName() + ".md5");  // (5)
                FileUtils.writeStringToFile(md5File.get().getAsFile(), DigestUtils.md5Hex(stream), (String) null);
            } catch (Exception e) {
                throw new RuntimeException(e);
            }
        }
    }
}
  1. SourceTask is a convenience type for tasks that operate on a set of source files.

  2. The task output will go into a configured directory.

  3. The task iterates over all the files defined as "source files" and creates an MD5 hash of each.

  4. Insert an artificial sleep to simulate hashing a large file (the sample files won’t be that large).

  5. The MD5 hash of each file is written to the output directory into a file of the same name with an "md5" extension.

Next, create a build.gradle(.kts) that registers your new CreateMD5 task:

build.gradle.kts
plugins { id("base") } // (1)

tasks.register<CreateMD5>("md5") {
    destinationDirectory = project.layout.buildDirectory.dir("md5") // (2)
    source(project.layout.projectDirectory.file("src")) // (3)
}
build.gradle
plugins { id 'base' } // (1)

tasks.register("md5", CreateMD5) {
    destinationDirectory = project.layout.buildDirectory.dir("md5") // (2)
    source(project.layout.projectDirectory.file('src')) // (3)
}
  1. Apply the base plugin so that you’ll have a clean task to use to remove the output.

  2. MD5 hash files will be written to build/md5.

  3. This task will generate MD5 hash files for every file in the src directory.

You will need some source to generate MD5 hashes from. Create three files in the src directory:

src/einstein.txt
Intellectual growth should commence at birth and cease only at death.
src/feynman.txt
I was born not knowing and have had only a little time to change that here and there.
src/hawking.txt
Intelligence is the ability to adapt to change.

At this point, you can test your task by running it ./gradlew md5:

$ gradle md5

The output should look similar to:

> Task :md5
Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 9s
3 actionable tasks: 3 executed

In the build/md5 directory, you should now see corresponding files with an md5 extension containing MD5 hashes of the files from the src directory. Notice that the task takes at least 9 seconds to run because it hashes each file one at a time (i.e., three files at ~3 seconds apiece).

Step 2. Convert to the Worker API

Although this task processes each file in sequence, the processing of each file is independent of any other file. This work can be done in parallel and take advantage of multiple processors. This is where the Worker API can help.

To use the Worker API, you need to define an interface that represents the parameters of each unit of work and extends org.gradle.workers.WorkParameters.

For the generation of MD5 hash files, the unit of work will require two parameters:

  1. the file to be hashed and,

  2. the file to write the hash to.

There is no need to create a concrete implementation because Gradle will generate one for us at runtime.

buildSrc/src/main/java/MD5WorkParameters.java
import org.gradle.api.file.RegularFileProperty;
import org.gradle.workers.WorkParameters;

public interface MD5WorkParameters extends WorkParameters {
    RegularFileProperty getSourceFile(); // (1)
    RegularFileProperty getMD5File();
}
  1. Use Property objects to represent the source and MD5 hash files.

Then, you need to refactor the part of your custom task that does the work for each individual file into a separate class. This class is your "unit of work" implementation, and it should be an abstract class that extends org.gradle.workers.WorkAction:

buildSrc/src/main/java/GenerateMD5.java
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.FileUtils;
import org.gradle.workers.WorkAction;

import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;

public abstract class GenerateMD5 implements WorkAction<MD5WorkParameters> { // (1)
    @Override
    public void execute() {
        try {
            File sourceFile = getParameters().getSourceFile().getAsFile().get();
            File md5File = getParameters().getMD5File().getAsFile().get();
            InputStream stream = new FileInputStream(sourceFile);
            System.out.println("Generating MD5 for " + sourceFile.getName() + "...");
            // Artificially make this task slower.
            Thread.sleep(3000);
            FileUtils.writeStringToFile(md5File, DigestUtils.md5Hex(stream), (String) null);
        } catch (Exception e) {
            throw new RuntimeException(e);
        }
    }
}
  1. Do not implement the getParameters() method - Gradle will inject this at runtime.

Now, change your custom task class to submit work to the WorkerExecutor instead of doing the work itself.

buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.workers.*;
import org.gradle.api.file.DirectoryProperty;

import javax.inject.Inject;
import java.io.File;

abstract public class CreateMD5 extends SourceTask {

    @OutputDirectory
    abstract public DirectoryProperty getDestinationDirectory();

    @Inject
    abstract public WorkerExecutor getWorkerExecutor(); // (1)

    @TaskAction
    public void createHashes() {
        WorkQueue workQueue = getWorkerExecutor().noIsolation(); // (2)

        for (File sourceFile : getSource().getFiles()) {
            Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile.getName() + ".md5");
            workQueue.submit(GenerateMD5.class, parameters -> { // (3)
                parameters.getSourceFile().set(sourceFile);
                parameters.getMD5File().set(md5File);
            });
        }
    }
}
  1. The WorkerExecutor service is required in order to submit your work. Create an abstract getter method annotated javax.inject.Inject, and Gradle will inject the service at runtime when the task is created.

  2. Before submitting work, get a WorkQueue object with the desired isolation mode (described below).

  3. When submitting the unit of work, specify the unit of work implementation, in this case GenerateMD5, and configure its parameters.

At this point, you should be able to rerun your task:

$ gradle clean md5

> Task :md5
Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

The results should look the same as before, although the MD5 hash files may be generated in a different order since the units of work are executed in parallel. This time, however, the task runs much faster. This is because the Worker API executes the MD5 calculation for each file in parallel rather than in sequence.

Step 3. Change the isolation mode

The isolation mode controls how strongly Gradle will isolate items of work from each other and the rest of the Gradle runtime.

There are three methods on WorkerExecutor that control this:

  1. noIsolation()

  2. classLoaderIsolation()

  3. processIsolation()

The noIsolation() mode is the lowest level of isolation and will prevent a unit of work from changing the project state. This is the fastest isolation mode because it requires the least overhead to set up and execute the work item. However, it will use a single shared classloader for all units of work. This means that each unit of work can affect one another through static class state. It also means that every unit of work uses the same version of libraries on the buildscript classpath. If you wanted the user to be able to configure the task to run with a different (but compatible) version of the Apache Commons Codec library, you would need to use a different isolation mode.

First, you must change the dependency in buildSrc/build.gradle to be compileOnly. This tells Gradle that it should use this dependency when building the classes, but should not put it on the build script classpath:

buildSrc/build.gradle.kts
repositories {
    mavenCentral()
}

dependencies {
    implementation("commons-io:commons-io:2.5")
    compileOnly("commons-codec:commons-codec:1.9")
}
buildSrc/build.gradle
repositories {
    mavenCentral()
}

dependencies {
    implementation 'commons-io:commons-io:2.5'
    compileOnly 'commons-codec:commons-codec:1.9'
}

Next, change the CreateMD5 task to allow the user to configure the version of the codec library that they want to use. It will resolve the appropriate version of the library at runtime and configure the workers to use this version.

The classLoaderIsolation() method tells Gradle to run this work in a thread with an isolated classloader:

buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;

import javax.inject.Inject;
import java.io.File;
import java.util.Set;

abstract public class CreateMD5 extends SourceTask {

    @InputFiles
    abstract public ConfigurableFileCollection getCodecClasspath(); // (1)

    @OutputDirectory
    abstract public DirectoryProperty getDestinationDirectory();

    @Inject
    abstract public WorkerExecutor getWorkerExecutor();

    @TaskAction
    public void createHashes() {
        WorkQueue workQueue = getWorkerExecutor().classLoaderIsolation(workerSpec -> {
            workerSpec.getClasspath().from(getCodecClasspath()); // (2)
        });

        for (File sourceFile : getSource().getFiles()) {
            Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile.getName() + ".md5");
            workQueue.submit(GenerateMD5.class, parameters -> {
                parameters.getSourceFile().set(sourceFile);
                parameters.getMD5File().set(md5File);
            });
        }
    }
}
  1. Expose an input property for the codec library classpath.

  2. Configure the classpath on the ClassLoaderWorkerSpec when creating the work queue.

Next, you need to configure your build so that it has a repository to look up the codec version at task execution time. We also create a dependency to resolve our codec library from this repository:

build.gradle.kts
plugins { id("base") }

repositories {
    mavenCentral() // (1)
}

val codec = configurations.create("codec") { // (2)
    attributes {
        attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage.JAVA_RUNTIME))
    }
    isVisible = false
    isCanBeConsumed = false
}

dependencies {
    codec("commons-codec:commons-codec:1.10") // (3)
}

tasks.register<CreateMD5>("md5") {
    codecClasspath.from(codec) // (4)
    destinationDirectory = project.layout.buildDirectory.dir("md5")
    source(project.layout.projectDirectory.file("src"))
}
build.gradle
plugins { id 'base' }

repositories {
    mavenCentral() // (1)
}

configurations.create('codec') { // (2)
    attributes {
        attribute(Usage.USAGE_ATTRIBUTE, objects.named(Usage, Usage.JAVA_RUNTIME))
    }
    visible = false
    canBeConsumed = false
}

dependencies {
    codec 'commons-codec:commons-codec:1.10' // (3)
}

tasks.register('md5', CreateMD5) {
    codecClasspath.from(configurations.codec) // (4)
    destinationDirectory = project.layout.buildDirectory.dir('md5')
    source(project.layout.projectDirectory.file('src'))
}
  1. Add a repository to resolve the codec library - this can be a different repository than the one used to build the CreateMD5 task class.

  2. Add a configuration to resolve our codec library version.

  3. Configure an alternate, compatible version of Apache Commons Codec.

  4. Configure the md5 task to use the configuration as its classpath. Note that the configuration will not be resolved until the task is executed.

Now, if you run your task, it should work as expected using the configured version of the codec library:

$ gradle clean md5

> Task :md5
Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed
Step 4. Create a Worker Daemon

Sometimes, it is desirable to utilize even greater levels of isolation when executing items of work. For instance, external libraries may rely on certain system properties to be set, which may conflict between work items. Or a library might not be compatible with the version of JDK that Gradle is running with and may need to be run with a different version.

The Worker API can accommodate this using the processIsolation() method that causes the work to execute in a separate "worker daemon". These worker processes will be session-scoped and can be reused within the same build session, but they won’t persist across builds. However, if system resources get low, Gradle will stop unused worker daemons.

To utilize a worker daemon, use the processIsolation() method when creating the WorkQueue. You may also want to configure custom settings for the new process:

buildSrc/src/main/java/CreateMD5.java
import org.gradle.api.Action;
import org.gradle.api.file.ConfigurableFileCollection;
import org.gradle.api.file.DirectoryProperty;
import org.gradle.api.file.RegularFile;
import org.gradle.api.provider.Provider;
import org.gradle.api.tasks.*;
import org.gradle.process.JavaForkOptions;
import org.gradle.workers.*;

import javax.inject.Inject;
import java.io.File;
import java.util.Set;

abstract public class CreateMD5 extends SourceTask {

    @InputFiles
    abstract public ConfigurableFileCollection getCodecClasspath(); // (1)

    @OutputDirectory
    abstract public DirectoryProperty getDestinationDirectory();

    @Inject
    abstract public WorkerExecutor getWorkerExecutor();

    @TaskAction
    public void createHashes() {
        // (1)
        WorkQueue workQueue = getWorkerExecutor().processIsolation(workerSpec -> {
            workerSpec.getClasspath().from(getCodecClasspath());
            workerSpec.forkOptions(options -> {
                options.setMaxHeapSize("64m"); // (2)
            });
        });

        for (File sourceFile : getSource().getFiles()) {
            Provider<RegularFile> md5File = getDestinationDirectory().file(sourceFile.getName() + ".md5");
            workQueue.submit(GenerateMD5.class, parameters -> {
                parameters.getSourceFile().set(sourceFile);
                parameters.getMD5File().set(md5File);
            });
        }
    }
}
  1. Change the isolation mode to PROCESS.

  2. Set up the JavaForkOptions for the new process.

Now, you should be able to run your task, and it will work as expected but using worker daemons instead:

$ gradle clean md5

> Task :md5
Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 3s
3 actionable tasks: 3 executed

Note that the execution time may be high. This is because Gradle has to start a new process for each worker daemon, which is expensive.

However, if you run your task a second time, you will see that it runs much faster. This is because the worker daemon(s) started during the initial build have persisted and are available for use immediately during subsequent builds:

$ gradle clean md5

> Task :md5
Generating MD5 for einstein.txt...
Generating MD5 for feynman.txt...
Generating MD5 for hawking.txt...

BUILD SUCCESSFUL in 1s
3 actionable tasks: 3 executed

Isolation modes

Gradle provides three isolation modes that can be configured when creating a WorkQueue and are specified using one of the following methods on WorkerExecutor:

WorkerExecutor.noIsolation()

This states that the work should be run in a thread with minimal isolation.
For instance, it will share the same classloader that the task is loaded from. This is the fastest level of isolation.

WorkerExecutor.classLoaderIsolation()

This states that the work should be run in a thread with an isolated classloader.
The classloader will have the classpath from the classloader that the unit of work implementation class was loaded from as well as any additional classpath entries added through ClassLoaderWorkerSpec.getClasspath().

WorkerExecutor.processIsolation()

This states that the work should be run with a maximum isolation level by executing the work in a separate process.
The classloader of the process will use the classpath from the classloader that the unit of work was loaded from as well as any additional classpath entries added through ClassLoaderWorkerSpec.getClasspath(). Furthermore, the process will be a worker daemon that will stay alive and can be reused for future work items with the same requirements. This process can be configured with different settings than the Gradle JVM using ProcessWorkerSpec.forkOptions(org.gradle.api.Action).

Worker Daemons

When using processIsolation(), Gradle will start a long-lived worker daemon process that can be reused for future work items.

build.gradle.kts
// Create a WorkQueue with process isolation
val workQueue = workerExecutor.processIsolation() {
    // Configure the options for the forked process
    forkOptions {
        maxHeapSize = "512m"
        systemProperty("org.gradle.sample.showFileSize", "true")
    }
}

// Create and submit a unit of work for each file
source.forEach { file ->
    workQueue.submit(ReverseFile::class) {
        fileToReverse = file
        destinationDir = outputDir
    }
}
build.gradle
// Create a WorkQueue with process isolation
WorkQueue workQueue = workerExecutor.processIsolation() { ProcessWorkerSpec spec ->
    // Configure the options for the forked process
    forkOptions { JavaForkOptions options ->
        options.maxHeapSize = "512m"
        options.systemProperty "org.gradle.sample.showFileSize", "true"
    }
}

// Create and submit a unit of work for each file
source.each { file ->
    workQueue.submit(ReverseFile.class) { ReverseParameters parameters ->
        parameters.fileToReverse = file
        parameters.destinationDir = outputDir
    }
}

When a unit of work for a worker daemon is submitted, Gradle will first look to see if a compatible, idle daemon already exists. If so, it will send the unit of work to the idle daemon, marking it as busy. If not, it will start a new daemon. When evaluating compatibility, Gradle looks at a number of criteria, all of which can be controlled through ProcessWorkerSpec.forkOptions(org.gradle.api.Action).

By default, a worker daemon starts with a maximum heap of 512MB. This can be changed by adjusting the workers' fork options.

executable

A daemon is considered compatible only if it uses the same Java executable.

classpath

A daemon is considered compatible if its classpath contains all the classpath entries requested.
Note that a daemon is considered compatible only if the classpath exactly matches the requested classpath.

heap settings

A daemon is considered compatible if it has at least the same heap size settings as requested.
In other words, a daemon that has higher heap settings than requested would be considered compatible.

jvm arguments

A daemon is compatible if it has set all the JVM arguments requested.
Note that a daemon is compatible if it has additional JVM arguments beyond those requested (except for those treated especially, such as heap settings, assertions, debug, etc.).

system properties

A daemon is considered compatible if it has set all the system properties requested with the same values.
Note that a daemon is compatible if it has additional system properties beyond those requested.

environment variables

A daemon is considered compatible if it has set all the environment variables requested with the same values.
Note that a daemon is compatible if it has more environment variables than requested.

bootstrap classpath

A daemon is considered compatible if it contains all the bootstrap classpath entries requested.
Note that a daemon is compatible if it has more bootstrap classpath entries than requested.

debug

A daemon is considered compatible only if debug is set to the same value as requested (true or false).

enable assertions

A daemon is considered compatible only if enable assertions are set to the same value as requested (true or false).

default character encoding

A daemon is considered compatible only if the default character encoding is set to the same value as requested.

Worker daemons will remain running until the build daemon that started them is stopped or system memory becomes scarce. When system memory is low, Gradle will stop worker daemons to minimize memory consumption.

Note
A step-by-step description of converting a normal task action to use the worker API can be found in the section on developing parallel tasks.

Cancellation and timeouts

To support cancellation (e.g., when the user stops the build with CTRL+C) and task timeouts, custom tasks should react to interrupting their executing thread. The same is true for work items submitted via the worker API. If a task does not respond to an interrupt within 10s, the daemon will shut down to free up system resources.

Advanced Tasks

Incremental tasks

In Gradle, implementing a task that skips execution when its inputs and outputs are already UP-TO-DATE is simple and efficient, thanks to the Incremental Build feature.

However, there are times when only a few input files have changed since the last execution, and it is best to avoid reprocessing all the unchanged inputs. This situation is common in tasks that transform input files into output files on a one-to-one basis.

To optimize your build process you can use an incremental task. This approach ensures that only out-of-date input files are processed, improving build performance.

Implementing an incremental task

For a task to process inputs incrementally, that task must contain an incremental task action.

This is a task action method that has a single InputChanges parameter. That parameter tells Gradle that the action only wants to process the changed inputs.

In addition, the task needs to declare at least one incremental file input property by using either @Incremental or @SkipWhenEmpty:

build.gradle.kts
public class IncrementalReverseTask : DefaultTask() {

    @get:Incremental
    @get:InputDirectory
    val inputDir: DirectoryProperty = project.objects.directoryProperty()

    @get:OutputDirectory
    val outputDir: DirectoryProperty = project.objects.directoryProperty()

    @get:Input
    val inputProperty: RegularFileProperty = project.objects.fileProperty() // File input property

    @TaskAction
    fun execute(inputs: InputChanges) { // InputChanges parameter
        val msg = if (inputs.isIncremental) "CHANGED inputs are out of date"
                  else "ALL inputs are out of date"
        println(msg)
    }
}
build.gradle
class IncrementalReverseTask extends DefaultTask {

    @Incremental
    @InputDirectory
    def File inputDir

    @OutputDirectory
    def File outputDir

    @Input
    def inputProperty // File input property

    @TaskAction
    void execute(InputChanges inputs) { // InputChanges parameter
        println inputs.incremental ? "CHANGED inputs are out of date"
                                   : "ALL inputs are out of date"
    }
}
Important

To query incremental changes for an input file property, that property must always return the same instance. The easiest way to accomplish this is to use one of the following property types: RegularFileProperty, DirectoryProperty or ConfigurableFileCollection.

You can learn more about RegularFileProperty and DirectoryProperty in Lazy Configuration.

The incremental task action can use InputChanges.getFileChanges() to find out what files have changed for a given file-based input property, be it of type RegularFileProperty, DirectoryProperty or ConfigurableFileCollection.

The method returns an Iterable of type FileChanges, which in turn can be queried for the following:

The following example demonstrates an incremental task that has a directory input. It assumes that the directory contains a collection of text files and copies them to an output directory, reversing the text within each file:

build.gradle.kts
abstract class IncrementalReverseTask : DefaultTask() {
    @get:Incremental
    @get:PathSensitive(PathSensitivity.NAME_ONLY)
    @get:InputDirectory
    abstract val inputDir: DirectoryProperty

    @get:OutputDirectory
    abstract val outputDir: DirectoryProperty

    @get:Input
    abstract val inputProperty: Property<String>

    @TaskAction
    fun execute(inputChanges: InputChanges) {
        println(
            if (inputChanges.isIncremental) "Executing incrementally"
            else "Executing non-incrementally"
        )

        inputChanges.getFileChanges(inputDir).forEach { change ->
            if (change.fileType == FileType.DIRECTORY) return@forEach

            println("${change.changeType}: ${change.normalizedPath}")
            val targetFile = outputDir.file(change.normalizedPath).get().asFile
            if (change.changeType == ChangeType.REMOVED) {
                targetFile.delete()
            } else {
                targetFile.writeText(change.file.readText().reversed())
            }
        }
    }
}
build.gradle
abstract class IncrementalReverseTask extends DefaultTask {
    @Incremental
    @PathSensitive(PathSensitivity.NAME_ONLY)
    @InputDirectory
    abstract DirectoryProperty getInputDir()

    @OutputDirectory
    abstract DirectoryProperty getOutputDir()

    @Input
    abstract Property<String> getInputProperty()

    @TaskAction
    void execute(InputChanges inputChanges) {
        println(inputChanges.incremental
            ? 'Executing incrementally'
            : 'Executing non-incrementally'
        )

        inputChanges.getFileChanges(inputDir).each { change ->
            if (change.fileType == FileType.DIRECTORY) return

            println "${change.changeType}: ${change.normalizedPath}"
            def targetFile = outputDir.file(change.normalizedPath).get().asFile
            if (change.changeType == ChangeType.REMOVED) {
                targetFile.delete()
            } else {
                targetFile.text = change.file.text.reverse()
            }
        }
    }
}
Note
The type of the inputDir property, its annotations, and the execute() action use getFileChanges() to process the subset of files that have changed since the last build. The action deletes a target file if the corresponding input file has been removed.

If, for some reason, the task is executed non-incrementally (by running with --rerun-tasks, for example), all files are reported as ADDED, irrespective of the previous state. In this case, Gradle automatically removes the previous outputs, so the incremental task must only process the given files.

For a simple transformer task like the above example, the task action must generate output files for any out-of-date inputs and delete output files for any removed inputs.

Important
A task may only contain a single incremental task action.
Which inputs are considered out of date?

When a task has been previously executed, and the only changes since that execution are to incremental input file properties, Gradle can intelligently determine which input files need to be processed, a concept known as incremental execution.

In this scenario, the InputChanges.getFileChanges() method, available in the org.gradle.work.InputChanges class, provides details for all input files associated with the given property that have been ADDED, REMOVED or MODIFIED.

However, there are many cases where Gradle cannot determine which input files need to be processed (i.e., non-incremental execution). Examples include:

  • There is no history available from a previous execution.

  • You are building with a different version of Gradle. Currently, Gradle does not use task history from a different version.

  • An upToDateWhen criterion added to the task returns false.

  • An input property has changed since the previous execution.

  • A non-incremental input file property has changed since the previous execution.

  • One or more output files have changed since the previous execution.

In these cases, Gradle will report all input files as ADDED, and the getFileChanges() method will return details for all the files that comprise the given input property.

You can check if the task execution is incremental or not with the InputChanges.isIncremental() method.

An incremental task in action

Consider an instance of IncrementalReverseTask executed against a set of inputs for the first time.

In this case, all inputs will be considered ADDED, as shown here:

build.gradle.kts
tasks.register<IncrementalReverseTask>("incrementalReverse") {
    inputDir = file("inputs")
    outputDir = layout.buildDirectory.dir("outputs")
    inputProperty = project.findProperty("taskInputProperty") as String? ?: "original"
}
build.gradle
tasks.register('incrementalReverse', IncrementalReverseTask) {
    inputDir = file('inputs')
    outputDir = layout.buildDirectory.dir("outputs")
    inputProperty = project.properties['taskInputProperty'] ?: 'original'
}

The build layout:

.
├── build.gradle
└── inputs
    ├── 1.txt
    ├── 2.txt
    └── 3.txt
$ gradle -q incrementalReverse
Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

Naturally, when the task is executed again with no changes, then the entire task is UP-TO-DATE, and the task action is not executed:

$ gradle incrementalReverse
> Task :incrementalReverse UP-TO-DATE

BUILD SUCCESSFUL in 0s
1 actionable task: 1 up-to-date

When an input file is modified in some way or a new input file is added, then re-executing the task results in those files being returned by InputChanges.getFileChanges().

The following example modifies the content of one file and adds another before running the incremental task:

build.gradle.kts
tasks.register("updateInputs") {
    val inputsDir = layout.projectDirectory.dir("inputs")
    outputs.dir(inputsDir)
    doLast {
        inputsDir.file("1.txt").asFile.writeText("Changed content for existing file 1.")
        inputsDir.file("4.txt").asFile.writeText("Content for new file 4.")
    }
}
build.gradle
tasks.register('updateInputs') {
    def inputsDir = layout.projectDirectory.dir('inputs')
    outputs.dir(inputsDir)
    doLast {
        inputsDir.file('1.txt').asFile.text = 'Changed content for existing file 1.'
        inputsDir.file('4.txt').asFile.text = 'Content for new file 4.'
    }
}
$ gradle -q updateInputs incrementalReverse
Executing incrementally
MODIFIED: 1.txt
ADDED: 4.txt
Note
The various mutation tasks (updateInputs, removeInput, etc) are only present to demonstrate the behavior of incremental tasks. They should not be viewed as the kinds of tasks or task implementations you should have in your own build scripts.

When an existing input file is removed, then re-executing the task results in that file being returned by InputChanges.getFileChanges() as REMOVED.

The following example removes one of the existing files before executing the incremental task:

build.gradle.kts
tasks.register<Delete>("removeInput") {
    delete("inputs/3.txt")
}
build.gradle
tasks.register('removeInput', Delete) {
    delete 'inputs/3.txt'
}
$ gradle -q removeInput incrementalReverse
Executing incrementally
REMOVED: 3.txt

Gradle cannot determine which input files are out-of-date when an output file is deleted (or modified). In this case, details for all the input files for the given property are returned by InputChanges.getFileChanges().

The following example removes one of the output files from the build directory. However, all the input files are considered to be ADDED:

build.gradle.kts
tasks.register<Delete>("removeOutput") {
    delete(layout.buildDirectory.file("outputs/1.txt"))
}
build.gradle
tasks.register('removeOutput', Delete) {
    delete layout.buildDirectory.file("outputs/1.txt")
}
$ gradle -q removeOutput incrementalReverse
Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

The last scenario we want to cover concerns what happens when a non-file-based input property is modified. In such cases, Gradle cannot determine how the property impacts the task outputs, so the task is executed non-incrementally. This means that all input files for the given property are returned by InputChanges.getFileChanges() and they are all treated as ADDED.

The following example sets the project property taskInputProperty to a new value when running the incrementalReverse task. That project property is used to initialize the task’s inputProperty property, as you can see in the first example of this section.

Here is the expected output in this case:

$ gradle -q -PtaskInputProperty=changed incrementalReverse
Executing non-incrementally
ADDED: 1.txt
ADDED: 2.txt
ADDED: 3.txt

Command Line options

Sometimes, a user wants to declare the value of an exposed task property on the command line instead of the build script. Passing property values on the command line is particularly helpful if they change more frequently.

The task API supports a mechanism for marking a property to automatically generate a corresponding command line parameter with a specific name at runtime.

Step 1. Declare a command-line option

To expose a new command line option for a task property, annotate the corresponding setter method of a property with Option:

@Option(option = "flag", description = "Sets the flag")

An option requires a mandatory identifier. You can provide an optional description.

A task can expose as many command line options as properties available in the class.

Options may be declared in superinterfaces of the task class as well. If multiple interfaces declare the same property but with different option flags, they will both work to set the property.

In the example below, the custom task UrlVerify verifies whether a URL can be resolved by making an HTTP call and checking the response code. The URL to be verified is configurable through the property url. The setter method for the property is annotated with @Option:

UrlVerify.java
import org.gradle.api.tasks.options.Option;

public class UrlVerify extends DefaultTask {
    private String url;

    @Option(option = "url", description = "Configures the URL to be verified.")
    public void setUrl(String url) {
        this.url = url;
    }

    @Input
    public String getUrl() {
        return url;
    }

    @TaskAction
    public void verify() {
        getLogger().quiet("Verifying URL '{}'", url);

        // verify URL by making a HTTP call
    }
}

All options declared for a task can be rendered as console output by running the help task and the --task option.

Step 2. Use an option on the command line

There are a few rules for options on the command line:

  • The option uses a double-dash as a prefix, e.g., --url. A single dash does not qualify as valid syntax for a task option.

  • The option argument follows directly after the task declaration, e.g., verifyUrl --url=http://www.google.com/.

  • Multiple task options can be declared in any order on the command line following the task name.

Building upon the earlier example, the build script creates a task instance of type UrlVerify and provides a value from the command line through the exposed option:

build.gradle.kts
tasks.register<UrlVerify>("verifyUrl")
build.gradle
tasks.register('verifyUrl', UrlVerify)
$ gradle -q verifyUrl --url=http://www.google.com/
Verifying URL 'http://www.google.com/'
Supported data types for options

Gradle limits the data types that can be used for declaring command line options.

The use of the command line differs per type:

boolean, Boolean, Property<Boolean>

Describes an option with the value true or false.
Passing the option on the command line treats the value as true. For example, --foo equates to true.
The absence of the option uses the default value of the property. For each boolean option, an opposite option is created automatically. For example, --no-foo is created for the provided option --foo and --bar is created for --no-bar. Options whose name starts with --no are disabled options and set the option value to false. An opposite option is only created if no option with the same name already exists for the task.

Double, Property<Double>

Describes an option with a double value.
Passing the option on the command line also requires a value, e.g., --factor=2.2 or --factor 2.2.

Integer, Property<Integer>

Describes an option with an integer value.
Passing the option on the command line also requires a value, e.g., --network-timeout=5000 or --network-timeout 5000.

Long, Property<Long>

Describes an option with a long value.
Passing the option on the command line also requires a value, e.g., --threshold=2147483648 or --threshold 2147483648.

String, Property<String>

Describes an option with an arbitrary String value.
Passing the option on the command line also requires a value, e.g., --container-id=2x94held or --container-id 2x94held.

enum, Property<enum>

Describes an option as an enumerated type.
Passing the option on the command line also requires a value e.g., --log-level=DEBUG or --log-level debug.
The value is not case-sensitive.

List<T> where T is Double, Integer, Long, String, enum

Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123 --image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space character, are currently not supported.

ListProperty<T>, SetProperty<T> where T is Double, Integer, Long, String, enum

Describes an option that can take multiple values of a given type.
The values for the option have to be provided as multiple declarations, e.g., --image-id=123 --image-id=456.
Other notations, such as comma-separated lists or multiple values separated by a space character, are currently not supported.

DirectoryProperty, RegularFileProperty

Describes an option with a file system element.
Passing the option on the command line also requires a value representing a path, e.g., --output-file=file.txt or --output-dir outputDir.
Relative paths are resolved relative to the project directory of the project that owns this property instance. See FileSystemLocationProperty.set().

Documenting available values for an option

Theoretically, an option for a property type String or List<String> can accept any arbitrary value. Accepted values for such an option can be documented programmatically with the help of the annotation OptionValues:

@OptionValues('file')

This annotation may be assigned to any method that returns a List of one of the supported data types. You need to specify an option identifier to indicate the relationship between the option and available values.

Note
Passing a value on the command line not supported by the option does not fail the build or throw an exception. You must implement custom logic for such behavior in the task action.

The example below demonstrates the use of multiple options for a single task. The task implementation provides a list of available values for the option output-type:

UrlProcess.java
import org.gradle.api.tasks.options.Option;
import org.gradle.api.tasks.options.OptionValues;

public abstract class UrlProcess extends DefaultTask {
    private String url;
    private OutputType outputType;

    @Input
    @Option(option = "http", description = "Configures the http protocol to be allowed.")
    public abstract Property<Boolean> getHttp();

    @Option(option = "url", description = "Configures the URL to send the request to.")
    public void setUrl(String url) {
        if (!getHttp().getOrElse(true) && url.startsWith("http://")) {
            throw new IllegalArgumentException("HTTP is not allowed");
        } else {
            this.url = url;
        }
    }

    @Input
    public String getUrl() {
        return url;
    }

    @Option(option = "output-type", description = "Configures the output type.")
    public void setOutputType(OutputType outputType) {
        this.outputType = outputType;
    }

    @OptionValues("output-type")
    public List<OutputType> getAvailableOutputTypes() {
        return new ArrayList<OutputType>(Arrays.asList(OutputType.values()));
    }

    @Input
    public OutputType getOutputType() {
        return outputType;
    }

    @TaskAction
    public void process() {
        getLogger().quiet("Writing out the URL response from '{}' to '{}'", url, outputType);

        // retrieve content from URL and write to output
    }

    private static enum OutputType {
        CONSOLE, FILE
    }
}
Listing command line options

Command line options using the annotations Option and OptionValues are self-documenting.

You will see declared options and their available values reflected in the console output of the help task. The output renders options alphabetically, except for boolean disable options, which appear following the enable option:

$ gradle -q help --task processUrl
Detailed task information for processUrl

Path
     :processUrl

Type
     UrlProcess (UrlProcess)

Options
     --http     Configures the http protocol to be allowed.

     --no-http     Disables option --http.

     --output-type     Configures the output type.
                       Available values are:
                            CONSOLE
                            FILE

     --url     Configures the URL to send the request to.

     --rerun     Causes the task to be re-run even if up-to-date.

Description
     -

Group
     -
Limitations

Support for declaring command line options currently comes with a few limitations.

  • Command line options can only be declared for custom tasks via annotation. There’s no programmatic equivalent for defining options.

  • Options cannot be declared globally, e.g., on a project level or as part of a plugin.

  • When assigning an option on the command line, the task exposing the option needs to be spelled out explicitly, e.g., gradle check --tests abc does not work even though the check task depends on the test task.

  • If you specify a task option name that conflicts with the name of a built-in Gradle option, use the -- delimiter before calling your task to reference that option. For more information, see Disambiguate Task Options from Built-in Options.

Verification failures

Normally, exceptions thrown during task execution result in a failure that immediately terminates a build. The outcome of the task will be FAILED, the result of the build will be FAILED, and no further tasks will be executed. When running with the --continue flag, Gradle will continue to run other requested tasks in the build after encountering a task failure. However, any tasks that depend on a failed task will not be executed.

There is a special type of exception that behaves differently when downstream tasks only rely on the outputs of a failing task. A task can throw a subtype of VerificationException to indicate that it has failed in a controlled manner such that its output is still valid for consumers. A task depends on the outcome of another task when it directly depends on it using dependsOn. When Gradle is run with --continue, consumer tasks that depend on a producer task’s output (via a relationship between task inputs and outputs) can still run after the producer fails.

A failed unit test, for instance, will cause a failing outcome for the test task. However, this doesn’t prevent another task from reading and processing the (valid) test results the task produced. Verification failures are used in exactly this manner by the Test Report Aggregation Plugin.

Verification failures are also useful for tasks that need to report a failure even after producing useful output consumable by other tasks.

build.gradle.kts
val process = tasks.register("process") {
    val outputFile = layout.buildDirectory.file("processed.log")
    outputs.files(outputFile) // (1)

    doLast {
        val logFile = outputFile.get().asFile
        logFile.appendText("Step 1 Complete.") // (2)
        throw VerificationException("Process failed!") // (3)
        logFile.appendText("Step 2 Complete.") // (4)
    }
}

tasks.register("postProcess") {
    inputs.files(process) // (5)

    doLast {
        println("Results: ${inputs.files.singleFile.readText()}") // (6)
    }
}
build.gradle
tasks.register("process") {
    def outputFile = layout.buildDirectory.file("processed.log")
    outputs.files(outputFile) // (1)

    doLast {
        def logFile = outputFile.get().asFile
        logFile << "Step 1 Complete." // (2)
        throw new VerificationException("Process failed!") // (3)
        logFile << "Step 2 Complete." // (4)
    }
}

tasks.register("postProcess") {
    inputs.files(tasks.named("process")) // (5)

    doLast {
        println("Results: ${inputs.files.singleFile.text}") // (6)
    }
}
$ gradle postProcess --continue
> Task :process FAILED

> Task :postProcess
Results: Step 1 Complete.
2 actionable tasks: 2 executed

FAILURE: Build failed with an exception.
  1. Register Output: The process task writes its output to a log file.

  2. Modify Output: The task writes to its output file as it executes.

  3. Task Failure: The task throws a VerificationException and fails at this point.

  4. Continue to Modify Output: This line never runs due to the exception stopping the task.

  5. Consume Output: The postProcess task depends on the output of the process task due to using that task’s outputs as its own inputs.

  6. Use Partial Result: With the --continue flag set, Gradle still runs the requested postProcess task despite the process task’s failure. postProcess can read and display the partial (though still valid) result.

DEVELOPING PLUGINS

Understanding Plugins

Gradle comes with a set of powerful core systems such as dependency management, task execution, and project configuration. But everything else it can do is supplied by plugins.

Plugins encapsulate logic for specific tasks or integrations, such as compiling code, running tests, or deploying artifacts. By applying plugins, users can easily add new features to their build process without having to write complex code from scratch.

This plugin-based approach allows Gradle to be lightweight and modular. It also promotes code reuse and maintainability, as plugins can be shared across projects or within an organization.

Before reading this chapter, it’s recommended that you first read Learning The Basics and complete the Tutorial.

Plugins Introduction

Plugins can be sourced from Gradle or the Gradle community. But when users want to organize their build logic or need specific build capabilities not provided by existing plugins, they can develop their own.

As such, we distinguish between three different kinds of plugins:

  1. Core Plugins - plugins that come from Gradle.

  2. Community Plugins - plugins that come from Gradle Plugin Portal or a public repository.

  3. Local or Custom Plugins - plugins that you develop yourself.

Core Plugins

The term core plugin refers to a plugin that is part of the Gradle distribution such as the Java Library Plugin. They are always available.

Community Plugins

The term community plugin refers to a plugin published to the Gradle Plugin Portal (or another public repository) such as the Spotless Plugin.

Local or Custom Plugins

The term local or custom plugin refers to a plugin you write yourself for your own build.

Custom plugins

There are three types of custom plugins:

# Type Location: Most likely: Benefit:

1

Script plugins

A .gradle(.kts) script file

A local plugin

Plugin is automatically compiled and included in the classpath of the build script.

2

Precompiled script plugins

buildSrc folder or composite build

A convention plugin

Plugin is automatically compiled, tested, and available on the classpath of the build script. The plugin is visible to every build script used by the build.

3

Binary plugins

Standalone project

A shared plugin

Plugin JAR is produced and published. The plugin can be used in multiple builds and shared with others.

Script plugins

Script plugins are typically small, local plugins written in script files for tasks specific to a single build or project. They do not need to be reused across multiple projects. Script plugins are not recommended but many other forms of plugins evolve from script plugins.

To create a plugin, you need to write a class that implements the Plugin interface.

The following sample creates a GreetingPlugin, which adds a hello task to a project when applied:

build.gradle.kts
class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.task("hello") {
            doLast {
                println("Hello from the GreetingPlugin")
            }
        }
    }
}

// Apply the plugin
apply<GreetingPlugin>()
build.gradle
class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        project.task('hello') {
            doLast {
                println 'Hello from the GreetingPlugin'
            }
        }
    }
}

// Apply the plugin
apply plugin: GreetingPlugin
$ gradle -q hello
Hello from the GreetingPlugin

The Project object is passed as a parameter in apply(), which the plugin can use to configure the project however it needs to (such as adding tasks, configuring dependencies, etc.). In this example, the plugin is written directly in the build file which is not a recommended practice.

When the plugin is written in a separate script file, it can be applied using apply(from = "file_name.gradle.kts") or apply from: 'file_name.gradle'. In the example below, the plugin is coded in the other.gradle(.kts) script file. Then, the other.gradle(.kts) is applied to build.gradle(.kts) using apply from:

other.gradle.kts
class GreetingScriptPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        project.task("hi") {
            doLast {
                println("Hi from the GreetingScriptPlugin")
            }
        }
    }
}

// Apply the plugin
apply<GreetingScriptPlugin>()
other.gradle
class GreetingScriptPlugin implements Plugin<Project> {
    void apply(Project project) {
        project.task('hi') {
            doLast {
                println 'Hi from the GreetingScriptPlugin'
            }
        }
    }
}

// Apply the plugin
apply plugin: GreetingScriptPlugin
build.gradle.kts
apply(from = "other.gradle.kts")
build.gradle
apply from: 'other.gradle'
$ gradle -q hi
Hi from the GreetingScriptPlugin

Script plugins should be avoided.

Precompiled script plugins

Precompiled script plugins are compiled into class files and packaged into a JAR before they are executed. These plugins use the Groovy DSL or Kotlin DSL instead of pure Java, Kotlin, or Groovy. They are best used as convention plugins that share build logic across projects or as a way to neatly organize build logic.

To create a precompiled script plugin, you can:

  1. Use Gradle’s Kotlin DSL - The plugin is a .gradle.kts file, and apply id("kotlin-dsl").

  2. Use Gradle’s Groovy DSL - The plugin is a .gradle file, and apply id("groovy-gradle-plugin").

To apply a precompiled script plugin, you need to know its ID. The ID is derived from the plugin script’s filename and its (optional) package declaration.

For example, the script src/main/*/some-java-library.gradle(.kts) has a plugin ID of some-java-library (assuming it has no package declaration). Likewise, src/main/*/my/some-java-library.gradle(.kts) has a plugin ID of my.some-java-library as long as it has a package declaration of my.

Precompiled script plugin names have two important limitations:

  • They cannot start with org.gradle.

  • They cannot have the same name as a core plugin.

When the plugin is applied to a project, Gradle creates an instance of the plugin class and calls the instance’s Plugin.apply() method.

Note
A new instance of a Plugin is created within each project applying that plugin.

Let’s rewrite the GreetingPlugin script plugin as a precompiled script plugin. Since we are using the Groovy or Kotlin DSL, the file essentially becomes the plugin. The original script plugin simply created a hello task which printed a greeting, this is what we will do in the pre-compiled script plugin:

buildSrc/src/main/kotlin/GreetingPlugin.gradle.kts
tasks.register("hello") {
    doLast {
        println("Hello from the convention GreetingPlugin")
    }
}
buildSrc/src/main/groovy/GreetingPlugin.gradle
tasks.register("hello") {
    doLast {
        println("Hello from the convention GreetingPlugin")
    }
}

The GreetingPlugin can now be applied in other subprojects' builds by using its ID:

app/build.gradle.kts
plugins {
    application
    id("GreetingPlugin")
}
app/build.gradle
plugins {
    id 'application'
    id('GreetingPlugin')
}
$ gradle -q hello
Hello from the convention GreetingPlugin

Convention plugins

A convention plugin is typically a precompiled script plugin that configures existing core and community plugins with your own conventions (i.e. default values) such as setting the Java version by using java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are also used to enforce project standards and help streamline the build process. They can apply and configure plugins, create new tasks and extensions, set dependencies, and much more.

Let’s take an example build with three subprojects: one for data-model, one for database-logic and one for app code. The project has the following structure:

.
├── buildSrc
│   ├── src
│   │   └──...
│   └── build.gradle.kts
├── data-model
│   ├── src
│   │   └──...
│   └── build.gradle.kts
├── database-logic
│   ├── src
│   │   └──...
│   └── build.gradle.kts
├── app
│   ├── src
│   │   └──...
│   └── build.gradle.kts
└── settings.gradle.kts

The build file of the database-logic subproject is as follows:

database-logic/build.gradle.kts
plugins {
    id("java-library")
    id("org.jetbrains.kotlin.jvm") version "1.9.24"
}

repositories {
    mavenCentral()
}

java {
    toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
    useJUnitPlatform()
}

kotlin {
    jvmToolchain(11)
}

// More build logic
database-logic/build.gradle
plugins {
    id 'java-library'
    id 'org.jetbrains.kotlin.jvm' version '1.9.24'
}

repositories {
    mavenCentral()
}

java {
    toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
    useJUnitPlatform()
}

kotlin {
    jvmToolchain {
        languageVersion.set(JavaLanguageVersion.of(11))
    }
}

// More build logic

We apply the java-library plugin and add the org.jetbrains.kotlin.jvm plugin for Kotlin support. We also configure Kotlin, Java, tests and more.

Our build file is beginning to grow…​

The more plugins we apply and the more plugins we configure, the larger it gets. There’s also repetition in the build files of the app and data-model subprojects, especially when configuring common extensions like setting the Java version and Kotlin support.

To address this, we use convention plugins. This allows us to avoid repeating configuration in each build file and keeps our build scripts more concise and maintainable. In convention plugins, we can encapsulate arbitrary build configuration or custom build logic.

To develop a convention plugin, we recommend using buildSrc – which represents a completely separate Gradle build. buildSrc has its own settings file to define where dependencies of this build are located.

We add a Kotlin script called my-java-library.gradle.kts inside the buildSrc/src/main/kotlin directory. Or conversely, a Groovy script called my-java-library.gradle inside the buildSrc/src/main/groovy directory. We put all the plugin application and configuration from the database-logic build file into it:

buildSrc/src/main/kotlin/my-java-library.gradle.kts
plugins {
    id("java-library")
    id("org.jetbrains.kotlin.jvm")
}

repositories {
    mavenCentral()
}

java {
    toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
    useJUnitPlatform()
}

kotlin {
    jvmToolchain(11)
}
buildSrc/src/main/groovy/my-java-library.gradle
plugins {
    id 'java-library'
    id 'org.jetbrains.kotlin.jvm'
}

repositories {
    mavenCentral()
}

java {
    toolchain.languageVersion.set(JavaLanguageVersion.of(11))
}

tasks.test {
    useJUnitPlatform()
}

kotlin {
    jvmToolchain {
        languageVersion.set(JavaLanguageVersion.of(11))
    }
}

The name of the file my-java-library is the ID of our brand-new plugin, which we can now use in all of our subprojects.

Tip
Why is the version of id 'org.jetbrains.kotlin.jvm' missing? See Applying External Plugins to Pre-Compiled Script Plugins.

The database-logic build file becomes much simpler by removing all the redundant build logic and applying our convention my-java-library plugin instead:

database-logic/build.gradle.kts
plugins {
    id("my-java-library")
}
database-logic/build.gradle
plugins {
    id('my-java-library')
}

This convention plugin enables us to easily share common configurations across all our build files. Any modifications can be made in one place, simplifying maintenance.

Binary plugins

Binary plugins in Gradle are plugins that are built as standalone JAR files and applied to a project using the plugins{} block in the build script.

Let’s move our GreetingPlugin to a standalone project so that we can publish it and share it with others. The plugin is essentially moved from the buildSrc folder to its own build called greeting-plugin.

Note
You can publish the plugin from buildSrc, but this is not recommended practice. Plugins that are ready for publication should be in their own build.

greeting-plugin is simply a Java project that produces a JAR containing the plugin classes.

The easiest way to package and publish a plugin to a repository is to use the Gradle Plugin Development Plugin. This plugin provides the necessary tasks and configurations (including the plugin metadata) to compile your script into a plugin that can be applied in other builds.

Here is a simple build script for the greeting-plugin project using the Gradle Plugin Development Plugin:

build.gradle.kts
plugins {
    `java-gradle-plugin`
}

gradlePlugin {
    plugins {
        create("simplePlugin") {
            id = "org.example.greeting"
            implementationClass = "org.example.GreetingPlugin"
        }
    }
}
build.gradle
plugins {
    id 'java-gradle-plugin'
}

gradlePlugin {
    plugins {
        simplePlugin {
            id = 'org.example.greeting'
            implementationClass = 'org.example.GreetingPlugin'
        }
    }
}

For more on publishing plugins, see Publishing Plugins.

Project vs Settings vs Init plugins

In the example used through this section, the plugin accepts the Project type as a type parameter. Alternatively, the plugin can accept a parameter of type Settings to be applied in a settings script, or a parameter of type Gradle to be applied in an initialization script.

The difference between these types of plugins lies in the scope of their application:

Project Plugin

A project plugin is a plugin that is applied to a specific project in a build. It can customize the build logic, add tasks, and configure the project-specific settings.

Settings Plugin

A settings plugin is a plugin that is applied in the settings.gradle or settings.gradle.kts file. It can configure settings that apply to the entire build, such as defining which projects are included in the build, configuring build script repositories, and applying common configurations to all projects.

Init Plugin

An init plugin is a plugin that is applied in the init.gradle or init.gradle.kts file. It can configure settings that apply globally to all Gradle builds on a machine, such as configuring the Gradle version, setting up default repositories, or applying common plugins to all builds.

Understanding Implementation Options for Plugins

The choice between script, precompiled script, or binary plugins depends on your specific requirements and preferences.

Script Plugins are simple and easy to write. They are written in Kotlin DSL or Groovy DSL. They are suitable for small, one-off tasks or for quick experimentation. However, they can become hard to maintain as the build script grows in size and complexity.

Precompiled Script Plugins are Kotlin or Groovy DSL scripts compiled into Java class files packaged in a library. They offer better performance and maintainability compared to script plugins, and they can be reused across different projects. You can also write them in Groovy DSL but that is not recommended.

Binary Plugins are full-fledged plugins written in Java, Groovy, or Kotlin, compiled into JAR files, and published to a repository. They offer the best performance, maintainability, and reusability. They are suitable for complex build logic that needs to be shared across projects, builds, and teams. You can also write them in Scala or Groovy but that is not recommended.

Here is a breakdown of all options for implementing Gradle plugins:

# Using: Type: The Plugin is: Recommended?

1

Kotlin DSL

Script plugin

in a .gradle.kts file as an abstract class that implements the apply(Project project) method of the Plugin<Project> interface.

No[1]

2

Groovy DSL

Script plugin

in a .gradle file as an abstract class that implements the apply(Project project) method of the Plugin<Project> interface.

No[1]

3

Kotlin DSL

Pre-compiled script plugin

a .gradle.kts file.

Yes

4

Groovy DSL

Pre-compiled script plugin

a .gradle file.

Ok[2]

5

Java

Binary plugin

an abstract class that implements the apply(Project project) method of the Plugin<Project> interface in Java.

Yes

6

Kotlin / Kotlin DSL

Binary plugin

an abstract class that implements the apply(Project project) method of the Plugin<Project> interface in Kotlin and/or Kotlin DSL.

Yes

7

Groovy / Groovy DSL

Binary plugin

an abstract class that implements the apply(Project project) method of the Plugin<Project> interface in Groovy and/or Groovy DSL.

Ok[2]

8

Scala

Binary plugin

an abstract class that implements the apply(Project project) method of the Plugin<Project> interface in Scala.

No[2]

If you suspect issues with your plugin code, try creating a Build Scan to identify bottlenecks. The Gradle profiler can help automate Build Scan generation and gather more low-level information.

Implementing Pre-compiled Script Plugins

A precompiled script plugin is typically a Kotlin script that has been compiled and distributed as Java class files packaged in a library. These scripts are intended to be consumed as binary Gradle plugins and are recommended for use as convention plugins.

A convention plugin is a plugin that normaly configures existing core and community plugins with your own conventions (i.e. default values) such as setting the Java version by using java.toolchain.languageVersion = JavaLanguageVersion.of(17). Convention plugins are also used to enforce project standards and help streamline the build process. They can apply and configure plugins, create new tasks and extensions, set dependencies, and much more.

Setting the plugin ID

The plugin ID for a precompiled script is derived from its file name and optional package declaration.

For example, a script named code-quality.gradle(.kts) located in src/main/groovy (or src/main/kotlin) without a package declaration would be exposed as the code-quality plugin:

buildSrc/build.gradle.kts
plugins {
    id("kotlin-dsl")
}
app/build.gradle.kts
plugins {
    id("code-quality")
}
buildSrc/build.gradle
plugins {
    id 'groovy-gradle-plugin'
}
app/build.gradle
plugins {
    id 'code-quality'
}

On the other hand, a script named code-quality.gradle(.kts) located in src/main/groovy/my (or src/main/kotlin/my) with the package declaration my would be exposed as the my.code-quality plugin:

buildSrc/build.gradle.kts
plugins {
    id("kotlin-dsl")
}
app/build.gradle.kts
plugins {
    id("my.code-quality")
}
buildSrc/build.gradle
plugins {
    id 'groovy-gradle-plugin'
}
app/build.gradle
plugins {
    id 'my.code-quality'
}

Making a plugin configurable using extensions

Extension objects are commonly used in plugins to expose configuration options and additional functionality to build scripts.

When you apply a plugin that defines an extension, you can access the extension object and configure its properties or call its methods to customize the behavior of the plugin or tasks provided by the plugin.

A Project has an associated ExtensionContainer object that contains all the settings and properties for the plugins that have been applied to the project. You can provide configuration for your plugin by adding an extension object to this container.

Let’s update our greetings example:

buildSrc/src/main/kotlin/greetings.gradle.kts
// Create extension object
interface GreetingPluginExtension {
    val message: Property<String>
}

// Add the 'greeting' extension object to project
val extension = project.extensions.create<GreetingPluginExtension>("greeting")
buildSrc/src/main/groovy/greetings.gradle
// Create extension object
interface GreetingPluginExtension {
    Property<String> getMessage()
}

// Add the 'greeting' extension object to project
def extension = project.extensions.create("greeting", GreetingPluginExtension)

You can set the value of the message property directly with extension.message.set("Hi from Gradle,").

However, the GreetingPluginExtension object becomes available as a project property with the same name as the extension object. You can now access message like so:

buildSrc/src/main/kotlin/greetings.gradle.kts
// Where the<GreetingPluginExtension>() is equivalent to project.extensions.getByType(GreetingPluginExtension::class.java)
the<GreetingPluginExtension>().message.set("Hi from Gradle")
buildSrc/src/main/groovy/greetings.gradle
extensions.findByType(GreetingPluginExtension).message.set("Hi from Gradle")

If you apply the greetings plugin, you can set the convention in your build script:

app/build.gradle.kts
plugins {
    application
    id("greetings")
}

greeting {
    message = "Hello from Gradle"
}
app/build.gradle
plugins {
    id 'application'
    id('greetings')
}

configure(greeting) {
    message = "Hello from Gradle"
}

Adding default configuration as conventions

In plugins, you can define default values, also known as conventions, using the project object.

Convention properties are properties that are initialized with default values but can be overridden:

buildSrc/src/main/kotlin/greetings.gradle.kts
// Create extension object
interface GreetingPluginExtension {
    val message: Property<String>
}

// Add the 'greeting' extension object to project
val extension = project.extensions.create<GreetingPluginExtension>("greeting")

// Set a default value for 'message'
extension.message.convention("Hello from Gradle")
buildSrc/src/main/groovy/greetings.gradle
// Create extension object
interface GreetingPluginExtension {
    Property<String> getMessage()
}

// Add the 'greeting' extension object to project
def extension = project.extensions.create("greeting", GreetingPluginExtension)

// Set a default value for 'message'
extension.message.convention("Hello from Gradle")

extension.message.convention(…​) sets a convention for the message property of the extension. This convention specifies that the value of message should default to the content of a file named defaultGreeting.txt located in the build directory of the project.

If the message property is not explicitly set, its value will be automatically set to the content of defaultGreeting.txt.

Mapping extension properties to task properties

Using an extension and mapping it to a custom task’s input/output properties is common in plugins.

In this example, the message property of the GreetingPluginExtension is mapped to the message property of the GreetingTask as an input:

buildSrc/src/main/kotlin/greetings.gradle.kts
// Create extension object
interface GreetingPluginExtension {
    val message: Property<String>
}

// Add the 'greeting' extension object to project
val extension = project.extensions.create<GreetingPluginExtension>("greeting")

// Set a default value for 'message'
extension.message.convention("Hello from Gradle")

// Create a greeting task
abstract class GreetingTask : DefaultTask() {
    @Input
    val message = project.objects.property<String>()

    @TaskAction
    fun greet() {
        println("Message: ${message.get()}")
    }
}

// Register the task and set the convention
tasks.register<GreetingTask>("hello") {
    message.convention(extension.message)
}
buildSrc/src/main/groovy/greetings.gradle
// Create extension object
interface GreetingPluginExtension {
    Property<String> getMessage()
}

// Add the 'greeting' extension object to project
def extension = project.extensions.create("greeting", GreetingPluginExtension)

// Set a default value for 'message'
extension.message.convention("Hello from Gradle")

// Create a greeting task
abstract class GreetingTask extends DefaultTask {
    @Input
    abstract Property<String> getMessage()

    @TaskAction
    void greet() {
        println("Message: ${message.get()}")
    }
}

// Register the task and set the convention
tasks.register("hello", GreetingTask) {
    message.convention(extension.message)
}
$ gradle -q hello
Message: Hello from Gradle

This means that changes to the extension’s message property will trigger the task to be considered out-of-date, ensuring that the task is re-executed with the new message.

You can find out more about types that you can use in task implementations and extensions in Lazy Configuration.

Applying external plugins

In order to apply an external plugin in a precompiled script plugin, it has to be added to the plugin project’s implementation classpath in the plugin’s build file:

buildSrc/build.gradle.kts
plugins {
    `kotlin-dsl`
}

repositories {
    mavenCentral()
}

dependencies {
    implementation("com.bmuschko:gradle-docker-plugin:6.4.0")
}
buildSrc/build.gradle
plugins {
    id 'groovy-gradle-plugin'
}

repositories {
    mavenCentral()
}

dependencies {
    implementation 'com.bmuschko:gradle-docker-plugin:6.4.0'
}

It can then be applied in the precompiled script plugin:

buildSrc/src/main/kotlin/my-plugin.gradle.kts
plugins {
    id("com.bmuschko.docker-remote-api")
}
buildSrc/src/main/groovy/my-plugin.gradle
plugins {
    id 'com.bmuschko.docker-remote-api'
}

The plugin version in this case is defined in the dependency declaration.

Implementing Binary Plugins

Binary plugins refer to plugins that are compiled and distributed as JAR files. These plugins are usually written in Java or Kotlin and provide custom functionality or tasks to a Gradle build.

Using the Plugin Development plugin

The Gradle Plugin Development plugin can be used to assist in developing Gradle plugins.

This plugin will automatically apply the Java Plugin, add the gradleApi() dependency to the api configuration, generate the required plugin descriptors in the resulting JAR file, and configure the Plugin Marker Artifact to be used when publishing.

To apply and configure the plugin, add the following code to your build file:

build.gradle.kts
plugins {
    `java-gradle-plugin`
}

gradlePlugin {
    plugins {
        create("simplePlugin") {
            id = "org.example.greeting"
            implementationClass = "org.example.GreetingPlugin"
        }
    }
}
build.gradle
plugins {
    id 'java-gradle-plugin'
}

gradlePlugin {
    plugins {
        simplePlugin {
            id = 'org.example.greeting'
            implementationClass = 'org.example.GreetingPlugin'
        }
    }
}

Writing and using custom task types is recommended when developing plugins as it automatically benefits from incremental builds. As an added benefit of applying the plugin to your project, the task validatePlugins automatically checks for an existing input/output annotation for every public property defined in a custom task type implementation.

Creating a plugin ID

Plugin IDs are meant to be globally unique, similar to Java package names (i.e., a reverse domain name). This format helps prevent naming collisions and allows grouping plugins with similar ownership.

An explicit plugin identifier simplifies applying the plugin to a project. Your plugin ID should combine components that reflect the namespace (a reasonable pointer to you or your organization) and the name of the plugin it provides. For example, if your Github account is named foo and your plugin is named bar, a suitable plugin ID might be com.github.foo.bar. Similarly, if the plugin was developed at the baz organization, the plugin ID might be org.baz.bar.

Plugin IDs should adhere to the following guidelines:

  • May contain any alphanumeric character, '.', and '-'.

  • Must contain at least one '.' character separating the namespace from the plugin’s name.

  • Conventionally use a lowercase reverse domain name convention for the namespace.

  • Conventionally use only lowercase characters in the name.

  • org.gradle, com.gradle, and com.gradleware namespaces may not be used.

  • Cannot start or end with a '.' character.

  • Cannot contain consecutive '.' characters (i.e., '..').

A namespace that identifies ownership and a name is sufficient for a plugin ID.

When bundling multiple plugins in a single JAR artifact, adhering to the same naming conventions is recommended. This practice helps logically group related plugins.

There is no limit to the number of plugins that can be defined and registered (by different identifiers) within a single project.

The identifiers for plugins written as a class should be defined in the project’s build script containing the plugin classes. For this, the java-gradle-plugin needs to be applied:

buildSrc/build.gradle.kts
plugins {
    id("java-gradle-plugin")
}

gradlePlugin {
    plugins {
        create("androidApplicationPlugin") {
            id = "com.android.application"
            implementationClass = "com.android.AndroidApplicationPlugin"
        }
        create("androidLibraryPlugin") {
            id = "com.android.library"
            implementationClass = "com.android.AndroidLibraryPlugin"
        }
    }
}
buildSrc/build.gradle
plugins {
    id 'java-gradle-plugin'
}

gradlePlugin {
    plugins {
        androidApplicationPlugin {
            id = 'com.android.application'
            implementationClass = 'com.android.AndroidApplicationPlugin'
        }
        androidLibraryPlugin {
            id = 'com.android.library'
            implementationClass = 'com.android.AndroidLibraryPlugin'
        }
    }
}

Working with files

When developing plugins, it’s a good idea to be flexible when accepting input configuration for file locations.

It is recommended to use Gradle’s managed properties and project.layout to select file or directory locations. This will enable lazy configuration so that the actual location will only be resolved when the file is needed and can be reconfigured at any time during build configuration.

This Gradle build file defines a task GreetingToFileTask that writes a greeting to a file. It also registers two tasks: greet, which creates the file with the greeting, and sayGreeting, which prints the file’s contents. The greetingFile property is used to specify the file path for the greeting:

build.gradle.kts
abstract class GreetingToFileTask : DefaultTask() {

    @get:OutputFile
    abstract val destination: RegularFileProperty

    @TaskAction
    fun greet() {
        val file = destination.get().asFile
        file.parentFile.mkdirs()
        file.writeText("Hello!")
    }
}

val greetingFile = objects.fileProperty()

tasks.register<GreetingToFileTask>("greet") {
    destination = greetingFile
}

tasks.register("sayGreeting") {
    dependsOn("greet")
    val greetingFile = greetingFile
    doLast {
        val file = greetingFile.get().asFile
        println("${file.readText()} (file: ${file.name})")
    }
}

greetingFile = layout.buildDirectory.file("hello.txt")
build.gradle
abstract class GreetingToFileTask extends DefaultTask {

    @OutputFile
    abstract RegularFileProperty getDestination()

    @TaskAction
    def greet() {
        def file = getDestination().get().asFile
        file.parentFile.mkdirs()
        file.write 'Hello!'
    }
}

def greetingFile = objects.fileProperty()

tasks.register('greet', GreetingToFileTask) {
    destination = greetingFile
}

tasks.register('sayGreeting') {
    dependsOn greet
    doLast {
        def file = greetingFile.get().asFile
        println "${file.text} (file: ${file.name})"
    }
}

greetingFile = layout.buildDirectory.file('hello.txt')
$ gradle -q sayGreeting
Hello! (file: hello.txt)

In this example, we configure the greet task destination property as a closure/provider, which is evaluated with the Project.file(java.lang.Object) method to turn the return value of the closure/provider into a File object at the last minute. Note that we specify the greetingFile property value after the task configuration. This lazy evaluation is a key benefit of accepting any value when setting a file property and then resolving that value when reading the property.

You can learn more about working with files lazily in Working with Files.

Making a plugin configurable using extensions

Most plugins offer configuration options for build scripts and other plugins to customize how the plugin works. Plugins do this using extension objects.

A Project has an associated ExtensionContainer object that contains all the settings and properties for the plugins that have been applied to the project. You can provide configuration for your plugin by adding an extension object to this container.

An extension object is simply an object with Java Bean properties representing the configuration.

Let’s add a greeting extension object to the project, which allows you to configure the greeting:

build.gradle.kts
interface GreetingPluginExtension {
    val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        // Add the 'greeting' extension object
        val extension = project.extensions.create<GreetingPluginExtension>("greeting")
        // Add a task that uses configuration from the extension object
        project.task("hello") {
            doLast {
                println(extension.message.get())
            }
        }
    }
}

apply<GreetingPlugin>()

// Configure the extension
the<GreetingPluginExtension>().message = "Hi from Gradle"
build.gradle
interface GreetingPluginExtension {
    Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        // Add the 'greeting' extension object
        def extension = project.extensions.create('greeting', GreetingPluginExtension)
        // Add a task that uses configuration from the extension object
        project.task('hello') {
            doLast {
                println extension.message.get()
            }
        }
    }
}

apply plugin: GreetingPlugin

// Configure the extension
greeting.message = 'Hi from Gradle'
$ gradle -q hello
Hi from Gradle

In this example, GreetingPluginExtension is an object with a property called message. The extension object is added to the project with the name greeting. This object becomes available as a project property with the same name as the extension object. the<GreetingPluginExtension>() is equivalent to project.extensions.getByType(GreetingPluginExtension::class.java).

Often, you have several related properties you need to specify on a single plugin. Gradle adds a configuration block for each extension object, so you can group settings:

build.gradle.kts
interface GreetingPluginExtension {
    val message: Property<String>
    val greeter: Property<String>
}

class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        val extension = project.extensions.create<GreetingPluginExtension>("greeting")
        project.task("hello") {
            doLast {
                println("${extension.message.get()} from ${extension.greeter.get()}")
            }
        }
    }
}

apply<GreetingPlugin>()

// Configure the extension using a DSL block
configure<GreetingPluginExtension> {
    message = "Hi"
    greeter = "Gradle"
}
build.gradle
interface GreetingPluginExtension {
    Property<String> getMessage()
    Property<String> getGreeter()
}

class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        def extension = project.extensions.create('greeting', GreetingPluginExtension)
        project.task('hello') {
            doLast {
                println "${extension.message.get()} from ${extension.greeter.get()}"
            }
        }
    }
}

apply plugin: GreetingPlugin

// Configure the extension using a DSL block
greeting {
    message = 'Hi'
    greeter = 'Gradle'
}
$ gradle -q hello
Hi from Gradle

In this example, several settings can be grouped within the configure<GreetingPluginExtension> block. The configure function is used to configure an extension object. It provides a convenient way to set properties or apply configurations to these objects. The type used in the build script’s configure function (GreetingPluginExtension) must match the extension type. Then, when the block is executed, the receiver of the block is the extension.

In this example, several settings can be grouped within the greeting closure. The name of the closure block in the build script (greeting) must match the extension object name. Then, when the closure is executed, the fields on the extension object will be mapped to the variables within the closure based on the standard Groovy closure delegate feature.

Declaring a DSL configuration container

Using an extension object extends the Gradle DSL to add a project property and DSL block for the plugin. Because an extension object is a regular object, you can provide your own DSL nested inside the plugin block by adding properties and methods to the extension object.

Let’s consider the following build script for illustration purposes.

build.gradle.kts
plugins {
    id("org.myorg.server-env")
}

environments {
    create("dev") {
        url = "http://localhost:8080"
    }

    create("staging") {
        url = "http://staging.enterprise.com"
    }

    create("production") {
        url = "http://prod.enterprise.com"
    }
}
build.gradle
plugins {
    id 'org.myorg.server-env'
}

environments {
    dev {
        url = 'http://localhost:8080'
    }

    staging {
        url = 'http://staging.enterprise.com'
    }

    production {
        url = 'http://prod.enterprise.com'
    }
}

The DSL exposed by the plugin exposes a container for defining a set of environments. Each environment the user configures has an arbitrary but declarative name and is represented with its own DSL configuration block. The example above instantiates a development, staging, and production environment, including its respective URL.

Each environment must have a data representation in code to capture the values. The name of an environment is immutable and can be passed in as a constructor parameter. Currently, the only other parameter the data object stores is a URL.

The following ServerEnvironment object fulfills those requirements:

ServerEnvironment.java
abstract public class ServerEnvironment {
    private final String name;

    @javax.inject.Inject
    public ServerEnvironment(String name) {
        this.name = name;
    }

    public String getName() {
        return name;
    }

    abstract public Property<String> getUrl();
}

Gradle exposes the factory method ObjectFactory.domainObjectContainer(Class, NamedDomainObjectFactory) to create a container of data objects. The parameter the method takes is the class representing the data. The created instance of type NamedDomainObjectContainer can be exposed to the end user by adding it to the extension container with a specific name.

It’s common for a plugin to post-process the captured values within the plugin implementation, e.g., to configure tasks:

ServerEnvironmentPlugin.java
public class ServerEnvironmentPlugin implements Plugin<Project> {
    @Override
    public void apply(final Project project) {
        ObjectFactory objects = project.getObjects();

        NamedDomainObjectContainer<ServerEnvironment> serverEnvironmentContainer =
            objects.domainObjectContainer(ServerEnvironment.class, name -> objects.newInstance(ServerEnvironment.class, name));
        project.getExtensions().add("environments", serverEnvironmentContainer);

        serverEnvironmentContainer.all(serverEnvironment -> {
            String env = serverEnvironment.getName();
            String capitalizedServerEnv = env.substring(0, 1).toUpperCase() + env.substring(1);
            String taskName = "deployTo" + capitalizedServerEnv;
            project.getTasks().register(taskName, Deploy.class, task -> task.getUrl().set(serverEnvironment.getUrl()));
        });
    }
}

In the example above, a deployment task is created dynamically for every user-configured environment.

You can find out more about implementing project extensions in Developing Custom Gradle Types.

Modeling DSL-like APIs

DSLs exposed by plugins should be readable and easy to understand.

For example, let’s consider the following extension provided by a plugin. In its current form, it offers a "flat" list of properties for configuring the creation of a website:

build-flat.gradle.kts
plugins {
    id("org.myorg.site")
}

site {
    outputDir = layout.buildDirectory.file("mysite")
    websiteUrl = "https://gradle.org"
    vcsUrl = "https://github.com/gradle/gradle-site-plugin"
}
build-flat.gradle
plugins {
    id 'org.myorg.site'
}

site {
    outputDir = layout.buildDirectory.file("mysite")
    websiteUrl = 'https://gradle.org'
    vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
}

As the number of exposed properties grows, you should introduce a nested, more expressive structure.

The following code snippet adds a new configuration block named siteInfo as part of the extension. This provides a stronger indication of what those properties mean:

build.gradle.kts
plugins {
    id("org.myorg.site")
}

site {
    outputDir = layout.buildDirectory.file("mysite")

    siteInfo {
        websiteUrl = "https://gradle.org"
        vcsUrl = "https://github.com/gradle/gradle-site-plugin"
    }
}
build.gradle
plugins {
    id 'org.myorg.site'
}

site {
    outputDir = layout.buildDirectory.file("mysite")

    siteInfo {
        websiteUrl = 'https://gradle.org'
        vcsUrl = 'https://github.com/gradle/gradle-site-plugin'
    }
}

Implementing the backing objects for such an extension is simple. First, introduce a new data object for managing the properties websiteUrl and vcsUrl:

SiteInfo.java
abstract public class SiteInfo {

    abstract public Property<String> getWebsiteUrl();

    abstract public Property<String> getVcsUrl();
}

In the extension, create an instance of the siteInfo class and a method to delegate the captured values to the data instance.

To configure underlying data objects, define a parameter of type Action.

The following example demonstrates the use of Action in an extension definition:

SiteExtension.java
abstract public class SiteExtension {

    abstract public RegularFileProperty getOutputDir();

    @Nested
    abstract public SiteInfo getSiteInfo();

    public void siteInfo(Action<? super SiteInfo> action) {
        action.execute(getSiteInfo());
    }
}

Mapping extension properties to task properties

Plugins commonly use an extension to capture user input from the build script and map it to a custom task’s input/output properties. The build script author interacts with the extension’s DSL, while the plugin implementation handles the underlying logic:

app/build.gradle.kts
// Extension class to capture user input
class MyExtension {
    @Input
    var inputParameter: String? = null
}

// Custom task that uses the input from the extension
class MyCustomTask : org.gradle.api.DefaultTask() {
    @Input
    var inputParameter: String? = null

    @TaskAction
    fun executeTask() {
        println("Input parameter: $inputParameter")
    }
}

// Plugin class that configures the extension and task
class MyPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        // Create and configure the extension
        val extension = project.extensions.create("myExtension", MyExtension::class.java)
        // Create and configure the custom task
        project.tasks.register("myTask", MyCustomTask::class.java) {
            group = "custom"
            inputParameter = extension.inputParameter
        }
    }
}
app/build.gradle
// Extension class to capture user input
class MyExtension {
    @Input
    String inputParameter = null
}

// Custom task that uses the input from the extension
class MyCustomTask extends DefaultTask {
    @Input
    String inputParameter = null

    @TaskAction
    def executeTask() {
        println("Input parameter: $inputParameter")
    }
}

// Plugin class that configures the extension and task
class MyPlugin implements Plugin<Project> {
    void apply(Project project) {
        // Create and configure the extension
        def extension = project.extensions.create("myExtension", MyExtension)
        // Create and configure the custom task
        project.tasks.register("myTask", MyCustomTask) {
            group = "custom"
            inputParameter = extension.inputParameter
        }
    }
}

In this example, the MyExtension class defines an inputParameter property that can be set in the build script. The MyPlugin class configures this extension and uses its inputParameter value to configure the MyCustomTask task. The MyCustomTask task then uses this input parameter in its logic.

You can learn more about types you can use in task implementations and extensions in Lazy Configuration.

Adding default configuration with conventions

Plugins should provide sensible defaults and standards in a specific context, reducing the number of decisions users need to make. Using the project object, you can define default values. These are known as conventions.

Conventions are properties that are initialized with default values and can be overridden by the user in their build script. For example:

build.gradle.kts
interface GreetingPluginExtension {
    val message: Property<String>
}

class GreetingPlugin : Plugin<Project> {
    override fun apply(project: Project) {
        // Add the 'greeting' extension object
        val extension = project.extensions.create<GreetingPluginExtension>("greeting")
        extension.message.convention("Hello from GreetingPlugin")
        // Add a task that uses configuration from the extension object
        project.task("hello") {
            doLast {
                println(extension.message.get())
            }
        }
    }
}

apply<GreetingPlugin>()
build.gradle
interface GreetingPluginExtension {
    Property<String> getMessage()
}

class GreetingPlugin implements Plugin<Project> {
    void apply(Project project) {
        // Add the 'greeting' extension object
        def extension = project.extensions.create('greeting', GreetingPluginExtension)
        extension.message.convention('Hello from GreetingPlugin')
        // Add a task that uses configuration from the extension object
        project.task('hello') {
            doLast {
                println extension.message.get()
            }
        }
    }
}

apply plugin: GreetingPlugin
$ gradle -q hello
Hello from GreetingPlugin

In this example, GreetingPluginExtension is a class that represents the convention. The message property is the convention property with a default value of 'Hello from GreetingPlugin'.

Users can override this value in their build script:

build.gradle.kts
GreetingPluginExtension {
    message = "Custom message"
}
build.gradle
GreetingPluginExtension {
    message = 'Custom message'
}
$ gradle -q hello
Custom message

Separating capabilities from conventions

Separating capabilities from conventions in plugins allows users to choose which tasks and conventions to apply.

For example, the Java Base plugin provides un-opinionated (i.e., generic) functionality like SourceSets, while the Java plugin adds tasks and conventions familiar to Java developers like classes, jar or javadoc.

When designing your own plugins, consider developing two plugins — one for capabilities and another for conventions — to offer flexibility to users.

In the example below, MyPlugin contains conventions, and MyBasePlugin defines capabilities. Then, MyPlugin applies MyBasePlugin, this is called plugin composition. To apply a plugin from another one:

MyBasePlugin.java
import org.gradle.api.Plugin;
import org.gradle.api.Project;

public class MyBasePlugin implements Plugin<Project> {
    public void apply(Project project) {
        // define capabilities
    }
}
MyPlugin.java
import org.gradle.api.Plugin;
import org.gradle.api.Project;

public class MyPlugin implements Plugin<Project> {
    public void apply(Project project) {
        project.getPlugins().apply(MyBasePlugin.class);

        // define conventions
    }
}

Reacting to plugins

A common pattern in Gradle plugin implementations is configuring the runtime behavior of existing plugins and tasks in a build.

For example, a plugin could assume that it is applied to a Java-based project and automatically reconfigure the standard source directory:

InhouseStrongOpinionConventionJavaPlugin.java
public class InhouseStrongOpinionConventionJavaPlugin implements Plugin<Project> {
    public void apply(Project project) {
        // Careful! Eagerly appyling plugins has downsides, and is not always recommended.
        project.getPlugins().apply(JavaPlugin.class);
        SourceSetContainer sourceSets = project.getExtensions().getByType(SourceSetContainer.class);
        SourceSet main = sourceSets.getByName(SourceSet.MAIN_SOURCE_SET_NAME);
        main.getJava().setSrcDirs(Arrays.asList("src"));
    }
}

The drawback to this approach is that it automatically forces the project to apply the Java plugin, imposing a strong opinion on it (i.e., reducing flexibility and generality). In practice, the project applying the plugin might not even deal with Java code.

Instead of automatically applying the Java plugin, the plugin could react to the fact that the consuming project applies the Java plugin. Only if that is the case, then a certain configuration is applied:

InhouseConventionJavaPlugin.java
public class InhouseConventionJavaPlugin implements Plugin<Project> {
    public void apply(Project project) {
        project.getPlugins().withType(JavaPlugin.class, javaPlugin -> {
            SourceSetContainer sourceSets = project.getExtensions().getByType(SourceSetContainer.class);
            SourceSet main = sourceSets.getByName(SourceSet.MAIN_SOURCE_SET_NAME);
            main.getJava().setSrcDirs(Arrays.asList("src"));
        });
    }
}

Reacting to plugins is preferred over applying plugins if there is no good reason to assume that the consuming project has the expected setup.

The same concept applies to task types:

InhouseConventionWarPlugin.java
public class InhouseConventionWarPlugin implements Plugin<Project> {
    public void apply(Project project) {
        project.getTasks().withType(War.class).configureEach(war ->
            war.setWebXml(project.file("src/someWeb.xml")));
    }
}

Reacting to build features

Plugins can access the status of build features in the build. The Build Features API allows checking whether the user requested a particular Gradle feature and if it is active in the current build. An example of a build feature is the configuration cache.

There are two main use cases:

  • Using the status of build features in reports or statistics.

  • Incrementally adopting experimental Gradle features by disabling incompatible plugin functionality.

Below is an example of a plugin that utilizes both of the cases.

Reacting to build features
public abstract class MyPlugin implements Plugin<Project> {

    @Inject
    protected abstract BuildFeatures getBuildFeatures(); // (1)

    @Override
    public void apply(Project p) {
        BuildFeatures buildFeatures = getBuildFeatures();

        Boolean configCacheRequested = buildFeatures.getConfigurationCache().getRequested() // (2)
            .getOrNull(); // could be null if user did not opt in nor opt out
        String configCacheUsage = describeFeatureUsage(configCacheRequested);
        MyReport myReport = new MyReport();
        myReport.setConfigurationCacheUsage(configCacheUsage);

        boolean isolatedProjectsActive = buildFeatures.getIsolatedProjects().getActive() // (3)
            .get(); // the active state is always defined
        if (!isolatedProjectsActive) {
            myOptionalPluginLogicIncompatibleWithIsolatedProjects();
        }
    }

    private String describeFeatureUsage(Boolean requested) {
        return requested == null ? "no preference" : requested ? "opt-in" : "opt-out";
    }

    private void myOptionalPluginLogicIncompatibleWithIsolatedProjects() {
    }
}
  1. The BuildFeatures service can be injected into plugins, tasks, and other managed types.

  2. Accessing the requested status of a feature for reporting.

  3. Using the active status of a feature to disable incompatible functionality.

Build feature properties

A BuildFeature status properties are represented with Provider<Boolean> types.

The BuildFeature.getRequested() status of a build feature determines if the user requested to enable or disable the feature.

When the requested provider value is:

  • true — the user opted in for using the feature

  • false — the user opted out from using the feature

  • undefined — the user neither opted in nor opted out from using the feature

The BuildFeature.getActive() status of a build feature is always defined. It represents the effective state of the feature in the build.

When the active provider value is:

  • true — the feature may affect the build behavior in a way specific to the feature

  • false — the feature will not affect the build behavior

Note that the active status does not depend on the requested status. Even if the user requests a feature, it may still not be active due to other build options being used in the build. Gradle can also activate a feature by default, even if the user did not specify a preference.

Using a custom dependencies block

Note
Custom dependencies blocks are based on incubating APIs.

A plugin can provide dependency declarations in custom blocks that allow users to declare dependencies in a type-safe and context-aware way.

For instance, instead of users needing to know and use the underlying Configuration name to add dependencies, a custom dependencies block lets the plugin pick a meaningful name that can be used consistently.

Adding a custom dependencies block

To add a custom dependencies block, you need to create a new type that will represent the set of dependency scopes available to users. That new type needs to be accessible from a part of your plugin (from a domain object or extension). Finally, the dependency scopes need to be wired back to underlying Configuration objects that will be used during dependency resolution.

See JvmComponentDependencies and JvmTestSuite for an example of how this is used in a Gradle core plugin.

1. Create an interface that extends Dependencies
Note
You can also extend GradleDependencies to get access to Gradle-provided dependencies like gradleApi().
ExampleDependencies.java
/**
 * Custom dependencies block for the example plugin.
 */
public interface ExampleDependencies extends Dependencies {
2. Add accessors for dependency scopes

For each dependency scope your plugin wants to support, add a getter method that returns a DependencyCollector.

ExampleDependencies.java
    /**
     * Dependency scope called "implementation"
     */
    DependencyCollector getImplementation();
3. Add accessors for custom dependencies block

To make the custom dependencies block configurable, the plugin needs to add a getDependencies method that returns the new type from above and a configurable block method named dependencies.

By convention, the accessors for your custom dependencies block should be called getDependencies()/dependencies(Action). This method could be named something else, but users would need to know that a different block can behave like a dependencies block.

ExampleExtension.java
    /**
     * Custom dependencies for this extension.
     */
    @Nested
    ExampleDependencies getDependencies();

    /**
     * Configurable block
     */
    default void dependencies(Action<? super ExampleDependencies> action) {
        action.execute(getDependencies());
    }
4. Wire dependency scope to Configuration

Finally, the plugin needs to wire the custom dependencies block to some underlying Configuration objects. If this is not done, none of the dependencies declared in the custom block will be available to dependency resolution.

ExamplePlugin.java
        project.getConfigurations().dependencyScope("exampleImplementation", conf -> {
            conf.fromDependencyCollector(example.getDependencies().getImplementation());
        });
Note
In this example, the name users will use to add dependencies is "implementation", but the underlying Configuration is named exampleImplementation.
build.gradle.kts
example {
    dependencies {
        implementation("junit:junit:4.13")
    }
}
build.gradle
example {
    dependencies {
        implementation("junit:junit:4.13")
    }
}
Differences between the custom dependencies and the top-level dependencies blocks

Each dependency scope returns a DependencyCollector that provides strongly-typed methods to add and configure dependencies.

There is also a DependencyFactory with factory methods to create new dependencies from different notations. Dependencies can be created lazily using these factory methods, as shown below.

A custom dependencies block differs from the top-level dependencies block in the following ways:

  • Dependencies must be declared using a String, an instance of Dependency, a FileCollection, a Provider of Dependency, or a ProviderConvertible of MinimalExternalModuleDependency.

  • Outside of Gradle build scripts, you must explicitly call a getter for the DependencyCollector and add.

    • dependencies.add("implementation", x) becomes getImplementation().add(x)

  • You cannot declare dependencies with the Map notation from Kotlin and Java. Use multi-argument methods instead in Kotlin and Java.

    • Kotlin: compileOnly(mapOf("group" to "foo", "name" to "bar")) becomes compileOnly(module(group = "foo", name = "bar"))

    • Java: compileOnly(Map.of("group", "foo", "name", "bar")) becomes getCompileOnly().add(module("foo", "bar", null))

  • You cannot add a dependency with an instance of Project. You must turn it into a ProjectDependency first.

  • You cannot add version catalog bundles directly. Instead, use the bundle method on each configuration.

    • Kotlin and Groovy: implementation(libs.bundles.testing) becomes implementation.bundle(libs.bundles.testing)

  • You cannot use providers for non-Dependency types directly. Instead, map them to a Dependency using the DependencyFactory.

    • Kotlin and Groovy: implementation(myStringProvider) becomes implementation(myStringProvider.map { dependencyFactory.create(it) })

    • Java: implementation(myStringProvider) becomes getImplementation().add(myStringProvider.map(getDependencyFactory()::create)

  • Unlike the top-level dependencies block, constraints are not in a separate block.

    • Instead, constraints are added by decorating a dependency with constraint(…​) like implementation(constraint("org:foo:1.0")).

Keep in mind that the dependencies block may not provide access to the same methods as the top-level dependencies block.

Note
Plugins should prefer adding dependencies via their own dependencies block.

Providing default dependencies

The implementation of a plugin sometimes requires the use of an external dependency.

You might want to automatically download an artifact using Gradle’s dependency management mechanism and later use it in the action of a task type declared in the plugin. Ideally, the plugin implementation does not need to ask the user for the coordinates of that dependency - it can simply predefine a sensible default version.

Let’s look at an example of a plugin that downloads files containing data for further processing. The plugin implementation declares a custom configuration that allows for assigning those external dependencies with dependency coordinates:

DataProcessingPlugin.java
public class DataProcessingPlugin implements Plugin<Project> {
    public void apply(Project project) {
        Configuration dataFiles = project.getConfigurations().create("dataFiles", c -> {
            c.setVisible(false);
            c.setCanBeConsumed(false);
            c.setCanBeResolved(true);
            c.setDescription("The data artifacts to be processed for this plugin.");
            c.defaultDependencies(d -> d.add(project.getDependencies().create("org.myorg:data:1.4.6")));
        });

        project.getTasks().withType(DataProcessing.class).configureEach(
            dataProcessing -> dataProcessing.getDataFiles().from(dataFiles));
    }
}
DataProcessing.java
abstract public class DataProcessing extends DefaultTask {

    @InputFiles
    abstract public ConfigurableFileCollection getDataFiles();

    @TaskAction
    public void process() {
        System.out.println(getDataFiles().getFiles());
    }
}

This approach is convenient for the end user as there is no need to actively declare a dependency. The plugin already provides all the details about this implementation.

But what if the user wants to redefine the default dependency?

No problem. The plugin also exposes the custom configuration that can be used to assign a different dependency. Effectively, the default dependency is overwritten:

build.gradle.kts
plugins {
    id("org.myorg.data-processing")
}

dependencies {
    dataFiles("org.myorg:more-data:2.6")
}
build.gradle
plugins {
    id 'org.myorg.data-processing'
}

dependencies {
    dataFiles 'org.myorg:more-data:2.6'
}

You will find that this pattern works well for tasks that require an external dependency when the task’s action is executed. You can go further and abstract the version to be used for the external dependency by exposing an extension property (e.g. toolVersion in the JaCoCo plugin).

Minimizing the use of external libraries

Using external libraries in your Gradle projects can bring great convenience, but be aware that they can introduce complex dependency graphs. Gradle’s buildEnvironment task can help you visualize these dependencies, including those of your plugins. Keep in mind that plugins share the same classloader, so conflicts may arise with different versions of the same library.

To demonstrate let’s assume the following build script:

build.gradle.kts
plugins {
    id("org.asciidoctor.jvm.convert") version "4.0.2"
}
build.gradle
plugins {
    id 'org.asciidoctor.jvm.convert' version '4.0.2'
}

The output of the task clearly indicates the classpath of the classpath configuration:

$ gradle buildEnvironment

> Task :buildEnvironment

------------------------------------------------------------
Root project 'external-libraries'
------------------------------------------------------------

classpath
\--- org.asciidoctor.jvm.convert:org.asciidoctor.jvm.convert.gradle.plugin:4.0.2
     \--- org.asciidoctor:asciidoctor-gradle-jvm:4.0.2
          +--- org.ysb33r.gradle:grolifant-rawhide:3.0.0
          |    \--- org.tukaani:xz:1.6
          +--- org.ysb33r.gradle:grolifant-herd:3.0.0
          |    +--- org.tukaani:xz:1.6
          |    +--- org.ysb33r.gradle:grolifant40:3.0.0
          |    |    +--- org.tukaani:xz:1.6
          |    |    +--- org.apache.commons:commons-collections4:4.4
          |    |    +--- org.ysb33r.gradle:grolifant-core:3.0.0
          |    |    |    +--- org.tukaani:xz:1.6
          |    |    |    +--- org.apache.commons:commons-collections4:4.4
          |    |    |    \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
          |    |    \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
          |    +--- org.ysb33r.gradle:grolifant50:3.0.0
          |    |    +--- org.tukaani:xz:1.6
          |    |    +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    |    \--- org.ysb33r.gradle:grolifant40-legacy-api:3.0.0
          |    |         +--- org.tukaani:xz:1.6
          |    |         +--- org.apache.commons:commons-collections4:4.4
          |    |         +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    |         \--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
          |    +--- org.ysb33r.gradle:grolifant60:3.0.0
          |    |    +--- org.tukaani:xz:1.6
          |    |    +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    |    \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
          |    +--- org.ysb33r.gradle:grolifant70:3.0.0
          |    |    +--- org.tukaani:xz:1.6
          |    |    +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
          |    |    \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    +--- org.ysb33r.gradle:grolifant80:3.0.0
          |    |    +--- org.tukaani:xz:1.6
          |    |    +--- org.ysb33r.gradle:grolifant40:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant50:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant60:3.0.0 (*)
          |    |    +--- org.ysb33r.gradle:grolifant70:3.0.0 (*)
          |    |    \--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    +--- org.ysb33r.gradle:grolifant-core:3.0.0 (*)
          |    \--- org.ysb33r.gradle:grolifant-rawhide:3.0.0 (*)
          +--- org.asciidoctor:asciidoctor-gradle-base:4.0.2
          |    \--- org.ysb33r.gradle:grolifant-herd:3.0.0 (*)
          \--- org.asciidoctor:asciidoctorj-api:2.5.7

(*) - Indicates repeated occurrences of a transitive dependency subtree. Gradle expands transitive dependency subtrees only once per project; repeat occurrences only display the root of the subtree, followed by this annotation.

A web-based, searchable dependency report is available by adding the --scan option.

BUILD SUCCESSFUL in 0s
1 actionable task: 1 executed

A Gradle plugin does not run in its own, isolated classloader, so you must consider whether you truly need a library or if a simpler solution suffices.

For logic that is executed as part of task execution, use the Worker API that allows you to isolate libraries.

Providing multiple variants of a plugin

Variants of a plugin refer to different flavors or configurations of the plugin that are tailored to specific needs or use cases. These variants can include different implementations, extensions, or configurations of the base plugin.

The most convenient way to configure additional plugin variants is to use feature variants, a concept available in all Gradle projects that apply one of the Java plugins:

dependencies {
    implementation 'com.google.guava:guava:30.1-jre'        // Regular dependency
    featureVariant 'com.google.guava:guava-gwt:30.1-jre'    // Feature variant dependency
}

In the following example, each plugin variant is developed in isolation. A separate source set is compiled and packaged in a separate jar for each variant.

The following sample demonstrates how to add a variant that is compatible with Gradle 7.0+ while the "main" variant is compatible with older versions:

build.gradle.kts
val gradle7 = sourceSets.create("gradle7")

java {
    registerFeature(gradle7.name) {
        usingSourceSet(gradle7)
        capability(project.group.toString(), project.name, project.version.toString()) // (1)
    }
}

configurations.configureEach {
    if (isCanBeConsumed && name.startsWith(gradle7.name))  {
        attributes {
            attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, // (2)
                objects.named("7.0"))
        }
    }
}

tasks.named<Copy>(gradle7.processResourcesTaskName) { // (3)
    val copyPluginDescriptors = rootSpec.addChild()
    copyPluginDescriptors.into("META-INF/gradle-plugins")
    copyPluginDescriptors.from(tasks.pluginDescriptors)
}

dependencies {
    "gradle7CompileOnly"(gradleApi()) // (4)
}
build.gradle
def gradle7 = sourceSets.create('gradle7')

java {
    registerFeature(gradle7.name) {
        usingSourceSet(gradle7)
        capability(project.group.toString(), project.name, project.version.toString()) // (1)
    }
}

configurations.configureEach {
    if (canBeConsumed && name.startsWith(gradle7.name))  {
        attributes {
            attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE, // (2)
                      objects.named(GradlePluginApiVersion, '7.0'))
        }
    }
}

tasks.named(gradle7.processResourcesTaskName) { // (3)
    def copyPluginDescriptors = rootSpec.addChild()
    copyPluginDescriptors.into('META-INF/gradle-plugins')
    copyPluginDescriptors.from(tasks.pluginDescriptors)
}

dependencies {
    gradle7CompileOnly(gradleApi()) // (4)
}
Note
Only Gradle versions 7 or higher can be explicitly targeted by a variant, as support for this was only added in Gradle 7.

First, we declare a separate source set and a feature variant for our Gradle 7 plugin variant. Then, we do some specific wiring to turn the feature into a proper Gradle plugin variant:

  1. Assign the implicit capability that corresponds to the components GAV to the variant.

  2. Assign the Gradle API version attribute to all consumable configurations of our Gradle7 variant. Gradle uses this information to determine which variant to select during plugin resolution.

  3. Configure the processGradle7Resources task to ensure the plugin descriptor file is added to the Gradle7 variant Jar.

  4. Add a dependency to the gradleApi() for our new variant so that the API is visible during compilation time.

Note that there is currently no convenient way to access the API of other Gradle versions as the one you are building the plugin with. Ideally, every variant should be able to declare a dependency on the API of the minimal Gradle version it supports. This will be improved in the future.

The above snippet assumes that all variants of your plugin have the plugin class at the same location. That is, if your plugin class is org.example.GreetingPlugin, you need to create a second variant of that class in src/gradle7/java/org/example.

Using version-specific variants of multi-variant plugins

Given a dependency on a multi-variant plugin, Gradle will automatically choose its variant that best matches the current Gradle version when it resolves any of:

The best matching variant is the variant that targets the highest Gradle API version and does not exceed the current build’s Gradle version.

In all other cases, a plugin variant that does not specify the supported Gradle API version is preferred if such a variant is present.

In projects that use plugins as dependencies, requesting the variants of plugin dependencies that support a different Gradle version is possible. This allows a multi-variant plugin that depends on other plugins to access their APIs, which are exclusively provided in their version-specific variants.

This snippet makes the plugin variant gradle7 defined above consume the matching variants of its dependencies on other multi-variant plugins:

build.gradle.kts
configurations.configureEach {
    if (isCanBeResolved && name.startsWith(gradle7.name))  {
        attributes {
            attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
                objects.named("7.0"))
        }
    }
}
build.gradle
configurations.configureEach {
    if (canBeResolved && name.startsWith(gradle7.name))  {
        attributes {
            attribute(GradlePluginApiVersion.GRADLE_PLUGIN_API_VERSION_ATTRIBUTE,
                objects.named(GradlePluginApiVersion, '7.0'))
        }
    }
}

Reporting problems

Plugins can report problems through Gradle’s problem-reporting APIs. The APIs report rich, structured information about problems happening during the build. This information can be used by different user interfaces such as Gradle’s console output, Build Scans, or IDEs to communicate problems to the user in the most appropriate way.

The following example shows an issue reported from a plugin:

ProblemReportingPlugin.java
public class ProblemReportingPlugin implements Plugin<Project> {

    private final ProblemReporter problemReporter;

    @Inject
    public ProblemReportingPlugin(Problems problems) { // (1)
        this.problemReporter = problems.forNamespace("org.myorg"); // (2)
    }

    public void apply(Project project) {
        this.problemReporter.reporting(builder -> builder // (3)
            .id("adhoc-deprecation", "Plugin 'x' is deprecated")
            .details("The plugin 'x' is deprecated since version 2.5")
            .solution("Please use plugin 'y'")
            .severity(Severity.WARNING)
        );
    }
}
  1. The Problem service is injected into the plugin.

  2. A problem reporter, is created for the plugin. While the namespace is up to the plugin author, it is recommended that the plugin ID be used.

  3. A problem is reported. This problem is recoverable so that the build will continue.

For a full example, see our end-to-end sample.

Problem building

When reporting a problem, a wide variety of information can be provided. The ProblemSpec describes all the information that can be provided.

Reporting problems

When it comes to reporting problems, we support three different modes:

  • Reporting a problem is used for reporting problems that are recoverable, and the build should continue.

  • Throwing a problem is used for reporting problems that are not recoverable, and the build should fail.

  • Rethrowing a problem is used to wrap an already thrown exception. Otherwise, the behavior is the same as Throwing.

For more details, see the ProblemReporter documentation.

Problem aggregation

When reporting problems, Gradle will aggregate similar problems by sending them through the Tooling API based on the problem’s category label.

  • When a problem is reported, the first occurrence is going to be reported as a ProblemDescriptor, containing the complete information about the problem.

  • Any subsequent occurrences of the same problem will be reported as a ProblemAggregationDescriptor. This descriptor will arrive at the end of the build and contain the number of occurrences of the problem.

  • If for any bucket (i.e., category and label pairing), the number of collected occurrences is greater than 10.000, then it will be sent immediately instead of at the end of the build.

Testing Gradle plugins

Testing plays a crucial role in the development process by ensuring reliable and high-quality software. This principle applies to build code, including Gradle plugins.

The sample project

This section revolves around a sample project called the "URL verifier plugin". This plugin creates a task named verifyUrl that checks whether a given URL can be resolved via HTTP GET. The end user can provide the URL via an extension named verification.

The following build script assumes that the plugin JAR file has been published to a binary repository. The script demonstrates how to apply the plugin to the project and configure its exposed extension:

build.gradle.kts
plugins {
    id("org.myorg.url-verifier")        // (1)
}

verification {
    url = "https://www.google.com/"  // (2)
}
build.gradle
plugins {
    id 'org.myorg.url-verifier'         // (1)
}

verification {
    url = 'https://www.google.com/'     // (2)
}
  1. Applies the plugin to the project

  2. Configures the URL to be verified through the exposed extension

Executing the verifyUrl task renders a success message if the HTTP GET call to the configured URL returns with a 200 response code:

$ gradle verifyUrl

> Task :verifyUrl
Successfully resolved URL 'https://www.google.com/'

BUILD SUCCESSFUL in 0s
5 actionable tasks: 5 executed

Before diving into the code, let’s first revisit the different types of tests and the tooling that supports implementing them.

The importance of testing

Testing is a crucial part of the software development life cycle, ensuring that software functions correctly and meets quality standards before release. Automated testing allows developers to refactor and improve code with confidence.

The testing pyramid
Manual Testing

While manual testing is straightforward, it is error-prone and requires human effort. For Gradle plugins, manual testing involves using the plugin in a build script.

Automated Testing

Automated testing includes unit, integration, and functional testing.

testing pyramid

The testing pyramid introduced by Mike Cohen in his book Succeeding with Agile: Software Development Using Scrum describes three types of automated tests:

  1. Unit Testing: Verifies the smallest units of code, typically methods, in isolation. It uses Stubs or Mocks to isolate code from external dependencies.

  2. Integration Testing: Validates that multiple units or components work together.

  3. Functional Testing: Tests the system from the end user’s perspective, ensuring correct functionality. End-to-end tests for Gradle plugins simulate a build, apply the plugin, and execute specific tasks to verify functionality.

Tooling support

Testing Gradle plugins, both manually and automatically, is simplified with the appropriate tools. The table below provides a summary of each testing approach. You can choose any test framework you’re comfortable with.

For detailed explanations and code examples, refer to the specific sections below:

Test type Tooling support

Manual tests

Gradle composite builds

Unit tests

Any JVM-based test framework

Integration tests

Any JVM-based test framework

Functional tests

Any JVM-based test framework and Gradle TestKit

Setting up manual tests

The composite builds feature of Gradle makes it easy to test a plugin manually. The standalone plugin project and the consuming project can be combined into a single unit, making it straightforward to try out or debug changes without re-publishing the binary file:

.
├── include-plugin-build   // (1)
│   ├── build.gradle
│   └── settings.gradle
└── url-verifier-plugin    // (2)
    ├── build.gradle
    ├── settings.gradle
    └── src
  1. Consuming project that includes the plugin project

  2. The plugin project

There are two ways to include a plugin project in a consuming project:

  1. By using the command line option --include-build.

  2. By using the method includeBuild in settings.gradle.

The following code snippet demonstrates the use of the settings file:

settings.gradle.kts
pluginManagement {
    includeBuild("../url-verifier-plugin")
}
settings.gradle
pluginManagement {
    includeBuild '../url-verifier-plugin'
}

The command line output of the verifyUrl task from the project include-plugin-build looks exactly the same as shown in the introduction, except that it now executes as part of a composite build.

Manual testing has its place in the development process, but it is not a replacement for automated testing.

Setting up automated tests

Setting up a suite of tests early on is crucial to the success of your plugin. Automated tests become an invaluable safety net when upgrading the plugin to a new Gradle version or enhancing/refactoring the code.

Organizing test source code

We recommend implementing a good distribution of unit, integration, and functional tests to cover the most important use cases. Separating the source code for each test type automatically results in a project that is more maintainable and manageable.

By default, the Java project creates a convention for organizing unit tests in the directory src/test/java. Additionally, if you apply the Groovy plugin, source code under the directory src/test/groovy is considered for compilation (with the same standard for Kotlin under the directory src/test/kotlin). Consequently, source code directories for other test types should follow a similar pattern:

.
└── src
    ├── functionalTest
    │   └── groovy      // (1)
    ├── integrationTest
    │   └── groovy      // (2)
    ├── main
    │   ├── java        // (3)
    └── test
        └── groovy      // (4)
  1. Source directory containing functional tests

  2. Source directory containing integration tests

  3. Source directory containing production source code

  4. Source directory containing unit tests

Note
The directories src/integrationTest/groovy and src/functionalTest/groovy are not based on an existing standard convention for Gradle projects. You are free to choose any project layout that works best for you.

You can configure the source directories for compilation and test execution.

The Test Suite plugin provides a DSL and API to model multiple groups of automated tests into test suites in JVM-based projects. You can also rely on third-party plugins for convenience, such as the Nebula Facet plugin or the TestSets plugin.

Modeling test types
Note
A new configuration DSL for modeling the below integrationTest suite is available via the incubating JVM Test Suite plugin.

In Gradle, source code directories are represented using the concept of source sets. A source set is configured to point to one or more directories containing source code. When you define a source set, Gradle automatically sets up compilation tasks for the specified directories.

A pre-configured source set can be created with one line of build script code. The source set automatically registers configurations to define dependencies for the sources of the source set:

// Define a source set named 'test' for test sources
sourceSets {
    test {
        java {
            srcDirs = ['src/test/java']
        }
    }
}
// Specify a test implementation dependency on JUnit
dependencies {
    testImplementation 'junit:junit:4.12'
}

We use that to define an integrationTestImplementation dependency to the project itself, which represents the "main" variant of our project (i.e., the compiled plugin code):

build.gradle.kts
val integrationTest by sourceSets.creating

dependencies {
    "integrationTestImplementation"(project)
}
build.gradle
def integrationTest = sourceSets.create("integrationTest")

dependencies {
    integrationTestImplementation(project)
}

Source sets are responsible for compiling source code, but they do not deal with executing the bytecode. For test execution, a corresponding task of type Test needs to be established. The following setup shows the execution of integration tests, referencing the classes and runtime classpath of the integration test source set:

build.gradle.kts
val integrationTestTask = tasks.register<Test>("integrationTest") {
    description = "Runs the integration tests."
    group = "verification"
    testClassesDirs = integrationTest.output.classesDirs
    classpath = integrationTest.runtimeClasspath
    mustRunAfter(tasks.test)
}
tasks.check {
    dependsOn(integrationTestTask)
}
build.gradle
def integrationTestTask = tasks.register("integrationTest", Test) {
    description = 'Runs the integration tests.'
    group = "verification"
    testClassesDirs = integrationTest.output.classesDirs
    classpath = integrationTest.runtimeClasspath
    mustRunAfter(tasks.named('test'))
}
tasks.named('check') {
    dependsOn(integrationTestTask)
}
Configuring a test framework

Gradle does not dictate the use of a specific test framework. Popular choices include JUnit, TestNG and Spock. Once you choose an option, you have to add its dependency to the compile classpath for your tests.

The following code snippet shows how to use Spock for implementing tests:

build.gradle.kts
repositories {
    mavenCentral()
}

dependencies {
    testImplementation(platform("org.spockframework:spock-bom:2.2-groovy-3.0"))
    testImplementation("org.spockframework:spock-core")
    testRuntimeOnly("org.junit.platform:junit-platform-launcher")

    "integrationTestImplementation"(platform("org.spockframework:spock-bom:2.2-groovy-3.0"))
    "integrationTestImplementation"("org.spockframework:spock-core")
    "integrationTestRuntimeOnly"("org.junit.platform:junit-platform-launcher")

    "functionalTestImplementation"(platform("org.spockframework:spock-bom:2.2-groovy-3.0"))
    "functionalTestImplementation"("org.spockframework:spock-core")
    "functionalTestRuntimeOnly"("org.junit.platform:junit-platform-launcher")
}

tasks.withType<Test>().configureEach {
    // Using JUnitPlatform for running tests
    useJUnitPlatform()
}
build.gradle
repositories {
    mavenCentral()
}

dependencies {
    testImplementation platform("org.spockframework:spock-bom:2.2-groovy-3.0")
    testImplementation 'org.spockframework:spock-core'
    testRuntimeOnly 'org.junit.platform:junit-platform-launcher'

    integrationTestImplementation platform("org.spockframework:spock-bom:2.2-groovy-3.0")
    integrationTestImplementation 'org.spockframework:spock-core'
    integrationTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'

    functionalTestImplementation platform("org.spockframework:spock-bom:2.2-groovy-3.0")
    functionalTestImplementation 'org.spockframework:spock-core'
    functionalTestRuntimeOnly 'org.junit.platform:junit-platform-launcher'
}

tasks.withType(Test).configureEach {
    // Using JUnitPlatform for running tests
    useJUnitPlatform()
}
Note
Spock is a Groovy-based BDD test framework that even includes APIs for creating Stubs and Mocks. The Gradle team prefers Spock over other options for its expressiveness and conciseness.

Implementing automated tests

This section discusses representative implementation examples for unit, integration, and functional tests. All test classes are based on the use of Spock, though it should be relatively easy to adapt the code to a different test framework.

Implementing unit tests

The URL verifier plugin emits HTTP GET calls to check if a URL can be resolved successfully. The method DefaultHttpCaller.get(String) is responsible for calling a given URL and returns an instance of type HttpResponse. HttpResponse is a POJO containing information about the HTTP response code and message:

HttpResponse.java
package org.myorg.http;

public class HttpResponse {
    private int code;
    private String message;

    public HttpResponse(int code, String message) {
        this.code = code;
        this.message = message;
    }

    public int getCode() {
        return code;
    }

    public String getMessage() {
        return message;
    }

    @Override
    public String toString() {
        return "HTTP " + code + ", Reason: " + message;
    }
}

The class HttpResponse represents a good candidate for a unit test. It does not reach out to any other classes nor does it use the Gradle API.

HttpResponseTest.groovy
package org.myorg.http

import spock.lang.Specification

class HttpResponseTest extends Specification {

    private static final int OK_HTTP_CODE = 200
    private static final String OK_HTTP_MESSAGE = 'OK'

    def "can access information"() {
        when:
        def httpResponse = new HttpResponse(OK_HTTP_CODE, OK_HTTP_MESSAGE)

        then:
        httpResponse.code == OK_HTTP_CODE
        httpResponse.message == OK_HTTP_MESSAGE
    }

    def "can get String representation"() {
        when:
        def httpResponse = new HttpResponse(OK_HTTP_CODE, OK_HTTP_MESSAGE)

        then:
        httpResponse.toString() == "HTTP $OK_HTTP_CODE, Reason: $OK_HTTP_MESSAGE"
    }
}
Important
When writing unit tests, it’s important to test boundary conditions and various forms of invalid input. Try to extract as much logic as possible from classes that use the Gradle API to make it testable as unit tests. It will result in maintainable code and faster test execution.

You can use the ProjectBuilder class to create Project instances to use when you test your plugin implementation.

src/test/java/org/example/GreetingPluginTest.java
public class GreetingPluginTest {
    @Test
    public void greeterPluginAddsGreetingTaskToProject() {
        Project project = ProjectBuilder.builder().build();
        project.getPluginManager().apply("org.example.greeting");

        assertTrue(project.getTasks().getByName("hello") instanceof GreetingTask);
    }
}
Implementing integration tests

Let’s look at a class that reaches out to another system, the piece of code that emits the HTTP calls. At the time of executing a test for the class DefaultHttpCaller, the runtime environment needs to be able to reach out to the internet:

DefaultHttpCaller.java
package org.myorg.http;

import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URI;
import java.net.URISyntaxException;

public class DefaultHttpCaller implements HttpCaller {
    @Override
    public HttpResponse get(String url) {
        try {
            HttpURLConnection connection = (HttpURLConnection) new URI(url).toURL().openConnection();
            connection.setConnectTimeout(5000);
            connection.setRequestMethod("GET");
            connection.connect();

            int code = connection.getResponseCode();
            String message = connection.getResponseMessage();
            return new HttpResponse(code, message);
        } catch (IOException e) {
            throw new HttpCallException(String.format("Failed to call URL '%s' via HTTP GET", url), e);
        } catch (URISyntaxException e) {
            throw new RuntimeException(e);
        }
    }
}

Implementing an integration test for DefaultHttpCaller doesn’t look much different from the unit test shown in the previous section:

DefaultHttpCallerIntegrationTest.groovy
package org.myorg.http

import spock.lang.Specification
import spock.lang.Subject

class DefaultHttpCallerIntegrationTest extends Specification {
    @Subject HttpCaller httpCaller = new DefaultHttpCaller()

    def "can make successful HTTP GET call"() {
        when:
        def httpResponse = httpCaller.get('https://www.google.com/')

        then:
        httpResponse.code == 200
        httpResponse.message == 'OK'
    }

    def "throws exception when calling unknown host via HTTP GET"() {
        when:
        httpCaller.get('https://www.wedonotknowyou123.com/')

        then:
        def t = thrown(HttpCallException)
        t.message == "Failed to call URL 'https://www.wedonotknowyou123.com/' via HTTP GET"
        t.cause instanceof UnknownHostException
    }
}
Implementing functional tests

Functional tests verify the correctness of the plugin end-to-end. In practice, this means applying, configuring, and executing the functionality of the plugin implementation. The UrlVerifierPlugin class exposes an extension and a task instance that uses the URL value configured by the end user:

UrlVerifierPlugin.java
package org.myorg;

import org.gradle.api.Plugin;
import org.gradle.api.Project;
import org.myorg.tasks.UrlVerify;

public class UrlVerifierPlugin implements Plugin<Project> {
    @Override
    public void apply(Project project) {
        UrlVerifierExtension extension = project.getExtensions().create("verification", UrlVerifierExtension.class);
        UrlVerify verifyUrlTask = project.getTasks().create("verifyUrl", UrlVerify.class);
        verifyUrlTask.getUrl().set(extension.getUrl());
    }
}

Every Gradle plugin project should apply the plugin development plugin to reduce boilerplate code. By applying the plugin development plugin, the test source set is preconfigured for the use with TestKit. If we want to use a custom source set for functional tests and leave the default test source set for only unit tests, we can configure the plugin development plugin to look for TestKit tests elsewhere.

build.gradle.kts
gradlePlugin {
    testSourceSets(functionalTest)
}
build.gradle
gradlePlugin {
    testSourceSets(sourceSets.functionalTest)
}

Functional tests for Gradle plugins use an instance of GradleRunner to execute the build under test. GradleRunner is an API provided by TestKit, which internally uses the Tooling API to execute the build.

The following example applies the plugin to the build script under test, configures the extension and executes the build with the task verifyUrl. Please see the TestKit documentation to get more familiar with the functionality of TestKit.

UrlVerifierPluginFunctionalTest.groovy
package org.myorg

import org.gradle.testkit.runner.GradleRunner
import spock.lang.Specification
import spock.lang.TempDir

import static org.gradle.testkit.runner.TaskOutcome.SUCCESS

class UrlVerifierPluginFunctionalTest extends Specification {
    @TempDir File testProjectDir
    File buildFile

    def setup() {
        buildFile = new File(testProjectDir, 'build.gradle')
        buildFile << """
            plugins {
                id 'org.myorg.url-verifier'
            }
        """
    }

    def "can successfully configure URL through extension and verify it"() {
        buildFile << """
            verification {
                url = 'https://www.google.com/'
            }
        """

        when:
        def result = GradleRunner.create()
            .withProjectDir(testProjectDir)
            .withArguments('verifyUrl')
            .withPluginClasspath()
            .build()

        then:
        result.output.contains("Successfully resolved URL 'https://www.google.com/'")
        result.task(":verifyUrl").outcome == SUCCESS
    }
}
IDE integration

TestKit determines the plugin classpath by running a specific Gradle task. You will need to execute the assemble task to initially generate the plugin classpath or to reflect changes to it even when running TestKit-based functional tests from the IDE.

Some IDEs provide a convenience option to delegate the "test classpath generation and execution" to the build. In IntelliJ, you can find this option under Preferences…​ > Build, Execution, Deployment > Build Tools > Gradle > Runner > Delegate IDE build/run actions to Gradle.

intellij delegate to build

Publishing Plugins to the Gradle Plugin Portal

Publishing a plugin is the primary way to make it available for others to use. While you can publish to a private repository to restrict access, publishing to the Gradle Plugin Portal makes your plugin available to anyone in the world.

plugin portal page

This guide shows you how to use the com.gradle.plugin-publish plugin to publish plugins to the Gradle Plugin Portal using a convenient DSL. This approach streamlines configuration steps and provides validation checks to ensure your plugin meets the Gradle Plugin Portal’s criteria.

Prerequisites

You’ll need an existing Gradle plugin project for this tutorial. If you don’t have one, use the Greeting plugin sample.

Attempting to publish this plugin will safely fail with a permission error, so don’t worry about cluttering up the Gradle Plugin Portal with a trivial example plugin.

Account setup

Before publishing your plugin, you must create an account on the Gradle Plugin Portal. Follow the instructions on the registration page to create an account and obtain an API key from your profile page’s "API Keys" tab.

plugin portal registration page

Store your API key in your Gradle configuration (gradle.publish.key and gradle.publish.secret) or use a plugin like Seauc Credentials plugin or Gradle Credentials plugin for secure management.

plugin portal api keys

It is common practice to copy and paste the text into your $HOME/.gradle/gradle.properties file, but you can also place it in any other valid location. All the plugin requires is that the gradle.publish.key and gradle.publish.secret are available as project properties when the appropriate Plugin Portal tasks are executed.

If you are concerned about placing your credentials in gradle.properties, check out the Seauc Credentials plugin or the Gradle Credentials plugin.

Alternatively, you can provide the API key via GRADLE_PUBLISH_KEY and GRADLE_PUBLISH_SECRET environment variables. This approach might be useful for CI/CD pipelines.

Adding the Plugin Publishing Plugin

To publish your plugin, add the com.gradle.plugin-publish plugin to your project’s build.gradle or build.gradle.kts file:

build.gradle.kts
plugins {
    id("com.gradle.plugin-publish") version "1.2.1"
}
build.gradle
plugins {
    id 'com.gradle.plugin-publish' version '1.2.1'
}

The latest version of the Plugin Publishing Plugin can be found on the Gradle Plugin Portal.

Note
Since version 1.0.0 the Plugin Publish Plugin automatically applies the Java Gradle Plugin Development Plugin (assists with developing Gradle plugins) and the Maven Publish Plugin (generates plugin publication metadata). If using older versions of the Plugin Publish Plugin, these helper plugins must be applied explicitly.

Configuring the Plugin Publishing Plugin

Configure the com.gradle.plugin-publish plugin in your build.gradle or build.gradle.kts file.

build.gradle.kts
group = "io.github.johndoe" // (1)
version = "1.0" // (2)

gradlePlugin { // (3)
    website = "<substitute your project website>" // (4)
    vcsUrl = "<uri to project source repository>" // (5)

    // ... // (6)
}
build.gradle
group = 'io.github.johndoe' // (1)
version = '1.0'     // (2)

gradlePlugin { // (3)
    website = '<substitute your project website>' // (4)
    vcsUrl = '<uri to project source repository>' // (5)

    // ... // (6)
}
  1. Make sure your project has a group set which is used to identify the artifacts (jar and metadata) you publish for your plugins in the repository of the Gradle Plugin Portal and which is descriptive of the plugin author or the organization the plugins belong too.

  2. Set the version of your project, which will also be used as the version of your plugins.

  3. Use the gradlePlugin block provided by the Java Gradle Plugin Development Plugin to configure further options for your plugin publication.

  4. Set the website for your plugin’s project.

  5. Provide the source repository URI so that others can find it, if they want to contribute.

  6. Set specific properties for each plugin you want to publish; see next section.

Define common properties for all plugins, such as group, version, website, and source repository, using the gradlePlugin{} block:

build.gradle.kts
gradlePlugin { // (1)
    // ... // (2)

    plugins { // (3)
        create("greetingsPlugin") { // (4)
            id = "<your plugin identifier>" // (5)
            displayName = "<short displayable name for plugin>" // (6)
            description = "<human-readable description of what your plugin is about>" // (7)
            tags = listOf("tags", "for", "your", "plugins") // (8)
            implementationClass = "<your plugin class>"
        }
    }
}
build.gradle
gradlePlugin { // (1)
    // ... // (2)

    plugins { // (3)
        greetingsPlugin { // (4)
            id = '<your plugin identifier>' // (5)
            displayName = '<short displayable name for plugin>' // (6)
            description = '<human-readable description of what your plugin is about>' // (7)
            tags.set(['tags', 'for', 'your', 'plugins']) // (8)
            implementationClass = '<your plugin class>'
        }
    }
}
  1. Plugin specific configuration also goes into the gradlePlugin block.

  2. This is where we previously added global properties.

  3. Each plugin you publish will have its own block inside plugins.

  4. The name of a plugin block must be unique for each plugin you publish; this is a property used only locally by your build and will not be part of the publication.

  5. Set the unique id of the plugin, as it will be identified in the publication.

  6. Set the plugin name in human-readable form.

  7. Set a description to be displayed on the portal. It provides useful information to people who want to use your plugin.

  8. Specifies the categories your plugin covers. It makes the plugin more likely to be discovered by people needing its functionality.

For example, consider the configuration for the GradleTest plugin, already published to the Gradle Plugin Portal.

build.gradle.kts
gradlePlugin {
    website = "https://github.com/ysb33r/gradleTest"
    vcsUrl = "https://github.com/ysb33r/gradleTest.git"
    plugins {
        create("gradletestPlugin") {
            id = "org.ysb33r.gradletest"
            displayName = "Plugin for compatibility testing of Gradle plugins"
            description = "A plugin that helps you test your plugin against a variety of Gradle versions"
            tags = listOf("testing", "integrationTesting", "compatibility")
            implementationClass = "org.ysb33r.gradle.gradletest.GradleTestPlugin"
        }
    }
}
build.gradle
gradlePlugin {
    website = 'https://github.com/ysb33r/gradleTest'
    vcsUrl = 'https://github.com/ysb33r/gradleTest.git'
    plugins {
        gradletestPlugin {
            id = 'org.ysb33r.gradletest'
            displayName = 'Plugin for compatibility testing of Gradle plugins'
            description = 'A plugin that helps you test your plugin against a variety of Gradle versions'
            tags.addAll('testing', 'integrationTesting', 'compatibility')
            implementationClass = 'org.ysb33r.gradle.gradletest.GradleTestPlugin'
        }
    }
}

If you browse the associated page on the Gradle Plugin Portal for the GradleTest plugin, you will see how the specified metadata is displayed.

plugin portal plugin page
Sources & Javadoc

The Plugin Publish Plugin automatically generates and publishes the Javadoc, and sources JARs for your plugin publication.

Sign artifacts

Starting from version 1.0.0 of Plugin Publish Plugin, the signing of published plugin artifacts has been made automatic. To enable it, all that’s needed is to apply the signing plugin in your build.

Shadow dependencies

Starting from version 1.0.0 of Plugin Publish Plugin, shadowing your plugin’s dependencies (ie, publishing it as a fat jar) has been made automatic. To enable it, all that’s needed is to apply the com.github.johnrengelman.shadow plugin in your build.

Publishing the plugin

If you publish your plugin internally for use within your organization, you can publish it like any other code artifact. See the Ivy and Maven chapters on publishing artifacts.

If you are interested in publishing your plugin to be used by the wider Gradle community, you can publish it to Gradle Plugin Portal. This site provides the ability to search for and gather information about plugins contributed by the Gradle community. Please refer to the corresponding section on making your plugin available on this site.

Publish locally

To check how the artifacts of your published plugin look or to use it only locally or internally in your company, you can publish it to any Maven repository, including a local folder. You only need to configure repositories for publishing. Then, you can run the publish task to publish your plugin to all repositories you have defined (but not the Gradle Plugin Portal).

build.gradle.kts
publishing {
    repositories {
        maven {
            name = "localPluginRepository"
            url = uri("../local-plugin-repository")
        }
    }
}
build.gradle
publishing {
    repositories {
        maven {
            name = 'localPluginRepository'
            url = '../local-plugin-repository'
        }
    }
}

To use the repository in another build, add it to the repositories of the pluginManagement {} block in your settings.gradle(.kts) file.

Publish to the Plugin Portal

Publish the plugin by using the publishPlugin task:

$ ./gradlew publishPlugins

You can validate your plugins before publishing using the --validate-only flag:

$ ./gradlew publishPlugins --validate-only

If you have not configured your gradle.properties for the Gradle Plugin Portal, you can specify them on the command-line:

$ ./gradlew publishPlugins -Pgradle.publish.key=<key> -Pgradle.publish.secret=<secret>
Note
You will encounter a permission failure if you attempt to publish the example Greeting Plugin with the ID used in this section. That’s expected and ensures the portal won’t be overrun with multiple experimental and duplicate greeting-type plugins.

After approval, your plugin will be available on the Gradle Plugin Portal for others to discover and use.

Consume the published plugin

Once you successfully publish a plugin, it won’t immediately appear on the Portal. It also needs to pass an approval process, which is manual and relatively slow for the initial version of your plugin, but is fully automatic for subsequent versions. For further details, see here.

Once your plugin is approved, you can find instructions for its use at a URL of the form https://plugins.gradle.org/plugin/<your-plugin-id>. For example, the Greeting Plugin example is already on the portal at https://plugins.gradle.org/plugin/org.example.greeting.

Plugins published without Gradle Plugin Portal

If your plugin was published without using the Java Gradle Plugin Development Plugin, the publication will be lacking Plugin Marker Artifact, which is needed for plugins DSL to locate the plugin. In this case, the recommended way to resolve the plugin in another project is to add a resolutionStrategy section to the pluginManagement {} block of the project’s settings file, as shown below.

settings.gradle.kts
resolutionStrategy {
    eachPlugin {
        if (requested.id.namespace == "org.example") {
            useModule("org.example:custom-plugin:${requested.version}")
        }
    }
}
settings.gradle
resolutionStrategy {
    eachPlugin {
        if (requested.id.namespace == 'org.example') {
            useModule("org.example:custom-plugin:${requested.version}")
        }
    }
}

OTHER TOPICS

Gradle-managed Directories

Gradle uses two main directories to perform and manage its work: the Gradle User Home directory and the Project Root directory.

author gradle 2

Gradle User Home directory

By default, the Gradle User Home (~/.gradle or C:\Users\<USERNAME>\.gradle) stores global configuration properties, initialization scripts, caches, and log files.

It can be set with the environment variable GRADLE_USER_HOME.

Tip
Not to be confused with the GRADLE_HOME, the optional installation directory for Gradle.

It is roughly structured as follows:

├── caches // (1)
│   ├── 4.8 // (2)
│   ├── 4.9 // (2)
│   ├── ⋮
│   ├── jars-3 // (3)
│   └── modules-2 // (3)
├── daemon // (4)
│   ├── ⋮
│   ├── 4.8
│   └── 4.9
├── init.d // (5)
│   └── my-setup.gradle
├── jdks // (6)
│   ├── ⋮
│   └── jdk-14.0.2+12
├── wrapper
│   └── dists // (7)
│       ├── ⋮
│       ├── gradle-4.8-bin
│       ├── gradle-4.9-all
│       └── gradle-4.9-bin
└── gradle.properties // (8)
  1. Global cache directory (for everything that is not project-specific).

  2. Version-specific caches (e.g., to support incremental builds).

  3. Shared caches (e.g., for artifacts of dependencies).

  4. Registry and logs of the Gradle Daemon.

  5. Global initialization scripts.

  6. JDKs downloaded by the toolchain support.

  7. Distributions downloaded by the Gradle Wrapper.

  8. Global Gradle configuration properties.

Cleanup of caches and distributions

Gradle automatically cleans its user home directory.

By default, the cleanup runs in the background when the Gradle daemon is stopped or shut down.

If using --no-daemon, it runs in the foreground after the build session.

The following cleanup strategies are applied periodically (by default, once every 24 hours):

  • Version-specific caches in all caches/<GRADLE_VERSION>/ directories are checked for whether they are still in use.

    If not, directories for release versions are deleted after 30 days of inactivity, and snapshot versions after 7 days.

  • Shared caches in caches/ (e.g., jars-*) are checked for whether they are still in use.

    If no Gradle version still uses them, they are deleted.

  • Files in shared caches used by the current Gradle version in caches/ (e.g., jars-3 or modules-2) are checked for when they were last accessed.

    Depending on whether the file can be recreated locally or downloaded from a remote repository, it will be deleted after 7 or 30 days, respectively.

  • Gradle distributions in wrapper/dists/ are checked for whether they are still in use, i.e., whether there’s a corresponding version-specific cache directory.

    Unused distributions are deleted.

Configuring cleanup of caches and distributions

The retention periods of the various caches can be configured.

Caches are classified into five categories:

  1. Released wrapper distributions: Distributions and related version-specific caches corresponding to released versions (e.g., 4.6.2 or 8.0).

    Default retention for unused versions is 30 days.

  2. Snapshot wrapper distributions: Distributions and related version-specific caches corresponding to snapshot versions (e.g. 7.6-20221130141522+0000).

    Default retention for unused versions is 7 days.

  3. Downloaded resources: Shared caches downloaded from a remote repository (e.g., cached dependencies).

    Default retention for unused resources is 30 days.

  4. Created resources: Shared caches that Gradle creates during a build (e.g., artifact transforms).

    Default retention for unused resources is 7 days.

  5. Build cache: The local build cache (e.g., build-cache-1).

    Default retention for unused build-cache entries is 7 days.

The retention period for each category can be configured independently via an