In large repositories, it's common to end up with pipeline execution that takes more than one hour. A pipeline execution refers to the automated sequence of steps—such as building, testing, and deploying code—that runs whenever changes are made to the codebase, typically triggered by a pull request. In many cases, all modules are executed sequentially, which leads to long feedback loops for the development team.
In this article, we’ll explore how to reduce test pipeline duration without sacrificing test coverage or quality.
So, first of all: does this strategy work in every context?
The short answer is no. The approach we’ll explore is particularly well-suited for projects that are organized into modules (or any structure that can be logically segmented). In this context, we can split test execution by module, which makes parallelization straightforward and effective.
Note: You can achieve parallel test execution even in non-modularized applications, but that’s beyond the scope of this article.
Let’s start with the required plugins. If you’re working on a Maven project, you’re likely already using the following:
The JaCoCo plugin helps configure the agent required for collecting code coverage data during test execution. Meanwhile, the Surefire plugin handles the actual execution of tests and can launch a new JVM instance, allowing additional configuration through the argLine parameter.
So, for the current example, you will need to add the following configurations to your JaCoCo plugin:
Let’s break down what each part of this configuration does:
Once this is configured, you’ll need to pass that property to the Surefire plugin so it can include the JaCoCo agent in the JVM when tests are executed.
Now let’s look at the Surefire plugin setup:
Here’s a breakdown of the key configurations:
With these two plugins correctly configured, you’re now ready to move on to executing tests efficiently.
Now, let's check the commands that we are going to use and how you could adapt those to your use cases.
Here’s how to execute tests selectively using Maven, which makes test execution more granular and parallelizable:
Now, let's break this down and explain how this is working:
After executing this command, you should see the .exec file generated in the directory you set with destFile, along with the standard Surefire test reports. Keep in mind that test classes referenced in -Dtest are resolved from the compiled class files in target/test-classes, not from the .java files.
Most CI/CD tools like Bitbucket Pipelines, CircleCI, TravisCI, and SemaphoreCI allow you to define parallel steps. This is where you can take full advantage of the configuration we just built.
An ideal execution flow might look like this:
1. In the first step, you should run basic validations (e.g., linting, formatting, security checks).
2. Let your CI tool clone the repository.
3. Depending on whether your CI tool allows you to share your build to the following steps, you might need to recompile modules in each parallel step.
// This command builds the selected module and its dependencies only. It saves time as long as your modules are not heavily interdependent.
4. Now, feel free to check how to execute different steps depending on your preferred CI tool.
5. Once your test steps have run in parallel and generated individual .exec files, you’ll want to merge them to create a unified coverage report.
Note: The ${class_and_source_files_arg_array[@]} parameter is essential. It must include all compiled classes and source files across all modules so that JaCoCo can accurately generate the coverage report.
This is a powerful and cost-effective way to improve pipeline execution time in CI environments, especially for large multi-module Maven projects. However, like any optimization, it’s essential to balance performance with resource usage. Start small, and monitor your CI environment’s capacity and behavior as you scale up parallelism. Consider grouping smaller or less active modules into batches to avoid spinning up unnecessary steps—particularly when initial scripts for each module are time-consuming.
Ultimately, finding the right balance between speed, cost, and maintainability is what will deliver the most value to your team.