the Chromium logo

The Chromium Projects

Deterministic builds


Make Chromium's build process to deterministic. Tracking issue:

Handling failures on the deterministic bots



Improve cycle time and reduce infrastructure utilization with redundant tasks.

Sub goals

  1. Reduce I/O load on the Isolate Server by having deterministic binaries.
    • Binaries do not need to be archived if they are already in the Isolate Server cache; if the build is deterministic, this happens frequently.
  2. Take advantage of native's Swarming task deduplication by skipping redundant test executables.

So it's actually a 2x multiplier here for each target that becomes deterministic and runs on Swarming. Benefits are both monetary (less hardware is required) and developer time (reduced latency by having less work to do on the TS and CI).

We estimate we'd save around >20% of the current testing load. This will result in faster test cycles on the Try Server (TS) and the Continuous Integration (CI) infrastructure. Swarming already dedupe 1~7% of the tasks runtime simply due to incremental builds.


Test isolation is an on-going effort to generate an exact list of the files needed by a given unit test at runtime. It enables 3 benefits:

Tracking issue:

Swarming is the task distributor that leverage Test isolation to run tests simultaneously to reduce latency in getting test results.

Normal projects do not have deterministic builds and chromium is one example. A deterministic build is not something that happens naturally. It needs to be done. Swarming knows the relationship between an isolated test and the result when run on a bot with specific features. The specific features are determined by the requests. For example the request may specify bot OS specific version, bitness, metadata like the video card, etc.

Google internally uses many tricks to achieve similar performance improves at extremely high cache rates: [link]

Building the whole action graph would be wasteful (...), so we will skip executing an action unless one or more of its input files change compared to the previous build. In order to do that we keep track of the content digest of each input file whenever we execute an action. As we mentioned in the previous blog post, we keep track of the content digest of source files and we use the same content digest to track changes to files.

Non Goals

Making the build deterministic is not a goal in these conditions:

Testing plan

Enforced by bots on the public waterfall on Android, Linux, and Windows; FYI bots on Mac. See also


Documented at

OS Specific Challenges

Each toolset is non-deterministic in different ways so the work has to be redone on each platform.


Tracking issue:

Builder: deterministic


Tracking issue:

Builder: deterministic


Tracking issue:

Builder: Linux Linux (dbg)


Tracking issue:

Builder: Android Android (dbg)


Tracking issue:

Builder: deterministic build

Example workflow on Windows

# Do a whitespace change to force a compilation.
echo "" >>
# It recreates the same foo.obj than before since the code didn't change.
compile -> foo_main.obj
# This step could be saved by a content-addressed build system, see "extension of the project" below.
link foo_main.obj -> bar_test.exe
# The binary didn't change, it is not uploaded again, saving both I/O and latency.
isolate bar_test.exe -> isolateserver
# Swarming immediately return results of the last green build, saving both utilization and latency. run <sha1 of bar_test.isolated>

Extension of the project

Getting the build system to be content addressed as described in the Google reference above. This is a secondary task. It is out of scope for this project but is a natural extension. This saves on the build process itself at the cost of calculating hashes for each intermediary files. This will be piling on the fact that the build is itself deterministic. This is not worth doing before the build is itself deterministic and this property is enforced. This will save significantly on the build time itself on the build machines.