In order to reduce the size of the Chromium source checkout, Telemetry stores binaries in Cloud Storage. This includes:
- Recordings of webpages, known as "page set archives" or "WPR archives."
- Support binaries, such as device/host_forwarder, ipfw, minidump_stackwalk, crash_service.
- credentials.json, which is in Cloud Storage to provide access rights.
Many benchmarks require these files to run, and will fail without them.
Set Up Cloud Storage
Follow these instructions to install depot_tools.
Authenticate into Cloud Storage
Some files in Cloud Storage include data internal to Google or its partners. To run benchmarks that rely on this data, you need to authenticate.
If you're on Linux on a Google-internal network,
prodaccess will authenticate you. Otherwise, run the command below and follow the instructions for authentication with your corporate account.
$ depot_tools/third_party/gsutil/gsutil config
When prompted with “
What is your project-id?”, just enter
Upload to Cloud Storage
Upload your files into the bucket “chromium-telemetry”
Put the target file in the directory you want it to be when downloaded from Cloud Storage, say
path/to/target. Use this command to upload:
$ depot_tools/upload_to_google_storage.py --bucket chromium-telemetry path/to/target
A SHA1 file
path/to/target.sha1 will be generated for each uploaded file.
Check the .sha1 files into repository
$ git add path/to/target.sha1
Download the file in python
from telemetry.page import cloud_storage