The appeal to most who adopt Bazel is the ability to memoize much of the build graph if nothing has changed. Furthermore, while leveraging remote caches, build results can be shared across machines making memoization even more effective.
This was a pretty compelling reason to adopt Bazel but pretty soon many noticed, especially on their CI systems, lots of unecessary data transfers for larger codebases.
😲 If the network is poor, the benefits of remote caching (memoization) can be outweighed by the cost to download the artifacts.
Bazel typically will download every output file of every action executed (or cached) to the host machine. The total size of all output files in a build can be extremely large especially if you are building OCI images.
Large repos may create more than 1GiB of total output files, and it’s easy to see that on a limited network it may be more cost-effective to rebuild them locally.