Cache deployment strategy

The main reason to use a MSR cache is so that users can pull images from a service that’s geographically closer to them.

For example, a company has developers spread across three locations: United States, Asia, and Europe. Developers working in the US office can pull their images from MSR without problem, but developers in the Asia and Europe offices complain that it takes them a long time to pulls images.

To address that, you can deploy MSR caches in the Asia and Europe offices, so that developers working from there can pull images much faster.

Deployment overview

To deploy the MSR caches for the example scenario, you need three datacenters:

  • The US datacenter runs MSR configured for high availability.

  • The Asia datacenter runs a MSR cache.

  • The Europe datacenter runs another MSR cache.

Both caches are configured to fetch images from MSR.

System requirements

Before deploying a MSR cache in a datacenter, make sure you:

  • Provision multiple nodes and install Docker on them.

  • Join the nodes into a Swarm.

  • Have one or more dedicated worker nodes just for running the MSR cache.

  • Have TLS certificates to use for securing the cache.

  • Have a shared storage system, if you want the cache to be highly available.

Ports used

You can customize the port used by the MSR cache, so you’ll have to configure your firewall rules to make sure users can access the cache using the port you chose.

By default the documentation guides you in deploying caches that are exposed on port 443/TCP using the swarm routing mesh.