This guide introduces the caching capabilities of Edgio. While most CDNs only cache content on your asset URLs, Edgio caches content on your page URLs using EdgeJS, allowing you to control caching within your application code.

Environments and Caching

To begin caching responses, you need to create an environment. Each environment provides a separate edge cache for the most recent deployment. Older deployments will no longer have edge caching, but can always be redeployed to re-enable caching.

L1 and L2 Caches

Each edge point-of-presence (POP) has its own L1 cache. If a request cannot be fulfilled from the L1 cache, Edgio will attempt to fulfill the request from a single global L2 cache POP in order to maximize your effective cache hit ratio. There is very little difference in time to first byte (TTFB) for responses served from the L1 vs L2 cache. In either case, the response is served nearly instantly (typically 25-100ms). Concurrent requests for the same URL on different POPs that result in a cache miss will be coalesced at the L2 cache. This means that only one request at a time will be sent to your origin servers for each cacheable URL.

Caching a Response

To cache a response, use the cache function in your route’s callback:
1import { CustomCacheKey } from '@layer0/core/router'
3router.get('/some/path', ({ cache }) => {
4 cache({
5 browser: {
6 // Sets the cache-control: maxage=n header sent to the browser. To prevent the browser from caching this route
7 // set maxAgeSeconds: 0
8 maxAgeSeconds: 0,
10 // Sends a non-standard header `x-sw-cache-control: n` that you can use to control caching your service worker.
11 // Note that service workers do not understand this header by default, so you would need to add code to your service
12 // worker to support it
13 serviceWorkerSeconds: 60 * 60,
14 },
15 edge: {
16 // Sets the TTL for a response in Edgio's edge cache
17 maxAgeSeconds: 60 * 60 * 24,
19 // Sets the amount of time a stale response will be served from the cache. When a stale response is sent, Edgio
20 // will simultaneously fetch a new response to serve subsequent requests.
21 // Using stale-while-revalidate helps raise your effective cache hit rate to near 100%.
22 staleWhileRevalidateSeconds: 60 * 60, // serve stale responses for up to 1 hour while fetching a new response
24 // And many other options
25 },
26 // Optionally customizes the cache key for both edge and browser
27 key: new CustomCacheKey()
28 .addBrowser() // Split cache by browser type
29 .addCookie('some-cookie'), // Split cache by some-cookie cookie
30 })
The cache function can be used in the same route as other functions such as serveStatic, proxy and render, or in a separate route prior to sending the response.

Cache Key

Edgio provides you with a default cache key out of the box. It is a broad cache key that ensures general correctness but can be further customized by you. The default cache key consists of:
  • Value of host request header
  • Complete request URL, including the query parameters (this can be customized)
  • Value of accept-encoding request header
  • Name of the destination when split testing is in effect
When POST and other non-GET/HEAD methods caching is enabled, Edgio automatically adds the following to the cache key:
  • Request HTTP method
  • Request body
To ensure that your site is resilient to cache poisoning attacks, every request header that influences the rendering of the content must be included in your custom cache key.

Customizing the Cache Key

It is often useful to customize the cache key, either to improve the cache hit ratio or to account for complexities of your site. As seen above, Edgio provides an easy way to customize the keys by using the CustomCacheKey class. Here we will focus on three common examples:
  • Increasing the cache hit ratio by excluding all query parameters except those provided from the cache key. This lets only those specified parameters to fragment the cache (so you would add things like page, number per page, filters, variants of a product etc.)
1import { CustomCacheKey } from '@layer0/core/router'
3router.get('/some/path', ({ cache }) => {
4 cache({
5 // Other options...
6 key: new CustomCacheKey().excludeAllQueryParametersExcept('whitelisted-param-1', 'whitelisted-param-2'),
7 })
We recommend using this method over excludeQueryParameters as it’s difficult to know all of the query parameters your application might receive and unexpected query parameters can lead to significantly lower cache hit rates. With excludeQueryParameters, it stops the listed query parameters from fragmenting the cache (so you would add things like utm_medium, gclid, or other marketing params you know that don’t alter the content on the page)
1import { CustomCacheKey } from '@layer0/core/router'
3router.get('/some/path', ({ cache }) => {
4 cache({
5 // Other options...
6 key: new CustomCacheKey().excludeQueryParameters('to-be-excluded-1', 'to-be-excluded-2'),
7 })
This will remove the given query parameters from the URL before it is used in cache. On cache miss, the transformed URL will be passed to your code with the original query strings available to your code in x-0-original-qs request header.
  • Including other request parameters like cookies:
1import { CustomCacheKey } from '@layer0/core/router'
3router.get('/some/path', ({ cache }) => {
4 cache({
5 key: new CustomCacheKey().addCookie('language').addCookie('currency'),
6 // Other options...
7 })
This will take the values of language and currency cookies from the cookie request header and use them in the cache key. This would allow you to cache different content for the same URL by creating different caches dependent on the language and currency values.
  • Splitting the cache based on device type:
1import { CustomCacheKey } from '@layer0/core/router'
3router.get('/some/path', ({ cache }) => {
4 cache({
5 key: new CustomCacheKey().addDevice(),
6 // Other options...
7 })
This will take the value of the x-0-device request header and split based on the following devices:
  • smartphone
  • tablet
  • mobile (feature phones)
  • desktop
This allows you to cache different content, depending on the type of device in this example, for the same URL.
Customizing caching keys is a very powerful tool to make your site faster. At the same time, it is easy to apply it too broadly causing a loss of performance due to lower cache hit ratio. The key to correctly using cache customization is to apply it judiciously and narrowly for specific routes.

Caching Responses for POST and other non-GET/HEAD Requests

Edgio only supports caching responses for GET and HEAD requests. Some APIs, particularly those implemented with GraphQL, use POST requests by default, with queries being sent through the request body. See Prefetching - GraphQL for more information on caching GraphQL with Edgio.

Caching Private Responses

By default, Edgio never caches responses which have the private clause in their cache-control header. Sometimes though, it’s desirable to cache such responses, intended for a single user of your site:
1router.get('/some/path', ({ cache }) => {
2 cache({
3 // Other options...
4 edge: {
5 // Other options...
6 forcePrivateCaching: true, // Force caching of `private` responses
7 },
8 })
Note that this feature cannot be safely used with caching of POST and similar requests. If your signal that something must not be cached is through private but then you force caching of private responses, all responses will be cached.

Achieving 100% Cache Hit Rates

The key to really successful cache hit rates is leveraging staleWhileRevalidate in conjunction with maxAge. There is a very detailed article available from web.dev that covers this concept in more detail. The main points to know is this
  • maxAge defines the hard cache limit. An asset will be cached this amount of time regardless.
  • staleWhileRevalidate defines an additional cache buffer limit past maxAge where cache content will still be returned to a client, but a network request will be issued to origin to check for new content.
  • If maxAge + staleWhileRevalidate value is exceeded, then a network request to origin is made no matter what.
Set keys using the edge key in your cache key
1edge: {
2 // 24 hours
3 maxAgeSeconds: 60 * 60 * 24,
4 // serve stale responses for up to 1 hour while fetching a new response
5 staleWhileRevalidateSeconds: 60 * 60,
With the following header set, the diagram below shows the age of the previously cached response at the time of the next request
1Cache-Control: max-age=1, stale-while-revalidate=59
maxAge staleWhileRevalidate diagram

Preventing a Response from being Cached

By default, Edgio will cache responses that satisfy all of the following conditions:
  1. The response must correspond to a GET or HEAD request. To override this, see the POST and other non-GET/HEAD section.
  2. The response status must have a status code of 1xx, 2xx or 3xx. You cannot override this.
  3. The response must not not have any set-cookie headers. You cannot override this, but you can use removeUpstreamResponseHeader('set-cookie') to remove set-cookie headers.
  4. The response must have a valid cache-control header that includes a positive max-age or s-maxage and does not include a private clause. You can override this by using router caching and forcing private responses.
However, sometimes you may not want to cache anything, even if the upstream backend returns a max-age. Other times, you might want to improve the performance of pages that can never be cached at edge. In those cases, you can turn off caching:
1router.get('/some/uncacheable/path', ({ cache, proxy }) => {
2 cache({
3 // Other options...
4 edge: false
5 })
6 // The route will need to send a response to prevent the request from continuing on to subsequent routes.
7 // This example sends the request through to a backend defined as "origin" which will complete the request cycle
8 await proxy('origin')

How do I know if a response was served from the cache?

To know if a response is being cached, examine the x-0-t response header. There are two components that indicate caching status:
  • oc - The outer (level 1) cache
  • sc - The shield (level 2) cache
You will see one of the following values for these components:
  • pass - The response was not cached (aka a cache “miss”)
  • cached - The response was added to the cache, but was not served from the cache (aka a cache “miss” that may be a “hit” for the next request)
  • hit - The response was served from the cache

Why is my response not being cached?

To understand why a response was not cached, examine the x-0-caching-status response header. It will have one of the following values:


The response was cached or served from the cache (see x-0-t).


The response was not cached because the edge caching was explicitly disabled (see Preventing a Response from being Cached).


The response was not cached because there was no cache-control response header with a non-zero max-age or s-maxage value. To cache the response, call cache in your route handler with edge.maxAgeSeconds set. For example:
1new Router().get('/', ({ cache }) => {
2 cache({
3 edge: {
4 maxAgeSeconds: 60 * 60 * 24,
5 },
6 })
You can also cache the response by adding a cache-control header with non-zero max-age or s-maxage value to the upstream response.


The response was not cached because the response had a status code >= 400.


The response was not cached because it contained a cache-control header with private. To cache the response, use:
1new Router().get('/', ({ cache }) => {
2 cache({
3 edge: {
4 forcePrivateCaching: true,
5 },
6 })
You can also remove the private value from the upstream response’s cache-control header.


The response was not cached because the request method was something other than HEAD or GET, and the route that set the caching behavior used match. To cache the POST responses, for example, use router.post() instead of router.match().


The response was not cached because the request body was more than 8000 bytes.


The response was not cached because it contained a set-cookie header. To cache the response, use removeUpstreamResponseHeader('set-cookie') to remove the set-cookie header.


The response was not cached because it was received during the brief time (less than 1 minute) that a new version of the app was being propagated through the global network of POPs. There is no need to take any action because this status goes away as soon as the new version is completely propagated.


The response was not cached because the request was issued with x-0-debug header set to 1. In debug mode, Edgio will respond with more data that is useful for troubleshooting. However, the increased header footprint may lead to header overflow and other failures, so this should be used only during actual troubleshooting.


The response was not cached due to unknown reasons. If you happen to receive this status then please contact support.
If the “pass” is intermittent on an otherwise cacheable resource, you may want to explore if “hit-for-pass” may have been triggered.
“Hit-for-pass” can happen when system remembers for a brief time that a typically cacheable resource was not cacheable as anticipated (such as a Set-Cookie header, or an HTTP error response code). The system will cache, typically for a couple of minutes that the resource was not cacheable, and will not coalesce requests.
Hit-for-pass disables the usual request coalescing behavior temporarily, when the resource is known not to be cacheable, clients can avoid being put in a waiting queue. Usually request coalescing (see L2 Shield Cache) speeds up client requests by enqueueing all but the first request, anticipating that the first request will populate the cache and allow instant delivery of the already-cached object to the waiting clients.
Disabling this, such as when the upstream resource is serving errors can help alleviate pressure at all stages of the request lifecycle.

Caching During Development

By default, caching is turned off during development. This is done to ensure that developers don’t see stale responses after making changes to their code or other upstream APIs. You can enable caching during development by running your app with:
10 dev --cache
The cache will automatically be cleared when you make changes to your router. A few aspects of caching are not yet supported during local development:
  • edge.staleWhileRevalidateSeconds is not yet implemented. Only edge.maxAgeSeconds is used to set the cache time to live.
  • edge.key is not supported. Cache keys are always based solely on url, method, the accept-encoding and host headers, and body.

Ensuring Versioned Browser Assets are Permanently Available

In order to ensure that users who are actively browsing your site do not experience issues during a deployment, developers can configure certain client-side assets to be permanently available, even after a new version of the site has been deployed. For example, browsers using an old version of the site may continue to request JavaScript chunks for the old version of the site for some time after a new version is deployed. Edgio automatically makes client-side scripts permanently available if you use Next.js, Nuxt.js, Angular, or Sapper.
If you are using another framework or would like to make sure a particular asset is permanently available, you can do so by setting the permanent option in serveStatic. For example:
1router.get('/scripts/:file', ({ serveStatic }) => {
2 serveStatic('path/to/scripts', {
3 permanent: true, // ensure that files are permanently accessible, even after a new version of the site has been deployed.
4 exclude: ['some-non-versioned-file.js'], // you can exclude specific files from being served permanently. You should do this for any files that do not have a hash of the content in the name.
5 })
You should only make assets permanently available if they have a hash of the content or a version number in the filename, or are accessed via a globally unique URL. For example:
  • /assets/main-989b11c4c35bc9b6e505.js
  • /assets/v99/main.js