Skip to content

Caching with Fastify and AWS Cloudfront

Build an App that Handles Caching with Fastify and AWS CloudFront

We all want to build blazingly fast applications and one of the most effective ways to speed them up is to use caching.

In large web applications it's common to adopt a Content Delivery Network (CDN). That introduces many advantages, from reducing latency to decreasing server load, to drastically improving performance by adopting HTTP caching.

We'll use fastify as a web server and Amazon CloudFront as CDN, in order to optimize HTTP response times via HTTP caching headers.

The architecture of this case is quite simple: the client -> the CDN -> the server(s).

You can access the examples referred to in this article at the following link: https://github.com/nearform/blog-fastify-cloudfront-example

HTTP cache

HTTP cache has been defined in the protocol since the beginning. It can be used to reduce server calls and data transfer as well as to avoid repeated operations that have the same results.

Two core concepts of caching are saving and updating operation results. Saving the results of operations is done to serve them quickly and avoid repeating computations that lead to the same outcome. Updating those saved results when they become stale is essential in a useful cache.

HTTP cache can be time-based or content-based - and of course, some content can’t be cached or is prohibitively difficult to cache.

Time-based caching Time-based caching is quite simple to implement: setting an arbitrary expiration to cache entries, and when that time passes, the content is reloaded from the source. This strategy is the most efficient from the client’s perspective. Once the content is received, no more requests are made for a specific time; at the same time, consistently orchestrating client-side requests is not that easy. Content-based caching Content-based caching is different: the client gets Etag and/or Last-Modified  headers that identify the response; the client will provide them to the next requests for the same resource, and the server either responds with new content and updated headers if the resource has changed meanwhile, or with a “304 - not modified” if not, without sending the content again.

This is just an overview of HTTP caching, for more information see HTTP caching on MDN .

Choosing the right strategy to adopt really depends on many factors, of which the very first is business requirements.

HTTP headers

Cache-Control is the main header for caching directives, both for request and response. In this use case, we focus on max-age and s-maxage , respectively, to tell the client and CDN the time-to-live (TTL) for a resource or no-cache to avoid caching. For more information see Cache-Control on MDN

Vary Vary response header describes the parts of the request headers that are involved in the cache. For more information see Vary on MDN Etag (entity tag) Etag response header is the identifier of the response. If the server response includes the Etag header, the client should provide the value on the following request to the same resource on the if-Match header. For more information see Etag on MDN Last-Modified Last-Modified is the same concept as Etag but is based on date. If it's contained in the response, the client should send it later as If-Modified-Since. Etag and Last-Modified can be used together. For more information see Last-Modified on MDN

Contents

We can configure different behaviours by combining the caching headers together, but generally speaking, we can categorize contents as public or private, static or dynamic.

When we combine  them, we get the following content types:

dynamic public Dynamic content usually refers to server-rendered pages or API responses. For example, a home page optimized for client devices (mobile or desktop) or localization by client origin; an API to get the content of an article from the company CMS. dynamic private The same as above, but private, so content is accessible under authorisation, or the user data affects the content. For example, the rendered page may include “welcome ${user.name}”. Because of the many parameters involved, this is the most tricky type of content to cache. Failing to properly define the parameters here can be disastrous such as serving private content to the wrong users. static public This type of content is usually application assets, often served efficiently by storage services like Amazon S3. static private This is for content that is only accessible with authorization, for example, an image sent in a chat app. The approach is similar to dynamic private content.

To force cache refresh, the simplest way is to remove the cache entries when new content is available (for example, on a new release of the frontend), otherwise, new entries are reloaded because they expire or don't match the content identifier/s Etag and/or Last-Modified .

Amazon CloudFront

Amazon CloudFront is our CDN of choice. The capabilities of CloudFront are wide, including compressing responses, applying Lambda@Edge functions or CloudFront functions to request/response, using streaming capabilities, adding encryption and much more. The feature set is so rich that caching is not even mentioned on the first documentation page.

For our scope, we'll focus only on caching features, using CloudFront to expose our fastify application as a reverse proxy.

CloudFront allows you to define very fine-grained policies for caching. The main concepts are:

  1. Define: the “cache key” for paths to identify requests and therefore cache entries. Cache keys always include the url and method, then part or all of the query string. Headers and cookies can be added to identify the request
  2. Use custom “CloudFront” HTTP headers to get client information such as user device and location (see the full list). for example CloudFront-is-Mobile-Viewer. Having that information ready to use on the server is very powerful!!
  3. On client response CloudFront adds an x-cache header, containing information about the use of the cache for the resource. There are four possible responses the client can deliver: “Miss”, “Hit”, “RefreshHit” or “Error”
  4. It automatically adopts Etag and/or Last-Modified if present in the server response and manages them with If-Match and If-Modified-Since in further requests.

Fastify

Fastify works perfectly with CloudFront because it’s very easy to set HTTP headers and fastify also has the most efficient Etag computation in a tiny, yet amazing, plugin: fastify-etag .

When implementing time-based content strategies, Etags are not needed and can be hard to define when a resource is generated and set to Last-Modified . However, since CloudFront manages Etags so efficiently out of the box, it’s very convenient to use them anyway.

The brilliant part of the Etag plugin is that it uses the fast algorithm fnv1a to hash the response, and also automatically manage If-Match and If-Modified-Since of the request.

In our case, once the fastify server provides the Etag , CloudFront is able to handle it, serving itself the matching content, or forwarding the request to the fastify server.

JavaScript
const fastify = require('fastify')
const etag = require('fastify-etag')

const app = fastify()
app.register(etag)

app.get('/hello', async (req, reply) => {
  // automatic etag generation
  return { hello: 'world' }
})

app.get('/word', async (req, reply) => {
  // set etag manually
  reply.header('etag', 'foobar')
  return { hello: 'world' }
})

app.listen(3000)

Looking at benchmarks , Etag generation with the fnv1a algorithm is only 10% slower than without it! Considering the benefits of adopting validation based strategy this is an impressively low cost.

Example

Let’s build a cache for a dynamic private API.

The fastify app has just a simple route with pseudo-authentication that responds with the user info and CloudFront information of request origin.

JavaScript
const users = {
 'one': { name: 'Alice', email: 'alice@email.com' },
 'two': { name: 'Bob', email: 'bob@email.com' }
}

const fastify = require('fastify')()
await fastify.register(require('fastify-etag'))

fastify.get('/my-info', async (request, reply) => {
 // this "authentication" is for educational purposes only!
 const [, token] = request.headers.authorization.split(' ')
 const user = users[token.substr('user:'.length)]

 if (!user) {
   return reply.code(403).send('UNAUTHORIZED')
 }

 reply.type('application/json')
 // it will be cached on CDN for 30 seconds, and on client for 60
 // the request cache use depends on "Authorization" content
 reply.headers({
   'cache-control': 's-maxage=30,max-age=60',
   vary: 'authorization'
 })

 reply.send({
   ...user,
   mobile: request.headers['cloudfront-is-mobile-viewer'],
   country: request.headers['cloudfront-viewer-country'],
   city: request.headers['cloudfront-viewer-city'],
   lat: request.headers['cloudfront-viewer-latitude'],
   lng: request.headers['cloudfront-viewer-longitude']
 })
})

await fastify.listen(3000 || process.env.PORT, '0.0.0.0')

We’ll use cdk to set up the CloudFront Distribution named “example”, which uses the CachePolicy and the OriginRequestPolicy . Related Read: Cloud Governance with CDK using Aspects The CachePolicy named “private-dynamic-content” will have min and default TTL at zero, and max at 1 day; the request is identified by the value of the Authorization header and the querystring, if any.

The OriginRequestPolicy named “forward-all” will forward all the values of cookies, querystring and headers, and also will include the CloudFront headers to identify the client, such as “'CloudFront-Is-Mobile-Viewer”, “'CloudFront-Viewer-Country” and so on.

You can see the full code in the example repo .

JavaScript
const cachePolicyPrivateDynamic = new cloudfront.CachePolicy(this, 'private-dynamic-content', {
 cachePolicyName: 'private-dynamic-content',
 defaultTtl: cdk.Duration.seconds(0),
 minTtl: cdk.Duration.seconds(0),
 maxTtl: cdk.Duration.days(1),
 headerBehavior: cloudfront.CacheHeaderBehavior.allowList('Authorization'),
 queryStringBehavior: cloudfront.CacheQueryStringBehavior.all(),
 enableAcceptEncodingGzip: true,
 enableAcceptEncodingBrotli: true
})

const originRequestPolicy = new cloudfront.OriginRequestPolicy(this, 'forward-all', {
 originRequestPolicyName: 'forward-all',
 headerBehavior: cloudfront.OriginRequestHeaderBehavior.allowList(
   'CloudFront-Is-Mobile-Viewer',
   'CloudFront-Viewer-Country',
   'CloudFront-Viewer-City',
   'CloudFront-Viewer-Latitude',
   'CloudFront-Viewer-Longitude'
 ),
 cookieBehavior: cloudfront.OriginRequestCookieBehavior.all(),
 queryStringBehavior: cloudfront.OriginRequestQueryStringBehavior.all()
})

new cloudfront.Distribution(this, 'example', {
 defaultBehavior: {
   origin: new origins.HttpOrigin(SERVICE_HOST),
   allowedMethods: cloudfront.AllowedMethods.ALLOW_ALL,
   viewerProtocolPolicy: cloudfront.ViewerProtocolPolicy.ALLOW_ALL,
   cachePolicy: cachePolicyPrivateDynamic,
   originRequestPolicy: originRequestPolicy
 }
})

Let’s see them in action.

The first call is for the user identified by the token “user:one”. The header response “x-cache” says it is missing from CloudFront, so it’s served by the fastify app.

Plain Text
curl -v -H "Authorization: token user:one" http://d1s9xu9rpb6qix.cloudfront.net/my-info

 < HTTP/1.1 200 OK
 < Content-Type: application/json; charset=utf-8
 < Content-Length: 122
 < Cache-Control: s-maxage=30,max-age=60
 < ETag: "qjrpq9"
 < Vary: authorization
 < X-Cache: Miss from cloudfront
 < Via: 1.1 c275031486c6f7b744b8d30847e98b14.cloudfront.net (CloudFront)
  {"name":"Alice","email":"alice@email.com","mobile":"false","country":"IT","city":"Rome","lat":"41.90080","lng":"12.48740"}

Making the same request again, CloudFront serves it from its cache using the previous response, without reaching the fastify app.

Plain Text
curl -v -H "Authorization: token user:one" http://d1s9xu9rpb6qix.cloudfront.net/my-info

 < HTTP/1.1 200 OK
 < Content-Type: application/json; charset=utf-8
 < Content-Length: 122
 < Cache-Control: s-maxage=30,max-age=60
 < ETag: "qjrpq9"
 < Vary: authorization
 < X-Cache: Hit from cloudfront
 < Via: 1.1 82e9051d8d41080bd3028731e0e8677e.cloudfront.net (CloudFront)


{"name":"Alice","email":"alice@email.com","mobile":"false","country":"IT","city":"Rome","lat":"41.90080","lng":"12.48740"}

Now let’s call the user “two”. Everything is working fine since it’s responding with the right data from the server.

Plain Text
curl -v -H "Authorization: token user:two" http://d1s9xu9rpb6qix.cloudfront.net/my-info

 < HTTP/1.1 200 OK
 < Content-Type: application/json; charset=utf-8
 < Content-Length: 118
 < Cache-Control: s-maxage=30,max-age=60
 < ETag: "59t9f5"
 < Vary: authorization
 < X-Cache: Miss from cloudfront
 < Via: 1.1 507b5edb20d0e1a0b73c8687f53defa8.cloudfront.net (CloudFront)

{"name":"Bob","email":"bob@email.com","mobile":"false","country":"IT","city":"Rome","lat":"41.90080","lng":"12.48740"}

Making the same request again, CloudFront handles it properly, serving the right content for user “two”.

Plain Text
curl -v -H "Authorization: token user:two" http://d1s9xu9rpb6qix.cloudfront.net/my-info

 < HTTP/1.1 200 OK
 < Content-Type: application/json; charset=utf-8
 < Content-Length: 118
 < Cache-Control: s-maxage=30,max-age=60
 < ETag: "59t9f5"
 < Vary: authorization
 < X-Cache: Hit from cloudfront
 < Via: 1.1 e7e7960d7731a7583cedd8f1ff1aca38.cloudfront.net (CloudFront)

 {"name":"Bob","email":"bob@email.com","mobile":"false","country":"IT","city":"Rome","lat":"41.90080","lng":"12.48740"}

Pitfalls & caveats

Since CloudFront manages the request and response between server and client, we must be aware of its behaviour according to HTTP headers.

CloudFront policies control the cache on the CDN over the HTTP directives set on the server: min/max/default CloudFront TTLs values cap Cache-control max-age , s-maxage and even no-cache,must-revalidate . That means, for example, setting minimum TTL = 1 on CloudFront will cache the response for 1 second even on Cache-Control: no-cache,must-revalidate . Cache-Control directive contains information for CloudFront and also for the final client, so they have to be set anyway, and they have to be consistent with the CloudFront settings.

Takeaways

Caching is a powerful yet tricky technique to improve application performance. We should approach it wisely and analyze what and how to cache our applications' data. Using powerful tools like Amazon CloudFront and fastify make it a little easier to implement and to be in control of the outcome.

Insight, imagination and expertly engineered solutions to accelerate and sustain progress.

Contact