Webpack + Express = Compressed Static Assets

Lately I've been obsessing over trying to make my blog load less and faster: I have analysed my dependencies and removed what wasn't needed and I am planning on experimenting with route-based code splitting and inlining of critical assets.

In this post, we are going to look at an important part in bring down the amount of bytes transferred from our server to our client: compressing your assets. Using algorithms like gzip or brotli promise reducing the file size 50-80% (see Akamai's comparison)! This means your client has less to download and can potentially render your app quicker. In the case of this blog, using brotli compression has improved the time to first paint and time to interactive of my app by 25%.

In the following I am going to show how to setup gzip compression in webpack and how to serve the compressed files in express. In the end we'll extend our approach to also include brotli!

If reading blog posts isn't your cup of tea and you prefer browsing the source, the webpack setup can be found here and the express server is available here.

The plan

Adding compression to your server is usually a fairly straight-forward task taken care of by the likes of apache or nginx. For express, there is compression middleware available, however, the middleware can only dynamically create the compressed files. This means that the files only get compressed once the client asks for them.

We are going to go a different and slightly faster route, by already creating the compressed versions of the files when building and bundling our app code with Webpack. The steps we are going to look at are:

  1. Build gzipped assets while building
  2. Whenever a client asks for an asset:
    1. Check if client supports gzip encoding
    2. Check if gzipped file exists
    3. Add correct headers and return encoded file

Build setup with webpack

Since I'm already using webpack to bundle all my assets, I am going to add the compression-webpack-plugin to my client-side webpack setup:

const CompressionPlugin = require('compression-webpack-plugin')
// ...other imports...

module.exports = env => ({
  plugins : [
    // ... other plugins...
    new CompressionPlugin()
  ]
})

That is literally all it should take to add gzip-encoded assets to your build.

Serving with express

The more interesting part is how to serve these newly created files using express. On my server, I statically serve my entire ./build folder and I don't want to change that.

const express = require('express')
const app = express()

app.get('*.js', serveGzipped('text/javascript')) // !
app.get('*.css', serveGzipped('text/css')) // !

app.use(express.static('./build', {
  immutable : true,
  maxAge    : '1y' // caching!
}))

Before statically serving my built assets, I add a special piece of middleware to all .js and .css files: They all have to go through servedGzipped.

servedGzipped in turn takes care of checking if the browser supports gzip by checking the request's Accept-Encoding header. If it does, and the file exists, it adds the file gzip-file extension (.gz) to the request's url, resets headers accordingly and then calls next() to let the next handler in express' pipeline take care of the request (hint: this is going to be the static serving of the build folder we set up before).

const fs = require('fs')

const serveGzipped = contentType => (req, res, next) => {
  // does browser support gzip? does the file exist?
  const acceptedEncodings = req.acceptsEncodings()
  if (
    acceptedEncodings.indexOf('gzip') === -1
    || !fs.existsSync(`./build/${req.url}.gz`)
  ) {
    next()
    return
  }

  // update request's url
  req.url = `${req.url}.gz`

  // set correct headers
  res.set('Content-Encoding', 'gzip')
  res.set('Content-Type', contentType)

  // let express.static take care of the updated request
  next()
}

Et voilà: All CSS and JavaScript resources will now be served in their gzipped form.

Testing

You can test this by taking the name of one of your assets (in my case /vendor.6611208a4356885eee4a.js) and curling it with and without the Accept-Encoding header set:

$ curl -I http://localhost:3000/vendor.6611208a4356885eee4a.js

HTTP/1.1 200 OK
X-Powered-By: Express
Cache-Control: public, max-age=31536000
Content-Type: application/javascript
Content-Length: 657895 # Big :(
# ...
$ curl -I http://localhost:3000/vendor.6611208a4356885eee4a.js -H "Accept-Encoding: gzip"

HTTP/1.1 200 OK
X-Powered-By: Express
Content-Encoding: gzip # Encoded!
Content-Type: text/javascript; charset=utf-8
Cache-Control: public, max-age=31536000
Content-Length: 187349 # Smaller!
# ...

Adding brotli to the mix

Brotli, like gzip, is a lossless compression algorithm supported by many modern browsers (see can i use). It is faster at compressing and the compression ratio is higher (meaning the files end up being smaller) than for gzip. AWESOME!

Let's see what we have to update to get brotli into the mix as well!

The webpack part, again, is fairly simple: Additionally to the compression-webpack-plugin, we'll also add the brotli-webpack-plugin:

const CompressionPlugin = require('compression-webpack-plugin')
const BrotliPlugin = require('brotli-webpack-plugin') // NEW!
// ...other imports...

module.exports = env => ({
  plugins : [
    // ... other plugins...
    new CompressionPlugin(), // gzip
    new BrotliPlugin() // brotli NEW!
  ]
})

On our express-server, we have to do the exact same thing as before with gzip. I rewrote the old servedGzipped function to work with an array of compressions instead.

First the array of supported compressions:

const compressions = [
  {
    encoding  : 'br',
    extension : 'br'
  },
  {
    encoding  : 'gzip',
    extension : 'gz'
  }
]

And then the updated serveCompressed function:

const serveCompressed = contentType => (req, res, next) => {
  const acceptedEncodings = req.acceptsEncodings()
  // use first compression which is supported
  // and where file exists
  const compression = compressions.find(comp => (
    acceptedEncodings.indexOf(comp.encoding) !== -1
    && fs.existsSync(`./build/${req.url}.${comp.extension}`)
  ))

  if (compression) {
    req.url = `${req.url}.${compression.extension}`
    res.set('Content-Encoding', compression.encoding)
    res.set('Content-Type', contentType)
  }

  next()
}

You can test if everything works correctly CURLing your asset again and comparing their headers:

$ curl -I http://localhost:3000/vendor.6611208a4356885eee4a.js
$ curl -I http://localhost:3000/vendor.6611208a4356885eee4a.js -H "Accept-Encoding: gzip"
$ curl -I http://localhost:3000/vendor.6611208a4356885eee4a.js -H "Accept-Encoding: gzip, br"

Results

And our favourite section: time for results!

1,510ms to first meaningful paint, 5,690 ms to interactive according, 1,630 ms to Lighthouse

Using Lighthouse, I could measure an improvement of 25% across the board:

Before After Diff
First meaningful paint 2,010 ms 1,510 ms 500 ms
First Interactive (beta) 7,910 ms 5,690 ms 2,220 ms
Perceptual Speed Index 2,170 1,630 540 ms
Share article