The Speed Trifecta: 11ty, Brotli 11, and CSS Fingerprinting

So recently, I have written two 11ty related blog posts:

  1. Using an 11ty Shortcode to craft a custom CSS pipeline
  2. Cranking Brotli up to 11 with Cloudflare Pro and 11ty

If you haven't read them, no problem, they are always there if you're ever struggling to sleep!

TL;DR:

In the first post I look at how I've automated CSS fingerprinting on production (Cloudflare), resulting in the ability to use long-life Cache-Control directives like max-age=31536000 and the immutable browser hint.

In the second post, I look at how you can improve Brotli compression by manually compressing your assets to Brotli setting 11 (max) then serving them via a Cloudflare Pro plan. This gives a significant improvement in file size over the Brotli compression setting of 4 that Cloudflare uses for dynamic (on the fly) compression.

While working on both posts, it dawned on me that blending the approaches they cover would result in the ideal CSS configuration for my little blog. In this post, I'll show you exactly how I've done it. Let's get started!

Original code

So I'm going to be using the code I wrote in the CSS fingerprinting blog post I wrote recently. You can see it below, or in this Gist.

import dotenv from "dotenv";
import CleanCSS from 'clean-css';
import fs from 'fs';
import crypto from 'crypto';
import path from 'path';
dotenv.config();

// create a single instance of the CleanCSS function
// to be used in file loops. Add additional optimisation settings in here.
const cleanCSS = new CleanCSS({
	level: {
		2: {
			removeDuplicateRules: true // turns on removing duplicate rules
		}
	}
});

export function manipulateCSS(eleventyConfig) {
	eleventyConfig.addShortcode("customCSS", async function(cssPath) {
		// output the file with no fingerprinting if on the dev environment
		// (allows auto-reload when the CSS is modified)
		if (process.env.ELEVENTY_ENV === 'development') {
			return `<link rel="stylesheet" href="${cssPath}">`;
		}
		// Using path.join for better cross-platform compatibility
		const inputFile = path.join('./public', cssPath);
		const outputDirectory = path.join('./_site', 'css');
		const cacheDirectory = path.join('./.cache', 'css');

		try {
			// Check if input file exists first
			if (!fs.existsSync(inputFile)) {
				console.error(`Input CSS file not found: ${inputFile}`);
				return '';
			}
			// Ensure both cache and output directories exist
			for (const dir of [cacheDirectory, outputDirectory]) {
				if (!fs.existsSync(dir)) {
					fs.mkdirSync(dir, { recursive: true });
				}
			}
			// Read the input CSS file
			const inputCSS = await fs.promises.readFile(inputFile, 'utf8');
			// Initialises a new hashing instance
			const hash = crypto.createHash('sha256')
				// Feed CSS data into the hash function
				.update(inputCSS)
				// Specify the hash should be returned as a hexadecimal string
				.digest('hex')
				// Only take the first 10 characters of the hash
				.slice(0, 10);

			// Generate our CSS Cache name
			const cacheKey = `${hash}-${cssPath.replace(/[\/\\]/g, '-')}`;
			// This is where the file will be written
			const cachePath = path.join(cacheDirectory, cacheKey);

			// store our manipulated CSS in this variable
			let processedCSS;
			// check we have a cache directory
			if (fs.existsSync(cachePath)) {
				// read the cached CSS file
				processedCSS = await fs.promises.readFile(cachePath, 'utf8');
			} else {
				// Use the memoized cleanCSS instance to minify the input CSS
				processedCSS = cleanCSS.minify(inputCSS).styles;
				await fs.promises.writeFile(cachePath, processedCSS);
			}
			// Split the input file path into its components (directory, filename, extension)
			const parsedPath = path.parse(inputFile);
			// Use path.join for output paths
			const finalFilename = path.join(outputDirectory, `${parsedPath.name}-${hash}${parsedPath.ext}`);
			// Write the optimised CSS to the final output location with the hash in the filename
			await fs.promises.writeFile(finalFilename, processedCSS);

			// path manipulation for final URL
			const hashedPath = finalFilename.replace(path.join('./_site'), '').replace(/\\/g, '/');

			// return our final link element with optimised and fingerprinted CSS.
			return `<link rel="stylesheet" href="${hashedPath}">`;
		} catch (err) {
			console.error("Error processing CSS:", err);
			return "";
		}
	});
}

Install zlib

The first thing we shall do is install the zlib package from npm, as this is what we are going to use to compress the CSS file all the way up to 11 on production.

npm install zlib

Next, we import it into our ESM code above:

import zlib from 'zlib';

Don't worry, I'm not going to go this slow all the way through the code. Feel free to skip straight to the final code if you so wish!

Setting the default compression level

This is a completely optional step, it really depends on how, and where you store all your build variables. But I'm going to be storing the Brotli Compression level I want to use on production both in the JS (as a default to fall back on if the environment variable doesn't exist), and also an environment variable on production. In the JavaScript below, set at 6 to show it can be any value between 0-11 (3 or 4 is what most CDN use for dynamic compression). Cloudflare's compression is set to 4:

// Default Brotli compression level if not set in the environment
const DEFAULT_BROTLI_COMPRESSION_LEVEL = 6;

And also in my .env file in development:

BROTLI_COMPRESSION_LEVEL=11

Now we set our compression level as a constant in our JavaScript for use later in the code:

const brotliCompressionLevel = parseInt(process.env.BROTLI_COMPRESSION_LEVEL || DEFAULT_BROTLI_COMPRESSION_LEVEL, 10);

Since our environment variable is currently set, compression will be set to 11, but if that variable isn't set, it will fall back to 6, the default value set above.

Cloudflare and Brotli

Before we go through the Brotli compression code, it's a good time to mention that Cloudflare does something automatically that is very useful for compressed files. Cloudflare Pages will automatically serve Brotli-compressed files (.br) if available, with no additional configuration required on the Cloudflare side. See the Content Compression Documentation on Cloudflare for more information.

This means we don't need to rename the compressed CSS file from index-hash.css.br back to index-hash.css for a user's browser to recognise it as a CSS file. Handy huh! Don't worry if you don't use Cloudflare Pages, I also have a version written that renames the CSS file back to index-hash-compressed.css for non-Cloudflare Pages, readers.

Important: While testing the documented method above, I could not find a way to confirm that my static compressed version was being served automatically. So in the code below I have removed this assumption and explicitly added my statically compressed version to the <link> tag. Please do let me know if I'm interpreting the documentation for this functionality wrong! As I'd love to see it in action, and also how you verify the .br file is actually being served automatically?

Brotli Compression code

Now that I've covered that bit of (potential) Cloudflare Pages automation, let's have a look at the actual compression code:

// Brotli compression

// The output filename for the compressed file
const brotliFilename = `${finalFilename}.br`;
// Only compress to Brotli if the file doesn't exist
if (!fs.existsSync(brotliFilename)) {
	// Set our zlib options here e.g. compression
	const brotliOptions = {
		level: brotliCompressionLevel // Use the level specified in the environment
	};
	// zlib does it's compression magic!
	const brotliBuffer = zlib.brotliCompressSync(Buffer.from(processedCSS), brotliOptions);
	// Write the compressed code to the output filename defined above
	await fs.promises.writeFile(brotliFilename, brotliBuffer);
}

The last thing to do now is mostly the same as what I had in the original code set out above:

// path manipulation for final URL
const hashedPath = brotliFilename.replace(path.join('./_site'), '').replace(/\\/g, '/');
// return our final link element with optimised and fingerprinted CSS.
return `<link rel="stylesheet" href="${hashedPath}">`;

Only this time I'm modifying the path to the Brotli compressed filename rather than the original uncompressed CSS file! Simple!

On Development: Once added to the 11ty config as per my previous post, my development CSS looks like this:

// uncompressed CSS with no fingerprinting, auto-reload still functioning
<link rel="stylesheet" href="/css/index.css">

On Production: The CSS looks like this:

// Brotli 11 compressed, minified and fingerprinted CSS
<link rel="stylesheet" href="/css/index-b9fcfe85ef.css.br">

This is the version I have running on the site at the moment, why not view the page source and take a look.

Finishing touches

That's the bulk of the work done, but if you were to run this code on production at present you'd get a page full of naked HTML and no styling. This is because the Cloudflare Pages _headers file needs to be modified, as Cloudflare Pages is now serving my Brotli 11 compressed CSS file, not the uncompressed version we had before.

Simply add the following to your _headers file:

/css/*
Content-Encoding: br
Vary: Accept-Encoding
Content-Type: text/css

It's important to remember that this headers file is looking at the path from which the file is served on the production website! It isn't (and can't) use an extension wildcard, as Cloudflare pages doesn't support it. It took me a while to figure this out I must admit!

So for example, The following doesn't work:

*.css
Content-Encoding: br
Vary: Accept-Encoding
Content-Type: text/css

When I first migrated to Cloudflare Pages, I tried it and couldn't work out why it wasn't working. /css/* is basically saying: "Any file that resides in the css directory on production will be served with the following headers".

Lastly, remember to add the following to your Cloudflare Pages environment variables. This can be plaintext if you like, since the variable is only storing a single integer (not a secret API key):

BROTLI_COMPRESSION_LEVEL=11

Once added, you should be good to go!

Final code

As promised, below is the final code that I just described above:

import zlib from 'zlib';
import dotenv from "dotenv";
import CleanCSS from 'clean-css';
import fs from 'fs';
import crypto from 'crypto';
import path from 'path';
dotenv.config();

// An example of how you could add additional CleanCSS settings if required
const cleanCSS = new CleanCSS({
	level: {
		2: {
			removeDuplicateRules: true
		}
	}
});

// Default Brotli compression level if not set in the environment
const DEFAULT_BROTLI_COMPRESSION_LEVEL = 6;

export function manipulateCSS(eleventyConfig) {
	eleventyConfig.addShortcode("customCSS", async function(cssPath) {
		if (process.env.ELEVENTY_ENV === 'development') {
			return `<link rel="stylesheet" href="${cssPath}">`;
		}

		const inputFile = path.join('./public', cssPath);
		const outputDirectory = path.join('./_site', 'css');
		const cacheDirectory = path.join('./.cache', 'css');

		// Get compression level from the environment or use the default
		const brotliCompressionLevel = parseInt(process.env.BROTLI_COMPRESSION_LEVEL || DEFAULT_BROTLI_COMPRESSION_LEVEL, 10);

		try {
			if (!fs.existsSync(inputFile)) {
				console.error(`Input CSS file not found: ${inputFile}`);
				return '';
			}

			for (const dir of [cacheDirectory, outputDirectory]) {
				if (!fs.existsSync(dir)) {
					fs.mkdirSync(dir, { recursive: true });
				}
			}

			const inputCSS = await fs.promises.readFile(inputFile, 'utf8');
			const hash = crypto.createHash('sha256').update(inputCSS).digest('hex').slice(0, 10);

			const cacheKey = `${hash}-${cssPath.replace(/[\/\\]/g, '-')}`;
			const cachePath = path.join(cacheDirectory, cacheKey);

			let processedCSS;
			if (fs.existsSync(cachePath)) {
				processedCSS = await fs.promises.readFile(cachePath, 'utf8');
			} else {
				processedCSS = cleanCSS.minify(inputCSS).styles;
				await fs.promises.writeFile(cachePath, processedCSS);
			}

			const parsedPath = path.parse(inputFile);
			const finalFilename = path.join(outputDirectory, `${parsedPath.name}-${hash}${parsedPath.ext}`);
			await fs.promises.writeFile(finalFilename, processedCSS);

			// Brotli compression

			// The output filename for the compressed file
			const brotliFilename = `${finalFilename}.br`;

			// Only compress to Brotli if the file doesn't exist
			if (!fs.existsSync(brotliFilename)) {
				// Set our zlib options here e.g. compression
				const brotliOptions = {
					level: brotliCompressionLevel // Use the level specified in the environment
				};
				// zlib does it's compression magic!
				const brotliBuffer = zlib.brotliCompressSync(Buffer.from(processedCSS), brotliOptions);
				// Write the compressed code to the output filename defined above
				await fs.promises.writeFile(brotliFilename, brotliBuffer);
			}

			const hashedPath = brotliFilename.replace(path.join('./_site'), '').replace(/\\/g, '/');
			return `<link rel="stylesheet" href="${hashedPath}">`;
		} catch (err) {
			console.error("Error processing CSS:", err);
			return "";
		}
	});
}

And here's a Gist for the code as well!

With file renaming

Now I understand not everyone will want to serve their CSS files using the .br extension. So there's a version below that also renames the file for you back to .css, and adds the -compressed suffix to the filename as well. The rest of the code is identical to the version above:

import zlib from 'zlib';
import dotenv from "dotenv";
import CleanCSS from 'clean-css';
import fs from 'fs';
import crypto from 'crypto';
import path from 'path';
dotenv.config();

// An example of how you could add additional CleanCSS settings if required
const cleanCSS = new CleanCSS({
	level: {
		2: {
			removeDuplicateRules: true
		}
	}
});

// Default Brotli compression level if not set in the environment
const DEFAULT_BROTLI_COMPRESSION_LEVEL = 6;

export function manipulateCSS(eleventyConfig) {
	eleventyConfig.addShortcode("customCSS", async function(cssPath) {
		if (process.env.ELEVENTY_ENV === 'development') {
			return `<link rel="stylesheet" href="${cssPath}">`;
		}

		const inputFile = path.join('./public', cssPath);
		const outputDirectory = path.join('./_site', 'css');
		const cacheDirectory = path.join('./.cache', 'css');

		// Get compression level from the environment or use the default
		const brotliCompressionLevel = parseInt(process.env.BROTLI_COMPRESSION_LEVEL || DEFAULT_BROTLI_COMPRESSION_LEVEL, 10);

		try {
			if (!fs.existsSync(inputFile)) {
				console.error(`Input CSS file not found: ${inputFile}`);
				return '';
			}

			for (const dir of [cacheDirectory, outputDirectory]) {
				if (!fs.existsSync(dir)) {
					fs.mkdirSync(dir, { recursive: true });
				}
			}

			const inputCSS = await fs.promises.readFile(inputFile, 'utf8');
			const hash = crypto.createHash('sha256').update(inputCSS).digest('hex').slice(0, 10);

			const cacheKey = `${hash}-${cssPath.replace(/[\/\\]/g, '-')}`;
			const cachePath = path.join(cacheDirectory, cacheKey);

			let processedCSS;
			if (fs.existsSync(cachePath)) {
				processedCSS = await fs.promises.readFile(cachePath, 'utf8');
			} else {
				processedCSS = cleanCSS.minify(inputCSS).styles;
				await fs.promises.writeFile(cachePath, processedCSS);
			}

			const parsedPath = path.parse(inputFile);
			const finalFilename = path.join(outputDirectory, `${parsedPath.name}-${hash}${parsedPath.ext}`);
			await fs.promises.writeFile(finalFilename, processedCSS);

			// Brotli compression with renaming
			const compressedFilename = path.join(outputDirectory, `${parsedPath.name}-${hash}-compressed${parsedPath.ext}`);
			// Only compress to Brotli if the file doesn't exist
			if (!fs.existsSync(compressedFilename)) {
				// Set our zlib options here e.g. compression
				const brotliOptions = {
					level: brotliCompressionLevel
				};
				// zlib does it's compression magic!
				const brotliBuffer = zlib.brotliCompressSync(Buffer.from(processedCSS), brotliOptions);
				// Write the compressed code to the output filename defined above
				await fs.promises.writeFile(compressedFilename, brotliBuffer);
			}

			const hashedPath = compressedFilename.replace(path.join('./_site'), '').replace(/\\/g, '/');
			return `<link rel="stylesheet" href="${hashedPath}">`;
		} catch (err) {
			console.error("Error processing CSS:", err);
			return "";
		}
	});
}

The Gist for the code above is here.

The above code will do the following:

On Development: It will simply output:

<link rel="stylesheet" href="/css/index.css">

Exactly as it did in the previous version.

On Production: This is the output that will be seen:

<link rel="stylesheet" href="/css/index-b9fcfe85ef-compressed.css">

In this instance, the CSS file has been Brotli compressed to 11, but it has also been renamed with the -compressed suffix and the .br extension has been removed. Technically, you don't need the suffix, but I've added it to emphasise (and remind myself) that it isn't a "standard" CSS file in plain text.

And as I mentioned in the previous CSS Shortcode blog post. You'll now be able to use long cache life headers like:

/[css]/*
Cache-Control: public, max-age=31536000, immutable

Safe in the knowledge that updating your CSS will generate a brand-new filename, effectively nullifying the existing cache. Check out the previous post if you want more details on the above _headers file code for Cloudflare.

Summary

Fantastic! You made it to the end of yet another 11ty blog post of mine, congrats! I'm not sure if there's anything else I can do to optimise my CSS delivery to users at the moment?

Maybe preloading or the Speculation Rules API? Which Cloudflare already supports, and is currently enabled on my site. This functionality is called "Speed Brain" in the Cloudflare dashboard (under the "Speed" then "Content Optimization"). It's worth noting that this functionality is currently in Beta. But I don't think I need to do anything manually to use it, since Cloudflare rolled it out to all plans (including the free plan!), back in September 2024.

But perhaps there are other optimisations I can make? Unknown, unknowns etc… As always, thanks for reading, I hope you found the post useful and if you have any feedback or comments let me know!


Post changelog:

  • 23/01/25: Initial post published.

Webmentions

Mentioned on:

  1. https://pressrex.com/6-css-snippets-audibles-ux-insights-350ms-lcp/
  2. https://pressrex.com/6-css-snippets-audibles-ux-insights-350ms-lcp/
  3. https://pressrex.com/6-css-snippets-audibles-ux-insights-350ms-lcp-2/
  4. https://pressrex.com/6-css-snippets-audibles-ux-insights-350ms-lcp-2/
  5. https://frontenddogma.com/
  6. https://frontenddogma.com/

A webmention is a way to notify this site when you've linked to it from your own site.