Using an 11ty Shortcode to craft a custom CSS pipeline

I know I'm still a bit of a 11ty n00b, so I hope this isn't frowned upon in the community, but on rebuilding my blog using 11ty, I decided not to use the standard Bundle plugin that was added in v3.0.0. Instead, I decided to write a custom Shortcode to customise my CSS output. In this blog post, I will go through the code I have written in the hope it will help others and more importantly gather feedback from the community, to see if any improvements can be made to the code. Please do let me know if you have any feedback.

My Requirements

First, what am I hoping to achieve with this Shortcode solution?

  • Maintain the auto-reload functionality that comes with 11ty when you modify your CSS locally (this is a super helpful feature I've been a big fan of since LiveReload was released back in February 2011, and later BrowserSync).
  • Filename fingerprinting to allow me to use long-life Cache-Control response headers without having to worry about cache becoming outdated. (e.g. max-age=31536000, immutable)
  • The ability to optimise my CSS output using the excellent clean-css Node plugin. It does more than just minify! Check out all the optimisations it can help you if you haven't seen them.
  • Minimal impact on the 11ty build process by leveraging build caching in some way.
  • Set it and forget it, once it's added It requires minimal (or no ongoing maintenance), apart from updating dependencies, every so often!

The Code

I'll quit the waffling and just show you the code I currently have building my CSS for this site. This code sits in its own file called css-manipulation.js in a _helpers directory in my 11ty root (i.e. the same level as _data).

/**
 * CSS Manipulation Module for Eleventy
 *
 * This module provides CSS processing, minification, and compression functionality
 * for an Eleventy static site generator. It handles:
 * - CSS minification using CleanCSS
 * - Content-based file hashing for cache busting
 * - Brotli compression for optimized delivery
 * - Build-time caching to avoid redundant processing
 * - Concurrent request handling to prevent duplicate work
 */

// Node.js built-in modules for file system operations, path manipulation, and hashing
import crypto from "crypto"; // Used to create SHA-256 hashes of CSS content for cache busting
import fs from "fs"; // File system operations (reading/writing files, checking existence)
import path from "path"; // Cross-platform path manipulation and joining
// Node.js zlib module for Brotli compression
// Brotli is a modern compression algorithm that provides better compression ratios than gzip
import { brotliCompressSync } from "zlib";

// CleanCSS library for CSS minification and optimization
// This performs a wide range of optimizations to reduce file size
import CleanCSS from "clean-css";

// Environment configuration to check if we're in local development mode
import env from "../_data/env.js";

/**
 * CleanCSS Configuration
 *
 * CleanCSS performs multi-level CSS optimization. This configuration enables
 * comprehensive minification while maintaining CSS functionality.
 *
 * Level 1 optimizations (basic cleanups):
 * - cleanupCharsets: Remove unnecessary @charset declarations
 * - normalizeUrls: Normalize and optimize URL() references
 * - optimizeBackground: Optimize background properties
 * - optimizeBorderRadius: Shorten border-radius values
 * - optimizeFilter: Optimize filter() functions
 * - optimizeFont: Optimize font-family declarations
 * - optimizeFontWeight: Convert font-weight to numeric values
 * - optimizeOutline: Optimize outline properties
 * - removeEmpty: Remove empty CSS rules
 * - removeNegativePaddings: Remove invalid negative padding values
 * - removeQuotes: Remove unnecessary quotes from URLs and identifiers
 * - removeWhitespace: Remove unnecessary whitespace
 * - replaceMultipleZeros: Shorten multiple zero values
 * - replaceTimeUnits: Optimize time unit values (e.g., 0s → 0)
 * - replaceZeroUnits: Remove units from zero values (e.g., 0px → 0)
 * - roundingPrecision: Round numeric values to 2 decimal places
 * - selectorsSortingMethod: Sort selectors in a standard order
 * - specialComments: Remove all comments (set to "none")
 * - tidyAtRules: Clean up @-rules
 * - tidyBlockScopes: Clean up block scopes
 * - tidySelectors: Clean up and optimize selectors
 * - transform: Custom transformation function (empty, no custom transforms)
 *
 * Level 2 optimizations (advanced restructuring):
 * - mergeAdjacentRules: Combine adjacent CSS rules with same selectors
 * - mergeIntoShorthands: Convert long-form properties to shorthand
 * - mergeMedia: Combine @media rules with same conditions
 * - mergeNonAdjacentRules: Merge duplicate rules even if not adjacent
 * - mergeSemantically: Disabled to avoid potentially breaking semantic merges
 * - overrideProperties: Remove overridden properties
 * - removeEmpty: Remove empty rules at level 2
 * - reducePadding: Optimize padding values
 * - reducePositions: Optimize position values
 * - reduceTimingFunctions: Optimize animation timing functions
 * - reduceTransforms: Optimize transform functions
 * - restructureRules: Disabled to avoid aggressive restructuring that might break CSS
 * - skipProperties: Empty array means optimize all properties
 *
 * Format options (output formatting):
 * - All breaks set to false: Output as single line (no line breaks)
 * - indentBy: 0 (no indentation for minimal file size)
 * - indentWith: "space" (if indentation were enabled)
 * - All spaces set to false: Remove unnecessary spaces
 * - wrapAt: false (no line wrapping)
 *
 * Other options:
 * - inline: ["none"] - Don't inline any @import rules
 * - rebase: false - Don't rebase URLs (keep original paths)
 * - returnPromise: false - Use synchronous API (we handle async ourselves)
 */
const cleanCSS = new CleanCSS({
	level: {
		1: {
			cleanupCharsets: true,
			normalizeUrls: true,
			optimizeBackground: true,
			optimizeBorderRadius: true,
			optimizeFilter: true,
			optimizeFont: true,
			optimizeFontWeight: true,
			optimizeOutline: true,
			removeEmpty: true,
			removeNegativePaddings: true,
			removeQuotes: true,
			removeWhitespace: true,
			replaceMultipleZeros: true,
			replaceTimeUnits: true,
			replaceZeroUnits: true,
			roundingPrecision: 2,
			selectorsSortingMethod: "standard",
			specialComments: "none",
			tidyAtRules: true,
			tidyBlockScopes: true,
			tidySelectors: true,
			transform: function () {},
		},
		2: {
			mergeAdjacentRules: true,
			mergeIntoShorthands: true,
			mergeMedia: true,
			mergeNonAdjacentRules: true,
			mergeSemantically: false,
			overrideProperties: true,
			removeEmpty: true,
			reducePadding: true,
			reducePositions: true,
			reduceTimingFunctions: true,
			reduceTransforms: true,
			restructureRules: false,
			skipProperties: [],
		},
	},
	format: {
		breaks: {
			afterAtRule: false,
			afterBlockBegins: false,
			afterBlockEnds: false,
			afterComment: false,
			afterProperty: false,
			afterRuleBegins: false,
			afterRuleEnds: false,
			beforeBlockEnds: false,
			betweenSelectors: false,
		},
		indentBy: 0,
		indentWith: "space",
		spaces: {
			aroundSelectorRelation: false,
			beforeBlockBegins: false,
			beforeValue: false,
		},
		wrapAt: false,
	},
	inline: ["none"],
	rebase: false,
	returnPromise: false,
});

/**
 * Default Brotli Compression Level
 *
 * Brotli compression levels range from 0-11:
 * - 0: Fastest, least compression
 * - 11: Slowest, best compression (maximum)
 *
 * Level 11 is used by default for production builds where file size matters
 * more than compression time. This can be overridden via BROTLI_COMPRESSION_LEVEL
 * environment variable.
 */
const DEFAULT_BROTLI_COMPRESSION_LEVEL = 11;

/**
 * In-Memory Build Cache
 *
 * This Map caches CSS processing results during a single build run.
 * It serves two purposes:
 * 1. Prevents re-processing the same CSS file when multiple pages reference it
 * 2. Handles concurrent requests by storing Promises during processing
 *
 * Structure:
 * - Key: cssPath (the original CSS file path)
 * - Value: Either a Promise (if processing is in progress) or an object with:
 *   - hash: Content hash of the CSS file
 *   - processedCSS: The minified CSS string
 *   - htmlOutput: The final HTML <link> tag string
 *
 * This cache is cleared between builds (per-process, not persisted).
 */
const cssBuildCache = new Map();

/**
 * Directory Creation Tracking
 *
 * Tracks which directory combinations have been created during this build.
 * This prevents redundant fs.existsSync() and fs.mkdirSync() calls when
 * multiple CSS files are processed.
 *
 * Structure: Set of strings like "cacheDir:outputDir"
 */
const directoriesCreated = new Set();

/**
 * Main Function: manipulateCSS
 *
 * Registers a custom Eleventy shortcode that processes CSS files.
 * The shortcode can be used in templates like: 
 *
 * Processing Pipeline:
 * 1. Check if in local development mode (skip processing)
 * 2. Check in-memory cache (avoid duplicate processing)
 * 3. Read and hash the CSS file
 * 4. Check disk cache for minified CSS
 * 5. Minify CSS if not cached
 * 6. Write processed CSS with hash-based filename
 * 7. Compress with Brotli
 * 8. Return HTML <link> tag with hashed filename for cache busting
 *
 * @param {object} eleventyConfig - The Eleventy configuration object
 */
export function manipulateCSS(eleventyConfig) {
	/**
	 * Register the "customCSS" shortcode with Eleventy
	 * This makes it available in all templates as 
	 *
	 * @param {string} cssPath - The relative path to the CSS file (from /public directory)
	 * @returns {Promise<string>} - HTML string containing a <link> tag pointing to the processed CSS
	 */
	eleventyConfig.addShortcode("customCSS", async function (cssPath) {
		/**
		 * Stage 1: Local Development Short-Circuit
		 *
		 * In local development, skip all processing to speed up builds.
		 * Just return a simple <link> tag pointing to the original file.
		 * This allows for faster iteration during development.
		 */
		if (env.isLocal) {
			return `<link rel="stylesheet" href="${cssPath}">`;
		}

		/**
		 * Stage 2: In-Memory Cache Check
		 *
		 * Check if we've already processed this CSS file during this build.
		 * This handles the common case where multiple pages reference the same CSS file.
		 *
		 * The cache can contain:
		 * - A Promise: Processing is currently in progress, wait for it
		 * - An object: Processing is complete, return the cached HTML output
		 */
		if (cssBuildCache.has(cssPath)) {
			const cached = cssBuildCache.get(cssPath);
			// If it's a promise (in progress), wait for it
			// This handles concurrent requests - multiple pages calling this shortcode
			// simultaneously for the same CSS file will all wait for the same processing
			if (cached instanceof Promise) {
				return await cached;
			}
			// Otherwise return cached result immediately (already HTML string)
			// This is the fast path for subsequent page builds
			return cached.htmlOutput;
		}

		/**
		 * Stage 3: Initialize Processing
		 *
		 * Set up file paths and configuration for processing.
		 */
		// Construct full paths for input/output/cache directories
		const inputFile = path.join("./public", cssPath); // Source CSS file location
		const outputDirectory = path.join("./_site", "css"); // Where processed CSS goes (build output)
		const cacheDirectory = path.join("./.cache", "css"); // Where minified CSS cache is stored

		/**
		 * Stage 4: Get Brotli Compression Level
		 *
		 * Read compression level from environment variable or use default.
		 * This allows fine-tuning compression vs. speed trade-off per environment.
		 * Higher levels = better compression but slower processing.
		 */
		const brotliCompressionLevel = parseInt(
			process.env.BROTLI_COMPRESSION_LEVEL || DEFAULT_BROTLI_COMPRESSION_LEVEL,
			10, // Base 10 parsing
		);

		/**
		 * Stage 5: Create Processing Promise
		 *
		 * Wrap all processing in an async IIFE (Immediately Invoked Function Expression).
		 * This allows us to:
		 * 1. Handle concurrent requests by caching the Promise itself
		 * 2. Catch errors at the processing level
		 * 3. Return the same Promise to multiple concurrent callers
		 */
		const processingPromise = (async () => {
			try {
				/**
				 * Stage 5.1: Validate Input File Exists
				 *
				 * Check if the source CSS file exists before attempting to process it.
				 * If missing, log an error and return empty string (fails gracefully).
				 */
				if (!fs.existsSync(inputFile)) {
					return "";
				}

				/**
				 * Stage 5.2: Ensure Directories Exist
				 *
				 * Create output and cache directories if they don't exist.
				 * Uses a Set to track which directory combinations have been created
				 * during this build to avoid redundant checks and operations.
				 *
				 * recursive: true ensures parent directories are created if needed.
				 */
				const dirKey = `${cacheDirectory}:${outputDirectory}`;
				if (!directoriesCreated.has(dirKey)) {
					for (const dir of [cacheDirectory, outputDirectory]) {
						if (!fs.existsSync(dir)) {
							fs.mkdirSync(dir, { recursive: true });
						}
					}
					directoriesCreated.add(dirKey);
				}

				/**
				 * Stage 5.3: Read and Hash CSS File
				 *
				 * Read the source CSS file and generate a content-based hash.
				 * The hash is used for:
				 * - Cache busting (filename changes when content changes)
				 * - Disk cache key (identify if we've seen this content before)
				 *
				 * SHA-256 is used for strong collision resistance.
				 * Only first 10 characters of hash are used (sufficient for cache busting).
				 *
				 */
				const inputCSS = await fs.promises.readFile(inputFile, "utf8");
				const hash = crypto
					.createHash("sha256") // Use SHA-256 algorithm
					.update(inputCSS) // Hash the CSS content
					.digest("hex") // Get hexadecimal representation
					.slice(0, 10); // Take first 10 chars (sufficient for uniqueness)
				/**
				 * Stage 5.4: Generate Cache Key and Path
				 *
				 * Create a unique cache key combining:
				 * - The content hash (identifies file contents)
				 * - The file path (normalized to avoid path separator issues)
				 *
				 * This allows different CSS files with same content to share cache,
				 * but also handles edge cases where paths matter.
				 */
				const cacheKey = `${hash}-${cssPath.replace(/[/\\]/g, "-")}`;
				const cachePath = path.join(cacheDirectory, cacheKey);

				/**
				 * Stage 5.5: Minify CSS (or Load from Disk Cache)
				 *
				 * Check if we've already minified this exact CSS content before.
				 * Disk cache persists between builds, so unchanged CSS files don't
				 * need re-minification even after restarting the build process.
				 *
				 * If cached:
				 * - Load the pre-minified CSS from disk
				 * - Skip the expensive minification step
				 *
				 * If not cached:
				 * - Run CleanCSS minification (expensive operation)
				 * - Save result to disk cache for next time
				 */
				let processedCSS;
				if (fs.existsSync(cachePath)) {
					// Cache hit - load pre-minified CSS
					processedCSS = await fs.promises.readFile(cachePath, "utf8");
				} else {
					// Cache miss - minify and save
					processedCSS = cleanCSS.minify(inputCSS).styles;
					await fs.promises.writeFile(cachePath, processedCSS);
				}

				/**
				 * Stage 5.6: Write Processed CSS to Output Directory
				 *
				 * Write the minified CSS to the build output directory with a
				 * hash-based filename. The hash in the filename enables:
				 * - Cache busting (browser cache invalidates when content changes)
				 * - Long-term caching (files with same hash are unchanged)
				 *
				 * Filename format: original-name-hash.css
				 * Example: main-a1b2c3d4e5.css
				 *
				 * Only write if file doesn't exist to avoid redundant disk I/O
				 * (useful when multiple pages reference the same CSS).
				 */
				const parsedPath = path.parse(inputFile); // Parse original filename
				const finalFilename = path.join(
					outputDirectory,
					`${parsedPath.name}-${hash}${parsedPath.ext}`, // name-hash.css
				);
				if (!fs.existsSync(finalFilename)) {
					await fs.promises.writeFile(finalFilename, processedCSS);
				}

				/**
				 * Stage 5.7: Brotli Compression
				 *
				 * Compress the minified CSS using Brotli algorithm.
				 * Brotli provides better compression than gzip, especially for text.
				 *
				 * The .br extension indicates Brotli-compressed files.
				 * Web servers can serve these directly to browsers that support Brotli
				 * (most modern browsers do via Accept-Encoding header).
				 *
				 * Compression happens synchronously (brotliCompressSync) because:
				 * - It's fast enough for build-time processing
				 * - Simplifies error handling
				 * - Build processes typically prefer synchronous operations
				 *
				 * Only compress if file doesn't exist (avoid redundant compression).
				 */
				const brotliFilename = `${finalFilename}.br`; // Add .br extension
				if (!fs.existsSync(brotliFilename)) {
					const brotliOptions = {
						level: brotliCompressionLevel, // Compression level (0-11)
					};
					const brotliBuffer = brotliCompressSync(
						Buffer.from(processedCSS), // Convert string to Buffer
						brotliOptions,
					);
					await fs.promises.writeFile(brotliFilename, brotliBuffer);
				}

				/**
				 * Stage 5.8: Generate Final HTML Output
				 *
				 * Create the HTML <link> tag that will be inserted into the page.
				 * The path is relative to the site root and uses forward slashes
				 * (normalized for web URLs, works on all platforms).
				 *
				 * Note: The HTML references the .br file, assuming the web server
				 * can serve Brotli-compressed files when the browser supports it.
				 * If your server doesn't handle .br files, you may need to modify
				 * this to point to the uncompressed file or configure your server.
				 */
				const hashedPath = brotliFilename
					.replace(path.join("./_site"), "") // Remove build directory prefix
					.replace(/\\/g, "/"); // Normalize path separators for web URLs

				const result = `<link rel="stylesheet" href="${hashedPath}">`;

				/**
				 * Stage 5.9: Cache Result in Memory
				 *
				 * Store the processing result in the in-memory cache for subsequent
				 * calls during this build. This includes:
				 * - hash: For reference/debugging
				 * - processedCSS: In case we need the CSS string later
				 * - htmlOutput: The final HTML tag (what we return)
				 *
				 * Replace the Promise in cache with the actual result object.
				 */
				cssBuildCache.set(cssPath, { hash, processedCSS, htmlOutput: result });
				return result;
			} catch {
				/**
				 * Error Handling
				 *
				 * Catch any errors during processing and fail gracefully.
				 * Return empty string to prevent one CSS file error from breaking the entire build.
				 */
				return "";
			}
		})();

		/**
		 * Stage 6: Handle Concurrent Requests
		 *
		 * Store the processing Promise in the cache BEFORE awaiting it.
		 * This ensures that if multiple pages call this shortcode simultaneously
		 * for the same CSS file, they all get the same Promise and wait for
		 * the same processing to complete (no duplicate work).
		 *
		 * Once processing completes, the Promise in cache is replaced with
		 * the result object (see Stage 5.9).
		 */
		cssBuildCache.set(cssPath, processingPromise);
		return await processingPromise;
	});
}

It's all a little clumsy to have the code only in the blog post, so I've created a gist on GitHub here for easier reading & copying and pasting (should you wish to use or modify it yourself).

Usage

So how would you use this code in 11ty? Let's go through each of the files you'd need to modify.

eleventy.config.js

Just your standard ESM 11ty config file with an import and the execution.

// Custom CSS manipulation
import { manipulateCSS } from './_helpers/css-manipulation.js';

export default async function(eleventyConfig) {
	// All your other 11ty config above...
	// execute the CSS manipulation
	manipulateCSS(eleventyConfig);
	// All your other 11ty config below...
}

.env

It sits in the 11ty root directory and is added to .gitignore.

ELEVENTY_ENV=development

HTML template

I, personally, have my HTML <head> separate and sitting in its own Nunjucks partial, but it will work with any template setup really. This Shortcode is using Nunjucks because that's what I use on this blog. Spaces added in the Nunjucks to stop 11ty just printing the output. The index.css is my one and only CSS file I use on this blog, and it is passed into the Shortcode. It is either used unmodified (in development), or manipulated (in production).

<head>
	{ % customCSS "/css/index.css" % }
</head>

Output

For my development environment the HTML output is:

<head>
	<link rel="stylesheet" href="/css/index.css">
</head>

There's no fingerprinting until it is built on production (in my case Cloudflare Pages). Where the output looks like this:

<head>
	<link rel="stylesheet" href="/css/index-b9fcfe85ef.css">
</head>

As you can see the index.css file that contains all the CSS for this site, now has a unique 10 character hash suffix. If I were to change a single byte of data in the CSS file it will generate an entirely new hash. For those wondering, a sha256 hash using only the first 10 characters still has 1,099,511,627,776 unique combinations, so the likelihood of a collision is pretty slim!

Since the name of my CSS file is now guaranteed to have a unique name I can use the following response headers for my CSS:

_headers file

We can now start to use Cache-Control directives that maximise the time that the CSS is stored in a user's browser cache.

/[css]/*
Cache-Control: public, max-age=31536000, immutable

Here we are telling the user's browser that:

  1. public: This response can be cached by any cache, it isn't restricted to only a private cache (like a user's browser cache). Examples of other caches are CDNs, and proxies.
  2. max-age=31536000: 31536000 seconds is the number of seconds in a year. So, this directive tells caches that they can store and reuse this response for up to a year without the need for revalidation.
  3. immutable: This is a hint to the browser or cache system that the content will never change for the lifetime of the cache. Even if the user reloads the page, the browser won't bother checking with the server for updates.

There's a lot involved in caching on the web, so if you want to know a lot more about this fascinating topic, check out Cache-Control for Civilians by Harry Roberts.

For those wondering about what happens to the old "orphaned" files in the cache, in this instance:

  • The old file version will stay in the cache for up to 1 year (until the cache duration expires).
  • But it's more than likely that the browser will clean up the cache before the year expiry time to free up storage space. This is more likely on devices with limited resources like older phones, since devices with limited resources will need to be more aggressive with their resource management.
  • In either case, this isn't anything a user has to worry about. The browsers will take care of this automatically, although a user can manually clear their browser cache should they wish too.

Summary

There we go, another 11ty blog post! We've reviewed my current CSS setup, which works well for now, but as mentioned earlier, feel free to suggest improvements or point out anything that strays from the "11ty way". As, the path to true enlightenment starts with uncovering the unseen gaps in our (my) understanding. I read that last sentence in Master Yoda's voice! As always, thanks for taking the time to read the post. I hope you found it useful and informative, and if you have any comments or feedback, please do let me know.


Post changelog:

  • 12/01/25: Initial post published.
  • 02/11/25: Updated post and code after shortcode was found to run for every page generation in the build process. This was inefficient and is now limited to a single run per build.

Webmentions

No mentions yet.

A webmention is a way to notify this site when you've linked to it from your own site.