How to generate blurry placeholders for your images

How I eliminated layout shift and improved perceived performance by generating 20px base64 placeholders automatically at build time.

Pixel art of a person sitting and watching an old television surrounded by plants.

When building a portfolio site heavily reliant on visuals, you hit a specific performance wall: images take time to load.

On a fast connection, you might not notice it. But on a spotty mobile network, the user experience degrades fast. You get empty white boxes, or worse, massive layout shifts as images pop in and push text around.

The standard solution is lazy loading or setting explicit aspect ratios. It helps with the layout shift, but it doesn’t solve the empty box problem.

I wanted something better: instant visual feedback. A blurry, low-resolution version of the image that loads immediately with the HTML, then cross-fades into the real image. This is a pattern used by Medium, Unsplash, and Next.js.

Here is how I built an automated pipeline to handle it.

The Strategy: 20px Base64 Thumbnails

The logic is straightforward:

  1. Take every image in the project.
  2. Generate a tiny (20px) version of it.
  3. Convert that tiny version to a base64 string.
  4. Inject that string directly into the component’s CSS.

Because the data is inline (base64), it requires zero extra HTTP requests. Because it is 20px, the payload is negligible.

The Build Script

I needed a script that runs before the build, scans my assets, and generates a metadata file. I used a two-tier approach for generation: ffmpeg for speed and quality, falling back to sharp if that fails.

Here is the core generation logic:

async function generatePlaceholder(imagePath: string): Promise<string | null> {
  try {
    // Try ffmpeg first (better quality/speed)
    const base64String = execSync(
      `ffmpeg -nostdin -i "${imagePath}" -vf "scale=20:-1" -f image2pipe -c:v mjpeg - 2>/dev/null | base64 -w 0`,
      { encoding: 'utf8', maxBuffer: 10 * 1024 * 1024 }
    );
    return `data:image/jpeg;base64,${base64String}`;
  } catch (error) {
    // Fallback to sharp
    try {
      const buffer = await sharp(imagePath)
        .resize(20, 20, { fit: 'inside', withoutEnlargement: true })
        .jpeg({ quality: 50 })
        .toBuffer();
      const base64 = buffer.toString('base64');
      return `data:image/jpeg;base64,${base64}`;
    } catch (sharpError) {
      console.error(`Failed: ${imagePath}`, sharpError);
      return null;
    }
  }
}

This output is saved to src/data/image-metadata.json. It maps filenames to their base64 placeholders and a content hash (more on that later).

{
  "sarthak-photo.avif": {
    "placeholder": "data:image/jpeg;base64,/9j/4AAQSkZJRg...",
    "hash": "22150ba448a81f2ea638a5f2a980268b061cb32758e1a818eb2c43cda8a76db4"
  }
}

Integrating with Astro Components

To render the placeholders, I updated my PostImage component to look up these placeholders.

The trick is using the placeholder as a CSS background-image on the container div. This ensures it displays immediately, occupying the exact same space the final image will fill.

---
import imageMetadata from '@/data/image-metadata.json';
import { Picture } from 'astro:assets';

const { thumbnail, image, ...props } = Astro.props;

// Lookup metadata by filename
const filename = thumbnail;
const metadata = filename
  ? (imageMetadata as Record<string, { placeholder: string }>)[filename]
  : null;
const placeholder = metadata?.placeholder || '';
---

<div
  class="overflow-hidden rounded-xl shadow-lg"
  style={placeholder
    ? `background-image: url('${placeholder}'); background-size: cover; background-position: center;`
    : ''}
>
  <Picture src={image} class="transition-opacity duration-300" {...props} />
</div>

When the high-res Picture loads, it simply covers the background placeholder.

Optimizing the Build Process

Running image processing on every build is slow. To keep my deployment times under control, I implemented a caching system using file hashing.

Before processing an image, the script calculates a SHA256 hash based on the file’s modification time, size, and a sample of its content.

function getFileHash(filePath: string): string {
  const stats = fs.statSync(filePath);
  const content = fs.readFileSync(filePath);
  return sha256(
    `${stats.mtime.getTime()}-${stats.size}-${content.toString('hex').substring(0, 1000)}`
  );
}

If the hash in image-metadata.json matches the file on disk, we skip it. This cache hit rate is usually around 95%+, meaning sub-2-second processing times for the entire image library.

Handling Collisions

One real-world edge case I hit immediately: filename conflicts. If you have header.jpg in two different blog folders, a flat JSON map will break.

I added a pre-validation step to the script. It scans all directories and throws a hard error if it detects duplicate filenames, forcing me to name files descriptively (which is better for SEO anyway).

const conflicts: string[] = [];
for (const [filename, paths] of filenameMap.entries()) {
  if (paths.length > 1) {
    conflicts.push(`Duplicate: ${filename} in ${paths.join(', ')}`);
  }
}

if (conflicts.length > 0) {
  process.exit(1); // Fail the build
}

The Result

The impact on user experience is immediate.

  1. Zero Layout Shift: Because the container has the background image immediately, the aspect ratio is reserved before the browser even parses the img tag.
  2. Perceived Speed: The user sees something instantly. The page feels responsive, even if the actual heavy assets are still coming down the pipe.
  3. Developer Experience: I don’t have to manually create thumbnails or configure lazy loading plugins. I drop an image in the folder, reference it in my markdown, and the build system handles the rest.

This is the kind of automation I love: high impact on the frontend, zero friction in the workflow.

Subscribe to my newsletter

I send out a newsletter every week, usually on Thursdays, that's it!

You'll get two emails initially—a confirmation link to verify your subscription, followed by a welcome message. Thanks for joining!

You can read all of my previous issues here

Related Posts.