Why I Didn't Use Google Programmable Search
Google's search API sounds perfect for static sites. It isn't. Here's why the pricing model and indexing delays make it impractical for most blogs.
When I started researching search solutions for this blog, Google Programmable Search (formerly Custom Search Engine) seemed like the obvious choice.
Google already indexes my site. Their search is best-in-class. There’s a JSON API. What could go wrong?
Everything.
The Implementation
I’ll walk through the implementation first because it’s actually pretty clean. Then I’ll explain why I threw it away.
1. The Search Metadata Endpoint
Google’s API returns links and snippets, but we need richer metadata for our search results (images, formatted dates, descriptions). So we create a metadata endpoint that maps paths to content data.
// src/pages/search-metadata.json.ts
import type { APIContext } from 'astro';
import { getCollection } from 'astro:content';
type SearchMetadata = Record<
string,
{
title: string;
description: string;
date: string;
image?: string;
}
>;
export async function GET(_context: APIContext) {
const metadata: SearchMetadata = {};
const blogPosts = await getCollection('blog', ({ data }) => !data.draft);
for (const post of blogPosts) {
const path = `/blog/${post.id}`;
metadata[path] = {
title: post.data.title,
description: post.data.description,
date: new Date(post.data.date).toLocaleDateString('en-US', {
year: 'numeric',
month: 'short',
day: 'numeric',
}),
image: post.data.image?.src,
};
}
// Add works, ventures, newsletters...
return new Response(JSON.stringify(metadata), {
headers: {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=3600',
},
});
}
2. The Search Utility
The core logic is simple. Call the Google API, normalize the results, and merge with our local metadata.
// src/utils/google-search.ts
export type GoogleSearchItem = {
title: string;
link: string;
snippet?: string;
};
export type SearchResult = {
id: string;
title: string;
path: string;
description?: string;
date?: string;
image?: string;
};
export async function searchGoogle(options: {
apiKey: string;
searchEngineId: string;
query: string;
limit?: number;
}): Promise<GoogleSearchItem[]> {
const { apiKey, searchEngineId, query, limit = 8 } = options;
const searchUrl = new URL('https://www.googleapis.com/customsearch/v1');
searchUrl.searchParams.set('key', apiKey);
searchUrl.searchParams.set('cx', searchEngineId);
searchUrl.searchParams.set('q', query);
searchUrl.searchParams.set('num', String(limit));
const response = await fetch(searchUrl.toString());
const data = await response.json();
return data.items ?? [];
}
export function normalizeSearchResults(
items: GoogleSearchItem[],
metadata: Record<string, any>,
): SearchResult[] {
return items.map((item) => {
const path = new URL(item.link).pathname;
const meta = metadata[path];
return {
id: item.link.replace(/[^a-zA-Z0-9]/g, '_'),
title: meta?.title ?? item.title,
path: item.link,
description: meta?.description ?? item.snippet,
date: meta?.date,
image: meta?.image,
};
});
}
3. The Search Component
The component fetches metadata on load, then calls the Google API on each search.
// Inside SearchModal.tsx
const [results, setResults] = useState<SearchResult[]>([]);
useEffect(() => {
const term = query.trim();
if (!term) return;
const apiKey = import.meta.env.PUBLIC_GOOGLE_CUSTOM_SEARCH_API;
const searchEngineId = import.meta.env.PUBLIC_SEARCH_ENGINE_ID;
const search = async () => {
const items = await searchGoogle({ apiKey, searchEngineId, query: term });
const metadata = await fetch('/search-metadata.json').then((r) => r.json());
setResults(normalizeSearchResults(items, metadata));
};
const timeoutId = setTimeout(search, 180);
return () => clearTimeout(timeoutId);
}, [query]);
It works. The search feels native. Results are fast. So what’s the problem?
Problem 1: The Pricing
Google gives you 100 free queries per day. After that, it’s $5 per 1,000 queries.
Let’s do the math for a modest blog:
| Scenario | Daily Queries | Monthly Cost |
|---|---|---|
| 50 visitors/day, 2 searches each | 100 | $0 (barely) |
| 200 visitors/day, 2 searches each | 400 | ~$45/month |
| 1,000 visitors/day, 2 searches each | 2,000 | ~$285/month |
For a personal blog, paying $285/month for search is absurd. That’s more than most people spend on hosting, domains, and email combined.
And this assumes 2 searches per visitor. If someone types “astro” and then refines to “astro search”, that’s 2 queries. Power users exploring your content will burn through queries fast.
Compare this to alternatives:
- Orama: Free. Zero API calls. Runs in the browser.
- Pagefind: Free. Zero API calls. Static files.
- Meilisearch (self-hosted): ~$5/month for a small VPS. Unlimited queries.
Google’s pricing only makes sense if you’re a large company with a search-heavy product and a marketing budget.
Problem 2: Indexing Delays
Google’s Programmable Search relies on Google’s main search index. This means:
- You publish a new blog post.
- Google’s crawler discovers it (hours to days).
- Google indexes it (more hours to days).
- Your search can finally find it.
For a static site where you control the content, this is backwards. Why wait for Google to crawl your own site when you can build the index yourself at deploy time?
I’ve published posts that didn’t appear in Google search for over a week. During that time, anyone searching my blog for the new content would find nothing.
With Orama or Pagefind, new content is searchable the moment you deploy.
Problem 3: No Control
Google decides what’s relevant. You can’t:
- Boost title matches over body matches
- Filter by tags or categories
- Exclude certain pages from results
- Customize fuzzy matching tolerance
With Orama, I can do boost: { title: 2 } and title matches rank higher. With Google, I get whatever their algorithm decides.
Problem 4: External Dependency
Your search stops working if:
- Google’s API has an outage
- Your API key gets rate-limited
- Google changes their pricing
- Google deprecates the API (they’ve done this before)
With a static search solution, your search is part of your site. It deploys together and fails together.
When Google Search Makes Sense
There are valid use cases:
- Large sites (10k+ pages) where building your own index is impractical
- Multi-site search where you need to search across multiple domains
- Enterprise budgets where $300/month is a rounding error
- Legacy content that’s already in Google’s index but not in a structured format
For a personal blog or portfolio? It’s overkill.
The Alternative I Actually Use
I went with Orama. Here’s why:
- Free: No API costs, ever
- Instant: New content is searchable on deploy
- Full control: I tune relevance exactly how I want
- No external dependencies: It’s just JSON and JavaScript
For sites under 500 pages, there’s no reason to use anything else.
Conclusion
Google Programmable Search is a solution looking for a problem. For static sites, you own the content. You can build the index yourself. Why pay Google to crawl your own site and then charge you to search it?
The implementation was a fun experiment. But when I calculated the costs and realized I’d be waiting days for new posts to appear in search, I deleted the branch and moved on.
What’s next?
This is part of a series of posts on implementing search for static sites:
- The Right Way to Add Orama Search to Astro — simple, zero-config search for small to medium sites
- Why I Switched from Orama to Pagefind — chunked index for better scalability
- Meilisearch is the Best Search You’ll Never Need — server-side search with advanced features
- Why I Didn’t Use Google Programmable Search (you are here) — the hidden costs and indexing delays that make it impractical
- I Tried 4 Search Engines So You Don’t Have To (coming soon) — comprehensive comparison from a small blog perspective
All with practical examples from a real production blog.