- Build a Claude SEO Agent with Google Search Console MCP Integration
Build a Claude SEO Agent with Google Search Console MCP Integration
Connect Claude to Google Search Console API via MCP for live SEO diagnostics, URL inspection, and AI-powered ranking analysis without leaving your IDE.

📚 Get Practical Development Guides
Join developers getting comprehensive guides, code examples, optimization tips, and time-saving prompts to accelerate their development workflow.
Related Posts:
- •Build an MCP Server in Next.js for Claude Code (Complete Guide)
- •Build a Working MCP Server: Custom JSON-RPC Implementation
- •Build a Production MCP Server in Next.js — Quick Guide
- •Building Secure Multi-User MCP Servers: Claude vs OpenAI's Authentication Gap
- •Persist Google OAuth Refresh Tokens with Next.js & Redis
- •Send Emails from MCP with React Email & Brevo — Guide
Build Your Own Claude SEO Agent: Google Search Console Automation
Imagine asking Claude: "Why isn't this page ranking?" and getting an answer backed by live Google Search Console data instead of a guess. This guide shows you exactly how to build a Claude SEO agent that connects directly to your GSC account via MCP, giving your AI assistant the ability to run URL inspections, pull analytics, and diagnose ranking problems in real-time.
Instead of exporting CSVs or toggling between tabs, you'll have an AI assistant that can answer questions like:
- "Which of my pages are indexation errors?"
- "What's my average position for queries where I rank 5-10?"
- "Why isn't this URL showing up in Google?"
The Model Context Protocol (MCP) is the bridge that makes this possible.
Prerequisite: This guide assumes you have already set up the OAuth infrastructure we built in Persist Google OAuth Refresh Tokens. We'll be importing the
getValidAccessTokenhelper from that project to handle the Google security handshake securely.
The Strategy: Beyond Simple Performance Metrics
Standard SEO tools give you dashboards, but dashboards require you to do the work of finding the pattern. When you give an LLM like Claude access to GSC via an MCP server, you're giving it the ability to perform diagnostics.
The core of this integration is a service layer that communicates with the Google Search Console API. While performance metrics are great, the real "killer feature" is the URL Inspection tool. It allows the AI to answer the most frustrating question in SEO: "Why isn't this page ranking?"
1. The Service Layer (src/lib/google/search-console.ts)
We need a robust file to handle the API communication. Note how we import our existing auth helper to handle the tokens.
// File: src/lib/google/search-console.ts
import { getValidAccessToken } from '@/lib/auth/google'; // From our previous article
import { z } from 'zod';
const BASE_URL = '[https://searchconsole.googleapis.com/v1](https://searchconsole.googleapis.com/v1)';
// 1. The "Smart Resolver" - Handles the annoying sc-domain: prefix
export async function resolveSiteUrl(userUrl: string): Promise<string> {
const token = await getValidAccessToken();
// Fetch all verified sites
const response = await fetch(`${BASE_URL}/sites`, {
headers: { 'Authorization': `Bearer ${token}` }
});
const data = await response.json();
// Logic: Find the best matching property in your GSC account
const site = data.siteEntry?.find((s: any) => userUrl.includes(s.siteUrl) || s.siteUrl.includes(userUrl));
if (!site) throw new Error(`Could not find a verified property for ${userUrl}`);
return site.siteUrl;
}
// 2. The Inspector - Checks Index Status
export async function inspectUrl(siteUrl: string, inspectionUrl: string): Promise<any> {
const accessToken = await getValidAccessToken();
const resolvedSiteUrl = await resolveSiteUrl(siteUrl);
const response = await fetch(`${BASE_URL}/urlInspection/index:inspect`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${accessToken}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
inspectionUrl: inspectionUrl,
siteUrl: resolvedSiteUrl,
languageCode: 'en-US'
})
});
if (!response.ok) throw new Error('URL Inspection failed');
return await response.json();
}
// 3. The Analytics Engine - Flexible Querying
export async function querySearchAnalytics(
siteUrl: string,
startDate: string,
endDate: string,
dimensions: string[]
) {
const accessToken = await getValidAccessToken();
const resolvedSite = await resolveSiteUrl(siteUrl);
const response = await fetch(`${BASE_URL}/sites/${encodeURIComponent(resolvedSite)}/searchAnalytics/query`, {
method: 'POST',
headers: { 'Authorization': `Bearer ${accessToken}` },
body: JSON.stringify({ startDate, endDate, dimensions })
});
return await response.json();
}
This code snippet allows the AI to peek behind the curtain. It returns detailed information about whether a page is indexed, the last time Google crawled it, and if it's considered mobile-friendly.
Designing the Marketing Tools
When building MCP tools for marketing teams, you have to think about flexibility. Marketing performance isn't just a single number; it's a mix of dates, devices, and search terms.
We will implement three specific categories of tools: Core Data, Deep Dive Analytics, and Diagnostics.
2. The MCP Server Definition (src/app/api/mcp/[transport]/route.ts)
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { z } from 'zod';
import { inspectUrl, querySearchAnalytics, resolveSiteUrl } from '@/lib/google/search-console';
const server = new McpServer({
name: 'Google Search Console Agent',
version: '1.0.0'
});
// Tool 1: Core Data - List Sites
server.tool(
'get_search_console_sites',
'List all verified GSC properties to see what we have access to.',
{},
async () => {
// Implementation calling the sites endpoint (simplified)
return { content: [{ type: 'text', text: "List of sites..." }] };
}
);
// Tool 2: Deep Dive - Flexible Analytics
server.tool(
'get_search_console_analytics',
'Advanced tool for flexible Search Console analytics. Supports filtering by page, query, etc.',
{
siteUrl: z.string().describe("The website URL (e.g., buildwithmatija.com)"),
days: z.number().optional().default(30),
dimensions: z.array(z.enum(['date', 'query', 'page', 'country', 'device'])).optional(),
},
async ({ siteUrl, days, dimensions }) => {
const endDate = new Date().toISOString().split('T')[0];
const startDate = new Date(Date.now() - (days * 24 * 60 * 60 * 1000)).toISOString().split('T')[0];
const results = await querySearchAnalytics(siteUrl, startDate, endDate, dimensions || ['date']);
return { content: [{ type: 'text', text: JSON.stringify(results) }] };
}
);
// Tool 3: Diagnostics - Index Inspector
server.tool(
'inspect_url',
'Checks if a specific URL is indexed by Google.',
{
siteUrl: z.string(),
pageUrl: z.string()
},
async ({ siteUrl, pageUrl }) => {
const result = await inspectUrl(siteUrl, pageUrl);
return { content: [{ type: 'text', text: JSON.stringify(result) }] };
}
);
// Export the server handler (Next.js specific)
// export const POST = ...
The Business Case: 3 Powerful Workflows
Once deployed, this setup enables three distinct workflows that transform how you interact with SEO data.
1. Core Data Access
The Goal: A high-level health check. The Prompt: "Which sites do I have access to, and how is traffic looking for buildwithmatija.com?"
The AI uses get_search_console_sites to list your verified properties (e.g., your 38 sites) and get_search_console_summary (a wrapper around analytics) to pull high-level clicks, impressions, and CTR.
2. "Deep Dive" Analytics
The Goal: Finding specific opportunities or leaks. The Prompt: "Show me daily clicks for the URL 'https://www.google.com/search?q=https://www.buildwithmatija.com/blog/payload-nextjs' over the last 30 days."
The AI utilizes the flexibility of get_search_console_analytics. It automatically calculates the date range and sets dimensions: ["date"].
Example Output: The AI can render this data into clean markdown tables for you:
⏺ Here are the daily clicks for your Payload article: Summary: This page is performing better! It has 1 total click with activity only on the last 2 days.
Date Clicks Impressions CTR Position 2025-12-18 0 23 0% 6.65 2025-12-19 1 28 3.57% 7.0
3. Diagnostics & Robustness (The "Gap" Fixes)
The Goal: Solving the "Why isn't it ranking?" mystery. The Prompt: "My page X isn't ranking. Can you check if it's indexed?"
This is where the Smart Resolver we built earlier shines. The AI:
- Uses
resolveSiteUrlbehind the scenes to handle the confusingsc-domain:prefix. - Calls
inspect_urlto check the live Google Index. - Reports back if the page is excluded due to a "Soft 404" or "Crawled - currently not indexed" status.
Final Verification
Complete Your SEO Agent Build
You now have a fully functional Claude SEO agent capable of:
- Real-time diagnostics — Check index status, crawl status, and mobile compatibility
- Flexible analytics — Query any dimension (date, query, page, device, country)
- Intelligent insights — Claude analyzes GSC data and provides actionable recommendations
Related MCP & SEO Tools:
- Build a Production MCP Server — Foundational MCP setup
- OAuth for MCP Server — Secure your MCP endpoints
- Persist Google OAuth Refresh Tokens — The auth pattern this guide relies on
- Custom JSON-RPC MCP Implementation — Alternative approach without mcp-handler
Next Steps:
- Deploy your MCP server to Vercel
- Configure OAuth in Claude Web or Claude CLI
- Start by asking Claude simple questions about your site's performance
- Expand with additional tools (Search Analytics, Sitemap analysis, ranking tracking)
Want to extend this? Consider adding tools for:
- Fetching your XML sitemap and checking coverage
- Tracking keyword rankings over time
- Finding content opportunities (queries where you rank 11-20)
- Bulk URL inspection for site migrations
Frequently Asked Questions
Comments
You might be interested in

30th November 2025

28th December 2025

30th November 2025

27th December 2025

21st December 2025

20th December 2025