Vercel / Next.js
One middleware. Every AI agent handled.
Add @inception-agents/vercel to your Next.js project. Detection and optimization at the edge.
Integration Guide
Complete setup instructions
Every code block below is complete and copy-paste-ready. Total time: ~10 minutes.
Prerequisites
- + Node.js 18+ installed
- + Next.js 14+ project deployed on Vercel
- + Inception Agents API key from your dashboard
Step 1: Install the SDK
npm install @inception-agents/vercel Step 2: Set environment variable
Add to your Vercel project (via CLI or dashboard):
vercel env add INCEPTION_API_KEY
# Paste your API key when prompted Or add to .env.local for local development:
echo "INCEPTION_API_KEY=your_api_key_here" >> .env.local Step 3: Create the middleware
Create middleware.ts in your project root (next to package.json):
// middleware.ts
// Inception Agents — AI agent detection and content optimization
// This middleware intercepts all requests, detects AI agents, and
// serves optimized content. Human visitors pass through with <2ms overhead.
import { createInceptionMiddleware } from '@inception-agents/vercel';
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
const inception = createInceptionMiddleware({
apiKey: process.env.INCEPTION_API_KEY!,
// Optional configuration:
// debug: process.env.NODE_ENV === 'development',
// excludePaths: ['/api/', '/_next/', '/static/'],
// customDetectionRules: [],
});
export async function middleware(request: NextRequest) {
return inception(request);
}
export const config = {
// Match all paths except static assets and Next.js internals
matcher: ['/((?!_next/static|_next/image|favicon.ico|robots.txt|sitemap.xml).*)'],
}; Already have a middleware.ts?
Wrap your existing logic:
// middleware.ts — with existing middleware logic
import { createInceptionMiddleware } from '@inception-agents/vercel';
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
const inception = createInceptionMiddleware({
apiKey: process.env.INCEPTION_API_KEY!,
});
export async function middleware(request: NextRequest) {
// Inception Agents runs first — if it detects an AI agent,
// it returns an optimized response. Otherwise, falls through.
const inceptionResponse = await inception(request);
if (inceptionResponse) return inceptionResponse;
// Your existing middleware logic here
return NextResponse.next();
}
export const config = {
matcher: ['/((?!_next/static|_next/image|favicon.ico).*)'],
}; Step 4: Deploy
vercel --prod Or push to your connected Git repository. The middleware activates automatically.
Step 5: Verify
After deployment, test with curl:
# Test AI agent detection (should return enriched content)
curl -s -H "User-Agent: GPTBot/1.0" \
https://your-site.vercel.app/ | head -50
# Test human passthrough (should return normal site)
curl -s -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)" \
https://your-site.vercel.app/ | head -50
# Test llms.txt
curl -s https://your-site.vercel.app/llms.txt Configuration Options
| Option | Type | Default | Description |
|---|---|---|---|
| apiKey | string | (required) | Your Inception Agents API key |
| debug | boolean | false | Log detection results to console |
| excludePaths | string[] | ['/api/', '/_next/'] | Paths to skip detection on |
| detectionThreshold | number | 0.70 | Confidence threshold for agent detection |
| cacheMaxAge | number | 300 | Cache TTL for optimized content (seconds) |
| enableLlmsTxt | boolean | true | Serve /llms.txt and /llms-full.txt |
| enableJsonLd | boolean | true | Inject enhanced JSON-LD for agents |
| enableAgentCard | boolean | false | Serve /.well-known/agent.json |
What Happens Next
- 1 Visit your dashboard to see AI agent traffic
- 2 Your site content is automatically analyzed and optimized content variants are generated
- 3 The learning engine begins tracking which content strategies work best for each agent type
- 4 Check the Optimization tab for specific recommendations to improve your AI visibility
Troubleshooting
Middleware not running
- Ensure
middleware.tsis in the project root (same level aspackage.json) - Check that the
matcherconfig isn't excluding the paths you're testing - Run
vercel logsto check for errors
No agent traffic showing in dashboard
- It may take a few hours for real AI agents to visit your site
- Use the curl commands above to simulate agent visits and verify the integration works
- Check that your API key is correct in environment variables
Increased latency for human visitors
- Normal overhead is <2ms. If higher, check that
excludePathscovers high-traffic API routes - The middleware makes zero external network calls for human traffic — detection is local pattern matching
Uninstalling
npm uninstall @inception-agents/vercel
# Delete middleware.ts (or remove the inception wrapper)
vercel env rm INCEPTION_API_KEY Framework Support
Next.js 14+
Middleware runs at the edge on Vercel
Next.js 13
Supported with App Router or Pages Router
SvelteKit 2+
Via hooks.server.ts adapter (coming soon)
Nuxt 3+
Via server middleware adapter (coming soon)
npm install and deploy.
Your Next.js site is 10 minutes away from being optimized for every AI agent.
Inception Agents