Getting Started

Core Concepts

Understand the three systems that power Inception Agents: detection, optimization, and learning.

Core Concepts

Inception Agents is built on three interconnected systems: Detection, Optimization, and Learning. Together, they ensure that when an AI agent evaluates your product or service, it sees the most relevant, well-structured content possible.

This page explains what each system does and introduces the key primitives you will encounter throughout the platform.


The Three Systems

┌─────────────┐     ┌──────────────┐     ┌─────────────┐
│  Detection   │────▶│ Optimization │────▶│  Learning   │
│              │     │              │     │             │
│  Identify    │     │  Shape what  │     │  Improve    │
│  agent visits│     │  agents see  │     │  over time  │
└─────────────┘     └──────────────┘     └─────────────┘
       ▲                                        │
       └────────────────────────────────────────┘
                   Feedback loop
  1. Detection identifies when an AI agent visits your site, which platform it belongs to, and what it is looking for.
  2. Optimization serves the right content in the right format for that specific agent and intent.
  3. Learning analyzes the outcome of every agent interaction and feeds improvements back into both detection and optimization.

The systems operate continuously. As agent behavior evolves, the platform adapts automatically.


Detection

The detection layer identifies AI agent traffic and classifies it by platform, agent type, and inferred intent.

What It Covers

  • 40+ distinct AI agents across 7 platforms (OpenAI, Anthropic, Google, Microsoft, Amazon, xAI, Perplexity)
  • Crawler bots (GPTBot, ClaudeBot, PerplexityBot, Google-Extended, BingBot)
  • Real-time browsing agents (ChatGPT Browse, Copilot, Perplexity Search)
  • API-based agents (Shopify Agentic API consumers, ACP/UCP clients)
  • Embedded agents (Rufus within Amazon, Gemini within Google Shopping)

How It Works

Detection runs at the edge, before your origin server processes the request. For human visitors, the overhead is sub-2ms — effectively invisible. When an agent is detected, the request is routed through the optimization layer. When a human is detected, the request passes through untouched.

Detection uses a combination of:

  • User-Agent analysis — primary signal for known crawlers
  • TLS fingerprinting — identifies headless browsers and API clients
  • Behavioral patterns — request cadence, header signatures, navigation patterns
  • IP range correlation — validated against known agent infrastructure

Every detection event is logged with platform, agent type, requested URL, and timestamp.


Optimization

The optimization layer transforms your content for agent consumption. It operates across three tiers, each targeting a different stage of agent evaluation.

The Three-Layer Content Architecture

LayerFile/FormatPurposeWhen It Is Used
Layer 1: llms.txt/llms.txt, /llms-full.txtDirect agent queries — “tell me about this site”Agent requests site-level information
Layer 2: Structured DataJSON-LD, schema.org markupContext enrichment — prices, reviews, specs, availabilityAgent evaluates specific products or services
Layer 3: HTML EnrichmentEnriched page contentDeep evaluation — full product comparisons, feature analysisAgent performs detailed research or comparison

Each layer is generated and maintained automatically. You can override or customize any layer from the dashboard.

Per-Platform Tailoring

Different agents process content differently. The optimization layer accounts for this:

  • OpenAI agents receive content emphasizing structured product attributes and direct answers
  • Anthropic agents receive longer-form content with detailed comparisons and evidence
  • Perplexity agents receive citation-friendly content with clear source attribution
  • Google agents receive schema.org-heavy markup with entity relationships
  • Microsoft Copilot receives API-compatible product catalog data
  • Amazon Rufus receives competitive positioning signals and availability data
  • Grok receives content enriched with recency signals and social proof

The platform selects the appropriate variant based on detection output. You do not need to manage per-platform content manually.


Learning Engine

Every agent interaction generates data. The learning engine analyzes this data to improve optimization over time.

What It Tracks

  • Agent visit patterns — which pages agents visit, in what order, how frequently
  • Content variant performance — which version of your content produces the best outcomes
  • Platform-specific behavior — how each agent type responds to different content strategies
  • Competitive signals — how your content performs relative to alternatives the agent evaluates in the same session

How It Improves

The learning engine runs continuously. It:

  1. Identifies which content variants correlate with positive agent outcomes (recommendations, citations, product inclusions)
  2. Adjusts content strategy weights per platform and per intent
  3. Generates new content variants when it detects underperforming segments
  4. Alerts you in the dashboard when manual intervention would be beneficial

You can review learning engine decisions in Analytics > Learning Log and override any automated adjustment.


Key Primitives

llms.txt

llms.txt is a standardized file — analogous to robots.txt — that provides AI agents with a machine-readable description of your site. It lives at the root of your domain:

https://example.com/llms.txt

The file is written in markdown and contains:

  • Site identity and purpose
  • Product or service catalog summary
  • Key pages and their descriptions
  • Structured links for deeper exploration
  • Contact and support information

Inception Agents generates and maintains your llms.txt automatically based on your site content. You can also edit it directly.

A more detailed version is available at /llms-full.txt for agents that perform deep-dive analysis.

See the llms.txt Reference for the full specification and customization options.

Agent Card

The Agent Card is a JSON file at /.well-known/agent.json that declares your site’s capabilities to AI agents. Think of it as a machine-readable business card:

{
  "name": "Example Store",
  "url": "https://example.com",
  "description": "Premium outdoor gear and apparel",
  "capabilities": {
    "commerce": true,
    "acp": true,
    "product_feed": "https://example.com/feeds/products.json"
  },
  "contact": {
    "support": "support@example.com"
  },
  "llms_txt": "https://example.com/llms.txt"
}

Agents that support the .well-known/agent.json standard will read this file to understand what your site offers and how to interact with it programmatically.

See the Agent Card Reference for the full schema.

Content Variants

A content variant is an alternative version of a page or product description optimized for a specific agent intent. Inception Agents generates five variant types:

VariantOptimized ForExample Use Case
Feature-ledAgents comparing technical specifications”What laptop has the best battery life?”
Outcome-ledAgents evaluating results and benefits”What CRM will help me close more deals?”
Comparison-ledAgents building head-to-head analyses”Compare Notion vs. Confluence”
Trust-ledAgents weighing credibility and social proof”What is the most trusted project management tool?”
Concise-ledAgents generating quick answers or summaries”Recommend a running shoe under $150”

The platform selects the appropriate variant based on the detected agent’s intent signal. Variants are generated from your existing content — you do not need to write them manually.

See the Content Variants Guide for customization and manual overrides.

Commerce Protocols

Commerce protocols enable AI agents to complete purchases on behalf of users without leaving the agent interface. Inception Agents supports four protocols:

ProtocolBacked ByWhat It Enables
ACP (Agent Commerce Protocol)ShopifyOne-click checkout through AI agents for Shopify merchants
UCP (Universal Commerce Protocol)Open standardPlatform-agnostic agent checkout for any commerce backend
Shopify Agentic APIShopifyFull cart management, product search, and checkout via API
Microsoft Copilot CatalogMicrosoftProduct listing and purchase through Copilot and Bing Shopping

When a commerce protocol is active, agents can present your products with a direct purchase action — reducing friction from recommendation to conversion.

See the Commerce Protocols Guide for setup and configuration.


Next Steps