AI Crawlers Blocked
What This Means
The robots.txt file blocks AI crawlers such as GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, or Google-Extended. This prevents your content from appearing in AI-powered search results and answers, reducing visibility in generative AI platforms.
What Triggers This Issue
This issue is triggered when the robots.txt contains Disallow rules for AI-related user agents including: GPTBot, ChatGPT-User, ClaudeBot, Claude-Web, Google-Extended, PerplexityBot, Anthropic-AI, CCBot, or Cohere-AI.
How To Fix
Consider allowing AI crawlers if you want your content to be discoverable in AI-powered search and assistant platforms. Remove or modify the Disallow rules for AI user agents. Note: allowing AI crawlers may mean your content is used to generate AI responses.
Last updated on