AEOAgent.Ai
Laptop showing a financial dashboard with charts
Fintech 5 min read

One robots.txt Change That 10x'd AI Visibility for a Fintech Startup

A fintech startup accidentally blocked all AI crawlers in a security review. Removing the block alone produced a 10x jump in AI citations within three weeks.

10x

AI citations in 3 weeks

1 line

Actual fix

72hr

First measurable lift

The mystery

A Series A fintech startup was doing everything right on content: question-shaped headings, answer capsules, FAQ schema, visible timestamps, a fresh publishing cadence. Their AEO scan scored 78/100 — above average. But their AI referral traffic was flat at roughly 50 visits per month, and manual ChatGPT audits showed zero citations for their target queries.

The team was stumped. Competitors with weaker content were appearing in AI answers while they were invisible.

The cause

During a security review six months earlier, the platform team had added this to robots.txt:

User-agent: *
Disallow: /

They had then layered specific Allow rules for Googlebot, Bingbot, and a few others. GPTBot, OAI-SearchBot, ClaudeBot, and PerplexityBot were not on the allow list — so the blanket Disallow applied to all of them.

Server logs confirmed the impact: GPTBot had been hitting the site before the change (100+ requests/day), dropping to zero after. No one noticed because AI traffic was a small fraction of overall traffic at the time.

The fix

The fix took under a minute: add explicit Allow rules for the four major AI crawlers.

User-agent: GPTBot
Allow: /

User-agent: OAI-SearchBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

What happened next

  • Hour 12: Server logs showed GPTBot crawling at pre-block rates within 12 hours of the robots.txt update.
  • Day 3: First ChatGPT citation for a target query, confirmed via manual audit.
  • Week 2: Perplexity citations appearing for branded queries.
  • Week 3: AI referral traffic up from 50/month baseline to roughly 500/month — a 10x increase from a one-line change.

The lesson

Before spending weeks on content work, check your robots.txt. It's the single most common and most consequential AEO bug in mid-market and enterprise sites. Our internal data suggests roughly 18% of sites we scan are accidentally blocking at least one major AI crawler — often via a legacy rule nobody remembers adding.

Run our robots.txt checker on your site today. It's the cheapest AEO audit you'll ever run, and it's worth checking quarterly because policies drift.

Related case studies

Want results like these?

Run a free AEO scan on your site and see how you stack up across all 12 factors.

Scan your site