Robots.txt Is Blocking Your Roofing Leads (And You Have No Idea)
Your roofing company website has a file on it right now that tells ChatGPT and Perplexity to stay away. You've never heard of it. You don't know it's there. And it's costing you leads. The fix takes under 10 minutes.
Your roofing company website has a file on it right now that tells ChatGPT and Perplexity to stay away. You've never heard of it. You don't know it's there. And it's costing you leads.
When a homeowner asks ChatGPT "best roofing contractor near me" after a storm, the AI can't recommend you if it can't read your website. It recommends a national chain instead. That's the lead that should have come to you. It went somewhere else, silently, with no explanation.
The file is called robots.txt. The fix takes under 10 minutes. And most contractors have no idea they need to do it.
What Robots.txt Is (And Why Your Website Probably Has One)
Think of robots.txt like a "No Soliciting" sign on your front door. It tells internet bots what they're allowed to read and what they should stay away from.
It was originally designed to keep spammy scrapers out. But here's the problem. When web developers built roofing company websites on WordPress, Wix, Squarespace, or ServiceTitan's website tools, they often left a default setting that blocks all bots — including ChatGPT (GPTBot), Perplexity (PerplexityBot), and Claude (ClaudeBot).
The developer wasn't trying to block AI. These tools didn't exist when the site was built in 2019. But the file is still there. And it's still blocking them today.
Here's what the blocking code looks like:
User-agent: GPTBot
Disallow: / That "Disallow: /" means "block everything." GPTBot can't read any part of your website.
Here's what the fixed version looks like:
User-agent: GPTBot
Allow: / Two characters changed. That's the entire fix.
How to Check If Your Website Has This Problem Right Now
Do this right now while you're reading this. It takes 60 seconds.
Open a browser and go to: yourdomain.com/robots.txt — replacing "yourdomain.com" with your actual website address.
Scenario 1: You see text with "Disallow: /" next to GPTBot, PerplexityBot, or ClaudeBot. Your website is blocking those AI crawlers. That's the problem. Keep reading.
Scenario 2: You see "User-agent: *" followed by "Disallow: /". You're blocking every bot, including AI crawlers. Same fix.
Scenario 3: The page is blank or says "404 Not Found." You likely don't have a robots.txt file at all. That's fine — AI crawlers can read your site by default.
Scenario 4: You see text but no blocking entries for AI crawlers. Your website is probably fine.
If you found the problem: contact your web developer and ask them to update the robots.txt file to allow GPTBot, PerplexityBot, and ClaudeBot. If your site is on WordPress, the setting is usually in Settings > Reading or in the Yoast SEO plugin. Most developers can fix this in under 30 minutes.
Fixing Robots.txt Is the Start, Not the Finish
Opening the door matters. But AI needs something worth reading on the other side.
Even if you fix robots.txt today, ChatGPT still won't recommend you if your website doesn't answer the questions homeowners actually ask. "How do I know if my roof needs replacing or just repairing?" "How long does a roof replacement take?" "What roofing material works best for [region] weather?"
These are the questions homeowners ask ChatGPT. If those answers aren't clearly on your website, AI has nothing to cite.
Then there's listing consistency. Google Business Profile, Angi, HomeAdvisor, Better Business Bureau, your local chamber listing — all need the same business name, address, and phone number. AI cross-references these sources to decide if a business is legitimate.
A blocked robots.txt file is like a locked door. Content and consistency are what's on the other side of that door. Both matter.
Why This Matters Most Going Into Storm Season
Storm season is when AI search for local roofers spikes. A homeowner wakes up with roof damage. They don't Google "roofer near me" the way they used to.
They open ChatGPT on their phone and ask: "Best roofing contractor near me that handles storm damage insurance claims?"
The contractor who gets cited in that response gets the call. The one with a blocked robots.txt file doesn't get considered at all.
National chains like Storm Guard have full digital teams. They've already fixed this. Local contractors who move now still have time to own their market's AI search position before storm season peaks.
AI search visitors convert up to 4.4x better than traditional organic search visitors. These aren't casual browsers. These are homeowners who asked for a recommendation and got your name.
Frequently Asked Questions
What is robots.txt and why does it matter for AI search?
Robots.txt is a file on your website that tells bots what they're allowed to read. It was designed to block spam scrapers. The problem is that older websites often have settings that block AI crawlers like GPTBot (ChatGPT) and PerplexityBot. If your robots.txt blocks those crawlers, ChatGPT can't read your website and won't recommend you.
How do I check if my roofing website is blocking ChatGPT?
Go to yourdomain.com/robots.txt in a browser. Look for entries that say "User-agent: GPTBot" or "User-agent: PerplexityBot" followed by "Disallow: /". If you see those lines, your site is blocking AI crawlers. If the page is blank or says "404 Not Found," you likely don't have a blocking issue.
What is GPTBot and why should I allow it on my website?
GPTBot is ChatGPT's web crawler. It reads your website so ChatGPT can cite it in answers. If you block GPTBot with robots.txt, ChatGPT can't recommend your roofing company even if you're the best option in the area. Allowing it takes one line change in your robots.txt file.
Will fixing my robots.txt file make my roofing company show up in ChatGPT?
Fixing robots.txt opens the door so ChatGPT can read your site. But ChatGPT also needs content that directly answers the questions homeowners ask, and consistent business information across directories. Robots.txt is step one. Content and consistency are steps two and three.
How long does it take to fix a robots.txt file?
Most web developers can fix it in under 30 minutes. The change itself is two characters. If your site is on WordPress, you might be able to fix it yourself through the settings panel. If it's on Wix or Squarespace, your developer will need to access server settings.
Can I fix robots.txt myself or do I need a developer?
If your site is on WordPress, you can usually fix it yourself through Settings > Reading or the Yoast SEO plugin. If you're on Wix, Squarespace, or a custom platform, contact your developer. Either way, the fix itself is simple once you know where to look.
Does robots.txt affect my Google rankings as well as AI search?
Robots.txt can affect both. If you block all bots, you're also blocking Googlebot. That's worse for Google rankings than for AI search. Check your robots.txt right now. If it's set to "Disallow: *" (block all bots), fix it immediately.
Ready to become the answer in AI search?
Start with an AI Visibility Audit. See exactly where you stand and what to fix.
Get Your AI Audit | $197