GEO: how to get your website found by ChatGPT, Perplexity and Gemini
More and more people search for answers via AI instead of Google. If your website isn't optimized for AI search engines, you simply won't be cited. In this article I explain what GEO (Generative Engine Optimization) is, why it matters, and how to get started today.
AI & automation consultant. Helps B2B companies with lead generation, workflow automation, and AI training.
What is GEO and why does it matter?
GEO (Generative Engine Optimization) is optimizing your website so that AI search engines can find, understand and cite your content in their answers. It's the next evolution after SEO: where SEO focuses on Google's ranking algorithm, GEO focuses on how AI systems select and cite sources.
The urgency is clear. Gartner predicts that traditional search traffic will drop by 50% by 2028 as consumers switch to AI-powered answers. AI-referred traffic is growing by 527% per year, according to BrightEdge research. Google itself is integrating AI Overviews into more and more search results.
This means that some of your potential customers no longer click through to your website, but get their answer directly from an AI. The question is: is your business being cited in that answer, or your competitor's?
If you're already investing in SEO with AI, then GEO is the logical next step. It's not a replacement for what you're already doing, but an expansion to a new channel that's growing fast.
SEO vs GEO: what's the difference?
SEO and GEO share the same goal (being found), but the mechanisms are fundamentally different.
| Traditional SEO | GEO | |
|---|---|---|
| Goal | Rank high in search results | Get cited in AI answers |
| Traffic | Click to your website | Brand awareness + click via citation |
| Signals | Backlinks, keywords, speed | Structured data, citability, entity strength |
| Content | Keyword-optimized | Factual, citable, structured |
| Measurement | Rankings, organic traffic | AI citations, referral traffic, brand mentions |
| Technical | Meta tags, sitemap, robots.txt | llms.txt, JSON-LD, AI crawler rules, FAQ schema |
Important: GEO does not replace SEO. The best strategy combines both. A site that scores well in Google and gets cited by AI systems has two traffic sources instead of one.
How AI search engines work
To understand how to optimize, you need to understand how AI search engines select sources. The process has three phases.
Phase 1: Crawling and indexing
AI companies send crawlers (GPTBot, ClaudeBot, PerplexityBot) that visit and index your website. They read your HTML, structured data, robots.txt and llms.txt. This is similar to how Googlebot works, but the crawlers are optimized for extracting factual information, not for evaluating backlinks.
Phase 2: Understanding and entity recognition
The AI model processes the crawled content and tries to understand who you are (entity), what you do (services), and how trustworthy you are (authority). Structured data, consistent NAP data across multiple sources, and external mentions strengthen this understanding. The stronger your entity, the greater the chance you'll be cited.
Phase 3: Citation and source selection
When a user asks a question, the AI model selects the most relevant and trustworthy sources to cite. It chooses sources with clear, factual statements, rich structured data, and demonstrable expertise. Vague marketing language and anonymous claims are ignored. Concrete numbers, comparison tables and FAQ answers get cited.
7 concrete steps to make your site GEO-proof
These are the steps we execute at our SEO service for every client. They work for any B2B website.
Step 1: Create an llms.txt file
This file gives AI crawlers a structured summary of your company, services and content. It's like a business card for AI systems. Place it in the root of your website and describe who you are, what you do, and which pages are important.
Tip: Use markdown format. Start with a one-liner about your company, followed by sections for services, articles and contact info.
Step 2: Enrich your structured data (JSON-LD)
AI systems rely heavily on Schema.org structured data. Add Organization, Person, Service, Article and FAQPage schema to your pages. The richer your schema, the better AI understands you. Add fields that most sites forget: hasOfferCatalog, knowsAbout, sameAs, areaServed.
Tip: Google's Rich Results Test shows whether your schema is technically correct, but for GEO it also needs to be rich in content. Add descriptions, price signals and concrete facts.
Step 3: Configure robots.txt for AI crawlers
Add explicit Allow rules for GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot and Google-Extended. Also add a Sitemap directive. This gives AI crawlers a positive signal that your content is available.
Tip: Don't block AI crawlers unless you have a deliberate reason. Every blocked crawler is a missed citation opportunity.
Step 4: Make your content citable
AI systems cite content that contains clear, factual statements. Write definition sentences ('X is Y'), use comparison tables, provide concrete numbers and statistics, and structure your content with clear headings. Avoid vague marketing language.
Tip: Test your own content: if you were to cite the first sentence of each section, would it be a useful fact? If not, rewrite it.
Step 5: Build your brand entity
AI systems need to recognize you as a trusted entity. This requires consistent information across multiple sources: your website, LinkedIn, directories (Clutch, GoodFirms), Google Business Profile, and external mentions. The more places your name, contact details and description appear consistently, the stronger your entity.
Tip: Start with the basics: a LinkedIn Company Page, a Google Business Profile, and listings on 2-3 relevant directories. Ensure the same NAP (name, address, phone) appears everywhere.
Step 6: Add FAQ schema to every page
FAQPage schema is one of the most powerful GEO tools. AI systems use FAQ content directly in their answers. Add 5-10 substantive questions and answers to your most important pages. Give real, detailed answers, not one-liners.
Tip: Use the questions your customers actually ask. Also check 'People Also Ask' in Google and the questions ChatGPT suggests about your topic.
Step 7: Build content authority
Publish in-depth, well-structured articles about your area of expertise. Use external sources and references. Add a visible author bio with credentials. Link internally between related pages. AI systems trust content from sources that demonstrate verifiable expertise.
Tip: An article of 3,000+ words with citations, an author bio and internal links scores better than ten short blog posts without depth.
Case: what we did on waibase.nl
We practice what we preach. This is exactly what we implemented on our own site.
Created llms.txt
A structured file with company description, all 6 services with one-line descriptions, all knowledge base articles with summaries, and contact details. AI crawlers can now understand what WaiBase is and does within seconds.
Extended robots.txt
Added explicit Allow rules for GPTBot, ClaudeBot, PerplexityBot, Google-Extended and CCBot. Plus a Sitemap directive so crawlers can find all pages.
Enriched Organization JSON-LD
From a bare Organization with just name and URL to a ProfessionalService with logo, foundingDate, areaServed, sameAs (LinkedIn), and a hasOfferCatalog with all 6 services including descriptions.
Enriched Person JSON-LD
Extended the Person schema for Nico Waiboer with a bio, image, 12 knowsAbout items (including specific tools like n8n, Clay and Claude AI), and sameAs links to LinkedIn and nicowaiboer.nl.
Added AuthorBox and Breadcrumb
All knowledge base articles now have a visible author bio with photo, publication date, and BreadcrumbList JSON-LD. This strengthens both E-E-A-T and AI citability.
Enriched Article schema
All articles now have image, inLanguage, wordCount, articleSection, keywords, and publisher.logo in their JSON-LD. Plus dateModified that is actually kept up to date.
Result
After this implementation, waibase.nl is fully visible and readable for all major AI crawlers. The structured data is complete enough for AI systems to recognize WaiBase as an entity and cite it. The next step is building external signals (directory listings, guest articles, reviews).
Tools you need
Google Search Console
FreeFree. Monitor your indexing, crawl errors and search performance. Check if Google recognizes your structured data.
Google Rich Results Test
FreeFree. Validate your JSON-LD schema. Shows which rich results your pages can generate.
Schema.org Validator
FreeFree. Checks whether your structured data is correct according to the Schema.org specification.
Semrush
Paid (from $140/month). Tracks your visibility in AI Overviews, monitors brand mentions, and analyzes your competitors.
Screaming Frog
Free up to 500 URLs, then $259/year. Crawls your entire site and finds technical SEO and GEO issues.
Claude AI
Paid ($20/month for Pro). Use it to analyze your content for citability, generate your structured data, and write your llms.txt.
What does GEO optimization cost?
Transparent pricing. Also check out our AI consulting service and our article about the costs of AI implementation.
GEO Quick Scan
€750Complete audit of your current GEO status: robots.txt, structured data, content citability, brand entity strength and llms.txt. You receive a priority list with concrete fixes.
GEO Project
€4,000 - 15,000Implementation of all technical GEO improvements: llms.txt, enriched structured data, robots.txt, content optimization, FAQ schema and author markup on all pages.
GEO Retainer
€995 - 3,000/monthOngoing management: monitoring of AI citations, content updates, structured data maintenance, optimizing new articles, and building external signals.
5 mistakes companies make with GEO
Mistake 1: Only focusing on SEO and ignoring GEO
Many companies invest thousands of euros in traditional SEO but completely ignore GEO. Meanwhile, 10-15% of search traffic is already shifting to AI answers. If your competitor is visible there and you're not, you're losing customers without realizing it.
Mistake 2: Thinking structured data is optional
For traditional SEO, you can still rank fine without structured data. For GEO, it's essential. AI systems depend on structured signals to understand who you are and what you do. Without schema, you're invisible.
Mistake 3: Publishing generic content without unique insights
AI systems cite sources that add unique value. If your content says the same thing as ten other sites, why would an AI cite you specifically? Add your own data, case studies, specific tools and personal experience.
Mistake 4: Anonymous testimonials and vague claims
AI systems can't verify anonymous quotes. 'A client says...' without a name or company gets ignored. Use real names, real companies, real numbers. That's what AI systems cite.
Mistake 5: Optimizing once and then stopping
GEO is not a one-time action. AI systems continuously update their knowledge. Your competitors are optimizing. New crawlers appear. Monitor your AI visibility regularly and keep improving your content and structured data.
Action plan: how to get started
Do a GEO audit of your current site
Check your robots.txt, structured data, llms.txt, content citability and brand entity signals. Use Google's Rich Results Test and the Schema.org validator. Search for your own brand in ChatGPT and Perplexity to see if you're being cited.
Implement the technical foundation
Create an llms.txt file. Update your robots.txt with AI crawler rules. Enrich your JSON-LD schema on all pages. This is the foundation everything builds on.
Optimize your most important pages
Start with your homepage, service pages and your best-performing content. Add citable definition sentences, enrich the structured data, and add FAQ schema.
Build external signals
Create a LinkedIn Company Page. List your company on relevant directories. Ensure consistent NAP data everywhere. Publish guest articles on relevant blogs.
Monitor and iterate
Track your AI referral traffic in Google Analytics. Check monthly whether you're being cited in AI answers. Update your content and structured data based on what works.
Frequently asked questions about GEO
What is GEO (Generative Engine Optimization)?
GEO is optimizing your website so that AI search engines like ChatGPT, Perplexity and Google AI Overviews can find, understand and cite your content. It goes beyond traditional SEO by focusing on structured data, citable content and signals that AI systems use to select sources.
Does GEO replace traditional SEO?
No. GEO is a complement to SEO, not a replacement. You still need good technical SEO, fast load times and relevant content. GEO adds a layer on top: content specifically optimized for how AI systems process and cite information.
Which AI search engines should I optimize for?
The most important ones are Google AI Overviews (integrated into Google Search), ChatGPT Search (OpenAI), Perplexity, and Gemini (Google). Additionally, there are specific crawlers like GPTBot, ClaudeBot and PerplexityBot that index your content.
What is llms.txt and do I need it?
llms.txt is a file in the root of your website that gives AI crawlers a summary of who you are, what you do and what content you have. It's comparable to robots.txt but for AI systems. It's not mandatory, but it helps AI systems understand your site faster and better.
How much does GEO optimization cost?
A GEO Quick Scan costs 750 euros. A complete GEO project including technical implementation, content optimization and structured data costs between 4,000 and 15,000 euros. Ongoing management and monitoring starts from 995 euros per month.
How long before I see results?
Technical GEO improvements like llms.txt, structured data and robots.txt can be implemented within 1 to 2 weeks. The impact on AI citations builds up over 2 to 6 months, depending on your current authority and competition in your niche.
Can I do GEO myself or do I need a specialist?
You can do the basics yourself: create llms.txt, update robots.txt, add FAQ schema. For enriched structured data, content strategy and ongoing optimization, a specialist is recommended. The technical implementation requires knowledge of JSON-LD, Schema.org and AI crawler behavior.
How do I measure if GEO is working?
Monitor your traffic from AI sources via Google Analytics (referral traffic from chat.openai.com, perplexity.ai, etc.). Use tools like Semrush to track your visibility in AI Overviews. Regularly check if your brand is cited in AI answers by asking the same questions your customers ask.
Do AI crawlers block my content if I don't explicitly allow them?
Most AI crawlers respect robots.txt. If you have no specific rules, your site is accessible by default. But explicit Allow rules for GPTBot, ClaudeBot and PerplexityBot send a positive trust signal. Some crawlers are more cautious when they don't see explicit permission.
What is the difference between SEO and GEO?
SEO optimizes for Google's ranking algorithm: you want to rank high in search results. GEO optimizes for AI citation: you want AI systems to cite your content in their answers. SEO is about links and rankings, GEO is about structured data, citable facts and entity recognition.
Want to know how your site scores?
Book a GEO Quick Scan and discover where the opportunities are. Or check out our SEO with AI service for a complete approach.
Book a strategy call