AI Visibility Scanner: How to Check and Improve Your GEO Score
AI search engines — Google AI Overviews, Perplexity, ChatGPT Search — extract answers directly from web pages and often cite only one or two sources. This guide explains what signals they look for and how to use the free AI Visibility Scanner to audit and improve any page in minutes.
What Is AI Visibility?
Traditional SEO optimises for ranking position — appearing in a list of ten blue links. AI visibility (also called Generative Engine Optimisation, or GEO) is about something different: being the source an AI engine extracts and cites when it synthesises an answer to a user's question.
The distinction matters because AI-generated answers often displace clicks entirely. If a user asks Perplexity “what is the best way to convert time zones?” and Perplexity answers from your page, you receive brand visibility — and sometimes a direct citation link — without the user performing a traditional search click at all.
Pages that rank well in traditional search do not automatically have high AI visibility. The signals are different. A page with excellent backlinks but no structured data, thin content, and no FAQ section may be largely invisible to AI engines even while appearing on page one of Google.
How to Use the AI Visibility Scanner
- Open the AI Visibility Scanner
- Enter the full URL of any publicly accessible page — homepage, blog post, product page, or tool page
- Click Scan — the tool fetches the page, checks robots.txt and sitemap.xml, and analyses 25+ signals
- Review your score out of 100 and the category breakdown across Technical, Meta & SEO, Content Signals, and Structured Data
- Expand each category to see individual check results with specific fix recommendations
Each check shows a pass (✓), warning (!), or fail (✗) status alongside the points earned and available. Fail items should be addressed first — they are the largest point gains. Warning items are quick wins that add incremental improvement.
The Four Scoring Categories
Technical (25 points)
Technical signals establish basic trust with AI crawlers. The five checks are HTTPS, page accessibility, canonical URL, robots.txt, and XML sitemap. These are table stakes — a page without a canonical URL or sitemap is harder for crawlers to include in their knowledge base. HTTPS is especially important because some AI crawler policies deprioritise or skip non-HTTPS pages entirely.
Meta & SEO (25 points)
Meta signals tell AI engines what a page is about before they parse the content. The checks cover title tag presence and length (50–70 characters), meta description presence and length (120–160 characters), Open Graph tags (og:title, og:description, og:image), and Twitter/X card. Even AI engines that do not display traditional search snippets use these fields to classify and summarise content.
Content Signals (25 points)
Content signals measure whether the page's visible text is well-structured and information-dense. The checks look for an H1 heading, multiple H2 headings, content depth (text character count), FAQ or question-and-answer patterns, and structured content such as lists or tables. AI engines extract answers by parsing these elements — a page with a clear heading hierarchy and explicit Q&A sections is far easier to mine than a wall of unstructured text.
Structured Data (25 points)
Structured data is the highest-impact category for AI visibility because it provides machine-readable facts in a standardised format. The checks look for any JSON-LD script blocks, FAQPage schema (the single highest-scoring check), Article or WebPage schema, BreadcrumbList, and Organization or WebSite schema. Pages with FAQPage schema are explicitly surfaced in Google rich results and are far more likely to be cited in AI-generated answers.
Score Interpretation
| Score | Grade | What It Means |
|---|---|---|
| 85–100 | Great | Strong AI visibility — focus on remaining fails to maximise reach |
| 65–84 | Good | Solid foundations with addressable gaps in structured data or content depth |
| 40–64 | Fair | Meaningful barriers that prevent AI engines from reliably citing the page |
| 0–39 | Poor | Significant issues — missing structured data and meta tags are the immediate priority |
The Fastest Fixes for Each Category
Technical fixes (fast)
Add a canonical tag if missing — one line of HTML in the <head>. Ensure /robots.txt exists and is not blocking crawlers. Submit your sitemap to Google Search Console and include the sitemap URL in robots.txt. These are typically one-time setup tasks that take under 30 minutes total.
Meta fixes (30 minutes)
Run your title and description through the SERP Preview tool to verify lengths visually before saving. Aim for 50–60 characters on the title and 150–160 on the description. Add og:title, og:description, and og:image if missing — most CMS platforms have an SEO plugin that adds these in a single form. The Twitter card tag is a one-liner:<meta name="twitter:card" content="summary_large_image">.
Content fixes (1–2 hours)
Ensure every page has exactly one H1 and at least three H2 headings. If your page is content-thin (under 800 visible characters), expand it — AI engines favour pages with detailed, in-depth coverage of their topic. Add a FAQ section with 4–6 questions and answers. The questions should match what users actually ask; use Google's “People also ask” or your own Search Console query data as a source.
Structured data fixes (highest ROI)
Add FAQPage JSON-LD using the questions from your new FAQ section. Most frameworks accept a simple JSON-LD script block in the page head or body. If your site has a shared layout, add Organization and WebSite schema once — it covers every page. Add Article or WebPage schema to blog posts; WebApplication schema to tool pages. BreadcrumbList is typically auto-generated by SEO plugins and emits from the breadcrumb component — adding it to your layout template covers the whole site.
GEO vs Traditional SEO: Key Differences
| Signal | Traditional SEO weight | GEO / AI visibility weight |
|---|---|---|
| Backlinks | Very high | Low (not directly readable) |
| Page speed / Core Web Vitals | High | Low (not scored by AI parsers) |
| FAQPage schema | Medium (rich result) | Very high (direct answer extraction) |
| H1 / heading structure | Medium | High (topic classification) |
| Meta description length | Medium (CTR signal) | High (page summary for AI) |
| JSON-LD structured data | Medium | Very high |
| Content depth / word count | High | High |
| Canonical URL | High | High (preferred URL for citation) |
Common Questions
Why does my page score low despite ranking well in Google?
Traditional Google ranking relies heavily on signals the AI visibility scanner does not measure — backlinks, page speed, user engagement metrics, and domain authority. A page can rank on page one while still missing FAQPage schema, having a thin meta description, and having no JSON-LD at all. GEO optimisation addresses the subset of signals that AI engines use for content extraction, which overlaps only partially with ranking signals.
Can I scan pages behind a login?
No. The scanner fetches pages as a public crawler — it cannot authenticate. Only scan pages that are publicly accessible at the URL you enter. For private staging environments, consider running the scan after a public release or temporarily making the URL accessible.
How often should I re-scan?
After any significant content update, after adding or modifying schema markup, and after changing your title or meta description. It is also useful to scan competitor pages to benchmark your structured data coverage against theirs and identify gaps.
Scan Your Page's AI Visibility — Free
Enter any URL and get a scored report with specific fix recommendations. No signup, no limits.
Open AI Visibility Scanner →