30-Second Summary

  • 5 checks, all free: site: command, robots.txt, Google Search Console, Core Web Vitals, title tags.
  • Most common cause: a blocking robots.txt or a missing sitemap — fixable in under 30 minutes.
  • Tools needed: Google Search Console (free) + PageSpeed Insights (free).
  • After these 5 checks: you will know exactly what is blocking your site — and what to fix first.

Check #1 — The site: Command (2 minutes)

Before anything else, verify that Google has indexed your site at all. Open a new Google tab and type:

site:yourdomain.ca

Replace yourdomain.ca with your actual domain. Two possible results:

  • Results appear: Google knows your site exists and has indexed at least some pages. The problem is ranking, not indexing — continue with checks 2-5.
  • No results: Google has never indexed your site, or all your pages have been removed from the index. This is a critical issue — go directly to check #3 (Search Console).

Also note how many pages appear. A 50-page site showing only 3 results in site: means 47 pages are not indexed — a significant SEO problem.


Check #2 — robots.txt (5 minutes)

The robots.txt file tells Google's bots what they can and cannot crawl. A single misplaced line can block your entire site from Google. Open a new tab and go to:

https://yourdomain.ca/robots.txt

What to look for: a line like Disallow: / under User-agent: * means you are blocking ALL crawlers from your entire site. This is the most common mistake on newly launched sites — developers often forget to remove the development environment's blocking rules before going live.

A safe, recommended robots.txt for a Canadian SMB in 2026:

User-agent: *
Disallow:

User-agent: GPTBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: Google-Extended
Disallow: /

Sitemap: https://yourdomain.ca/sitemap.xml

This configuration allows Google, Bing and other search engines to crawl freely, while blocking AI training bots that do not bring traffic. Adjust based on your preferences around AI training.


Check #3 — Install Google Search Console (15 minutes)

Google Search Console (GSC) is the most important free tool for any business with a website. If it is not installed, you are flying blind — you have no data on how Google sees your site. Go to search.google.com/search-console and add your property.

Once installed and verified, check these 3 things immediately:

  • Coverage → Errors tab: lists pages that should be indexed but have crawl errors (404, redirect loops, server errors).
  • Sitemaps: submit your sitemap.xml if not already done. A sitemap tells Google exactly which pages exist and should be prioritized for crawling.
  • URL Inspection: paste any page URL to see if it is indexed, when it was last crawled, and whether any issues were detected.
Quick Win If your key pages are not indexed, use URL Inspection → "Request Indexing" for each one. This typically speeds up indexing by several days.

Check #4 — Core Web Vitals (5 minutes)

Speed is a direct Google ranking factor since June 2021. A site that passes the Core Web Vitals thresholds has a measurable advantage over slower competitors. Test your site at pagespeed.web.dev and check these 3 metrics:

Metric What it measures ✅ Good ⚠️ Needs Improvement 🔴 Poor
LCP Main content load time ≤ 2.5 s 2.5–4.0 s > 4.0 s
INP Click reaction time ≤ 200 ms 200–500 ms > 500 ms
CLS Visual stability (layout shifts) ≤ 0.1 0.1–0.25 > 0.25

Source: Google — web.dev/vitals

Focus on mobile results — Google uses mobile-first indexing since 2020. If all 3 numbers are red on mobile, fixing speed is a higher priority than any other SEO task.


Check #5 — Title Tags and Meta Descriptions (10 minutes)

Title tags are the clickable blue links in Google results. They are one of the strongest on-page signals. Check the top 5 pages of your site:

  • Is each page title unique? Duplicate title tags across multiple pages confuse Google and split ranking potential.
  • Are titles under 60 characters? Longer titles are truncated in Google results.
  • Does each title contain the target keyword? Specifically the term your clients actually type in Google.
  • Does the homepage title clearly describe your business and location? Example: "Plumber Montreal — Emergency Service 24/7 | CompanyName" outperforms "CompanyName — Welcome".

To check your titles: right-click any page → View Page Source → Ctrl+F → search for <title>. Or use the URL Inspection tool in Search Console.


You've run the 5 checks and found several issues. Prioritizing what to fix first is exactly what the audit is designed for.

See Our SEO Service →

Your Action Plan Checklist

  • 1Run site:yourdomain.ca — confirm Google has indexed your site and count indexed pages.
  • 2Open yourdomain.ca/robots.txt — verify no accidental Disallow: / is blocking crawlers.
  • 3Install Google Search Console (if not done) — submit sitemap, fix Coverage errors, request indexing for key pages.
  • 4Test Core Web Vitals at pagespeed.web.dev — fix any red mobile metrics.
  • 5Audit title tags — ensure each page has a unique, keyword-rich title under 60 characters.

FAQ: 8 Questions About Indexing and Visibility

Type site:yourdomain.ca in Google search (no spaces). If results appear, your site is indexed. If Google returns "no results found", your site is either not yet indexed or has been excluded by a technical issue (robots.txt, noindex tag, or a crawl error in Search Console).

Between 1 day and 4 weeks for a new site. To accelerate indexing: install Google Search Console, submit your sitemap.xml, and use URL Inspection to manually request indexing for your key pages.

robots.txt is a text file at the root of your site that tells search engine bots which pages to crawl or ignore. A misconfigured robots.txt (e.g. Disallow: /) can block all crawlers and make your entire site invisible to Google. Always check it at yourdomain.ca/robots.txt.

Google Search Console is Google's free tool for webmasters. It shows which pages are indexed, which keywords generate impressions and clicks, technical errors to fix, and lets you submit your sitemap. It is 100% free and essential for any business with a website.

This is normal for a new or young site. Google indexes brand name searches quickly but needs more time and authority signals (backlinks, content depth, E-E-A-T) to rank for competitive keywords. Check Search Console Impressions — if your target keywords appear with 0 clicks, you are crawled but not ranked well enough yet.

Right-click on the page → View Page Source, then search for "noindex" with Ctrl+F. If you find <meta name="robots" content="noindex">, Google will crawl but not index the page. This is sometimes accidentally left on by developers after site launch.

A site can be technically live without being indexed. Common causes: robots.txt still blocking crawlers from staging, no sitemap submitted, Search Console not installed, or not enough time elapsed — Google can take 1 to 4 weeks to crawl a new domain.

A very slow site is indexed but ranked lower because speed is a ranking factor since 2021 (Core Web Vitals). It won't completely prevent indexing but makes it much harder to appear in the first 3 pages for competitive queries. Fix speed issues before investing in content.


Your Next Step

You now have 5 concrete checks and an action plan. Most indexing issues can be fixed in under an hour. If the problem persists after these checks, the issue is likely deeper — either in your content strategy, your authority profile, or your technical architecture.

Full SEO audit — 18-point technical + content + authority diagnosis, delivered within 48 h.

Get My Free Audit →

We don't sell you anything on the call — we start by helping you see clearly.