## Why Technical SEO Is Still Misunderstood
Content gets the glory. Links get the arguments. Technical SEO quietly determines whether either of those things matter.
A page with excellent content and authoritative backlinks can still fail to rank if Google can't crawl it, if it loads too slowly on mobile, or if its structured data is malformed.
This checklist covers every layer of technical SEO. Work through it systematically — section by section — and you'll find something fixable on almost every site.
[Image: Screenshot of a technical SEO audit dashboard showing critical issues, warnings, and passes across crawl, performance, and structured data categories]
## 1. Crawlability and Indexation
Google can only rank what it can find and index. This section catches the issues that silently remove pages from search entirely.
**Robots.txt**
- [ ] `robots.txt` exists and is accessible at `/robots.txt`
- [ ] Correct pages are explicitly disallowed (admin, staging, duplicate parameter URLs)
- [ ] Sitemap location is declared with `Sitemap:` directive
- [ ] No important pages are accidentally blocked
**XML Sitemap**
- [ ] Sitemap exists and is valid XML
- [ ] Submitted to Google Search Console and Bing Webmaster Tools
- [ ] Contains only canonical, indexable URLs (no 3xx/4xx/noindex pages)
- [ ] Updated dynamically or refreshed with each content publish
- [ ] Sitemap file size under 50MB / 50,000 URLs (split if necessary)
**Meta Robots and Canonical Tags**
- [ ] No important pages have `noindex` set accidentally
- [ ] Canonical tags are self-referential on all primary content pages
- [ ] Paginated pages use canonical correctly (not all pointing to page 1)
- [ ] Thin or duplicate pages use canonical to consolidate equity
[Image: Google Search Console Coverage report showing indexed vs. excluded pages, highlighting common exclusion reasons]
## 2. Crawl Architecture and Internal Linking
**Site depth**
- [ ] All important pages reachable within 3 clicks from the homepage
- [ ] No orphan pages (pages with no internal links pointing to them)
- [ ] Pagination doesn't bury important content beyond crawl depth
**Internal link structure**
- [ ] Priority pages receive the highest volume and most authoritative internal links
- [ ] Anchor text is descriptive (not "click here" or "read more")
- [ ] No broken internal links (crawl with Screaming Frog or equivalent)
- [ ] Navigation structure reflects content hierarchy
**Redirects**
- [ ] No redirect chains longer than 2 hops
- [ ] No redirect loops
- [ ] Old URLs from migrations still redirect correctly
- [ ] 302 redirects used only for genuinely temporary purposes
## 3. Core Web Vitals and Page Speed
As of 2024, Core Web Vitals are confirmed ranking signals. Failing these thresholds has a measurable cost.
**Largest Contentful Paint (LCP)** — target: under 2.5 seconds
- [ ] Hero image or largest text element above fold is preloaded
- [ ] LCP element is not lazy-loaded
- [ ] Server response time (TTFB) under 800ms
- [ ] Critical CSS inlined or loaded without render-blocking
**Interaction to Next Paint (INP)** — target: under 200ms
- [ ] Heavy JavaScript does not block main thread during user interaction
- [ ] Event handlers are debounced where appropriate
- [ ] Long tasks broken up with `scheduler.yield()` or equivalent
**Cumulative Layout Shift (CLS)** — target: under 0.1
- [ ] All images and embeds have explicit `width` and `height` attributes
- [ ] No content injected above existing content after page load
- [ ] Web fonts loaded with `font-display: swap` or `optional`
- [ ] No ads that shift layout on load or scroll
[Image: PageSpeed Insights report showing LCP, INP, and CLS scores for mobile and desktop with field data vs. lab data comparison]
## 4. Mobile-First Indexing
Google uses the mobile version of your site to determine rankings. If the mobile experience is degraded relative to desktop, you're being penalised on your primary index.
- [ ] Mobile and desktop content is identical (no content hidden on mobile only)
- [ ] Mobile version has the same structured data as desktop
- [ ] Viewport meta tag set correctly: ``
- [ ] Touch targets minimum 44×44px
- [ ] No horizontal scrolling on mobile at standard viewport widths
- [ ] Font size readable without zooming (minimum 16px body)
## 5. Structured Data and Schema
Structured data doesn't directly improve rankings, but it enables rich results — which dramatically improve click-through rates.
**Validate all existing schema**
- [ ] No errors in Google's Rich Results Test for every schema type deployed
- [ ] JSON-LD format used (not Microdata or RDFa — JSON-LD is Google's preferred format)
- [ ] Schema only marks up content that is visibly present on the page
**Schema types to prioritise by site type**
*B2B / agency sites*: `Organization`, `LocalBusiness`, `FAQPage`, `BreadcrumbList`
*Blog / editorial*: `Article`, `Author`, `BreadcrumbList`
*E-commerce*: `Product`, `Offer`, `Review`, `BreadcrumbList`
*Local business*: `LocalBusiness` with full NAP (Name, Address, Phone)
[Image: Google Search Console Enhancements report showing structured data coverage with valid items and error breakdown]
## 6. HTTPS and Security
- [ ] All pages served over HTTPS
- [ ] HTTP URLs redirect to HTTPS (301, not 302)
- [ ] No mixed content warnings (HTTP resources on HTTPS pages)
- [ ] HSTS header present on the server
- [ ] SSL certificate valid and not expiring within 30 days
## 7. Internationalisation (if applicable)
- [ ] `hreflang` tags correctly implemented for all language/region variants
- [ ] `hreflang` is bidirectional (every page variant references all others)
- [ ] No `hreflang` pointing to redirects or 404 pages
- [ ] x-default `hreflang` set for pages without a perfect language match
## 8. JavaScript SEO
JavaScript-rendered content is indexable, but it adds latency and risk to the pipeline.
- [ ] Critical content is server-side rendered (SSR) or statically generated — not client-side only
- [ ] Navigation links are in the DOM at render time, not injected by JavaScript
- [ ] Use Google's URL Inspection tool to verify rendered HTML matches intended output
- [ ] Lazy-loaded content that needs indexing uses `IntersectionObserver` with appropriate thresholds
[Image: Side-by-side comparison of HTML source vs. rendered DOM in Chrome DevTools for a JavaScript-heavy page]
## Running This Audit Systematically
This checklist is most useful when paired with tooling:
- **Screaming Frog SEO Spider** — full site crawl, identifies most issues in sections 1-4
- **Google Search Console** — authoritative source for indexation, Core Web Vitals, and structured data issues
- **PageSpeed Insights** — field and lab data for Core Web Vitals
- **Ahrefs / Semrush** — backlink health, broken links, and organic ranking data
Run a full crawl quarterly. Check Search Console weekly. Treat Core Web Vitals as engineering metrics — not just SEO ones.
Technical SEO is not glamorous. It is, however, the foundation that makes everything else work.
← Back to BlogSEO
The Complete Technical SEO Audit Checklist for 2026
Core Web Vitals, crawl architecture, structured data, mobile-first indexing — a systematic checklist covering every layer of technical SEO that separates ranking sites from invisible ones.