Search Off the Record Are websites getting "fat"? Page weight, HTML size & Googlebot limits explained
Mar 30, 2026
They unpack what it means for sites and pages to get “fat,” comparing raw bytes, transferred size, and perceived weight. They highlight huge growth in mobile page sizes and Googlebot’s per-URL fetch limits. They call out HTML bloat from inlined images and structured data. They discuss rendering costs, content parity between mobile and desktop, and practical ways to cut image and asset bloat.
AI Snips
Chapters
Transcript
Episode notes
Page Weight Depends On Your Definition
- Web page 'weight' varies by definition and can mean raw bytes, resources, or total download needed to view a page.
- Web Almanac measured median mobile homepages at 845 KB in 2015 and 2.3 MB in 2025, a ~3x growth in transferred bytes.
From Floppy Disk Websites To Multi‑MB Homepages
- Martin recalled building sites in the dial-up era where whole sites (minus images) fit on floppy disks, so small HTML/CSS/JS bundles were common.
- He contrasted that with today's median homepage sizes and surprised reactions to 845 KB in 2015.
Googlebot Uses A 15MB Per URL Fetch Limit
- Googlebot fetches up to 15 MB of raw bytes per URL then stops, and each referenced resource has its own 15 MB limit.
- That crawl limit matters more for search engine access than for user experience.
