Google PageSpeed Insights is the free tool most small business owners get pointed at when someone says your site is slow. It is also the tool most often misread.
Two things people get wrong about it. The score at the top is not a grade Google uses to rank you. And the long list of recommendations is not a to-do list ranked by impact.
Reading the report correctly matters more than improving the number at the top.
This guide explains what PageSpeed Insights actually measures, which numbers matter for SEO, and which fixes move the needle most. Full disclosure: Stackra runs PageSpeed Insights on every audit and translates the output into platform-specific recommendations, which is why this guide is opinionated about which sections to act on and which to ignore.
-
Google PageSpeed Insights ↗
Free Google tool. Enter any URL to get a performance, accessibility, SEO, and best-practices report.
What PageSpeed Insights actually measures
PageSpeed Insights reports two different kinds of data on the same page. Most of the confusion about the tool comes from not knowing which is which.
- Field data (real users): collected from actual visits to your site over the previous 28 days through the Chrome User Experience Report. This is what Google uses when it evaluates your page for ranking. If your page does not have enough real-user traffic, this section is missing or incomplete.
- Lab data (a single test in a controlled environment): a single Lighthouse test run from Google's servers. This is the score most people remember, but it is not what Google uses for ranking. It is useful for diagnosing specific problems and comparing before and after a change.
When SEO depends on speed, field data is the number that matters. The lab score is for debugging, not for grading.
The three Core Web Vitals: a quick reference
Core Web Vitals are the three real-user metrics Google uses to grade every website. A page passes when all three are in the good range across at least 75% of real visits. The thresholds:
| Metric | What it measures | Good | Needs improvement | Poor |
|---|---|---|---|---|
| LCP | How fast your main content loads | Under 2.5s | 2.5 to 4s | Over 4s |
| INP | How fast the page responds to a tap or click | Under 200ms | 200 to 500ms | Over 500ms |
| CLS | How much the layout jumps while loading | Under 0.1 | 0.1 to 0.25 | Over 0.25 |
Source: Google Core Web Vitals documentation. Thresholds apply to the 75th percentile of real-user data.
-
Google's Core Web Vitals documentation ↗
Official thresholds and evaluation criteria for Core Web Vitals.
Where most sites stand right now
Across the open web, the median small business site fails Core Web Vitals more often than it passes. The latest aggregate figures from real-user data:
- 48% of sites pass all three Core Web Vitals on mobile
- 56% of sites pass on desktop
- LCP is the primary bottleneck, with only 62% of sites passing the LCP threshold
If your site is failing one Core Web Vital, it is probably LCP, and it is probably an uncompressed hero image.
Platform matters more than the per-site score implies
Different platforms produce different baseline performance because the underlying infrastructure, image handling, and theme overhead differ. The latest real-user pass rates from the HTTP Archive Web Almanac and Chrome UX Report:
| Platform | Pass rate | Common bottleneck |
|---|---|---|
| Shopify | 78% | Uncompressed product images on legacy OS 1.0 themes |
| Wix | 71% | Uncompressed hero images (Wix converts to WebP but does not reduce source weight) |
| Squarespace | 65% | Theme JavaScript and image weight |
| WordPress | 45% | No automatic image compression, plugin overhead, slow shared hosting |
Source: HTTP Archive CrUX Technology Report. Pass rates are mobile real-user data, the share of visits where all three Core Web Vitals are in the good range. Latest figures range from June 2025 to February 2026.
-
HTTP Archive Web Almanac ↗
Annual state-of-the-web report. Source for the platform pass-rate figures above.
The most common LCP problem on small business sites
An uncompressed hero image. The pattern repeats across WordPress, Wix, Shopify, and Squarespace: a 4 to 8 megabyte JPEG uploaded as the homepage banner. What each platform does and does not do with that file:
- Most platforms generate responsive image markup so browsers can pick an appropriate size for the viewport
- Most platforms convert images to WebP or AVIF at delivery, which reduces transfer size for the browser format that loaded
- None of them reduce the weight of your source file. An 8 MB upload is stored and served at 8 MB even after WebP conversion
Uncompressed hero images are the most common LCP failure mode Stackra flags on small business sites.
- How we reduced our logo image by 99 percent →
- The scroll animation that was hiding our LCP element →
Three fixes that move the score, in order
If you only have time for three fixes, do these. They apply to every platform.
- Compress every image above 200 KB before uploading. A typical 4 MB hero image compresses to 200 to 400 KB with no visible quality loss. This single change often moves LCP from failing to passing.
- Add lazy loading to images below the fold. Most modern platforms do this automatically. For older custom themes, view source and confirm offscreen images carry the loading="lazy" attribute.
- Remove or defer render-blocking third-party scripts. Chat widgets, analytics tags, ad pixels, and review widgets often load before your page can paint. Defer non-critical scripts to load after page interaction.
-
TinyPNG ↗
Free image compression tool. Drag in a JPEG or PNG and download a smaller version with no visible quality loss.
-
GTmetrix ↗
Page speed test with a waterfall chart that shows which scripts and images are blocking your page from rendering.
What to ignore in the recommendations list
PageSpeed Insights lists every detected issue, regardless of impact. Most lists are 20 to 40 items long. The vast majority of those items will not move your score noticeably. What to skip:
- Recommendations under 50 ms estimated savings: too small to register in real-user data
- Third-party script issues you cannot control: GDPR consent banners, ad networks, analytics tags
- Properly size images warnings on responsive images that already have srcset (a parsing quirk, not a real problem)
- Diagnostics labelled as informational rather than as opportunities
When the field data section is missing
If your page does not get enough real-user traffic, the field data section will be missing or partial. PageSpeed Insights falls back to lab data only. This is normal for new sites, low-traffic pages, and most internal pages on small business sites.
- Focus on the lab-data Core Web Vitals (LCP and CLS) since those are still measurable in a lab test
- Treat the lab score as a directional indicator, not a grade
- As your traffic grows, field data starts to appear and eventually replaces lab data as the source of truth
No field data is not a problem to fix. It just means Google is not yet ranking the page on real-user performance, so use lab data as your guide until traffic catches up.
Frequently asked questions
Common questions about reading and acting on PageSpeed Insights results.
What is a good PageSpeed Insights score?
The lab score uses three colour bands:
- Above 90 (green): excellent
- 50 to 89 (orange): moderate
- Below 50 (red): poor
But which score actually matters for SEO?
The lab score is not what Google uses for ranking. The number that matters is the field data Core Web Vitals assessment at the top of the report, which is either Pass or Fail based on real-user data over the previous 28 days. Aim to pass Core Web Vitals first. Improve the lab score as a secondary goal.
Why does my mobile score differ from my desktop score?
Mobile tests are run on a simulated mid-range Android device with a slow 4G connection, while desktop tests use a fast wired connection. Mobile is harder to score well on, and Google ranks mobile and desktop separately. Mobile-first indexing means Google primarily uses your mobile page to rank, so mobile is the score that matters most for SEO.
Does PageSpeed Insights affect my Google ranking?
Indirectly, yes. Google confirmed Core Web Vitals as an official ranking signal in 2021. The signal is used as a tiebreaker when pages are otherwise closely matched in relevance, not as a primary ranking factor. The lab score from PageSpeed Insights does not directly affect ranking; the field-data Core Web Vitals assessment does. Failing Core Web Vitals will not crater your ranking on its own. Passing them removes a tiebreaker disadvantage and signals to Google that the page provides a good user experience.
How often should I run PageSpeed Insights?
Run it after every change that affects layout, images, scripts, or hosting. Re-run it monthly to spot regressions caused by plugin updates, theme updates, or new third-party scripts added by marketing tags. One important detail about field data: the underlying Chrome UX Report data is updated daily but reflects a 28-day rolling window. A fix takes about a month to fully appear in the field score.
What about the older Lighthouse metrics like FCP and Speed Index?
First Contentful Paint (FCP), Speed Index, Total Blocking Time, and Time to Interactive still appear in PageSpeed Insights and contribute to the lab score, but they are not Core Web Vitals and are not what Google uses for ranking. Treat them as diagnostics for understanding why your LCP is slow. If your FCP is high, your server response or render-blocking resources are likely the cause. If Total Blocking Time is high, JavaScript execution is the bottleneck.