Google Search Console Showing High CLS But PageSpeed Insights Shows Good Score

So you’ve asked a plugin provider/your theme designer/ads provider to fix CLS issues that you think their services are causing on your website. They’ve updated their software, run some tests with PageSpeed Insights, and you’re ecstatic – your CLS has dropped to 0.05, virtually non-existent!

A month goes by, and you login to Google Search Console to see your performance and make sure there aren’t any warnings or errors you’ve missed. You go over to the Core Web Vitals tab, and notice that you have a bunch of errors for CLS! You were told the posts should be re-crawled and the errors should go away, but some of the listed posts were published in the past few days and should never have had CLS issues. You run another PageSpeed Insights report, yet it tells you that you have nothing to worry about for CLS. What gives, Google?

The reason

The primary difference in Web Vitals reporting between pretty much any Web Vitals reporting tool (including PageSpeed Insights) and Google Search Console, is the way that web vital data is collected.

Site speed scans work by launching a browser (in the background, on the site speed tool’s website server), loading the URL you’ve given it, and measuring specific metrics. The browser instance only exists for as long as it takes for the webpage to load and scripts to execute, plus an additional few seconds. Then it closes out and the measurement results are saved and displayed to you.

However, Google Search Console collects metrics in a different way. It uses the exact same measurement methods, however it does so on the browsers of your real users (those using Chrome, at least), and not a crawl bot. This ensures that the data is as “real” as possible, reflecting the same content your visitors are seeing. Not only does it collect data from the visitor experience when they visit your website, it does so throughout the entire time the tab is open, not just when the page is loading. According to Google:

The data for the Core Web Vitals report comes from the CrUX report. The CrUX report gathers anonymized metrics about performance times from actual users visiting your URL (called field data). The CrUX database gathers information about URLs whether or not the URL is part of a Search Console property.

https://support.google.com/webmasters/answer/9205520#about_data

Any layout shift that happens, whether in the first minute or after five minutes of browsing the same page, is kept track of by Google Search Console. According to the Core Web Vitals documentation:

CLS measures the sum total of all individual layout shift scores for every unexpected layout shift that occurs during the entire lifespan of the page.

https://web.dev/cls/#what-is-cls

You’ve probably realized by now that the reason GSC is showing different results, is because visitors are not getting the same experience that site speed scansare seeing. The update that was made earlier to fix CLS issues only fixed CLS issues for site speed scanners, while no change was made for visitors.

Now, I don’t think that that companies like ads providers are doing this on purpose. Core Web Vitals are relatively new, and it hasn’t even been a year since they were first announced in May 2020. I believe they just didn’t know how Google Search Console collects Core Web Vital data, and made the assumption that it was done through the same crawl bot that indexes webpages.

Real world example of this happening

There are two ways that CLS issues are fixed for speed scans and not for visitors. The easiest and most commonly implemented way (and one that is actually still recommended by many people) is they simply delay the loading of ads scripts by 10-15 seconds after the the page loads. Site scans end before ads are loaded, so it looks like the issue is fixed (until you check Search Console a month later and still see the issue).

The second way I’ve seen CLS issues avoid detection by scans is by waiting for user input before displaying ads, or waiting for them to start scrolling down the page. Similar to delaying the ads, this prevents site scans from triggering JS that causes CLS issues.

I’ve created a page that showcases what that looks like. About eight seconds after the content loads, it will be shifted down by a new element loading onto the page, causing a layout shift of about 0.25. If you run any scanning tool, however, it will show a CLS score of virtually 0.

As long as any visitor stays on that page long enough for the layout shift to happen, Google will know about it and factor it into their ranking algorithm.

How can I know what’s causing CLS that doesn’t appear in scans?

I’ve recorded a video showcasing what happens and how I can see the layout shift score without using an external scanning tool, just the developer tools built into the latest versions of Chrome.