Nowadays it’s extremely common to find a website that has dynamically inserted content. Usually it appears in response to a user input. A good example would be a so-called “accordion dropdown,” where hidden content reveals itself after a visitor clicks on the section title.
Another example would be cases when some elements appear on the page during the loading process, unexpectedly moving page content down. This is an unwanted behavior. To track the issue and reflect it in numbers, Google came up with a metric called Cumulative Layout Shift.
Cumulative Layout Shift (CLS) is a metric that shows how “stable” your content is. It monitors layout shifts that happen later than 500 ms after user input and sums them using a certain formula that we will get to down the road. As a result you get a dimensionless value that can be considered as follows:
Less than or equal to 0.1 — Good
Between 0.1 and 0.25 inclusively — Needs improvement
More than 0.25 — Poor
When you break everything down by the numbers, the impact of CLS is quite small compared to other Lighthouse metrics. It affects 5% of the total Performance Score in the most recent Lighthouse version (v7). While that number may look small, I’d recommend you don’t neglect this metric to ensure that users don’t accidentally open the wrong thing simply because an image or ad loads just a moment before they tap or click a desired element.
In May 2021, Google rolls out its page experience update. That means the search engine will update its ranking system and re-evaluate websites using page experience signals. Page experience is a set of signals that report how comfortable a page appears to a visitor beyond just getting pure information value. When talking about UX-related signals, Core Web Vitals comes to the rescue.
As of May 2021, it has three core user-centric metrics to track the main aspects of user interactions with a page. Cumulative Layout Shift is one of them, which means that improving CLS may almost directly affect how Google favors your page.
CLS can be measured in the lab (via synthetic tests) or in the field (using real user data).
Chrome DevTools (Performance tab)
Chrome User Experience Report
Core Web Vitals report from Search Console
Lab tools are often used during the development stage to make sure new features/improvements comply with the threshold set for CLS, whereas field tools report RUM (Real User Monitoring) data.
Results provided by these two methods may differ due to their nature. As for CLS, values reported by lab tools for a given page may be less than field data. That’s because tools like Lighthouse measure layout shifts only during the page loading process, and field tools can continuously report user values.
Technically speaking, a layout shift happens when an element that is visible within the viewport changes its start position in the timespan of two consecutive frames. Simply put, a sudden movement of a page element without any user input triggering it is considered a layout shift.
While any dynamically added element may potentially cause content to change its starting position, here are some common reasons:
To fix this issue, include width and height attributes inside the <img> tag. Alternatively, or in conjunction with this, you can utilize aspect ratio padding trick to keep element proportions intact. The latter approach is also proven to be suitable for other types of media such as dynamically embedded videos.
I intentionally combined these into one section since most ads are in fact third-party code embedded to the page via various means. While you can’t control the size of content placed inside an ad, it is possible to track down how big the whole ad/widget is or how potentially large it may become.
Using Chrome DevTools, you can find the ad dimensions and style the ad container upfront with those values.
It’s also worth mentioning that ads placed at the very beginning of the page (even higher than website navigation) may cause bigger perceived layout shifts than those that are placed in the middle of the page.
If a custom font is used on the website, the browser will need to download it in order to use it on the page if the font is not already present in the system. When a font is being downloaded, a fallback font is used in the meantime. This may be defined in CSS by the developer or provided as a default browser property. The takeaway here is that the target font and fallback font may be different in terms of the space a text is occupying.
To prevent layout shifts caused by font-related problems, you can utilize the following:
Use the font-display CSS property. There’s a lot to talk about here: you can refer to this page for a deep dive.
Preload key web fonts using the <link rel=preload> HTML tag instead of remotely fetching them from CSS.
These two methods may be used separately or jointly. You can use the font-display property to determine the font swapping behavior, and preload the font to increase the chances that it will be downloaded and ready to be rendered even before the page is outputted visually. No visible font change = no layout shift.
If you have AJAX-dependent DOM changes on your website, consider reserving a space for results coming from the server. This solution is very similar to reserving space for ads and widgets, but in this case you may have more control over how big the upcoming content will be.
From my experience: I once had dynamic JS forms rendered on a website on demand. When a user opened a modal, the form configuration started downloading and was then displayed on the page. However, sometimes this may be longer than the 500-millisecond window, breaking the cause-effect chain and triggering layout shifts. Since the forms were pre-defined on the back end, it was possible to define container height upfront and eliminate layout shifts.
I intentionally didn’t bring this up at the very beginning of the article for a simple reason: from my experience, you don’t need to know all the details of the way the metric is measured to make some initial improvements.
However, knowing the theory puts you ahead of the competition (even if you’re just competing with yourself), so let’s find out.
Then, the browser determines the impact fraction. Take the height of the area that’s been affected by unstable elements and divide it by the whole viewport height, and you’ll get the impact fraction.
In the example above, the text block has been shifted down by 25% of the viewport. However, this has affected 75% of the viewport, so the impact fraction is 0.75. To put this in more mathematical language, the impact fraction is the union of the space occupied by the element before and after the layout shift.
It’s important to optimize all your Core Web Vitals — not only LCP and FID, but also CLS). As you optimize them, keep in mind that they all may be more or less dependent on the actual website loading speed (in bytes per second).
I’m mentioning images in this article for a reason. The thing is that images load in a non-blocking fashion, meaning that they won’t block the page from rendering, but they will occupy bandwidth, leaving less space for scripts, fonts and external ads to be loaded, which can cause delays. By optimizing images, you can increase performance in all three core metrics.
To accomplish this in a convenient automated way, Uploadcare created a tool called Adaptive Delivery. It allows you to:
Defer image loading (aka “lazy loading”). Images load automatically once they enter the viewport.
Apply image transformations.
Optimize parameters like size, format, quality, and dimensions automatically based on the user’s device configuration.
With the last one, you won’t need to manually prepare different image versions. The tool is content-aware and AI-driven, so the optimal quality will be determined based on the image content.
Also, it’s a tiny (3.7 KB) compressed script, which can save you megabytes of image data and do all the heavy lifting on the front end.
If you want to look at the potential results Adaptive Delivery can provide in your particular case, test your website with PageDetox, another service from Uploadcare that can generate reports based on image size data.
In order to help you in the race for perfect performance results, in this Lighthouse-related article series I always encourage you to stick to a fundamental user-centric approach, and CLS is not an exception. Modern web standards make it harder to trick search engines, and this naturally forces website owners to care about users more.