Back in 2021 I wrote about measuring your website's performance with Lighthouse. At the time, Google had just started factoring Core Web Vitals into search rankings and the industry was scrambling to understand what these metrics meant. Five years on, the landscape has matured — some metrics have been replaced, new ones have been introduced and the bar for what Google considers "good" has moved. Let's take stock of where we are.
A quick refresher
Core Web Vitals are a set of real-world performance metrics that Google uses to evaluate how users experience your website. They've always focused on three dimensions: loading speed, interactivity and visual stability. What's changed over the years is which specific measurements Google uses to assess each of those dimensions.
What's changed since 2021
Interaction to Next Paint replaced First Input Delay
The most significant change came in 2024 when Google officially replaced First Input Delay (FID) with Interaction to Next Paint (INP). FID only measured the delay before the browser started processing the first user interaction. INP measures the full lifecycle — from the moment a user clicks, taps or presses a key, through the browser's processing, all the way to when the screen actually updates.
This is a much better reflection of how responsive a site feels. A page could pass FID easily while still feeling sluggish because the browser took ages to visually respond after processing the event. INP catches that.
What it means for you: Sites that relied on passing FID benchmarks may now fail on INP. If your site has heavy JavaScript execution, complex event handlers or long-running tasks on the main thread, INP will expose those issues.
Largest Contentful Paint thresholds tightened
LCP hasn't changed conceptually — it still measures how long it takes for the largest visible content element to render. But Google has gradually raised expectations. A score that was considered acceptable in 2021 may now be flagged as needing improvement.
The target remains under 2.5 seconds for a "good" rating, but with modern hosting, CDNs and image formats like AVIF, there's less excuse for missing it. Google's crawlers now also better detect lazy-loaded content and deferred rendering tricks that artificially improved LCP scores.
Cumulative Layout Shift scoring refined
CLS still measures visual stability — how much the page layout shifts unexpectedly during loading. Google has refined how it's calculated, using a "session windowing" approach that groups layout shifts into bursts rather than accumulating them across the entire page lifecycle. This is fairer for long-lived pages like single-page applications where small shifts over a long session used to accumulate unfairly.
In my original Lighthouse article I mentioned how CLS can cause users to miss critical calls to action — that Tesla example. The refined scoring model is better at catching genuinely disruptive shifts while being more forgiving of minor ones that happen well after the page has loaded.
The three Core Web Vitals in 2026
| Metric | What it measures | Good threshold |
|---|---|---|
| LCP (Largest Contentful Paint) | Loading speed | Under 2.5s |
| INP (Interaction to Next Paint) | Responsiveness | Under 200ms |
| CLS (Cumulative Layout Shift) | Visual stability | Under 0.1 |
Why it still matters
Some business owners I speak with assume that Core Web Vitals were a passing trend — something Google pushed for a couple of years before moving on. That hasn't happened. If anything, the weighting has increased. Page experience signals remain a confirmed ranking factor, and with AI-powered search results reducing the number of organic positions on the results page, every ranking advantage counts more than ever.
But the SEO impact is only part of the story. These metrics exist because they correlate with real business outcomes:
- Faster sites convert better. This has been demonstrated repeatedly across industries. Even a 100-millisecond improvement in load time can measurably improve conversion rates.
- Responsive interactions reduce bounce. When users click a button and nothing appears to happen, they leave. INP directly measures this.
- Stable layouts build trust. A page that jumps around while loading feels unfinished and unreliable. CLS captures exactly this experience.
For a deeper dive into how performance connects to business results, see our article on the real ROI of SEO.
Practical steps to improve your scores
For LCP
- Optimise images. Use modern formats (AVIF, WebP) and serve appropriately sized images for each device. This alone fixes LCP issues on most sites.
- Preload critical resources. Make sure the browser starts fetching your largest content element as early as possible.
- Evaluate your hosting. If your server response time is slow, no amount of front-end optimisation will save your LCP. A fast, well-located host is table stakes.
For INP
- Break up long tasks. If your JavaScript runs for more than 50 milliseconds without yielding, the browser can't respond to user input during that time. Split heavy operations into smaller chunks.
- Reduce third-party script impact. Analytics trackers, chat widgets, consent banners — each adds JavaScript that competes for the main thread. Audit what you actually need.
- Debounce expensive event handlers. If a scroll or resize event triggers layout recalculations, make sure it's not firing on every frame.
For CLS
- Set explicit dimensions on images and videos. This tells the browser how much space to reserve before the content loads.
- Avoid injecting content above the fold after load. Cookie banners, promotional bars and late-loading ads are the most common CLS culprits.
- Use CSS containment where appropriate to isolate layout changes to specific areas of the page.
Measuring in the real world
Lighthouse is still a valuable tool for lab-based testing, but the scores that matter to Google come from real users via the Chrome User Experience Report (CrUX). This means your Core Web Vitals performance is based on what your actual visitors experience on their actual devices and connections — not a synthetic test on your high-spec laptop.
You can check your real-world scores in Google Search Console under the Core Web Vitals report, or via PageSpeed Insights which shows both lab and field data side by side.
At Inlucent we collect Web Vitals data from real user sessions to identify exactly where performance degrades — not just what the averages say, but which pages, which devices and which interactions are causing problems. If your scores need attention, we can help.