WEBSITE PERFORMANCE

INP is quiet. It's the Core Web Vital that correlates with conversion.

Interaction to Next Paint replaced First Input Delay as a Core Web Vital on 12 March 2024. FID measured one number, the delay before the browser could start handling the first input on the page. INP measures the worst interaction across the whole page session, roughly the 98th percentile of all input latencies. That is a materially different metric, and most teams have not yet felt the full consequences.

FID was a test almost everything passed. INP is not.

The HTTP Archive Web Almanac 2024 performance chapter, which reads the CrUX dataset across the top million origins, put FID pass rate above 95% on mobile. INP pass rate on the same dataset dropped to roughly 62% on mobile and 91% on desktop. Sites that were green on every Core Web Vital in early 2024 were often amber or red by April.

Google's official targets are: good is 200ms or below, needs improvement is 200 to 500ms, poor is above 500ms. These are the thresholds that map to the green, amber, and red zones in PageSpeed Insights and Search Console's Core Web Vitals report.

What INP actually captures

INP sums three things for every interaction:

  1. Input delay. Time from the user action to the event handler starting. Usually small unless the main thread is already blocked.

  2. Processing time. How long the handler runs on the main thread. Typically the dominant component on sites with heavy client-side rendering.

  3. Presentation delay. Time from handler completion to the browser painting the next frame. Often underestimated, especially on sites with large layout costs.

A jQuery-era click handler that blocks the main thread for 380ms passes FID and fails INP. A React component that re-renders the page tree on every keystroke in a search box passes FID and fails INP. That is the practical shift: INP is sensitive to the middle and end of the session, not just the opening interaction.

The three regressions we keep finding

  1. Third-party tag scripts firing on first input. Marketing stacks that defer init to the first click, touch, or scroll concentrate main-thread work exactly where INP measures it. A common fix is moving consent SDK and tracker init off the input path and onto a later idle callback using requestIdleCallback or scheduler.postTask.

  2. React state updates inside input handlers. When a click handler sets four bits of state that each trigger re-renders on large component trees, processing time dominates INP. The fix is usually batching (React 18 automatic batching is not always enough across async boundaries), moving derived state out, or deferring with startTransition. We have seen p75 INP drop from 420ms to 180ms on a B2B SaaS dashboard by replacing a Redux cascade with a single useTransition.

  3. Synchronous work disguised as UX. Client-side sorting, filtering, currency formatting, large list renders, markdown parsing in real time. Virtualisation helps. Moving the work into a Web Worker helps more when it is non-trivial. content-visibility: auto on off-screen sections can cut presentation delay by more than half.

Mobile is the real battleground

The desktop INP pass rate is comfortable for most sites. Mobile is where budgets break. Median mobile INP across the CrUX dataset sits around 180ms; the 75th percentile (the one that matters for the CWV assessment) is above 200ms for most ecommerce and content categories. Low-end Android devices (Moto G, Samsung A-series) routinely produce INP values three to four times worse than an iPhone 15 on the same page. If your audience is heavy on Android under $300, assume your field INP is meaningfully worse than what your team measures on their own phones.

Why this is the CWV that moves conversion

Chrome's case-study collection on web.dev (Redbus, Tokopedia, Economic Times, RedBull) shows INP improvements mapping to measurable bounce and conversion lifts. LCP and CLS gate perception. INP gates the actual interaction the user came to do. It is the one that, when bad, produces the mouse-click-nothing-happens sensation users describe in interviews and never in analytics. We ran a simple correlation on 14 ecommerce domains, comparing p75 INP to checkout-start rate monthly. The domains whose p75 INP moved below 200ms showed a median 6.4% lift in checkout-start. LCP movements over the same period did not correlate.

Measurement is not optional

Lab tools under-report INP because they rarely replicate the session-long interaction pattern of a real user. Field measurement via the web-vitals JS library reported to GA4 or a warehouse is the only reliable source. Google's web.dev INP guide has the current recommended instrumentation. A common mistake is reporting INP only as an average; the p75 is the one Google scores against, and the tail is where the problem lives.

What to prioritise if you have one sprint

  • Instrument field INP first. Without the data you are guessing.

  • Find the worst-performing page templates by p75 INP. Not the homepage; usually the product-list or dashboard template.

  • Profile one real interaction per template in the Chrome DevTools Performance panel. The long tasks will be obvious.

  • Defer the third-party scripts that are the easiest to defer. Almost every marketing stack has two or three you can move off the input path inside a day.

Sources: HTTP Archive Web Almanac 2024; CrUX quarterly updates; web.dev INP guide; internal correlation across 14 ecommerce domains, Q2 2024.

© 2026 8LAB. All loops reserved.

EXPERIMENTS