Monitor the Performance of Your Storefront
Monitor the performance of your PWA kit-based storefront at different stages of the app lifecycle, including in-house lab test performance and real-world usage. Create a performance benchmark to use as a future comparison basis.
Collect lab data when you develop the app, and field (real-world) data after the app is deployed.
- Lab data is collected from performance tests in a controlled environment. During app development, run performance tests and collect lab data to make sure the app's performance meets performance benchmarks before it's released to customers. For performance tools to use, see Performance Tools.
- Field data captures real-world user experiences. After the app is released to production, collect field data, or Real User Monitoring (RUM), from active user sessions, to get an idea of how your app is performing in the real world on the web. See the next section for more information about Chrome User Experience Report and Google's Core Web Vitals.
To understand real user experience, measure field data using some common monitoring tools, such as Chrome User Experience Report (CrUX), or other third-party tools, such as Datadog or Akamai.
CrUX is a public dataset from Google that provides real-world metrics from actual Chrome users and serves as the official source for Google's Web Vitals program. CrUX data is a 28-day rolling average, meaning performance improvements take time to reflect. CrUX evaluates site performance based on the 75th percentile, requiring 75% of pageviews to meet the "Good" mark for a given metric. It primarily focuses on hard navigations (initial page loads), not soft navigations within Single-Page Applications (SPAs). CrUX doesn't include iOS devices. See Overview of CrUX in the Google Chrome documentation.
Google uses field data from CrUX for page experience ranking, not lab-based Lighthouse scores. The ultimate goal is to ensure at least 75% of real users pass Core Web Vitals thresholds.
Users must meet certain criteria to have their experiences aggregated in the CrUX dataset. To learn more, see User in CrUX methodology in the Chrome for Developers documentation.
Core Web Vitals is a set of metrics that measure real-world user experience for three key aspects of the page: loading performance, interactivity, and visual stability.
- Largest Contentful Paint (LCP): Measures how long it takes for the largest image or text block to appear. The goal is 2.5 seconds or less. See Largest Contentful Paint.
- Interaction to Next Paint (INP): Measures page responsiveness to user input. The goal is 200 milliseconds or less. See Interaction to Next Paint (INP).
- Cumulative Layout Shift (CLS): Measures unexpected content shifts during loading. The goal is 0.1 or less. See Cumulative Layout Shift.
See Also
During app development, perform initial performance testing to create a benchmark that you can compare future performance test results against.
To set up a performance test that simulates your customers' experience, it is important to know your customers' setup and choose settings that represent your average customer. Define performance goals, for example, LCP < 2.5 seconds, and track them throughout app development. If future test results show slower performance, this means that your app regressed and you can investigate which change introduced the performance regression.
We recommend you use continuous integration practices to catch and fix regressions early before release. With continuous integration, code changes get merged frequently to a shared source code repository, triggering automated builds and tests. Integrate continuous integration tools in your development workflow, such as Google Chrome Lighthouse or WebPageTest.
Here are some commonly used tools to measure the performance of your app.
- (Recommended) WebPageTest. A web performance testing tool with AI-powered analytics, dashboards, and real user monitoring (RUM).
- Google Lighthouse: Open-source, automated tool to help you improve the quality of web pages.
- PageSpeed Insights: Reports on the user experience of a page on both mobile and desktop devices, and provides suggestions on how that page can be improved.
- Browser developer tools. For example, for Chrome dev tools, check out Coverage: Find unused JavaScript and CSS and Performance panel: Analyze your website's performance.
- Chrome User Experience Report (CrUX): Publicly available performance data per user in Chrome.
To effectively audit and monitor performance, use these metrics and debugging tips.
- Core Web Vitals
- Largest Contentful Paint (LCP)
- Interaction to Next Paint (INP)
- Cumulative Layout Shift (CLS)
- Time to First Byte (TTFB)
- Total Blocking Time (TBT)
- CDN cache hit ratio
For a description of performance metrics, see What are Core Web Vitals? and Time to First Byte (TTFB). To learn more about TBT, see Total Blocking Time (TBT) in web.dev.
Use the Metrics Dashboard in your B2C Commerce instance to gather field data and monitor the performance of the app while it's running. For example, you can check out these metrics.
- Managed Runtime metrics: Request Time and Cache Hits (Ratio)
- Overall Application Metrics: Storefront requests
To access the Metrics Dashboard, in App Launcher
, select Log Center, and then click Metrics.
- Append
?__server_timing=trueto your URL to get a Server-Timing header breaking down SSR process times. - Append
?__server_onlyto view the pure, server-rendered version of a page without client-side JavaScript.