Introducing User eXperience Rating (UXR) score
- Aymen Loukil
- No Comments
To assess how well a web page is performing in controlled environments (in the lab) we usually rely on the Google Lighthouse score. But how to measure and assess real user experiences? Following the progress of every metric is challenging and the lack of a unified rating score. That’s why we are excited to introduce User eXperience Rating (UXR): A score to assess real users’ experiences.
In this post, we’ll describe UXR, how to measure it, and how Speetals can help you get started.
What is UXR?
UXR (user eXperience Rating) is a new way to measure and assess real user experience using data from recorded sessions by Real User Monitoring Tool (RUM). It’s a simple scoring system that measures the quality of your website users’ experiences. Inspired by Google Lighthouse, UXR scores all of your metrics in one value so you can easily see how they all come together to form an overall rating on domain and page levels.
Each metric has a weighting associated with it—the more heavily weighted metric, the more important it is for determining your overall UXR score (weighted average).
While the Lighthouse score is the weighted average of the metrics values, UXR is based on the good % distribution of each metric. For example, according to CrUX (Chrome User Experience Report) a website got 76% of tracked users experienced Good LCP (Largest Contentful Paint). The UXR weight for LCP will be applied on the 76%.
A Good UXR means a website/webpage is offering an overall good user experience.
UXR Weighting and formula:
Metric | Weight |
Largest Contentful Paint (LCP) | 30% |
First Input Delay (FID) | 10% |
Cumulative Layout Shift (CLS) | 20% |
First Contentful Paint (FCP) | 10% |
Time To First Byte (TTFB) | 20% |
Interaction to Next Paint (INP) [experimental] | 10% |
UXR = (Good_LCP x 30%) + (Good_FID x 10%) + (Good_CLS x 20%) + (Good_FCP x 10%) + (Good_TTFB x 20%) + (Good_INP x 10%).
To use the UXR formula, open up this Google Sheets template and make your own copy
Color code of UXR:
We are following the same color code of Lighthouse.
- 0 to 49 (red): Poor
- 50 to 89 (orange): Needs Improvement
- 90 to 100 (green): Good
UXR in Speetals:
Let’s take an example of the UXR score for Vinted website for a France-based audience. The below screenshot from Speetals shows the distribution of each of the metrics for the month of January 2023 on desktop devices. UXR=61/100 and comparing to 2022 December data, the website improved it with 2 points.
We can also see the progress of UXR score over time by clicking on the “UXR over time” button.
With this graph, we can easily see Mobile versus Desktop UXR progress. In the above example, the Mobile UXR is poor (red) while the Desktop one needs improvement (Yellow).
Check your website’s UXR now
How is it different from the Lighthouse score?
UXR is a new score that is created to provide a holistic view of user experience. UXR scores are calculated based on real user data collected from the field, not lab experiments or synthetic tools.
Lighthouse score is for lab data (synthetic tools) and for a controlled page load experience and not about a set of user sessions. Also, it’s not possible to use Lighthouse scoring on field data because they don’t share the same metrics (For example, Total Blocking Time (TBT) is not applicable for field data).
Differences on weighting:
Metric | Lighthouse (V10) weight | UXR weight |
Largest Contentful Paint (LCP) | 25% | 30% |
First Input Delay (FID) | NA | 10% |
Cumulative Layout Shift (CLS) | 25% | 20% |
First Contentful Paint (FCP) | 10% | 10% |
Time To First Byte (TTFB) | NA | 20% |
Interaction to Next Paint (INP) [experimental] | NA | 10% |
Speed Index (SI) | 10% | NA |
Total Blocking Time (TBT) | 30% | NA |
Time To Interactive (TTI) | 10% | NA |
Final thoughts:
So, what has been discussed here is a new initiative to measure and assess real user experience. UXR aims to make Real User Monitoring data simpler to evaluate and compare over time but also to have an all-in-one score of users’ data.
We encourage the web performance community and other tools providers to try the UXR score, review it, and give feedback.
Thanks to Dave Smart and all those who reviewed the formula and weights of the UXR score.