When Google announced Chrome User Experience Report (CrUX), I was immediately excited and started to dig. It's a 30TB BigQuery dataset with a ton of insights into how real users experience the web.
But working with raw data is hard and expensive (5$ to scan 1TB). So I built – Treo Site Speed (https://treo.sh/sitespeed). It caches CrUX data and makes it accessible for everyone, and for free. No signup, just enter a URL and explore Core Web Vitals, server responses, and other speed metrics for top 8m websites on the web. Build your custom metrics using percentiles and intervals (including p99).
It’s famously minimalistic, and it makes its metrics unique:
• LCP is 0.7s
• CLS is 0 – 99% of the time
• For the past year, metrics didn’t change at all
• It passes CWV in all countries (even on 3G)
This library caches data in localStorage and sync them with server (when online). Main difference is sync-algorithm which works with server using REST API. You can read about it here: https://github.com/Ask11/backbone.offline#how-it-works
But working with raw data is hard and expensive (5$ to scan 1TB). So I built – Treo Site Speed (https://treo.sh/sitespeed). It caches CrUX data and makes it accessible for everyone, and for free. No signup, just enter a URL and explore Core Web Vitals, server responses, and other speed metrics for top 8m websites on the web. Build your custom metrics using percentiles and intervals (including p99).
I'd love to hear what the HN community thinks about it. Check out an example for HN vs Reddit to get a quick taste: https://treo.sh/sitespeed/news.ycombinator.com/vs/www.reddit...