We Analyzed 5.2 Million Webpages. Here's What We Learned About PageSpeed

Hi all,

I´m Daniel and we recently looked at various speed metrics and related page characteristics, using the HTTPArchive and the CrUx datasets. While most of the results might be familiar to you, we also run a machine learning model to determine the most important page characteristics that influence page speed.

The final piece can be seen here: https://backlinko.com/page-speed-stats. The more technical paper can be found here: https://frontpagedata.com/projects/backlinko/pagespeed/Analysis_final.html

I would also like to thank the community and the HTTPArchive-Team, especially Rick - without the excellent resources on GitHub, the forum and elsewhere, we wouldn´t be able to conduct the study.

Would be curious to know what you think of the results.



1 Like

Neat research @dancoup great job tracking down all the relevant stats!

One main concern here. A lot of these conclusions seem to be based on correlations yet are presented as causal, and the conclusions are ones that would perpetuate performance anti-patterns (I’m thinking of the conclusions that compression can be a net loss and that specific CDNs are slow based on overall page load times). Have the correlations presented been adjusted to only compare pages with similar total byte weight, main-thread execution time, resource type composition, etc? All of those features I expect to be more responsible for some of the highlighted observed changes and are likely to correlate with compression rates and CDN usage in different ways that would explain these relationships.

1 Like

@dancoup how published results could be replicated? Could you share search queries which were used for analysis?

1 Like