Is recursion used for website size? Is content compressed?

Hey guys, I was looking at the “Average Bytes per Page by Content Type” and had a few questions.

Is this content already compressed? If yes, can it be compressed more?

Second, how many links deep do these content sizes represent?

I would like to know the total size of compressed website content 2 links deep.

Thanks!

Hi @mrlearnerguru! :wave:

Is this content already compressed? If yes, can it be compressed more?

All transfer sizes are as they appear over the network. So if the servers compressed the responses, then we’re measuring the compressed response sizes. Whether there is opportunity to squeeze more out of it depends on the resource type, file format, compression level, etc. We do include the raw WebPageTest and Lighthouse results in BigQuery, so in some cases you can find out if images could have been compressed more, if text resources were uncompressed, etc.

Second, how many links deep do these content sizes represent?

HTTP Archive does not “crawl” the web like a typical search engine, rather it has a list of URLs and it simply loads each web page. So the depth is only 1 page.

1 Like

Thanks @rviscomi! Great resource you guys have here.