Is this content already compressed? If yes, can it be compressed more?
All transfer sizes are as they appear over the network. So if the servers compressed the responses, then we’re measuring the compressed response sizes. Whether there is opportunity to squeeze more out of it depends on the resource type, file format, compression level, etc. We do include the raw WebPageTest and Lighthouse results in BigQuery, so in some cases you can find out if images could have been compressed more, if text resources were uncompressed, etc.
Second, how many links deep do these content sizes represent?
HTTP Archive does not “crawl” the web like a typical search engine, rather it has a list of URLs and it simply loads each web page. So the depth is only 1 page.