HTTP Archive has come a long way in the 7 years since Steve Souders first started tracking how the web is built. The first dataset was run on November 15, 2010 over ~20k desktop websites. Today, we track 25x more websites on both desktop and mobile, including much richer data from newer WebPageTest features and PWA metrics from Lighthouse.
A few years ago, we moved all of our data to BigQuery to open up the mine of data for easier exploration. Since then, this forum has been thriving with all of the great work by the HTTP Archive community.
So to give back to the community and encourage more awesome analysis, we’re giving away “research grants” in the form of $50 in Google Cloud credit (equivalent to 10 TB of query processing). Whether you’re a power user running close to your free BQ quota or a new developer interested in getting your hands dirty, we encourage you to sign up at bit.ly/ha50 and tell us your use cases. And if you find something super interesting in the data, or even if you just need help writing your query, please open up a thread on this forum and tell everyone about what you’re working on.
To motivate and inspire you, here are some of the greatest hits from the forum over the years:
- M dot or RWD. Which is faster?
- Are Popular Websites Faster?
- What is the least common colour used on web pages?
- 1MB+ of HTTP overhead due to TowerData cookies
- Sites that deliver Images using gzip/deflate encoding
- Mobile requests that redirect to 'm.' or '.mobi'
- What is the distribution of 1st party vs 3rd party resources?
- Top Base Page CDNs for Top URLs
So thank you to the HTTP Archive community for keeping us going for these 7 years. We can’t wait to see what you discover next!
Rick, on behalf of the HTTP Archive team