I recently wanted to know if 52 domains on a given page was too much and how was the general distribution of the domains per page on httparchive looked like
@aranjedeath before we answer that, it would be interesting to run a second run of analysis on these domains to isolate cases of third party deps (e.g. widgets, ads, etc), vs. domain sharding. In HTTP 2.0 world, sharding is an anti-pattern, so that’s definitely part education, part (hopefully) automation to undo the damage. That said, I doubt the number of third party deps will go down – my guess it, it’ll continue going up (in fact, there’s another interesting graph that I’d like to see!)
@cqueern I don’t think the sheer number is as relevant as when/which resources are in the critical path. If the bulk of these domains are loaded after the fact and outside of critical path – great!
I’m guessing this data would also be useful for folks who work in webspam detection because it’d be easier to identify sites and pages that have suspiciously high numbers of outbound links. I doubt it’s a novel idea to many teams that work for search engines but perhaps hosting companies hoping to detect compromises in their customers’ sites would find it helpful.