Hi everyone,
I’ve been digging into web performance lately using http archive data, and I started wondering how small but highly interactive pages compare to larger content sites. I was testing something like the telenor quiz portal and noticed that even though the page looks simple, it makes a lot of network requests and loads quite a few scripts and assets.
When I check things like lcp, cls, and total blocking time, the results are much worse than I expected for a single quiz page. Some of the scripts are loaded from different domains, and caching doesn’t always seem to kick in, which makes repeat visits slower than they should be.
Is this a common pattern for interactive quiz or educational pages? And does http archive have a good way to analyze how third-party scripts, dynamic content, and cross-domain assets contribute to performance problems like long load times or layout shifts?
I’d love to hear how others here approach measuring and optimizing this kind of setup.