React adoption over time

Here is a reusable query template for tracking adoption of any particular Wappalyzer technology over time since 2016 (the earliest time at which we had the response bodies to apply the detection logic). In this example we’re looking at React:

#standardSQL
SELECT
  REPLACE(SUBSTR(_TABLE_SUFFIX, 0, 10), '_', '-') AS date,
  IF(ENDS_WITH(_TABLE_SUFFIX, 'desktop'), 'desktop', 'mobile') AS client,
  COUNT(DISTINCT url) AS pages,
  total,
  ROUND(COUNT(DISTINCT url) / total, 4) AS pct
FROM
  `httparchive.technologies.*`
JOIN
  (SELECT _TABLE_SUFFIX, COUNT(0) AS total FROM `httparchive.summary_pages.*` GROUP BY _TABLE_SUFFIX)
USING (_TABLE_SUFFIX)
WHERE
  app = 'React'
GROUP BY
  date,
  client,
  total
ORDER BY
  date,
  client

Raw results

When plotting the absolute count of React detections over time, here’s what we get:

Most recently we’ve detected React on 362,399 mobile pages and 200,404 desktop pages, and adoption is growing. But there are some misleading jumps in this chart. Due to the recent corpus changes we started ramping up to 10x more websites since July 2018, completing that change in January 2019. So it’s helpful to think about adoption relative to the total number of sites tested:

The relative percentages tell another interesting story. There are three phases: before/between/after July 2018 and January 2019. These dates also correspond to the corpus changes and the rollout of that change tells us about how React adoption is distributed on the web.

Before July 2018 we used the home pages of the Alexa top 500K domains and tested them on both mobile and desktop. The Alexa list had been discontinued, so our corpus was getting stale. Adoption during this time was pretty low, < 1%. After July 2018 we started using the Chrome UX Report, which gave us more granular origin-level info for popular websites. But we didn’t have the capacity to test all ~5M websites, so we had to take the intersection between the Alexa 1M domains and the CrUX origins. In other words, we were looking at the most popular 20% of origins. During that time React adoption was nearly 10% on mobile. After January 2019, we increased our capacity and included the remaining 80% of the tail. The adoption rate dropped to ~7% on mobile, but has been significantly higher than before.

In summary, switching to the CrUX-based corpus improved the accuracy of our detections because it represents websites real users actually visit. And it’s updated monthly so we’re better able to track changes more closely.

React adoption is significantly higher on mobile than desktop. And while the relative percent of websites looks to be about flat through 2019, the absolute number of websites is growing.

1 Like

Are the httparchive.technologies detections based on Wappalyzer? Makes me wonder what the difference would be compared to the lighthouse-derived detections, since we updated those earlier in the fall in a way that likely increased detections.

Yes, these are Wappalyzer-based detections.

Lighthouse is based on “Library Detector for Chrome”, which produced some questionable results last I looked. For example I think it showed Angular as more popular than React.

There are two ways to query that data: using the _js_libs HAR metric (or similar, off the top of my head) in the pages dataset or in the Lighthouse results, which are only available for mobile websites. I was querying the HAR metric, so it’s possible it’s stale and not detecting correctly compared to the Lighthouse implementation. It may be worth querying the Lighthouse data to be sure but being mobile-only limits our analysis.

Really neat analysis! I think those insights square up with my intuition that the tail of the web uses React much less frequently that the most popular 20%. Fascinating that it’s usage has stagnated over the past year percentage-wise though, I wouldn’t have expected that.

Lighthouse is based on “Library Detector for Chrome”, which produced some questionable results last I looked. For example I think it showed Angular as more popular than React.

FWIW, if the last time you looked was some time ago, there were several updates that were made to detection such that it doesn’t just rely on global scope anymore. When I ran a query ~2 weeks ago, the React numbers there and from technologies table were very similar.

1 Like

It was last week. I’ll rerun the analysis using both sources and post a new thread for comparison.

Have there been other changes in the Wappalyzer detection that might be driving these graphs? For example, say that we plot Google Analytics using the same code but changing ‘React’ to ‘Google Analytics’ and only looking at the desktop results:

Picture1

Does anyone know why the graph changes so dramatically?

Wappalyzer wasn’t run as part of the scan until around where the graph jumped. @rviscomi had run a backfill using the detection logic and the historical test results but it doesn’t have access to the DOM so I wouldn’t be surprised if it missed a bunch.

1 Like