Just a heads up that there was an issue with the Lighthouse update script that prevented ~half of the tests from running the Lighthouse audits. Keep this in mind when querying the 2017_11_15 dataset.
Just curious- is there a predictable schedule for posting crawl data publicly, both on this site and within Google bigquery?
Apologies in advance if this has been discussed elsewhere
Long answer: The crawls start on the 1st and 15th of every month and takes 2 weeks to run. Theoretically, the data should be processed and available on BigQuery and the web UI on the 14th and 30th respectively, but there is often some kind of hiccup that requires manual intervention. For example, I just found that the November 1 desktop HAR tables were not available on BigQuery and manually ran that processing job. I’m usually quicker to discover and manually run failed jobs, but holidays, blah blah blah.
Short answer: We’re working to make the data availability more predictable but for now you can expect a ~2-3 week turnaround time.
Understood- thanks so much!