I’m remaking this website in the open. Follow along, if you wish: Writing, RSS, Source


A Helpful Baseline (Measuring Site Quality with Lighthouse CI)

There is much to improve about this site. (If you’re reading this close to the publish date, that will be obvious, as you view the nearly naked prose and utter lack of design.) But making improvements without measuring the impact of the changes can risk making it worse. So, before I do any more work, I need a baseline from which to measure my changes against.


Lighthouse is a helpful tool to audit a website in five main criteria: performance, accessibility, best practices, SEO, and progressive web app (PWA). Those align nicely with my goals for the site, making it an ideal fit for my needs. Lighthouse also does something quite neat and very much in line with the “learn, then teach” spirit of this site. Reporting the audit successes & failures would be useful on its own, but the Lighthouse team went much further and provide the why behind each check, many with a “learn more” link. This greatly shortens the path from “oh no, not good” to “I learned a thing!”.

There are many ways to run Lighthouse. The easiest is probably using the built-in Audits panel of Chrome’s DevTools. That’s fine for a manual workflow, but I want to automate the audits, to ensure they run against every change to the site and surface the results well. For that, I’ll use Lighthouse CI, which I can run within my GitHub Workflow. Doing so is fairly straightforward:

  1. Add Lighthouse step to workflow (commit)

    Though it’s all in one commit, this contains multiple tasks.

    1. Change the Deploy step to always build a deploy preview, which can then be audited
    2. Add the Lighthouse step (following Lighthouse CI’s instructions), using the preview link from the previous step
    3. Add some basic Lighthouse configuration
      1. Add recommended assertions (using the “no-pwa” variety until I can begin address those criteria)
      2. Publish reports to a temporary, public location (for free!)
    4. Add a second deploy step (really, move the previous one) that conditionally deploys a preview or the production site, upon success of the prior steps
    5. Refactor the Create Deploy Message step, to build up slightly different messages for each deploy step
  2. Turn off failing assertions (commit)

    While I haven’t done anything to intentionally make the performance or accessibility of this site worse (to then improve in a helpful blog post), I haven’t done anything extra to improve those things either. Thus, the recommended assertions return [four failed checks]. For now, I’ll disable those checks to get a passing audit, and will address each of them as my very next tasks.

Now whenever I push a change to the site, a Lighthouse run is kicked off as part of my CI/CD workflow, and the results are published as both a status check (if pushing to a branch with a pull request) and a detailed report (not that exact report, as the generated one is only hosted temporarily; that link is from a manual audit using https://web.dev/measure/, and if it continues to work, it will probably not reflect the state of the site when this post was published).

Most importantly, I can now continue to improve the site knowing that my changes are truly having a positive impact on the things I care about.

Follow-up

I’m happy with this initial implementation, but there are aspects to improve.