accessibility
Lighthouse accessibility failures
Counts the specific accessibility audits Lighthouse flagged. The high-leverage list — most failures are one-line fixes in markup or CSS.
What this check does
Counts how many of Lighthouse’s accessibility audits failed on the page. Lighthouse runs axe-core under the hood and groups results into pass/fail/manual. This rule reports the fail count and lists them — the Lighthouse accessibility score gauge is the headline number, this rule is the to-do list.
Typical offenders MetricSpot surfaces:
color-contrast— text below the 4.5:1 AA contrast ratio.image-alt—<img>withoutalt(different from image alt-text quality, which scores meaningfulness).link-name/button-name— interactive elements with no accessible name (icon-only buttons are the usual culprit).label— form controls without an associated<label>.html-has-lang/html-lang-valid— covered by declare page language.document-title— empty or missing<title>.aria-*— invalid roles, missing required attributes, oraria-hiddenon focusable elements.list/listitem—<li>outside a<ul>/<ol>parent.
Why it matters
Lighthouse failures are the fixable half of accessibility. Every audit in this list is something a build agent can verify mechanically — no human judgment needed. They’re also the audits most likely to surface in a legal complaint: contrast and missing form labels are the two most-cited issues in ADA web-accessibility lawsuits in the US, and the EU Accessibility Act (in force June 2025) reads the same checklist.
Getting this count to zero won’t make a site fully WCAG-compliant — about a third of WCAG 2.1 AA criteria require manual review — but it removes the lowest-effort objections.
How to fix it
Read the failures in order. Each Lighthouse failure includes a “Failing elements” snippet — that’s your starting point.
View the per-audit list:
- Lighthouse CLI / DevTools — run an audit, scroll to “Accessibility”, click into each red entry for the failing element list.
- PageSpeed Insights —
pagespeed.web.dev/?url=<your-url>returns the same audits with element selectors. - DevTools → Issues panel — Chrome 109+ surfaces axe-core issues live as you browse.
Wire it into CI. Don’t rely on humans clicking Lighthouse. Add axe-core to your e2e suite:
bun add -D @axe-core/playwright axe-playwright
// e2e/a11y.spec.ts
import { test, expect } from "@playwright/test";
import AxeBuilder from "@axe-core/playwright";
test("home has no detectable a11y violations", async ({ page }) => {
await page.goto("/");
const results = await new AxeBuilder({ page })
.withTags(["wcag2a", "wcag2aa", "wcag21aa"])
.analyze();
expect(results.violations).toEqual([]);
});
For React/Next/Astro builds, also add eslint-plugin-jsx-a11y to catch the obvious ones at lint time.
The top 5 one-line fixes:
| Failure | Fix |
|---|---|
color-contrast | Bump the foreground or background until the ratio clears 4.5:1 (3:1 for text 18pt+). Use DevTools’ color picker — it shows the ratio live. |
image-alt | Add alt="...". For purely decorative images, use alt="" (empty, not missing). See image alt-text. |
button-name | Icon-only <button>s need aria-label="Close menu". SVG icons inside need aria-hidden="true". |
link-name | Same as buttons. Wrap icon links with descriptive text or aria-label. See descriptive link text. |
label | Wrap <input> in <label>, or use <label for="id"> + id="id". See form input labels. |
If the score is n/a instead of a count, see Lighthouse accessibility unavailable — the audit didn’t run.
Frequently asked questions
Will fixing every failure get me to 100?
You’ll usually land in the high 90s. Lighthouse’s accessibility score is weighted but not linear — color-contrast is heavy, others are light. The remaining points often come from manual checks (focus order, screen-reader narration) that the automated audit lists as “Additional items to manually check.”
Can I suppress audits I disagree with?
Yes, via the Lighthouse config’s skipAudits array, but think hard before suppressing. The common false-positive is color-contrast on text-over-image hero banners — fix the contrast (add an overlay or solid backdrop) rather than skipping the audit.
How is this different from axe-core in DevTools?
It isn’t really. Lighthouse bundles axe-core and runs a subset of its rules. The DevTools axe extension runs the full set, so you may see more issues there — those extras are usually best-practice tagged and not WCAG-required.
Sources
Last updated 2026-05-11