Product Manager Compliance Pathway
Treating accessibility as a regulatory requirement and a product quality requirement at the same time. Both are now real, both have deadlines, and both land on the PM's desk.
Last reviewed: 2026-04-07
Scoping the EAA against your product
Before you build a roadmap, you need to know which parts of the EAA apply to which parts of your product. Half the teams I see skip this and spend a year fixing the wrong things.
What the law says
EAA Article 2 sets the scope: ICT products and services placed on the EU market after 28 June 2025. Annex I lists the accessibility requirements by product category. The ones that catch most software products are e-commerce services (Article 2(2)(e)), consumer banking services (2(2)(d)), e-books and dedicated reading software (2(2)(f)), and electronic communications (2(2)(b)). Each category has its own clauses in Annex I and its own functional performance criteria. Some clauses apply to everyone — your product needs to support a keyboard, support a screen reader, scale its text. Others are sector-specific. Article 31 sets the transition timeline: 28 June 2025 for new products and services, with a longer grace period for legacy products and existing contracts.
What it means in practice
Sit down with the product, the marketing site, the mobile app, the help docs, and the contract templates. For each one, answer two questions: which EAA service or product category does it fall under, and which Annex I clauses apply to it? Write it down. That document is what every later decision rests on. The usual mistake is jumping to 'we have a website, so the EAA applies to all of it' and stopping there. It's both too broad and not specific enough. A B2B SaaS sold into enterprises is largely out of scope. A B2C e-commerce flow inside the same app is fully in scope. A help centre with consumer self-service is in scope. An admin panel used only by your own staff is not. The microenterprise exemption in Article 4 covers companies with fewer than 10 employees and turnover under €2 million. If you qualify, the service obligations don't apply. The product obligations still do, but you can claim the exemption in writing. If you're sitting at 9 employees and growing, plan for the day you lose that exemption — it'll happen the moment you cross the threshold. Keep the scoping doc in version control next to the product spec. When you launch a new feature, the scoping conversation happens at design review, not in the audit afterwards.
Common pitfalls
- Treating the EAA as 'a website thing' and missing that it covers contract terms, pre-contract information, payment flows, and support channels as well.
- Assuming the microenterprise exemption applies just because the company is small. It only applies to services, never to physical products, and only below specific size thresholds.
- Scoping once at the start of the project and never going back. Acquisitions, new features, market expansion — they all change the scope, often without anyone noticing.
How to verify it
Run the EAA Ready Quiz on this site for a quick first pass. It asks about your product type, your sectors, and your operator role, and tells you which clauses apply. Not legal advice, but a defensible starting point you can do in ten minutes. Then feed the scoping output into the Self-Assessment Pipeline. It turns the scope into a list of mandatory, recommended, and conditional tools to run, then tracks the results through to a final Word export. That's the document you'll attach to your Article 13 conformity assessment.
AccessibilityRef tools that help
- EAA Ready Quiz— quick scoping with a guided questionnaire
- Self-Assessment Pipeline— turns scope into a tracked test plan
- Disproportionate Burden Calculator— if you need to claim Article 14 exemption for parts of your product
Further reading
Accessibility in the Definition of Done
Accessibility is built in or it isn't built. Catching it at design review costs almost nothing. Catching it in a remediation sprint six months later costs a lot.
What the law says
EAA Article 4 says operators have to 'ensure' that products and services comply with the accessibility requirements. That verb is active. It's not enough to fix issues after someone reports them. The word the regulator uses for this is 'integration': accessibility built into the design and development process, not bolted on at QA. In practice that means every feature ships with accessibility considered at requirements, at design, in the build, and in test. The accessibility acceptance criteria sit right next to the functional ones in your user stories.
What it means in practice
Put accessibility in your Definition of Done as a list of specific, verifiable items. 'Accessible' on its own doesn't work — it's not testable. You want concrete things: keyboard reachable, screen reader announces correctly, contrast verified, focus indicator visible, form fields labelled, error messages properly associated. The exact list depends on what your team ships, but keep it short and concrete. Write the acceptance criteria into the user stories. 'As a user I can submit the contact form' becomes 'As a user I can submit the contact form using keyboard alone, with errors announced via the live region, and with all fields meeting WCAG 2.5.5 target size.' That's the level of detail that gets the thing built right the first time. Run accessibility design review at the same moment you run UX design review — before tickets get pointed. The cheapest place to catch a missing focus state is in the Figma file, not in code. Designers should be checking contrast as part of their handoff. PMs should be confirming the design has answered the four big questions: keyboard, screen reader, contrast, motion. For existing products with no DoD discipline at all, start with the next sprint, not with a giant retroactive sweep. New features get the new bar. Legacy debt goes into a separate backlog with its own prioritisation.
Common pitfalls
- Writing 'accessible' in the DoD without saying what that means. The dev signs off thinking they did the right thing, the QA signs off because they don't know what to test, and the bug goes out.
- Saving accessibility for the QA pass at the end of the cycle. By then the architecture is locked in and fixing the bugs costs ten times what designing them out would have.
- Treating accessibility as a separate workstream owned by 'the accessibility person'. You get a debt backlog nobody else owns, plus a single point of failure the day that person leaves.
How to verify it
After each sprint, pick two or three stories at random and check whether the DoD was actually met. Did anyone run the keyboard test? Did anyone check contrast? Are the error messages associated? If most answers are no, the DoD is decoration and you need to either enforce it or simplify it down to something the team will actually do. Use the WCAG 2.2 Audit Checklist on this site as the canonical reference for what 'done' looks like at the criterion level. Not every story needs every criterion, but every team should agree on which ones apply to which kinds of work.
AccessibilityRef tools that help
- WCAG 2.2 Audit Checklist— the criterion-level reference for DoD definitions
- Component Library— vetted patterns that pass accessibility by default
Further reading
Managing accessibility debt
Every product has accessibility debt. Knowing how much, where it lives, and what it costs to fix is what separates a real roadmap from wishful thinking.
What the law says
EAA Article 14 lets operators claim 'disproportionate burden' for specific accessibility requirements, but only if the burden has been formally assessed against the criteria in Annex VI. Those criteria are: the cost of compliance relative to the operator's resources, the estimated benefit to people with disabilities, and the frequency and duration of use of the product. The assessment has to be documented, reviewed at least every five years, and made available to a regulator on request. 'It's too expensive' is not a defence. 'We did the Annex VI assessment, here's the document' is.
What it means in practice
Inventory the debt before you plan the remediation. The Accessibility Debt Calculator on this site turns a list of audit findings into hours, cost, and timeline estimates. The output is a number you can take to a finance director and have an actual conversation about. Prioritise by user impact, not by developer effort. A keyboard trap in checkout hits every blind user every time they try to buy something. A missing alt on a decorative icon hits nobody meaningfully. The 80/20 rule holds here — fixing the top 20% of issues knocks out 80% of the user-facing pain. Keep the remediation backlog separate from the feature backlog. They compete for the same engineering hours, but they need to be tracked and reported separately so leadership can see both. 'How much accessibility work shipped this quarter' is a useful metric. Burying it inside the feature backlog hides it. When you genuinely can't fix something within the timeline — for example, a critical legacy component that would need six months of rework — Article 14 disproportionate burden is the right tool. Document the assessment honestly. The Burden Calculator on this site walks you through the Annex VI factors and produces a defensible record. Just don't use it as an excuse to wave away things you don't want to fix. The regulator will see straight through that.
Common pitfalls
- Dumping every audit finding into Jira as separate tickets with no prioritisation. The team works through them in random order and the high-impact bugs ship months later than the trivial ones.
- Using disproportionate burden as a general escape hatch. It applies to specific requirements, not to whole products, and it requires real evidence behind it.
- Reporting accessibility debt only as absolute numbers — 'we have 312 issues'. The number is meaningless without severity weighting and a trend line.
How to verify it
Your remediation backlog should have a trend. Month over month, the number of critical and high-severity issues should be falling. If it's not, either nothing is shipping at all or new debt is being introduced as fast as old debt is being fixed. The Metrics Dashboard on this site (or your own spreadsheet) is fine for tracking this. Two numbers matter most: critical issues fixed this period, critical issues introduced this period. The delta needs to be negative. If it isn't, you're treading water and the DoD bar needs to come up.
AccessibilityRef tools that help
- Accessibility Debt Calculator— convert audit findings into hours and cost
- Disproportionate Burden Calculator— Annex VI assessment for Article 14 exemption claims
- Metrics Dashboard— track accessibility scores over time
- ROI Calculator— build the business case for the remediation budget
Further reading
The accessibility statement (Article 13 / Annex V)
The statement is the first thing a regulator will look at. Get it right and most enquiries end there.
What the law says
EAA Article 13 says operators have to provide information about how their products and services meet the accessibility requirements, in an accessible format. Annex V spells out the structure: identification of the product or service, a description of how the requirements are met, contact details for queries, and an enforcement procedure for users who think the requirements haven't been met. This is the EAA's equivalent of the accessibility statement under the Web Accessibility Directive. WAD imposed a similar requirement on public sector bodies. The EAA extends it to private sector products and services in the relevant categories.
What it means in practice
Link the statement from every page of every product surface — marketing site, web app, mobile app, help centre. Put it in the footer, not three menus deep where nobody can find it. The content has to cover the Annex V points without sliding into marketing copy. Be honest about partial conformance. 'We conform to WCAG 2.1 AA except for the following known issues' is a stronger position than a blanket conformance claim that falls apart the moment anyone scrutinises it. List the issues, the workarounds, the remediation timeline. Then keep that list current. Contact details and the enforcement procedure are the bits most teams skip. Give a real email address that someone actually monitors. Describe what happens when a user reports a problem — who acknowledges it, how quickly, who escalates if nothing happens. If you serve a multilingual market, link to the national enforcement body for the user's country. The Authorities directory on this site lists all of them. Use the Statement Wizard for the first draft. It produces an Annex V-compliant statement in seven languages and lets you customise the conformance status, known issues, and contact details. Don't ship it untouched, though. The wizard gives you the structure. The content has to reflect your actual product.
Common pitfalls
- Claiming full WCAG conformance when the audit clearly shows partial. It's dishonest, and it's the first thing a regulator will challenge.
- Putting a contact email on the statement that nobody monitors. Users report bugs, hear nothing back, and go straight to the regulator.
- Writing the statement once and forgetting about it. It needs reviewing at least annually, and any time you ship a major release.
How to verify it
Read the statement out loud. If it sounds like marketing copy, it's wrong. If it sounds like a legal document that admits specific limitations and points users to specific remedies, it's right. Check every link in the statement works. Check the conformance claim matches your latest test results, not last year's. Check the contact email actually gets monitored, with an SLA you can defend.
AccessibilityRef tools that help
- Statement Wizard— Annex V-compliant statement generator in 7 languages
- EU Authorities Directory— national enforcement bodies for all 27 member states
Further reading
Release management and regression prevention
Accessibility regresses on every deploy. The only thing that holds the line is automation in CI plus a manual check before release.
What the law says
EAA Article 4 requires ongoing compliance, not just a point-in-time conformance claim. A product that was accessible at launch and broken six months later is still non-compliant. The Article 13 conformity assessment is a snapshot. The obligation behind it is continuous.
What it means in practice
Two layers hold the line. Layer one is automated accessibility tests in CI on every pull request — axe-core, Pa11y, or Lighthouse running against the changed pages, with the build failing on any new violation. That catches the easy 30% before code merges. Layer two is a manual pre-release checklist on the changed flows. Keyboard test, screen reader spot-check, focus order check, contrast check on any new colours. Ten minutes per major flow, and it catches everything CI can't see. Design system changes carry amplified risk. A focus-style regression in the button component breaks every button on the site at once. Treat design system PRs as high-risk by default and run a fuller accessibility check on them. The Focus Order Pro and Headings Pro batch tools are good here — scan a sample of pages from different parts of the product and confirm nothing's regressed. Keep a release log that records the accessibility check status for each release. When the regulator asks how you know your product was accessible on date X, the log is your answer. When something does ship broken, the log tells you when it broke and what changed.
Common pitfalls
- Running accessibility tests once a quarter instead of on every release. By the time you find the regression, half the team has forgotten what they shipped.
- Failing CI on new violations but ignoring the existing baseline. The baseline represents real bugs hitting real users — it deserves a remediation plan, not a permanent exception.
- Skipping the manual check on 'small' changes. A one-line CSS fix can disable focus indicators across the entire site.
How to verify it
Look at your last three releases. For each one, can you point to evidence that an accessibility check actually ran? If not, the process exists on paper only. If yes, look at what was checked. Were they the changed pages, or the same five pages every time? The Self-Assessment Pipeline tracks results over time and lets you compare assessments across releases. Useful when you need to show leadership or a regulator that the trend is heading the right way.
AccessibilityRef tools that help
- Self-Assessment Pipeline— track conformance across releases
- Headings Pro (Batch)— regression scan on multiple URLs
- Focus Order Pro (Batch)— regression scan for focus order
Further reading
Important Legal Disclaimer
This tool is a self-assessment aid only and does not constitute legal advice or a formally certified compliance assessment. Outputs — including reports, scores, checklists, and accessibility statements — are for internal use and should be reviewed by a qualified legal representative or independent accessibility auditor before being relied upon for regulatory, procurement, or public-disclosure purposes. All assessment risk lies with the internal assessor. accessibilityref, its developers, and staff accept zero liability for losses arising from use of or reliance on these outputs. Always verify against official sources: the W3C WCAG 2.2 Recommendation, the European Accessibility Act (Directive 2019/882), and your national enforcement authority.