A veteran pastes a 12-page benefits letter into an AI tool and asks, "What does this mean?" For the first time, he understands his own disability rating.
A college student with dyslexia feeds her course syllabus into an AI assistant. It breaks down the jargon, highlights deadlines, and explains what each assignment requires. She passes the class.
A patient with a cognitive disability leaves the hospital with discharge instructions written at a college reading level. She uses AI to translate them into plain language. She takes her medications on time. She recovers.
These aren't power users or early adopters. They're people for whom AI is the difference between comprehension and confusion — between participating in daily life and being shut out by paperwork.
The comprehension gap
Fourteen percent of American adults — roughly 30 million people — score Below Basic in prose literacy. The average hospital discharge summary is written at a college reading level. Benefits letters from the VA and SSA are dense with bureaucratic jargon. University course materials assume academic fluency. The documents that govern daily life are written for people who already understand them.
For the 61 million Americans with disabilities — and particularly for those with cognitive, learning, and neurological conditions — this gap isn't an inconvenience. It's a barrier.
AI comprehension tools are closing that barrier. Not in theory. Right now, in millions of households, without anyone's permission or institutional support.
The blocking problem
But as people discover these tools, institutions are — often unknowingly — blocking access to them.
Hospitals deploy patient portals that prevent copy-paste, so patients can't use AI to understand their own discharge instructions. Universities ban AI campus-wide without disability exemptions, cutting off students who rely on AI as a comprehension aid. Government agencies produce benefits paperwork so complex that the people it serves can't navigate it, and offer no AI-compatible alternative.
Nobody set out to create these barriers. They emerged from legitimate concerns about data security and academic integrity. But the effect is the same: people who depend on AI to understand essential information are losing access to the tool that makes comprehension possible.
The law already covers this
Here's what most people don't know: the legal framework for protecting AI comprehension access already exists.
The Americans with Disabilities Act requires covered entities to provide "auxiliary aids and services" for effective communication with people who have disabilities. This category was deliberately designed to evolve with technology. In 1990, auxiliary aids meant TTY devices. In 2000, screen readers. In 2010, captioned video and accessible websites.
In 2026, AI tools that help people understand their own medical records, benefits letters, and lease agreements may be the next step in this same progression.
No new legislation is required. The legal principle hasn't changed: people with disabilities have the right to understand information that affects their lives. Only the tools have changed.
What AI Access Alliance does
AI Access Alliance exists to build the legal, policy, and public infrastructure that makes this recognition real. We are:
- Building the legal doctrine — a structured argument for why AI comprehension tools qualify as auxiliary aids under existing disability law
- Documenting the evidence — real stories from real people who depend on AI to access essential information
- Forming a coalition — disability-rights organizations, legal experts, accessibility practitioners, and allied institutions who share this position
- Engaging policymakers — working toward federal agency guidance that recognizes AI comprehension tools as a category of auxiliary aid
We are narrowly focused on one use case: using AI to understand information that already belongs to you. Not AI agents acting on your behalf. Not AI generating content. Not AI taking tests. Just comprehension.
Why now
The window for establishing this principle is open now. AI comprehension tools are widely used but legally unprotected. Institutions are writing AI policies today that will determine access for years. Federal agencies are developing AI guidance frameworks.
If disability-rights perspectives aren't part of this process, the default policies will be blanket restrictions — and the people who lose the most will be those who need these tools the most.
Join us
If you've relied on AI to understand something important — a medical document, a legal notice, a school assignment — your experience matters. We're building a case grounded in real stories from real people.
If your organization works on disability rights, accessibility, or inclusive technology, we'd like to talk.