If you have ever had trouble reading something due to the size or color of font used, or had a hard time hearing the person next to you at a loud concert, then you have had the same accessibility experience many with disabilities deal with every day.
Those with sight, hearing, and motion impairments have a harder time getting access to information on the web. The main problem is most websites do not design with these use cases in mind.
There are approximately 1 billion people around the world that have some form of a disability according to the World Health Organization. What this means is as high as 15% of the world’s population are unable to effectively use a large number of websites on the web.
How can we ensure all users have equal access to content? How can developers, as they create websites, express themselves inclusively on the web?
Accessibility work, also known as a11y, strives to improve the web experience for those with disabilities in an effort to ensure web content and experiences are available and equal for everyone.
Rob Dodson, a Developer Advocate for Chrome at Google, is one of the advocates working to better web accessibility.
Fundamentally, accessibility is defined as a person’s ability to be able to access something, whether that be information, physical access, or even the ability to work comfortably. When we speak of accessibility for the web, we usually mean technical access, which includes the ability to read a web page, book a plane ticket online, or order food from a website. The goal for accessibility work is to help people interact with their environment and remove the impediments a person may experience in doing so.
It is easy for those without disabilities or impairments to take things like vision for granted. If you are able to effectively see your computer screen and consume information, you don’t think about how your experience might be impacted if you could not see the website you are viewing.
Most accessibility problems stem from user experience design, and can even be as simple as color contrast on a website. If text on a page has low contrast with the background color, it’s not just an access issue but also a usability issue; anyone viewing the page will have a hard time seeing the content clearly.
User efficiency also comes in to play when designing websites for accessibility. When considering the experience for users with motor impairments, or persons with a limited range of motion, it becomes more apparent as to why we need to be considering accessibility. These users may only be able to move a finger to activate a switch device, or use a similar sip and puff device to navigate the page. For these users, the ability to skip a menu and use the site with less button clicks becomes fundamental for efficient navigation.
But we must look further than the simple goal of helping everyone have equal access to the web. Accessibility is one of the key pieces in building the future of the web. If we look to many of the cutting edge products today, we begin to realize that many such as Amazon Alexa or Google Home began as accessibility technology. As AI becomes more integrated into our everyday lives, we often take for granted the text to speech capabilities we rely on was once only used for a calculator for the blind.
In following this concept of how accessibility work accelerates advancement of technology, we must take note of the significance of semantics when developing websites. Adding semantics to a web page will not only improve the experience for visually impaired or blind users today by improving the ability for screen and braille readers to work effectively, but it may also help lay a foundation for how future AIs understand and interact with our pages. As we build AIs, we must train them and teach them what it means to be a web page, and this comes with adding semantics.
Voice interactions, a technology increasingly relied on for communicating with digital assistants and AIs, are also an assistive technology. Voice interaction allows users with motor or vision impairments to access content and perform certain actions more easily than having to directly manipulate a switch device or screen reader. And much like voice interaction technology may help a blind person navigate the world around them they cannot see, it also can help tell users where to go in a foreign country where they cannot read or understand the native language.
Soon, AI will be able to help us around our homes. Ideas like this are being beta tested today. If Amazon Alexa were able to reliably help a user fix a toilet or install plumbing while the user was working on a project, we will have successfully leaped into the future. With more semantic content available for AI to consume, enablement of this type of assistance could be possible faster.
While developers work harder to understand the importance of accessibility, bodies such as WAI-ARIA (Web Accessibility Initiative — Accessible Rich Internet Applications) and WCAG 2.0 (Web Content Accessibility Guidelines) are working on improving web standards. Currently tools like aXe and WAVE can be used today to help audit a site to see how well it meets these standards.