Digital accessibility for people with hearing impairments is crucial for equitable access to all audio content on the internet. This has become increasingly important as more and more web pages contain videos and other interactive elements.
Here are some easy ways to ensure that people with hearing impairments can understand all the digital content you publish on your websites or applications.
Video accessibility, or lack thereof, is generally the most significant access barrier for users who are deaf or have hearing loss. Web development teams must include captions on all videos that contain spoken content or context that is otherwise portrayed audibly.
Captions are text versions of the audio or sounds presented in video that provide a view of the context of what is being displayed. Captions must include both noises and dialogue.
Videos can have closed captions, allowing users to turn captions on or off, or open captions (meaning captions are always on the screen). While both options are acceptable, closed captions may be preferred to allow users to toggle captions on or off.
A relatively simple way to learn about adding captions to a video is through YouTube, assuming the video is on a YouTube account. When adding captions to a video via YouTube, you must add captions manually and not rely on YouTube’s automated captioning feature (the same holds as a recommendation for all significant video player tools and platforms).
While automated captioning has improved and continues to progress, it is still not nearly as accurate as manually adding captions. Inaccurate captions can lead to people with hearing loss (and potentially others) misunderstanding video content or having content completely misrepresented.
One strategy to ensure accurate captions is running the automated captioning feature and manually editing the automatic captions. This process can save teams a significant amount of time during this process while also allowing them to feel confident that their video content is accessible. In addition to prerecorded videos, all live videos must have captions.
Videos and/or audio that automatically play when webpages are loaded are also accessibility barriers for people with hearing impairments, particularly people who are deafblind and use screen readers to navigate the Internet. The auto-playing content will likely confuse users with the screen reader’s audio. Auto-playing content can be distracting to people with certain kinds of cognitive disabilities as well.
For developers, removing all auto-playing content on page load from a website is the best practice. If a website does have auto-playing content on page load, the content should not play for more than three seconds.
Suppose the content plays for more than three seconds. In that case, there must be a mechanism that is easily accessible (and gains focus when needed for users of assistive technology – not just a mouse user) to pause the content and/or control the content’s volume level to meet WCAG requirements. If there is a volume control, it must be independent from the overall system video level.
While test-based transcripts for video content are not explicitly required for WCAG A or AA conformance, they are critical for deafblind users. Users who are deafblind cannot visually see captions or hear audio descriptions and must rely on transcripts to have equitable access to video content. Deafblind people use Refreshable Braille Displays to convert the transcript text to readable Braille.
There are three types of transcripts:
- Basic transcripts are text versions of speech and non-speech audio information.
- The second type, descriptive transcripts, also includes descriptions of visual information needed to understand the content.
- The third type, interactive transcripts, highlights phrases as spoken and allows users to select text in the transcript to skip ahead in videos.
Transcripts are also essential for users with cognitive disabilities or memory disorders.
Transcripts can also be helpful to anyone who wants to skim through video content or search for specific instances in videos.
Multiple feedback mechanisms
Providing multiple options for users to request help or give feedback, especially about accessibility issues, is also essential. Users who are deaf or hard of hearing often have difficulty or cannot use a telephone because they may not be able to hear the other person.
By providing an email address and/or an online chat feature, these users will likely be able to get the support they need or provide meaningful feedback as easily as possible.
We strongly encourage and assist organizations that work with Allyant to post a strong Accessibility Statement acting as a reference for these feedback mechanisms. We even have a recommended template that we would be happy to discuss with your team if you have questions.
Use universally recognized icons and symbols
People who are deaf will often communicate via a form of sign language, like ASL or ESL. Since there are often language barriers when interacting with deaf consumers, it’s strongly recommended to include universally recognized icons and symbols in the web design process.
For example, you could use a microphone icon to indicate a voice input field or a magnifying glass to let users know they’re in a search field – rather than using a ‘custom’ icon that is far less universally obvious to the input or action being requested.
Universal design benefits everyone
As with these and many other web accessibility strategies, universal design benefits all users, not just those with a wide range of disabilities.
Contact our team of experts to get started on providing equitable access for all!