In my previous post, I spoke with Ruth MacMullen, an academic librarian and copyright specialist from York, about her experience of being deaf and how it affects how she uses the web. In this next post in the series, Ruth shares some of the things that make life easier for her on the web, and we offer some practical tips on how you can improve accessibility for deaf and hard of hearing people.
Provide subtitles/captions
“Subtitles, that’s the really obvious one”, remarks Ruth as we discuss the things that help her the most. In my previous post, she described the frustration of viewing a video posted on Facebook that lacked subtitles.
Subtitles or captions are the words shown at the bottom of videos that explain what’s being said or what’s happening. The term “subtitles” typically refers only to spoken content, whereas “captions” also includes descriptions of non-speech sounds, such as music, applause and laughter. Outside of North America, the terms are often used interchangeably. So when Ruth refers to subtitles on videos, that’s what she’s talking about.
This article by Tom Mitchell at JISC highlights the benefits of captions towards accessibility, and he describes how you can add them to videos on YouTube and Facebook.
Check the accuracy of captions
YouTube and Facebook (in the US) offer free automated video captioning—but since there aren’t any humans involved, the captions they produce can be wildly inaccurate. “I’ve seen ones where irrelevant and inappropriate words, including expletives, came up!” remarks Ruth. “I was quite surprised there was no protection or filtering to stop those. I’ve also seen one where ‘Nazi’ and the ‘S-word’ came up in relation to an information literacy video—all in one sentence!”
Google provides clear instructions on how you can review and edit automated captions on YouTube. If you make use of automated captions, you need to take the time to edit them because automated captions are notoriously terrible to begin with. “It makes such a difference to a deaf person if a little bit of effort has been made to clean up subtitles”, explains Ruth. “You can see they are so much more accurate.”
Make sure that captions are synchronised with the audio
One advantage of using automated captioning is that the captions are automatically synchronised to the audio. Some video makers choose to generate their captions from an existing transcript. And if you do that, you need to make sure that each line appears on screen at about the same time that the audio is heard.
“When you’re deaf, you want it to run in time” explains Ruth. She cites two examples of good practices: the Lynda.com training videos and the British Universities Film & Video Council’s BoB (Box of Broadcasts) service. “I sometimes notice a couple of seconds’ delay,” Ruth remarks, “but on the whole, it’s good.”
Provide a summary of audio and video content
In my previous post, Ruth explained how a brief summary of what a video is about can be just as important as captions or a transcript. A video’s summary may be as simple as a list of topics or songs that the video includes, which Ruth likens to “alternative text for someone with a hearing impairment”.
Ruth recalls watching a concert on YouTube: “There’s a piano player playing Gershwin songs and I’ll think, ‘Which one do I want to listen to?’ I can’t skip through and listen to fragments, I need to know what’s in there before I make a choice to watch it. And I feel like I don’t have that choice if I don’t know what’s in it.”
Make sure that audio doesn’t play automatically
Deaf and hard of hearing people may have a hard time gauging how loud videos are, particularly when they play automatically and unexpectedly. “You see a video on your Facebook feed and there’s no sound”, explains Ruth. “If you click on it to make it big, the sound plays, but you don’t realise. Sometimes I have my hearing aid off and I feel like the video is blaring out sound.”
Ruth describes how frustrating this can be when she doesn’t even know that the webpage includes an auto-playing video: “Sometimes you’re browsing the web and you get video adverts that play sound and you don’t realise. That’s caused a couple of embarrassing situations in the office.” So try to avoid playing audio automatically, but if it’s unavoidable, make sure that users have an easy way to turn it off.
Structure your content
In my previous post, Ruth emphasised the importance of structured content, such as headings, paragraphs, and lists. “I rely so much on visual information”, she explains. “The more clearly it is structured and the more clear what it is, the better.” Ruth is currently undertaking a law degree through distance learning, which involves using a virtual learning environment. “They’re really good about their use of headings; they’re very descriptive. If they weren’t, I’d feel very lost because I completely rely on reading for understanding.”
Using semantic HTML helps websites remain flexible and extensible. It makes the content reusable and conveys more meaning to assistive technologies. As Ruth points out, “This is good practice generally, but for people who are completely reliant on visual information, even more important.”
Keep your content flexible
Underscoring each of these suggestions is an emphasis on clear communication and flexibility of content. Mobile technology has proven particularly useful to Ruth in this respect. “I think if you speak to any person with a disability, they will tell you that an iPhone or an Apple Watch is a game changer.”
Ruth shares a blog post by Molly Watt, an inclusive technology advocate who is deaf-blind. Molly’s post describes her experience of using an Apple Watch and the way that its haptic feedback has proven particularly useful for navigating.
The benefit that Ruth sees in such technology is being able to curate information in a way that suits her. “I can’t hear a radio very well if it’s in a room, but if I plug in my special Bluetooth headphones to my iPhone, I can listen to a radio programme. For me, it’s a bit of a lifesaver in terms of being able to get information, get content, and modify it in a way that works for me.”
Content that’s flexible enough to be delivered by captions, indexed by transcripts, enlarged by screen magnifiers, and rendered by screen readers is a key principle of web accessibility. And while many of the tips in this post are useful for deaf and hard of hearing people, they ultimately benefit everyone.
Ruth is the copyright and licences officer at York St. John University in the UK, where she specialises in the legal aspects of information management. She blogs at librarianinlawland.com, and she tweets at @thehearinglib.
Comments
Hello David,
Very good article David,
I just wanted to mention that the Action Plan for the implementation of the UN Convention on the rights of people with disabilities became a reality this year. Now that the plan is here, the accessibility of the Netherlands and Europe, can get started. The Plan describes how the Government, the ministries, municipalities and businesses can start concrete efforts to implement the UN Convention guidelines. As a subtitling company we work actively to promote the accessibility and use of WCAG (Web Content Accessibility Guidelines of 28 February 2017) for businesses, governments, municipalities. But this is a slow process.
When I look at some of the large websites here in Holland, like newspapers, government and others with large video content, there are no subtitles. I think there is a need for some legislation, that makes accessibility mandatory for certain popular or widely used websites. Government websites here in Holland are, since 2010, mandatory to comply, but only about 10% really do. This does not set a good example for business.
Anyway, I enjoyed your article.