Disability as UX Error

A blind man walks into a bar. The bartender asks him what he’s having. The blind man points to the shelf of expired grape juice. The bartender, amazed by the man’s ability to point at an over aged grape water asks how the man is able to navigate the space so effortlessly.

The blind man tips his shades, revealing the glow of a screen portraying the floor the lenses are aiming at. What he reveals, is an emergent technology that allows those who are legally blind to see without much impairment. His VR device uses front-facing cameras to capture whatever his glasses are looking at from afar and display it right in front of him.

This works because not all who are legally blind, are completely blind. To be legally blind your vision with corrective lenses must be worse with  20/200.  And what does this mean? Well, if you have 20/20 vision then the smallest letters you can read from 20 feet away match the normal 20-foot distance. But if you have 20/200 vision, which is the standard for being legally blind, the letters you can read from 20 feet can be read from 200 feet by people with the perceived normal 20/20.

for blog
Here’s that eye chart we are asked to peak at when we get our vision checked. The right side has vision scores. What line can you usually read?

Vision is a weird thing. I use a -4.25 and a -4.00 prescription for my contacts, which means unaided, my vision is 20/ 425 and 20/ 400 respectively. With the magic of curved spheres I am able to navigate the world with ‘normal vision’, as we define at 20/20. Anytime I want to step into the legally blind perspective, all I have to do is navigate a day without my lenses. And that’s incredibly challenging.

However, many vision impaired, and legally blind individuals can still see, as I can without my corrective lenses. And many of these individuals are nearsighted. And when VR devices can rest just inches away from your eyes, being nearsighted isn’t entirely a problem. In fact, many people who wear glasses will find that they no longer need them when operating VR spaces. Being nearsighted in VR is not typically noticeable. With the proper technology, our nearsighted disability, is no longer a disability.

And with interactions trending towards the virtual world there are more opportunities for User Experience Designers to create spaces where accessibility removes disability.  While the example of front-facing cameras on a VR device can allow for an accessible user interface for more than just the blind, it becomes even less of a technology burden to adapt virtual spaces we already occupy so heavily into accessible environments.

08-Digital-2019-Global-Overview-Images-v01-Slide-40-Time-Spent-Internet-Total-1200x675.png
Chart showing Internet usage in January 2019. The worldwide average use is 6 hours and forty-two minutes per day. And this is pre-COVID era data. How much time do you spend on the internet per day?

Software devs and frequent internet users alike may already be familiar with ADA, Section 508 which requires closed captioning. Different facets of these laws apply to different styles of visual media. For example, live streams, even public-facing ones, are not required to have closed captioning even though it is an easy thing to do with modest accuracy, given modern technology. While different media companies are facing different sides of these laws and regulations, it is recommended to use Closed Captioning whenever possible.  And with the current increase in Live video content here are some helpful links for adding captions to Facebook Live, Live stream and video conference calls.

Just because it is not required, does not mean it should not be. Law lags behind ethics, so make sure your industry or production team is leading with ethics that predicate laws. If you are designing web content you may use the WCAG guidelines (for rich internet applications) which can be summed up with the premise, allow all visual content to be equally accessible without visuals and vice versa.

This simple summary of hundreds of pages of accessibility legalese extends beyond the written guidelines and into UX design. It is one thing to have captions for your content, but it is another to do them well. FCC online video guidelines require 99 percent accuracy, exact wording, time synchronization, completeness and placement that does not impair other important content. But captions should strive for more.
WCAG guidelines require all visuals to have text descriptions, but good UX writing can allow for a more complete word drawn picture.

Captions need to surpass just the basic coverage of dialogue that so often goes wrong. Using different fonts, and colors, captions can contain expression and emotion. Red font paired with proper priming tools, such as emojiis or just the scene itself, can show love, anger, or even signify a character. Motifs and themes can be enhanced with subtitles for all viewers. Have a scene with an alien language? Add a font with less readability to show confusion. Have a dramatic scene in a cold climate? Add captions to visible clouds of breathe as characters speak. Captions and artistic subtitles can enhance communications for everyone. Let’s examine some examples. The first of which I crafted in about two minutes, the other two are from the 2004 film, Night Watch.

Add a subheading.png
A narrated scene from Amelie, with one caption being replaced by the same text with a different font to show emotion as well as align the protagonist with said emotion. In this scene the words ‘love’ and ‘her’ are the same color while love is in a different font.
NightWatch-dontlookatme.gif
A scene from Night Watch where a character transitions from an owl to a human. Feathers fall over the subtitles obscuring them as they say, “Don’t look at me!”
NightWatch-ComeToMe.gif
A vampire calls to her victim as his nose bleeds into the water of a pool. The blood turns into the text “Come to me…”  where upon emerging from the water, the text disappears.

There are plenty of opportunities to use captions and subtitles as visual art in film. They can enhance immersion, provide expression, create familiarity with characters or add to motifs and themes. The same strategies can be invoked to add to experiences for vision impaired while reading prose. Try adding sound files to embellish scenes. Have a web story that takes place in an airplane? Add ambient airplane noises. Following an epic quest to destroy a ring on the side of a volcano? Add rumbling sounds. Have your protagonists entered a loud club where they can barely hear each other? What’s playing (below)?

 

With emergent technologies such as VR and haptic suits, the possibility to make experiences that simultaneously open up accessibility as well as enhance all user experiences. Imagine watching Lord of the Rings without sound and having rumble packs can create epic moments for charges into battle, or feel the cold of entering Shelob’s lair with Frodo.  Haptic suits such as the Teslasuit can do all this and more.  With temperature control, rumble packs and electric shock, Teslasuit does everything but added smells, which is also available, albeit in lower demand.

Teslasuit.jpg
Image of Teslasuit showing sensors for vibration (haptic feedback), electric shock, climate control, and motion capture.  I had an amazing opportunity to try this suit in action when I was working for MXTreality, an all things mixed reality company.

There are risks to adding more features using emergent technologies. Smellovision can trigger allergies or nausea in acute cases. Electric shocks may be bad news for people with pacemakers or preexisting conditions. However, risks to immersive media are nothing new for the entertainment industry, and we can find ways to thwart these risks with proper warning labels and cautionary yet simple agreements.

From a User Experience perspective, we must make sure this new tech is both adding to the enjoyment of the experience and not creating fruitless barriers to entry that may even risks. Media has never been without risk, whether these are artistic risks or risks of breaking into emergent technologies, this risk-taking drives trends. As early as 1997, an episode of Pokemon had a scene cause epileptic seizures in many viewers when Pikachu uses “Thunderbolt” causing the screen to flash red and blue rapidly. Currently, there are tools to thwart possible epilepsy triggering moments. The Photosensitive Epilepsy Analysis Tool (PEAT) is free for content producers everywhere to make sure their content will not trigger seizures.  Similar guidelines can be used for VR experiences, but none such exist for Haptic body suits or smellovision yet. It’s safe to say that as these products become more available for consumers we will have some growing pains to overcome.

Accessibility is greatly enhanced through emergent technologies that highlight the setting dependency of disability.  Motor, visual and cognitive constraints can be eliminated through careful and conscientious user experience design. It just takes time. The payoffs for this time enhances the value of your content’s community, the quality of the media and your content’s general consumption rate. Whether you are producing VR games, software, video content or podcasts, taking the time to add sensory experiences to your content increases its quality and provides you a larger market audience.

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s