For many patients who suffer from a Traumatic Brain Injury (TBI), the environment can trigger painful and uncomfortable symptoms. So, is it possible to design software experiences in such a way that does not trigger their symptoms?
To find out, we interviewed Dr. Norm Vinson, a User Experience and Human-Computer-Interaction researcher from the Canadian National Research Council.
What motivated you to become a User Experience (UX) and Human-Computer-Interaction (HCI) researcher?
I have a Ph.D. in psychology from Carnegie Mellon University, with a particular focus on a subdiscipline called Cognitive Psychology, which involves studying the way people think. I studied how people try to understand complex sentences, and how their understanding is affected by their ability to hold information in short-term memory. I also looked at spatial memory and distortions in spatial memory, to see whether those distortions were affected by people's interpretation of the images that they were seeing.
This is very similar to human-computer-interaction in the sense that when you're doing human-computer-interaction, you're looking at how the person's thinking relates to the problem that they're working on using the computer, and how that fits in with the information that the computer is presenting.
What are some challenges that people who suffer from a Traumatic Brain Injury (TBI) face when using computers?
If you have TBI, you could have a cortical visual impairment, and that could translate in many different ways. For example, you could have photosensitivity, trouble recognizing objects, or trouble recognizing faces. So, it can be oddly specific to things like that.
You could also have auditory problems, or you could have motor problems. For user interface design, all these factors place some constraints on how you can design the system.
People who have photosensitivity are sensitive to intense light, and as studies have pointed out, 60% of people who have TBI have this problem. It's quite common.
If you consider people looking at a computer screen, that's even more problematic because it's a backlit device. The light is shooting out into your face. It can be very difficult.
The other issue with photosensitivity is that different users are sensitive to different colors. So, yes, a black background is good just to reduce the overall amount of light coming into those eyes. But some people have trouble with red. Some people have trouble with blue. Some people have trouble with various colors, and so it would be best if the user could set the text color easily.
Motion perception and difficulty tracking objects is another common symptom. If you've ever had motion sickness or anything like that, you can imagine scrolling or seeing animations on the screen might be upsetting to people who have TBI.
What should software designers incorporate into their products to address some of the challenges that people with TBI face?
People with TBI are more sensitive to some colors than others, so software designers can allow them to set their own display text color and background color to limit the amount of discomfort that they have. There is no one-size-fits-all. This is true for people with disabilities in general.
While some people with TBI might have vision issues, other people with TBI have auditory issues, and the auditory issues could vary. Some people could be just completely deaf. Some people could be sensitive to certain sounds. There are all sorts of conditions, so when you're designing for accessibility, it's very difficult and requires a lot of customization. My son is disabled, he has cerebral palsy. When he gets a wheelchair, it has to be designed specifically for him. They actually make a custom seat for him that is suited to provide support where he specifically needs support. So, accessibility is an area where a lot of customization is needed.
Software designers should aim to minimize the amount of time that the user spends navigating the software because the person's time is limited in terms of how long they can use the software without feeling discomfort. Also, they should limit scrolling and possibly fade to black between screen changes.
There are all sorts of visual disturbances that can happen resulting from TBI in the parietal or temporal lobe of the brain. If you have brain damage in those areas, your reading might be completely disrupted because reading is distributed throughout the brain. Speech output is a good alternative. And today, text to speech is not a very complicated technology.
An easy way to avoid logging in, and having the user read and type in their name and so on, is to use facial or fingerprint recognition. A lot of phones have this now.
SPEECH RECOGNITION SOFTWARE SOUNDS PROMISING. CAN YOU EXPAND A BIT MORE ON HOW A USER WITH TBI WOULD UTILIZE THIS TECHNOLOGY?
We have good examples of how speech recognition works with Alexa and Siri. In the case of a particular app that's directed to users who have TBI, the speech input and output just allow them to avoid looking at the app.
Using speech input, you could make entries. You could say what your triggers are. You can say what your symptoms are, and the mobile phone will know what time it is. So, you could just say the app name, and the app would start. Then you would say record trigger. And then you'd say the trigger. Then you'd say record symptom, and then you would say the symptom. It could keep track that way. Then you could ask for your weekly summary. And then it would tell you the symptoms and triggers through text-to-speech.
The speech recognition tools from both Google and from Azure are very good, and they have fairly good data privacy and security controls in the sense that you can localize the data. The reason that's important is that in Canada we have legislation requiring all patient data to stay within Canada. So, it allows you to comply with whatever the privacy regulations are with whichever country you happen to be in.
I think speech recognition software would be quite good for people with photophobia. You could have sound icons and little indicators. But some people with TBI have problems with sound. They’re sound sensitive, so again it requires customization. It really has to be tailored to the specific conditions of the users.
I read an article on data processing where they were analyzing urine samples. They coded the data on a synthesizer that automatically outputs musical notes. They found that the analysts were very good at listening to the music generated by urine samples and identifying problems! A similar thing could be done with mobile apps. When you're reviewing all your various symptoms and triggers, rather than having to look at the data and move your eyes across the screen, you could hear the notes:
doo, doo, doo, doo, doo, doo, doo, doo, doo, doo
It's fairly easy to know that a high note is high, and the low note is lower. That's another possibility.
WHAT DOES THE FUTURE OF UX AND HCI LOOK LIKE?
Increasingly people are working on a brain-computer-interface (BCI). An interesting implementation by Fricke and his colleagues allowed pilots to control the flight path of a simulated aircraft via EEG. Of course, for people who have motor disabilities, this is great.
There’s a lot of interest in human-in-the-loop (HITL) artificial intelligence (AI), which is getting humans and AI to work together for verifying that the AI is doing what you want to do. There are a lot of issues in ethical AI as well.
Recently, we haven't had any big change in device use. So, for example, when smartphones came out, device use was a very important driver of HCI. Virtual Reality (VR) is more popular with Oculus and so on. Their stuff is actually already pretty good.
Something that's always been in the background but has never really been done is using AI to customize a user interface through learning. The system learns how the user is using the interface, and then customizes it according to that. That is designing a self-revealing system that is good for novices and then kind of reveals more complicated functions as the person becomes more proficient. No one's really ever done anything with that. It's always something that's in the background that could be done. You can think of Microsoft Word. So many functions, so many things. It's very complicated. People don't even use 10% of Microsoft Word. So, Microsoft Word would be a good candidate for this self-revealing type of system.
In conclusion, it is possible to create software experiences that accommodate patients suffering from TBI. The solution is to provide customization to address the vast array of triggers that affect different patients in different ways.
- Armstrong, R.A. (2018), Visual problems associated with traumatic brain injury. Clin Exp Optom, 101: 716-726.
Fricke, T.; Zander , T.O.; Gramann , K.; Holzapfel, F. (2014, September 16-18). First pilot-in-the-loop simulator experiments on brain control of horizontal aircraft motion, in Deutscher Luft- und Raumfahrtkongress 2014.