UMaine Researcher Developing Tool That Incorporates Sound and Touch to Aid Blind and Low-Vision Community

Nicholas Giudice knows what it’s like to be vision-impaired, and he’s using his personal experiences and research background to develop an affordable tool to help others in the blind and low-vision community in school, on the job and with independent travel.

Improving access to and the comprehension of visual material such as charts and maps is the focus of a National Science Foundation-sponsored project led by Giudice, an associate professor in the University of Maine’s School of Computing and Information Science. The research aims to develop and evaluate an intuitive, low-cost tool to aid the interpretation of graphic data for those who can’t rely on vision to do so.

The ability to effectively use and accurately understand graphs, figures and other visual representations of numeric data is critical for success in the classroom and at work, Giudice says. Spatial learning and navigating in and outside the home also frequently depend on the use of maps and other graphical aids, which can be challenging for blind people to use, he says.

The World Health Organization estimates vision impairment affects as many as 285 million people worldwide, with numbers expected to rise due to the aging population. About 11 percent of blind or low-vision people have a bachelor’s degree and 75 percent are unemployed, according to Giudice. He says providing blind people with a way to process graphics will boost their employability, as well as confidence, independence and overall quality of life.

The tool has the potential to reduce the information gap between blind people and their sighted peers, Giudice says, giving an example of a teacher displaying a diagram to a class. Instead of relying on descriptions from the teacher, a student who can’t see could pull up the same image on a handheld device and use touch and audio to comprehend what the other students see.

“Many jobs deal with graphics and interpreting them,” Giudice says. “If this tool is developed, deployed and broadly implemented, it would make blind people more confident. Employers would see it’s no big deal if someone can’t see a graphic as long as they can understand and interpret it, and can act upon it.”

Gaining access to these forms of information is often difficult and expensive, Giudice says, citing as an example a printer worth thousands of dollars that creates tactile graphics but can only be used for one purpose.

By developing software that works on commercial, multifunctional and portable hardware such as smartphones and tablets, the tool Giudice and his team create would be readily available and comparably inexpensive.

Screen-reading software that uses text-to-speech is helpful for written material but lacks the ability to convey graphic elements, Giudice says. His proposed tool would present graphics on the touchscreen of a device equipped with a vibration motor.

The tool would allow users to experience touch combined with vibration, or vibrotactile feedback, when they touch a graphic element perceived as points, lines or regions, similar to feeling traditional hard-copy graphics. Sound would be used to enhance the vibrotactile information, creating a vibro-audio approach to materials traditionally processed strictly by vision.

Giudice would like to eventually pair the tool with a real-time map that automatically updates using GPS when the user moves, helping the 70 percent of people with little to no vision who don’t navigate independently outside their home.

Giudice’s research in spatial informatics and cognitive neuroscience is guided by his own experiences of living with vision impairment. The core of his research is multimodal spatial cognition — how we learn about, think about and act in space using different senses. Through personal experiences and research, Giudice has found many spatial tasks done with vision can be completed equally well using other senses.

“If you touch a desk as opposed to seeing it, your brain processes the desk edges and recognizes it as a desk. It doesn’t care how it got the information,” says Giudice, who also directs the Virtual Environment and Multimodal Interaction (VEMI) Laboratory, which houses the university’s first, and Maine’s only, virtual reality research facility.

Giudice has a preference for working with the sense of touch because it’s more closely related to vision than the other senses and shares a lot of the same properties, he says.

The new project, “Non-visual access to graphical information using a vibro-audio display,” recently received $177,568 from the National Science Foundation — the first installment of a three-year $500,000 grant.

The research team is at the early stage of the project, developing a tool that best works with how people process tactile information; discovering an intuitive approach is the team’s first task.

“We know this can work, but to make sure it can be used commercially, we need to understand about cognitive factors, how well it can work compared to hard-copy or traditional tactile approaches,” Giudice says.

Initial data has shown learning similar to that achieved using printed lines is possible using a vibro-audio approach for graphs and shapes, Giudice says, but the process needs to be optimized.

“Early research has worked amazingly well, there’s a lot of potential here. But there’s still a lot we don’t know,” Giudice says, such as determining the best alignment, vibration and resolution.

Preliminary work on map panning and zooming has also been done, he says, adding his team plans to develop software to manipulate on-screen movement; a common practice, especially for reading maps, that’s difficult without sight.

In the future, Giudice would like the tool to be available as an app, or multiple apps, that could be used to supplement existing apps, such as Google Maps.

Contact: Elyse Kahl, 207.581.3747