More like props from a James Bond movie than something that might have come from a scientific lab, a new device developed by neuroscientists combines a pair of special glasses, a webcam and a smartphone might allow the blind to see again by converting visual signals into auditory ones, which get transmitted through a headset.
The amazing device, called “vOICe”, will be demonstrated live by Michael Proulx, a neuroscientist at Queen Mary’s College in London, this week in Washington at the at the American Psychological Association meeting.
I know your curious how something like this might actually practically work – the truth is, it was very complicated to develop in the first place, and secondly it’s maybe just as complicated to use since the training for vOICe lasts three months. In a nutshell, the device maps whatever the wearer sees through the webcam, assesses whether one or multiple objects are in the range of sight and transmit their coordinates which get transduced in a audio signal outputed in a “hot” or “cold” manner. For example, for vertical location, “up” is represented by high frequencies and “down” by low frequencies. Horizontal location is indicated by the time it takes for a left-to-right scan of each image. Bright white is heard at maximum volume and dark is silent.
“The program takes visual input from the camera, then scans the image from left to right,” Proulx said. “Then you hear this soundscape where the changes in frequency and volume correspond to pixels in the image.”
You can imagine that it can be fairly difficult for a person to use, but once someone gets used to it, life might become a lot more bearable. The device might actually prove to be, for some of the blind, the ticket to independence they’ve waiting for. There’s nothing really novel about vOICe, though. The technology has been here for 15 years, but once with the expansion of the smartphone market, mobiles are now smart enough to handle the required imaging software.
“The main thing is to work out how to make the brain and the technology meet in the middle,” Meijer said. “The technology is mature, but we don’t know how the brain deals with complex sounds.”
Up to 30 percent of our brain is linked to imaging processing, and Proulx and colleagues believes these areas of the brain might receive and decode other sound and touch signals than it was previously thought. Other applications are envisioned by its developers as well, a new infrared vision device for seeing at night or different alternative to sonar for navigating underwater are also possible.
“You could use this to bring in ultraviolet or infrared as well,” Meijer said. “If you want to see like a snake sees or if police are looking for someone in a forest, they could use a device like this to augment their vision with infrared information.”
Tibi is a science journalist and co-founder of ZME Science. He writes mainly about emerging tech, physics, climate, and space. In his spare time, Tibi likes to make weird music on his computer and groom felines.