Researchers from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a wearable system for the visually impaired that offers a more comprehensive view of their surroundings. Findings were presented at the International Conference on Robotics and Automation.
The new systems combines a 3D camera, belt with vibrational motors and electronically reconfigurable Braille interface. Researchers tested its effectiveness with a multitude of usability studies with visually impaired patients.
"We did a couple of different tests with blind users," said Robert Katzschmann, a graduate student in mechanical engineering at MIT and one of the paper's two first authors. "Having something that didn't infringe on their other senses was important. So, we didn't want to have audio; we didn't want to have something around the head, vibrations on the neck—all of those things, we tried them out, but none of them were accepted. We found that the one area of the body that is the least used for other senses is around your abdomen."
The system contains a 3D camera worn in a pouch around the neck, a sensor belt with five vibrating motors and a Braille interface worn on the user’s side. The tools are controlled by a processing unit running an algorithm developed by researchers to identify surfaces from the 3D camera. By identifying the environment around the user, the algorithm and 3D camera are able to send signals to the belt and interface to provide a more extensive view into the wearer's surroundings.
The wearable was able to reduce instances of a user running into objects by 80 percent and decrease cane collisions with people in a hallway by 86 percent.