Depth Perception

    This project attempted to make a device to assist people who are blind. I used the Kinect's depth camera and Arduino to send depth information down to a vibrating tactile display that could fit in the palm of your hand and be used to essentially "feel" the objects and people around you. This could work for object detection or avoidance when navigating through a room or being able to feel whether cars were moving or stationary at loud intersections.

    This first video shows my initial prototype and also demonstrates to sighted-people how the final prototype works. The upper right window is the video from the Kinect. The upper left window is the depth video, where nearby objects are light gray and distant objects are dark gray. The bottom window is a grid of 8x8 LEDs controlled by an Arduino.

    As I move back and forth a grid of 8x8 white boxes appear in the upper left window over my silhouette which correspond to the grid of LEDs. As I move the computer detects my silhouette, puts white boxes over it, and turns on the matching LEDs. The LEDs were later replaced with a tactile display shown in the second video further down the page.

    Below are images and a video of my final working prototype. The left image shows the hard-foam housing for the vibrating motors which allowed them to vibrate freely while keeping them supported, and the circuit board I assembled to interface with the Arduino. The circuit board was designed by my professors Tony Olsson and David Cuartielles from 1Scale1 which provides 64 output pins. The right image shows the tactile display's size relative to the hand. The vibrating motors rest 2mm above the surface to improve tactile feedback.

    The video below demonstrates my final working prototype. A friend of mine walks back and forth in my apartment, and with my eyes closed I am able to literally "feel" where in the room she is.

   Rita, a member of the Danish blind community I interviewed, was comfortable with technology and used a PC (upper left) for web surfing and email, a braille typewriter (upper right), and a PDA (lower right) to manage her life and read e-books. She even gave me a lesson in reading braille (lower left). But mobility was her single greatest concern. She lived next to a park but never visited it for fear of getting lost. Busy intersections also posed a major challenge in her daily life.

   While there are a number of sonar based devices currently available today to assist with mobility, they are limited to sensing information in just one dimension. Using a two dimensional grid communicates a substantially greater amount of information to the user about their surrounding environment.

   Finally, other systems for assisting the blind use thresholding to filter video input and produce tactile output for the user. The image below illustrates how the Kinect's depth information offers significantly better imaging quality for this application. Additionally the motors in a tactile display can be set to vibrate based on depth, so pixels corresponding to closer objects can vibrate more strongly and those for distant objects can vibrate with less intensity.