Our Sixth Sense: Wearable Fluid Interfaces

This week at the TED conference, Pattie Maes from MIT Media Lab’s Fluid Interfaces group showcased the latest work of her students, “a wearable computing system that turns any surface into an interactive display screen. The wearer can summon virtual gadgets and internet data at will, then dispel them like smoke when they’re done.”

Pattie Maes of the lab’s Fluid Interfaces group said the research is aimed at creating a new digital “sixth sense” for humans.

In the tactile world, we use our five senses to take in information about our environment and respond to it, Maes explained. But a lot of the information that helps us understand and respond to the world doesn’t come from these senses. Instead, it comes from computers and the internet. Maes’ goal is to harness computers to feed us information in an organic fashion, like our existing senses.

The prototype was built from an ordinary webcam and a battery-powered 3M projector, with an attached mirror — all connected to an internet-enabled mobile phone. The setup, which costs less than $350, allows the user to project information from the phone onto any surface — walls, the body of another person or even your hand.

Maes showed a video of her student Pranav Mistry who she describes as the brains behind the project. Mistry wore the device on a lanyard around his neck, and colored Magic Marker caps on four fingers (red, blue, green and yellow) helped the camera distinguish the four fingers and recognize his hand gestures with software that Mistry created.

The gestures can be as simple as using his fingers and thumbs to create a picture frame that tells the camera to snap a photo, which is saved to his mobile phone. When he gets back to an office, he projects the images onto a wall and begins to size them.


Here’s the video of Pranav Mistry using the system.

[Via Wired Epicenter]