Just saw this video on TED featuring Pattie Mae and Pranav mistry of MIT Media Lab. She introduced a wearable device that can present meta information (that already exists) anywhere anytime (assuming the phone supports Internet connection) just by looking at it. Few applications that they showed include looking at your boarding card to know your flight status, clicking pictures, reading book reviews from Amazon by just looking at the barcode, knowing about a person by just looking at his face.
They use basic technology tools – a camera, a mirror, a rechargeable battery, pointers and a cellphone (for communication) to bring to life possibilities that bring the entire world literally on your finger tips. In the team’s words, SixthSense is
a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
Its more than QR Codes, Microsoft Surface, iPhone, a digital personal assistant. All put together. Wish it comes to life soon. And this is what I can the New New Thing.