Ahoy there! This is my personal blog which I use as my memory extension and a medium to share stuff that could be useful to others.

Tangible User Interfaces

How about manipulating digital information with your hands in a more natural, instinctive way like we do with other tangible things? Devices called Siftables make this possible. As per the creators David Merrill (MIT Media Lab) and Jeevan Kalanithi (Taco Lab), “Siftables are compact electronic devices with motion sensing, graphical display, and wireless communication. One or more Siftables may be physically manipulated to interact with digital information and media.” So, while gestural user interfaces allow you to manipulate digital information on surfaces by using gestures (like SixthSense and g-speak), tangible user interfaces (like Siftables) allow you to manipulate information in your hands with gestures. Watch a demo video of Siftables below:

 

 

Visit siftables.com for more details.

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Spatial manipulation of digital information

Most jobs involving digital information are computer-centric. My job (IT) and that of millions out there predominantly involve exchange and manipulation of digital information and this ties us to our laptops/PCs/Macs, etc. and I/O interfaces like keyboards, mice, touch screens, etc.  How about doing the same stuff by working with digital information spatially? To do this, we would require a new type of I/O interface and what could be simpler and more intuitive to use than hand gestures! – so, we have it – gestural i/o. The SixthSense uses gestural I/O. A lot of research in the field of gestural I/O and Human-Computer Interaction has been performed by Oblong Industries together with MIT’s Tangible Media Lab. Oblong Industries is the developer of the g-speak spatial operating environment. View the video below for an overview of g-speak:

 

 

By the way, gestural I/O was not inspired by Minority Report, but rather the work done by Oblong industries/MIT inspired Minority Report. The movie’s Production Designer visited MIT Labs to determine how to depict a plausible future in the movie as required by Steven Spielberg. So, Spielberg wanted to depict 2054 AD as something which could be made real one day and I guess he was rather pessimistic when choosing the year 2054 as it seems we’re only a few years or a decade away from doing that stuff depicted in Minority Report.

Read this interesting account on the development of gestural I/O and spatial operating environments at Oblong and MIT.

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

At times during my ritual household grocery shopping, I stand in front of a supermarket shelf containing several brands of a specific product (probably a new addition to the shelf), trying to digest all the information on the product labels and decide which brand to pick up. This task has been made somewhat easier after getting my BlackBerry Bold with unlimited internet usage, as I can now at least check product reviews on the www, thereby facilitating my decision-making process. But how nice it would be for me to have very quick access to the information I want with minimum fuss! – the SixthSense  may be just the perfect device to make this possible. Pranav Mistry, a Ph.D student in the Fluid Interfaces Group at the MIT Media Lab is the genius behind the SixthSense. Watch the demo of SixthSense below:

 

 

Visit SixthSense for more details. When this device becomes generally available, it will revolutionize the way we humans interact with the world around us and will bridge the gap between science fiction and reality.

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)