Back in 2009 and 2010 I described how I felt touch screens needed to adopt something like the technology used for braile displays (called haptic displays) to provide tactile feedback as to the mode of the current display.
A company called Tactus Technology has now made that possible. I didn’t really have phones in mind, though, I was thinking more of cases where you could expect not to be looking at the input area – tablets, secondary displays, etc.
But then again, couple a smart phone/tablet with Microsoft’s Second Screen for Xbox and, actually, tactile phone feedback would be handy: allowing your phone to serve as a context-morphing controller. Woot.
only vaguely related, but you have seen http://www.leapmotion.com/ right?
I hadn’t. Interesting :) Based on my experience with Kinect, I have concerns about how it would perform relative translation of interactions tho. E.g. the most naturaly thing to do is to reach for the things on the screen, but the kinect does not perform any kind of clever mapping, so you have to fight the visual cues and try to calibrate to the virtual sensor space in which you are interacting. Bleah.