Interacting with an intelligent interface is in the palm of your hand.
Scientists are taking the touchscreen to a whole new level by projecting it onto your hand. This way you can “feel your way” to your wearable device. And, why would you want to do that? Convenience (or handiness) of course.
Smart devices are getting smaller and smaller, making the interface difficult for even a toddler’s fingers to activate. But a technology called haptics, which applies touch sensation and control to interact with computers and technology, can turn your hand into an extension of the display.
According to an article from Phys.org, a study led by the University of Sussex (which was funded by the Nokia Research Centre and the European Research Council), is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand. The innovation, called SkinHaptics, sends ultrasound through the hand to precise points on the palm, which then uses the skin as the touchscreen.
Get it? I don’t either. But the findings of the study were recently presented at the IEEE Haptics Symposium in Philadelphia, so we soon could be literally telling our smart devices to “talk to the hand.”
Leaders relevant to this article: