By translating shapes into computerized images, this system can turn any surface into a touch-screen.
The next step in touch-screens may be to ditch the actual screen, according to researchers at Purdue University. Engineers Karthik Ramani and Niklas Elmqvist and their colleagues recently unveiled a projector-based computer input system that can transform desks, office walls, or even kitchen islands into interactive surfaces.
The system, called extended multitouch, consists of a computer, a projector, and a Microsoft Kinect depth-sensing camera. (Also used for the Xbox 360, Kinect enables users to interact with games and devices optically, without touching anything.) The projector displays the contents of a typical computer screen onto a surface like your refrigerator or stone countertop, while the Kinect’s infrared laser and receiver estimate the distance to the surface.
As a user moves his hand into view, the Kinect snaps 30 frames per second, and an image-processing algorithm in the computer tracks and deciphers the changes from frame to frame. “Basically, we know where your hand is and what your hand is doing,” Ramani says. If the user’s finger comes within a few centimeters of the projected image (i.e., the surface), the system interprets this as a touch.
Other researchers have demonstrated technology that can make use of a similar kind of virtual click, but the Purdue system is unique: It can distinguish between thumbs and fingers, pick out a right hand from a left, and even track the hands of several users at once. In one experiment, several subjects drew virtual pictures on a table, and the computer marked each individual’s rendering in a unique color.
clich here to view this artticle
itech solutions website
The next step in touch-screens may be to ditch the actual screen, according to researchers at Purdue University. Engineers Karthik Ramani and Niklas Elmqvist and their colleagues recently unveiled a projector-based computer input system that can transform desks, office walls, or even kitchen islands into interactive surfaces.
The system, called extended multitouch, consists of a computer, a projector, and a Microsoft Kinect depth-sensing camera. (Also used for the Xbox 360, Kinect enables users to interact with games and devices optically, without touching anything.) The projector displays the contents of a typical computer screen onto a surface like your refrigerator or stone countertop, while the Kinect’s infrared laser and receiver estimate the distance to the surface.
As a user moves his hand into view, the Kinect snaps 30 frames per second, and an image-processing algorithm in the computer tracks and deciphers the changes from frame to frame. “Basically, we know where your hand is and what your hand is doing,” Ramani says. If the user’s finger comes within a few centimeters of the projected image (i.e., the surface), the system interprets this as a touch.
Other researchers have demonstrated technology that can make use of a similar kind of virtual click, but the Purdue system is unique: It can distinguish between thumbs and fingers, pick out a right hand from a left, and even track the hands of several users at once. In one experiment, several subjects drew virtual pictures on a table, and the computer marked each individual’s rendering in a unique color.
clich here to view this artticle
itech solutions website
No comments:
Post a Comment