The touch interface goes back as far as the early days of mobile devices to the Apple Newton and Palm Organiser. Elliptic Labs now add non-touch to the mobile user interface with ultrasound gesture recognition. Todd gets a demo from Laila Danielson, CEO.
Elliptic Labs use an ultrasonic speaker to create a sonic field around the mobile device. Moving a hand in front of the device creates ultrasonic echoes which are picked up by microphones and the changes in echo patterns are converted into gestures. For example, waving a hand from left to right scrolls a picture gallery. Moving a hand closer to the phone brings up additional information on a film or movie.
It’s currently a prototype device being shown to OEMs so expect the technology to appear on mobile devices in the next year or two.
Interview by Todd Cochrane of Geek News Central for the TechPodcast Network.
[cessponsor]
Podcast: Play in new window | Download
Subscribe: Apple Podcasts | RSS | More