Elliptic Labs brings ultrasound gesturing SDK to Android

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)

You may have heard of Elliptic Labs and their technology; the company previously had brought touchless gesture control to Windows 8 – now they’re taking that same unique input to Google’s smartphone OS, Android.

The 20-member team is showing off a prototype at CEATEC in Japan, after all, it’s not the most conventional of technologies! It’s also not the only solution on the market. Samsung has recently started to include “Smart Scroll” in their handsets.

Don’t expect to simply find this as an app in Google’s Play Store however, it requires one or two cheap chips (of the under $1 non-edible kind) to be baked-in to the device; with Elliptic Labs’ software handling all the rest.

Unlike current implementations, which require a camera, these chips will detect gestures in various detections across the phone; without having to be within the camera’s vision. In fact, it’s comprehensive enough to detect multiple hands at once.

Whilst not quite up to the Kinect 2’s hardware which is shipping alongside the Xbox One home console later this year, bringing the ability to track 25 joints of up to six people, the combination of both forms of motion detection could spark consumer interest and development.

The only real similarity between the two technologies is that both can “see” in the dark; without the requirement of a well-lit room; which a traditional camera would rely upon.

It’s this accessibility the team is focusing on getting right and making it a convenience, not a hindrance, to the end-user. The consumer can flip through pictures, or set a new Fruit Ninja score, all without touching their precious high-definition displays.

As for the latency? The prototypes are already running sub-100ms; compared to current touchscreens which can only react with a ~120ms response time. This is truly an improvement which we hopefully will start to soon see being built into future Android devices.

Check out Elliptic Labs website for more details about the SDK release.

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *