CES 2015: Elliptic Labs Wants to Transform Your Smartphone UX with Ultrasonic Gesturing

Waving your hands in front of your phone like a madman usually isn’t something people associate with brilliant UX design, but the folks at Elliptic Labs want you to forget that association immediately. I spent time speaking with the company’s VP of Product Development Guenael Strutt at CES today, and after a mere 15-minute conversation he had me convinced that intuitive gestures may just be the way of the future.

Before you accuse Elliptic of aping existing tech found in the LG G3 or the latest Samsung Galaxy S, it’s important to get one thing straight – what’s going on with Elliptic is a lot more advanced than what we’ve seen in current phones. Sure, you can gesture through your photos in TouchWiz, but is the movement really one-to-one? Does the device track the motion of your hand every step of the way? Hardly. It’s a fun diversion, but in its current commercially available form, overall usefulness is questionable.

What Elliptic really wants to tackle are the everyday inconveniences of smart devices that we don’t even realize we regularly put up with. Take Android’s YouTube app. While watching a video in full screen, you’ll need to tap the screen once to hide the play and pause controls. Or, if they’re already hidden, you’ll need to tap the screen when you want to access them. It sounds straightforward enough, but instead Elliptic asks a simple question. Why?

The demo I saw used a case-like enclosure to imbue a standard Samsung smart device with souped-up sensing powers. As I moved my hand toward the screen to tap and reveal YouTube’s pause button, something awesome happened – the button appeared on its own and presented itself to me! This of course happened because the device sensed my approaching hand and acted accordingly.

Another example of supreme usefulness was an idea for potential functionality in Google Chrome. In Chrome, when you scroll down the app switches to full screen mode, hiding the address bar. If you want to see the address bar again (or even just use it), you’ll need to scroll up first, even if just ever-so-slightly. In theory, Elliptic’s sensors could instead just detect when your finger approaches the area where the address bar normally is, and present it to you just in time for a tap – no extraneous scrolling required. It’s a small change but an incredibly intuitive one, and it’s not hard to imagine dozens of little tweaks like this amounting to massive overall boosts in convenience.

Related: LG’s Curved G Flex 2 Can Recover from Scratches in Ten Seconds Flat

I asked Mr. Strutt if his company’s SDK had been made available or if any of its  work was open source, and he seemed almost taken aback at the audacity of the question. It’s clear that Elliptic’s goal is to tweak, refine, and eventually sell its wares to a larger suitor (likely a major Android OEM), at which point the hardware can be developed in-house, miniaturized, and integrated directly with phones as they’re sold. I’m all for useful gestures that change the way we interact with our devices, but what happens if Google develops a similar technology in a future version of Android? If they’re lucky, Elliptic will already have struck a deal with an OEM by then, or better yet, been outright purchased by one.

TRENDING


X