Quantum Interface (Qi) of Austin, Texas has come up with a predictive navigation interface which is responsive to motion, analysing the speed and direction of any input to determine user intent. It is offering beta versions of an app using technology.
When a user moves toward a menu selection with a finger on a touchpad, moves their hands in the air or moves their eyes in a car’s heads up display or under a VR hood, the app infers the choice, and literally moves toward the user, opening up the next levels of information.
This interface presents more information to the user faster than the current point and click, tap and lift interfaces.
The first use of the technology is in Qi’s Android smartwatch launcher, QiLaunch Wear.
As a user puts a finger on the watch face and starts moving it toward a selection, the app launches with no lift and tap or point and click required. This updated look streamlines the interface, speeds up engagement with apps, and allows the user to see more content and choices on a smaller screen.
The only requirement to control any device is continuous motion, no special sign language or gestures to learn nor any special rings, remotes or other hardware necessary.
Designed to be universal, the Qi predictive motion interface works with almost any type of device, regardless of the sensors installed or type of touch or touchless environment, and can be added to any application or operating system.
The interface consumes less power than current touchscreens, gestures or other ways of communicating with devices.
“With QiLaunch Wear and all the coming versions of our interface, there are no ‘stop signs’ that slow down navigation so users have the fastest route to navigate and select apps,” says Qi founder and CTO Jonathan Josephson.
from News http://ift.tt/1QllXAW
via Yuichun
沒有留言:
張貼留言