sábado, 15 de marzo de 2014

Ideum’s GestureWorks Platform Will Let Developers Create Complex Gestures For Ultrabook Apps

Ideum has built a platform for developers to create complex gestures for apps that can run on 100-inch table top displays or the small screens of Ultrabooks running Windows 8.

Ideum's GestureWorks platform shows how developers will increasingly have the freedom to think beyond developing apps that are limited to those that require a keyboard or a mouse. The platform makes it easier to develop for the new Intel-based Ultrabooks that have the multitouch capabilities that come with Windows 8.

Here's a video of the GestureWorks platform that I shot at the Blur conference this week. It's significant because it demonstrates the new kinds of developer platforms that we can expect to emerge as complex gestures become a standard aspect of application development in the post-PC age.

PC manufacturers such as Samsung, Sony and Asus have made touch a primary feature of its Ultrabook offerings. But until recently the capability has gone relatively unexplored. Part of it may be due to the usability issues that can make touch less than optimal. When the hardware lags behind the software, the touch aspect becomes a broken experience, which turns off users.

Ideum Founder Jim Spadaccini said that the company worked with Intel to make its processors more optimized for gesture-based apps. You can see the responsiveness in the demos. The Google Earth apps he showed had no lag at all on the Ulrabook. For instance, you can use your fingers to touch, zoom in or zoom out on the Google Earth app.

Ideum created GestureWorks as an ActionScript SDK for multitouch development starting in 2009. The latest version of the platform supports more rhan 300 gestures. As part of the platform, Ideum has also developed its own markup language called "Gesture ML" for developing multitouch apps that come with an open-source "gesture library."

With GestureWorks, Ideum has programmed the capability for Ultrabook users to create their own gestures so they may use the touch capabilities that run on Windows 8. At Blur, Spadaccini showed me how someone on an Ultrabook machine can use their "GestureKey" to attach mapped gestures to Google Earth.

The Blur conference explored the "blur" between real and physical worlds, robots and thought control technologies that harness brain waves to turn down the lights or play a game that allows the player to use their brain to drive a car. These new technologies show how standard, keyboard and mouse apps will increasingly blend with technologies that allow people to touch a screen more so than type a command. These new apps also go beyond the single-touch methods that have become the norm with mobile devices. (I'll explore these emerging technologies in a later post.)

I think of it this way: Look at the tables and walls in any room. Then correlate how these physical things will soon embed the compute capabilities that come with the increasing density of microprocessors. These new kinds of tables will have thousands of sensors someday with the capability to brush our hands across a screen or draw with a stylus.

At any cofee shop, people sit at tables with their laptops perched on top to do their work and interact online. More than ever before, people now bring tablets to read or do their work. They are smaller and less noticeable but still are separate hardware devices. In the future, the laptops will become part of the table, the hardware itself entirely abstracted, controlled by any variety of complex gestures to co-exist in real and virtual worlds.

No hay comentarios:

Publicar un comentario