Augmented Reality Apps: Making the case for Smart Eyewear
17th October 2013
The publicity around Google Glass has helped stir attention for the market potential of wearable technology. In what appears to be the future of smart eyewear, Google Glass is merely the biggest player in what is quickly becoming the next big technological trend. Other tech giants like Sony and Microsoft and lesser-known, but formidable, competitors Vuzix and Recon Instruments are next on the list to develop their own wearable gadgets. As the consumer tech world awaits the advent of wearable tech, where we all hope to exist Jetsons-like, will augmented reality (AR) apps define the future of smart eyewear?
Forrester Research recently reported that over 20 million U.S. consumers are willing to wear augmented reality devices. Earlier this year, IHS forecasted that the adoption of AR in smart-glass devices will drive volumes of 10 million units through to 2016. Apps will be critical to driving the success of glass, in fact IHS stipulated “Google is betting the house that developers will produce some compelling applications for Glass.” Topping the list of apps that will define the glass experience are AR related apps, which are designed to add an extra layer of information or experience to that which is being seen. Without AR apps, the forecast sinks to just 1 million units which — an indication of the lengths to which AR will impact the wearable market.
Imagine the Possibilities
Consumer AR applications are dominating the gaming industry. By tapping into a user’s environment, AR builds a backdrop for gamer’s world — opening the door to a new hybrid virtual/real construct. Wearables can provide a more immersive user experience, be it as a standalone point of entry for the gaming experience or as an added enhancement to the games for smartphone or tablet use.
Other examples of useful apps on smart eyewear devices include foreign language translation, as developed by DoCoMo, for tourists visiting Japan — providing directions to local points of interest and virtual guides for museums and other attractions.
Yet consumers won’t be the only ones to adopt AR technology, wearables are quickly demonstrating their value in the enterprise. For example, with one in 10 workers operating without desks, wearable smart eyewear offers the best way to provide the benefits of access to computing and corporate data on the go. Enterprise deployments of AR-enabled applications include stock picking in warehouses, virtual manuals and sales demonstrations that allow consumers to see products in their own home before purchasing. These kinds of applications will add additional revenue for early adopters.
AR Device Architecture
The evolution of AR from dedicated systems to open smartphone devices is now migrating to wearable smart-glass devices. However the challenge remains of enabling always-on, always-augmented experiences that users expect. In particular, minimizing the power and heat dissipation of wearable devices is key to enabling the adoption of such systems by consumers especially with the constraint of small-form factors required for wearable systems.
A short battery life limits the experience of wearable device. If consumers have to take off their watches or glasses more than once a day to charge it, that’s going to limit how often they’re willing to use it. Beyond the nuisance that comes with recharging, AR based applications are known to consume a significant amount of battery life. Consider that the camera needs to be on to allow the vision system to identify, track and recognize objects, the application needs to run on the operating system processor and render complex graphics onto the display which also has to be running constantly. In addition, GPS, WiFi and cellular radios may need to be running in parallel to allow accurate location tracking and data access. All of these elements consume power so deep consideration needs to be given to each component and the overall system architecture and design to mitigate any excessive power consumption.
A proven approach is to support the various computational activities with dedicated processing elements, giving rise to the use of distributed heterogeneous processing systems. In such systems, each activity has a processing element assigned to it that is optimized for that function. This allows the system architect to tune the overall power consumption by ensuring each element of the system is optimized for its particular activity to minimize power consumption.
Always On, Always Augmented
At InsideAR 2013 last week in Munich, Germany, Metaio demonstrated AR apps that were capable of delivering visual search which allows users to identify objects in near real-time and allows them to link to online information and purchasing options about those objects; Metaio also demonstrated IKEA’s AR enabled catalogue which allows customers to view furniture from the catalog in their home using a mobile app. Generating a lot of interest at the show were a Google Glass app for interactive car assistance as well as a variety of enterprise applications using wearable smart glasses for stock picking and service support.
Thomas Alt, Metaio’s founder and CEO pointed to Gartner assessment that “Augmented Reality is one of the Top 10 IT technologies of our time”, caveating this with his perspective that devices makers need to ensure they deliver “always-on, always augmented” experiences for users to gain real value, to not only enjoy AR apps — but demand them.
To make these apps possible, high-performance, ultralow power multi-core and computational image-processor chips are critical to optimize the AR experience. Specialized co-processors or dedicated hardware IP blocks can create huge power advantages and performance gains for device designers to differentiate their products in a hyper-competitive market.
Making the Case
With the AR market expected to reach $659.98 million by 2018, many factors will contribute to its success, including design, performance and features. Ultimately, however, the AR industry vision is to become an invisible utility as the bridge between the digital and physical worlds merges seamlessly — a challenge, perhaps, but one soon to be remedied with revolutionary developments in processor capabilities and vision computing techniques.
Remi El-Ouazzane is CEO of Movidius.