HomeArticles

3 ways augmented reality will transform UX design

Like Tweet Pin it Share Share Email


With Apple’s recent launch of AR Kit and Google’s announcement of AR Core, the nascent augmented reality space is set for a major battle. AR presents a brand new frontier of computing interfaces that will redefine how we interact with information. As such, the UX designers of tomorrow will need to be equipped with the right knowledge and tools to build immersive interfaces. Here are three significant ways that AR will significantly change UX design.

The human interface

Unlike 2D screen-based interfaces like those of smartphones and laptops, AR interfaces have two properties that make them different. The first is 360 degrees of potential context: the ability to organize information all around you. The second is Z-depth: the capability to organize and interact with information in your Z-axis. In other words, we will no longer have to work within the confines of screens. Designers will be able to place objects, assets, and elements directly into space.

By combining sources of information this way, AR will eliminate many of the interaction costs associated with screen-based interfaces. This includes actions such as scrolling, switching tabs, and waiting for pages to load. This will reduce the user’s cognitive load and make completing tasks more efficient: the ultimate goal of UX design. However, for all the potential benefits AR will bring to UX, screen-less interfaces also present challenges.

Input and interaction

One of the greatest challenges presented by these new interfaces will be to define new interaction standards. The development of the personal computer and the smartphone has introduced interaction standards that have become second nature over time. These include clicking through different windows, keying in text input, and zooming in with a finger pinch gesture.  These standards were designed for their corresponding hardware peripherals like the mouse or touchscreen, but what’s the mouse or touchscreen of AR?

Perhaps one of the most exciting implications of AR interfaces is that they will move us toward a system of natural interactions that are intuitive and well adapted to the way we interact with the physical world. In the real world, productivity and communications based tasks are accomplished using our hands and voices.

AR is largely confined to the smartphone today. This means that, for now, we’ll be using a combination of traditional with immersive interaction standards. However, as we move toward AR wearables like the Microsoft Hololens and Meta headsets, interactions will become increasingly gesture-based. An example of such an interaction is demonstrated by the “airtap” on Hololens:

As shown above, users have to reach out to objects in front of them in order to manipulate them, mimicking natural human behavior. However, because digital objects in AR don’t provide haptic feedback, the gap between natural interaction and digital output will have to be identified and solved by thoughtful UX designers.

Contextual input and reactive interfaces

A core property of augmented reality is that AR devices automatically interpret contextual input from the real world in order to provide actionable digital output. A relatable example of this is Snapchat’s AR face filters: the filters offer actionable information in the form of dog filter based on a facial recognition system. The contextual input, in this case, is the user’s face. The implication here is that designers will have to adapt to creating UX flows, interactions, gestures and animations for reactive interfaces.

When designing for a 3D medium, UX designers also have to consider how to anchor the user’s work space or play space. It’s crucial that users are able to easily adjust and adapt to different environments. With every UX decision, designers must now consider the physical space into which our digital elements will be overlaid. UX research will involve testing software in different lighting and weather conditions, and in various kinds of exteriors and interiors.

Conclusion

UX design for AR is still at an experimental stage. As such, it’s still early enough to “break the rules” — or even define what the rules are and will be. The state of the technology will improve in line with design standards, to ultimately craft the next big thing in computing. In a decade, “air tapping” with AR smart glasses might be as intuitive to us as smartphone pinch zooming is today.

Michael Park is the CEO and Founder of PostAR, a platform that lets you build, explore, and share augmented realities.

The PC Gaming channel is presented by Intel®‘s Game Dev program.



var _elqQ = _elqQ || [];
_elqQ.push([‘elqSetSiteId’, ‘334284386’]);
_elqQ.push([‘elqTrackPageView’]);

(function () {
function async_load() {
var s = document.createElement(‘script’);
s.type = ‘text/javascript’;
s.async = true;
s.src = ‘//img.en25.com/i/elqCfg.min.js’;
var x = document.getElementsByTagName(‘script’)[0];
x.parentNode.insertBefore(s, x);
}
if (window.addEventListener)
window.addEventListener(‘DOMContentLoaded’, async_load, false);
else if (window.attachEvent)
window.attachEvent(‘onload’, async_load);
})();

var timerId = null, timeout = 5;
function WaitUntilCustomerGUIDIsRetrieved() {
if (!!(timerId)) {
if (timeout == 0) {
return;
}
if (typeof this.GetElqCustomerGUID === ‘function’) {
document.forms[“casTrialIntegration10206IRC”].elements[“elqCustomerGUID”].value = GetElqCustomerGUID();
return;
}
timeout -= 1;
}
timerId = setTimeout(“WaitUntilCustomerGUIDIsRetrieved()”, 500);
return;
}
window.onload = WaitUntilCustomerGUIDIsRetrieved;
_elqQ.push([‘elqGetCustomerGUID’]);

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *

اخبار حلويات الاسرة طب عام طعام وشراب