HomeArticles

Eye-tracking is virtual reality’s next frontier

Like Tweet Pin it Share Share Email


Eye-tracking technology has the potential to help VR bridge two gaps. It offers users additional control and an intuitive experience, while future developments in foveated rendering will eventually decrease both the processing power necessary for rendering complex 3D environments, which will make VR more accessible overall.

Developers have talked a great deal about the importance of foveated rendering, as the display resolution for future VR headsets increases and untethered headsets based on mobile systems on chipsets gain popularity. However, once motion controls are supplemented with eye tracking, the ability to understand the user’s intent makes the system more natural and intuitive, opening the door more immersive possibilities such as socially responsive VR worlds.

As part of our efforts to explore the best VR experience possible and meeting the growing interest in combining eye tracking and VR as a research methodology, Tobii recently released a VR developer kit to create integrated eye tracking content and a development kit for research applications enabling collection and recording of eye tracking data. While we’ve only just begun to grasp the full potential for eye tracking in VR gaming and research, the near-term benefits include improvements to the following components of the VR experience.

Interaction

In the realm of VR, controls suffer from an inverse relationship between input accuracy and that method’s level of precision and versatility. While various input methods offer unique trade-offs, often times reliable accuracy wins out. The introduction of eye tracking to VR platforms will assist in the development of more intuitive, effortless and fast controls. Allowing a user’s gaze to select individual items in a VR world will enable more intuitive rapid adaptations to accommodate faster-paced and more nuanced experiences. For example following and targeting a fast moving object in VR is much easier with your eyes than it is to point at it with your forehead or using the hand controllers.

Navigation

Publicly available retail headsets can only determine where a user’s head is turned, not where the user is actually looking. This is a subtle yet important distinction that leaves out a layer of complexity and depth to virtual interactions. Moving within VR now typically requires four steps: look at where you wish to go, turn your head to point in that direction, use the motion controller to mark a location and then teleport there. Eye tracking compatible head mounted displays, or HMDs, remove two of these steps by letting users look where they want to go, press a button and they’re there. Gaze navigation not only helps users navigate VR worlds quicker, but also allows for fast and simple menu navigation.

Above: A VR avatar using Tobii eye tracking

Social Cues

One of the most surprising yet significant benefits of eye tracking in VR is the addition of gaze sensitive avatars and player models. Accurately tracking a user’s eye movements allows for players’ avatars to reflect their shifting attention in-game. This makes interacting with the virtual world more immersive when the objects and characters can naturally react to a user’s gaze, offering a wink back at the player or potentially a raise of the eyebrow in acknowledgement.  We recently partnered with VR game developer Against Gravity to show how the subtle use of facial cues and gaze can impact players’ experiences playing poker in their social VR title Rec Room, and were impressed by the enhancements to both social interactions and in-game strategy.

Foveated Rendering

Foveated rendering refers to a specific rendering technique that selectively renders the areas where users are looking in full definition. By drastically reducing the processing power necessary for full VR experiences, foveated rendering has the potential to decrease the hardware specs and price for consumer VR HMDs. Additionally, accurate eye location data would allow next-gen HMDs to automatically adjust the focal length and position of screens to ensure the clearest image possible. In March, we showcased the current boundaries of eye tracking in VR with an early build of The Solus Project VR, which included foveated depth of field, gaze interaction and gaze navigation.

The growing trend of major VR players and tech giants buying eye tracking companies to enhance their current and future platforms shows the significant importance of eye tracking technology and how it’s closer than ever to widespread adoption. Additionally, upstarts such as FOVE and 7invensun are charting their own path to entering the market through building their own HMDs and accessories. However, Tobii’s collaboration with other members of the OpenXR™ Working Group to ensure an open standard for VR/AR will help to ensure those breakthroughs extend to all future VR platforms and devices. As we build upon our previous experience implementing eye tracking for PCs, tablets and wearable eye trackers, we foresee eye tracking in VR influencing future HMD designs through rich eye tracking data and transforming behavioral research through programs such as Walmart’s VR instruction curriculum and the NFL’s new referee training program. These types of opportunities open the door to controlled study environments that would otherwise be too risky, inaccessible, cumbersome or expensive to set up.

Joakim Karlén is Director of Product Management, VR and Middleware for Tobii, spearheading the company’s efforts in expanding eye tracking technology into the VR industry.

The PC Gaming channel is presented by Intel®‘s Game Dev program.



var _elqQ = _elqQ || [];
_elqQ.push([‘elqSetSiteId’, ‘334284386’]);
_elqQ.push([‘elqTrackPageView’]);

(function () {
function async_load() {
var s = document.createElement(‘script’);
s.type = ‘text/javascript’;
s.async = true;
s.src = ‘//img.en25.com/i/elqCfg.min.js’;
var x = document.getElementsByTagName(‘script’)[0];
x.parentNode.insertBefore(s, x);
}
if (window.addEventListener)
window.addEventListener(‘DOMContentLoaded’, async_load, false);
else if (window.attachEvent)
window.attachEvent(‘onload’, async_load);
})();

var timerId = null, timeout = 5;
function WaitUntilCustomerGUIDIsRetrieved() {
if (!!(timerId)) {
if (timeout == 0) {
return;
}
if (typeof this.GetElqCustomerGUID === ‘function’) {
document.forms[“casTrialIntegration10206IRC”].elements[“elqCustomerGUID”].value = GetElqCustomerGUID();
return;
}
timeout -= 1;
}
timerId = setTimeout(“WaitUntilCustomerGUIDIsRetrieved()”, 500);
return;
}
window.onload = WaitUntilCustomerGUIDIsRetrieved;
_elqQ.push([‘elqGetCustomerGUID’]);

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *

اخبار حلويات الاسرة طب عام طعام وشراب