Please support Game Informer. Print magazine subscriptions are less than $2 per issue

X
Feature

How Eye-Tracking May Change The Way We Play And Interact With Video Games

by Blake Hester on Aug 21, 2016 at 11:42 AM

Eye contact is one of our most basic means of communication and is usually the first way we start any in-person interaction. We look someone in the eyes, then we greet them. We make eye contact, then initiate conversation. It’s the same when playing video games. We see a title screen, then we engage. But the latter’s always been a one-sided experience. Video games have never looked back.

[Above image credit: Petr Novák, Wikipedia]

According to some with their fingers on the pulse of a new technology, however, this may soon change. Companies are starting to experiment with a middleware technology that allows games to track users’ eye movement and points of focus, opening new doors to the way developers gain feedback about their players and interact with them.

Eye-tracking technology also has the potential to change the way we use virtual reality. VR has been on the tip of a lot of tongues recently, but it also has undeniable hurdles to overcome if it’s going to go mainstream. It’s expensive, it’s cumbersome, and developers haven’t settled on a standard input device, with some choosing motion controls and others sticking to conventional controllers. Eye-tracking, however, may hold the key to VR’s mainstream appeal. 

It can change the way we play video games, it may help us look into the eyes of non-playable characters, and it may help lower virtual reality’s financial barrier to entry. 

It’s All In The Eyes
The advanced study of eyes goes back to 1879, when the French ophthalmologist Louis Émile Javal noticed when someone is reading, their eyes move in a series of short stops rather than one fluid motion. This led to eye-tracking research, which proliferated in the 1970s, especially in terms of researching how we read. 

A decade later, people began looking into how human eyes look at computer screens and how they search for command keys and menus. This led to changing web interfaces based on where a person’s focus would be. More recently, more sophisticated eye-tracking technology has led to its commercial application, as well as studies on how viewers look at television shows, commercials, and web screens. Now, it may be video games’ turn to look at us.

Jesse Schell, a game developer whose team created the virtual reality experience I Expect You To Die; a former Disney Imagineer, a professor at Carnegie Mellon, and futurist, makes a bold claim: “We haven’t had anything interesting in video games happen in about five years. Nothing really new or novel has happened for quite a while and suddenly we have these experiences [with virtual reality] that are really different, really powerful.” 

Whether or not he’s right, Schell is a big proponent of virtual reality and the social possibilities of eye-tracking in VR headsets.

“Eye-tracking sensors are right around the corner,” Schell theorizes, “and they’ll be integrated into virtual reality systems, they’ll be integrated into game consoles, they’ll be integrated into television sets.” With the use of it, devices will know where a viewer is looking, what they’re paying attention to. 

Last year, Assassin’s Creed Rogue launched on PC with eye-tracking support. Partnering with the company SteelSeries, Ubisoft gave players using its Sentry Eye Tracker the ability to control the game’s camera with their eyes. 

The Sentry is one of a few eye-tracking peripherals on the gaming market right now. The thin, black-and-red bar attaches to a user’s computer, and using three infrared microprojectors it scans their eye 50 times per second, keeping track of and recording where they are looking at on screen.

SteelSeries didn’t make this peripheral with the player strictly in mind; it also considered the viewer. For those watching a streamer on Twitch, for example, the Sentry monitors the player’s eye-movements, projecting it back to viewers, allowing them insights into streamers’ strategies. 


SteelSeries' Sentry Eye Tracker

Sony has a similar technology in the works, which it demoed at GDC 2014 but hasn’t showcased since. Other companies like Tobii have implemented eye-tracking with its Tobii EyeX into games such as The Division, Farming Simulator, and more recently Deus Ex: Mankind Divided. For most games, the implementation is for aiming the camera or a player’s gun, deciding where to take cover, and where the character moves. When using this technology, players are no longer aiming with just their thumbs, but with their eyes as well.

In 2014, Game Informer got an opportunity to demo an eye-tracking development kit at CES made by Tobii in partnership with SteelSeries. Playing World Of Warcraft, we were surprised to discover how natural  it was using our eyes to control the game’s camera, target enemies, and pull up menus. 

It doesn’t radically change the way we play, but it does shift the way we interact. When it works, it gives the character’s reticle the same precision as the player’s eye. That said, it remains to be seen how well this technology will work with precision-based games like Counter-Strike: Global Offense, for example, and whether or not there’s a noticeable latency. Certain third-party apps have been released, such as Project IRIS, in an attempt to smooth this over.

It’s an exciting new technology, one that feels futuristic and one that may have the possibility to radically improve virtual reality. The use of eye-tracking in VR could change the way we interact with characters and the way we think about engagement.  

“Every video game platform starts out a single-player platform and if it has some success there, then somebody takes it social and they’re the ones who make all the money,” Schell says. “When it goes social, that’s when it goes big. And VR is the most social of all technologies that have ever been created, because it’s the only technology where I can actually make eye contact with another person, where I can actually see in three dimensions the gestures of another person.” But for us to make that eye contact, first we have to look at an eye-tracker.

“[Eye-tracking cameras] pay attention,” adds Mark Mento, director of the American division of SensoMotoric Instruments (SMI). “They look at the eyes and the head and they understand [what] someone is paying attention to at any given time.” SMI specializes in eye and gaze tracking technology. They are also the company working with Sony on its aforementioned prototype. 


The HTC Vive

Mento, like Schell, points to its social implications. “In the real world, if I look at you and you look at me, we make eye contact. That has meaning from a social perspective,” he says, explaining when it doesn’t happen in the digital world, it can be unnerving. “This sort of passive awareness of your attention, which is something that we very much use in a social context in the real world, can be implemented in the virtual world if you happen to know where all of us are looking in some virtual environment.”

“You can pay attention [with] gaze position and model an avatar based on that, so there are a lot of applications that follow along those lines,” Mento continues. “I think we’re one giant step before the common consumer application, but in some sense we’re heading in that direction.”

With the use of eye-tracking in video games, already an immersive medium, for the first time we could be making actual, intimate eye contact with a virtual character or player on the other side of the world. You could see their expressions, see where they look when talking, and they could do the same to you. Creating moments between player and character that feel natural, rather than artificial or fabricated – something developers have struggled with in the past. 

If the game could look into a player’s eyes and talk back, it could give a personal connection in an impersonal space, diminishing the separation of person on-screen and off. You’re no longer a third-person observer to a romance scene; you’re engaged with the character, in the moment. You’re jumping from the rooftop, looking as quickly as possible for the next handhold. You’re looking deep into someone’s darting eyes during an interrogation, noticing their quick glances. For the first time in a game, you could look into the eyes of a person or monster when you kill them, potentially making a real statement about the gravity of the action. 

Eye-tracking may have the possibility to pull us directly into game moments we’ve only been distant observers of prior. It may open up a bigger conversation about our actions in games – be them romantic, violent, or adventurous – about how we engage and how we interact. But what good is any of this if the platform needed for these moments is unaffordable? A new rendering technique could actually be helping with that problem, instead of adding to it.

“I can slack off”
There’s a small depression in the center of the eye’s retina called the fovea where visual acuity is at its highest. Everything else we see around it is blurry. Look at the last letter of this word; it’s the only thing you see in complete focus. Foveated rendering, using eye-tracking cameras, works much the same way. 

“The idea is that you can aim processing power where the person’s gaze position is,” Mento explains. “So if they’re looking at a particular place, you render to the highest degree at that position because a person’s peripheral vision doesn’t see in detail.” 

Foveated rendering could benefit both developers and consumers in the same way. By only rendering what’s in the center of someone’s vision, developers can save GPU expenditure in their games. Players using headsets such as the Oculus and HTC Vive – both requiring pretty demanding computers to run – could ostensibly run games on lower caliber machines as the need to fully render everything else on screen is eliminated. 

“It certainly has the potential to really improve the experience, especially if you have somewhat limited resources. Which does seem to be the case,” Mento says in regards to the affordability of virtual reality headsets. “Foveated rendering basically lowers the GPU requirements, so presumably you could buy a cheaper [graphics] card.”


An example of foveated rendering in action. The center point is where visual acuity and rendering is at its highest.

For foveated rendering to work, it needs to be able to rely on the accuracy of eye-tracking. “If I know exactly where on those graphics you’re looking at any given time [I can cause] the highest detail on the graphics to be only at the point where you’re looking [and if] you look away from that place, I can shift that high-rendered spot somewhere else,” Mento explains. “I can slack off, I can have a lower acuity, less detail, because you’ll never see it.”

Right now, foveated rendering is more of luxury item than a logical next step. It needs eye tracking to be the next enhancement to virtual reality for it to have chance at being the next next big thing. Developers need to see its benefits in action and why they should want it in their games. However, NVIDIA recently took a step at making the latter a possibility. 

NVIDIA recently partnered with SMI to use its eye-tracking cameras in the HTC Vive to demo NVIDIA’s “perceptually-based foveated rendering technique.” It’s a small step toward a potentially big advancement in getting virtual reality into more homes.

Time Will Tell
All of these different eye-tracking technologies are still in their early stages of development. If eye-tracking becomes the next big thing in virtual reality, it won’t be natively adopted until future generations of headsets. But if that happens, foveated rendering may be able to follow. 

That said, the combination of virtual reality and eye-tracking technologies has the potential to allow us to engage with a character, not just passively interact. To look in their eyes, not just look at.