GDC 2019: Valve Psychologist: Brain-computer Interfaces Are Coming & Could Be Built into VR Headsets

Valve’s resident experimental psychologist Dr. Mike Ambinder took the stage at GDC 2019 today for his talk on the state of brain-computer interfaces (BCIs) and how they’ll inform the future of game design. In his talk, Ambinder put forth a near future where VR/AR headsets are kitted with non-invasive electroencephalogram (EEG) devices that could one day provide data to game designers so they can create a new generation of smarter, more reactive games. The long-term view is definitely trending towards neuronal implants, Ambinder says, but we’re not there yet.

Ambinder explained that many of the traditionally measurable ‘bio-feedback’ such as heart-rate, galvanic skin response, eye movement, facial expression, muscle tension, and physical posture are all important to understanding what’s happening to the player at any given moment, although he argues that reaching into the core of the matter, the brain itself, is the frontier that holds the most promise for game developers.

Brain-computer interfaces are basically a communication method that translates neuronal signals into actionable input for computers. In the context of gaming, the Ambinder’s envisioned goal is to acquire physiological data from a player and use that to inform the game whether a player is happy, sad, frustrated, bored, focused, distracted, etc; the idea is to figure out why that happens in each circumstance—be it anger from experiencing a game-breaking bug, or happiness when a specific goal is achieved—and build systems to leverage this data.

And it’s not as far-fetched as you may think, Ambinder maintains:

“We can measure responses to in-game stimuli. And we’re not always getting [data] reliably, but we’re starting to figure out how. Think about what you’d want to know about your players. There’s a long list of things we can get right now with current technology, current generation analysis, and current generation experimentation,” he said.

Moment-to-moment insights into the user’s cognitive state could provide a sort of adaptive, immersive gameplay which could include variable game difficulty, on-the-fly AI responses tailored to the player’s mindset, and even replacing traditional input systems all together—all within the purview of BCI’s gaming future, Ambinder said.

In the near-term, Ambinder sees EEG as the easiest way to start collecting that sort of psychological data and organizing it into something actionable for developers to create more immersive games. In the case of EEG, electrodes are non-invasively placed on various points of the scalp, and used to measure voltage fluctuations of the brain’s neurons. Things like attention, learning, memory, and intention have been measured in some respects using EEG, and Ambinder hopes these things, which have previously been the subject of scientific research, will make their way into mainstream game design at some point. But people aren’t just going to wear a full 35-electrode location EEG helmet like the one from the OpenBCI initiative on its lonesome though; that’s where AR/VR comes in.

“If you’re going to measure brain signals, you need a way to get people to wear a helmet. If only us as game designers had a way of doing that,” Ambinder said jokingly, showing a slide of an HTC Vive headset. “One advantage of AR and VR as well is you’re getting consistent contact with the source of brain activity. So you might be able to do interesting things if you could convince them to wear a helmet with EEG sensors.”

Companies like Neurable have already begun productizing EEG devices especially built to work with VR devices, letting people “control software and devices using only their brain activity,” the company claims.

That said, EEG data isn’t a perfect solution. Ambinder compared it to sitting outside of a football stadium and trying to figure out what’s happening on the field just by listening to the intensity of the crowd’s reaction. The current generation of BCI devices are noisy, and EEG is one of the noisiest due to its job of picking up neuronal signals through the skull, scalp, and hair.

There’s still plenty to learn from EEG, Ambinder says, although deeper knowledge and a push towards more immersive, adaptive games would likely necessitate invasive brain implants. Before that happens, they’d need to pass what he calls the ‘Lasik threshold’ though—a procedure that is technically invasive but confers enough potential benefits to the user while minimizing the procedure’s overall risk.

Ambinder had nothing to announce regarding the company’s plans for potentially kickstarting BCI with something as monumental as a consumerized virtual reality-EEG headset (or brain implants for that matter), although he sees EEG as a definite early avenue for capturing the sort of large data sets needed to start building out interaction systems that could span everything from creating better tutorials by learning about the user’s individual ability to learn, to making games capable of learning how players want to play as opposed to how they should play.

Making this knowledge more granular (and accurate) is still in its early phases, something that needs both sufficient time and a large enough userbase to gather data across a wide set of people. There’s a long list of powerful uses and equally powerful pitfalls yet to come.

If you want to get into the nitty gritty, Ambinder’s full talk will be up on the GDC’s YouTube page in the coming days. We’ll toss the video down here when it comes, so check back soon.

Source: Read Full Article