Unity acquires AI chat analysis platform OTO, launches toxicity in gaming report

A new GamesBeat event is around the corner! Learn more about what comes next. 

Unity has acquired OTO, an AI-based audio chat analysis platform that figures out if humans need to intercede in a toxic multiplayer gaming environment. The terms were not disclosed.

We all know that online gamers can be toxic, trash-talking each other and bullying those who aren’t as skillful. Some of that is OK and can be chalked up to the culture around a game. But some of it also crossed the line, and that’s where OTO comes in.

As a real-time acoustic intelligence platform, OTO (pronounced Otto) can analyze online gaming voice or text chat sessions for the tone and emotional weight of the words and conclude whether it needs the attention of a human moderator. While it uses automation technology, it doesn’t automatically ban people for being toxic, said Felix Thé, Unity’s vice president of product management in the Operation Solutions division, in an interview with GamesBeat.

Oto used machine learning and acoustic neural networks to create its technology. It can detect tonal patterns, intonation, amplitude, and the expression of human emotions when people are interacting. There are a variety of other companies working on similar AI technologies, like Modulate.


Three top investment pros open up about what it takes to get your video game funded.

“The focus of the technology is less about the speech, less about the spoken words, and more on the sentiment and the latent expression that is being carried in the conversation,” Thé said.

Above: OTO analyzes voices for acoustic intelligence.

This is important because Unity’s own survey, conducted by the Harris Poll and being released today, found that nearly seven in 10 players said they have experienced toxic behavior.

Unity’s OTO can detect the conversations with the emotional weight behind them and flag them for community moderators. Those moderators can track the players involved in the conversations and monitor them for further violations. OTO can also analyze conversations that have been reported by players. In this way, OTO can help filter out the tons of conversations so the human moderators can keep up.

One of the problems is that if you just analyze what was said or what was spoken, you may misinterpret a gamer’s intent. They can curse when they really mean to offer praise to another player. Or they may dryly make a sarcastic comment in an attempt to bully someone. That’s why AI has such a hard time automatically policing voice chat, Thé said.

“Certain toxic emotions or expressions can be spoken calmly, but they’re acidly aggressive,” Thé  said. “By utilizing an acoustic tonal pattern detection, we believe we can be much more effective in detecting a toxic interaction between communities online. We will also be more effective in interacting and detecting good behaviors and enjoyable interactions online. We made the acquisition with the end goal of making online interactions safe and accessible for everyone.”


Above: The OTO team.

OTO was started by a group of SRI (a Silicon Valley think tank) scientists in 2017. Their goa was to explore the frontiers of speech understanding by combining their expertise in behavioral science and AI.

The founders included Teo Borschberg, CEO. He was at SRI as an entrepreneur in residence. And chief technology officer Nicolas Perony specialized in complex systems at ETH Zurich. He did research on modelling social behavior at scale. Prior to founding OTO, he led the AI team at Hyperloop Transportation Technologies, and held various data-oriented roles in industries ranging from blockchain to sustainability.

Integrating OTO

Above: People are finding multiplayer gaming can be toxic.

OTO will be integrated into Unity’s Vivox voice chat platform as a cornerstone for solving the rise of toxic behavior that leads to poor player experience, and ultimately, lost revenue for game creators.

The aim is to give game makers access to an acoustic intonation engine that operates 100 times faster than speech recognition, is language independent, and is able to detect a wider and more accurate range of disruptive behavior. Developers can then swiftly and efficiently
determine the appropriate courses of action to address possible toxic situations.

The system could be implemented in a variety of ways, depending on how a game company has set up its terms of service around privacy. If a player reports a conversation, the developer can override the privacy concern, analyze the conversation, and make a determination about whether it needs to ban a player. The tech has a roughly 60-millisecond lag so it can be real-time capable. But rather than analyze an actual recording, the system can be part of a way to detect a pattern of abuse that ultimately leads to action against a player. Such judgments involve the U.S. of AI tools, but the ultimate decisions still have to be made by humans.

“We don’t want the AI to make a decision. What we would like this technology to be used is to make moderation of interactions more scalable and easier,” Thé said.

Thé noted that a lot of people found respite in gaming as they wanted to connect during the pandemic with family and friends. But they also felt like there was a surge in toxic behavior, the survey said.

Key findings

  • The poll found that nearly seven in 10 (68%) of players — defined as those who played multiplayer games in the past year — said they’ve experienced toxic behavior while playing multiplayer games (e.g., sexual harassment, hate speech, threats of violence, doing, or having their personal information stolen and displayed).
  • Nearly half of players (46%) said that they at least sometimes experience toxic behavior while playing multiplayer video games, with 21% reporting it every time or often.
  • And 67% of players were very/somewhat likely to stop playing a multiplayer video game if another player were exhibiting toxic behavior.
  • About 92% of players think solutions should be implemented and enforced to reduce toxic behavior in multiplayer games.

Women are more likely than men (49% to 39%) to say they quit playing a game because of toxicity. Over two out of three multiplayer gamers (68%) believe there was a surge of toxic behavior among gamers during the COVID-19 pandemic, with more than one in four (26%) saying they strongly agree. These findings clearly spelled out the need for something like OTO, Thé said.

The Harris Poll conducted the survey for Unity from June 21 to June 23, focusing on 2,076 people over 18. Of those, 1,167 played multiplayer games in the past year.


  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Source: Read Full Article