ToxMod

AI Voice Chat Monitor Startup Modulate Raises $30M

AI voice chat moderation startup Modulate has closed a $30 million Series A funding round led by Lakestar. Modulate is the developer behind the ToxMod artificial intelligence tool called for detecting and dealing with instances of violent or otherwise offensive speech in real-time video game voice chat. ToxMod’s AI is designed to note threats, racial slurs, or any other comments the developers designate, using context and emotional analysis to determine how to respond and when to alert human moderators.

ToxMod Talk

Game service platforms can embed ToxMod within their voice chat systems to passively listen to conversation without saving anything until activated by hearing something from a list of both universal and custom problematic keywords and phrases devised by Modulate and the platform, including racist slurs, threats, or anything else the developers want to know. The AI assists human moderators, expanding their reach and letting them know about potential problems with a real-time transcript and analysis.

“ToxMod is changing the way game developers attack toxic behavior in their communities and this funding is a real validation of our mission to make online communities safer,” Modulate CEO Mike Pappas said. “We’re thrilled to welcome Mika [Salmi, managing partner atLakestar,] and his vast store of experience to the Board as we grow our team and ramp-up the development and deployment of ToxMod.”

ToxMod was born out of Modulate’s work on creating custom voice changers for video game players. The enhanced machine learning models enable emotional and contextual evaluation. Modulate claims a more than 98% accuracy rate in spotting problems and that it elicits more than 25 times faster human moderator responses. The human moderators can decide on a course of action based on the language and context where it was said.

“[The model] understands nuance and if someone is being aggressive or defensive,” Modulate CTO Carter Huffman told Voicebot in an interview when ToxMod launched at the end of 2020. “Nuance and understanding terminology for that language and game, that’s where our technical innovation comes in.”

AI Moderation

The new funding sextuples the $6 million Modulate had previously raised and will go toward rapidly scaling the company and its tech. The market is starting to see more competition, especially after digital voice chat exploded in popularity when the COVID-19 pandemic began. AI content moderator Spectrum Labs raised $32 million in January for its text and voice-based AI moderator that it claims can recognize and categorize more than 40 kinds of toxic comments. Microsoft is heading toward a similar possible product after it filed a patent last year for an AI to measure how people are feeling based on their voice and other signals to moderate voice chat on its game servers. And Intel has developed a voice AI tool called Bleep on Intel PC chips to spot and censor offensive language uttered by one player before anyone else hears it.

  

New ToxMod AI Tool Monitors Game Chat for Toxic Speech

Intel is Developing a Real-Time Language Filter for Online Games

Conversational AI Content Moderation Startup Spectrum Labs Raises $32M