Rec Room reduces toxic chat incidences by 70% with intelligence voice moderation

by | Nov 14, 2024 | Technology

Gaming experiences can be undermined, even ruined by bad behavior in text chat or forums. In voice chat and in VR, that bad experience is magnified and so much more visceral, so toxicity is amplified.

But positive interactions can be similarly enhanced. It’s vital that developers dig into how their users are relating to one another to understand how to mitigate harm, improve safety and trust, and encourage the kind of experiences that help players build community and stay for the long haul.

To talk to about the challenges and opportunities emerging as the game industry begins to address just how bad toxicity can be for business, Imran Khan, senior writer, game dev and tech at GamesBeat welcomed Yasmin Hussain, chief of staff at Rec Room and Mark Frumkin, director of account management at Modulate, to the GamesBeat Next stage.

[embedded content]

Backing up the code of conduct with voice intelligence

Moderation is one of the most effective tools for detecting and combating bad behavior, but it’s a complex undertaking for humans alone. Voice intelligence platforms, such as Modulate’s ToxMod, can monitor across every live conversation, and file a report directly to the human moderation team for follow-up. That offers the evidence required to make educated decisions to mitigate that harm, backed by a code of conduct, as well as offers overall insight into player interactions across the game.

Rec Room has seen a 70% reduction in toxic voice chat incidents over the past 18 months since rolling out ToxMod, as well as experimenting with moderation policies and procedures and making product changes, Hussain said. Consistency has been key, she added.

“We had to be consistent. We have a very clear code of conduct on what we expect from our players, then they needed to see that consistency in terms of how we were moderating and detecting,” she said. “ToxMod is on in all public rooms. It runs in real time. Then players were seeing that if they were to violate the code of conduct, we were detecting those instances of toxic speech.”

With the data behind those instances, t …

Article Attribution | Read More at Article Source