Mike Pappas, CEO & Co-founder of Modulate.
2023 was a busy year for trust and safety. We’ve seen increased scrutiny of social media (especially in the lead-up to a major election year) and a new wave of regulation intended to increase user safety and transparency. But often neglected from these conversations is a growing part of the online world—online games.
These games are host to hundreds of millions of players from every demographic and form a social bedrock for many. This was especially true during the pandemic quarantines. But gaming, like social media, struggles with its own forms of toxicity, which can erode the experience of community and self-expression so many find value in.
As trust and safety become a more active space, I believe it’s worth asking—how has trust and safety evolved in gaming this year and what’s coming in 2024?
Gaming Trends Throughout The Year
Some of 2023’s highlights include:
Increased Investment In Trust And Safety Overall
There has been a significant increase in AAA game studios creating a trust and safety team for the first time or substantially growing their existing teams as user numbers have continued to swell. This can be seen through the likes of Niantic, Epic Games, Roblox, Blizzard Entertainment, Ubisoft and Riot Games, and many more.
Legislative Focus On Gaming Safety
U.S. lawmakers kicked off the year with inquiries to several gaming platforms looking to understand their defenses against extremism. As the year progressed, Australia’s eSafety commission performed its own inquiries, and then the U.K.’s Online Safety Bill passed into law, joining the EU’s DSA from 2022 in explicitly requiring online platforms (including games) to take their safety strategies to the next level.
Voice Moderation Hits The Scene
Call of Duty, one of the largest gaming franchises in the world, announced their use of Modulate’s AI voice moderation system, ToxMod, sparking a new level of interest in voice safety. Shortly thereafter, Roblox acquired a speech tech startup to explore building similar solutions, and Xbox and Epic joined Riot and Sony in offering tools for end users to report harmful voice content.
Rise Of Generative AI
Generative AI (GenAI) took the world by storm, and gaming was no exception. Moderating user-generated content (UGC) has always been difficult for platforms, but with the increased scale and speed enabled by GenAI, trust and safety teams have been forced to aggressively rethink their strategies to ensure their users encounter age-appropriate content.
Transparency Reporting Kicks Off
As new regulation is set to require additional reporting next year, some gaming platforms like Xbox and Discord are getting ahead of the curve. Complementing this valuable lens into the gaming ecosystem is an influx of surveys and reports from other industry insiders like Unity, highlighting the prevalence of toxicity across a variety of game genres.
Gamifying Trust And Safety
Games like Trust & Safety Tycoon and Moderator Mayhem showcased a growing interest in both trust and safety and content moderation, indicating increased awareness outside of direct decision-makers of the importance and challenges of achieving online safety.
What To Expect In 2024 And Beyond
With all this momentum behind us, what should we expect in the new year?
Increased Maturity In Gaming Trust And Safety
While social media platforms like X have been cutting their trust and safety teams, gaming has largely been expanding and refining its approach. We’re also seeing greater gaming representation at major industry events like TrustCon. Expect that in 2024, you’ll hear more about trust and safety challenges as they manifest in games and see more innovative advances in that space.
Continued Regulatory Scrutiny
With the DSA and OSA now in effect and U.S. and Australian regulators and lawmakers looking closely at the ways extremists and other bad actors take advantage of gaming’s unique combination of deep community and largely anonymous player base, expect to see more pressure on platforms to speak transparently about what’s happening online and what they are doing to stop it. This may further result in new fines when regulations are not followed.
Balancing Safety And Privacy
For a long time, online platforms—especially those with anonymity, like most games—have leaned heavily toward the absolute minimum amount of data collection. (This has been reinforced by privacy regulations like COPPA imposing intense fines on child-directed platforms, with games being in an awkward gray area.) But as pressure increases from end users and regulators to solve the problem of online toxicity, platforms have been forced to get creative to strike an appropriate balance of minimizing the data they collect within reason, while still collecting enough to be able to identify and take action against the worst offenses.
Blending Of Gaming And Social
Is Roblox a gaming platform? What about Rec Room or Fortnite? Many so-called “games” these days are more akin to a community gathering space and compete increasingly heavily with more “conventional” social platforms like TikTok, YouTube, X and Facebook.
Expect it to get harder and harder to tell where “games” end and “social” begins, as we move toward a future where all online platforms compete to immerse and engage the same communities with a similar breadth of options.
Conclusion
Online games are at the forefront of innovation, putting novel technologies like GenAI to work as they offer a dizzying array of spaces for community gathering, socialization and friendly competition.
But those same social games can also be the target of a few bad actors looking to ruin the experience of others. As the platforms and technology evolve, so must the trust and safety teams that protect those communities. It’s not an easy job by any means.
For years, trust and safety has been seen by platforms as a cost center. Users have viewed it as somewhere between a non-entity and a nuisance. But in recent years, folks have started to appreciate the critical work these teams put in to keep these platforms welcoming.
Expect to see that trend continue. Expect regulators, end users and employers alike to continuously recognize the incredible value of trust and safety. Finally, expect the platforms that foster the richest, most well-managed community experiences to pull further ahead.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
Read the full article here