How EA’s Positive Play initiative aims to move from toxic players to creating positive experiences

0

We’ve known for years that online gaming can be a minefield of toxicity and harassment, especially for women. And while moderation tools have been around for almost as long, it hasn’t been until the last few years that we’ve started to see the major gaming companies really recognize their responsibility and power. not only to stop this behavior, but to proactively create positive spaces.

Last month we saw Riot Games and Ubisoft partnering on such a project, and Xbox has also recently started offering data on moderation issues. But a company that has been publicly promoting this strategy for a few years is EA, through its Positive Play program.

The program Positive Play is directed by Chris Bruzzo, Chief Experience Officer at EA. He has been with the company for eight and a half years, assuming this new role after six years as EA’s chief marketing officer. It was while he was still in that former role that he and current CMO David Tinson began the discussions that led to Positive Play at EA.

“David and I talked for many years about the need to engage the community on this issue and address toxicity in gaming and some of the really challenging things that were happening in what have been rapidly growing social communities in or around games,” says Bruzzo. “And so, a few years ago [en 2019]we held a summit at E3 and started talking about what is the collective responsibility that game companies and everyone else has, gamers and everyone involved to address hateful behavior and toxicity in games.”

Launch of Positive Play

EA’s Building Healthy Communities Summit featured content creators from 20 countries, EA employees and third-party experts in online communities and toxicity. There were talks and panel discussions, as well as opportunities to provide feedback on how to address the issues raised.

Bruzzo says that both before and after the summit, it became very clear to him that women were having a “generally bad experience” in social gaming. If they revealed their gender or if their voice was heard, women often reported harassment or intimidation. But the response from the summit convinced him that EA could do something about it. This is how Positive Play was born.

searched for Rachel Franklin, former head of Maxiswho had left for Meta (then Facebook) in 2016 to be their head of social VR, where Bruzzo says he unfortunately gained some additional relevant experience on the matter.

See also  Sony will have its personal State of Play after E3 2021 in step with an insider

“If you want to find an environment that’s more toxic than a gaming community, go to a social VR community,” says Bruzzo. “Because not only is there the same amount of toxicity, but my avatar can come right up and get in your avatar’s face, and that creates a whole other level of not feeling safe or included.”

With Franklin at the helm as EA’s Senior Vice President of Positive Play, the group got to work. In 2020 they published the Positive Play Charter, which is a rundown of the do’s and don’ts of social gaming in EA games. Its pillars include treating others with respect, keeping things fair, sharing fair content, and following local laws, and it states that players who don’t follow these rules can get their EA accounts banned. Basic as it may sound, Bruzzo says it’s a framework with which EA can step up moderation of bad behavior, as well as begin to proactively create experiences that are more likely to be progressive and positive.

The Army of Moderation

Regarding moderation, Bruzzo says that they have tried to make it easier for players to spot problems in EA games, and that every time AI agents are further used and improved to identify patterns of misbehavior and issue warnings automatically. Of course, they can’t fully rely on AI: real people still need to review rare or outliers and make the right decisions.

As an example of how AI is facilitating the process, Bruzzo points to player names. Player names are one of the most common toxicity issues they come across., it states. While it’s fairly easy to train the AI ​​to ban certain inappropriate words, players who want to misbehave use symbols or other tricks to bypass the ban filters. But with AI, they are getting better and better at identifying and preventing these solutions. Last summer, he says, they ran 30 million Apex Legends club names through their AI checks and removed 145,000 that were in violation. No human could do it.

And it’s not just about names. Since the Positive Play initiative was launched, Bruzzo says that EA is watching measurable reductions in hateful content on their platforms.

The moment your expression begins to infringe on another person’s ability to feel safe… that is the moment your ability to do so disappears.

“One of the reasons we’re in a better position than social media platforms is that we’re not a social media platform,” he says. “We’re a community of people who come together to have fun. So it’s not really a platform for all political speech. It’s not a platform where you can talk about whatever you want… The moment your expression starts to infringe on another person’s ability to feel safe and included or to make the environment fair and everyone have fun, that’s when your ability to do it disappears. Go do it on another platform. This is a community of people, of players who come together to have fun. That gives us great advantages in terms of having very clear parameters. And so we can impose consequences and make real material progress in reducing disruptive behavior“.

See also  Travis Kelce's Ex-Girlfriend Doesn't Like That Brittany Mahomes Is Now Friends With Taylor Swift

That covers the text, but what about voice chat? I ask Bruzzo how EA handles it, since it’s much harder to moderate what people are saying to each other over voice communications without violating privacy laws related to recorded conversations.

Bruzzo admits that it is more difficult. He says EA gets a lot of help from platform owners like Steam, Microsoft, Sony, and Epic whenever voice chat is hosted on their platforms, because both companies can contribute their tools. But for the moment, the best solution is still, unfortunately, for players to block, mute or leave of toxic communications.

“In the case of voice, the most important and effective thing to do today is to make sure the player has easy access to turn things off,” he says. “That’s the best we can do.”

Another way EA is working to reduce toxicity in their games might seem a bit tangential: they are aggressively banning cheaters.

“We found that when games are buggy or have cheaters in them, so when there isn’t a good anti-cheat or when the anti-cheat is lagging, especially in competitive games, one of the root causes of a large percentage of Toxicity is when players feel the environment is unfair,” says Bruzzo. “That they can’t compete fairly. And what happens is that it makes them angry. Because suddenly you realize that there are others who break the rules and the game is not controlling that behavior. But you love this game and you have spent a lot of time and energy on it. It’s very annoying. So we have prioritized tackling cheaters as one of the best ways to reduce toxicity in games.”

See also  Filtered images of the filming set of Thor: Love & Thunder

Good game

What Bruzzo wants to convey is that Eliminating toxicity is as important as promoting positivity. And it’s not like I’m working from scratch. As ubiquitous and memorable as bad behavior in games can be, the vast majority of gaming sessions are not toxic. At worst they are neutral, and often they are already positive without any additional help from EA.

“Less than 1% of our gaming sessions end with one player reporting another,” he says. “We have hundreds of millions of people playing our games, so it’s still massive, and we think… we have to get down to business now because the future of entertainment is interactive… But it’s important to remember that 99 out of 100 sessions do not result in a player having to report misconduct“.

So far in 2022, the most common text comment among gamers is “gg”.

“And another thing I was looking at the other day in Apex Legends, so far in 2022, the most common text comment among players is actually ‘gg’. It’s not ‘I hate you’. It’s not profanity, not even something competitive It’s “good game” And actually, “thank you.” Apex Legends alone has used ‘thank you’ more than a billion times in 2022.

“And the last thing I’m going to say is that when we warn people that they’ve crossed the line, that they’ve broken a rule or done something harmful, 85% of the people we warn do not offend again. That gives me hope.”

It is that spirit of positivity that Bruzzo hopes to nurture in the future. I ask him what EA’s Positive Play initiative will look like ten years from nowif it continues to be successful.

“I hope we’ve moved past the number one problem of trying to remove hateful and toxic content, and now we’re talking about how to make games as inclusive as possible. I think ten years from now we will see games with adaptive controls and even with different starting modes and different servers for different playstyles. We will see the creation explosion and players creating things, not just cosmetics, but playable items in our games. And all of that is going to benefit from all of this work that we’re doing to create positive content, positive gaming environments, and positive social communities.”