Todie Profile Blog Joined March 2010 Sweden 46 Posts #1



While i was reading coments in





At the end of it, i decided to put something together to colect my thoughts and perhaps help provide a more comprehensive basis for discussion than what



Before discussing more nuanced and problematic aspects of the issue, its important to keep in mind the purpose of the reporting system and the context at hand, and to - as far as possible - agree on the answers to some defining questions





Answering some basic defining questions



Q: What is abusive chat? what about being "abusive" in chat is actually inherently negative in some way - in what way?

There seems to be a significant skepticism or confusion about this in the mentioned reddit thread, among other places. ("whats the big deal")



As i understand it, toxicity / abusive chat is use of chat for blaming / shaming flaming other players in a persistently non-constructive and unwanted fashion. (the discussion about fringe cases get complicated - This will be left out of this blog for now - for clarity)



Q: Why cant blizz be more open and clear about how they define abusive chat?



From a business point of view, its important to keep in mind that a game clients chat isn't the public. It may appear so in a free to play game, but actually its even more apparent in such games that it is not. To communicate in this space is not to be taken for granted - it can and will be taken away if the company perceives it to be notably poisonous. In 2018, people and players on the internet are increasingly diverse in how they interact, this is a complex context where gameplay decisions interact with lines of chat and pings. Its not free to police this space and the players don't have much in the way of rights here.



Blizzard tries to appeal to its model customer. A chat functionality that has value is one that is consistently more positive / constructive than it is negative. From a business point of view they must try to achieve this with as low costs as possible.



Being transparent about what constitutes abusive chat and trying to prove consistency in rationale for issued penalties, would be costly and create liabilities, especially considering the likely fluid and changing state of any definitions, language barriers, etc.



Q: what the big deal?

When abusive chat behaviors are pervasive enough, and consistently go unpunished, they do infact manifest as a culture in the player base of the game at hand (in "the HOTS community"). At that point, excuses for tolerating abusive chat, become very attractive, almost mandatory, to be able to enjoy the game without muting chat. even excuses as innocent as "me speaking up wont help" and "if i get involved, im not relaxing or having fun". At that point, one might say its "too late" in a sense, for that subculture to become healthy.



In this way, the words "toxic" and "toxicity" are actually aptly metaphorical for contagious negative behavior among semi-autonomous players in a community.



Q: So what? whats the alternative?

Before a community becomes completely toxic, it can still be a reasonable rational approach for any given player to NOT stay silent in the face of one player using abusive chat to another. This is easier to do in a smaller community and one where abusive chat is an offense users are often punished for, like HOTS. (telling someone, one time, to stop behavior abusive behavior x and focus on the game may not seem like much, but if enough people do it often enough, it makes a difference)



Q: Aren't other features much more important than chat-policing!?



As much flak as the reporting system in the game is getting, for the purpose of discussion, its key to agree on the importance of having an actual system in place as compared to not having one. You may discuss this point as well, but thats not really relevant to the economic reality nor the average player of the game - for reasons described above.



What is changing



Fewer silence penalties for repeat offenders. Instead, temporary bans, and potentially permanent bans.



The reports that have a logical fallacy in how they were issued are not counted. No mention of other false reports.



"Implementing machine learning technology that further enhances our ability to validate and empower accurate player reports."



Issues and prospects for discussion



1: Many players learn to not trust themselves with game chat, or not to trust the system; they stay quiet no matter what, to make sure nothing can be held against them 'in a court of Blizzard law'.



If the system is implemented and / or communicated in a way that makes some players who would be more positive than negative stop using chat, that's a problem and a failure on Blizards part. It might be good to discuss how they could most easily remedy this.



For example, be as clear as possible to owners of accounts with prior warnings and penalties on them; should they expect to be banned if they accumulate enough reports going forward? Without prior warning ? Automatically? Without changes to their abilities to appeal through CS?



2: Can we flesh out the description of how machine learning is used to parse reports?



Does it strictly use data about senders and recipients of reports and chat logs, or does it also use some replay-file or other gameplay data? Blizzard says it empowers accurate reports – can they clarify if it can also be used to depower inaccurate reports?



What do you think?

vote and discuss!



Poll: Your feelings about abusive chat and reports in HOTS



The system solves bigger problems than it creates (24)

46%



The system creates bigger problems than it solves (18)

35%



I dont play HOTS LUL (10)

19%



52 total votes (24)46%(18)35%(10)19%52 total votes Your vote: Your feelings about abusive chat and reports in HOTS (Vote): I dont play HOTS LUL

(Vote): The system creates bigger problems than it solves

(Vote): The system solves bigger problems than it creates





Thats all for now. If we manage to spark any notable discussion from this i might make a followup entry to sumarize in a few days.







This blog entry was inspired by recent developments in Blizzards approach to staving off toxicity in Heroes of the storm.While i was reading coments in the reddit thread i found myself wanting to make more, longer and more nuanced replies tha would be worth the time invested. i also found myself looking back at related discussions in the months and years past.At the end of it, i decided to put something together to colect my thoughts and perhaps help provide a more comprehensive basis for discussion than what Blizzards official blog is able to provide.There seems to be a significant skepticism or confusion about this in the mentioned reddit thread, among other places. ("whats the big deal")As i understand it, toxicity / abusive chat is use of chat for blaming / shaming flaming other players in a persistently non-constructive and unwanted fashion. (the discussion about fringe cases get complicated - This will be left out of this blog for now - for clarity)From a business point of view, its important to keep in mind that a game clients chat isn't the public. It may appear so in a free to play game, but actually its even more apparent in such games that it is not. To communicate in this space is not to be taken for granted - it can and will be taken away if the company perceives it to be notably poisonous. In 2018, people and players on the internet are increasingly diverse in how they interact, this is a complex context where gameplay decisions interact with lines of chat and pings. Its not free to police this space and the players don't have much in the way of rights here.Blizzard tries to appeal to its model customer. A chat functionality that has value is one that is consistently more positive / constructive than it is negative. From a business point of view they must try to achieve this with as low costs as possible.Being transparent about what constitutes abusive chat and trying to prove consistency in rationale for issued penalties, would be costly and create liabilities, especially considering the likely fluid and changing state of any definitions, language barriers, etc.When abusive chat behaviors are pervasive enough, and consistently go unpunished, they do infact manifest as a culture in the player base of the game at hand (in "the HOTS community"). At that point, excuses for tolerating abusive chat, become very attractive, almost mandatory, to be able to enjoy the game without muting chat. even excuses as innocent as "me speaking up wont help" and "if i get involved, im not relaxing or having fun". At that point, one might say its "too late" in a sense, for that subculture to become healthy.In this way, the words "toxic" and "toxicity" are actually aptly metaphorical for contagious negative behavior among semi-autonomous players in a community.Before a community becomes completely toxic, it can still be a reasonable rational approach for any given player to NOT stay silent in the face of one player using abusive chat to another. This is easier to do in a smaller community and one where abusive chat is an offense users are often punished for, like HOTS. (telling someone, one time, to stop behavior abusive behavior x and focus on the game may not seem like much, but if enough people do it often enough, it makes a difference)As much flak as the reporting system in the game is getting, for the purpose of discussion, its key to agree on the importance ofan actual system in place as compared to not having one. You may discuss this point as well, but thats not really relevant to the economic reality nor the average player of the game - for reasons described above.Fewer silence penalties for repeat offenders. Instead, temporary bans, and potentially permanent bans.The reports that have a logical fallacy in how they were issued are not counted. No mention of other false reports."Implementing machine learning technology that further enhances our ability to validate and empower accurate player reports."If the system is implemented and / or communicated in a way that makes some players who would be more positive than negative stop using chat, that's a problem and a failure on Blizards part. It might be good to discuss how they could most easily remedy this.For example, be as clear as possible to owners of accounts with prior warnings and penalties on them; should they expect to be banned if they accumulate enough reports going forward? Without prior warning ? Automatically? Without changes to their abilities to appeal through CS?Does it strictly use data about senders and recipients of reports and chat logs, or does it also use some replay-file or other gameplay data? Blizzard says it empowers accurate reports – can they clarify if it can also be used to depoweraccurate reports?What do you think?vote and discuss!Thats all for now. If we manage to spark any notable discussion from this i might make a followup entry to sumarize in a few days. Petter Rudberg @Todie #3264 on Discord /u/Todie on reddit

Chef Profile Blog Joined August 2005 10797 Posts #2 1: Many players learn to not trust themselves with game chat, or not to trust the system; they stay quiet no matter what, to make sure nothing can be held against them 'in a court of Blizzard law'.



If the system is implemented and / or communicated in a way that makes some players who would be more positive than negative stop using chat, that's a problem and a failure on Blizards part. It might be good to discuss how they could most easily remedy this.

I think that's crazy. It's pretty hard to misunderstand what is and isn't abusive. If you're really paranoid, you just don't curse or criticize other people (as those are the only things that really exist on a spectrum that could go from friendly to rude). I think "many" is greatly exaggerated here. I would hazard to guess almost no one reacts that way.



The machine learning idea is probably very flexible, you're not likely to learn exactly what data it uses. It's just a program that is going to get fed piles and piles of data, and guess which reports resulted in bans when a human inspected them. Then when it gets accurate enough, it can replace or aid humans to make sifting through reports much more efficient. Obviously this is just my guess.



I think we've seen chat in online games handled in a few ways. In the old days of the late 90s, it was a kind of 'why would anyone be mean?' sort of feeling, with the technology being so nerdy that even if you were playing with strangers, you were all nerds anyway. Then when it turned out people cheated, it was just sort of "well play with your friends instead." Because it was hard enough to make online gaming work at all.



Then we've seen very naive approaches, like running a filter over all chat text to replace swear words, which people get around easily by just misspelling them. This was probably some half baked idea from someone involved in PR.



Finally, we've seen games that fully restrict what you are able to say by giving you a set of prefab options. And actually this is not horrible. As I understand it, Hearthstone does this, but when you add someone to your friend's list you're able to talk to them. So as long as you don't add someone to your friend's list, you can't be abused. But you can still have emoji-style conversations with strangers to let them know your emotion after something happened in the game. This also protects you from horribly awkward people who don't know the time and place for constructive criticism. But there are problems with this system, as it does isolate you and make it harder to find new people. Amid the rampant abuse and hate-speech on battle.net, lots of kids made long lasting friendships with each other by filtering for themselves who they wanted to talk to from the trolls.



I think the recent advances in machine learning are Blizzard going back and saying how can we have what we had before but without the worst parts? Because the prefab options and friends-only chat is the answer outside of machine learning.



But they're not trying to create a platform for free speech. They're trying to create a space that won't scare off potential customers, or worried parents, or damage their image as a company. Because of that, I think it's not an ethics discussion from their point of view. It is starting off from a place of censorship and control, and one would have to argue about that separately. It is continuing a motive of games-as-escape, and we don't want to be abused while we are trying to escape from the abuse of real life, and we don't want to associate ourselves with neo-nazis by playing games with them. So I think you have to evaluate it from the perspective of what they're trying to accomplish, even if in the grander perspective of our society it might not be healthy to create more and more deeply entrenched bubbles through which no foreign opinions can enter. I think that's crazy. It's pretty hard to misunderstand what is and isn't abusive. If you're really paranoid, you just don't curse or criticize other people (as those are the only things that really exist on a spectrum that could go from friendly to rude). I think "many" is greatly exaggerated here. I would hazard to guess almost no one reacts that way.The machine learning idea is probably very flexible, you're not likely to learn exactly what data it uses. It's just a program that is going to get fed piles and piles of data, and guess which reports resulted in bans when a human inspected them. Then when it gets accurate enough, it can replace or aid humans to make sifting through reports much more efficient. Obviously this is just my guess.I think we've seen chat in online games handled in a few ways. In the old days of the late 90s, it was a kind of 'why would anyone be mean?' sort of feeling, with the technology being so nerdy that even if you were playing with strangers, you were all nerds anyway. Then when it turned out people cheated, it was just sort of "well play with your friends instead." Because it was hard enough to make online gaming work at all.Then we've seen very naive approaches, like running a filter over all chat text to replace swear words, which people get around easily by just misspelling them. This was probably some half baked idea from someone involved in PR.Finally, we've seen games that fully restrict what you are able to say by giving you a set of prefab options. And actually this is not horrible. As I understand it, Hearthstone does this, but when you add someone to your friend's list you're able to talk to them. So as long as you don't add someone to your friend's list, you can't be abused. But you can still have emoji-style conversations with strangers to let them know your emotion after something happened in the game. This also protects you from horribly awkward people who don't know the time and place for constructive criticism. But there are problems with this system, as it does isolate you and make it harder to find new people. Amid the rampant abuse and hate-speech on battle.net, lots of kids made long lasting friendships with each other by filtering for themselves who they wanted to talk to from the trolls.I think the recent advances in machine learning are Blizzard going back and saying how can we have what we had before but without the worst parts? Because the prefab options and friends-only chat is the answer outside of machine learning.But they're not trying to create a platform for free speech. They're trying to create a space that won't scare off potential customers, or worried parents, or damage their image as a company. Because of that, I think it's not an ethics discussion from their point of view. It is starting off from a place of censorship and control, and one would have to argue about that separately. It is continuing a motive of games-as-escape, and we don't want to be abused while we are trying to escape from the abuse of real life, and we don't want to associate ourselves with neo-nazis by playing games with them. So I think you have to evaluate it from the perspective of what they're trying to accomplish, even if in the grander perspective of our society it might not be healthy to create more and more deeply entrenched bubbles through which no foreign opinions can enter. LEGEND!! LEGEND!!

Pentay Profile Joined June 2018 15 Posts Last Edited: 2018-06-13 15:33:08 #3 Chef, you can’t counter rational concepts arrived at through deductive reasoning and factual data with points built on optimism and faith in Blizzard. That’s what fanboys do, and that’s why they’re called fanboys. It’s known that people silence their chat to avoid conversation to avoid the hyper-sensitive reporters. People say they’re scared to chat. Games are played with dead draft chats. It’s a common reported issue of quietness. Assuming the machine learning is flexible honestly adds nothing to the discussion. And all your counter points end in “I guess” anyway, so please stop.

Todie Profile Blog Joined March 2010 Sweden 46 Posts Last Edited: 2018-06-13 18:04:27 #4 On June 14 2018 00:03 Chef wrote:

Show nested quote +

1: Many players learn to not trust themselves with game chat, or not to trust the system; they stay quiet no matter what, to make sure nothing can be held against them 'in a court of Blizzard law'.



If the system is implemented and / or communicated in a way that makes some players who would be more positive than negative stop using chat, that's a problem and a failure on Blizards part. It might be good to discuss how they could most easily remedy this.



I think that's crazy. It's pretty hard to misunderstand what is and isn't abusive. If you're really paranoid, you just don't curse or criticize other people (as those are the only things that really exist on a spectrum that could go from friendly to rude). I think "many" is greatly exaggerated here. I would hazard to guess almost no one reacts that way. I think that's crazy. It's pretty hard to misunderstand what is and isn't abusive. If you're really paranoid, you just don't curse or criticize other people (as those are the only things that really exist on a spectrum that could go from friendly to rude). I think "many" is greatly exaggerated here. I would hazard to guess almost no one reacts that way.



This is a fair point. its just a matter of word choice. they are quite vocal on reddit and whatnot, but it would probably be better to referr to them as "a group" or the like, rather than "many" - because i certainly share the perception that in the grander scheme of things, they are NOT many.

This is a fair point. its just a matter of word choice. they are quite vocal on reddit and whatnot, but it would probably be better to referr to them as "a group" or the like, rather than "many" - because i certainly share the perception that in the grander scheme of things, they are NOT many.



Finally, we've seen games that fully restrict what you are able to say by giving you a set of prefab options. And actually this is not horrible. As I understand it, Hearthstone does this, but when you add someone to your friend's list you're able to talk to them. So as long as you don't add someone to your friend's list, you can't be abused. But you can still have emoji-style conversations with strangers to let them know your emotion after something happened in the game. This also protects you from horribly awkward people who don't know the time and place for constructive criticism. But there are problems with this system, as it does isolate you and make it harder to find new people. Amid the rampant abuse and hate-speech on battle.net, lots of kids made long lasting friendships with each other by filtering for themselves who they wanted to talk to from the trolls.

.. yeah, to be clear, i highly doubt the hearthstone approach would ever be considered for a team strategy game like hots - unless its on mobile or something.



.. yeah, to be clear, i highly doubt the hearthstone approach would ever be considered for a team strategy game like hots - unless its on mobile or something.

I think the recent advances in machine learning are Blizzard going back and saying how can we have what we had before but without the worst parts? Because the prefab options and friends-only chat is the answer outside of machine learning.





Sure. Fingers crossed that it works out!



Sure. Fingers crossed that it works out! But they're not trying to create a platform for free speech. They're trying to create a space that won't scare off potential customers, or worried parents, or damage their image as a company. Because of that, I think it's not an ethics discussion from their point of view. It is starting off from a place of censorship and control, and one would have to argue about that separately. It is continuing a motive of games-as-escape, and we don't want to be abused while we are trying to escape from the abuse of real life, and we don't want to associate ourselves with neo-nazis by playing games with them. So I think you have to evaluate it from the perspective of what they're trying to accomplish, even if in the grander perspective of our society it might not be healthy to create more and more deeply entrenched bubbles through which no foreign opinions can enter.



I agree with this! I increasingly understand abusive comunication online as a social / cultural question - its not a matter that a gaming company can be expected to handle i a wa that takes bullies by the hand and teaches them how to not be bullies... Another approach might be merrited in teh scheme of things that is larger than Blizzard, but its not something tehy are respoinsible or even capable of doing as it stands.



Thank you for your thoughtful reply Chef!

I agree with this! I increasingly understand abusive comunication online as a social / cultural question - its not a matter that a gaming company can be expected to handle i a wa that takes bullies by the hand and teaches them how to not be bullies... Another approach might be merrited in teh scheme of things that is larger than Blizzard, but its not something tehy are respoinsible or even capable of doing as it stands.Thank you for your thoughtful reply Chef! Petter Rudberg @Todie #3264 on Discord /u/Todie on reddit

Pentay Profile Joined June 2018 15 Posts #5 The whole issue boils down to Blizzard using punitive measures in the assumption that they will teach people to change. It didn’t work, so they increased the punishment. It’s such an archaic ideology that has proven throughout history to fail, so it’s a massive surprise that Blizzard goes this route. What happens when it fails again? They increase punishments a third time? Why is abusive chat equating to the same punishment as cheating? Blizzard needs to give tools for communication rather than collapse tools for communication. The report system is a very cruel form of communication that doesn’t actually enhance player communication in the moment per match. They should have overhauled the entire chat interface when they released the game but instead the added features like blocking whispers from non-friends. They should have given the option to accept or decline whispers almost like a party invite. Chat reports should kick people from normal queue to a pit match like leaver queue that has fun gameplay but is a nuisance to complete, like forts rebuilding, losing teams instantly respawning, objectives moving, temporary handicaps and bosses that aggro down lanes randomly. It should be like using a portapotty on a hot day so when you step outside it’s a relief for many reasons, causing you to less likely tilt when back in normal queue.

Todie Profile Blog Joined March 2010 Sweden 46 Posts #6 On June 14 2018 04:01 Pentay wrote:

The whole issue boils down to Blizzard using punitive measures in the assumption that they will teach people to change. It didn’t work, so they increased the punishment. It’s such an archaic ideology that has proven throughout history to fail, so it’s a massive surprise that Blizzard goes this route. [...]



we're talking about a private company here, that effectively sells playtime in a digital space. the consumers have very limited rights and the company has very limited responsiblities towards them (us, the players). Subsequently, they only invest just enough in the poliecing of the comunication in this space, to ensure its construed as more positive than negative. from this perspective, wether or not an offender becomes a repeat offender or not isnt very relevant.

we're talking about a private company here, that effectively sells playtime in a digital space. the consumers have very limited rights and the company has very limited responsiblities towards them (us, the players). Subsequently, they only invest just enough in the poliecing of the comunication in this space, to ensure its construed as more positive than negative. from this perspective, wether or not an offender becomes a repeat offender or not isnt very relevant.



[...]Blizzard needs to give tools for communication rather than collapse tools for communication. The report system is a very cruel form of communication that doesn’t actually enhance player communication in the moment per match. They should have overhauled the entire chat interface when they released the game but instead the added features like blocking whispers from non-friends. They should have given the option to accept or decline whispers almost like a party invite.

Fair point.

Fair point.

Chat reports should kick people from normal queue to a pit match like leaver queue that has fun gameplay but is a nuisance to complete, like forts rebuilding, losing teams instantly respawning, objectives moving, temporary handicaps and bosses that aggro down lanes randomly. It should be like using a portapotty on a hot day so when you step outside it’s a relief for many reasons, causing you to less likely tilt when back in normal queue.



this is a very creative suggestion! this is a very creative suggestion! Petter Rudberg @Todie #3264 on Discord /u/Todie on reddit