Wednesday, Valve, the company that operates the huge online video game store Steam, shared more details about how it plans to control and moderate the ever-increasing number of games published on its platform.

In May, Valve removed a school shooting simulator from Steam after it was widely criticized by parents of school shooting victims who spoke to publications like the New York Times and CNN. At the time, Valve called the developer of the game a "troll," and in June the company announced a new policy for how it will moderate its store: it would allow "everything" on Steam unless it's illegal or "straight up trolling."

In the post published Wednesday, Valve shared more details about how it determines what it considers "outright trolling."

"It is vague and we'll tell you why," Valve wrote. "You're a denizen of the internet so you know that trolls come in all forms. On Steam, some are simply trying to rile people up with something we call 'a game shaped object' (ie: a crudely made piece of software that technically and just barely passes our bar as a functioning video game but isn't what 99.9% of folks would say is "good.")

Valve goes on to explain that some trolls are trying to scam folks out of their Steam inventory items (digital items that can be traded for real money), while others are trying to generate a small amount of money through a variety of schemes that have to do with how developers use keys to unlock Steam games, while others are trying to "incite and sow discord."

"Trolls are figuring out new ways to be loathsome as we write this," Valve said. "But the thing these folks have in common is that they aren't actually interested in good faith efforts to make and sell games to you or anyone. When a developer's motives aren't that, they're probably a troll."

One interesting observation Valve shares in the blog post is that it rarely bans individual games from Steam, and more often bans developers and/or publishers entirely. That intuitively makes sense. It is extremely hard to make good video games, so it doesn't make much sense for a developer to spend so much time and effort in creating something people appreciate only to follow it up with a crude and designed-to-offend school shooting simulator or a straight up cryptocurrency miner. If a developer is on Steam to run some kind of scam, they're probably more in the scam business than the game business. Or as Valve wrote: In the words of someone here in the office: "it really does seem like bad games are made by bad people."

Valve said that its review process for determining that something may be a "troll game" is a "deep assessment" that involves investigating who the developer is, what they've done in the past, their behavior on Steam as a developer, as a customer, their banking information, developers they associate with, and more.

Overall, this is more insight than we had previously about how Valve plans to tame Steam going forward, and more insight into how the company, which often operates as a black box, is thinking in general.

But the approach is not surprising. For years now, Valve's strategy has involved letting users and publishers do almost whatever they want while providing users with curation tools to hide the content they don't want to see. In the blog post, Valve also explained that it improved its tagging system so users can search for more specific types of games, introduced a feature that will prompt publishers to say exactly what kind of adult content their games include, and created a new "Adults Only" filter that will filter out games that feature explicit sexual content. The latter might help the developers of adult visual novels, who have been in limbo since May, when Valve started the process of taking down and cease distribution of their games. I've reached out to those developers to find out if they've received an update from Valve and will update this post if I hear back.