"Review-bombing" is a coordinated effort to damage the public profile of a company or product by inundating it with negative reviews. Actual problems with the product in question often have nothing to do with it: Dota 2 got hammered last month because Valve won't make Half-Life 3, and more recently Firewatch suffered because of Sean Vanaman's beef with Pewdiepie. As we reported earlier this summer, it happens because it works, and it's become enough of a problem that Valve has finally been forced to address the issue.

One of the big problems facing Valve, as Alden Kroll said in a new blog post, is that review bombers are "fulfilling the goal of User Reviews" by expressing their opinions on whether or not people should buy a particular game. "But one thing we've noticed is that the issue players are concerned about can often be outside the game itself. It might be that they're unhappy with something the developer has said online, or about choices the developer has made in the Steam version of their game relative to other platforms, or simply that they don't like the developer's political convictions," he wrote.

He acknowledged that those opinions can be relevant to a player's happiness with a game purchase, but that relevance is less obvious when it comes to the actual review score. Data indicates that in most cases, review scores recover to an appropriate level once the bombing campaign is over, but there's no apparent correlation between a developer's response (or lack thereof) to complaints and the score's bounceback.

"In short, review bombs make it harder for the Review Score to achieve its goal of accurately representing the likelihood that you'd be happy with your purchase if you bought them," Kroll said.

Valve looked at a few possible solutions to the problem, including the elimination of review scores altogether, although that was pretty much a non-starter since they were added in response to user demand in the first place. Thought was also given to locking down reviews temporarily when "abnormal behavior," which is to say a review-bombing campaign, was detected. But that was rejected as well, because Valve doesn't want to "stop the community having a discussion about the issue they're unhappy about, even though there are probably better places to have that conversation than in Steam User Reviews."

Ultimately, the decision was to do nothing, at least with regard to the review scores themselves. Instead, Valve is now providing consumers with more information about the reviews by way of a histogram that compares the ratio of a game's positive to negative reviews over its lifetime.

"As a potential purchaser, it's easy to spot temporary distortions in the reviews, to investigate why that distortion occurred, and decide for yourself whether it's something you care about. This approach has the advantage of never preventing anyone from submitting a review, but does require slightly more effort on the part of potential purchasers," Kroll wrote.

"It also has the benefit of allowing you to see how a game's reviews have evolved over time, which is great for games that are operating as services. One subtlety that's not obvious at first is that most games slowly trend downwards over time, even if they haven't changed in any way. We think this makes sense when you realize that, generally speaking, earlier purchasers of a game are more likely to enjoy it than later purchasers. In the pool of players who are interested in a game, the ones who are more confident that they'll like the game will buy it first, so as time goes on the potential purchasers left are less and less certain that they'll like the game. So if you see a game's reviews trending up over time, it may be an even more powerful statement about the quality of work its developers are doing."

It's easy enough to use: Clicking any bar on the graph will bring up a sample of reviews from the appropriate time period, so in theory it's relatively simple to tell what sparked a particular spate of positive or negative feedback. But it's also an added layer of complexity that some—quite likely many—consumers won't be interested in screwing around with. Someone who's really interested in Firewatch, for example, may be willing to dig into the cause of all those recent "mixed" reviews, but someone who's just browsing for something new is far less likely to go to the trouble.

And as Spartan Fist developer Megan Fox said on Twitter, the new system also won't do much to help older games that have fallen out of the spotlight. "This will do literally nothing to stem review bombing, especially on older games with lower buy/review rates," she wrote. "Assuming the reviews will recover assumes the game will keep selling well. On an older now-low-% game, that's a bad assumption ... They're making the implicit assumption that people will keep up-voting it post bomb, and there's not much data for that."

Most evidence from this and other stores is counter: when a product gets review bombed, it's a crater, unless someone positive review-bombs.September 19, 2017

Other developers seem similarly less-than-excited by Valve's solution:

Valve's changes to Steam reviews seem to only be addressing one issue. There are many issues. Sigh.September 19, 2017

Seems like a copout, putting the onus onto the new customer to figure out if the mob are angrily spamming or not.September 19, 2017

Not entirely sure about how "give review bombs more exposure, a permanent record, and gamify it with a graph they can min/max" fixes things.September 19, 2017

And that really is where this approach stumbles. Review bombs will still take place, they'll still bring down user review scores, and this change will only be of value to consumers who are willing to go to the effort to find out why—consumers who probably don't need a graph to figure out why a game's review scores have suddenly gone sideways anyway. More data is (almost) always a good thing, but I'm skeptical that this is going to be the solution that Valve hopes it will.