Who murdered Transformers: The Last Knight? The fifth movie in the series opened last weekend to numbers that weren't simply lower than expected—they amounted to the worst opening weekend haul of the entire franchise. Apparently, four visually nonsensical films about giant robots hitting each other had been plenty for moviegoers. This being a blame-loving industry, though, the search began for exactly who was responsible for this Floptimus Prime.

Don’t blame the actors; the fault lies not within the stars. Maybe blame director Michael Bay—or Akiva Goldsman, who assembled a writer's room in order to churn out more Transformers movies for Paramount and Hasbro. Or, hey, blame Paramount itself, who maybe should not have tried to squeeze more than meets the eye out of its only repeating cash crop besides Mission: Impossible.

Maybe it was data. After a spate of movies that failed to live up to commercial and critical expectations this summer, Hollywood’s evangelists have become increasingly likely to blame aggregated-scorekeepers like Rotten Tomatoes and Metacritic for harshing their buzz. Baywatch, The Mummy, Pirates of the Carribbean: Yarrrr Kidding, Right?—all of them perished, caught between the green-splatted Scylla of the Tomatometer and the sulfurous Charybdis of not-very-goodness.

“I want every movie to be good. I absolutely do. I hope every movie I sit down and see is good,” says Matt Atchity, editor in chief at Rotten Tomatoes. “Do I want to see people fail? No. I don’t want to see anybody fail.”

Yet fail they do. And that might be because of Rotten Tomatoes.

Founded in 1998, Rotten Tomatoes has had a series of corporate owners, most recently the online movie ticket site Fandango, itself jointly owned by Warner Brothers and Comcast (which also owns NBCUniversal). The simple concept: Turn hundreds of movie reviews into binary pass/fail assessments—inspired by the thumbs up-or-down of the critics Gene Siskel and Roger Ebert—into a quantified selfie that captures a movie’s overall quality. You get a “fresh” or a “rotten.” If the site slurps up 100 reviews for a given movie and 10 are negative, that's a 90 percent score.

The site maintains fairly straightforward rules about which reviewers and outlets it’ll draw from—about 2,000 critics overall contribute, though no movie has reviews from all of them. Some critics have adapted to the binary distinction, sending along word as to how Rotten Tomatoes should code their possibly more subtle review. “Some days a 2.5 out of 5 out of a particular critic might be fresh, and with a different movie might be a rotten, and that’s OK with us,” Atchity says.

Metacritic, founded the year after Rotten Tomatoes, grades more finely—and, perhaps as a result, is less influential. The site also aggregates scores for computer games, and converts critics’ scores from 58 publications to a 100-point scale. A 3 out of 5 is a 60 on Metacritic; 7.2 on Paste is 72. They hand-code New York Times reviews. Movies need at least four reviews to get a metascore.

Unlike Rotten Tomatoes, though, Metacritic weights some reviewers to have a greater influence on the score. “That’s our little secret formula. I’ll just leave it at that,” says Keith Kimbell, Metacritic’s film editor. “It’s something that we keep to ourselves to keep our formula unique compared to just a straight average.”

That all sounds reasonable—innocuous, even. But Rotten Tomatoes scores now show up not only on the site, but also in reviews and articles about the movies they purport to assess, and next to ticket purchase options on Fandango. In advance of Wonder Woman, the movie's very high score itself became a story.