Late in the fall of 2014, an entity calling itself Guardians of Peace began leaking e-mails and other private material belonging to Sony Pictures. Sensational headlines quickly followed. The hack was reportedly the work of the North Korean government, possibly in retaliation for the portrayal of Kim Jong Un in the as yet unreleased Sony comedy “The Interview.” (North Korea denied this.) Among other things, the e-mails revealed that Jennifer Lawrence was paid significantly less than men whose stardom didn’t equal hers and that, before attending a Democratic fund-raiser featuring Barack Obama, Amy Pascal, the co-chair of Sony Pictures, had joked with the producer Scott Rudin about which recent movies starring black people the President might have liked. On the heels of these and other reports came opinion pieces arguing that journalists were abetting the hackers by publishing the stolen information.

Another story was getting lost. Sony, which from 2002 to 2012 had generally been one of the top earners at the box office, was failing as a studio. Under Pascal’s leadership, Sony released a mix of tentpole films—the latest James Bonds, “Da Vinci Code” sequels—and star-driven vehicles often featuring Will Smith or Adam Sandler. (“Will and Adam bought our houses,” Sony execs liked to say.) Sprinkled among these were mid-budget, low-concept movies aimed at adults. Pascal had taken pride in Sony’s reputation as a “relationship studio,” built on its connections with talent. She was literate and smart, and alive to what makes a story click. Sony owned the rights to Spider-Man, and Pascal made intelligent use of them—her choices for director (Sam Raimi, of “Evil Dead” fame) and star (dewy-eyed Tobey Maguire) were unexpected, and together they made a movie that honored fans and non-fans alike. (“Spider-Man 2” was good, too.)

In the long run, it didn’t matter. Sony did not own the intellectual property, or “I.P.,” necessary to build out Spider-Man into a “cinematic universe”—that is, a fictional world that transfers from picture to picture, so that, instead of a single story line with a new installment every few years, a studio can release two or three “quasi-sequels,” as one Marvel executive has put it, in the span of a single year. Marvel pioneered the cinematic universe, hatching a plan in 2005 that it launched with the release of “Iron Man,” three years later. Without the requisite I.P., Sony couldn’t compete. “I only have the spider universe not the marvel universe,” Pascal explained to a colleague, in a 2014 e-mail. (The studio had had a chance to buy nearly all Marvel’s big characters, on the cheap, in the late nineties, but declined.) In another e-mail, Pascal suggested that she was trying to create an “un-marvel marvel world that is rooted in humanity.”

As Sony faltered, its rival, Disney, was enjoying an embarrassment of I.P. riches. First, it began remaking its animated classics as live-action features; then, in 2009, Disney bought Marvel, for four billion dollars. In 2012, it acquired Lucasfilm, the parent company of “Star Wars,” for another four billion. By 2015, Disney was releasing one new movie from the “Star Wars” universe and two or more movies from the Marvel universe every year.

Located nowhere in actual history or geography (or, maybe, human experience), a cinematic universe need not be limited by cultural specificity or nuance. What plays in Sioux City plays in Bayonne will play in Chongqing. The rise of the cinematic universe is inseparable from the rise of a truly global cinematic marketplace, dominated by China. In “The Big Picture: The Fight for the Future of the Movies” (Houghton Mifflin Harcourt), the Wall Street Journal reporter Ben Fritz shares a startling fact: in 2005, the highest-grossing film in China was “Harry Potter and the Goblet of Fire,” which took in just under twelve million dollars. In 2017, a “Fast and the Furious” sequel made almost four hundred million there.

To write “The Big Picture,” Fritz sifted through all the Sony e-mails made public by the hack. “This was, I realized, a way to embed myself inside a studio,” he writes. The surprising undersong to the story he tells is one of pathos—the pathos of an old-school studio head becoming an anomaly in a Hollywood increasingly overseen by brand managers. Fritz quotes at length from an extraordinary St. Crispin’s Day-like pep talk that Rudin delivered to Pascal, via e-mail, in 2014. Rudin had been trying to get Sony to back the movie “Steve Jobs,” with a screenplay by Aaron Sorkin, based on Walter Isaacson’s 2011 biography. David Fincher was going to direct, but then he dropped out and Danny Boyle took over; Christian Bale was going to star, then maybe Leonardo DiCaprio, or perhaps Bradley Cooper or Matt Damon or Ben Affleck. Now it was Michael Fassbender. Pascal had wavered, and let Rudin take the movie to Universal. When Rudin e-mailed her, she was trying to get it back.

“Why have the job if you can’t do this movie?” he asked her. “So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is NOT a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed.”

Universal kept the movie, and released it in October, 2015. It was the kind of nervy, mid-budget drama that Pascal lived to make. It was also the kind of movie that does not play in Sioux City, Bayonne, or Chongqing. “Steve Jobs” lost about fifty million dollars at the box office, according to Fritz. By then, Pascal had been eased out of her position at Sony in classic Hollywood style. When her contract expired, in February, 2015, she was given a “first-look” producing deal.

Aside from one person’s job, what was lost? Fritz sees a bleak future for the big studios, but is surprisingly upbeat about what’s in store for the rest of us. The decline of wide-release movies for grownups has coincided with the rise of ambitious, big-budget storytelling on television, a trade-off Fritz is fine with. “For those of us who simply want to sit down, turn off the lights, and be immersed in the magic of stories told in images on a screen,” he writes, “the future has never looked brighter.” True. But for a book that carefully delineates the causes and effects that have shaped the recent Hollywood past, the reduction of movies to “stories told in images on a screen” is surprisingly ahistorical. How and where the movies reach us has always contributed to the particular power they have to rearrange our moral furniture.

The story of Amy Pascal’s downfall at Sony is unsettling, but the period that preceded her tenure is not widely regarded as a golden age of American cinema. It was, instead, the age of the traditional blockbuster, when a “high concept” and a single A-list star could drive a project from pitch meeting into production and, finally, out to theatres. Until blockbusters arrived—starting in 1975, with Steven Spielberg’s “Jaws,” in its time the most commercially successful film in history—Hollywood released movies gradually, one set of theatres after another. In the “run-zone clearance system,” a movie would begin with a heavily publicized first run in downtown theatres in major cities, continue on to smaller houses in less affluent or less fashionable parts of the city, and then move out to the suburbs, to smaller cities and towns, and, finally, to rural communities. A movie that was disliked by its first wave of viewers might not continue through the system, and the urban sophisticates who made the initial decision to see it were heavily influenced by the critics.