These movies fail for the same reason that most attempts at a specific vision of advanced beings fail: We have always been terrible at imagining the future. In his classic essay “The Hazards of Prophecy,” the science-fiction writer Arthur C. Clarke cataloged a list of such embarrassments, from predictions that car travel would suffocate passengers to physicists denying the possibility of space travel, and divided them into two categories: failures of imagination and failures of nerve. These movies fall into both traps. They have neither the wit nor the daring to tell us something new about our lives with machines.

The present, mundane in comparison to what sci-fi of the past promised us, seems to have beggared our collective imagination. As Erik Brynjolfsson and Andrew McAfee explain in their recently published book, “The Second Machine Age,” digital progress moves exponentially, in a way specifically designed to elude our grasp. The progress of “Moore’s Law” — which says, per a 1965 prediction from the Intel co-founder Gordon Moore, that the density of transistors on a chip (and therefore processing speed) will double every 18 months — has barely hiccuped in four decades. Engineers have continually found new ways to reshuffle and relayer silicon chips and invent new methods of transmission. The most pressing inhibitor to the progression of Moore’s Law had been silicon’s tendency to overheat, but chip makers and engineers may have cracked that as well — they recently built the first successful transistor made of carbon nanotubes, clearing the space, potentially, for Moore’s Law to unfold for decades.

Sustained exponential progress, by its very nature, is mind-boggling. It’s because of this, perhaps, that we see cultural obsessions with growing brains everywhere we look, not just in blockbusters. In his best seller “The Future of the Mind,” the theoretical physicist Michio Kaku eagerly presents research demonstrating the possibility of telepathy, telekinesis and artificial memories. Joshua Foer’s “Moonwalking With Einstein” details the author’s journey into the world of competitive memory championships, where “mental athletes” square off, memorizing decks of cards and reciting 50,000 digits of pi with a stopwatch running. It’s a sweet, slightly Sisyphean impulse, rooted in a desire to reclaim some of our long-ago outsourced mental labor. “Forgotten phone numbers and birthdays represent minor erosions of our everyday memory,” Foer writes, “but they are part ofa much larger story of how we’ve supplanted our own natural memory with a vast superstructure of technological crutches.” It’s hard not to feel that what Foer and the mental athletes long for is the ability to toss those crutches down and test those wobbly limbs again.

But this binary — freedom versus enslavement — is no longer the useful way to talk about machine intelligence. Machines and humans work better with their heads together than apart. Jobs where workers and computers collaborate yield more effective results than when either goes it alone. One example, cited in “The Second Machine Age,” comes from chess: In 1997, the IBM computer Deep Blue toppled Garry Kasparov, sending ripples of existential despair through the chess world. Chess was over; machines had taken it. And yet in 2005, a supercomputer named Hydra with operating capabilities similar to Deep Blue was defeated by a pair of amateur players running multiple chess engines on three laptops. As Kasparov himself wrote in The New York Review of Books about that event: “Human strategic guidance combined with the tactical acuity of a computer was overwhelming.”

Skeptics point to experiments like this as evidence that human interpretation still matters and that our greatest gift — creativity — remains behind a sealed-off door that machines can never penetrate. But the line between creativity and statistical analysis blurs the harder you look at it, and machines are looking hard: They will cross it eventually. Creativity is not some exalted milk that we alone drink; it isa chain of small, insignificant leaps leading to one small, significant one. Years of data processing in our memories smoothes into the murmur we call intuition, a pleasing hum that is easy to mistake for divine inspiration.

I, personally, am incredibly grateful to my machine-intelligence friends — they make work like mine immeasurably easier. A few dedicated hours of dredging the depths, Googling names and then Googling the names those names mentioned led me to a hard clutch of source texts representing the precise gaps in my knowledge I hoped to fill.