It’s easy to praise robots and automation when it isn’t your ass on the line. I’ve done it lots. But I may have to eat my own Cheerios soon enough.

Software is writing news stories with increasing frequency. In a recent example, an LA Times writer-bot wrote and posted a snippet about an earthquake three minutes after the event. The LA Times claims they were first to publish anything on the quake, and outside the USGS, they probably were.

The LA Times example isn’t special because it’s the first algorithm to write a story on a major news site. With the help of Chicago startup and robot writing firm, Narrative Science, algorithms have basically been passing the Turing test online for the last few years.

This is possible because some kinds of reporting are formulaic. You take a publicly available source, crunch it down to the highlights, and translate it for readers using a few boiler plate connectors. Hopefully, this makes it more digestible.

Indeed, Kristian Hammond, cofounder and CTO of Narrative Science, thinks some 90% of the news could be written by computers by 2030.

I imagine the computer populating a Venn diagram. In one circle, it adds hard data (earnings, sports stats, earthquake readings), in another, a selection of journalistic clichés—and where the two intersect, an article is born.

In truth, it’s a little more complicated than that. In engineering their software, Narrative worked with trained journalists to help the software determine an angle. For example, in the case of sports, the algorithm answers key questions like, “Who won the game and by how much? Was it a comeback or a blowout? Any heroics or notable stats?”

The program chooses an article template, strings together sentences, and spices them up with catch phrases: “It was a flawless day at the dish for the Giants.” The tone is colorfully prosaic, but human enough.

Early on, Narrative applied its algorithms to Little League baseball games. Participating parents would enter game stats into an iPhone app called GameChanger and the app would spit out written game summaries.

Since then, they’ve supplied content to major news sites. Forbes is open about its use of Narrative’s software, including an explanation in the article. The LA Times earthquake story, written by an algorithm created by one of their staff, included a disclaimer. But many more big sites anonymously use algorithms to write simple stories.

Narrative’s approach can be applied elsewhere too. The firm recently launched an app that works with Google Analytics to transform raw website metrics (traffic, sources, referrals, demographics) into accessible, natural language reports. These could be useful in any business, a kind of automated analyst to help make sense of big data sets.

The software clearly has some native advantages over the typical human.

For example, the LA earthquake hit at 6:25am. I doubt many West Coast journalists were at their desk that early. And if they were, few would have cared to scoop what amounted to a pretty inconsequential earthquake. Even if someone had been on it—how many could have penned and published a typo-free article in three minutes?

Ken Schwencke, the journalist who created the algorithm, was awoken by the quake, rolled out of bed, found the article awaiting his approval—and simply hit “publish.”

If a writer never had to compose a fifty word earthquake report again—few would complain. Better to leave the short, dry, purely informational articles to the bots.

In the perennially cash-strapped news business, unpaid algorithms could add lots of cheap content while (hopefully) freeing human writers to focus on and improve the quality of more in-depth, nuanced pieces.

“The way we use it, it’s supplemental,” Schwencke told the Huffington Post. “It saves people a lot of time, and for certain types of stories, it gets the information out there in usually about as good a way as anybody else would. The way I see it is, it doesn’t eliminate anybody’s job as much as it makes everybody’s job more interesting.”

But Narrative isn’t satisfied with Little League write-ups and data reports.

Hammond doesn’t mince words. He believes a computer could write stories worthy of a Pulitzer Prize by 2017. Not only would such a robot writer be fast and ever-wakeful, prowling the exponentially growing deluge of online information—it would know enough of the subtleties of human language and logic to write compelling stories too.

And the software needn’t be limited to the digital world. Such algorithms might one day find themselves a robot body, travel to war zones, and cover robot bull fights.

These robot-Hemingways might write existential think pieces that get to the heart (or emotional processor) of what it means to be a robot, and in the process, make us question what it means to be human—what sets us apart from the machines we make.

Image Credit: Mirko Tobias Schaefer/Flickr, patchtok/Flickr, Marcin Wichary/Flickr