Yahoo personalizes headlines for its audience of over 700 million people through its Content Optimization and Relevance Engine, an algorithmic system based on demographic data and reading behavior. As a researcher who studies digital media, I was aware that my news was filtered, but I had never noticed the filtering process in action, probably because, until now, Yahoo had guessed me right. (Or at least not so gruesomely wrong.) When the headlines first appeared, I thought they were an anomaly, but as the weeks went on, I noticed a pattern. Going through my search history, I could trace the emergence of stories to the day I read about the Bustamante murder.

What I could not determine was whether I was alone in receiving such lurid headlines. When television news airs stories on murdered children, people can complain to each other about the sordid exploitation, the manipulation of tragedy for ratings gain. A mass media event can be dissected, and protested, by the masses.

In my case, the focus could only turn inward: why was Yahoo feeding me such grisly material? Was it really because I had clicked on that one story? Was it because I am a mom in Missouri, and this is what they think moms in Missouri like to read? Yahoo had become a character assessment -- not one to be taken seriously, but one that turned news consumption into self-analysis, or at least an analysis of my algorithmic analogue: What did the headlines say about me?

* * *

In 2010, noted technophobe Jonathan Franzen revealed his fondness for AOL: "AOL's little box -- the welcome screen, they call it, I guess -- is so infuriating in its dopiness: 'Surprising Leader In The Masters! Find Out Who!' 'Ten Things To Think About When Choosing A Hotel!' 'What Smart Travelers Know About X!' It's all in compact form, and it kind of tells me everything I need to know about the larger stupidity. It helps keep me in touch."

Condescension aside, Franzen has a point. Portals like AOL or Yahoo, with their mix of gossip and politics and recipes and celebrity death watch masquerading as "trending now" (Larry Hagman? Ernest Borgnine?), feel like a throwback to a time of inclusive, if dim-witted, media. I am not alone in my taste for the larger stupidity: Yahoo is the most popular online news source in the world. Unlike a website whose sole purpose is news, Yahoo's headlines seem like too much of an afterthought to be pointed. Unlike Facebook or Google, with their mercurial platforms and pretense to philosophy, Yahoo seems too uncool to control you.

Yet that might be the reason it is effective in doing so. Few perusing Yahoo headlines would suspect that children murdering children is a reader category chosen by robots. While disturbing on an epistemological level, it may also have practical consequences. As we rely on internet media to give us a taste of what's going on, we don't realize we're consuming a particular flavor. A sudden uptick in stories on violence -- particularly by or against a specific demographic category -- can spur paranoia, prejudice and vigilante behavior. What a machine thinks we need to know can become what we fear. But because the algorithmic process is both secret and subjective, we have no way of tracking the ramifications.

Media organizations have long been accused of bias. Social media shifted that bias from the organization to the user -- the filter bubble of news chosen by friends, the friends themselves filtered by assumed similarities. But now we must contend with the bias of a false version of ourselves. Yahoo's murder feed exposes the algorithmic process for what it is: personalization without the person.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.