We tend to speak of the Internet as if it were a person—a needy, arresting, generous, pigheaded person. The exact phrase “the Internet hates” gets nearly two million hits (often followed by Beyoncé, Justin Bieber, Anne Hathaway, and rich people), while “the Internet loves” garners nearly a million (often followed by cats). We say that the Internet can be angry, scared, and confused. It is freaking out, but also, on occasion, nice.

Of course, not everybody refers to the Internet as if it were a demanding relative, but the tendency, fairly common in my circles, makes sense in a way. Because so few of the people we encounter online are actually real to us, we associate photos and videos and blog posts as much with the medium as the messenger.

Perhaps we can be forgiven the confusion. A recent study published in the Journal of Experimental Psychology: General by E. J. Masicampo and Nalini Ambady (available on Ambady’s webpage) noted some surprising similarities between the Internet—that great aggregator of precocity and inanity—and the individual human mind.

The researchers—using Google Trends, which tallies up how often particular search terms are used over time, thus gauging the Internet’s interest in a subject—investigated responses to two different types of current events. Incidental events refer to unpredictable, one-shot occurrences: food recalls, celebrity deaths, the naming of Nobel Prize winners. Goal-directed events, by contrast, are things anticipated over time that have a resolution: a government election or planning for a holiday meal.

Masicampo and Ambady found that the use of search terms for incidental events (e.g., “Al Gore” after he was awarded a Nobel Prize, or “Farrah Fawcett” after the actress’s death) shot up quickly, peaked, and then gradually declined. A few weeks later, the searches were still higher than they’d been before the triggering event. Searches for goal-directed events, however, increased steadily over time, peaking right around Easter, say, or election day, before plummeting quickly. A few weeks later, searches for those terms were lower than they’d been initially.

These patterns are interesting, the researchers argue, because they mirror how an individual experiences these events. After exposure to a stimulus, we confront a long, slow “forgetting curve” whereby the stimulus becomes less and less accessible to us. (Consider how faintly you recall a trigonometry lesson two months post-test—or how long the details of this post will stay sharp in your memory.) But the accessibility of goal-related information, such as an upcoming job interview or beach vacation, builds as the goal approaches. And once the goal has been met, and we have little need to dwell upon it, we don’t.

I suspect that nothing I’ve written thus far has shocked you. Of course we stop obsessing over the beach-readiness of our bodies once we return to our everyday lives, and of course we collectively lose interest in an election after the votes have been counted. So is it really interesting that, as the researchers put it, the “subtle psychological features of an event can therefore determine how large populations will regard it over time?”

It interests me. Psychologists still understand more about the behaviors of individuals than those of groups—particularly groups of individuals, each acting of his own accord. This study offers a way in. Additionally, I find this example of self-similarity—where a phenomenon resembles itself on multiple scales—elegant. The natural and human worlds are filled with fractals and near-fractals, patterns that are self-similar or nearly so: the jagged path of lightening, the flow of blood through our veins, even the transmission of information over the Internet don’t look all that different whether the scale is inches or yards, seconds or minutes, one network or dozens. Should such patterns not be found in the way we—individually and collectively—engage with our world?