This week, as I was sitting in my father's hospital room, I glanced at my phone and saw something strange going on with my Twitter notifications. A story I wrote five years ago was suddenly exploding across social media. It was my tribute to Dennis Ritchie, the creator of the C programming language and co-creator of Unix, republished from Ars by Wired on October 12, 2011:

"By creating C, Ritchie gave birth to the concept of open systems." https://t.co/C1v9oE7sNG nice eulogy for dmr by @thepacketrat — ROBERT HACKETT (@rhhackett) October 13, 2016

I wasn't sure what had set this off. But the deluge began to build even more after this tweet from Om Malik:

Dennis Ritchie, Father of C and Co-Developer of Unix, Dies https://t.co/LxkCK6i1DE via @WIRED — Om Malik (@om) October 13, 2016

Malik later apologized for posting a five-year-old story—one that was perhaps overshadowed at the time by the attention paid to the passing of Steve Jobs a week earlier. But he clearly wasn't the only one who thought the story was fresh news, as my Twitter and Facebook timelines showed.

@JZdziarski wow, I didn't check the year. Someone reshared on FB and it started a mini avalanche. — Ryan Lackey (@octal) October 12, 2016

Dennis Ritchie, Father of C and Co-Developer of Unix, Dies https://t.co/jFpZjrtOpy via @WIRED — craignewmark (@craignewmark) October 13, 2016

Thank you for your immense contributions, RIP Dennis Ritchie https://t.co/LVacgkcXvl — sundarpichai (@sundarpichai) October 13, 2016

This is not a new phenomenon. Social media snap-posts have killed off celebrities hundreds of times before their actual deaths (to the point where some have required websites to constantly fact-check their mortality). Facebook is full of years-late "RIP" posts. The Internet may never forget, but the humans who use it have become increasingly absent-minded.

It wasn't even just my story that went viral—a similar Guardian story also resurfaced, probably because of the same "memories" feature on Facebook or some other social media feature that dredges up old content. Still, there was something personally unsettling about having words I had written in tribute of "dmr"—a man whom I credited personally for making my early exposure to computing and its potential possible—suddenly resurface five years later.

The first few times I spotted Twitter acting up, I thanked people for resurfacing the story after so much time. But reading the post again—partially to make sure I hadn't somehow written another tribute subconsciously from my perch at my dad's bedside—was affecting in ways I didn't expect. Maybe I got emotional because I was in a hospital room with my father, who was recovering from an other-than-routine knee replacement surgery, and I had spent the day before sitting in a surgical waiting room.

Whatever the reason, I felt ways I still can't fully deconstruct. Social media is a machine that both stirs and feeds upon human emotion (as demonstrated by this particular presidential election cycle). Even though I knew my piece on Ritchie was five years old, all the emotions I had felt writing it came back in a strangely amplified and distorted way.

At the time, I noted, "Ritchie has shaped our world in much more fundamental ways than Steve Jobs or Bill Gates have. What sets him apart from them is that he did it all not in a quest for wealth or fame, but just out of intellectual curiosity." That sort of intellectual curiosity gets numbed by social media sometimes—just as it has become marginalized by popular culture, along with many things of the intellectual variety.

Perhaps I mourn dmr even more today than I did five years ago—because I realize we live in an age when there are increasingly few opportunities for individuals like him to color outside the lines of purely profit-driven work. The opportunities to create things that could have the same ripple effect that C and Unix had are going away. And given the way social media works, I'll probably get the chance to revisit those feelings over and over.