1995 was a great year to be a college freshman. Our government was stable. The economy was strong. The closest thing to wars were a few missiles fired here and there by remote control. History was at an end and the American-led West had won. It was only a matter of time before every nation on earth converted to democracy and plugged into the New York Stock Exchange.

And to top it all off, there was this new thing, a nascent network of computers bringing unprecedented connections for communication and access to information from all over the world.

The World Wide Web was born, and we were there to see it.

Every new college student received an email account. It was like the world had given us a driver's license for the future, and our parents didn't have one.

I still remember the first website I saw. A few of my cohorts and I walked to the campus computer lab one afternoon after a lunch trip to Taco Bell.

"Name a company," my friend said, sitting in front of a computer.

I looked at the cup in my hand.

"Taco Bell," I said.

He typed "http://www.tacobell.com" into the Netscape browser and there it was -- a crude site with a list of Taco Bell locations and an electronic taco you could "eat" by clicking on it.

Whoa.

Then my friend asked us to come up with the sickest sexually deviant behavior we could imagine. I don't remember the exact combination, but I do remember it involved hamsters. A few more clicks on a keyboard and there it was -- a colonoscopy via Habitrail.

Nasty.

The Internet was our genie, ready to grant us any wish.

At least until Congress tried to wrest control of it. Dirty pictures of naked people doing nasty things! And children could go on their computers and see this stuff! Somebody had to stop this, and Congress decided that somebody was them.

They failed, obviously, but one unintended side effect of their attempt lasts to this day -- a line of code, if you will, in federal law, that changed the world.

When Congress passed the Communications Decency Act of 1996, it meant to make indecency on the internet illegal. Civil libertarians howled. Lawsuits flew through the courts like flying monkeys, challenging it as a necessary encroachment on the First Amendment.

By the time my sophomore year began, my computer lab pal was now my college roommate. We did our part to resist the oppressive CDA: We changed the desktop backgrounds in the campus computer labs to a cartoon of a puppy peeing on the bill.

Take that, feds!

The lawsuits, it turned out, worked a lot better. By 1997, the federal courts shredded the bill. Only one significant provision of the CDA remained: Section 230.

A bug and a feature

About the same time I was vandalizing computer lab desktops, I took a course computer science and learned to write a little code. When you learn to code on a computer, the first thing you learn is that a single misplaced punctuation mark can crash the whole thing, corrupting your product or even negating its intent.

Section 230 of the United States Code was such a line. It did anything but make the internet decent.

The critical portion says this:

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

What that means is this: If you have a platform on the internet and you allow users to publish information on that platform, you cannot be held liable for what those users put on your site.

Pretty much everything everyone hates about the internet today -- anonymous trolls in comment threads, Russian bots, revenge porn, fake news, videos of ISIS beheadings, ads for black market drugs, cookies that track your searches and then chase your around the internet with ads for the things you searched for -- all that can be traced back to this line in our federal code. None of that would be possible without it.

But for some, it wasn't a bug. It was a feature.

Pretty much everything fantastic about the internet today -- Facebook, Twitter, YouTube, Instagram, etc. -- would not exist without it, either.

Freedom from responsibility

The key things to understand here is that 230 separated the role of platforms from that of publishers. For instance, if you write a letter to the editor that libels somebody and the newspaper prints that letter, the newspaper can be held liable for publishing your defamation. It's a publisher.

But if you post those libelous things in a comment thread on a newspaper's website, the company can't be held liable. That's a platform.

Suddenly tech startups had all the freedom of the First Amendment and none of the responsibility. Without it, Twitter would have to vet each tweet and YouTube would have to screen each video. (And before you say it, yes, we would have to sift every comment on AL.com.) User-generated content as we know it would be legally dangerous and cost-prohibitive.

Under 230, platforms can let users publish whatever, all while running ads next to that content and putting the profits in their pockets.

Freedom from responsibility is not what the authors of 230 intended.

Around the same time my pals and I were horsing around in the college computer lab, the federal courts ruled that Prodigy (a competitor of AOL) had made itself liable for user-generated content. Prodigy, the courts found, had moderated its user forums, and by doing so, had touched that content -- making it responsible for all content on its platform.

More directly, by trying to clean up some of the content on its platform, Prodigy had become liable for all the content on its platform. Trying to take any control of content on a platform was then legally dangerous for platforms.

Christopher Cox, a Republican from California, and Ron Wyden, a Democrat from Oregon, saw this disincentive as dangerous to the internet and the pioneers staking their claims there. They tried to fix this problem, first with their own bill. When their bill failed to pass, they tried again by inserting much the same language in the CDA.

Their amendment became Section 230.

As the other provisions of the CDA died in the federal courts, my internet libertarian pals and I celebrated those victories. The digital world would be free from government interference, we thought.

But ultimately, it was the CDA that made this new realm free from responsibility and densely populated with new hazards.

The End of the End of History

Mark Zuckerberg was 11 years old when the CDA passed into law. Eight years later -- as he reminded the world over and over again in his testimony before Congress this week -- he launched Facebook from his college dorm room. He didn't even have to walk to the computer lab.

What a crazy world.

I thought two things while watching Zuckerberg's testimony this week.

First, I had an eight-year lead on this kid but I'm not yet a billionaire. I haven't rattled markets with my pronouncements. I haven't hosted a party for my company's IPO. But also, I haven't poisoned democracy throughout the world, nor have I empowered genocide in Myanmar.

I call it a wash.

The second thing I thought was this: These wrinkly old men in Congress haven't learned a damn thing about the internet since 1995.

One after another in the Senate committee room Tuesday, lawmakers put their outrage on display, but also their ignorance. Not one of these buffoons could set up an iPhone without an intern's help.

Zuck sat there and took it. Patiently as one senator demanded to know when Facebook would give users control of their privacy and information, he patly explained that users already could. He was ready with answers, but the old men weren't ready with questions. I kept waiting for one of them to demand Zuck explain how to program a VCR.

These are men (some women, but mostly men still) with incredible power. How many of them, do you figure, know they are the ones who made Zuckerberg's success possible? How many of them understand this unruly digital world exists, not because of Silicon Valley but because of Washington, D.C.?

All because of one line of code.

It's time for those wrinkly old men to smarten up, get with the times, and debug the law.

Doing so, for a time, will turn the world on its head. Facebook, Twitter, YouTube and other tech giants would have to evolve quickly and drastically. The disrupters would be the disrupted. Some would probably go bankrupt. Many newly rich people would suddenly be poor. There is a cost to be paid.

But there's a benefit, too. Each day, it becomes more apparent that the social media giants are the tobacco industry of the 21st century. Go on Facebook or Twitter today and you'll see one friend or another announcing they are deleting their accounts with all the boasting and bravado of a smoker who just threw a carton of cigarettes in the trash. But they'll be back. We know this stuff is poisonous, but we can't quit.

Zuck will cry. Folks like that 18-year-old me in the college computer lab will howl. They'll say deleting 230 will destroy freedom on the internet. That's nonsense.

Deleting 230 will restore responsibility.

The internet's birth and adolescence have been fast, painful and strange, but it's time for it and us to grow up.

Kyle Whitmire is the state political columnist for the Alabama Media Group.

Want access to the best analysis and in-depth reporting about Alabama each week? Sign up for the weekly Reckon Report newsletter and follow Reckon on Facebook and Twitter.