In 1995, New York City psychiatrist Ivan Goldberg logged onto PsyCom.net, then a popular message board for shrinks, to describe a new disease he called "internet addiction disorder," symptoms of which, he wrote, included giving up important social activities because of internet use and "voluntary or involuntary typing movements of the fingers."

It was supposed to be a joke.

But to his surprise, many of his colleagues took him seriously. Their response led him to create an online support group for internet addicts—though he quickly downgraded the affliction, renaming it "pathological internet-use disorder." The word addiction "makes it sound as if one were dealing with heroin, a truly addicting substance," Goldberg told the New Yorker in 1997. "To medicalize every behavior by putting it into psychiatric nomenclature is ridiculous."

LEARN MORE The WIRED Guide to Internet Addiction

Today, more than two decades after Goldberg's joke fell flat, mental health professionals find themselves in a similar bind. Public anxiety over the side effects of screen time—the hours we spend staring at our various devices—is the highest it's been in years. That anxiety has manifested in the form of self-help books, social movements, major media outlets foretelling the "the worst mental-health crisis in decades," and no shortage of guilt. (You let your kid play with an iPad at restaurants? You spent 30 minutes browsing Instagram when you could have been exercising? Or playing board games with your family? Or learning a second language? You sad/selfish/lonely monster!) And yet, there exists little clear evidence that we are locked in an unambiguously harmful relationship with our devices—let alone addicted to them in any clinical sense. "For the past twelve months, the narrative surrounding technology use and screen time has been consistently negative, but it's been driven more by fear than facts," says UC Irvine psychologist Candice Odgers.

Experts like Odgers say we'll never get good answers about the effects of screen time, unless we start asking better questions. And that means being honest with ourselves about what we mean by "screen time" in the first place.

This year, the conversation around digital dependence entered a new phase when Facebook CEO Mark Zuckerberg resolved to spend 2018 fixing Facebook, vowing, among other things, to ensure that time spent on the social network would be "time well spent." (Zuckerberg borrowed the phrase from former Google design ethicist Tristan Harris, who has popularized the term in recent years by characterizing it as the opposite of time surrendered involuntarily to devices, apps, and algorithms designed to "hijack our minds.") A few days after Zuck's post went public, major Apple shareholders urged that company to study its products' effects on children and equip parents with better tools for managing their kids' screen time. The following month, Harris formed the Center for Humane Technology—an alliance of tech-giant turncoats united in opposition against the attention-grabbing products they helped create.

These events helped set the tone of the year to come. "For right or wrong, big tech companies have seen which way the wind is blowing and responded," says Andrew Przybylski, an experimental psychologist at the Oxford Internet Institute. Google led the charge, pledging its commitment to digital well-being and releasing new tools designed to help Android users monitor their tech habits. Apple followed suit, unveiling features designed to help users understand and manage the time they spend on their iOS devices. Then came Facebook and Instagram, each of which released features designed to help users track, and set limits on, the time they spend in-app.

Want more? Read all of WIRED’s year-end coverage

None of those companies has shared whether its tools have been effective. It's possible they never will release any data—and even if they do, researchers say, taking them at face value could be hard. "There wasn't a good empirical basis for their creation in the first place, so there probably won't be good evidence for their effectiveness," Przybylski says.