At long last, Facebook might be facing a little music for consistently violating the privacy of its users. The Federal Trade Commission is reportedly considering a multibillion-dollar fine for the social network’s abuses — abuses that are at the core of its business model of stripping people’s personal information and selling it on the open market.

We have little confidence, though, that even heavy fines will change Facebook’s relentless and obscured pursuit of your data, on and off its platform.

Founder Mark Zuckerberg told us as much in a recent defense of the company headlined “The Facts about Facebook.”

Zuckerberg acknowledged in that piece that Facebook must pursue personal data as part of its core mission to sell ads so it can continue “to offer services for free.”

Free is a loaded term here. Facebook is not free. Those of us using it without cost are its primary product.

In this regard, the FTC’s effort to force a settlement is a good step.

The problem is that the FTC can’t address Facebook’s primary problem, one that is shared across the spectrum of tech giants.

The problem is accurately identified in a recent lawsuit filed by a DART police officer who took a bullet on what was surely the worst night in our city's history. On July 7, 2016, four Dallas police officers and one DART officer were murdered by a gunman with a high-powered rifle in downtown Dallas. Nine other law enforcement officers and two civilians were wounded.

DART officer Jesus Retana claims that Facebook, Twitter and Google enabled the shooter because their platforms knowingly support terrorist groups that used social media to radicalize the shooter.

Jesus Retana, Dallas Area Rapid Transit police officer, was injured in an ambush attack in downtown Dallas, July 7, 2016.

We aren’t ready to describe Retana’s claim as true or not true. But it points up the continued problem that the tech giants pose for our society.

We have insulated them from the legal boundaries that traditional publishers and broadcasters have historically worked under.

The tech giants do not face the same liabilities under law for publishing libelous material. They may well also enjoy protections from hosting the sort of activity that Retana claims radicalized the murderer in Dallas.

In the U.S., the tech giants are largely left to self-police. They have proven insufficient to the job. And how can we be surprised? Policing against such abuses cuts against a core element of their business models. In the case of Facebook and Twitter, it even appears to cut against their founders’ false and naive belief that the truth will somehow rise out of the sea of lies, libel and hatred that pollute social media.

That hasn’t happened. Quite the opposite. In many places where social media giants have become a more dominant source of information, democracy has been threatened and minority communities have been oppressed or worse.

So yes, the FTC’s decision to take on Facebook is a step in the right direction. But in the end billions of dollars will likely not be enough to change privacy intrusions for companies with billions in profit. And, even still, it’s not clear that the FTC is the right agency to regulate the tech giants.

What America needs is a clearer regulatory environment that acknowledges social media is the most powerful publisher in the history of humanity and treats it that way — just as prior generations of lawmakers recognized the power of the printing press and the radio signal.

This is not a question of conservative or liberal principles.

It is a question of making companies responsible for what they publish.

This editorial was written by the editorial board and serves as the voice and opinion of The Dallas Morning News.