The scandals that have plagued Facebook for more than two years have led a U.K. committee to a stark conclusion — social media companies pose unique problems that require a new type of regulation.

The U.K. Digital, Culture, Media and Sport Committee issued its final 108-page report on disinformation and fake news on Monday in London, detailing a variety of investigations that included Facebook’s data privacy practices, its content moderation and the company's data-based ad targeting platform.

It includes a strong rebuke of Facebook's actions, including the allegation that it "intentionally and knowingly violated both data privacy and anti-competition laws."

"Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law," the report added.

The report adds to growing calls from activists, academics and politicians for increased oversight of tech companies that have percolated for years but gained widespread attention following the discovery of Russia’s effort to interfere in the 2016 U.S. election. Those calls also mark a significant shift for social media companies that had previously avoided taking responsibility for the content posted on their networks or how their ad platforms were used.

A demo booth at Facebook's annual F8 developer conference on April 18, 2017, in San Jose, California. Noah Berger / AP file

“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites,” the committee wrote in its report.

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

The report also concluded that the U.K.’s existing laws are not capable of effectively regulating Facebook and other big tech platforms. It calls for the creation of a “new category” of company that would have greater legal responsibility for regulating the content that is users upload and how its advertising systems are used.

The report recommended that social media companies should be held to a “compulsory Code of Ethics, overseen by an independent regulator” that would decide what kind of content should be disallowed. It would also establish a legal framework to fine companies if they are found not to be effectively enforcing those rules.

Facebook CEO Mark Zuckerberg declined requests to appear before the committee, a decision that the report said showed “contempt towards both the UK Parliament and the ‘International Grand Committee’, involving members from nine legislatures from around the world.”

The report also further details Facebook’s previous data privacy practices that have come under scrutiny in the U.K. and the U.S., where The Washington Post reported Facebook faces a record multi-billion dollar fine from the Federal Trade Commission. In October, the U.K. Information Commissioner’s Office hit Facebook with a £500,000 fine (about $644,000) for failing to ensure user privacy in relation to the Cambridge Analytica scandal.

Karim Palant, Facebook’s U.K. public policy manager, said in an emailed statement that Facebook agreed with the report’s calls for changes to election laws that would effectively address issues with digital advertising and pointed to changes that it had already made.

“We are open to meaningful regulation and support the committee's recommendation for electoral law reform. But we're not waiting,” Palant said. “We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for 7 years. No other channel for political advertising is as transparent and offers the tools that we do.”

Palant also said Facebook supports data privacy laws.

“While we still have more to do, we are not the same company we were a year ago,” Palant wrote. “We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”

Jason Kint, CEO of Digital Content Next, a trade group that represents media organizations, said the takeaways from the report were important but also should not but limited to the U.K.

“It’s critical to understand the key findings of the report are global challenges not limited to the U.K.,” Kint wrote in an email.

Kint, a vocal critic of Facebook, noted that some of Facebook’s data practices referenced in the report came after an agreement with the FTC — a detail he said highlighted that the company had not changed its ways.

“The inability for Facebook to prioritize security issues over profit despite being under a FTC Consent Decree in the U.S. speaks volumes,” Kint wrote.