The report further asked for 10 guiding "principles" to dictate rules, including accountability, privacy, transparency, human rights and protections for children. The lords also wanted to enforce a general "duty of care" among internet companies, requiring them to take "reasonable" steps o prevent harm. They'd also aim for clearer community content standards through a classification system similar to the one governing British movies.

The committee pushed for more specific regulations beyond this. Companies should enable the strictest privacy and safety settings by default, leaving it to users to loosen the controls. If a firm handles data, it would be required to publish yearly transparency reports showing how they develop, buy, use and store behavioral info. The ICO would conduct audits to explore the risks of algorithms, while the government should consider a "public-interest test" for mergers where data plays a role.

There's no guarantee this will lead to a Digital Authority, but the House of Lords contended that there might not be much choice. The current rule system is "out of date," according to Lord Gilbert of Panteg, and self-policing is "clearly failing." Instead of reacting to the news, Lord Gilbert argued the UK should be "looking ahead" and setting policies that can apply to services in the future.

We wouldn't count on internet giants taking this gracefully. While they haven't been completely opposed to regulation, they've generally tried to avoid it where possible. A Digital Authority and the proposed changes could leave them with no choice but to alter their practices in the UK, sometimes in drastic ways. There are also potential issue-specific problems, such as the possibility a "duty of care" requirement could limit internet giants' safe harbor protections and hold them responsible. The House of Lords is aware of that concern and doesn't want to eliminate the protections for fear or stifling free speech, but finding a balance between freedom and responsibility might be difficult, if not outright problematic.