This is ORG's Policy Update for the week beginning 02/10/2017.

If you are reading this online, you can also subscribe to the email version or unsubscribe.

ORG’s work

Save the date for ORGCon 2017 - it will take place on Saturday 4 November at Friends House on Euston Road in London. We have a second smaller event planned on Sunday 5 November in a different location (TBC). This year is all about the Digital Fightback. Confirmed speakers include Graham Linehan, Noel Sharkey, Helen Lewis, Jamie Bartlett and Nanjira Sambuli. Tickets are on sale now!

Planned local group events:

ORG London is organising an informal pub gathering on 10 October with Myles Jackman to discuss the current state of digital rights legislation in the UK, including the progress of the Digital Economy Act.

Join ORG Edinburgh for a free screening of The Internet’s Own Boy - the life story of programmer, writer, political and internet activist Aaron Swartz on 11 October. Following the screening, our Scotland Director Matthew Rice will be available to discuss ORG’s work.

Official meetings

Javier Ruiz attended a meeting organised by the Royal Society and the Government Digital Service (GDS) to review the Data Science Ethical Framework.

Jim Killock attended a roundtable meeting organised by the Global Network Initiative on how governments can most effectively address concerns about content and protect human rights.

UK Parliament

DPBill will be read second time next week

The Data Protection Bill is due to be debated in the Second Reading in the House of Lords on 10 October.



Article 80(2)

We have previously raised our concerns about the draft Bill not containing provisions allowing independent privacy organisations to raise complaints without naming data subjects. Article 80(2) of the EU’s General Data Protection Regulation is a derogation (option) that would allow it, but the UK decided not to include it in the Bill. Such a provision would help investigate harmful data processing practices.

The Bill will allow people to lodge a complaint themselves or designate a qualifying organisation to file a complaint on their behalf. However affected data subjects are not always willing to come forward as they might not wish to be publicly associated with certain companies.



The lack of “representative”

Another issue arising from the draft Bill is the removal of “representative” from the text. Originally, the EU’s General Data Protection Regulation covers the processing of personal data of EU data subjects by data controllers (companies) not established in the EU. In such circumstances, the EU requires companies who are based outside of the EU but wish to offer services to people in the EU to establish a representative in a Member State.

The DPBill draft does not include the need for a “representative” and as such it will not be possible to enforce all rights and obligations on non-UK companies offering services to the people in the UK if something goes wrong.

For more details read the blog post by Amberhawk.

Other national developments

Viewing websites or media streams of terror propaganda will be a criminal offence

The Home Office announced that counter-terrorism laws will be updated and will include an offence of repeatedly viewing terrorist content online. The offence could result in a 15-year jail sentence.

The updated law is the Government’s effort to tackle online radicalisation. The changes will strengthen offence defined in Section 58 of the Terrorism Act 2000. Section 58 applies to online material which has been downloaded and stored on the offender’s computer, saved on another device or printed.

The new offence will apply to material that is viewed repeatedly or streamed online. Additionally, the maximum penalty will be increased from 10 to 15 years.

The new wording of the offence raises serious questions about safeguards for the general public and people who view the material for legitimate reasons. Journalists, anti-terror campaigners, academics and others may need to view extremist content frequently. The law may dissuade potential informants from coming forwards because they are already criminalised.

In their statement, the Home Office said that the offence will ensure that the defence for viewing material by mistake or out of curiosity without a criminal intent is available, as well as “reasonable excuse” defence which applies to journalists, academics, and others with a legitimate interest.

Max Hill QC, the Independent Reviewer of Terrorism Legislation, regards the announcement by the Home Secretary Amber Rudd MP as merely an update to an already existing offence, not a new offence altogether.

However, Hill stresses it is important that the amended version of Section 58 does not indict individuals based upon the content of the internet cache on their computers. Upon viewing terrorist video material online once, a website might cause multiple news stories/images to be stored in the laptop user’s cache, and the new offence should not encompass this situation.

He further identified that the “repeated view” of terrorist material will cause problems when attempting to define how many views constitute “repeated” view with a criminal intent.

Home Office to crack down on online child sexual abuse with new technology

The Home Office made another announcement this week regarding a new technology they plan to use to tackle online child sexual abuse.

The Government has made an investment of £600,000 in technology that will allow Internet companies to identify and remove indecent images of children from websites at an “unprecedented rate”.

The technology, Project Arachnid, uses lists of digital fingerprints (hashes) to search known illegal images and issue removal notices to the websites that host them. Internet and social media companies will be able to use a plugin to implement it in their systems.

The project could address parts of the Home Secretary’s (Amber Rudd MP) criticism this week who said that paedophiles use end-to-end services (like WhatsApp) to communicate beyond the reach of law enforcement.

Rudd said she does not

“accept it is right that companies should allow them and other criminals to operate beyond the reach of law enforcement. “We must require the industry to move faster and more aggressively. They have the resources and there must be greater urgency.”

She continued to complain about the attitude tech companies show when asked to compromise encryption they use to protect their users. Rudd explained that she does not

"need to understand how encryption works to understand how it's helping the criminals.”

Investigatory Powers Tribunal consultation

The Investigatory Powers Tribunal launched a consultation on updated rules (pdf) governing proceedings and complaints at the Investigatory Powers Tribunal.

The Investigatory Powers Tribunal Rules 2000 set out the procedures the IPT should follow. The consultation closes 10 November.

To respond to the consultation follow this link.

Europe

Germany’s new online hate speech code

New rules on policing online hate speech in Germany took effect 1 October. The law is supposed to target Internet companies such as Facebook, Twitter, and Google.

Internet companies are required by the new law to remove illegal content from their platforms within 24 hours. If they consistently fail to do so they could face fines of up to €50 million. The law is placing more responsibility of policing the Internet on companies. This approach is in line with other initiatives in the UK and France.

According to the reports, Facebook and Twitter failed to remove 70% of online hate speech within 24 hours of being notified by users. On the other hand, Google met the criteria for removing the illegal content.

Internet companies will need to hire representatives who will inform local authorities about the company’s efforts to remove potential hate speech material. The benchmark of removing content within 24 hours is supposed to apply only in the most egregious cases. If there is a doubt whether content should be removed, companies will have seven days to make their decision.

Additionally, Internet companies are expected to create avenues for their users to easily make reports of online hate speech. The changes should be made public by early 2018.

ORG media coverage

See ORG Press Coverage for full details.

Staff page