Ruth Coustick-Deal – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Mon, 26 Feb 2018 08:03:11 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 The dangerous trend for automating censorship, and circumventing laws https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28 https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28#respond Wed, 28 Feb 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=69991 Deals between companies and governments working together to automate acceptable content online are too common. Whilst content filtering is being proposed in EU copyright law, in other situations it’s all wrapped up in a closed door agreement.  Ruth Coustick-Deal, writing for OpenMedia.org lays out the “shadow regulation” complementing the dubious legal propositions which are being drafted... Continue reading

The post The dangerous trend for automating censorship, and circumventing laws appeared first on P2P Foundation.

]]>

Deals between companies and governments working together to automate acceptable content online are too common. Whilst content filtering is being proposed in EU copyright law, in other situations it’s all wrapped up in a closed door agreement. 

Ruth Coustick-Deal, writing for OpenMedia.org lays out the “shadow regulation” complementing the dubious legal propositions which are being drafted to curtail sharing.

Ruth Coustick-Deal: As the excitement over using automation and algorithms in tech to “disrupt” daily life grows, so too does governments’ desire to use it to solve social problems. They hope “automation” will disrupt piracy, online harassment, and even terrorism.

This is particularly true in the case of deploying automated bots for content moderation on the web. These autonomous programs are designed to detect certain categories of posts, and then take-down or block them without any human intervention.

In the last few weeks:

1)The UK Government have announced they have developed an algorithmic tool to remove ISIS presence from the web.
2) Copyright industries have called for similar programs to be installed that can remove un-approved creative content in the United States.
3) The European Commission has suggested that filters can be used to “proactively detect, identify, and remove” anything illegal – from comments sections on news sites to Facebook posts.
4) The Copyright in the Digital Single Market Directive, currently being debated by MEPs, is proposing using technical filters to block copyrighted content from being posted.

There’s a recklessness to all of these proposals – because so much of them involve sidestepping legal processes.

EFF coined the term “shadow regulation” for rules that are made outside of the legislative process, and that’s what is happening here. A cosy relationship between business and governments has developed that the public are being left outside of when it comes to limiting online speech.

Let’s take a look at Home Secretary Amber Rudd’s anti-terrorist propaganda tool. She claims it can identify “94% of IS propaganda with 99.995% accuracy.” Backed up by this amazingly bold claim, the UK Government want to make the tool available to be installed on countless platforms across the web (including major platforms like Vimeo and YouTube) which would be able to detect, and then remove such content. However, it’s likely to be in some form of unofficial “agreement”, rather than legislation that is scrutinised by parliament.

Similarly, in the European Commission’s communication on automating blocking illegal content, our friends at EDRi point out, “the draft reminds readers – twice – that the providers have “contractual freedom”, meaning that… safeguards will be purely optional.”

If these programs are installed without the necessary public debate, a legal framework, or political consensus – then who will they be accountable to? Who is going to be held responsible for censorship of the wrong content? Will it be the algorithm makers? Or the platforms that utilise them? How will people object to the changes?

Even when these ideas have been introduced through legal mechanisms they still give considerable powers to the platforms themselves. For example, the proposed copyright law we have been campaigning on through Save the Link prevents content from being posted that was simply identified by the media industry – not what is illegal.

The European Commission has suggested using police to tell the companies when a post, image, or video is illegal. There is no consideration of using courts – who elsewhere are the ones who make calls about justice. Instead we are installing systems that bypass the rule of law, with only vague gestures towards due process.

Governments are essentially ignoring their human rights obligations by putting private companies in charge. Whether via vague laws or back-room agreements, automated filtering is putting huge amounts of power in the hands of a few companies, who are getting to decide what restrictions are appropriate.

The truth is, the biggest platforms on the web already have unprecedented control over what gets published online. These platforms have become public spaces, where we go to communicate with one another. With these algorithms however, there is an insidious element of control that the owners of the platforms have over us. We should be trying to reduce the global power of these companies, rather than hand over the latest tools for automated censorship to use freely.

It’s not just the handing over of power that is problematic. Once something has been identified by police or by the online platform as “illegal,” governments argue that it should never be seen again.  What if that “illegal” content is being shown for criticism or news-worthy commentary? Should a witness to terrorism be censored for showing the situation to the world? Filters make mistakes. They cannot become our gods.

Content moderation is one of the trickiest subjects being debated by digital rights experts and academics at the moment. There have been many articles written, many conferences on the subject, and dozens of papers that have tried to consider how we can deal with the volumes of content on the web – and the horrific examples that surfact.

It is without a doubt that however content moderation happens online, there must be transparency. It must be specified in law what exactly gets blocked. And the right to free expression must be considered.

The post The dangerous trend for automating censorship, and circumventing laws appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28/feed 0 69991
Article 13 will set back creativity. We let the artists speak for themselves. https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26 https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26#respond Mon, 26 Feb 2018 09:00:00 +0000 https://blog.p2pfoundation.net/?p=69971 Content filtering, bots scanning for copyrighted content and then blocking what they find, will seriously harm creativity in Europe. That’s why artists are joining together to speak out against it. Continuing our coverage of the European Parliament’s heinous proposition for filtering uploaded content, Ruth Coustick-Deal consults with the artistic community. Republished from OpenMedia.org. Ruth Coustick-Deal: Last... Continue reading

The post Article 13 will set back creativity. We let the artists speak for themselves. appeared first on P2P Foundation.

]]>

Content filtering, bots scanning for copyrighted content and then blocking what they find, will seriously harm creativity in Europe. That’s why artists are joining together to speak out against it.

Continuing our coverage of the European Parliament’s heinous proposition for filtering uploaded content, Ruth Coustick-Deal consults with the artistic community. Republished from OpenMedia.org.

Ruth Coustick-Deal: Last week we asked our community to let us know how people in their profession will be harmed by content filtering (Article 13) and the link tax (Article 11).[1]We’ve heard from more than 1000 people already, and more responses are still coming in every day.

Now it’s time to take the message to the person at the front of this decision. Axel Voss MEP is both in charge of the key committee and a leader in his party.

Axel Voss MEP today published his “compromise” (in name only) today. Essentially he kept Oettinger’s original flawed proposal. Despite public voices. Despite tens of thousands of people speaking up against. Despite robust academic critique. We are still faced with unaccountable censorship machines.

Axel Voss MEP needs to directly hear why the public are so opposed to automated censorship machines. He has the most influence on this law. He has all this power, and is still clinging on to broken, unpopular proposals.[2]

If MEPs like Voss want to the web to work for artists, they need to start listening to the individuals, not just the big industry groups.

They try to tell us that automated content filtering, bots scanning for copyrighted content and then blocking what they find, will help creativity flourish. We know that it won’t. Consider Adam Neely.[3] A YouTube-based jazz teacher couldn’t play short snippets of music to analyse them. Because the music was owned by Universal Music Group; they got it blocked and taken down. We will see more and more of this kind of censorship, which will take place across Europe, if these filters are legally demanded.

That’s why we are working with the Create/Refresh coalition.[4] They are a network of artists from across Europe who are opposed to Article 13. These creators produced a video which illustrates all of their talents, and their unity against these rules. Watch the video to find out more.

We need Axel Voss to see this! Let’s make sure he can’t ignore it, and knows that the very people he claims to be speaking for, oppose him.

This is just a small sample of art made possible because we don’t have these excessive restrictions that do nothing for creators.

Please give them a voice. Share the video with Axel Voss on Facebook and Twitter demanding he rejects content blocking and the link tax.

We know that tweeting at MEPs can be hugely effective if we raise a chorus too loud to ignore – MEPs pay attention to what people are saying on social media. Lets show Axel Voss that artists are not asking for his “protection”, what they want is freedom to create.

Footnotes

[1] Help our censorship impact research AND speak to your MEPs. Source: OpenMedia
[2] Green light for upload filters: EU Parliament’s copyright rapporteur has learned nothing from year-long debate. Source: Julia Reda
[3] When I want to teach but can’t, thanks to Universal Music Group. Source: Adam Neely
[4] Create Refresh Coalition website.

The post Article 13 will set back creativity. We let the artists speak for themselves. appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26/feed 0 69971