Censorship – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Tue, 12 Jun 2018 16:06:16 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 The EU’s Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia https://blog.p2pfoundation.net/the-eus-copyright-proposal-is-extremely-bad-news-for-everyone-even-especially-wikipedia/2018/06/14 https://blog.p2pfoundation.net/the-eus-copyright-proposal-is-extremely-bad-news-for-everyone-even-especially-wikipedia/2018/06/14#respond Thu, 14 Jun 2018 09:00:00 +0000 https://blog.p2pfoundation.net/?p=71385 Republished from EFF.org Cory Doctorow: The pending update to the EU Copyright Directive is coming up for a committee vote on June 20 or 21 and a parliamentary vote either in early July or late September. While the directive fixes some longstanding problems with EU rules, it creates much, much larger ones: problems so big... Continue reading

The post The EU’s Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia appeared first on P2P Foundation.

]]>
Republished from EFF.org

Cory Doctorow: The pending update to the EU Copyright Directive is coming up for a committee vote on June 20 or 21 and a parliamentary vote either in early July or late September. While the directive fixes some longstanding problems with EU rules, it creates much, much larger ones: problems so big that they threaten to wreck the Internet itself.

Under Article 13 of the proposal, sites that allow users to post text, sounds, code, still or moving images, or other copyrighted works for public consumption will have to filter all their users’ submissions against a database of copyrighted works. Sites will have to pay to license the technology to match submissions to the database, and to identify near matches as well as exact ones. Sites will be required to have a process to allow rightsholders to update this list with more copyrighted works.

Even under the best of circumstances, this presents huge problems. Algorithms that do content-matching are frankly terrible at it. The Made-in-the-USA version of this is YouTube’s Content ID system, which improperly flags legitimate works all the time, but still gets flack from entertainment companies for not doing more.

There are lots of legitimate reasons for Internet users to upload copyrighted works. You might upload a clip from a nightclub (or a protest, or a technical presentation) that includes some copyrighted music in the background. Or you might just be wearing a t-shirt with your favorite album cover in your Tinder profile. You might upload the cover of a book you’re selling on an online auction site, or you might want to post a photo of your sitting room in the rental listing for your flat, including the posters on the wall and the picture on the TV.

Wikipedians have even more specialised reasons to upload material: pictures of celebrities, photos taken at newsworthy events, and so on.

But the bots that Article 13 mandates will not be perfect. In fact, by design, they will be wildly imperfect.

Article 13 punishes any site that fails to block copyright infringement, but it won’t punish people who abuse the system. There are no penalties for falsely claiming copyright over someone else’s work, which means that someone could upload all of Wikipedia to a filter system (for instance, one of the many sites that incorporate Wikpedia’s content into their own databases) and then claim ownership over it on Twitter, Facebook and WordPress, and everyone else would be prevented from quoting Wikipedia on any of those services until they sorted out the false claims. It will be a lot easier to make these false claims that it will be to figure out which of the hundreds of millions of copyrighted claims are real and which ones are pranks or hoaxes or censorship attempts.

Article 13 also leaves you out in the cold when your own work is censored thanks to a malfunctioning copyright bot. Your only option when you get censored is to raise an objection with the platform and hope they see it your way—but if they fail to give real consideration to your petition, you have to go to court to plead your case.

Article 13 gets Wikipedia coming and going: not only does it create opportunities for unscrupulous or incompetent people to block the sharing of Wikipedia’s content beyond its bounds, it could also require Wikipedia to filter submissions to the encyclopedia and its surrounding projects, like Wikimedia Commons. The drafters of Article 13 have tried to carve Wikipedia out of the rule, but thanks to sloppy drafting, they have failed: the exemption is limited to “noncommercial activity”. Every file on Wikipedia is licensed for commercial use.

Then there’s the websites that Wikipedia relies on as references. The fragility and impermanence of links is already a serious problem for Wikipedia’s crucial footnotes, but after Article 13 becomes law, any information hosted in the EU might disappear—and links to US mirrors might become infringing—at any moment thanks to an overzealous copyright bot. For these reasons and many more, the Wikimedia Foundation has taken a public position condemning Article 13.

Speaking of references: the problems with the new copyright proposal don’t stop there. Under Article 11, each member state will get to create a new copyright in news. If it passes, in order to link to a news website, you will either have to do so in a way that satisfies the limitations and exceptions of all 28 laws, or you will have to get a license. This is fundamentally incompatible with any sort of wiki (obviously), much less Wikipedia.

It also means that the websites that Wikipedia relies on for its reference links may face licensing hurdles that would limit their ability to cite their own sources. In particular, news sites may seek to withhold linking licenses from critics who want to quote from them in order to analyze, correct and critique their articles, making it much harder for anyone else to figure out where the positions are in debates, especially years after the fact. This may not matter to people who only pay attention to news in the moment, but it’s a blow to projects that seek to present and preserve long-term records of noteworthy controversies. And since every member state will get to make its own rules for quotation and linking, Wikipedia posts will have to satisfy a patchwork of contradictory rules, some of which are already so severe that they’d ban any items in a “Further Reading” list unless the article directly referenced or criticized them.

The controversial measures in the new directive have been tried before. For example, link taxes were tried in Spain and Germany and they failed, and publishers don’t want them. Indeed, the only country to embrace this idea as workable is China, where mandatory copyright enforcement bots have become part of the national toolkit for controlling public discourse.

Articles 13 and 11 are poorly thought through, poorly drafted, unworkable—and dangerous. The collateral damage they will impose on every realm of public life can’t be overstated. The Internet, after all, is inextricably bound up in the daily lives of hundreds of millions of Europeans and an entire constellation of sites and services will be adversely affected by Article 13. Europe can’t afford to place education, employment, family life, creativity, entertainment, business, protest, politics, and a thousand other activities at the mercy of unaccountable algorithmic filters. If you’re a European concerned about these proposals, here’s a tool for contacting your MEP.

Photo by ccPixs.com

The post The EU’s Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/the-eus-copyright-proposal-is-extremely-bad-news-for-everyone-even-especially-wikipedia/2018/06/14/feed 0 71385
The dangerous trend for automating censorship, and circumventing laws https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28 https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28#respond Wed, 28 Feb 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=69991 Deals between companies and governments working together to automate acceptable content online are too common. Whilst content filtering is being proposed in EU copyright law, in other situations it’s all wrapped up in a closed door agreement.  Ruth Coustick-Deal, writing for OpenMedia.org lays out the “shadow regulation” complementing the dubious legal propositions which are being drafted... Continue reading

The post The dangerous trend for automating censorship, and circumventing laws appeared first on P2P Foundation.

]]>

Deals between companies and governments working together to automate acceptable content online are too common. Whilst content filtering is being proposed in EU copyright law, in other situations it’s all wrapped up in a closed door agreement. 

Ruth Coustick-Deal, writing for OpenMedia.org lays out the “shadow regulation” complementing the dubious legal propositions which are being drafted to curtail sharing.

Ruth Coustick-Deal: As the excitement over using automation and algorithms in tech to “disrupt” daily life grows, so too does governments’ desire to use it to solve social problems. They hope “automation” will disrupt piracy, online harassment, and even terrorism.

This is particularly true in the case of deploying automated bots for content moderation on the web. These autonomous programs are designed to detect certain categories of posts, and then take-down or block them without any human intervention.

In the last few weeks:

1)The UK Government have announced they have developed an algorithmic tool to remove ISIS presence from the web.
2) Copyright industries have called for similar programs to be installed that can remove un-approved creative content in the United States.
3) The European Commission has suggested that filters can be used to “proactively detect, identify, and remove” anything illegal – from comments sections on news sites to Facebook posts.
4) The Copyright in the Digital Single Market Directive, currently being debated by MEPs, is proposing using technical filters to block copyrighted content from being posted.

There’s a recklessness to all of these proposals – because so much of them involve sidestepping legal processes.

EFF coined the term “shadow regulation” for rules that are made outside of the legislative process, and that’s what is happening here. A cosy relationship between business and governments has developed that the public are being left outside of when it comes to limiting online speech.

Let’s take a look at Home Secretary Amber Rudd’s anti-terrorist propaganda tool. She claims it can identify “94% of IS propaganda with 99.995% accuracy.” Backed up by this amazingly bold claim, the UK Government want to make the tool available to be installed on countless platforms across the web (including major platforms like Vimeo and YouTube) which would be able to detect, and then remove such content. However, it’s likely to be in some form of unofficial “agreement”, rather than legislation that is scrutinised by parliament.

Similarly, in the European Commission’s communication on automating blocking illegal content, our friends at EDRi point out, “the draft reminds readers – twice – that the providers have “contractual freedom”, meaning that… safeguards will be purely optional.”

If these programs are installed without the necessary public debate, a legal framework, or political consensus – then who will they be accountable to? Who is going to be held responsible for censorship of the wrong content? Will it be the algorithm makers? Or the platforms that utilise them? How will people object to the changes?

Even when these ideas have been introduced through legal mechanisms they still give considerable powers to the platforms themselves. For example, the proposed copyright law we have been campaigning on through Save the Link prevents content from being posted that was simply identified by the media industry – not what is illegal.

The European Commission has suggested using police to tell the companies when a post, image, or video is illegal. There is no consideration of using courts – who elsewhere are the ones who make calls about justice. Instead we are installing systems that bypass the rule of law, with only vague gestures towards due process.

Governments are essentially ignoring their human rights obligations by putting private companies in charge. Whether via vague laws or back-room agreements, automated filtering is putting huge amounts of power in the hands of a few companies, who are getting to decide what restrictions are appropriate.

The truth is, the biggest platforms on the web already have unprecedented control over what gets published online. These platforms have become public spaces, where we go to communicate with one another. With these algorithms however, there is an insidious element of control that the owners of the platforms have over us. We should be trying to reduce the global power of these companies, rather than hand over the latest tools for automated censorship to use freely.

It’s not just the handing over of power that is problematic. Once something has been identified by police or by the online platform as “illegal,” governments argue that it should never be seen again.  What if that “illegal” content is being shown for criticism or news-worthy commentary? Should a witness to terrorism be censored for showing the situation to the world? Filters make mistakes. They cannot become our gods.

Content moderation is one of the trickiest subjects being debated by digital rights experts and academics at the moment. There have been many articles written, many conferences on the subject, and dozens of papers that have tried to consider how we can deal with the volumes of content on the web – and the horrific examples that surfact.

It is without a doubt that however content moderation happens online, there must be transparency. It must be specified in law what exactly gets blocked. And the right to free expression must be considered.

The post The dangerous trend for automating censorship, and circumventing laws appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/the-dangerous-trend-for-automating-censorship-and-circumventing-laws/2018/02/28/feed 0 69991
Article 13 will set back creativity. We let the artists speak for themselves. https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26 https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26#respond Mon, 26 Feb 2018 09:00:00 +0000 https://blog.p2pfoundation.net/?p=69971 Content filtering, bots scanning for copyrighted content and then blocking what they find, will seriously harm creativity in Europe. That’s why artists are joining together to speak out against it. Continuing our coverage of the European Parliament’s heinous proposition for filtering uploaded content, Ruth Coustick-Deal consults with the artistic community. Republished from OpenMedia.org. Ruth Coustick-Deal: Last... Continue reading

The post Article 13 will set back creativity. We let the artists speak for themselves. appeared first on P2P Foundation.

]]>

Content filtering, bots scanning for copyrighted content and then blocking what they find, will seriously harm creativity in Europe. That’s why artists are joining together to speak out against it.

Continuing our coverage of the European Parliament’s heinous proposition for filtering uploaded content, Ruth Coustick-Deal consults with the artistic community. Republished from OpenMedia.org.

Ruth Coustick-Deal: Last week we asked our community to let us know how people in their profession will be harmed by content filtering (Article 13) and the link tax (Article 11).[1]We’ve heard from more than 1000 people already, and more responses are still coming in every day.

Now it’s time to take the message to the person at the front of this decision. Axel Voss MEP is both in charge of the key committee and a leader in his party.

Axel Voss MEP today published his “compromise” (in name only) today. Essentially he kept Oettinger’s original flawed proposal. Despite public voices. Despite tens of thousands of people speaking up against. Despite robust academic critique. We are still faced with unaccountable censorship machines.

Axel Voss MEP needs to directly hear why the public are so opposed to automated censorship machines. He has the most influence on this law. He has all this power, and is still clinging on to broken, unpopular proposals.[2]

If MEPs like Voss want to the web to work for artists, they need to start listening to the individuals, not just the big industry groups.

They try to tell us that automated content filtering, bots scanning for copyrighted content and then blocking what they find, will help creativity flourish. We know that it won’t. Consider Adam Neely.[3] A YouTube-based jazz teacher couldn’t play short snippets of music to analyse them. Because the music was owned by Universal Music Group; they got it blocked and taken down. We will see more and more of this kind of censorship, which will take place across Europe, if these filters are legally demanded.

That’s why we are working with the Create/Refresh coalition.[4] They are a network of artists from across Europe who are opposed to Article 13. These creators produced a video which illustrates all of their talents, and their unity against these rules. Watch the video to find out more.

We need Axel Voss to see this! Let’s make sure he can’t ignore it, and knows that the very people he claims to be speaking for, oppose him.

This is just a small sample of art made possible because we don’t have these excessive restrictions that do nothing for creators.

Please give them a voice. Share the video with Axel Voss on Facebook and Twitter demanding he rejects content blocking and the link tax.

We know that tweeting at MEPs can be hugely effective if we raise a chorus too loud to ignore – MEPs pay attention to what people are saying on social media. Lets show Axel Voss that artists are not asking for his “protection”, what they want is freedom to create.

Footnotes

[1] Help our censorship impact research AND speak to your MEPs. Source: OpenMedia
[2] Green light for upload filters: EU Parliament’s copyright rapporteur has learned nothing from year-long debate. Source: Julia Reda
[3] When I want to teach but can’t, thanks to Universal Music Group. Source: Adam Neely
[4] Create Refresh Coalition website.

The post Article 13 will set back creativity. We let the artists speak for themselves. appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/article-13-will-set-back-creativity-we-let-the-artists-speak-for-themselves/2018/02/26/feed 0 69971
Green light for upload filters: EU Parliament’s copyright rapporteur has learned nothing from year-long debate https://blog.p2pfoundation.net/green-light-for-upload-filters-eu-parliaments-copyright-rapporteur-has-learned-nothing-from-year-long-debate/2018/02/24 https://blog.p2pfoundation.net/green-light-for-upload-filters-eu-parliaments-copyright-rapporteur-has-learned-nothing-from-year-long-debate/2018/02/24#respond Sat, 24 Feb 2018 11:00:00 +0000 https://blog.p2pfoundation.net/?p=69963 Julia Reda gives an update – and not a good one – on the forthcoming European Comission “censorship machine” proposal. The following is republished from Reda’s website. Julia Reda: Ever since the European Commission presented its hugely controversial proposal to force internet platforms to employ censorship machines, the copyright world has been eagerly awaiting the... Continue reading

The post Green light for upload filters: EU Parliament’s copyright rapporteur has learned nothing from year-long debate appeared first on P2P Foundation.

]]>
Julia Reda gives an update – and not a good one – on the forthcoming European Comission “censorship machine” proposal. The following is republished from Reda’s website.

Julia Reda: Ever since the European Commission presented its hugely controversial proposal to force internet platforms to employ censorship machines, the copyright world has been eagerly awaiting the position of the European Parliament. Today, the person tasked with steering the copyright reform through Parliament, rapporteur Axel Voss, has finally issued the text he wants the Parliament to go forward with.

It’s a green light for censorship machines: Mr. Voss has kept the proposal originally penned by his German party colleague, former Digital Commissioner Günther Oettinger, almost completely intact.

In doing so, he is dismissing calls from across the political spectrum to stop the censorship machines. He is ignoring one and a half years of intense academic and political debate pointing out the proposal’s many glaring flaws. He is discarding the work of several committees of the Parliament which came out against upload filters, and of his predecessor and party colleague MEP Comodini, who had correctly identified the problems almost a year ago. He is brushing off the concerns about the proposal’s legality several national governments have voiced in Council. And he is going against the recently published coalition agreement of the new German government – which is going to include Voss’ own Christian Democratic Party – where filtering obligations are rejected as disproportionate.

Photo © European Union (used with permission)

[Read Axel Voss’ compromise proposal PDF]

This is a “compromise” in name only. Mr. Voss’ proposal contains all the problematic elements of the original censorship machines idea, and adds several new ones. Here’s the proposal in detail:

1. Obligatory impossible-to-get licenses

The proposal says: All apps and websites where users can upload and publish media are required to get copyright licenses for all content. These platforms are considered to “communicate to the public” all those user uploads, which means that the platforms would be directly responsible for copyright infringements committed by their users, as if it were the platform’s employees themselves uploading these works.

This is a bizarre addition to the Commission proposal, which would be impossible to implement in practice: Who exactly are the platforms supposed to get those license agreements from? While there may be collecting societies representing professional authors in a few areas such as music or film, which may be able to issue a license covering the works of many individual authors, other sectors do not have collecting societies at all.

Imagine a platform dedicated to hosting software, such as GitHub. There is no collecting society for software developers and nobody has so far seen the need to found one. So where will GitHub, which undoubtedly hosts and gives access to (copyright-protected) software uploaded by users, get their copyright license from? They can’t enter into license negotiations with every single software developer out there, just because somebody might someday upload their software to GitHub without permission. And without that impossible-to-get license, this law says they will be directly liable as soon as somebody does upload copyrighted works. That’s a sure-fire way to kill the platforms economy in Europe.

And these impossible-to-get licenses cover only non-commercial use: If the platform acquires a license as prescribed, then non-commercial uploaders won’t be liable. Uploaders acting for commercial purposes however, such as companies with social media accounts, can still be sued by rightsholders.

2. The censorship machine is here to stay

The proposal says: All platforms hosting and providing public access to “significant amounts” of user-uploaded content have to prevent copyrighted content that rightsholders have identified from being uploaded in the first place.

There are only two ways to do this: (a) hire an army of trained monkeys to look at every individual user upload and compare it manually to the rightsholder information or (b) install upload filters. The article that creates this obligation no longer mentions content recognition technologies explicitly, but they are still mentioned in other parts of the text, making it clear that filters are what Voss has in mind.

There is no definition what “significant amounts” are supposed to be. The Commission was widely criticised for requiring censorship machines on platforms with “large amounts” of content, following the misguided idea that only large companies with significant resources available to dedicate to the development of upload filters host large amounts of content, completely ignoring the wide diversity of popular specialised platforms out there: Community-run platforms like Wikipedia, niche platforms like MuseScore (for sheet music) and many startups host millions of uploads, but would struggle to implement or license expensive filtering technology.

Why Voss believes replacing the word “large” with the potentially even broader “significant” is supposed to improve anything remains completely unclear.

3. A tiny problem with fundamental rights

The proposal says: The filtering measures must not entail any processing of personal data, in order to protect users’ privacy

The only indication that Mr. Voss has paid attention to any of the public criticism at all is that he acknowledges there may a tiny problem with fundamental rights. Indeed, the European Court of Justice has in the past ruled that an obligation to filter all user uploads violates the fundamental rights to privacy, freedom of expression, freedom of information and freedom to conduct a business. Voss picks one of those fundamental rights seemingly at random and adds a provision aimed at protecting it. Admirable as this may be, it is also in direct contradiction to what comes next:

Because filters will invariably delete content that is legal, for example under a copyright exception, users are supposed to have access to a redress mechanism to complain about overblocking. But how exactly is the platform supposed to offer the user that redress if it is not allowed to process any personal data? Simply recording which user’s uploads have fallen victim to the filter already requires processing of personal data. How can a user complain about a wrongful takedown if the platform is not allowed to keep records of what the filter deleted in the first place?

It gets better: Guess who should decide about what happens with the users’ complaints about wrongful takedowns? The rightsholders who asked for the content to be blocked in the first place. Surely they will turn out to be an impartial arbiter.

At least, users are supposed to be able to go to a court if the redress mechanism fails. However, this may end up being ineffective, because copyright exceptions do not constitute legal rights against the rightsholders, so a court may decide not to require a platform to reinstate previously deleted uploads, even if they were legal under a copyright exception.

What users need is a clear legal rule that the copyright exceptions constitute users’ rights – just like the previous copyright rapporteur Therese Comodini had suggested.

4. Very specific general monitoring

The proposal says: Checking all user uploads for whether they are identical to a particular rightsholder’s copyrighted work does not constitute forbidden “general“ monitoring, but is “specific“.

EU law forbids any laws that force hosting providers to do “general monitoring”, such as checking every single file uploaded by every user all of the time. Voss simply postulates that upload filters would not break that rule and writes that only “abstract monitoring” should be forbidden, which presumably means randomly looking at uploaded files without looking for anything in particular.

This argument has already been dismissed by the European Court of Justice: The European Commission tried making it in defense of upload filters in the past – and lost (Paragraph 58 of this French-language Commission contribution to the European Court of Justice case Scarlet vs. SABAM).

5. Few exceptions

The proposal says: The filtering obligation should not apply to Internet access services, online marketplaces such as ebay, research repositories where rightsholders mainly upload their own works such as arXiv, or cloud service providers where the uploads cannot be accessed publicly, such as Dropbox.

In a last-ditch attempt to redeem himself, Voss provides a welcome clarification that the obligation to filter does not extend to certain businesses. But this exception, not legally binding since it is in a recital rather than an article, does not apply to the obligation to license.

The listed platforms would still have to get licenses from rightsholders provided that the user uploads are publicly accessible, because they would still be considered to be communicating to the public. But how are these platforms supposed to shield themselves from lawsuits by rightsholders if they can’t get a license for all possible content that may be uploaded? They will have to resort to a filter anyway.

6. Critical parts remain unchanged

Large parts of the most widely criticised elements of the Commission proposal were left completely unchanged by rapporteur Voss, such as the infamous Recital 38 (2), where the Commission misrepresents the limited liability regime of the e-commerce directive, essentially stating that any platform that so much as uses an algorithm to sort the uploaded works alphabetically or provides a search function should be considered as “active” and therefore liable for its users’ actions. The only change that Mr. Voss has made to this section is cosmetic in nature.

* * *

It’s not too late to stop the Censorship Machines!

Fortunately, Axel Voss does not get to decide the Parliament position on his own. He will need to secure a majority in the Legal Affairs (JURI) committee, which will vote in late March or April. Two other committees have already come out strongly against filtering obligations, and several JURI members have tabled amendments to delete or significantly improve the Article.

Now it’s time to call upon your MEPs to reject Mr. Voss’ proposal! You can use tools such as SaveTheMeme.net by Digital Rights NGO Bits of Freedom or ChangeCopyright.org by Mozilla to call the Members of the Legal Affairs Committee free of charge. Or look for MEPs from your country and send them an email.

But most importantly, spread the word! Ask you local media to report on this law.


To the extent possible under law, the creator has waived all copyright and related or neighboring rights to this work.

Photo by Thomas Hawk

The post Green light for upload filters: EU Parliament’s copyright rapporteur has learned nothing from year-long debate appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/green-light-for-upload-filters-eu-parliaments-copyright-rapporteur-has-learned-nothing-from-year-long-debate/2018/02/24/feed 0 69963
Digital repression and resistance during the #CatalanReferendum https://blog.p2pfoundation.net/digital-repression-and-resistance-during-the-catalanreferendum/2017/10/05 https://blog.p2pfoundation.net/digital-repression-and-resistance-during-the-catalanreferendum/2017/10/05#respond Thu, 05 Oct 2017 14:33:17 +0000 https://blog.p2pfoundation.net/?p=68072 Successes and failures in the use of digital tools in Catalonia’s rebellion The battle presently being fought in the streets and polling stations in towns and cities throughout Catalonia before, during and after October 1, in which a diverse civil society has come together in huge numbers, putting their bodies and knowledge in the service... Continue reading

The post Digital repression and resistance during the #CatalanReferendum appeared first on P2P Foundation.

]]>
Successes and failures in the use of digital tools in Catalonia’s rebellion

The battle presently being fought in the streets and polling stations in towns and cities throughout Catalonia before, during and after October 1, in which a diverse civil society has come together in huge numbers, putting their bodies and knowledge in the service of the shared goal of defending what is considered to be real democracy, has also had a crucial battleground in the case of the Internet.

September 7, 2017

On September 7, 2017, the Constitutional Court suspends the law of the referendum in Catalonia. Thenceforth, the Spanish government embarked on legal, police, and administrative persecution of any “device or instrument that is to be used for preparing or holding the referendum”, including ballot boxes and papers which were now criminal objects. Websites, apps and tools related with the referendum were closed on the Internet.

Independently of whether one agrees or disagrees with the decision of the Spanish courts to ban the referendum, the closing of many regular Internet spaces can be viewed, in a great number of cases, as a grave violation of freedom of expression —and especially freedom of political opinion— which is protected in international treaties and by Article 11 of the European Union’s Charter of Fundamental Rights on “Freedom of expression and information”. While some websites, apps and domains belong to the Generalitat (Government) of Catalonia and were tools directly linked with organizing the referendum, many others were of private individuals or associations, and basically reflect political opinions. It is clear that one thing —arguable or not— is banning a referendum and quite another is blocking, while they were at it, the right to express one’s political opinion that the referendum should be held.

In the last few days, Catalonia has been the testing ground of what we have always denounced or, in other words, the fact that the space of the Internet has yet again been subjected to a state of exception which “democratic” governments wouldn’t dare to apply to physical space because this violation of rights would immediately be visible. Proof of this is that many of the shut-down websites belong to associations with physical premises but no authority has risked ordering that these centers should be closed.

Internet access is essential for the exercise of our freedoms and should be considered in itself a fundamental right [#KeepItOn].

If we let the space of the Internet become the first casualty in the curtailment of basic rights, we can be sure that the next step will be to limit those rights in other spaces as well.

September 13, 2017

On September 13 a court order shut down the web page referendum.cat. Thus began a game of cat-and-mouse between the Spanish government (with its state repression) and the Catalan government.

Some citizens published the referendum web code in Github. After this, clones of the website began to appear, created by volunteer citizens in domains with names like piolin.cat (where piolin refers to Tweetie Pie, painted on the boat accommodating Spanish police), referendum.ninja o marianorajoy.cat, while alternative sites were also made available by the Generalitat itself. The police operation continues with domains being shut down and access blocked to all these sites as well as many other web pages with opinions about the referendum, including those of associations, sports clubs and private sites. All of this was occurring against a background of politicians being arrested and presidents of civil society associations being charged with sedition.

In ten days more than 140 websites were blocked. The project OONI by Tor includes a non-exhaustive list of affected domains and information on the type of block.

dominio-intervenido-a-disposicion-judicial

As part of this state operation, the Guardia Civil raided the headquarters of Top Level Domain .cat, confiscating IT equipment and data, and detaining one of its IT staff. This disproportionate measure, which is unprecedented in the European Union, implies the possibility of opening the way for something we have been struggling against for years, namely domain managers being held responsible for content.

The UN Rapporteur on Freedom of Opinion and Expression, the Internet Society, the Electronic Founder Foundation, and many other organizations like our own have condemned this blocking of websites and the inordinate digital repression carried out by the Spanish government just days before the referendum was held, which meant that there was no chance to establish their validity, suitability and legality because they left no time to do so.

In this situation of persecution and very serious violation of rights, many people, moved by their convictions and without proper legal advice, have exposed themselves to risks which could have been avoided in some cases, and have left their identities at the mercy of a repressive apparatus that needs scapegoats to justify its actions. The open use of names among the alleged authors of the first mirror sites has meant that the authorities are now boasting that they have rounded up the young perpetrators (hasta 14). Some of them face very serious charges like “heading a seditious organization” which, as everyone knows, makes no sense at all in a free, open space like the Internet. These are definitely measures that aim to inflict disproportionate punishment so as to bully and intimidate citizens in an attempt to discourage their intense online activity.

One of the most common errors made by citizens has been their frequent use of servers with few and poor legal guarantees for the client. A case in point is the insistent use of .cat domains. These come under the control of .es, and therefore the Spanish state, which shows no concern for civil rights, in contrast with other generic domains (.net, .org, .com…) with are overseen by ICANN and other organizations that do respect basic rights.

We believe that it is important to stress that we shouldn’t need martyrs to prove that our struggles are just. We must make every possible effort to ensure that the people who are struggling for their rights don’t suffer reprisals. In this regard, Xnet has tried to give an overall explanation of how to avoid this and other useful information in a Guide that seeks to protect people who work with the Internet from unjust repression. This initiative is part of a set of actions designed by the lawyers and organizations of #SomDefensores to defend basic rights.

Net Democracy: Distributed Government

We have seen a Generalitat that is competent and farsighted in its online activity but, in particular, we also note that the acceleration of events in Catalonia has catalyzed the population into a massive use of digital tools in defense of their basic rights. Unlike similar situations, such as that in Turkey for example, the Catalan institutions have agreed in recent days to cede and share, in a widely distributed manner, responsibility for safeguarding freedoms, thus regularizing what we see as the embryo of what could be a truly transversal democracy worthy of the digital age, as some of us have already proposed in our discussion of the methodology of the device Red Ciudadana Partido X.

The president of Catalonia, Carles Puigdemont—thanks also to help from experts who have actively and continually been engaged in working for the defense of rights (even people like Julian Assange and Peter Sunde publicly offered their counseling)—have recommended the use of proxies in social networks in order to gain access to blocked websites. He subsequently announced that IPFS had also been used as a distributed tool for housing the website giving citizens information about where they should go to vote.

September 23, 2017

On September 23, the High Court of Justice of Catalonia ordered the “blocking of websites and domains [giving this information] which are publicized in any account or official social network of any kind” (). This was not just a matter of a specific list of sites but a general order giving a free hand to forces of security in ordering Internet providers to shut down websites.

With these new powers, the Guardia Civil blocked the domain gateway.ipfs.io and thereby cut off connection, not only to the referendum website, but also to all content from the Spanish state hosted in IPFS through this gateway. The shutdown extended to websites of nongovernmental organizations and movements like empaperem.cat, assemblea.cat and webdelsi.cat which are in favor of the referendum. This carrying out of the court order also extended to GooglePlay, which was forced to withdraw the app allowing people to find information about where to vote.

Nevertheless, at all times the whole population of Catalonia has been able to keep informed about polling stations thanks to continuous replication and massive use of VPN and anonymous browsing in order to access sites that were blocked from Spain. This capacity for action distributed between the government and organized citizens has been the trend throughout the electoral process, with large-scale use of chats, networks and other tools that have allowed swift circulation of information circulated on the micro-scale and among strangers who are working together to deal with hoaxes, leaks and infiltrations.

This networked action by means of which people have, for example, organized themselves, polling station by polling station, has also been manifest in physical spaces, for example with regard to protecting the ballot boxes from police seizure. For a month, the state security forces and their secret services have been searching all over Catalonia for the ballot boxes and voting papers. Although they have raided printers, media offices and headquarters of political parties and other organizations —sometimes without a court order— the ballot boxes were never found, yet they magically appeared in the polling stations. The ballot boxes and papers were there—they were everywhere—guarded by small groups, autonomous nodes, and spread all around Catalonia.

October 1, referendum day

Finally, even as the referendum was taking place on October 1, the Spanish government tried to block, by every means it could, the possibility of accessing the “universal census” app of the entire electoral register.

The domain registremeses.com where the app was hosted was immediately blocked. The Generalitat quickly supplied the more than 1,000 polling stations throughout Catalonia with alternative IPs for access. We believe that, in this case, it probably would have been better to work with Hidden Service in order to avoid police harassment and DDoS attacks by groups opposing the referendum.
Internet connection was also interrupted and it is not yet known who is responsible. Could it have been Internet suppliers obeying state orders (although they deny it)?

However, the polling stations still managed to function, almost all of them routing the smartphones of the volunteers in order to access the Internet. In the street, people were chorusing “airplane mode” so as to save network bandwidth for people working inside the polling stations. The operation lasted from 5 a.m.—which is when citizens began filling the streets to protect the polling stations—until midnight when the vote count ended. All this was achieved in the midst of violent charges by National Police with a toll of more than 800 wounded. Despite everything, more than 2,200,000 people came out to vote.

Order is the people, equal to equal: disorder is this state and its violence

The citizens and government of Catalonia have learned and are witnesses to the fact that in the front line of defense of our democracy, digital resistance depends on our use of technological tools which allow us to protect our rights autonomously and in a well distributed manner.

We hope that the Catalan government will never forget this and that its administration will always resist the temptation of the usual kind of discourse that criminalizes tools protecting privacy, encryption and decentralization of the Internet.

Moreover, when repression was massively unleashed in streets and villages of Catalonia, the social networks and their intelligent use by citizens were once again used to put an end to the blocking and manipulation of information by the mainstream media in Spain, and to let the international media outlets know what was really happening. Perhaps in 2017, many people were already used to this, but it is also highly possible that there have never been so many published videos and photographs documenting police violence as there have been this time (https://twitter.com/joncstone/status/914450692416397312). Without the widespread use of social networks to testify and inform, the people of Catalonia would have been totally isolated and crushed with absolute impunity.

From this point of view, what has been happening in the last few days is historic. This acceleration towards a greater degree of democracy and more power in civil society is happening spontaneously but the ignorance of most people about some aspects of the digital milieu is exposing them to risks and, in this regard, this is what we must make and what we are putting evey effort into to achieve.

October 1, 2017 as a beginning

On October 1, 2017 the politicians were nowhere to be seen. Only Unidos Podemos could be heard now-and-then, trying to capitalize on our wounded for its own ends. Apart this, there were only grassroots people organizing and acting, including some members of parliament and councilors who are people like anyone else. Over 24 hours, civil society came together to work for a day in which people could vote and vote on a huge scale and, furthermore, it didn’t fall into the temptation of responding to the state’s provocation in the form of violence, even though hundreds of injured people needed medical attention. There was happiness, anger and fraternity among the most different people. It was incredibly moving. There were no slogans, no shouting, so that people could vote without being coerced in this display of a valiant, stirring capacity for organization and desire for democracy.

On October 1, 2017 we proved that order is the people and disorder is this state.

 

Xnet

*Front picture by @Makrakas in Gràcia, Barcelona.
More info provided by the network:

Association for Progressive Communications (APC) press release:
https://www.apc.org/en/pubs/apc-calls-end-restrictions-freedom-expression-catalonia

Netcommons.eu press release:
https://netcommons.eu/?q=content/internet-censorship-and-blockade-catalonia-self-sovereign-internet-infrastructures

ISOC (Frederic Donc from Brussels) declaration:
https://www.internetsociety.org/news/statements/2017/internet-society-statement-internet-blocking-measures-catalonia-spain/

Wikipedia article that describes the operation from the Spanish government:
https://en.wikipedia.org/wiki/Operation_Anubis

Official web site of the High Court of Justice of Catalonia:
http://www.poderjudicial.es/cgpj/es/Poder-Judicial/Tribunales-Superiores-de-Justicia/TSJ-Cataluna/

The legal basis in Catalonia comes from the https://en.wikipedia.org/wiki/Parliament_of_Catalonia and several laws approved such as: https://en.wikipedia.org/wiki/Law_on_the_Referendum_on_Self-determination_of_Catalonia and the https://en.wikipedia.org/wiki/Law_of_juridical_transition_and_foundation_of_the_Republic.

The informative web site www.referendum.cat (Sept 13) was blocked by police with a judicial order to suspend the DNS domain:
https://blog.cdmon.com/ca/comunicat-oficial-referendum-cat/

Replicas of this web site were hosted in Cloudflare, but then ISPs were ordered filters or redirections.

The site was published in a code repository and replicated in the P2P file system IPFS. A IPFS gateway was blocked in some ISPs, but content was accessible from IPFS clients.

Some people cloned the sites and were detained and interrogated, in some cases forced to give passwords (even for personal email and social media). For instance the magistrate (public prosecutor or judge) also ordered to block “websites or domains that appear in any official account or social network of the members of the Government through which directly and indirectly, even referring to other accounts, was informed, through links, of how to access domains whose contents they keep in relation to which they are now blocked.”

About the DNS (.cat TLD intervention and domain blockade) mentioned in the netcommons article, an external discussion:
http://www.internetgovernance.org/2017/09/20/puntcat-under-fire-internet-vs-political-identities/

Among many others (more than 140) sites, such as assemblea.cat, blocked (different ways depending the ISP) and moved to assemblea.eu.

The day of the election many schools and poll stations found their Internet connection down. Police closed several poll stations, in several cases with violence, and a small portion of votes were seized by the police.

Citizens deployed their own mobile and wifi point-to-point links, even batteries for power to allow access to the census application to provide guarantees to the process, such as avoid double voting. VPNs, indirection mechanisms such as Onion routing and alternative ISPs were used to circumvent the traffic filters.

The census service itself, was stopped the day before with a judicial order, among nearly 30 databases, controlled from the data center of the Catalan government. Replicas were created. These servers were actively blocked and attacked during the day, and some of the interruptions and delays of the voting process were the result of that, and required moving the servers to new IP addressed. People in poll stations were using a web application to use that service for voter validation, and social media was used to share news about changes and events. Many people resorted to Whatsapp, Telegram, Signal groups, Twitter, etc.

Many, diverse, and powerful DDOS attacks have happened to many web sites related to the process in the last weeks.

This is still ongoing. Today there is a general strike in Catalonia.

– Xnet has published a very good “Basic guide to preseve fundamental rights on the Internet”:
https://xnet-x.net/en/how-to-guide-for-preserving-fundamental-rights-internet/#how-to-guide

– Softcatala has also published a guide “autodefensa digital”:
https://autodefensa.softcatala.cat/

The post Digital repression and resistance during the #CatalanReferendum appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/digital-repression-and-resistance-during-the-catalanreferendum/2017/10/05/feed 0 68072
Stop #CensorshipMachine: EU copyright threatens our freedoms https://blog.p2pfoundation.net/stop-censorshipmachine-eu-copyright-threatens-our-freedoms/2017/03/08 https://blog.p2pfoundation.net/stop-censorshipmachine-eu-copyright-threatens-our-freedoms/2017/03/08#respond Wed, 08 Mar 2017 08:55:00 +0000 https://blog.p2pfoundation.net/?p=64229 Europe wants Internet companies to filter all of your uploads. (It is a censorship machine.) An upload filter can’t recognize your legal use of copyrighted content. (Like parody, citations and – oh, noes! – memes.) And you will have no meaningful protection from unfair deletion. (So, the proposed safeguards will not save you.) This needs... Continue reading

The post Stop #CensorshipMachine: EU copyright threatens our freedoms appeared first on P2P Foundation.

]]>
Europe wants Internet companies to filter all of your uploads. (It is a censorship machine.)
An upload filter can’t recognize your legal use of copyrighted content. (Like parody, citations and – oh, noes! – memes.)
And you will have no meaningful protection from unfair deletion. (So, the proposed safeguards will not save you.)
This needs to change: we need to save the meme! (Article 13 of the proposed Copyright Directive must be deleted.)
And you can be of a tremendous help!
Call a member of the European Parliament now – for free.

Visit SaveTheMeme.net for more!

The post Stop #CensorshipMachine: EU copyright threatens our freedoms appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/stop-censorshipmachine-eu-copyright-threatens-our-freedoms/2017/03/08/feed 0 64229