Brewster Kahle – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Sun, 21 Oct 2018 11:00:25 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 What to do once you admit that decentralizing everything never seems to work https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24 https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24#respond Wed, 24 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=73242 Decentralization is the new disruption—the thing everything worth its salt (and a huge ICO) is supposed to be doing. Meanwhile, Internet progenitors like Vint Cerf, Brewster Kahle, and Tim Berners-Lee are trying to re-decentralize the Web. They respond to the rise of surveillance-based platform monopolies by simply redoubling their efforts to develop new and better decentralizing technologies. They... Continue reading

The post What to do once you admit that decentralizing everything never seems to work appeared first on P2P Foundation.

]]>

Decentralization is the new disruption—the thing everything worth its salt (and a huge ICO) is supposed to be doing. Meanwhile, Internet progenitors like Vint Cerf, Brewster Kahle, and Tim Berners-Lee are trying to re-decentralize the Web. They respond to the rise of surveillance-based platform monopolies by simply redoubling their efforts to develop new and better decentralizing technologies. They seem not to notice the pattern: decentralized technology alone does not guarantee decentralized outcomes. When centralization arises elsewhere in an apparently decentralized system, it comes as a surprise or simply goes ignored.

Here are some traces of the persistent pattern that I’m talking about:

  • The early decentralized technologies of the Internet and Web relied on key points of centralization, such as the Domain Name System (which Berners-Lee called the Internet’s “centralized Achilles’ heel by which it can all be brought down or controlled”) and the World Wide Web Consortium (which Berners-Lee has led for its entire history)
  • The apparently free, participatory open-source software communities have frequently depended on the charismatic and arbitrary authority of a “benevolent dictator for life,” from Linus Torvalds of Linux (who is not always so benevolent) to Guido van Rossum of Python
  • Network effects and other economies of scale have meant that most Internet traffic flows through a tiny number of enormous platforms — a phenomenon aided and exploited by a venture-capital financing regime that must be fed by a steady supply of unicorns
  • The venture capital that fuels the online economy operates in highly concentrated regions of the non-virtual world, through networks that exhibit little gender or ethnic diversity, among both investors and recipients
  • While crypto-networks offer some novel disintermediation, they have produced some striking new intermediaries, from the mining cartels that dominate Bitcoin and other networks to Vitalik Buterin’s sweeping charismatic authority over Ethereum governance

This pattern shows no signs of going away. But the shortcomings of the decentralizing ideal need not serve as an indictment of it. The Internet and the Web made something so centralized as Facebook possible, but they also gave rise to millions of other publishing platforms, large and small, which might not have existed otherwise. And even while the wealth and power in many crypto-networks appears to be remarkably concentrated, blockchain technology offers distinct, potentially liberating opportunities for reinventing money systems, organizations, governance, supply chains, and more. Part of what makes the allure of decentralization so compelling to so many people is that its promise is real.

Yet it turns out that decentralizing one part of a system can and will have other kinds of effects. If one’s faith in decentralization is anywhere short of fundamentalism, this need not be a bad thing. Even among those who talk the talk of decentralization, many of the best practitioners are already seeking balance — between unleashing powerful, feral decentralization and ensuring that the inevitable centralization is accountable and functional. They just don’t brag about the latter. In what remains, I will review some strategies of thought and practice for responsible decentralization.

Hat from a 2013 event sponsored by Zambia’s central government celebrating a decentralization process. Source: courtesy of Elizabeth Sperber, a political scientist at the University of Denver

First, be more specific

Political scientists talk about decentralization, too—as a design feature of government institutions. They’ve noticed a similar pattern as we find in tech. Soon after something gets decentralized, it seems to cause new forms of centralization not far away. Privatize once-public infrastructure on open markets, and soon dominant companies will grow enough to lobby their way into regulatory capture; delegate authority from a national capital to subsidiary regions, and they could have more trouble than ever keeping warlords, or multinational corporations, from consolidating power. In the context of such political systems, one scholar recommends a decentralizing remedy for the discourse of decentralization — a step, as he puts it, “beyond the centralization-centralization dichotomy.” Rather than embracing decentralization as a cure-all, policymakers can seek context-sensitive, appropriate institutional reforms according to the problem at hand. For instance, he makes a case for centralizing taxation alongside more distributed decisions about expenditures. Some forms of infrastructure lend themselves well to local or private control, while others require more centralized institutions.

Here’s a start: Try to be really, really clear about what particular features of a system a given design seeks to decentralize.

No system is simply decentralized, full-stop. We shouldn’t expect any to be. Rather than referring to TCP/IP or Bitcoin as self-evidently decentralized protocols, we might indicate more carefully what about them is decentralized, as opposed to what is not. Blockchains, for instance, enable permissionless entry, data storage, and computing, but with a propensity to concentration with respect to interfaces, governance, and wealth. Decentralizing interventions cannot expect to subdue every centralizing influence from the outside world. Proponents should be forthright about the limits of their enterprise (as Vitalik Buterin has sometimes been). They can resist overstating what their particular sort of decentralization might achieve, while pointing to how other interventions might complement their efforts.

Another approach might be to regard decentralization as a process, never a static state of being — to stick to active verbs like “decentralize” rather than the perfect-tense “decentralized,” which suggests the process is over and done, or that it ever could be.

Guidelines such as these may tempt us into a pedantic policing of language, which can lead to more harm than good, especially for those attempting not just to analyze but to build. Part of the appeal of decentralization-talk is the word’s role as a “floating signifier” capable of bearing various related meanings. Such capacious terminology isn’t just rhetoric; it can have analytical value as well. Yet people making strong claims about decentralization should be expected to make clear what distinct activities it encompasses. One way or another, decentralization must submit to specificity, or the resulting whack-a-mole centralization will forever surprise us.

A panel whose participants, at the time, represented the vast majority of the Bitcoin network’s mining power. Original source unknown

Second, find checks and balances

People enter into networks with diverse access to resources and skills. Recentralization often occurs because of imbalances of power that operate outside the given network. For instance, the rise of Facebook had to do with Mark Zuckerberg’s ingenuity and the technology of the Web, but it also had to do with Harvard University and Silicon Valley investors. Wealth in the Bitcoin network can correlate with such factors as propensity to early adoption of technology, wealth in the external economy, and proximity to low-cost electricity for mining. To counteract such concentration, the modes of decentralization can themselves be diverse. This is what political institutions have sought to do for centuries.

Those developing blockchain networks have tended to rely on rational-choice, game-theoretic models to inform their designs, such as in the discourse that has come to be known as “crypto-economics.” But relying on such models alone has been demonstrably inadequate. Already, protocol designers seem to be rediscovering notions like the separation of powers from old, institutional liberal political theory. As it works to “truly achieve decentralization,” the Civil journalism network ingeniously balances market-based governance and enforcement mechanisms with a central, mission-oriented foundation populated by elite journalists — a kind of supreme court. Colony, an Ethereum-based project “for open organizations,” balances stake-weighted and reputation-weighted power among users, so that neither factor alone dictates a user’s fate in the system. The jargon is fairly new, but the principle is old. Stake and reputation, in a sense, resemble the logic of the House of Lords and the House of Commons in British government — a balance between those who have a lot to lose and those who gain popular support.

As among those experimenting with “platform cooperativism,” protocols can also adapt lessons from the long and diverse legacy of cooperative economics. For instance, blockchain governance might balance market-based one-token-one-vote mechanisms with cooperative-like one-person-one-vote mechanisms to counteract concentrations of wealth. The developers of RChain, a computation protocol, have organized themselves in a series of cooperatives, so that the oversight of key resources is accountable to independent, member-elected boards. Even while crypto-economists adopt market-based lessons from Hayek, they can learn from the democratic economics of “common-pool resources” theorized by Elinor Ostrom and others.

Decentralizing systems should be as heterogeneous as their users. Incorporating multiple forms of decentralization, and multiple forms of participation, can enable each to check and counteract creeping centralization.

Headquarters of the Internet Archive, home of the Decentralized Web conferences: Wikimedia Commons

Third, make centralization accountable

More empowering strategies for decentralization, finally, may depend on not just noticing or squashing the emergence of centralized hierarchy, but embracing it. We should care less about whether something is centralized or decentralized than whether it is accountable. An accountable system is responsive to both the common good for participants and the needs of minorities; it sets consistent rules and can change them when they don’t meet users’ needs.

Antitrust policy is an example of centralization (through government bureaucracy) on behalf of decentralization (in private sector competition). When the government carrying out such a policy holds a democratic mandate, it can claim to be accountable, and aggressive antitrust enforcement frequently enjoys broad popularity. Such centralized government power, too, may be the only force capable of counteracting the centralized power of corporations that are less accountable to the people whose lives they affect. In ways like this, most effective forms of decentralization actually imply some form of balance between centralized and decentralized power.

While Internet discourses tend to emphasize their networks’ structural decentralization, well-centralized authorities have played critical roles in shaping those networks for the better. Internet progenitors like Vint Cerf and Tim Berners-Lee not only designed key protocols but also established multi-stakeholder organizations to govern them. Berners-Lee’s World Wide Web Consortium (W3C), for instance, has been a critical governance body for the Web’s technical standards, enabling similar user experience across servers and browsers. The W3C includes both enormously wealthy corporations and relatively low-budget advocacy organizations. Although its decisions have sometimes seemedto choose narrow business interests over the common good, these cases are noteworthy because they are more the exception than the rule. Brewster Kahle has modeled mission-grounded centralization in the design of the nonprofit Internet Archive, a piece of essential infrastructure, and has even attempted to create a cooperative credit union for the Internet. His centralizing achievements are at least as significant as his calls for decentralizing.

Blockchain protocols, similarly, have tended to spawn centralized organizations or companies to oversee their development, although in the name of decentralization their creators may regard such institutionalization as a merely temporary necessity. Crypto-enthusiasts might admit that such institutions can be a feature, not a bug, and design them accordingly. If they want to avoid a dictator for life, as in Linux, they could plan ahead for democracy, as in Debian. If they want to avoid excessive miner-power, they could develop a centralized node with the power to challenge such accretions.

The challenge that entrepreneurs undertake should be less a matter of How can I decentralize everything? than How can I make everything more accountable? Already, many people are doing this more than their decentralization rhetoric lets on; a startup’s critical stakeholders, from investors to developers, demand it. But more emphasis on the challenge of accountability, as opposed to just decentralization, could make the inevitable emergence of centralization less of a shock.

What’s so scary about trust?

In a February 2009 forum post introducing Bitcoin, Satoshi Nakamoto posited, “The root problem with conventional currency is all the trust that’s required to make it work.” This analysis, and the software accompanying it, has spurred a crusade for building “trustless” systems, in which institutional knowledge and authority can be supplanted with cryptographic software, pseudonymous markets, and game-theoretic incentives. It’s a crusade analogous to how global NGOs and financial giants advocated mechanisms to decentralize power in developing countries, so as to facilitate international investment and responsive government. Yet both crusades have produced new kinds of centralization, in some cases centralization less accountable than what came before.

For now, even the minimal electoral accountability over the despised Federal Reserve strikes me as preferable to whoever happens to be running the top Bitcoin miners.

Decentralization is not a one-way process. Decentralizing one aspect of a complex system can realign it toward complex outcomes. Tools meant to decentralize can introduce novel possibilities — even liberating ones. But they run the risk of enabling astonishingly unaccountable concentrations of power. Pursuing decentralization at the expense of all else is probably futile, and of questionable usefulness as well. The measure of a technology should be its capacity to engender more accountable forms of trust.

Learn more: ntnsndr.in/e4e

If you want to read more about the limits of decentralization, here’s a paper I’m working on about that. If you want to read about an important tradition of accountable, trust-based, cooperative business, here’s a book I just published about that.

Photo by CIFOR

The post What to do once you admit that decentralizing everything never seems to work appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24/feed 0 73242
Karissa McKelvey on the Web of Commons https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08 https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08#comments Mon, 08 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72923 Karissa McKelvey from the Dat Project provides an overview of the new decentralized Internet and the need to insert commons thinking and practices into this new space. This text is based on Karissa’s2017 Full Stack Fest’s keynote and was originally published in the Dat Project’s Blog. Karissa McKelvey:  In the 18th, 19th centuries it was... Continue reading

The post Karissa McKelvey on the Web of Commons appeared first on P2P Foundation.

]]>
Karissa McKelvey from the Dat Project provides an overview of the new decentralized Internet and the need to insert commons thinking and practices into this new space. This text is based on Karissa’s2017 Full Stack Fest’s keynote and was originally published in the Dat Project’s Blog.


Karissa McKelvey:  In the 18th, 19th centuries it was thought that property ownership was the only way to protect common resources such as grazing pastures. Garrett Hardin famously put it: “The individual benefits as an individual from his ability to deny the truth even though society as a whole, of which he is a part, suffers.”

It was thought that communities that only act in rational self interest destroy the common pool resource they are sharing. This is described as “the tragedy of the commons”: that isolated, autonomous individuals will always choose the path best for them as individuals.

Elinor Ostrom introduced a new body of research to challenge this. Over 40 years of research, she was able to prove that Hardin exaggerated the problems involved in managing a commons. In 2009, Elinor Ostrom was the first woman to win the Nobel Prize in economics. She talked about how people actually are able to come to together to form norms and rules that sustain their mutual cooperation. For example, she went to Nepal and studied how people there were managing their own irrigation systems. She found that if communities simply follow eight principles, a sort of blueprint, communities use to self-govern and sustain the resource without depleting it.

What about applying this to the internet? Before her death in 2012 Ostrom published a book with Charlotee Hess called Understanding Knowledge as a Commons. This book laid the groundwork for thinking of digital knowledge as a commons (that is the digital artifacts in libraries, wikis, open source code, scientific articles, and everything in between).

The Internet as a Commons

Looking at the internet as a commons — as a shared resource — allows us to understand both its unlimited possibilities and also what threatens it.

What threatens the internet? Right now, private companies that control large parts of the internet are trying to prevent the internet of commons. If products fail or are deemed not economically viable (for example Vine, Google Reader, etc), the whole suffers. Monopolies, like Google, are able to keep their power by influencing the political landscape. However, in the internet of commons, monopolies are no longer in control, and users would be trusted to self-govern the commons.

Decentralization has been the most recent proposal as our technological means to get away from this and give the power to users. In a decentralized world, users get to control the contracts of the website, can choose to fork that website, re-host data to fix broken links, evade censorship, and overall take ownership of their data. Freedom of expression, privacy, and universal access to all knowledge should be inherent to the web. But right now, those values are not.

Locking the Web Open

Thinking of the internet as a commons allows us to think of different ways we can moderate and grow spaces, allow innovation to flourish, and improve the quality of knowledge and information sharing. As Brewster Kahle puts it, decentralization ‘Locks the Web Open.’

I’m not just dreaming of a new world with Brewster Kahle about the future of the internet. The internet of commons is here today. Peer-to-peer (p2p) applications already exist, are being built, as well as used by real users as we speak — you can build one too! Secure Scuttlebutt, for example, is a completely p2p protocol for syncing data. Patchwork is a social networking application built on top of the Secure Scuttlebutt Protocol. People can join a public server and make friends, then use a gossip approach to find friends of friends. Many early adopters come from IRC and have started using it instead of IRC. It’s immensely successful as a little protocol and you can build something with it today.

Dat is inspired by BitTorrent and built in a similar fashion to Scuttlebutt. It is a decentralized protocol for storing, versioning, and syncing streams of potentially very large datasets. We’re a non-profit, funded by grants and, so far, we’ve operated more like a research lab than a company.

A foundational part of what we’ve been doing for the past three years is to work with university labs, libraries, researchers, and universities to help them manage their scientific data. Scientific articles and their related data are very specific and yet good use case for a commons approach to the internet.

As companies privatize data they create silos or they put up paywalls, and prevent the growth of the commons — another kind of enclosure. This means that certain people with power close the pathways into the commons so that they can profit from it… but it actually detracts from everyone’s ability to use it and also prevents its ability to flourish. Innovation suffers, as fewer people have access to the knowledge and it is much harder to produce contributions that could improve that research. The rationale given for companies to create paywalls is that it is expensive to collect, store, organize, present, and provide bandwidth for the billions of pages of articles and datasets.

Decentralization is a clear way we can reduce the costs of this hosting and bandwidth — as more people come to download the articles and data from the journal or library or university, the faster it gets. The dream is that universities could turn their currently siloed servers into a common resource that is shared amongst many universities. This would cut costs for everyone, improve download speed, and reduce the likelihood that data is lost.

Decentralization of data produces challenges though — just like a torrent, data that is decentralized can go offline if there aren’t any stable and trusted peers. In the case of scientific data, this is an immense failure. To mitigate it, we invoke the human part of a commons — the data will be commonly managed. For example, we can detect how many copies are available in different places, just like a BitTorrent, and compute health for a dat — for example, a dat hosted at the Internet Archive, University of Pennsylvania, and UC Berkeley is probably very healthy and has low probability of ever going offline, while a dat hosted by 5 laptops might go down tomorrow — even though there are more peers. When a dat becomes less healthy, the community can be alerted and make sure the resource does not go down. Decentralized tech and decentralized humans working together to use commons methodology in practice.

Along with this, what we get by design is that links can last forever, no matter what server they are hosted on — using a decentralized network based on cryptographic links and integrity checks allow many servers to host the same content without security risks, a property not present in http.

This concept of decentralization isn’t new. The internet was built upon the concept of it being very resilient, that if a node failed, it’d find another way to get information to other computers. The internet was originally decentralized, but over time it became clear that centralized parties were needed to fund and maintain websites on the internet. The move towards decentralization is almost a yearning for the past, a way to get around this really centralized section of internet history.

Building the Future

A way we’ve been thinking about building protocols for decentralization is looking to how current popular protocols were developed and mirroring those methods. Current very popular modes for transfer were developed by people like Tim Berners-Lee (CERN, www) and Vint Cerf (DARPA TCP/IP) who worked in research labs. They gave away their protocols for free to the public, as products of scientific inquiry. The secret sauce of what they did was to craft open standards that don’t need permission to use and reuse, prioritized usability, and involved no or low barriers to access. Even Google was founded from two folks in a university lab, who published their search algorithm PageRank.

Today, I look at the decentralized landscape in context of what these people were doing back in the day and wonder if we’re continuing their legacy. Ideally, new decentralized protocols could be built into browsers that people already use today. Alongside http://, we imagine dat:// view websites or data from a distributed network (which you can now do with the Beaker Browser!).

I look at initial coin offerings (ICOs) and new blockchain companies that claim to be revolutionizing the way we work on the internet, and I’m not seeing this same model. I’m seeing white papers that are published, and sometimes even implemented in open source. But if you look at what they propose, many offer siloed networks that are privatized, with money being invested into specialized coins that create new enclosures. A big component of these ICOs are trust-less networks, which remove the human elements of trust and social groups from the network.

Decentralization then, is not just a technological problem, it is also a human one. Researchers at MIT have been looking into many of these decentralized tools and are reaching similar conclusions — the technical problems are hard but we must solve the social and people problems if we want to progress: “Decentralized web advocates have good intentions, but there is no silver-bullet technical solution for the challenges that lie ahead.”

To top it off, over $1.6 billion was invested in these ICOs in the past year alone. Where are we going? Is the future of decentralization going to be rooted in paywalls and coins, with the management of those coins and that technology trusted to a single individual or group? Is that really where we want to end up?

With a commons approach to the decentralized web, the most ideal approach is guided from where we came. I am much more excited about creating protocols that are easy to use, develop with, and extend without asking for permission and without paying or having much money at all. That means that they are driven by the community, built for the public good, and given away for free. If the company or organization dies, the protocols should still be usable. Any blockchains involved should not be tied to a particular for-profit company. I should not be tying my data to any one coin or blockchain for fear of enclosure. The protocols should be optimizing for science(broadly speaking, as in developing knowledge) and mutual collaboration rather than optimizing for profit. Let us not recreate the problem we are trying to solve.

Photo by n.a.t.u.r.e

The post Karissa McKelvey on the Web of Commons appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/karissa-mckelvey-on-the-web-of-commons/2018/10/08/feed 1 72923
Decentralising the web: The key takeaways https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14 https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14#respond Fri, 14 Sep 2018 09:00:00 +0000 https://blog.p2pfoundation.net/?p=72506 Republished with permission from UK technology site Computing John Leonard: The Decentralized Web Summit is over – what’s next? Earlier this month a rather unusual tech event took place in San Francisco. The Decentralized Web Summit played host to a gathering of web luminaries such as Sir Tim Berners-Lee, Brewster Kahle and Vint Cerf. On... Continue reading

The post Decentralising the web: The key takeaways appeared first on P2P Foundation.

]]>
Republished with permission from UK technology site Computing

John Leonard: The Decentralized Web Summit is over – what’s next?

Earlier this month a rather unusual tech event took place in San Francisco.

The Decentralized Web Summit played host to a gathering of web luminaries such as Sir Tim Berners-Lee, Brewster Kahle and Vint Cerf. On top of that, activists and authors and screenwriters such as Jennifer Stisa Granick, Emili Jacobi, Mike Judge and Cory Doctorow put in an appearance, as did cryptocurrency pioneers like Zooko Wilcox, blockchain developers, and academics.

Then, there was what the Guardian‘s John Harris calls the Punk Rock Internet – companies like MaidSafe and Blockstack who play by their own decentralised rules.

Oh, and there was a sprinkling of techies from Microsoft, Google (Vint Cerf and others) and Mozilla in attendance too, along with a handful of venture capitalists looking for opportunities.

Uniting this diverse selection of delegates was the challenge of fixing the centralising tendencies of the internet and web.

Simply put, the internet’s reliance on centralised hubs of servers and data centres means that the more servers you control the more power you have, with all the negative consequences that follow from the creation of data-haves and data-have-nots.

To redress the balance, data needs to be freed from silos with control handed back to users, but how to do that while retaining the convenience and ease-of-use of the current web?

Aside from the inevitable resistance by the powers that be, this turns out to be quite the technical challenge.

One task among a set of complex interlocking challenges is to separate data from the applications that use it. People could then store their personal data where they choose, granting or limiting access by applications as they please. For example, Berners-Lee’s Solid platform enables everyone to have multiple ‘pods’ for their data allowing for fine-grained control.

Another element is authentication, ensuring that the data owner really is who they say they are, while ensuring real identities remain private by default.

Networking needs to be peer-to-peer rather than hub-and-spoke, with copies of files stored across multiple machines for redundancy and speed of throughput in a manner that users of torrent-based file-sharing services will be familiar with, but adding far more control and performance features.

And above all it will need to be easy to use, low latency and simple for developers to create decentralised applications for.

Computing contacted a number of contributors to the Summit before and after the event and asked about their take on progress towards a viable decentralised web.

Pic credit Vitor Fontes. Things fall apart, the centre cannot hold (W.B. Yeats)

14/08/2018 – The key takeaways

With the summit now over and the participants returned to their basement labs (or shiny new offices) it’s time to consider the takeaways.

Interest in decentralisation is growing

While the 2016 Decentralized Web Summit summit attracted 350 enthusiasts, 2018 saw more than twice that number, with 800 attendees across 156 sessions. Not huge numbers as tech events in San Francisco go (the ‘big one’ Oracle OpenWorld attracts an astonishing 60,000 delegates), but important nevertheless in that it brought together the founders of the connected world with those looking at new ways to reclaim the web’s original vision.

“There are dozens and dozens of new projects and protocols and our goal was to get them to a place where people could do real learning,” said Wendy Hanamura of the Internet Archive.

For Blockstack’s Patrick Stanley the seed planted two years ago is still growing strongly: “I was very impressed by the quality of attendees and felt that the spirit of the original vision of the web as a place where people can create was intact,” he said.

No project is an island

The web touches almost every aspect of modern life. Re-architecting such a system will be a huge undertaking, one far too big for disparate bunches of developers working alone. MaidSafe COO Nick Lambert was among many urging more collaboration.

“Certainly, there are some efforts to work together on problem solving, but this is not happening universally,” he said. “Everyone at the event was clearly united in a common purpose to make the internet more private and secure, but the key takeaway for me is how we foster greater cohesion among the different projects.”

Money: no longer too tight to mention

Concerns about attracting VC funding haunted 2016, but those worries have largely evaporated as a result of the crypto goldrush which has given a huge boost to the value of the tokens that support many projects. Booms can turn to busts, of course, and sudden wealth can bring challenges of its own, but for now the gloom has lifted.

While some fear an inevitable clampdown on cryptocurrencies by the authorities, OmiseGO’s Althea Allen, who chaired a debate on the issue, said the worst may not happen.

“What I took away from talking with those excellent thinkers was actually quite a hopeful picture for the future of decentralised finance,” she said. “By all their accounts, they have found regulators to be more open to the possibilities of crypto than we tend to assume, with less default bias toward corporate interests, and largely concerned with the same things that we are: security, privacy, consumer protections; generally speaking, making honest people’s lives easier and not harder.”

Awareness of the bigger picture

Mindful of the developing relationship with the authorities, governance was front and centre of many discussions, a sign of growing maturity in decentralised thinking. For Miriam Avery, director of strategic foresight at Mozilla’s Emerging Technologies department, valuable lessons can be learned from those working “in countries where corruption is blatant, regulation is ineffective, and centralised control points cause palpable harm.”

Their experiences may turn out be more universal than some might think, she said. 

“The threat model is changing such that these harms are relevant to people who are less acutely aware of their causes. For instance, the things Colombian Ethereum hackers are worried about are things that we should all be a little worried about.”

Avery continued: “At the same time, digging into these projects we can already see pitfalls in the ‘governance’ of the software projects themselves, from the prevalence of benevolent dictators to disagreements on the limits of moral relativism. There’s room to grow these technologies through healthy, inclusive open source communities, and I’m excited to see that growth.”

The door needs to be wedged open, or it will be slammed shut again

Another Mozillan, software engineer Irakli Gozalishvili, said: “It was reassuring to see that the community is actively thinking and talking about not only making decentralised web a place that serves people, but also how to create technology that can’t be turned into corporate silos or tools for empowering hate groups.”

Scaling up

Any decentralised web worthy of that name needs to be quick and responsive at scale, said MaidSafe’s Lambert. “There is a long way to go to create a user experience that will encourage everyone to adopt the decentralised approach.  For example, none of the demonstrations at the summit were able to show scalability to millions of users.”

Front-end focus

The decentralised web, with a few notable exceptions, is still very ‘engineering-y’ with most of the effort going into the back-end rather than the user interface. The networking may be futuristic but the front end is (with a few honourable exceptions) still Web 1.0. Which is fine at the development stage but projects will soon need to move on from demonstrating capabilities to making apps that people actually want to use.

Creating an easy onramp is an essential step. Mozilla is piloting decentralised web browsing via WebExtension APIs, the first of the ‘major’ vendors to do so, although others have been working in this area for a while, notably the Beaker browser for navigating DAT sites and ZeroNet.

A long list of necessary developments includes a human-readable decentralised replacement for the DNS system, search engines, and proof that crypto-based incentive systems for the supply and demand of resources can make for a scalable economy.

And the next Decentralized Web Summit? Hanamura wouldn’t be drawn on a date. “We’re still recovering from organising this one,” she said.

Enthusiasm is not sufficient fuel

06/08/2018 – Maintaining the momentum

If the 2016 Decentralized Web Summit was a call to action, in 2018 it’s all about working code. That’s according to Wendy Hanamura, director of partnerships at the Internet Archive, the organisation that hosted both events. However, there’s still a fair way to go before it goes anything like mainstream.

The Internet Archive’s mission is to preserve the outputs of culture, turning analogue books, files and recordings into digital, storing digital materials for posterity and preserving web pages going back to 1996 in the Wayback Machine.

Unsurprisingly given its aims, the organisation is sitting on a mountain of data – more than 40 petabytes and rising fast. It has recently started experimenting with decentralised technologies as a way of spreading the load and ensuring persistence, including file sharing and storage protocols WebTorrent, DAT and IPFS, the database GUN and P2P collaborative editor YJS.

And it’s open to looking at more in the future. “We’re glad to be in at the ground floor,” said Hanamura. “We have no horse in the race. We’re looking for all of them to succeed so we’re looking at different protocols for different functions.”

Wendy Hanamura

Despite some substantial progress, few decentralised projects could yet be described as ‘enterprise ready’. More work is required in many different areas, one of which is providing more straightforward ways for non-technical users to become involved.

Hanumara pointed to developments among big-name browsers including Firefox, Chrome and Brave as among the most promising for improved user experience. Mozilla demonstrated a Firefox API for decentralised systems at the event.

“Participants were able to talk to each other directly browser to browser without a server involved, and they thought that was tremendously exciting,” she said.

Collaborations

For Ruben Verborgh of the Solid project, the cross-pollination required to overcome some of the challenges is hampered by the diversity of approaches.

“Ironically, the decentralised community itself is also very decentralised, with several smaller groups doing highly similar things,” he said. “Finding common ground and interoperability will be a major challenge for the future since we can only each do our thing if we are sufficiently compatible with what others do.”

While it’s still too early for projects to merge or consolidate around standards, Hanamura said she witnessed “lots of meetings in corridors and deals being struck about how you could tweak things to work together.”

“That’s another way you can make it scale,” she added.

Maintaining momentum

The summit had strong ideological underpinnings. Hanamura described it as “an event for the heart. People came to share,” she said.

The strength of small open-source projects with big ideas is that they can easily sustain shared ideals, but this can be hard to maintain as they evolve, she went on.

“Many founders said governance was their biggest worry. You need a team of developers who believe in you and are willing to work with you – if not they can fork the code and create something very different.”

In 2016 the main concern was very different: it was funding. The success of cryptocurrency token sales (ICOs) have removed many of these worries, at least for some. A lot of money has flowed into decentralised technologies, for example Filecoin recently raised $230m in an ICO and Blockstack made $50m. But this can be a double-edged sword as rapid expansion and bags of cash make team cohesion more challenging to maintain, Hanamura believes.

“It makes it a dangerous time. We came to this with a purpose, to make a web that’s better for everyone. So we need to keep our eye on the North Star.”

Once the technologies hit the mainstream, there will be other challenges too, including legal ones.

“As this ecosystem grows it has to be aware of the regulations on the books around the world but also those pending,” said Hanamura. “We have to have a strong voice for keeping areas where we can sandbox these technologies. We need a governance system to keep it decentralised otherwise it can get centralised again.”

It’s gonna take a lot of thinking through

01/08/2018 – Why is decentralising the web so hard to achieve?

Tim Berners-Lee and his colleagues faced a number of tough challenges when inventing the web, including having to build early browsers and protocols from scratch and overcoming initial scepticism (his original idea was labelled ‘vague but exciting’ by his boss at CERN). The nascent web also needed to be brought into being under the radar, and the terms for the release of its code carefully formulated to guarantee its free availability for all time. It took 18 months to persuade CERN that this was the right course.

Had the technology been proprietary, and in my total control, it would probably not have taken off. The decision to make the web an open system was necessary for it to be universal. You can’t propose that something be a universal space and at the same time keep control of it,” said Berners-Lee in 1998.

The original web was designed to be decentralised, but over the course of time it has been largely fenced off by a small number of quasi-monopolistic powers we know as ‘the tech giants’. This makes designing a new decentralised internet  – one that’s ‘locked open’ in the words of the Internet Archive’s Brewster Kahle – a challenge even more daunting than those pioneers faced. The problem is the tech giants are very good at what they do, said Jamie Pitts, a member of the DevOps team with the Ethereum Foundation, speaking for himself rather than on behalf of his organisation.

“One of the key hurdles to decentralisation is the lock-in effect and current excellent user experience provided by the large, centralised web services,” he said.

“Decentralised web technology must enable developers to produce high-quality systems enabling users to search, to connect with each other, and to conduct all forms of business. Until that happens, users will continue to be satisfied with the current set of options.”

While a subset of users is worried about power imbalances, surveillance and lack of control and transparency, the fact is that most people don’t care so long as there are bells and whistles aplenty. A tipping point must be achieved, as Althea Allen of OmiseGO put it.

“The only thing that will force those decentralised systems to change on a fundamental level is a mass shift by consumers toward decentralised systems.”

Selling ads and services through the centralisation and mining of data (‘surveillance capitalism’) has made the tech giants very powerful, and it can be hard to see beyond this model.

“The monopolisation that can occur in a rapidly-advancing technology space poses one of the greatest challenges to decentralisation,” said Pitts.

“Aggregation of capital and talent results from the network effect of a successful commercially-run service, and developers and users can become locked-in. While many of their needs of users may be met by the dominant content provider, search engine, or social network, the monopolised network becomes a silo.”

Moreover, the suck-up-all-the-data model has proven to be highly lucrative for the big boys, and while alternative economic methods for paying participants involving cryptocurrencies and micropayments are emerging, none has yet proved itself on the wider stage.

“There need to be viable business models for app developers that do not depend on advertisements or exploiting user behaviour and data,” said Blockstack’s Patrick Stanley.

On the systems side, there is a necessity to rethink the architecture to avoid central hubs. One of the toughest problems is achieving reliable consensus: with nodes seeing different versions of the ‘truth’ (i.e. what events are happening and in what order), how can one ‘truth’ be agreed upon without reference to a central arbiter? And how can this consensus be secured against faults and bad actors?

This longstanding conundrum was finally solved by the bitcoin blockchain a decade ago, and many efforts are ongoing to make it more efficient and a better fit for the decentralised web, the IoT and other applications. However, other projects, such as IPFS and MaidSafe’s SAFE Network, don’t use a blockchain, arriving at different methods for achieving consensus.

There are many ways to skin the decentralised cat – and that is another issue. What do people want, is it privacy, autonomy, security, an alternative economy, all of the above? Where are the tradeoffs and who decides the priorities? And how can the various strategies work together?

The problem is too big for one player to handle. MaidSafe’s David Irvine sees collaboration as key to any solution, which was one reason why the firm open-sourced all its code.

“We want to collaborate with other companies in this space. We have the scars of developing specific functionality and are happy to work with companies to integrate that functionality where it makes sense.”

Pic credit Rene Böhmer. A decentralised web can also be a place to hide

31/07/2018 What might go wrong?

Technology is morally agnostic. Nuclear power provides the raw material for nuclear bombs. That new road can carry serial killers as well as saints. And while a decentralised web would redistribute power over personal data, it could also provide a convenient hiding place for the bad guys.

Danielle Robinson

It’s high time technologists started to see this issue in the round, said Danielle Robinson, co-executive director, of Code for Science & Society, a non-profit supporting collaboration in public interest technology.

“When technology is built, the biases of its creators are often embedded into the technology itself in ways that are very hard for the creators to see, until it’s used for a purpose you didn’t intend,” she said during an interview with Internet Archive. “So I think it’s really important that we talk about this stuff.”

The increased privacy and security built into decentralised web technologies makes it easier for anyone to collaborate in a secure fashion. And that includes hate groups.

“They’re on the current existing web, and they’re also on the decentralised web, and I think it’s important for our community to talk about that,” she said. “We need a deeper exploration that’s not just ‘oh you know, we can’t control that’.”

In a separate interview, Matt Zumwalt, program manager at Protocol Labs, creator of Inter-Plantetary File System (IPFS), argued that proponents of decentralised web need to think about how it might be gamed.

“We should be thinking, really proactively, about what are the ways in which these systems can be co-opted, or distorted, or gamed or hijacked, because people are going to try all of those things,” he said.

The decentralised web is still an early stage project, and many involved in its creation are motivated by idealism, he went on, drawing parallels with the early days of the World Wide Web. Lessons should be learned from that experience about how reality is likely to encroach on the early vision, he said.

“I think we need to be really careful, and really proactive about trying to understand, what are these ideals? What are the things we dream about seeing happen well here, and how can we protect those dreams?”

Mitra Ardron, technical lead for decentralisation at the Internet Archive, believes that one likely crunch point will be when large firms try to take control.

“I think that we may see tensions in the future, as companies try and own those APIs and what’s behind them,” he said. “Single, unified companies will try and own it.”

However, he does not think this will succeed because he believes people will not accept a monolith. Code can be forked and “other people will come up with their own approaches.”

30/07/2018 Blockstack on identity and decoupling data

Authentication and identity are cornerstones of decentralised networking. Through cryptography, I as a user can verify who I am and what data I own without reference to any central registry. I can use my decentralised ID (DID) to log on securely and perhaps anonymously to services and applications with no third party involved.

Identity is bound up with another tenet of decentralisation: separating the data from the applications. Applications are now interfaces to shared data rather than controllers and manipulators of it. Without my express permission, apps can no longer use and retain data beyond my control.

Coupling data to ID rather than apps was the starting point for the Blockstack platform, as head of growth Patrick Stanley explained.

“Blockstack is creating a digital ecosystem of applications that let users fully own their identities and data on the Internet. User data – like photos and messages – are completely decoupled from the applications. Apps can no longer lock users and their social graph in, since they no longer store anything.”

Storage is taken care of elsewhere, in a decentralised storage system called Gaia. As apps are now ‘views’ or interfaces you don’t need to log in to each individually.

“People use applications on Blockstack just like they would with today’s Internet. But instead of signing up for each app one-by-one with an email address and password — or a Google/Facebook log-in — users have an identity that’s registered in the blockchain and a public key that permissions applications or other users to access pieces of data.”

That’s lots of positives so far from a user point of view, and also for developers who have a simpler architecture and fewer security vulnerabilities to worry about, but of course, there’s a catch. It’s the difference between shooting from the hip and running everything by a committee.

“Decentralisation increases coordination costs. High coordination costs make it hard to get some kinds of things done, but with the upside that the things that do get done are done with the consensus of all stakeholders.”

There are already privacy-centric social networks and messaging apps available on Blockstack, but asked about what remains on the to-do list, Stanley mentioned “the development of a killer app”. Simply replicating what’s gone before with a few tweaks won’t be enough.

A viable business model that doesn’t depend on tracking-based advertising is another crucial requirement – what would Facebook be without the data it controls? – as is interoperability with other systems, he said.

And the big picture? Why is Blockstack sponsoring the event? Ultimately it’s about securing digital freedom, said Stanley.

“If we’re going to live free lives online, there needs to be protocol-level safeguards to ensure your data stays under your control. Otherwise, the people who control your data ultimately control your digital life.”

Independent but interconnected

27/06/2018 OmiseGO on the importance of UI

OmiseGO, a sponsor of the Decentralized Web Summit, is a subsidiary of Asia-Pacific regional fintech firm Omise. Omise is a payments gateway similar to PayPal or Stripe that’s doing brisk business in East Asia. Omise enables online and mobile fiat currency transactions between customers and participating vendors, and OmiseGO, a separate company and open source project, aims to do the same with cryptocurrencies too.

The backbone of OmiseGO is the OMG blockchain which in turn is built on Ethereum. The goal is to provide seamless interoperability across all blockchains and providers. OMG uses Plasma, an enhancement designed to speed up transactions on the Ethereum blockchain, and the company counts Ethereum’s founders Vitalik Buterin and Gavin Wood among its advisors. While it’s very early days, in the long run OmiseGO wants to extend banking-type services to the billions of ‘unbanked’ people by cutting out the financial middleman who don’t serve those people, and also giving the ‘banked’ an alternative.

The current Internet has too many middlemen of its own, meaning that equal access does not mean equal control, explained OmiseGO’s head of ecosystem growth Althea Allen in an email.

“The decentralised web is crucial is providing equitable agency within the systems that internet users are accessing. Sovereignty over your own data, money and communication; access to information that is not censored or manipulated; the ability to control what aspects of your identity are shared and with whom; these are essential freedoms that the centralised web simply will not provide.”

However, if the alternatives are awkward and clunky, they will never take off.

“It is difficult, though not impossible, to create a decentralised system that provides the kind of user experience that the average internet user has come to expect. Mass adoption is unlikely until we can provide decentralised platforms that are powerful, intuitive and require little or no change in users’ habits.”

Team OmiseGO

Blockchains are a powerful tool for decentralisation as they can help keep control of events and processes across the network, but that depends on how they are used. There’s a lot of ‘blockchain-washing’ out there, Allen warned.

“Blockchains are not intrinsically decentralised – they can absolutely be private and proprietary. Many institutions, old and new, are showing an interest in adopting new technologies such as blockchains, maintaining the same centres of power and influence, and putting an ‘I blockchained’ sticker on them – essentially, appropriating the rhetoric of decentralisation without actually adopting the principles.”

Asked about the plethora of competing decentralised approaches, Allen said she believes this is positive, but sharing ideas is vital too.

“Cooperation is crucial for us to move the space forward, while healthy competition encourages the exploration of many different possible solutions to the same problems. We work particularly closely with Ethereum, but the success of our project depends on a thriving ecosystem (which extends well beyond crypto or even blockchain technology). To this end, we make a concerted effort to work with projects and individuals in many fields who are contributing to building the decentralised web.”

26/07/2018 MaidSafe on collaboration

As we mentioned in the introduction, a decentralised web will require a number of different interlocking components, including decentralised storage, decentralised networking, decentralised applications and decentralised identities.

MaidSafe, one of the event’s sponsors, is trying to cover all but one these bases with its autonomous SAFE Network, replacing the Transport, Session and Presentation layers of the current seven-layer internet with decentralised alternatives to create a platform for applications. The project is currently at alpha test stage.

So it’s all sewn up then, no need for further collaboration? Not at all said CEO David Irvine, who will be speaking at the event, pointing to the firm’s open-sourcing of its PARSEC consensus algorithm and its invitation to other projects to help develop it. It’s just not always easy to organise joint ventures he said. The summit will bring together many pioneers and innovators (70-plus projects are represented) with each pushing their own ideas for redefining the web.

“[Everyone’s] so passionate about improving the internet experience, we are defining the rules for the future, and everyone has a point of view. That does mean there are some egos out there who are quite vocal about the merits of their approach versus others, which makes for good media stories and fuels hype, but it’s not what we’re really focused on.”

Within any movement dedicated to upending the status quo, there lurks the danger of a People’s Front of Judea-type scenario with infighting destroying the possibilities of cooperation. Amplifying the risk, many projects in this space are funded through cryptocurrency tokens, which adds profiteering to the mix. It’s easy to see how the whole thing could implode, but Irvine says he’s now starting to see real collaborations happen and hopes the summit will bring more opportunities.

“We’ve already been talking to Sir Tim Berners-Lee’s Solid project at MIT, and we have a growing number of developers experimenting with applications for the platform,” he said.

MaidSafe’s David Irvine

MaidSafe has been a fixture in the decentralised firmament for a while, predating even the blockchain which is the backbone of many other ventures. At one time it had the space almost to itself but has since been joined by a host of others. Asked about his company’s USP, Irvine came back with one word: “honesty”.

We asked him to expand.

“There is far too much hype in the wider blockchain crypto space and we have always tried to distance ourselves from that nonsense. We’re trying to build something hugely complex and radically different. That doesn’t happen overnight, so you have to be upfront with people so they are not misled. Sure we’ve learned along the way, got some things wrong, but whenever we have we’ve held our hands up and that has helped us.”

And the big-picture goal?

“In essence, privacy, security and freedom. The technology we are building will provide private and secure communications, as well as freedom through the unfettered access to all humanity’s data.”

25/07/2018 Kahle and Berners-Lee on the need for decentralisation

Organiser the Internet Archive directed us to some recent statements by founder Brewster Kahle. Here Kahle outlines some of the problems with the existing web.

“Some of the problems the World Wide Web that we’ve seen in the last few years are the surveillance structures that Snowden gave light to. There are the trolling problems that we saw in the last election. There’s privacy aspects, of people spilling their privacy into companies that sometimes aren’t the most trustworthy. There’s advertising technologies being used against users. There’s a lot of failings that we’ve seen in the World Wide Web.”

To be successful, the decentralised web will need to encourage “lots of winners, lots of participation, lots of voices” he said.

“So this is a time to join in, to find a place, get knee-deep in the technologies. Try some things out. Break some stuff. Invest some time and effort. Let’s build a better, open world, one that serves more of us.”

Open source principles are essential but not sufficient. There must be a focus on performance, functionality and new ideas.

“We’re only going to survive if the open world is more interesting than closed app worlds … what I would think of as a dystopian world of closed, segmented, siloed, corporately-owned little pieces of property. I’d much rather see an open, next-generation web succeed,” Kahle said.

Tim Berners-Lee

As ‘Father of the Web’ (Mk I), Tim Berners-Lee has become increasingly disillusioned with his offspring. Around the time of the previous Decentralized Web Summit in 2016, he said: “The web has got so big that if a company can control your access to the internet, if they can control which websites you go to, they have tremendous control over your life.

“If they can spy on what you’re doing they can understand a huge amount about you, and similarly if a government can block you going to, for example, the opposition’s political pages, they can give you a blinkered view of reality to keep themselves in power.”

Since then, of course, many of the things he warned about have become evident in increasingly obvious and frightening ways. And in the US Congress recently scrapped net neutrality, doing away – in that country at least – with a longstanding principle of the internet, namely that ISPs should treat all data equally.

So, are there any positive developments to report over the last two years? Berners-Lee remains hopeful.

“There’s massive public awareness of the effects of social networks and the unintended consequences,” he told Computing. “There’s a huge backlash from people wanting to control their own data.”

In part this awareness is being driven by GDPR coming into effect, in part by news headlines.

Meanwhile, there’s the rise of “companies which respect user privacy and do not do anything at all with user data” (he namechecks social network MeWe to which he acts as an advisor), open-source collaborations like the data portability project (DTP) led by tech giants, and his own project Solid which is “turning from an experiment into a platform and the start of a movement”.

“These are exciting times,” said Berners-Lee.


John Leonard, Research Editor, Incisive Media

The post Decentralising the web: The key takeaways appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14/feed 0 72506