Ethereum – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Mon, 18 Feb 2019 13:37:22 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 Beyond Bitcoin and Ethereum — a fairer and more just post-monetary sociopolitical economy https://blog.p2pfoundation.net/beyond-bitcoin-and-ethereum%e2%80%8a-%e2%80%8aa-fairer-and-more-just-post-monetary-sociopolitical-economy/2019/02/16 https://blog.p2pfoundation.net/beyond-bitcoin-and-ethereum%e2%80%8a-%e2%80%8aa-fairer-and-more-just-post-monetary-sociopolitical-economy/2019/02/16#respond Sat, 16 Feb 2019 18:20:37 +0000 https://blog.p2pfoundation.net/?p=74513 Taking the Bitcoin dream of ‘freedom as self-sovereignty’ beyond anything even Bitcoin maximalists ever dared to dream of. Written by Hank Sohota. Originally posted on Good Audience on 28th January 2019. A viable, sustainable and scalable P2P sociopolitical economy, which embraces digital and data sovereignty for all agents, is about to emerge. One in which, as a consequence, money... Continue reading

The post Beyond Bitcoin and Ethereum — a fairer and more just post-monetary sociopolitical economy appeared first on P2P Foundation.

]]>
Taking the Bitcoin dream of ‘freedom as self-sovereignty’ beyond anything even Bitcoin maximalists ever dared to dream of.

Written by Hank Sohota. Originally posted on Good Audience on 28th January 2019.

A viable, sustainable and scalable P2P sociopolitical economy, which embraces digital and data sovereignty for all agents, is about to emerge. One in which, as a consequence, money and intermediaries — social, political and economic — will no longer play the central, and therefore controlling, role they play today.

Let us start where Bitcoin started

Whatever socially and politically legitimised ‘flavour’ of money operates within a given community, nation or civilisation, it fundamentally shapes the economic, social, and political potential — and therefore possibilities — within those domains.

The Problem with Money

Money — by its very nature a social construct, as it fundamentally relies on people’s confidence in it — has three defining core functions. These are:

  1. a store of value (i.e. sufficiently stable in value — and particularly, in people’s confidence in it — over time), which enables it to be …
  2. a medium of exchange (i.e. also sufficiently saleable across scales and space), all of which enables it to be …
  3. a unit of account

However, in order for money to work at scale, standardisation is also required which, in reality, inevitably means centralisation. Unfortunately, this intrinsically undermines the hardness of money (i.e. it’s ‘uninflatability’*), and the level of confidence people can have in its core functions, because decision-making shifts into the hands of the few. One cannot have sound money without reliable and consistent long-term hardness, as well as confidence-maintaining monetary policy. Regrettably, the few — or in the bygone case of a monarchy, an individual — have a long history of abusing their fiduciary responsibilities on both counts.

So, in order to solve the ‘hardness of money’ and the ‘level of confidence’ problems, we need to solve the centralisation problem, which — applied more broadly — asks:

How do we coordinate, cooperate and collaborate across space and time, at scale, without the need for intermediaries, representatives, executives or organisation-owners?

Arguably, this articulates, for some, the holy grail of anarchy (which should not be assumed to be synonymous with lawlessness, chaos or disorder).

Limitations of Bitcoin and Ethereum

Bitcoin is an attempt at solving the ‘centralisation of hard money’ problem which in the bigger picture is a good place to start. However, it does this by using a distributed ledger of hashchain blocks (giving it immutability), hard coding hardness (giving it uninflatability), and constructing a single network-wide timeline through a decentralised but not fully distributed Proof of Work (PoW) consensus mechanism (giving it ‘uncensorability’*). This provides a form of ‘trustlessness’ by trusting the network rather than any individual entity or actor. Due to its significant practical and philosophical limitations, this approach provides only a partial and impractical solution because it is not distributed enough, not fast enough, not cost effective enough, and not scalable enough. Furthermore, it could push climate change too far in the wrong direction to be worth it, due to its electrical power consumption needs. Unless of course, conversely, it turns out to be a boon for renewable forms of generating electricity by increasing the financial incentives for it, perhaps even leading to green energy infrastructure which otherwise would not be funded. Nonetheless, these shortcomings will still apply even if all the near-to-medium term solutions work out as proposed. Even so, the four key features of Bitcoin*, in its current form, namely, immutability, uninflatability, uncensorability and unconfiscatability, are an historic achievement.

Ethereum, although not necessarily trying to solve the same problem — and not necessarily doing a good job of it — is fundamentally based on the same underlying ledger technology as Bitcoin and so suffers from the same or similar limitations, even before we include its ‘centralisation of power’ issues, its shortcomings as a cryptocurrency relative to Bitcoin, the complications and disadvantages of smart contracts, and its attempt to move to a Proof of Stake (PoS) consensus mechanism*. What Ethereum has done is enable the launching of several thousand Altcoins, none of which seem to make much sense, and nor do their fundamentals give one confidence that they will ever achieve their stated goals. Given that this has taken place in a new asset class and an unregulated market, no one should be surprised by the emergence of a FOMO-FUD wild west, or the role played in it by market makers.


Mutual Self-sovereignty — the foundational core construct of a fair and just sociopolitical economy

In my view, economics should have a strong focus on thrivability in human social systems — viable, sustainable and inclusive thrivability, at scale.

Although I over-simplify, I believe that at the heart of thrivability lies a dialectic in human social systems, that of group solidarity vs. individual sovereignty (cf. the political philosophy divide of left vs. right). Both aspects of this dialectic provide tremendous benefits for the group and the individual, namely, social cohesion leading to better survival odds, but this comes at a price, namely, acquiescence, conformity and homogeneity.

However, I would suggest that solidarity and sovereignty are two sides of the same coin – they mutually and dynamically ‘co-form’ and ‘in-form’ each other, and so co-evolve symbiotically. They constitute a ‘dialectical singularity’ which is brought into ‘harmony’ through mutual self-sovereignty (cf. Yin-Yang; i.e. black and white dynamically interacting with each other at the same time, without either diminishing in identity or the two combining to become a ‘middle’ grey). In this dynamic, both social cohesion and individual sovereignty are both ‘strong and fluid’, at the same time — a concept often referenced in Daoist philosophy using the metaphor of water. I believe it is this perpetual dynamic which leads to the anti-fragility of a human social system. I further believe, it leads to the perpetual emergence of one’s sense of self and one’s sense of identity.


The mutual self-sovereignty challenge

Even solving the centralisation problem — of hard money or more broadly — would not be enough. We need to go further and address ‘the mutual self-sovereignty’ challenge, which can be thought of as:

Not only do I need a viable option of not having to participate in any particular socially mediated ‘game’ played by a particular set of rules, I also need to be able to, easily and permissionlessly, change the rules of the game (i.e. create a forked version — preferably not a sh*tty/scammy one) and invite others to play, or — just as easily and permissionlessly — be able to invent an entirely new game.

Furthermore, and equally importantly, in all such games the rules (i.e voluntary and mutually enabling constraints) must be enforceable and policed in an emergent and self-organising manner by the participants — governance of the people, by the people, for the people — and the rules must respect relativity (i.e. multiple relativistic timelines) — global consensus should not be necessary. Otherwise, we inexorably end up back at the centralisation problem.

All of which means that Bitcoin and Ethereum — specifically their underpinning blockchain technology — are not going to take us where we need to go, in order to address our most pressing global and local challenges. This is because they are not sufficiently workable and do not go far enough although they will have been critical and essential catalysts. Even those who were initially inspired by the distributed ledger technology (DLT) of Bitcoin, as a means of addressing the challenges of enabling a radically new peer-to-peer (P2P) sociopolitical economy — which motivated some in the Bitcoin and Ethereum communities — are now having to recognise, and to concede, these limitations. Hence, the sense of malaise and disillusionment among the Ethereum and Ethereum-esque developers who are not in it for the money.

Holochain and the Post-monetary Economy

Holochain, on the other hand, will take us where we need to get to. It is the first technology, in human history, which genuinely addresses the mutual self-sovereignty challenge, completely and at any scale — in fact, it is inversely scalable, its efficiency and efficacy improve as network size increases — and as an integral component of the MetaCurrency and Ceptr projects, it also pre-dates both Ethereum and Bitcoin.

Holochain provides a bio-mimicry inspired, software-based, enabling social technology — a pattern, if you will — from which can emerge anarchy — life without mass intermediation as a necessity. Thus empowering us to move to a post-monetary epoch with, for example, a multitude of asset-backed mutual credit (crypto)currencies — which on Holochain are natively inter-operable — using a much broader definition of currency (i.e. a formal symbol system for shaping, enabling, and measuring flows — e.g. of value, promises or reputation). A much more enlightened interpretation of Hayekian thinking, I would suggest, than the neo-liberalism version.

A value flow, of any kind, must first be acknowledged and recognised before it can be managed for the better — making visible only GDP-related flows has been a disaster for humanity and the planet, if not potentially catastrophic. Then, and only then, can we begin the work of reinforcing or amplifying interrelated positive flows and mitigating — hopefully eliminating — interrelated negative flows, in an emergent and self-organising way. Thus we can form the basis on which more meaningful, and more humane, wealth and prosperity can be created for the many, perhaps even, for all.

Mass Disintermediation

Despite its long history, for most people, the economic and sociopolitical revolution Holochain will induce will seem like it happened overnight. This is because it is an open source software solution taking place in a digitalised worldIt can be deployed at speed, at scale, and at zero marginal cost, using the full range of computational device types from a Raspberry Pi, to a smartphone, to a tablet, or a laptop — even a server — using software development languages and tools which produce secure, compact and fast web and native apps.

The first hApp (Holochain dApp) to be built — using Rust and WASM — is Holo, an hApp for hosting hApps which includes the first ever mutual credit cryptocurrency called Holo Fuel, to reimburse Holo hosts — who with Holo, host hApps using the spare computation and storage capacity on their own devices. This enables hApps to be accessed using a standard browser — such as Holochain favoured Mozilla’s Firefox — through the web, without any change in the user experience. However, even this hosting can be avoided, since any device running Holochain is natively both a user and a host. Holo’s purpose then is to provide a bridge between the current server-based web and the potential longer term server-less — because it is peer-to-peer — Holochain alternative. Ultimately, it should be possible to integrate mesh networkingtoo, which would mean a genuinely and fully distributed internet and web.

Furthermore, Holochain’s data integrity model supports mutual self-sovereignty by having an agent-centric orientation, using sourcechains (think, agent owned hashchains), digital signatures, and validatingdistributed hash tables (think, BitTorrent and GitHub), rather than a data-centric orientation. Thus fully returning value realisation and ownership, as well as privacy and confidentiality, to those actually creating the value locallyrather than intermediaries, representatives, executives or organisation-owners, seeking to extract and monetise it.

The Ultimate Question

Once workable, practical and ubiquitous, mutual self-sovereignty — as a movement — will redefine every dimension of our lives — social, political, economic, artistic and cultural. Most profoundly, it will completely change the nature of the stories we tell ourselves and each other in order to navigate our lives, both intra and inter generationally. In doing so, along with the societal implications of advanced, model-free, deep reinforcement learning AI — not to mention Ceptr and Ceptr-based AI — we will ultimately re-conceive and therefore redefine what we believe it truly means to be human — in the 21st century.

Disclosure: I am financially and philosophically invested in Ceptr/Holochain/Holo. I have never invested in Bitcoin, any alt-coin or crypto asset.

Photo by Freddie Collins on Unsplash

Thanks for reading.

You can share this article using these links: 
Facebook | Twitter | Reddit | LinkedIn | Telegram |Email.

Further reading:

Ceptr/Holochain/Holo Whitepapers

Antonopoulos, M. (2016). The Internet of Money: A collection of talks by Andreas M. Antonopoulos: Volume 1. Merkle Bloom.

Antonopoulos, M. (2017). The Internet of Money Volume Two: A collection of talks by Andreas M. Antonopoulos. Merkle Bloom.

Ammous, S. (2018). The Bitcoin Standard: The Decentralized Alternative to Central Banking. John Wiley & Sons.

* Special thanks to Tone Vays and Murad Mahmudov for so freely sharing their intellectual musings with the public.

The post Beyond Bitcoin and Ethereum — a fairer and more just post-monetary sociopolitical economy appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/beyond-bitcoin-and-ethereum%e2%80%8a-%e2%80%8aa-fairer-and-more-just-post-monetary-sociopolitical-economy/2019/02/16/feed 0 74513
What to do once you admit that decentralizing everything never seems to work https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24 https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24#respond Wed, 24 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=73242 Decentralization is the new disruption—the thing everything worth its salt (and a huge ICO) is supposed to be doing. Meanwhile, Internet progenitors like Vint Cerf, Brewster Kahle, and Tim Berners-Lee are trying to re-decentralize the Web. They respond to the rise of surveillance-based platform monopolies by simply redoubling their efforts to develop new and better decentralizing technologies. They... Continue reading

The post What to do once you admit that decentralizing everything never seems to work appeared first on P2P Foundation.

]]>

Decentralization is the new disruption—the thing everything worth its salt (and a huge ICO) is supposed to be doing. Meanwhile, Internet progenitors like Vint Cerf, Brewster Kahle, and Tim Berners-Lee are trying to re-decentralize the Web. They respond to the rise of surveillance-based platform monopolies by simply redoubling their efforts to develop new and better decentralizing technologies. They seem not to notice the pattern: decentralized technology alone does not guarantee decentralized outcomes. When centralization arises elsewhere in an apparently decentralized system, it comes as a surprise or simply goes ignored.

Here are some traces of the persistent pattern that I’m talking about:

  • The early decentralized technologies of the Internet and Web relied on key points of centralization, such as the Domain Name System (which Berners-Lee called the Internet’s “centralized Achilles’ heel by which it can all be brought down or controlled”) and the World Wide Web Consortium (which Berners-Lee has led for its entire history)
  • The apparently free, participatory open-source software communities have frequently depended on the charismatic and arbitrary authority of a “benevolent dictator for life,” from Linus Torvalds of Linux (who is not always so benevolent) to Guido van Rossum of Python
  • Network effects and other economies of scale have meant that most Internet traffic flows through a tiny number of enormous platforms — a phenomenon aided and exploited by a venture-capital financing regime that must be fed by a steady supply of unicorns
  • The venture capital that fuels the online economy operates in highly concentrated regions of the non-virtual world, through networks that exhibit little gender or ethnic diversity, among both investors and recipients
  • While crypto-networks offer some novel disintermediation, they have produced some striking new intermediaries, from the mining cartels that dominate Bitcoin and other networks to Vitalik Buterin’s sweeping charismatic authority over Ethereum governance

This pattern shows no signs of going away. But the shortcomings of the decentralizing ideal need not serve as an indictment of it. The Internet and the Web made something so centralized as Facebook possible, but they also gave rise to millions of other publishing platforms, large and small, which might not have existed otherwise. And even while the wealth and power in many crypto-networks appears to be remarkably concentrated, blockchain technology offers distinct, potentially liberating opportunities for reinventing money systems, organizations, governance, supply chains, and more. Part of what makes the allure of decentralization so compelling to so many people is that its promise is real.

Yet it turns out that decentralizing one part of a system can and will have other kinds of effects. If one’s faith in decentralization is anywhere short of fundamentalism, this need not be a bad thing. Even among those who talk the talk of decentralization, many of the best practitioners are already seeking balance — between unleashing powerful, feral decentralization and ensuring that the inevitable centralization is accountable and functional. They just don’t brag about the latter. In what remains, I will review some strategies of thought and practice for responsible decentralization.

Hat from a 2013 event sponsored by Zambia’s central government celebrating a decentralization process. Source: courtesy of Elizabeth Sperber, a political scientist at the University of Denver

First, be more specific

Political scientists talk about decentralization, too—as a design feature of government institutions. They’ve noticed a similar pattern as we find in tech. Soon after something gets decentralized, it seems to cause new forms of centralization not far away. Privatize once-public infrastructure on open markets, and soon dominant companies will grow enough to lobby their way into regulatory capture; delegate authority from a national capital to subsidiary regions, and they could have more trouble than ever keeping warlords, or multinational corporations, from consolidating power. In the context of such political systems, one scholar recommends a decentralizing remedy for the discourse of decentralization — a step, as he puts it, “beyond the centralization-centralization dichotomy.” Rather than embracing decentralization as a cure-all, policymakers can seek context-sensitive, appropriate institutional reforms according to the problem at hand. For instance, he makes a case for centralizing taxation alongside more distributed decisions about expenditures. Some forms of infrastructure lend themselves well to local or private control, while others require more centralized institutions.

Here’s a start: Try to be really, really clear about what particular features of a system a given design seeks to decentralize.

No system is simply decentralized, full-stop. We shouldn’t expect any to be. Rather than referring to TCP/IP or Bitcoin as self-evidently decentralized protocols, we might indicate more carefully what about them is decentralized, as opposed to what is not. Blockchains, for instance, enable permissionless entry, data storage, and computing, but with a propensity to concentration with respect to interfaces, governance, and wealth. Decentralizing interventions cannot expect to subdue every centralizing influence from the outside world. Proponents should be forthright about the limits of their enterprise (as Vitalik Buterin has sometimes been). They can resist overstating what their particular sort of decentralization might achieve, while pointing to how other interventions might complement their efforts.

Another approach might be to regard decentralization as a process, never a static state of being — to stick to active verbs like “decentralize” rather than the perfect-tense “decentralized,” which suggests the process is over and done, or that it ever could be.

Guidelines such as these may tempt us into a pedantic policing of language, which can lead to more harm than good, especially for those attempting not just to analyze but to build. Part of the appeal of decentralization-talk is the word’s role as a “floating signifier” capable of bearing various related meanings. Such capacious terminology isn’t just rhetoric; it can have analytical value as well. Yet people making strong claims about decentralization should be expected to make clear what distinct activities it encompasses. One way or another, decentralization must submit to specificity, or the resulting whack-a-mole centralization will forever surprise us.

A panel whose participants, at the time, represented the vast majority of the Bitcoin network’s mining power. Original source unknown

Second, find checks and balances

People enter into networks with diverse access to resources and skills. Recentralization often occurs because of imbalances of power that operate outside the given network. For instance, the rise of Facebook had to do with Mark Zuckerberg’s ingenuity and the technology of the Web, but it also had to do with Harvard University and Silicon Valley investors. Wealth in the Bitcoin network can correlate with such factors as propensity to early adoption of technology, wealth in the external economy, and proximity to low-cost electricity for mining. To counteract such concentration, the modes of decentralization can themselves be diverse. This is what political institutions have sought to do for centuries.

Those developing blockchain networks have tended to rely on rational-choice, game-theoretic models to inform their designs, such as in the discourse that has come to be known as “crypto-economics.” But relying on such models alone has been demonstrably inadequate. Already, protocol designers seem to be rediscovering notions like the separation of powers from old, institutional liberal political theory. As it works to “truly achieve decentralization,” the Civil journalism network ingeniously balances market-based governance and enforcement mechanisms with a central, mission-oriented foundation populated by elite journalists — a kind of supreme court. Colony, an Ethereum-based project “for open organizations,” balances stake-weighted and reputation-weighted power among users, so that neither factor alone dictates a user’s fate in the system. The jargon is fairly new, but the principle is old. Stake and reputation, in a sense, resemble the logic of the House of Lords and the House of Commons in British government — a balance between those who have a lot to lose and those who gain popular support.

As among those experimenting with “platform cooperativism,” protocols can also adapt lessons from the long and diverse legacy of cooperative economics. For instance, blockchain governance might balance market-based one-token-one-vote mechanisms with cooperative-like one-person-one-vote mechanisms to counteract concentrations of wealth. The developers of RChain, a computation protocol, have organized themselves in a series of cooperatives, so that the oversight of key resources is accountable to independent, member-elected boards. Even while crypto-economists adopt market-based lessons from Hayek, they can learn from the democratic economics of “common-pool resources” theorized by Elinor Ostrom and others.

Decentralizing systems should be as heterogeneous as their users. Incorporating multiple forms of decentralization, and multiple forms of participation, can enable each to check and counteract creeping centralization.

Headquarters of the Internet Archive, home of the Decentralized Web conferences: Wikimedia Commons

Third, make centralization accountable

More empowering strategies for decentralization, finally, may depend on not just noticing or squashing the emergence of centralized hierarchy, but embracing it. We should care less about whether something is centralized or decentralized than whether it is accountable. An accountable system is responsive to both the common good for participants and the needs of minorities; it sets consistent rules and can change them when they don’t meet users’ needs.

Antitrust policy is an example of centralization (through government bureaucracy) on behalf of decentralization (in private sector competition). When the government carrying out such a policy holds a democratic mandate, it can claim to be accountable, and aggressive antitrust enforcement frequently enjoys broad popularity. Such centralized government power, too, may be the only force capable of counteracting the centralized power of corporations that are less accountable to the people whose lives they affect. In ways like this, most effective forms of decentralization actually imply some form of balance between centralized and decentralized power.

While Internet discourses tend to emphasize their networks’ structural decentralization, well-centralized authorities have played critical roles in shaping those networks for the better. Internet progenitors like Vint Cerf and Tim Berners-Lee not only designed key protocols but also established multi-stakeholder organizations to govern them. Berners-Lee’s World Wide Web Consortium (W3C), for instance, has been a critical governance body for the Web’s technical standards, enabling similar user experience across servers and browsers. The W3C includes both enormously wealthy corporations and relatively low-budget advocacy organizations. Although its decisions have sometimes seemedto choose narrow business interests over the common good, these cases are noteworthy because they are more the exception than the rule. Brewster Kahle has modeled mission-grounded centralization in the design of the nonprofit Internet Archive, a piece of essential infrastructure, and has even attempted to create a cooperative credit union for the Internet. His centralizing achievements are at least as significant as his calls for decentralizing.

Blockchain protocols, similarly, have tended to spawn centralized organizations or companies to oversee their development, although in the name of decentralization their creators may regard such institutionalization as a merely temporary necessity. Crypto-enthusiasts might admit that such institutions can be a feature, not a bug, and design them accordingly. If they want to avoid a dictator for life, as in Linux, they could plan ahead for democracy, as in Debian. If they want to avoid excessive miner-power, they could develop a centralized node with the power to challenge such accretions.

The challenge that entrepreneurs undertake should be less a matter of How can I decentralize everything? than How can I make everything more accountable? Already, many people are doing this more than their decentralization rhetoric lets on; a startup’s critical stakeholders, from investors to developers, demand it. But more emphasis on the challenge of accountability, as opposed to just decentralization, could make the inevitable emergence of centralization less of a shock.

What’s so scary about trust?

In a February 2009 forum post introducing Bitcoin, Satoshi Nakamoto posited, “The root problem with conventional currency is all the trust that’s required to make it work.” This analysis, and the software accompanying it, has spurred a crusade for building “trustless” systems, in which institutional knowledge and authority can be supplanted with cryptographic software, pseudonymous markets, and game-theoretic incentives. It’s a crusade analogous to how global NGOs and financial giants advocated mechanisms to decentralize power in developing countries, so as to facilitate international investment and responsive government. Yet both crusades have produced new kinds of centralization, in some cases centralization less accountable than what came before.

For now, even the minimal electoral accountability over the despised Federal Reserve strikes me as preferable to whoever happens to be running the top Bitcoin miners.

Decentralization is not a one-way process. Decentralizing one aspect of a complex system can realign it toward complex outcomes. Tools meant to decentralize can introduce novel possibilities — even liberating ones. But they run the risk of enabling astonishingly unaccountable concentrations of power. Pursuing decentralization at the expense of all else is probably futile, and of questionable usefulness as well. The measure of a technology should be its capacity to engender more accountable forms of trust.

Learn more: ntnsndr.in/e4e

If you want to read more about the limits of decentralization, here’s a paper I’m working on about that. If you want to read about an important tradition of accountable, trust-based, cooperative business, here’s a book I just published about that.

Photo by CIFOR

The post What to do once you admit that decentralizing everything never seems to work appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/what-to-do-once-you-admit-that-decentralizing-everything-never-seems-to-work/2018/10/24/feed 0 73242
Economics back into Cryptoeconomics https://blog.p2pfoundation.net/economics-back-into-cryptoeconomics/2018/10/09 https://blog.p2pfoundation.net/economics-back-into-cryptoeconomics/2018/10/09#respond Tue, 09 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72899 Republished from Medium.com Dick Bryan, Benjamin Lee, Robert Wosnitzer, Akseli Virtanen* The mounting literature on cryptoeconomics shows an interesting but also alarming characteristic: its underlying economics is remarkably conventional and conservative. It is surely an anomaly that many people who have gone outside the mainstream to disrupt and develop new visions of the future and... Continue reading

The post Economics back into Cryptoeconomics appeared first on P2P Foundation.

]]>
Republished from Medium.com

Dick Bryan, Benjamin Lee, Robert Wosnitzer, Akseli Virtanen*

The mounting literature on cryptoeconomics shows an interesting but also alarming characteristic: its underlying economics is remarkably conventional and conservative.

It is surely an anomaly that many people who have gone outside the mainstream to disrupt and develop new visions of the future and economy so readily adopt the conventions of the ‘dismal science’.

The problem is that the orthodox economics blocks the real potential for cryptoeconomics and cryptographically enabled distributed economic-social system to facilitate the building of a radically alternative politics and economics.

At the core of this view are two realizations:

1. In their money role, cryptotokens can be an alternative unit of account, not just a means of exchange. They can invoke a new measure of value, not just facilitate new processes of trade. As such, they can have a ‘backing’ in the value of output they facilitate, not present as simply tools of speculative position-taking.

2. In their ownership role, they can be derivatives (purchases of risk exposure, not just asset ownership) designed so that people risk together, not individually. They can invoke collective approaches to dealing with risk and upside, not individualistic ones: they can enable risking together.

In this review we first follow the different framings of ‘economics’ and their implications to cryptoeconomics. The analysis then explores the notion of ‘fundamental value’: how it is utilised in the cryptoeconomics literature and how a different framing of ‘economics’ enables distinctive insights on how to creatively develop the idea of token value being founded in ‘fundamental value’. Our objective is to show how a critical reframing of economics enables token integrity to be explained and managed.

To help you navigate, here is the contents of what follows:

1. Which economics?

2. The limited working definitions of cryptoeconomics

3. The Hayekian turn: the integrity of market processes

4. Cryptotokens: means of exchange or units of account?

5. The valuation of cryptotokens

A. Developing the ECSA unit of account

B. A historical digression, but of some significance

6. Contemporary lessons from the historical digression

7. Once more on fundamental value: the ECSA approach

8. The derivative form: buying exposures to an exponential future and a big put

1. Which economics?

Economics is a broad and contested discipline. It is also an old one, with Adam Smith’s Wealth of Nations almost 250 years old, and Karl Marx’s economics 150 years old. Its dominant discourse is ‘neo-classical’ economics, dating from the late 19th century. It has, of course, significantly evolved over the past century, but the current dominant thought, directly based in that neo-classical turn, remains relatively coherent. It is dominated by an orthodoxy, quite unlike the rest of the social sciences that are conceived in theoretical and methodological debate.

The dominance of neoclassical economics is not unchallenged. There are criticisms from both the ‘right’, in the name of libertarianism (e.g. Hayek) and from the left (both a statist left like Keynesianism and an anti-capitalist left like Marxism). In that sense it is hardly surprising that there is no definition of ‘economics’ that is generally agreed.

Here are a couple of ‘standard’ definitions that come from the likes of Economics 101 textbooks that to some degree cover multiple positions in debates:

1. Economics is the study of allocating scarce resources between alternative uses.

This is a definition that points to price formation and decision making. It opens up agendas of optimisation. It is framed to privilege ‘microeconomics’ (the workings of particular markets), not ‘macroeconomics’ (the totality of economic processes, understood as more than the sum of market processes).

Here is another one:

2. Economics is the study of the processes of production, distribution and consumption of goods and services.

This is a definition with wider social meaning and context. It is not specifically about markets and certainly not focussed on optimisation. Its focus is the social, not the individual, and on systems. It is more likely to be historical and less mathematical than the economics created under the first definition. It implicitly acknowledges that the economic is difficult to disentangle from other facets of social life.

By contrast, the first definition isolates ‘the economic’ to a greater extent by focussing on price formation and decision making. In most of the 20th century, this was achieved by treating economic agents as autonomously rational, later enhanced by game-theoretic strategic rationality and later still challenged somewhat by propositions of ‘systematic irrationality’ (behavioural economics). These developments have enabled economics to become mathematically advanced and subjectable to formal modelling.

Both definitions have something to say to the token economy community, where we can recognize concurrently a potential epochal change in the way of doing economic activity and the new potential of mathematical modelling. But the two emphases need to be kept consciously in balance.

Perhaps this recognition is part of the success of Ethereum, where we can see each style of economic focus in play.

Ethereum inventor Vitalik Buterin, consistent with the first definition of economics, has defined cryptoeconomics as about:

  • Building systems that have certain desired properties
  • Using cryptography to prove properties about messages that have happened in the past
  • Using economic incentives defined inside the system to encourage desired properties to hold into the future.

Ethereum developer Vlad Zamfir, embracing more the second definition (and citing Wikipedia), says that cryptoeconomics might be:

“A formal discipline that studies protocols that govern the production, distribution, and consumption of goods and services in a decentralized digital economy. Cryptoeconomics is a practical science that focuses on the design and characterization of these protocols.”

But it is apparent that, in the broad scoping of cryptoeconomics, it is the first definition that is the focus. Sometimes called ‘token engineering’, it scopes systems of incentives that can be applied to ‘rational’ agents. The cost of this framing is to both limit the social and economic significance of a cryptoeconomics, and to create greater possibility for token failure in practice.

2. The limited working definitions of cryptoeconomics

Specifically, the focus in cryptoeconomics on reducing transactions costs and creating individual incentives to operate optimally is in danger of not just neglecting wider social issues of production, distribution and consumption of goods and services, but of building a framework that actually makes impossible a systematic engagement with the wider issues.

If you google cryptoeconomy/cryptoeconomics, the sources that appear have a remarkable consistency. The various blogs/primers/newsletters start with almost the same sentence. They break the term ‘cryptoeconomics’ into its two component elements. They explain processes of cryptography with some precision, but when it comes to explaining the associate economics, the depiction is remarkably narrow. For example:

“Cryptoeconomics comes from two words: Cryptography and Economics. People tend to forget the “economics” part of this equation and that is the part that gives the blockchain its unique capabilities. . . .Like with any solid economic system, there should be incentives and rewards for people to get work done, similarly, there should be a punishment system for miners who do not act ethically or do not do a good job. We will see how the blockchain incorporates all these basic economic fundamentals.” (Ameerr Rosic’s ‘What is Cryptoeconomics: The ultimate beginners guide.)

Similarly:

“Cryptoeconomics . . . combines cryptography and economics in order to create huge decentralized peer-to-peer network. On the one side, the cryptography is what makes the peer-2-peer network secure, and on the other side, the economics is what motivates the people to participate in the network, because it gives the blockchain its unique characteristics.” (Introduction to Cryptoeconomics through Bitcoin)

The limited framing of economics is, perhaps, because the world lacks people with background in both cryptography and (a broad) economics. Cryptoeconomics is, perhaps, being frequently projected by people who are highly qualified in programming and engineering, but often self-taught in economics. We thought it was funny when Nick Szabo tweeted some time ago about economists and programmers:

“An economist or programmer who hasn’t studied much computer science, including cryptography, but guesses about it, cannot design or build a long-term successful cryptocurrency. A computer scientist and programmer who hasn’t studied much economics, but applies common sense, can.”

In a sense he is absolutely right, but then on the other hand, you do not create anything new from doxa (common sense), but just repeat the same. The idea that the economy (society) is common sense will create an economy that looks like a computer, taking the existing power structures as given. In good economics, the issue of power, and who holds it, how its use it governed, is the key issue.

And it’s not just the bloggers and tweeters who advance this simple economics. Within the academy, there is the same sort of emphasis emerging, including from qualified economists.

The MIT Cryptoeconomics Lab presents a couple of papers that centre on transaction costs and networking. For example, in “Some Simple Economics of the Blockchain” Christian Catalini and Joshua Gans contend as their central proposition:

“In the paper, we rely on economic theory to explain how two key costs affected by blockchain technology — the cost of verification of transaction attributes, and the cost of networking — change the types of transactions that can be supported in the economy.

. . . The paper focuses on two key costs that are affected by blockchain technology: the cost of verification, and the cost of networking. For markets to thrive, participants need to be able to efficiently verify and audit transaction attributes, including the credentials and reputation of the parties involved, the characteristics of the goods and services exchanged, future events that have implications for contractual arrangements, etc.”

The Cryptoeconomics research team at Berkeley is another example. Zubin Koticha, Head of Research and Development at Blockchain at Berkeley, begins his ‘Introduction to blockchain through cryptoeconomics’ like this:

“Although Bitcoin’s protocol is often explained from a technological point of view, in this series, I will convey the incentives existing at every level that allow for its various comprising parties to interact with cohesion and security. This study of the incentives that secure blockchain systems is known as cryptoeconomics.”

It is important to be clear here. Our objective is not a critique of these specific contributions: they may well be exemplary expositions within their chosen agenda. Our objective is to say that if we limit the conception of cryptoeconomics to these framings, then we can imagine and theorise cryptoeconomics only in the language and grammar of optimized individual transactions and incentives. Programmers too should understand what that means. The issues of production, distribution and consumption of goods and services — the bigger picture issues — slide off the analytical agenda. They can’t even be expressed in this grammar.

3. The Hayekian turn: the integrity of market processes

For some, this slide is most welcome, for they see the world in terms of interacting individuals and markets as both an efficient and a moral mode for individuals to engage. If we attach an economics and philosophy to it, the most obvious is Friedrich von Hayek. Hayek was a relatively marginal figure in economic theory and policy until his ideas were embraced by UK prime minister Margaret Thatcher. Hayek was an admirer of markets and prices as modes of transmitting information, arguing they generate spontaneous self-organization. He was also an advocate of limited roles of government in money creation and management, and in social policy too, citing what Milton Friedman later depicted as the ‘the tyranny of the majority’ as the danger of government interventions. In 1976 he published a book called The Denationalization of Money, arguing that governments messed up money systems when they intervene, and we would be better off with private, competitively driven monies.

There is certainly a strong tradition in the blockchain community that would confirm this Hayekian view. But it is important that we do not fall into this discourse by accident. It is not the role of this text to debate this or any specific philosophy of economics; the point is that there is a form of Hayekian economics, with its appeal to individuals and incentives, that seems to resonate with people in cryptography. But there are more complex, detailed versions of this theory that are not reducible to these populist framings. Recall in this context that while Hayek was an opponent of state money, he did not at all advocate that money should be freely issued. He believed that money should reflect, and its quantity and value should be tied to the ‘real economy’. In 1930s and 40s debates about the post WWII global monetary system, Hayek, following von Mises and others, argued against the Keynesian proposal for a state-backed global money. The alternative he supported was that the system should be backed by reserves of basic commodities (lumbar, coal, wheat, etc). This requirement seems to be ignored by many cryptoeconomic commentators who invoke the relevance of Hayek to advocate non-state ‘currencies’ without material backing. Yet, the issue of token backing is very important, and we consider it a little bit more below.

For the non-Hayekians, there remains the option of a tradition of neo-classical economics that embraces optimisation, transaction costs, and incentives, but also pays more attention to the limitations of market solutions. A significant number of Nobel Prizes for Economic Science in the past 30 years have been awarded for engagement with these sorts of problems. It all points to the proposition that markets do not work in a simple, idealised way.

Neoclassical economists identify two broad limitations of market solutions. One is ‘imperfect markets’, where the capacity to secure forms of control over a market generate returns above the norm. Historically, this issue has focussed on the inefficiencies of monopolies and oligopolies. More recently, attention has been paid to asymmetrical information, and especially the fact that sellers generally know more about a commodity than buyers. (Joel Monegro’s ‘Fat Protocols’ is in this tradition, engaging what sorts of control at what point of the stack/value chain generate best long-term returns.)

The other factor in the neo-classical approach is the condition of ‘market failure’: where markets cannot effectively allocate prices because collateral costs and benefits are not borne by individual producers and traders. Hence there is in neo-classical economics greater engagement with the roles of government in overcoming market failure than is found in Hayek, albeit that there is also debate whether the cure is worse than the disease. Whether the computational systems built on smart contracts can significantly diminish market failure stands as a moot point.

There is one more recent points of challenge to neo-classical economics that is relevant to cryptoeconomics. It is work brought to prominence by Michel Callon and, in English and in relation to finance by Donald MacKenzie who describe economic models as ‘performative’: essentially that they make the world, they don’t describe it. This approach casts economics as a prescriptive discipline, operating in the domain of ‘ought’ statements, rather than ‘is’ statements. It warrants mentioning here because it already hints at the potentiality of cryptoeconomics: when used more radically (not only to repeat the most orthodox economic beliefs), as we try to show below, cryptoeconomics opens to us economy itself as design space. We need to recognize the complexity of social and economic dynamics in token design, and make sure that the ‘social’ receives as much analytical attention as the formal, technical issues. If we design ‘ought’ systems that understate the complexity of the social, or do not understand their effects, it is likely that governance processes will be inadequate.

Much of the rest of economics covers a wider range of views, but a smaller number of economists. But it is here that the broader, more cultural and socio-historical questions come to the fore.

We want to focus on two broader issues, still very ‘economic’ in framing, that maybe are sufficient to capture the flavour of these broader agendas. They both appeal to broader social perspectives in cryptoeconomic analysis, but embody rather different political agendas.

One comes from treating cryptotokens as not just a new means of exchange (the transaction view) but also a new unit of account (a production and distribution view); the other is played out through debates about the valuation of cryptotokens.

4. Cryptotokens: means of exchange or units of account?

Going back to our definitions of economics, the first one — about optimization, incentives and transaction costs — conceives of tokens as either means of exchange or, in the case of utility tokens, types of commodity futures contracts (rights to future conversion into commodities).

They are that, but they also are more than that, when framed in the context of the second definition of economics. From the perspective of the second definition we can see cryptotokens as providing the possibility for new units of account, and hence new ways to measure the economy.

In the first definition, the answer to the question ‘what counts’ in the economy is answered by reference to the discipline of market calculus. In the second definition, what gets counted as ‘production’ and ‘consumption’ is more open ended. What is counted — and what is valued — opens as a design question.

It has been well established that market criteria are blind to some critical economic processes. Roughly (for it is complex to specify) anything not produced for sale is systematically excluded.

In the mainstream capitalist world of fiat currencies, incorporating these excluded forms of production and consumption has been a virtually insurmountable challenge. There we see the dominance of a culture of production for profit and a history of data collection based on that principle. Beneficial things that do not make revenue are difficult to measure and hence to incorporate.

Of course we are not the first to recognise this limitation. The neglect of household production, both nurturing activities in the home and the economic activities of peasant economies, are widely-recognised limitations. And within neo-classical economics there is debate about how far into wider social analysis the notion of externalities extends (and how they might be priced). In a similar vein, the appeal of ideas like ‘triple-bottom-line accounting’ and ‘ethical investing’ also embrace alternative visions of counting. But, and this is critical, they all presume the ontological primacy of profit-centred measurement: they are critiques of and qualifiers to that system and rarely present alternative modes of calculation.

Cryptotokens enable us to re-open this measurement question. Cryptotokens as means of exchange enable us to trade in new ways. Cryptotokens as new units of account enable us to measure output (what is value and how is it produced) in new ways.

This points us already to one of our key insights, to which we will return a little bit later: we already know that the next value production layer — of the era of decentralized open source data — has to do with governance (more welcoming and better governed crypto networks will be valued at a premium due to their reliability of social inclusion in the decision making process), but also with ways of belonging, ways of sharing stakes, risks, upside. The organization of ‘risking together’ — or what we call an economic space — becomes now the actual value creation layer: it is a new value form which is very different to the commodity form as a basic economic cell of society which makes social relations between people (for capital is a social relation) appear as just relations between things. Basically, you will compete now with different community-economy-governances. This — explicitly relational, social logic of this new value form — is precisely what we try to capture by talking about it as a network derivative below. And it is to express such social-economic organizations that we are working on the Space organizational grammar and development environment (See the ECSA Tech Stack below).

The challenge in the cryptoeconomy is to open discussion about how we understand and measure this broader conception of ‘production’ and ‘consumption’, consistent with our aspiration of incubating not just in new ways of organizing, but also the production of new things and new (social, political, aesthetic, organizational, environmental…) relations.

But it is critical here that this imagining of new ways of doing economy and economics is not just fuzzy and feelgood: we need ways to measure and to socially validate these new horizons. It means that we cannot, in the first instance, reduce all forms of production to a monetary price. We can treat monetary price as one index of measurement (for a price is merely an index since the base unit of measure is arbitrary with respect to the thing being measured), but we will need other indices of production too, targeting measurement of the different ways in which goods, services and intangibles get acknowledged socially — or, become socially quantifiable, as Gabriel Tarde, one of the maybe most relevant economic thinkers for crypto, originally put it. Think of measurements of replication, imitation, iteration, social inclusion and recommendation.

In cryptoeconomics, especially where the focus is on wider social agendas, questions of what to measure and how to measure opens up a critical research agenda which will require ongoing attention and resource allocation. It is readily apparent, even in the mainstream of economics and accounting, that the valuation of ‘intangible assets’ is a critical problem. It always has been a problem, for such assets can’t be measured like plant and equipment and real estate, but as intangible capital (brands, intellectual property, etc) are now the overwhelming proportion of the assets of the world’s largest companies (Facebook, Apple, etc.), the lack of appropriate tools for valuation has emerged as a conspicuous accounting problem.

Most of the assets inside cryptoeconomics are also likely to be predominantly ‘intangibles’, so the problem is shared. The difference is that new tokens open up possibilities for new codes and agendas —new grammars — of measurement. It is quite conceivable that research on measurement in relation to crypto units of account may turn out to be invaluable to mainstream accounting too. (See Valuation Crisis and Crypto Economy)

Developing alternative measures are not a simple processes, and there are certainly challenges, most notably of measurement across indices and dealing with gaming of the measurement system. But we believe these challenges are significant and definitely worth taking up. It is important that resources are put into exploring new measurement agendas.

5. The valuation of cryptotokens

This issue is of importance because it gets to the heart of the question: can markets effectively price tokens as more than speculative objects? This was certainly an issue for popular debate in late 2017 when the price of bitcoin spiked. In this context, prominent crypto investor Fred Wilson said:

“In times like this, I like to turn to the fundamentals to figure out where things stand and how I should behave. . . You need to have some fundamental theory of value and then apply it rigorously.”

For those who believe that markets create spontaneous order, the search for something ’fundamental’ is a non-question: price captures all information, it is an expression of supply and demand and finds its own level. So the very act of posing the question of an ‘underlying’ or ‘fundamental’ value is to move outside the Hayekian view. It is to suggest that there is and can be a value to cryptotokens beyond current price. It takes us beyond that first definition of economics in terms of markets and incentives and into some wider, social and historical issues of economics. (See Valuation crisis and crypto economy; and Whose stability? Reframing stability in the crypto economy)

The issue under consideration here is not whether the measurement of ‘fundamentals’ is a good guide to trading strategy in cryptomarkets. (There is a standard debate in trading strategy about fundamentals VS. technical analysis of patterns of price movements. Warren Buffet stands out as an advocate of fundamentals analysis.) Nor is it about developing the capacity to forecast an income and expenditure model for the future, important though this is. (See for example, Brett Winton, How to Value a Crypto Asset — A Model)

The issue here is how we might measure the crypto economy if not by current token price. It matters because fundamental value points to the longer-term viability of a token and the activities that underlie it and potentially gives tokens an integrity which will be recognised in wider capital markets.

Fundamental value is not simply a long-term average price around which the spot price varies. It is a value that can be measured by criteria which link to the capacity of an asset to produce new value. In neoclassical, equilibrium theory, the ‘efficient markets hypothesis’ postulates that long-term market price will spontaneously gravitate to the valuation of this capacity to generate new (future) value. But it is only a view specific to the discourse of neoclassical economics.

The challenge of fundamental value is to find a mode in which to measure the current value of an asset, especially when that value requires projection of the future. It was once a relatively straightforward calculation: a staple (and stable) measure of accounting, when manufacturing industry was the ‘model’ corporation, and capital was physical (factory sites, machinery, stock). It has become a more challenging issue for accounting since corporate assets became increasingly intangible — intellectual property, brands, goodwill, etc. Some leading accountants claim that the profession is in a crisis because of an incapacity to measure the value of intangible assets.(See Valuation crisis and crypto economy)

So we should recognise that, in a cryptotoken context, the idea of calculating a fundamental value is experimental. We should also be aware that there is a propensity in cryptoeconomics to engage valuation via an over-simplified interpretation of what a token actually is and can do. That is, there may be consensus that tokens are complex, hybrid, novel things. But in the complexity of the valuation process, there is a proclivity to treat them as one thing; in particular as money or as equity, and especially the former.

Some of the debate here occurs via analogy. For example, it can be argued bitcoin could do the credit card provisioning of Mastercard or Visa, so we could estimate a corporate value for bitcoin based on the value of these companies. (See critical evaluation of this by Andy Kessler, The Bitcoin Valuation Bubble, Wall St Journal, August 27, 2017). But that doesn’t really work, because analogies don’t hold. We can’t claim the crypto economy to be different, yet benchmark its value to assets and processes we seek to disrupt.

Another approach, we think laudable in its desire to capture cryptotokens as derivatives (see more below), contends that the Black Scholes options pricing model can be adapted to explain token values. (See J. Antos and R. McCreanor ‘An Efficient-Markets Valuation Framework for Cryptoassets using Black-Scholes Option Theory’)

The essence of this proposition is that cryptotokens hold exposure to a future of potentially such monumental significance, so that cryptoassets themselves are call options on the utility value of what that cryptoasset might someday provision. Volatility of token values may then be seen as an efficient reflection of a rational estimation of the probability of realising some real utility value of a future — perhaps distant future — product that results from current cryptoassets.

Framed within the efficient markets hypothesis (Eugene Fama), this approach does bear the critical proposition that ‘efficient’ outcomes are a reflection of some ‘fundamental value’. But outside that assumption it is not really persuasive as an explanation of fundamental values. However, valuable in its approach is that it focuses on changes in estimated variables, not on the static value of underlying assets. In this way the approach does capture a derivative dimension in valuation (an issue addressed shortly more below).

Some innovative agendas of measurement are coming via the old quantity theory of money proposition, expressed as ‘the equation of exchange’. The formula states:

MV=PQ

where M = the quantity of money in circulation,

V = its velocity of circulation of money,

P = the general price level in the economy and

Q = the quantity of goods and services sold in the economy.

It is worth spending a little time giving context to this formula, both because of its application in the existing literature on cryptoasset valuation and because it is where ECSA too looks to frame fundamental value, albeit in a way different from current debates.

The equation MV = PQ comes from a long economic lineage, mostly identified with 18th century Scottish philosopher David Hume. It presents the ‘real economy’ on the right hand side and its money equivalent on the left hand side. Its lineage is long, but its functionality in economics is challenged (e.g. is it merely an identity; can it be read causally and if so from right to left as well as left to right?). In late 20th century policy application it has been used to focus on the relationship of M and P, and to argue that states should be passive in economic management, creating just enough money to keep prices stable (with V constant, money supply should expand in proportion to Q) . It became popular in the 1970s economics of Milton Friedman (broadly aligned to Hayek). It was called ‘monetarism’ and contended that state fiscal and monetary expansion were not solving recession, but causing inflation.

Monetarism became central bank policy orthodoxy in many Anglo countries for just a brief period in the early 1980s, expressed as ‘money supply targeting’. It was quickly abandoned and one of the reasons, with new significance for the world of cryptotokens, was that the state’s various definitions of money (cash, trading bank deposits, etc.) moved in different directions: there was no single state money to be targeted. More recently, the non-inflationary impact of US quantitative easing (so far) may be some further indication of the practical limits of the equation in state policy formation. QE also raises the challenge that the equation may only work for goods and services outputs and prices, but not for financial assets. Indeed, it is ambiguous as to which side of the equation liquid financial assets should be located: are they commodities (RHS) or money (LHS)?

With that brief background, let’s take a quick look to the use of the quantity theory of money in cryptotoken valuation.

The initial figure of note here is Chris Burniske, who observes that tokens have both asset and money attributes and are currencies in the context of the programs they support. But in that role they don’t generate cash flow, so they can be valued by discounted future value but not a discounting related to cash flows but to a projected future fundamental value. So the analysis turns to utility values (Cryptoasset Valuations).

For this purpose Chris re-defines the variables of the equation:

M = size of the asset base

V = velocity of the asset’s circulation

P = price of the digital resource being provisioned (note: not the price of crypto assets)

Q = quantity of the digital resource being provisioned (note: not the quantity of crypto assets.

He then solves for M, which enables an individual token valuation.

This is indeed a novel approach and for a reason opened up a significant debate. But it does have some problems:

  • It finishes up displacing the valuation problem from the token to the valuation of digital resources being provisioned, and the ambiguity of how that is being measured.
  • Moving V to the RHS so as to solve for M turns the equation from being a logical identity (that money values equal commodity values) into a historical proposition of individual variable valuation. (If, for example, velocity doubles, it doesn’t thereby halve the size of the asset base.)
  • In wider discussion there is recognition of the problematic valuation of V (actually, it is the volatility of V). There is a literature addressing the question of velocity. There are debates here, but, as summarised by Alex Evans, its common proposition is that:

“tokens that are not store-of-value assets will generally suffer from high velocity at scale as users avoid holding the asset for meaningful periods of time, suppressing ultimate value.”

A problem with the valuation literature seems to be that it conflates the equity dimension and the monetary dimension of tokens. The focus on the fact that (a) tokens will be turned over rapidly because they are not a good store of investor value is different from the issue of (b) tokens turning over in their use as a means of exchange inside projects/businesses. The latter matters, the former does not. To give attention to the former would be like saying the turnover of corporate equities impacts the long-term price of corporate equities. The point: We need to re-think the meaning of velocity in a token world. The underlying issue here is that tokens blur the categories of equity and money, and the velocities of these attributes have different drivers.

We are interested in this approach and its criticisms not for the purpose of disproving the approach — for probably any proposal in this domain is somewhat easy to critique. The point is that cryptoaccounting, like mainstream accounting, simply doesn’t have good tools to measure in this domain. But it is an area where we should explore, and it requires creativity, such as shown by Burniske and the debate to which his work has given rise. Indeed, we are thinking that the novelty of cryptoassets gives us opportunities to invent valuation procedures that could well be actually of benefit to the mainstream accounting profession.

We have been working intensively on the ECSA token valuation system, as it could be posed as an engagement with this debate, returning to the original meaning of

MV=PQ

as the depiction of an economy which balances the ‘monetary side’ of an economy (MV) with the so-called ‘real economy side’ (PQ). From an ECSA perspective, the measurement process means that P is too limited a category. We want to treat price as just one index of ‘value’ measurement amongst a range. So we respecify:

MV= I(1-x)Q

where

M = the quantity of tokens issued in the new economic space (ECSA bootstrapped ecosystem of new value forms)

V = the velocity of circulation of tokens within new economic space (ECSA bootstrapped ecosystem of new value forms)

I(1-x) is the range of indices of valuation, of which price is just one

Q is the quantity of output (tangible and intangible) produced in the new economic space (ECSA bootstrapped ecosystem of new value forms).

If we measure the new economic space (ECSA bootstrapped economy) in the way described, we can make a simple use of this formula, or at least the underlying sentiment, both economic and social. For ECSA itself, this mode of measurement gives a means to define both fundamental value and set some governance agendas.

As an identity, the LHS and RHS are always equal.The monetary policy position of ECSA is that the RHS (the total value of output within new economic space/ECSA bootstrapped ecosystem) will drive the LHS (token issuance qualified by velocity). In the distributed system of offers and matches ECSA is building, we will be able to develop significant data sets of M (distributed token issuance) V (where we can distinguish clearly between offers that are about acquisition of inputs for production and those used for mere token exchange), and we will develop empirical measurement to calculate I(1-x) and P. We think that in the offer based new economic space token issuance can be governed in a distributed way to ensure MV=IQ, and the economy can be run on non-inflationary tokens. The key here is to concurrently have internal ‘working tokens’ and ‘market token’ traded on the capital market. The equation of exchange should only apply to the internal ‘working token’, not the ‘capital market token’, for it is the internal token which articulates with production and distribution. (We will return to the issue how the capital market token and the value graph and its working tokens are bridged in the next texts of this series.)

A question is, of course, how do we reduce I to a single index number? The answer has two layers. One is that once we reach critical mass of offers and matchings market processes will themselves drive valuation. The second layer is that these same offers and acceptance will generate significant data which can then be transformed into indices for valuing goods and services (outputs and performances). How these indices will be compiled is one of our most interesting research areas at the moment — it will come out of an empirical processing of large data; it cannot be known a priori; nor should it be presume to be fixed in value.

There are some elements in this proposition that warrant further explanation. They will be taken up in a later section, but in the current context the critical point is that we believe that big data generated by agent offers and acceptances provide a critical information source for framing fundamental within a (broadly) MV=PQ framework.

‘Big data’ are in contemporary society often readily depicted, and rightly so, as socially intrusive and manipulative. But we think that in a token economy context of transparency and decentralized open source data, they are critical for organization and inclusiveness. And even further, without a reliable runtime and a grammar that can help us navigate and operate this space, and build knowledge derivatives (indexes) of it, we are today without politics, economics, incapable of speaking, of grasping, and intervening in processes and future of our life. This is the technology development task we are taking on in ECSA (See the ECSA Tech Stack).

 

We’ve always been impressed by Galileo Galilei, Leonardo da Vinci and other renaissance scientists who invented the experiments and instruments to navigate, understand and measure the newly opening space-time reality — the microscopes and telescopes to reveal micro- and macrocosms, inclinometers to determine latitudes, thermoscopes to show change of temperature, barometers to reveal atmospheric pressure, nautical instruments, experimental methods to understand invisible phenomena, velocity, acceleration, gravity— we will need to do the same now for the new economic space-time. Just like the birth of perspective in art, we will introduce perspective in economy. It will be a renaissaince.

We will return to the fundamental value issue shortly, for the issue of whichfundamental value is to be indexed in terms of the measuring role, and governance role, is called on to play within a token system. The focus is therefore on the requirements of the new economic space, but we get there in a way with ramifications for wider token systems.

Galileo’s original telescopes and the lense he gave to Medici at the Museo della Storia della Scienza in Firenze (Photo by Akseli Virtanen)

6. Developing the ECSA unit of account

Part A: A historical digression, but of some significance

The problem of our radical measurement proposal is that all the language of money, markets, prices and profit is dominated by a grammar that equates money with the state, markets with a profit-centred mode of calculation and profit as a surplus defined by reference to extraction of individual benefit. Cryptoeconomics has been strong in challenging the first of these, but less effective in the latter ones. But they need to be challenged too.

John Maynard Keynes, the economist most associated with the principles of state issuance and management of fiat money in advanced capitalist economies, said in his 1930 Treatise on Money:

“The age of chartalist or State money was reached when the State claimed the right to declare what thing should answer as money to the current money of account — when it claimed the right not only to enforce the dictionary but also to write the dictionary. Today all civilised money is, beyond possibility of dispute, chartalist.”

Ninety years on, cryptotokens are the counterfactual to this proposition, but Keynes’ proposition about the State writing the dictionary is right. Part of cryptoeconomics is to challenge the dictionary: to open up new ways of thinking money and price.

That challenge is broad, but in this context, let us go to Keynes and Hayek, and read them via the innovation of cryptotokens. The object is quite specific: to note how certain of their categories that are now assumed ‘theoretical’, indeed axiomatic (the ‘dictionary’), are actually historically specific and contingent and should be challenged in the light of cryptotoken development. Moreover, within each of these significant economists, we can find the derivative dimension that is repressed in the interests of conveying a culture of theoretical certainty.

We start with Keynes.

His central proposition, conceived in the years either side of the Great Depression, want that nation states must manage national economies for markets will not gravitate to full employment stability. Central here is the idea that the state defines money and closely manages the financial system. In Chapter 17 of the General Theory he challenged his own premise and introduced the hypothetical idea that money may not be unique in its economic characteristics:

“The money-rate of interest — we may remind the reader — is nothing more than the percentage excess of a sum of money contracted for forward delivery, e.g. a year hence, over what we may call the “spot” or cash price of the sum thus contracted for forward delivery. …Thus for every durable commodity we have a rate of interest in terms of itself, — a wheat-rate of interest, a copper-rate of interest, a house-rate of interest, even a steel-plant-rate of interest.… Money is the greatest of the own-rates of interest (as we may call them) which rules the roost.”

Keynes, in essence, depicts money as the greatest own interest rate because (a) it does not itself produce a use value (like say wheat or copper) so it does not get diverted to those uses (it is exclusively ‘money’); (b) there is no issue of wastage (c) it is the most liquid asset, and (d) its quantum is managed.

Two things are interesting here. First, these criteria identified by Keynes as integral to the ‘greatness’ of (state) money do not persuasively apply today: indeed all financial derivatives have the liquidity and fungibility of state money, and cryptytokens are not constrained by nation- (or group-of-nation-) specific acceptability.

Second, Keynes uses the language now associated with derivatives to depict the rate of interest. Money is the underlying of which interest is the derivative. And for Keynes, money is axiomatically state money. Cryprotokens are in this context a put option on the state: the right to sell out of the state’s unit of account.

For Hayek, the origins of his thinking on the social and economic virtues of market processes comes from an early to mid 20th century debate with advocates of Soviet-inspired and other variants of central planning. It is known as the Socialist Calculation Debate. Hayek, following von Mises, argued that central planning, even at its best, has a range of insensitivities to detail: it can only work with highly aggregated, and outdated, data and impose these generalized decisions on individuals. The market, on the other hand, runs by processing decentralized information. It can synthesise complex forms of social and economic information into a single index, enabling economic relations to be conducted in simple and orderly processes.

It is worth quoting Hayek at some length, because what he says resonates with the capacities of a cryptoeconomy:

“It is in this connection that what I have called the “economic calculus” proper helps us, at least by analogy, to see how this problem can be solved, and in fact is being solved, by the price system. Even the single controlling mind [the central planner], in possession of all the data for some small, self-contained economic system, would not — every time some small adjustment in the allocation of resources had to be made — go explicitly through all the relations between ends and means which might possibly be affected. It is indeed the great contribution of the pure logic of choice that it has demonstrated conclusively that even such a single mind could solve this kind of problem only by constructing and constantly using rates of equivalence (or “values,” or “marginal rates of substitution”), i.e., by attaching to each kind of scarce resource a numerical index which cannot be derived from any property possessed by that particular thing, but which reflects, or in which is condensed, its significance in view of the whole means-end structure. In any small change he will have to consider only these quantitative indices (or “values”) in which all the relevant information is concentrated; and, by adjusting the quantities one by one, he can appropriately rearrange his dispositions without having to solve the whole puzzle ab initio or without needing at any stage to survey it at once in all its ramifications.

Fundamentally, in a system in which the knowledge of the relevant facts is dispersed among many people, prices can act to coördinate the separate actions of different people in the same way as subjective values help the individual to coördinate the parts of his plan.”

So price is the condensation of a multiplicity of determinations (to borrow from Althusser). The market can incorporate and process all different forms of information (create knowledge) to create spontaneous order.

“The most significant fact about this system is the economy of knowledge with which it operates, or how little the individual participants need to know in order to be able to take the right action. In abbreviated form, by a kind of symbol, only the most essential information is passed on, and passed on only to those concerned. It is more than a metaphor to describe the price system as a kind of machinery for registering change, or a system of telecommunications which enables individual producers to watch merely the movement of a few pointers, as an engineer might watch the hands of a few dials, in order to adjust their activities to changes of which they never know more than is reflected in the price movement.”

F.A. Hayek, ‘The Use of Knowledge in Society’, American Economic Review. XXXV, №4. 1945, pp. 519–30.

This 1940s advocacy of ‘the market’ may stand strong as an alternative to 1940s central planning, but 70 years on the argument is and should be different. There are now ‘big data’, access to vast amounts of information to inform individual decisions, and computational capacities to process this information instantly. The ‘imperative’ to have complex variables reduced to ‘price’ no longer holds. Decentralized decision making does not have to articulate simply via price formation in markets. Indeed, one potential of cryptomarkets is to challenge the use of Hayekian price as the decentralized object of calculation.

Hayek says price embodies complex information — it creates knowledge of society — and its great functionality is that it is a simple representation of that complexity. Blockchain and cryptotokens present us with other ways of processing complex information. To get to this 21st century engagement, we can frame Hayek’s analysis in the context of risk and derivatives. There are two dimensions here.

  • In the era of blockchain and big data, and in the language of Gilles Deleuze, we can dividuate knowledge: break it down into its underlying, determining elements (that Hayek thought were too complex to code), but without necessarily aspiring to see those elements combined so as to ontologically privilege the totalised category of ‘knowledge’. Knowledge is a synthetic asset; an assembly of information. Its purpose does not have to be the formation of market price.
  • It follows that, in the era of derivatives, we can think of price as itself a derivative on those underlying forms of information of which price is said to be the condensate. In Hayek’s analysis of ‘The Price System as a Mechanism for Using Knowledge’, ‘price’ is really the strike price on the option on a synthetic asset called ‘knowledge’. (Individuals in this framing hold out-of-the-money options where they are priced out of the market and the return on in-the-money options is what neoclassical economists call the ‘consumer or producer surplus’.)

It follows that if Hayek’s approach can now be framed as an excessive reduction of information to a single, totalising unit of measure (‘price’), we can ask what are the key dividuated forms of information for which price represents a derivative exposure? And once we identify what they are, we can ask how are they important beyond being the ‘underlyers’ of price? How are they important in their own right as knowledge and as social indicators for decision-making?

The significance of these forays into Keynes and Hayek is profound: more so than might at first appear.

For Hayek, if it is possible to deconstruct the information behind price, why is it assumed that the objective of this information is the formation of prices rather than some other unit of measure? We can open up radically different social modes of calculation.

For Keynes, if money is a derivative exposure to the state, we might ask what cryptotokens might be a derivative of? What social and economic modes of organization may be available here?

To take on this significance, we need to back up a bit: to challenge the dictionary. Price is no more than an index: it measures relative values (between products; over time). But it gets treated socially as an absolute social measure. This is central to the idea of trust in a (fiat) money system. But the absolute measure is a social construct, and it can be changed: Francs to Euros; ‘old’ British pounds to ‘new’ (decimal) pounds. The appearance of cryptocurrencies, offering potential for so many different benchmarks for valuation, makes that social construction stark.

So why is ‘price’ currently the privileged index of valuation? Why do we not use (for example) sociality (social impact) as the privileged index of valuation? Or environmental impact?

The answer is that price is a measure that expresses the social and cultural values of a capitalist society. In using price as the privileged measure we assume that (a) production for the market is valued over production for direct use (for the latter generates no price) and (b) we assume that profit is embedded within price (people take things to market so as to make a profit). In a capitalist society, those priorities seem appropriate: they capture the values of that society.

The above is no doubt something of an overstatement. There has long been critique of GDP data, for example because they are not adequately measuring the environment, or commodity output is not a measure of ‘happiness’. Exactly what sorts of indices we might adopt is not the specific issue here. It is that for a measure to be not just an idealised alternative to GDP but a living indicator to be used in economic management, it has to have a material grounding in social organization: GDP as we now know it rules as an aggregate measure in a society that privileges production for profit.

Part B: Contemporary lessons from the historical digression

In an economy not driven by profit, but by a different framing of social contribution, we need different modes of measurement. If we stay with GDP-like measures, but add qualifiers (like pricing the environment or pricing care) we have to convert these qualifiers to profit-centred criteria, and then seek to justify their lack of profitability in terms of some unspecified social good. In this framing, non-profitable social goods are inevitably depicted as ‘concessions’ (virtuous loss-makers). We want to frame them not as concessions but as a purpose of doing economic activity. That framing requires the de-throning of profit as ruling the discourse of economic analysis. It is just a perspective — a very restricted perspective — on value.

We think we can borrow from a Hayekian method to re-think different measures made possible by the creation of tokens as units of account.Price, in its abstracted meaning, is valuation of something (output) by means of an index. So instead of allocating the generic word ‘price’ to our current (and Hayek’s preferred) index of measurement, let’s call that measure ‘profit price’, signalling the epistemological foundation of the index: it is just a profit-centred perspective on value.

But cryptoeconomics provides a means to measure also in terms of post-capitalist values — in terms of future value forms. And ‘profit price’ is not the index that best captures these value forms and their production. On the contrary, it has rather become a drag to their production. This is perhaps one of the most difficult things to understand about the transformation we are in: when capital becomes ‘intangible’ information and knowledge there is an irreversible change in the nature of itself. It does no longer follow the same laws, it does not behave in the same way, it does not produce and capture value anymore in the same way.

Think of knowledge. How and where does knowledge get its value? How does knowledge become valuable? How does the value of knowledge appreciate? The value of knowledge appreciates (a) if it is used in multiple ways, that multiple ways of using it are invented: if it is shared, adopted, repeated, imitated, copied; which means that it is always a collective and social process; (b) if it gets subjectively interpreted, accepted, “owned” and invested into; there is nothing more valuable than a knowledge producer who “owns” the production, is capable in sharing in its risks and puts herself at stake in the production; (c) if its producers ‘risk together’, if they self-regulate continuously the relations of its production, share its risks, stakes, upside. These key features of knowledge production — ability to multiply ways of use, ability to interpret and give subjective meanings, ability to self-regulate and continuously modify relationships between actors in its production — are precisely something that the old value production did not have. It is a different grammar where value gets created by (a) sharing, copying and inventing multiple uses (VS. restricting use by proprietary ownership); (b) many interpretations, iterations, variations, “owners” (VS. hiding the source code); (c ) collective self-organization, self-governance and right to fork (VS. external organization and control).

Would it not make sense that perhaps another kind an on index, say a ‘sociality’ index (‘sociality price’) would better capture such value production, and markets could value in terms of sociality price rather than profit price. (The objective here is not to give precision to a sociality index — or indices — it is just to frame the credibility of their existence as a social alternative, a different perspective on value.)

Fanciful, many will say! Hayek would have us believe, and many passively accept, that profit price is ‘natural’ and society spontaneously gravitates to an order around the calculation of profit price. Sociality price does not exist, and it would be a complete overturning of social norms to engage it.

So how do you compile a sociality index that is socially recognised and used?

First, we should recognise that Hayek’s notion that price formation in markets is a spontaneous order which happens to society is simply wrong. As Karl Polanyi puts it in the context of the socialist calculation debate: markets were planned; planning wasn’t!

Hayek himself highlights just how much complex and decentralized information goes into compiling price as an index. It’s about costs and market power, regulations, etc. on the cost side and tastes, income, etc. on the demand side. They are different for every individual, for every commodity, and at every point in time. But, and here Hayek is right in relation to current social relations, our society does, in general, bring it all together to create prices and orderly markets.

We respond that this is not a natural order, but a socially organized order. Building alternative sociality indices, and having people value by reference to sociality price, involves a massive cultural as well as economic shift. It would be about production for value rather than production for profit. The key metrics would have to shift from conditions for profitability to conditions for valuable. It is no more or less logically feasible than valuing in terms of profit price.

Cryptotokens provides us with an opportunity to experiment for example with a sociality index of price: indeed to develop multiple indices of valuation that reflect different social priorities the way that profit price reflects capitalism’s priorities. And if total income is determined by payments for contribution for sociality, we have the conditions for a different measure of value, for a different value calculus.

We are thinking to begin to trial this system. To quote ECSA developers:

“Think of the token as a propositional force, a sparkle of potentiality. It is a multi-dimensional docking port that can germinate new forms of relations and value sharing. The token is an occurrence, a virtual (time) crystal expecting its transductive associated milieu. It is an instance of value capture, but only insofar as it acts, simultaneously, as a fugitive relay of anarchic shares collectively modulating and amplifying values. Conceiving of tokens as speculative pragmatic relays is a way of entertaining them as generator of collective effervescence.”

(Erik Bordeleau et al. at the Economic Space Agency “We don’t know yet what a token can do”; see also “On intensive self-issuance”, Economic Space Agency in MoneyLab Reader, pp. 232–233)

7. Once more on fundamental value: the ECSA approach

‘Fundamental value’ is critical in cryptoeconomics for the simple reason that, in economic terms, it is the first benchmark of good governance. That is, it provides the framework so that the integrity of a token on issue is to be found in the system of production that it enables. So there must be a clearly-stated relationship between token issuance and the level of production.

But that relationship is historically specific, and we need to recognise that old notions of ‘fundamental value’ may have to adapt, and not just in the context of cryptotokens, but emphatically in their context.

So what are the hallmarks of the traditional notion of fundamental value? It is that there can be an ‘underlying’ measure outside of (beneath) the vicissitudes of market exchange. Adam Smith called it ‘natural price’; Marx sought a unit of value in socially necessary labour time. Accountancy sought fundamental value in the long-term productive value of the various assets of the corporation, as distinct from stock price. In each of these cases ‘intangible capital’ breaks the modes of fundamental value measurement. That is important as intangible capital becomes increasingly prevalent on corporate balance sheets, and it is clearly central to cryproeconomies.

But more broadly, as the nature of capital changes, so the mode of its measurement changes. If we look at the way financial markets have developed over the last 60 years to explain value — using tools like CAPM, VaR, EMH and Black Scholes — we can see that practical valuation has gone ‘inside’ the market: the idea that ‘real value’ exists outside market transactions is no longer a reflection of how valuation actually occurs.

What it signals is that ‘fundamental value’ has been shifting from stock measures (hours of embedded labour time; machinery and factory sites) to flow measures: from measures ‘outside’ the market to ways of re-interpreting data generated within markets.

A flows approach to fundamental value focusses on sources and uses rather than the valuation of stocks (assets, portfolios, warehouses).

Professor Perry Mehrling, one of the key critics of modern finance in the light of the global financial crisis, quotes Hyman Minsky that “the cash flow approach looks at all units — be they households, corporations, state and municipal governments, or even national governments — as if they were banks.” Cash flows have their uses and sources, and “for each agent, every use has a corresponding source and vice versa” and “each agent’s use is some other agent’s source, and vice versa.” If one thinks of agents as banks then all their assets and liabilities are intertwined. Whatever assets a bank has (i.e. cash) comes from elsewhere, as of course its liabilities are also “social”. In a stock version of fundamental, what is valued is assumed social because what is valued is the product of past social processes. Here, in measuring flows, we are focussing on the social-in-process.

This approach resonates directly with the interoperability grammar of offers and acceptances undertaken by agents on a blockchain. It suggests a new mode of framing fundamental value; one that is appropriate to the new form of economic interaction generated in a cryptoeconomy.

The MV=PQ approach to fundamental value must sit under this framing. This identity is one critical perspective on fundamental value.

ECSA thinks that this framing sets the conditions on which ECSA must build its diagnostic tools and indeed its units of account (for there need not be just one, albeit that there will be commensuration between units, but they may be non-tokenised). We can draw from Hayek the idea that transactional data embody complex, detailed information, from which indices can be compiled: indices that will give access to ‘underlying’ trends and can be compiled in ways to perform the function of units of account.

But, and this is critical, these indices cannot be pre-defined nor locked in: they are themselves to be produced as a recursive exercise of data analysis, and as the data evolve, and as techniques of data analysis evolve, so the (synthetic) indices of fundamental value must evolve. The indices must be in harmony with underlying market processes; not stand in contradistinction.

In an MV=PQ framing, we have to build data from agents’ offers and matches to compile a way to measure Q and V, and ultimately thereby stabilise the relationship between M and P. They must emerge from the interrogation of data, and suit the specific purposes of the new economic spaces bootstrapped by ECSA. And those indices must be allowed to evolve over time, both by processes of refinement as data get richer and processes of transformation as notions of social value evolve within the new economic space.

This on-going process of developing indices will be one of the critical performances of ECSA. It is critical to the integrity of new economic spaces, but we aspire also that it will be part of a wider social project to re-think value. We are familiar with the extensive ‘alternative’ measurement work around the world — in development studies, in human capacity measures, in health, in care, in the environment. We believe that ECSA’s indexing project can be part of this wider agenda; indeed framed within a token economy there will be new ways for this project to develop.

8. The derivative form: buying exposures to an exponential future and a big put

The above suggests to us that tokens in cryptoeconomics, and certainly in the new economic space, take the form of derivatives. We mean this not just in the sense that the purchase of an ECSA token in the capital market is a derivative, in the same way that company stocks are derivatives (exposure to company performance, without ownership of the underlying).

The proposition is deeper — in part material, but in part also symbolic .

First the material expression. There is a widespread embrace from the cryptoeconomic community of the issue of a transformative potential: tokens give exposure to an unverified but exponential future. The various indices that ECSA will compile are themselves derivative formulations, in the sense that movements in indices will determine individual token values as they exchange with other tokens and with ECSA’s mutual stakeholding fund. The purchase of tokens is thereby the acquisition of an exposure to these indices. The indices themselves are measures of the performance of economic spaces. So a token is an exposure to an index which is itself a representation of economic performances by token-issuing economic spaces.

Second, symbolically, the crypto economy involves taking risk positions on the established capitalist economy, its calculation of value, and the economic (and political) power structure around it.

  • Those of us engaged in building (and analysing) cryptoeconomics and holding a long position on its potential.
  • Those state regulators/commentators who decry the potential of crypto economies are using their regulatory and media power to short us.
  • Those who diversify from the conventional capital market and capitalist economy and invest in ECSA are taking a short position of capitalist systems of value calculation. Borrowing the great insight of ECSA Advisor prof. Robert Meister of UCSC, we offer them a ‘big put’, a capacity to short capitalism.

In the words of ECSA Advisor, NYU professor Robert Wosnitzer:

“If the current system/structure of capital “shorts” the qualitative dimensions of life and society, and/or the externalized costs of production (i.e., businesses “put” the costs of, say, pollution onto society), through going “long” the current system of production and circulation, then ECSA is going long the qualitative/intangible dimensions and shorting the current system.

Said differently, a put option “puts” back the cost of the spread between the current value of something and its strike price to a counterparty in equity options, or in the case of bonds that carry a put, the right to “put” back the par value at a specific moment. This “put” option also carries the logic of securitization — that is to say, the rationale and need to securitize mortgages is due to the fact that homeowners have a “put” option that they can exercise at any time, thereby rendering the cash flows unpredictable and therefore not suitable for investment and reducing liquidity. By securitizing mortgages, the “put” option is mitigated as the risk is spread across multiple mortgages in the pool, and then tranching allows for even more precise predictability. So the “put” option has always carried some threat to the current system of capital relations, and the need for securitization arose largely to address this “threat.” Of course, the put option in mortgages has been largely reduced to interest rate sensitivities (and hence the need for strong central banking operations), whilst ignoring the real, material social contexts that often drive interest rates — unemployment, price increases by producers, health care costs, etc. etc.”

And the politics is clear:

Creating “alternative” economic spaces allows the owners of tokens to own an option that could, under certain conditions, “put” back the cost of an externality to the owners of the means of production which created the externality (or injustice, if you will). It’s a long, directional play (buying a put) that is the dialectical opposition to the long, directional play of existing capital relations.

Yet there is currently no derivative product that recognises the significance of this insight, and enables it to be ‘played out’ financially, in a way that don’t express simply via the volatility of token values themselves.

ECSA is currently exploring ways we might engage this logic, for example by securitizing a certain part of revenue (perhaps a world-first collateralized equity note?), with the possibility of both hedging that revenue (both in quantity and currency) and also providing a liquid market tool to attract short positions on ECSA, but in a way that diverts this shorting activity from the ECSA token itself.

It must be emphasised that this is a strategy currently at the stage of exploration only, as part of engaging in a process of discovery of what cryptotokens might become. It is raised here simply to indicate the kind of exploration we feel need to emerge.

The ECSA team will be again at NYU during the last week of September to carry on the work in the next series of Cryptoeconomic working sessions.

If you made it this far, and are interested in joining the work, let us know!

Credits:

Big thanks for comments and discussions on earlier drafts by Johnny Antos, Chris Burniske, Eden Dhaliwal, Anesu Machoko, Jessa Walden and some anonymous contributors. Thanks also to all participants at the Cryptoeconomics Working Sessions at NYU/Stern, Stockholm School of Economics and GCAS. ECSA team working on our cryptoeconomics project — Jonathan Beller, Erik Bordeleau, Fabian Bruder, Pekko Koskinen, Jorge Lopez, Joel Mason, Tere Vaden — rocks.

Dick Bryan, is a prof. (emer.) of Political Economy (University of Sydney) and Chief Economist at Economic Space Agency. He is one of the key theorists of the derivative value form, and the author of Risking Together and Capitalism with Derivatives (together with Mike Rafferty).

Benjamin Lee is a prof. of Anthropology and Philosophy (The New School, NYC) and Advisor to Economic Space Agency. The author of Derivatives and the Wealth of Societies. Co-organizer of the Volatility Working Group.

Robert Wosnitzer is a prof. at New York University/Stern Business School and Advisor to Economic Space Agency. Credit instruments, derivatives, and cultures of finance specialist. Former debt instruments and options trader at Lehman Brothers and Wells Fargo Capital Markets.

Akseli Virtanen, PhD, is a political economist, the author of Arbitrary Power. A Critique of Biopolitical Economy and Economy and Social Theory (Vol 1–3, with Risto Heiskala), Co-founder at Economic Space Agency, and at the decentralized hedge fund Robin Hood Minor Asset Management. Currently visiting researcher at Standford University.

Photo by rutty

The post Economics back into Cryptoeconomics appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/economics-back-into-cryptoeconomics/2018/10/09/feed 0 72899
There’s more to decentralisation than blockchains and bitcoin https://blog.p2pfoundation.net/theres-more-to-decentralisation-than-blockchains-and-bitcoin/2018/10/02 https://blog.p2pfoundation.net/theres-more-to-decentralisation-than-blockchains-and-bitcoin/2018/10/02#respond Tue, 02 Oct 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72803 Republished from Medium.com As the decentralisation movement grows, I consider the characteristics of decentralisation, what decentralisation is a tactic for, why and what work still needs to happen to re-decentralize the digital world. Decentralisation has gone mainstream Between Tim Berners-Lee raising the call to arms to re-decentralize the web, Mozilla, Internet Archive and other institutions pledging... Continue reading

The post There’s more to decentralisation than blockchains and bitcoin appeared first on P2P Foundation.

]]>
Republished from Medium.com

As the decentralisation movement grows, I consider the characteristics of decentralisation, what decentralisation is a tactic for, why and what work still needs to happen to re-decentralize the digital world.

Decentralisation has gone mainstream

Between Tim Berners-Lee raising the call to arms to re-decentralize the web, Mozilla, Internet Archive and other institutions pledging support, to the incredible financial success of blockchain and cryptocurrency projects — decentralisation is increasingly sexy.

(If you haven’t seen the hype, some of the mainstream coverage includes the New Yorker covering ‘the mission’ in 2013 to the Guardian calling decentralisation ‘the next big step’ earlier this month and Make Use Of wondering if blockchains are the answer).

Yet, what does decentralisation actually mean? Does it only apply to technology or is governance more important? Who gets to call themselves decentralised and does it matter?

The number of times I’ve heard ‘it’s decentralised’ as a reason to use or move to a particular application or platform recently, is impressive. All kinds of crypto/blockchain companies are branding themselves as ‘decentralised’ — every day there’s a new decentralised social network, decentralised file storage solution, decentralised identity app, decentralised syncing, contract management, health data sharing, dating service, avocado delivery — all decentralised! As if decentralisation is something wonderful and worthwhile in and of itself. Yet, when I ask ‘why does that matter?’ or ‘how are you decentralised?’ the answers tend to be very different and even inconsistent with the actual business proposition people are working on. How did we get here and what’s beyond the hype?

Decentralisation means different things to different people. When Francis and I picked Redecentralize to name our decentralisation-promoting side project 6 years ago, it was precisely because we cared about a number of things: privacy, competition and resilience. It wasn’t just about one solution (such as encryption) that we wanted to promote, it was a set of values: freedom, autonomy, collaboration, experimentation. Those values were tied up to the original spirit of the open web and net — the sense of freedom and possibility that we wanted to remind people of, and protect.

As decentralisation becomes more popular, those values and goals are getting lost as the community fractures into various roles. We need a way to distinguish and assess decentralisation meaningfully.

First, what does decentralisation actually mean?

At its most basic level, it is a distinction between a centralised hub and spoke model and a distributed connected network:

I drew this myself. You’re welcome.

Some people distinguish between ‘decentralised’ and ‘distributed’ — I’m talking about the general idea of decentralisation that encompasses distributed, federated and decentralised systems. This post is about the characteristics of decentralisation and the outcomes and implications of those characteristics rather than the specific configuration. (For more discussion on types of decentralisation, Vitalik wrote a great post on ‘the meaning of decentralisation’ last year).

While the diagrams are a simplification, they do immediately suggest certain characteristics. The centralised system on the left obviously has one much more important or powerful node — the middle one. All the other nodes depend on it to reach each other. It will know about all communication in the network. It’s a central point of failure and a central point of control. If you contrast this with the diagram on the right — which nodes are more important there? It’s hard to tell. Most nodes have multiple routes to other nodes. It seems like a more resilient system, but it’s harder to know how you can quickly make sure all nodes have the same information at once.

What we need is a more formal way to assess if something counts as ‘decentralised’.

Characterististics of decentralisation

The key characteristic I propose is that a system is decentralised to the extent it distributes power. Specifically, the distribution of control, knowledge and capability between many users. What does this look like?

Control is about ensuring user choice — adapting to user preferences and giving users decision making power. It’s fundamentally about autonomy. Decentralised control looks like end-users having a choice between service providers and not being forced into accepting terms and conditions that exploit them due to a lack of alternatives (see Facebook). This also looks like users having the freedom to adapt and customise the products and services they use to their specific needs. It looks like being able to opt out of targeted advertising or choosing to store your data locally. It looks like having applications that don’t require an internet connection to work.

Knowledge is about access to data and information. Knowledge distribution avoids information asymmetry and helps people recognise dependencies and the consequences of their choices. Decentralised knowledge looks like users having local copies of their data, being able to export data or choose to store the authoritative copy of their data locally. It looks like users understanding how the services they use actually work and their business models (for example whether it is advertising based, personalised advertising, selling your profile and preferences to external advertisers, something else etc). It looks like users being able to have private conversations and share photos securely with end-to-end encryption where the content of communication cannot be accessed or deleted by external organisations. It can look like the company providing the service not knowing or storing the metadata of who contacts who and when.

Capability is about infrastructure — the storage, processing and computation power needed to run systems and services. In a centralised model these are either all in the same place or in a small number of places controlled by one company. This creates a central point of failure both in the event of natural disasters (hurricanes, floods, earthquakes) and attacks (whether virtual such as data breaches, data taps, denial of services attacks, or physical destruction and manipulation). Centralisation often means that people’s data, which we rely on and want to protect (such as our conversations, photos and work), can be compromised or even lost. Privacy can be easier to compromise in central systems. A decentralised approach tends to be more resilient, but also offers greater control and knowledge distribution. It looks like apps which work offline, users being able to communicate, collaborate or share data across devices without mobile networks or wifi through peer-to-peer networks or user data federating across a network (e.g. mastodon.social).

Why decentralise?

Importantly, decentralisation in and of itself is neither good or bad. It depends on the context and what is being decentralised. Decentralisation can bring new capabilities, privacy and flexibility or surveillance, inefficiency and waste. How and why it is done, matters.

Not all things need decentralising. Unlike some, I don’t think code should be law. I like the law. It has been iterated on and developed and tested over thousands of years by millions of people. I would trust British Law above even a dozen smart contract developers. (Disclaimer: I’ve worked in tech for over 10 years, but never in law).

Institutions have value and not all expertise can or should be replaced by an immutable list and algorithmic consensus. However, in many other aspects, we desperately need to redecentralise and serve people, not corporations, much better. Even so, simply decentralising in some fashion does not magically bring about utopia. Much of the rhetoric of blockchain and other ‘decentralisation’ startups offer no plausible way from where we are today to the autonomous secure empowered world of decentralisation via their service or application. Let’s be intentional and clear about what changes we want to realise and what exactly it might take to get there. If you’re not building all of it, then be clear on what else will need to happen. We will most likely succeed as an ecosystem, not as one ‘killer app’.

This brings me back to how and why decentralisation is done, matters. And for me, the meaning and value of decentralisation is closely related to the purpose and expected outcomes of it. That means understanding the problem, articulating an alternative and roadmap for how we get there and testing the roadmap and showing it’s better by tracking the impact.

Everybody in the decentralisation space needs to do this.

Understanding the problem

Centralised systems lead to increasingly monotonous and unaccountable power. Over time this encourages exploitation and disinterest in user needs. Take Facebook for example, a platform that on the face of it is designed to help people digitally connect with their friends and family — share photos, talk, organise events and keep in touch. If my needs were a genuine priority then I should be able to share and showcase my photos from flickr or talk to my friends using my favourite app (such as telegram, signal or wire) — which would be most convenient for me. If Facebook cared about connecting people, it would not have dropped xmpp support — an open instant messaging protocol that allowed people to choose their own interface (mine was pidgin!) and from one place and talk to anyone using gchat, facebook, AIM, msn or jabber. Instead, Facebook’s interface and functionality is optimised around keeping me scrolling and in-app as long as possible since their business model depends on selling my attention.

Amazon has become a near monopoly for buying things online with their brand recognition, efficiencies of scale and great customer service. As real-world bookshops close down and everyone else sells on amazon marketplace, few have the infrastructure, supply chains, funds or brand to be able to compete any more. When there are no alternatives, why be cheaper? Why have great customer service? Users have little choice or control and Bezos (the owner of Amazon) is the richest person on the planet. Instead of thousands of independent flourishing businesses, we have one very very very rich man.

Centralisation makes it easy to undermine privacy and use personal information in ways individuals cannot control. As the Snowden revelations showed us, Governments tap network cables and can curtail freedom of speech. Digital monopolies now hold unbelievable amounts of data on us which can be used to manipulate us into spending money, but potentially also to impersonate, blackmail or silence.

An alternative

Keeping power accountable requires alternative competing sources of power which are independent. This could be government, assuming government is there to represent the interests of the many above the few. It could be alternative companies and services. It could be many people choosing together.

An alternative, decentralised world is one of:

  • Choice, diversity and competition — where many different business models and structures co-exist beyond the ‘winner takes all’ surveillance capitalism model (which depends on closed networks which don’t integrate or talk to each other). Centralised models, especially with data selling / advertising business models, have been deeply explored and within any new vertical often one or two winners take all and price out new competitors. This is uninspiring compared to the wealth of innovation that might be possible with local organisations tailoring their offering to particular sectors, cultures, interests and preferences. The same open source software can be provided in different configurations and alternative service standards to fit different user needs, budget and cultural context. It’s a world where providing ethical and environmentally friendly products and delivery services is possible and discoverable.
  • Resilience — where our valuable data and services are persistent and safe from companies being bought, new management decisions, natural disaster or hacking. No more losing your journal or portfolio gallery when a company is bought up by a monopoly.
  • Autonomy and privacy — where we control what kinds of terms and conditions we’re willing to agree to. A world where people can opt out of data sharing or choose to pay for their social network — choosing security and no adverts while still being able to communicate with friends using different providers. A world where end-to-end encryption works seamlessly.

How do we make it happen?

We all can contribute!

At Redecentralize.org we’re encouraging viable alternatives that work together (‘small pieces loosely joined’). This means ensuring that decentralised products and services are usable and work well with other privacy preserving user centered services and products. A key goal of redecentralize is to promote decentralised projects and platforms and bring people working in this space together through events and discussion forums.

Secondly, open protocols and regulation that incentivises or enforces their use is vital. The beginnings of this already exist in the data portability requirements of GDPR. Open protocols allow for collaboration between different and competing products and services, giving the user maximum flexibility and control without losing access to others in their network. The forced exclusion of closed proprietary protocols over network type services (such as social networks or marketplaces like amazon, airbnb, uber) has led to monopolies and lack of innovation and should be consigned to history.

Lastly we all have a role to play to disrupt the surveillance capitalism business model by choosing with our wallets and spending money on respectful software. A promising path may be to have payment built into how things work (cryptocurrency style) so that when you use IPFS and help store content you collect Filecoin you can then spend on the applications and services you value.

Conclusion

Decentralisation in and of itself, is unlikely to achieve all the outcomes that many people in the decentralisation movement care about. Yet it does offer a powerful way to tackle the problems of digital monopolies, growing inequality and loss of autonomy in our societies. Decentralisation incentivises power to be distributed across users. It’s an alternative infrastructure and way of being that creates space for autonomy, collaboration and local control. So, let’s be explicit about the change we want to see and test the impact.

Decentralised governance (knowledge and control in this model) is vital and must be considered alongside infrastructure and capacity. Let’s assess projects on all three characteristics of decentralisation and treat technology as a powerful tool to get us to a better world, but by no means the only intervention needed!

Can I get involved?

Yes of course. Join the discussion list and come chat on the #redecentralize matrix channel. We’re about to start fundraising —shout if you’d like to sponsor our work or come contribute!

 

 

Photo by Thomas Hawk

The post There’s more to decentralisation than blockchains and bitcoin appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/theres-more-to-decentralisation-than-blockchains-and-bitcoin/2018/10/02/feed 0 72803
Decentralising the web: The key takeaways https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14 https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14#respond Fri, 14 Sep 2018 09:00:00 +0000 https://blog.p2pfoundation.net/?p=72506 Republished with permission from UK technology site Computing John Leonard: The Decentralized Web Summit is over – what’s next? Earlier this month a rather unusual tech event took place in San Francisco. The Decentralized Web Summit played host to a gathering of web luminaries such as Sir Tim Berners-Lee, Brewster Kahle and Vint Cerf. On... Continue reading

The post Decentralising the web: The key takeaways appeared first on P2P Foundation.

]]>
Republished with permission from UK technology site Computing

John Leonard: The Decentralized Web Summit is over – what’s next?

Earlier this month a rather unusual tech event took place in San Francisco.

The Decentralized Web Summit played host to a gathering of web luminaries such as Sir Tim Berners-Lee, Brewster Kahle and Vint Cerf. On top of that, activists and authors and screenwriters such as Jennifer Stisa Granick, Emili Jacobi, Mike Judge and Cory Doctorow put in an appearance, as did cryptocurrency pioneers like Zooko Wilcox, blockchain developers, and academics.

Then, there was what the Guardian‘s John Harris calls the Punk Rock Internet – companies like MaidSafe and Blockstack who play by their own decentralised rules.

Oh, and there was a sprinkling of techies from Microsoft, Google (Vint Cerf and others) and Mozilla in attendance too, along with a handful of venture capitalists looking for opportunities.

Uniting this diverse selection of delegates was the challenge of fixing the centralising tendencies of the internet and web.

Simply put, the internet’s reliance on centralised hubs of servers and data centres means that the more servers you control the more power you have, with all the negative consequences that follow from the creation of data-haves and data-have-nots.

To redress the balance, data needs to be freed from silos with control handed back to users, but how to do that while retaining the convenience and ease-of-use of the current web?

Aside from the inevitable resistance by the powers that be, this turns out to be quite the technical challenge.

One task among a set of complex interlocking challenges is to separate data from the applications that use it. People could then store their personal data where they choose, granting or limiting access by applications as they please. For example, Berners-Lee’s Solid platform enables everyone to have multiple ‘pods’ for their data allowing for fine-grained control.

Another element is authentication, ensuring that the data owner really is who they say they are, while ensuring real identities remain private by default.

Networking needs to be peer-to-peer rather than hub-and-spoke, with copies of files stored across multiple machines for redundancy and speed of throughput in a manner that users of torrent-based file-sharing services will be familiar with, but adding far more control and performance features.

And above all it will need to be easy to use, low latency and simple for developers to create decentralised applications for.

Computing contacted a number of contributors to the Summit before and after the event and asked about their take on progress towards a viable decentralised web.

Pic credit Vitor Fontes. Things fall apart, the centre cannot hold (W.B. Yeats)

14/08/2018 – The key takeaways

With the summit now over and the participants returned to their basement labs (or shiny new offices) it’s time to consider the takeaways.

Interest in decentralisation is growing

While the 2016 Decentralized Web Summit summit attracted 350 enthusiasts, 2018 saw more than twice that number, with 800 attendees across 156 sessions. Not huge numbers as tech events in San Francisco go (the ‘big one’ Oracle OpenWorld attracts an astonishing 60,000 delegates), but important nevertheless in that it brought together the founders of the connected world with those looking at new ways to reclaim the web’s original vision.

“There are dozens and dozens of new projects and protocols and our goal was to get them to a place where people could do real learning,” said Wendy Hanamura of the Internet Archive.

For Blockstack’s Patrick Stanley the seed planted two years ago is still growing strongly: “I was very impressed by the quality of attendees and felt that the spirit of the original vision of the web as a place where people can create was intact,” he said.

No project is an island

The web touches almost every aspect of modern life. Re-architecting such a system will be a huge undertaking, one far too big for disparate bunches of developers working alone. MaidSafe COO Nick Lambert was among many urging more collaboration.

“Certainly, there are some efforts to work together on problem solving, but this is not happening universally,” he said. “Everyone at the event was clearly united in a common purpose to make the internet more private and secure, but the key takeaway for me is how we foster greater cohesion among the different projects.”

Money: no longer too tight to mention

Concerns about attracting VC funding haunted 2016, but those worries have largely evaporated as a result of the crypto goldrush which has given a huge boost to the value of the tokens that support many projects. Booms can turn to busts, of course, and sudden wealth can bring challenges of its own, but for now the gloom has lifted.

While some fear an inevitable clampdown on cryptocurrencies by the authorities, OmiseGO’s Althea Allen, who chaired a debate on the issue, said the worst may not happen.

“What I took away from talking with those excellent thinkers was actually quite a hopeful picture for the future of decentralised finance,” she said. “By all their accounts, they have found regulators to be more open to the possibilities of crypto than we tend to assume, with less default bias toward corporate interests, and largely concerned with the same things that we are: security, privacy, consumer protections; generally speaking, making honest people’s lives easier and not harder.”

Awareness of the bigger picture

Mindful of the developing relationship with the authorities, governance was front and centre of many discussions, a sign of growing maturity in decentralised thinking. For Miriam Avery, director of strategic foresight at Mozilla’s Emerging Technologies department, valuable lessons can be learned from those working “in countries where corruption is blatant, regulation is ineffective, and centralised control points cause palpable harm.”

Their experiences may turn out be more universal than some might think, she said. 

“The threat model is changing such that these harms are relevant to people who are less acutely aware of their causes. For instance, the things Colombian Ethereum hackers are worried about are things that we should all be a little worried about.”

Avery continued: “At the same time, digging into these projects we can already see pitfalls in the ‘governance’ of the software projects themselves, from the prevalence of benevolent dictators to disagreements on the limits of moral relativism. There’s room to grow these technologies through healthy, inclusive open source communities, and I’m excited to see that growth.”

The door needs to be wedged open, or it will be slammed shut again

Another Mozillan, software engineer Irakli Gozalishvili, said: “It was reassuring to see that the community is actively thinking and talking about not only making decentralised web a place that serves people, but also how to create technology that can’t be turned into corporate silos or tools for empowering hate groups.”

Scaling up

Any decentralised web worthy of that name needs to be quick and responsive at scale, said MaidSafe’s Lambert. “There is a long way to go to create a user experience that will encourage everyone to adopt the decentralised approach.  For example, none of the demonstrations at the summit were able to show scalability to millions of users.”

Front-end focus

The decentralised web, with a few notable exceptions, is still very ‘engineering-y’ with most of the effort going into the back-end rather than the user interface. The networking may be futuristic but the front end is (with a few honourable exceptions) still Web 1.0. Which is fine at the development stage but projects will soon need to move on from demonstrating capabilities to making apps that people actually want to use.

Creating an easy onramp is an essential step. Mozilla is piloting decentralised web browsing via WebExtension APIs, the first of the ‘major’ vendors to do so, although others have been working in this area for a while, notably the Beaker browser for navigating DAT sites and ZeroNet.

A long list of necessary developments includes a human-readable decentralised replacement for the DNS system, search engines, and proof that crypto-based incentive systems for the supply and demand of resources can make for a scalable economy.

And the next Decentralized Web Summit? Hanamura wouldn’t be drawn on a date. “We’re still recovering from organising this one,” she said.

Enthusiasm is not sufficient fuel

06/08/2018 – Maintaining the momentum

If the 2016 Decentralized Web Summit was a call to action, in 2018 it’s all about working code. That’s according to Wendy Hanamura, director of partnerships at the Internet Archive, the organisation that hosted both events. However, there’s still a fair way to go before it goes anything like mainstream.

The Internet Archive’s mission is to preserve the outputs of culture, turning analogue books, files and recordings into digital, storing digital materials for posterity and preserving web pages going back to 1996 in the Wayback Machine.

Unsurprisingly given its aims, the organisation is sitting on a mountain of data – more than 40 petabytes and rising fast. It has recently started experimenting with decentralised technologies as a way of spreading the load and ensuring persistence, including file sharing and storage protocols WebTorrent, DAT and IPFS, the database GUN and P2P collaborative editor YJS.

And it’s open to looking at more in the future. “We’re glad to be in at the ground floor,” said Hanamura. “We have no horse in the race. We’re looking for all of them to succeed so we’re looking at different protocols for different functions.”

Wendy Hanamura

Despite some substantial progress, few decentralised projects could yet be described as ‘enterprise ready’. More work is required in many different areas, one of which is providing more straightforward ways for non-technical users to become involved.

Hanumara pointed to developments among big-name browsers including Firefox, Chrome and Brave as among the most promising for improved user experience. Mozilla demonstrated a Firefox API for decentralised systems at the event.

“Participants were able to talk to each other directly browser to browser without a server involved, and they thought that was tremendously exciting,” she said.

Collaborations

For Ruben Verborgh of the Solid project, the cross-pollination required to overcome some of the challenges is hampered by the diversity of approaches.

“Ironically, the decentralised community itself is also very decentralised, with several smaller groups doing highly similar things,” he said. “Finding common ground and interoperability will be a major challenge for the future since we can only each do our thing if we are sufficiently compatible with what others do.”

While it’s still too early for projects to merge or consolidate around standards, Hanamura said she witnessed “lots of meetings in corridors and deals being struck about how you could tweak things to work together.”

“That’s another way you can make it scale,” she added.

Maintaining momentum

The summit had strong ideological underpinnings. Hanamura described it as “an event for the heart. People came to share,” she said.

The strength of small open-source projects with big ideas is that they can easily sustain shared ideals, but this can be hard to maintain as they evolve, she went on.

“Many founders said governance was their biggest worry. You need a team of developers who believe in you and are willing to work with you – if not they can fork the code and create something very different.”

In 2016 the main concern was very different: it was funding. The success of cryptocurrency token sales (ICOs) have removed many of these worries, at least for some. A lot of money has flowed into decentralised technologies, for example Filecoin recently raised $230m in an ICO and Blockstack made $50m. But this can be a double-edged sword as rapid expansion and bags of cash make team cohesion more challenging to maintain, Hanamura believes.

“It makes it a dangerous time. We came to this with a purpose, to make a web that’s better for everyone. So we need to keep our eye on the North Star.”

Once the technologies hit the mainstream, there will be other challenges too, including legal ones.

“As this ecosystem grows it has to be aware of the regulations on the books around the world but also those pending,” said Hanamura. “We have to have a strong voice for keeping areas where we can sandbox these technologies. We need a governance system to keep it decentralised otherwise it can get centralised again.”

It’s gonna take a lot of thinking through

01/08/2018 – Why is decentralising the web so hard to achieve?

Tim Berners-Lee and his colleagues faced a number of tough challenges when inventing the web, including having to build early browsers and protocols from scratch and overcoming initial scepticism (his original idea was labelled ‘vague but exciting’ by his boss at CERN). The nascent web also needed to be brought into being under the radar, and the terms for the release of its code carefully formulated to guarantee its free availability for all time. It took 18 months to persuade CERN that this was the right course.

Had the technology been proprietary, and in my total control, it would probably not have taken off. The decision to make the web an open system was necessary for it to be universal. You can’t propose that something be a universal space and at the same time keep control of it,” said Berners-Lee in 1998.

The original web was designed to be decentralised, but over the course of time it has been largely fenced off by a small number of quasi-monopolistic powers we know as ‘the tech giants’. This makes designing a new decentralised internet  – one that’s ‘locked open’ in the words of the Internet Archive’s Brewster Kahle – a challenge even more daunting than those pioneers faced. The problem is the tech giants are very good at what they do, said Jamie Pitts, a member of the DevOps team with the Ethereum Foundation, speaking for himself rather than on behalf of his organisation.

“One of the key hurdles to decentralisation is the lock-in effect and current excellent user experience provided by the large, centralised web services,” he said.

“Decentralised web technology must enable developers to produce high-quality systems enabling users to search, to connect with each other, and to conduct all forms of business. Until that happens, users will continue to be satisfied with the current set of options.”

While a subset of users is worried about power imbalances, surveillance and lack of control and transparency, the fact is that most people don’t care so long as there are bells and whistles aplenty. A tipping point must be achieved, as Althea Allen of OmiseGO put it.

“The only thing that will force those decentralised systems to change on a fundamental level is a mass shift by consumers toward decentralised systems.”

Selling ads and services through the centralisation and mining of data (‘surveillance capitalism’) has made the tech giants very powerful, and it can be hard to see beyond this model.

“The monopolisation that can occur in a rapidly-advancing technology space poses one of the greatest challenges to decentralisation,” said Pitts.

“Aggregation of capital and talent results from the network effect of a successful commercially-run service, and developers and users can become locked-in. While many of their needs of users may be met by the dominant content provider, search engine, or social network, the monopolised network becomes a silo.”

Moreover, the suck-up-all-the-data model has proven to be highly lucrative for the big boys, and while alternative economic methods for paying participants involving cryptocurrencies and micropayments are emerging, none has yet proved itself on the wider stage.

“There need to be viable business models for app developers that do not depend on advertisements or exploiting user behaviour and data,” said Blockstack’s Patrick Stanley.

On the systems side, there is a necessity to rethink the architecture to avoid central hubs. One of the toughest problems is achieving reliable consensus: with nodes seeing different versions of the ‘truth’ (i.e. what events are happening and in what order), how can one ‘truth’ be agreed upon without reference to a central arbiter? And how can this consensus be secured against faults and bad actors?

This longstanding conundrum was finally solved by the bitcoin blockchain a decade ago, and many efforts are ongoing to make it more efficient and a better fit for the decentralised web, the IoT and other applications. However, other projects, such as IPFS and MaidSafe’s SAFE Network, don’t use a blockchain, arriving at different methods for achieving consensus.

There are many ways to skin the decentralised cat – and that is another issue. What do people want, is it privacy, autonomy, security, an alternative economy, all of the above? Where are the tradeoffs and who decides the priorities? And how can the various strategies work together?

The problem is too big for one player to handle. MaidSafe’s David Irvine sees collaboration as key to any solution, which was one reason why the firm open-sourced all its code.

“We want to collaborate with other companies in this space. We have the scars of developing specific functionality and are happy to work with companies to integrate that functionality where it makes sense.”

Pic credit Rene Böhmer. A decentralised web can also be a place to hide

31/07/2018 What might go wrong?

Technology is morally agnostic. Nuclear power provides the raw material for nuclear bombs. That new road can carry serial killers as well as saints. And while a decentralised web would redistribute power over personal data, it could also provide a convenient hiding place for the bad guys.

Danielle Robinson

It’s high time technologists started to see this issue in the round, said Danielle Robinson, co-executive director, of Code for Science & Society, a non-profit supporting collaboration in public interest technology.

“When technology is built, the biases of its creators are often embedded into the technology itself in ways that are very hard for the creators to see, until it’s used for a purpose you didn’t intend,” she said during an interview with Internet Archive. “So I think it’s really important that we talk about this stuff.”

The increased privacy and security built into decentralised web technologies makes it easier for anyone to collaborate in a secure fashion. And that includes hate groups.

“They’re on the current existing web, and they’re also on the decentralised web, and I think it’s important for our community to talk about that,” she said. “We need a deeper exploration that’s not just ‘oh you know, we can’t control that’.”

In a separate interview, Matt Zumwalt, program manager at Protocol Labs, creator of Inter-Plantetary File System (IPFS), argued that proponents of decentralised web need to think about how it might be gamed.

“We should be thinking, really proactively, about what are the ways in which these systems can be co-opted, or distorted, or gamed or hijacked, because people are going to try all of those things,” he said.

The decentralised web is still an early stage project, and many involved in its creation are motivated by idealism, he went on, drawing parallels with the early days of the World Wide Web. Lessons should be learned from that experience about how reality is likely to encroach on the early vision, he said.

“I think we need to be really careful, and really proactive about trying to understand, what are these ideals? What are the things we dream about seeing happen well here, and how can we protect those dreams?”

Mitra Ardron, technical lead for decentralisation at the Internet Archive, believes that one likely crunch point will be when large firms try to take control.

“I think that we may see tensions in the future, as companies try and own those APIs and what’s behind them,” he said. “Single, unified companies will try and own it.”

However, he does not think this will succeed because he believes people will not accept a monolith. Code can be forked and “other people will come up with their own approaches.”

30/07/2018 Blockstack on identity and decoupling data

Authentication and identity are cornerstones of decentralised networking. Through cryptography, I as a user can verify who I am and what data I own without reference to any central registry. I can use my decentralised ID (DID) to log on securely and perhaps anonymously to services and applications with no third party involved.

Identity is bound up with another tenet of decentralisation: separating the data from the applications. Applications are now interfaces to shared data rather than controllers and manipulators of it. Without my express permission, apps can no longer use and retain data beyond my control.

Coupling data to ID rather than apps was the starting point for the Blockstack platform, as head of growth Patrick Stanley explained.

“Blockstack is creating a digital ecosystem of applications that let users fully own their identities and data on the Internet. User data – like photos and messages – are completely decoupled from the applications. Apps can no longer lock users and their social graph in, since they no longer store anything.”

Storage is taken care of elsewhere, in a decentralised storage system called Gaia. As apps are now ‘views’ or interfaces you don’t need to log in to each individually.

“People use applications on Blockstack just like they would with today’s Internet. But instead of signing up for each app one-by-one with an email address and password — or a Google/Facebook log-in — users have an identity that’s registered in the blockchain and a public key that permissions applications or other users to access pieces of data.”

That’s lots of positives so far from a user point of view, and also for developers who have a simpler architecture and fewer security vulnerabilities to worry about, but of course, there’s a catch. It’s the difference between shooting from the hip and running everything by a committee.

“Decentralisation increases coordination costs. High coordination costs make it hard to get some kinds of things done, but with the upside that the things that do get done are done with the consensus of all stakeholders.”

There are already privacy-centric social networks and messaging apps available on Blockstack, but asked about what remains on the to-do list, Stanley mentioned “the development of a killer app”. Simply replicating what’s gone before with a few tweaks won’t be enough.

A viable business model that doesn’t depend on tracking-based advertising is another crucial requirement – what would Facebook be without the data it controls? – as is interoperability with other systems, he said.

And the big picture? Why is Blockstack sponsoring the event? Ultimately it’s about securing digital freedom, said Stanley.

“If we’re going to live free lives online, there needs to be protocol-level safeguards to ensure your data stays under your control. Otherwise, the people who control your data ultimately control your digital life.”

Independent but interconnected

27/06/2018 OmiseGO on the importance of UI

OmiseGO, a sponsor of the Decentralized Web Summit, is a subsidiary of Asia-Pacific regional fintech firm Omise. Omise is a payments gateway similar to PayPal or Stripe that’s doing brisk business in East Asia. Omise enables online and mobile fiat currency transactions between customers and participating vendors, and OmiseGO, a separate company and open source project, aims to do the same with cryptocurrencies too.

The backbone of OmiseGO is the OMG blockchain which in turn is built on Ethereum. The goal is to provide seamless interoperability across all blockchains and providers. OMG uses Plasma, an enhancement designed to speed up transactions on the Ethereum blockchain, and the company counts Ethereum’s founders Vitalik Buterin and Gavin Wood among its advisors. While it’s very early days, in the long run OmiseGO wants to extend banking-type services to the billions of ‘unbanked’ people by cutting out the financial middleman who don’t serve those people, and also giving the ‘banked’ an alternative.

The current Internet has too many middlemen of its own, meaning that equal access does not mean equal control, explained OmiseGO’s head of ecosystem growth Althea Allen in an email.

“The decentralised web is crucial is providing equitable agency within the systems that internet users are accessing. Sovereignty over your own data, money and communication; access to information that is not censored or manipulated; the ability to control what aspects of your identity are shared and with whom; these are essential freedoms that the centralised web simply will not provide.”

However, if the alternatives are awkward and clunky, they will never take off.

“It is difficult, though not impossible, to create a decentralised system that provides the kind of user experience that the average internet user has come to expect. Mass adoption is unlikely until we can provide decentralised platforms that are powerful, intuitive and require little or no change in users’ habits.”

Team OmiseGO

Blockchains are a powerful tool for decentralisation as they can help keep control of events and processes across the network, but that depends on how they are used. There’s a lot of ‘blockchain-washing’ out there, Allen warned.

“Blockchains are not intrinsically decentralised – they can absolutely be private and proprietary. Many institutions, old and new, are showing an interest in adopting new technologies such as blockchains, maintaining the same centres of power and influence, and putting an ‘I blockchained’ sticker on them – essentially, appropriating the rhetoric of decentralisation without actually adopting the principles.”

Asked about the plethora of competing decentralised approaches, Allen said she believes this is positive, but sharing ideas is vital too.

“Cooperation is crucial for us to move the space forward, while healthy competition encourages the exploration of many different possible solutions to the same problems. We work particularly closely with Ethereum, but the success of our project depends on a thriving ecosystem (which extends well beyond crypto or even blockchain technology). To this end, we make a concerted effort to work with projects and individuals in many fields who are contributing to building the decentralised web.”

26/07/2018 MaidSafe on collaboration

As we mentioned in the introduction, a decentralised web will require a number of different interlocking components, including decentralised storage, decentralised networking, decentralised applications and decentralised identities.

MaidSafe, one of the event’s sponsors, is trying to cover all but one these bases with its autonomous SAFE Network, replacing the Transport, Session and Presentation layers of the current seven-layer internet with decentralised alternatives to create a platform for applications. The project is currently at alpha test stage.

So it’s all sewn up then, no need for further collaboration? Not at all said CEO David Irvine, who will be speaking at the event, pointing to the firm’s open-sourcing of its PARSEC consensus algorithm and its invitation to other projects to help develop it. It’s just not always easy to organise joint ventures he said. The summit will bring together many pioneers and innovators (70-plus projects are represented) with each pushing their own ideas for redefining the web.

“[Everyone’s] so passionate about improving the internet experience, we are defining the rules for the future, and everyone has a point of view. That does mean there are some egos out there who are quite vocal about the merits of their approach versus others, which makes for good media stories and fuels hype, but it’s not what we’re really focused on.”

Within any movement dedicated to upending the status quo, there lurks the danger of a People’s Front of Judea-type scenario with infighting destroying the possibilities of cooperation. Amplifying the risk, many projects in this space are funded through cryptocurrency tokens, which adds profiteering to the mix. It’s easy to see how the whole thing could implode, but Irvine says he’s now starting to see real collaborations happen and hopes the summit will bring more opportunities.

“We’ve already been talking to Sir Tim Berners-Lee’s Solid project at MIT, and we have a growing number of developers experimenting with applications for the platform,” he said.

MaidSafe’s David Irvine

MaidSafe has been a fixture in the decentralised firmament for a while, predating even the blockchain which is the backbone of many other ventures. At one time it had the space almost to itself but has since been joined by a host of others. Asked about his company’s USP, Irvine came back with one word: “honesty”.

We asked him to expand.

“There is far too much hype in the wider blockchain crypto space and we have always tried to distance ourselves from that nonsense. We’re trying to build something hugely complex and radically different. That doesn’t happen overnight, so you have to be upfront with people so they are not misled. Sure we’ve learned along the way, got some things wrong, but whenever we have we’ve held our hands up and that has helped us.”

And the big-picture goal?

“In essence, privacy, security and freedom. The technology we are building will provide private and secure communications, as well as freedom through the unfettered access to all humanity’s data.”

25/07/2018 Kahle and Berners-Lee on the need for decentralisation

Organiser the Internet Archive directed us to some recent statements by founder Brewster Kahle. Here Kahle outlines some of the problems with the existing web.

“Some of the problems the World Wide Web that we’ve seen in the last few years are the surveillance structures that Snowden gave light to. There are the trolling problems that we saw in the last election. There’s privacy aspects, of people spilling their privacy into companies that sometimes aren’t the most trustworthy. There’s advertising technologies being used against users. There’s a lot of failings that we’ve seen in the World Wide Web.”

To be successful, the decentralised web will need to encourage “lots of winners, lots of participation, lots of voices” he said.

“So this is a time to join in, to find a place, get knee-deep in the technologies. Try some things out. Break some stuff. Invest some time and effort. Let’s build a better, open world, one that serves more of us.”

Open source principles are essential but not sufficient. There must be a focus on performance, functionality and new ideas.

“We’re only going to survive if the open world is more interesting than closed app worlds … what I would think of as a dystopian world of closed, segmented, siloed, corporately-owned little pieces of property. I’d much rather see an open, next-generation web succeed,” Kahle said.

Tim Berners-Lee

As ‘Father of the Web’ (Mk I), Tim Berners-Lee has become increasingly disillusioned with his offspring. Around the time of the previous Decentralized Web Summit in 2016, he said: “The web has got so big that if a company can control your access to the internet, if they can control which websites you go to, they have tremendous control over your life.

“If they can spy on what you’re doing they can understand a huge amount about you, and similarly if a government can block you going to, for example, the opposition’s political pages, they can give you a blinkered view of reality to keep themselves in power.”

Since then, of course, many of the things he warned about have become evident in increasingly obvious and frightening ways. And in the US Congress recently scrapped net neutrality, doing away – in that country at least – with a longstanding principle of the internet, namely that ISPs should treat all data equally.

So, are there any positive developments to report over the last two years? Berners-Lee remains hopeful.

“There’s massive public awareness of the effects of social networks and the unintended consequences,” he told Computing. “There’s a huge backlash from people wanting to control their own data.”

In part this awareness is being driven by GDPR coming into effect, in part by news headlines.

Meanwhile, there’s the rise of “companies which respect user privacy and do not do anything at all with user data” (he namechecks social network MeWe to which he acts as an advisor), open-source collaborations like the data portability project (DTP) led by tech giants, and his own project Solid which is “turning from an experiment into a platform and the start of a movement”.

“These are exciting times,” said Berners-Lee.


John Leonard, Research Editor, Incisive Media

The post Decentralising the web: The key takeaways appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/decentralising-the-web-the-key-takeaways/2018/09/14/feed 0 72506
Tokens as a Labor Model https://blog.p2pfoundation.net/tokens-as-a-labor-model/2018/08/16 https://blog.p2pfoundation.net/tokens-as-a-labor-model/2018/08/16#respond Thu, 16 Aug 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72273 Two years ago, we published a report on Value in the Commons Economy, in which we analyzed the value regime of a number of pioneering peer production projects such as Sensorica and Backfeed. In that report, we posited a sphere of ‘value sovereignty’, within the sphere of the commons, and a membrane between the commons... Continue reading

The post Tokens as a Labor Model appeared first on P2P Foundation.

]]>
Two years ago, we published a report on Value in the Commons Economy, in which we analyzed the value regime of a number of pioneering peer production projects such as Sensorica and Backfeed. In that report, we posited a sphere of ‘value sovereignty’, within the sphere of the commons, and a membrane between the commons and the market to govern its interaction.

In the meantime, the token economy has exploded, and despite its many faults and weaknesses, it has brought open and contributive accounting to the mainstream as a practice, via programmable tokens that are divided up exactly as the open source communities decide. We have moved from an economy based on capitalist enterprises, which extracted all the surplus value from the developers, to an eco-system in which contributory competency networks, prepare white papers, crowdfund through tokens, and distribute the value much more widely amongst the contributors.

While much remains to be done, this is a major milestone in showing a possible future of or work and reward systems. The two following extracts bring testimonies about how the ‘developer working class’ is looking at these advances.

The question now is, can other sections of workers, those that do not belong to the aristocracy of labor that do software work, also learn and benefit from these new systems, and a second question is, We will be working on these very questions this summer and publish a report about it.


(excerpted from): How App Tokens Changed the Life of the Developer Working Class

Richard Burton: A month of work for the protocol (Ethereum) has completely changed my life. I am free to travel the world and work on whatever I want. It is hard to overstate the mental freedom afforded by having a cash buffer and not having to work all the time to make ends meet. It has had a profound effect on my mental health and freed me up to do the best work of my life. The people who built this protocol took a chance on me and I am incredibly grateful.

Vitalik and his team gave birth to a protocol that over 7,000 people committed to. They effectively held an IPO for their protocol at the start of the project. Since then, thousands more have got involved by trading Ether, writing code, and helping the protocol to flourish.

– “Bitcoin is not just a protocol or money, it’s a new business model for Open Source Software. Prior to Bitcoin, you had to raise money, write software, distribute your product, build a business model, and work towards liquidity. Angels, VCs, salespeople and bankers guided you the entire way, through a maze of tolls and controls.”

Naval Ravikant saw this coming months before the Ether sale. The coins that protocols distribute to contributors are like shares in a company. The key difference is that these shares are not locked up by startup founders and venture capitalists.

There are a thousand nightmarish stories about startup employees not being able to afford to exercise their stock options and missing out on millions of dollars. Alex MacCaw and I wrote about this problem in 2013 after seeing many of our friends go through the stressful process of trying to borrow money to buy the stock they had earnt.

The current stock option system is totally broken. It forces people to stay at companies longer than they want to in the hope that a liquidity event is just around the corner.

App Coins are totally different from stock options. I was paid for my month’s work and I was rewarded for my belief in the protocol at an early stage. There was no cliff, no vesting schedule, no liquidation preferences, no VC ratchets, no exercise window, just coins. I helped the Ethereum team when they had no money and they rewarded me for that.

The moment I decided to move on to a freelance job, I was free to do so. I didn’t have to stick around in the hope that I would make some huge pile of money in the future.

This model is going to completely change the war for talent. If you’re a smart engineer, you can go and join a rocketship startup and work crazy hours. Alternatively, you can head over to Thailand, live cheaply, and work for App Coins.

Protocol creators need your help: They need people to write clear documentation, teachers to help people learn, designers to work on the user interfaces, customer support staff to handle the swelling inboxes, investors to raise capital, and a whole range of other talent to help them build a successful protocol. It doesn’t matter if you don’t write code—you can still contribute.

Protocols will follow the startup power law: millions will be started and only a few hundred will change the world forever.

In the future, billions of people will be working for a protocol. They will define themselves by the protocols they work for and how much they can contribute.

Protocolism might be the solution we need. It harnesses human ingenuity and distributes the benefits far and wide. It can help us build an economy for the 99%.

When a startup succeeds, a handful of people get insanely wealthy. When a protocol succeeds, thousands of people profit. In the future, the great protocols could lift millions of people out of poverty.

(excerpted from): Decentralization as a Means for Developers and other Stakeholders to Take Back Control from Centralized Platforms

Chris Dixon: Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows.

When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits. Historical examples of this are Microsoft vs Netscape, Google vs Yelp, Facebook vs Zynga, and Twitter vs its 3rd-party clients. Operating systems like iOS and Android have behaved better, although still take a healthy 30% tax, reject apps for seemingly arbitrary reasons, and subsume the functionality of 3rd-party apps at will.

For 3rd parties, this transition from cooperation to competition feels like a bait-and-switch. Over time, the best entrepreneurs, developers, and investors have become wary of building on top of centralized platforms. We now have decades of evidence that doing so will end in disappointment. In addition, users give up privacy, control of their data, and become vulnerable to security breaches. These problems with centralized platforms will likely become even more pronounced in the future.

Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, Golem for performing computations, and Filecoin for decentralized file storage.

Early internet protocols were technical specifications created by working groups or non-profit organizations that relied on the alignment of interests in the internet community to gain adoption. This method worked well during the very early stages of the internet but since the early 1990s very few new protocols have gained widespread adoption. Cryptonetworks fix these problems by providing economics incentives to developers, maintainers, and other network participants in the form of tokens. They are also much more technically robust. For example, they are able to keep state and do arbitrary transformations on that state, something past protocols could never do.

Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for “voice” and “exit.” Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol.

In short, cryptonetworks align network participants to work together toward a common goal — the growth of the network and the appreciation of the token. This alignment is one of the main reasons Bitcoin continues to defy skeptics and flourish, even while new cryptonetworks like Ethereum have grown alongside it.


Photo by Marco Verch

The post Tokens as a Labor Model appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/tokens-as-a-labor-model/2018/08/16/feed 0 72273
Essay of the Day: Coexistence of Decentralized Economies and Competitive Markets https://blog.p2pfoundation.net/essay-of-the-day-coexistence-of-decentralized-economies-and-competitive-markets/2018/07/13 https://blog.p2pfoundation.net/essay-of-the-day-coexistence-of-decentralized-economies-and-competitive-markets/2018/07/13#respond Fri, 13 Jul 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=71792 In this paper, we explore novel ways for decentralized economies to protect themselves from, and coexist with, competitive markets at a global scale utilizing decentralized technologies such as Blockchain. Article: The Wolf and the Caribou: Coexistence of Decentralized Economies and Competitive Markets. By Andreas Freund and Danielle Stanko. J. Risk Financial Manag. 2018, 11(2), 26... Continue reading

The post Essay of the Day: Coexistence of Decentralized Economies and Competitive Markets appeared first on P2P Foundation.

]]>

In this paper, we explore novel ways for decentralized economies to protect themselves from, and coexist with, competitive markets at a global scale utilizing decentralized technologies such as Blockchain.

Article: The Wolf and the Caribou: Coexistence of Decentralized Economies and Competitive Markets. By Andreas Freund and Danielle Stanko. J. Risk Financial Manag. 2018, 11(2), 26

Michel Bauwens: There are several interesting aspects to this article. First the authors are from Tata Industries, who are reportedly very concerned about the future of their industrial verticals and actively looking for successor systems. Second this article is very liberal in quoting the work of the P2P Foundation about the institutionalization of peer production. And third, it examines how our models fit with the emerging possibilities of the blockchain , tokens and programmable organizations, i.e. ‘Distributed Autonomous Organizations’.

However, in doing so, it also does something very problematic in our view, ie it assumes that the current wave of crypto projects conform to these generative, commons-based vision of our economy, while they clearly don’t. While current crypto-based projects are interesting new models, most of them are extractive to people and the planet, and use the commons and open source in the context of achieving value capture. Without a clear distinction between generative and extractive economic modes, we justify the worst excesses of the current crypto-based economy.

Abstract

Andreas Freund and Danielle Stanko: “Starting with BitTorrent and then Bitcoin, decentralized technologies have been on the rise over the last 15+ years, gaining significant momentum in the last 2+ years with the advent of platform ecosystems such as the Blockchain platform Ethereum. New projects have evolved from decentralized games to marketplaces to open funding models to decentralized autonomous organizations. The hype around cryptocurrency and the valuation of innovative projects drove the market cap of cryptocurrencies to over a trillion dollars at one point in 2017. These high valued technologies are now enabling something new: globally scaled and decentralized business models. Despite their valuation and the hype, these new business ecosystems are frail. This is not only because the underlying technology is rapidly evolving, but also because competitive markets see a profit opportunity in exponential cryptocurrency returns. This extracts value from these ecosystems, which could lead to their collapse, if unchecked. In this paper, we explore novel ways for decentralized economies to protect themselves from, and coexist with, competitive markets at a global scale utilizing decentralized technologies such as Blockchain.”

Excerpt

“Based on available research, decentralized socioeconomic models, also often referred to as a Decentralized Autonomous Organization (DAO) or Commons6 structure typically have three main components (Giotitsas and Ramos 2017; Filippi et al. 2007):

  • Entrepreneurial Common (EC): An EC is the commercial interface with external ecosystems and gives funds it raised from selling goods and services or other activities such as investments in other ecosystems in the form of tokens to the For-Benefit Common (FBC) and receives goods & services to market and sell from the Production Common (PC) in return. This requires exchange between an EC token and Fiat and a PC token that is governed by the FBC. Tokens generally represent a unit of value as defined by the participants of a DAO, and there may be many tokens within a DAO. In addition, the EC is responsible for financial and monetary policy in the DAO since issuing a token is effectively creating a currency with all the accompanying complexities. We will discuss this in more detail when we discuss our proposals for the coexistence of decentralized economies and competitive markets;
  • Production Common (PC): A p2p group that produces goods and services collaboratively based on the purpose of the ecosystem as established in the FBC. A participant’s contributions are valued in PC tokens which can be exchanged to EC tokens or other tokens through an exchange utility, as detailed out in one of our four proposals below. Assets created in this common are held in common by the FBC with claims rights by the contributors based on their value contribution to the asset in order to enable a fair sharing of value generated both commercially and through reputation;
  • For-Benefit Common (FBC): The FBC is the governance common that is responsible for setting the DAO vision and impact goals, sets consensus rules and incentives for the DAO commons, sets the exchange rules for the EC and PC Tokens within the commons and externally to other ecosystem tokens and fiat, sets the ownership/membership and sharing rules for the DAO commons, defines and enforces reputation also in relation to non-DAO reputation measurement and management models, sets collaboration and giving rules with internal and external entities, and acts as the interface to not-for-benefit entities etc.

This three-zone model is designed to

  • Insulate the economically vulnerable FBC and PC from extractive external markets through the EC commons by limiting token exchanges between the common markets that have direct interfaces to competitive markets.
  • Enable social impact results through the FBC without a strong dependency on market results. The FBC decides the use of funds coming from the EC. As a result it is independent of “shareholder value” as defined by external and extractive markets; therefore it is accountable to the PC and EC participants.
  • Allows the EC to focus on raising funds for both the FBC and PC either through selling products and services or raising funds for future products and services and social impact efforts.
  • Enables the PC to focus on core competencies to create new products and services aligned with the overall DAO values independent of the EC.

It is worth noting that decentralized economies as described above typically have three characteristics in common (Commons Transition Primer 2018, Commons-Based Peer Production Directory 2018, 2017):

  • Open Value Accounting: An accounting system that allows one to record not only tangible assets and assess their value in a unit of value measure, but also to record tangible and intangible value contributions from participants to an asset and subsequent value translation into a unit of value measure such as a token in a decentralized socioeconomic model;
  • Decentralized Commons Market: A decentralized marketplace for the free exchange of assets and services governed by business rules established by a governing commons through participant consensus. The decentralized marketplace is transparent and has verifiable and marketplace transactions. In order to motivate participation in a decentralized commons market (DCM), a DCM has an incentive model for both tangible and intangible value contributions and open asset ownership representation through a set of defined unit of value measures such as tokens. The rules for DAO membership and voting are normally based on consensus processes;
  • DCM Reciprocity: Reciprocity in this context means that the return on investment beyond a certain value7 is capped but not frozen by requiring reciprocity contributions to the DCM in the form of tangible or intangible value contributions that equal or exceed returns. Example: For a return of a 100 token investment in a DCM asset beyond say 10 tokens, an actor needs to make a value contribution after applying an exchange rate of token to utility token quantifiably equivalent to 1 utility or 1 reputation token for every token earned beyond 10 tokens.

As we will see below, this is in stark contrast to competitive markets.

Central to making the three zoned model operate is effective governance. There is an evolving literature on the governance of decentralized markets discussing issues and challenges creating inefficiencies and potentially additional costs as well as benefits and efficiencies.” (http://www.mdpi.com/1911-8074/11/2/26/htm)

Photo by reynermedia

The post Essay of the Day: Coexistence of Decentralized Economies and Competitive Markets appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/essay-of-the-day-coexistence-of-decentralized-economies-and-competitive-markets/2018/07/13/feed 0 71792
Vinay Gupta returns to Meaning with his biggest vision yet for global systems change https://blog.p2pfoundation.net/vinay-gupta-returns-to-meaning-with-his-biggest-vision-yet-for-global-systems-change/2018/06/06 https://blog.p2pfoundation.net/vinay-gupta-returns-to-meaning-with-his-biggest-vision-yet-for-global-systems-change/2018/06/06#comments Wed, 06 Jun 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=71270 Always provocative, always stimulating: the strategic visioning of Vinay Gupta. By Emily Yates, reposted from Medium.com From open source innovation to the vanguard of the blockchain movement; the ‘global resilience guru’ discusses the conflicts, dangers and opportunities of the world to come. The future we are facing calls for new perspectives, new concepts and new... Continue reading

The post Vinay Gupta returns to Meaning with his biggest vision yet for global systems change appeared first on P2P Foundation.

]]>
Always provocative, always stimulating: the strategic visioning of Vinay Gupta.

By Emily Yates, reposted from Medium.com

From open source innovation to the vanguard of the blockchain movement; the ‘global resilience guru’ discusses the conflicts, dangers and opportunities of the world to come.

The future we are facing calls for new perspectives, new concepts and new guides. How, then, should we introduce Vinay Gupta — a man who more than any other speaker at Meaning challenges our basic assumptions about reality, and the extent of the problems we are facing? In a simpler era we might have called him an inventor, a philosopher, or a spiritual activist. But all of these definitions are breaking down, and perhaps they must. If we are facing a fourth industrial revolution, then all our beliefs and assumptions are due for a radical overhaul.

We first heard from Vinay at Meaning 2012, where he began with a nod to his reputation for apocalyptic thinking — identifying himself as a ‘merchant of doom’ confronting a whole spectrum of ‘plausible utopias’. If the source of our creativity is to be found in our limitations, then Vinay draws his from worst case scenarios; unafraid to depict the likely trajectory of climate disaster and hypercapitalism. I caught up with him last month to inquire about his current outlook.

“The world is dying, and we have a 30% chance of making it through the end of this century. Certainly, we’re likely to see a capitalist famine in which maybe a few hundred million or a few billion starve to death. The first time that global warming gets heavily intersected with the food supply is going to be a massive termination event. And everybody is going to turn around and say ‘Oh my god this is terrible, we never saw it coming!’”

In the five years since Vinay shared his hexayurt housing project with Meaning, the stakes have clearly got higher. At the time, Vinay described how the hexayurt’s simple, open source structure could change the game in the housing market, returning the commodity to its use-value and removing the banking and speculation aspects that keep the market artificially inflated and static. In this, as in other disruptive grassroots technologies, he has argued that the creation of abundance (or the removal of scarcity) is the route to breaking industrial stalemates. How much has his hexayurt mission progressed in the interim; given the prospect of looming climate disaster and increasing political volatility in the West?

“At this point, what I’m working towards is trying to redesign how we handle refugees — for example, climate refugees. I came to the conclusion that I’m going to have to do a lot of privately financed, fairly large-scale research and development, so that has taken me into a kind of indirect loop forward which is: go into the markets, make some money in technology, hopefully come back and follow the Elon Musk strategy of ‘pay for the change you want to see in the world’. I tried ‘being the change’ and it wasn’t working all that well, but ‘paying for the change’ — that seems like it might work.

“So, step one is to make about £800 million. Step two is to spend this money setting up charter cities that are designed to accept refugees, and finance the process by having the refugees export goods and services on preferential tax rates; which would basically be a subsidy provided by the first world countries as a way of getting the refugee problem solved. So, you have a jurisdiction where the refugees can export goods into Europe without paying taxes on them, and that encourages foreign direct investment. You basically set up free trade zones for the refugees to be able to take care of themselves; rather than us trying to find the budget to cover 300 million displaced people.”

Hexayurt communities at the Burning Man festival

While on course to realising his vision for the hexayurt project, Vinay has emerged as one of the leading thinkers in the second generation of blockchain; speaking and publishing prolifically on the revolutionary potential of crypto-currencies to cut out the middle man. Now an undisputed pioneer of the smart contracts platform ethereum, he recently designed the Dubai blockchain strategy as well as presenting his new thesis — the Internet of Agreements — at the World Government Summit. Could Ethereum be the route to the £800 million he needs?

“I certainly ran into capitalism in a really dedicated way three years ago because I figured out that we were just screwed. It is time to run. If I was attempting to run now, I wouldn’t be at the head end of the blockchain as basically a late entrant. I’m in the position that I’m in because I started running early enough that I got a good position as I ran into the system. If you wait too late it’s quite hard to get a decent position inside of the next round. So, the awareness landscape is basically a sort of a stress network — I look in society for the places where stress has accumulated and I use that map to position myself forward, because I’m carrying this hexayurt thing. It’s going to require the investment of enormous sums of money to build hexayurt cities and then hexayurt countries for the climate refugees. If I get squashed now, none of that is going to get done.”

It seems that we are now entering a cultural explosion around the blockchain. This has come with a large amount of political baggage — with crypto-currencies claimed by libertarians, anarchists and survivalists as a revolutionary tool to break free of both the state and existing markets. I was interested to know to what extent Vinay would agree with their creed — that decentralisation is the key to political liberation:

“I’m running around with a view of the future which is far more realistic than almost anyone else in the blockchain space has. Therefore I’m continually three or four steps ahead because I don’t believe that decentralisation is utopian. I don’t think it’s going to produce a better world at all. Centralisation can be the FDA ensuring you don’t have dioxins in your food. Decentralisation can be people marrying their thirteen-year-old cousins in rural Utah. This all cuts both ways. There is getting it right and there is any particular given political dogma. And all of the political sides are wrong — all of them are wrong.

I think accountability could produce a better world, and you could get accountability from blockchain; but decentralisation in the mode that people are currently practising it is simply hypercapitalism with another set of fangs. I also believe that the state is not going anywhere because the nuclear weapon stockpiles are exactly the way they were when we started and they’re not going away. So, at that point whatever we’re building is going to end up interfacing with the state. I have a fundamentally different view of where cryptography fits into the future — and I take the risk of terrorists using this stuff completely seriously. These are all fundamentally anathema to the vast majority of people in the blockchain space. They think you’re going to get full decentralisation, they don’t want to think about the black state and its weapon stockpiles, they absolutely don’t want to think about environmental constraints. It’s just a ‘yeah it’s all going to work out’ kind of future. But it’s not all going to work out. It might work out for well-armed white people in rich countries, but it’s certainly not going to work out for everybody else.”

With his arguments for post-scarcity economics, Vinay has also become associated with ‘left-accelerationism’ and the development of simple, open source technologies — even setting down principles for how ‘open source appropriate technology’ should be ethically approached. His hexayurt falls into this category, along with water filters and solar panels; commodities with an economic rationale of “the lowest investment for biggest increase in quality of life”. Where this kind of technological progress is emancipatory, ‘right-accelerationism’ is considered its technocratic counterpart; further intensifying the concentration of wealth under capitalism. I asked Vinay if this is still a battleground on which he wishes to fight:

“I’m going to get back to that stuff in ten years if I’m still alive. The ‘if I’m still alive is important, right!’ Of the 1960’s generation of leaders — the vast majority of them were dead by the 1990s. The Alan Watts and all the rest of that kind of crew, even the Robert Anton Wilsons of the world — he was broken down to a shadow of himself by the time the 90s came round. Over and over and over again we lose the top end of leadership because they just get crushed in history. People just stick to their guns and they carry the weight until the weight crushes them.

The mind-set should not be one of ‘stick to your guns, die with your boots on’. Every generation has tried that approach and it’s been completely ineffective. The activists keep getting suckered into that trap again and again — this is spiritually right, this is spiritually wrong, we’re only going to do the spiritually right — then they get materially broken and they get shoved off the wreck. Another generation of totally inept youngsters then stands up as the next round of spiritual leadership and then gets the shit kicked out of them again in the next round. Armies that go into battle with no general will lose, and the generals are dying in the streets — twenty years too young to actually have any real effectiveness. Forty-five is the age that you begin to enter structural power, and for the most part the hippy leadership never made it that far. It’s a recurring, inter-generational cycle.”

It seems significant that Vinay evokes the activism of the 1960s, and I get the feeling that it’s a cultural trajectory he’s spent a lifetime thinking about; one that could not be better expressed than by one of his favourite literary passages, Hunter S Thompson’s description of ‘the wave’ from Fear and Loathing in Las Vegas. I once heard him quip that the aftermath of the 60s would have been very different if the activists had emerged with something like blockchain. For Vinay, this is a time for building, not fighting.

Our discussion goes on to survey the contemporary political scene — the rise of the right and infighting of the left on both sides of the Atlantic. If activists have allowed themselves to be drained of their effectiveness, could this be because they have too often prioritised sensibility above strategy? I can’t help but raise that old bone of Marxist contention — does Vinay believe that the contemporary focus on identity politics has diverted the left from addressing the more urgent question of resources?

“This is why I think basic income is the next winnable fight — and the proper response to global hypercapitalism. Because you could potentially unite the shattered disparate wreckage of the left and indeed the former middle classes under basic income as a banner in the age of robot socialism. So, after the manufacturing economy is gutted by robots, after the drivers are gutted by self-driving cars, as all that stuff unfolds and the right promises the world to get into power and then completely fails to deliver, there will be an opportunity for large scale renegotiation. So, the objective is basically to keep the powder dry and keep the front line activists safe until that large scale renegotiation occurs. No one really understands the issues, you’ve got to wait until people are actively beginning to push for basic income before you start dropping everything to go and deliver basic income. It’s a waiting game.”

Vinay warns that we might be waiting ten years before the scene is set for the “next round” of activism for basic income. But, if activists are to leave the front lines and reinvent their strategies, is there really nothing to fight for in the meantime?

“The one thing that I think might be worth fighting for is laboratory grown meat. It’s now close enough that a fight for that might be really important. If the lab meat thing works and you wind up with the ability to get the population off cow, it will make an enormous difference to our global warming emissions. Enormous. Bigger than getting rid of cars. It takes all of the land use pressure of nature. So, you get the jungles beginning to grow back, you get the English countryside beginning to come back — you get a huge restoration of natural systems because you’re no longer grazing everything in sight to turn it into hamburgers, because the hamburgers are coming out of an enormous factory on the far side of Dundee for a pound a kilo! So, I actually think that beating the hell out of green resistance to lab meat — a ‘tech will save us’ kind of thing — is a really good idea. And getting into the lab meat industry — can you imagine how much money is going to come out of lab meat? Cutting greenhouse gas emissions by maybe 20%, hugely improving access to protein in the developing world, saves the lives of untold millions of cows by simply failing to have them exist. It’s something where the culture gets all up in arms about it, you can imagine the farming lobby now. But if they ban it we are screwed, because it’s the next big shift we could make technologically that could protect the ecosystem from our stupidity.”

The lab meat question is exactly the kind of pressure point that Vinay Gupta likes to hone in on, for the extent they challenge our comfort levels and ask us to think through our contradictions. He will regularly remind you that it’s impossible to confront the future without also tearing up your sensibilities; and it is clear that this is a deeply held existential position. As an advanced practitioner of Kriya Yoga, Vinay likens the task to the ancient principles of Tantric philosophy: ”the continual pursuit of truth over social conformity.”

This November, Vinay will share his experiences at the vanguard of Ethereum — in particular The Internet of Agreements, his thesis on how blockchain can build the future of global trade and co-operation. In approaching how data and commerce should interface with the state in the era of blockchain, he is sure to be fearless in addressing the blind spots created by the blockchain craze; and in deconstructing the belief systems that have so strongly influenced its first wave. As with any topic on which Vinay holds forth — you pigeonhole him at your peril.

You can hear more about Ethereum at the Meaning conference in Brighton, UK on 16 November 2017 — where Vinay Gupta will join a line-up of diverse speakers exploring the role of business in creating a more sustainable, equitable and humane world. Find out more via the event website.

Photo by lotus8

The post Vinay Gupta returns to Meaning with his biggest vision yet for global systems change appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/vinay-gupta-returns-to-meaning-with-his-biggest-vision-yet-for-global-systems-change/2018/06/06/feed 1 71270
Holo: The evolution of cloud computing https://blog.p2pfoundation.net/holo-the-evolution-of-cloud-computing/2018/05/24 https://blog.p2pfoundation.net/holo-the-evolution-of-cloud-computing/2018/05/24#respond Thu, 24 May 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=71092 If you’re looking for good, accessible resources on Holo and Holochain, you’ve come to the right place. Up above you’ll find a video presentation by Nancy Giordano (the slides are below). Additionally, we’re republishing a post by Matthew Schutte on Holo’s impressive potential. Nancy Giordano presents Holochain from P2PF Holo: The evolution of cloud computing... Continue reading

The post Holo: The evolution of cloud computing appeared first on P2P Foundation.

]]>
If you’re looking for good, accessible resources on Holo and Holochain, you’ve come to the right place. Up above you’ll find a video presentation by Nancy Giordano (the slides are below). Additionally, we’re republishing a post by Matthew Schutte on Holo’s impressive potential.

Holo: The evolution of cloud computing

Matthew Schutte: This is an attempt to communicate Holo in simple, clear language (with a bit of playfulness to keep it entertaining).

We will cover Five areas:

1. Holo Value Proposition

2. Why you should Participate

3. Holo Currency Pricing

4. How is HOT related to Holo fuel

5. Matt’s Snarky Takeaway

Holo Value Proposition

Holo is launching a peer-to-peer app hosting marketplace.

Today, application developers usually pay Amazon or some other big corporation to serve their app or website to visitors.

Holo enables anyone to compete with Amazon for this business by offering the spare computing capacity on their own laptop, desktop or other computer. When their computer hosts an application, the developer pays them instead of Amazon.

Just like Airbnb enables people to rent spare bedrooms to help pay their mortgage, Holo enables people to rent their computer’s spare storage and processing power to help pay for their internet access, or even their computer itself.

Holo does to spare computing capacity what Airbnb did for spare bedrooms

And there is actually a LOT of spare capacity out there. In fact, globally, the idle storage and processing power sitting unused in our laptops and desktops dwarfs even the infrastructure of the largest cloud computing company.

Holo will do to that enormous spare computing capacity what Airbnb did to spare bedrooms.

Except, with Holo, you won’t find yourself cleaning sheets all the time. Here, your computer does the work and you reap the reward.

Application hosting is the cash cow of the third most valuable company on earth

And this isn’t a small market. For instance, Amazon is the third most valuable company on the planet, and though their app hosting division, AWS, accounts for just 10% of their revenues, it generates more profit than the entire rest of the company… combined. In other words, AWS is the cash cow of Amazon. And they are just one of several gigantic companies in the space. So… yes, hosting is a big business — and getting bigger.

Why you should Participate

If you are trying to decide whether you want to participate in this ecosystem, we can make it even more blunt:

Holo might do to the cash cow of the third largest company on the planet, what Uber did to Taxis.

Except, unlike Uber, with Holo, 99% of the money goes straight to the people whose machines are doing the work. That’s right. In exchange for orchestrating all of this, we take just a 1% cut.

Sound familiar? It might. People have dreamed of this for years. It was even the plot of an HBO show last season. But two new innovations from Holo are enabling us to, as Forbes recently put it, “turn internet fiction into reality.” Those two innovations are Holochain and Holo Fuel.

Holochain

First, Holochain is a new way of running truly peer-to-peer applications that makes it so that my computer doesn’t necessarily have to “store” all of the content in an application in order to be able to serve that content. Instead, my machine can quickly retrieve anything I need right when I need it. It’s “just in time” content delivery. And when my computer then serves that content to a visitor, I get paid.

Folks around the globe have been building apps on Holochain Alpha (“the adventurer” release) since October. Holochain Beta is coming soon.

Holochain is live now and apps are actively being built and run on it. Holochain gives Holo a competitive advantage by giving it a collaborative advantage. And thanks to the care with which we designed Holochain, the World Economic Forum called Holochain one “of the most integral technology projects” in the blockchain space, pointing out that we “aren’t just putting lipstick on clones of existing projects” but have actually gone “back to the drawing board and created mission-driven roles for coders, entrepreneurs, investors, philanthropists, regulators and policymakers.” We’ve taken some of the most widely used technologies of the last two decades and combined them in a novel way. Holochain combines the efficient peer-to-peer data storage model (DHT) that bitTorrent uses with the tamper resistant logs (Hash Chains) that blockchains use and the agent centric approach (each agent has their own perspective and signs their own actions) that Git uses. We think of Holochain as an evolution of blockchain, because it solves so many of the problems that have plagued blockchain over the past decade, including scale, speed, cost, adaptability and composability.

One thing to note is that unlike Holo, Holochain itself is not a platform. It is a pattern. Like HTML. We are giving it away to the world for free. It is open source. It does not require web servers. Or miners. Or cryptocurrency. Every user of a particular app, runs that app, showing up as both user and host. Holo is making use of this “holochain” pattern and is reaping the efficiency and resilience benefits that result.

Holo fuel

Holo fuel. It isn’t actually fuel. It’s for fueling hosting.

Second, Holo fuel is a new crypto-accounting system that enables us to process transactions in parallel, rather than in sequence. That means that Holo can handle millions or billions of simultaneous transactions. Moreover, because Holo fuel is so efficient, we can process transactions for even very small amounts such as a payment of a penny. And though this might look foreign if you are only familiar with blockchain “token” based types of cryptocurrencies, it isn’t exactly untested. We’re applying a centuries old double-entry accounting system called Mutual Credit. And now we’re getting to use Holochain and cryptography to distribute, secure and extend the capabilities of this tried and true accounting system.

Holo Currency Pricing

We’ve been compared by others to Ethereum, the blockchain based computing network that is worth hundreds of billions of dollars at present. So we did some benchmark testing. We built and tested several applications so we could see how costly it was to perform computation on Ethereum vs. Holo.

The result: depending on the app, Holo is somewhere between one-hundred-thousand and one-million times more efficient than Ethereum. For more details, check out our benchmarking walk through.

So when we decided to pre-sell Hosting services on Holo with an Initial Community Offering, we wanted to accomplish two things:

First, put our money where our mouth is. Second, create a margin gap to enable a two-sided marketplace to emerge.

PUTTING OUR MONEY WHERE OUR MOUTH IS

We have drawn a line in the sand and are offering to host applications WITH OUR OWN COMPUTERS for 10,000 times cheaper than Ethereum.

This makes visceral just how much more elegant the design of Holo and Holochain are relative to Ethereum and Blockchain.

If computing services were cars, for the same amount of money that it would take to buy a Remote Control car on Ethereum, you could buy a real car on Holo. And that car would be a Lamborghini. That is what a 10,000 times price difference looks like ($40 vs $400,000).

CREATING A TWO-SIDED MARKET

Second, It also makes visible that we are UNDER-PROMISING what our network can deliver. We wanted to ensure that there was room for those who step up to participate in Holo as developers and hosts, to get rewarded for doing so. The gap between our “100,000 x” or “1,000,000 x” better benchmark performances and our “10,000 x” offer leaves room for Hosts to enter the market and underbid our price.

We expect that a competitive market will form, and because it will cost hosts five or fifteen or fifty times less than our price point to provide hosting, other hosts will be able to underbid us and win hosting contracts.

And when those hosts price their offerings competitively in order to attract more business, holders of Holo fuel will likely be able get two, or ten or twenty times as much computing power as even the price at which we were offering to provide it ourselves.

In other words, that same amount of Holo fuel that would have bought you an RC car on Ethereum, starts to deliver two, or five or ten Lamborghini’s worth of value (not that we’d spend our money stockpiling Lambo’s, but you get the point).

That creates a win-win-win. A win for hosts. A win for developers. A win for us.

Of course, it won’t exactly be a win for everybody.

Ethereum for instance. It probably won’t be a win for them. If a competitor enters the market and starts offering similar services for, let’s say 50,000 times cheaper than you are able to provide, what do you think will happen to demand for their services. And what would a drop in demand for ETH services do to their currency?

How HOT is related to Holo fuel

Because Holo isn’t live yet (our ICO is focused on funding the software development for it), we needed to raise funds through an existing, and established channel. Ethereum ERC20 tokens have been the standard way of running an ICO for the last year or two. The Holo Token or HOT is an ERC20 token on the Ethereum blockchain and will be redeemable for Holo fuel once Holo goes live. This redemption will happen at a conversion rate of 1 HOT = 1 unit of Holo fuel (HOLO). Holders of HOT will need to redeem HOT for HOLO within 6 months of the launch of the network. You can think of it like a coupon that expires if you don’t redeem it in time.

Again, Holo fuel is the utility credit currency that application owners can use to pay for hosting services on the Holo network.

We estimate that the Holo network will go live sometime in Q3 of this year. In other words, we are aiming to launch Holo in July, August or September.

When a host provides hosting services for an app, that app’s owner pays for that hosting using Holo fuel. So if you have an application and want it hosted (served to non-peer visitors) via Holo, you need to buy Holo fuel from somebody so you can pay your hosts. The Initial Community Offering we are currently running is a pre-sale of the currency that will be used in our system. The purchase and redemption of ERC20 HoloTokens is how people are acquiring Holo fuel. After the close of the pre-sale, people will buy Holo fuel from others that have purchased it from us, earned it themselves through hosting etc. For more details, see our Green Paper. People with Holo will be able to use it themselves, or sell it to others who want hosting services, or, if they earned it through hosting, redeem it with the Holo organization in exchange for other currencies.

To be clear, Holochain applications do not need to use Holo when they are just interacting amongst peers (others who are also running the same application).

But not everyone is going to install Holochain on day one.

So how do you reach “the masses” when the masses have not yet installed the new empowering peer-to-peer apps of the future? You let them interact with those apps through a pattern they are familiar with: opening a browser and typing in a URL.

Holo hosts serve out websites to anyone with a browser, thus creating a bridge back to the “old” internet that everyone, even my grandparents are used to by now. (To be fair, Jerry, Jacqueline, George, and Algreta are fairly savvy when it comes to the internet, but I digress).

When a host creates this sort of bridge by hosting on behalf of an application, the app owner pays them, just like that app owner today might pay Amazon Web Services to host their app.

Some describe this hosting process as being similar to the mining that happens in Blockchain. However, whereas “mining” is an arms race to see who can waste the most electricity doing useless work in hopes of winning a lottery, hosting consists of serving applications or webpages on behalf of customers that are willing to pay for that hosting service. It is way more useful, way more cost effective and vastly more environmentally friendly. For instance, the HoloPorts that we have been making available through our top trending IndieGoGo campaign, use about as much electricity as a lightbulb.

Matt’s Snarky Takeaway

For those that don’t want to take the time to understand this evolution of cloud computing, no hard feelings. Seriously. We played with remote control cars when we were kids too. They were fun.

But you might want to ask yourself, “if these folks are on to something, and they do manage to cut the cost of distributed computing by 20,000 or 100,000 or 200,000 times…

do I really still want to be HODLing ETH?”

More info:

Holo Website

ICO Purchasing and Stats

Holochain Website

Holochain Code

More Technical Overview of Holo

Technical Walkthrough of Holochain

 

The post Holo: The evolution of cloud computing appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/holo-the-evolution-of-cloud-computing/2018/05/24/feed 0 71092
Blockchain is facing a backlash. Can it survive? https://blog.p2pfoundation.net/blockchain-is-facing-a-backlash-can-it-survive/2018/04/18 https://blog.p2pfoundation.net/blockchain-is-facing-a-backlash-can-it-survive/2018/04/18#respond Wed, 18 Apr 2018 07:00:00 +0000 https://blog.p2pfoundation.net/?p=70565 Not so long ago, the internet was hailed as the solution to humanity’s ills. It would shine a light on all corners of the globe, bringing new knowledge and exchange. But growing concerns about fake news, surveillance, cybercrime, social media addiction and monopolised power have tarnished that shine. Without ignoring the internet’s positive impact over... Continue reading

The post Blockchain is facing a backlash. Can it survive? appeared first on P2P Foundation.

]]>
Not so long ago, the internet was hailed as the solution to humanity’s ills. It would shine a light on all corners of the globe, bringing new knowledge and exchange. But growing concerns about fake news, surveillance, cybercrime, social media addiction and monopolised power have tarnished that shine. Without ignoring the internet’s positive impact over the past few decades, these difficulties remind us that a technology-driven utopia – or technotopia – is a fiction. People and governance always shape the use and impact of a technology.

Today’s advocates of blockchain and digital currencies describe the potential for more privacy, transparency, accountability, efficiency and competition in all forms of commerce, finance and bureaucracy. Some see blockchain as providing technologies for democracy itself, from elections to budgeting. While some claims seem overblown or premature, there are already some fascinating applications in the fields of logistics, inventory and supply chain management.

Despite these advances, there has been a growing backlash from opinion leaders as the technology’s drawbacks become better known. Perhaps you’ve heard that Bitcoin alone uses 0.25% of the world’s electricity? Other blockchain systems, such as Ethereum, use similar approaches that require computers to burn electricity unnecessarily. Perhaps you are concerned about the number of accidents, hacks and scams possible in this new space, where the law has not yet found its feet? Or you may have heard that crime and terror networks could use these technologies to transfer funds. Blockchains and digital currencies pose important questions to both their advocates and regulators.

Pioneers in the industry are alert to such concerns and have attempted collective self-regulation. The Brooklyn Project, an industry-wide initiative to support investor and consumer protection, was launched in November 2017.

“By acting responsibly today, we can help make sure we are collectively able to reap the benefits of this powerful technology tomorrow,” explained co-founder of Ethereum Joseph Lubin. The following month, a coalition of cryptocurrency organizations and investors representing $650m in market capitalization established Project Transparency. It seeks to protect investors by enabling more disclosure within the digital currency sector.

These initiatives are welcome, but neither address how the technology affects wider society and the environment. If this sector is going to disrupt incumbent organisations (management-speak for people losing their jobs) then the general public will soon ask what the upsides really are.

In recent months, many blockchain projects explicit about their social mission have launched. Bflow.io offers a system for reporting corporate sustainability. Alice.si strives for greater accountability from charities. Provenance.org tracks tuna from shore to plate, giving consumers confidence in sustainability. BitLandGlobal is seeking a step change in land registration by the rural poor. Specialist think-tank Blockchain for Good has been established to promote blockchain’s benefits for worthy causes. Nevertheless, on closer analysis, many of these ‘4good’ projects miss a crucial factor – the impact of their code itself.

Is it appropriate for people apparently seeking economic justice and equal opportunity to use a blockchain in which only heavily invested actors receive new tokens? Is it appropriate for those seeking to put a new medium of exchange in the hands of the masses to use a blockchain whose tokens are mostly hoarded by speculators? Is it appropriate for a carbon emissions reduction project to use a blockchain which emits as much CO2 as a small country?

These are not hypothetical examples. Most blockchain projects bolt a purpose onto code and governance systems that were designed without such public interests in mind. Just as it would not be acceptable to clear the ancient Borneo jungle to raise money for homeless orangutans, it should not be acceptable for a project to deploy socially regressive or climate-toxic code.

Fortunately, there is a new wave of mission-driven blockchain projects conscious about their total social impact. Initiatives like HolochainFaircoinYetta and LocalPay explicitly connect their code base to their social cause. Faircoin uses a codebase that requires little electricity and allows the distribution of coins to socially useful projects. Providing the same smart contract functionality as Ethereum, the new Yetta blockchain is intended to be sustainable by design, with the low energy requirements of its codebase being moderated further by automated rewards for those nodes using renewable energy. It will also enable automated philanthropy to support the Sustainable Development Goals (SDGs).

Two of the most integral technology projects in this field take a post-blockchain approach. By sharing data and not using a single blockchain, Holochain reduces the energy and time involved, while avoiding being dependent on the decisions of unaccountable groups of computing “miners”, as so many blockchain projects are. Shunning digital tokens entirely, LocalPay runs on code that means its users in more than 300 local communities do not need to purchase or mine a currency to begin transacting. For them, currency is simply a unit that comes into being, for free, when they wish to trade.

These projects aren’t just putting lipstick on clones of existing projects. Their founders went back to the drawing board and created mission-driven roles for coders, entrepreneurs, investors, philanthropists, regulators and policymakers. They designed a technology to fit into an ecosystem, rather than to dominate it. They set up incentive structures for fair contributions and rewards. This generation of “integral blockchain” and digital currency initiatives aligns its codebase and internal governance with positive social and environmental outcomes. These projects strive to be an integral part of a healthy society, rather than ends-in-themselves.

Will blockchain technologies be killed in their infancy by regulators? Will they grow into monsters that consume energy while enabling tax evasion, crime and capital flight? Or could they provide meaningful services to humanity? Greater cross-sectoral dialogue and guidance is needed to help this last scenario emerge.


Originally posted in WeForum.org

The post Blockchain is facing a backlash. Can it survive? appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/blockchain-is-facing-a-backlash-can-it-survive/2018/04/18/feed 0 70565