Republished from Medium.com

Dick Bryan, Benjamin Lee, Robert Wosnitzer, Akseli Virtanen*

The mounting literature on cryptoeconomics shows an interesting but also alarming characteristic: its underlying economics is remarkably conventional and conservative.

It is surely an anomaly that many people who have gone outside the mainstream to disrupt and develop new visions of the future and economy so readily adopt the conventions of the ‘dismal science’.

The problem is that the orthodox economics blocks the real potential for cryptoeconomics and cryptographically enabled distributed economic-social system to facilitate the building of a radically alternative politics and economics.

At the core of this view are two realizations:

1. In their money role, cryptotokens can be an alternative unit of account, not just a means of exchange. They can invoke a new measure of value, not just facilitate new processes of trade. As such, they can have a ‘backing’ in the value of output they facilitate, not present as simply tools of speculative position-taking.

2. In their ownership role, they can be derivatives (purchases of risk exposure, not just asset ownership) designed so that people risk together, not individually. They can invoke collective approaches to dealing with risk and upside, not individualistic ones: they can enable risking together.

In this review we first follow the different framings of ‘economics’ and their implications to cryptoeconomics. The analysis then explores the notion of ‘fundamental value’: how it is utilised in the cryptoeconomics literature and how a different framing of ‘economics’ enables distinctive insights on how to creatively develop the idea of token value being founded in ‘fundamental value’. Our objective is to show how a critical reframing of economics enables token integrity to be explained and managed.

To help you navigate, here is the contents of what follows:

1. Which economics?

2. The limited working definitions of cryptoeconomics

3. The Hayekian turn: the integrity of market processes

4. Cryptotokens: means of exchange or units of account?

5. The valuation of cryptotokens

A. Developing the ECSA unit of account

B. A historical digression, but of some significance

6. Contemporary lessons from the historical digression

7. Once more on fundamental value: the ECSA approach

8. The derivative form: buying exposures to an exponential future and a big put

1. Which economics?

Economics is a broad and contested discipline. It is also an old one, with Adam Smith’s Wealth of Nations almost 250 years old, and Karl Marx’s economics 150 years old. Its dominant discourse is ‘neo-classical’ economics, dating from the late 19th century. It has, of course, significantly evolved over the past century, but the current dominant thought, directly based in that neo-classical turn, remains relatively coherent. It is dominated by an orthodoxy, quite unlike the rest of the social sciences that are conceived in theoretical and methodological debate.

The dominance of neoclassical economics is not unchallenged. There are criticisms from both the ‘right’, in the name of libertarianism (e.g. Hayek) and from the left (both a statist left like Keynesianism and an anti-capitalist left like Marxism). In that sense it is hardly surprising that there is no definition of ‘economics’ that is generally agreed.

Here are a couple of ‘standard’ definitions that come from the likes of Economics 101 textbooks that to some degree cover multiple positions in debates:

1. Economics is the study of allocating scarce resources between alternative uses.

This is a definition that points to price formation and decision making. It opens up agendas of optimisation. It is framed to privilege ‘microeconomics’ (the workings of particular markets), not ‘macroeconomics’ (the totality of economic processes, understood as more than the sum of market processes).

Here is another one:

2. Economics is the study of the processes of production, distribution and consumption of goods and services.

This is a definition with wider social meaning and context. It is not specifically about markets and certainly not focussed on optimisation. Its focus is the social, not the individual, and on systems. It is more likely to be historical and less mathematical than the economics created under the first definition. It implicitly acknowledges that the economic is difficult to disentangle from other facets of social life.

By contrast, the first definition isolates ‘the economic’ to a greater extent by focussing on price formation and decision making. In most of the 20th century, this was achieved by treating economic agents as autonomously rational, later enhanced by game-theoretic strategic rationality and later still challenged somewhat by propositions of ‘systematic irrationality’ (behavioural economics). These developments have enabled economics to become mathematically advanced and subjectable to formal modelling.

Both definitions have something to say to the token economy community, where we can recognize concurrently a potential epochal change in the way of doing economic activity and the new potential of mathematical modelling. But the two emphases need to be kept consciously in balance.

Perhaps this recognition is part of the success of Ethereum, where we can see each style of economic focus in play.

Ethereum inventor Vitalik Buterin, consistent with the first definition of economics, has defined cryptoeconomics as about:

  • Building systems that have certain desired properties
  • Using cryptography to prove properties about messages that have happened in the past
  • Using economic incentives defined inside the system to encourage desired properties to hold into the future.

Ethereum developer Vlad Zamfir, embracing more the second definition (and citing Wikipedia), says that cryptoeconomics might be:

“A formal discipline that studies protocols that govern the production, distribution, and consumption of goods and services in a decentralized digital economy. Cryptoeconomics is a practical science that focuses on the design and characterization of these protocols.”

But it is apparent that, in the broad scoping of cryptoeconomics, it is the first definition that is the focus. Sometimes called ‘token engineering’, it scopes systems of incentives that can be applied to ‘rational’ agents. The cost of this framing is to both limit the social and economic significance of a cryptoeconomics, and to create greater possibility for token failure in practice.

2. The limited working definitions of cryptoeconomics

Specifically, the focus in cryptoeconomics on reducing transactions costs and creating individual incentives to operate optimally is in danger of not just neglecting wider social issues of production, distribution and consumption of goods and services, but of building a framework that actually makes impossible a systematic engagement with the wider issues.

If you google cryptoeconomy/cryptoeconomics, the sources that appear have a remarkable consistency. The various blogs/primers/newsletters start with almost the same sentence. They break the term ‘cryptoeconomics’ into its two component elements. They explain processes of cryptography with some precision, but when it comes to explaining the associate economics, the depiction is remarkably narrow. For example:

“Cryptoeconomics comes from two words: Cryptography and Economics. People tend to forget the “economics” part of this equation and that is the part that gives the blockchain its unique capabilities. . . .Like with any solid economic system, there should be incentives and rewards for people to get work done, similarly, there should be a punishment system for miners who do not act ethically or do not do a good job. We will see how the blockchain incorporates all these basic economic fundamentals.” (Ameerr Rosic’s ‘What is Cryptoeconomics: The ultimate beginners guide.)

Similarly:

“Cryptoeconomics . . . combines cryptography and economics in order to create huge decentralized peer-to-peer network. On the one side, the cryptography is what makes the peer-2-peer network secure, and on the other side, the economics is what motivates the people to participate in the network, because it gives the blockchain its unique characteristics.” (Introduction to Cryptoeconomics through Bitcoin)

The limited framing of economics is, perhaps, because the world lacks people with background in both cryptography and (a broad) economics. Cryptoeconomics is, perhaps, being frequently projected by people who are highly qualified in programming and engineering, but often self-taught in economics. We thought it was funny when Nick Szabo tweeted some time ago about economists and programmers:

“An economist or programmer who hasn’t studied much computer science, including cryptography, but guesses about it, cannot design or build a long-term successful cryptocurrency. A computer scientist and programmer who hasn’t studied much economics, but applies common sense, can.”

In a sense he is absolutely right, but then on the other hand, you do not create anything new from doxa (common sense), but just repeat the same. The idea that the economy (society) is common sense will create an economy that looks like a computer, taking the existing power structures as given. In good economics, the issue of power, and who holds it, how its use it governed, is the key issue.

And it’s not just the bloggers and tweeters who advance this simple economics. Within the academy, there is the same sort of emphasis emerging, including from qualified economists.

The MIT Cryptoeconomics Lab presents a couple of papers that centre on transaction costs and networking. For example, in “Some Simple Economics of the Blockchain” Christian Catalini and Joshua Gans contend as their central proposition:

“In the paper, we rely on economic theory to explain how two key costs affected by blockchain technology — the cost of verification of transaction attributes, and the cost of networking — change the types of transactions that can be supported in the economy.

. . . The paper focuses on two key costs that are affected by blockchain technology: the cost of verification, and the cost of networking. For markets to thrive, participants need to be able to efficiently verify and audit transaction attributes, including the credentials and reputation of the parties involved, the characteristics of the goods and services exchanged, future events that have implications for contractual arrangements, etc.”

The Cryptoeconomics research team at Berkeley is another example. Zubin Koticha, Head of Research and Development at Blockchain at Berkeley, begins his ‘Introduction to blockchain through cryptoeconomics’ like this:

“Although Bitcoin’s protocol is often explained from a technological point of view, in this series, I will convey the incentives existing at every level that allow for its various comprising parties to interact with cohesion and security. This study of the incentives that secure blockchain systems is known as cryptoeconomics.”

It is important to be clear here. Our objective is not a critique of these specific contributions: they may well be exemplary expositions within their chosen agenda. Our objective is to say that if we limit the conception of cryptoeconomics to these framings, then we can imagine and theorise cryptoeconomics only in the language and grammar of optimized individual transactions and incentives. Programmers too should understand what that means. The issues of production, distribution and consumption of goods and services — the bigger picture issues — slide off the analytical agenda. They can’t even be expressed in this grammar.

3. The Hayekian turn: the integrity of market processes

For some, this slide is most welcome, for they see the world in terms of interacting individuals and markets as both an efficient and a moral mode for individuals to engage. If we attach an economics and philosophy to it, the most obvious is Friedrich von Hayek. Hayek was a relatively marginal figure in economic theory and policy until his ideas were embraced by UK prime minister Margaret Thatcher. Hayek was an admirer of markets and prices as modes of transmitting information, arguing they generate spontaneous self-organization. He was also an advocate of limited roles of government in money creation and management, and in social policy too, citing what Milton Friedman later depicted as the ‘the tyranny of the majority’ as the danger of government interventions. In 1976 he published a book called The Denationalization of Money, arguing that governments messed up money systems when they intervene, and we would be better off with private, competitively driven monies.

There is certainly a strong tradition in the blockchain community that would confirm this Hayekian view. But it is important that we do not fall into this discourse by accident. It is not the role of this text to debate this or any specific philosophy of economics; the point is that there is a form of Hayekian economics, with its appeal to individuals and incentives, that seems to resonate with people in cryptography. But there are more complex, detailed versions of this theory that are not reducible to these populist framings. Recall in this context that while Hayek was an opponent of state money, he did not at all advocate that money should be freely issued. He believed that money should reflect, and its quantity and value should be tied to the ‘real economy’. In 1930s and 40s debates about the post WWII global monetary system, Hayek, following von Mises and others, argued against the Keynesian proposal for a state-backed global money. The alternative he supported was that the system should be backed by reserves of basic commodities (lumbar, coal, wheat, etc). This requirement seems to be ignored by many cryptoeconomic commentators who invoke the relevance of Hayek to advocate non-state ‘currencies’ without material backing. Yet, the issue of token backing is very important, and we consider it a little bit more below.

For the non-Hayekians, there remains the option of a tradition of neo-classical economics that embraces optimisation, transaction costs, and incentives, but also pays more attention to the limitations of market solutions. A significant number of Nobel Prizes for Economic Science in the past 30 years have been awarded for engagement with these sorts of problems. It all points to the proposition that markets do not work in a simple, idealised way.

Neoclassical economists identify two broad limitations of market solutions. One is ‘imperfect markets’, where the capacity to secure forms of control over a market generate returns above the norm. Historically, this issue has focussed on the inefficiencies of monopolies and oligopolies. More recently, attention has been paid to asymmetrical information, and especially the fact that sellers generally know more about a commodity than buyers. (Joel Monegro’s ‘Fat Protocols’ is in this tradition, engaging what sorts of control at what point of the stack/value chain generate best long-term returns.)

The other factor in the neo-classical approach is the condition of ‘market failure’: where markets cannot effectively allocate prices because collateral costs and benefits are not borne by individual producers and traders. Hence there is in neo-classical economics greater engagement with the roles of government in overcoming market failure than is found in Hayek, albeit that there is also debate whether the cure is worse than the disease. Whether the computational systems built on smart contracts can significantly diminish market failure stands as a moot point.

There is one more recent points of challenge to neo-classical economics that is relevant to cryptoeconomics. It is work brought to prominence by Michel Callon and, in English and in relation to finance by Donald MacKenzie who describe economic models as ‘performative’: essentially that they make the world, they don’t describe it. This approach casts economics as a prescriptive discipline, operating in the domain of ‘ought’ statements, rather than ‘is’ statements. It warrants mentioning here because it already hints at the potentiality of cryptoeconomics: when used more radically (not only to repeat the most orthodox economic beliefs), as we try to show below, cryptoeconomics opens to us economy itself as design space. We need to recognize the complexity of social and economic dynamics in token design, and make sure that the ‘social’ receives as much analytical attention as the formal, technical issues. If we design ‘ought’ systems that understate the complexity of the social, or do not understand their effects, it is likely that governance processes will be inadequate.

Much of the rest of economics covers a wider range of views, but a smaller number of economists. But it is here that the broader, more cultural and socio-historical questions come to the fore.

We want to focus on two broader issues, still very ‘economic’ in framing, that maybe are sufficient to capture the flavour of these broader agendas. They both appeal to broader social perspectives in cryptoeconomic analysis, but embody rather different political agendas.

One comes from treating cryptotokens as not just a new means of exchange (the transaction view) but also a new unit of account (a production and distribution view); the other is played out through debates about the valuation of cryptotokens.

4. Cryptotokens: means of exchange or units of account?

Going back to our definitions of economics, the first one — about optimization, incentives and transaction costs — conceives of tokens as either means of exchange or, in the case of utility tokens, types of commodity futures contracts (rights to future conversion into commodities).

They are that, but they also are more than that, when framed in the context of the second definition of economics. From the perspective of the second definition we can see cryptotokens as providing the possibility for new units of account, and hence new ways to measure the economy.

In the first definition, the answer to the question ‘what counts’ in the economy is answered by reference to the discipline of market calculus. In the second definition, what gets counted as ‘production’ and ‘consumption’ is more open ended. What is counted — and what is valued — opens as a design question.

It has been well established that market criteria are blind to some critical economic processes. Roughly (for it is complex to specify) anything not produced for sale is systematically excluded.

In the mainstream capitalist world of fiat currencies, incorporating these excluded forms of production and consumption has been a virtually insurmountable challenge. There we see the dominance of a culture of production for profit and a history of data collection based on that principle. Beneficial things that do not make revenue are difficult to measure and hence to incorporate.

Of course we are not the first to recognise this limitation. The neglect of household production, both nurturing activities in the home and the economic activities of peasant economies, are widely-recognised limitations. And within neo-classical economics there is debate about how far into wider social analysis the notion of externalities extends (and how they might be priced). In a similar vein, the appeal of ideas like ‘triple-bottom-line accounting’ and ‘ethical investing’ also embrace alternative visions of counting. But, and this is critical, they all presume the ontological primacy of profit-centred measurement: they are critiques of and qualifiers to that system and rarely present alternative modes of calculation.

Cryptotokens enable us to re-open this measurement question. Cryptotokens as means of exchange enable us to trade in new ways. Cryptotokens as new units of account enable us to measure output (what is value and how is it produced) in new ways.

This points us already to one of our key insights, to which we will return a little bit later: we already know that the next value production layer — of the era of decentralized open source data — has to do with governance (more welcoming and better governed crypto networks will be valued at a premium due to their reliability of social inclusion in the decision making process), but also with ways of belonging, ways of sharing stakes, risks, upside. The organization of ‘risking together’ — or what we call an economic space — becomes now the actual value creation layer: it is a new value form which is very different to the commodity form as a basic economic cell of society which makes social relations between people (for capital is a social relation) appear as just relations between things. Basically, you will compete now with different community-economy-governances. This — explicitly relational, social logic of this new value form — is precisely what we try to capture by talking about it as a network derivative below. And it is to express such social-economic organizations that we are working on the Space organizational grammar and development environment (See the ECSA Tech Stack below).

The challenge in the cryptoeconomy is to open discussion about how we understand and measure this broader conception of ‘production’ and ‘consumption’, consistent with our aspiration of incubating not just in new ways of organizing, but also the production of new things and new (social, political, aesthetic, organizational, environmental…) relations.

But it is critical here that this imagining of new ways of doing economy and economics is not just fuzzy and feelgood: we need ways to measure and to socially validate these new horizons. It means that we cannot, in the first instance, reduce all forms of production to a monetary price. We can treat monetary price as one index of measurement (for a price is merely an index since the base unit of measure is arbitrary with respect to the thing being measured), but we will need other indices of production too, targeting measurement of the different ways in which goods, services and intangibles get acknowledged socially — or, become socially quantifiable, as Gabriel Tarde, one of the maybe most relevant economic thinkers for crypto, originally put it. Think of measurements of replication, imitation, iteration, social inclusion and recommendation.

In cryptoeconomics, especially where the focus is on wider social agendas, questions of what to measure and how to measure opens up a critical research agenda which will require ongoing attention and resource allocation. It is readily apparent, even in the mainstream of economics and accounting, that the valuation of ‘intangible assets’ is a critical problem. It always has been a problem, for such assets can’t be measured like plant and equipment and real estate, but as intangible capital (brands, intellectual property, etc) are now the overwhelming proportion of the assets of the world’s largest companies (Facebook, Apple, etc.), the lack of appropriate tools for valuation has emerged as a conspicuous accounting problem.

Most of the assets inside cryptoeconomics are also likely to be predominantly ‘intangibles’, so the problem is shared. The difference is that new tokens open up possibilities for new codes and agendas —new grammars — of measurement. It is quite conceivable that research on measurement in relation to crypto units of account may turn out to be invaluable to mainstream accounting too. (See Valuation Crisis and Crypto Economy)

Developing alternative measures are not a simple processes, and there are certainly challenges, most notably of measurement across indices and dealing with gaming of the measurement system. But we believe these challenges are significant and definitely worth taking up. It is important that resources are put into exploring new measurement agendas.

5. The valuation of cryptotokens

This issue is of importance because it gets to the heart of the question: can markets effectively price tokens as more than speculative objects? This was certainly an issue for popular debate in late 2017 when the price of bitcoin spiked. In this context, prominent crypto investor Fred Wilson said:

“In times like this, I like to turn to the fundamentals to figure out where things stand and how I should behave. . . You need to have some fundamental theory of value and then apply it rigorously.”

For those who believe that markets create spontaneous order, the search for something ’fundamental’ is a non-question: price captures all information, it is an expression of supply and demand and finds its own level. So the very act of posing the question of an ‘underlying’ or ‘fundamental’ value is to move outside the Hayekian view. It is to suggest that there is and can be a value to cryptotokens beyond current price. It takes us beyond that first definition of economics in terms of markets and incentives and into some wider, social and historical issues of economics. (See Valuation crisis and crypto economy; and Whose stability? Reframing stability in the crypto economy)

The issue under consideration here is not whether the measurement of ‘fundamentals’ is a good guide to trading strategy in cryptomarkets. (There is a standard debate in trading strategy about fundamentals VS. technical analysis of patterns of price movements. Warren Buffet stands out as an advocate of fundamentals analysis.) Nor is it about developing the capacity to forecast an income and expenditure model for the future, important though this is. (See for example, Brett Winton, How to Value a Crypto Asset — A Model)

The issue here is how we might measure the crypto economy if not by current token price. It matters because fundamental value points to the longer-term viability of a token and the activities that underlie it and potentially gives tokens an integrity which will be recognised in wider capital markets.

Fundamental value is not simply a long-term average price around which the spot price varies. It is a value that can be measured by criteria which link to the capacity of an asset to produce new value. In neoclassical, equilibrium theory, the ‘efficient markets hypothesis’ postulates that long-term market price will spontaneously gravitate to the valuation of this capacity to generate new (future) value. But it is only a view specific to the discourse of neoclassical economics.

The challenge of fundamental value is to find a mode in which to measure the current value of an asset, especially when that value requires projection of the future. It was once a relatively straightforward calculation: a staple (and stable) measure of accounting, when manufacturing industry was the ‘model’ corporation, and capital was physical (factory sites, machinery, stock). It has become a more challenging issue for accounting since corporate assets became increasingly intangible — intellectual property, brands, goodwill, etc. Some leading accountants claim that the profession is in a crisis because of an incapacity to measure the value of intangible assets.(See Valuation crisis and crypto economy)

So we should recognise that, in a cryptotoken context, the idea of calculating a fundamental value is experimental. We should also be aware that there is a propensity in cryptoeconomics to engage valuation via an over-simplified interpretation of what a token actually is and can do. That is, there may be consensus that tokens are complex, hybrid, novel things. But in the complexity of the valuation process, there is a proclivity to treat them as one thing; in particular as money or as equity, and especially the former.

Some of the debate here occurs via analogy. For example, it can be argued bitcoin could do the credit card provisioning of Mastercard or Visa, so we could estimate a corporate value for bitcoin based on the value of these companies. (See critical evaluation of this by Andy Kessler, The Bitcoin Valuation Bubble, Wall St Journal, August 27, 2017). But that doesn’t really work, because analogies don’t hold. We can’t claim the crypto economy to be different, yet benchmark its value to assets and processes we seek to disrupt.

Another approach, we think laudable in its desire to capture cryptotokens as derivatives (see more below), contends that the Black Scholes options pricing model can be adapted to explain token values. (See J. Antos and R. McCreanor ‘An Efficient-Markets Valuation Framework for Cryptoassets using Black-Scholes Option Theory’)

The essence of this proposition is that cryptotokens hold exposure to a future of potentially such monumental significance, so that cryptoassets themselves are call options on the utility value of what that cryptoasset might someday provision. Volatility of token values may then be seen as an efficient reflection of a rational estimation of the probability of realising some real utility value of a future — perhaps distant future — product that results from current cryptoassets.

Framed within the efficient markets hypothesis (Eugene Fama), this approach does bear the critical proposition that ‘efficient’ outcomes are a reflection of some ‘fundamental value’. But outside that assumption it is not really persuasive as an explanation of fundamental values. However, valuable in its approach is that it focuses on changes in estimated variables, not on the static value of underlying assets. In this way the approach does capture a derivative dimension in valuation (an issue addressed shortly more below).

Some innovative agendas of measurement are coming via the old quantity theory of money proposition, expressed as ‘the equation of exchange’. The formula states:

MV=PQ

where M = the quantity of money in circulation,

V = its velocity of circulation of money,

P = the general price level in the economy and

Q = the quantity of goods and services sold in the economy.

It is worth spending a little time giving context to this formula, both because of its application in the existing literature on cryptoasset valuation and because it is where ECSA too looks to frame fundamental value, albeit in a way different from current debates.

The equation MV = PQ comes from a long economic lineage, mostly identified with 18th century Scottish philosopher David Hume. It presents the ‘real economy’ on the right hand side and its money equivalent on the left hand side. Its lineage is long, but its functionality in economics is challenged (e.g. is it merely an identity; can it be read causally and if so from right to left as well as left to right?). In late 20th century policy application it has been used to focus on the relationship of M and P, and to argue that states should be passive in economic management, creating just enough money to keep prices stable (with V constant, money supply should expand in proportion to Q) . It became popular in the 1970s economics of Milton Friedman (broadly aligned to Hayek). It was called ‘monetarism’ and contended that state fiscal and monetary expansion were not solving recession, but causing inflation.

Monetarism became central bank policy orthodoxy in many Anglo countries for just a brief period in the early 1980s, expressed as ‘money supply targeting’. It was quickly abandoned and one of the reasons, with new significance for the world of cryptotokens, was that the state’s various definitions of money (cash, trading bank deposits, etc.) moved in different directions: there was no single state money to be targeted. More recently, the non-inflationary impact of US quantitative easing (so far) may be some further indication of the practical limits of the equation in state policy formation. QE also raises the challenge that the equation may only work for goods and services outputs and prices, but not for financial assets. Indeed, it is ambiguous as to which side of the equation liquid financial assets should be located: are they commodities (RHS) or money (LHS)?

With that brief background, let’s take a quick look to the use of the quantity theory of money in cryptotoken valuation.

The initial figure of note here is Chris Burniske, who observes that tokens have both asset and money attributes and are currencies in the context of the programs they support. But in that role they don’t generate cash flow, so they can be valued by discounted future value but not a discounting related to cash flows but to a projected future fundamental value. So the analysis turns to utility values (Cryptoasset Valuations).

For this purpose Chris re-defines the variables of the equation:

M = size of the asset base

V = velocity of the asset’s circulation

P = price of the digital resource being provisioned (note: not the price of crypto assets)

Q = quantity of the digital resource being provisioned (note: not the quantity of crypto assets.

He then solves for M, which enables an individual token valuation.

This is indeed a novel approach and for a reason opened up a significant debate. But it does have some problems:

  • It finishes up displacing the valuation problem from the token to the valuation of digital resources being provisioned, and the ambiguity of how that is being measured.
  • Moving V to the RHS so as to solve for M turns the equation from being a logical identity (that money values equal commodity values) into a historical proposition of individual variable valuation. (If, for example, velocity doubles, it doesn’t thereby halve the size of the asset base.)
  • In wider discussion there is recognition of the problematic valuation of V (actually, it is the volatility of V). There is a literature addressing the question of velocity. There are debates here, but, as summarised by Alex Evans, its common proposition is that:

“tokens that are not store-of-value assets will generally suffer from high velocity at scale as users avoid holding the asset for meaningful periods of time, suppressing ultimate value.”

A problem with the valuation literature seems to be that it conflates the equity dimension and the monetary dimension of tokens. The focus on the fact that (a) tokens will be turned over rapidly because they are not a good store of investor value is different from the issue of (b) tokens turning over in their use as a means of exchange inside projects/businesses. The latter matters, the former does not. To give attention to the former would be like saying the turnover of corporate equities impacts the long-term price of corporate equities. The point: We need to re-think the meaning of velocity in a token world. The underlying issue here is that tokens blur the categories of equity and money, and the velocities of these attributes have different drivers.

We are interested in this approach and its criticisms not for the purpose of disproving the approach — for probably any proposal in this domain is somewhat easy to critique. The point is that cryptoaccounting, like mainstream accounting, simply doesn’t have good tools to measure in this domain. But it is an area where we should explore, and it requires creativity, such as shown by Burniske and the debate to which his work has given rise. Indeed, we are thinking that the novelty of cryptoassets gives us opportunities to invent valuation procedures that could well be actually of benefit to the mainstream accounting profession.

We have been working intensively on the ECSA token valuation system, as it could be posed as an engagement with this debate, returning to the original meaning of

MV=PQ

as the depiction of an economy which balances the ‘monetary side’ of an economy (MV) with the so-called ‘real economy side’ (PQ). From an ECSA perspective, the measurement process means that P is too limited a category. We want to treat price as just one index of ‘value’ measurement amongst a range. So we respecify:

MV= I(1-x)Q

where

M = the quantity of tokens issued in the new economic space (ECSA bootstrapped ecosystem of new value forms)

V = the velocity of circulation of tokens within new economic space (ECSA bootstrapped ecosystem of new value forms)

I(1-x) is the range of indices of valuation, of which price is just one

Q is the quantity of output (tangible and intangible) produced in the new economic space (ECSA bootstrapped ecosystem of new value forms).

If we measure the new economic space (ECSA bootstrapped economy) in the way described, we can make a simple use of this formula, or at least the underlying sentiment, both economic and social. For ECSA itself, this mode of measurement gives a means to define both fundamental value and set some governance agendas.

As an identity, the LHS and RHS are always equal.The monetary policy position of ECSA is that the RHS (the total value of output within new economic space/ECSA bootstrapped ecosystem) will drive the LHS (token issuance qualified by velocity). In the distributed system of offers and matches ECSA is building, we will be able to develop significant data sets of M (distributed token issuance) V (where we can distinguish clearly between offers that are about acquisition of inputs for production and those used for mere token exchange), and we will develop empirical measurement to calculate I(1-x) and P. We think that in the offer based new economic space token issuance can be governed in a distributed way to ensure MV=IQ, and the economy can be run on non-inflationary tokens. The key here is to concurrently have internal ‘working tokens’ and ‘market token’ traded on the capital market. The equation of exchange should only apply to the internal ‘working token’, not the ‘capital market token’, for it is the internal token which articulates with production and distribution. (We will return to the issue how the capital market token and the value graph and its working tokens are bridged in the next texts of this series.)

A question is, of course, how do we reduce I to a single index number? The answer has two layers. One is that once we reach critical mass of offers and matchings market processes will themselves drive valuation. The second layer is that these same offers and acceptance will generate significant data which can then be transformed into indices for valuing goods and services (outputs and performances). How these indices will be compiled is one of our most interesting research areas at the moment — it will come out of an empirical processing of large data; it cannot be known a priori; nor should it be presume to be fixed in value.

There are some elements in this proposition that warrant further explanation. They will be taken up in a later section, but in the current context the critical point is that we believe that big data generated by agent offers and acceptances provide a critical information source for framing fundamental within a (broadly) MV=PQ framework.

‘Big data’ are in contemporary society often readily depicted, and rightly so, as socially intrusive and manipulative. But we think that in a token economy context of transparency and decentralized open source data, they are critical for organization and inclusiveness. And even further, without a reliable runtime and a grammar that can help us navigate and operate this space, and build knowledge derivatives (indexes) of it, we are today without politics, economics, incapable of speaking, of grasping, and intervening in processes and future of our life. This is the technology development task we are taking on in ECSA (See the ECSA Tech Stack).

 

We’ve always been impressed by Galileo Galilei, Leonardo da Vinci and other renaissance scientists who invented the experiments and instruments to navigate, understand and measure the newly opening space-time reality — the microscopes and telescopes to reveal micro- and macrocosms, inclinometers to determine latitudes, thermoscopes to show change of temperature, barometers to reveal atmospheric pressure, nautical instruments, experimental methods to understand invisible phenomena, velocity, acceleration, gravity— we will need to do the same now for the new economic space-time. Just like the birth of perspective in art, we will introduce perspective in economy. It will be a renaissaince.

We will return to the fundamental value issue shortly, for the issue of whichfundamental value is to be indexed in terms of the measuring role, and governance role, is called on to play within a token system. The focus is therefore on the requirements of the new economic space, but we get there in a way with ramifications for wider token systems.

Galileo’s original telescopes and the lense he gave to Medici at the Museo della Storia della Scienza in Firenze (Photo by Akseli Virtanen)

6. Developing the ECSA unit of account

Part A: A historical digression, but of some significance

The problem of our radical measurement proposal is that all the language of money, markets, prices and profit is dominated by a grammar that equates money with the state, markets with a profit-centred mode of calculation and profit as a surplus defined by reference to extraction of individual benefit. Cryptoeconomics has been strong in challenging the first of these, but less effective in the latter ones. But they need to be challenged too.

John Maynard Keynes, the economist most associated with the principles of state issuance and management of fiat money in advanced capitalist economies, said in his 1930 Treatise on Money:

“The age of chartalist or State money was reached when the State claimed the right to declare what thing should answer as money to the current money of account — when it claimed the right not only to enforce the dictionary but also to write the dictionary. Today all civilised money is, beyond possibility of dispute, chartalist.”

Ninety years on, cryptotokens are the counterfactual to this proposition, but Keynes’ proposition about the State writing the dictionary is right. Part of cryptoeconomics is to challenge the dictionary: to open up new ways of thinking money and price.

That challenge is broad, but in this context, let us go to Keynes and Hayek, and read them via the innovation of cryptotokens. The object is quite specific: to note how certain of their categories that are now assumed ‘theoretical’, indeed axiomatic (the ‘dictionary’), are actually historically specific and contingent and should be challenged in the light of cryptotoken development. Moreover, within each of these significant economists, we can find the derivative dimension that is repressed in the interests of conveying a culture of theoretical certainty.

We start with Keynes.

His central proposition, conceived in the years either side of the Great Depression, want that nation states must manage national economies for markets will not gravitate to full employment stability. Central here is the idea that the state defines money and closely manages the financial system. In Chapter 17 of the General Theory he challenged his own premise and introduced the hypothetical idea that money may not be unique in its economic characteristics:

“The money-rate of interest — we may remind the reader — is nothing more than the percentage excess of a sum of money contracted for forward delivery, e.g. a year hence, over what we may call the “spot” or cash price of the sum thus contracted for forward delivery. …Thus for every durable commodity we have a rate of interest in terms of itself, — a wheat-rate of interest, a copper-rate of interest, a house-rate of interest, even a steel-plant-rate of interest.… Money is the greatest of the own-rates of interest (as we may call them) which rules the roost.”

Keynes, in essence, depicts money as the greatest own interest rate because (a) it does not itself produce a use value (like say wheat or copper) so it does not get diverted to those uses (it is exclusively ‘money’); (b) there is no issue of wastage (c) it is the most liquid asset, and (d) its quantum is managed.

Two things are interesting here. First, these criteria identified by Keynes as integral to the ‘greatness’ of (state) money do not persuasively apply today: indeed all financial derivatives have the liquidity and fungibility of state money, and cryptytokens are not constrained by nation- (or group-of-nation-) specific acceptability.

Second, Keynes uses the language now associated with derivatives to depict the rate of interest. Money is the underlying of which interest is the derivative. And for Keynes, money is axiomatically state money. Cryprotokens are in this context a put option on the state: the right to sell out of the state’s unit of account.

For Hayek, the origins of his thinking on the social and economic virtues of market processes comes from an early to mid 20th century debate with advocates of Soviet-inspired and other variants of central planning. It is known as the Socialist Calculation Debate. Hayek, following von Mises, argued that central planning, even at its best, has a range of insensitivities to detail: it can only work with highly aggregated, and outdated, data and impose these generalized decisions on individuals. The market, on the other hand, runs by processing decentralized information. It can synthesise complex forms of social and economic information into a single index, enabling economic relations to be conducted in simple and orderly processes.

It is worth quoting Hayek at some length, because what he says resonates with the capacities of a cryptoeconomy:

“It is in this connection that what I have called the “economic calculus” proper helps us, at least by analogy, to see how this problem can be solved, and in fact is being solved, by the price system. Even the single controlling mind [the central planner], in possession of all the data for some small, self-contained economic system, would not — every time some small adjustment in the allocation of resources had to be made — go explicitly through all the relations between ends and means which might possibly be affected. It is indeed the great contribution of the pure logic of choice that it has demonstrated conclusively that even such a single mind could solve this kind of problem only by constructing and constantly using rates of equivalence (or “values,” or “marginal rates of substitution”), i.e., by attaching to each kind of scarce resource a numerical index which cannot be derived from any property possessed by that particular thing, but which reflects, or in which is condensed, its significance in view of the whole means-end structure. In any small change he will have to consider only these quantitative indices (or “values”) in which all the relevant information is concentrated; and, by adjusting the quantities one by one, he can appropriately rearrange his dispositions without having to solve the whole puzzle ab initio or without needing at any stage to survey it at once in all its ramifications.

Fundamentally, in a system in which the knowledge of the relevant facts is dispersed among many people, prices can act to coördinate the separate actions of different people in the same way as subjective values help the individual to coördinate the parts of his plan.”

So price is the condensation of a multiplicity of determinations (to borrow from Althusser). The market can incorporate and process all different forms of information (create knowledge) to create spontaneous order.

“The most significant fact about this system is the economy of knowledge with which it operates, or how little the individual participants need to know in order to be able to take the right action. In abbreviated form, by a kind of symbol, only the most essential information is passed on, and passed on only to those concerned. It is more than a metaphor to describe the price system as a kind of machinery for registering change, or a system of telecommunications which enables individual producers to watch merely the movement of a few pointers, as an engineer might watch the hands of a few dials, in order to adjust their activities to changes of which they never know more than is reflected in the price movement.”

F.A. Hayek, ‘The Use of Knowledge in Society’, American Economic Review. XXXV, №4. 1945, pp. 519–30.

This 1940s advocacy of ‘the market’ may stand strong as an alternative to 1940s central planning, but 70 years on the argument is and should be different. There are now ‘big data’, access to vast amounts of information to inform individual decisions, and computational capacities to process this information instantly. The ‘imperative’ to have complex variables reduced to ‘price’ no longer holds. Decentralized decision making does not have to articulate simply via price formation in markets. Indeed, one potential of cryptomarkets is to challenge the use of Hayekian price as the decentralized object of calculation.

Hayek says price embodies complex information — it creates knowledge of society — and its great functionality is that it is a simple representation of that complexity. Blockchain and cryptotokens present us with other ways of processing complex information. To get to this 21st century engagement, we can frame Hayek’s analysis in the context of risk and derivatives. There are two dimensions here.

  • In the era of blockchain and big data, and in the language of Gilles Deleuze, we can dividuate knowledge: break it down into its underlying, determining elements (that Hayek thought were too complex to code), but without necessarily aspiring to see those elements combined so as to ontologically privilege the totalised category of ‘knowledge’. Knowledge is a synthetic asset; an assembly of information. Its purpose does not have to be the formation of market price.
  • It follows that, in the era of derivatives, we can think of price as itself a derivative on those underlying forms of information of which price is said to be the condensate. In Hayek’s analysis of ‘The Price System as a Mechanism for Using Knowledge’, ‘price’ is really the strike price on the option on a synthetic asset called ‘knowledge’. (Individuals in this framing hold out-of-the-money options where they are priced out of the market and the return on in-the-money options is what neoclassical economists call the ‘consumer or producer surplus’.)

It follows that if Hayek’s approach can now be framed as an excessive reduction of information to a single, totalising unit of measure (‘price’), we can ask what are the key dividuated forms of information for which price represents a derivative exposure? And once we identify what they are, we can ask how are they important beyond being the ‘underlyers’ of price? How are they important in their own right as knowledge and as social indicators for decision-making?

The significance of these forays into Keynes and Hayek is profound: more so than might at first appear.

For Hayek, if it is possible to deconstruct the information behind price, why is it assumed that the objective of this information is the formation of prices rather than some other unit of measure? We can open up radically different social modes of calculation.

For Keynes, if money is a derivative exposure to the state, we might ask what cryptotokens might be a derivative of? What social and economic modes of organization may be available here?

To take on this significance, we need to back up a bit: to challenge the dictionary. Price is no more than an index: it measures relative values (between products; over time). But it gets treated socially as an absolute social measure. This is central to the idea of trust in a (fiat) money system. But the absolute measure is a social construct, and it can be changed: Francs to Euros; ‘old’ British pounds to ‘new’ (decimal) pounds. The appearance of cryptocurrencies, offering potential for so many different benchmarks for valuation, makes that social construction stark.

So why is ‘price’ currently the privileged index of valuation? Why do we not use (for example) sociality (social impact) as the privileged index of valuation? Or environmental impact?

The answer is that price is a measure that expresses the social and cultural values of a capitalist society. In using price as the privileged measure we assume that (a) production for the market is valued over production for direct use (for the latter generates no price) and (b) we assume that profit is embedded within price (people take things to market so as to make a profit). In a capitalist society, those priorities seem appropriate: they capture the values of that society.

The above is no doubt something of an overstatement. There has long been critique of GDP data, for example because they are not adequately measuring the environment, or commodity output is not a measure of ‘happiness’. Exactly what sorts of indices we might adopt is not the specific issue here. It is that for a measure to be not just an idealised alternative to GDP but a living indicator to be used in economic management, it has to have a material grounding in social organization: GDP as we now know it rules as an aggregate measure in a society that privileges production for profit.

Part B: Contemporary lessons from the historical digression

In an economy not driven by profit, but by a different framing of social contribution, we need different modes of measurement. If we stay with GDP-like measures, but add qualifiers (like pricing the environment or pricing care) we have to convert these qualifiers to profit-centred criteria, and then seek to justify their lack of profitability in terms of some unspecified social good. In this framing, non-profitable social goods are inevitably depicted as ‘concessions’ (virtuous loss-makers). We want to frame them not as concessions but as a purpose of doing economic activity. That framing requires the de-throning of profit as ruling the discourse of economic analysis. It is just a perspective — a very restricted perspective — on value.

We think we can borrow from a Hayekian method to re-think different measures made possible by the creation of tokens as units of account.Price, in its abstracted meaning, is valuation of something (output) by means of an index. So instead of allocating the generic word ‘price’ to our current (and Hayek’s preferred) index of measurement, let’s call that measure ‘profit price’, signalling the epistemological foundation of the index: it is just a profit-centred perspective on value.

But cryptoeconomics provides a means to measure also in terms of post-capitalist values — in terms of future value forms. And ‘profit price’ is not the index that best captures these value forms and their production. On the contrary, it has rather become a drag to their production. This is perhaps one of the most difficult things to understand about the transformation we are in: when capital becomes ‘intangible’ information and knowledge there is an irreversible change in the nature of itself. It does no longer follow the same laws, it does not behave in the same way, it does not produce and capture value anymore in the same way.

Think of knowledge. How and where does knowledge get its value? How does knowledge become valuable? How does the value of knowledge appreciate? The value of knowledge appreciates (a) if it is used in multiple ways, that multiple ways of using it are invented: if it is shared, adopted, repeated, imitated, copied; which means that it is always a collective and social process; (b) if it gets subjectively interpreted, accepted, “owned” and invested into; there is nothing more valuable than a knowledge producer who “owns” the production, is capable in sharing in its risks and puts herself at stake in the production; (c) if its producers ‘risk together’, if they self-regulate continuously the relations of its production, share its risks, stakes, upside. These key features of knowledge production — ability to multiply ways of use, ability to interpret and give subjective meanings, ability to self-regulate and continuously modify relationships between actors in its production — are precisely something that the old value production did not have. It is a different grammar where value gets created by (a) sharing, copying and inventing multiple uses (VS. restricting use by proprietary ownership); (b) many interpretations, iterations, variations, “owners” (VS. hiding the source code); (c ) collective self-organization, self-governance and right to fork (VS. external organization and control).

Would it not make sense that perhaps another kind an on index, say a ‘sociality’ index (‘sociality price’) would better capture such value production, and markets could value in terms of sociality price rather than profit price. (The objective here is not to give precision to a sociality index — or indices — it is just to frame the credibility of their existence as a social alternative, a different perspective on value.)

Fanciful, many will say! Hayek would have us believe, and many passively accept, that profit price is ‘natural’ and society spontaneously gravitates to an order around the calculation of profit price. Sociality price does not exist, and it would be a complete overturning of social norms to engage it.

So how do you compile a sociality index that is socially recognised and used?

First, we should recognise that Hayek’s notion that price formation in markets is a spontaneous order which happens to society is simply wrong. As Karl Polanyi puts it in the context of the socialist calculation debate: markets were planned; planning wasn’t!

Hayek himself highlights just how much complex and decentralized information goes into compiling price as an index. It’s about costs and market power, regulations, etc. on the cost side and tastes, income, etc. on the demand side. They are different for every individual, for every commodity, and at every point in time. But, and here Hayek is right in relation to current social relations, our society does, in general, bring it all together to create prices and orderly markets.

We respond that this is not a natural order, but a socially organized order. Building alternative sociality indices, and having people value by reference to sociality price, involves a massive cultural as well as economic shift. It would be about production for value rather than production for profit. The key metrics would have to shift from conditions for profitability to conditions for valuable. It is no more or less logically feasible than valuing in terms of profit price.

Cryptotokens provides us with an opportunity to experiment for example with a sociality index of price: indeed to develop multiple indices of valuation that reflect different social priorities the way that profit price reflects capitalism’s priorities. And if total income is determined by payments for contribution for sociality, we have the conditions for a different measure of value, for a different value calculus.

We are thinking to begin to trial this system. To quote ECSA developers:

“Think of the token as a propositional force, a sparkle of potentiality. It is a multi-dimensional docking port that can germinate new forms of relations and value sharing. The token is an occurrence, a virtual (time) crystal expecting its transductive associated milieu. It is an instance of value capture, but only insofar as it acts, simultaneously, as a fugitive relay of anarchic shares collectively modulating and amplifying values. Conceiving of tokens as speculative pragmatic relays is a way of entertaining them as generator of collective effervescence.”

(Erik Bordeleau et al. at the Economic Space Agency “We don’t know yet what a token can do”; see also “On intensive self-issuance”, Economic Space Agency in MoneyLab Reader, pp. 232–233)

7. Once more on fundamental value: the ECSA approach

‘Fundamental value’ is critical in cryptoeconomics for the simple reason that, in economic terms, it is the first benchmark of good governance. That is, it provides the framework so that the integrity of a token on issue is to be found in the system of production that it enables. So there must be a clearly-stated relationship between token issuance and the level of production.

But that relationship is historically specific, and we need to recognise that old notions of ‘fundamental value’ may have to adapt, and not just in the context of cryptotokens, but emphatically in their context.

So what are the hallmarks of the traditional notion of fundamental value? It is that there can be an ‘underlying’ measure outside of (beneath) the vicissitudes of market exchange. Adam Smith called it ‘natural price’; Marx sought a unit of value in socially necessary labour time. Accountancy sought fundamental value in the long-term productive value of the various assets of the corporation, as distinct from stock price. In each of these cases ‘intangible capital’ breaks the modes of fundamental value measurement. That is important as intangible capital becomes increasingly prevalent on corporate balance sheets, and it is clearly central to cryproeconomies.

But more broadly, as the nature of capital changes, so the mode of its measurement changes. If we look at the way financial markets have developed over the last 60 years to explain value — using tools like CAPM, VaR, EMH and Black Scholes — we can see that practical valuation has gone ‘inside’ the market: the idea that ‘real value’ exists outside market transactions is no longer a reflection of how valuation actually occurs.

What it signals is that ‘fundamental value’ has been shifting from stock measures (hours of embedded labour time; machinery and factory sites) to flow measures: from measures ‘outside’ the market to ways of re-interpreting data generated within markets.

A flows approach to fundamental value focusses on sources and uses rather than the valuation of stocks (assets, portfolios, warehouses).

Professor Perry Mehrling, one of the key critics of modern finance in the light of the global financial crisis, quotes Hyman Minsky that “the cash flow approach looks at all units — be they households, corporations, state and municipal governments, or even national governments — as if they were banks.” Cash flows have their uses and sources, and “for each agent, every use has a corresponding source and vice versa” and “each agent’s use is some other agent’s source, and vice versa.” If one thinks of agents as banks then all their assets and liabilities are intertwined. Whatever assets a bank has (i.e. cash) comes from elsewhere, as of course its liabilities are also “social”. In a stock version of fundamental, what is valued is assumed social because what is valued is the product of past social processes. Here, in measuring flows, we are focussing on the social-in-process.

This approach resonates directly with the interoperability grammar of offers and acceptances undertaken by agents on a blockchain. It suggests a new mode of framing fundamental value; one that is appropriate to the new form of economic interaction generated in a cryptoeconomy.

The MV=PQ approach to fundamental value must sit under this framing. This identity is one critical perspective on fundamental value.

ECSA thinks that this framing sets the conditions on which ECSA must build its diagnostic tools and indeed its units of account (for there need not be just one, albeit that there will be commensuration between units, but they may be non-tokenised). We can draw from Hayek the idea that transactional data embody complex, detailed information, from which indices can be compiled: indices that will give access to ‘underlying’ trends and can be compiled in ways to perform the function of units of account.

But, and this is critical, these indices cannot be pre-defined nor locked in: they are themselves to be produced as a recursive exercise of data analysis, and as the data evolve, and as techniques of data analysis evolve, so the (synthetic) indices of fundamental value must evolve. The indices must be in harmony with underlying market processes; not stand in contradistinction.

In an MV=PQ framing, we have to build data from agents’ offers and matches to compile a way to measure Q and V, and ultimately thereby stabilise the relationship between M and P. They must emerge from the interrogation of data, and suit the specific purposes of the new economic spaces bootstrapped by ECSA. And those indices must be allowed to evolve over time, both by processes of refinement as data get richer and processes of transformation as notions of social value evolve within the new economic space.

This on-going process of developing indices will be one of the critical performances of ECSA. It is critical to the integrity of new economic spaces, but we aspire also that it will be part of a wider social project to re-think value. We are familiar with the extensive ‘alternative’ measurement work around the world — in development studies, in human capacity measures, in health, in care, in the environment. We believe that ECSA’s indexing project can be part of this wider agenda; indeed framed within a token economy there will be new ways for this project to develop.

8. The derivative form: buying exposures to an exponential future and a big put

The above suggests to us that tokens in cryptoeconomics, and certainly in the new economic space, take the form of derivatives. We mean this not just in the sense that the purchase of an ECSA token in the capital market is a derivative, in the same way that company stocks are derivatives (exposure to company performance, without ownership of the underlying).

The proposition is deeper — in part material, but in part also symbolic .

First the material expression. There is a widespread embrace from the cryptoeconomic community of the issue of a transformative potential: tokens give exposure to an unverified but exponential future. The various indices that ECSA will compile are themselves derivative formulations, in the sense that movements in indices will determine individual token values as they exchange with other tokens and with ECSA’s mutual stakeholding fund. The purchase of tokens is thereby the acquisition of an exposure to these indices. The indices themselves are measures of the performance of economic spaces. So a token is an exposure to an index which is itself a representation of economic performances by token-issuing economic spaces.

Second, symbolically, the crypto economy involves taking risk positions on the established capitalist economy, its calculation of value, and the economic (and political) power structure around it.

  • Those of us engaged in building (and analysing) cryptoeconomics and holding a long position on its potential.
  • Those state regulators/commentators who decry the potential of crypto economies are using their regulatory and media power to short us.
  • Those who diversify from the conventional capital market and capitalist economy and invest in ECSA are taking a short position of capitalist systems of value calculation. Borrowing the great insight of ECSA Advisor prof. Robert Meister of UCSC, we offer them a ‘big put’, a capacity to short capitalism.

In the words of ECSA Advisor, NYU professor Robert Wosnitzer:

“If the current system/structure of capital “shorts” the qualitative dimensions of life and society, and/or the externalized costs of production (i.e., businesses “put” the costs of, say, pollution onto society), through going “long” the current system of production and circulation, then ECSA is going long the qualitative/intangible dimensions and shorting the current system.

Said differently, a put option “puts” back the cost of the spread between the current value of something and its strike price to a counterparty in equity options, or in the case of bonds that carry a put, the right to “put” back the par value at a specific moment. This “put” option also carries the logic of securitization — that is to say, the rationale and need to securitize mortgages is due to the fact that homeowners have a “put” option that they can exercise at any time, thereby rendering the cash flows unpredictable and therefore not suitable for investment and reducing liquidity. By securitizing mortgages, the “put” option is mitigated as the risk is spread across multiple mortgages in the pool, and then tranching allows for even more precise predictability. So the “put” option has always carried some threat to the current system of capital relations, and the need for securitization arose largely to address this “threat.” Of course, the put option in mortgages has been largely reduced to interest rate sensitivities (and hence the need for strong central banking operations), whilst ignoring the real, material social contexts that often drive interest rates — unemployment, price increases by producers, health care costs, etc. etc.”

And the politics is clear:

Creating “alternative” economic spaces allows the owners of tokens to own an option that could, under certain conditions, “put” back the cost of an externality to the owners of the means of production which created the externality (or injustice, if you will). It’s a long, directional play (buying a put) that is the dialectical opposition to the long, directional play of existing capital relations.

Yet there is currently no derivative product that recognises the significance of this insight, and enables it to be ‘played out’ financially, in a way that don’t express simply via the volatility of token values themselves.

ECSA is currently exploring ways we might engage this logic, for example by securitizing a certain part of revenue (perhaps a world-first collateralized equity note?), with the possibility of both hedging that revenue (both in quantity and currency) and also providing a liquid market tool to attract short positions on ECSA, but in a way that diverts this shorting activity from the ECSA token itself.

It must be emphasised that this is a strategy currently at the stage of exploration only, as part of engaging in a process of discovery of what cryptotokens might become. It is raised here simply to indicate the kind of exploration we feel need to emerge.

The ECSA team will be again at NYU during the last week of September to carry on the work in the next series of Cryptoeconomic working sessions.

If you made it this far, and are interested in joining the work, let us know!

Credits:

Big thanks for comments and discussions on earlier drafts by Johnny Antos, Chris Burniske, Eden Dhaliwal, Anesu Machoko, Jessa Walden and some anonymous contributors. Thanks also to all participants at the Cryptoeconomics Working Sessions at NYU/Stern, Stockholm School of Economics and GCAS. ECSA team working on our cryptoeconomics project — Jonathan Beller, Erik Bordeleau, Fabian Bruder, Pekko Koskinen, Jorge Lopez, Joel Mason, Tere Vaden — rocks.

Dick Bryan, is a prof. (emer.) of Political Economy (University of Sydney) and Chief Economist at Economic Space Agency. He is one of the key theorists of the derivative value form, and the author of Risking Together and Capitalism with Derivatives (together with Mike Rafferty).

Benjamin Lee is a prof. of Anthropology and Philosophy (The New School, NYC) and Advisor to Economic Space Agency. The author of Derivatives and the Wealth of Societies. Co-organizer of the Volatility Working Group.

Robert Wosnitzer is a prof. at New York University/Stern Business School and Advisor to Economic Space Agency. Credit instruments, derivatives, and cultures of finance specialist. Former debt instruments and options trader at Lehman Brothers and Wells Fargo Capital Markets.

Akseli Virtanen, PhD, is a political economist, the author of Arbitrary Power. A Critique of Biopolitical Economy and Economy and Social Theory (Vol 1–3, with Risto Heiskala), Co-founder at Economic Space Agency, and at the decentralized hedge fund Robin Hood Minor Asset Management. Currently visiting researcher at Standford University.

Photo by rutty

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.