Red Plenty Platforms (2): The ecological problem and contemporary cybernetic planning

We continue our treatment (2nd and last part) of the important essay by Nick Dyer-Witheford:

“An abundant communist society of high automation, free software, and in-home replicators might, however, as Fraise (2011) suggests, need planning more than ever – not to overcome scarcity but to address the problems of plenty, which perversely today threaten shortages of the very conditions for life itself. Global climate change and a host of interlinked ecological problems challenge all the positions we have discussed to this point. Bio-crisis brings planning back on stage, or indeed calculation – but calculation according to metrics measuring limits, thresholds and gradients of the survival of species, human and otherwise. Discussing the imperatives for such ecosocialist planning, Michael Lowy (2009) points out how this would require a far more comprehensive social steering than mere ‘workers control’, or even the negotiated reconciliation of worker and consumer interests suggested by schemes such as Parecon.

Rather, it implies a far-reaching remaking of the economic systems, including the discontinuation of certain industries, such as industrial fishing and destructive logging, the reshaping of transportation methods, ‘a revolution in the energy-system’ and the drive for a ‘solar communism’ (Lowy, 2009: np).

Such transformations would involve cybernetics along two major axes, as both contributors to the current bio-crisis and as potential means for its resolution. On the first of these axes, the ecological costs of nominally ‘clean’ digital technologies have become increasing apparent: the electrical energy requirements of cloud computing data-centres; the demands of chip manufacture for fresh water and minerals, the latter from large scale extractive enterprises; and the resulting prodigious quantities of toxic e-waste. Making every home a fab-lab mini-factory will only speed-up planetary heat death. Contrary to all idealistic notions of virtual worlds, cybernetics are themselves inextricably part of the very industrial system whose operations have to be placed under scrutiny in a new system of metabolic regulation that aims for both red and green plenty.

However, cybernetic systems are also a potential part of any resolution of the bio-crisis – or, indeed, of even fully recognizing it.

Paul Edward’s (2010) A Vast Machine analyzes the global system of climatological measurement and projection – the apparatus of weather stations, satellites, sensors, digitally archived records and massive computer simulations, which, like the Internet itself, originated in US Cold War planning – on which comprehension of global warming rests. This infrastructure generates information so vast in quantity and from data platforms so diverse in quality and form that it can be understood only on the basis of computer analysis. Knowledge about climate change is dependent on computer models: simulations of weather and climate; reanalysis models, which recreate climate history from historical data; and data models, combining and adjusting measurements from multiple sources.

By revealing the contingency of conditions for species survival, and the possibility for their anthropogenic change, such ‘knowledge infrastructures’ of people, artifacts, and institutions (Edwards, 2010: 17) – not just for climate measurement, but also for the monitoring of ocean acidification, deforestation, species loss, fresh water availability – reveal the blind spot of Hayek’s catallaxy in which the very grounds for human existence figure as an arbitrary ‘externality’. So-called ‘green capital’ attempts to subordinate such bio-data to price signals. It is easy to point to the fallacy of pricing non-linear and catastrophic events: what is the proper tag for the last tiger, or the carbon emission that triggers uncontrollable methane release? But bio-data and bio-simulations also now have to be included in any concept of communist collective planning. Insofar as that project aims at a realm of freedom that escapes the necessity of toil, the common goods it creates will have to be generated with cleaner energy, and the free knowledge it circulates have metabolic regulation as a priority. Issues of the proper remuneration of labor time require integration into ecological calculations. No bio-deal that does not recognize the aspirations of millions of planetary proletarians to escape inequality and immiseration will succeed, yet labour metrics themselves need to be rethought as part of a broader calculation of the energy expenditures compatible with collective survival.” (http://www.culturemachine.net/index.php/cm/article/view/511/526)

Conclusions: requirements for Cybernetic Communism

Nick Dyer-Whiteford:

“A new cybernetic communism … would, we have seen, involve some of the following elements: use of the most advanced super-computing to algorithmically calculate labour time and resource requirements, at global, regional and local levels, of multiple possible paths of human development; selection from these paths by layered democratic discussion conducted across assemblies that include socialized digital networks and swarms of software agents; light-speed updating and constant revision of the selected plans by streams of big data from production and consumption sources; the passage of increasing numbers of goods and services into the realm of the free or of direct production as use values once automation, copy-left, peer-to-peer commons and other forms of micro-replication take hold; the informing of the entire process by parameters set from the simulations, sensors and satellite systems measuring and monitoring the species metabolic interchange with the planetary environment.

This would indeed be a communism heir to Lenin’s ‘soviets plus electricity’, with its roots in red futurism, constructivism, tektology and cybernetics, together with the left-science fiction imaginaries of authors such as Iain M. Banks, Ken McLeod and Chris Moriarty. It would be a social matrix encouraging increasingly sophisticated forms of artificial intelligence as allies of human emancipation. For those who fear the march of the machine it holds only this comfort: whatever singularities might spring from its networks would not be those of entities initially programmed for unconstrained profit expansion and the military defense of property, but rather for human welfare and ecological protection. Such a communism is consonant with a left accelerationist politic that, in place of anarchoprimitivisms, defensive localism and Fordist nostalgia, ‘pushes towards a future that is more modern, an alternative modernity that neoliberalism is inherently unable to generate’ (Williams & Srnicek, 2013). If it needs a name, one can take the K-prefix with which some designate ‘Kybernetic’ endeavors, and call it ‘K-ommunism’. The possibile space for such a communism now exists only between the converging lines of civilizational collapse and capitalist consolidation. In this narrowing corridor, it would arise not out of any given, teleological logic, but piece by piece from countless societal breakdowns and conflicts; a post-capitalist mode of production emerging in a context of massive mid-twenty-first century crisis, assembling itself from a hundred years of non-linear computerized communist history to create the platforms of a future red plenty.”

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.