Giorgio Griziotti – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Tue, 15 Jan 2019 09:38:57 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 Algorithms, Capital, and the Automation of the Common https://blog.p2pfoundation.net/algorithms-capital-and-the-automation-of-the-common/2019/01/15 https://blog.p2pfoundation.net/algorithms-capital-and-the-automation-of-the-common/2019/01/15#respond Tue, 15 Jan 2019 09:38:36 +0000 https://blog.p2pfoundation.net/?p=74010 “autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation.” This essay was written by Tiziana Terranova and originally published in Euromade.info Tiziana Terranova: This essay is the outcome of a research process which involves a series of Italian institutions of autoformazione of post-autonomist inspiration (‘free’ universities engaged in grassroots organization of public seminars,... Continue reading

The post Algorithms, Capital, and the Automation of the Common appeared first on P2P Foundation.

]]>
“autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation.”

This essay was written by Tiziana Terranova and originally published in Euromade.info

Tiziana Terranova: This essay is the outcome of a research process which involves a series of Italian institutions of autoformazione of post-autonomist inspiration (‘free’ universities engaged in grassroots organization of public seminars, conferences, workshops etc) and anglophone social networks of scholars and researchers engaging with digital media theory and practice officially affiliated with universities, journals and research centres, but also artists, activists, precarious knowledge workers and such likes. It refers to a workshop which took place in London in January 2014, hosted by the Digital Culture Unit at the Centre for Cultural Studies (Goldsmiths’ College, University of London). The workshop was the outcome of a process of reflection and organization that started with the Italian free university collective Uninomade 2.0 in early 2013 and continued across mailing lists and websites such as EuronomadeEffimeraCommonwareI quaderni di San Precarioand others. More than a traditional essay, then, it aims to be a synthetic but hopefully also inventive document which plunges into a distributed ‘social research network’ articulating a series of problems, theses and concerns at the crossing between political theory and research into science, technology and capitalism.

What is at stake in the following is the relationship between ‘algorithms’ and ‘capital’—that is, the increasing centrality of algorithms ‘to organizational practices arising out of the centrality of information and communication technologies stretching all the way from production to circulation, from industrial logistics to financial speculation, from urban planning and design to social communication.1 These apparently esoteric mathematical structures have also become part of the daily life of users of contemporary digital and networked media. Most users of the Internet daily interface or are subjected to the powers of algorithms such as Google’s Pagerank (which sorts the results of our search queries) or Facebook Edgerank (which automatically decides in which order we should get our news on our feed) not to talk about the many other less known algorithms (Appinions, Klout, Hummingbird, PKC, Perlin noise, Cinematch, KDP Select and many more) which modulate our relationship with data, digital devices and each other. This widespread presence of algorithms in the daily life of digital culture, however, is only one of the expressions of the pervasiveness of computational techniques as they become increasingly co-extensive with processes of production, consumption and distribution displayed in logistics, finance, architecture, medicine, urban planning, infographics, advertising, dating, gaming, publishing and all kinds of creative expressions (music, graphics, dance etc).

The staging of the encounter between ‘algorithms’ and ‘capital’ as a political problem invokes the possibility of breaking with the spell of ‘capitalist realism’—that is, the idea that capitalism constitutes the only possible economy while at the same time claiming that new ways of organizing the production and distribution of wealth need to seize on scientific and technological developments2. Going beyond the opposition between state and market, public and private, the concept of the common is used here as a way to instigate the thought and practice of a possible post-capitalist mode of existence for networked digital media.

Algorithms, Capital and Automation

Looking at algorithms from a perspective that seeks the constitution of a new political rationality around the concept of the ‘common’ means engaging with the ways in which algorithms are deeply implicated in the changing nature of automation. Automation is described by Marx as a process of absorption into the machine of the ‘general productive forces of the social brain’ such as ‘knowledge and skills’3,which hence appear as an attribute of capital rather than as the product of social labour. Looking at the history of the implication of capital and technology, it is clear how automation has evolved away from the thermo-mechanical model of the early industrial assembly line toward the electro-computational dispersed networks of contemporary capitalism. Hence it is possible to read algorithms as part of a genealogical line that, as Marx put it in the ‘Fragment on Machines’, starting with the adoption of technology by capitalism as fixed capital, pushes the former through several metamorphoses ‘whose culmination is the machine, or rather, an automatic system of machinery…set in motion by an automaton, a moving power that moves itself’4.The industrial automaton was clearly thermodynamical, and gave rise to a system ‘consisting of numerous mechanical and intellectual organs so that workers themselves are cast merely as its conscious linkages’5. The digital automaton, however, is electro-computational, it puts ‘the soul to work’ and involves primarily the nervous system and the brain and comprises ‘possibilities of virtuality, simulation, abstraction, feedback and autonomous processes’6. The digital automaton unfolds in networks consisting of electronic and nervous connections so that users themselves are cast as quasi-automatic relays of a ceaseless information flow. It is in this wider assemblage, then, that algorithms need to be located when discussing the new modes of automation.

Quoting a textbook of computer science, Andrew Goffey describes algorithms as ‘the unifying concept for all the activities which computer scientists engage in…and the fundamental entity with which computer scientists operate’7. An algorithm can be provisionally defined as the ‘description of the method by which a task is to be accomplished’ by means of sequences of steps or instructions, sets of ordered steps that operate on data and computational structures. As such, an algorithm is an abstraction, ‘having an autonomous existence independent of what computer scientists like to refer to as “implementation details,” that is, its embodiment in a particular programming language for a particular machine architecture’8. It can vary in complexity from the most simple set of rules described in natural language (such as those used to generate coordinated patterns of movement in smart mobs) to the most complex mathematical formulas involving all kinds of variables (as in the famous Monte Carlo algorithm used to solve problems in nuclear physics and later also applied to stock markets and now to the study of non-linear technological diffusion processes). At the same time, in order to work, algorithms must exist as part of assemblages that include hardware, data, data structures (such as lists, databases, memory, etc.), and the behaviours and actions of bodies. For the algorithm to become social software, in fact, ‘it must gain its power as a social or cultural artifact and process by means of a better and better accommodation to behaviors and bodies which happen on its outside’.9

Furthermore, as contemporary algorithms become increasingly exposed to larger and larger data sets (and in general to a growing entropy in the flow of data also known as Big Data), they are, according to Luciana Parisi, becoming something more then mere sets of instructions to be performed: ‘infinite amounts of information interfere with and re-program algorithmic procedures…and data produce alien rules’10. It seems clear from this brief account, then, that algorithms are neither a homogeneous set of techniques, nor do they guarantee ‘the infallible execution of automated order and control11.

From the point of view of capitalism, however, algorithms are mainly a form of ‘fixed capital’—that is, they are just means of production. They encode a certain quantity of social knowledge (abstracted from that elaborated by mathematicians, programmers, but also users’ activities), but they are not valuable per se. In the current economy, they are valuable only in as much as they allow for the conversion of such knowledge into exchange value (monetization) and its (exponentially increasing) accumulation (the titanic quasi-monopolies of the social Internet). In as much as they constitute fixed capital, algorithms such as Google’s Page Rank and Facebook’s Edgerank appear ‘as a presupposition against which the value-creating power of the individual labour capacity is an infinitesimal, vanishing magnitude’12. And that is why calls for individual retributions to users for their ‘free labor’ are misplaced. It is clear that for Marx what needs to be compensated is not the individual work of the user, but the much larger powers of social cooperation thus unleashed, and that this compensation implies a profound transformation of the grip that the social relation that we call the capitalist economy has on society.

From the point of view of capital, then, algorithms are just fixed capital, means of production finalized to achieve an economic return. But that does not mean that, like all technologies and techniques, that is all that they are. Marx explicitly states that even as capital appropriates technology as the most effective form of the subsumption of labor, that does not mean that this is all that can be said about it. Its existence as machinery, he insists, is not ‘identical with its existence as capital… and therefore it does not follow that subsumption under the social relation of capital is the most appropriate and ultimate social relation of production for the application of machinery’.13 It is then essential to remember that the instrumental value that algorithms have for capital does not exhaust the ‘value’ of technology in general and algorithms in particular—that is, their capacity to express not just ‘use value’ as Marx put it, but also aesthetic, existential, social, and ethical values. Wasn’t it this clash between the necessity of capital to reduce software development to exchange value, thus marginalizing the aesthetic and ethical values of software creation, that pushed Richard Stallman and countless hackers and engineers towards the Free and Open Source Movement? Isn’t the enthusiasm that animates hack-meetings and hacker-spaces fueled by the energy liberated from the constraints of ‘working’ for a company in order to remain faithful to one’s own aesthetics and ethics of coding?

Contrary to some variants of Marxism which tend to identify technology completely with ‘dead labor’, ‘fixed capital’ or ‘instrumental rationality’, and hence with control and capture, it seems important to remember how, for Marx, the evolution of machinery also indexes a level of development of productive powers that are unleashed but never totally contained by the capitalist economy. What interested Marx (and what makes his work still relevant to those who strive for a post-capitalist mode of existence) is the way in which, so he claims, the tendency of capital to invest in technology to automate and hence reduce its labor costs to a minimum potentially frees up a ‘surplus’ of time and energy (labor) or an excess of productive capacity in relation to the basic, important and necessary labor of reproduction (a global economy, for example, should first of all produce enough wealth for all members of a planetary population to be adequately fed, clothed, cured and sheltered). However, what characterizes a capitalist economy is that this surplus of time and energy is not simply released, but must be constantly reabsorbed in the cycle of production of exchange value leading to increasing accumulation of wealth by the few (the collective capitalist) at the expense of the many (the multitudes).

Automation, then, when seen from the point of view of capital, must always be balanced with new ways to control (that is, absorb and exhaust) the time and energy thus released. It must produce poverty and stress when there should be wealth and leisure. It must make direct labour the measure of value even when it is apparent that science, technology and social cooperation constitute the source of the wealth produced. It thus inevitably leads to the periodic and widespread destruction of this accumulated wealth, in the form of psychic burnout, environmental catastrophe and physical destruction of the wealth through war. It creates hunger where there should be satiety, it puts food banks next to the opulence of the super-rich. That is why the notion of a post-capitalist mode of existence must become believable, that is, it must become what Maurizio Lazzarato described as an enduring autonomous focus of subjectivation. What a post-capitalist commonism then can aim for is not only a better distribution of wealth compared to the unsustainable one that we have today, but also a reclaiming of ‘disposable time’—that is, time and energy freed from work to be deployed in developing and complicating the very notion of what is ‘necessary’.

The history of capitalism has shown that automation as such has not reduced the quantity and intensity of labor demanded by managers and capitalists. On the contrary, in as much as technology is only a means of production to capital, where it has been able to deploy other means, it has not innovated. For example, industrial technologies of automation in the factory do not seem to have recently experienced any significant technological breakthroughs. Most industrial labor today is still heavily manual, automated only in the sense of being hooked up to the speed of electronic networks of prototyping, marketing and distribution; and it is rendered economically sustainable only by political means—that is, by exploiting geo-political and economic differences (arbitrage) on a global scale and by controlling migration flows through new technologies of the border. The state of things in most industries today is intensified exploitation, which produces an impoverished mode of mass production and consumption that is damaging to both to the body, subjectivity, social relations and the environment. As Marx put it, disposable time released by automation should allow for a change in the very essence of the ‘human’ so that the new subjectivity is allowed to return to the performing of necessary labor in such a way as to redefine what is necessary and what is needed.

It is not then simply about arguing for a ‘return’ to simpler times, but on the contrary a matter of acknowledging that growing food and feeding populations, constructing shelter and adequate housing, learning and researching, caring for the children, the sick and the elderly requires the mobilization of social invention and cooperation. The whole process is thus transformed from a process of production by the many for the few steeped in impoverishment and stress to one where the many redefine the meaning of what is necessary and valuable, while inventing new ways of achieving it. This corresponds in a way to the notion of ‘commonfare’ as recently elaborated by Andrea Fumagalli and Carlo Vercellone, implying, in the latter’s words, ‘the socialization of investment and money and the question of the modes of management and organisation which allow for an authentic democratic reappropriation of the institutions of Welfare…and the ecologic re-structuring of our systems of production13. We need to ask then not only how algorithmic automation works today (mainly in terms of control and monetization, feeding the debt economy) but also what kind of time and energy it subsumes and how it might be made to work once taken up by different social and political assemblages—autonomous ones not subsumed by or subjected to the capitalist drive to accumulation and exploitation.

The Red Stack: Virtual Money, Social Networks, Bio-Hypermedia

In a recent intervention, digital media and political theorist Benjamin H. Bratton has argued that we are witnessing the emergence of a new nomos of the earth, where older geopolitical divisions linked to territorial sovereign powers are intersecting the new nomos of the Internet and new forms of sovereignty extending in electronic space14. This new heterogenous nomos involves the overlapping of national governments (China, United States, European Union, Brasil, Egypt and such likes), transnational bodies (the IMF, the WTO, the European Banks and NGOs of various types), and corporations such as Google, Facebook, Apple, Amazon, etc., producing differentiated patterns of mutual accommodation marked by moments of conflict. Drawing on the organizational structure of computer networks or ‘the OSI network model, upon with the TCP/IP stack and the global internet itself is indirectly based’, Bratton has developed the concept and/or prototype of the ‘stack’ to define the features of ‘a possible new nomos of the earth linking technology, nature and the human.’15 The stack supports and modulates a kind of ‘social cybernetics’ able to compose ‘both equilibrium and emergence’. As a ‘megastructure’, the stack implies a ‘confluence of interoperable standards-based complex material-information systems of systems, organized according to a vertical section, topographic model of layers and protocols…composed equally of social, human and “analog” layers (chthonic energy sources, gestures, affects, user-actants, interfaces, cities and streets, rooms and buildings, organic and inorganic envelopes) and informational, non-human computational and “digital” layers (multiplexed fiber optic cables, datacenters, databases, data standards and protocols, urban-scale networks, embedded systems, universal addressing tables)’16.

In this section, drawing on Bratton’s political prototype, I would like to propose the concept of the ‘Red Stack’—that is, a new nomos for the post-capitalist common. Materializing the ‘red stack’ involves engaging with (at least) three levels of socio-technical innovation: virtual money, social networks, and bio-hypermedia. These three levels, although ‘stacked’, that is, layered, are to be understood at the same time as interacting transversally and nonlinearly. They constitute a possible way to think about an infrastructure of autonomization linking together technology and subjectivation.

Virtual money

The contemporary economy, as Christian Marazzi and others have argued, is founded on a form of money which has been turned into a series of signs, with no fixed referent (such as gold) to anchor them, explicitly dependent on the computational automation of simulational models, screen media with automated displays of data (indexes, graphics etc) and algo-trading (bot-to-bot transactions) as its emerging mode of automation17. As Toni Negri also puts it, ‘money today—as abstract machine—has taken on the peculiar function of supreme measure of the values extracted out of society in the real subsumption of the latter under capital’18.

Since ownership and control of capital-money (different, as Maurizio Lazzarato remind us, from wage-money, in its capacity to be used not only as a means of exchange, but as a means of investment empowering certain futures over others) is crucial to maintaining populations bonded to the current power relation, how can we turn financial money into the money of the common? An experiment such as Bitcoin demonstrates that in a way ‘the taboo on money has been broken’19 and that beyond the limits of this experience, forkings are already developing in different directions. What kind of relationship can be established between the algorithms of money-creation and ‘a constituent practice which affirms other criteria for the measurement of wealth, valorizing new and old collective needs outside the logic of finance’?20

Current attempts to develop new kinds of cryptocurrencies must be judged, valued and rethought on the basis of this simple question as posed by Andrea Fumagalli: Is the currency created not limited solely to being a means of exchange, but can it also affect the entire cycle of money creation – from finance to exchange?21.

Does it allow speculation and hoarding, or does it promote investment in post-capitalist projects and facilitate freedom from exploitation, autonomy of organization etc.? What is becoming increasingly clear is that algorithms are an essential part of the process of creation of the money of the common, but that algorithms also have politics (What are the gendered politics of individual ‘mining’, for example, and of the complex technical knowledge and machinery implied in mining bitcoins?) Furthermore, the drive to completely automate money production in order to escape the fallacies of subjective factors and social relations might cause such relations to come back in the form of speculative trading. In the same way as financial capital is intrinsically linked to a certain kind of subjectivity (the financial predator narrated by Hollywood cinema), so an autonomous form of money needs to be both jacked into and productive of a new kind of subjectivity not limited to the hacking milieu as such, but at the same time oriented not towards monetization and accumulation but towards the empowering of social cooperation. Other questions that the design of the money of the common might involve are: Is it possible to draw on the current financialization of the Internet by corporations such as Google (with its Adsense/Adword programme) to subtract money from the circuit of capitalist accumulation and turn it into a money able to finance new forms of commonfare (education, research, health, environment etc)? What are the lessons to be learned from crowdfunding models and their limits in thinking about new forms of financing autonomous projects of social cooperation? How can we perfect and extend experiments such as that carried out by the Inter-Occupy movement during the Katrina hurricane in turning social networks into crowdfunding networks which can then be used as logistical infrastructure able to move not only information, but also physical goods?22.

Social Networks

Over the past ten years, digital media have undergone a process of becoming social that has introduced genuine innovation in relation to previous forms of social software (mailing lists, forums, multi-user domains, etc). If mailing lists, for example, drew on the communicational language of sending and receiving, social network sites and the diffusion of (proprietary) social plug-ins have turned the social relation itself into the content of new computational procedures. When sending and receiving a message, we can say that algorithms operate outside the social relation as such, in the space of the transmission and distribution of messages; but social network software places intervenes directly on the social relationship. Indeed, digital technologies and social network sites ‘cut into’ the social relation as such—that is, they turn it into a discrete object and introduce a new supplementary relation.23

If, with Gabriel Tarde and Michel Foucault, we understand the social relation as an asymmetrical relation involving at least two poles (one active and the other receptive) and characterized by a certain degree of freedom, we can think of actions such as liking and being liked, writing and reading, looking and being looked at, tagging and being tagged, and even buying and selling as the kind of conducts that transindividuate the social (they induce the passage from the pre-individual through the individual to the collective). In social network sites and social plug-ins these actions become discrete technical objects (like buttons, comment boxes, tags etc) which are then linked to underlying data structures (for example the social graph) and subjected to the power of ranking of algorithms. This produces the characteristic spatio-temporal modality of digital sociality today: the feed, an algorithmically customized flow of opinions, beliefs, statements, desires expressed in words, images, sounds etc. Much reviled in contemporary critical theory for their supposedly homogenizing effect, these new technologies of the social, however, also open the possibility of experimenting with many-to-many interaction and thus with the very processes of individuation. Political experiments (se the various internet-based parties such as the 5 star movement, Pirate Party, Partido X) draw on the powers of these new socio-technical structures in order to produce massive processes of participation and deliberation; but, as with Bitcoin, they also show the far from resolved processes that link political subjectivation to algorithmic automation. They can function, however, because they draw on widely socialized new knowledges and crafts (how to construct a profile, how to cultivate a public, how to share and comment, how to make and post photos, videos, notes, how to publicize events) and on ‘soft skills’ of expression and relation (humour, argumentation, sparring) which are not implicitly good or bad, but present a series of affordances or degrees of freedom of expression for political action that cannot be left to capitalist monopolies. However, it is not only a matter of using social networks to organize resistance and revolt, but also a question of constructing a social mode of self-Information which can collect and reorganize existing drives towards autonomous and singular becomings. Given that algorithms, as we have said, cannot be unlinked from wider social assemblages, their materialization within the red stack involves the hijacking of social network technologies away from a mode of consumption whereby social networks can act as a distributed platform for learning about the world, fostering and nurturing new competences and skills, fostering planetary connections, and developing new ideas and values.

Bio-hypermedia

The term bio-hypermedia, coined by Giorgio Griziotti, identifies the ever more intimate relation between bodies and devices which is part of the diffusion of smart phones, tablet computers and ubiquitous computation. As digital networks shift away from the centrality of the desktop or even laptop machine towards smaller, portable devices, a new social and technical landscape emerges around ‘apps’ and ‘clouds’ which directly ‘intervene in how we feel, perceive and understand the world’.24). Bratton defines the ‘apps’ for platforms such as Android and Apple as interfaces or membranes linking individual devices to large databases stored in the ‘cloud’ (massive data processing and storage centres owned by large corporations).25

This topological continuity has allowed for the diffusion of downloadable apps which increasingly modulate the relationship of bodies and space. Such technologies not only ‘stick to the skin and respond to the touch’ (as Bruce Sterling once put it), but create new ‘zones’ around bodies which now move through ‘coded spaces’ overlayed with information, able to locate other bodies and places within interactive, informational visual maps. New spatial ecosystems emerging at the crossing of the ‘natural’ and the artificial allow for the activation of a process of chaosmotic co-creation of urban life.26 Here again we can see how apps are, for capital, simply a means to ‘monetize’ and ‘accumulate’ data about the body’s movement while subsuming it ever more tightly in networks of consumption and surveillance. However, this subsumption of the mobile body under capital does not necessarily imply that this is the only possible use of these new technological affordances. Turning bio-hypermedia into components of the red stack (the mode of reappropriation of fixed capital in the age of the networked social) implies drawing together current experimentation with hardware (shenzei phone hacking technologies, makers movements, etc.) able to support a new breed of ‘imaginary apps’ (think for example about the apps devised by the artist collective Electronic Disturbance Theatre, which allow migrants to bypass border controls, or apps able to track the origin of commodities, their degrees of exploitation, etc.).

Conclusions

This short essay, a synthesis of a wider research process, means to propose another strategy for the construction of a machinic infrastructure of the common. The basic idea is that information technologies, which comprise algorithms as a central component, do not simply constitute a tool of capital, but are simultaneously constructing new potentialities for postneoliberal modes of government and postcapitalist modes of production. It is a matter here of opening possible lines of contamination with the large movements of programmers, hackers and makers involved in a process of re-coding of network architectures and information technologies based on values others than exchange and speculation, but also of acknowledging the wide process of technosocial literacy that has recently affected large swathes of the world population. It is a matter, then, of producing a convergence able to extend the problem of the reprogramming of the Internet away from recent trends towards corporatisation and monetisation at the expense of users’ freedom and control. Linking bio-informational communication to issues such as the production of a money of the commons able to socialize wealth, against current trends towards privatisation, accumulation and concentration, and saying that social networks and diffused communicational competences can also function as means to organize cooperation and produce new knowledges and values, means seeking for a new political synthesis which moves us away from the neoliberal paradigm of debt, austerity and accumulation. This is not a utopia, but a program for the invention of constituent social algorithms of the common.

In addition to the sources cited above, and the texts contained in this volume, we offer the following expandable bibliographical toolkit or open desiring biblio-machine. (Instructions: pick, choose and subtract/add to form your own assemblage of self-formation for the purposes of materialization of the red stack):

— L. Baroniant and C. Vercellone, Moneta Del Comune e Reddito Sociale Garantito (2013), Uninomade.

— M. Bauwens, The Social Web and Its Social Contracts: Some Notes on Social Antagonism in Netarchical Capitalism (2008), Re-Public Re-Imaging Democracy.

— F. Berardi and G. Lovink, A call to the army of love and to the army of software (2011), Nettime.

— R. Braidotti, The posthuman (Cambridge: Polity Press, 2013).

— G. E. Coleman, Coding Freedom: The Ethics and Aesthetics of Hacking (Princeton and Oxford: Princeton University Press, 2012).

— A. Fumagalli, Trasformazione del lavoro e trasformazioni del welfare: precarietà e welfare del comune (commonfare) in Europa, in P. Leon and R. Realfonso (eds), L’Economia della precarietà (Rome: Manifestolibri, 2008), 159–74.

— G. Giannelli and A. Fumagalli, Il fenomeno Bitcoin: moneta alternativa o moneta speculativa? (2013), I Quaderni di San Precario.

— G. Griziotti, D. Lovaglio and T. Terranova, Netwar 2.0: Verso una convergenza della “calle” e della rete (2012), Uninomade 2.0.

— E. Grosz, Chaos, Territory, Art (New York: Columbia University Press, 2012).

— F. Guattari, Chaosmosis: An Ethico-Aesthetic Paradigm (Indianapolis, IN: Indiana University Press, 1995).

S. Jourdan, Game-over Bitcoin: Where Is the Next Human-Based Digital Currency? (2014).

— M. Lazzarato, Les puissances de l’invention (Paris: L’empecheurs de penser ronde, 2004).

— M. Lazzarato, The Making of the Indebted Man (Los Angeles: Semiotext(e), 2013).

— G. Lovink and M. Rasch (eds), Unlike Us Reader: Social Media Monopolies and their Alternatives (Amsterdam: Institute of Network Culture, 2013).

— A. Mackenzie (2013), Programming subjects in the regime of anticipation: software studies and subjectivity in In: Subjectivity. 6, p. 391-405

— L. Manovich, The Poetics of Augmented Space, Virtual Communication 5:2 (2006), 219–40.

— S. Mezzadra and B. Neilson, Border as Method or the Multiplication of Labor (Durham, NC: Duke University Press, 2013).

— P. D. Miller aka DJ Spooky and S. Matviyenko, The Imaginary App (Cambridge, MA: MIT Press, forthcoming).

— A. Negri, Acting in common and the limits of capital (2014), in Euronomade.

— A. Negri and M. Hardt, Commonwealth (Cambridge, MA: Belknap Press, 2009).

— M. Pasquinelli, Google’s Page Rank Algorithm: A Diagram of the Cognitive Capitalism and the Rentier of the Common Intellect(2009).

— B. Scott, Heretic’s Guide to Global Finance: Hacking the Future of Money (London: Pluto Press, 2013).

— G. Simondon, On the Mode of Existence of Technical Objects (1958), University of Western Ontario

— R. Stallman, Free Software: Free Society. Selected Essays of Richard M. Stallman (Free Software Foundation, 2002).

— A. Toscano, Gaming the Plumbing: High-Frequency Trading and the Spaces of Capital (2013), in Mute.

— I. Wilkins and B. Dragos, Destructive Distraction? An Ecological Study of High Frequency Trading, in Mute.

Download this article as an e-book


  1. In the words of the programme of the worshop from which this essay originated: http://quaderni.sanprecario.info/2014/01/workshop-algorithms/ ↩
  2. M. Fisher, Capitalist Realism: Is There No Alternative (London: Zer0 Books, 2009); 2009, A. Williams and N. Srnciek, ‘#Accelerate: Manifesto for an Accelerationist Politics’, this volume XXX-XXX. ↩
  3. K. Marx, ‘Fragment on Machines’, this volume, XXX–XXX. ↩
  4. Ibid., XXX. ↩
  5. Ibid., XXX. ↩
  6. M. Fuller, Software Studies: A Lexicon (Cambridge, MA: The MIT Press, 2008); F. Berardi, The Soul at Work: From Alienation to Autonomy, Cambridge, Mass: MIT Press, 2009)  ↩
  7. A. Goffey, ‘Algorithm’, in Fuller (ed), Software Studies, 15–17: 15. ↩
  8. Ibid. ↩
  9. Fuller, Introduction to Fuller (ed), Software Studies, 5 ↩
  10. L. Parisi, Contagious Architecture: Computation, Aesthetics, Space (Cambridge, Mass. and Sidney: MIT Press, 2013), x. ↩
  11. Ibid., ix. ↩
  12. Marx, XXX. ↩
  13. C. Vercellone, ‘From the crisis to the “commonfare” as new mode of production’, in special section on Eurocrisis (ed. G. Amendola, S. Mezzadra and T. Terranova), Theory, Culture and Society, forthcoming; also A. Fumagalli, ‘Digital (Crypto) Money and Alternative Financial Circuits: Lead the Attack to the Heart of the State, sorry, of Financial Market’ ↩
  14. B. Bratton, On the Nomos of the Cloud (2012). ↩
  15. Ibid. ↩
  16. Ibid. ↩
  17. C. Marazzi, Money in the World Crisis: The New Basis of Capitalist Power ↩
  18. T. Negri, Reflections on the Manifesto for an Accelerationist Politics(2014), Euronomade ↩
  19. Jaromil Rojio, Bitcoin, la fine del tabù della moneta (2014), in I Quaderni di San Precario. ↩
  20. S. Lucarelli, Il principio della liquidità e la sua corruzione. Un contributo alla discussione su algoritmi e capitale (2014), in I Quaderni di san Precario ↩
  21. A. Fumagalli, Commonfare: Per la riappropriazione del libero accesso ai beni comuni (2014), in Doppio Zero ↩
  22. Common Ground Collective, Common Ground Collective, Food, not Bombs and Occupy Movement form Coalition to help Isaac & Kathrina Victims (2012), Interoccupy.net  ↩
  23. B. Stiegler, The Most Precious Good in the Era of Social Technologies, in G. Lovink and M. Rasch (eds), Unlike Us Reader: Social Media Monopolies and Their Alternatives (Amsterdam: Institute of Network Culture, 2013), 16–30. ↩
  24. G. Griziotti, Biorank: algorithms and transformations in the bios of cognitive capitalism (2014), in I Quaderni di san Precario; also S. Portanova, Moving without a Body (Boston, MA: MIT Press, 2013 ↩
  25. B. Bratton, On Apps and Elementary Forms of Interfacial Life: Object, Image, Superimposition  ↩
  26. S. Iaconesi and O. Persico, The Co-Creation of the City: Re-programming Cities using Real-Time User-Generated Content ↩

Photo by ahisgett

The post Algorithms, Capital, and the Automation of the Common appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/algorithms-capital-and-the-automation-of-the-common/2019/01/15/feed 0 74010
Book of the day: Neurocapitalism Technological Mediation and Vanishing Lines https://blog.p2pfoundation.net/book-of-the-day-neurocapitalism-technological-mediation-and-vanishing-lines/2018/09/27 https://blog.p2pfoundation.net/book-of-the-day-neurocapitalism-technological-mediation-and-vanishing-lines/2018/09/27#respond Thu, 27 Sep 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72728 Technological change is ridden with conflicts, bifurcations and unexpected developments. Neurocapitalism takes us on an extraordinarily original journey through the effects that cutting-edge technology has on cultural, anthropological, socio-economic and political dynamics. Today, neurocapitalism shapes the technological production of the commons, transforming them into tools for commercialization, automatic control, and crisis management. But all is... Continue reading

The post Book of the day: Neurocapitalism Technological Mediation and Vanishing Lines appeared first on P2P Foundation.

]]>
Technological change is ridden with conflicts, bifurcations and unexpected developments. Neurocapitalism takes us on an extraordinarily original journey through the effects that cutting-edge technology has on cultural, anthropological, socio-economic and political dynamics. Today, neurocapitalism shapes the technological production of the commons, transforming them into tools for commercialization, automatic control, and crisis management.

But all is not lost: in highlighting the growing role of General Intellect’s autonomous and cooperative production through the development of the commons and alternative and antagonistic uses of new technologies, Giorgio Griziotti proposes new ideas for the organization of the multitudes of the new millennium.

Excerpt from “Neurocapitalism: Technological Mediation and Vanishing Lines”, by Giorgio Griziotti (with permission)

The archaic god and the technological Leviathan – sacred techne

What has been discussed regarding neo-nomadism and transient modes of being has a counterweight (or contradiction) in new forms of absolute belonging that are manifested through archaic religious fundamentalisms. These extremisms are, in their various facets, more and more present – even hegemonic – in vast areas of the South. A conspicuous part of the global population is walking the road back to a sense of belonging that is even more archaic and binding than those previously discussed.

From a superficial point of view, fundamentalist movements seem to have in some way substituted those of Soviet-inspired national liberation from the Cold War era, a vision that doesn’t however take into account the influence of the profound transformation that came with technological mediation. On the other hand, how can we explain the inconsistency of a North poised between the fascination and the threat of technological temptation and the archaic fundamentalism that, from the South, manifests itself even in western metropolitan suburbs? Simondon provides an interesting key for interpreting these profound contradictions.

In one of his main works dedicated to the modality of existence of technical objects, Simondon maintains, similar to what was written in the introduction, that the genesis of technical reality is part of human beings’ relation to the world.1 In addition, he adds that technicality is, along with religion, one of the two simultaneous phases2 that emerge in order to solve the problems presented in the magical, primitive original stage of our relation to the world. “Primitive unity,” writes Simondon, “appears as a reticulation of the universe in privileged key points where exchanges between the living and the environment take place.” These are places or magical moments3 that are distinguished as figures distinct from the background of the universe. At a certain moment in evolution, we pass from the magic unity of these reticulations to the development of technical and religious thought that is “the organization of two symmetrical and opposite mediations.”In this doubling, or rather phase shift, key points in the world separate from the background to become a technicality that is crystallized in efficient and instrumental objects that function everywhere and at any given moment, while the background becomes abstract and is subjectified, personified in divine, sacred forms of religion. What prevents us from grafting the contemporary condition of a technology-religion dualism onto Simondon’s vision? Simondon states that in the becoming of technical objects, key points of the magical, prehistoric world lose “their capacity for creating network and their power to influence reality that surrounds them from a distance.” In this way, he refers to the technological mediation as we knew it until very recently.

Today, however, the situation has changed so drastically that we have put forth the hypothesis of this volume based on the paradigmatic leap in said mediation. A leap characterized, to use Simondon’s terms, by the emergence of a context where today’s technical objects (for example ITC devices and networks) are integrated with the “background” (the space-time of the universe), restoring, in some way, original unity. Such reconstitution obviously doesn’t take us back to a world populated by magical places and doesn’t entail transcendence but, contrary to what Simondon asserts, it can no longer be claimed that the technical object is “distinguished” from natural being in the sense that it is not part of the world. Quite the opposite, our hypothesis is that in human’s “becoming machine,” the technical object becomes a part of the living and this calls into question the vision of two mediations: the technical and the religious, counterposed as an indissoluble couple.

The basic framework from which technicality and religion were born at the dawn of human history is made brittle by a multiplicity of technologies that invade not only the political dimension of life, bios, but also the biological one: the vital breath of zoé. Evoking an extreme biopolitical case that acts upon the separation between bios and zoé and reduces life to “nude life,” we can refer to Nazi thanatopolitics. Agamben reminds us of the Euthanasia-Program enacted by Hitler to eliminate incurable mental patients:

[T]he program, in the guise of a solution to a humanitarian problem, was an exercise of the sovereign power to decide on bare life in the horizon of the new biopolitical vocation of the National Socialist state. The concept of “life unworthy of being lived” is clearly not an ethical one, which would involve the expectations and legitimate desires of the individual. It is, rather, a political concept […] on which sovereign power is founded4.

70,000 people were eliminated, of which 5,000 were children, in the span of fifteen months. The program was later abandoned due to the growing protest of the Bishops. The two doctors responsible for the program, condemned to die at Nuremburg, “declared they didn’t feel guilty because the question of euthanasia would come up again.” With the Aktion T4 program, the Nazis also widened their deadly action to all “lives unworthy of being lived.”5

Today, for the first time, technology allows us to operate within the complexity that binds and separates bios and zoé and that, until recently, was indecipherable. In fact, like all mysteries, what unites life and death was the exclusive prerogative of religion and, in rendering it profane, we overstep the boundaries of the confines of religious thought and technical thought moves into the domain of the sacred. Paraphrasing Agamben, we could say we are facing a sacred techne that “is set outside of human jurisdiction without trespassing the divine.” Therefore, from an archaic point of view, the civilization of profaning technology can be killed with impunity, as homo sacer, but not sacrificed.

On the other hand, this capacity to act upon bios and zoé opens many prospects including, in a positive sense, that of an era of hybridization that is not exclusively anthropocentric6 that could give life to a non-capitalist, non-archaic ethics. Positive outcomes are not, however, obvious or to be taken for granted because, in this framework, technology is also the tool of the contemporary necropolitics practiced by biopower that, concentrated almost exclusively on the daily exploitation of life itself, creates inhumane forms of destruction. Inhumane are the new forms of a remote-controlled algorithmic death because it is delegated to automatons and robots like, for example, the CIA’s drones that, in under eight years, killed thousands of people in Pakistan alone, including hundreds of women and children,7 or the automatic sensorial strafing systems able to automatically activate themselves and shoot “intruders.”8

These new forms of asymmetrical warfare, of which remotely guided drones are only the tip of the iceberg, are subverting the praxis, theory and ethics – if not the very concept of war itself, as explained in the well-argued piece A theory of the drone.9 More generally, the ecological devastation of the Earth is literally inhumane in the sense that it takes out a dangerous mortgage on the possibility of human participation in the future. However, now we’d like to focus our attention on the macropolitical consequences of questioning a reality founded on a technological-religious bipolarity. If technical objects, born from the objectification of magical places that emerge from the background of the primitive world tend to reorganize themselves in networks, pushed by cognitive capitalism and reconstitute a new unity, what are the consequences for religious thought?

The impulse of reticular technologies that reconstitute unity with the universe in the perspective of control and the commercialization of life and death calls into question the religious phase, breaking the previous balance. This condition influences all religions and, in particular, the three main monotheistic belief systems. Our hypothesis is thus that, subjectivizing and rendering “profane” the role traditionally allocated to the divine, technical capitalist thought unconsciously pushes the latter to regress towards archaic values by any means necessary. It is as if religious subjectivation tries to recover its primitive vocation of total need that it feels slipping through its fingers. In this regard, it is enough to recall the anathema of Pope Ratzinger – a theologian little inclined to the populism in vogue – against the “dictatorship of relativism.” In looking for universal and absolute values, fundamentalist theologians are convinced they will find the original strength to contrast the invasion of technical, profane thought by going back to archaic values and ethics. This obviously doesn’t mean that, for example, in Islamic theocracies the use of contemporary technology is denied but that, maybe unconsciously, they react against the supposed danger of a society that no longer has divinities to refer back to for ethics. This is common both in fundamentalist instincts as well as the three monotheistic religions.

Thereafter, the force and effects of this phenomenon are different: in the areas of Christianity and Judaism, cradle of the new technological paradigm and where the decline of belonging strikes ideologies and religion, fundamentalism sometimes manifests with virulence,10 though without assuming a driving or central function. In the great swath of the postcolonial south, from Morocco to Indonesia and where one of the great monotheistic religions, Islam, prevails, the situation is quite different. It doesn’t seem surprising that facing western techno-biopolitical expression, archaic religious calls gain strength and increasingly radicalize. If post-capitalist social movements had managed to rapidly trigger new political processes during the Arab Spring, today we probably wouldn’t be witnessing the wars that tear apart, disperse and take entire populations hostage, “collateral damage” of two asymmetric necropolitical blocs that fight in a downward spiral: biotechnological capitalism on the one hand and absolutist obscurantism on the other.

One of the expressions of the explosion of this antagonist equilibrium between the technical thought of cognitive capitalism and fundamentalist religion found its origins in the Middle Eastern wars to then spread globally. The two significant and rival arms are, on one side, suicide bombers and, on the other, Hellfire missiles launched from a remotely controlled drone that annihilate any form of life within a twenty-meter range.11 The kamikaze and the technological angel of death are the incarnation of two deviations that attempt to destroy one another and us without any hope for victory.

If fundamentalist thought wasn’t the archaic equivalent of Western neo-colonialist biopower which it opposes and if it had a minimal awareness of the impulses that animate it, it would have promoted Nineveh and Palmira as symbols of resistance rather than destroying them with several tons of TNT. In conclusion, nothing good will come of this war that opposes a simulacrum of god to the technological Leviathan origins of supreme algorithms attempting to subject the entire planet. Only a third path of constructing a common based on post-capitalist ethics can effectively counter this trend. The rest is a question of time.

1 Simondon, 1958.

2 The phase must be understood, according to Simondon, not from a temporal point of view but from the point of view of the relation of phases to the physical, in which it must be conceived of as a relation to another or others and the whole of the phase constitutes a complete system (in our case, reality).

3 Many institutionalized and temporal vestiges of these figures remain today: holidays, vacations, justified with the excuse of the rest, “often compensate with a magical charge lost in contemporary urbanization.”

4 Agamben, 1995, 90.

5 Marco Paolini wrote and produced “Ausmerzen. Vite indegne di essere vissute.” [Ausmerzen. Lives unworthy of being lived], a play that deals with Nazi eugenic theories and Aktion T4. This play was performed at Milan’s ex-psychiatric hospital “Paolo Pini” in 2011.

This is the story of mass extermination known as Aktion T4. T4 stands for Tiergartenstraße 4, an address in Berlin.

During Aktion T4, around 300,000 people, classified as ‘lives unworthy of being lived’, were killed.” Paolini, 2012, 5 [our translation].

6 Hybridization here isn’t intended to support any particular current of posthumanist or transhumanist thought.

7 Already in 2012, there were more than 2400 dead according to London’s “Bureau for Investigative Journalism”: “March of the robots,” The Economist, 2/06/2012 http:// www.economist.com/node/21556103.

8 For example, the automatic sensorial strafing systems like Rafael’s Samson Remote Weapon Station, installed in Israel along the border with the Gaza Strip.

9 Chamayou, 2013. For a realistic representation of drone piloting stations in the US, see Good Kill (Niccol, 2015).

10For example, the somewhat ample social movement against the so-called “Mariage pour tous” [Marriage for all] (which extended matrimony to homosexual couples) in France in 2014.

11 Chamayou, 2015, 120.


Bio: Giorgio Griziotti was one of the first digital engineers to graduate from Milan’s Politecnico University. His participation in the autonomous movements in Italy in the 1970s forced him to gain most of his professional experience in exile. He has an experience of more than thirty years in large international IT projects. Today he is an independent researcher and member of the collective Effimera.

Released by Minor Compositions 2019
Colchester / New York / Port Watson
Minor Compositions is a series of interventions & provocations drawing from autonomous politics, avant-garde aesthetics, and the revolutions of everyday life. Minor Compositions is an imprint of Autonomedia

www.minorcompositions.infowww.autonomedia.org

Photo by Antonio_Trogu

The post Book of the day: Neurocapitalism Technological Mediation and Vanishing Lines appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/book-of-the-day-neurocapitalism-technological-mediation-and-vanishing-lines/2018/09/27/feed 0 72728