Public good – P2P Foundation https://blog.p2pfoundation.net Researching, documenting and promoting peer to peer practices Wed, 01 Aug 2018 15:18:27 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.15 62076519 Out of the Frying Pan and Into the Fire https://blog.p2pfoundation.net/out-of-the-frying-pan-and-into-the-fire/2018/08/04 https://blog.p2pfoundation.net/out-of-the-frying-pan-and-into-the-fire/2018/08/04#respond Sat, 04 Aug 2018 08:00:00 +0000 https://blog.p2pfoundation.net/?p=72084 Republished from Aral Balkan  Mariana Mazzucato1 has an article in MIT Technology Review titled Let’s make private data into a public good. Let’s not. While Mariana’s criticisms of surveillance capitalism are spot on, her proposed remedy is as far from the mark as it possibly could be. Yes, surveillance capitalism is bad Mariana starts off... Continue reading

The post Out of the Frying Pan and Into the Fire appeared first on P2P Foundation.

]]>
Republished from Aral Balkan 

Mariana Mazzucato1 has an article in MIT Technology Review titled Let’s make private data into a public good.

Let’s not.

While Mariana’s criticisms of surveillance capitalism are spot on, her proposed remedy is as far from the mark as it possibly could be.

Yes, surveillance capitalism is bad

Mariana starts off by making the case, and rightly so, that surveillance capitalists2 like Google or Facebook “are making huge profits from technologies originally created with taxpayer money.”

Google’s algorithm was developed with funding from the National Science Foundation, and the internet came from DARPA funding. The same is true for touch-screen displays, GPS, and Siri. From this the tech giants have created de facto monopolies while evading the type of regulation that would rein in monopolies in any other industry. And their business model is built on taking advantage of the habits and private information of the taxpayers who funded the technologies in the first place.

There’s nothing to argue with here. It’s a succinct summary of the tragedy of the commons that lies at the heart of surveillance capitalism and, indeed, that of neoliberalism itself.

Mariana also accurately describes the business model of these companies, albeit without focusing on the actual mechanism by which the data is gathered to begin with3:

Facebook’s and Google’s business models are built on the commodification of personal data, transforming our friendships, interests, beliefs, and preferences into sellable propositions. … The so-called sharing economy is based on the same idea.

So far, so good.

But then, things quickly take a very wrong turn:

There is indeed no reason why the public’s data should not be owned by a public repository that sells the data to the tech giants, rather than vice versa.

There is every reason why we shouldn’t do this.

Mariana’s analysis is fundamentally flawed in two respects: First, it ignores a core injustice in surveillance capitalism – violation of privacy – that her proposed recommendation would have the effect of normalising. Second, it perpetuates a fundamental false dichotomy ­– that there is no other way to design technology than the way Silicon Valley and surveillance capitalists design technology – which then means that there is no mention of the true alternatives: free and open, decentralised, interoperable ethical technologies.

No, we must not normalise violation of privacy

The core injustice that Mariana’s piece ignores is that the business model of surveillance capitalists like Google and Facebook is based on the violation of a fundamental human right. When she says “let’s not forget that a large part of the technology and necessary data was created by all of us” it sounds like we voluntarily got together to create a dataset for the common good by revealing the most intimate details of our lives through having our behaviour tracked and aggregated. In truth, we did no such thing.

We were farmed.

We might have resigned ourselves to being farmed by the likes of Google and Facebook because we have no other choice but that’s not a healthy definition of consent by any standard. If 99.99999% of all investment goes into funding surveillance-based technology (and it does), then people have neither a true choice nor can they be expected to give any meaningful consent to being tracked and profiled. Surveillance capitalism is the norm today. It is mainstream technology. It’s what we funded and what we built.

It is also fundamentally unjust.

There is a very important reason why the public’s data should not be owned by a public repository that sells the data to the tech giants because it’s not the public’s data, it is personal data and it should never have been collected by a third party to begin with. You might hear the same argument from people who say that we must nationalise Google or Facebook.

No, no, no, no, no, no, no! The answer to the violation of personhood by corporations isn’t violation of personhood by government, it’s not violating personhood to begin with.

That’s not to say that we cannot have a data commons. In fact, we must. But we must learn to make a core distinction between data about people and data about the world around us.

Data about people ≠ data about rocks

Our fundamental error when talking about data is that we use a single term when referring to both information about people as well as information about things. And yet, there is a world of difference between data about a rock and data about a human being. I cannot deprive a rock of its freedom or its life, I cannot emotionally or physically hurt a rock, and yet I can do all those things to people. When we posit what is permissible to do with data, if we are not specific in whether we are talking about rocks or people, one of those two groups is going to get the short end of the stick and it’s not going to be the rocks.

Here is a simple rule of thumb:

Data about individuals must belong to the individuals themselves. Data about the commons must belong to the commons.

I implore anyone working in this area – especially professors writing books and looking to shape public policy – to understand and learn this core distinction.

There is an alternative

I mentioned above that the second fundamental flaw in Mariana’s article is that it perpetuates a false dichotomy. That false dichotomy is that the Silicon Valley/surveillance capitalist model of building modern/digital/networked technology is the only possible way to build modern/digital/networked technology and that we must accept it as a given.

This is patently false.

It’s true that all modern technology works by gathering data. That’s not the problem. The core question is “who owns and controls that data and the technology by which it is gathered?” The answer to that question today is “corporations do.” Corporations like Google and Facebook own and control our data not because of some inevitable characteristic of modern technology but because of how they designed their technology in line with the needs of their business model.

Specifically, surveillance capitalists like Google and Facebook design proprietary and centralised technologies to addict people and lock them in. In such systems, your data originates in a place you do not own. On “other people’s computers,” as the Free Software Foundation calls it. Or on “the cloud” as we colloquially reference it.

The crucial point here, however, is that this toxic way of building modern technology is not the only way to design and build modern technology.

We know how to build free and open, decentralised, and interoperable systems where your data originates in a place that you – as an individual – own and control.

In other words, we know how to build technology where the algorithms remain on your own devices and where you are not farmed for personal information to begin with.

To say that we must take as given that some third party will gather our personal data is to capitulate to surveillance capitalism. It is to accept the false dichotomy that either we have surveillance-based technology or we forego modern technology.

This is neither true, nor necessary, nor acceptable.

We can and we must build ethical technology instead.

Regulate and replace

As I’m increasingly hearing these defeatist arguments that inherently accept surveillance as a foregone conclusion of modern technology, I want to reiterate what a true solution looks like.

There are two things we must do to create an ethical alternative to surveillance capitalism:

    1. Regulate the shit out of surveillance capitalists.The goal here is to limit their abuses and harm. This includes limiting their ability to gather, process, and retain data, as well as fining them meaningful amounts and even breaking them up.4
    2. Fund and build ethical alternatives.In other words, replace them with ethical alternatives.Ethical alternatives do exist today but they do so mainly thanks to the extraordinary personal efforts of disjointed bands of so-called DIY rebels.

Whether they are the punk rockers of the tech world or its ragamuffins – and perhaps a little bit of both – what is certain is that they lead a precarious existence on the fringes of mainstream technology. They rely on anything from personal finances to selling the things they make, to crowdfunding and donations – and usually combinations thereof – to etch out an existence that both challenges and hopes to alter the shape of mainstream technology (and thus society) to make it fairer, kinder, and more just.

While they build everything from computers and phones (Puri.sm) to federated social networks (Mastodon) and decentralised alternatives to the centralised Web (DAT), they do so usually with little or no funding whatsoever. And many are a single personal tragedy away from not existing at all.

Meanwhile, we use taxpayer money in the EU to fund surveillance-based startups. Startups, which, if they succeed will most likely be bought by larger US-based surveillance capitalists like Google and Facebook. If they fail, on the other hand, the European taxpayer foots the bill. Europe, bamboozled by and living under the digital imperialism of Silicon Valley, has become its unpaid research and development department.

This must change.

Ethical technology does not grow on trees. Venture capitalists will not fund it. Silicon Valley will not build it.

A meaningful counterpoint to surveillance capitalism that protects human rights and democracy will not come from China. If we fail to create one in Europe then I’m afraid that humankind is destined for centuries of feudal strife. If it survives the unsustainable trajectory that this social system has set it upon, that is.

If we want ethical technological infrastructure – and we should, because the future of our human rights, democracy, and quite possibly that of the species depends on it – then we must fund and build it.

The answer to surveillance capitalism isn’t to better distribute the rewards of its injustices or to normalise its practices at the state level.

The answer to surveillance capitalism is a socio-techno-economic system that is just at its core. To create the technological infrastructure for such a system, we must fund independent organisations from the common purse to work for the common good to build ethical technology to protect individual sovereignty and nurture a healthy commons.


  1. According to the bio in the article: “Mariana Mazzucato is a professor in the economics of innovation and public value at University College London, where she directs the Institute for Innovation and Public Purpose.” The article I’m referencing is an edited excerpt from her new book The Value of Everything: Making and Taking in the Global Economy. [return]
  2. Although she never explicitly uses that term in the article. [return]
  3. Centralised architectures based on surveillance. [return]
  4. Break them up, by all means. But don’t do anything silly like nationalising them (for all the reasons I mention in this post). Nationalising a surveillance-based corporation would simply shift the surveillance to the state. We must embrace the third alternative: funding and building technology that isn’t based on surveillance to begin with. In other words, free and open, decentralised, interoperable technology. [return]

Photo by JForth

The post Out of the Frying Pan and Into the Fire appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/out-of-the-frying-pan-and-into-the-fire/2018/08/04/feed 0 72084
Culture, Community, and Collaboration – New Directions for Protecting Indigenous Heritage https://blog.p2pfoundation.net/culture-community-collaboration-new-directions-protecting-indigenous-heritage/2017/04/30 https://blog.p2pfoundation.net/culture-community-collaboration-new-directions-protecting-indigenous-heritage/2017/04/30#respond Sun, 30 Apr 2017 10:30:00 +0000 https://blog.p2pfoundation.net/?p=65084 Questions about who “owns” or has the right to benefit from Indigenous heritage are at the core of ongoing political, economic, and ethical debates taking place at local, national, and international levels. When it comes to research in this area, Indigenous peoples have typically had little say in how studies related to their heritage are... Continue reading

The post Culture, Community, and Collaboration – New Directions for Protecting Indigenous Heritage appeared first on P2P Foundation.

]]>
Questions about who “owns” or has the right to benefit from Indigenous heritage are at the core of ongoing political, economic, and ethical debates taking place at local, national, and international levels. When it comes to research in this area, Indigenous peoples have typically had little say in how studies related to their heritage are managed. Increasingly though, efforts are being made to decolonize research practices by fostering more equitable relationships between researchers and Indigenous peoples, based on mutual trust and collaboration.

In this presentation George Nicholas reviews debates over the “ownership” of Indigenous heritage and provides examples of new research practices that are both more ethical and more effective. These collaborative research models, in which the community leads the research, highlight important new directions in protecting Indigenous heritage.


Originally published on Remix the Commons

The post Culture, Community, and Collaboration – New Directions for Protecting Indigenous Heritage appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/culture-community-collaboration-new-directions-protecting-indigenous-heritage/2017/04/30/feed 0 65084
Science as Public Good and Commons as a Science https://blog.p2pfoundation.net/science-as-public-good-and-commons-as-a-science/2016/02/15 https://blog.p2pfoundation.net/science-as-public-good-and-commons-as-a-science/2016/02/15#respond Mon, 15 Feb 2016 01:43:55 +0000 https://blog.p2pfoundation.net/?p=54066 A discussion on the interweavings of science with commons, public and private and the ways we understand, produce and socialize it. By Antonio Lafuente and Adolfo Estalella, from the Instituto de Historia (CSIC) and the University of Manchester. Proclaiming the public nature of science has become something as commonplace as it is controversial. At times,... Continue reading

The post Science as Public Good and Commons as a Science appeared first on P2P Foundation.

]]>
A discussion on the interweavings of science with commons, public and private and the ways we understand, produce and socialize it. By Antonio Lafuente and Adolfo Estalella, from the Instituto de Historia (CSIC) and the University of Manchester.


Proclaiming the public nature of science has become something as commonplace as it is controversial. At times, the consensus is overwhelming: more science and more research funding are universally in demand, taking it as given that science is not only economically necessary but also morally irreplaceable. However, the agreement was never absolute. There have always been those who are willing to blame a democratic deficit for so little discussion on the type of science we want, or for the fact that we still treat the environmental and health damages produced by the deployment of techno-science as externalities. The truth is that science isn’t only public, it’s also private, and the crossbreeding between academia, government and business is old, deep, and sometimes murky.

Science isn’t only semi-public: it couldn’t survive without the publics [a]. There is an extensive body of work that insists upon the social, urban, and collective nature of science. Within it we are shown how science has always maintained a complex, vibrant and dynamic relationship with the people: amateurs and artisans, witnesses and spectators, activists and consumers. Yes, it’s true that, for better or worse, the citizenry owes a lot to science; equally correct is the thesis that science owes much to the citizenry. A lot of what contributes to the knowledge we find so hard to accept is anonymous, invisible and tacit; our narratives insist upon scorning that knowledge. Consequently, it seems as if the entire world is complicit in creating this absurd and biased image of science.

To build our argument, we’ve divided this text into three parts. In the first we’ll explore the historical origins of science’s capacity as a public good. In the second we will highlight the problems derived from treating Commons science and open science as analogous, which is to say that the exigencies of the open access or open data movements, while necessary, are not sufficient. The third section argues that the condition of being a common good comes not from its being provided for all, but rather stems from being created among all. This opens the capacity of being a common good by virtue of belonging to a third sector, alongside the public and the private.

The truth is that science isn’t only public, it’s also private, and the crossbreeding between academia, government and business is old, deep, and sometimes murky. For decades, perhaps centuries, we’ve told the history of science as if it were akin to the planetary expansion of an oil stain, or the transmission of an epidemic. But there is nothing natural about the transmission of knowledge.

Science understood as a commons would not then be simply a public science seeking an outlet via open access, nor would it be an extramural non-commercialized science, or a formal science (as always) including the citizenry in the design and evaluation of projects and results. Science understood as a commons wouldn’t be the same old science in a democratic or post-modernist guise. Science doesn’t become a commons by being more functional, open or militant; instead it results from the application of contrasted, collective and recursive epistemic practices. The Commons would then be another approach, historically distinguished for producing knowledge, community, and commitment. Thus, in the third part, rather than discussing science as a commons, we will discuss the Commons as a science.
lentes

Science Commons as a Public Good

The notion of science being a public good is relatively recent. Philip Mirowski (2011) has devoted much effort to explaining this idea. In order to understand it, we must unavoidably admit that the pressures imposed upon scientists by the Church, Empires, and States bear a striking resemblance to those which today’s industrial corporations seek to impose. We know that by the 19th-century, university laboratories were being scrupulously combed by industrialists probing between test tubes and coils for some discovery upon which new monopolies could be built. It all seems to point to the communitarian nature of science earning credit by somehow finding ways to legitimize corporate-financed industrial laboratories claiming discoveries as their own. In this way, if the discovery was assumed to be collective, no one except the owner of the space where the knowledge was produced could claim the patent.

The Second World War drastically changed this panorama. During the second third of the 20th century, the state gave itself the right to manage science and create the necessary conditions for advancing innovation. The war economy brought about a techno-military complex where the public sector invested in basic sciences, in order to guarantee the free circulation of knowledge among entrepreneurs playing a game whose rules, fixed by the Army, served the national interest. Its condition as public goods meant the militarization and nationalization of so-called Big Science. Everything changed in a hurry from the 1980s on, creating conditions which were favorable for launching an accelerated process of knowledge privatization. Not just inventions but discoveries could be subjected to intellectual property rights and, consequently, be treated as assets circulating in the stock exchange to attract venture capital. In this new regime of academic capitalism the boundaries between the public and the private dissolve.

The transition, however, wasn’t without resistance. What by now is obvious to all was only anticipated by some, and their arguments are still relevant. Paul A. David (2008) explains how, since the dawn of modern science, a perception emerged of scientists as unmanageable people, owing to the sophisticated nature of their knowledge. The truth is that the court, given that nobody could act as a counterweight, elected to open knowledge so that they themselves could preside over the quality of scientific work. This would be the origin of the awards, academia, and the periodic journals. The autonomy of science imbued its organization with the qualities of a meritocratic, open and cosmopolitan enterprise. Distinguishing between sages and charlatans required the concurrence of new spaces, different actors and different mediations that, as a whole, lead us to treat what is known as the Scientific Revolution not as an epistemic revolution but as an open science revolution. Michael Polanyi also wanted to join the club of those who opposed the treatment of knowledge as information to eventually, after disenfranchising it from its places of production, turn it into a profitable resource. The commodification of science is impossible when the only knowledge that can be patented is non-tacit. The aforementioned positions argue that science only prospers when it is kept as a collective enterprise whose products are not reduced to codified information, and whose organization won’t overwhelm the attempts to confine it within a protected environment. The history of ideas, the anthropology of organizations and the economy of innovation coincided in the necessity of demanding an active role from the state in the preservation of science as public goods. And this is the tradition assumed and inherited by Michel Callon (1994) in his provocative vision of science.

Callon’s reasoning begins by obliging his readers to accept that knowledge was always the most mundane of enterprises, never isolated from the surrounding interests. To say otherwise was tantamount to ignoring the ample work previously undertaken in the field of the study of science. For decades, perhaps centuries, we’ve told the history of science as if it were akin to the planetary expansion of an oil stain, or the transmission of an epidemic. But there is nothing natural about the transmission of knowledge. What STS has taught us is that verifying any natural law or testing the relevance of a scientific concept necessitates a plethora of machines, technicians and reagents, as well as the time and resources to produce, select, contrast, discuss, standardize and communicate the results. As such, the desire to have science as a commons is a utopian undertaking which obliges us to examine whether we can truly assume the potentially untenable cost of transmission.

We must sign the peace treaty: we need a lasting agreement that doesn’t insist upon dividing the world between those who know and those who don’t; an armistice to liberate the world from the arrogance of experts. Stating that we need science to guarantee a prosperous future is a message that is as certain as it is exhausted. Moreover, it is the carrier of a plan that legitimizes exclusion while guaranteeing new wars for science.

Michel Callon has shown us that a robust science should promote the necessary Freedom of Association to operate different forms of organization; he also calls for a Freedom of Extension to prevent the network from allowing the obstruction or imposition of any type of orthodoxy or canon. Finally, it invites all participants to a Struggle against Irreversibility to prevent monopolies from creating standards that block innovation. The notion of public goods is explicitly related to the notion of diversity and not of open access. The importance lies not in the equitable distribution of knowledge, but rather in creating conditions that prevent interruptions in the process of knowledge production and diversification. The resource in need of protection isn’t knowledge itself, but the plurality of forms of socialization it facilitates. We don’t need the state to protect knowledge itself, but the networks it circulates within. It isn’t about protecting ideas that are published or deserve a Nobel Prize, but the infrastructures supporting them, which are often as inscrutable as they are contrary to the Commons.

Science Commons as Open Science

Imagining science as a commons requires that we stop thinking of it as something separable from the market. We also have to disentail the aforementioned claim from notions of open access. Elinor Ostrom (1990) argued for this with memorable aplomb: there is nothing more contrary to the Commons that an open access system absent any form of governance. Confusing both concepts is in fact what led Garrett Hardin to proclaim the tragedy of the Commons and to demand, as a survival strategy, the public or private patrimonial appropriation of those resources that really matter. The Commons, Ostrom remarked, are not a thing, but a management process that collapses when the community that sustains and is sustained by them doesn’t incorporate efficient rules to protect itself from, among other threats, free riders.

Over the last decade we’ve witnessed the birth of various movements which demand that science be awarded the condition of open enterprise. Though not all of their proponents use the same arguments, nor emphasize the same concepts, it seems reasonable to mention two central sets of motivations. On the one hand, there are those who question the generalized practice of externalizing the process of communication. They all share the critique of the current system as both profligate and paradoxical, as it assumes huge expenses to produce papers that we are later obliged to buy back from those who’d previously received them from us gratis.

The second motivation for demanding open access to scientific information has to do with the desideratum for well-informed politics, belief in free choice, and the strengthening of democracy. Debates on energy options, consumption of GMOs, air quality, food labeling, and the treatment of chronic diseases prompt processes that should be discussed openly. No less important is the fact that the exaggerated costs of scientific information or pharmaceuticals exclude poor institutions and countries from their use. This adds science to the list of factors contributing to the widening inequalities in the world.

Extravagance, careerism, and opacity are justified criticisms that validate an orientation towards open access. The quality of our democracies and global justice are neither minor, nor likely deferrable, objectives. But it’s an absolutely insufficient debate. While the politics of open science do correct some of the most heinous shortcomings of the current system, it is no less true that open, online and free distribution requires a set of conditions that, ultimately, benefit big corporations foremost – or, in other words, those able to capitalize on the information. Furthermore, it’s not clear that accessibility corrects the role of science in our world in any decisive way. Making the information freely available is not tantamount to being able to use or do something with it, as it will remain invariably tied to the technologies and values through which it was produced.

Those who’ve studied open science invite us to consider cases such as SETI [1] or all the crowd-sourced projects related to the pioneering BOINC [2] platform. Voluntary computation has shown itself to be a powerful mechanism for solving problems that demand enormous calculating capacity. Wikipedia and Fold.it [3] are two very different projects that authoritatively demonstrate the emergent power available to interconnected multitudes. We are speaking of colossal mechanisms connecting millions of humans; we’re also referring to new ways of producing and validating knowledge. Examples that allow us to imagine an empowered citizenry capable of producing facts to counter official data do exist. We could be talking about environmental or food crises, or the production of new cartographies, different patterns, or different institutions. If that were the case, we would witness the birth of different systems of knowledge gestated through pioneering forms of coding, communicating, archiving, and validating the knowledge. Laboratory space, formerly reserved for experts, is turning into disputed territory. The experts have good reason to feel uneasy. It all indicates that their consolidated hegemony could be in danger. It wouldn’t be the first time that the needs of the disgruntled have provoked a broadening of the space where knowledge resides, including new agents and different questions. We must sign the peace treaty: we need a lasting agreement that doesn’t insist upon dividing the world between those who know and those who don’t, an armistice to liberate the world from the arrogance of experts. Stating that we need science to guarantee a prosperous future is a message that is as certain as it is exhausted. Moreover, it is the carrier of a plan that legitimizes exclusion while guaranteeing new wars for science.

If we had to put the term “citizen science” on the scale, we would have to acknowledge that it tips more heavily toward science, despite extending beyond academia. Effectively, citizen science is independent science, knowledge developed by virtuous communities who, radical in their political rhetoric, are more conservative than we imagine in scientific practice. But citizen science isn’t monolithic and we could stand to evoke that diversity in the plural. All the citizen sciences share a resistentialist gesture. Some have also highlighted the existence of alternative means of relating to political, economic, scientific and environmental realities. Having reached this point, it’s imperative to mention hacker culture. The truth is, we owe much to Pekka Himannen and his notion of hacker ethics as the expression of technological non-conformism, negating the idea that things can only be that for which they were designed. But the most radical of hacker gestures, as shown by McKenzie Wark, not only implies a questioning of functionalities but also a confrontation with their properties. Hacking the world, beyond the invention of new possibilities for inhabiting and transforming it, could return to the Commons all that has been abusively relegated to state and market patrimony. The first hackers, from the 60s onwards, invented the squaring of the circle: to be an author doesn’t demand that you be a proprietor. Achieving the condition of author happens the moment the author gives away the thing that was authored. Thus, accreditation functions as an admirable way of opening knowledge.

snow2-1024x678

Commons as a Science

None have been as radical in their approaches as the hacker movement. No one has managed to attain a better set of practical and sustainable protocols for a culture that is open, experimental, inalienable, horizontal and distributed. Moreover, as an abstract culture adept at generating new futures and imbued with a curiosity that generates hope, hacker culture has made its mark on other domains. It is no longer restricted to geeks, nor is it the domain of maladapted computer scientists. Nowadays we talk of hacking museums, academia and the city. We have hundreds of projects daring to examine the arts as ventures to be reformed according to non-mercantile principles, fighting to rescue music, painting or architecture from the clutches of the culture, tourism, or housing speculation industries. The city itself, its squares and abandoned lots, can be inhabited in other ways. From 2011 on, the Occupy movement has been the most visible manifestation of something that has been going on for decades all over the world. The city has been occupied, it must be occupied [b]; we must wrest it away from the entertainment, security and housing corporations (Harvey 2012). This is the origin of the open source urbanism that looks to and is inspired by free software hackers.

There is no city where citizens aren’t gathering in the squares and open lots; where, fed up with bowing to the ideal of individual consumerism, people aren’t stretching themselves a bit and getting reacquainted with the pleasures of group dances, sharing food, holding bazaars, fairs and other kinds of popular festivals. They’d almost convinced us that we’d do well to forget the old ways of socializing, to toss them out as old-fashioned and fossilized. However, now we see these things as our cultural patrimony, bringing out the best in us; that is to say, what we share and create among ourselves. This new urbanism is not new construction, but rather new relationships to one another experienced through material intervention in our own city. Madrid is an example of what we’re talking about right now in the second decade of our new century, but we also see it in Berlin, San Francisco, Buenos Aires, Cape Town and Mumbai. A city re-urbanization shooting up from the abandoned lots, urban gardens, collective bike routes, self-guided walking tours, neighbourhood assemblies, local markets, rescued festivals, recovered collective memory, and finally, in the thousand and one ways of redesigning the city and gathering where our common links are weakest, most sporadic, tentative, intermittent – yet still apparent and solid, in place and functional (Corsín Jiménez 2014).

People are learning to experience and experiment with their city in new ways, with or without architects, with or without designers, with or without anthropologists. The experiment has been granted new life beyond the confines of the lab, setting in motion a process where the places, parties and infrastructures needed to turn the city into both an object and a place for experimentation are being re-imagined. There’s no scarcity of the credentialed in many of these projects, but theirs is not an expert role, recognizing themselves as part of a collective experiment to problematize established forms of authority. They all feel the importance of these communities of learning. Everyone experiments, everyone investigates, everyone interprets, everyone contrasts, everyone consents, everyone learns and everyone creates new knowledge: a commons science, to create a city of the Commons.

It’s not hard to find references from hacker culture in these projects, as they often invoke the free software community as a source of inspiration. In them, we find some of the characteristics that make free software a singular mode of production, knowledge, and sociability, among them rapid prototyping, recursive in its organization and granular in the distribution of its efforts (Kelty 2008). Rapid prototyping means that designs, objects and proposals are circulated before they are even finished. This is a vulnerable and precarious state of affairs that, nonetheless, compels others to partake in an effort that aspires to be collective. In this way, the widening of the design object is paralleled by the growth of the community surrounding it. But this ongoing beta state also allows for forking at any stage, the possibility of opting for another alternative, and separating from the dominant criteria. Free software, then, is always open to all its potential, always functions as a beta design, a prototype manifested in a non-niche community, a project that is always more than many and less than one.

The community both serves and is created by this cognitive activity. Let’s summarize its nature: experimental, open, relational, distributed, horizontal, collaborative, inalienable and recursive. As conversationalists, they are producing a relational body based on the experiential, in all that the academic laboratory qualifies as collateral, irrelevant or useless. It’s the same experience we’ve previously described when referring to urbanism.

Projects that learn from their errors are recursive, something that children do systematically and which, at times, also lies within the reach of adults. But in this context, our interest in recursive notions lies in their application to systems rather to persons or simple projects. In such circumstances we detect a recursive nature when not only is the functionality of the mechanism is preserved, but also its moral integrity — in other words, when the protocols and the code are responsible for the preservation of the values that sustain the project, that is to say the community. What makes free software community vibrant isn’t the intention of producing for everyone, but of involving everyone in its construction. Here is the reason why the distribution of its efforts is granular; anyone can contribute with his or her knowledge and available time and effort.

The Commons that hackers are working towards isn’t guaranteed by free access, but by a willingness to not exclude any form of collaboration that improves the result. This is, obviously, not a product, but a way of understanding our relationship with technology and other humans based on the principle that the language used by machines for communication must be open and communities must be peer-based in order to dissolve the artificial, imaginary boundaries society imposes between nationals and aliens, experts and amateurs, communicating and sharing — and between free as in “open”, and free as in “free of charge”. As we’ve mentioned, we are speaking of cosmopolitan, informal communities based in the gift economy.

Just as the city is reinvented, the body is also involved in a process of reconfiguration. The accelerated expansion of chronic ailments — coupled with the growing number of persons with severe conduct, nutritional, mental or addictive disorders, along with numerous collectives of victims assailed by allergies or intolerances — marks incurable ills as a new and disquieting phenomenon within our world. We’ve been educated in the conviction that every ill has a technical, scientific, and therefore, political solution. We weren’t ready to confront the obvious and admit that not all bodies are equal and that all react in different ways to the same therapies. Generalized solutions always produce minorities of sufferers. Many people — it’s hard to know if they’re the most lucid or the most disheartened — seriously doubt whether institutionalized wisdom can offer adequate consolation. And there are answers for everything, from those who’ve fallen into the arms of disciplines as alternative as they are hazy, and from those who’ve opened the floor to talk about what’s going on with them (and us).

Putting the pieces back together is difficult and very costly. But Internet allows it at zero cost, as in the case of the mentally ill who communicate amongst themselves in Brain Talk Communities, or those affected by electro-hypersensitivity who don’t even possess the words to describe their ailment. Dissatisfied with available diagnostics and treatments, they take on the task of identifying traits that might be recognized as symptoms, compelling them to manufacture a shared and contrasted language. These projects constitute a gigantic real-time clinical study, where the affected themselves have decided to take the reins of their own bodies. There are none more interested in finding a good answer than those who are risking their own lives searching for it. They know that they can only aspire to enhancing their quality of life: for them at least, the healing paradigm has been left behind.

The experiment is proven when they agree that they feel better, even though this recovery, as with drug addicts, is etched in words. It is an effect of a commitment held between all, not an individual solution. If participants get taken seriously by formal scientific institutions, or experience an improvement, there is no other choice but to admit that we’re speaking of knowledge constructed by all. The community that sustains it is recognized — diagnosed, even — in light of the fact that the knowledge it produces is validated by virtue of being functional. The community both serves and is created by this cognitive activity. Let’s summarize its nature: experimental, open, relational, distributed, horizontal, collaborative, inalienable and recursive. As conversationalists, they are producing a relational body based on the experiential, in all that the academic laboratory qualifies as collateral, irrelevant or useless (Lafuente e Ibáñez-Martín s/d). It’s the same experience we’ve previously described when referring to urbanism. From abandoned lots, from social practices long ignored for belonging to the poor, ignorant or marginal, we are reinventing the city. In the same way, we are creating a common body from all that’s left, that which was discarded as irrelevant by formal scientists.

We now have all we need to reach a conclusion. The Commons science that has determined to achieve a reinvention of the body and the city is a knowledge enacted from the experiential, where, consequently, none can be excluded. Commons science is not an alternative to academic science. Both have a mutual need of each other, although we’ll occasionally see them contending public space, and ever more frequently, the publics.

References

  • Callon, M. (1994) “Is Science a Public Good? Fifth Mullins Lecture, Virginia Polytechnic Institute, 23 March 1993”, Science, Technology & Human Values, Vol.19/n.4: 395-424.
  • Corsín Jimenez, A. (2014) “The right to infrastructure: a prototype for open source urbanism”, Environment and Planning D: Society and Space advance online, Vol.32.
  • David, P. A. (2008) “The Historical Origins of ‘Open Science’. An Essay on Patronage, Reputation and Common Agency Contracting in the Scientific Revolution”, Capitalism and Society, Vol.3/n.2.
  • Harvey, D. (2012) Rebel Cities. From the Right to the City to the Urban Revolution, London, New York: Verso.
  • Kelty, C. (2008) Two Bits. The Cultural Significance of Free Software, Durham: Duke University Press.
  • Lafuente, A. and Ibáñez-Martín, R. (s/d) “Cuerpo común, y cuerpos colaterales”, manuscript.
  • Mirowski, P. (2011) Science-Mart. Privatizing American Science, Cambridge, MA: Harvard University Press.
  • Ostrom, E. (1990) Governing the Commons: The Evolution of Institutions for Collective Action, Cambridge: Cambridge University Press.

Footnotes

[1] http://setiathome.ssl.berkeley.edu/

[2] http://boinc.berkeley.edu/

[3] http://fold.it/portal/


Translator’s notes

[a] We’ve elected to use “the publics” instead of “the public” in reference to Callon’s own nomenclature.

[b] “Okupar” in the original. “Okupar”, an alteration of the Spanish “Ocupar”, makes reference to the squatter movement, as well as the Occupy Movement.

PPLicense mockup small Produced by Guerrilla Translation under a Peer Production License.


Article translated by Stacco Troncoso and Anne Marie Utratel – Guerrilla Translation

Originally published in Guerrilla Translation

Lead Image by Alison

The post Science as Public Good and Commons as a Science appeared first on P2P Foundation.

]]>
https://blog.p2pfoundation.net/science-as-public-good-and-commons-as-a-science/2016/02/15/feed 0 54066