Christian Siefkes has updated his insights on contributory economics:
(original article with links, via Keimform, here)
“While in my book I describe what could be characterized as “open sharing communities requiring reciprocity” (you are required to contribute in order to benefit), my more recent work is about “open sharing communities facilitating reciprocity” – where contributing in some ways is easy and encouraged, but it is not required in order to benefit. When we look at existing successful peer communities, we see that they tend to follow the latter model, hence the change.
The old model was more concerned about lifting the protestant work ethic into a context of peers (as opposed to the buyer/seller relation of capitalism). Work is considered a necessary evil that people will only do if required (forced) to do so, hence contributions must be required or they won’t be done. But actually, successful peer production is more about extending and generalizing the hacker ethic – turning work into something that is fun, pleasurable and rewarding in itself. If you manage to do this, requiring contributions or giving additional external rewards is no longer required and indeed often harmful (the crowding-out effect described by Benkler and others).
Michel Bauwens inquired if it’s not more reasonable to have an “integrative approach” which goes from “one special case, the need for reciprocity, to another, that doesn’t require it.” But actually, I never said anything against reciprocity per se. Reciprocity exists in either case, both in my old approach of “requiring reciprocity” as well as in the newer approach of “facilitating reciprocity.” I do something for others, and others do something for me. Indeed, that’s true of any society.
Also, it’s noteworthy that in both models, reciprocity is indirect – I do something for the community (= other community members) and the community (= other community members) does something for me. But the community members that do something for me will generally not be those that I do something for. In my old “task auctioning” model this indirect reciprocity was enforced and measured – I had to give back (in general) the same amount of labor that was needed to produce the goods I consume.
Society-wide, that relation will always hold – only the goods that have been produced can be consumed. But meanwhile I think it no longer necessary to enforce this on an individual level. If you stop considering consumption as “the good” (that everybody wants to increase as much as possible) and production as “the bad” (that everybody wants to avoid), instead considering both as necessary, interwoven, and potentially pleasant aspects of life (as the hacker ethic does), then enforcing something becomes much less important. After all, you wouldn’t force people to consume, so why force them to produce?
That still leaves the question of how to minimize possible mismatches between consumptive and productive desires. I think that stigmergy, automation and re-organization are the best responses here:
1. Announce the tasks necessary for your consumptive goals, and wait for volunteers.
2. If there aren’t enough, try to automatize the task, i.e. let machines do it. Getting there will usually need other tasks, so go back to step 1 for them. (I think it will often be easier to find volunteers to automatize something than do it manually.)
3. If automation is not (or only partially) possible and there still aren’t enough volunteers, think about how to re-organize the task in such a way that it becomes more attractive for potential volunteers. (Indeed, potential volunteers will do this themselves and might decide to re-organize tasks in ways you didn’t foresee.)
After all this, a pool of (apparently quite unpleasant) tasks which nobody wants to do might remain. For these, I would first consider voluntary distribution among the community members, where (more or less) everybody does a small part of them now and then, without something very bad happening if you don’t. (Though you could get some bad looks or nagging from other community members, if community expectation of doing these tasks is high and you refuse. Hence the line between voluntary and enforced can be quite blurred.) If that doesn’t work, i.e. if too many people opt out, the community would doubtlessly agree on more formal sanctions, such as restricting the consumptive options of those who refuse. In this case, the task pool would revert to my old model of required reciprocity.
I certainly won’t rule out that that can happen, so in that sense I haven’t “given up” my old model. I just don’t consider it the best, or even the most likely scenario. Lets see how far we get with solutions that follow the hacker ethic, or “peer spirit” – stigmergic, self-organized, voluntary. We can figure the rest out later if and when needed.”