How cooperation and collaboration may be “gamed”…

In a recent discussion with Venessa Miemis, Venessa asked me whether I thought cooperation can be “gamed” or not.  While it could be true that it is tough to game a mutual rating system (example: ebay rating system), there are known examples of ebay ratings being “gamed” by way of groups of people pretending to give one another good ratings.

Beyond mutual rating systems, there are a number of other ways cooperation and collaboration may be “gamed”:


Of course, Douglas Rushkoff’s classic book by the same name (D Rushkoff – London: Little, Brown & Company, 2000) gives scores of examples about how human nature and it’s propensity to want to give, help, cooperate and collaborate are used to get people to the bidding of others.  Cooperation is “gamed” in some ways by pushing people to commit to relationships where most of the value flows away from them and towards you (perhaps with vague promises, or combined with other cooperation gaming techniques). Then, you can “freeride”  on their contributions once you’ve  locked them into committing.  In P2P networks, I contend this coercive attitude is actually increasing as more people gravitate towards sharing ecologies, gifiting cultures, and commons-based approaches.

Propoganda is also used in this way (on mass scales, and on person to person scales). highlights many of the classic techniques employed in groups of all scales. The core point of employment of propaganda techniques is to convince others to come to a pre-fabricated conclusion. Appealing to emotion over reason, using convoluted logic, creating a “band wagon” effect.


I wrote about this back in 2006 (incorrectly attributed to Paul Lamb in the smartmobs blog).  Also discussed on ZDNet and ZDNet “Koolaid guy saga”.  Very much along the lines of hoaxes that spread around the internet, the idea is to take advantage of people’s desire for being included (really another form of coercion), combined with people’s reaction to the novel, unique, and polarizing.  Muzafir Sherif had an early perspective on this with Sherif , et al work on “Assimilation and Contrast Effect” (Muzafer Sherif, Daniel Taub, and Carl I. Hovland, “Assimilation and contrast effects of anchoring stimuli on judgments.” Journal of Experimental Psychology, 1958, 55, 150-155.)  Giving people black and white arguments to take sides on will push them to one side or another. Access to a medium where they can relay their side-taking, and resonance with wanting cooperate with your “side” can see the message spread extremely fast and widely through networks. In this way, an “information cascade” is used to game people’s propensity to want to collaborate stigmergically, or cooperate with “like minded” people, or to get support for opposing non-“like minded” people

Co-opting, one-way using, not giving attribution

This can be as simple as using from what is shared and contributing nothing back, to more subtle and artful forms of cooperation/collaboration gaming.  One of very worst and most widespread variations of this takes the form of not giving credit or attribution to contributors or sources of ideas, theories, and actual work.  Attribution is the currency of open and commons based systems. If you are not giving credit and attribution (which usually literally takes seconds to do) you are definitely gaming the system, and destroying the commons where the value you co-opted and subtly represented as your own,  was originally freely shared by others.  Not giving attribution, and co-opting value discourages future open contribution.  Not finding a way to contribute back or reciprocate value back to commons you are drawing value from, makes those commons unsustainable.

3 Comments How cooperation and collaboration may be “gamed”…

  1. AvatarSam Rose

    Patrick Anderson emailed me and pointed out:

    “It is worth noting that the GNU GPL – which is currently the most
    popular Free Software license – does *not* require attribution, and
    yet has been instrumental in creating probably the most stable commons
    for virtual materials we have yet seen.”

    My reply:

    I think it is safe to say that a huge and significant amount of
    software, knowledge, and hardware released out there *does* present
    attribution (many times in automated ways via wikichanges, revision
    control, and listing contributors in license or other text).

    So, whether GNU GPL or any other license *requires* attribution, I
    thin it is worth noting that the fact is that there is a culture which
    is *giving* attribution, and I think there is a reason why it is
    happening. I think that has to do in part with sustaining a commons
    where many participants expect it.

    Patrick then replied:

    “This is a very good point as far as the psychology (or is it
    sociology) of how those who collaborate under such licenses interact
    and are ‘sustained’ by the certain amount of pride and/or recognition
    that comes with fixing bugs or adding features, etc.

    So there is a sort of ‘internal’ attribution occurring between the
    ‘workers’ (I use that word with caution considering much of Free
    Software development would more likely be considered ‘play’ or at
    least “self-scratching” by those who contribute) that is not
    necessarily exposed to the end users or public in general except for
    the most famous…”

  2. Avatarpaulbhartzog

    This is great breakdown, Sam.

    This is also addressed in a forthcoming book from MIT Press, tentatively titled “Evidence-based Social Design: Mining the Social Sciences to Build Online Communities” by Robert E. Kraut and Paul Resnick, of which some chapters are online at . The chapter on rewards, for example, suggests that “Non-transparent eligibility criteria and unpredictable schedules will lead to less ‘gaming of the system’ than predictable rewards.” “Gaming” is defined as “doing useless or destructive actions just to get the rewards”.

    Also, “Rapid Decision Making for Complex Issues” prepared by Andrea Saveri and Howard Rheingold for The Institute for the Future (August 2005, SR-935, notes that “Gaming the system creates distrust among members of a standards-setting group and acts as a disincentive to sharing. Both disincentives and a lack of trust prevent sharing information necessary to develop an effective, equitable standard.”

    As long as there are people, systems will be “gamed.” You cannot design or implement a “foolproof” system. Consequently, the best kind of system spreads oversight around among all the members, so the impact of the fools is lessened. The problem is that all too often the impact of the wise is also lessened. A good design doesn’t prevent gaming, but is resilient to being gamed, with the full knowledge and awareness that it WILL be gamed.

  3. AvatarSam Rose

    Thanks Paul. Will check out that book. I think the system also becomes resilient, and adaptable to the reality that it will be gamed by as many participants as possible being literate about how it is gamed (which is the point of my post).

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.