Read the survey as a pdf:
Abstract:
Cooperation among unrelated individuals can arise if decisions to help others can be based on reputation. While working for dyadic interactions, reputation-use in social dilemmas involving many individuals (e.g. public goods games) becomes increasingly difficult as groups become larger and errors more frequent. Reputation is therefore believed to have played a minor role for the evolution of cooperation in collective action dilemmas such as those faced by early humans. Here, we show in computer simulations that a reputation system based on punitive actions can overcome these problems and, compared to reputation system based on generous actions, (i) is more likely to lead to the evolution of cooperation in sizable groups, (ii) more effectively sustains cooperation within larger groups, and (iii) is more robust to errors in reputation assessment. Punishment and punishment reputation could therefore have played crucial roles in the evolution of cooperation within large groups of humans.