Building successful online communities: Evidence-based social design

From AcaWiki
Jump to: navigation, search

Citation: Kraut, R. E. & Resnick, P (2012) Building successful online communities: Evidence-based social design.


Tagged: Computer Science (RSS) NatematiasGenerals (RSS), online communities (RSS), newcomers (RSS), commitment (RSS), social psychology (RSS), regulating behavior (RSS), online harassment (RSS), wikipedia (RSS), reddit (RSS), ebay (RSS), open source (RSS), social computing (RSS), sociotechnical systems (RSS)


In Building Successful Online Communities (2012), Robert Kraut, Paul Resnick, and their collaborators set out to draw links between the design of socio-technical systems with findings from social psychology and economics. Along the way, they set out a vision for the role of social sciences in the design of systems like mailing lists, discussion forums, wikis, and social networks, offering a way that behavior on those platforms might inform our understanding of human behavior.

The book, which is organized along practitioner needs, focuses on:

  • attracting newcomers
  • encouraging commitment
  • encouraging contributions
  • regulating behavior
  • starting a new community (this chapter is not included in this review)

The authors make a fundamental assumption common in nudge theory[1][2], that "despite limited direct control of individual people's actions, online communities can be designed and managed to achieve the goals that their owners, managers or members desire."

What counts as a successful community? Kraut and Resnick's definition of success, a guiding force throughout the book, attempts common ground between the interests of platforms and the interests of individual communities. A successful community attracts people and maintains its size over time. Furthermore, "to be successful, online communities need the people who participate in them to contribute the resources on which the group's existence is built." Successful communities as they define them, will also regulate behavior effectively, limiting the damage of behavior the community considers inappropriate.


This book is presented through a series of hundreds of design claims, many of which are substantiated through social psychology lab experiments, quantitative analyses of online behavior, and behavioral economics in a few cases. The authors offer a forceful argument "that social science findings can and should inform more directly the choices that online community designers make," an argument whose force is carried by hundreds of examples fromsocial psychnology.

The book focuses on positivist methods, usually experimental methods or and quantitative, descriptive analyses. For example, when discussing the principle of homophily, the authors cite experimental research showing that people tend to like faces that look like their own. When discussing commitment and defection from a group, they cite an observational study of mailing lists that correlated defections to the number of outbound links to other communities.

Encouraging Contribution to Online Communities

This chapter is the heart of the book, exploring a range of methods for encouraging contribution to online communities, in relation to theories of motivation in social psychnology. Kraut and Resnick ground this section with examples from open source software communities, Wikipedia, and discussion on forums and mailing lists. Their discussion of the social psychnology of "collective effort" is rooted in a 1993 review paper by Karau and Williams's work on social loafing and their attempts to understand and promote individuals to make substantive contributions when in groups rather than depend on the efforts of others [3] The overall emphasis of the chapter is on motivating individuals to make the maximum meaningful contribution they can, motivated through clever software design, task structures, and reward architectures:

  • How should one make requests? These design claims focus on how to show requests, make them searchable, how to define and structure them, who to address them to, who they should come from, what emotional tone they should have, and what kind of feedback to offer.
  • How can designers support intrinsic motivations? Drawing from Reiss's model of "16 basic desires,"[4], the authors discuss the idea of including "social interaction" with opportunities to contribute, offering opportunities to challenge one's skills, giving contributors feedback, and creating "a game-like atmosphere" in terms of their potential positive and negative effect on maximizing contributions from community members.
  • How can designers use rewards? The authors are especially concerned to prevent users from "gaming the system" when rewards are offered. They focus on defining the basis of the reward (effort, performance, etc), the nature of the reward itself, and the transparency of the reward criteria.
  • Combining intrinsic and extrinsic motivations: The authors are concerned that "tangible incentives seem to undermine intrinsic motivation." They suggest ways that this might happen and speculate on ways to overcome this problem.
  • Motivating Contribution Through Group Expectations. In this section, the authors discuss social motivations, including the belief that contributions will make a difference to a group, the role of the group size, the uniqueness of a contributor in a group, and the influence of indicators that others are also involved.

The chapter concludes with a set of suggestions for the design of contests, using this as an example that ties together a large number of the above factors in a practical case.

Encouraging Committment to Online Communities

The motivation of this chapter is clear: since successful communities maintain their existence over time, they need people to stay and to commit to contribute to the community. The authors see commitment as a "building block" for enforcing norms and maintaining the community. In the social psychology literature, they pay special attention to Kurt Lewin's work on "field theory," and the idea that loyalty to a group is related to characteristics of the environment ("field") "that attracted them to a group and kept them loyal [5]. Building on this work, the authors identify three kinds of commitment to communities:

  • affective commitment, "based on feelings of closeness and attachment to a group or members of the group"
  • normative commitment, "based on feelings of rightness or obligations to the group
  • need-based or continuance commitment, "based on an incentive structure in the group and alternatives available to members from outside that increase the net costs of leaving the group."

Within their discussion of these kinds of commitment, the authors set up a dichotomy between identity-based commitment and bonds-based commitment. In "identity-based" commitment, people participate in a group because "members feel a commitment to the online community's purpose or topic." In contrast, "bonds-based commitment" to a group "implies that members feel socially or emotionally attached to particular members of the online community." The authors then develop these ideas to support design claims that argue the relative merits of supporting identity or bonds based commitment for participants to survive turnover and comply with norms. They then associate different kinds of design choices with identity based commitment, with designs ranging from giving your group a name and setting group goals, to creating forms of group patriotism by highlighting outsiders "to intensify group commitment." Bonds based commitment is supported by recruiting newcomers who have existing ties to community members, showing photos of users, supporting conversation, or supporting pseudonyms. They also discuss the relationship between group size and commitment, as well as the relationship between group diversity and commitment (diversity reduces commitment, according to them).

The discussion of normative commitment centers on the stated goals of the community, practices through which the community affirms and negotiates its goals, and the structures of obligation that invite participation towards those goals. Finally, the discussion of needs-based commitment considers the way that people participate in communities that benefit them in some way. Here, the discussion is straightforward: communities will attract and retain participants if they make clear how participants benefit and make it hard for them to defect to the group's competitors.

Dealing with Newcomers

This chapter, which includes Robert Kraut, Moira Burke and John Reidl as authors, focuses on the process of recruiting newcomers, screening them, retaining them, socializing them to norms,and "protecting the group from newcomers." Newcomers deserve special attention, because they can invigorate and sustain communities but also are inexperienced at the community's tasks and unfamiliar with its norms, introducing diversity that "may be off-putting to more experienced members of the community." Throughout the chapter, the authors take care to attend to two sides of the issue: the experiences of newcomers and the experiences of current members, as members move in a path from "reader-to-leader"[6] . The chapter also focuses on issues under the control of designers rather than things outside a given platform that influence recruitment and the experience of newcomers.

The section on recruiting newcomers draws from literature on persuasion, breaking it down into:

  • "interpersonal recruiting," particularly Katz,[7],[8], Coleman[9], who argued that interpersonal appeals were more powerful than mass-media advertising, and Latane[10], who described a model for the degree of social influence based on the number of sources of influence someone's exposed to. The authors also describe a thread of research on statistical models of diffusion starting with the work of Frank Bass[11][12], which predicts diffusion based on:
  • "the number of people who might potentially adopt"
  • "the number of people who have adopted to that point"
  • "the parameter (α) representing the constant proportion of potential adopters who convert due to advertising"
  • "the parameter (β), representing the constant proportion of potential adopters who convert because of word of mouth influence from people who have already adopted."

The authors cite research finding that β, adopters via word-of-mouth, "is substantially higher than α, the impact of advertising." They then discuss the idea of recruiting along existing social networks and the idea of focusing recruitment on the most influential members of a community.

The chapter's treatment of "impersonal advertising" offers a discussion of research on conventional advertising, appeals by celebrities, emotional appeals, social proof[13], and brand recognition.

When discussing the challenge of "ensuring a good fit between the newcomers and the community," the chapter focuses mostly on case studies. It focuses on two approaches: self-selection and screening. For self-selection, the authors focus on the design acts of publishing accurate information about the community and offering barriers to entry. In the discussion on screening, the authors discuss "diagnostic tasks" like CAPTCHA, credential checks, and referrals.

How can communities keep newcomers? The chapter discusses entry barriers and initiation rituals, welcoming practices, and efforts to discourage hostility towards newcomers, mostly in reference to research on online communities by Burke,[14] and Panciera, Halfaker, and Terveen.[15]

The authors frame the practice of "teaching the newcomers the ropes," in relation to von Maanen and Schein's "dimensions of organizational socialization tactics." [16] The authors argue that while the evidence supports "institutionalized socialization tactics" to develop "commitment and appropriate behavior" these practices are not common.

Finally, the authors consider ways to protect a community from the unsocialized activity of newcomers, grounding this section with Lave and Wenger's theory of legitimate peripheral participation[17], "by which newcomers become more experienced members through small but procuctive actions in the community." Sandbox features on websites offer newcomers an opportunity to take those actions, argue the authors.

Regulating Behavior in Online Communities

In this chapter, the authors (Sara Kiesler, Kraut, Resnick, and Aniket Kittur) discuss the challenge of "regulating behaviour" as illustrated in Julian Dibbell's A Rape in Cyberspace[18]. The chapter focuses on the regulation of behavior norms that receive "rough consensus" as a way to "help the community achieve its mission," differentiating between the mission of a health support community, open source communities, and Wikipedia[19][20], which might all have differing norms. In addition to defining non-normative behavior, the authors also distinguish between community insiders and outsiders like "trolls" and "manipulators" who "have no vested interest in the community functioning well."

The authors argue that problems of non-normative behavior are a version of the Tragedy of the Commons, where attention is a limit resource and "low quality contributions create a social dilemma wherein these contributions drown out the worthy contributions and exhaust the available attention." Drawing from the work of Ostrom on governance of commons[21][22], they put forward design claims focused on:

  • limiting effects of bad behavior through moderation systems, making "inappropriate posts" less prominent, keeping consisten moderation policies, including community members in moderation, offering reversion tools, and introducing filters.
  • limiting bad behavior itself, through activity quotas (to prevent spam), suspensions and bans, consistent criteria for suspensions with due process, payment requirements that raise the cost of activity, and CAPTCHAs
  • encouraging voluntary compliance, by ensuring that people learn the norms, showing examples of appropriate or inappropriate behavior (but not too much) [23][24], displaying statistics on normative behavior, and reminding people about norms at the point of action. They also discuss the possibility of these interventions backfiring.

At this point, the article shifts more towards sociology, considering possible interactions between different kinds of communities and norm compliance. They consider if norms might be stronger in more cohesive communities and suggest that communities should be given influence on rule-making.

The authors also suggest that ascribing blame or community sanctions may be less effective than offering community members a way to "save face" "without having to admit that they deliberately violated the community's norms." They describe a system called stopit designed at MIT to address computer-based harassment. When users reported harassment, the system sent a message to the alleged harasser claiming that the alleged harasser's account may have been compromised and urging them to change their password. Here is the rationale given by Gregory Jackson, the Director of Academic Computing at MIT in 1994:

recipients virtually never repeat the offending behavior. This is important: even though recipients concede no guilt, and receive no punishment, they stop. [this system has] drastically reduced the number of confrontational debates between us and perpetrators, while at the same time reducing the recurrence of misbehavior. When we accuse perpetrators directly, they often assert that their misbehavior was within their rights (which may well be true). They then repeat the misbehavior to make their point and challenge our authority. When we let them save face by pretending (if only to themselves) that they did not do what they did, they tend to become more responsible citizens with their pride intact [25]

Finally, the authors describe the use of rewards and sanctions. They argue that since anonymity prevents sanctions and de-links individuals from the likelihood of interacting with each other in the future or developing a reputation, "verified identities and pictures will reduce the incidence of norm violations." They discuss mechanisms that reward productive contributions to a community and make sanctions more consequential, such as reputation systems, "attention bonds" that "make undesirable actions more costly," costs associated with pseudonym switching, "bonds that may be forfeited if the newcomers misbehave," membership referral systems that affect the reputation of a sponsoring member.

How can violations detected and sanctions carried out? The authors discuss the idea of graduated sanctions (n-strikes policies) and peer reporting / automatic flagging mechanisms. They are especially concerned with the problem that initiating sanctions has a cost for the community members who initiate them, an issue Elster explores as the "free-rider problem:"

it may be better for all members if all punish non-members than if none do, but for each member it may be even better to remain passive. Punishment almost invariable is costly to the punisher, while the benefits from punishment are diffusely distributed over the members.[26]

The problem of encouraging meaningful peer interventions is a hard one, and the economics research cited in this question only fills in parts of the issue, which concludes the chapter on regulating behavior.

Starting A New Community

This chapter is not included in this review. If you can summarize this chapter, please consider joining AcaWiki to add it!

Theoretical and practical relevance:

Although some academics disagree with the approach taken in this work (its focus on social psych, its concern with design principles, its positivist approach), this book offers a remarkable bridge between questions in social psychology and the concerns of social platform design. It also offers an excellent introduction to the practice of identifying narrow questions and hypotheses that can be tested in particular socio-technical contexts. The design claims in the book also offer a rich vein of questions for further testing.

I personally use the second chapter of Kraut & Resnick to introduce students to the practice of thinking through questions that they could investigate with data from socio-technical systems, in connection with theory and other research findings. Furthermore, the book's focus on retaining users and supporting productivity is aligned with the interests of platform owners in ways that reveal, for me, areas where my own interests and focus on civic values diverges. Natematias (talk) 17:18, 28 February 2015 (UTC)

References of Note

  1. * John, Peter, Graham Smith, and Gerry Stoker. 2009. "Nudge Nudge, Think Think: Two Strategies for Changing Civic Behaviour" The Political Quarterly 80 (3): 361–70.
  2. * Sunstein, Cass R. 2014. The Ethics of Nudging. SSRN Scholarly Paper ID 2526341. Rochester, NY: Social Science Research Network.
  3. Karau, S., & Williams, K. (1993). Social Loafing: A meta-analytic review and theoretical integration. Journal of Personality and Social Psychology, 65(4), 681-706.
  4. Reiss, S. (2004). Multifaceted nature of intrinsic motivation: The theory of 16 basic desires. Review of General Psychology, 8, 179-193
  5. Lewin, K. (1951). Field theory in social science; selected theoretical papers (D. Cartwright ed.). New York.: Harper & Row
  6. Preece, J., & Shneiderman, B. (2009). The Reader-to-Leader Framework: Motivating technology-mediated social participation. AIS Transactions on Human-Computer Interaction, 1(1), 13-32.
  7. Katz, E. L., P. (1955). Personal influence: The part played by people in the flow of mass communication. New York: The Free Press.
  8. Katz, E. (1957). The two-step flow of communication: An up-to-date report on an hypothesis. Public Opinion Quarterly, 12(1), 61-78.
  9. Coleman, J., Katz, E., & Menzel, H. (1957). The diffusion of an innovation among physicians. Sociometry, 20(4), 253-270.
  10. Latane, B. (1981). The psychology of social impact. American Psychologist, 36, 343-356.
  11. Bass, F. M. (1969). A new product growth model for consumer durables. Management Science, 15(5), 215-227.
  12. Wikipedia. Bass diffusion model
  13. Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and Conformity. Annual Review of Psychology, 55(1), 591-621.
  14. Burke, M., Marlow, C., & Lento, T. (2009). Feed me: motivating newcomer contribution in social network sites CHI 2009: Conference on Human Factors in Computing Systems (pp. 945-954). New York: ACM Press.
  15. Panciera, K., Halfaker, A., & Terveen, L. (2009). Wikipedians are born, not made: A study of power editors on Wikipedia Proceedings of the ACM 2009 international conference on supporting group work table of contents (pp. 51-60 ). New York: ACM Press.
  16. Van Maanen, J., & Schein, E. H. (1979). Toward a theory of organizational socialization. Research in Organizational Behavior, 1, 209–264.
  17. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.
  18. Dibbell, J. (1993, Dec 23). A rape in cyberspace: How an evil clown, a haitian trickster spirit, two wizards, and a cast of dozens turned a database into a society. The Village Voice, from
  19. Halfaker, A., Kittur, A., Kraut, R., & Riedl, J. (2009). A Jury of Your Peers: Quality, Experience and Ownership in Wikipedia WikiSym 2009: Proceedings of the 5th International Symposium on Wikis and Open Collaboration. . New York: : ACM Press
  20. Viegas, F., Wattenberg, M., & Dave, K. (2004). Studying cooperation and conflict between authors with history flow visualizations. CHI 2004: ACM Conference on Human-Factors in Computing Systems. NY: ACM Press.
  21. Ostrom, E. (1990). Governing the Commons: The Evolution of Institutions for Collective Action: Cambridge University Press.
  22. Ostrom, E. (2000). Collective action and the evolution of social norms. The Journal of Economic Perspectives, 14(3), 137-158
  23. Cialdini, R. (2003). Crafting normative messages to protect the environment. Current Directions in Psychological Science, 12(4), 105.
  24. Cialdini, R., Kallgren, C., & Reno, R. (1991). A focus theory of normative conduct: A theoretical refinement and reevaluation of the role of norms in human behavior. Advances in experimental social psychology, 24(20), 1-243.
  25. Jackson, G. A. (1994). Promoting Network Civility At MIT: Crime & Punishment, Or The Golden Rule? Retrieved October 4, 2010, from
  26. (Elster, J. 1989, pp. 41. The Cement of Society. A Study of Social Order. Cambridge University Press