Codes of conduct (CoCs) have become a hot topic in open source software as contributors and projects increasingly discuss their messaging, presence, and importance. This chapter aims to provide a holistic overview of CoCs and research on them. We first provide the history, context, and controversies surrounding CoCs and demonstrate why CoCs are an important document and tool for OSS projects. We then showcase findings from the literature on CoCs and finally identify open research questions and call for their exploration.

Introduction of CoCs in OSS

Free/libre open source software (FLOSS/OSS) is an important form of digital infrastructure and technical career pathway for many developers [4, 20]. Despite OSS’s origin as an alternative technology movement subverting privatization and commercial forces, OSS’s roots in libertarianism, masculine technologies, and techno-determinism have resulted in an apathetic attitude toward social issues [7]. This philosophy has contributed to a notoriously antagonistic environment for women and minorities [14, 16, 23], with CoCs as a particular sore spot.

In 2014, Coraline Ada Ehmke created the Contributor’s Covenant (hereafter referred to simply as CC), after reports of sexualized language and assaults at events and a general lack of governance [15]. The CC is the first of many CoCs designed to govern projects and discourage what advocates refer to as “toxic” behavior. To OSS traditionalists, this was interpreted as a threat to free speech. To increasingly diverse contributors, it was a needed provision of protections and accountability. The introduction of the CC thus began the first major, and ongoing, socio-cultural war of OSS.

Cisgender women and other underrepresented groups often bear the brunt of the labor of advocacy and thankless community-oriented tasks (e.g., documentation and organizing community events) [25]. Women’s technical contributions are often undervalued or nitpicked in comparison to men’s [17], and women often experience biases and harassment at in-person events. Unsurprisingly, women tend to leave projects earlier than men [13]. While advocacy for CoCs and inclusive efforts has increased over the past few years, there is still pushback among traditionalists who believe the inclusion of any guardrails represents an attack on cis-, white, hetero-men and their free speech; this reaction has even included cases of contributors leaving projects due to the inclusion of a CoC [2]. Having discussed the history of CoCs, we now present their content before turning to examine current research on their role and function in OSS projects.

Structures of Conduct

CoCs are community-wide governance documents that establish the core values, expected behaviors, and commitment to inclusivity within OSS projects. Recently, OSS projects hosted on the popular git control platform GitHub are given CoC templates at the project’s creation. The most popular templates include the CCFootnote 1 and the Python Code of ConductFootnote 2 [24]. While CoCs vary in prose and content, they all convey a sense of community values and expectations. We observe different CoC approaches by comparing the six most popular: the Contributor’s Covenant, Mozilla’s Community Impact Guidelines, Google’s Code of Conduct, the Python Code of Conduct, the Django Code of Conduct, and the Rust Code of Conduct.

The most significant CoC is the Contributor’s Covenant (CC), which serves as a cornerstone of OSS governance documents. It contains a pledge, standards, enforcement responsibilities and scope, enforcement guidelines, and attribution. The CC pledges against discrimination of any form and is geared toward an “open, welcoming, diverse, inclusive, and healthy community.” It defines unacceptable behavior, namely, any conduct that could “reasonably be considered inappropriate in a professional setting,” for example, doxxing and harassment. Enforcing these standards is the right and responsibility of community leaders. There are four levels of enforcement depending on the frequency/severity of infringements: initial correction with a written warning, a warning resulting in reduced interactions with others in the community, a temporary ban, and finally a permanent ban.

In the early stages of CoC introduction to OSS, it was primarily larger, successful, and more established OSS projects that were the first to adopt them (e.g., Mozilla and Django). Mozilla’s Community Impact GuidelinesFootnote 3 predates the CC and helped shape it. Mozilla’s CoC is more verbose, with additional clarity, details, and examples. Mozilla showcases a section titled “Be Inclusive,” which encourages people to “seek diverse perspectives” and being open to new perspectives and ideas, fostering innovation. It also lists ways to be considerate toward others, for example, respecting and using someone’s self-identified pronouns. Google’s Code of ConductFootnote 4 borrows heavily from the CC, albeit terser and more direct. The web framework Django is unique in separating its CoCFootnote 5 from its enforcement manual, the former focusing on community ideals and the latter on enforcement. Django states that while their enforcement manual is internal to their Code of Conduct Committee, it is published in the interest of transparency, a key value of DEI efforts. Finally, the CoCFootnote 6 of the Rust programming language only has two sections: Conduct and Moderation. This structure reflects the CoC’s ultimate purpose – to provide a community-wide baseline for expected behaviors and a warning of the consequences should inappropriate behavior be exhibited.

Related Work on CoCs

To provide additional clarity of the usage of CoCs, we consider existing research and feature results from previous and ongoing work, concluding by highlighting open research questions. Previous research has found that the content of CoCs can be value-based, rule-based, or a mixture of both. Rule-based CoCs list concrete examples of unacceptable behaviors, while value-based codes lack explicit rules and instead define community values and ideas generally [24]. All CoCs studied by Tourani et al. championed diversity and a welcoming community and encouraged respectful and constructive collaboration [22, 24]. Finally, for a CoC to be effective, disciplinary actions must be clearly spelt out, and any enforcement should be visible to the community. Public enforcement serves two purposes: potential offenders know there is a consequence to actions, and marginal community members feel safer knowing protections exist with the backing of the broader community [1].

CoC Community Conversations

To demonstrate how communities discuss and use CoCs, we look to a recent publication on the typology of these conversations, where we as researchers viewed thousands of GitHub issues and conducted a content analysis on over 400 OSS community conversations [15].

Adoption and Creation

A CoC’s adoption and creation are an important step for communities as they commit to inclusivity and move to a central governance model. The proposal to include a CoC can incite different reactions of disapproval, approval, and ambivalence among community members.

As CoCs grow in popularity, disapproval from the broader project community is becoming increasingly uncommon. Those protesting the inclusion of a CoC often consider it an antithesis to OSS philosophy, claiming it limits free speech and free code (assuming that all code is neutral). Others believe CoCs are an infective moderation tool and are skeptical of their capability to curb negative behavior. Admittedly, communities cannot control the emergence of negative behavior, yet project leaders and advocates can exert authority through their reaction and utilizing CoCs for enforcement. CoCs provide rules that can penalize infringers with unseemly consequences (e.g., temporary or permanent ban).

Most sampled projects had either neutral or positive stances toward a proposed CoC. Positive reactions were relayed through encouraging comments (see Figure 17-1), “+1”s, or “ ”s, yet most CoC proposals received minimal interaction from the community, for example, zero comments/likes. In this case, CoCs were often eventually merged. The lack of engagement from the community could potentially spell trouble for later drama, infringements, or miscommunication.

Figure 17-1
A screenshot of the comment section in the code of conduct. It displays 4 lines of wording and comments from 2 users with reaction icons.

Support for the wording in the code of conduct

Moderation and Enforcement

Moderation and enforcement vary between projects and depend on factors such as project leadership structure, presence of a community manager, proportion of contributors to maintainers, etc. We review our typology of enforcement of the CoC [15] through the lens of an online content moderation framework [12], based on content moderation in environments comparable to OSS, for example, Reddit and Wikipedia.

The CoC was used proactively to enforce community guidelines and reactively to moderate unwelcome behaviors. Individual projects decide their moderation schema, derived from several reflective values (e.g., leadership team transparency, driving moderation philosophies, etc.). Common moderated behaviors included the complaints of disgruntled users and offensives due to a language barrier. Upon being “called out” by a moderator, we observed both examples of defiance and apology from the offender (shown in Figure 17-2).

Figure 17-2
A screenshot of the defiance and apology comments in the code of conduct. It displays paragraphs of edited text from 2 users on specific dates.

Post-moderation, the infringer apologized and corrected their behavior

There are many moderation styles in OSS, for example, human vs. automated moderation. Humans understand nuance and complex situations, while “automated systems offer the kind of moderation required by the massive scale of today’s online community” [12]. There are many OSS bots for detecting non-inclusive language (e.g., in-solidarity-botFootnote 7 and probotFootnote 8) in GitHub projects and forums where contributors connect synchronously. A trade-off exists between efficiency and quality of moderation; maintainers and moderators can react quickly to infringements with less careful responses or can spend more time crafting their responses, leaving open the possibility of harm spreading.

OSS projects can be perceived as more transparent than other working settings, but does that apply to moderation? Our analysis suggests CoC conversations exist in private spaces without broader community involvement [15]. There was little public conversation around CoC additions, but longer deliberation when moderation was perceived as unfair, likely due to no shared understanding of the CoC and its usage. This connects with existing work on moderation techniques in online communities, for example, work examining the statistical association between types of moderation behaviors and future user activity [8, 11]. We aim to expand the research on CoCs by considering them a form of content moderation, discussed in the next section.

Contributor Experiences with CoCs

CoCs are generally considered helpful in attracting newcomers [18, 21], yet there is little to no empirical evidence to support that. In fact, there is evidence that the CoC’s presence has no bearing on a newcomer’s joining of a new project compared with other factors considered [9, 19].

Adding nuance to the evidence that CoCs are inconsequential in joining and contribution processes, we present results from two ongoing unpublished studies. The first is a study of trans-contributors and the second a study of OSS maintainers. As part of a larger study, we aimed to understand the experiences of trans-, non-binary, and genderqueer (hereafter referred to simply as “trans”) people in OSS and the impact of gender identity on OSS career trajectory. We interviewed 21 trans-contributors to gather details and insights on their involvement in OSS, including their initial contributing experiences, their career trajectories, and positive and negative experiences.

When discussing their early joining experiences in OSS, many of our participants performed reconnaissance work before joining a project. They would screen projects for potential toxicity through a variety of methods, including the presence of a CoC. A participant described a general project culture assessment based on the CoC’s presence.

If they’ve got a code of conduct, they’re probably trans-friendly. If they don’t, they’re probably bigots <laugh>. That’s a very rough rule of thumb and there’s more nuance, but at a first glance, looking to see if they’ve got a code of conduct is a very good indicator of how trans-friendly the community will be.

While the presence of the CoC was a positive indicator, our participants also stressed its enforcement. Enforcement of a project’s CoC signals to vulnerable contributors whether they are truly welcomed and supported by the broader community [10]. Our participants hoped that communities would enforce community guidelines, rather than simply allowing and ignoring instances of bigotry.

If somebody says something transphobic, that’s a warning sign to me, and I need to pay attention to how it’s dealt with. Is the community, the project leadership, …are they laughing with that person or are they scolding and educating that person and correcting the situation? How does that community leadership respond? That response tells me whether or not it’s safe for me to be involved in that community.

We also interviewed 21 OSS maintainers to understand how they regard newcomers in their projects. Maintainers were aware of the CoC’s message and its role in community management and statements. A participant discussed the CoC’s connection to community statements, for example, working to be “actively anti-racist” in their messaging and encoding an actively welcoming mindset among community members.

I’m involved in conversations about code of conduct and, more recently, Black Lives Matter, making sure we actively use anti-racist language in our community and its statements … The community gets the message reaffirmed that this is an actively inclusive space … We expect you to make people feel welcome, and we use that explicit language because we really think that it needs the least amount of room for ambiguity.

Another participant spoke passionately about proactively creating an inclusive and welcoming environment for any contributors. They cited the CoC as a document for enforcement against “trolls” and negative behavior. As a maintainer, they expressed a sincere desire to ensure a welcoming environment for everyone and hoped that the CoC would be a meaningful and supporting document for contributors.

Having a visible code of conduct, we’re making a commitment to anybody who might want to be involved that you’re welcome and we’re not going to let anyone else be horrible to you and continue to participate … I meant it sincerely when I added it. And I hope it means something to someone who’s reading it.

Dismissal of CoCs emerged in our interviews, including a participant discussing the potentially performative nature of the CoC. They likened it to greenwashing,Footnote 9 which conveys a false impression about how a company’s products are environmentally sound. To this participant, the rules of a CoC were redundant as negative interactions are “of course” discouraged; thus, a CoC makes projects only appear inclusive.

I’m going to be very honest, but for me, this looks like greenwashing, to say, “Oh, you see we’re very kind and nice people because that’s our standards and we don’t accept trolling, insulting, and derogatory comments.” I think that’s basic. You don’t need to say it.

Admittedly, the signal of inclusivity and community provided by CoCs can quickly devolve into a façade if projects do not enforce them. The strength of CoCs derives from community backing of values and project leaders enforcing them. Withholding accountability or allowing negative behavior undermines the authority of the CoC.

Further Recommendations

CoCs are a place to express and define shared community values in OSS projects, an important governance document for moderation, and an indicator of project inclusivity and safety, especially for vulnerable contributors. In this section, we build on these understandings in the form of recommendations for practice in open projects.

For projects creating and adopting a CoC, community consensus is recommended to avoid later confusion and miscommunication. Communities should collaborate in CoC creation and define their values, expected behavior, and consequences of infringements. Creating a custom CoC is a nontrivial task, requiring the input of many members – therein lies its strength. A custom CoC needs collaboration, facilitating shared social learning and values. Projects can also adopt a CoC template and tweak its contents; however, the broader community is encouraged to be involved. If CoC conversations are done privately, common in large projects with hierarchical structures [3], leaders should make an effort to replicate that discussion publicly. A clear community consensus on unacceptable behaviors and their enforcement promotes mutual social learning [6] and can reduce later tension and divisions in the event of a serious infringement. Open collaboration platforms such as GitHub can help by considering ways in which their system designs can better facilitate community discussions at a broader level.

Project leaders should clearly designate a contact method for enforcement within the CoC, either using a moderator’s contact info (e.g., the maintainer’s email) or creating an email for handling requests (e.g., moderation-team@project.com). As for moderation, our advice is to be firm and fair. Moderators must have a consistent reaction toward the infringer, no matter who they are. To help build shared social learning, we encourage moderators to NOT delete negative interactions and their subsequent moderation. Moderators should consider adding these (anonymized) examples to their CoCs to maintain a public record of past infringements, providing an opportunity for newcomers to accurately assess the community [5]. We understand, however, that the choice to not delete comments is difficult, especially when members are subject to continuous trolling, harassment, or spamming. In such cases, comments should be deleted, especially if they are targeting a specific individual or group.

We’ve seen CoCs viewed as a signal of perceived project inclusivity and their enforcement as indicators of the project leadership’s commitment to that message. If a project is not committed to enforcement, the CoC’s presence may lull contributors into a false sense of safety, especially harmful for vulnerable contributors. Projects should not include a CoC unless they are prepared to enforce it against their “best” contributor. Future research should also include more diverse perspectives (e.g., non-white, cis-male) when considering participant experiences in OSS.

Open Research Questions

Many questions surrounding CoCs remain unanswered. How does the lack of a contact method impact the effectiveness of a CoC? Many negative interactions are also deleted. It is unclear how widespread the deletion of transgressions is among OSS projects nor its long-term implications. Further research can explore the impact of hidden or removed negative interactions on a community’s sense of self.

As projects grow and opinions evolve, project leaders can revisit the concerns of the community and reflect these in the CoC. Yet there are still many questions on how best this can be achieved. Should all members participate in CoC discussions or just a representative sample – what would a representative sample look like for a project? How can project leaders best facilitate these discussions in an open and respectful way? When is deliberation on CoC content considered sufficient? What are indicators of a consensus on the contents of the CoC? These are questions to be explored in future research.

There are many different moderation styles and philosophies driving enforcement of the CoC. Should projects be nurturing and educate infringers or be swift in their punishment? Kindness and respect toward others can encourage gracefully accepting constructive feedback and accountability for one’s actions, but it may leave the community open to trolling or attacks. Conversely, harsh punishments can effectively discourage infringements from happening again, but run the risk of facilitating a “toxic” and exclusive environment. Studies to help understand the trade-off between contribution quality and level of activity or the trade-off between the quality of moderation and its efficiency are needed. How can other analyses of moderation values, philosophies, and actions be applied to the decentralized projects of OSS?

We hope that future research can assist communities addressing these questions.