As an emerging and growing research field, the ability of researchers in software developer diversity and inclusion to make practical progress hinges on their ability to ask and answer the right questions. To make progress on this task, we surveyed a diverse group of 903 software engineers at Google about their experiences surrounding inclusion and software development. We found that accessibility was frequently mentioned as an important area to address, that negative experiences sometimes impact certain groups more often, and that open and closed source developers use different patterns of help-seeking behavior.

Introduction

As evidenced by this book, a variety of researchers have studied inclusion in software engineering communities, typically through the lens of a specific group of historically marginalized developers such as women [14] or through a specific task such as code review [13]. Such research can provide compelling insights into the inordinate challenges imposed on these developers.

What such lenses lack is a bigger picture on inclusion in software engineering, beyond challenges faced by a specific community or during a specific task. But this lack of a big picture has posed a significant threat to our research group’s ability to fulfill our team’s mission at Google, to advance understanding of diversity and inclusion challenges facing software developers and evaluate interventions that move the needle on creating an inclusive developer culture for all. Without a big picture of inclusion challenges, both our team and the wider community that researches software developer inclusion cannot be certain that our current focal areas are the most important inclusion areas we can be focusing on. For instance, perhaps our effort to study inequities in code review is misguided, because we could have a bigger impact by studying a topic we haven’t thought of yet that is pervasive in practice.

To widen the lens and build a bigger picture of inclusion in software engineering, in 2021, we ran a survey with 903 responses from Google engineers (18% response rate). We constructed the survey by combining newly created questions with questions adapted from Stack OverflowFootnote 1 and GitHubFootnote 2 developer surveys to allow for comparisons. Survey topics included experiences with key collaborative development tasks; preferred channels for obtaining support off-team (i.e., a team at Google that a developer is not a member of); help-seeking, help provision, and stuck behavior; and experiences while using asynchronous communication tools. Full survey details are in the section “Appendix.”

To select participants, we used a stratified sampling approach to ensure the representation of developers across diverse races/ethnicities, gender identities, and ages, as detailed in the section “Appendix.” We additionally invited developer members of LGBTQ+ (29 responses), transgender (8 responses), and disability-focused (4 responses) employee resource groups to respond through their email lists. Because there were relatively few respondents from the transgender and disability groups, we did not provide or analyze breakdowns for those groups, but we did include their responses within our aggregate quantitative and qualitative analysis. In figures throughout this chapter, response breakdowns from members of the LGBTQ+ employee resource group are referred to by the group’s name, “Pride at Google.”

The other demographic identifiers referenced in this chapter are reflective of the options that were available within employees’ internal human resources profiles at the time of data collection (which also include the option to not self-identify). In charts we present, we distinguish between US- and non-US-based employees when referring to race/ethnicity categories because analyzing race/ethnicity data is only permitted for US employees according to company policies based on international regulations. Response counts per group are detailed in the section “Appendix” alongside the sampling criteria. Note that most categories we provide breakdowns for are not mutually exclusive, that is, survey participants may appear in multiple demographic categories, such as those for age, race/ethnicity, and gender. The only mutually exclusive categories are Female/Male and US/Non-US. Our project was reviewed by Google’s employee privacy working group, helping ensure that we were using data congruent with employees’ expectations and following privacy best practices.

We summarized overall distributions of responses across questions with descriptive statistics and analyzed data for any significant differences in responses across groups when sample sizes were sufficient to provide statistical power to make such comparisons. The first author analyzed open-ended data to synthesize qualitative themes. We next summarize the results and discuss what those results mean.

Results

Overall Inclusion Sentiment

Overall, we found that engineering tools and processes are meeting most groups’ needs. Thirty-one percent of respondents reported that engineering processes and tools met all of the respondents’ needs, 60% reported most of their needs were met, 8% reported some of their needs were met, and 2% said few of their needs were met. Figure 13-1 breaks down participant responses by demographic category.

Figure 13-1
A set of 2 stacked bar graphs of percentages versus respondents and age groups of respondents. The parameters are all, most, some, few, and none of your needs. All bars contribute 100%.

Survey response breakdown for software engineering inclusion question

On the other hand, looking at Figure 13-1, more Black developers (18%) reported that only some or few of their needs were met when compared with White, Asian, and Hispanic or Latino developers (8%). Additionally, more developers aged 40 years and older (13%) reported that only some or few of their needs were met than those under 40 years of age (7%). Also, developer members of Pride at Google were the least likely to report all of their needs were met when compared with everyone else (21% vs. 31%).

Most Important Developer Inclusion Areas to Address

To add more depth to the previous quantitative data, we next asked respondents the following open-ended question, “For this question, we’re defining inclusion as the degree to which employees feel part of essential organizational processes, including influence over the decision-making process, involvement in critical work groups, and access to information and resources. In your opinion, what inclusion issues in engineering tools and processes do you view as most important to address, and why?” We coded the 177 responses into several categories, as shown in Figure 13-2. We next give examples of comments from the top four categories and the “better documentation, tutorials, mentorship” category.

Figure 13-2
A bar graph of the number of responses versus emergent themes. The bars descend from accessibility improvements, 43 to reward collaboration as much as individual achievement, 2.

Number of open-ended responses in emergent themes from the inclusion issues question

The most common response was a desire for improved accessibility of internal tools. More than half of those comments specifically discussed enhanced supports for visual accessibility, for example:

[C]olor contrast. Not all text is legible. My eyesight is good but not perfect anymore.

Participants also desired improved product design processes. Specifically, the second most common theme was a desire for a more user-centered design process for internal engineering tools, similar to the structured design research approaches applied to consumer products. Respondents wanted the design of these tools to take in account the varied experiences and feedback of their users rather than basing decisions on the intuitions of the team building the product or the loudest voices off-team:

The typical metric that [internal] product teams seem to use for prioritizing features is “How many people are asking for this?” It often feels incredibly difficult to motivate their product decisions based on “How acutely is this affecting a smaller group of users?”

Respondents raised concerns that organizational decision-making was at times too hierarchical and opaque, failing to incorporate a diverse set of perspectives or to clarify the criteria used to reach decisions to the broader impacted groups. This category included concerns around when conversations are not facilitated to solicit input from the range of voices present:

Engineering decisions have become much more top-down than in the past …and the needs and concerns of the engineers on the receiving end are not really considered.

Respondents also had a desire for increased transparency and open information-sharing. Respondents shared examples of information being difficult to discover and inconveniences related to gaining access to documents in cases where the subject matter was not confidential yet default sharing permissions weren’t granted between different parts of the organization:

Inability to access documents in others’ verticals might affect ability to provide feedback.

Finally, respondents described how high-quality documentation, user guides, and mentorship can ensure all developers are equally empowered to be successful:

A lot of documentation on engineering tools assumes background or past experiences that may not be applicable to folks from diverse backgrounds … Certain documentation puts a lot of burden on the reader to dig around for background info.

Experiences During Collaborative Development Work

We next asked respondents to share their experiences with key collaborative development activities. The breakdown of their responses is shown in Figure 13-3. Most respondents reported frequently or always having positive experiences during these activities. About 80% said they always received respectful feedback during design and code reviews, while over half said they always experienced fair task distribution, appropriate credit, and receptiveness to their ideas in the context of their work.

In terms of collaboration, developers of higher age groups reported worse outcomes than those of lower age groups, to a statistically significant degree. Twenty-three percent of developers aged 60 years or greater reported having experienced unfair distribution of development tasks compared with only 6% of developers under 30 years old. Nineteen percent of developers between the ages of 50 and 59 years old and 35% of developers 60 years or greater in age reported instances of not receiving appropriate credit for their engineering contributions, compared with 6% of developers under 30 years old. Apart from these age differences, no other statistically significant differences between demographic groups emerged.

Figure 13-3
A stacked bar graph of the percentage versus experiences. The parameters are never, rarely, sometimes, frequently, and always. The remark on respectfulness during code reviews contributes the most to 81%.

Survey response breakdown for experiences during collaborative development activities question

Respondents provided examples of negative experiences during these collaborative development activities, as a follow-up to the prior question. In the following are some of the examples of the 110 follow-up responses we received:

There is a lot of behind-the-scenes work that gets done (like task tracking, note-taking, organizational planning, mentoring, etc.) that I as a female was either asked to do or just did that helped the team run smoothly but was never acknowledged.

Even if I contributed heavily to the design through reviews, if my name isn’t listed as an author, I probably wouldn’t take credit. Might be nice to have guidelines about second [and] third authors and publicize more.

I was offended when my coworker responded that my idea was “not important” and that I was “missing the company’s priorities.” Even if that is 100% true, I would have appreciated him saying, “That’s an interesting idea, but we don’t have time for that right now. Maybe we can revisit this later.”

Work is not particularly proactively distributed or tracked on my team. Nobody has particular visibility into how much work anyone is doing …and at [performance review] it’s clear our manager has limited visibility too.

Level of Comfort Reaching Out Off-Team

Most participants reported feeling comfortable using various channels when reaching out for off-team support, but comfort did vary based on the channel. More respondents reporting being “comfortable” or “very comfortable” reaching out to an individual they were directly referred to or to listed code, document, or bug owners (88% and 84%, respectively) than they did using an internal question forum called YAQS or reaching out to a mailing list (77% and 70%, respectively).

When asked to elaborate on reasons for discomfort, the most frequently cited reasons were fear of asking what others would perceive as an obvious or unnecessary question, doubts that the answers would be timely enough or sufficient to solve their problem, reticence to engage due to the tone in which previous questions were responded to within the forum, and concerns about spamming too wide of a group or disrupting others in their work. For example, one developer explained in an open response:

I have major imposter syndrome and I’m afraid of asking dumb questions. Questions on YAQS and mailing lists will be visible forever, and it’s open to a huge audience.

As suggested by Figure 13-1, how different demographic groups responded to questions was generally similar. However, the level of comfort seeking off-team technical support was one of the two areas of the survey where differences did exist, as evidenced by adjusted Wald confidence intervals.

Figure 13-4 shows these differences across groups. Using non-overlapping confidence intervals of non-favorable responses as the criterion, we observe the following differences:

Figure 13-4
A set of 2 stacked bar graphs of percentages versus respondent groups on the mailing list, Y A Q S, listed code or doc or bug owners, and directly to another individual referred to. The parameters are not at all comfortable, somewhat comfortable, comfortable, and very comfortable.

Comfort reaching out off-team, broken down by demographics

  • Thirty-four percent of female developers reported being “not at all” or “somewhat” comfortable reaching out via YAQS compared with 20% of male developers.

  • Forty-four percent of female developers reported being “not at all” or “somewhat” comfortable reaching out via a mailing list compared with 24% of male developers.

  • Thirty-four percent of Black developers reported being “not at all” or “somewhat” comfortable using YAQS compared to 16% of Asian developers.

  • More Black (23%), White (18%), and female (18%) developers, as well as developer members of Pride at Google (30%), reported being “not at all” or “somewhat” comfortable reaching out to listed code/doc/bug owners when compared with Asian developers (8%).

Comparing Google to Stack Overflow and GitHub

Because some of our questions were modified versions of inclusion questions asked by Stack Overflow and GitHub, we can compare those results. When asked about a recent experience finding help, directly reaching out to another person for help is the most common way developers reported providing and seeking assistance at Google; 63% of Googlers reported asking a specific person for help, while only 14% of GitHub users did. Google developers were less likely to report unsolicited help or asking for help in an external forum; 0% of Googlers reported doing this, whereas 74% of GitHub users did.

Methods for getting un-stuck also differed between Google and Stack Overflow. When asked about what developers do when they get stuck

  • Eighty-eight percent of Google respondents “ask a teammate for help” at least sometimes, while only 50% of Stack Overflow respondents “call a coworker or friend” with the same frequency.

  • Seventy-one percent of Google respondents “investigate the issue using external documentation (Stack Overflow and others)” at least sometimes, while 91% of Stack Overflow respondents “visit Stack Overflow” with the same frequency.

  • Twenty-three percent of Google respondents “watch help/tutorial videos” at least sometimes, while 53% of Stack Overflow users do.

Many of the differences observed between Google and GitHub responses can likely be attributed to the inherent differences in developer workflows in a private company vs. an open source project. For example, within a private company, developers primarily collaborate with colleagues they know, whereas within an open source context, developers are more likely to collaborate with strangers whom they don’t have a professional relationship with.

Discussion

Returning to our original question, are we as researchers studying the right thing? There are many ways to answer this question, but turning to Figure 13-2, we can compare what the research community focuses on vs. what developers report as the most important to address.

For instance, many in the research community – ourselves included – have focused on inclusion in the code review process (e.g., [7, 9, 10, 11, 13]), but it was not mentioned by many respondents as the most important issue to address. On the other hand, vision and other accessibility issues were frequently mentioned, providing some support for research in the topic (e.g., [1, 2, 3, 8, 12]). As another example, a more structured design research process for internal tools was the next most mentioned inclusion issue, but we know of little research in that area. To the extent that there’s a disconnect between practitioners’ perceptions of inclusion issues and what the research community investigates, we speculate that part of the problem may be that the community tends to investigate highly structured and instrumented processes (like code reviews), which are easier to study than unstructured or un-instrumented processes (like design reviews).

Beyond asking respondents directly what they believe to be the most important inclusion issues to address, we also asked them about their experiences during common collaborative development activities and evaluated whether experiences varied across demographic groups. We found some groups were less comfortable reaching out for help using mailing lists and question forums. Prior research has also found evidence that experiences in public technical forums vary by gender (e.g. [4, 5, 6 ]). Scaling the exchange of information and expertise often requires diverting questions to common support channels. Further research aimed at enhancing the psychological safety of these online knowledge-sharing spaces could potentially reduce barriers to obtaining technical support that may be disproportionately experienced by underrepresented groups within those spaces.

We believe strongly in amplifying marginalized voices, as we have tried to do with this survey. But at the same time, we also believe that a research roadmap need not be constructed strictly using a histogram of frequently mentioned issues. Solving real inclusion issues is clearly critical, yet not every issue is salient and easily articulated when filling out a survey, so researchers should also apply other prioritization mechanisms. For instance, a researcher in a historically marginalized group may choose to study that group, bringing valuable personal motivation and experiences to the table.

Additionally, although we found few significant differences in terms of the frequency of positive experiences across demographic groups, we did find that a small proportion of engineers across all groups had negative experiences while collaborating on development work. While frequency of negative occurrences may be similar, the impact of those negative experiences may not be equal across all groups. For those who are less represented in the field of engineering, these experiences, like failing to get appropriate credit for their contributions or receiving disrespectful feedback, may have greater negative impact. Thus, research aimed at reducing the incidence of these unfavorable experiences in general, through more fair and transparent processes for distributing and tracking team work, for example, may still serve to enhance inclusion for marginalized groups.

We believe that Google is a reflection of the larger tech industry and that our survey is replicable in other organizations. At the same time, our results have very limited generalizability because our sample of survey respondents was stratified such that the respondent pool is purposefully not representative of the population of developers at Google. Given the differences we observed comparing our results to GitHub and Stack Overflow’s, results may be different in other development contexts as well. Nonetheless, the results presented in this chapter provide a unique examination of developer inclusion issues across a diverse sample of engineers.

Conclusion

As the field advances, researchers can use our findings to help guide decisions on what inclusion problems to study in software development. When asked to volunteer priorities to address in this space, developers frequently mentioned issues related to accessibility and product design processes, but since frequency isn’t the same as importance, we also encourage researchers to study other inclusion challenges, challenges that might not yet be salient to practitioners, but that negatively impact software developer diversity and inclusion nonetheless.

Acknowledgments

Thanks to Carolyn Egelman, Ciera Jaspan, Claire Taylor, Collin Green, Dalain Williams, Jill Dicker, Jingjing Chen, Liz Kammer, Madison Stamos, Sarah D’Angelo, Sarah Inman, Google’s Core Developer, and anonymous reviewers for their feedback and support.

Appendix

In this appendix, we list our survey design, sampling criteria, and survey questions.

Survey Design

Our survey was divided into three parts: the frontmatter, high-level inclusion experience questions, and questions about specific processes. In what follows, we briefly describe the design of each section; full questions appear later in this appendix. We asked six questions designed to assess developer sentiment toward engineering inclusion. The questions covered whether the company’s engineering tooling and processes meet engineers’ own needs, as well as to what extent the respondent believes the company is committed to product inclusion and accessibility when developing Google’s internal engineering tools. We designed these questions to assess software developer inclusion at a high level, with a focus on developer tools because our team’s place in the company enables it to provide research findings directly to teams that build and maintain such tools.

The remainder of the questions focused on specific development processes and tasks. The first two questions were about the nature of feedback during code and design reviews, whether engineering tasks are distributed fairly, peers’ receptiveness to new ideas, and receiving appropriate credit for engineering contributions. We chose these questions based on prior work in computer and social sciences, which, for instance, suggest that women are more likely to face more pushback during code review [9, 13].

The next three questions asked about respondents’ comfort reaching out off-team via various channels and frequency of employing various methods for getting unstuck. These questions are based on a similarly worded, comparable question for Stack Overflow. The next four questions asked about a respondent’s most recent specific help-giving and help-receiving experience, digging into how help was obtained, the relationship between the help giver and help receiver, and the nature of the problem. These questions are based on similarly worded questions from GitHub. The final three questions asked about discouraging behaviors that respondents may have encountered, such as lack of responses to questions, dismissive responses, and unexplained delays. These questions were also adapted from GitHub’s survey.

Sampling

To ensure inclusion of diverse groups and perspectives in our sample, we invited

  • 300 developers who self-identify as Black or African American

  • 400 developers who self-identify as Hispanic or Latino

  • 500 developers who self-identify as Asian

  • 500 developers who self-identify as White

  • 10 developers who self-identify as American Indian or Alaskan Native

  • 10 developers who self-identify as Native Hawaiian or Other Pacific Islander

  • 500 developers who self-identity as male

  • 500 developers who self-identify as female

  • 500 developers who self-identify as less than 29 years of age

  • 500 developers who self-identify as between 30 and 39 years of age

  • 500 developers who self-identify as between 40 and 49 years of age

  • 400 developers who self-identify as between 50 and 59 years of age

  • 70 developers who self-identify as 60+ years of age

While we would have ideally sampled 500 software engineers from each group, for several groups listed here, we invited fewer engineers, so as not to overburden these communities with research requests.

We received 903 total responses, but 41 of these were in response to survey invitations sent by email to three employee resource groups and do not have demographic data connected to them. We received 862 responses from the stratified sample that we invited by individual email invites. Demographic data is available for these 862 responses, but we only report breakdowns in this chapter for the groups that had sufficient sample sizes. We received responses from

  • 171 developers based outside the United States (race/ethnicity data not available)

  • 197 developers who self-identify as Asian

  • 64 developers who self-identify as Black or African American

  • 105 developers who self-identify as Hispanic or Latino

  • 306 developers who self-identify as White

  • 2 developers who self-identify as Native Hawaiian or Other Pacific Islander

  • 1 developer who self-identifies as American Indian or Alaska Native

  • 12 developers who self-identify as more than one of the preceding races/ethnicities

  • 28 developers within the United States who chose not to self-identify a race/ethnicity

Breaking down responses by gender, we received responses from:

  • 178 developers who self-identify as female

  • 681 developers who self-identify as male

  • 3 developers who chose not to self-identify a gender

Across age groups, we received responses from

  • 252 developers under the age of 30

  • 283 developers aged 30–39 years

  • 175 developers aged 40–49 years

  • 127 developers aged 50–59 years

  • 25 developers aged 60 years or more

Responses from some groups (e.g., Male) exceed the original strata sample sizes because respondents from other samples contribute to response group sizes.

Survey Frontmatter

We designed the frontmatter to the survey to explain the context of the survey, how the results would be used and shared, and how to reach out to the researchers. At the end of the frontmatter, we asked two questions about where respondents’ code resides: in the company’s monolithic repository (e.g., where Google Maps and Docs reside), in its external git repository (e.g., where Android and Chromium reside), on private cloud git repositories (source.cloud.google.com), or on public git repositories on GitHub. These questions were not generalizable outside of Google, so we do not provide any further analysis of them here. More than 80% of respondents said that Google’s monolithic codebase is where the majority of their code is hosted.

Refinement and Analysis

To refine the survey’s validity and clarity, we piloted the survey by observing 11 developers filling it out. Pilot participants were recruited using a small stratified sample similar to the one used for the full survey. During the pilot, the first author observed the participants answer the survey questions over video conferencing during a 45-minute session. The first author occasionally probed with additional questions about how pilot participants selected their responses or how they interpreted the question and answer options.

Survey Questions

[Q1] Thinking about your background, experiences, and demographic characteristics, to what extent do engineering tooling and processes at Google meet?

  • All of your needs

  • Most of your needs

  • Some of your needs

  • Few of your needs

  • None of your needs

  • Unsure

[Q2] Google/Alphabet is committed to product inclusion within internal engineering tools (i.e., intentionally incorporating underrepresented perspectives at key points in the product design process).

  • Strongly disagree

  • Disagree

  • Neither agree nor disagree

  • Agree

  • Strongly agree

  • Unsure

[Q3] Optional: Please explain your answer to the previous question and elaborate on any ways Google/Alphabet is or is not demonstrating commitment to product inclusion within internal engineering tools.

[Q4] Google/Alphabet incorporates accessibility (i.e., the needs of people with disabilities) into the design and development process of internal engineering tools.

  • Strongly disagree

  • Disagree

  • Neither agree nor disagree

  • Agree

  • Strongly agree

  • Unsure

[Q5] Please complete the following statement. The engineering tools I use regularly within my work at Google/Alphabet have

  • All of the accessibility supports I require

  • Most of the accessibility supports I require

  • Some of the accessibility supports I require

  • Few of the accessibility supports I require

  • None of the accessibility supports I require

[Q6] Optional: For this question, we’re defining inclusion as the degree to which employees feel part of essential processes, including influence over the decision-making process, involvement in critical work groups, and access to information and resources. In your opinion, which inclusion issue in engineering tools and processes at Google do you see as most important to address, and why?

[Q7] During the past three months, how often did each of the following things happen? (Options: Never, Rarely, Sometimes, Frequently, Always, Not applicable)

  • When I received feedback during design reviews, it was respectful.

  • When I received feedback during code reviews, it was respectful.

  • Engineering tasks were distributed fairly on my team.

  • Other Googlers were receptive when I proposed a new idea (feature, fixit, etc.).

  • I received appropriate credit for my engineering contributions.

[Q8] Optional: Please elaborate on any experiences in which you did not receive respectful feedback during design/code reviews, engineering tasks weren’t distributed fairly, other Googlers weren’t receptive when you proposed an idea related to your engineering work, or you did not receive appropriate credit for your engineering contributions.

[Q9] During the past three months, how comfortable did you feel reaching out for off-team help through the following channels? (Options: Not at all comfortable, Somewhat comfortable, Comfortable, Very comfortable, Not applicable)

  • YAQS (an internal Q&A system)

  • Mailing list

  • Listed code/doc/bug owners

  • Directly contacting another individual I was referred to

[Q10] Optional: Please describe any reasons you did not feel comfortable when using YAQS, mailing lists, listed owners, or referrals to reach out for assistance off-team.

[Q11] During the past three months, how frequently did you do each of the following when you got stuck during an engineering task? (Options: Never, Rarely, Sometimes, Frequently, Always)

  • Investigate the issue using internal documentation (YAQS, MoMA, intranet search, etc.).

  • Investigate the issue using external documentation (Stack Overflow and others).

  • Do other work and come back later.

  • Watch help/tutorial videos.

  • Ask a teammate for help.

  • Ask a colleague off-team for help.

  • Do something non-work related and come back later (e.g., take a walk).

  • Other:

[Q12] Thinking of the most recent case where someone helped you at work, how did you find someone to help you? (Choose one.)

  • I asked for help in an internal forum (e.g., in a YAQS thread, project mailing list, etc.) and someone responded.

  • I asked for help in an external forum (e.g., Stack Overflow) and someone responded.

  • I asked a specific person for help.

  • Someone offered me unsolicited help.

  • A standard or prescribed process led us to interact (e.g., code review).

  • Other:

[Q13] Which best describes your relationship with the person who helped you?

  • We knew each other well.

  • We knew each other a little.

  • I knew of them through their contributions to projects, but I didn’t know them personally.

  • Total strangers, I didn’t know of them previously.

[Q14] What kind of problem did they help you with?

  • Writing code or otherwise implementing ideas

  • Debugging code

  • Installing or using a tool, application, or piece of infrastructure

  • Understanding community norms (e.g., how to submit a change, how to communicate effectively)

  • Introductions to other people

  • Other:

[Q15] Thinking of the most recent case where you helped someone at work, how did you come to help this person?

  • They asked for help in an internal forum (e.g., in a YAQS thread, project mailing list, etc.) and I responded.

  • They asked me directly for help.

  • I reached out to them to offer unsolicited help.

  • A standard or prescribed process led us to interact (e.g., code review).

  • Other:

[Q16] Which best describes your relationship with the person you helped?

  • We knew each other well.

  • We knew each other a little.

  • I knew of them through their contributions to projects, but I didn’t know them personally.

  • Total strangers, I didn’t know of them previously.

[Q17] What kind of problem did you help them with?

  • Writing code or otherwise implementing ideas

  • Debugging code

  • Installing or using a tool, application, or piece of infrastructure

  • Understanding community norms (e.g., how to submit a change, how to communicate effectively)

  • Introductions to other people

  • Other:

[Q18] Optional: Please share any additional context about your answers to the previous questions regarding recent instances of helping others and receiving help or your general experience as an engineer of helping and receiving help.

[Q19] Thinking about the code review tools you’ve used in the past three months at Google/Alphabet, how often have you experienced the following? (Options: Never, Rarely, Sometimes, Frequently, Always)

  • Unexplained delay in getting a CL reviewed

  • Lack of response to questions

  • Withholding of LGTM on CLs without explanation

  • Dismissive responses to CLs

  • Dismissive responses to questions

  • Conflict or interpersonal tension with another engineer

  • Language or other content that made you feel uncomfortable (e.g., profanity, inappropriate jokes, etc.)

[Q20] Optional: Please elaborate on any of the preceding experiences that you said happened rarely to always while using code review tools.

[Q21] Thinking about the asynchronous communication tools you’ve used in the past three months at Google/Alphabet (e.g., chat, email, YAQS), how often have you experienced the following? (Options: Never, Rarely, Sometimes, Frequently, Always)

  • Unexplained delay in getting a response

  • Lack of response to questions

  • Dismissive responses to questions

  • Conflict or interpersonal tension with another engineer

  • Language or other content that made you feel uncomfortable (e.g., profanity, inappropriate jokes, etc.)

[Q22] Optional: Please elaborate on any of the preceding experiences that you said happened rarely to always while using asynchronous communication tools.