We research the users’ perception of the representation of gender in software through an interview study. We capture the perception of how non-binary people are being represented in different software systems. We aim to understand how users perceive the value of their gender being collected in the software they use, where it is not necessary to be captured, and how to make our systems more gender-inclusive. Through thematic coding, the results of the study present various themes related to gender representation in software. It is important that software engineers take the issues of gender representation seriously. This could imply not to register a user’s gender in the first place. Where relevant, it should be done in a way that makes sure that all gender identities feel included in the digital world.

Introduction

There are multiple aspects of inclusivity that should be considered in software design, such as the user’s gender, age, ethnicity, and special abilities. This study focuses on the gender aspect, namely, how gender is represented in software – how software systems ask about a user’s gender and how that gender information is used by the software. The use of gender in software varies depending on the system, where it is not relevant in some systems and crucial in others. It is therefore in our interest to not only understand how gender is represented in software but also where gender is a relevant factor in a system [1].

Current literature looks at gender in software from various different aspects, such as conceptual modeling for gender-inclusive requirements [2], how a non-binary person was discriminated through different websites when trying to register their gender [3], how the quality of requirements is affected depending on the stakeholder’s gender [4], the difference between men and women regarding privacy concerns in e-health applications [5], and how men and women felt about different gendered aesthetics on a website [6]. There is a gap in the literature with regard to how users feel that their gender is being represented in software, as well as what the software engineers can do to make the systems they create be more gender-inclusive.

In this study, the definition of gender is the gender a person identifies as, that is, a person’s gender identity. People may experience a gender identity on a mental and emotional level that cannot be captured by documenting a physical expression.Footnote 1 The contribution is to understand how gender representation in software systems is perceived by users to help practitioners understand how to make software more inclusive already at the stage of requirements elicitation.

Related Work

Spiel [3] writes about their experience as a non-binary person and how they, over the course of 1.5 years, faced discrimination with various digital interfaces, detailing their thoughts and feelings on how they were represented across various systems. They propose a number of solutions for how to include non-binary people in software (e.g., more fields in the databases) and that computer science education take these issues seriously and make sure that systems are created more inclusively.

Nunes et al. [2] have created a model for conceptualizing and helping developers understand how their software can be made more gender-inclusive adapted to each organization and system’s goals. Their aim was to guide requirements engineers to creating more gender-inclusive requirements using a framework to help practitioners with recognizing gender bias in their systems.

Ehrnberger et al. [7] recreated a drill and hand blender to talk about and challenge gender norms surrounding everyday items. Clarkson et. al. [8] discuss the various aspects of inclusive or universal design and how practitioners can work with them. Within the area of user experience (UX) design, there are many voices raising about making software and designs more inclusive [9, 10, 11], giving guidelines and suggestions for how software can become more gender-inclusive.

Rowan and Dehlinger [5] look into the difference between men and women with regard to privacy concerns for e-health mobile applications, and women reported higher concerns than men. The authors suggest that health applications should provide different services related to privacy as the concerns and behavior of the users differ depending on their gender.

Metaxa-Kakavouli et al. [6] created two different web interfaces that displayed the same content for a computer science course, where one page had aesthetics that were perceived as gender-neutral and the other had aesthetics that were perceived as masculine. Results showed that women reacted negatively to the more masculine website.

Criado Perez [12] details how data bias and the gender data gap have affected and continue to affect women in various areas of life. The data bias can be seen in everything, from software to city planning and the design of restrooms. The book is not strictly about software and software engineering, and the study at hand therefore compliments it by looking at gender representation solely in software.

Further studies cover how gender differences impact building social goal models [13], how algorithm bias can affect the use of user feedback in app stores [14], and how the software engineers’ human aspects can affect requirements engineering [15].

Study Design

Purpose of the study: The aim of this interview study [16] is to explore and understand how gender representation in software is being perceived. The study aims first at understanding how users perceive the value of their gender being collected in the software they use, how it is currently represented and what effects that has, and what experiences users have with gender being represented in software systems. Informed consent from the participants was obtained by first explaining the study and the planned use of the data before the participants made a decision about whether they wanted to take part or not and then signing the consent form. In the interviews, their answers were anonymized in the interview transcript, and the interviewer collected their email address only to send the finished transcript to the interviewee for them to read over and give their permission to use. The research protocol was reviewed at university level and not required to go through further review at the national ethics authority.Footnote 2

Research questions: The two research questions are (RQ1) What is the perception of representing gender in software systems? and (RQ2) How are the options of representing non-binary gender in software systems perceived? RQ1 aims to look at what people (the interviewees) think of how gender is represented in different types of software systems. RQ2 aims to look at the perceptions of how non-binary people are being represented in software systems. It is partially answered with the help of interviewees who do not identify as non-binary, but put themselves in the shoes of someone who identifies as it. This design choice was made as (1) we were not able to recruit what we would have deemed a sufficient number of non-binary interviewees and (2) the ability of people who identify as female or male to put themselves into the perspective of a non-binary person might indicate the feasibility of a software engineer doing the same for a gender they do not identify with.

Piloting of interview questions: The interview questions (Table 5-1) were piloted with the help of interviewee 1, who completed the interview and then gave feedback. Interviewee 1 is a professor and equality representative at the university, and we therefore felt that their opinions and insights were valuable. After the feedback was positive, the rest of the interviews were then conducted as planned, and interviewee 1’s answers could be used.

Ideally, a more extensive piloting of the interview would have been carried out, but because of the short time frame for this study and the already small pool of interviewees, this was not seen as feasible. The time frame was due to the project being a bachelor’s thesis, with a limited time to find and carry out the research. The small interview pool resulted from a convenience sample. We reached out to students on a Discord server with around 400 members, as well as other personal acquaintances. As we were asking about non-binary gender, the interviewees either identified as non-binary (minority) or had relatable expertise or experience so they felt comfortable answering questions around the topic, for example, equality representative, researcher of gender studies, etc. Some participants struggled with coming up with examples for government software in Q4, and the question was therefore clarified to government or tax software instead.

Data collection: The subjects for the interviews were chosen through convenience sampling where they were either personal acquaintances or people who had some kind of interest in the subject of the study and wanted to take part in it. The majority of the participants were European, with a couple of interviewees from the United States and Brazil. The interviews were conducted in a semi-structured manner either in person or through video calls. The reason for using semi-structured interviews was to give the participants the option of speaking freely and for the researcher to be able to ask follow-up questions to interesting points they made. The participants consented to recording, and their answers were then transcribed and sent back to them for feedback before using thematic coding. All their data was anonymized while creating the transcripts. The interviews lasted up to 30 minutes, with most of them averaging 20 minutes.

The interview questions were derived from the research questions, as detailed in Table 5-1. Q3 was only asked if the participant did not identify as non-binary, because for non-binary participants, the answers to Q1 and Q2 were deemed sufficient to answer RQ2. We are aware that Q3 poses a threat to validity since we aim to understand perceptions of non-binary people using software systems. In Q4, four different examples of software systems were given: dating software, tax or government software, and medical software. These were chosen as examples as they cover a range of different types of software systems where gender potentially matters and therefore give an overview of the issues with gender representation across different fields. The participants were first asked a set of demographic questions, including how they identified their gender. Table 5-2 details all the participants’ demographic information.

Table 5-1 Interview questions mapping to research questions
Table 5-2 Interviewees’ demographic information

Data analysis: The interview data was analyzed through thematic coding following the guidelines by Saldaa [17]. The coding was carried out through a mixed approach, using both inductive coding (creating the themes and codes as we read through the transcripts) and deductive coding (developing a set of themes and codes before starting the coding). The deductive themes were derived from the interview questions presented in Table 5-1. The coding was carried out by the first author and verified by the second author.

Results

Answering research questions:

RQ1: What is the perception of representing gender in software systems? Summarizing the results from the thematic coding, the state of the practice in representing gender in software systems is that more can be done with regard to representing gender. Many of the interviewees said that the current gender options are inadequate and that software asks for gender when it is not necessary.

RQ2: How are the options of representing non-binary gender in software systems perceived? The non-binary people who were interviewed for this study said that they often felt that their gender was not accurately represented and that the option of “other” gender was not useful and discriminatory. They mentioned that certain types of software, namely, medical software, should have other ways of presenting their gender instead of the binary male and female. The responses from all subjects were considered when answering this research question. This could therefore bias the results (see p. 16), but the answers were combined because there was a large overlap in the answers from the interviewees who identified as non-binary and the ones who were imagining that they were identifying as non-binary. The overlap was seen in the answers for Q3 from the interviewees who did not identify as non-binary and in the answers for questions Q1, Q2, and Q4 from the people who did identify as non-binary.

Participants: The subjects of the study were selected through connections and personal acquaintances. Out of the fifteen participants, seven worked in academia and seven were university students across different fields of study. One participant worked as a director of marketing. Four of the participants identified as male, eight identified as female, two identified as non-binary people, and one identified as androgynous. This information can also be found in Table 5-2.

Results from thematic coding: The thematic coding was carried out by doing multiple iterations, reading over the transcripts, and assigning themes and codes to pieces of text. An overview of the deductive themes is presented in Figure 5-1, and an overview of the inductive themes is presented in Figure 5-2.

Figure 5-1
A set of seven radial diagrams denote the elements under non-binary discrimination, government or tax software, dating software, medical software, registering user's gender, and more gender-inclusive software.

Overview of deductive themes and codes

Deductive themes: Non-binary discrimination often happens in language constructs in software, according to many interviewees. They explained that using a gendered language when not necessary adds nothing but further excludes non-binary individuals. One interviewee mentioned that they might be forced to input a binary gender in places that do not offer other options. Interviewees also mentioned that this often happens in the user registration state and that digitalization has always been very binary. Similarly, interviewees noted that the option of “other” as gender is seen as discriminatory. One interviewee explained that it is because users will feel not seen or acknowledged and will feel that they are being put in an exception box if they choose the “other” option and thus will more likely not go for that option in surveys, forcing them to choose a gender option that may not be representative of the gender they identify as. Other interviewees mentioned the design choices and uses of colors as a way of cementing the binary gender norms.

For dating software, interviewees said that they are generally more inclusive and progressive compared with other software. One interviewee pointed out that dating software is still very binary. For medical software, some interviewees said that they are generally not “up to date,” while others said that they are better than other types of software. One non-binary interviewee explained that the information regarding a person’s gender is only relevant to the doctors and that they would work with a medical team if they were developing this kind of software. The Personnummer in Sweden was mentioned in relation to medical software, where one interviewee mentioned that the medical system 1177 [18] assumes a person’s gender based on their personnummer when they sign in. Several interviewees mentioned that there is not enough data collected for all communities of people, which leads to data bias, for example, for different diseases. One interviewee suggested that instead of medical software asking for a person’s gender, they should ask about their genetic composition instead.

For government/tax software, interviewees mentioned that they often only register male or female and no other genders and are therefore not as up to date as other types of software. One interviewee mentioned that registering a person’s gender in governmental software should only be done for statistical purposes and that the gender information is not relevant in this type of software otherwise. Another interviewee mentioned that legislation regarding gender does not always match between countries, which can be an issue for people who live and work in one country but have a citizenship in another country.

Registering the user’s gender as a deductive theme was asked in Q5 in the interviews. Here, almost all interviewees claimed that it is generally not relevant to register users’ gender information. Some explained that it could be useful to understand your audience and for the companies. Other interviews said that registering a user’s gender could be relevant in medical cases, for example, if a patient is unconscious when arriving to a medical institution. Registering a user’s gender should follow the laws and regulations in a country, one interviewee explained. Another interviewee explained that registering of gender could be good if the software is being customized to the user.

Disclosing the user’s gender should be a choice, or that the software asks for consent before disclosing it. Two interviewees said that the gender should be completely removed from job applications as that only feeds into prejudices. One interviewee said that the software should offer personalization irrespective of a user’s gender and focus on other information they have provided. Some interviewees said that disclosing of a user’s gender should only be done in medical cases. Disclosing a user’s gender should not be done unless it is relevant for the service the software is providing, according to one interviewee. Another said that it would be better for the software to ask for and disclose gender rather than trying to assume people’s gender based on their names.

For more gender-inclusive software, the primary solutions according to the interviewees were to have more gender options and make non-binary individuals feel seen. Interviewees suggested having diverse development teams and an inclusive software development process and, also, bringing awareness to the gender perspective by asking people about how they would prefer to have their gender represented in software and that gender should be seen as a spectrum and that pronouns should be avoided altogether. Another interviewee suggested that to make it more inclusive, a selection of pronouns should be presented on a list to users and that people who have chosen that they identify as trans are asked follow-up questions. With regard to the data bias, interviewees said that it is important to enrich the data sets to have a better representation of all the different types of people that use the software. One interviewee said that if gender is relevant to have, it should be made confidential:

Systems can be designed in such a way that this information is not directly accessible to anyone who just happens to be maintaining the systems. It can start with encrypting the data, what is sensitive, and storing the data without having a name attached to it, just ID number and references.

—Interviewee 13

Inductive themes: Discrimination due to UI was one theme that appeared in the interviews. Four codes were found: use of colors, graphics, pronouns in messages, and marketing through colors. The interviewees explained that the choices of colors were made based on gender, where darker tones were generally seen as targeting men. Other aspects of the user interface, such as graphics and messages, could also be seen as discriminatory, namely, where pronouns were used in pop-up messages and the default pronouns were he/him.

Some interviewees mentioned discrimination due to anonymization, where they explained that people using different forums and communities would have an easier time spreading harmful comments. One interviewee mentioned that if a non-binary person were to disclose their gender in an anonymized community, that might be more noticeable and they are at a greater threat of receiving hurtful comments.

Figure 5-2
A set of 11 block diagrams denote the elements under surveys, less inclusive regional apps, discrimination due to U I, Personnummer in Sweden, discrimination when building software, anonymization discrimination, data bias, target ads, gender representation in games, social media, and translation software.

Overview of inductive themes and codes

Gender representation in games was a theme that was mentioned multiple times. Interviewees explained that there is a large amount of male characters in games, together with many characters being gendered in general and female characters often being stereotyped, and that there are no non-binary characters in games. One interviewee mentioned that games having gendered characters could be a possibility for people to take part in a gender different from the one they identify as and thus could lead to a better understanding of different genders.

Some interviewees mentioned discrimination when building software. One interviewee mentioned that the solution to overcoming gender discrimination in software lies in the requirements part and that it is at that stage where inclusivity needs to be taken into account. This is in line with what another interviewee said: that the engineers really can make a change regarding these issues. Other interviewees mentioned that men often make applications for men.

Less inclusive local/regional applications were mentioned a few times. The interviewees explained that it is more common in conservative countries that applications only have two gender options, one for male and one for female. They further explained that users feel more unwelcome when using those applications when their gender is not accurately represented.

Another theme that came up in the interviews was data bias. Interviewees said that many systems came with creator biases, where data was not collected fairly from different user groups. One interviewee specifically mentioned that LGBTQ+ communities were often not identified in the process of collecting data. The same interviewee mentioned recommender systems as places where the data bias becomes clear. Another interviewee mentioned AI software as inherently biased.

Targeted ads were another theme. It was mentioned in relation to assumptions being made based on a person’s chosen gender, as well as people getting ads that did not correspond. One interviewee mentioned that when the non-binary are forced to choose a binary gender when signing up, the ads they get based on their gender are wrong. Some interviewees mentioned that the ads should not know people’s gender at all. Another interviewee explained that this type of discrimination could be either implicit or explicit and that it could be unnoticed where the binary genders are still being used in the back end:

For example, when Facebook had this multitude of options that you could provide, but in the back end for advertisers, it’s kind of slotted people into binary genders again.

Interviewee 15

Surveys/questionnaires/sign-ups refer to the different places where users have to sign up. Interviewees mentioned that these forms often have a limited amount of gender options. As a consequence, people may not exist in systems because they are hindered by the barrier of entry, as one of the interviewees mentioned.

Some of the interviewees mentioned social media as a place where technology reconstructs gender. They explained that the binary gender norms are present in the different social media platforms. One interviewee felt that the platforms should not ask for gender at all, while another said that if they do need to ask for gender, they should include more gender options for the users.

One interviewee mentioned translation software as an example of software that produces wrongful information as a result of data bias. They explained that the software changes the noun of a word from female to male when translating it. For example, the word “researcher” as a female noun in one language would be translated to “researcher” as a male noun in another language. The interviewee said that these types of software systems should have some indication to the user that the meaning of the word has changed.

The theme Personnummer in Sweden refers to the system in Sweden where everyone registered in the Swedish Population Register receives a personal identity number [19]. The number consists of a person’s birth date in the form YYYYMMDD, followed by four digits where the third one is decided by the person’s sex. The digit is odd if a person’s legal sex is male, and if a person’s legal sex is female, the digit is even. As of now, there are no options for a non-binary personnummer, but a motion has been sent to the Swedish government about creating a third legal sex as well as gender-neutral personal identity numbers [20]. As the information structure is insufficient in the first place, representation cannot be accurate either. The interviewees explained that because the system is inherently binary, it leads to assumptions:

And they have in medical areas the assumption that if the personal number says that you are this gender, we have to add it in your journal that you also can read, but it’s not really important actually.

—Interviewee 12

Discussion

Software development process: Discrimination when building the software is a theme that is very relevant to software engineers. The interviewees explained that men often make applications for men, and one specifically mentioned that discrimination can happen already at the stage of requirements writing:

[…] I think in the domain itself as well, there is so many different aspects of how we capture the requirements of the system that are not aware of the systematic discriminations that these systems can reinforce.

—Interviewee 1

One possible solution to this would be to use the GenderMag method [21] in the software development process. By using the process and provided set of materials, that is, the developed personas, companies and software engineers could make sure that the software they are developing is gender-inclusive. GenderMag is a good solution because it easily provides the developers with personas that they should consider for their software and thus can help them find where their software displays gender bias. Aside from using GenderMag, developers could use Hidellaarachchi et al.’s [15] paper to understand how human aspects of the engineers can affect the requirements they write. They did conclude that the gender of the developers was of lower importance than, for example, knowledge of the domain, but we think that making sure that both the process and the development teams are inclusive leads to the software being more gender-inclusive. More on this aspect can be found in Chapter 10, “Beyond Diversity: Computing for Inclusive Software,” and Chapter 11, “Gender Diversity on Software Development Teams: A Qualitative Study.”

Design of systems: The findings from Costanza-Chock [22] can be incorporated into software systems design. Specifically, we can apply the principles of the Design Justice Network that are put forth in the book.

Let’s take the first two principles as examples. Principle 1: We Use Design to Sustain, Heal, and Empower Our Communities, As Well As to Seek Liberation from Exploitative and Oppressive Systems [22, p. 190]. The Design Justice Network encourages designers to move beyond critiquing oppressive systems and actively empowering community. Software engineers have an agency here that is visible in the fitting or not fitting design choices they make, and that agency comes with responsibility.

Principle 2: We Center the Voices of Those Who Are Directly Impacted by the Outcomes of the Design Process [22, p. 191]. The idea of “nothing about us without us” is a partial answer to the question of who gets to do design work as an appeal for more inclusivity in software engineering as a discipline. While female and male interviewees in our study provided insightful answers when prompted to put themselves into the shoes of a non-binary person, the participants of our study are not representative of software practitioners in general. To design well for non-binary users means to have one on the development team or as a client stakeholder.

Further principles detail potential impacts in social movements and technological innovation as well as best practices for community-engaged research.

User interface and design: The theme discrimination due to UI showed that the choice of colors and other design elements often can make users of certain genders feel unwelcome, which is in line with the findings of Metaxa-Kakavouli et al. [6]. More focus needs to be put on recognizing these patterns of design-related discrimination, specifically when creating the software. If users don’t feel welcomed when opening an application or a website, they most likely will not continue to use the software, which means less profit for the company, but also that the user misses out.

Medical software and data bias: With regard to the medical software theme, the most interesting findings are the ones related to data bias. Several interviewees mentioned that data is not collected fairly from different groups of people and this then makes the medical data biased in relation to gender. It can even be life-threatening, as stated by one interviewee:

If you think about heart attacks, a lot of the symptoms related to females are different to the ones related to males. And quite often women die from heart attacks because their symptoms are not recognized.

—Interviewee 14

Data bias is something that is prevalent in most software today, and there are a lot of things that can be done to solve it. Our first suggestion is something that was mentioned by multiple interviewees, which is to enrich the data sets. By making sure that the data collected comes from a large and diverse sample of people, it would make the systems detecting, for example, medical problems more inclusive and could in the long run save lives.

Data bias as a theme was mentioned multiple times, also in relation to other types of software besides medical, in line with Criado Perez [12]. She proposes several solutions for minimizing the gender data gap, the most prominent being that women should be included in the collection of data. For that, women need to be included in all aspects of life, and when more women are in positions of power, they remember that women and their needs exist.

Rowan and Dehlinger [5] results about privacy concerns are echoed by one interviewee, who talked about being wary of giving out their information to companies:

[…] I think that it’s important to break down barriers and to be inclusive. But I don’t necessarily think that part of being inclusive is to harvest and collect all the data we could possibly muster up because then I feel like that delves into a completely different territory.

—Interviewee 8

Non-binary discrimination, registering and disclosing users’ gender: Non-binary discrimination as a theme was related to RQ2. A result that was quite surprising was how many interviewees mentioned that language constructs were contributing to the discrimination of non-binary people. One suggestion to make the language more inclusive is to follow the Gender Guidelines by Scheuerman et al. [23], and we encourage all practitioners to read Spiel’s paper [3], where they detail their experiences with different types of software as a non-binary person and propose several solutions.

The large majority of interviewees mentioned that it is not relevant to register a user’s gender at all. While it was not surprising that people thought it unnecessary, we were surprised by the fact that almost all of the interviewees said that they found it to not be necessary. In addition, there are examples where gender bias is a known fact, for example, job application systems [24, 25], but also in tax systems [26], so these may serve as an indicator to not capture gender when not strictly necessary. In light of the General Data Protection Regulation (GDPR), the guideline of not collecting unnecessary data might have even more weight.

A motion has been sent to the Swedish government about including a third legal gender as well as gender-neutral personnummer. This is similar to what other countries, such as the United States [27], Germany [28], and Austria [29], have implemented. These countries show that it is possible to include non-binary genders in legal documents, and more countries should follow in their footsteps.

Limitations and threats to validity: The following threats to validity are of concern for the study at hand:

Internal validity: The interviewees were chosen based on if they had an interest in the subject matter regarding their experiences of gender in software. While most of them were in academia, they were at different stages of their studies or were professors in different areas. It was also reasoned that if subjects who had no interest in the topic of the study would be interviewed, they would not give any meaningful answers. This is a trade-off that has to be made between having a slight bias in the results and receiving not meaningful data. Another aspect of internal validity is that Q3 gives results that may not be applicable to answer RQ2. This is because when asking people to imagine a scenario, it does not give a truthful view of how non-binary people experience representation in software systems. This question remains in the study because answers thematically overlapped significantly for all genders.

Credibility: The results of the interviews could threaten the credibility of the study. The interviewee could be biased in giving answers that do not actually reflect their opinions on this subject because they wanted to be helpful to the research. The mitigation strategy for this threat is similar to the one for internal validity, namely, that the subjects were from a variety of backgrounds and that there is a trade-off between having a slight bias and difficulty of saying if the findings are true and receiving not meaningful data.

Dependability: The findings within the study were fairly consistent as many interviewees mentioned similar themes and codes. It is however difficult to say whether the findings could be repeated in a similar study as it depends on the subjects. This threat is not possible to mitigate at this stage, because there is only one study made on this topic thus far. More studies of similar nature would have to be conducted to be able to say if the findings are congruent and, if so, what made it possible for the results to be similar.

Conclusion

The results of our study point out that registering a user’s gender in software systems is often not relevant or even detrimental to the experience of non-binary users. Other important points were recognizing what design choices can be discriminatory for different types of users and that the data sets used for training algorithms need to be more diverse to reduce the data bias that is prevalent in so many software systems. If the software we create is inclusive, it makes our communities and society as a whole more inclusive and welcoming. Perhaps it is best summarized by the following quote from one interviewee:

[…] The software engineers, they really have the power in their hands to change a lot of assumptions and stigmas. […] People use software all the time, and you can really change the world in a different and positive way where people can feel included. I think that’s important.

—Interviewee 12

For how to ask about gender, we suggest to read Chapter 28, “How to Ask About Gender Identity of Software Engineers and “Guess” It from the Archival Data.”

Acknowledgment

Thank you to all the interviewees for your interesting perspectives. We wish to conclude this chapter by acknowledging the privilege that we have, including that we, as CIS women, have been lucky enough to never have had our gender identity questioned. That we were able to conduct this study is further proof of our privilege, coming from good socioeconomic backgrounds, as well as being born and raised in countries that offer free higher education. Despite the privileges we have, we are also no strangers to discrimination and marginalization. We are women in the male-dominated field of software engineering as well as immigrants. We have both experienced racial discrimination and one of us sexual harassment based on gender in different countries we have lived in. This is a motivation for us to support others who experience discrimination and marginalization.