Abstract
This commentary examines the challenges faced by metaverse platforms in cross-border content moderation, focusing on the implications for freedom of expression and nondiscrimination. It highlights the difficulties in determining what to remove for which users as well as how to do so, which has serious implications for freedom of expression and our shared sense of reality. Proto-metaverse platforms such as Roblox and Minecraft face similar questions, but have not yet encountered major cross-jurisdictional issues because, as looking at traditional social media platforms reveals, content moderation is not merely a question of law and policy, but also of geopolitics and government priorities. To avoid a “lowest common denominator effect” where freedom of expression is infringed upon worldwide and discrimination is entrenched, this commentary argues that metaverse platforms must clarify their moderation policies, assess their entry into specific markets based on local laws and their own values, and be prepared to exit overly restrictive markets.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Artificial intelligence (AI) has overtaken the “metaverse”Footnote 1 as the technology topic of the day. However, despite headlines announcing the demise of the metaverse (such as Olinga (2023)), rumors of its death have been much exaggerated. Companies are certainly scaling back investments—as they are in many areas—but development continues. Mark Zuckerberg, CEO of the “metaverse company,” Meta, pushed back against the narrative “that we’re somehow moving away from focusing on the metaverse vision,” reiterating that Meta remains committed to investing in the metaverse as a long-term project (Dean 2023). Apple will launch a mixed reality (MR) headset in early 2024 (Apple 2023), and the government of China, which last year released a 5-year action plan on virtual reality (VR)Footnote 2 remains committed to following through in industrial applications even as consumer investment pulls back (Jiang 2023).
This cooling of investment may benefit governance efforts. The EU is preparing to regulate “virtual worlds” after several Citizens’ Panels (European Commission, 2023). One issue that must be considered but that will be difficult to regulate domestically is that of cross-border content moderation, which has implications for freedom of expression worldwide. In general, online platforms remove content based on the laws of the countries in which they operate and have users, either proactively or in response to takedown requests from governments. This problem is theoretically well-characterized: platforms remove or restrict locally illegal content for the users and/or website version based in that country (Goldman 2021). However, this commentary will argue that in reality, economic and political factors significantly complicate cross-border content moderation. In the metaverse, the politics and the fundamental issue of determining for whom and where to remove content will be much trickier, both because users from different jurisdictions will mingle in the same environment and because the immersive nature of the metaverse means that expression goes beyond speech to avatar representation and behavior, making the question of what is allowed even more political and pressing. The sense of presence created by immersion means that experiences trigger the same physiological and psychological responses as in the physical world (Parsons et al. 2009), and in platforms where users are represented by an avatar, they may identify with their avatar as an extension or representation of their own self (Freeman et al. 2020). Thus, experiences—including negative ones—impact users much like experiences in the physical world, meaning that, for example, hate speech in the metaverse could have an even more deleterious impact than hate speech on a traditional social media platform.
In this commentary, I will address the difficulties metaverse platforms will face in cross-border content moderation, especially in cases where required removals violate a platform’s stated values, and how they may impact individual freedoms of expression and non-discrimination guaranteed by the Universal Declaration of Human RightsFootnote 3 (UDHR). I will also consider the precedents (and lack thereof) of proto-metaverses such as Minecraft and Roblox, as well as traditional social media platforms.
2 The Difficulties of the Inevitable Cross-Border Moderation Conflicts
At the most fundamental level, content moderation is a question of what to remove where and for whom, as well as how. The metaverse presents difficulties across all four. I will first address the how. Moderating user-created worlds has already proven challenging for Meta’s metaverse platform, Horizon Worlds. In Horizon Worlds, journalists created a world called the “Qniverse” and filled it with content and misinformation that violated Meta’s Community Guidelines. Meta did not remove it even after multiple users reported it; it was only removed when the journalists inquired directly with Meta’s communications department (Baker-White 2022). Horizon Worlds has only around 200,000 active users (Horwitz et al., 2022), but scale is already proving to be an issue. Automated moderation of immersive content, which involves not just posts but the entire worlds, avatars, and other user-created content, is not as straightforward as 2D audiovisual content. However, if metaverse platforms become as popular as they themselves predict, human moderation will not be feasible at the scale required. New processes will have to be developed to ensure that content is appropriate (to say nothing of user behavior).
This brings us to the question of what is to be removed. Both platform standards and local laws must be considered. Platform standards are not always easy to apply, though. One theory the “Qniverse” journalists put forth was that moderators may have thought the world was parody, something that is difficult to judge when the content in question is an immersive world instead of text, images, and videos, which may contain more obvious context clues that identify their parodic nature. If this is difficult for human moderators, it will be even more so for automated moderation tools. Metaverse and proto-metaverse platforms require their users and content moderation to abide by local laws, but local laws may be in conflict with these platforms’ stated values, which include community diversity (Roblox Community Standards, 2023), inclusion (Community Standards for Minecraft, 2023), and the ability to express yourself (Customize Your Meta Avatar With New Body Shapes, Hair and Clothing Textures, and More Ways to Express Yourself, 2023). Under laws restricting free expression across the world, not just content but also user behavior and expression may be subject to moderation.Footnote 4 Countries have always had different standards regarding what speech is acceptable online and governments often issue takedown requests to platforms for violating content, but the embodied nature of metaverse platforms creates new avenues of restriction. For example, in 2022, Vladimir Putin signed into law an amendment to Russia’s existing “gay propaganda law,”Footnote 5 banning all depictions of LGBTQ+ “propaganda” in media and prohibiting positive depictions of LGBTQ+ relationships (Rajvanshi 2022)—which could include the mere existence of avatars presenting an LGBTQ+ identity. The law has already been used to fine TikTok for not taking down LGBTQ+ content (Marrow 2022) and to arrest two gay content creators (Padgett 2023). This is one entry in a long list of laws restricting the expression of LGBTQ+ identity, including in Hungary (whose law was noted as similar to Russia’s in a European Parliament resolution condemning itFootnote 6) and Uganda (Bhandari 2023). According to Russia’s law, displaying LGBTQ+ identity in online spaces is illegal, meaning that metaverse platforms could either be forced to prohibit Russian users from accessing spaces where any portrayal whatsoever of LGBTQ+ content could be present (i.e., everywhere), restrict LGBTQ+ content to specific areas where Russian users are prohibited (infringing on the freedom of expression of all non-Russian users), or selectively alter the appearance of avatars expressing LGBTQ+ identities to appear “straight” to Russian users (another severe violation of non-Russian users’ autonomy and freedom of expression). Restricting LGBTQ+ expression on metaverse platforms would also be overtly discriminatory, violating Article 7 of the UDHR. These same questions apply to any case where some jurisdictions ban expression that others do not.
The question of what to remove overlaps with for whom content is to be banned. Metaverse platforms would have to restrict content for users based in the country where it is illegal. If they choose to remove content for only those users, the immersive nature of metaverse platforms would create situations where two users from different countries could be standing right next to each other and see completely different worlds, undermining any shared sense of reality. If they choose to remove such content for all users, other users’ autonomy and freedom of expression (which includes the right to “seek and impart information” according to Article 19 of the UDHR) will be infringed on. Many social medial platforms ban wholesale content like pro-Nazi propaganda, which is illegal in GermanyFootnote 7 but not elsewhere. This is generally accepted as benefitting everyone’s online experience. Returning to the example of LGBTQ+ content, though, it would be discriminatory and severely curtail freedom of expression to ban it altogether. Despite this, platforms may face pressure from governments to broadly ban content. In January of 2023, the Indian government invoked emergency laws to block a BBC documentary unfavorable to Prime Minister Narendra Modi, ordering Twitter and YouTube to remove accounts sharing clips (Ellis-Petersen 2023). In an immersive metaverse, these actions would be akin to storming movie theaters and television stations and confiscating all copies of a film, impacting not merely the content available to users, but their very realities.
The final issue platforms face regarding content moderation is where content is to be removed, i.e., on what version of the site and if different standards apply for public and private spaces. It has already been established that the very premise of the metaverse is that there are not different versions for different countries, but the public/private divide question remains. The journalists who created the “Qniverse” flagged the lack of clarity around Horizon Worlds community standards and content moderation in public versus private spaces, even though Facebook’s Community Standards apply to both public and private groups (Alison, 2020). Issues of who can access what content are likely to persist in private spaces as well, but could be more egregious because users may expect that their private spaces are a place where they can freely express themselves. Restricting specific content for users from specific countries in private spaces would violate that expectation and infringe on not only peoples’ social experiences, but their freedom of expression.
3 Proto-metaverse and Social Media Platform Precedent
One might assume that proto-metaverses like Roblox and Minecraft might provide precedent for how platforms can deal with these issues. After all, many of the above content moderation questions also apply to non-immersive platforms where users are depicted via avatars. However, neither metaverse nor proto-metaverse platforms have had to address significant cross-border content moderation issues. For metaverse platforms, part of the reason may be their operating area or scale. Meta’s Horizon Worlds is (as of May 2023) only available in seven countries, all in Europe and North America (Supported Countries for Meta Horizon Worlds, 2022), and daily active VRChat users number in the tens of thousands (VRChat API Metrics, n.d.). Proto-metaverse platforms are larger, though. Roblox boasts millions of users in Russia and has games that would seem to fall afoul of the “gay propaganda law,” such as “LGBTQ+ Hangout,” (Film Family, 2023) so the government could theoretically take action at any time. However, this would likely provoke significant international outrage, and there is precedent for governments ignoring virtual worlds that technically violate their laws. A Minecraft server called the “Uncensored Library” hosts articles that have been banned by various governments and is accessible to Minecraft users in any country. It was widely celebrated in the media, which perhaps contributed to its security—any government forcing its takedown would face widespread condemnation. Still, Nick Feamster of the University of Chicago warned that “Governments will know about this - the articles are going across the internet. It’s not going to be foolproof against a determined adversary” (Gerken 2020). The unknown factor is at what point governments will consider it worthwhile to take on platforms over content they disagree with, and it is here that social media platforms provide some insight, revealing the complex political dynamics of cross-border content moderation.
Best practices documentation for cross-border content moderation states that content restriction for locally illegal content should be “geographically proportionate” and preserve the broadest availability of “legitimate content” (Internet & Jurisdiction Policy Network 2021). When defining “legitimate content” and restrictions, platforms should look to Articles 19 of the UDHR and International Covenant on Civil and Political Rights (ICCPR)Footnote 8 (Internet & Jurisdiction Policy Network 2021), which protect the rights to freedom of opinion and expression, including the right to “seek, receive, and impart” information. Indeed, Meta states that if takedown requests are “inconsistent with international human rights standards,” (Content Restrictions Based on Local Law, n.d.) they may take no action. However, moderation decisions appear to be influenced more by political and commercial interests than the multi-step evaluation process laid out by Meta. Meta’s transparency reports reveal that they are not acting on takedown requests from the Russian government regarding pro-Ukraine content (Case Studies, n.d.), which could indicate that they are upholding their commitment to the right to free expression. Contradicting this is a May 2023 case regarding so-called anti-state content where Meta blocked access to 110 items alleging crimes and corruption by the Turkish government (which did not violate Meta’s Community Standards) under threat of having Facebook banned altogether (Case Studies, n.d.). Twitter similarly restricted anti-government accounts and Tweets, and owner Elon Musk offered insight into the decision when he responded to a critical Tweet: “The choice is to have Twitter throttled in its entirety or limit access to some tweets. Which do you want?” (Musk 2023).Footnote 9 Though the implication is that this was the trade-off necessary to protect free speech more broadly, commercial interests are no doubt at play, as no platform wants the bottom-line impact of being blocked (Gillespie 2018). Despite the similar nature of these cases, one of the reasons Meta likely did not remove pro-Ukraine content at the behest of the Russian government is that Facebook is already blocked in Russia (Allyn and Selyukh 2022), so the government had limited leverage.Footnote 10 Furthermore, given the overwhelming pro-Ukraine sentiment in the West, complying with such requests would create a public relations nightmare that would damage their business elsewhere.
Thus far, proto-metaverse and metaverse platforms have not faced the content moderation battles that traditional social media companies have, likely due to their comparatively smaller size and reach. However, as they grow, it is almost certain that they will clash with governments over content moderation, and what happens will depend on a variety of political factors independent of the human rights impact of banning the content. For platforms to truly avoid infringing on freedom of expression, they would have to concretely establish what values they want to uphold and stand by them, but this may come at a cost of their ability to operate in certain countries.
4 Conclusion
Metaverse platforms will be subject to the same political and economic pressures in content moderation as traditional social media platforms and proto-metaverses, but with higher stakes due to the implications for user identity and expression in an immersive context. And yet, they will probably be governed primarily by private companies, especially when it comes to content moderation. Any forthcoming EU metaverse regulation may spread to other jurisdictions via the Brussels Effect, but it will not be able to address cross-border content moderation. In an interview, Meta CEO Mark Zuckerberg was asked about how the metaverse will be governed and expressed a vision of a regime of industry-set standards (Newton 2021), which in reality would likely have to co-exist with some amount of hard law. However, unless a standards body can devise a universally accepted standard for content moderation, platforms will continue to set their own, which risks creating a “Lowest Common Denominator Effect” where, for technical and political ease, metaverse platforms adopt the most restrictive content regulations worldwide, severely limiting user expression. Platforms are poised to have enormous control over metaverse content and thus the realities of their users. Governments will have power as well, but only in restricting otherwise legitimate content, not in compelling certain forms of speech to be permitted outside their borders.
The status quo where traditional social media platforms are in constant battles over platform availability may spell doom for the vision of a universal Metaverse—unless it is one where freedom of expression is significantly restricted. Instead of this dystopic vision, platforms should concede that the Metaverse will be a set of “splinterverses” with different markets and values and (a) clarify how they will moderate content illegal in specific jurisdictions, (b) select and justify which markets they enter based on their laws and the platform’s professed values, and (c) be prepared to exit markets that become overly restrictive, rather than capitulate to restrictions over threats of being banned. So long as different countries have different standards for free expression online, disparities in platform and content availability will exist. In the absence of international regulations establishing global content norms, the metaverse is set to be a critical new battleground in the fight for freedom of expression.
Data Availability
Not applicable.
Notes
The definition of “metaverse” and whether there is one “Metaverse” or many “metaverses” have yet to be settled (Floridi 2022). In this piece, I use the term broadly to include any immersive virtual world accessible to users from many different places.
虚拟现实与行业应用融合发展行动计划 (2022—2026年) [Action Plan for the Integration and Development of Virtual Reality and Industrial Applications (2022-2026)], 2022
Universal Declaration of Human Rights, United Nations, 1948.
Some local laws may hinder any content moderation, such as a Texas law (currently blocked pending a Supreme Court case) attempting to ban political “censorship” that is written so broadly as to bar companies from blocking any content that is not explicitly illegal (Dwoskin 2023).
On the Introduction of Amendments into Article 5 of the Federal Law “On the Protection of Children from Information Liable to Be Injurious to Their Health and Development” and Individual Legislative Documents of the Russian Federation Aimed at Protecting Children from Information Promoting the Denial of Traditional Family Values No. 135-FZ, 2013.
European Parliament Resolution of 8 July 2021 on Breaches of EU Law and of the Rights of LGBTIQ Citizens in Hungary as a Result of the Legal Changes Adopted by the Hungarian Parliament 2021/2780(RSP), 2021.
Criminal Code in the version published on 13 November 1998 (Federal Law Gazette I, p. 3322), as last amended by Article 2 of the Act of 22 November 2021 (Federal Law Gazette I, p. 4906)
International Convention on Civil and Political Rights, United Nations, 1966
Musk has been criticized for hypocrisy in his content moderation decisions after professing to be a “free speech absolutist” (Musk 2022). Even the most libertarian platforms are subject to laws and commercial pressures that force them to moderate content; the famously anti-moderation Parler still expanded its moderation policy when Apple and Google removed it from their app stores for allowing hate speech (Grant 2022), and all platforms are vulnerable to financial pressure from threats of blocking.
Other forms of leverage may include fines and “hostage-taking laws” requiring platforms to have domestic staff who can be pressured and even detained (Elliott 2021).
References
Alison, Tom. (2020). Our latest steps to keep Facebook groups safe. Meta Newsroom (blog). September 17, 2020. https://about.fb.com/news/2020/09/keeping-facebook-groups-safe/
Allyn, Bobby, and Alina Selyukh. (2022). Russia blocks access to Facebook. NPR, March 4, 2022, sec. Technology. https://www.npr.org/2022/03/04/1084580235/russia-blocks-facebook-twitter
Apple. (2023). WWDC 2023 — June 5 | Apple. https://www.youtube.com/watch?v=GYkq9Rgoj8E
Baker-White, Emily. (2022). Meta wouldn’t tell us how it enforces its rules in VR, so we ran a test to find out. Buzzfeed News, February 11, 2022. https://www.buzzfeednews.com/article/emilybakerwhite/meta-facebook-horizon-vr-content-rules-test
Bhandari, Aditi. (2023). Uganda’s anti-gay bill is the latest and worst to target LGBTQ Africans. Reuters, April 7, 2023. https://www.reuters.com/graphics/UGANDA-LGBT/movakykrjva/
Case Studies. (n.d.). Transparency Center. https://transparency.fb.com/data/content-restrictions/case-studies/. Accessed 17 May 2023
Community Standards for Minecraft. 2023. Minecraft. April 13, 2023. https://www.minecraft.net/en-us/community-standards
Content Restrictions Based on Local Law. (n.d.). Meta Transparency Center. https://transparency.fb.com/data/content-restrictions/content-violating-local-law/. Accessed 4 May 2023
Customize Your Meta Avatar With New Body Shapes, Hair and Clothing Textures, and More Ways to Express Yourself. (2023). Meta Quest Blog. April 27, 2023. https://www.meta.com/blog/quest/meta-avatar-puma-new-body-shapes-hair-clothing-textures
Dean, Grace. (2023). Mark Zuckerberg says he is absolutely not abandoning the metaverse as company division loses $4 billion. Business Insider. April 27, 2023. https://www.businessinsider.com/zuckerberg-says-he-isnt-abandoning-metaverse-as-division-loses-mount-2023-4
Dwoskin, Elizabeth. (2023). Tech companies are gaming out responses to the Texas social media law. Washington Post, January 21, 2023. https://www.washingtonpost.com/technology/2022/10/01/texas-social-media-impact/
Elliott, Vittoria. (2021). New Laws Requiring Social Media Platforms to Hire Local Staff Could Endanger Employees. Rest of World. May 14, 2021. https://restofworld.org/2021/social-media-laws-twitter-facebook/
Ellis-Petersen, Hannah. (2023). “India invokes emergency laws to ban BBC Modi documentary.” The Guardian, January 23, 2023, sec. World news. https://www.theguardian.com/world/2023/jan/23/india-emergency-laws-to-ban-bbc-narendra-modi-documentary
European Commission. (2023). European commission - have your say. Text. European Commission - Have Your Say. May 3, 2023. https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13757-Virtual-worlds-metaverses-a-vision-for-openness-safety-and-respect_en
European Parliament Resolution of 8 July 2021 on Breaches of EU Law and of the Rights of LGBTIQ Citizens in Hungary as a Result of the Legal Changes Adopted by the Hungarian Parliament (2021/2780(RSP)). 2021
Freeman, Guo, Samaneh Zamanifard, Divine Maloney, and Alexandra Adkins. (2020). My Body, My Avatar: How People Perceive Their Avatars in Social Virtual Reality. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 1–8. Honolulu HI USA: ACM. https://doi.org/10.1145/3334480.3382923
Film Family. (2023). LGBTQ+ Hangout. Roblox. May 1, 2023. https://www.roblox.com/games/5373028495/LGBTQ-Hangout
Floridi, L. (2022). Metaverse: A Matter of Experience. Philosophy & Technology, 35(3), 73. https://doi.org/10.1007/s13347-022-00568-6
German Criminal Code (Strafgesetzbuch – StGB). (n.d.). https://www.gesetze-im-internet.de/englisch_stgb/englisch_stgb.html. Accessed 28 Apr 2023
Gerken, Tom. (2020). Minecraft ‘Loophole’ Library of Banned Journalism. BBC News, March 13, 2020, sec. US & Canada. https://www.bbc.com/news/world-us-canada-51883247
Gillespie, Tarleton. (2018). Regulation of and by Platforms. In The SAGE Handbook of Social Media, by Jean Burgess, Alice Marwick, and Thomas Poell, 254–78. 1 Oliver’s Yard, 55 City Road London EC1Y 1SP: SAGE Publications Ltd. https://doi.org/10.4135/9781473984066.n15
Goldman, Eric. (2021). Content Moderation Remedies. Michigan Technology Law Review 28(1): 1–60. https://doi.org/10.36645/mtlr.28.1.content
Grant, Nico. (2022). Parler Returns to Google Play Store. The New York Times, September 2, 2022, sec. Technology. https://www.nytimes.com/2022/09/02/technology/parler-google-play.html
Horwitz, Jeff, Salvador Rodriguez, and Meghan Bobrowsky. (2022). Company Documents Show Meta’s Flagship Metaverse Falling Short. Wall Street Journal, October 15, 2022, sec. Tech. https://www.wsj.com/articles/meta-metaverse-horizon-worlds-zuckerberg-facebook-internal-documents-11665778961
Internet & Jurisdiction Policy Network. (2021). Toolkit Cross-Border Content Moderation. https://www.internetjurisdiction.net/uploads/pdfs/Internet-Jurisdiction-Policy-Network-21-104-Toolkit-Cross-border-Content-Moderation-2021.pdf
Jiang, Yaling. (2023). China’s Metaverse Is All About Work. Wired, April 25, 2023. https://www.wired.com/story/china-metaverse-work-health-care/
Marrow, Alexander. (2022). Russia Fines TikTok for ‘LGBT Propaganda’, Twitch over Ukraine Content. Reuters, October 4, 2022, sec. Technology. https://www.reuters.com/technology/russian-court-fines-tiktok-50000-over-refusal-delete-lgbt-content-2022-10-04/
Miller, Chance. (2023). Apple ‘Reality Pro’ Headset Will Toggle between AR/VR, Serve as Mac Display, 2-Hour External Battery Packs, More. 9to5Mac (blog). January 23, 2023. https://9to5mac.com/2023/01/23/apple-reality-pro-headset-features-details/
Ministry of Industry and Information Technology. (2022). 虚拟现实与行业应用融合发展行动计划 (2022—2026年) [Action Plan for the Integration and Development of Virtual Reality and Industrial Applications (2022-2026)]. https://www.miit.gov.cn/zwgk/zcwj/wjfb/tz/art/2022/art_775aaa3f77264817a5b41421a8b2ce22.html
Musk, Elon [@elonmusk]. (2022). Starlink Has Been Told by Some Governments (Not Ukraine) to Block Russian News Sources. We Will Not Do so Unless at Gunpoint. Sorry to Be a Free Speech Absolutist. Tweet. Twitter. https://twitter.com/elonmusk/status/1499976967105433600. Accessed 5 Mar 2022
Musk, Elon. (2023). @mattyglesias Did Your Brain Fall out of Your Head, Yglesias? The Choice Is Have Twitter Throttled in Its Entirety or Limit Access to Some Tweets. Which One Do You Want? Tweet. Twitter. https://twitter.com/elonmusk/status/1657422401754259461
Newton, Casey. (2021). Mark Zuckerberg Is Betting Facebook’s Future on the Metaverse. The Verge. July 22, 2021. https://www.theverge.com/22588022/mark-zuckerberg-facebook-ceo-metaverse-interview
Olinga, Luc. (2023). Mark Zuckerberg Quietly Buries the Metaverse. TheStreet. March 18, 2023. https://www.thestreet.com/technology/mark-zuckerberg-quietly-buries-the-metaverse
On the Introduction of Amendments into Article 5 of the Federal Law “On the Protection of Children from Information Liable to Be Injurious to Their Health and Development” and Individual Legislative Documents of the Russian Federation Aimed at Protecting Children from Information Promoting the Denial of Traditional Family Values. 2013
Padgett, Donald. (2023). Gay TikTok Couple Arrest Under Anti-LGBTQ+ Law. April 9, 2023. https://www.advocate.com/russia/tiktok-russian-gay-couple
Parsons, Thomas D., Christopher Courtney, Louise Cosand, Arvind Iyer, Albert A. Rizzo, and Kelvin Oie. (2009). Assessment of Psychophysiological Differences of West Point Cadets and Civilian Controls Immersed within a Virtual Environment. In Foundations of Augmented Cognition. Neuroergonomics and Operational Neuroscience, edited by Dylan D. Schmorrow, Ivy V. Estabrooke, and Marc Grootjen, 514–23. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-642-02812-0_60
Rajvanshi, Astha. (2022). What to Know About Russia’s So-Called ‘Gay Propaganda’ Bill. Time. November 24, 2022. https://time.com/6236822/russia-gay-propaganda-law-discrimination/
Roblox Community Standards. (2023). Roblox Support. https://en.help.roblox.com/hc/en-us/articles/203313410-Roblox-Community-Standards. Accessed 4 May 2023
Supported Countries for Meta Horizon Worlds. (2022). Meta Store. September 2022. https://www.meta.com/en-gb/help/quest/articles/horizon/explore-horizon-worlds/horizon-supported-countries/
United Nations General Assembly. (1966). International Covenant on Civil and Political Rights. https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-civil-and-political-rights
United Nations. (1948). Universal Declaration of Human Rights. United Nations. United Nations. 1948. https://www.un.org/en/about-us/universal-declaration-of-human-rights
VRChat API Metrics - Dashboards - Dashboards - Grafana. (n.d.). https://metrics.vrchat.community/d/wnQj2Qjnk/vrchat-api-metrics?orgId=1&refresh=30s&from=now-5y&to=now. Accessed 21 Apr 2023
Acknowledgements
The author wishes to acknowledge the University of Bologna for funding her doctoral studies and her supervisor, Luciano Floridi.
Funding
Open access funding provided by Alma Mater Studiorum - Università di Bologna within the CRUI-CARE Agreement. The author’s doctoral research is supported by a scholarship co-funded by the University of Bologna Department of Legal Studies and general budget.
Author information
Authors and Affiliations
Contributions
EH conceptualized, wrote, and edited the article.
Corresponding author
Ethics declarations
Ethics Approval and Consent to Participate
Not applicable.
Consent for Publication
Not applicable.
Competing Interests
The author declares no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Hine, E. Content Moderation in the Metaverse Could Be a New Frontier to Attack Freedom of Expression. Philos. Technol. 36, 43 (2023). https://doi.org/10.1007/s13347-023-00645-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s13347-023-00645-4