Introduction
Virtual reality (VR) is “a computer-generated display that allows or compels the user (or users) to have a sense of being present in an environment other than the one they are actually in, and to interact with that environment” (Schroeder 1996, 25). VR social spaces are virtual worlds, or “environments that people experience as ongoing over time and that have large populations which they experience together with others as a world for social interaction” (Schroeder 2008, 2). Social spaces in VR are avenues for spontaneous creativity, association with other users, and shared transformative experiences spanning the realms of work, education, and social engagement.
Increased agency and a sense of physical presence within virtual spaces allow users to more intensely experience new types of content. With VR-mediated interaction, users physically represent their avatar in real time rather than manipulate an avatar through a controller. By harnessing these features, VR platforms have changed the way people interact with others in social spaces. In short, social VR enables empathy, engagement, and association over long distances in a more engaging way than other platforms or consoles. 77 percent of VR users are interested in more social engagement in virtual reality (Koetsier 2018).
This paper explores how free-association-generated governance emerged in VRChat, an open-ended, user-generated platform consisting of over 25,000 community created worlds.11 For a glimpse into what is currently happening in VRChat, visit https://www.twitch.tv/directory/game/VRChat and click on one of the available livestreams. It applies the lens of polycentric governance to a case study of the emergence of governance solutions to collective action problems in VRChat (Ostrom 2010). This paper finds that user-specified governance (that is, rulesets that are created by and tailored to users in the community), in contrast to platform-imposed governance, encourages cooperation in VR social spaces and is a critical addition to the rules the platform exogenously imposes. In short, polycentric governance in virtual social spaces is an interactive process where rules are coproduced by users, developers, and policymakers.2For the purposes of this paper, “developers” is used as a catch-all term for VRChat employees and professional staff as opposed to users. Castronova (2005) refers to them as the “code authority.” Coproduction is the process by which multiple individuals in the community provide input into a good or service (Ostrom 1996).
Users are not passively consuming experiences in social VR; they are acting as designers by contributing content. Moreover, their feedback informs the mechanisms and tools the developers offer. Users import their perceptions, expectations, and visions into this virtual space. Developers curate, apply technical constraints, and guide users to “create and play in virtual worlds” with other people.33 VRChat, “Create and Play in Virtual Worlds,” VRChat, December 28, 2019, https://www.vrchat.com/. In short, VRChat is a representation of coproduction in action, where users are the interpreters and joint creators of formal and informal laws. In this way, VRChat’s evolution serves as a case study for governance emerging on a new frontier. Game developers, policymakers, and users alike can harness the lessons from this analysis to continue to foster new sustainable social spaces in VR.
This paper is structured as follows. Sections 2 and 3 lay out a history of governance in online communities and a history of virtual social spaces. Section 4 reviews the case study methodology taken here. Section 5 details the extent of collective action problems in VRChat, and section 6 documents emerging solutions to those problems from formal codes of conduct, to US policy, to community governance and informal norms. Section 7 discusses the role of polycentric governance in moderating user behavior.
Previous Work on the Governance of Online Communities
Existing work on the governance of online communities provides a foundation from which to explore the rulemaking activity of communities in virtual reality. Post (2001) identified the emergent nature of rules governing online communities in cyberspace. He notes that although platforms can be governed from above, this is far from the likely norm across online platforms. In fact, Post predicts that competition between platforms will result in a market for rulesets compatible with individual preferences. He argues that “the outcome of the individual decisions within this market—the aggregated choices of individual users seeking particular network rule sets most to their liking—will therefore, to a significant extent, determine the contours of the ‘law of cyberspace’” (Post 2001, 208).
Community-led governance can only be sustained in the right institutional environment. Stivale (2001) documents how the creators of LambdaMOO, a text-based virtual experience released in 1990, turned social moderation and rule creation entirely over to its community; however, repeated incidences of sexual harassment and even “cyber rape” (wherein a player described forced sexual acts that he and others performed on each other virtually) led the creators to re-administer a system of top-down governance three years later. Stivale notes that this type of experimentation is part of a learning process in the pursuit of a mode of governance that avoids the pitfalls of both democratic and authoritarian systems. De Filippi et al. (2020) discuss bringing existing political mechanisms such as juries and political parties into online spaces and designing these modular political mechanisms to be interoperable across platforms.
Balkin (2006) recognized that the relationship between platform owners and players is complex, contain- ing elements from both producer-consumer relationships and governor-citizen relationships. He argues two major governance mechanisms exist in social worlds: code and contract. Code refers to the technical limitations of the game itself. The contract takes the form of a legally binding terms-of-service agreement or end user licensing agreement (EULA); but because the end goal of many online worlds is to build a community following, developers often take the opinions of users into account when creating or updating the EULA. Balkin stresses the importance of “formalizing a consultation process that recognizes the interests of the player community” to allow for joint governance—a form of coproduction (Balkin 2006, 110). Joint production of rules can occur peaceably through continuous discourse between developers and users, or it can happen disjointedly as was the case in the first virtual world protest, the “tax revolt” in Second Life, where users persuaded developers to forgo a penalty for excessive building (Grimmelmann 2003).
Cavender (2015) analyzes the competitive world of EVE Online and finds that reputation mechanisms and group formation allow individuals to cooperate even in a state of war in a game with large popula- tions. Dumont and Candler (2005) look at informal communal provision of rules in online spaces as a powerful delivery system for governance as compared to the traditional emphasis on public versus private provision or provision by formal organizations. They argue that internet-based technology expanded the range of services that can be provided through informal communities. Crawford (2006) argues that social groups can regulate player experiences and often do so better than formal guidelines or public policy because rules can be targeted to local context and culture.
Morrell (2014) draws on the work of Elinor Ostrom and defines governance in online creation communities as having eight characteristics: a collective goal, social norms, technologically embedded rules of participation, self-management of contributions, formal community guidelines, a licensing model, decision-making and conflict resolution systems, and infrastructure provision. Section 6 shows that VRChat exhibits all eight of these characteristics; in addition, governance in VRChat is user-specific, changes over time, and is coproduced. Morrell’s analysis of governance of “online creation communities” provides a lens by which to explore the institutional processes by which users overcome collective action problems in virtual reality spaces wherein they can physically embody their avatars.
The Social Frontier in Virtual Reality
Social virtual reality has its origins in “MUDs” (multi-user dungeons), real-time exercises of interactive story building and role-playing usually taking place in chat rooms. MUDs, often based on fantasy, took their inspiration from the tabletop role-playing game Dungeons & Dragons, which in turn was influenced by the works of J. R. R. Tolkien. One of the earliest visual MUDs, Habitat, released in 1986 by Lucasfilm, encountered an interesting governance challenge. At first, its developers felt they needed to pre-plan and set up events. “We were to be like the cruise director on an ocean voyage, but we were still thinking like game designers” (Morningstar and Farmer 1990). They created a treasure hunt called “D’nalsi Island Adventure” intended to occupy players for days; however, one person solved the puzzle in eight hours before other players even had the opportunity to start. Morningstar and Farmer (1990) reflected:
Again and again we found that activities based on often unconscious assumptions about player behavior had completely unexpected outcomes (when they were not simply outright failures). It was clear that we were not in control. The more people we involved in something, the less in control we were. We could influence things, we could set up interesting situations, we could provide opportunities for things to happen, but we could not dictate the outcome. Social engineering is, at best, an inexact science (or, as some wag once said, “in the most carefully constructed experiment under the most carefully controlled conditions, the organism will do whatever it damn well pleases”).
As a result, they shifted to an approach that let players design and host their own experiences and encouraged designers to allow for the widest range of activity possible (Lastowka and Hunter 2006, 22). Devel- opers became facilitators rather than directors or dictators.
After MUDs came MMORPGs (massively multiplayer online role-playing games) like World of Warcraft, EVE Online, Ultima Online, Everquest, and Dark Age of Camelot. Linden Lab’s Second Life, released in 2003, is still one of the most active and persistent open-ended virtual worlds. Though similar to an MMORPG, Linden Lab insisted Second Life was not a game with an objective, but rather a virtual world in which users chose their own objectives. The vision of individuals fully embodying avatars in real time in an open-ended, virtual, immersive space took its modern form in science fiction. William Gibson depicted the VR dataspace called “the Matrix” in his 1984 thriller Neuromancer. In Snow Crash, Neal Stephenson (1992) formulated the concept of a metaverse—a virtual world that mimicked the real world and featured a downtown area that facilitated socialization and economic exchange. Stephenson posited a world of flu- id, voluntary contracting where individuals enter a different physical jurisdiction (called a “burbclave”) and they are notified by sound or sight that the legal ruleset has changed. For example, nudity or swearing may be allowed in some spaces but not others. VRChat embodies this rule-based fractionalization of society.
Before users select a world, they are presented with a short description of its features.4A “world,” in the context of this paper, is a designed virtual space. There can be multiple “instances” of a world. For example, one can open up a private invite-only or friends-only instance of a public world. Most worlds with defined purposes (such as scavenger hunts or escape rooms55 Escape rooms are games where users solve puzzles and reveal clues in order to leave the space.) have welcome text on walls near the entry points describing the rules of engagement. Users jump from world to world discontinuously and must analyze the formal rules and informal conduct within a world to avoid being kicked or banned.
In Ready Player One, Ernest Cline conceptualized a whole parallel VR society on different planets where people worked, went to school, and played games (Cline 2012). Both Stephenson and Cline imagined that certain design features and fictional moderation would shape behavior, foreshadowing social VR as we know it today.
VRChat is a social platform in which users create content and define the purpose of gameplay.6An online platform is a service that facilitates interactions between multiple users who interact through the internet. VRChat aims to be a place where users can create community, and it empowers its community to create worlds and avatars. The VR environment is unique in that the level of presence and agency (the ability to act on one’s own behalf ) approximates reality while also going beyond limitations imposed by reality. Virtual worlds are not constrained by gravity or Newton’s laws of motion. Users design their avatars to have superhuman capabilities and features such as flaming hair or extra arms. In virtual reality, the digital domain is inhabited; in other words, individuals virtually embody their avatars rather than manipulating them from a distance on a 2D screen. When using a head-mounted display device (e.g., a VR headset), users are in two worlds at once: the material, or real world, and the virtual world (Hindmarsh, Heath, and Fraser 2006).
Real-world phenomena also influence how people interact in virtual spaces. For example, the presence of cords or other real-world distractions can draw users out of the experience. Materiality also affects expectations and immersion in virtual spaces—players expect gravity, expect material resistance, and expect objects to move or respond in certain ways. Yet the immateriality of VR is also a boon. Users can assume multiple new identities, fly, have superhuman strength, and manipulate rocks as if they are feathers. Immateriality provides unique ways to interact, develop communities, and find meaning. The consequences of developments in VR deserve emphasis: users can be the architects of a new society, no longer confined to planet Earth or a single corporeal body (Castronova 2005, 70).
VRChat’s popularity and challenges are due in part to its open-ended, social format. Users generate avatars, design private spaces, and create their own plans and goals. Compared to their two-dimensional or single-user counterparts, social spaces in VR approximate real human interaction more rigorously. VRChat allows users to use body language and exhibit a wider range of expression. People appear more human than in a traditional chat room, even though they may take on non-human forms. As one user recounted, “There are so many new ways to make people laugh.”77 This quote was taken from an Endgame public lecture in VRChat. See: Endgame, “Herding Cats, from VR to Meatspace,” streamed live July 24, 2019, https://www.youtube.com/watch?v=wMKT2bl3sFY. For example, in one instance, users draw shapes like hats and mustaches in the air, and other people walk up to them and pretend to wear them. In an instance made to look like a bowling alley, one user flings bowling balls comically against the wall and another shoves a virtual cupcake into his friend’s face, who responds by falling on the floor, pretending to spit it out. VRChat is a hotbed of creativity, featuring hundreds of concerts, meet-ups, and lectures each week. Users catalyze scenarios, solve problems, impose and break rules, and inform the development of new rules.
In open-ended virtual spaces, users have the freedom to interact with objects, people, and environments in creative ways. Picon (2004) points out that no matter the technical design of the virtual space, the reality of existing within it will not correspond to the designer’s expectations. Hindmarsh, Heath, and Fraser (2006) argue that social scientists overemphasize the importance of the design visions of developers over the empirical realities of the users who inhabit the virtual space. The reality users experience also informs the designer’s decision-making when it comes to rulemaking in virtual spaces and influences how the users and developers communicate to coproduce the governance structure in VR social spaces.
Methodology
Social VR platforms include VRChat, Bigscreen VR, Altspace, RecRoom, Mozilla Hubs, Facebook Horizon, and Sansar. This paper focuses on VRChat because it is the most active and popular. Released in February 2017, VRChat is a multiuser platform in which users design their own avatars and spaces. It hovers around 11,000 players on average per month according to the Steam game distribution service.88 For Steam’s statistics on VRChat usage, see Steam, “VRChat,” SteamCharts, accessed January 27, 2020, https://steamcharts.com/app/438100#1y. Steam is not the only platform for accessing VR social games like VRChat, but it may provide a fairly representative sample, as it is the most popular platform through which users access social VR games.
By comparison, competitors AltSpace, RecRoom, and BigscreenVR all have fewer than 600 players on average per month. Au (2019a) estimated that 30 percent of VRChat users are using VR-capable headsets (head-mounted displays, or HMDs), and many more are using traditional computer interfaces to connect to VRChat. Yet the main community members involved in shaping governance are repeated VRChat users and those who invest in HMDs such as the Oculus Rift or HTC Vive.
This paper uses in-game observations and analysis of primary and secondary sources to formulate case studies of emergent governance in VRChat. The case studies reflect a few developments that occurred in 2018: the Ugandan Knuckles harassment incident, the evolution of community-based public events such as the EndgameVR lecture series, and user reaction to a medical incident. These three incidents, discussed in more detail below, are moments that resulted in major governance decisions. The Ugandan Knuckles phenomenon led to new moderation tools.9Moderation tools are features users or developers can employ to edit or remove user-generated content (e.g., worlds, avatars, or objects). The EndgameVR community developed rules, norms, and enforcement mechanisms to facilitate public lectures. An incident in which a player suffered a seizure while active in VRChat demonstrated the value of sanctioning in informal rule formation.
In order to understand how governance is emerging in social spaces in VR, I embedded myself in the social context in which the rules developed. I observed 30 hours total of VRChat streams on the Twitch platform and on Youtube in the fall of 2019 and spring of 2020. I also participated in several public events in VRChat (lectures, freeze tag, dance parties, and open mics) and hosted 22 private events with friends.
I attended one EndgameVR lecture in 2018 and watched recordings of six other lectures featuring conversations about the future of VR, rules regarding commerce on the platform, and the culture of VRChat communities. For analysis of evolving community discussion and public ruleset deliberation, I reviewed dozens of threads10A list of links to specific threads is available upon request. on online platforms such as Reddit, YouTube, Steam, Canny, and Twitch, which served as primary sources of past and ongoing deliberation about rules, community, and culture.
VRChat provides an ideal venue for understanding governance because of the potential for users and developers to experiment. Users self-select into different rule settings, and platforms like Canny (a comment board that allows users to post ideas, which can be voted up or down and commented on) reveal the dialogue between users and developers in shaping governance.11To read the list of requested features, please visit VRChat, “Feature Requests,” https://vrchat.canny.io/feature-requests. That rules-level analysis is what social theorist Jon Elster focuses on in The Cement of Society (1989). He articulates a question of social theory that is central to this analysis: “What is it that glues societies together and prevents them from disintegrating into chaos and war?” (Elster 1989, 1). How rules will emerge or what the structure of governance will look like at any point in time is unpredictable, as decision-making is context-specific and institution-dependent. Nonetheless, one can seek to better understand the nature of governance solutions as they emerge in response to collective action problems.
Collective Action Problems in VR Social Spaces
In VR social spaces, users must interact in a space with other users, and this setting creates both problems and opportunities, many of which have corollaries in the physical world. Successfully resolving norms of civility is enough of a problem in society today, but users have to engage in the coproduction of rules all over again in VR. Users log in and select a public or private virtual space to enter, each with its own societal context. Users may choose spaces based on where their friends are, based on the characteristics of the world, or based on the world’s designer. There are five types of worlds: public worlds, friends+ (friends can invite friends), friends only, invite+ (users can request an invite), and private instances.1212 Instances are occurrences of worlds. There can be multiple instances of the same world running simultaneously with different users in each instance. This paper does not analyze governance in purely private instances, in which a single individual who created the instance has the power to exclude bad actors in a dictatorial fashion. Collective action problems can occur in all types of populated spaces when there is either an absence of clear rules or an ineffective enforcement of rules; the issues enumerated in this paper occur mainly in public spaces, where complex interactions are difficult to moderate.
Enforcement in VRChat takes a few forms. Users can perform five technical actions toward other users: friend, mute, block, report user, or vote to kick another user. If enough members vote to kick an individual from an instance, the individual is forced to return to their original, in-game starting location and cannot enter the instance for a period of time. VRChat moderators (jokingly referred to as gods) are the only individuals with the ability to ban users or delete their accounts. Moderators can enter public and private worlds (the latter upon private request) and can make their avatars invisibile, raising privacy concerns among users. Instance owners (whether they be private or public instances) and world creators can warn, kick, and mute other users. Competing social spaces fulfill different needs, and users vote with their feet. However, it is impossible to moderate all behavior in real time in all worlds. Inconsistent enforcement means age-old problems of conflict and cooperation are present and evolving.
The number of participants combined with near-endless possibilities for interaction means that virtual words are complex, dynamic orders. Users can construct their avatars any number of ways. Public worlds and private worlds often lack direct formal governance as moderators and venue creators may not be present to enforce community guidelines or stated rules. As such, VRChat is home to a variety of collective action problems including “portalling,” malware, crashing, flash mobs, sexual harassment, and invisibility.
Players can “portal” other players by opening player-sized portals (doors to other instances) and pushing users or tricking users into entering them, effectively kicking them out of an instance. Malware can take the form of distributed denial of service (DDoS) attacks1313 A DDoS attack leverages multiple systems (such as systems compromised by a botnet) to overwhelm the bandwidth of a targeted system. using bots (which can crash a player), or software programs that steal private avatars. VRChat also cautions against malware induced by the download and against use of unofficial copies of the game (Alexander 2018b). Crashers are individuals who use bots, items with high pixel counts (which makes them difficult for screens to render), or exploits14An exploit is the use of glitches or other features of the design to gain an advantage in a way the designers did not intend. in VRChat code to intentionally freeze users’ computers or boot them from the game. One community member called malfeasant crashers the “cancer of society.”1515 For a discussion on crashers, see: Chiruchiru (original poster), “People and ‘crashing’ avatars,” VRChat Steam discussion thread, begun August 13, 2018, https://steamcommunity.com/app/438100/discussions/6/1746720717355580203/.
VRChat is most well-known for its problems with flash mobs. In January 2018, VRChat became known for a flash mob of small echidna avatars known as “Ugandan Knuckles” who would repeat a single phrase phonetically spelled “do you know de way?” (Alexander 2018a). The mobs ruined users’ experience by bombarding them with noise and physically boxing them in. In addition, high profile female gamers highlighted incidents of harassment in VR—specifically users who would target females (Lorenz 2016). Phantoms—characters who enter instances with invisible avatars in order to scare, harass, or listen to the private conversations of other users—remain a problem. Users are particularly concerned with the potential for abuse if moderators use their privileges to enter private instances and remain invisible. Solutions have been proposed, but this issue remains unresolved.1616 Suggested solutions and a discussion about the issue of invisibility can be found here: Poplopo (original poster), “Invisibility disallowed in private rooms,” VRChat Canny discussion thread, begun January 22, 2018, https://vrchat.canny.io/feature-requests/p/invisibility-disallowed-in-private-rooms.
The early days of VRChat in particular were characterized by collective action problems that prevent- ed cooperation. Norms likely weren’t pervasive or powerful enough to moderate hackers, crashers, and harassers in 2018, when VRChat spiked in popularity. In effect, the users involved in harassment were
unsanctionable due to the lack of mechanisms available for governance in the hands of users. Solutions to collective action problems emerged over time and are documented in the following section.
Emergent Solutions to Collective Action Problems in VRChat
The interaction of developers and users results in a number of emergent solutions including technical and design decisions, changes in the formal codes of conduct, applications of US public policy, the development of community governance, and informal codes of behavior. The following sections review each of these solutions to collective action problems in more detail.
Negotiated Governance by Technical Design
By virtue of their technical and design decisions, game designers and developers govern the virtual spaces they create. For example, the design decision to put a gun versus a pen in a VR social space will affect how users behave. VRChat has worlds designed to look like courtrooms, presentation halls, and stages that prompt users to do anything from sing, perform stand-up comedy, or prosecute other users in fake trials. Pixel limitations (called polygon count) on avatars and props ensure that users with less-powerful computers don’t have a subpar experience. Technical decisions can also be made by users. For example, in End- gameVR’s public lectures, hosts will ask players to turn off graphic features with high polygon counts (e.g., an avatar’s actively flaming hairpiece) to reduce lagging performance for other users. The technical “coding authority” and the users are in a constant tug of war about what the rules are in practice. “[T]he actual and maddening fluidity of rules has become part of the daily life of those who design and operate synthetic worlds. Every rule they declare, even ones they code into the world as part of its physics, induces reactions by the user community that may subvert or amplify the rule’s effect” (Castronova 2005, 101).
Through user feedback and design changes, a combination of publicly and privately driven solutions emerges when users and game moderators interact. After the Ugandan Knuckles incident, users asked the game designers and developers to create a mute button and a block option that phases out the sound and avatar of malicious users. The block option remains the most powerful self-governance tool for users to control their environment. Developers also introduced safe mode, which mutes and greys out17In safe mode, other users’ avatars appear as uniform grey busts, they are muted, and their nametags appear with a grey border. user avatars in one’s vicinity. Apart from updating their community guidelines and martialing their moderators, game designers and developers made it so that users could pass through other users’ avatars to prevent mobs from pinning other users in place. Developers also implemented a user trust rank system, which changes the color of a user’s name based on how much experience they have and factors like contributing content, flagging bad behavior, or having friends.18For more on the trust ranking system, visit VRChat, “VRChat Safety and Trust System,” last modified November 2019, https://docs.vrchat.com/docs/vrchat-safety-and-trust-system. The trust rank system and safe mode are examples of the coproduction of rules, since both involved input from users. VRChat’s reaction wasn’t to forcibly break up mobs or prevent free association, but rather to empower users with the tools to moderate their own experience.
Formal Codes of Conduct
VRChat’s formal community guidelines are an example of user-specified governance that has arisen over time.19VRChat, “Community Guidelines: Rules That Keep You from Getting Banned,” last modified June 7, 2018, https://www.vrchat.com/community-guidelines. The guidelines reiterate that role-playing (acting out the part of a particular character, including violent or predatory personas) is not an excuse for violating the guidelines. Reverse engineering of digital assets (e.g., reproduction of someone else’s avatar without the user’s permission) and digital extraction of owned content (known as “ripping” content) are also prohibited. VRChat’s guidelines also detail proper microphone etiquette and include rules about profanity, sexual conduct, self-promotion, and discrimination.
After public accusations of harassment, VRChat re-envisioned their governance structure. They acknowledge their task in an open letter:
One of the biggest challenges with rapid growth is trying to maintain and shape a community that is fun and safe for everyone. We’re aware there’s a percentage of users that choose to engage in disrespectful or harmful behavior…we’re working on new systems to allow the community to better self-moderate and for our moderation team to be more effective.” (VRChat 2018)
The memo detailed where users could provide feedback and ideas to improve VRChat, encouraging users to be actively involved in the rulemaking process. To that end, VRChat maintains a Discord channel (similar to a public chat room with text and voice capabilities) and Canny account. The Discord channel has its own code of conduct in which it asks users to follow the “golden rule” (do unto others as you would want them to do unto you) and bans excessive profanity, advertising, and the discussion of certain topics like piracy or ripped content.20To view VRChat’s Discord channel, you must log in to Discord: https://discordapp.com/channels/189511567539306508/344556143353397263. On Canny, users suggest features they wish to see in VRChat, and VRChat developers can note which rules they are working on implementing.21To see the Canny platform specific to VRChat, visit “VRChat Roadmap,” accessed January 8, 2020, https://vrchat.canny.io/. Suggested features include special moderation tools for world creators (such as the ability to ban individuals from certain worlds), haptic feedback (controller vibration) for in-game fist bumps, and the ability to send portals to friends to allow them to join the portal-sender. The formal rules become user-specified as a result of this feedback process.
After the Ugandan Knuckles incident, VRChat empowered their moderators to warn or ban users who abuse community guidelines or VRChat’s terms of service. There are two types of bans: public bans and full bans. Public bans allow moderators to remove users from public or friends+ instances temporarily; those experiencing public bans are taken to a different space to watch a video explaining the nature of public bans. Full bans disable all activity for a specified period of time. Ban recipients receive a message describing the reason for and length of the ban. VRChat sought these technical and formal solutions in the wake of community demand for change following wide-scale incidents of harassment (Alexander 2018a).
Public Policy
When the late John Perry Barlow, the founder of Electronic Frontier Foundation (EFF), published his “Declaration of the Independence of Cyberspace” in 1996, humanity stood on the frontier of an online world devoid of physical borders and open to new emergent codes of conduct (Barlow 1996). He stressed the role of “culture, ethics and unwritten codes” in governing the new social society where the First Amendment served as the law of the virtual land. Barlow saw online communities as sovereign pockets of free association exempt from federal law. To a great extent, Barlow’s vision has come true. The Clinton ad- ministration’s 1997 Framework for Global Electronic Commerce carved out space for social and economic exchange on the internet.22William J. Clinton and Albert Gore, Jr., A Framework for Global Electronic Commerce (Washington, DC: GPO, 1997).
At the highest level, US law and policies govern VRChat. Copyright laws, digital laws against pirating content, and laws against libel and slander are enforceable in US courts (Hobson 2016). The Children’s Online Privacy Protection Act (COPPA) requires apps and websites to notify parents of data collection from those under age 13.2323 Federal Trade Commission, Children’s Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312 (1998). The Computer Fraud and Abuse Act prohibits unauthorized access to accounts. Certain types of content, such as child pornography, are illegal under federal law and expose users to lawsuits.2424 For Department of Justice laws regarding child pornography, visit: US Department of Justice, “Citizen’s Guide to the US Federal Law on Child Pornography,” last modified May 28, 2020, https://www.justice.gov/criminal-ceos/citizens-guide-us-federal-law-child-pornography
Intellectual property laws are likely to continue to be a flashpoint in future of VR experiences (Balkin 2006; Wassom 2014). Publicity rights govern the permissions process for using celebrities’ likenesses in avatars. Trademark and copyright laws set limitations on what words, phrases, symbols, logos, videos, or music can be reproduced in VR and what is considered “fair use.” Defamation of a user’s virtual or real-world identity can harm reputations. VRChat’s terms of service agreement contains a notification about the Digital Millennium Copyright Act for copyright-related complaints. A bad-faith DMCA takedown of an art project known as “Subcom” resulted in a VRChat world being offline for a few days as the DMCA claim for unauthorized use of copyrighted content was addressed (Diman 2018). The terms of service also delineate the property rights arrangement between the game designers and developers and the users.25VRChat, “Terms of Use and End User License Agreement,” last modified June 7, 2018, https://www.vrchat.com/legal. Second Life was the first notable virtual world to enforce users’ intellectual property rights to their virtual creations through the terms of service (Harvey 2008). In VRChat’s case, users retain the copyright to their user-generated content; however, by posting or uploading that content, they grant VRChat a perpetual license to do whatever it wants with it. If users share their content in a public instance of a world, other users can access and use the content. By encouraging the ownership and creation of user-generated content, VRChat created the incentives for users to self-govern and invest in improving worlds and avatars. The creation of a private world in VRChat grants the creator additional freedoms of speech, association, and assembly, which are constrained in public worlds by community guidelines. A wider range of discourse is permitted in VRChat’s private spaces where participants consent to discussion.2626 VRChat, “Community Guidelines: Rules That Keep You from Getting Banned,” last modified June 7, 2018, https://www.vrchat.com/community-guidelines. One solution not yet enabled by developers is the ability to zone, geofence27Geofencing refers to the act of creating a virtual perimeter for a specific area that may impose different rules or technical features., or label instances as over-18, not-safe-for-work, invisibility-allowed, or anything-goes zones. Some users have requested this option in Canny, and Balkin (2006) suggests zoning is a mechanism to better allow communities to clarify social boundaries and expectations about rules.
Not only does public policy inform what users can do in the game, but rulemaking in VR social spaces can also inform real-world policymaking. Virtual worlds can exhibit a form of cyber federalism in which experimentation with and competition between rulesets acts as a test for legal rules (from zoning laws to property rights to taxation) prior to implementing them in the real world (Bradley and Froomkin 2006). Not only can virtual testing of rulesets be more ethical, but VR also lowers the transaction costs to rulemaking experimentation and the formation of different communities as venues for rulesets.
While formal laws and regulations set some constraints on content and behavior in virtual spaces, real-time governance is largely left up to the inhabitants of VR communities. This situation is similar to the development of the Western legal tradition. In the early days of common law, coproduction and consensus of rules was the norm; law, in a sense, was socially devised and dependent (Berman 1985). A bad actor would be ejected from the group, hence the derivation of the term “outlaw” as standing outside the protection of the group’s law. The same process is playing out in VRChat.
Community Governance and Informal Codes of Behavior
Informal, community-specific codes in virtual social spaces are not a novel phenomenon. In MUDs (i.e., text-based virtual realities in which users type what their characters are doing), users developed informal, in-game codes of conduct. For example, MUD rules included refraining from the following activities: using “reply all” excessively, moving other people’s items, and flooding the server with content (Curtis 2001, 333). Like its online text-based predecessors, VRChat welcomed the formation and emergence of dozens of smaller communities and rulesets based on shared interests or goals.
VRChat communities are groups of individuals who also develop rules by which they will cooperate to address collective action problems.28For a list of VR communities, see Fandom, Inc., “Lore of the Metaverse: Groups,” VRChat Fandom category page, https://vrchat-legends.fandom.com/wiki/Category:Groups. Community leaders and members develop the informal rules by which individuals manage cooperation and conflict. VRChat communities are characterized by distinct informal codes of behavior. For example, the EndgameVR community regularly hosts lectures on VR phenome- an.292The activities of the EndgameVR community can be found on the EndgameVR Facebook page. Endgame: A Talk Show in VRChat, “Home,” Facebook, page created December, 4, 2017, https://www.facebook.com/endgamevr/. During a virtual lecture, a dinosaur, a ghost, several anime characters, and I listened to the host on stage run through a set of ground rules. In order to minimize distraction during their public lectures, the EndgameVR community developed rules to ensure order (e.g., users should mute their mics). Any user found in noncompliance would first be hailed by a community member who would repetitively jump into their field of vision to get their attention, similar to the way in which drivers flash their car’s high beams to get the attention of other drivers. If the user still does not comply, they would then be kicked out of the virtual space.
Sometimes community rules are literally written on the wall. For example, a world designer who creates a world based on the game freeze tag might present written information at the spawn point (the place where individuals enter the instance) describing the rules of the game. A dance club world’s written guidelines might note that users who harass other users will be removed, or request that users turn off distracting avatar features. One feature suggested by users through Canny that VRChat developers plan to implement is the addition of player affiliations with communities. Like guild systems in video games, players would be able to signal under their name whether they are part of a community and at what level (an administrator, officer, or member). Not only would this lend credibility to existing communities and reputations, it would help players identify with others in their community in public spaces and encourage others to join communities and adhere to community rules. Critics of the change note that it may increase tensions between communities and force individuals to choose a primary community.30For details on this particular feature request and its subsequent discussion, see Tupper (original poster), “Player ‘Groups,’ ‘guilds,’ or ‘circles,’” VRChat Canny discussion thread, begun December 18, 2017, https://vrchat.canny.io/feature-requests/p/player-groups-guilds-or-circles. This is the latest iteration in the process of negotiated rulemaking between developers and members of the VRChat community.
Reputation also moderates behavior by requiring users to have “skin in the game.” While the trust ranking is one such mechanism, well-known user names, regular players, or users with creative, complex, or memorable avatar builds can symbolize reputation. The actions of reputable players hold greater weight because of the likelihood of repeated interaction with that person and their friend base, which can be leveraged when voting to kick malicious users. Community members call out disreputable users by name on forums such as Steam and Reddit, advising other users to block known crashers and harassers.
Incidents of harassment and other collective action problems in VRChat have led to the emergence of norms to guard against bad behavior. Reputable users sometimes call for their friends and followers to help stop harassment by sanctioning users who engaged in it. For example, when one user had a seizure in a public world, users gave him space and stayed around to make sure he had recovered (Alexander 2018c). When another user started making fun of the victim, others shamed him for his comments. Since the incident, players have requested medical alert buttons to draw the attention of the moderator to medical issues. Although the proposal faltered due to concerns about false reporting, the credible threat of reporting or voting to kick an individual lent power to sanctioning behavior.31A discussion about the efficacy of adding a medical alert button can be found on Reddit. See: TheMotion (original poster), “Someone in VRChat has a seizure while playing, everyone stops what they’re doing to make sure he’s okay,” archived Reddit discussion thread, last modified 2018, https://www.reddit.com/r/videos/comments/7rg6iy/someone_in_vrchat_has_a_seizure_while_playing/dsx0m0k/.
Discussion: Moderating User Behavior through Polycentric Governance in VRChat
Recognizing that social spaces in VR are dynamic and complex means that a single snapshot of rulesets in virtual worlds is insufficient to understand how social action and cooperation emerge in VR. Instead, the above case studies from VRChat are meant to demonstrate the emergent process of deliberation and coproduction of rules in response to collective action problems. VRChat’s community guidelines are revised and tweaked, reflecting an ongoing quasi-constitutional process that constrains antisocial behavior and promotes cooperation. Rules play a central role in social action. “Humans are other-regarding because we learn to follow rules of conduct” whether formal or informal (Smith and Wilson 2019, 11). Smith and Wilson argue that socially beneficial rule-following develops out of both self-interest and other-directed interests so that we can live in harmony with others (128).
The work of political scientist Elinor Ostrom provides a framework through which to understand the evolution of rules to solve social problems. In her research on governing common resources, she emphasizes the importance of including multiple stakeholders in the governing process, instituting a mechanism for dispute resolution and sanctioning, and making sure the rules and norms that emerge are tailored to the community of users. “Building trust in one another and developing institutional rules that are well matched to the ecological systems being used are of central importance for solving social dilemmas” (Ostrom 2010, 435). Ostrom advanced the concept of polycentricity, wherein multiple, overlapping centers of decision-making lead to a network of competing solutions to collective action problems. Polycentric governance can play a critical role when rights and enforcement are unclear or inconsistently enforced, as with VR instances where direct moderation is absent. Within a polycentric order, different venues can experiment with diverse rules and features, have access to local knowledge, obtain feedback, and learn from the experiences of others’ attempts at rule-setting. Competition between worlds in VRChat allows users to try different rule combinations.
In her Nobel Prize lecture, Ostrom criticizes the oft-made assumption that enlightened policymakers (or developers in this case) should be the ones “to impose an optimal set of rules on individuals involved.” Instead, she argued that the self-reflection and creativity of those local users could serve “to restructure their own patterns of interaction” (Ostrom 2010, 417). The resulting social norms are a form of governance at the most local level. In VRChat, game designers and developers deliberately left space for users to design the rules under which they operate. Self-governance occurs “where actors, who are major users of [the space], are involved over time in making and adapting rules within collective-choice arenas regarding the inclusion or exclusion of participants, appropriation strategies, obligations of participants, monitoring and sanctioning, and conflict resolution” (Ostrom 1999, 2).
As Ostrom notes, rulesets should match the setting in which they are being used. VRChat’s community guidelines do just that. VRChat’s community guidelines exemplify the coproduction and emergence of context-specific rules over time.32To see VRChat’s community guidelines, please visit VRChat, “Community Guidelines: Rules That Keep You from Getting Banned,” last modified June 7, 2018, https://www.vrchat.com/community-guidelines. Community guidelines stipulate the rules of interaction in public spaces on the platform. While game designers and developers can create de jure codes of conduct, the de facto reality of how users respond may be different than the developers’ intent. For example, in VRChat, the use of derogatory terms as usernames is banned, but people may come up with creative misspellings to get around the ban. Nudity is also prohibited; however, users may understand nudity for non-human characters in different ways (is a gorilla avatar without clothes naked?). In open-ended VR social spaces such as VRChat, multiple layers of informal and formal rules, different venues, and moderation tools allow cooperation to emerge.
VRChat’s official Canny and Discord forums facilitate user input and reflect features of committee decision-making. Committees have explicit or implicit rules and procedures (e.g., upvoting) that foster deliberation, as well as rules governing user names and commenting procedure on Canny and Discord. Tullock (2005) identifies multiple roles for committees in an organizational hierarchy: committees foster decision-making; they identify areas of consensus; and they generate ideas and rank them, narrowing down the option set for a superior to choose from. In short, committees result in “joint production of new ideas and winnowing” (Tullock 2005, 311). For clubs and voluntary associations in physical space, rules (notably motions, amendments, and quorums) often originate from customs captured in Robert’s Rules of Order, first published in 1876, which specified parliamentary procedures for meetings (Robert, Honemann, and Balch 2011). The idea of a parliamentary procedure conjures an image of political fanfare, with officials in wigs, suits, or robes laying out romantic arguments about justice and prosperity. Yet parliamentary procedures are commonplace in the digital space; they merely take a different form. Today’s quasi-committees take place on Reddit, Canny, and Discord. Rules for engagement on these platforms are posted publicly and forbid hate speech, doxing (deliberate distribution of someone’s personal information), discussion of how to rip content, and excessive profanity or self-promotion for streamers or content creators.33For an example of such rules for engagement, see Nukemarine, “VRChat: General Rules,” Reddit, last modified March 2020, https://www.reddit.com/r/VRchat/wiki/rules. In public instances, rulesets are more the outcome of live interaction and negotiation rather than choice. Rules are politics in action and represent a compromise among players (Grimmelmann 2006). The resulting frictions and tensions reflect the messy reality of solving collective action problems.
The success of EndgameVR in 2018 shows how the formation of communities has a moderating effect on behavior through the enforcement of social norms. Such voluntary private group formation parallels de Tocqueville’s observations of emergent association in the United States (de Tocqueville [1835] 2010). When permitted to form groups, people organize into social, political, and economic entities to address collective action problems, among other functions. In an online community, the “best government is going to be each other, because the man behind the curtain isn’t going to know you any more than you know him,” and this local knowledge allows communities to develop rules and norms tailored to the community’s needs (Crawford 2006, 210).
Social norms and reputational mechanisms are a critical source of governance. As sociologist Jon Elster recognizes, “the strategically rational actor takes account of the fact that the environment is made up of other actors, and that he is part of their environment and that they know this, etc.” (Elster 1979, 18). Individuals act within a social system and must come to terms with the social costs and benefits of their actions. In doing so, individuals see that acting in one’s self-interest means being considerate and conscientious. According to Elster, “altruism, envy, social norms and self-interest all contribute, in complex, interacting ways to order, stability and cooperation […] Each society and each community will be glued together, for better and for worse, by a particular, idiosyncratic mix of these motivations” (Elster 1989, 287). Elster recognized that cooperation doesn’t just depend on a rational understanding of legal and economic constraints; social action also derives from shame, honor, opportunism, envy, and spite.
Norms coordinate expectations and evolve to ensure predictability in new or complex settings (Elster 1989, 97). According to Elster, what makes norms social “is that other people are important for enforcing them by expressing their approval and, especially, disapproval” (1989, 99). Accusing someone in VRChat of having been involved in the Ugandan Knuckles mob behavior is to accuse them of being antisocial, cruel, and immature. Social norms are the grammar of society—the common practices that hold societies together (Bicchieri 2006). Importantly, norms are not the product of deliberate design but instead emerge within the social fabric alongside language and culture (Bicchieri 2006). Social norms specify what is acceptable behavior in a group. In the case of the EndgameVR lecture, users engage in the convention of standing in line in the center of the room to ask questions. Social sanctioning plays a role in constraining bad behavior throughout the process—users can be ejected from the space for failing to mute their headset or for making lewd gestures. The ability to banish violators from a community or world is key to enforcing social norms.
The political sphere, the economy, and society are not independent systems. These domains are entangled across virtual and real-world spaces, and developments in each sphere influence the other. Yet long-lasting and successful rule changes are produced from within a society, not imposed from without (Boettke and Fink 2011, 499). By contrast, rule changes that occur without the cultural buy-in of the people impacted may not last. This is why forums such as Canny and Discord that allow users and developers to communicate are critical to VRChat’s continued success as a platform.
Not all problems within VRChat have been resolved. Continuing challenges to social cooperation include differences in the level of investment of players, conflicting motivations for behavior, language barriers, demographics, and barriers to adoption of the technology necessary to participate in virtual social spaces. Community leaders and committed users take norms and sanctioning more seriously than users seek- ing to use the platform purely for their own entertainment rather than socialization. According to Au (2019b), VRChat’s gender distribution is 81.5 percent male and 18.5 percent female, with 75 percent of users under the age of 34. This means social dynamics are likely different from dynamics in a more diverse populace in the real world. The majority of laptops don’t meet the minimum hardware requirements for processors and graphics cards that VRChat requires; furthermore, VRChat is not available on Mac operating systems.34For VRChat system requirements, see Aev, “System Requirements,” VRChat, last modified May 10, 2020, https://help.vrchat.com/kb/article/19-system-requirements/. There are also barriers to the emergence of solutions. The lack of continuity between users day-to-day means that norms have to be constantly learned by newcomers and reinforced by exist- ing users. Moreover, the process of finding solutions is not always fast or easy. Debates on how to manage moderation in private worlds, enforce formal codes of conduct, mitigate the abuse of invisibility privileges, and deal with health emergencies and persistent toxic behavior are still in process. Undue suffering can occur in the meantime. Nonetheless, even in the face of such challenges, developers and users are engaging in the process of joint rulemaking.
Conclusion
The VRChat case studies show that a polycentric order of rules supporting cooperation emerged through the interactions of users and developers. VRChat employees, in conjunction with users, altered social circumstances through technical means, formal codes of conduct, applications of US public policy, community governance, and informal codes of behavior. VRChat’s current emphasis on improving the ability of users to self-moderate while using the platform is crucial to the long-term success of the social environment. Public policy and enforceable, formal codes of conduct are two institutional layers in a complex, evolving polycentric order that governs the VRChat experience. Ostrom’s framework for institutional analysis and design reveals the value over time of bottom-up governance that involves user-specified rules and mechanisms, and coproduction of those rules from multiple stakeholders.35Adjacent research areas not explored in this paper can address the following questions: What is the best way for virtual reality platforms to disclose formal rules? How much rulemaking deliberation should occur on 2D online platforms versus within the virtual worlds themselves? When users are allowed to have a say in rule creation, communities can be oases of civility that flexibly adapt to support the needs of individuals that operate within them. Communities can also serve as competing venues for experimentation in governance solutions. The formation of fluid social groups, such as the EndgameVR community, contributed to governance and the creation of social norms that users carried with them as they traversed worlds.
Wherever we go, we bring the conflict inherent in collective action problems as well as our ability to muddle through them. As famous Chinese science fiction author Cixin Liu said, “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark.
Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains” (Liu 2014, afterward). Liu’s observation reflects two insights germane to this conversation on governance in virtual reality.3636 Liu’s observation also raises questions worthy of more extensive exploration. For example, what happens to the economy, politics, and society when individuals choose to spend a majority of time in social spaces? For an imaginative exploration of the effects of large-scale adoption of virtual existence, see the science fiction short story by Kurt Vonnegut, “Unready to Wear,” Galaxy Science Fiction, April 1953, 98–111, available at the Internet Archive, https://archive.org/stream/galaxymagazine-1953-04/Galaxy_1953_04#page/n99/mode/2up. First, our experiences, norms, and cultures in physical space are inevitably transported into the virtual spaces we inhabit. Second, if “reality” can be anywhere individuals find meaning, virtual social experiences can be as legitimate as physical ones. As Castronova recognizes, “the conceptual step of assuming that computer-generated content has less actuality, less genuineness than content from the ‘real world,’ was a mistake…one that has not helped the research paradigm and was, in a deep way, arrogant” (2005, 288). VR is increasingly where users find meaning by merging their external identities with their in-game personas. Thus, virtual worlds are not diversions from reality, but real places in their own right. In these worlds, community helps create rulesets and a sense of reality, making the game more experiential (Castronova 2005, 292).
Quarrels and antagonism are part of the human condition. And yet, individuals within societies continually engage in internally generated fabrication of rules and relationships. In fact, “the single most important action a world designer can take to improve a virtual world is to increase the bandwidth of social interaction” and encourage group formation because the value of online platforms scales exponentially with the size of the social network (Crawford 2006, 207–8). The experiences of users and developers in VRChat demonstrate that increasing the range of user freedom to engage in rulemaking is not going to result in intractable, permanent problems of undesirable or antisocial conduct. The goal of game developers and others involved in social VR should be to foster platforms for the coproduction of rules and allow multiple, competing informal and formal rulesets to bring out the best in humanity.
References
Alexander, Julia. 2018a. “Ugandan Knuckles is Overtaking VRChat: VRChat’s New Favorite Joke is Problematic.” Polygon. January 8, 2018. https://www.polygon.com/2018/1/8/16863932/ugandan-knuck-les-meme-vrchat.
———. 2018b. “VRChat Threatens Permanent Bans for Players in the Wake of Security Concerns.” Polygon. January 19, 2018. https://www.polygon.com/2018/1/19/16909324/vrchat-security-play-ers-banned-steam.
———. 2018c. “After VRChat User Suffers Seizure, Exec Says There Are Ways to Help.” Polygon. January 19, 2018. https://www.polygon.com/2018/1/19/16911524/vrchat-seizure-mod-youtube-rogue-shad-ow.
Au, Wagner James. 2019a. “30% of VRChat’s Daily Users Wear HMDs.” New World Notes (blog). February 01, 2019. https://nwn.blogs.com/nwn/2019/02/vrchat-users-hours-stats.html.
———. 2019b. “VRChat Site User Demographics: 430,000 Uniques, Mostly Male and Over 25.” New World Notes (blog). May 23, 2019. https://nwn.blogs.com/nwn/2019/05/vrchat-user-numbers-demo-graphics-social-vr.html.
Balkin, Jack M. 2006. “Law and Liberty in Virtual Worlds.” In The State of Play: Law, Games, and Virtual Worlds, edited by Jack M. Balkin and Beth Simone Noveck. New York: New York University Press.
Bicchieri, Cristina. 2006. The Grammar of Society: The Nature and Dynamics of Social Norms. Cambridge, UK: Cambridge University Press.
Barlow, John Perry. 1996. “Declaration for the Independence of Cyberspace.” Electronic Frontier Foundation. February 8, 1996. https://www.eff.org/cyberspace-independence.
Berman, Harold. 1985. Law and Revolution: The Formation of Western Legal Tradition. Cambridge, MA: Harvard University Press.
Boettke, Peter J., and Alexander Fink. 2011. “Institutions First.” Journal of Institutional Economics 7, no. 4 (December): 499–504.
Bradley, Caroline, and Michael A. Froomkin. 2006. “Virtual Worlds, Real Rules: Using Virtual Worlds to Test Legal Rules.” In The State of Play: Law, Games, and Virtual Worlds, edited by Jack M. Balkin and Beth Simone Noveck. New York: New York University Press.
Castronova, Edward. 2005. Synthetic Worlds: The Business and Culture of Online Games. Chicago: University of Chicago Press.
Cavender, Robert S. 2015. “The Economics of Self-Governance in Online Virtual Societies.” PhD dissertation, George Mason University. Proquest (UMI 3706883).
Cline, Ernest. 2012. Ready Player One: A Novel. New York: Ballantine Books.
Curtis, Pavel. 2001. “Mudding: Social Phenomena in Text-Based Virtual Realities.” In Multimedia: From Wagner to Virtual Reality, edited by Randall Packer and Ken Jordan. New York: W. W. Norton & Company.
Crawford, Susan. 2006. “Who’s in Charge of Who I Am? Identity and Law Online.” In The State of Play: Law, Games, and Virtual Worlds, edited by Jack M. Balkin and Beth Simone Noveck. New York: New York University Press.
De Filippi, Primavera, Seth Frey, Nathan Schneider, Joshua Tan, and Amy Zhang. 2020. “Modular Politics: Toward a Governance Layer for Online Communities.” Working paper.
De Tocqueville, Alexis. (1835) 2010. Democracy in America: Historical-Critical Edition, vol. 3. Edited by Eduardo Nolla. Translated by James T. Schleifer. Indianapolis: Liberty Fund.
Diman, Pan. 2018. “‘Can We Copystrike VRChat?’: DMCA Claim against User Gets Resolved.” Virtual Week-ality. August 18, 2018. https://sites.google.com/view/virtual-week-ality/august-18th-2018/dmca-claim-against-user-gets-resolved.https://sites.google.com/view/virtual-week-ality/au-gust-18th-2018/dmca-claim-against-user-gets-resolved
Dumont, Georgette, and George Candler. 2005. “Virtual Jungles: Survival, Accountability, and Governance in Online Communities.” American Review of Public Administration 35, no. 3 (September): 287–99.
Elster, Jon. 1979. Ulysses and the Sirens: Studies in Rationality and Irrationality. New York: Cambridge University Press.
———. 1989. The Cement of Society: A Study of Social Order. New York: Cambridge University Press.
Grimmelmann, James. 2003. “The State of Play: On the Second Life Tax Revolt.” Sept. 21, 2003. https://mail.dworkin.nl.pipermail/mud-dev-archive/2003-September/028336.html.
———. 2006. “Virtual Power Politics.” In The State of Play: Law, Games, and Virtual Worlds, edited by Jack
M. Balkin and Beth Simone Noveck. New York: New York University Press.
Harvey, Ian. 2008. “Virtual Worlds Generate Real Litigation.” Law Times. February 11, 2008. https://www.lawtimesnews.com/news/general/virtual-worlds-generate-real-litigation/259438.
Hobson, Anne. 2016. “Reality Check: The Regulatory Landscape for Virtual and Augmented Reality.” R Street Institute Policy Study No. 69, R Street Institute, Washington, DC, September 2016.
Hindmarsh, Jon, Christian Heath, and Mike Fraser. 2006. “(Im)materiality, Virtual Reality and Interaction: Grounding the ‘Virtual’ in Studies of Technology in Action.” Sociological Review 54, no. 4 (November): 795–817.
Koetsier, John. 2018. “VR Needs More Social: 77% of Virtual Reality Users Want More Social Engagement.” Forbes. April 30, 2018. https://www.forbes.com/sites/johnkoetsier/2018/04/30/virtual-reality-77-of-vr-users-want-more-social-engagement-67-use-weekly-28-use-daily/#54ae2b5818fc.
Lastowka, F. Gregory, and Dan Hunter. 2006. “Virtual Worlds: A Primer.” In The State of Play: Law, Games, and Virtual Worlds, edited by Jack M. Balkin and Beth Simone Noveck. New York: New York University Press.
Lorenz, Taylor. 2016. “Virtual Reality is Full of Harassers. Here’s Why I Keep Going Back.” Mic. May 26, 2016. https://www.mic.com/articles/144470/sexual-harassment-in-virtual-reality.
Liu, Cixin. 2014. The Three Body Problem. Translated by Ken Liu. New York: Tor Books.
Morningstar, Chip, and Farmer, Randall F. 1990. “The Lessons of Lucasfilm’s Habitat.” In Cyberspace: First Steps, edited by Michael Benedikt. Cambridge, MA: MIT Press.
Morrell, Mayo Fuster. 2014. “Governance of Online Creation Communities for the Building of Digital Commons: Viewed through the Framework of Institutional Analysis and Development.” In Governing Knowledge Commons, edited by Brett M. Frischmann, Michael J. Madison, and Katherine J. Strandburg. New York: Oxford University Press.
Ostrom, Elinor. 1996. “Crossing the Great Divide: Coproduction, Synergy, and Development.” World Development 24, no. 6 ( June): 1073–87.
———. 1999. “Self-Governance and Forest Resources.” CIFOR Occasional Paper No. 20, Center for International Forestry Research, Bogor, Indonesia.
———. 2010. “Beyond Markets and States: Polycentric Governance of Complex Economic Systems.”
American Economic Review 100, no. 3 ( June): 641–72.
Post, David. 2001. “Anarchy, State, and the Internet: An Essay on Lawmaking in Cyberspace.” In Crypto Anarchy, Cyberstates, and Pirate Utopias, edited by Peter Ludlow. Cambridge, MA: MIT Press.
Picon, Antoine. 2004. “Architecture and the Virtual: Towards a new Materiality?” PRAXIS: Journal of Writing + Building, no. 6, 141–21.
Robert, Henry M. III, Daniel H. Honemann, and Thomas J. Balch. 2011. Robert’s Rules of Order Newly Revised, 11th edition. Cambridge, MA: Da Capo Press.
Schroeder, Ralph. 1996. Possible Worlds: The Social Dynamic of Virtual Reality Technology. Boulder, CO: Westview Press.
———. 2008. “Defining Virtual Worlds and Virtual Environments.” Journal of Virtual Worlds Research 1, no. 1 ( July): 1–3.
Smith, Vernon L., and Bart J. Wilson. 2019. Humanomics: Moral Sentiments and the Wealth of Nations for the Twenty-First Century. Cambridge, UK: Cambridge University Press.
Stephenson, Neal. 1992. Snow Crash. New York: Bantam Books.
Stivale, Charles J. 2001. “‘Help Manners’: Cyberdemocracy and Its Vicissitudes.” In Crypto Anarchy, Cyber- states, and Pirate Utopias, edited by Peter Ludlow. Cambridge, MA: MIT Press.
Tullock, Gordon. 2005. Bureaucracy. Edited by Charles K. Rowley. Indianapolis: Liberty Fund.
VRChat. 2018. “An Open Letter to Our Community.” Medium. January 9, 2018. https://medium.com/@vrchat/an-open-letter-to-our-community-1b7aa5d9026f.
Wassom, Brian. 2014. Augmented Reality Law, Privacy, and Ethics: Law, Society, and Emerging AR Technologies. Rockland, MA: Syngress Publishing.