key: cord-0113482-r84vv5hy authors: Bajpai, Tanvi; Asher, Drshika; Goswami, Anwesa; Chandrasekharan, Eshwar title: Harmonizing the Cacophony with MIC: An Affordance-aware Framework for Platform Moderation date: 2021-07-19 journal: nan DOI: nan sha: e9ab330aa79cf387b6060361f857ea16aa63e614 doc_id: 113482 cord_uid: r84vv5hy Social platforms are evolving at a rapid pace. With the addition of new features like real-time audio, the landscape of online communities and moderation work on these communities is being out-paced by platform development. In this paper, we present a novel framework that allows us to represent the dynamic moderation ecosystems of social platforms using a base-set of 12 platform-level affordances, along with inter-affordance relationships. These affordances fall into the three categories -- Members, Infrastructure, and Content. We call this the MIC framework, and apply MIC to analyze several social platforms in two case studies. First we analyze individual platforms using MIC and demonstrate how MIC can be used to examine the effects of platform changes on the moderation ecosystem and identify potential new challenges in moderation. Next, we systematically compare three platforms using MIC and propose potential moderation mechanisms that platforms can adapt from one another. Moderation researchers and platform designers can use such comparisons to uncover where platforms can emulate established, successful and better-studied platforms, as well as learn from the pitfalls other platforms have encountered. The moderation of online communities has been the focus of a large body of social computing research [11, 15, 21-26, 37, 39-41, 52] . Much of this research is unified by the use of Grimmelmann's taxonomy of moderation [17] , which provides general terminology and strategies for moderating online communities. For instance, Grimmelmann broadly characterizes an online community using its three features: the community's members, the content that is shared among the members, and the infrastructure used to share it. Similarly, Grimmelmann's four techniques for moderation, excluding, pricing, organizing and norm-settings are all defined in a way that is general enough for them to be applied to a variety of diverse communities and technologies. The generality of Grimmelmann's taxonomy is unequivocally useful for unifying moderation research. However, it is also true that the moderation of online communities is largely limited and enabled by the characteristics of the Social Networking Sites (SNSs), or social platforms, they use. As platforms are created and updated, so too are the moderation strategies, needs, and challenges of the online communities that use them. As such, more and more recent moderation research is centered around particular platforms (e.g., [22, 39] ). However, the landscape of online communities and moderation work on these communities is being rapidly out-paced by the development of platforms. An example of this can be seen in the recent rise in popularity of audio-based social platforms: In March of 2020, the global COVID-19 pandemic forced people to self-isolate, work from home, and limit in-person interactions all together; this allowed for a new social platform called Clubhouse to surge into the mainstream [42] . Clubhouse's subsequent popularity was accompanied by the introduction of other audio-focused platforms and extensions to existing platforms [36] , as shown in Figure 1 . Twitter launched an audio-group chat feature called Twitter Spaces in May 2021 [44] . Eventually started being used by other non-gamer communities. 2020 founded in April 2020 as a dropin live audio social app. In October, the Locker Room app was launched as a live audio app for Sports communities. In Facebook announced the development of a similar feature in the Spring of 2021, with plans to launch sometime in the Summer of 2021 [33] . Spotify acquired the parent company of an audio-only, sports-centered app called Locker Room in March 2021 [4, 45] , and re-branded and re-launched it as Spotify Greenroom two months later [9] . Sonar, an alternative voice-chatting app, launched in January 2021 [43] . Other popular platforms such as Reddit [35] , Telegram [53] , Slack [38] , and Discord [3] quickly followed suit and launched their own Clubhouse-esque features to support audio. Similar to the development of any new social technology, questions about moderating such platforms continues to be of particular interest to the Computer-Supported Cooperative Work and Social Computing (CSCW) research community. We identify three key challenges that researchers face when studying moderation on this landscape of dynamically evolving social platforms. First, it may be tempting to choose one or two representative platforms to investigate to develop new insights to their moderation. However, in reality, these platforms are diverse in ways that effect moderation. For instance, Clubhouse is largely audio-only, while Spotify Greenroom allows users to enable a text-based chat box into their live audio-room. Secondly, many of the new platforms or features might appear to be novel or unstudied, when they are in reality subtly reminiscent of older and more-established technologies. Spotify Greenroom's live chat box is similar to those that accompany live video streams on Twitch, 1 while Sonar's world-building concept resembles classic virtual world building games such as Minecraft. 2 Finally, these platforms are rapidly evolving and adding features that impact moderation. Thus, research done on a platform might seem out-dated or impractical by the time it gets published. For instance, Clubhouse added new text-based messaging features in the time between the submission of this manuscript and the release of its revisions. To address these challenges, and better enable the moderation research community to keep up with rapid platform development, we develop a new theoretical framework for representing the moderation ecosystems of social platforms. Our framework can benefit platform designers and online community owners by enabling them to identify potential moderation challenges they may face on a platform, as well as design moderation solutions to address them. In this paper, we present a novel theoretical framework that allows us to represent the moderation ecosystems of social platforms. By moderation ecosystem, we mean the physical attributes of a social platform that impact moderation. Our representation uses a base set of relevant platform-level affordances. These affordances fall into the three categories that are derived from Grimmelmann's [17] definition of an online community: -Members, Infrastructure, and Content. As such, we call our framework MIC. As is the case with any ecosystem, these moderation-related affordances likely impact each other. To represent this, we have also included in MIC a notion of inter-affordance relationships. The MIC framework has key implications for moderation researchers, platform designers, and online community owners. Broadly, we argue that the advantages of using the MIC framework are three-fold: (1) The affordances and inter-affordance relationships in MIC provide a simple and explicit representation of potentially complex or subtle moderation ecosystems of social platforms. These components will also provide moderation researchers and community owners a convenient "checklist" to aid them in exploring and considering platforms to understand how moderation occurs on them. (2) MIC can be used to compare and contrast platforms' moderation ecosystems. Online community owners can use these comparisons to help decide which platforms would be more conducive for the moderation needs of their communities. Moderation researchers and platform designers can use these comparisons to uncover where platforms can adapt and learn from more established and better-studied platforms, as well as learn from the pitfalls these platforms have encountered. (3) MIC's representation of a platform's moderation ecosystem can be easily updated to reflect platform changes. Inter-affordance relationships can also be examined to catch potential moderation issues that new features could cause. This will make it easier for moderation researchers, platform designers, and online community owners to update their understanding of platforms, and re-evaluate and potentially update moderation strategies and tools that might be impacted by platform changes. To support the above claims, we will use MIC to analyze several social platforms in two case studies. Our first case study focuses on analyzing an individual platform using MIC, and shows how MIC can easily reflect platform changes as well as propogate such changes throughout the moderation ecosystem to account for potential new moderation challenges. In the second case study, we use MIC to systematically compare three platforms and use these MIC-based comparisons to propose potential moderation mechanisms that platforms can adapt from one another. Before detailing our framework, we introduce the platform affordances that we account for in MIC and review related work that motivated each of these affordances. First, we describe the high-level organization of these affordances, which was inspired by Grimmelmann's work [17] . Grimmelmann defines an online community using three features: the community's members, the content that is shared among the members, and the infrastructure used to share it [17] . We use these features to motivate the three main categories for affordances that we include in the MIC framework. Now we discuss how each of these categories impacts the four basic techniques for moderation listed by Grimmelmann. Exclusion is the act of excluding problematic or unwanted members from the community. Another closely related technique is pricing, which controls the participation of community members by introducing barriers to entry. Both exclusion and pricing are mandated by the infrastructure and members of the community: infrastructure provides the tools for exclusion or pricing, while members are involved in using these tools. Organizing is a technique that involves "shaping the flow of content from authors. " This technique is closely tied to the nature of content within the community. It is also tied to infrastructure and the type of "shaping" capabilities that are provided to the members of the community. Finally, the fourth technique listed by Grimmelmann is norm-setting, which involves the creation and articulation of community norms to establish the types of behavior that are acceptable within the community. Norm-setting can be done through the other techniques, and is therefore impacted by all three categories of community features and affordances. Next, we discuss each category of affordances included in our framework and review related work examining these affordances, with a particular emphasis on research related to moderation. Through interviews with volunteer moderators of Discord servers, Jiang et al. [22] found that server owners create custom user roles to distinguish between various user types. The moderator role is a common facet of online communities and a role that is often assumed by volunteers on platforms relying on distributed moderation [15, 22, 41, 54] . The second member-related component in our framework is anonymity. Schlesinger et al. [39] studied how anonymity affects content on Yik Yak, a social media application that allowed users to make anonymous text posts that are grouped by location [39] . In general, anonymity has been found to have both positive and negative effects on social interactions [13] . Outside the context of online social spaces, anonymity was found to remove status markers that prevent members from participating in discussions on collaborative systems [18, 30, 51] . Prior work examining the role anonymous voice-based interactions in online games found that in some cases anonymity was lost due to the nature of voice-based communication, and this caused some players to feel uncomfortable [50] . In fact, this loss of anonymity was deemed as one of the main reasons behind gamers abandoning the game being studied. One of the main infrastructural affordances we consider is a platform's organization, i.e., how content and communities of the platform are situated. On Twitch, text-chats are associated to specific live streams, and live streams are separated by different Twitch channels; different channels have different moderators. In certain cases, the lack of certain organizational structures within platforms might force community members to use other platforms to overcome these deficiencies. This might lead to various inter-platform relationships, which can be seen in prior work studying how moderators of Reddit communities use both Reddit and Discord to host their communities and the resulting challenges moderators have to tackle in doing so [23] . Other integral parts of the infrastructure of ABSPs include the rules and guidelines of platforms and the communities they host. Prior work has examined the rules that moderators of both Reddit and Discord outline for their communities, as well as guidelines specified by the platform itself [22, 23] . Rules and guidelines, both community-defined and platform-specified, often describe the different roles members can play within the community (e.g., both Discord and Reddit have pages dedicated to defining what the role of a moderator entails). Rules and guidelines have also been shown to shape community norms [14, 24, 48] . Platforms also have different badges and markers, such as emojis to react or up-and down-vote content. In the context of audio-based social platforms, markers can provide relevant cues to indicate whether a user wishes to speak or not (a challenge that is often characteristic of video-based or voice-based communication [19, 32] ). Our infrastructural affordances include moderation mechanisms, i.e. the infrastructure that a platform provides specifically for moderation. Reddit has automated moderation tools, as well as an API that allows moderators to create moderation tools and bots to help human moderators to review large volumes of content. Discord has similar tools for moderators, some of which have been found to cause unprecedented moderation issues [22] . Prior work has explored how volunteer moderators employ a variety of mechanisms for moderating content, and moderation typically involves a large amount of time and effort to keep up with the massive amounts of content generated within social platforms [23, 29] . As a result automated and human-machine collaboration tools are being developed to assist moderators on text-based platforms like Reddit [10, 20] . Video-hosting platforms like YouTube use algorithmic moderation that allows them to have a larger moderation purview without burdening human moderators [16, 37] . Finally, platforms which have mechanisms allow for monetization may have novel moderation problems, since monetization has been found to lead to controversial behavior online to achieve virality [7] , and algorithmic moderation tools can negatively impact users who rely on the monetization of their content [31] . Our framework considers the various modalities platforms can support. As discussed in the previous subsections, the modality of content plays a role in how the content is viewed, organized, and moderated. Much of the communication that occurs in the audio-based social platforms discussed previously occurs in real-time. This has always been the case with voice-communication over telephone and is a common theme of audio-based communication that occurs in group voice-chats for gaming [5, 46, 50] . Ackerman et al. [5] studied how users viewed and used Thunderwire, a collaborative audio-only real-time communication system modeled after telephone "party lines" of the late 19th century. Wadley et al. [50] studied real-time audio-communication in online multiplayer games and virtual worlds during game play. There has been research done on voice-based communities from India that use asynchronous audio for communication [34, 49] . From these works, it is clear that the synchronicity of audio content is a defining characteristic of audio-based social platforms and affects moderation capabilities. Ephemerality is often, but not always, a consequence of synchronous or real-time content. Both communities studied by Ackerman et al. [5] and Wadley et al. [50] used ephemeral content. Prior work on ephemerality in social platforms has largely focused on ephemerality of text posts, links or images [6, 39, 55] . Jiang et al. [22] studied the challenges of moderating voice on Discord and found that the ephemerality of audio-based content was a large factor that contributed to the challenges that moderators face. Finally, social platforms can allow for certain access and restrictions imposed on either viewing or creating content. In the past, subreddit moderators have purposely restricted access to their content as a way to express dissatisfaction with certain platform changes [28] . Similarly, restrictions and access have been used to subdue antisocial behavior, though the efficacy of doing so is largely unclear [47] . In this section, we formally define MIC through its components: platform affordances and the relationships between them. Affordances are properties of platforms that play a role in moderation. We have identified three categories of observations provided by the first author, as well as some prior work. We will also construct MIC diagrams for Spotify ( Figure 2 ) and Discord ( Figure 3 ) using the framework. High-level descriptions of these platforms are provided below. Discord. A messaging platform that allow users to communicate via text, voice, or video. Discord's infrastructure is composed of "servers," which can be thought of as landing pages for individual communities that use the platform. Servers can contain topic specific text-channels or voice/video channels. Server owners can create custom roles for server members, and can associate specific permissions for each role. A audio-streaming service that hosts both music and podcasts. The main two types of Spotify users are listeners (those who use the service to stream content) and creators (those who use the service to upload content). Listeners are able to follow both creators and other listeners, and can view the latter's playlists and listening history. Creators must use other Spotify services, such as Spotify For Artists 3 for musicians and Anchor 4 for podcasters. A music-sharing website that allows all users to post audio (which consists of music, podcasts, random noises, etc). Users are able to comment on audio files and re-post others' audio posts on to their feed. We present twelve affordances that can be used to represent social platforms in the MIC framework. For each affordance, we provide a general description and identify variations of each affordance through our working examples. We will also discuss how these affordances may play a role in moderation on platforms. Modalities ( modalities). Platforms that are centered around one type of modality are considered unimodal. Platforms that support multiple types of modalities are considered mutlimodal. Discord is multimodal since servers contain text-and voice/video-channels. Spotify is unimodal since audio is the primary type of content supported by the platform. The existence of multiple modalities will affect moderation on the platform, since having more than one modality typically requires a broader set of policies and tools for moderation [22, 23, 29] . Access and Restrictions ( access). Platforms often have various access and permission settings that allow or prohibit content from being posted, viewed, or removed. Many of these settings are accessible by the content creator, while some are limited to the platform. Discord allows server-owners and moderators to limit access to the server itself and to channels; the ability to use certain messaging features can also be limited by owners or moderators. Spotify only allows creators (musicians or podcasters) to publish content. Since Anchor is a free service for users who wish to become podcasters, there is no restrictions to post podcasts. However, users cannot publish music to Spotify directly-they must use a music distributor. Popular musicians are often signed to record companies or labels that will either act as or employ a distributor. Independent artists, those who do not have the backing of a record company, can use online music distribution services like DistroKid 5 to publish music on Spotify. These services are never free, and therefore access to publishing music on Spotify is restricted. SoundCloud, on the other hand, allows all of its users to post audio-content, and only limits the amount of audio-content a free user can upload before requiring a paid SoundCloud Pro account. The types of barriers to access on Spotify and SoundCloud are examples of the pricing moderation technique outlined by Grimmelmann [17] . Monetization ( monetization). Monetization on platforms refers to whether content is being used to generate revenue for both the platform and content creator. There is no content on Discord that can be monetized on the platform itself. Music and podcasts on Spotify are monetized, and creators receive profits based off of the number of streams their content receives. Soundcloud content is not monetized. Monetization plays a role in moderation since content that is being monetized may be more heavily moderated than content that is not; monetization may also incentivize creators to generate more content, which could lead to moderation challenges. Synchronicity ( synchronicity). Synchronicity refers to whether or not the content on a platform is being created in real-time. Voice chats on Discord can only occur synchronously, whereas text-based conversations may occur asynchronously. Audio on Spotify is asynchronous. Synchronous content often creates challenges for moderators, since not all moderators or moderation mechanisms can be present at the time the content is being created/shared. Asynchronous content provides a larger window of opportunity for moderation mechanisms to detect and report antisocial behavior. Ephemerality ( ephemerality). Ephemerality refers to whether or not the content can be accessed after it is created/posted. On Discord, voice chats are ephemeral, since recording voice-channels can violate Discord's Terms of Service. On Spotify, audio is not ephemeral. Studies have shown that users behave differently when interactions are ephemeral and leave no record or trace [6, 39] . Furthermore, when content is ephemeral, it becomes difficult for moderators to collect robust evidence to prove that anti-social behavior occurred to remove bad actors [22] . User Types ( users). Platforms may distinguish between types of users, and may even have designated types that allow users to act as moderators. Different user types are often associated with different permissions. On Discord, server owners and administrators can create custom roles for users, each with custom permission settings; one such role is typically assigned to "moderators". On Spotify, only users with Spotify for Artist accounts are able to publish music. All users are able to create Anchor accounts to publish podcasts. Spotify has no designated "Moderator"-like role assigned to users on the platform. Anonymity ( anonymity). Users on platforms may be anonymous or use pseudonymous usernames to mask their identity. On Discord, users typically adopt usernames or handles that are custom and/or pseudonyms. Thus, users in voice-channels might not be not associated with any actual means of identification. On Spotify, listeners can, and often do, create account usernames with their actual identity (typically by linking Spotify to other social media accounts, such as Facebook). However, some users do adopt custom usernames that obscure their identity. Creators may publish audio-content under stage names or aliases. Anonymity has been found to both enable and discourage negative behavior in online social spaces [18] , and anonymity appears to break down when using voice-based communication [50] . 5 https://distrokid.com/ Organization ( organization). The organization of a platform refers to the way in which content and communities are organized, situated, and discovered on the platform. Discord is organized into servers, and each server has various channels in which community members interact and share content. Users can use Discord's Server Discovery feature or Explore page to look for popular public servers to join, or create their own public or private servers. Not all large servers are necessarily public or searchable using Discord's Server Discovery. The vast majority of audio-content on Spotify is indexed and publicly available to every user of the service. Typically, audio on Spotify is organized by artist, genre, podcast, or in user-or algorithmically-curated playlists (some of which are private). Users can search and discover all public audio-content via search or using Spotify's various discovery and recommendation mechanisms. A platforms' organization impacts users' and moderators' ability to locate content and members of interest. Rules and Guidelines ( rules). Most platforms utilize some combination platform-wide terms of service (TOS) and community-specific guidelines to govern user behavior. These terms and guidelines establish high-level rules that all users are expected to abide by. In addition to community guidelines and TOS, Discord also has platform-level rules that clearly define the roles of moderators on servers. At the community-level, Discord servers can publish their own set of rules and guidelines that are typically more tailored to the type of community the server hosts. Spotify has separate guidelines and TOS for listeners and content creators who use Spotify for Artists and Anchor. The rules and guidelines help establish a baseline for both platform-wide and community-specific norms and conditions for exclusion (e.g., suspensions or bans [11] ). Rules and guidelines play a key role in moderation, as seen in Grimmelmann's work-norm-setting and exclusion make up two of the four common techniques for moderation [17] . Inter-Platform Relationships ( inter-platform). The way users of one social platform (audio-based or otherwise) utilize other platforms is an aspect that is often overlooked when discussing moderation on social platforms in general. Discord servers are known to be used alongside other platforms (such as Reddit [23] ), but are also commonly used alone. Discord users will occasionally use other, more free-range platforms such as Twitter and Reddit to discover and advertise private servers. Spotify, on the other hand, is often used by other platforms to embed music. For instance, Instagram users can add music directly from Spotify to their story posts, or link to their Spotify playlists. As more SNSs become available, it will be more commonplace for online communities to use more than one platform. This affects moderation since bad actors can harass users over multiple platforms, making moderation more difficult [21] . Moderation Mechanisms ( mechanisms). The moderation mechanisms of a platform refer to its built-in moderation tools and procedures. Discord allows users to use and create chat bots and tools to moderate text-channels. Discord also has a guide for moderators. However, not all interactions in a voice-channel can be moderated unless a moderator is present in the voice-channel every time there is activity or the voice-channels are being recorded. Discord has bots that enable recording, but depending on where users reside, consent must be granted in order for recording to be allowed. On Spotify, all audio content can be moderated by the platform itself, since audio must be first uploaded to the platform and processed before it is hosted publicly. Spotify has mechanisms for algorithmic content moderation, 6 and the existence of such mechanisms leads us to believe that all audio-content is moderated in some way. Limited moderation mechanisms allow abusive and antisocial behavior to go unchecked on social platforms. Though we have defined a set of disjoint affordances, these affordances will often be linked to each other in the larger platform ecosystem. For instance, in both Spotify and Discord, access is directly linked to user roles, since different types of roles constitute different types of access. Inter-affordance relationships are important to highlight since any modifications to one affordance could impact several others. Moreover, if a specific affordance has been identified as a contributor to moderation challenges, we can use inter-affordance relationships to identify other, less apparent affordances that also contribute to these challenges. Formally, we define an inter-affordance relationship from affordance to affordance if modifying affordance impacts or changes the status of affordance . For example, the asynchronous nature of content on Spotify (synchronicity) enables its non-ephemerality (ephemerality); indeed, if Spotify introduced synchronous content, then the ephemerality of certain content might change. 7 On Discord, the ephemerality and synchronicity of the voice interactions in voice-channels affect the moderation mechanisms that are available on the platform. In our MIC diagrams, these relationships are shown as directed arrows between affordances. A bi-directional arrow is used to indicate when a relationship exists in both "directions. " For example, user types on both Spotify and Discord are tied to types of access and permissions. These relationships in a platform will likely change over time as the platform itself is updated. To further reinforce our notion of inter-affordance relationships, we list more of the relationships that exist among the affordances of Spotify and Discord. The non-ephemeral (ephemerality) and asynchronous (synchronicity) nature of content on Spotify affects the platforms' moderation mechanisms. Similarly, the moderation mechanisms are enabled by Spotify's user agreement which explicitly states that the platform is allowed to remove or edit any content that is uploaded if it violates community guidelines (rules). On Discord, user types change across each server, thus the organization of Discord has an affect on user types. In this section, we will demonstrate how MIC can be used to represent and subsequently update our understanding of a particular platform's moderation ecosystem. We will use MIC to analyze the Clubhouse app, which has been rapidly evolving since its release in 2020, at two different points in time. First, we will describe the state of Clubhouse as of June of 2021 ( Figure 4) . We then describe the state of Clubhouse as of the time of writing this manuscript, and accordingly update the MIC diagram and discuss how these changes could effect potential moderation challenges and strategies ( Figure 5 ). Finally, we will discuss how using MIC allows us to reason about moderation strategies and challenges that exist on Clubhouse in a more efficient and systematic way, and what insights MIC provides that may otherwise be overlooked. Clubhouse in June 2021. As of June 2021, Clubhouse was invite-only, so new users must be invited to the app using their phone number (access). Users must use their real name, as per the platform's community guidelines (anonymity). Clubhouse users can only communicate with one another using audio in public or private voice rooms (modalities). Clubhouse is organized into topic-specific pages and groups called "clubs" (organization); only "the most active members of the Clubhouse Community" can create clubs (access). Each such page and club is made up of synchronous and ephemeral voice rooms (synchronicity, ephemerality). Every club has designated admins that have the ability to edit the club settings, name, and manage members (users). Public voice rooms can be accessed by any user on the app, regardless of their membership in its associated club or interest in the room's subject (access). Private rooms can only be joined by the followers of the room host or the members of the room's associated club (if it exists) (access). All participants of rooms are required to follow Clubhouse's Community Guidelines [2] (rules). However, established clubs can publish a list of club-specific rules that can be applied to participants of rooms hosted by the club (rules). Users can have one of three roles in a room on Clubhouse (users). The moderator role (denoted by a green star symbol) is given to the user who creates the room. This user has the ability to end the room, invite users to the stage to speak, mute speakers, and assign other users to be moderators as well. This means that every active room (i.e., every instance that audio-content is generated on the app) has a "moderator" present (mechanisms). All other users that enter the room start out as listeners, and do not have the ability to speak in this role-they cannot unmute their microphone. As a listener, users can press the "raise hand" button and ask to be a speaker. If a moderator accepts a listener's request to speak, that listener gets moved up to the "stage" where they now have the role of speaker. As a speaker, they can unmute their own microphone and be heard by everyone else in the room (access). All speakers inside a room have a marker to show whether their microphone is muted or not. Speakers often click this marker on and off to indicate that they want a turn to speak. When users enter a room, they have a celebratory emoji by their icon and name to indicate that they are new to the room (badges). Clubhouse also a monetization feature that lets users send money to other Clubhouse users via their profile page (monetization). Clubhouse uses a block-list icon to indicate to a user that a specific user has been blocked by many people in their circle (mechanisms, badges). Much of the commentary about Clubhouse interactions happen on other platforms. One such platform that is heavily used by Clubhouse users for commentary is Twitter. Users often talk about what they are experiencing on Clubhouse on Twitter, and Clubhouse users will often link to their Twitter profiles in the Clubhouse app. There are even subreddits dedicated to talking about Clubhouse (i.e., r/Clubhouse). These other platforms are also used to announce and publicize rooms or clubs and invite new users to Clubhouse (inter-platform). Between June of 2021 and January of 2022, Clubhouse released close to 20 updates to their iOS app [1] . These releases included changes to the app's appearance, updates to the app's terms of service and privacy policy, as well as the addition of multiple new features. Using MIC, we identified which of these updates to investigate further to understand moderation on Clubhouse. The relevant changes are as follows: Clubhouse is no longer invite-only, i.e., anyone with a smartphone is allowed to make an account and join the Clubhouse community (access). The platform also added a direct-messaging feature that lets users send text-messages to other users and create group chats (modalities). Clubs can now assign users a "Leader" role that gives them the ability start and schedule rooms in a club, but does not allow them to alter the club settings or add/remove members (users). By far the largest change to Clubhouse is that it introduced non-ephemeral content, i.e., live audio rooms can be recorded for users to listen to later (ephemerality). Additionally, Clubhouse added an option that lets users block inappropriate or NSFW voice rooms from their feed (mechanisms). The observed affordances and relationships in MIC give us several insights into moderation on Clubhouse. First, the existence of the moderator role in every live audio room indicates that moderation on Clubhouse is done primarily by users as opposed to by the platform itself (mechanisms). The platform's requirement of using identifiable information (rules) will impact the types of interactions that users have on the platform, hopefully reducing the frequency of antisocial behavior. The organization of live audio rooms on Clubhouse will make it easy for users to find new rooms and interact with new people (organization). This organization also lets users to abruptly leave rooms, which may make it difficult for room hosts and moderators to report disruptive or antisocial users. However, with Clubhouse's new record feature, room hosts can now have a record of which users engaged in disruptive behavior, and can then use this record to locate the disruptive user in question and report them after the room ends (ephemerality, synchronicity). Before Clubhouse added a text-based chat feature, users had to utilize other social platforms if they wanted to send asynchronous, text-based messages to other users. This would have also driven abusive users to several other platforms to harass individuals they initially encountered on Clubhouse [27] . This could amplify the amount of harassment a potential victim receives. The introduction of text-based messages (modalities) likely reduced the need for certain inter-platform relationships, making Clubhouse more self-contained. This could make moderating Clubhouse easier; at the very least, it could restrict the amount of harassment that victims of antisocial users get, rather than amplify it. Finally, since Clubhouse is no longer invite-only (access), the user base of Clubhouse is likely to have expanded. This means more users, and more communities, would start using Clubhouse resulting in a large influx of user and incident reports, thereby posing newer challenges to the platform. So far, we have used MIC on three platforms, all of which are centered around audio. As discussed in the introduction, these audio platforms have many similarities and differences that could impact how moderation is accomplished. In this section, we will compare and contrast the platforms via the MIC framework. We will then use the comparisons to generate ideas for new moderation interventions. Before we use MIC affordances and relationships, we will first point out the obvious similarities and differences between the three platforms that can be determined without using MIC. First, Discord and Clubhouse both offer live audio features, whereas Spotify itself does not. Spotify also does not offer users a way to direct-message other users, while Discord and Clubhouse both have such features. In fact, Spotify users have no means to interact with one another on the platform apart from using posted audio, which is not the case on Discord or Clubhouse. In general, Spotify is used for listening to Music and Podcasts; Clubhouse is used for listening to and participating in live audio rooms; Discord is used to host communities and let community members interact with each other over text, voice, and video. MIC-guided Comparisons of Spotify, Discord, and Clubhouse. While the above observations do give us insights into how moderation strategies and challenges differ across these platforms, they do not give us as complete a view as comparisons using MIC would. We have already compared the affordances and relationships of Spotify and Discord in Section 3, so we will now focus on comparisons involving Clubhouse. Clubhouse is similar to Discord in that it allows users to communicate using text-messages and voice; Discord has video capabilities while Clubhouse does not. Clubhouse and Spotify both have features that enable sharing and posting audio content (modalities). Clubhouse and Discord allow all their users the ability to generate and post content, while Spotify limits this to only certain types of users (access). Clubhouse and Spotify both have monetization features that Discord lacks, but monetization on Spotify depends on streaming numbers and ad revenue, whereas on Clubhouse monetization occurs between users (i.e. one user sends another user money) (monetization). Audio on Discord is synchronous and ephemeral, while on Spotify it is asynchronous and non-ephemeral. Clubhouse has synchronous audio that can be made non-ephemeral. Text messaging on Discord and Clubhouse is both asynchronous and non-ephemeral (synchronicity, ephemerality). Discord and Clubhouse both offer ways to delineate specific communities (i.e. servers and clubs). However, Clubhouse is more openly structured, like Spotify, making it easier for users to explore and find more niche communities (organization). Clubhouse and Spotify have fixed user types, whereas Discord lets users create custom roles. Clubhouse and Discord have roles that can change between servers/rooms/clubs (users). Users on Clubhouse must have identifiable profiles, whereas pseudonyms are allowed on Discord and Spotify (anonymity). Clubhouse and Discord both use visual cues and markers to differentiate among user types. However, neither Clubhouse nor Discord uses the blue check verification marker to verify users' identities, like Spotify does for artists. Clubhouse shows how many listeners and speakers are in an active live room, similar to how Spotify shows the number of streams for songs (badges). All three platforms have robust terms of service and community guidelines, but only Discord and Clubhouse lets users create rule sets for individual communities or rooms (rules). All platforms are used in tandem with other social media sites, however, Discord and Clubhouse can be used as an online community's sole meeting place (inter-platform). Finally, Clubhouse and Discord both have moderation roles that allow for users themselves to engage in moderation. Discord has tools that allow users to create and use automated moderation tools such as chat bots. Both Clubhouse and Spotify keep recordings of audio and both list in their Terms of Service that they are at liberty to remove any content that they feel violates their terms. Spotify is moderated using algorithmic tools. Spotify also uses curation and recommendation mechanisms to help users find the content they are interested in (mechanisms). Spotify and Clubhouse. One challenge we noticed while using Clubhouse to conduct the previous case study (Section 4) is that it was difficult to identify live rooms that are of interest that appear on the app's home page. Furthermore, some live rooms dealt with sensitive topics, such as sexual assault. Such rooms should likely not be shown to users who are insensitive to certain topics, since their participation in the room would have negative impacts on the members of such a space. In general, it seems difficult for both listeners to find interesting rooms on Clubhouse and room hosts to find interested listeners and participants. To begin addressing this potential challenge, one can use MIC to observe that Clubhouse has a similar open organization to Spotify. In particular, the room topic categories that users can browse on Clubhouse are reminiscent of the various genres users can use to browse content on Spotify. Likewise, as of Clubhouse's newer updates, both platforms host non-ephemeral content (ephemerality). One of Spotify's major services is its recommendation system for music and podcast discovery. Not only does this service aim to show users content that they would be inclined to listen to, but also for creators to discover new listeners. 8 One way in which Spotify does this is by curating playlists. These playlists can be broadly defined, containing music from a genre, or from a specific musical artist. Many of these playlists are manually curated, and artists can submit music for consideration to be added to these curated playlists. Given Clubhouse and Spotify's organizational similarity, and the existence of non-ephemeral content, we could propose a moderation mechanisms for Clubhouse that involves adopting a similar type of recommendation-via-curation mechanism like Spotify, and manually curate endorsed playlists of recordings of quality room recordings. We could even try to extend this idea to ephemeral content, i.e. playlist-type hubs of clubs or upcoming scheduled rooms that are hosted by trusted or experience users. This could start to help clubs and rooms find relevant audiences, and could also help users find and build communities in a more strategic way, while limiting the number of potential bad actors that try to engage. Discord and Clubhouse. MIC also showed us that Clubhouse and Discord are very similar across many different affordances. Discord has been studied in the context of moderation research [22, 24] , and researchers have found that moderating voice channels on Discord to be a challenging feat. This is largely due to the fact that moderators in Discord servers find it difficult to monitor events and collect evidence of bad behavior in voice channels [22] . Clubhouse, like Discord, has a moderator role for users (users); however, on Clubhouse, every active room must have a moderator present. A feature, or moderation mechanism, that Discord could "borrow" from Clubhouse to help moderators handle voice-channels is a way to enable moderators to schedule when voice-channels can be made active . This way, moderators can ensure that they are present in public voice channels. Broader Effects of Affordance Changes. In Section 5.2, we discussed potential moderation mechanisms that Discord, Spotify, and Clubhouse could adapt from one another. One proposal we made involved adapting Clubhouse's rule of keeping all recordings for a short period of time to address voice moderation challenges found on Discord [22] . We briefly discussed that users of Discord may not be open to this platform change, largely due to the fact that Discord seems to allow its users more privacy than Clubhouse does. This conjecture was made by observing that Discord users are allowed to be pseudonymous, while Clubhouse users have always been required to be identifiable. Observations like this are seemingly unimportant, and had we not used MIC, may have been overlooked. However, in some cases, overlooking these subtle nuances have inadvertently allowed for detrimental platform changes. An example of this can be seen with YikYak, a social platform that allowed users to post location-specific anonymous text-posts [39] . YikYak was a successful social platform that shut down in 2017 after platform changes were introduced. One such update was the removal of anonymity. As discussed in Section 2, existing research has explored the role anonymity played in voice-based interactions in online games [50] . In particular, Wadley et al. [50] found that voice seemed to remove a degree of anonymity in game-play, which made some players feel uncomfortable, and in some cases, caused the players to abandon the game. There is no way to prove that MIC-based analysis could have prevented this specific platform change, but MIC would have highlighted anonymity as an integral affordance, and one that was similar to that of the online games explored by Wadley et al. [50] . MIC-based analysis would have highlighted these connections to a seemingly unrelated platform and could have shed light on potential (and later realized) pitfalls that could result from modifying the anonymity affordance. As such, MIC-based approach to moderation research and social platform design could be instrumental in designing and maintaining successful social platforms. Limitations of MIC. MIC's purpose is for capturing the moderation ecosystems of social platforms to allow moderation researchers and platform stakeholders to better understand moderation. However, MIC does not capture every moderation-related property. In particular, the implicit norms that exist on a platform would not be represented by the affordances or relationships in MIC, since they are not tangible. Norms of online communities play a massive role in moderation on platforms, and is identified as one of four main moderation techniques by Grimmelmann [17] ; there is also research that explores how norms play a role in moderating online communities, and how norms differ amongst various communities on the same platform [12, 40] . Another closely related limitation of MIC is that it is not currently designed for analyzing individual communities. However, studying individual online communities, such as specific subreddits, is beneficial for understanding moderation [15] . We posit that there might be a way to extend MIC to capture nuances of individual communities and their norms, but leave this for future work. Extending MIC. MIC's base set of affordances and relationships are likely to become non-exhaustive as technology advances. Luckily, the graphical nature of MIC allows us to do so in an easy and straightforward way. We can add new affordances to our original set when new types of affordances that effect moderation are uncovered or developed. Similarly, we could further granularize existing affordances. For instance, we may eventually find it useful to distinguish between automated moderation mechanisms and manual ones. We can also extend our set of relationships by defining new types of relationships. There is no real restriction on how one could go about defining new relationships. We could even forego the condition that relationships occur between only two affordances, and describe multi-affordance relationships that are analogous to hyper-edges 9 . Another potentially useful, albeit more involved, extension of MIC, and in particular the MIC diagram would be to use the inter-platform relationship affordance with a MIC diagram for other platforms or services. This would be useful if there is a nearly symbiotic relationship between two separate platforms or services, but we still wish to consider the affordances of each separately. For instance, Discord introduced a new Clubhouse-like service called Discord Stages 10 . It may be useful to consider Stages as a separate service from Discord's servers, since its use-case and set-up is different. We could analyze each of these services separately, and then build an extended MIC diagram to understand moderation on Discord in more detail. In this paper, we introduced the MIC framework as an extension of existing theoretical frameworks for discussing moderation. MIC provides a standardized way to represent moderation ecosystems of social platforms via their affordances and the relationships between them. Using two case studies, we demonstrated how to use MIC to analyze growing individual platforms, as well as to compare and contrast platforms to generate ideas for moderation interventions. We believe that the MIC framework will help the moderation research community keep up with the fast-paced nature of social platform development and design updates. Clubhouse iOS Release Notes Community Guidelines Discord introduces Clubhouse-like Stage Channels feature for live audio conversations-Technology News Spotify Acquires Locker Room and Announces Plans for a New Live Audio Experience Hanging on the 'Wire: A Field Study of an Audio-Only Media Space 4chan and/b: An Analysis of Anonymity and Ephemerality in a Large Online Community Clout Chasing for the Sake of Content Monetization: Gaming Algorithmic Architectures with Self-moderation Strategies Remix's retreat? Content moderation, copyright law and mashup music Spotify is launching its own Clubhouse competitor Crossmod: A Cross-Community Learning-Based System to Assist Reddit Moderators You Can't Stay Here: The Efficacy of Reddit's 2015 Ban Examined Through Hate Speech The Internet's Hidden Rules: An Empirical Study of Reddit Norm Violations at Micro, Meso, and Macro Scales The positive and negative implications of anonymity in Internet social interactions Social influence: Social norms, conformity and compliance I run the world's largest historical outreach project and it's on a cesspool of a website Algorithmic content moderation: Technical and political challenges in the automation of platform governance The virtues of moderation Attribution accuracy when using anonymity in group support systems What Video Can and Can't Do for Collaboration: A Case Study Human-Machine Collaboration for Content Regulation: The Case of Reddit Automoderator Online Harassment and Content Moderation: The Case of Blocklists Moderation Challenges in Voice-Based Online Communities on Discord Technological Frames and User Innovation: Exploring Technological Change in Community Moderation Teams Eternal September": How an Online Community Managed a Surge of Newcomers Building successful online communities: Evidence-based social design Slash(Dot) and Burn: Distributed Moderation in a Large Online Conversation Space Clubhouse Moderation Issues and Incidents Going Dark: Social Factors in Collective Action Against Platform Operators in the Reddit Blackout The Civic Labor of Volunteer Moderators Online A Comprehensive Model of Anonymity in Computer-Supported Group Decision Making What Do We Know about Algorithmic Literacy? The Status Quo and a Research Agenda for a Growing Field What mix of video and audio is useful for small groups doing remote real-time design work Facebook Announces Live Audio Rooms, Its Clubhouse Clone Avaaj Otalo: A Field Study of an Interactive Voice Forum for Small Farmers in Rural India Reddit Talk is a Clubhouse competitor for subreddits Audio Chatrooms like Clubhouse Have Become the Hot New Media by Tapping into the Age-Old Appeal of the Human Voice. The Conversation Commercial content moderation: Digital laborers' dirty work Slack is getting Clubhouse-like audio chatrooms, and I absolutely don Situated Anonymity: Impacts of Anonymity, Ephemerality, and Hyper-Locality on Social Media Shaping Pro and Anti-Social Behavior on Twitch Through Moderation and Example-Setting Moderator engagement and community development in the age of algorithms Julius Solans. 2020. The Rise of Audio in Virtual Events to Sonar: Create worlds together Twitter Spaces, a Clubhouse-like feature, goes live in India for some users Spotify Acquires Sports-Talk App Locker Room A Mobile Voice Communication System in Medical Setting: Love It or Hate It? YouTube's predator problem": Platform moderation as governance-washing, and user resistance Culture and social behavior Sangeet Swara: A Community-Moderated Voice Forum in Rural India Voice in Virtual Worlds: The Design, Use, and Influence of Voice Chat in Online Play. Human-Computer Interaction Overcoming social awareness in computer-supported groups Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms Telegram is stealing the best feature from Clubhouse -here's how Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They Experience Automatic Archiving versus Default Deletion: What Snapchat Tells Us About Ephemerality in Design