DSM Workshop Content Governance Panel Session[edit]
Princeton University March 4, 2024
Panel: Creating trustworthy content governance in DSM[edit]
Among the urgent and unresolved concerns facing decentralized social media communities, protocols, and platforms today, issues related to content moderation, integrity, trust and safety, and trustworthy community governance stand out to many of us. The spectacular and ongoing failures of centralized social media systems in this regard have driven dozens of millions of people away from the mainstream platforms in search of alternatives. Some proportion of these people have wound up in decentralized and federated systems, creating a unique moment of opportunity in which to envision, design, build, and refine a more trustworthy ecosystem of content and governance.
The goal of this panel is to pursue this conversation by prompting a small group of the DSM workshop participants to share their perspectives on these topics. The structure will be a short moderated discussion followed by time for interactive exchange with panelists and other workshop participants. As facilitator, I will try to keep the conversation moving, manage time as well as transitions between speakers and topics, and ensure compliance with the event code of conduct.
Below, I've proposed four broad questions / topics for discussion. It's very unlikely we'll cover all of these within the time constraints (especially if we want to leave time for more interactive conversation with non-panelists!), but hopefully they can help to spark your thinking and provoke conversation. Your comments/suggestions/modifications of the topics are very welcome!
1. Introductions and panelist context[edit]
Panelists briefly introduce themselves and summarize a bit about your experience/context relevant to the topic of content moderation and integrity in decentralized social media?
2. Opportunities and challenges[edit]
Several of you and others in the room have noted distinct opportunities and challenges involved with various forms of decentralized social media. What do these look like with specific regard to content moderation?
Potential followup: Structural/technical constraints and affordances[edit]
Arguments about these opportunities and challenges tend to emphasize structural constraints and/or affordances of decentralized systems. Server-level autonomy; the ease of user exit and data/network portability; opportunities for server-level collective sanctions (defederation, blocks, partial blocks, etc.).
Another common emphasis falls on specific technical resources and approaches that can be built for these systems and were maybe infeasible under prior arrangements. Whether these are narratives of composable moderation, moderation feeds/servers, or specific implementations of features like blocks/mutes/etc.
How do you think about these elements? Any approaches/developments you find particularly promising/troubling?
3. The current state of the landscape[edit]
Talk about your assessment of the landscape of content moderation in DSM systems right now. We know it's been a period of rapid growth in terms of users and communities and that this has renewed the urgency of content integrity and moderation considerations, but what else is going on? What's in play? Where are things moving?
Potential followup: What aspects of this landscape keep you up at night these days?[edit]
For a personal example, I know BlueSky just hired a new head of T&S, but my sense is that the federated and decentralized landscape as a whole is wildly unprepared to anticipate/address likely threats to content integrity, trust, and safety in a year where a large proportion of the world's population will have the opportunity to vote in consequential national elections.
For another example, I share concerns made most recently visible in vanta black's anti-meta fedi pact about the prospects for trustworthy and accountable collectively governed social media in the face of Meta and other large firms with troubling track records into the space.
What are we not worried about, or maybe not worried about enough?
4. Three key open challenges:[edit]
A. Embedding values[edit]
Rabble, Nathan Schneider, and others have made eloquent arguments about the values embedded in technologies and platforms and about how the basic structural constraints of federated systems can enact that...
What content governance values can/should be embedded in moderation systems on federated social media? What values should we avoid enacting? How should we negotiate value conflicts?
B. Theories of change: Pathways to trustworthy content governance[edit]
A lot of the writing and discussion about DSM emphasizes grand visions of how things will work on decentralized and/or federated systems once we grow them a bit more, once we build out the ecosystem a bit more. But we (maybe always?) find ourselves not quite there yet. Given the state of things and the goals/values you aspire to work towards, what theory of change do you believe should be applied by people, communities, companies, regulators, funders aiming to make impactful contributions in this space? Where are the windows of opportunity, the key leverage points that we should be attending to?
C. Collective meta-governance institutions and organizing[edit]
Decentralized and federated communities and platforms encounter several meta-governance challenges concerning design, coordination, adoption, and enforcement of moderation, integrity, trust, and safety frameworks. To make this concrete, consider the potential mechanisms by which blocklists, moderation feeds or standards might be developed and disseminated across servers/services. How should we pursue this class of challenges and problems? Can we rely on voluntary initiatives and adoption alone? What kinds of coordination and collaboration will be valuable and/or necessary to build collective action across decentralized and federated ecosystems?