Moderation & the design of social platforms

Recently @siderea wrote a fantastic thread about social homogeneity, moderation, the design of social platforms and what they could be. They covered a lot of ground and I can’t respond to it all so I’ll just pick some highlights

I cannot tell you how many conversations I have seen about the topic of “moderation” and how necessary it is in which nobody has ever bothered to set down what exactly it is that they think a moderator is supposed to accomplish.

I mean, it’s all of them. I’ve been on the internet since the 1980s, and I have never seen anyone stop and actually talk about what they thought moderators were trying to do or should try to do.


That sounds easy. I’ll take a shot at that, below.

Also they draw a parallel between designing buildings and designing social platforms:

Why should our societies tolerate the existence of *irresponsibly* designed and operated social media platforms, that increase violence and other antisocial behavior?

Primarily buildings are built to be used, and as such they are tools, and we judge them, as we do all tools, by how fit they are for their purpose, whatever that might be.

And the purposes of buildings are to afford various ways of people interacting or avoiding interacting.

So architects think a lot about that. It’s a whole thing.

Those who put together social media platforms need to think about the same sort of thing.

Preach!

The upshot is that we can do better than what we have in the past. We can go beyond the bare minimum of “delete the spam, ban the nazis” moderation. When we build social software the features it has will determine what kind of moderation is possible, what kind of interactions people will have. We should be intentional about that.

I’d like to share some of my ideas for how we can do that but first, let’s get the basics covered:

What I think a moderator is supposed to accomplish

Obviously every online space is different and has it’s own values and priorities. What follows is what I consider to be the minimum necessary to avoid devolving into 4chan as soon as the normies arrive.

The goal of moderators is to create a positive, inclusive, and constructive online community where users feel comfortable engaging in discussions and sharing their thoughts and ideas. To that end, their responsibilities include:

  1. Enforcing Community Guidelines:
    • Moderators ensure that users adhere to the forum’s rules and guidelines. This may involve removing or editing content that violates these rules.
  2. Fostering a Positive Atmosphere:
    • They work to create a welcoming and friendly atmosphere within the forum. This includes encouraging respectful communication and discouraging any form of harassment or bullying.
  3. Managing Conflict:
    • Moderators intervene when conflicts arise between users, helping to de-escalate situations and resolve disputes. This may involve mediating discussions or issuing warnings to users.
  4. Preventing Spam and Irrelevant Content:
    • They monitor the forum for spam, irrelevant content, or any form of disruptive behaviour. This helps maintain the quality of discussions and keeps the forum focused on its intended topics.
  5. Addressing Technical Issues:
    • Moderators often assist users with technical issues related to the forum platform. This includes addressing bugs, helping users navigate the site, and forwarding technical problems to the appropriate channels.
  6. Encouraging Positive Contributions:
    • Moderators actively encourage users to contribute positively to discussions. This can involve highlighting valuable contributions, providing constructive feedback, and recognizing members for their positive engagement.
  7. Applying Consequences:
    • When necessary, moderators may apply consequences for rule violations, such as issuing warnings, temporary suspensions, or permanent bans. This ensures accountability and helps maintain a healthy community.
  8. Staying Informed:
    • Moderators stay informed about the forum’s community and culture, as well as any changes in policies or guidelines. This helps them address issues effectively and stay responsive to the evolving needs of the community.
  9. Collaborating with Community Members:
    • Moderators listen to concerns and feedback from the community. Taking a collaborative approach helps build trust and ensures that the moderation team understands the community’s needs.

Ok, cool. But:

We can and should accomplish more

When we think about moderation tools for a platform that serves millions of people, we are shaping the nature of social interactions on a grand scale. As we engineer these virtual societies, the question we need to ask ourselves is, “What is the nature of the society we want to create?” and within that, “What do we want moderation to accomplish that supports that nature?” and eventually “What software features do moderators need to do their work?”

The nature of the society

We want to create an ideal society where everyone is safe, respected, empowered, entertained and encouraged to grow and find meaning according to their individual free choices. Members of this online society contribute meaningfully and positively to the rest of society, support the actualisation of human rights for all and work to help democracy to live up to it’s promise.

Remember the 1990s, when the internet hadn’t been corrupted yet? Yeah. I do.

What we want moderation to accomplish to maintain this ideal society

Defining the Role of Moderation

Moderation should not be a passive, reactive role. Instead, it should be proactive, shaping the community’s social dynamics intentionally. The first step towards this is defining what our platforms aim to achieve. Do we want a space for free and open discussions, a supportive community, or a platform for specific interests? This vision will shape the guidelines we develop, the tools we use, and the strategies we implement.

Developing Clear Guidelines and Empowering Moderators

Once we have our vision, we need to create a set of rules that align with this vision. These guidelines should be clear, easily accessible, and comprehensive. Moreover, we need to empower our moderators with the right tools and authority to enforce these guidelines. This can include features for deleting posts, banning users, or moving discussions.

Investing in Technology

Incorporating technology is crucial in supporting our moderators. Automated moderation tools can detect and remove inappropriate content, while algorithms can promote high-quality posts. Technology can also help in combating challenges like trolls who use new IP addresses to create accounts. Techniques like browser fingerprinting can identify users regardless of their IP, and restrictions on new accounts can deter trolls.

Addressing Complex Issues

Online communities also need to grapple with complex issues such as the formation of high-control groups, disinformation propagation, social isolation, and internet addiction. Tackling these problems requires more advanced tools and strategies:

  • For high-control groups, we need to implement robust reporting systems and use AI tools to detect patterns of manipulation.
  • To combat disinformation, we need to establish strong fact-checking protocols, possibly collaborating with external fact-checking organizations.
  • To mitigate social isolation and internet addiction, platforms can implement features to promote healthier usage, like reminders to take breaks or limits on usage time.
  • To manage trolls, we can use advanced techniques that track users beyond their IP address and limit the activities of new accounts until they show they can be trusted.

Continuous Evaluation and User Education

Finally, moderation should be an ongoing process of improvement and adaptation. We need to regularly review and update our strategies based on their effectiveness and changing conditions. Additionally, we need to educate our users about these issues and how to report them. An informed user base can greatly aid in maintaining a healthy community.

In conclusion, moderation in online communities is not just about maintaining order but about intentionally shaping the dynamics of these spaces. As we navigate the digital age, we must recognize the power and responsibility we hold in engineering these virtual societies, and use it to create healthier, safer, and more inclusive communities.

6 Comments

  1. @piefedadmin
    Thank you for a good list of points to consider about online communities. I disagree with grouping all of these under "moderation", however. I believe it is more useful to split the responsibilities between moderators (who enforce rules) and "community managers" (CMs) who do most of the more positive/interactive roles with the community. A CM does not necessarily have more access/power than the typical user, although moderators may listen more closely to them.
    1/x

    • @piefedadmin
      As I see it, the main problem with merging moderation and CM roles is that one side will tend to be neglected, at least during a community crisis. People who work well in one role (like enforcing rules) may not be as skilled in others (like the political/discussion work of CMs). I prefer a layered approach like:

      Level 1: Legal requirements (copyright, CSAM)–dealing with events that will shut down a community.
      Level 2: Spam and technical attacks that make a site unusable.
      2/x

Leave a Reply

Your email address will not be published. Required fields are marked *