Tag: moderation (page 1 of 2)

Cancel Technology

Noah Smith makes a good point in this article that ‘cancel culture’ has always existed, we just called it ‘social ostracism’. The difference is the technology we interact with, and the intended and unintended audiences with which we communicate.

First let’s think about distribution. In the olden days, you could “read the room” and decide whether you were going to get a sympathetic ear before you said something. You knew who you were hanging out with — your relatives, or your coworkers, or your buddies, or your neighbors, or your cell of the Communist Party, etc. On the internet, that’s much less true. On Twitter, anyone can see what you write and retweet it or screenshot it to millions of strangers all over the globe. In a Facebook group, you probably don’t know exactly what kind of others are in the group unless it’s really small. If you put something up on a website, anyone can read it. Etc.

The internet also makes it much less hard to maintain private spaces because text can be screenshotted and distributed widely. In the old days, if you said something that would be cancel-worthy outside the group of people you were talking to, it was impossible for someone to verifiably transmit that information outside the group — they could snitch on you, but it would be hearsay and you could deny it. But when you write something down, the text of what you wrote can be screenshotted and distributed widely to people that you didn’t expect to be watching you.

Now, this broad distribution has a number of effects. It makes it a lot harder to get together with your buddies in private and say racist or sexist stuff, because now one of them can betray you with a screenshot. Lots of people are probably pleased with that outcome.

But it also means that everyone who talks on the internet must always worry about their words being shown to someone who’s going to interpret it in an uncharitable way.

[…]

Thus, the internet changes Cancel Culture by massively increasing the number of people who can target you for ostracism. It’s a bit like living in a gossipy small town where you don’t know any of your neighbors — you don’t know who’s going to read what you write, so you don’t know how people are going to take what you say.

Source: It’s not Cancel Culture, it’s Cancel Technology | Noahpinion

Nothing is repeated, and everything is unparalleled

🤔 We need more than deplatforming — “But as reprehensible as the actions of Donald Trump are, the rampant use of the internet to foment violence and hate, and reinforce white supremacy is about more than any one personality. Donald Trump is certainly not the first politician to exploit the architecture of the internet in this way, and he won’t be the last. We need solutions that don’t start after untold damage has been done.”

💪 Demands and Responsibilities — “If you demand rights for yourself, you have to demand those same rights for others. You have to take on the responsibility of collective action, and you yourself act in a way that benefits the collective. If you want credit, you have to give credit. If you want community, you have to be communal. If you want to be satiated, you have to allow others to be sated. If you want your vote to be respected, you have to respect the votes of others.”

🗯️ Parler Pitched Itself as Twitter Without Rules. Not Anymore, Apple and Google Said. — “Google said in a statement that it had pulled the app because Parler was not enforcing its own moderation policies, despite a recent reminder from Google, and because of continued posts on the app that sought to incite violence.”

🙅 Hello! You’ve Been Referred Here Because You’re Wrong About Section 230 Of The Communications Decency Act — “While this may all feel kind of mean, it’s not meant to be. Unless you’re one of the people who is purposefully saying wrong things about Section 230, like Senator Ted Cruz or Rep. Nancy Pelosi (being wrong about 230 is bipartisan). For them, it’s meant to be mean. For you, let’s just assume you made an honest mistake — perhaps because deliberately wrong people like Ted Cruz and Nancy Pelosi steered you wrong. So let’s correct that.”

🧐 What Wikipedia saw during election week in the U.S., and what we’re doing next — “To help meet this goal, we hope to invest in resources that we can share with international Wikipedia communities that will help mitigate future disinformation risks on the sites. We’re also looking to bring together administrators from different language Wikipedias for a global forum on disinformation. Together, we aim to build more tools to support our volunteer editors, and to combat disinformation.”


Quotation-as-title by the Goncourt Brothers. Image from top-linked post.

You can’t tech your way out of problems the tech didn’t create

The Electronic Frontier Foundation (EFF), is a US-based non-profit that exists to defend civil liberties in the digital world. They’ve been around for 30 years, and I support them financially on a monthly basis.

In this article by Corynne McSherry, EFF’s Legal Director, she outlines the futility in attempts by ‘Big Social’ to do content moderation at scale:

[C]ontent moderation is a fundamentally broken system. It is inconsistent and confusing, and as layer upon layer of policy is added to a system that employs both human moderators and automated technologies, it is increasingly error-prone. Even well-meaning efforts to control misinformation inevitably end up silencing a range of dissenting voices and hindering the ability to challenge ingrained systems of oppression.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

Ultimately, these monolithic social networks have a problem around false positives. It’s in their interests to be over-zealous, as they’re increasingly under the watchful eye of regulators and governments.

We have been watching closely as Facebook, YouTube, and Twitter, while disclaiming any interest in being “the arbiters of truth,” have all adjusted their policies over the past several months to try arbitrate lies—or at least flag them. And we’re worried, especially when we look abroad. Already this year, an attempt by Facebook to counter election misinformation targeting Tunisia, Togo, Côte d’Ivoire, and seven other African countries resulted in the accidental removal of accounts belonging to dozens of Tunisian journalists and activists, some of whom had used the platform during the country’s 2011 revolution. While some of those users’ accounts were restored, others—mostly belonging to artists—were not.

Corynne McSherry, Content Moderation and the U.S. Election: What to Ask, What to Demand (EFF)

McSherry’s analysis is spot-on: it’s the algorithms that are a problem here. Social networks employ these algorithms because of their size and structure, and because of the cost of human-based content moderation. After all, these are companies with shareholders.

Algorithms used by Facebook’s Newsfeed or Twitter’s timeline make decisions about which news items, ads, and user-generated content to promote and which to hide. That kind of curation can play an amplifying role for some types of incendiary content, despite the efforts of platforms like Facebook to tweak their algorithms to “disincentivize” or “downrank” it. Features designed to help people find content they’ll like can too easily funnel them into a rabbit hole of disinformation.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

She includes useful questions for social networks to answer about content moderation:

  • Is the approach narrowly tailored or a categorical ban?
  • Does it empower users?
  • Is it transparent?
  • Is the policy consistent with human rights principles?

But, ultimately…

You can’t tech your way out of problems the tech didn’t create. And even where content moderation has a role to play, history tells us to be wary. Content moderation at scale is impossible to do perfectly, and nearly impossible to do well, even under the most transparent, sensible, and fair conditions

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

I’m so pleased that I don’t use Facebook products, and that I only use Twitter these days as a place to publish links to my writing.

Instead, I’m much happier on the Fediverse, a place where if you don’t like the content moderation approach of the instance you’re on, you can take your digital knapsack and decide to call another place home. You can find me here (for now!).