Tag: Facebook (page 2 of 13)

You can’t tech your way out of problems the tech didn’t create

The Electronic Frontier Foundation (EFF), is a US-based non-profit that exists to defend civil liberties in the digital world. They’ve been around for 30 years, and I support them financially on a monthly basis.

In this article by Corynne McSherry, EFF’s Legal Director, she outlines the futility in attempts by ‘Big Social’ to do content moderation at scale:

[C]ontent moderation is a fundamentally broken system. It is inconsistent and confusing, and as layer upon layer of policy is added to a system that employs both human moderators and automated technologies, it is increasingly error-prone. Even well-meaning efforts to control misinformation inevitably end up silencing a range of dissenting voices and hindering the ability to challenge ingrained systems of oppression.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

Ultimately, these monolithic social networks have a problem around false positives. It’s in their interests to be over-zealous, as they’re increasingly under the watchful eye of regulators and governments.

We have been watching closely as Facebook, YouTube, and Twitter, while disclaiming any interest in being “the arbiters of truth,” have all adjusted their policies over the past several months to try arbitrate lies—or at least flag them. And we’re worried, especially when we look abroad. Already this year, an attempt by Facebook to counter election misinformation targeting Tunisia, Togo, Côte d’Ivoire, and seven other African countries resulted in the accidental removal of accounts belonging to dozens of Tunisian journalists and activists, some of whom had used the platform during the country’s 2011 revolution. While some of those users’ accounts were restored, others—mostly belonging to artists—were not.

Corynne McSherry, Content Moderation and the U.S. Election: What to Ask, What to Demand (EFF)

McSherry’s analysis is spot-on: it’s the algorithms that are a problem here. Social networks employ these algorithms because of their size and structure, and because of the cost of human-based content moderation. After all, these are companies with shareholders.

Algorithms used by Facebook’s Newsfeed or Twitter’s timeline make decisions about which news items, ads, and user-generated content to promote and which to hide. That kind of curation can play an amplifying role for some types of incendiary content, despite the efforts of platforms like Facebook to tweak their algorithms to “disincentivize” or “downrank” it. Features designed to help people find content they’ll like can too easily funnel them into a rabbit hole of disinformation.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

She includes useful questions for social networks to answer about content moderation:

  • Is the approach narrowly tailored or a categorical ban?
  • Does it empower users?
  • Is it transparent?
  • Is the policy consistent with human rights principles?

But, ultimately…

You can’t tech your way out of problems the tech didn’t create. And even where content moderation has a role to play, history tells us to be wary. Content moderation at scale is impossible to do perfectly, and nearly impossible to do well, even under the most transparent, sensible, and fair conditions

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

I’m so pleased that I don’t use Facebook products, and that I only use Twitter these days as a place to publish links to my writing.

Instead, I’m much happier on the Fediverse, a place where if you don’t like the content moderation approach of the instance you’re on, you can take your digital knapsack and decide to call another place home. You can find me here (for now!).

Nothing will ever be attempted, if all possible objections must be first overcome

Facebook Accused of Watching Instagram Users Through Cameras (The Verge)

In the complaint filed Thursday in federal court in San Francisco, New Jersey Instagram user Brittany Conditi contends the app’s use of the camera is intentional and done for the purpose of collecting “lucrative and valuable data on its users that it would not otherwise have access to.”


Facebook Has Been a Disaster for the World (The New York Times)

Facebook has been incredibly lucrative for its founder, Mark Zuckerberg, who ranks among the wealthiest men in the world. But it’s been a disaster for the world itself, a powerful vector for paranoia, propaganda and conspiracy-theorizing as well as authoritarian crackdowns and vicious attacks on the free press. Wherever it goes, chaos and destabilization follow.


Kim Kardashian West joins Facebook and Instagram boycott (BBC News)

I can’t sit by and stay silent while these platforms continue to allow the spreading of hate, propaganda and misinformation – created by groups to sow division and split America apart,” Kardashian West said.


Quotation-as-title from Dr Johnson.

One nation under Zuck

This image, from Grayson Perry, is incredible. As he points out in the accompanying article, he’s chosen the US due to an upcoming series of his, but geographically this could be anywhere, as culture wars these days happen mainly online.

I’ve added the emphasis in the quotation below:

When we experience a background hum of unfocused emotion, be it anxiety, sadness, fear, anger, we unconsciously look for something to attach it to. Social media is brilliant at supplying us with issues to which attach our free-floating feelings. We often look for nice, preformed boxes into which we can dump our inchoate feelings, we crave certainty. Social media constantly offers up neat solutions for our messy feelings, whether it be God, guns, Greta or gender identity.

In a battle-torn landscape governed by zeroes and ones, nuance, compromise and empathy are the first casualties. If I were to sum up the online culture war in one word it would be “diaphobia”, a term coined by the psychiatrist RD Laing meaning “fear of being influenced by other people”, the opposite of dialogue. Our ever-present underlying historical and enculturated emotions will nudge us to cherrypick and polish the nuggets of information that support a stance that may have been in our bodies from childhood. Once we have taken sides, the algorithms will supply us with a stream of content to entrench and confirm our beliefs.

Grayson Perry, Be it on God, guns or Greta, social media offers neat solutions for our messy feelings (The Guardian)