Tag: democracy

Cory Doctorow on the corruption at the heart of Facebook

I like Cory Doctorow. He’s a gifted communicator who wears his heart on his sleeve. In this article, he talks about Facebook and how what it’s wrought is a result of the corruption at its very heart.

It’s great that the privacy-matters message is finally reaching a wider audience, and it’s exciting to think that we’re approaching a tipping point for indifference to privacy and surveillance.

But while the acknowledgment of the problem of Big Tech is most welcome, I am worried that the diagnosis is wrong.

The problem is that we’re confusing automated persuasion with automated targeting. Laughable lies about Brexit, Mexican rapists, and creeping Sharia law didn’t convince otherwise sensible people that up was down and the sky was green.

Rather, the sophisticated targeting systems available through Facebook, Google, Twitter, and other Big Tech ad platforms made it easy to find the racist, xenophobic, fearful, angry people who wanted to believe that foreigners were destroying their country while being bankrolled by George Soros.

So, for example, people seem to think that Facebook advertisement caused people to vote for Trump. As if they were going to vote for someone else, and then changed their mind as a direct result of viewing ads. That’s not how it works.

Companies such as Cambridge Analytica might claim that they can rig elections and change people’s minds, but they’re not actually that sophisticated.

Cambridge Analytica are like stage mentalists: they’re doing something labor-intensive and pretending that it’s something supernatural. A stage mentalist will train for years to learn to quickly memorize a deck of cards and then claim that they can name your card thanks to their psychic powers. You never see the unglamorous, unimpressive memorization practice. Cambridge Analytica uses Facebook to find racist jerks and tell them to vote for Trump and then they claim that they’ve discovered a mystical way to get otherwise sensible people to vote for maniacs.

This isn’t to say that persuasion is impossible. Automated disinformation campaigns can flood the channel with contradictory, seemingly plausible accounts for the current state of affairs, making it hard for a casual observer to make sense of events. Long-term repetition of a consistent narrative, even a manifestly unhinged one, can create doubt and find adherents – think of climate change denial, or George Soros conspiracies, or the anti-vaccine movement.

These are long, slow processes, though, that make tiny changes in public opinion over the course of years, and they work best when there are other conditions that support them – for example, fascist, xenophobic, and nativist movements that are the handmaidens of austerity and privation. When you don’t have enough for a long time, you’re ripe for messages blaming your neighbors for having deprived you of your fair share.

Advertising and influencing works best when you provide a message that people already agree with in a way that they can easily share with others. The ‘long, slow processes’ that Doctorow refers to have been practised offline as well (think of Nazi propaganda, for example). Dark adverts on Facebook are tapping into feelings and reactions that aren’t peculiar to the digital world.

Facebook has thrived by providing ways for people to connect and communicate with one another. Unfortunately, because they’re so focused on profit over people, they’ve done a spectacularly bad job at making sure that the spaces in which people connect are healthy spaces that respect democracy.

There’s an old-fashioned word for this: corruption. In corrupt systems, a few bad actors cost everyone else billions in order to bring in millions – the savings a factory can realize from dumping pollution in the water supply are much smaller than the costs we all bear from being poisoned by effluent. But the costs are widely diffused while the gains are tightly concentrated, so the beneficiaries of corruption can always outspend their victims to stay clear.

Facebook doesn’t have a mind-control problem, it has a corruption problem. Cambridge Analytica didn’t convince decent people to become racists; they convinced racists to become voters.

That last phrase is right on the money.

Source: Locus magazine

Platform censorship and the threat to democracy

TorrentFreak reports that Science Hub (commonly referred to as ‘Sci-Hub’) has had its account with Cloudflare terminated. Sci-Hub is sometimes known as ‘the Piratebay of Science’ as, in the words of Wikipedia, it “bypasses publisher paywalls by allowing access through educational institution proxies”:

Cloudflare’s actions are significant because the company previously protested a similar order. When the RIAA used the permanent injunction in the MP3Skull case to compel Cloudflare to disconnect the site, the CDN provider refused.

The RIAA argued that Cloudflare was operating “in active concert or participation” with the pirates. The CDN provider objected, but the court eventually ordered Cloudflare to take action, although it did not rule on the “active concert or participation” part.

In the Sci-Hub case “active concert or participation” is also a requirement for the injunction to apply. While it specifically mentions ISPs and search engines, ACS Director Glenn Ruskin previously stressed that companies won’t be targeted for simply linking users to Sci-Hub.

Cloudflare is a Content Delivery Network (CDN), and I use their service on my sites, to improve web performance and security. They are the subject of some controversy at the moment, as the Electronic Frontier Foundation note:

From Cloudflare’s headline-making takedown of the Daily Stormer last autumn to YouTube’s summer restrictions on LGBTQ content, there’s been a surge in “voluntary” platform censorship. Companies—under pressure from lawmakers, shareholders, and the public alike—have ramped up restrictions on speech, adding new rules, adjusting their still-hidden algorithms and hiring more staff to moderate content. They have banned ads from certain sources and removed “offensive” but legal content.

It’s a big deal, as intermediaries that are required for the optimisation in speed of large website succumb to political pressure.

Given this history, we’re worried about how platforms are responding to new pressures. Not because there’s a slippery slope from judicious moderation to active censorship — but because we are already far down that slope. Regulation of our expression, thought, and association has already been ceded to unaccountable executives and enforced by minimally-trained, overworked staff, and hidden algorithms. Doubling down on this approach will not make it better. And yet, no amount of evidence has convinced the powers that be at major platforms like Facebook—or in governments around the world. Instead many, especially in policy circles, continue to push for companies to—magically and at scale—perfectly differentiate between speech that should be protected and speech that should be erased.

We live in contentious times, which are setting the course for a digitally mediate future. For every positive development (such as GDPR), there’s stuff like this…

Sources: TorrentFreak / EFF

Facebook is under attack

This year is a time of reckoning for the world’s most popular social network. From their own website (which I’ll link to via archive.org because I don’t link to Facebook). Note the use of the passive voice:

Facebook was originally designed to connect friends and family — and it has excelled at that. But as unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated.

It’s pretty amazing that a Facebook spokesperson is saying things like this:

I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t. That’s why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.

What they are careful to do is to paint a picture of Facebook as somehow ‘neutral’ and being ‘hijacked’ by bad actors. This isn’t actually the case.

As an article in The Guardian points out, executives at Facebook and Twitter aren’t exactly heavy users of their own platforms:

It is a pattern that holds true across the sector. For all the industry’s focus on “eating your own dog food”, the most diehard users of social media are rarely those sitting in a position of power.

These sites are designed to be addictive. So, just as drug dealers “don’t get high on their own supply”, so those designing social networks know what they’re dealing with:

These addictions haven’t happened accidentally… Instead, they are a direct result of the intention of companies such as Facebook and Twitter to build “sticky” products, ones that we want to come back to over and over again. “The companies that are producing these products, the very large tech companies in particular, are producing them with the intent to hook. They’re doing their very best to ensure not that our wellbeing is preserved, but that we spend as much time on their products and on their programs and apps as possible. That’s their key goal: it’s not to make a product that people enjoy and therefore becomes profitable, but rather to make a product that people can’t stop using and therefore becomes profitable.

The trouble is that this advertising-fuelled medium which is built to be addictive, is the place where most people get their news these days. Facebook has realised that it has a problem in this regard so they’ve made the decision to pass the buck to users. Instead of Facebook, or anyone else, deciding which news sources an individual should trust, it’s being left up to users.

While this sounds empowering and democratic, I can’t help but think it’s a bad move. As The Washington Post notes:

“They want to avoid making a judgment, but they are in a situation where you can’t avoid making a judgment,” said Jay Rosen, a journalism professor at New York University. “They are looking for a safe approach. But sometimes you can be in a situation where there is no safe route out.”

The article continues to cite former Facebook executives who think that the problems are more than skin-deep:

They say that the changes the company is making are just tweaks when, in fact, the problems are a core feature of the Facebook product, said Sandy Parakilas, a former Facebook privacy operations manager.

“If they demote stories that get a lot of likes, but drive people toward posts that generate conversation, they may be driving people toward conversation that isn’t positive,” Parakilas said.

A final twist in the tale is that Rupert Murdoch, a guy who has no morals but certainly has a valid point here, has made a statement on all of this:

If Facebook wants to recognize ‘trusted’ publishers then it should pay those publishers a carriage fee similar to the model adopted by cable companies. The publishers are obviously enhancing the value and integrity of Facebook through their news and content but are not being adequately rewarded for those services. Carriage payments would have a minor impact on Facebook’s profits but a major impact on the prospects for publishers and journalists.”

2018 is going to be an interesting year. If you want to quit Facebook and/or Twitter be part of something better, why not join me on Mastodon via social.coop and help built Project MoodleNet?

Sources: Facebook newsroom / The Guardian / The Washington Post / News Corp

Social media short-circuits democracy

I’m wondering whether to delete all my social media accounts, or whether I should stay and fight. The trouble is, no technology is neutral, it always contains biases.

It’s interesting how the narrative has changed since the 2011 revolutions in Iran and Egypt:

Because of the advent of social media, the story seemed to go, tyrants would fall and democracy would rule. Social media communications were supposed to translate into a political revolution, even though we don’t necessarily agree on what a positive revolution would look like. The process is overtly emotional: The outrage felt translates directly, thanks to the magic of social media, into a “rebellion” that becomes democratic governance.

But social media has not helped these revolutions turn into lasting democracies. Social media speaks directly to the most reactive, least reflective parts of our minds, demanding we pay attention even when our calmer selves might tell us not to. It is no surprise that this form of media is especially effective at promoting hate, white supremacy, and public humiliation.

In my new job at Moodle, I’m tasked with leading work around a new social network for educators focused on sharing Open Educational Resources and professional development. I think we’ll start to see more social networks based around content than people (think Pinterest rather than Facebook).

Source: Motherboard