Tag: social media

There is no exercise of the intellect which is not, in the final analysis, useless

A quotation from a short story from Jorge Luis Borges’ Labyrinths provides the title for today’s article. I want to dig into the work of danah boyd and the transcript of a talk she gave recently, entitled Agnotology and Epistemological Fragmentation. It helps us understand what’s going on behind the seemingly-benign fascias of social networks and news media outlets.

She explains the title of her talk:

Epistemology is the term that describes how we know what we know. Most people who think about knowledge think about the processes of obtaining it. Ignorance is often assumed to be not-yet-knowledgeable. But what if ignorance is strategically manufactured? What if the tools of knowledge production are perverted to enable ignorance? In 1995, Robert Proctor and Iain Boal coined the term “agnotology” to describe the strategic and purposeful production of ignorance. In an edited volume called Agnotology, Proctor and Londa Schiebinger collect essays detailing how agnotology is achieved. Whether we’re talking about the erasure of history or the undoing of scientific knowledge, agnotology is a tool of oppression by the powerful.

danah boyd

Having already questioned ‘media literacy’ the way it’s currently taught through educational institutions and libraries, boyd explains how the alt-right are streets ahead of educators when it comes to pushing their agenda:

One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this.

[…]

Today’s drumbeat happens online. The goal is no longer just to go straight to the news media. It’s to first create a world of content and then to push the term through to the news media at the right time so that people search for that term and receive specific content. Terms like caravan, incel, crisis actor. By exploiting the data void, or the lack of viable information, media manipulators can help fragment knowledge and seed doubt.

danah boyd

Harold Jarche uses McLuhan’s tetrads to understand this visually, commenting: “This is an information war. Understanding this is the first step in fighting for democracy.”

Harold Jarche on Agnotology

We can teach children sitting in classrooms all day about checking URLs and the provenance of the source, but how relevant is that when they’re using YouTube as their primary search engine? Returning to danah boyd:

YouTube has great scientific videos about the value of vaccination, but countless anti-vaxxers have systematically trained YouTube to make sure that people who watch the Center for Disease Control and Prevention’s videos also watch videos asking questions about vaccinations or videos of parents who are talking emotionally about what they believe to be the result of vaccination. They comment on both of these videos, they watch them together, they link them together. This is the structural manipulation of media.

danah boyd

It’s not just the new and the novel. Even things that are relatively obvious to those of us who have grown up as adults online are confusing to older generations. As this article by BuzzFeed News reporter Craig Silverman points out, conspiracy-believing retirees have disproportionate influence on our democratic processes:

Older people are also more likely to vote and to be politically active in other ways, such as making political contributions. They are wealthier and therefore wield tremendous economic power and all of the influence that comes with it. With more and more older people going online, and future 65-plus generations already there, the online behavior of older people, as well as their rising power, is incredibly important — yet often ignored.

Craig Silverman

So when David Buckingham asks ‘Who needs digital literacy?’ I think the answer is everyone. Having been a fan of his earlier work, it saddens me to realise that he hasn’t kept up with the networked era:

These days, I find the notion of digital literacy much less useful – and to some extent, positively misleading. The fundamental problem is that the idea is defined by technology itself. It makes little sense to distinguish between texts (or media) on the grounds of whether they are analogue or digital: almost all media (including print media) involve the use of digital technology at some stage or other. Fake news and disinformation operate as much in old, analogue media (like newspapers) as they do online. Meanwhile, news organisations based in old media make extensive and increasing use of online platforms. The boundaries between digital and analogue may still be significant in some situations, but they are becoming ever more blurred.

David Buckingham

Actually, as Howard Rheingold pointed out a number of years ago in Net Smart, and as boyd has done in her own work, networks change everything. You can’t seriously compare pre-networked and post-networked cultures in any way other than in contrast.

Buckingham suggests that, seeing as the (UK) National Literacy Trust are on the case, we “don’t need to reinvent the wheel”. The trouble is that the wheel has already been reinvented, and lots of people either didn’t notice, or are acting as though it hasn’t been.

There’s a related article by Anna Mckie in the THE entitled Teaching intelligence: digital literacy in the ‘alternative facts’ era which, unfortunately, is now behind a paywall. It reports on a special issue of the journal Teaching in Higher Education where the editors have brought together papers on the contribution made by Higher Education to expertise and knowledge in the age of ‘alternative facts’:

[S]ocial media has changed the dynamic of information in our society, [editor] Professor Harrison added. “We’ve moved away from the idea of experts who assess information to one where the validity of a statement is based on the likes, retweets and shares it gets, rather than whether the information is valid.”

The first task of universities is to go back to basics and “help students to understand the difference between knowledge and information, and how knowledge is created, which is separate to how information is created”, Professor Harrison said. “Within [each] discipline, what are the skills needed to assess that?”

Many assume that schools or colleges are teaching this, but that is not the case, he added. “Academics should also be wary of the extent to which they themselves understand the new paradigms of knowledge creation,” Professor Harrison warned.

Anna McKie

One of the reasons I decided not to go into academia is that, certain notable exceptions aside, the focus is on explaining rather than changing. Or, to finish with another quotation, this time from Karl Marx, “Philosophers have hitherto only interpreted the world in various ways; the point is to change it.”


Also check out:

Rules for Online Sanity

It’s funny: we tell kids not to be mean to one another, and then immediately jump on social media to call people out and divide ourselves into various camps.

This list by Sean Blanda has been shared in several places, and rightly so. I’ve highlighted what I consider to be the top three.

I’ve started thinking about what are the “new rules” for navigating the online world? If you could get everyone to agree (implicitly or explicitly) to a set of rules, what would they be? Below is an early attempt at an “Rules for Online Sanity” list. I’d love to hear what you think I missed.

  • Reward your “enemies” when they agree with you, exhibit good behavior, or come around on an issue. Otherwise they have no incentive to ever meet you halfway.
  • Accept it when people apologize. People should be allowed to work through ideas and opinions online. And that can result in some messy outcomes. Be forgiving.
  • Sometimes people have differing opinions because they considered something you didn’t.
  • Take a second.
  • There’s always more to the story. You probably don’t know the full context of whatever you’re reading or watching.
  • If an online space makes more money the more time you spend on it, use sparingly.
  • Judge people on their actions, not their words. Don’t get outraged over what people said. Get outraged at what they actually do.
  • Try to give people the benefit of the doubt, be charitable in how you read people’s ideas.
  • Don’t treat one bad actor as representative of whatever group or demographic they belong to.
  • Create the kind of communities and ideas you want people to talk about.
  • Sometimes, there are bad actors that don’t play by the rules. They should be shunned, castigated, and banned.
  • You don’t always have the moral high ground. You are not always right.
  • Block and mute quickly. Worry about the bubbles that creates later.
  • There but for the grace of God go you.

Oh, and about “creating communities”: why not support Thought Shrapnel via Patreon and comment on these posts along with people you already know have something in common?

Source: The Discourse (via Read Write Collect)

Internalising the logic of social media

A few days ago, Twitter posted a photo of an early sketch that founder Jack Dorsey made for the initial user interface. It included settings to inform a user’s followers that they might not respond immediately because they were in the part or busy reading.

A day later, an article in The New Yorker about social media used a stark caption for its header image:

Social-media platforms know what you’re seeing, and they know how you acted in the immediate aftermath of seeing it, and they can decide what you will see next.

There’s no doubt in my mind that we’re like slow-boiled frogs when it comes to creeping dystopia. It’s not happening through the totalitarian lens of the 20th century, but instead in a much more problematic way.

One of the more insidious aspects of [social media’s business] model is the extent to which we, as social-media users, replicate its logic at the level of our own activity: we perform market analysis of our own utterances, calculating the reaction a particular post will generate and adjusting our output accordingly. Negative emotions like outrage and contempt and anxiety tend to drive significantly more engagement than positive ones.

No wonder Twitter’s such an angry place these days.

The article quotes James Bridle’s book New Dark Age, a book which is sitting waiting for me on my shelf when I get back home from this work trip.

We find ourselves today connected to vast repositories of knowledge and yet we have not learned to think. In fact, the opposite is true: that which was intended to enlighten the world in practice darkens it. The abundance of information and the plurality of worldviews now accessible to us through the internet are not producing a coherent consensus reality, but one riven by fundamentalist insistence on simplistic narratives, conspiracy theories, and post-factual politics. It is on this contradiction that the idea of a new dark age turns: an age in which the value we have placed upon knowledge is destroyed by the abundance of that profitable commodity, and in which we look about ourselves in search of new ways to understand the world.

This resonates with a quotation I posted to Thought Shrapnel this week from Jon Ronson’s So You’ve Been Publicly Shamed about how we’re actually creating a more conservative environment, despite thinking we’re all ‘non-conformist’.

To be alive and online in our time is to feel at once incensed and stultified by the onrush of information, helpless against the rising tide of bad news and worse opinions. Nobody understands anything: not the global economy governed by the unknowable whims of algorithms, not our increasingly volatile and fragile political systems, not the implications of the impending climate catastrophe that forms the backdrop of it all. We have created a world that defies our capacity to understand it—though not, of course, the capacity of a small number of people to profit from it. Deleting your social-media accounts might be a means of making it more bearable, and even of maintaining your sanity. But one way or another, the world being what it is, we are going to have to learn to live in it.

Last week, at the ALT conference, those in the audience were asked by the speaker to ‘stand up’ if they felt imposter syndrome. I didn’t get to my feet, but it wasn’t an act of arrogance or hubris. I may have no idea what I’m doing, but I’m pretty sure no-one else does either.

Source: The New Yorker

On living in public

In this post, Austin Kleon, backpedaling a little from the approach he seemed to promote in Show Your Work!, talks about the problems we all face with ‘living in public’.

It seems ridiculous to say, but 2013, the year I wrote the book, was a simpler time. Social media seemed much more benign to me. Back then, the worst I felt social media did was waste your time. Now, the worst social media does is cripple democracy and ruin your soul.

Kleon quotes Warren Ellis, who writes one of my favourite newsletters (his blog is pretty good, too):

You don’t have to live in public on the internet if you don’t want to. Even if you’re a public figure, or micro-famous like me. I don’t follow anyone on my public Instagram account. No shade on those who follow me there, I’m glad you give me your time – but I need to be in my own space to get my shit done. You want a “hack” for handling the internet? Create private social media accounts, follow who you want and sit back and let your bespoke media channels flow to you. These are tools, not requirements. Don’t let them make you miserable. Tune them until they bring you pleasure.

In May 2017, after being on Twitter over a decade, I deleted my Twitter history, and now delete tweets on a weekly basis. Now, I hang out on a social network that I co-own called social.coop and which is powered by a federated, decentralised service called Mastodon.

I still publish my work, including Thought Shrapnel posts, to Twitter, LinkedIn, etc. It’s just not where I spend most of my time. On balance, I’m happier for it.

Source: Austin Kleon

The death of the newsfeed (is much exaggerated)

Benedict Evans is a venture capitalist who focuses on technology companies. He’s a smart guy with some important insights, and I thought his recent post about the ‘death of the newsfeed’ on social networks was particularly useful.

He points out that it’s pretty inevitable that the average person will, over the course of a few years, add a few hundred ‘friends’ to their connections on any given social network. Let’s say you’re connected with 300 people, and they all share five things each day. That’s 1,500 things you’ll be bombarded with, unless the social network does something about it.

This overload means it now makes little sense to ask for the ‘chronological feed’ back. If you have 1,500 or 3,000 items a day, then the chronological feed is actually just the items you can be bothered to scroll through before giving up, which can only be 10% or 20% of what’s actually there. This will be sorted by no logical order at all except whether your friends happened to post them within the last hour. It’s not so much chronological in any useful sense as a random sample, where the randomizer is simply whatever time you yourself happen to open the app. ’What did any of the 300 people that I friended in the last 5 years post between 16:32 and 17:03?’ Meanwhile, giving us detailed manual controls and filters makes little more sense – the entire history of the tech industry tells us that actual normal people would never use them, even if they worked. People don’t file.

So we end up with algorithmic feeds, which is an attempt by social networks to ensure that you see the stuff that you deem important. It is, of course, an almost impossible mission.

[T]here are a bunch of problems around getting the algorithmic newsfeed sample ‘right’, most of which have been discussed at length in the last few years. There are lots of incentives for people (Russians, game developers) to try to manipulate the feed. Using signals of what people seem to want to see risks over-fitting, circularity and filter bubbles. People’s desires change, and they get bored of things, so Facebook has to keep changing the mix to try to reflect that, and this has made it an unreliable partner for everyone from Zynga to newspapers. Facebook has to make subjective judgements about what it seems that people want, and about what metrics seem to capture that, and none of this is static or even in in principle perfectible. Facebook surfs user behaviour.

Evans then goes on to raise the problem of what you want to see may be different from what your friends want you to see. So people solve the problem of algorithmic feeds not showing them what they really want by using messaging apps such as WhatsApp and Telegram to interact individually with people or small groups.

The problem with that, though?

The catch is that though these systems look like they reduce sharing overload, you really want group chats. And lots of groups. And when you have 10 WhatsApp groups with 50 people in each, then people will share to them pretty freely. And then you think ‘maybe there should be a screen with a feed of the new posts in all of my groups. You could call it a ‘news feed’. And maybe it should get some intelligence, to show the posts you care about most…

So, to Evans mind (and I’m tempted to agree with him) we’re in a never-ending spiral. The only way I can see out of it is user education, particularly around owning one’s own data and IndieWeb approaches.

Source: Benedict Evans

Social internet vs social media

It’s good to see Cal Newport, whose book Deep Work I found unexpectedly great last year, add a bit more nuance to his position on social media:

The young progressives grew up in a time when platform monopolies like Facebook were so dominant that they seemed inextricably intertwined into the fabric of the internet. To criticize social media, therefore, was to criticize the internet’s general ability to do useful things like connect people, spread information, and support activism and expression.

The older progressives, however, remember the internet before the platform monopolies. They were concerned to observe a small number of companies attempt to consolidate much of the internet into their for-profit, walled gardens.

To them, social media is not the internet. It was instead a force that was co-opting the internet — including the powerful capabilities listed above — in ways that would almost certainly lead to trouble.

Newport has started talking about the difference between ‘social media’ and the ‘social internet’:

The social internet describes the general ways in which the global communication network and open protocols known as “the internet” enable good things like connecting people, spreading information, and supporting expression and activism.

Social media, by contrast, describes the attempt to privatize these capabilities by large companies within the newly emerged algorithmic attention economy, a particularly virulent strain of the attention sector that leverages personal data and sophisticated algorithms to ruthlessly siphon users’ cognitive capital.

If you’d asked people in 2005, they would have said that there was no way that people would leave MySpace in favour of a different platform.

People like Facebook. But if you could offer them a similar alternative that stripped away the most unsavory elements of Zuckerberg’s empire (perhaps funded by a Wikipedia-style nonprofit collective, or a modest subscription fee), many would happily jump ship.

Indeed.

Following up with another this post this week, Newport writes:

My argument is that you can embrace the social internet without having to become a “gadget” inside the algorithmic attention economy machinations of the social media conglomerates. As noted previously, I think this is the right answer for those who are fed up with the dehumanizing aspects of social media, but are reluctant to give up altogether on the potential of the internet to bring people together.

He suggests several ways for this to happen:

  • Approach #1: The Slow Social Media Philosophy
  • Approach #2: Own Your Own Domain

This is, in effect, the IndieWeb approach. However, I still think that Newport and others who work in universities may a special case. As Austin Kleon notes, there’s already built-in ways for your career to advance in academia. Others have to show their work…

What I don’t see being discussed is that as we collectively mature in our use of social media is that we’re likely to use different networks for different purposes. Facebook, LinkedIn, and the like try to force us into a single online identity. It’s OK to look and act differently when you’re around different people in different environments.

Source: Cal Newport (On Social Media and Its Discontents / Beyond #DeleteFacebook: More Thoughts on Embracing the Social Internet Over Social Media)

Going deep

I don’t think the right term for this is ‘mobile blindness’ but Seth Godin’s analogy is nevertheless instructive.

He talks about the shift over the last 20 years or so in getting our news and information on primarily via books and newspapers, to getting it via desktop computers, and now predominantly through our mobile devices. Things become bite-sized, and our attention field is wide by shallow.

Photokeratitis (snow blindness) happens when there’s too much ultraviolet–when the fuel for our eyes comes in too strong and we can’t absorb it all. Something similar is happening to each of us, to our entire culture, as a result of the tsunami of noise vying for our attention.

It’s possible you can find an edge by going even faster and focusing even more on breadth at the surface. But it’s far more satisfying and highly leveraged to go the other way instead. Even if it’s just for a few hours a day.

If you care about something, consider taking a moment to slow down and understand it. And if you don’t care, no need to even bother with the surface.

This isn’t a technology issue, it’s an attention issue. Yes, it’s possible to argue that these devices are designed to capture your attention. But we all still have a choice.

You can safely ignore what doesn’t align with your goals in life. First, of course, you have to have some goals…

Source: Seth Godin

Life in the outrage economy

Rafael Behr nails it when he says we live in an ‘outrage economy’:

Rage is contagious. It spreads from one sweaty digital crevice to the next, like a fungal infection. It itches like one too. When sitting at the keyboard, it is difficult to perceive wrongness without wanting to scratch it with a caustic retort. But that provides no sustained relief. One side’s scratch is the other side’s itch.

I’m just back from watching Star Wars: The Last Jedi. It’s an incredible film with plenty of social commentary. The Rebel Alliance is outraged at what the First Order is doing, just as we’re outraged with the order of our society, created by elites.

An outrage economy is lucrative only in an outraged society. Once stoked, the anger becomes self-sustaining, addictive. There is a physiological gratification in rage – a primitive adrenal response that overrides more sophisticated emotions. It can be perversely comforting. Politicised anger feels virtuous. It is the kick of moral purpose, but conveniently stripped of any obligation to consider nuance or alternative perspectives. Hatred of a proposition, or a party, removes interest in understanding why others like it. Self-righteous anger is an excuse not to even try to persuade. St Augustine’s invitation to “love the sinner, hate the sin” does not have much purchase on Twitter.

Perhaps we need to ‘use the force’ and come into a bit more balance, both individually and as a society. After all, more outrage just feeds the whole edifice from which the bad guys prosper.

Source: The Guardian

Sticks and stones

This article, originally given as a lecture, focuses on the worrying fact that we no longer seem to know how to disagree with one another any more. I’ve certainly witnessed this with the ‘hive mind’ on social networks, who are outraged if anyone so much as questions what keyboard warriors see as sacred tenets. 

In other words, to disagree well you must first understand well. You have to read deeply, listen carefully, watch closely. You need to grant your adversary moral respect; give him the intellectual benefit of doubt; have sympathy for his motives and participate empathically with his line of reasoning. And you need to allow for the possibility that you might yet be persuaded of what he has to say.

I subscribe to the view that we should have strong opinions, weakly held. In other words, we shouldnl be neither embarrassed nor reticent to say what we think, but we should be ready to change our mind. This is why the EU ‘right to be forgotten’ legislation is so important. We grow up, emotionally, physically, and intellectually.

There’s no one answer. What’s clear is that the mis-education begins early. I was raised on the old-fashioned view that sticks and stones could break my bones but words would never hurt me. But today there’s a belief that since words can cause stress, and stress can have physiological effects, stressful words are tantamount to a form of violence. This is the age of protected feelings purchased at the cost of permanent infantilization.

Source: The New York Times