Tag: social media (page 1 of 2)

Friday fermentations

I boiled the internet and this was what remained:

  • I Quit Social Media for a Year and Nothing Magical Happened (Josh C. Simmons) — “A lot of social media related aspects of my life are different now – I’m not sure they’re better, they’re just different, but I can confidently say that I prefer this normal to last year’s. There’s a bit of rain with all of the sunshine. I don’t see myself ever going back to social media. I don’t see the point of it, and after leaving for a while, and getting a good outside look, it seems like an abusive relationship – millions of workers generating data for tech-giants to crunch through and make money off of. I think that we tend to forget how we were getting along pretty well before social media – not everything was idyllic and better, but it was fine.”
  • Face recognition, bad people and bad data (Benedict Evans) — “My favourite example of what can go wrong here comes from a project for recognising cancer in photos of skin. The obvious problem is that you might not have an appropriate distribution of samples of skin in different tones. But another problem that can arise is that dermatologists tend to put rulers in the photo of cancer, for scale – so if all the examples of ‘cancer’ have a ruler and all the examples of ‘not-cancer’ do not, that might be a lot more statistically prominent than those small blemishes. You inadvertently built a ruler-recogniser instead of a cancer-recogniser.”
  • Would the Internet Be Healthier Without ‘Like’ Counts? (WIRED) ⁠— “Online, value is quantifiable. The worth of a person, idea, movement, meme, or tweet is often based on a tally of actions: likes, retweets, shares, followers, views, replies, claps, and swipes-up, among others. Each is an individual action. Together, though, they take on outsized meaning. A YouTube video with 100,000 views seems more valuable than one with 10, even though views—like nearly every form of online engagement—can be easily bought. It’s a paradoxical love affair. And it’s far from an accident.”
  • Are Platforms Commons? (On The Horizon) — “[W]hat if ecosystems were constructed so that they were governed by the participants, rather by the hypercapitalist strivings of the platform owners — such as Apple, Google, Amazon, Facebook — or the heavy-handed regulators? Is there a middle ground where the needs of the end user and those building, marketing, and shipping products and services can be balanced, and a fair share of the profits are distributed not just through common carrier laws but by the shared economics of a commons, and where the platform orchestrator gets a fair share, as well?”
  • Depression and anxiety threatened to kill my career. So I came clean about it (The Guardian) — “To my surprise, far from rejecting me, students stayed after class to tell me how sorry they were. They left condolence cards in my mailbox and sent emails to let me know they were praying for my family. They stopped by my office to check on me. Up to that point, I’d been so caught up in my despair that it never occurred to me that I might be worthy of concern and support. Being accepted despite my flaws touched me in ways that are hard to express.”
  • Absolute scale corrupts absolutely (apenwarr) — “Here’s what we’ve lost sight of, in a world where everything is Internet scale: most interactions should not be Internet scale. Most instances of most programs should be restricted to a small set of obviously trusted people. All those people, in all those foreign countries, should not be invited to read Equifax’s PII database in Argentina, no matter how stupid the password was. They shouldn’t even be able to connect to the database. They shouldn’t be able to see that it exists. It shouldn’t, in short, be on the Internet.”
  • The Automation Charade (Logic magazine) — “The problem is that the emphasis on technological factors alone, as though “disruptive innovation” comes from nowhere or is as natural as a cool breeze, casts an air of blameless inevitability over something that has deep roots in class conflict. The phrase “robots are taking our jobs” gives technology agency it doesn’t (yet?) possess, whereas “capitalists are making targeted investments in robots designed to weaken and replace human workers so they can get even richer” is less catchy but more accurate.”
  • The ambitious plan to reinvent how websites get their names (MIT Technology Review) — “The system would be based on blockchain technology, meaning it would be software that runs on a widely distributed network of computers. In theory, it would have no single point of failure and depend on no human-run organization that could be corrupted or co-opted.”
  • O whatever God or whatever ancestor that wins in the next life (The Main Event) — “And it begins to dawn on you that the stories were all myths and the epics were all narrated by the villains and the history books were written to rewrite the histories and that so much of what you thought defined excellence merely concealed grift.”
  • A Famous Argument Against Free Will Has Been Debunked (The Atlantic) — “In other words, people’s subjective experience of a decision—what Libet’s study seemed to suggest was just an illusion—appeared to match the actual moment their brains showed them making a decision.”

The best place to be is somewhere else?

So said Albarran Cabrera, except I added a cheeky question mark.

I have a theory. Not a grand, unifying theory of everything, but a theory nonetheless. I reckon that, despite common wisdom attributing the decline of comments on blogs to social media, it’s at least also because of something else.

Here’s an obvious point: there’s more people online now than there were ten years ago. As a result, there’s more stuff being produced and shared and, because of that, there’s more to miss out on. This is known as the Fear Of Missing Out (or FOMO).

While I don’t think anyone realistically thinks it’s possible to keep up with everything produced online every day, I think people do have an expectation that they can keep up with what their online friends are doing and thinking. As the number of people we’re following in different places grows and grows, we don’t have much time to share meaningfully. Hence the rise of the retweet button.

Back in 2006, in the mists of internet time, Kathy Sierra wrote a great post entitled The myth of “keeping up”. Remember that this was before people were really using social networks such as Twitter. She talks about what we’re experiencing as ‘information anxiety’ and has some tips to combat it, which I think are still relevant:

  • Find the best aggregators
  • Get summaries
  • Cut the redundancy!
  • Unsubscribe to as many things as possible
  • Recognise that gossip and celebrity entertainment are black holes
  • Pick the categories you want for a balanced perspective, and include some from OUTSIDE your main field of interest
  • Be a LOT more realistic about what you’re likely to get to, and throw the rest out.
  • In any thing you need to learn, find a person who can tell you what is:
    • Need to know
    • Should know
    • Nice to know
    • Edge case, only if it applies to you specifically
    • Useless

The interesting thing is that, done well, social media can actually be a massive force for good. It used to be set up for that, coming on the back of RSS. Now, it’s set up to drag you into arguments about politics and the kind of “black holes” of gossip and celebrity entertainment that Kathy mentions.

One of the problems is that we have a cult of ‘busy’ which people mis-attribute to a Protestant work ethic instead of rapacious late-stage capitalism. I’ve recently finished 24/7: Late Capitalism and the Ends of Sleep by Jonathan Crary where he makes this startlingly obvious, but nevertheless profound point:

Because one’s bank account and one’s friendships can now be managed through identical machinic operations and gestures, there is a growing homogenization of what used to be entirely unrelated areas of experience.

Jonathan Crary

…and:

[S]ince no moment, place, or situation now exists in which one can not shop, consume, or exploit networked resources, there is a relentless incursion of the non-time of 24/7 into every aspect of social or personal life.

Jonathan Crary

In other words, you’re busy because of your smartphone, the apps you decide to install upon it, and the notifications that you then receive.

The solution to FOMO is to know who you are, what you care about, and the difference you’re trying to make in the world. As Gandhi famously said:

Happiness is when what you think, what you say, and what you do are in harmony.

Mahatma Gandhi

I’ve recently fallen into the trap of replying to work emails on my days off. It’s a slippery slope, as it sets up an expectation.

via xkcd

The same goes with social media, of course, except that it’s even more insidious, as an ‘action’ can just be liking or retweeting. It leads to slacktivism instead of making actual, meaningful change in the world.

People joke about life admin but one of those life admin tasks might be to write down (yes! with a pen and paper!) the things you’re trying to achieve with the ‘free’ apps that you’ve got installed. If you were being thorough, or teaching kids how to do this, perhaps you’d:

  1. List all of the perceived benefits
  2. List all of the perceived drawbacks
  3. List all of the ways that the people making the free app can make money

Tim Ferriss recently reposted an interview he did with Seth Godin back in 2016 about how he (Seth) manages his life. It’s an object lesson in focus, and leading an intentional life without overly-quantifying it. I can’t help but think it’s all about focus. Oh, and he doesn’t use social media, other than auto-posting from his blog to Twitter.

For me, at least, because I spend so much time surrounded by technology, the decisions I make about tech are decisions I make about life. A couple of months ago I wrote a post entitled Change your launcher, change your life where I explained that even just changing how you access apps can make a material difference to your life.

So, to come full circle, the best place to be is actually where you are right now, not somewhere else. If you’re fully present in the situation (Tim Ferriss suggests taking three breaths), then ask yourself some hard questions about what success looks like for you, and perhaps whether what you say, what you think, and what you do are in harmony.

Only thoughts conceived while walking have any value

Philosopher and intrepid walker Friedrich Nietzsche is well known for today’s quotation-as-title. Fellow philosopher Immanuel Kant was a keen walker, too, along with Henry David Thoreau. There’s just something about big walks and big thoughts.

I spent a good part of yesterday walking about 30km because I woke wanting to see the sea. It has a calming effect on me, and my wife was at work with the car. Forty-thousand steps later, I’d not only succeeded in my mission and taken the photo that accompanies this post, but managed to think about all kinds of things that definitely wouldn’t have entered my mind had I stayed at home.

I want to focus the majority of this article on a single piece of writing by Craig Mod, whose walk across Japan I followed by SMS. Instead of sharing the details of his 620 mile, six-week trek via social media, he instead updated a server which then sent text messages (with photographs, so technically MMS) to everyone who’d signed up to receive them. Readers could reply, but he didn’t receive these until he’d finished the walk and they’d been automatically curated into a book and sent to him.

Writing in WIRED, Mod talks of his “glorious, almost-disconnected walk” which was part experiment, part protest:

I have configured servers, written code, built web pages, helped design products used by millions of people. I am firmly in the camp that believes technology is generally bending the world in a positive direction. Yet, for me, Twitter foments neurosis, Facebook sadness, Google News a sense of foreboding. Instagram turns me covetous. All of them make me want to do it—whatever “it” may be—for the likes, the comments. I can’t help but feel that I am the worst version of myself, being performative on a very short, very depressing timeline. A timeline of seconds.

[…]

So, a month ago, when I started walking, I decided to conduct an experiment. Maybe even a protest. I wanted to test hypotheses. Our smartphones are incredible machines, and to throw them away entirely feels foolhardy. The idea was not to totally disconnect, but to test rational, metered uses of technology. I wanted to experience the walk as the walk, in all of its inevitably boring walkiness. To bask in serendipitous surrealism, not just as steps between reloading my streams. I wanted to experience time.

Craig Mod

I love this, it’s so inspiring. The most number of consecutive days I’ve walked is only two, so I can’t even really imagine what it must be like to walk for weeks at a time. It’s a form of meditation, I suppose, and a way to re-centre oneself.

The longness of an activity is important. Hours or even days don’t really cut it when it comes to long. “Long” begins with weeks. Weeks of day-after-day long walking days, 30- or 40-kilometer days. Days that leave you wilted and aware of all the neglect your joints and muscles have endured during the last decade of sedentary YouTubing.

[…]

In the context of a walk like this, “boredom” is a goal, the antipode of mindless connectivity, constant stimulation, anger and dissatisfaction. I put “boredom” in quotes because the boredom I’m talking about fosters a heightened sense of presence. To be “bored” is to be free of distraction.

Craig Mod

I find that when I walk for any period of time, certain songs start going through my head. Yesterday, for example, my brain put on repeat the song Good Enough by Dodgy from their album Free Peace Sweet. The time before it was We Can Do It from Jamiroquai’s latest album Automaton. I’m not sure where it comes from, although the beat does have something to do with my pace.

Walking by oneself seems to do something to the human brain akin to unlocking the subconscious. That’s why I’m not alone in calling it a ‘meditative’ activity. While I enjoy walking with others, the brain seems to start working a different way when you’re by yourself being propelled by your own two legs.

It’s easy to feel like we’re not ‘keeping up’ with work, with family and friends, and with the news. The truth is, however, that the most important person to ‘keep up’ with is yourself. Having a strong sense of self, I believe, is the best way to live a life with meaning.

It might sound ‘boring’ to go for a long walk, but as Alain de Botton notes in The News: a user’s manual, getting out of our routine is sometimes exactly what we need:

What we colloquially call ‘feeling bored’ is just the mind, acting out of a self-preserving reflex, ejecting information it has despaired of knowing where to place.

Alain de Botton

I’m not going to tell you what I thought about during my walk today as, outside of the rich (inner and outer) context in which the thinking took place, whatever I write would probably sound banal.

To me, however, the thoughts I had today will, like all of the thoughts I’ve had while doing some serious walking, help me organise my future actions. Perhaps that’s what Nietzsche meant when he said that only thoughts conceived while walking have any value.


Also check out:

  • One step ahead: how walking opens new horizons (The Guardian) — “Walking provides just enough diversion to occupy the conscious mind, but sets our subconscious free to roam. Trivial thoughts mingle with important ones, memories sharpen, ideas and insights drift to the surface.”
  • A Philosophy of Walking (Frédéric Gros) — “a bestseller in France, leading thinker Frédéric Gros charts the many different ways we get from A to B—the pilgrimage, the promenade, the protest march, the nature ramble—and reveals what they say about us.”
  • What 10,000 Steps Will Really Get You (The Atlantic) — “While basic guidelines can be helpful when they’re accurate, human health is far too complicated to be reduced to a long chain of numerical imperatives. For some people, these rules can even do more harm than good.”

There is no exercise of the intellect which is not, in the final analysis, useless

A quotation from a short story from Jorge Luis Borges’ Labyrinths provides the title for today’s article. I want to dig into the work of danah boyd and the transcript of a talk she gave recently, entitled Agnotology and Epistemological Fragmentation. It helps us understand what’s going on behind the seemingly-benign fascias of social networks and news media outlets.

She explains the title of her talk:

Epistemology is the term that describes how we know what we know. Most people who think about knowledge think about the processes of obtaining it. Ignorance is often assumed to be not-yet-knowledgeable. But what if ignorance is strategically manufactured? What if the tools of knowledge production are perverted to enable ignorance? In 1995, Robert Proctor and Iain Boal coined the term “agnotology” to describe the strategic and purposeful production of ignorance. In an edited volume called Agnotology, Proctor and Londa Schiebinger collect essays detailing how agnotology is achieved. Whether we’re talking about the erasure of history or the undoing of scientific knowledge, agnotology is a tool of oppression by the powerful.

danah boyd

Having already questioned ‘media literacy’ the way it’s currently taught through educational institutions and libraries, boyd explains how the alt-right are streets ahead of educators when it comes to pushing their agenda:

One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this.

[…]

Today’s drumbeat happens online. The goal is no longer just to go straight to the news media. It’s to first create a world of content and then to push the term through to the news media at the right time so that people search for that term and receive specific content. Terms like caravan, incel, crisis actor. By exploiting the data void, or the lack of viable information, media manipulators can help fragment knowledge and seed doubt.

danah boyd

Harold Jarche uses McLuhan’s tetrads to understand this visually, commenting: “This is an information war. Understanding this is the first step in fighting for democracy.”

Harold Jarche on Agnotology

We can teach children sitting in classrooms all day about checking URLs and the provenance of the source, but how relevant is that when they’re using YouTube as their primary search engine? Returning to danah boyd:

YouTube has great scientific videos about the value of vaccination, but countless anti-vaxxers have systematically trained YouTube to make sure that people who watch the Center for Disease Control and Prevention’s videos also watch videos asking questions about vaccinations or videos of parents who are talking emotionally about what they believe to be the result of vaccination. They comment on both of these videos, they watch them together, they link them together. This is the structural manipulation of media.

danah boyd

It’s not just the new and the novel. Even things that are relatively obvious to those of us who have grown up as adults online are confusing to older generations. As this article by BuzzFeed News reporter Craig Silverman points out, conspiracy-believing retirees have disproportionate influence on our democratic processes:

Older people are also more likely to vote and to be politically active in other ways, such as making political contributions. They are wealthier and therefore wield tremendous economic power and all of the influence that comes with it. With more and more older people going online, and future 65-plus generations already there, the online behavior of older people, as well as their rising power, is incredibly important — yet often ignored.

Craig Silverman

So when David Buckingham asks ‘Who needs digital literacy?’ I think the answer is everyone. Having been a fan of his earlier work, it saddens me to realise that he hasn’t kept up with the networked era:

These days, I find the notion of digital literacy much less useful – and to some extent, positively misleading. The fundamental problem is that the idea is defined by technology itself. It makes little sense to distinguish between texts (or media) on the grounds of whether they are analogue or digital: almost all media (including print media) involve the use of digital technology at some stage or other. Fake news and disinformation operate as much in old, analogue media (like newspapers) as they do online. Meanwhile, news organisations based in old media make extensive and increasing use of online platforms. The boundaries between digital and analogue may still be significant in some situations, but they are becoming ever more blurred.

David Buckingham

Actually, as Howard Rheingold pointed out a number of years ago in Net Smart, and as boyd has done in her own work, networks change everything. You can’t seriously compare pre-networked and post-networked cultures in any way other than in contrast.

Buckingham suggests that, seeing as the (UK) National Literacy Trust are on the case, we “don’t need to reinvent the wheel”. The trouble is that the wheel has already been reinvented, and lots of people either didn’t notice, or are acting as though it hasn’t been.

There’s a related article by Anna Mckie in the THE entitled Teaching intelligence: digital literacy in the ‘alternative facts’ era which, unfortunately, is now behind a paywall. It reports on a special issue of the journal Teaching in Higher Education where the editors have brought together papers on the contribution made by Higher Education to expertise and knowledge in the age of ‘alternative facts’:

[S]ocial media has changed the dynamic of information in our society, [editor] Professor Harrison added. “We’ve moved away from the idea of experts who assess information to one where the validity of a statement is based on the likes, retweets and shares it gets, rather than whether the information is valid.”

The first task of universities is to go back to basics and “help students to understand the difference between knowledge and information, and how knowledge is created, which is separate to how information is created”, Professor Harrison said. “Within [each] discipline, what are the skills needed to assess that?”

Many assume that schools or colleges are teaching this, but that is not the case, he added. “Academics should also be wary of the extent to which they themselves understand the new paradigms of knowledge creation,” Professor Harrison warned.

Anna McKie

One of the reasons I decided not to go into academia is that, certain notable exceptions aside, the focus is on explaining rather than changing. Or, to finish with another quotation, this time from Karl Marx, “Philosophers have hitherto only interpreted the world in various ways; the point is to change it.”


Also check out:

Rules for Online Sanity

It’s funny: we tell kids not to be mean to one another, and then immediately jump on social media to call people out and divide ourselves into various camps.

This list by Sean Blanda has been shared in several places, and rightly so. I’ve highlighted what I consider to be the top three.

I’ve started thinking about what are the “new rules” for navigating the online world? If you could get everyone to agree (implicitly or explicitly) to a set of rules, what would they be? Below is an early attempt at an “Rules for Online Sanity” list. I’d love to hear what you think I missed.

  • Reward your “enemies” when they agree with you, exhibit good behavior, or come around on an issue. Otherwise they have no incentive to ever meet you halfway.
  • Accept it when people apologize. People should be allowed to work through ideas and opinions online. And that can result in some messy outcomes. Be forgiving.
  • Sometimes people have differing opinions because they considered something you didn’t.
  • Take a second.
  • There’s always more to the story. You probably don’t know the full context of whatever you’re reading or watching.
  • If an online space makes more money the more time you spend on it, use sparingly.
  • Judge people on their actions, not their words. Don’t get outraged over what people said. Get outraged at what they actually do.
  • Try to give people the benefit of the doubt, be charitable in how you read people’s ideas.
  • Don’t treat one bad actor as representative of whatever group or demographic they belong to.
  • Create the kind of communities and ideas you want people to talk about.
  • Sometimes, there are bad actors that don’t play by the rules. They should be shunned, castigated, and banned.
  • You don’t always have the moral high ground. You are not always right.
  • Block and mute quickly. Worry about the bubbles that creates later.
  • There but for the grace of God go you.

Oh, and about “creating communities”: why not support Thought Shrapnel via Patreon and comment on these posts along with people you already know have something in common?

Source: The Discourse (via Read Write Collect)

Internalising the logic of social media

A few days ago, Twitter posted a photo of an early sketch that founder Jack Dorsey made for the initial user interface. It included settings to inform a user’s followers that they might not respond immediately because they were in the part or busy reading.

A day later, an article in The New Yorker about social media used a stark caption for its header image:

Social-media platforms know what you’re seeing, and they know how you acted in the immediate aftermath of seeing it, and they can decide what you will see next.

There’s no doubt in my mind that we’re like slow-boiled frogs when it comes to creeping dystopia. It’s not happening through the totalitarian lens of the 20th century, but instead in a much more problematic way.

One of the more insidious aspects of [social media’s business] model is the extent to which we, as social-media users, replicate its logic at the level of our own activity: we perform market analysis of our own utterances, calculating the reaction a particular post will generate and adjusting our output accordingly. Negative emotions like outrage and contempt and anxiety tend to drive significantly more engagement than positive ones.

No wonder Twitter’s such an angry place these days.

The article quotes James Bridle’s book New Dark Age, a book which is sitting waiting for me on my shelf when I get back home from this work trip.

We find ourselves today connected to vast repositories of knowledge and yet we have not learned to think. In fact, the opposite is true: that which was intended to enlighten the world in practice darkens it. The abundance of information and the plurality of worldviews now accessible to us through the internet are not producing a coherent consensus reality, but one riven by fundamentalist insistence on simplistic narratives, conspiracy theories, and post-factual politics. It is on this contradiction that the idea of a new dark age turns: an age in which the value we have placed upon knowledge is destroyed by the abundance of that profitable commodity, and in which we look about ourselves in search of new ways to understand the world.

This resonates with a quotation I posted to Thought Shrapnel this week from Jon Ronson’s So You’ve Been Publicly Shamed about how we’re actually creating a more conservative environment, despite thinking we’re all ‘non-conformist’.

To be alive and online in our time is to feel at once incensed and stultified by the onrush of information, helpless against the rising tide of bad news and worse opinions. Nobody understands anything: not the global economy governed by the unknowable whims of algorithms, not our increasingly volatile and fragile political systems, not the implications of the impending climate catastrophe that forms the backdrop of it all. We have created a world that defies our capacity to understand it—though not, of course, the capacity of a small number of people to profit from it. Deleting your social-media accounts might be a means of making it more bearable, and even of maintaining your sanity. But one way or another, the world being what it is, we are going to have to learn to live in it.

Last week, at the ALT conference, those in the audience were asked by the speaker to ‘stand up’ if they felt imposter syndrome. I didn’t get to my feet, but it wasn’t an act of arrogance or hubris. I may have no idea what I’m doing, but I’m pretty sure no-one else does either.

Source: The New Yorker

On living in public

In this post, Austin Kleon, backpedaling a little from the approach he seemed to promote in Show Your Work!, talks about the problems we all face with ‘living in public’.

It seems ridiculous to say, but 2013, the year I wrote the book, was a simpler time. Social media seemed much more benign to me. Back then, the worst I felt social media did was waste your time. Now, the worst social media does is cripple democracy and ruin your soul.

Kleon quotes Warren Ellis, who writes one of my favourite newsletters (his blog is pretty good, too):

You don’t have to live in public on the internet if you don’t want to. Even if you’re a public figure, or micro-famous like me. I don’t follow anyone on my public Instagram account. No shade on those who follow me there, I’m glad you give me your time – but I need to be in my own space to get my shit done. You want a “hack” for handling the internet? Create private social media accounts, follow who you want and sit back and let your bespoke media channels flow to you. These are tools, not requirements. Don’t let them make you miserable. Tune them until they bring you pleasure.

In May 2017, after being on Twitter over a decade, I deleted my Twitter history, and now delete tweets on a weekly basis. Now, I hang out on a social network that I co-own called social.coop and which is powered by a federated, decentralised service called Mastodon.

I still publish my work, including Thought Shrapnel posts, to Twitter, LinkedIn, etc. It’s just not where I spend most of my time. On balance, I’m happier for it.

Source: Austin Kleon

The death of the newsfeed (is much exaggerated)

Benedict Evans is a venture capitalist who focuses on technology companies. He’s a smart guy with some important insights, and I thought his recent post about the ‘death of the newsfeed’ on social networks was particularly useful.

He points out that it’s pretty inevitable that the average person will, over the course of a few years, add a few hundred ‘friends’ to their connections on any given social network. Let’s say you’re connected with 300 people, and they all share five things each day. That’s 1,500 things you’ll be bombarded with, unless the social network does something about it.

This overload means it now makes little sense to ask for the ‘chronological feed’ back. If you have 1,500 or 3,000 items a day, then the chronological feed is actually just the items you can be bothered to scroll through before giving up, which can only be 10% or 20% of what’s actually there. This will be sorted by no logical order at all except whether your friends happened to post them within the last hour. It’s not so much chronological in any useful sense as a random sample, where the randomizer is simply whatever time you yourself happen to open the app. ’What did any of the 300 people that I friended in the last 5 years post between 16:32 and 17:03?’ Meanwhile, giving us detailed manual controls and filters makes little more sense – the entire history of the tech industry tells us that actual normal people would never use them, even if they worked. People don’t file.

So we end up with algorithmic feeds, which is an attempt by social networks to ensure that you see the stuff that you deem important. It is, of course, an almost impossible mission.

[T]here are a bunch of problems around getting the algorithmic newsfeed sample ‘right’, most of which have been discussed at length in the last few years. There are lots of incentives for people (Russians, game developers) to try to manipulate the feed. Using signals of what people seem to want to see risks over-fitting, circularity and filter bubbles. People’s desires change, and they get bored of things, so Facebook has to keep changing the mix to try to reflect that, and this has made it an unreliable partner for everyone from Zynga to newspapers. Facebook has to make subjective judgements about what it seems that people want, and about what metrics seem to capture that, and none of this is static or even in in principle perfectible. Facebook surfs user behaviour.

Evans then goes on to raise the problem of what you want to see may be different from what your friends want you to see. So people solve the problem of algorithmic feeds not showing them what they really want by using messaging apps such as WhatsApp and Telegram to interact individually with people or small groups.

The problem with that, though?

The catch is that though these systems look like they reduce sharing overload, you really want group chats. And lots of groups. And when you have 10 WhatsApp groups with 50 people in each, then people will share to them pretty freely. And then you think ‘maybe there should be a screen with a feed of the new posts in all of my groups. You could call it a ‘news feed’. And maybe it should get some intelligence, to show the posts you care about most…

So, to Evans mind (and I’m tempted to agree with him) we’re in a never-ending spiral. The only way I can see out of it is user education, particularly around owning one’s own data and IndieWeb approaches.

Source: Benedict Evans

Social internet vs social media

It’s good to see Cal Newport, whose book Deep Work I found unexpectedly great last year, add a bit more nuance to his position on social media:

The young progressives grew up in a time when platform monopolies like Facebook were so dominant that they seemed inextricably intertwined into the fabric of the internet. To criticize social media, therefore, was to criticize the internet’s general ability to do useful things like connect people, spread information, and support activism and expression.

The older progressives, however, remember the internet before the platform monopolies. They were concerned to observe a small number of companies attempt to consolidate much of the internet into their for-profit, walled gardens.

To them, social media is not the internet. It was instead a force that was co-opting the internet — including the powerful capabilities listed above — in ways that would almost certainly lead to trouble.

Newport has started talking about the difference between ‘social media’ and the ‘social internet’:

The social internet describes the general ways in which the global communication network and open protocols known as “the internet” enable good things like connecting people, spreading information, and supporting expression and activism.

Social media, by contrast, describes the attempt to privatize these capabilities by large companies within the newly emerged algorithmic attention economy, a particularly virulent strain of the attention sector that leverages personal data and sophisticated algorithms to ruthlessly siphon users’ cognitive capital.

If you’d asked people in 2005, they would have said that there was no way that people would leave MySpace in favour of a different platform.

People like Facebook. But if you could offer them a similar alternative that stripped away the most unsavory elements of Zuckerberg’s empire (perhaps funded by a Wikipedia-style nonprofit collective, or a modest subscription fee), many would happily jump ship.

Indeed.

Following up with another this post this week, Newport writes:

My argument is that you can embrace the social internet without having to become a “gadget” inside the algorithmic attention economy machinations of the social media conglomerates. As noted previously, I think this is the right answer for those who are fed up with the dehumanizing aspects of social media, but are reluctant to give up altogether on the potential of the internet to bring people together.

He suggests several ways for this to happen:

  • Approach #1: The Slow Social Media Philosophy
  • Approach #2: Own Your Own Domain

This is, in effect, the IndieWeb approach. However, I still think that Newport and others who work in universities may a special case. As Austin Kleon notes, there’s already built-in ways for your career to advance in academia. Others have to show their work…

What I don’t see being discussed is that as we collectively mature in our use of social media is that we’re likely to use different networks for different purposes. Facebook, LinkedIn, and the like try to force us into a single online identity. It’s OK to look and act differently when you’re around different people in different environments.

Source: Cal Newport (On Social Media and Its Discontents / Beyond #DeleteFacebook: More Thoughts on Embracing the Social Internet Over Social Media)

Going deep

I don’t think the right term for this is ‘mobile blindness’ but Seth Godin’s analogy is nevertheless instructive.

He talks about the shift over the last 20 years or so in getting our news and information on primarily via books and newspapers, to getting it via desktop computers, and now predominantly through our mobile devices. Things become bite-sized, and our attention field is wide by shallow.

Photokeratitis (snow blindness) happens when there’s too much ultraviolet–when the fuel for our eyes comes in too strong and we can’t absorb it all. Something similar is happening to each of us, to our entire culture, as a result of the tsunami of noise vying for our attention.

It’s possible you can find an edge by going even faster and focusing even more on breadth at the surface. But it’s far more satisfying and highly leveraged to go the other way instead. Even if it’s just for a few hours a day.

If you care about something, consider taking a moment to slow down and understand it. And if you don’t care, no need to even bother with the surface.

This isn’t a technology issue, it’s an attention issue. Yes, it’s possible to argue that these devices are designed to capture your attention. But we all still have a choice.

You can safely ignore what doesn’t align with your goals in life. First, of course, you have to have some goals…

Source: Seth Godin