Tag: content

Lessin’s five steps and the coming AI apocalypse

I’m not really on any of the big centralised social networks any more, but I’m interested in the effect they have on society. Apparently there have been calls recently complaining about, and resisting, changes that Instagram has made.

In this post, Ben Thompson cites Sam Lessin, a former Facebook exec, who suggests we’re at step four of a five-step process.

  1. The Pre-Internet ‘People Magazine’ Era
  2. Content from ‘your friends’ kills People Magazine
  3. Kardashians/Professional ‘friends’ kill real friends
  4. Algorithmic everyone kills Kardashians
  5. Next is pure-AI content which beats ‘algorithmic everyone’

There’s a bit in this post which I think is a pretty deep insight about human behaviour, identity, and the story we like to tell ourselves. Again, it’s Thompson quoting Lessin:

I saw someone recently complaining that Facebook was recommending to them…a very crass but probably pretty hilarious video. Their indignant response [was that] “the ranking must be broken.” Here is the thing: the ranking probably isn’t broken. He probably would love that video, but the fact that in order to engage with it he would have to go proactively click makes him feel bad. He doesn’t want to see himself as the type of person that clicks on things like that, even if he would enjoy it.

So TikTok and other platforms reducing the need for human interaction to deliver ‘engaging’ content have the capacity to fundamentally change the way we think about the world.

In another, related, post Charles Arthur scaremongers about how AI-created content will overwhelm us:

I suspect in the future there will be a premium on good, human-generated content and response, but that huge and growing amounts of the content that people watch and look at and read on content networks (“social networks” will become outdated) will be generated automatically, and the humans will be more and more happy about it.

In its way, it sounds like the society in Fahrenheit 451 (that’s 233ºC for Europeans) though without the book burning. There’s no need: why read a book when there’s something fascinating you can watch instead?

Quite what effect this has on social warming is unclear. Possibly it accelerates polarisation, but rather like the Facebook Blenderbot, people are just segmented into their own world, and not shown things that will disturb them. Or, perhaps, they’re shown just enough to annoy them and engage them again if their attention seems to be flagging. After all, if you can generate unlimited content, you can do what you want. And as we know, what the companies who do this want is your attention, all the time.

As ever, I don’t think we’re ready for this. Not even close.


Sources: Instagram, TikTok, and the Three Trends | Stratechery by Ben Thompson and The approaching tsunami of addictive AI-created content will overwhelm us | Social Warming by Charles Arthur

Good ideas become colonised and domesticated

I’ve got this thought about how every good idea becomes colonised and domesticated. While domestication can be a good thing, because it potentially makes it more accessible to all, it also robs the idea of its radical, transformatory power.

Colonisation, however, is never a positive thing. It’s about renegotiating existing relationships, often through the lens of power, capital, and hegemonic power.

How related the above two paragraphs are to this article in The New Yorker is questionable. But, to me, it’s related. Centralised social media is colonised and domesticated.

Laptop with goo coming out

Once upon a time, the Internet was predicated on user-generated content. The hope was that ordinary people would take advantage of the Web’s low barrier for publishing to post great things, motivated simply by the joy of open communication. We know now that it didn’t quite pan out that way. User-generated GeoCities pages or blogs gave way to monetized content. Google made the Internet more easily searchable, but, in the early two-thousands, it also began selling ads and allowed other Web sites to easily incorporate its advertising modules. That business model is still what most of the Internet relies on today. Revenue comes not necessarily from the value of content itself but from its ability to attract attention, to get eyeballs on ads, which are most often bought and sold through corporations like Google and Facebook. The rise of social networks in the twenty-tens made this model only more dominant. Our digital posting became concentrated on a few all-encompassing platforms, which relied increasingly on algorithmic feeds. The result for users was more exposure but a loss of agency. We generated content for free, and then Facebook mined it for profit.

“Clickbait” has long been the term for misleading, shallow online articles that exist only to sell ads. But on today’s Internet the term could describe content across every field, from the unmarked ads on an influencer’s Instagram page to pseudonymous pop music designed to game the Spotify algorithm. Eichhorn uses the potent term “content capital”—a riff on Pierre Bourdieu’s “cultural capital”—to describe the way in which a fluency in posting online can determine the success, or even the existence, of an artist’s work. Where “cultural capital” describes how particular tastes and reference points confer status, “content capital” connotes an aptitude for creating the kind of ancillary content that the Internet feeds upon. Since so much audience attention is funnelled through social media, the most direct path to success is to cultivate a large digital following. “Cultural producers who, in the past, may have focused on writing books or producing films or making art must now also spend considerable time producing (or paying someone else to produce) content about themselves and their work,” Eichhorn writes. Pop stars log their daily routines on TikTok. Journalists spout banal opinions on Twitter. The best-selling Instapoet Rupi Kaur posts reels and photos of her typewritten poems. All are trapped by the daily pressure to produce ancillary content—memes, selfies, shitposts—to fill an endless void.

Source: How the Internet Turned Us Into Content Machines | The New Yorker