Instagram has never been a place I’ve ever wanted to spend any time or attention. But its impact on physical spaces is undeniable.
This post (newsletter issue?) by Drew Austin cites a couple of other authors who perfectly skewer the Instagram aesthetic as being a grammar that quickly conveys that somebody… did a thing.
The Blackbird Spyplane newsletter recently made a valuable contribution to the pantheon of essays about how the internet has transformed the physical world: a hopeful manifesto in praise of the “Un-Grammable Hang Zone,” the definition of which will be obvious if you’ve spent enough time in the Instagram-optimized settings that have proliferated in cities during the past decade—places that BBSP describes as a “high-efficiency, low-humanity kind of eatery where you point yr phone at a QR code and do contactless payment before eating a room-temp grain bowl under a pink neon sign that says ‘Living My Best Life’ in cursive.”
Affirming the interchangeability of “millennial” and “Instragrammable” as descriptors, Fischer pinpoints the force that really drives them: Instagrammable “does not mean ‘beautiful’ or even quite ‘photogenic’; it means something more like ‘readable.’ The viewer could scroll past an image and still grasp its meaning, e.g., ‘I saw fireworks,’ ‘I am on vacation,’ or ‘I have friends.’” If Instagram as a medium demands readability, in other words, it puts pressure on the physical environment to simplify itself accordingly, at least in the long run.
In the complaint filed Thursday in federal court in San Francisco, New Jersey Instagram user Brittany Conditi contends the app’s use of the camera is intentional and done for the purpose of collecting “lucrative and valuable data on its users that it would not otherwise have access to.”
Facebook has been incredibly lucrative for its founder, Mark Zuckerberg, who ranks among the wealthiest men in the world. But it’s been a disaster for the world itself, a powerful vector for paranoia, propaganda and conspiracy-theorizing as well as authoritarian crackdowns and vicious attacks on the free press. Wherever it goes, chaos and destabilization follow.
I can’t sit by and stay silent while these platforms continue to allow the spreading of hate, propaganda and misinformation – created by groups to sow division and split America apart,” Kardashian West said.
People often ask me about my stance on Facebook products. They can understand that I don’t use Facebook itself, but what about Instagram? And surely I use WhatsApp? Nope.
Given that I don’t usually have a single place to point people who want to read about the problems with WhatsApp, I thought I’d create one.
WhatsApp is a messaging app that was acquired by Facebook for the eye-watering amount of $19 billion in 2014. Interestingly, a BuzzFeed News article from 2018 cites documents confidential documents from the time leading up to the acquisition that were acquired by the UK’s Department for Culture, Media, and Sport. They show the threat WhatsApp posed to Facebook at the time.
As you can see from the above chart, Facebook executives were shown in 2013 that WhatsApp (8.6% reach) was growing rapidly and posed a huge threat to Facebook Messenger (13.7% reach).
So Facebook bought WhatsApp. But what did they buy? If, as we’re led to believe, WhatsApp is ‘end-to-end encrypted’ then Facebook don’t have access to the messages of users. So what’s so valuable?
Brian Acton, one of the founders of WhatsApp (and a man who got very rich through its sale) has gone on record saying that he feels like he sold his users’ privacy to Facebook.
Facebook, Acton says, had decided to pursue two ways of making money from WhatsApp. First, by showing targeted ads in WhatsApp’s new Status feature, which Acton felt broke a social compact with its users. “Targeted advertising is what makes me unhappy,” he says. His motto at WhatsApp had been “No ads, no games, no gimmicks”—a direct contrast with a parent company that derived 98% of its revenue from advertising. Another motto had been “Take the time to get it right,” a stark contrast to “Move fast and break things.”
Facebook also wanted to sell businesses tools to chat with WhatsApp users. Once businesses were on board, Facebook hoped to sell them analytics tools, too. The challenge was WhatsApp’s watertight end-to-end encryption, which stopped both WhatsApp and Facebook from reading messages. While Facebook didn’t plan to break the encryption, Acton says, its managers did question and “probe” ways to offer businesses analytical insights on WhatsApp users in an encrypted environment.
Parmy Olson (Forbes)
The other way Facebook wanted to make money was to sell tools to businesses allowing them to chat with WhatsApp users. These tools would also give “analytical insights” on how users interacted with WhatsApp.
Facebook was allowed to acquire WhatsApp (and Instagram) despite fears around monopolistic practices. This was because they made a promise not to combine data from various platforms. But, guess what happened next?
In 2014, Facebook bought WhatsApp for $19b, and promised users that it wouldn’t harvest their data and mix it with the surveillance troves it got from Facebook and Instagram. It lied. Years later, Facebook mixes data from all of its properties, mining it for data that ultimately helps advertisers, political campaigns and fraudsters find prospects for whatever they’re peddling. Today, Facebook is in the process of acquiring Giphy, and while Giphy currently doesn’t track users when they embed GIFs in messages, Facebook could start doing that anytime.
All of this creates a profile. So yes, because of end-ot-end encryption, Facebook might not know the exact details of your messages. But they know that you’ve started messaging a particular user account around midnight every night. They know that you’ve started interacting with a bunch of stuff around anxiety. They know how the people you message most tend to vote.
Do I have to connect the dots here? This is a company that sells targeted adverts, the kind of adverts that can influence the outcome of elections. Of course, Facebook will never admit that its platforms are the problem, it’s always the responsibility of the user to be ‘vigilant’.
So you might think that you’re just messaging your friend or colleague on a platform that ‘everyone’ uses. But your decision to go with the flow has consequences. It has implications for democracy. It has implications on creating a de facto monopoly for our digital information. And it has implications around the dissemination of false information.
The features that would later allow WhatsApp to become a conduit for conspiracy theory and political conflict were ones never integral to SMS, and have more in common with email: the creation of groups and the ability to forward messages. The ability to forward messages from one group to another – recently limited in response to Covid-19-related misinformation – makes for a potent informational weapon. Groups were initially limited in size to 100 people, but this was later increased to 256. That’s small enough to feel exclusive, but if 256 people forward a message on to another 256 people, 65,536 will have received it.
A communication medium that connects groups of up to 256 people, without any public visibility, operating via the phones in their pockets, is by its very nature, well-suited to supporting secrecy. Obviously not every group chat counts as a “conspiracy”. But it makes the question of how society coheres, who is associated with whom, into a matter of speculation – something that involves a trace of conspiracy theory. In that sense, WhatsApp is not just a channel for the circulation of conspiracy theories, but offers content for them as well. The medium is the message.
William Davies (The Guardian)
I cannot control the decisions others make, nor have I forced my opinions on my two children, who (despite my warnings) both use WhatsApp to message their friends. But, for me, the risk to myself and society of using WhatsApp is not one I’m happy with taking.