So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There’s plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it’s not currently affecting us here and now, we become blasé. We forget that it’s already the lived experience for hundreds of millions of people.
Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government’s brutality against the Muslim Uyghur population in the western province of Xinjiang:
[The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.Derek Thompson
As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we’re the ones who should be worried:
The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.Ian Welsh
As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they’re looking for places where they can’t be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.
A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I’ve spent a long time thinking about quite nebulous concept. It’s not the easiest thing to understand when you examine it more closely.
Privacy is usually considered a freedom from rather than a freedom to, as in “freedom from surveillance”. The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they’re sharing it with the company that provides that service.
There’s a lot of power in the ‘default’ privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:
Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them… Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.Sidney Fussell
Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.
Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:
In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology…
One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.Ellen Sheng
It’s not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:
Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.
While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.Benjamin Herold
Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.
In China, of course, a social credit system makes all of this a million times worse, but we in the West aren’t heading in a great direction either.
We’re entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.
There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon’s smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.
Returning to Ian Welsh to finish up, remember that it’s the coercive power relationships that make surveillance a bad thing:
Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.Ian Welsh
We don’t want to think about all of this, though, do we?
Also check out:
- A New York School District Will Test Facial Recognition System On Students Even Though The State Asked It To Wait (BuzzFeed News) — “Rep. Alexandria Ocasio-Cortez has expressed concern in a congressional hearing on the technology last week that facial recognition could be used as a form of social control.”
- Amazon preparing a wearable that ‘reads human emotions,’ says report (The Verge) — “This is definitely one of those things that hasn’t yet been done because of how hard it is to do.”
- Google’s Sundar Pichai: Privacy Should Not Be a Luxury Good (The New York Times) — “Ideally, privacy legislation would require all businesses to accept responsibility for the impact of their data processing in a way that creates consistent and universal protections for individuals and society as a whole.