Tag: Ian Welsh

Friday forebodings

I think it’s alright to say that this was a week when my spirits dropped a little. Apologies if that’s not what you wanted to hear right now, and if it’s reflected in what follows.

For there to be good things there must also be bad. For there to be joy there must also be sorrow. And for there to be hope there must be despair. All of this will pass.


We’re Finding Out How Small Our Lives Really Are

But there’s no reason to put too sunny a spin on what’s happening. Research has shown that anticipation can be a linchpin of well-being and that looking ahead produces more intense emotions than retrospection. In a 2012 New York Times article on why people thirst for new experiences, one psychologist told the paper, “Novelty-seeking is one of the traits that keeps you healthy and happy and fosters personality growth as you age,” and another referred to human beings as a “neophilic species.” Of course, the current blankness in the place of what comes next is supposed to be temporary. Even so, lacking an ability to confidently say “see you later” is going to have its effects. Have you noticed the way in which conversations in this era can quickly become recursive? You talk about the virus. Or you talk about what you did together long ago. The interactions don’t always spark and generate as easily as they once did.

Spencer Kornhaber (The Atlantic)

Part of the problem with all of this is that we don’t know how long it’s going to last, so we can’t really make plans. It’s like an extended limbo where you’re supposed to just get on with it, whatever ‘it’ is…


Career Moats in a Recession

If you’re going after a career moat now, remember that the best skills to go after are the ones that the market will value after the recession ends. You can’t necessarily predict this — the world is complex and the future is uncertain, but you should certainly keep the general idea in mind.

A simpler version of this is to go after complementary skills to your current role. If you’ve been working for a bit, it’s likely that you’ll have a better understanding of your industry than most. So ask yourself: what complementary skills would make you more valuable to the employers in your job market?

Cedric James (Commonplace)

I’m fortunate to have switched from education to edtech at the right time. Elsewhere, James says that “job security is the ability to get your next job, not keep your current one” and that this depends on your network, luck, and having “rare and valuable skills”. Indeed.


Everything Is Innovative When You Ignore the Past

This is hard stuff, and acknowledging it comes with a corollary: We, as a society, are not particularly special. Vinsel, the historian at Virginia Tech, cautioned against “digital exceptionalism,” or the idea that everything is different now that the silicon chip has been harnessed for the controlled movement of electrons.

It’s a difficult thing for people to accept, especially those who have spent their lives building those chips or the software they run. “Just on a psychological level,” Vinsel said, “people want to live in an exciting moment. Students want to believe they’re part of a generation that’s going to change the world through digital technology or whatever.”

Aaron Gordon (VICE)

Everyone thinks they live in ‘unprecedented’ times, especially if they work in tech.


‘We can’t go back to normal’: how will coronavirus change the world?

But disasters and emergencies do not just throw light on the world as it is. They also rip open the fabric of normality. Through the hole that opens up, we glimpse possibilities of other worlds. Some thinkers who study disasters focus more on all that might go wrong. Others are more optimistic, framing crises not just in terms of what is lost but also what might be gained. Every disaster is different, of course, and it’s never just one or the other: loss and gain always coexist. Only in hindsight will the contours of the new world we’re entering become clear.

Peter C Baker (the Guardian)

An interesting read, outlining the optimistic and pessimistic scenarios. The coronavirus pandemic is a crisis, but of course what comes next (CLIMATE CHANGE) is even bigger.


The Terrible Impulse To Rally Around Bad Leaders In A Crisis

This tendency to rally around even incompetent leaders makes one despair for humanity. The correct response in all cases is contempt and an attempt, if possible, at removal of the corrupt and venal people in charge. Certainly no one should be approving of the terrible jobs they [Cuomo, Trump, Johnson] have done.

All three have or will use their increased power to do horrible things. The Coronavirus bailout bill passed by Congress and approved by Trump is a huge bailout of the rich, with crumbs for the poor and middle class. So little, in fact, that there may be widespread hunger soon. Cuomo is pushing forward with his cuts, and I’m sure Johnson will live down to expectations.

Ian Welsh

I’m genuinely shocked that the current UK government’s approval ratings are so high. Yes, they’re covering 80% of the salary of those laid-off, but the TUC was pushing for an even higher figure. It’s like we’re congratulating neoliberal idiots for destroying our collectively ability to be able to respond to this crisis effectively.


As Coronavirus Surveillance Escalates, Personal Privacy Plummets

Yet ratcheting up surveillance to combat the pandemic now could permanently open the doors to more invasive forms of snooping later. It is a lesson Americans learned after the terrorist attacks of Sept. 11, 2001, civil liberties experts say.

Nearly two decades later, law enforcement agencies have access to higher-powered surveillance systems, like fine-grained location tracking and facial recognition — technologies that may be repurposed to further political agendas like anti-immigration policies. Civil liberties experts warn that the public has little recourse to challenge these digital exercises of state power.

Natasha Singer and Choe Sang-Hun (The New York Times)

I’ve seen a lot of suggestions around smarpthone tracking to help with the pandemic response. How, exactly, when it’s trivial to spoof your location? It’s just more surveillance by the back door.


How to Resolve Any Conflict in Your Team

Have you ever noticed that when you argue with someone smart, if you manage to debunk their initial reasoning, they just shift to a new, logical-sounding reason?

Reasons are like a salamander’s legs — if you cut one off, another grows in its place.

When you’re dealing with a salamander, you need to get to the heart. Forget about reasoning and focus on what’s causing the emotions. According to [non-violent communication], every negative emotion is the result of an unmet, universal need.

Dave bailey

Great advice here, especially for those who work in organisations (or who have clients) who lack emotional intelligence.


2026 – the year of the face to face pivot

When the current crisis is over in terms of infection, the social and economic impact will be felt for a long time. One such hangover is likely to be the shift to online for so much of work and interaction. As the cartoon goes “these meetings could’ve been emails all along”. So let’s jump forward then a few years when online is the norm.

Martin Weller (The Ed Techie)

Some of the examples given in this post gave me a much-needed chuckle.


Now’s the time – 15 epic video games for the socially isolated

However, now that many of us are finding we have time on our hands, it could be the opportunity we need to attempt some of the more chronologically demanding narrative video game masterpieces of the last decade.

Keith Stuart (The Guardian)

Well, yes, but what we probably need even more is multiplayer mode. Red Dead Redemption II is on this list, and it’s one of the best games ever made. However, it’s tinged with huge sadness for me as it’s a game I greatly enjoyed playing with the late, great, Dai Barnes.


Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!


Header image by Alex Fu

Friday facings

This week’s links seem to have a theme about faces and looking at them through screens. I’m not sure what that says about either my network, or my interests, but there we are…

As ever, let me know what resonates with you, and if you have any thoughts on what’s shared below!


The Age of Instagram Face

The human body is an unusual sort of Instagram subject: it can be adjusted, with the right kind of effort, to perform better and better over time. Art directors at magazines have long edited photos of celebrities to better match unrealistic beauty standards; now you can do that to pictures of yourself with just a few taps on your phone.

Jia Tolentino (The New Yorker)

People, especially women, but there’s increasing pressure on young men too, are literally going to see plastic surgeons with ‘Facetuned’ versions of themselves. It’s hard not to think that we’re heading for a kind of dystopia when people want to look like cartoonish versions of themselves.


What Makes A Good Person?

What I learned as a child is that most people don’t even meet the responsibilities of their positions (husband, wife, teacher, boss, politicians, whatever.) A few do their duty, and I honor them for it, because it is rare. But to go beyond that and actually be a man of honor is unbelievably rare.

Ian Welsh

This question, as I’ve been talking with my therapist about, is one I ask myself all the time. Recently, I’ve settled on Marcus Aurelius’ approach: “Waste no more time arguing about what a good man should be. Be one.”


Boredom is but a window to a sunny day beyond the gloom

Boredom can be our way of telling ourselves that we are not spending our time as well as we could, that we should be doing something more enjoyable, more useful, or more fulfilling. From this point of view, boredom is an agent of change and progress, a driver of ambition, shepherding us out into larger, greener pastures.

Neel Burton (Aeon)

As I’ve discussed before, I’m not so sure about the fetishisation of ‘boredom’. It’s good to be creative and let the mind wander. But boredom? Nah. There’s too much interesting stuff out there.


Resting Risk Face

Unlock your devices with a surgical mask that looks just like you.

I don’t usually link to products in this roundup, but I’m not sure this is 100% serious. Good idea, though!


The world’s biggest work-from-home experiment has been triggered by coronavirus

For some employees, like teachers who have conducted classes digitally for weeks, working from home can be a nightmare.
But in other sectors, this unexpected experiment has been so well received that employers are considering adopting it as a more permanent measure. For those who advocate more flexible working options, the past few weeks mark a possible step toward widespread — and long-awaited — reform.

Jessie Yeung (CNN)

Every cloud has a silver lining, I guess? Working from home is great, especially when you have a decent setup.


Setting Up Your Webcam, Lights, and Audio for Remote Work, Podcasting, Videos, and Streaming

Only you really know what level of clarity you want from each piece of your setup. Are you happy with what you have? Please, dear Lord, don’t spend any money. This is intended to be a resource if you want more and don’t know how to do it, not a stress or a judgment to anyone happy with their current setup

And while it’s a lot of fun to have a really high-quality webcam for my remote work, would I have bought it if I didn’t have a more intense need for high quality video for my YouTube stuff? Hell no. Get what you need, in your budget. This is just a resource.

This is a fantastic guide. I bought a great webcam when I saw it drop in price via CamelCamelCamel and bought a decent mic when I recorded the TIDE podcast wiht Dai. It really does make a difference.


Large screen phones: a challenge for UX design (and human hands)

I know it might sound like I have more questions than answers, but it seems to me that we are missing out on a very basic solution for the screen size problem. Manufacturers did so much to increase the screen size, computational power and battery capacity whilst keeping phones thin, that switching the apps navigation to the bottom should have been the automatic response to this new paradigm.

Maria Grilo (Imaginary Cloud)

The struggle is real. I invested in a new phone this week (a OnePlus 7 Pro 5G) and, unlike the phone it replaced from 2017, it’s definitely a hold-with-two-hands device.


Society Desperately Needs An Alternative Web

What has also transpired is a web of unbridled opportunism and exploitation, uncertainty and disparity. We see increasing pockets of silos and echo chambers fueled by anxiety, misplaced trust, and confirmation bias. As the mainstream consumer lays witness to these intentions, we notice a growing marginalization that propels more to unplug from these communities and applications to safeguard their mental health. However, the addiction technology has produced cannot be easily remedied. In the meantime, people continue to suffer.

Hessie Jones (Forbes)

Another call to re-decentralise the web, this time based on arguments about centralised services not being able to handle the scale of abuse and fraudulent activity.


UK Google users could lose EU GDPR data protections

It is understood that Google decided to move its British users out of Irish jurisdiction because it is unclear whether Britain will follow GDPR or adopt other rules that could affect the handling of user data.

If British Google users have their data kept in Ireland, it would be more difficult for British authorities to recover it in criminal investigations.

The recent Cloud Act in the US, however, is expected to make it easier for British authorities to obtain data from US companies. Britain and the US are also on track to negotiate a broader trade agreement.

Samuel Gibbs (The Guardian)

I’m sure this is a business decision as well, but I guess it makes sense given post-Brexit uncertainty about privacy legislation. It’s a shame, though, and a little concerning.


Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!


Header image by Luc van Loon

To others we are not ourselves but a performer in their lives cast for a part we do not even know that we are playing

Surveillance, technology, and society

Last week, the London Metropolitan Police (‘the Met’) proudly announced that they’ve begun using ‘LFR’, which is their neutral-sounding acronym for something incredibly invasive to the privacy of everyday people in Britain’s capital: Live Facial Recognition.

It’s obvious that the Met expect some pushback here:

The Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences. 

At a deployment, cameras will be focused on a small, targeted area to scan passers-by. The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video or ANPR.

London Metropolitan Police

Note the talk of ‘intelligence’ and ‘bespoke watch lists’, as well as promises that LFR will not be linked any other systems. (ANPR, for those not familiar with it, is ‘Automatic Number Plate Recognition’.) This, of course, is the thin end of the wedge and how these things start — in a ‘targeted’ way. They’re expanded later, often when the fuss has died down.


Meanwhile, a lot of controversy surrounds an app called Clearview AI which scrapes publicly-available data (e.g. Twitter or YouTube profiles) and applies facial recognition algorithms. It’s already in use by law enforcement in the USA.

The size of the Clearview database dwarfs others in use by law enforcement. The FBI’s own database, which taps passport and driver’s license photos, is one of the largest, with over 641 million images of US citizens.

The Clearview app isn’t available to the public, but the Times says police officers and Clearview investors think it will be in the future.

The startup said in a statement Tuesday that its “technology is intended only for use by law enforcement and security personnel. It is not intended for use by the general public.” 

Edward Moyer (CNET)

So there we are again, the technology is ‘intended’ for one purpose, but the general feeling is that it will leak out into others. Imagine the situation if anyone could identify almost anyone on the planet simply by pointing their smartphone at them for a few seconds?

This is a huge issue, and one that politicians and lawmakers on both sides of the Atlantic are both ill-equipped to deal with and particularly concerned about. As the BBC reports, the European Commission is considering a five-year ban on facial recognition in public spaces while it figures out how to regulate the technology:

The Commission set out its plans in an 18-page document, suggesting that new rules will be introduced to bolster existing regulation surrounding privacy and data rights.

It proposed imposing obligations on both developers and users of artificial intelligence, and urged EU countries to create an authority to monitor the new rules.

During the ban, which would last between three and five years, “a sound methodology for assessing the impacts of this technology and possible risk management measures could be identified and developed”.

BBC News

I can’t see the genie going back in this particular bottle and, as Ian Welsh puts it, this is the end of public anonymity. He gives the examples of the potential for all kinds of abuse, from an increase in rape, to abuse by corporations, to an increase in parental surveillance of children.

The larger issue is this: people who are constantly under surveillance become super conformers out of defense. Without true private time, the public persona and the private personality tend to collapse together. You need a backstage — by yourself and with a small group of friends to become yourself. You need anonymity.

When everything you do is open to criticism by everyone, you will become timid and conforming.

When governments, corporations, schools and parents know everything, they will try to control everything. This often won’t be for your benefit.

Ian Welsh

We already know that self-censorship is the worst kind of censorship, and live facial recognition means we’re going to have to do a whole lot more of it in the near future.

So what can we do about it? Welsh thinks that this technology should be made illegal, which is one option. However, you can’t un-invent technologies. So live facial recognition is going to be used (lawfully) by some organisations, even if it were restricted to state operatives. I’m not sure if that’s better or worse than everyone having it?


At a recent workshop I ran, I was talking during one of the breaks to one person who couldn’t really see the problem I had raised about surveillance capitalism. I have to wonder if they would have a problem with live facial recognition? From our conversation, I’d suspect not.

Remember that facial recognition is not 100% accurate and (realistically) never can be. So there will be false positives. Let’s say your face ends up on a ‘watch list’ or a ‘bad actor’ database shared with many different agencies and retailers. All of a sudden, you’ve got yourself a very big problem.


As BuzzFeed News reports, around half of US retailers are either using live facial recognition, or have plans to use it. At the moment, companies like FaceFirst do not facilitate the sharing of data across their clients, but you can see what’s coming next:

[Peter Trepp, CEO of FaceFirst] said the database is not shared with other retailers or with FaceFirst directly. All retailers have their own policies, but Trepp said often stores will offer not to press charges against apprehended shoplifters if they agree to opt into the store’s shoplifter database. The files containing the images and identities of people on “the bad guy list” are encrypted and only accessible to retailers using their own systems, he said.

FaceFirst automatically purges visitor data that does not match information in a criminal database every 14 days, which is the company’s minimum recommendation for auto-purging data. It’s up to the retailer if apprehended shoplifters or people previously on the list can later opt out of the database.

Leticia Miranda (BuzzFeed News)

There is no opt-in, no consent sought or gathered by retailers. This is a perfect example of technology being light years ahead of lawmaking.


This is all well-and-good in situations where adults are going into public spaces, but what about schools, where children are often only one step above prisoners in terms of the rights they enjoy?

Recode reports that, in schools, the surveillance threat to students goes beyond facial recognition. So long as authorities know generally what a student looks like, they can track them everywhere they go:

Appearance Search can find people based on their age, gender, clothing, and facial characteristics, and it scans through videos like facial recognition tech — though the company that makes it, Avigilon, says it doesn’t technically count as a full-fledged facial recognition tool

Even so, privacy experts told Recode that, for students, the distinction doesn’t necessarily matter. Appearance Search allows school administrators to review where a person has traveled throughout campus — anywhere there’s a camera — using data the system collects about that person’s clothing, shape, size, and potentially their facial characteristics, among other factors. It also allows security officials to search through camera feeds using certain physical descriptions, like a person’s age, gender, and hair color. So while the tool can’t say who the person is, it can find where else they’ve likely been.

Rebecca Heilweil (Recode)

This is a good example of the boundaries of technology that may-or-may-not be banned at some point in the future. The makers of Appearance Search, Avigilon, claim that it’s not facial recognition technology because the images it captures and analyses are tied to the identity of a particular person:

Avigilon’s surveillance tool exists in a gray area: Even privacy experts are conflicted over whether or not it would be accurate to call the system facial recognition. After looking at publicly available content about Avigilon, Leong said it would be fairer to call the system an advanced form of characterization, meaning that the system is making judgments about the attributes of that person, like what they’re wearing or their hair, but it’s not actually claiming to know their identity.

Rebecca Heilweil (Recode)

You can give as many examples of the technology being used for good as you want — there’s one in this article about how the system helped discover a girl was being bullied, for example — but it’s still intrusive surveillance. There are other ways of getting to the same outcome.


We do not live in a world of certainty. We live in a world where things are ambiguous, unsure, and sometimes a little dangerous. While we should seek to protect one another, and especially those who are most vulnerable in society, we should think about the harm we’re doing by forcing people to live the totality of their lives in public.

What does that do to our conceptions of self? To creativity? To activism? Live facial recognition technology, as well as those technologies that exist in a grey area around it, is the hot-button issue of the 2020s.


Image by Kirill Sharkovski. Quotation-as-title by Elizabeth Bibesco.

Friday festoonings

Check out these things I read and found interesting this week. Thanks to some positive feedback, I’ve carved out time for some commentary, and changed the way this link roundup is set out.

Let me know what you think! What did you find most interesting?


Maps Are Biased Against Animals

Critics may say that it is unreasonable to expect maps to reflect the communities or achievements of nonhumans. Maps are made by humans, for humans. When beavers start Googling directions to a neighbor’s dam, then their homes can be represented! For humans who use maps solely to navigate—something that nonhumans do without maps—man-made roads are indeed the only features that are relevant. Following a map that includes other information may inadvertently lead a human onto a trail made by and for deer.

But maps are not just tools to get from points A to B. They also relay new and learned information, document evolutionary changes, and inspire intrepid exploration. We operate on the assumption that our maps accurately reflect what a visitor would find if they traveled to a particular area. Maps have immense potential to illustrate the world around us, identifying all the important features of a given region. By that definition, the current maps that most humans use fall well short of being complete. Our definition of what is “important” is incredibly narrow.

Ryan Huling (WIRED)

Cartography is an incredibly powerful tool. We’ve known for a long time that “the map is not the territory” but perhaps this is another weapon in the fight against climate change and the decline in diversity of species?


Why Actually Principled People Are Difficult (Glenn Greenwald Edition)

Then you get people like Greenwald, Assange, Manning and Snowden. They are polarizing figures. They are loved or hated. They piss people off.

They piss people off precisely because they have principles they consider non-negotiable. They will not do the easy thing when it matters. They will not compromise on anything that really matters.

That’s breaking the actual social contract of “go along to get along”, “obey authority” and “don’t make people uncomfortable.” I recently talked to a senior activist who was uncomfortable even with the idea of yelling at powerful politicians. It struck them as close to violence.

So here’s the thing, people want men and women of principle to be like ordinary people.

They aren’t. They can’t be. If they were, they wouldn’t do what they do. Much of what you may not like about a Greenwald or Assange or Manning or Snowden is why they are what they are. Not just the principle, but the bravery verging on recklessness. The willingness to say exactly what they think, and do exactly what they believe is right even if others don’t.

Ian Welsh

Activists like Greta Thunberg and Edward Snowden are the closest we get to superheroes, to people who stand for the purest possible version of an idea. This is why we need them — and why we’re so disappointed when they turn out to be human after all.


Explicit education

Students’ not comprehending the value of engaging in certain ways is more likely to be a failure in our teaching than their willingness to learn (especially if we create a culture in which success becomes exclusively about marks and credentialization). The question we have to ask is if what we provide as ‘university’ goes beyond the value of what our students can engage with outside of our formal offer. 

Dave White

This is a great post by Dave, who I had the pleasure of collaborating with briefly during my stint at Jisc. I definitely agree that any organisation walks a dangerous path when it becomes overly-fixated on the ‘how’ instead of the ‘what’ and the ‘why’.


What Are Your Rules for Life? These 11 Expressions (from Ancient History) Might Help

The power of an epigram or one of these expressions is that they say a lot with a little. They help guide us through the complexity of life with their unswerving directness. Each person must, as the retired USMC general and former Secretary of Defense Jim Mattis, has said, “Know what you will stand for and, more important, what you won’t stand for.” “State your flat-ass rules and stick to them. They shouldn’t come as a surprise to anyone.”

Ryan Holiday

Of the 11 expressions here, I have to say that other than memento mori (“remember you will die”) I particularly like semper anticus (“always forward”) which I’m going to print out in a fancy font and stick on the wall of my home office.


Dark Horse Discord

In a hypothetical world, you could get a Discord (or whatever is next) link for your new job tomorrow – you read some wiki and meta info, sort yourself into your role you’d, and then are grouped with the people who you need to collaborate with on a need be basis. All wrapped in one platform. Maybe you have an HR complaint – drop it in #HR where you can’t read the messages but they can, so it’s a blind 1 way conversation. Maybe there is a #help channel, where you ping you write your problems and the bot pings people who have expertise based on keywords. There’s a lot of things you can do with this basic design.

Mule’s Musings

What is described in this post is a bit of a stretch, but I can see it: a world where work is organised a bit like how gamers organisers in chat channels. Something to keep an eye on, as the interplay between what’s ‘normal’ and what’s possible with communications technology changes and evolves.


The Edu-Decade That Was: Unfounded Optimism?

What made the last decade so difficult is how education institutions let corporations control the definitions so that a lot of “study and ethical practice” gets left out of the work. With the promise of ease of use, low-cost, increased student retention (or insert unreasonable-metric-claim here), etc. institutions are willing to buy into technology without regard to accessibility, scalability, equity and inclusion, data privacy or student safety, in hope of solving problem X that will then get to be checked off of an accreditation list. Or worse, with the hope of not having to invest in actual people and local infrastructure.

Geoff Cain (Brainstorm in progress)

It’s nice to see a list of some positives that came out of the last decades, and for microcredentials and badging to be on that list.


When Is a Bird a ‘Birb’? An Extremely Important Guide

First, let’s consider the canonized usages. The subreddit r/birbs defines a birb as any bird that’s “being funny, cute, or silly in some way.” Urban Dictionary has a more varied set of definitions, many of which allude to a generalized smallness. A video on the youtube channel Lucidchart offers its own expansive suggestions: All birds are birbs, a chunky bird is a borb, and a fluffed-up bird is a floof. Yet some tension remains: How can all birds be birbs if smallness or cuteness are in the equation? Clearly some birds get more recognition for an innate birbness.

Asher Elbein (Audubon magazine)

A fun article, but also an interesting one when it comes to ambiguity, affinity groups, and internet culture.


Why So Many Things Cost Exactly Zero

“Now, why would Gmail or Facebook pay us? Because what we’re giving them in return is not money but data. We’re giving them lots of data about where we go, what we eat, what we buy. We let them read the contents of our email and determine that we’re about to go on vacation or we’ve just had a baby or we’re upset with our friend or it’s a difficult time at work. All of these things are in our email that can be read by the platform, and then the platform’s going to use that to sell us stuff.”

Fiona Scott Morton (Yale business school) quoted by Peter coy (Bloomberg Businessweek)

Regular readers of Thought Shrapnel know all about surveillance capitalism, but it’s good to see these explainers making their way to the more mainstream business press.


Your online activity is now effectively a social ‘credit score’

The most famous social credit system in operation is that used by China’s government. It “monitors millions of individuals’ behavior (including social media and online shopping), determines how moral or immoral it is, and raises or lowers their “citizen score” accordingly,” reported Atlantic in 2018.

“Those with a high score are rewarded, while those with a low score are punished.” Now we know the same AI systems are used for predictive policing to round up Muslim Uighurs and other minorities into concentration camps under the guise of preventing extremism.

Violet Blue (Engadget)

Some (more prudish) people will write this article off because it discusses sex workers, porn, and gay rights. But the truth is that all kinds of censorship start with marginalised groups. To my mind, we’re already on a trajectory away from Silicon Valley and towards Chinese technology. Will we be able to separate the tech from the morality?


Panicking About Your Kids’ Phones? New Research Says Don’t

The researchers worry that the focus on keeping children away from screens is making it hard to have more productive conversations about topics like how to make phones more useful for low-income people, who tend to use them more, or how to protect the privacy of teenagers who share their lives online.

“Many of the people who are terrifying kids about screens, they have hit a vein of attention from society and they are going to ride that. But that is super bad for society,” said Andrew Przybylski, the director of research at the Oxford Internet Institute, who has published several studies on the topic.

Nathaniel Popper (The New York Times)

Kids and screentime is just the latest (extended) moral panic. Overuse of anything causes problems, smartphones, games consoles, and TV included. What we need to do is to help our children find balance in all of this, which can be difficult for the first generation of parents navigating all of this on the frontline.


Gorgeous header art via the latest Facebook alternative, planetary.social

Wretched is a mind anxious about the future

So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There’s plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it’s not currently affecting us here and now, we become blasé. We forget that it’s already the lived experience for hundreds of millions of people.

Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government’s brutality against the Muslim Uyghur population in the western province of Xinjiang:

[The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.

Derek Thompson

As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we’re the ones who should be worried:

The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.

Ian Welsh

As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they’re looking for places where they can’t be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.

A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I’ve spent a long time thinking about quite nebulous concept. It’s not the easiest thing to understand when you examine it more closely.

Privacy is usually considered a freedom from rather than a freedom to, as in “freedom from surveillance”. The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they’re sharing it with the company that provides that service.

There’s a lot of power in the ‘default’ privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:

Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them… Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.

Sidney Fussell

Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.

Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:

In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology…

One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.

Ellen Sheng

It’s not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:

Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

[…]

While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.
Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

Benjamin Herold

In China, of course, a social credit system makes all of this a million times worse, but we in the West aren’t heading in a great direction either.

We’re entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.

There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon’s smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.

Returning to Ian Welsh to finish up, remember that it’s the coercive power relationships that make surveillance a bad thing:

Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.

Ian Welsh

We don’t want to think about all of this, though, do we?


Also check out:

Get a Thought Shrapnel digest in your inbox every Sunday (free!)
Holler Box