Tag: Ian Welsh (page 2 of 2)

Friday festoonings

Check out these things I read and found interesting this week. Thanks to some positive feedback, I’ve carved out time for some commentary, and changed the way this link roundup is set out.

Let me know what you think! What did you find most interesting?


Maps Are Biased Against Animals

Critics may say that it is unreasonable to expect maps to reflect the communities or achievements of nonhumans. Maps are made by humans, for humans. When beavers start Googling directions to a neighbor’s dam, then their homes can be represented! For humans who use maps solely to navigate—something that nonhumans do without maps—man-made roads are indeed the only features that are relevant. Following a map that includes other information may inadvertently lead a human onto a trail made by and for deer.

But maps are not just tools to get from points A to B. They also relay new and learned information, document evolutionary changes, and inspire intrepid exploration. We operate on the assumption that our maps accurately reflect what a visitor would find if they traveled to a particular area. Maps have immense potential to illustrate the world around us, identifying all the important features of a given region. By that definition, the current maps that most humans use fall well short of being complete. Our definition of what is “important” is incredibly narrow.

Ryan Huling (WIRED)

Cartography is an incredibly powerful tool. We’ve known for a long time that “the map is not the territory” but perhaps this is another weapon in the fight against climate change and the decline in diversity of species?


Why Actually Principled People Are Difficult (Glenn Greenwald Edition)

Then you get people like Greenwald, Assange, Manning and Snowden. They are polarizing figures. They are loved or hated. They piss people off.

They piss people off precisely because they have principles they consider non-negotiable. They will not do the easy thing when it matters. They will not compromise on anything that really matters.

That’s breaking the actual social contract of “go along to get along”, “obey authority” and “don’t make people uncomfortable.” I recently talked to a senior activist who was uncomfortable even with the idea of yelling at powerful politicians. It struck them as close to violence.

So here’s the thing, people want men and women of principle to be like ordinary people.

They aren’t. They can’t be. If they were, they wouldn’t do what they do. Much of what you may not like about a Greenwald or Assange or Manning or Snowden is why they are what they are. Not just the principle, but the bravery verging on recklessness. The willingness to say exactly what they think, and do exactly what they believe is right even if others don’t.

Ian Welsh

Activists like Greta Thunberg and Edward Snowden are the closest we get to superheroes, to people who stand for the purest possible version of an idea. This is why we need them — and why we’re so disappointed when they turn out to be human after all.


Explicit education

Students’ not comprehending the value of engaging in certain ways is more likely to be a failure in our teaching than their willingness to learn (especially if we create a culture in which success becomes exclusively about marks and credentialization). The question we have to ask is if what we provide as ‘university’ goes beyond the value of what our students can engage with outside of our formal offer. 

Dave White

This is a great post by Dave, who I had the pleasure of collaborating with briefly during my stint at Jisc. I definitely agree that any organisation walks a dangerous path when it becomes overly-fixated on the ‘how’ instead of the ‘what’ and the ‘why’.


What Are Your Rules for Life? These 11 Expressions (from Ancient History) Might Help

The power of an epigram or one of these expressions is that they say a lot with a little. They help guide us through the complexity of life with their unswerving directness. Each person must, as the retired USMC general and former Secretary of Defense Jim Mattis, has said, “Know what you will stand for and, more important, what you won’t stand for.” “State your flat-ass rules and stick to them. They shouldn’t come as a surprise to anyone.”

Ryan Holiday

Of the 11 expressions here, I have to say that other than memento mori (“remember you will die”) I particularly like semper anticus (“always forward”) which I’m going to print out in a fancy font and stick on the wall of my home office.


Dark Horse Discord

In a hypothetical world, you could get a Discord (or whatever is next) link for your new job tomorrow – you read some wiki and meta info, sort yourself into your role you’d, and then are grouped with the people who you need to collaborate with on a need be basis. All wrapped in one platform. Maybe you have an HR complaint – drop it in #HR where you can’t read the messages but they can, so it’s a blind 1 way conversation. Maybe there is a #help channel, where you ping you write your problems and the bot pings people who have expertise based on keywords. There’s a lot of things you can do with this basic design.

Mule’s Musings

What is described in this post is a bit of a stretch, but I can see it: a world where work is organised a bit like how gamers organisers in chat channels. Something to keep an eye on, as the interplay between what’s ‘normal’ and what’s possible with communications technology changes and evolves.


The Edu-Decade That Was: Unfounded Optimism?

What made the last decade so difficult is how education institutions let corporations control the definitions so that a lot of “study and ethical practice” gets left out of the work. With the promise of ease of use, low-cost, increased student retention (or insert unreasonable-metric-claim here), etc. institutions are willing to buy into technology without regard to accessibility, scalability, equity and inclusion, data privacy or student safety, in hope of solving problem X that will then get to be checked off of an accreditation list. Or worse, with the hope of not having to invest in actual people and local infrastructure.

Geoff Cain (Brainstorm in progress)

It’s nice to see a list of some positives that came out of the last decades, and for microcredentials and badging to be on that list.


When Is a Bird a ‘Birb’? An Extremely Important Guide

First, let’s consider the canonized usages. The subreddit r/birbs defines a birb as any bird that’s “being funny, cute, or silly in some way.” Urban Dictionary has a more varied set of definitions, many of which allude to a generalized smallness. A video on the youtube channel Lucidchart offers its own expansive suggestions: All birds are birbs, a chunky bird is a borb, and a fluffed-up bird is a floof. Yet some tension remains: How can all birds be birbs if smallness or cuteness are in the equation? Clearly some birds get more recognition for an innate birbness.

Asher Elbein (Audubon magazine)

A fun article, but also an interesting one when it comes to ambiguity, affinity groups, and internet culture.


Why So Many Things Cost Exactly Zero

“Now, why would Gmail or Facebook pay us? Because what we’re giving them in return is not money but data. We’re giving them lots of data about where we go, what we eat, what we buy. We let them read the contents of our email and determine that we’re about to go on vacation or we’ve just had a baby or we’re upset with our friend or it’s a difficult time at work. All of these things are in our email that can be read by the platform, and then the platform’s going to use that to sell us stuff.”

Fiona Scott Morton (Yale business school) quoted by Peter coy (Bloomberg Businessweek)

Regular readers of Thought Shrapnel know all about surveillance capitalism, but it’s good to see these explainers making their way to the more mainstream business press.


Your online activity is now effectively a social ‘credit score’

The most famous social credit system in operation is that used by China’s government. It “monitors millions of individuals’ behavior (including social media and online shopping), determines how moral or immoral it is, and raises or lowers their “citizen score” accordingly,” reported Atlantic in 2018.

“Those with a high score are rewarded, while those with a low score are punished.” Now we know the same AI systems are used for predictive policing to round up Muslim Uighurs and other minorities into concentration camps under the guise of preventing extremism.

Violet Blue (Engadget)

Some (more prudish) people will write this article off because it discusses sex workers, porn, and gay rights. But the truth is that all kinds of censorship start with marginalised groups. To my mind, we’re already on a trajectory away from Silicon Valley and towards Chinese technology. Will we be able to separate the tech from the morality?


Panicking About Your Kids’ Phones? New Research Says Don’t

The researchers worry that the focus on keeping children away from screens is making it hard to have more productive conversations about topics like how to make phones more useful for low-income people, who tend to use them more, or how to protect the privacy of teenagers who share their lives online.

“Many of the people who are terrifying kids about screens, they have hit a vein of attention from society and they are going to ride that. But that is super bad for society,” said Andrew Przybylski, the director of research at the Oxford Internet Institute, who has published several studies on the topic.

Nathaniel Popper (The New York Times)

Kids and screentime is just the latest (extended) moral panic. Overuse of anything causes problems, smartphones, games consoles, and TV included. What we need to do is to help our children find balance in all of this, which can be difficult for the first generation of parents navigating all of this on the frontline.


Gorgeous header art via the latest Facebook alternative, planetary.social

Wretched is a mind anxious about the future

So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There’s plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it’s not currently affecting us here and now, we become blasé. We forget that it’s already the lived experience for hundreds of millions of people.

Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government’s brutality against the Muslim Uyghur population in the western province of Xinjiang:

[The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.

Derek Thompson

As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we’re the ones who should be worried:

The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.

Ian Welsh

As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they’re looking for places where they can’t be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.

A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I’ve spent a long time thinking about quite nebulous concept. It’s not the easiest thing to understand when you examine it more closely.

Privacy is usually considered a freedom from rather than a freedom to, as in “freedom from surveillance”. The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they’re sharing it with the company that provides that service.

There’s a lot of power in the ‘default’ privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:

Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them… Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.

Sidney Fussell

Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.

Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:

In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology…

One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.

Ellen Sheng

It’s not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:

Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

[…]

While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.
Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

Benjamin Herold

In China, of course, a social credit system makes all of this a million times worse, but we in the West aren’t heading in a great direction either.

We’re entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.

There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon’s smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.

Returning to Ian Welsh to finish up, remember that it’s the coercive power relationships that make surveillance a bad thing:

Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.

Ian Welsh

We don’t want to think about all of this, though, do we?


Also check out: