You can’t tech your way out of problems the tech didn’t create
The Electronic Frontier Foundation (EFF), is a US-based non-profit that exists to defend civil liberties in the digital world. They've been around for 30 years, and I support them financially on a monthly basis.
In this article by Corynne McSherry, EFF's Legal Director, she outlines the futility in attempts by 'Big Social' to do content moderation at scale:
[C]ontent moderation is a fundamentally broken system. It is inconsistent and confusing, and as layer upon layer of policy is added to a system that employs both human moderators and automated technologies, it is increasingly error-prone. Even well-meaning efforts to control misinformation inevitably end up silencing a range of dissenting voices and hindering the ability to challenge ingrained systems of oppression.
CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)
Ultimately, these monolithic social networks have a problem around false positives. It's in their interests to be over-zealous, as they're increasingly under the watchful eye of regulators and governments.
We have been watching closely as Facebook, YouTube, and Twitter, while disclaiming any interest in being “the arbiters of truth,” have all adjusted their policies over the past several months to try arbitrate lies—or at least flag them. And we’re worried, especially when we look abroad. Already this year, an attempt by Facebook to counter election misinformation targeting Tunisia, Togo, Côte d’Ivoire, and seven other African countries resulted in the accidental removal of accounts belonging to dozens of Tunisian journalists and activists, some of whom had used the platform during the country’s 2011 revolution. While some of those users’ accounts were restored, others—mostly belonging to artists—were not.
Corynne McSherry, Content Moderation and the U.S. Election: What to Ask, What to Demand (EFF)
McSherry's analysis is spot-on: it's the algorithms that are a problem here. Social networks employ these algorithms because of their size and structure, and because of the cost of human-based content moderation. After all, these are companies with shareholders.
Algorithms used by Facebook’s Newsfeed or Twitter’s timeline make decisions about which news items, ads, and user-generated content to promote and which to hide. That kind of curation can play an amplifying role for some types of incendiary content, despite the efforts of platforms like Facebook to tweak their algorithms to “disincentivize” or “downrank” it. Features designed to help people find content they’ll like can too easily funnel them into a rabbit hole of disinformation.
CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)
She includes useful questions for social networks to answer about content moderation:
- Is the approach narrowly tailored or a categorical ban?
- Does it empower users?
- Is it transparent?
- Is the policy consistent with human rights principles?
But, ultimately...
You can’t tech your way out of problems the tech didn’t create. And even where content moderation has a role to play, history tells us to be wary. Content moderation at scale is impossible to do perfectly, and nearly impossible to do well, even under the most transparent, sensible, and fair conditions
CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)
I'm so pleased that I don't use Facebook products, and that I only use Twitter these days as a place to publish links to my writing.
Instead, I'm much happier on the Fediverse, a place where if you don't like the content moderation approach of the instance you're on, you can take your digital knapsack and decide to call another place home. You can find me here (for now!).
Even those of a harsh and unyielding nature will endure gentle treatment: no creature is fierce and frightening if it is stroked

🌼 Digital gardens let you cultivate your own little bit of the internet
⛏️ Dorset mega henge may be ‘last hurrah’ of stone-age builders
📺 Culture to cheer you up during the second lockdown: part one
Quotation-as-title by Seneca. Image from top-linked post.
When people are free to do as they please, they usually imitate each other

🙆 How I talk to the victims of conspiracy theories
🔒 The Github youtube-dl Takedown Isn't Just a Problem of American Law
🖥️ The Raspberry Pi 400 - Teardown and Review
🐧 As a former social media analyst, I'm quitting Twitter
Quotation-as-title by Eric Hoffer. Image from top-linked post.
Reafferent loops
In Peter Godfrey-Smith's book Other Minds, he cites work from 1950 by the German physiologists Erich van Holst and Horst Mittelstaedt.
They used the term afference to refer to everything you take in through the senses. Some of what comes in is due to the changes in the objects around you — that is exafference... — and some of what comes in is due to your own actions: that is reafference.
Peter Godfrey-Smith, Other Minds, p.154
Godfrey-Smith is talking about octopuses and other cephalopods, but I think what he's discussing is interesting from a digital note-taking point of view.
To write a note and read it is to create a reafferent loop. Rather than wanting to perceive only the things that are not due to you — finding the exafferent among the noise is the senses — you what you read to be entirely due to your previous action. You want the contents of the note to be due to your acts rather than someone else's meddling, or the natural decay of the notepad. You want the loop between present action and future perception to be firm. Thus enables your to create a form of external memory — as was, almost certainly, the role of much early writing (which is full of records of goods and transactions), and perhaps also the role of some early pictures, though that js much less clear.
When a written message is directed at others, it's ordinary communication. When you write something for yourself to read, there's usually an essential role for time — the goal is memory, in a broad sense. But memory like this is a communicative phenomenon; it is communication between your present self and a future self. Diaries and notes-to-self are embedded in a sender/receiver system just like more standard forms of communication.
Peter Godfrey-Smith, Other Minds, p.154-155
Some people talk about digital note-taking as a form of 'second brain'. Given the type of distributed cognition that Godfrey-Smith highlights in Other Minds, it would appear that by creating reafferent loops that's exactly the kind of thing that's happening.
Very interesting.
Hiring is broken, but not in the ways you assume
Hacker News is a link aggregator for people who work in tech. There's a lot of very technical information on there, but also stuff interesting to the curious mind more generally.
As so many people visit the site every day, it can be very influential, especially given the threaded discussion about shared links.
There can be a bit of a 'hive mind' sometimes, with certain things being sacred cows or implicit assumptions held by those who post (and lurk) there.
In this blog post focusing on hiring practices there's a critique of four 'myths' that seem to be prevalent in Hacker News discussions. Some of it is almost exclusively focused on tech roles in Silicon Valley, but I wanted to pull out this nugget which outlines what is really wrong with hiring:
Diversity. We really, really suck at diversity. We’re getting better, but we have a long way to go. Most of the industry chases the same candidates and assesses them in the same way.
Generally unfair practices. In cases where companies have power and candidates don’t, things can get really unfair. Lack of diversity is just one side-effect of this, others include poor candidate experiences, unfair compensation, and many others.
Short-termism. Recruiters and hiring managers that just want to fill a role at any cost, without thinking about whether there really is a fit or not. Many recruiters work on contingency, and most of them suck. The really good ones are awesome, but most of the well is poison. Hiring managers can be the same, too, when they’re under pressure to hire.
General ineptitude. Sometimes companies don’t knowing what they’re looking for, or are not internally aligned on it. Sometimes they just have broken processes, where they can’t keep track of who they’re talking to and what stage they’re at. Sometimes the engineers doing the interviews couldn’t care two shits about the interview or the company they work at. And often, companies are just tremendously indecisive, which makes them really slow to decide, or to just reject candidates because they can’t make up their minds.
Ozzie, 4 Hiring Myths Common in HackerNews Discussions
I've hired people and, even with the lastest talent management workflow software, it's not easy. It sucks up your time, and anything/everything you do can and will be criticised.
But that doesn't mean that we can't strive to make the whole process better, more equitable, and more enjoyable for all involved.
If you have been put in your place long enough, you begin to act like the place

📉 Of Flying Cars and the Declining Rate of Profit
💪 How to walk upright and stop living in a cave
🤔 It’s Not About Intention, It’s About Action
💭 Are we losing our ability to remember?
🇺🇸 How The Presidential Candidates Spy On Their Supporters
Quotation-as-title by Randall Jarrell. Image from top-linked post.
Why we can't have nice things
There's a phrase, mostly used by Americans, in relation to something bad happening: "this is why we can't have nice things".
I'd suggest that the reason things go south is usually because people don't care enough to fix, maintain, or otherwise care for them. That goes for everything from your garden, to a giant wiki-based encyclopedia that is used as the go-to place to check facts online.
The challenge for Wikipedia in 2020 is to maintain its status as one of the last objective places on the internet, and emerge from the insanity of a pandemic and a polarizing election without being twisted into yet another tool for misinformation. Or, to put it bluntly, Wikipedia must not end up like the great, negligent social networks who barely resist as their platforms are put to nefarious uses.
Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)
Wikipedia's approach is based on a evolving process, one that is the opposite of "go fast and break things".
Moving slowly has been a Wikipedia super-power. By boringly adhering to rules of fairness and sourcing, and often slowly deliberating over knotty questions of accuracy and fairness, the resource has become less interesting to those bent on campaigns of misinformation with immediate payoffs.
Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)
I'm in danger of sounding old, and even worse, old-fashioned, but everything isn't about entertainment. Someone or something has to be the keeper of the flame.
Being a stickler for accuracy is a drag. It requires making enemies and pushing aside people or institutions who don’t act in good faith.
Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)
Collaboration is our default operating system
One of the reasons I'm not active on Twitter any more is the endless, pointless arguments between progressives and traditionalists, between those on the left of politics and those on the right, and between those who think that watching reality TV is an acceptable thing to spend your life doing, and those who don't.
Interestingly a new report which draws on data from 10,000 people, focus groups, and academic interviews suggests that half of the controversy on Twitter is generated by a small proportion of users:
[The report] states that 12% of voters accounted for 50% of all social-media and Twitter users – and are six times as active on social media as are other sections of the population. The two “tribes” most oriented towards politics, labelled “progressive activists” and “backbone Conservatives”, were least likely to agree with the need for compromise. However, two-thirds of respondents who identify with either the centre, centre-left or centre-right strongly prefer compromise over conflict, by a margin of three to one.
Michael Savage, ‘Culture wars’ are fought by tiny minority – UK study (The Observer)
Interestingly, the report also shows difference between the US and UK, but also to attitudes before and after the pandemic started:
The research also suggested that the Covid-19 crisis had prompted an outburst of social solidarity. In February, 70% of voters agreed that “it’s everyone for themselves”, with 30% agreeing that “we look after each other”. By September, the proportion who opted for “we look after each other” had increased to 54%.
More than half (57%) reported an increased awareness of the living conditions of others, 77% feel that the pandemic has reminded us of our common humanity, and 62% feel they have the ability to change things around them – an increase of 15 points since February.
MICHAEL SAVAGE, ‘CULTURE WARS’ ARE FOUGHT BY TINY MINORITY – UK STUDY (THE OBSERVER)
As I keep on saying, those who believe in unfettered capitalism have to perpetuate a false narrative of competition in all things to justify their position. We have more things in common than differences, and I truly believe the collaboration is our default operating system.
Everything intercepts us from ourselves

🤝 Medieval English people used to pay their rent in eels
🤺 The Mad, Mad World of Niche Sports Among Ivy League–Obsessed Parents
📜 Archaeologists unearth 'huge number' of sealed Egyptian sarcophagi
🌉 3D model of how the Charles bridge in Prague was constructed
💪 Every Man Should Be Able to Save His Own Life: 5 Fitness Benchmarks a Man Must Master
Quotation by Ralph Waldo Emerson. Image from top-linked post.
Fighting health disinformation on Wikipedia
This is great to see:
As part of efforts to stop the spread of false information about the coronavirus pandemic, Wikipedia and the World Health Organization announced a collaboration on Thursday: The health agency will grant the online encyclopedia free use of its published information, graphics and videos.
Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)
Compared to Twitter's dismal efforts at fighting disinformation, the collaboration is welcome news.
The first W.H.O. items used under the agreement are its “Mythbusters” infographics, which debunk more than two dozen false notions about Covid-19. Future additions could include, for example, treatment guidelines for doctors, said Ryan Merkley, chief of staff at the Wikimedia Foundation, which produces Wikipedia.
Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)
More proof that the for-profit private sector is in no way more 'innovative' or effective than non-profits, NGOs, and government agencies.
Seeing through is rarely seeing into

♂️ What does it mean to be a man in 2020? Introducing our news series on masculinity
✏️ Your writing style is costly (Or, a case for using punctuation in Slack)
Quotation-as-title by Elizabeth Bransco. Image from top-linked post.
Perceptions of the past

The History teacher in me likes this simple photo quiz site that shows how your perception of the past can easily be manipulated by how photographs are presented.
Gatekeepers of opportunity and the lottery of privilege
Despite starting out as a pejorative term, 'meritocracy' is something that, until recently, few people seem to have had a problem with. One of the best explanations of why meritocracy is a problematic idea is in this Mozilla article from a couple of years ago. Basically, it ascribes agency to those who were given opportunities due to pre-existing privilege.
In an interview with The Chronicle of Higher Education, Michael Sandel makes some very good points about the American university system, which can be more broadly applied to other western nations, such as the UK, which have elite universities.
The meritocratic hubris of elites is the conviction by those who land on top that their success is their own doing, that they have risen through a fair competition, that they therefore deserve the material benefits that the market showers upon their talents. Meritocratic hubris is the tendency of the successful to inhale too deeply of their success, to forget the luck and good fortune that helped them on their way. It goes along with the tendency to look down on those less fortunate, and less credentialed, than themselves. That gives rise to the sense of humiliation and resentment of those who are left out.
Michael Sandel, quoted in 'The Insufferable Hubris of the Well-Credentialed'
As someone who is reasonably well-credentialed, I nevertheless see a fundamental problem with requiring a degree as an 'entry-level' qualification. That's why I first got interested in Open Badges nearly a decade ago.
Despite the best efforts of the community, elite universities have a vested in maintaining the status quo. Eventually, the whole edifice will come crashing down, but right now, those universities are the gatekeepers to opportunity.
Society as a whole has made a four-year university degree a necessary condition for dignified work and a decent life. This is a mistake. Those of us in higher education can easily forget that most Americans do not have a four-year college degree. Nearly two-thirds do not.
[...]
We also need to reconsider the steep hierarchy of prestige that we have created between four-year colleges and universities, especially brand-name ones, and other institutions of learning. This hierarchy of prestige both reflects and exacerbates the tendency at the top to denigrate or depreciate the contributions to the economy made by people whose work does not depend on having a university diploma.
So the role that universities have been assigned, sitting astride the gateway of opportunity and success, is not good for those who have been left behind. But I’m not sure it’s good for elite universities themselves, either.
MICHAEL SANDEL, QUOTED IN 'THE INSUFFERABLE HUBRIS OF THE WELL-CREDENTIALED'
Thankfully, Sandel, has a rather delicious solution to decouple privilege from admission to elite universities. It's not a panacea, but I like it a first step.
What might we do about it? I make a proposal in the book that may get me in a lot of trouble in my neighborhood. Part of the problem is that having survived this high-pressured meritocratic gauntlet, it’s almost impossible for the students who win admission not to believe that they achieved their admission as a result of their own strenuous efforts. One can hardly blame them. So I think we should gently invite students to challenge this idea. I propose that colleges and universities that have far more applicants than they have places should consider what I call a “lottery of the qualified.” Over 40,000 students apply to Stanford and to Harvard for about 2,000 places. The admissions officers tell us that the majority are well-qualified. Among those, fill the first-year class through a lottery. My hunch is that the quality of discussion in our classes would in no way be impaired.
The main reason for doing this is to emphasize to students and their parents the role of luck in admission, and more broadly in success. It’s not introducing luck where it doesn’t already exist. To the contrary, there’s an enormous amount of luck in the present system. The lottery would highlight what is already the case.
MICHAEL SANDEL, QUOTED IN 'THE INSUFFERABLE HUBRIS OF THE WELL-CREDENTIALED'
Would people like me be worse off in a more egalitarian system? Probably. But that's kind of the point.
Tedious sports
This made me smile:
You can divide most sports into those that take place in the real world (road cycling, sailing, cross country running) and those that are played on the artificial space of a court or pitch. Some (golf, croquet) occupy an uncertain middle ground, which may be one of the reasons they are so tedious to watch. Others (football, rugby) started as the former and, as they were codified, became the latter.
Jon Day, Better on TV (London Review of Books)
Man is equally incapable of seeing the nothingness from which he emerges and the infinity in which he is engulfed

👻 How to hide from a drone – the subtle art of ‘ghosting’ in the age of surveillance
♻️ How to Repurpose Your Old Gadgets
🎮 What Digital Doping Means for Esports—and Everything Else
💬 Sony clarifies PS5 voice chat recording feature following privacy panic
🚗 Split-Second ‘Phantom’ Images Can Fool Tesla’s Autopilot
Quotation-as-title from Pascal. Image from top-linked post.
Biometric surveillance in a post-pandemic future
I woke up today to the news that, in the UK, the police will get access to to the data on people told to self-isolate on a 'case-by-case basis'. As someone pointed out on Mastodon, this was entirely predictable.
They pointed to this article by Yuval Noah Harari from March of this year, which also feels like a decade ago. In it, he talks about post-pandemic society being a surveillance nightmare:
You could, of course, make the case for biometric surveillance as a temporary measure taken during a state of emergency. It would go away once the emergency is over. But temporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon. My home country of Israel, for example, declared a state of emergency during its 1948 War of Independence, which justified a range of temporary measures from press censorship and land confiscation to special regulations for making pudding (I kid you not). The War of Independence has long been won, but Israel never declared the emergency over, and has failed to abolish many of the “temporary” measures of 1948 (the emergency pudding decree was mercifully abolished in 2011).
Yuval Noah Harari: the world after coronavirus (The Financial times)
Remember the US 'war on terror'? That led to an incredible level of domestic and foreign surveillance that was revealed by Edward Snowden a few years ago.
The trouble, though, is that health is a clear and visible thing, a clear and present danger. Privacy is more nebulous with harms often being in the future, so the trade-off is between the here and now and, well, the opposite.
Even when infections from coronavirus are down to zero, some data-hungry governments could argue they needed to keep the biometric surveillance systems in place because they fear a second wave of coronavirus, or because there is a new Ebola strain evolving in central Africa, or because . . . you get the idea. A big battle has been raging in recent years over our privacy. The coronavirus crisis could be the battle’s tipping point. For when people are given a choice between privacy and health, they will usually choose health.
YUVAL NOAH HARARI: THE WORLD AFTER CORONAVIRUS (THE FINANCIAL TIMES)
For me, just like Harari, the way that governments choose to deal with the pandemic shows their true colours.
The coronavirus epidemic is thus a major test of citizenship. In the days ahead, each one of us should choose to trust scientific data and healthcare experts over unfounded conspiracy theories and self-serving politicians. If we fail to make the right choice, we might find ourselves signing away our most precious freedoms, thinking that this is the only way to safeguard our health.
YUVAL NOAH HARARI: THE WORLD AFTER CORONAVIRUS (THE FINANCIAL TIMES)
Ethics is the result of the human will
Sabelo Mhlambi is a computer scientist, researcher and Fellow at Harvard’s Berkman Klein Center for Internet & Society. He focuses on the ethical implications of technology in the developing world, particularly in Sub-Saharan Africa, and has written a great, concise essay on technological ethics in relation to the global north and south.
Ethics is not missing in technology, rather we are witnessing the ethics in technology – the ethics of the powerful. The ethics of individualism.
Mhlambi makes a number of important points, and I want to share three of them. First, he says that ethics is the result of human will, not algorithmic processes:
Ethics should not be left to algorithmic definitions and processes, ultimately ethics is a result of the human will. Technology won’t save us. The abdication of social and environmental responsibility by creators of technology should not be allowed to become the norm.
Second, technology is a driver of change in society, and, because technology is not neutral, we have individualism baked into the tools we use:
Ethics describes one’s relationship and responsibilities to others and the environment. Ethics is the protocol for human interaction, with each other and with the world. Different ethical systems may be described through this scale: Individualistic systems promote one’s self assertion through the limitation of one’s relationship and responsibilities to others and the environment. In contrast, a more communal ethics asserts the self through the encouragement of one’s relationship and responsibilities to the community and the environment.
This is, he says, a form of colonialism:
Technology designed and deployed beyond its ethical borders poses a threat to social stability in different regions with different ethical systems, norms and values. The imposition of a society’s beliefs on another is colonial. This relationship can be observed even amongst members of the South as the more economically developed nations extend their technology and influence into less developed nations, the East to Africa relationship being an example.
Third, over and above the individualism and colonialism, the technologies we use are unrepresentative because they do not take into account the lived experiences and view of marginalised groups:
In the development and funding of technology, marginalized groups are underrepresented. Their values and views are unaccounted for. In the software industry marginalized groups make a minority of the labor force and leadership roles. The digital divide continues to increase when technology is only accessible through the languages of the well developed nations.
It's an important essay, and one that I'll no doubt be returning to in the weeks and months to come.
Even while a thing is in the act of coming into existence, some part of it has already ceased to be

🤖 ‘Machines set loose to slaughter’: the dangerous rise of military AI
📏 Wittgenstein’s Ruler: When Our Opinions Speak More About Us Instead The Topic
🤨 Inside the strange new world of being a deepfake actor
🎡 Japanese Amusement Park Turns Ferris Wheel Into Wi-Fi Enabled Remote Workspace
Quotation-as-title from Marcus Aurelius. Image from top-linked post.
Forward momentum above all things
This page on a Brian Eno fan site was re-shared on Hacker News this week. It features text from an email from Eno himself, explaining why, although he's grateful that people want to discuss his work, he doesn't want to necessarily see it:
I think the reason I feel uncomfortable about such a thing is that it becomes a sort of weight on my shoulders. I start to feel an obligation to live up to something, instead of just following my nose wherever it wants to go at the moment. Of course success has many nice payoffs, but one of the disadvantages is that you start to be made to feel responsible for other people's feelings: what I'm always hearing are variations of "why don't you do more records like - (insert any album title) " or "why don't you do more work with - (insert any artist's name)?". I don't know why, these questions are un answerable, why is it so bloody important to you, leave me alone....these are a few of my responses. But the most important reason is "If I'd followed your advice in the first place I'd never have got anywhere".
Eno goes on to explain that being constantly reminded of your 'exhaust', of what you've already done isn't very conducive to future creative work:
I'm afraid to say that admirers can be a tremendous force for conservatism, for consolidation. Of course it's really wonderful to be acclaimed for things you've done - in fact it's the only serious reward, becasue it makes you think "it worked! I'm not isolated!" or something like that, and irt makes you feel gratefully connected to your own culture. But on the other hand, there's a tremendously strong pressure to repeat yourself, to do more of that thing we all liked so much. I can't do that - I don't have the enthusiasm to push through projects that seem familiar to me ( - this isn't so much a question of artistic nobility or high ideals: I just get too bloody bored), but at the same time I do feel guilt for 'deserting my audience' by not doing the things they apparently wanted. I'd rather not feel this guilt, actually, so I avoid finding out about situations that could cause it.
Finally, Eno explains that, just like everyone else, there are days when he wonders where the creative spark comes from:
The problem is that people nearly always prefer what I was doing a few years earlier - this has always been true. The other problem is that so, often, do I! Discovering things is clumsy and sporadic, and the results don't at first compare well with the glossy and lauded works of the past. You have to keep reminding yourself that they went through that as well, otherwise they become frighteningly accomplished. That's another problem with being made to think about your own past - you forget its genesis and start to feel useless awe toward syour earlier self "How did I do it? Wherever did these ideas come from?". Now, the workaday everyday now, always looks relatively less glamorous than the rose-tinted then (except for those magic mhours when your finger is right on the pulse, and those times only happen when you've abandoned the lifeline of your own history).
Being creative comes not from looking back, but looking forward. As the enigmatic Taylor, a character in the TV series Billions states in one episode, we should prize "forward momentum above all things".