What kind of world do we want? (or, why regulation matters)

I saw a thread on Mastodon recently, which included this image:

Three images with the title 'Space required to Transport 48 People'. Each image is the same, with cars backed up down a road. The caption for each image is 'Car', 'Electric Car' and 'Autonomous Car', respectively.

Someone else replied with a meme showing a series of images with the phrase "They feed us poison / so we buy their 'cures' / while they ban our medicine". The poison in this case being cars burning fossil fuels, the cures being electric and/or autonomous cars, and the medicine public transport.

There's similar kind of thinking in the world of tech, with at least one interviewee in the documentary The Social Dilemma saying that people should be paid for their data. I've always been uneasy about this, so it's good to see the EFF come out strongly against it:

Let’s be clear: getting paid for your data—probably no more than a handful of dollars at most—isn’t going to fix what’s wrong with privacy today. Yes, a data dividend may sound at first blush like a way to get some extra money and stick it to tech companies. But that line of thinking is misguided, and falls apart quickly when applied to the reality of privacy today. In truth, the data dividend scheme hurts consumers, benefits companies, and frames privacy as a commodity rather than a right.

EFF strongly opposes data dividends and policies that lay the groundwork for people to think of the monetary value of their data rather than view it as a fundamental right. You wouldn’t place a price tag on your freedom to speak. We shouldn’t place one on our privacy, either.

Hayley Tsukayama, Why Getting Paid for Your Data Is a Bad Deal (EFF)

As the EFF points out, who would get to set the price of that data, anyway? Also, individual data is useful to companies, but so is data in aggregate. Is that covered by such plans?

Facebook makes around $7 per user, per quarter. Even if they gave you all of that, is that a fair exchange?

Those small checks in exchange for intimate details about you are not a fairer trade than we have now. The companies would still have nearly unlimited power to do what they want with your data. That would be a bargain for the companies, who could then wipe their hands of concerns about privacy. But it would leave users in the lurch.

All that adds up to a stark conclusion: if where we’ve been is any indication of where we’re going, there won’t be much benefit from a data dividend. What we really need is stronger privacy laws to protect how businesses process our data—which we can, and should do, as a separate and more protective measure.

Hayley Tsukayama, Why Getting Paid for Your Data Is a Bad Deal (EFF)

As the rest of the article goes on to explain, we're already in a world of 'pay for privacy' which is exacerbating the gulf between the haves and the have-nots. We need regulation and legislation to curb this before it gallops away from us.

A candour affected is a dagger concealed

Slowly-boiling frogs in Facebook's surveillance panopticon

I can't think of a worse company than Facebook than to be creating a IRL surveillance panopticon. But, I have to say, it's entirely on-brand.

On Wednesday, the company announced a plan to map the entire world, beyond street view. The company is launching a set of glasses that contains cameras, microphones, and other sensors to build a constantly updating map of the world in an effort called Project Aria. That map will include the inside of buildings and homes and all the objects inside of them. It’s Google Street View, but for your entire life.

Dave Gershgorn, Facebook’s Project Aria Is Google Maps — For Your Entire Life (OneZero)

We're like slowly-boiling frogs with this stuff. Everything seems fine. Until it's not.

The company insists any faces and license plates captured by Aria glasses wearers will be anonymized. But that won’t protect the data from Facebook itself. Ostensibly, Facebook will possess a live map of your home, pictures of your loved ones, pictures of any sensitive documents or communications you might be looking at with the glasses on, passwords — literally your entire life. The employees and contractors who have agreed to wear the research glasses are already trusting the company with this data.

Dave Gershgorn, Facebook’s Project Aria Is Google Maps — For Your Entire Life (OneZero)

With Amazon cosying up to police departments in the US with its Ring cameras, we really are hurtling towards surveillance states in the West.

Who has access to see the data from this live 3D map, and what, precisely, constitutes private versus public data? And who makes that determination? Faces might be blurred, but people can be easily identified without their faces. What happens if law enforcement wants to subpoena a day’s worth of Facebook’s LiveMap? Might Facebook ever build a feature to try to, say, automatically detect domestic violence, and if so, what would it do if it detected it?

Dave Gershgorn, Facebook’s Project Aria Is Google Maps — For Your Entire Life (OneZero)

Judges already requisition Fitbit data to solve crimes. No matter what Facebook say are their intentions around Project Aria, this data will end up in the hands of law enforcement, too.


More details on Project Aria:

To pursue the unattainable is insanity, yet the thoughtless can never refrain from doing so

'Prepper' philosophy

This morning, I came across a long web page from 2016, presumably created as a reaction to everything that went down that year (little did we know!)

Ostensibly, it's about preparing for scenarios in life that are relatively likely. It's pretty epic. While I've converted it to PDF and printed all 68 pages out to read in more detail, there were some parts that jumped out at me, which I'll share here.

[T]he purpose of this guide is to combat the mindset of learned helplessness by promoting simple, level-headed, personal preparedness techniques that are easy to implement, don't cost much, and will probably help you cope with whatever life throws your way.

lcamtuf, Doomsday Prepping For Less Crazy Folk

Growing up, my mother was the kind of woman who always had extra tins in the cupboards 'just in case'. Recently, my wife has taken this to the next level, with documents containing details on our stash including best before dates, etc.

Effective preparedness can be simple, but it has to be rooted in an honest and systematic review of the risks you are likely to face. Plenty of excited newcomers begin by shopping for ballistic vests and night vision goggles; they would be better served by grabbing a fire extinguisher, some bottled water, and then putting the rest of their money in a rainy-day fund.

LCAMTUF, DOOMSDAY PREPPING FOR LESS CRAZY FOLK

I see this document, which goes into money, self-defence, hygiene, and even relationships as neighbours as more of a philosophy of life.

Rational prepping is meant to give you confidence to go about your business, knowing that you are well-equipped to weather out adversities. But it should not be about convincing yourself that the collapse is just around the corner, and letting that thought consume and disrupt your life.

Stay positive: the world is probably not ending, and there is a good chance that it will be an even better place for our children than it is for us. But the universe is a harsh mistress, and there is only so much faith we should be putting in good fortune, in benevolent governments, or in the wonders of modern technology. So, always have a backup plan.

LCAMTUF, DOOMSDAY PREPPING FOR LESS CRAZY FOLK

Recommended reading 👍

(also check out the author's hyperinflation gallery)

Much will have more

Philosophical anxiety as a superpower

Anxiety is a funny thing. Some people are anxious over specific things, while others, like me, have a kind of general background anxiety. It's only recently have I've admitted that to myself.

Some might call this existential or philosophical anxiety and, to a greater or lesser extent, it's part of the human condition.

Humans are philosophising animals precisely because we are the anxious animal: not a creature of the present, but regretful about the past and fearful of the future. We philosophise to understand our past, to make our future more comprehensible... Philosophy is the path that we hope gets us there. Anxiety is our dogged, unpleasant and indispensable companion.

Samir Chopra, Anxiety isn’t a pathology. It drives us to push back the unknown (Psyche)

One of the things my therapist has been pushing me on recently is my tolerance for, and ability to sit with uncertainty. We all want to know something for definite, but it's rarely possible.

We are anxious; we seek relief by enquiring, by asking questions, while not knowing the answers; greater or lesser anxieties might heave into view as a result. As we realise the dimensions of our ultimate concerns, we find our anxiety is irreducible, for our increasing bounties of knowledge – scientific, technical or conceptual – merely bring us greater burdens of uncertainty.

Samir Chopra, Anxiety isn’t a pathology. It drives us to push back the unknown (Psyche)

To be able to tolerate the philosophical anxiety of not knowing, then, is a form of superpower. It may not necessarily make us happy, but it does make us free.

Anxiety then, rather than being a pathology, is an essential human disposition that leads us to enquire into the great, unsolvable mysteries that confront us; to philosophise is to acknowledge a crucial and animating anxiety that drives enquiry onward. The philosophical temperament is a curious and melancholic one, aware of the incompleteness of human knowledge, and the incapacities that constrain our actions and resultant happiness.

Samir Chopra, Anxiety isn’t a pathology. It drives us to push back the unknown (Psyche)

Ultimately, it's OK to be anxious, as it makes us human and takes us beyond mere rationality to a deeper, more powerful understanding of who (and why) we are.

The most fundamental enquiry of all is into our selves; anxiety is the key to this sacred inner chamber, revealing which existential problematic – the ultimate concerns of death, meaning, isolation, freedom – we are most eager to resolve.

Samir Chopra, Anxiety isn’t a pathology. It drives us to push back the unknown (Psyche)

You can’t tech your way out of problems the tech didn’t create

The Electronic Frontier Foundation (EFF), is a US-based non-profit that exists to defend civil liberties in the digital world. They've been around for 30 years, and I support them financially on a monthly basis.

In this article by Corynne McSherry, EFF's Legal Director, she outlines the futility in attempts by 'Big Social' to do content moderation at scale:

[C]ontent moderation is a fundamentally broken system. It is inconsistent and confusing, and as layer upon layer of policy is added to a system that employs both human moderators and automated technologies, it is increasingly error-prone. Even well-meaning efforts to control misinformation inevitably end up silencing a range of dissenting voices and hindering the ability to challenge ingrained systems of oppression.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

Ultimately, these monolithic social networks have a problem around false positives. It's in their interests to be over-zealous, as they're increasingly under the watchful eye of regulators and governments.

We have been watching closely as Facebook, YouTube, and Twitter, while disclaiming any interest in being “the arbiters of truth,” have all adjusted their policies over the past several months to try arbitrate lies—or at least flag them. And we’re worried, especially when we look abroad. Already this year, an attempt by Facebook to counter election misinformation targeting Tunisia, Togo, Côte d’Ivoire, and seven other African countries resulted in the accidental removal of accounts belonging to dozens of Tunisian journalists and activists, some of whom had used the platform during the country’s 2011 revolution. While some of those users’ accounts were restored, others—mostly belonging to artists—were not.

Corynne McSherry, Content Moderation and the U.S. Election: What to Ask, What to Demand (EFF)

McSherry's analysis is spot-on: it's the algorithms that are a problem here. Social networks employ these algorithms because of their size and structure, and because of the cost of human-based content moderation. After all, these are companies with shareholders.

Algorithms used by Facebook’s Newsfeed or Twitter’s timeline make decisions about which news items, ads, and user-generated content to promote and which to hide. That kind of curation can play an amplifying role for some types of incendiary content, despite the efforts of platforms like Facebook to tweak their algorithms to “disincentivize” or “downrank” it. Features designed to help people find content they’ll like can too easily funnel them into a rabbit hole of disinformation.

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

She includes useful questions for social networks to answer about content moderation:

  • Is the approach narrowly tailored or a categorical ban?
  • Does it empower users?
  • Is it transparent?
  • Is the policy consistent with human rights principles?

But, ultimately...

You can’t tech your way out of problems the tech didn’t create. And even where content moderation has a role to play, history tells us to be wary. Content moderation at scale is impossible to do perfectly, and nearly impossible to do well, even under the most transparent, sensible, and fair conditions

CORYNNE MCSHERRY, CONTENT MODERATION AND THE U.S. ELECTION: WHAT TO ASK, WHAT TO DEMAND (EFF)

I'm so pleased that I don't use Facebook products, and that I only use Twitter these days as a place to publish links to my writing.

Instead, I'm much happier on the Fediverse, a place where if you don't like the content moderation approach of the instance you're on, you can take your digital knapsack and decide to call another place home. You can find me here (for now!).

Even those of a harsh and unyielding nature will endure gentle treatment: no creature is fierce and frightening if it is stroked

When people are free to do as they please, they usually imitate each other

Ethical living

Update: AI upscaled to larger resolution with more clarity

Mural which reads "You are personally responsible for becoming more ethical than the society you grew up in"

/via LinkedIn

Reafferent loops

In Peter Godfrey-Smith's book Other Minds, he cites work from 1950 by the German physiologists Erich van Holst and Horst Mittelstaedt.

They used the term afference to refer to everything you take in through the senses. Some of what comes in is due to the changes in the objects around you — that is exafference... — and some of what comes in is due to your own actions: that is reafference.

Peter Godfrey-Smith, Other Minds, p.154

Godfrey-Smith is talking about octopuses and other cephalopods, but I think what he's discussing is interesting from a digital note-taking point of view.

To write a note and read it is to create a reafferent loop. Rather than wanting to perceive only the things that are not due to you — finding the exafferent among the noise is the senses — you what you read to be entirely due to your previous action. You want the contents of the note to be due to your acts rather than someone else's meddling, or the natural decay of the notepad. You want the loop between present action and future perception to be firm. Thus enables your to create a form of external memory — as was, almost certainly, the role of much early writing (which is full of records of goods and transactions), and perhaps also the role of some early pictures, though that js much less clear.

When a written message is directed at others, it's ordinary communication. When you write something for yourself to read, there's usually an essential role for time — the goal is memory, in a broad sense. But memory like this is a communicative phenomenon; it is communication between your present self and a future self. Diaries and notes-to-self are embedded in a sender/receiver system just like more standard forms of communication.

Peter Godfrey-Smith, Other Minds, p.154-155

Some people talk about digital note-taking as a form of 'second brain'. Given the type of distributed cognition that Godfrey-Smith highlights in Other Minds, it would appear that by creating reafferent loops that's exactly the kind of thing that's happening.

Very interesting.

Hiring is broken, but not in the ways you assume

Hacker News is a link aggregator for people who work in tech. There's a lot of very technical information on there, but also stuff interesting to the curious mind more generally.

As so many people visit the site every day, it can be very influential, especially given the threaded discussion about shared links.

There can be a bit of a 'hive mind' sometimes, with certain things being sacred cows or implicit assumptions held by those who post (and lurk) there.

In this blog post focusing on hiring practices there's a critique of four 'myths' that seem to be prevalent in Hacker News discussions. Some of it is almost exclusively focused on tech roles in Silicon Valley, but I wanted to pull out this nugget which outlines what is really wrong with hiring:

Diversity. We really, really suck at diversity. We’re getting better, but we have a long way to go. Most of the industry chases the same candidates and assesses them in the same way.

Generally unfair practices. In cases where companies have power and candidates don’t, things can get really unfair. Lack of diversity is just one side-effect of this, others include poor candidate experiences, unfair compensation, and many others.

Short-termism. Recruiters and hiring managers that just want to fill a role at any cost, without thinking about whether there really is a fit or not. Many recruiters work on contingency, and most of them suck. The really good ones are awesome, but most of the well is poison. Hiring managers can be the same, too, when they’re under pressure to hire.

General ineptitude. Sometimes companies don’t knowing what they’re looking for, or are not internally aligned on it. Sometimes they just have broken processes, where they can’t keep track of who they’re talking to and what stage they’re at. Sometimes the engineers doing the interviews couldn’t care two shits about the interview or the company they work at. And often, companies are just tremendously indecisive, which makes them really slow to decide, or to just reject candidates because they can’t make up their minds.

Ozzie, 4 Hiring Myths Common in HackerNews Discussions

I've hired people and, even with the lastest talent management workflow software, it's not easy. It sucks up your time, and anything/everything you do can and will be criticised.

But that doesn't mean that we can't strive to make the whole process better, more equitable, and more enjoyable for all involved.

If you have been put in your place long enough, you begin to act like the place

Why we can't have nice things

There's a phrase, mostly used by Americans, in relation to something bad happening: "this is why we can't have nice things".

I'd suggest that the reason things go south is usually because people don't care enough to fix, maintain, or otherwise care for them. That goes for everything from your garden, to a giant wiki-based encyclopedia that is used as the go-to place to check facts online.

The challenge for Wikipedia in 2020 is to maintain its status as one of the last objective places on the internet, and emerge from the insanity of a pandemic and a polarizing election without being twisted into yet another tool for misinformation. Or, to put it bluntly, Wikipedia must not end up like the great, negligent social networks who barely resist as their platforms are put to nefarious uses.

Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)

Wikipedia's approach is based on a evolving process, one that is the opposite of "go fast and break things".

Moving slowly has been a Wikipedia super-power. By boringly adhering to rules of fairness and sourcing, and often slowly deliberating over knotty questions of accuracy and fairness, the resource has become less interesting to those bent on campaigns of misinformation with immediate payoffs.

Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)

I'm in danger of sounding old, and even worse, old-fashioned, but everything isn't about entertainment. Someone or something has to be the keeper of the flame.

Being a stickler for accuracy is a drag. It requires making enemies and pushing aside people or institutions who don’t act in good faith.

Noam Cohen, Wikipedia's Plan to Resist Election Day Misinformation (WIRED)

Collaboration is our default operating system

One of the reasons I'm not active on Twitter any more is the endless, pointless arguments between progressives and traditionalists, between those on the left of politics and those on the right, and between those who think that watching reality TV is an acceptable thing to spend your life doing, and those who don't.

Interestingly a new report which draws on data from 10,000 people, focus groups, and academic interviews suggests that half of the controversy on Twitter is generated by a small proportion of users:

[The report] states that 12% of voters accounted for 50% of all social-media and Twitter users – and are six times as active on social media as are other sections of the population. The two “tribes” most oriented towards politics, labelled “progressive activists” and “backbone Conservatives”, were least likely to agree with the need for compromise. However, two-thirds of respondents who identify with either the centre, centre-left or centre-right strongly prefer compromise over conflict, by a margin of three to one.

Michael Savage, ‘Culture wars’ are fought by tiny minority – UK study (The Observer)

Interestingly, the report also shows difference between the US and UK, but also to attitudes before and after the pandemic started:

The research also suggested that the Covid-19 crisis had prompted an outburst of social solidarity. In February, 70% of voters agreed that “it’s everyone for themselves”, with 30% agreeing that “we look after each other”. By September, the proportion who opted for “we look after each other” had increased to 54%.

More than half (57%) reported an increased awareness of the living conditions of others, 77% feel that the pandemic has reminded us of our common humanity, and 62% feel they have the ability to change things around them – an increase of 15 points since February.

MICHAEL SAVAGE, ‘CULTURE WARS’ ARE FOUGHT BY TINY MINORITY – UK STUDY (THE OBSERVER)

As I keep on saying, those who believe in unfettered capitalism have to perpetuate a false narrative of competition in all things to justify their position. We have more things in common than differences, and I truly believe the collaboration is our default operating system.

Everything intercepts us from ourselves

Fighting health disinformation on Wikipedia

This is great to see:

As part of efforts to stop the spread of false information about the coronavirus pandemic, Wikipedia and the World Health Organization announced a collaboration on Thursday: The health agency will grant the online encyclopedia free use of its published information, graphics and videos.

Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)

Compared to Twitter's dismal efforts at fighting disinformation, the collaboration is welcome news.

The first W.H.O. items used under the agreement are its “Mythbusters” infographics, which debunk more than two dozen false notions about Covid-19. Future additions could include, for example, treatment guidelines for doctors, said Ryan Merkley, chief of staff at the Wikimedia Foundation, which produces Wikipedia.

Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)

More proof that the for-profit private sector is in no way more 'innovative' or effective than non-profits, NGOs, and government agencies.

Seeing through is rarely seeing into

Perceptions of the past

The History teacher in me likes this simple photo quiz site that shows how your perception of the past can easily be manipulated by how photographs are presented.