Tag: Slate (page 2 of 3)

Friday fumings

My bet is that you’ve spent most of this week reading news about the global pandemic. Me too. That’s why I decided to ensure it’s not mentioned at all in this week’s link roundup!

Let me know what resonates with you… 😷


Finding comfort in the chaos: How Cory Doctorow learned to write from literally anywhere

My writing epiphany — which arrived decades into my writing career — was that even though there were days when the writing felt unbearably awful, and some when it felt like I was mainlining some kind of powdered genius and sweating it out through my fingertips, there was no relation between the way I felt about the words I was writing and their objective quality, assessed in the cold light of day at a safe distance from the day I wrote them. The biggest predictor of how I felt about my writing was how I felt about me. If I was stressed, underslept, insecure, sad, hungry or hungover, my writing felt terrible. If I was brimming over with joy, the writing felt brilliant.

Cory Doctorow (CBC)

Such great advice in here from the prolific Cory Doctorow. Not only is he a great writer, he’s a great speaker, too. I think both come from practice and clarity of thought.


Slower News

Trends, micro-trends & edge cases.

This is a site that specialises in important and interesting news that is updated regularly, but not on an hour-by-hour (or even daily) basis. A wonderful antidote to staring at your social media feed for updates!


SCARF: The 5 key ingredients for psychological safety in your team

There’s actually a mountain of compelling evidence that the single most important ingredient for healthy, high-performing teams is simple: it’s trust. When Google famously crunched the data on hundreds of high-performing teams, they were surprised to find that one variable mattered more than any other: “emotional safety.” Also known as: “psychological security.” Also known as: trust.

Matt Thompson

I used to work with Matt at Mozilla, and he’s a pretty great person to work alongside. He’s got a book coming out this year, and Laura (another former Mozilla colleague, but also a current co-op colleague!) drew my attention to this.


I Illustrated National Parks In America Based On Their Worst Review And I Hope They Will Make You Laugh (16 Pics)

I’m an illustrator and I have always had a personal goal to draw all 62 US National Parks, but I wanted to find a unique twist for the project. When I found that there are one-star reviews for every single park, the idea for Subpar Parks was born. For each park, I hand-letter a line from the one-star reviews alongside my illustration of each park as my way of putting a fun and beautiful twist on the negativity.

Amber Share (Bored Panda)

I love this, especially as the illustrations are so beautiful and the comments so banal.


What Does a Screen Do?

We know, for instance, that smartphone use is associated with depression in teens. Smartphone use certainly could be the culprit, but it’s also possible the story is more complicated; perhaps the causal relationship works the other way around, and depression drives teenagers to spend more time on their devices. Or, perhaps other details about their life—say, their family background or level of physical activity—affect both their mental health and their screen time. In short: Human behavior is messy, and measuring that behavior is even messier.

Jane C. Hu (Slate)

This, via Ian O’Byrne, is a useful read for anyone who deals with kids, especially teenagers.


13 reads to save for later: An open organization roundup

For months, writers have been showering us with multiple, ongoing series of articles, all focused on different dimensions of open organizational theory and practice. That’s led to to a real embarrassment of riches—so many great pieces, so little time to catch them all.

So let’s take moment to reflect. If you missed one (or several) now’s your chance to catch up.

Bryan Behrenshausen (Opensource.com)

I’ve already shared some of the articles in this roundup, but I encourage you to check out the rest, and subscribe to opensource.com. It’s a great source of information and guidance.


It Doesn’t Matter If Anyone Exists or Not

Capitalism has always transformed people into latent resources, whether as labor to exploit for making products or as consumers to devour those products. But now, online services make ordinary people enact both roles: Twitter or Instagram followers for conversion into scrap income for an influencer side hustle; Facebook likes transformed into News Feed-delivery refinements; Tinder swipes that avoid the nuisance of the casual encounters that previously fueled urban delight. Every profile pic becomes a passerby—no need for an encounter, even.

Ian Bogost (The Atlantic)

An amazing piece of writing, in which Ian Bogost not only surveys previous experiences with ‘strangers’ but applies it to the internet. As he points out, there is a huge convenience factor in not knowing who made your sandwich. I’ve pointed out before that capitalism is all about scale, and at the end of the day, caring doesn’t scale, and scaling doesn’t care.


You don’t want quality time, you want garbage time

We desire quality moments and to make quality memories. It’s tempting to think that we can create quality time just by designating it so, such as via a vacation. That generally ends up backfiring due to our raised expectations being let down by reality. If we expect that our vacation is going to be perfect, any single mistake ruins the experience

In contrast, you are likely to get a positive surprise when you have low expectations, which is likely the case during a “normal day”. It’s hard to match perfection, and easy to beat normal. Because of this, it’s more likely quality moments come out of chance

If you can’t engineer quality time, and it’s more a matter of random events, it follows that you want to increase how often such events happen. You can’t increase the probability, but you can increase the duration for such events to occur. Put another way, you want to increase quantity of time, and not engineer quality time.

Leon Lin (Avoid boring people)

There’s a lot of other interesting-but-irrelevant things in this newsletter, so scroll to the bottom for the juicy bit. I’ve quoted the most pertinent point, which I definitely agree with. There’s wisdom in Gramsci’s quotation about having “pessimism of the intellect, optimism of the will”.


The Prodigal Techbro

The prodigal tech bro doesn’t want structural change. He is reassurance, not revolution. He’s invested in the status quo, if we can only restore the founders’ purity of intent. Sure, we got some things wrong, he says, but that’s because we were over-optimistic / moved too fast / have a growth mindset. Just put the engineers back in charge / refocus on the original mission / get marketing out of the c-suite. Government “needs to step up”, but just enough to level the playing field / tweak the incentives. Because the prodigal techbro is a moderate, centrist, regular guy. Dammit, he’s a Democrat. Those others who said years ago what he’s telling you right now? They’re troublemakers, disgruntled outsiders obsessed with scandal and grievance. He gets why you ignored them. Hey, he did, too. He knows you want to fix this stuff. But it’s complicated. It needs nuance. He knows you’ll listen to him. Dude, he’s just like you…

Maria Farrell (The Conversationalist)

Now that we’re experiencing something of a ‘techlash’ it’s unsurprising that those who created surveillance capitalism have had a ‘road to Damascus’ experience. That doesn’t mean, as Maria Farrell points out, that we should all of a sudden consider them to be moral authorities.


Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!

Technology is the name we give to stuff that doesn’t work properly yet

So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.

I’m in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual ‘autopilot’ can be a useful way to see things differently. So I’m going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years…

Attempts to put microphones everywhere

Alexa-enabled EVERYTHING

In an article for Slate, Shannon Palus ranks all of Amazon’s new products by ‘creepiness’. The Echo Frames are, in her words:

A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?

Shannon Palus

When we’re not talking about weapons of mass destruction, it’s not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a ‘quiet word’ with anyone wearing glasses ever again?

It’s not just Amazon, of course. Google and Facebook are at it, too.

Full-body deepfakes

Scary stuff

With the exception, perhaps, of populist politicians, I don’t think we’re ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for ‘full body deepfakes’.

The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can’t trust anything any more.

Neuromarketing

If you clicked on the ‘super-secret link’ in Sunday’s newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it’s the tip of the iceberg.

Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:

“The sort of future we’re looking ahead toward is a world where our neural data — which we don’t even have access to — could be used” against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.

Kaveh Waddell

This would lead to ‘neuromarketing’, with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it’s coming directly from your brainwaves, short of literally wearing a tinfoil hat, there’s nothing much you can do.


So there we are. Am I being too fearful here?

Is Google becoming more like Facebook?

I’m composing this post on ChromeOS, which is a little bit hypocritical, but yesterday I was shocked to discover how much data I was ‘accidentally’ sharing with Google. Check it out for yourself by going to your Google account’s activity controls page.

This article talks about how Google have become less trustworthy of late:

[Google] announced a forthcoming update last Wednesday: Chrome’s auto-sign-in feature will still be the default behavior of Chrome. But you’ll be able to turn it off through an optional switch buried in Chrome’s settings.

This pattern of behavior by tech companies is so routine that we take it for granted. Let’s call it “pulling a Facebook” in honor of the many times that Facebook has “accidentally” relaxed the privacy settings for user profile data, and then—following a bout of bad press coverage—apologized and quietly reversed course. A key feature of these episodes is that management rarely takes the blame: It’s usually laid at the feet of some anonymous engineer moving fast and breaking things. Maybe it’s just a coincidence that these changes consistently err in the direction of increasing “user engagement” and never make your experience more private.

What’s new here, and is a very recent development indeed, is that we’re finally starting to see that this approach has costs. For example, it now seems like Facebook executives spend an awful lot of time answering questions in front of Congress. In 2017, when Facebook announced it had handed more than 80 million user profiles to the sketchy election strategy firm Cambridge Analytica, Facebook received surprisingly little sympathy and a notable stock drop. Losing the trust of your users, we’re learning, does not immediately make them flee your business. But it does matter. It’s just that the consequences are cumulative, like spending too much time in the sun.

I’m certainly questioning my tech choices. And I’ve (re-)locked down my Google account.

Source: Slate