Tag: Harold Jarche

To pursue the unattainable is insanity, yet the thoughtless can never refrain from doing so

Two people talking to one another

💬 The Surprising Power of Simply Asking Coworkers How They’re Doing

🤔 Facebook Maybe Not Singlehandedly Undermining Democracy With Political Content, Says Facebook

🐑 What is the Zollman effect?

👂 Unnervingly good entry in the “what languages sound like to non-speakers” genre

⚔️ Could a Peasant defeat a Knight in Battle?

Quotation-as-title from Marcus Aurelius. Image from top-linked post.

The importance of co-operation

Quoting Stephen Downes in the introduction to his post, Harold Jarche goes on to explain:

Managing in complex adaptive systems means influencing possibilities rather than striving for predictability (good or best practices). Cooperation in our work is needed so that we can continuously develop emergent practices demanded by this complexity. What worked yesterday won’t work today. No one has the definitive answer any more, but we can use the intelligence of our networks to make sense together and see how we can influence desired results. This is cooperation and this is the future, which is already here, albeit unevenly distributed.

Harold Jarche, revisiting cooperation

It’s all very well having streamlined workflows, but that’s the way to get automated out of a job.

Software ate the world, so all the world’s problems get expressed in software

Benedict Evans recently posted his annual ‘macro trends’ slide deck. It’s incredibly insightful, and work of (minimalist) art. This article’s title comes from his conclusion, and you can see below which of the 128 slides jumped out at me from deck:

For me, what the deck as a whole does is place some of the issues I’ve been thinking about in a wider context.

My team is building a federated social network for educators, so I’m particularly tuned-in to conversations about the effect social media is having on society. A post by Harold Jarche where he writes about his experience of Twitter as a rage machine caught my attention, especially the part where he talks about how people are happy to comment based on the ‘preview’ presented to them in embedded tweets:

Research on the self-perception of knowledge shows how viewing previews without going to the original article gives an inflated sense of understanding on the subject, “audiences who only read article previews are overly confident in their knowledge, especially individuals who are motivated to experience strong emotions and, thus, tend to form strong opinions.” Social media have created a worldwide Dunning-Kruger effect. Our collective self-perception of knowledge acquired through social media is greater than it actually is.

Harold Jarche

I think our experiment with general-purpose social networks is slowly coming to an end, or at least will do over the next decade. What I mean is that, while we’ll still have places where you can broadcast anything to anyone, the digital environments we’ll spend more time will be what Venkatesh Rao calls the ‘cozyweb’:

Unlike the main public internet, which runs on the (human) protocol of “users” clicking on links on public pages/apps maintained by “publishers”, the cozyweb works on the (human) protocol of everybody cutting-and-pasting bits of text, images, URLs, and screenshots across live streams. Much of this content is poorly addressable, poorly searchable, and very vulnerable to bitrot. It lives in a high-gatekeeping slum-like space comprising slacks, messaging apps, private groups, storage services like dropbox, and of course, email.

Venkatesh Rao

That’s on a personal level. I should imagine organisational spaces will be a bit more organised. Back to Jarche:

We need safe communities to take time for reflection, consideration, and testing out ideas without getting harassed. Professional social networks and communities of practices help us make sense of the world outside the workplace. They also enable each of us to bring to bear much more knowledge and insight that we could do on our own.

Harold Jarche

…or to use Rao’s diagram which is so-awful-it’s-useful:

Image by Venkatesh Rao

Of course, blockchain/crypto could come along and solve all of our problems. Except it won’t. Humans are humans (are humans).

Ever since Eli Parisier’s TED talk urging us to beware online “filter bubbles” people have been wringing their hands about ensuring we have ‘balance’ in our networks.

Interestingly, some recent research by the Reuters Institute at Oxford University, paints a slightly different picture. The researcher, Dr Richard Fletcher begins by investigating how people access the news.

Preferred access to news
Diagram via the Reuters Institute, Oxford University

Fletcher draws a distinction between different types of personalisation:

Self-selected personalisation refers to the personalisations that we voluntarily do to ourselves, and this is particularly important when it comes to news use. People have always made decisions in order to personalise their news use. They make decisions about what newspapers to buy, what TV channels to watch, and at the same time which ones they would avoid

Academics call this selective exposure. We know that it’s influenced by a range of different things such as people’s interest levels in news, their political beliefs and so on. This is something that has pretty much always been true.

Pre-selected personalisation is the personalisation that is done to people, sometimes by algorithms, sometimes without their knowledge. And this relates directly to the idea of filter bubbles because algorithms are possibly making choices on behalf of people and they may not be aware of it.

The reason this distinction is particularly important is because we should avoid comparing pre-selected personalisation and its effects with a world where people do not do any kind of personalisation to themselves. We can’t assume that offline, or when people are self-selecting news online, they’re doing it in a completely random way. People are always engaging in personalisation to some extent and if we want to understand the extent of pre-selected personalisation, we have to compare it with the realistic alternative, not hypothetical ideals.

Dr Richard Fletcher

Read the article for the details, but the takeaways for me were twofold. First, that we might be blaming social media for wider and deeper divisons within society, and second, that teaching people to search for information (rather than stumble across it via feeds) might be the best strategy:

People who use search engines for news on average use more news sources than people who don’t. More importantly, they’re more likely to use sources from both the left and the right. 
People who rely mainly on self-selection tend to have fairly imbalanced news diets. They either have more right-leaning or more left-leaning sources. People who use search engines tend to have a more even split between the two.

Dr Richard Fletcher

Useful as it is, what I think this research misses out is the ‘black box’ algorithms that seek to keep people engaged and consuming content. YouTube is the poster child for this. As Jarche comments:

We are left in a state of constant doubt as conspiratorial content becomes easier to access on platforms like YouTube than accessing solid scientific information in a journal, much of which is behind a pay-wall and inaccessible to the general public.

Harold Jarche

This isn’t an easy problem to solve.

We might like to pretend that human beings are rational agents, but this isn’t actually true. Let’s take something like climate change. We’re not arguing about the facts here, we’re arguing about politics. Adrian Bardon, writing in Fast Company, writes:

In theory, resolving factual disputes should be relatively easy: Just present evidence of a strong expert consensus. This approach succeeds most of the time, when the issue is, say, the atomic weight of hydrogen.

But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious, or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.

Adrian Bardon

This is pretty obvious when we stop to think about it for a moment; beliefs are bound up with identity, and that’s not something that’s so easy to change.

In ideologically charged situations, one’s prejudices end up affecting one’s factual beliefs. Insofar as you define yourself in terms of your cultural affiliations, information that threatens your belief system—say, information about the negative effects of industrial production on the environment—can threaten your sense of identity itself. If it’s part of your ideological community’s worldview that unnatural things are unhealthful, factual information about a scientific consensus on vaccine or GM food safety feels like a personal attack.

Adrian Bardon

So how do we change people’s minds when they’re objectively wrong? Brian Resnick, writing for Vox, suggests the best approach might be ‘deep canvassing’:

Giving grace. Listening to a political opponent’s concerns. Finding common humanity. In 2020, these seem like radical propositions. But when it comes to changing minds, they work.


The new research shows that if you want to change someone’s mind, you need to have patience with them, ask them to reflect on their life, and listen. It’s not about calling people out or labeling them fill-in-the-blank-phobic. Which makes it feel like a big departure from a lot of the current political dialogue.

Brian Resnick

This approach, it seems, works:

Diagram by Stanford University, via Vox

So it seems there is some hope to fixing the world’s problems. It’s just that the solutions point towards doing the hard work of talking to people and not just treating them as containers for opinions to shoot down at a distance.

Enjoy this? Sign up for the weekly roundup and/or become a supporter!

How you do anything is how you do everything

So said Derek Sivers, although I suspect that, originally, it’s probably a core principle of Zen Buddhism. In this article I want to talk about management and leadership. But also about emotional intelligence and integrity.

I currently spend part of my working life as a Product Manager. At some organisations, this means that you’re in charge of the budget, and pull in colleagues from different disciplines. For example, a designer you’re working with on a particular project might report to the Head of UX. Matrix-style management and internal budgeting keeps track of everything.

This approach can get complicated so, at other companies (like the one I’m working with), the Product Manager manages both people and product. It’s a lot of work, as both can be complicated.

I think I’m OK at managing people, and other people say I’m good at it, but it’s not my favourite thing in the world to do.

That’s why, when hiring, I try to do so in one of three ways. Ideally, I want to hire people with whom at least one member of the existing team has already worked and can vouch for. If that doesn’t work, then I’m looking for people vouched for my the networks of which the team are part. Failing that, I’m trying to find people who don’t wait for direction, but know how to get on with things that need doing.

It’s an approach I’ve developed from the work of Laura Thomson. She’s a former colleague at Mozilla, and an advocate of a chaordic style of management and self-organising ducks:

Instead of having ‘all your ducks in a row’ the analogy in chaordic management is to have ‘self-organising ducks’. The idea is to give people enough autonomy, knowledge and skill to be able to do the management themselves.

As I’ve said before, the default way of organising human beings is hierarchy. That doesn’t mean it’s the best way. Hierachy tends to lean on processes, paperwork and meetings to ‘get things done’ but even a cursory glance at Open Source projects shows that all of this isn’t strictly necessary.

Last week, a new-ish member of the team said that I can be “too nice”. I’m still processing that and digging into what they meant, but I then ended up reading an article by Roddy Millar for Fast Company entitled Here’s why being likable may make you a less effective leader.

It’s a slightly oddly-framed article that quotes Prof. Karen Cates from Northwestern’s Kellogg School of Management :

Leaders should not put likability above effectiveness. There are times when the humor and smiles need to go and a let’s-get-this-done approach is required. Cates goes further: “Even the ‘nasty boss approach’ can be really effective—but in short, small doses—to get everyone’s attention and say ‘Hey, we’ve got to make some changes around here.’ You can then create—with an earnest approach—that more likable persona as you move forward. Likability is a good thing to have in your leadership toolkit, but it shouldn’t be the biggest hammer in the box.”

Roddy Millar

I think there’s a difference between ‘trying to be likeable’ and ‘treating your colleagues with dignity and respect’.

If you’re being nice to be just to liked by your team, you’re probably doing it wrong. It’s a bit like, back when I was teaching, teachers who wanted to be liked by the kids they taught.

The other approach is to simply treat the people around you with dignity and respect, realising that all of human life involves suffering, so let’s not add to the burden through our everyday working lives.

If you want to build a ship, don’t drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea.

Antoine de Saint-Exupéry

The above is one of my favourite quotations. We don’t need to crack the whip or wield some kind of totem of hierarchical power over other people. We just need to ensure people are in the right place (physically and emotoinally), with the right things (tools, skills, and information) to get things done.

In managers are for caring, Harold Jarche points a finger at hierarchical organisations, stating that they are “what we get when we use the blunt stick of economic consequences with financial quid pro quo as the prime motivator”.

Jarche wonders instead what would happen if they were structured more like communities of practice?

What would an organization look like with looser hierarchies and stronger networks? A lot more human, retrieving some of the intimacy and cooperation of tribal groups. We already have other ways of organizing work. Orchestras are not teams, and neither are jazz ensembles. There may be teamwork on a theatre production but the cast is not a team. It is more like a community of practice, with strong and weak social ties.

Harold Jarche

I think part of the problem, to be honest, is emotional intelligence, or rather the lack of it, in many organisations.

Unfortunately, the way to earn more money in organisations is to start managing people. Which is fine for the subset of people who have the skills to be able to handle this. For others, it’s a frustrating experience that takes them away from doing the work.

For TED Ideas, organisational psychologist Tomas Chamorro-Premuzic asks Why do so many incompetent men become leaders? And what can we do about it? He lists three reasons why we have so many incompetent (male) leaders:

  1. Our inability to distinguish between confidence and competence
  2. Our love of charasmatic individuals
  3. The allure of “people with grandiose visions that tap into our own narcissism”

He suggests three ways to fix this. The other two are all well and good, but I just want to focus on the first solution he suggests:

The first solution is to follow the signs and look for the qualities that actually make people better leaders. There is a pathological mismatch between the attributes that seduce us in a leader and those that are needed to be an effective leader. If we want to improve the performance of our leaders, we should focus on the right traits. Instead of falling for people who are confident, narcissistic and charismatic, we should promote people because of competence, humility and integrity. Incidentally, this would also lead to a higher proportion of female than male leaders — large-scale scientific studies show that women score higher than men on measures of competence, humility and integrity. But the point is that we would significantly improve the quality of our leaders.

Tomas Chamorro-Premuzic

The best leaders I’ve worked for exhibited high levels of emotional intelligence. Most of them were women.

Developing emotional intelligence is difficult and goodness knows I’m no expert. What I think we perhaps need to do is to remove our corporate dependency on hierarchy. In hierarchies, emotion and trust is removed as an impediment to action.

However, in my experience, hierarchy is inherently patriarchal and competitive. It’s not something that’s necessarily useful in every industry in the 21st century. And hierarchies are not places that I, and people like me, particularly thrive.

Instead, I think we require trust-based ways of organising — ways that emphasis human relationships. I think these are also more conducive to human flourishing.

Right now, approaches such as sociocracy take a while to get our collective heads around as they’re opposed to our “default operating system” of hierarchy. However, over time I think we’ll see versions of this becoming the norm, as it becomes ever easier to co-ordinate people at a distance.

To sum up, what it means to be an effective leader is changing. Returning to the article cited above by Harold Jarche, he writes:

Hierarchical teams are what we get when we use the blunt stick of economic consequences with financial quid pro quo as the prime motivator. In a creative economy, the unity of hierarchical teams is counter-productive, as it shuts off opportunities for serendipity and innovation. In a complex and networked economy workers need more autonomy and managers should have less control.

Harold Jarche

Many people no longer live in a world of the ‘permanent job’ and ‘career ladder’. What counts as success for them is not necessarily a steadily-increasing paycheck, but measures such as social justice or ‘making a dent in the universe’. This is where hierarchy fails, and where emergent, emotionally-intelligent leaders with teams of self-organising ducks, thrive.

Friday fawnings

On this week’s rollercoaster journey, I came across these nuggets:

  • Renata Ávila: “The Internet of creation disappeared. Now we have the Internet of surveillance and control” (CCCB Lab) — “This lawyer and activist talks with a global perspective about the movements that the power of “digital colonialism” is weaving. Her arguments are essential for preventing ourselves from being crushed by the technological world, from being carried away by the current of ephemeral divertemento. For being fully aware that, as individuals, our battle is not lost, but that we can control the use of our data, refuse to give away our facial recognition or demand that the privacy laws that protect us are obeyed.”
  • Everything Is Private Equity Now (Bloomberg) — “The basic idea is a little like house flipping: Take over a company that’s relatively cheap and spruce it up to make it more attractive to other buyers so you can sell it at a profit in a few years. The target might be a struggling public company or a small private business that can be combined—or “rolled up”—with others in the same industry.”
  • Forget STEM, We Need MESH (Our Human Family) — “I would suggest a renewed focus on MESH education, which stands for Media Literacy, Ethics, Sociology, and History. Because if these are not given equal attention, we could end up with incredibly bright and technically proficient people who lack all capacity for democratic citizenship.”
  • Connecting the curious (Harold Jarche) — “If we want to change the world, be curious. If we want to make the world a better place, promote curiosity in all aspects of learning and work. There are still a good number of curious people of all ages working in creative spaces or building communities around common interests. We need to connect them.”
  • Twitter: No, really, we’re very sorry we sold your security info for a boatload of cash (The Register) — “The social networking giant on Tuesday admitted to an “error” that let advertisers have access to the private information customers had given Twitter in order to place additional security protections on their accounts.”
  • Digital tools interrupt workers 14 times a day (CIO Dive) — “The constant chime of digital workplace tools including email, instant messaging or collaboration software interrupts knowledge workers 13.9 times on an average day, according to a survey of 3,750 global workers from Workfront.”
  • Book review – Curriculum: Athena versus the Machine (TES) — “Despite the hope that the book is a cure for our educational malaise, Curriculum is a morbid symptom of the current political and intellectual climate in English education.”
  • Fight for the planet: Building an open platform and open culture at Greenpeace (Opensource.com) — “Being as open as we can, pushing the boundaries of what it means to work openly, doesn’t just impact our work. It impacts our identity.”
  • Psychodata (Code Acts in Education) — “Social-emotional learning sounds like a progressive, child-centred agenda, but behind the scenes it’s primarily concerned with new forms of child measurement.”

Image via xkcd

There is no exercise of the intellect which is not, in the final analysis, useless

A quotation from a short story from Jorge Luis Borges’ Labyrinths provides the title for today’s article. I want to dig into the work of danah boyd and the transcript of a talk she gave recently, entitled Agnotology and Epistemological Fragmentation. It helps us understand what’s going on behind the seemingly-benign fascias of social networks and news media outlets.

She explains the title of her talk:

Epistemology is the term that describes how we know what we know. Most people who think about knowledge think about the processes of obtaining it. Ignorance is often assumed to be not-yet-knowledgeable. But what if ignorance is strategically manufactured? What if the tools of knowledge production are perverted to enable ignorance? In 1995, Robert Proctor and Iain Boal coined the term “agnotology” to describe the strategic and purposeful production of ignorance. In an edited volume called Agnotology, Proctor and Londa Schiebinger collect essays detailing how agnotology is achieved. Whether we’re talking about the erasure of history or the undoing of scientific knowledge, agnotology is a tool of oppression by the powerful.

danah boyd

Having already questioned ‘media literacy’ the way it’s currently taught through educational institutions and libraries, boyd explains how the alt-right are streets ahead of educators when it comes to pushing their agenda:

One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this.


Today’s drumbeat happens online. The goal is no longer just to go straight to the news media. It’s to first create a world of content and then to push the term through to the news media at the right time so that people search for that term and receive specific content. Terms like caravan, incel, crisis actor. By exploiting the data void, or the lack of viable information, media manipulators can help fragment knowledge and seed doubt.

danah boyd

Harold Jarche uses McLuhan’s tetrads to understand this visually, commenting: “This is an information war. Understanding this is the first step in fighting for democracy.”

Harold Jarche on Agnotology

We can teach children sitting in classrooms all day about checking URLs and the provenance of the source, but how relevant is that when they’re using YouTube as their primary search engine? Returning to danah boyd:

YouTube has great scientific videos about the value of vaccination, but countless anti-vaxxers have systematically trained YouTube to make sure that people who watch the Center for Disease Control and Prevention’s videos also watch videos asking questions about vaccinations or videos of parents who are talking emotionally about what they believe to be the result of vaccination. They comment on both of these videos, they watch them together, they link them together. This is the structural manipulation of media.

danah boyd

It’s not just the new and the novel. Even things that are relatively obvious to those of us who have grown up as adults online are confusing to older generations. As this article by BuzzFeed News reporter Craig Silverman points out, conspiracy-believing retirees have disproportionate influence on our democratic processes:

Older people are also more likely to vote and to be politically active in other ways, such as making political contributions. They are wealthier and therefore wield tremendous economic power and all of the influence that comes with it. With more and more older people going online, and future 65-plus generations already there, the online behavior of older people, as well as their rising power, is incredibly important — yet often ignored.

Craig Silverman

So when David Buckingham asks ‘Who needs digital literacy?’ I think the answer is everyone. Having been a fan of his earlier work, it saddens me to realise that he hasn’t kept up with the networked era:

These days, I find the notion of digital literacy much less useful – and to some extent, positively misleading. The fundamental problem is that the idea is defined by technology itself. It makes little sense to distinguish between texts (or media) on the grounds of whether they are analogue or digital: almost all media (including print media) involve the use of digital technology at some stage or other. Fake news and disinformation operate as much in old, analogue media (like newspapers) as they do online. Meanwhile, news organisations based in old media make extensive and increasing use of online platforms. The boundaries between digital and analogue may still be significant in some situations, but they are becoming ever more blurred.

David Buckingham

Actually, as Howard Rheingold pointed out a number of years ago in Net Smart, and as boyd has done in her own work, networks change everything. You can’t seriously compare pre-networked and post-networked cultures in any way other than in contrast.

Buckingham suggests that, seeing as the (UK) National Literacy Trust are on the case, we “don’t need to reinvent the wheel”. The trouble is that the wheel has already been reinvented, and lots of people either didn’t notice, or are acting as though it hasn’t been.

There’s a related article by Anna Mckie in the THE entitled Teaching intelligence: digital literacy in the ‘alternative facts’ era which, unfortunately, is now behind a paywall. It reports on a special issue of the journal Teaching in Higher Education where the editors have brought together papers on the contribution made by Higher Education to expertise and knowledge in the age of ‘alternative facts’:

[S]ocial media has changed the dynamic of information in our society, [editor] Professor Harrison added. “We’ve moved away from the idea of experts who assess information to one where the validity of a statement is based on the likes, retweets and shares it gets, rather than whether the information is valid.”

The first task of universities is to go back to basics and “help students to understand the difference between knowledge and information, and how knowledge is created, which is separate to how information is created”, Professor Harrison said. “Within [each] discipline, what are the skills needed to assess that?”

Many assume that schools or colleges are teaching this, but that is not the case, he added. “Academics should also be wary of the extent to which they themselves understand the new paradigms of knowledge creation,” Professor Harrison warned.

Anna McKie

One of the reasons I decided not to go into academia is that, certain notable exceptions aside, the focus is on explaining rather than changing. Or, to finish with another quotation, this time from Karl Marx, “Philosophers have hitherto only interpreted the world in various ways; the point is to change it.”

Also check out:

Human societies, hierarchy, and networks

Human societies and cultures are complex and messy. That means if we want to even begin to start understanding them, we need to simplify. This approach from Harold Jarche, based on David Ronfeldt’s work, is interesting:

Our current triform society is based on families/communities, a public sector, and a private market sector. But this form, dominated by Markets is unable to deal with the complexities we face globally — climate change, pollution, populism/fanaticism, nuclear war, etc. A quadriform society would be primarily guided by the Network form of organizing. We are making some advances in that area but we still have challenges getting beyond nation states and financial markets.

This diagram sums up why I find it so difficult to work within hierarchies: while they’re our default form of organising, they’re just not very good at dealing with complexity.

Source: Harold Jarche

Amazon Go, talent and labour

I’ll try and explain what Amazon Go is without sounding a note of incredulity and rolling my eyes. It’s a shop where shoppers submit to constant surveillance for the slim reward of not having to line up to pay. Instead, they enter the shop by identifying themselves via the Amazon app on their smartphone, and their shopping is then charged to their account.

Ben Thompson zooms out from this to think about the ‘game’ Amazon is playing here:

The economics of Amazon Go define the tech industry; the strategy, though, is uniquely Amazon’s. Most of all, the implications of Amazon Go explain both the challenges and opportunities faced by society broadly by the rise of tech.

He goes on to explain that Amazon really really likes fixed costs, which is what their new store provides. Yes, R&D is expensive, but then afterwards you can predict your costs, and concentrate on throughput:

Fixed costs, on the other hand, have no relation to revenue. In the case of convenience stores, rent is a fixed cost; 7-11 has to pay its lease whether it serves 100 customers or serves 1,000 in any given month. Certainly the more it serves the better: that means the store is achieving more “leverage” on its fixed costs.

In the case of Amazon Go specifically, all of those cameras and sensors and smartphone-reading gates are fixed costs as well — two types, in fact. The first is the actual cost of buying and installing the equipment; those costs, like rent, are incurred regardless of how much revenue the store ultimately produces.

Just as Amazon built amazingly scalable server technology and then opened it out as a platform for others to build websites and apps upon, so Thompson sees Amazon Go as the first move in the long game of providing technology to other shops/brands.

In market after market the company is leveraging software to build horizontal businesses that benefit from network effects: in e-commerce, more buyers lead to more suppliers lead to more buyers. In cloud services, more tenants lead to great economies of scale, not just in terms of servers and data centers but in the leverage gained by adding ever more esoteric features that both meet market needs and create lock-in… [T]he point of buying Whole Foods was to jump start a similar dynamic in groceries.

Thompson is no socialist, so I had a little chuckle at his reference to Marx towards the end of the article:

The political dilemma embedded in this analysis is hardly new: Karl Marx was born 200 years ago. Technology like Amazon Go is the ultimate expression of capital: invest massive amounts of money up front in order to reap effectively free returns at scale. What has fundamentally changed, though, is the role of labour: Marx saw a world where capital subjugated labour for its own return; technologies like Amazon Go have increasingly no need for labor at all.

He does have a point, though, and reading Inventing the Future: Postcapitalism and a World Without Work convinced me that even ardent socialists should be advocating for full automation.

This is all related to points made about the changing nature of work by Harold Jarche in a new article he’s written:

As routine and procedural work gets automated, human work will be increasingly complex, requiring permanent skills for continuous learning and adaptation. Creativity and empathy will be more important than compliance and intelligence. This requires a rethinking of jobs, employment, and organizational management.

Some people worry that there won’t be enough jobs to go around. However, the problem isn’t employment, the problem is neoliberalism, late-stage capitalism, and the fact that 1% of people own more than 55% of the rest of the planet.

Sources: Stratechery and Harold Jarche

Get a Thought Shrapnel digest in your inbox every Sunday (free!)
Holler Box