Tag: Silicon Valley

Wretched is a mind anxious about the future

So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There’s plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it’s not currently affecting us here and now, we become blasé. We forget that it’s already the lived experience for hundreds of millions of people.

Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government’s brutality against the Muslim Uyghur population in the western province of Xinjiang:

[The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.

Derek Thompson

As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we’re the ones who should be worried:

The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.

Ian Welsh

As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they’re looking for places where they can’t be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.

A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I’ve spent a long time thinking about quite nebulous concept. It’s not the easiest thing to understand when you examine it more closely.

Privacy is usually considered a freedom from rather than a freedom to, as in “freedom from surveillance”. The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they’re sharing it with the company that provides that service.

There’s a lot of power in the ‘default’ privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:

Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them… Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.

Sidney Fussell

Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.

Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:

In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology…

One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.

Ellen Sheng

It’s not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:

Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

[…]

While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.
Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

Benjamin Herold

In China, of course, a social credit system makes all of this a million times worse, but we in the West aren’t heading in a great direction either.

We’re entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.

There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon’s smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.

Returning to Ian Welsh to finish up, remember that it’s the coercive power relationships that make surveillance a bad thing:

Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.

Ian Welsh

We don’t want to think about all of this, though, do we?


Also check out:

Everyone hustles his life along, and is troubled by a longing for the future and weariness of the present

Thanks to Seneca for today’s quotation, taken from his still-all-too-relevant On the Shortness of Life. We’re constantly being told that we need to ‘hustle’ to make it in today’s society. However, as Dan Lyons points out in a book I’m currently reading called Lab Rats: how Silicon Valley made work miserable for the rest of uswe’re actually being ‘immiserated’ for the benefit of Venture Capitalists. 

As anyone who’s read Daniel Kahneman’s book Thinking, Fast and Slow will know, there are two dominant types of thinking:

The central thesis is a dichotomy between two modes of thought: “System 1” is fast, instinctive and emotional; “System 2” is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman’s own research on loss aversion. From framing choices to people’s tendency to replace a difficult question with one which is easy to answer, the book highlights several decades of academic research to suggest that people place too much confidence in human judgement.

WIkipedia

Cal Newport, in a book of the same name, calls ‘System 2’ something else: Deep Work. Seneca, Kahneman, and Newport, are all basically saying the same thing but with different emphasis. We need to allow ourselves time for the slower and deliberative work that makes us uniquely human.

That kind of work doesn’t happen when you’re being constantly interrupted, nor when you’re in an environment that isn’t comfortable, nor when you’re fearful that your job may not exist next week. A post for the Nuclino blog entitled Slack Is Not Where ‘Deep Work’ Happens uses a potentially-apocryphal tale to illustrate the point:

On one morning in 1797, the English poet Samuel Taylor Coleridge was composing his famous poem Kubla Khan, which came to him in an opium-induced dream the night before. Upon waking, he set about writing until he was interrupted by an unknown person from Porlock. The interruption caused him to forget the rest of the lines, and Kubla Khan, only 54 lines long, was never completed.

Nuclino blog

What we’re actually doing by forcing everyone to use synchronous tools like Slack is a form of journalistic rhythm — but without everyone being synced-up:

Diagram courtesy of the Nuclino blog

If you haven’t read Deep Work, never fear, because there’s an epic article by Fadeke Adegbuyi for doist entitled The Complete Guide to Deep Work which is particularly useful:

This is an actionable guide based directly on Newport’s strategies in Deep Work. While we fully recommend reading the book in its entirety, this guide distills all of the research and recommendations into a single actionable resource that you can reference again and again as you build your deep work practice. You’ll learn how to integrate deep work into your life in order to execute at a higher level and discover the rewards that come with regularly losing yourself in meaningful work.

Fadeke Adegbuyi

Lots of articles and podcast episodes say they’re ‘actionable’ or provide ‘tactics’ for success. I have to say this one delivers. I’d still read Newport’s book, though.

Interestingly, despite all of the ridiculousness spouted by VC’s, people are pretty clear about how they can do their best work. After a Dropbox survey of 500 US-based workers in the knowledge economy, Ben Taylor outlines four ‘lessons’ they’ve learned:

  1. More workers want to slow down to get things right — “In reality, 61% of workers said they wanted to “slow down to get things right” while only 41%* wanted to “go fast to achieve more.” The divide was even starker among older workers.”
  2. Workers strongly value uninterrupted focus at work, but most will make an exception to help others — “The results suggest we need to be more thoughtful about when we break our concentration, or ask others to do so. When people know they are helping others in a meaningful way, they tend to be okay with some distraction. But the busywork of meetings, alerts, and emails can quickly disrupt a person’s flow—one of the most important values we polled.”
  3. Most workers have slightly more trust in people closest to the work, rather than people in upper management — “Among all respondents, 53% trusted people “closest to the work,” while only 45% trusted “upper management.” You might assume that younger workers would be the most likely to trust peers over management, but in fact, the opposite was true.”
  4. Workers are torn between idealism and pragmatism — “It’s tempting to assume that addressing just one piece—like taking a stand on societal issues—will necessarily get in the way of the work itself. But our research suggests we can begin to solve the two in tandem, as more equality, inclusion, and diversity tends to come hand-in-hand with a healthier mindset about work.”

I think we need to reclaim workplace culture from the hustlers, shallow thinkers, and those focused on short-term profit. Let’s reflect on how things actually work in practice. As Nassim Nicholas Taleb says about being ‘antifragile’, let’s “look for habits and rules that have been around for a long time”.


Also check out:

  • Health effects of job insecurity (IZA) — “Workers’ health is not just a matter for employees and employers, but also for public policy. Governments should count the health cost of restrictive policies that generate unemployment and insecurity, while promoting employability through skills training.”
  • Will your organization change itself to death? (opensource.com) — “Sometimes, an organization returns to the same state after sensing a stimulus. Think about a kid’s balancing doll: You can push it and it’ll wobble around, but it always returns to its upright state… Resilient organizations undergo change, but they do so in the service of maintaining equilibrium.”
  • Your Brain Can Only Take So Much Focus (HBR) — “The problem is that excessive focus exhausts the focus circuits in your brain. It can drain your energy and make you lose self-control. This energy drain can also make you more impulsive and less helpful. As a result, decisions are poorly thought-out, and you become less collaborative.”

Charisma instead of hierarchy?

An interesting interview with Fred Turner, former journalist, Stanford professor, and someone who spends a lot of time studying the technology and culture of Silicon Valley.

Turner likens tech companies who try to do away with hierarchy to 1960s communes:

When you take away bureaucracy and hierarchy and politics, you take away the ability to negotiate the distribution of resources on explicit terms. And you replace it with charisma, with cool, with shared but unspoken perceptions of power. You replace it with the cultural forces that guide our behavior in the absence of rules.

It’s an interesting viewpoint, and one which chimes with works such as The Tyranny of Structurelessness. I still think less hierarchy is a good thing. But then I would say that, because I’m a white, privileged western man getting ever-closer to middle-age…

Source: Logic magazine

What would a version of Maslow’s Hierarchy of Needs for society look like?

I like the notion put forward by Susan Wu in this article — although Maslow’s framework was actually based on co-operation, so re-thinking it as a dynamic hierarchy may be all that’s required:

Perhaps it’s time for an updated version of Maslow’s hierarchy of needs, one that underscores what’s essential not just for individuals to flourish, but for the greater good of society. Startups and management executives universally invoke this theory as an accepted canon for framing the human problems they’re trying to solve.

The problem is that Maslow’s framework pertains to individual, not societal, well-being. The reality is that individual needs cannot be met without the social cohesion of belonging, connectedness, and symbiotic networks. A revised design focused on a thriving civilization would have at its root empathy and ethics, and acknowledge that if inequality continues to grow at its current pace, societal well-being becomes impossible to achieve.

Source: WIRED