- Blood money is fine with us, says GitLab: Vetting non-evil customers is 'time consuming, potentially distracting' (The Register)
- Revealed: Google made large contributions to climate change deniers (The Guardian)
- Lenovo X270 (Wikipedia)
- Debian Linux
- Asus VivoMini (Wikipedia)
- How does a computer ‘see’ gender? (Pew Research Center) — "Machine learning tools can bring substantial efficiency gains to analyzing large quantities of data, which is why we used this type of system to examine thousands of image search results in our own studies. But unlike traditional computer programs – which follow a highly prescribed set of steps to reach their conclusions – these systems make their decisions in ways that are largely hidden from public view, and highly dependent on the data used to train them. As such, they can be prone to systematic biases and can fail in ways that are difficult to understand and hard to predict in advance."
- The Communication We Share with Apes (Nautilus) — "Many primate species use gestures to communicate with others in their groups. Wild chimpanzees have been seen to use at least 66 different hand signals and movements to communicate with each other. Lifting a foot toward another chimp means “climb on me,” while stroking their mouth can mean “give me the object.” In the past, researchers have also successfully taught apes more than 100 words in sign language."
- Why degrowth is the only responsible way forward (openDemocracy) — "If we free our imagination from the liberal idea that well-being is best measured by the amount of stuff that we consume, we may discover that a good life could also be materially light. This is the idea of voluntary sufficiency. If we manage to decide collectively and democratically what is necessary and enough for a good life, then we could have plenty."
- 3 times when procrastination can be a good thing (Fast Company) — "It took Leonardo da Vinci years to finish painting the Mona Lisa. You could say the masterpiece was created by a master procrastinator. Sure, da Vinci wasn’t under a tight deadline, but his lengthy process demonstrates the idea that we need to work through a lot of bad ideas before we get down to the good ones."
- Why can’t we agree on what’s true any more? (The Guardian) — "What if, instead, we accepted the claim that all reports about the world are simply framings of one kind or another, which cannot but involve political and moral ideas about what counts as important? After all, reality becomes incoherent and overwhelming unless it is simplified and narrated in some way or other.
- A good teacher voice strikes fear into grown men (TES) — "A good teacher voice can cut glass if used with care. It can silence a class of children; it can strike fear into the hearts of grown men. A quiet, carefully placed “Excuse me”, with just the slightest emphasis on the “-se”, is more effective at stopping an argument between adults or children than any amount of reason."
- Freeing software (John Ohno) — "The only way to set software free is to unshackle it from the needs of capital. And, capital has become so dependent upon software that an independent ecosystem of anti-capitalist software, sufficiently popular, can starve it of access to the speed and violence it needs to consume ever-doubling quantities of to survive."
- Young People Are Going to Save Us All From Office Life (The New York Times) — "Today’s young workers have been called lazy and entitled. Could they, instead, be among the first to understand the proper role of work in life — and end up remaking work for everyone else?"
- Global climate strikes: Don’t say you’re sorry. We need people who can take action to TAKE ACTUAL ACTION (The Guardian) — "Brenda the civil disobedience penguin gives some handy dos and don’ts for your civil disobedience"
- Appoint someone as online executor
- State in a formal document how profiles and accounts are handled
- Understand privacy policies
- Provide online executor list of websites and logins
- State in the will that the online executor must have a copy of the death certificate
- Really important to my legacy
- Kind of important
- Not important
Happy New Year! It's good to be back.
This week's microcast answers a question from John Johnston about federation and the IndieWeb. I also discuss anarchism and left-libertarianism, for good measure.
For approximately the last decade, I've had an annual hiatus from writing and social media, and focused on inputs rather than outputs. Sometimes that's lasted a month, sometimes two.
This year, I'm going to be sending out weekly newsletters (only) during November, and then nothing at all in December. As a result, there won't be any more posts on this site until January 2020.
I'd like to take this opportunity to thank everyone who has commented on my work this year, either publicly or privately. A special thanks goes to those who back Thought Shrapnel via Patreon. I really do appreciate your support!
I've decided to post these microcasts, which I previously made available only through Patreon, here instead.
Microcasts focus on what I've been up to and thinking about, and also provide a way to answer questions from supporters and other readers/listeners!
This microcast covers ethics in decision-making for technology companies and (related!) some recent purchases I've made.
So said Sydney Smith. Let's talk about surveillance. Let's talk about surveillance capitalism and surveillance humanitarianism. But first, let's talk about machine learning and algorithms; in other words, let's talk about what happens after all of that data is collected.
Writing in The Guardian, Sarah Marsh investigates local councils using "automated guidance systems" in an attempt to save money.
The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.Sarah Marsh
The trouble is, they're not particularly effective:
It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk
Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.Sarah Marsh
When I was at Mozilla there were a number of colleagues there who had worked on the OFA (Obama For America) campaign. I remember one of them, a DevOps guy, expressing his concern that the infrastructure being built was all well and good when there's someone 'friendly' in the White House, but what comes next.
Well, we now know what comes next, on both sides of the Atlantic, and we can't put that genie back in its bottle. Swingeing cuts by successive Conservative governments over here, coupled with the Brexit time-and-money pit means that there's no attention or cash left.
If we stop and think about things for a second, we probably wouldn't don't want to live in a world where machines make decisions for us, based on algorithms devised by nerds. As Rose Eveleth discusses in a scathing article for Vox, this stuff isn't 'inevitable' — nor does it constitute a process of 'natural selection':
Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”
And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.Rose Eveleth
I suppose the thing that concerns me the most is people in dire need being subject to impersonal technology for vital and life-saving aid.
For example, Mark Latonero, writing in The New York Times, talks about the growing dangers around what he calls 'surveillance humanitarianism':
By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need.
Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.Mark Latonero
It's easy to think that this is an emergency, so we should just do whatever is necessary. But Latonero explains the risks, arguing that the risk is shifted to a later time:
If an individual or group’s data is compromised or leaked to a warring faction, it could result in violent retribution for those perceived to be on the wrong side of the conflict. When I spoke with officials providing medical aid to Syrian refugees in Greece, they were so concerned that the Syrian military might hack into their database that they simply treated patients without collecting any personal data. The fact that the Houthis are vying for access to civilian data only elevates the risk of collecting and storing biometrics in the first place.Mark Latonero
There was a rather startling article in last weekend's newspaper, which I've found online. Hannah Devlin, again writing in The Guardian (which is a good source of information for those concerned with surveillance) writes about a perfect storm of social media and improved processing speeds:
[I]n the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.
The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.
In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.Hannah Devlin
Those of you who have read this far and are expecting some big reveal are going to be disappointed. I don't have any 'answers' to these problems. I guess I've been guilty, like many of us have, of the kind of 'privacy nihilism' mentioned by Ian Bogost in The Atlantic:
Online services are only accelerating the reach and impact of data-intelligence practices that stretch back decades. They have collected your personal data, with and without your permission, from employers, public records, purchases, banking activity, educational history, and hundreds more sources. They have connected it, recombined it, bought it, and sold it. Processed foods look wholesome compared to your processed data, scattered to the winds of a thousand databases. Everything you have done has been recorded, munged, and spat back at you to benefit sellers, advertisers, and the brokers who service them. It has been for a long time, and it’s not going to stop. The age of privacy nihilism is here, and it’s time to face the dark hollow of its pervasive void.Ian Bogost
The only forces that we have to stop this are collective action, and governmental action. My concern is that we don't have the digital savvy to do the former, and there's definitely the lack of will in respect of the latter. Troubling times.
So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.
I'm in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual 'autopilot' can be a useful way to see things differently. So I'm going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years...
Attempts to put microphones everywhere
In an article for Slate, Shannon Palus ranks all of Amazon's new products by 'creepiness'. The Echo Frames are, in her words:
A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?Shannon Palus
When we're not talking about weapons of mass destruction, it's not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a 'quiet word' with anyone wearing glasses ever again?
It's not just Amazon, of course. Google and Facebook are at it, too.
With the exception, perhaps, of populist politicians, I don't think we're ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for 'full body deepfakes'.
The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can't trust anything any more.
If you clicked on the 'super-secret link' in Sunday's newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it's the tip of the iceberg.
Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:
"The sort of future we're looking ahead toward is a world where our neural data — which we don't even have access to — could be used" against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.Kaveh Waddell
This would lead to 'neuromarketing', with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it's coming directly from your brainwaves, short of literally wearing a tinfoil hat, there's nothing much you can do.
So there we are. Am I being too fearful here?
Thanks to Clay Shirky for today's title. It's true, isn't it? You can't claim something to be a true revolution unless someone, some organisation, or some group of people loses.
I'm happy to say that it's the turn of some older white men to be losing right now, and particularly delighted that those who have spent decades abusing and repressing people are getting their comeuppance.
Enough has been written about Epstein and the fallout from it. You can read about comments made by Richard Stallman, founder of the Free Software Foundation, in this Washington Post article. I've only met RMS (as he's known) in person once, at the Indie Tech Summit five years ago, but it wasn't a great experience. While I'm willing to cut visionary people some slack, he mostly acted like a jerk.
RMS is a revered figure in Free Software circles and it's actually quite difficult not to agree with his stance on many political and technological matters. That being said, he deserves everything he gets though for the comments he made about child abuse, for the way he's treated women for the past few decades, and his dictator-like approach to software projects.
In an article for WIRED entitled Richard Stallman’s Exit Heralds a New Era in Tech, Noam Cohen writes that we're entering a new age. I certainly hope so.
This is a lesson we are fast learning about freedom as it promoted by the tech world. It is not about ensuring that everyone can express their views and feelings. Freedom, in this telling, is about exclusion. The freedom to drive others away. And, until recently, freedom from consequences.
After 40 years of excluding those who didn’t serve his purposes, however, Stallman finds himself excluded by his peers. Freedom.
Maybe freedom, defined in this crude, top-down way, isn’t the be-all, end-all. Creating a vibrant inclusive community, it turns out, is as important to a software project as a coding breakthrough. Or, to put it in more familiar terms—driving away women, investing your hopes in a single, unassailable leader is a critical bug. The best patch will be to start a movement that is respectful, inclusive, and democratic.Noam Cohen
One of the things that the next leaders of the Free Software Movement will have to address is how to take practical steps to guarantee our basic freedoms in a world where Big Tech provides surveillance to ever-more-powerful governments.
Cory Doctorow is an obvious person to look to in this regard. He has a history of understanding what's going on and writing about it in ways that people understand. In an article for The Globe and Mail, Doctorow notes that a decline in trust of political systems and experts more generally isn't because people are more gullible:
40 years of rising inequality and industry consolidation have turned our truth-seeking exercises into auctions, in which lawmakers, regulators and administrators are beholden to a small cohort of increasingly wealthy people who hold their financial and career futures in their hands.
To be in a world where the truth is up for auction is to be set adrift from rationality. No one is qualified to assess all the intensely technical truths required for survival: even if you can master media literacy and sort reputable scientific journals from junk pay-for-play ones; even if you can acquire the statistical literacy to evaluate studies for rigour; even if you can acquire the expertise to evaluate claims about the safety of opioids, you can’t do it all over again for your city’s building code, the aviation-safety standards governing your next flight, the food-safety standards governing the dinner you just ordered.Cory Doctorow
What's this got to do with technology, and in particular Free Software?
Big Tech is part of this problem... because they have monopolies, thanks to decades of buying nascent competitors and merging with their largest competitors, of cornering vertical markets and crushing rivals who won't sell. Big Tech means that one company is in charge of the social lives of 2.3 billion people; it means another company controls the way we answer every question it occurs to us to ask. It means that companies can assert the right to control which software your devices can run, who can fix them, and when they must be sent to a landfill.
These companies, with their tax evasion, labour abuses, cavalier attitudes toward our privacy and their completely ordinary human frailty and self-deception, are unfit to rule our lives. But no one is fit to be our ruler. We deserve technological self-determination, not a corporatized internet made up of five giant services each filled with screenshots from the other four.Cory Doctorow
Doctorow suggests breaking up these companies to end their de facto monopolies and level the playing field.
The problem of tech monopolies is something that Stowe Boyd explored in a recent article entitled Are Platforms Commons? Citing previous precedents around railroads, Boyd has many questions, including whether successful platforms be bound with the legal principles of 'common carriers', and finishes with this:
However, just one more question for today: what if ecosystems were constructed so that they were governed by the participants, rather by the hypercapitalist strivings of the platform owners — such as Apple, Google, Amazon, Facebook — or the heavy-handed regulators? Is there a middle ground where the needs of the end user and those building, marketing, and shipping products and services can be balanced, and a fair share of the profits are distributed not just through common carrier laws but by the shared economics of a commons, and where the platform orchestrator gets a fair share, as well? We may need to shift our thinking from common carrier to commons carrier, in the near future.Stowe Boyd
The trouble is, simply establishing a commons doesn't solve all of the problems. In fact, what tends to happen next is well known:
The tragedy of the commons is a situation in a shared-resource system where individual users, acting independently according to their own self-interest, behave contrary to the common good of all users, by depleting or spoiling that resource through their collective action.Wikipedia
An article in The Economist outlines the usual remedies to the 'tragedy of the commons': either governmental regulation (e.g. airspace), or property rights (e.g. land). However, the article cites the work of Elinor Ostrom, a Nobel prizewinning economist, showing that another way is possible:
An exclusive focus on states and markets as ways to control the use of commons neglects a varied menagerie of institutions throughout history. The information age provides modern examples, for example Wikipedia, a free, user-edited encyclopedia. The digital age would not have dawned without the private rewards that flowed to successful entrepreneurs. But vast swathes of the web that might function well as commons have been left in the hands of rich, relatively unaccountable tech firms.
A world rich in healthy commons would of necessity be one full of distributed, overlapping institutions of community governance. Cultivating these would be less politically rewarding than privatisation, which allows governments to trade responsibility for cash. But empowering commoners could mend rents in the civic fabric and alleviate frustration with out-of-touch elites.The Economist
I count myself as someone on the left of politics, if that's how we're measuring things today. However, I don't think we need representation at any higher level than is strictly necessary.
In a time when technology allows you, to a great extent, to represent yourself, perhaps we need ways of demonstrating how complex and multi-faceted some issues are? Perhaps we need to try 'liquid democracy':
Liquid democracy lies between direct and representative democracy. In direct democracy, participants must vote personally on all issues, while in representative democracy participants vote for representatives once in certain election cycles. Meanwhile, liquid democracy does not depend on representatives but rather on a weighted and transitory delegation of votes. Liquid democracy through elections can empower individuals to become sole interpreters of the interests of the nation. It allows for citizens to vote directly on policy issues, delegate their votes on one or multiple policy areas to delegates of their choosing, delegate votes to one or more people, delegated to them as a weighted voter, or get rid of their votes' delegations whenever they please.WIkipedia
I think, given the state that politics is in right now, it's well worth a try. The problem, of course, is that the losers would be the political elites, the current incumbents. But, hey, it's not a revolution if nobody loses, right?
This week's roundup is going out a day later than usual, as yesterday was the Global Climate Strike and Thought Shrapnel was striking too!
Here's what I've been paying attention to this week:
So said Marcus Aurelius. Today's short article is about what happens after you die. We're all aware of the importance of making a will, particularly if you have dependants. But that's primarily for your analogue, offline life. What about your digital life?
In a recent TechCrunch article, Jon Evans writes:
I really wish I hadn’t had cause to write this piece, but it recently came to my attention, in an especially unfortunate way, that death in the modern era can have a complex and difficult technical aftermath. You should make a will, of course. Of course you should make a will. But many wills only dictate the disposal of your assets. What will happen to the other digital aspects of your life, when you’re gone?Jon Evans
The article points to a template for a Digital Estate Planning Document which you can use to list all of the places that you're active. Interestingly, the suggestion is to have a 'digital executor', which makes sense as the more technical you are the more likely that other members of your family might not be able to follow your instructions.
Interestingly, the Wikipedia article on digital wills has some very specific advice of which the above-mentioned document is only a part:
I hadn't really thought about this, but the chances of identity theft after someone has died are as great, if not greater, as when they were alive:
An article by Magder in the newspaper The Gazette provides a reminder that identity theft can potentially continue to be a problem even after death if their information is released to the wrong people. This is why online networks and digital executors require proof of a death certificate from a family member of the deceased person in order to acquire access to accounts. There are instances when access may still be denied, because of the prevalence of false death certificates.Wikipedia
Zooming out a bit, and thinking about this from my own perspective, it's a good idea to insist on good security practices for your nearest and dearest. Ensure they know how to use password managers and use two-factor authentication on their accounts. If they do this for themselves, they'll understand how to do it with your accounts when you're gone.
One thing it's made think about is the length of time for which I renew domain names. I tend to just renew mine (I have quite a few) on a yearly basis. But what if the worst happened? Those payment details would be declined, and my sites would be offline in a year or less.
All of this makes me think that the important thing here is to keep things as simple as possible. As I've discussed in another article, the way people remember us after we're gone is kind of important.
Most of us could, I think, divide our online life into three buckets:
So if, for example, I died tomorrow, the domain renewal for Thought Shrapnel lapsed next year, and a scammer took it over, that would be terrible. It's part of the reason why I still renew domains I don't use. So this would go in the 'really important to my legacy' bucket.
On the other hand, my experiments with various tools and platforms I'm less bothered about. They would probably go in the 'not important' bucket.
Ultimately, it's a conversation to have with those close to you. For me, it's on my mind after the death of a good friend and so something I should get to before life goes back to some version of normality. After all, figuring out someone else's digital life admin is the last thing people want when they're already dealing with grief.
What would you do if you knew you had 24 hours left to live? I suppose it would depend on context. Is this catastrophe going to affect everyone, or only you? I'm not sure I'd know what to do in the former case, but once I'd said my goodbyes to my family, I'm pretty sure I know what I'd do in the latter.
Yep, I would go somewhere by myself and write.
To me, the reason both reading and writing can feel so freeing is that they allow you to mentally escape your physical constraints. It almost doesn't matter what's happening to your body or anything around you while you lose yourself in someone else's words, or you create your own.
I came across an interesting blog recently. It had a single post, entitled Consume less, create more. In it, the author, 'Tom', explains that the 1,600 words he's shared were written over the course of a month after he realised that he was spending his life consuming instead of creating.
A lot of ink has been spilled about the perils of modern technology. How it distracts us, how it promotes unhealthy comparisons with others, how it makes us fat, how it limits social interaction, how it spies on us. And all of these things are probably true, to some extent.
But the real tragedy of modern technology is that it’s turned us into consumers. Our voracious consumption of media parallels our consumption of fossil fuels, corn syrup, and plastic straws. And although we’re starting to worry about our consumption of those physical goods, we seem less concerned about our consumption of information.
We treat information as necessarily good, and comfort ourselves with the feeling that whatever article or newsletter we waste our time with is actually good for us. We equate reading with self improvement, even though we forget most of what we’ve read, and what we remember isn’t useful.TJCX
I feel that at this juncture in history, we've perfected surveillance-via-smartphone as the perfect tool to maximise FOMO. For those growing up in the goldfish bowl of the modern world, this may feel as normal as the 'water' in which they are 'swimming'. But for the rest of us, it can still feel... odd.
This is going to sound pretty amazing, but I don't think there's been many days in my adult life when I've been able to go somewhere without anyone else knowing. As a kid? Absolutely. I can vividly remember, for example, cycling to a corn field and finding a place to lie down and look at the sky, knowing that no-one could see me. It was time spent with myself, unmediated and unfiltered.
This didn't used to be unusual. People had private inner lives that were manifested in private actions. In a recent column in The Guardian, Grace Dent expanded on this.
Yes life after iPhones is marvellous, but in the 90s I ran wild across London, up to all kinds of no good, staying out for days, keeping my own counsel entirely. My parents up north would not speak to me for weeks. Sometimes, life back in the days when we had one shit Nokia and a landline between five friends seems blissful. One was permitted lost weekends and periods of secret skulduggery or just to lie about reading a paperback without the sense six people were owed a text message. Yes, things took longer, and one needed to make plans and keep them, but being off the grid was normal. Today, not replying... is a truly radical act.Grace Dent
"Not replying... is a truly radical act". Wow. Let that sink in for a moment.
Given all this, it's no wonder in our always-on culture that we have so much 'life admin' to concern ourselves with. Previous generations may have had 'pay the bills' on their to-do list, but it wasn't nudged down the to-do list by 'inform a person I kind of know on Twitter that they have incorrect view on Brexit'.
All of these things build upon incrementally until they eventually become unsustainable. It's death by a thousand cuts. As I've quoted many times before before, Jocelyn K. Glei's question is always worth asking: who are you without the doing?
Realistically, most of our days are likely to involve some use of digital communication tools. We can't always be throwing off our shackles to live the life of a flâneur. To facilitate space to create, therefore, it's important to draw some red lines. This is what Michael Bernstein talks about in Sorry, we can't join your Slack.
Saying yes to joining client Slack channels would mean that down the line we’d feel more exhausted but less accomplished. We’d have more superficial “friends,” but wouldn’t know how to deal with products much better than we did now. We’d be on the hook all the time, and have less of an opportunity to consider our responses.Michael Bernstein
In other words, being more available and more 'social' takes time away from more important pursuits. After all, time is the ultimate zero-sum game.
Ultimately, I guess it's about learning to see the world differently. There very well be a 'new normal' that we've begun to internalise but, for now at least, we have a choice to use to our advantage that 'flexibility' we hear so much about.
This is why self-reflection is so important, as Wanda Thibodeaux explains in an article for Inc.
In sum, elimination of stress and the acceptance of peace comes not necessarily from changing the world, but rather from clearing away all the learned clutter that prevents us from changing our view of the world. Even the biggest systemic "realities" (e.g., work "HAS" to happen from 9 a.m. to 5 p.m.) are up for reinterpretation and rewriting, and arguably, inner calm and innovation both stem from the same challenge of perceptions.Wanda Thibodeaux
To do this, you have to have to already have decided the purpose for which you're using your tools, including the ones provided by your smartphone.
Need more specific advice on that? I suggest you go and read this really handy post by Ryan Holiday: A Radical Guide to Spending Less Time on Your Phone. The advice to be focused on which apps you need on your phone is excellent; I deleted over 100!
You may also find this post useful that I wrote over on my blog a few months ago about how changing the 'launcher' on your phone can change your life.
If you make some changes after reading this, I'd be interested in hearing how you get on. Let me know in the comments section below!
Quotation-as-title from Rajkummar Rao.