Handwriting, note-taking, and recall

    I write by hand every day, but not much. While I used to keep a diary in which I’d write several pages, I now keep one that encourages a tweet-sized reflection on the past 24 hours. Other than that, it’s mostly touch-typing on my laptop or desktop computer.

    Next month, I’ll start studying for my MSc and the university have already shipped me the books that form a core part of my study. I’ll be underlining and taking notes on them, which is interesting because I usually highlight things on my ereader.

    This article in The Economist is primarily about note-taking and the use of handwriting. I think it’s probably beyond doubt that for deeper learning and recall this is more effective. But perhaps for the work I do, which is more synthesis of multiple sources, I find digital more practical.

    A line of research shows the benefits of an “innovation” that predates computers: handwriting. Studies have found that writing on paper can improve everything from recalling a random series of words to imparting a better conceptual grasp of complicated ideas.

    For learning material by rote, from the shapes of letters to the quirks of English spelling, the benefits of using a pen or pencil lie in how the motor and sensory memory of putting words on paper reinforces that material. The arrangement of squiggles on a page feeds into visual memory: people might remember a word they wrote down in French class as being at the bottom-left on a page, par exemple.

    One of the best-demonstrated advantages of writing by hand seems to be in superior note-taking. In a study from 2014 by Pam Mueller and Danny Oppenheimer, students typing wrote down almost twice as many words and more passages verbatim from lectures, suggesting they were not understanding so much as rapidly copying the material.

    […]

    Many studies have confirmed handwriting’s benefits, and policymakers have taken note. Though America’s “Common Core” curriculum from 2010 does not require handwriting instruction past first grade (roughly age six), about half the states since then have mandated more teaching of it, thanks to campaigning by researchers and handwriting supporters. In Sweden there is a push for more handwriting and printed books and fewer devices. England’s national curriculum already prescribes teaching the rudiments of cursive by age seven.

    Source: The importance of handwriting is becoming better understood | The Economist

    The 'value' of a degree

    I’ve got two things to say about this article in The Economist. One is to do with alternative credentialing, and the other is to do with my first degree.

    1. The rhetoric around Open Badges in the early days was that it would mean the end of universities. Instead, they have co-opted them as 'microcredentials' in a way that unbundles chunky degrees into bitesize pieces. Universities are now more likely to work with employers on these microcredentials, which is a benefit to employability, at the expense of a rounded 'liberal' education.
    2. My first degree was in Philosophy, which most people would assume makes you entirely unemployable. The reverse is actually true, especially for knowledge work. I should imagine that in a world where we need, for example, more AI ethicists, this trend will only continue.
    The value of a university education, to my mind, isn't really how much you earn specifically because of the piece of paper you earn at the end of it. Instead, it's a way to broaden your mind by (hopefully) moving away from where you grew up and encountering people who think differently.

    Chart showing the (economic) "value" of different degrees

    In England 25% of male graduates and 15% of female ones will take home less money over their careers than peers who do not get a degree, according to the Institute for Fiscal Studies (IFS), a research outfit. America has less comprehensive data but has begun publishing the share of students at thousands of institutions who do not manage to earn more than the average high-school graduate early on. Six years after enrolment, 27% of students at a typical four-year university fail to do so, calculate researchers at Georgetown University in Washington, dc. In the long tail, comprising the worst 30% of America’s two- and four-year institutions, more than half of people who enroll lag this benchmark.

    […]

    Earnings data in Britain call into question the assumption that bright youngsters will necessarily benefit from being pushed towards very selective institutions, says Jack Britton of the ifs. In order to beat fierce competition for places, some youngsters apply for whatever subject seems easiest, even if it is not one that usually brings a high return. Parents fixated on getting their offspring into Oxford or Cambridge, regardless of subject, should take note. But there is also evidence that tackling a high-earning course for the sake of it can backfire. Norwegian research finds that students whose true desire is to study humanities, but who end up studying science, earn less after ten years than they probably otherwise would have.

    Source: Was your degree really worth it? | The Economist

    Woke, broke, and complicated

    I thought the comments about how young people’s desire for instant gratification was nothing particularly new. However, it is worth thinking about the desire for more ‘green’ options being coupled with the desire to get everything instantly. The two are somewhat in tension.

    Uncertainty about the future may be encouraging impulsive spending of limited resources in the present. The young were disrupted more by covid than other generations and are now enjoying the rebound. According to McKinsey, American millennials (born between 1980 and the late 1990s) spent 17% more in the year to March 2022 than they did in the year before. Despite this short-term recovery from the dark days of the pandemic, their long-term prospects are much less good.

    […]

    Youngsters’ appetite for instant gratification is also fuelling some distinctly ungreen consumer habits. The young have virtually invented quick commerce, observes Isabelle Allen of kpmg. And that convenience is affordable because it fails to price in all its externalities. The environmental benefits of eating plants rather than meat can be quickly undone if meals are delivered in small batches by a courier on a petrol-powered motorbike. Shein, a Chinese clothes retailer that is the fastest in fast fashion, tops surveys as a Gen Z favourite in the West, despite being criticised for waste; its fashionable garments are cheap enough to throw on once and then throw away. Like everyone else the young are, then, contradictory—because, like everyone else, they are only human.

    Source: How the young spend their money | The Economist

    Meetings and work theatre

    The way that you do something is almost as important as what you do. However, I’ve definitely noticed that, during the pandemic as people get used to working remotely (as I’ve done for a decade now) there’s definitely been some, let’s say, ‘theatre’ added to it all.

    Meetings, the office’s answer to the theatre, have proliferated. They are harder to avoid now that invitations must be responded to and diaries are public. Even if you don’t say anything, cameras make meetings into a miming performance: an attentive expression and occasional nodding now count as a form of work. The chat function is a new way to project yourself. Satya Nadella, the boss of Microsoft, says that comments in chat help him to meet colleagues he would not otherwise hear from. Maybe so, but that is an irresistible incentive to pose questions that do not need answering and offer observations that are not worth making.

    Shared documents and messaging channels are also playgrounds of performativity. Colleagues can leave public comments in documents, and in the process notify their authors that something approximating work has been done. They can start new channels and invite anyone in; when no one uses them, they can archive them again and appear efficient. By assigning tasks to people or tagging them in a conversation, they can cast long shadows of faux-industriousness. It is telling that one recent research study found that members of high-performing teams are more likely to speak to each other on the phone, the very opposite of public communication.

    Performative celebration is another hallmark of the pandemic. Once one person has reacted to a message with a clapping emoji, others are likely to join in until a virtual ovation is under way. At least emojis are fun. The arrival of a round-robin email announcing a promotion is as welcome as a rifle shot in an avalanche zone. Someone responds with congratulations, and then another recipient adds their own well wishes. As more people pile in, pressure builds on the non-responders to reply as well. Within minutes colleagues are telling someone they have never met in person how richly they deserve their new job.

    Source: The rise of performative work | The Economist

    Remote workers clock up more hours, says one study

    It takes time and/or training to transition fully to remote working. If it’s not something you’ve chosen (say, because of the pandemic) then that’s doubly-problematic.

    really enjoy working remotely. I miss travelling for events and meetups, which I used to do probably 10-15 times per year, but the actual working from home part is great. As I type this I’m in my running stuff waiting for the Tesco delivery. Work happens around life, rather than the other way round.

    This article talks about one study, which I don’t think is illustrative of the wider picture. What I do recognise, however, is the temptation to work more hours when you live in your workplace. You have to be strict.

    Ultimately, it comes down to control. If you’re in control of your time, then eventually you spend it productively. For example, I work fewer than 30 hours per week in an average week, mainly because I don’t attend meetings I don’t have to.

    Early surveys of employees and employers found that remote work did not reduce productivity. But a new study* of more than 10,000 employees at an Asian technology company between April 2019 and August 2020 paints a different picture. The firm uses software installed on employees’ computers that tracked which applications or websites were active, and whether the employee was using the keyboard or a mouse. (Shopping online didn’t count.)

    The research certainly concluded that the employees were working hard. Total hours worked were 30% higher than before the pandemic, including an 18% increase in working outside normal hours. But this extra effort did not translate into any rise in output. This may explain the earlier survey evidence; both employers and employees felt they were producing as much as before. But the correct way to measure productivity is output per working hour. With all that extra time on the job, this fell by 20%.

    Source: Remote workers work longer, not more efficiently | The Economist

    Remote workers clock up more hours, says one study

    It takes time and/or training to transition fully to remote working. If it’s not something you’ve chosen (say, because of the pandemic) then that’s doubly-problematic.

    really enjoy working remotely. I miss travelling for events and meetups, which I used to do probably 10-15 times per year, but the actual working from home part is great. As I type this I’m in my running stuff waiting for the Tesco delivery. Work happens around life, rather than the other way round.

    This article talks about one study, which I don’t think is illustrative of the wider picture. What I do recognise, however, is the temptation to work more hours when you live in your workplace. You have to be strict.

    Ultimately, it comes down to control. If you’re in control of your time, then eventually you spend it productively. For example, I work fewer than 30 hours per week in an average week, mainly because I don’t attend meetings I don’t have to.

    Early surveys of employees and employers found that remote work did not reduce productivity. But a new study* of more than 10,000 employees at an Asian technology company between April 2019 and August 2020 paints a different picture. The firm uses software installed on employees’ computers that tracked which applications or websites were active, and whether the employee was using the keyboard or a mouse. (Shopping online didn’t count.)

    The research certainly concluded that the employees were working hard. Total hours worked were 30% higher than before the pandemic, including an 18% increase in working outside normal hours. But this extra effort did not translate into any rise in output. This may explain the earlier survey evidence; both employers and employees felt they were producing as much as before. But the correct way to measure productivity is output per working hour. With all that extra time on the job, this fell by 20%.

    Source: Remote workers work longer, not more efficiently | The Economist

    Saturday scrapings

    Every week, I go back through the links I've saved, pick out the best ones, and share them here. This week is perhaps even more eclectic than usual. Enjoy!


    Marcus Henderson

    Meet the Farmer Behind CHAZ's Vegetable Gardens

    Marcus was the first to start gardening in the park, though he was quickly joined by friends and strangers. This isn’t the work of a casual amateur; Henderson has an Energy Resources Engineering degree from Stanford University, a Master’s degree in Sustainability in the Urban Environment, and years of experience working in sustainable agriculture. His Instagram shows him hard at work on various construction and gardening projects, and he’s done community development at organic farms around the world.

    Matt Baume (The Stranger)

    I love this short article about Marcus Henderson, the first person to start planting in Seattle's Capitol Hill Autonomous Zone.


    The Rich Are 'Defunding' Our Democracy

    “Apparently,” comments [journalist David] Sirota, “we’re expected to be horrified by proposals to reduce funding for the militarized police forces that are violently attacking peaceful protesters — but we’re supposed to obediently accept the defunding of the police forces responsible for protecting the population from the wealthy and powerful.”

    Sam Pizzigati (Inequality.org)

    A lot of people have been shocked by the calls to 'defund the police' on the back of the Black Lives Matter protests. The situation is undoubtedly worse in the US, but I particularly liked this explainer image, that I came across via Mastodon:

    Teapot with label 'Defund the police' which has multiple spouts pouring into cups entitled 'Education', 'Universal healthcare', 'Youth services', 'Housing', and 'Other community investments'

    Peasants' Revolt

    Yet perhaps the most surprising feature of the revolt is that in-spite of the modern title, Peasants' Revolt didn't gain usage until the late nineteenth century, the people who animated the movement weren't peasants at all. They were in many respects the village elite. True, they weren't noble magnates, but they were constables, stewards and jurors. In short, people who were on the up and saw an opportunity to press their agenda.

    Robert Winter

    I love reading about things I used to teach, especially when they're written by interesting people about which I want to know more. This blog post is by Robert Winter, "philosopher and historian by training, Operations Director by pay cheque". I discovered is as part of the #100DaysToOffload challenge, largely happening on the Fediverse, and to which I'm contributing.


    Red blood cells

    Three people with inherited diseases successfully treated with CRISPR

    Two people with beta thalassaemia and one with sickle cell disease no longer require blood transfusions, which are normally used to treat severe forms of these inherited diseases, after their bone marrow stem cells were gene-edited with CRISPR.

    Michael Le Page (New Scientist)

    CRISPR is a way of doing gene editing within organisms. sAs far as I'm aware, this is one of the first times it's been used to treat conditions in humans. I'm sure it won't be the last.


    Choose Your Own Fake News

    Choose Your Own Fake News is an interactive "choose your own adventure" game. Play the game as Flora, Jo or Aida from East Africa, and navigate the world of disinformation and misinformation through the choices you make. Scrutinize news and information about job opportunities, vaccines and upcoming elections to make the right choices!

    This is the kind of thing that the Mozilla Foundation does particularly well: either producing in-house, or funding very specific web-based tools to teach people things. In this case, it's fake news. And it's really good.


    Why are Google and Apple dictating how European democracies fight coronavirus?

    The immediate goal for governments and tech companies is to strike the right balance between privacy and the effectiveness of an application to limit the spread of Covid-19. This requires continuous collaboration between the two with the private sector, learning from the experience of national health authorities and adjusting accordingly. Latvia, together with the rest of Europe, stands firm in defending privacy, and is committed to respecting both the individual’s right to privacy and health while applying its own solutions to combat Covid-19.

    Ieva Ilves (The Guardian)

    This is an article written by an an adviser to the president of Latvia on information and digital policy. They explain some of the nuance behind the centralised vs decentralised contact tracing app models which I hadn't really thought about.


    Illustration of Lévy walks

    Random Search Wired Into Animals May Help Them Hunt

    Lévy walks are now seen as a movement pattern that a nervous system can produce in the absence of useful sensory or mnemonic information, when it is an animal’s most advantageous search strategy. Of course, many animals may never employ a Lévy walk: If a polar bear can smell a seal, or a cheetah can see a gazelle, the animals are unlikely to engage in a random search strategy. “We expect the adaptation for Lévy walks to have appeared only where they confer practical advantages,” Viswanathan said.

    Liam Drew (QUanta Magazine)

    If you've watched wildlife documentaries, you probably know about Lévy walks (or 'flights'). This longish article gives a fascinating insight into the origin of the theory and how it can be useful in protecting different species.


    A plan to turn the atmosphere into one, enormous sensor

    One of AtmoSense’s first goals will be to locate and study phenomena at or close to Earth’s surface—storms, earthquakes, volcanic eruptions, mining operations and “mountain waves”, which are winds associated with mountain ranges. The aim is to see if atmospheric sensing can outperform existing methods: seismographs for earthquakes, Doppler weather radar for storms and so on.

    The Economist

    This sounds potentially game-changing. I can see the positives, but I wonder what the negatives will be?


    Paths of desire: lockdown has lent a new twist to the trails we leave behind

    Desire paths aren’t anything new – the term has been traced back to the French philosopher Gaston Bachelard, who wrote of “lignes de désir” in his 1958 book The Poetics of Space. Nature author Robert Macfarlane has written more recently about the inherent poetry of the paths. In his 2012 book The Old Ways: A Journey on Foot, Macfarlane calls them “elective easements” and says: “Paths are human; they are traces of our relationships.” Desire paths have been created by enthusiastic dogs in back gardens, by superstitious humans avoiding scaffolding and by students seeking shortcuts to class. Yet while illicit trails may have marked the easier (ie shorter) route for centuries, the pandemic has turned them into physical markers of our distance. Desire paths are no longer about making life easier for ourselves, but about preserving life for everyone.

    Amelia Tait (The Guardian)

    I've used desire paths as a metaphor many times in presentations and workshops over the last decade. This is an article that specifically talks about how they've sprung up during the pandemic.


    Header image by Hans Braxmeier

    We have it in our power to begin the world over again

    UBI, GDP, and Libertarian Municipalism

    It's sobering to think that, in years to come, historians will probably refer to the 75 years between the end of the Second World War and the start of this period we've just begun with a single name.

    Whatever we end up calling it, one thing is for sure: what comes next can't be a continuation of what went before. We need a sharp division of life pre- and post-pandemic.

    That's because our societies have been increasingly unequal since 2008, when the global financial crisis meant that the rich consolidated their position while the rest of us paid for the mistakes of bankers and the global elite.

    Image via Oxfam

    So what can we do about this? What should we be demanding once we're allowed back out of our houses? What should we organise against?

    I've been a proponent of Universal Basic Income over the last few years, but, I have to say that the closer it comes to being a reality, the more concerns I have about its implementation. Even if it's brought in by a left-leaning government, there's still the danger that it's subsequently used as a weapon against the poor by a new adminstration.

    That's why I was interested in this section from a book I'm reading at the moment. Writing in Future Histories, Lizzie O'Shea suggests that we need to think beyond UBI to include other approaches:

    Alongside this, we need to consider how productive, waged work could be more democratically organized to meet the needs of society rather than individual companies. To this end, one commonly suggested alternative to a basic income is a job guarantee. The idea is that the government offers a job to anyone who wants one and is able to work, in exchange for a minimum wage. Jobs could be created around infrastructure projects, for example, or care work. Government spending on this enlarged public sector world act like a kind of Keynesian expenditure, to stimulate the economy and buffer the population against the volatility of the private labor market. Modeling suggests that this would be more cost-effective than a basic income (often critiqued for being too expensive) and avoid many of the inflationary perils that might accompany basic income proposals. It could also be used to jump-start sections of the economy that are politically important, like green energy, carbon reduction and infrastructure. A job guarantee could help us collectively decide what kind of work is must urgent and necessary and to prioritize that through democratically accountable representatives.

    Lizzie O'Shea, Future Histories

    Of course, as she points out, there are a number of drawbacks to a job guarantee scheme:

    • Reinforcement of the connection between productivity and human value
    • Creation of 'bullshit jobs'
    • Could deny people chance to engage in other pursuits (if poorly implemented)
    • Potential to leave behind prior who cannot work (disability / other health concerns)
    • Seems didactic and disciplinary

    Nevertheless, O'Shea believes that a combination of a job guarantee, UBI, and government-provided services is the way forward:

    Ultimately, we need a combination of these programs. We need the liberty offered by a basic income, the sustainability promised by the organization of a job guarantee, and the protection of dignity offered by centrally planned essential services. It is like a New Deal for the age of automation, a ground rent for the digital revolution, in which the benefits of accelerated productive capacity are shared among everyone. From each according to his ability, to each according to their need - a twenty-first-century vision of socialism. "We have it in our power to begin the world over again," wrote Thomas Paine in an appendix to Common Sense, just before one of the most revolutionary periods in human history. We have a similar opportunity today.

    Lizzie O'Shea, Future Histories

    While I don't disagree that we will continue to need "the protection of dignity offered by centrally planned essential services," I'm not so sure that giving the state so much power over our lives is a good thing. I think this approach papers over the cracks of neoliberalism, giving billionaires and capitalists a get-out-of-jail-free card.

    Instead, I'd like to see a post-pandemic breakup of mega corporations. While a de jure limit on how much one individual or one organisation can be worth is likely to be unworkable, there's ways we can make de facto limits on this a reality.

    People respond to incentives, including how easy or hard it is to do something. I know from experience how easy it is to set up a straightforward limited company in the UK and how difficult it is to set up a co-operative. To get to where we need to be, we need to ensure collective decision-making is the norm within organisations owned by workers. And then these worker-owned organisations need to co-ordinate for the good of everyone.

    I'm a huge believer in decentralisation, not just technologically but politically and socially; we don't need governments, billionaires, or celebrities telling us what to do with our lives. We need to think wider and deeper. My current thinking aligns with this section on libertarian municipalism from the Wikipedia page on the political philosopher Murray Bookchin:

    Libertarian Municipalism constitutes the politics of social ecology, a revolutionary effort in which freedom is given institutional form in public assemblies that become decision-making bodies.

    Wikipedia

    ...or, in other words:

    The overriding problem is to change the structure of society so that people gain power. The best arena to do that is the municipality—the city, town, and village—where we have an opportunity to create a face-to-face democracy.

    Wikipedia

    Some people think that, in these days of super-fast connections to anyone on the planet, that nation states are dead, and that we should be building communities on the blockchain. I have yet to see a proposal of how this would be workable in practice; everything I've seen so far extrapolates naïvely from what's technically possible to what should be socially desirable.

    Yes, we can and should have solidarity with people around the world with whom we work and socialise. But this does not negate the importance of decision-making at a local level. Gaming clans don't yet do bin collections, and colleagues in a different country can't fix the corruption riddling your local government.

    Ultimately, then, we're going to need a whole new politics and social contract after the pandemic. I sincerely hope we manage to grasp the nettle and do something radically different. I'm not sure how we'll all survive if the rich, once again, come out of all this even richer than before.


    BONUS: check out this 1978 speech from Murray Bookchin where he calls for utopia, not futurism.


    Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!


    Quotation-as-title from Thomas Paine. Header image by Stas Knop.

    It’s not a revolution if nobody loses

    Thanks to Clay Shirky for today's title. It's true, isn't it? You can't claim something to be a true revolution unless someone, some organisation, or some group of people loses.

    I'm happy to say that it's the turn of some older white men to be losing right now, and particularly delighted that those who have spent decades abusing and repressing people are getting their comeuppance.

    Enough has been written about Epstein and the fallout from it. You can read about comments made by Richard Stallman, founder of the Free Software Foundation, in this Washington Post article. I've only met RMS (as he's known) in person once, at the Indie Tech Summit five years ago, but it wasn't a great experience. While I'm willing to cut visionary people some slack, he mostly acted like a jerk.

    RMS is a revered figure in Free Software circles and it's actually quite difficult not to agree with his stance on many political and technological matters. That being said, he deserves everything he gets though for the comments he made about child abuse, for the way he's treated women for the past few decades, and his dictator-like approach to software projects.

    In an article for WIRED entitled Richard Stallman’s Exit Heralds a New Era in Tech, Noam Cohen writes that we're entering a new age. I certainly hope so.

    This is a lesson we are fast learning about freedom as it promoted by the tech world. It is not about ensuring that everyone can express their views and feelings. Freedom, in this telling, is about exclusion. The freedom to drive others away. And, until recently, freedom from consequences.

    After 40 years of excluding those who didn’t serve his purposes, however, Stallman finds himself excluded by his peers. Freedom.

    Maybe freedom, defined in this crude, top-down way, isn’t the be-all, end-all. Creating a vibrant inclusive community, it turns out, is as important to a software project as a coding breakthrough. Or, to put it in more familiar terms—driving away women, investing your hopes in a single, unassailable leader is a critical bug. The best patch will be to start a movement that is respectful, inclusive, and democratic.

    Noam Cohen

    One of the things that the next leaders of the Free Software Movement will have to address is how to take practical steps to guarantee our basic freedoms in a world where Big Tech provides surveillance to ever-more-powerful governments.

    Cory Doctorow is an obvious person to look to in this regard. He has a history of understanding what's going on and writing about it in ways that people understand. In an article for The Globe and Mail, Doctorow notes that a decline in trust of political systems and experts more generally isn't because people are more gullible:

    40 years of rising inequality and industry consolidation have turned our truth-seeking exercises into auctions, in which lawmakers, regulators and administrators are beholden to a small cohort of increasingly wealthy people who hold their financial and career futures in their hands.

    [...]

    To be in a world where the truth is up for auction is to be set adrift from rationality. No one is qualified to assess all the intensely technical truths required for survival: even if you can master media literacy and sort reputable scientific journals from junk pay-for-play ones; even if you can acquire the statistical literacy to evaluate studies for rigour; even if you can acquire the expertise to evaluate claims about the safety of opioids, you can’t do it all over again for your city’s building code, the aviation-safety standards governing your next flight, the food-safety standards governing the dinner you just ordered.

    Cory Doctorow

    What's this got to do with technology, and in particular Free Software?

    Big Tech is part of this problem... because they have monopolies, thanks to decades of buying nascent competitors and merging with their largest competitors, of cornering vertical markets and crushing rivals who won't sell. Big Tech means that one company is in charge of the social lives of 2.3 billion people; it means another company controls the way we answer every question it occurs to us to ask. It means that companies can assert the right to control which software your devices can run, who can fix them, and when they must be sent to a landfill.

    These companies, with their tax evasion, labour abuses, cavalier attitudes toward our privacy and their completely ordinary human frailty and self-deception, are unfit to rule our lives. But no one is fit to be our ruler. We deserve technological self-determination, not a corporatized internet made up of five giant services each filled with screenshots from the other four.

    Cory Doctorow

    Doctorow suggests breaking up these companies to end their de facto monopolies and level the playing field.

    The problem of tech monopolies is something that Stowe Boyd explored in a recent article entitled Are Platforms Commons? Citing previous precedents around railroads, Boyd has many questions, including whether successful platforms be bound with the legal principles of 'common carriers', and finishes with this:

    However, just one more question for today: what if ecosystems were constructed so that they were governed by the participants, rather by the hypercapitalist strivings of the platform owners — such as Apple, Google, Amazon, Facebook — or the heavy-handed regulators? Is there a middle ground where the needs of the end user and those building, marketing, and shipping products and services can be balanced, and a fair share of the profits are distributed not just through common carrier laws but by the shared economics of a commons, and where the platform orchestrator gets a fair share, as well? We may need to shift our thinking from common carrier to commons carrier, in the near future.

    Stowe Boyd

    The trouble is, simply establishing a commons doesn't solve all of the problems. In fact, what tends to happen next is well known:

    The tragedy of the commons is a situation in a shared-resource system where individual users, acting independently according to their own self-interest, behave contrary to the common good of all users, by depleting or spoiling that resource through their collective action.

    Wikipedia

    An article in The Economist outlines the usual remedies to the 'tragedy of the commons': either governmental regulation (e.g. airspace), or property rights (e.g. land). However, the article cites the work of Elinor Ostrom, a Nobel prizewinning economist, showing that another way is possible:

    An exclusive focus on states and markets as ways to control the use of commons neglects a varied menagerie of institutions throughout history. The information age provides modern examples, for example Wikipedia, a free, user-edited encyclopedia. The digital age would not have dawned without the private rewards that flowed to successful entrepreneurs. But vast swathes of the web that might function well as commons have been left in the hands of rich, relatively unaccountable tech firms.

    [...]

    A world rich in healthy commons would of necessity be one full of distributed, overlapping institutions of community governance. Cultivating these would be less politically rewarding than privatisation, which allows governments to trade responsibility for cash. But empowering commoners could mend rents in the civic fabric and alleviate frustration with out-of-touch elites.

    The Economist

    I count myself as someone on the left of politics, if that's how we're measuring things today. However, I don't think we need representation at any higher level than is strictly necessary.

    In a time when technology allows you, to a great extent, to represent yourself, perhaps we need ways of demonstrating how complex and multi-faceted some issues are? Perhaps we need to try 'liquid democracy':

    Liquid democracy lies between direct and representative democracy. In direct democracy, participants must vote personally on all issues, while in representative democracy participants vote for representatives once in certain election cycles. Meanwhile, liquid democracy does not depend on representatives but rather on a weighted and transitory delegation of votes. Liquid democracy through elections can empower individuals to become sole interpreters of the interests of the nation. It allows for citizens to vote directly on policy issues, delegate their votes on one or multiple policy areas to delegates of their choosing, delegate votes to one or more people, delegated to them as a weighted voter, or get rid of their votes' delegations whenever they please.

    WIkipedia

    I think, given the state that politics is in right now, it's well worth a try. The problem, of course, is that the losers would be the political elites, the current incumbents. But, hey, it's not a revolution if nobody loses, right?

    That which we do not bring to consciousness appears in our lives as fate

    Today's title is quotation from Carl Jung, via a recent issue of New Philosopher magazine. I thought it was a useful frame for a discussion around a few things I've been reading recently, including an untranslatable Finnish word, music and teen internet culture, as well as whether life does indeed get better once you turn forty.

    Let's start with that Finnish word, discussed in Quartzy by Olivia Goldhill:

    At some point in life, all of us get that unexpected call on a Tuesday afternoon that distorts our world and makes everything else irrelevant: There’s been an accident. Or, you need surgery. Or, come home now, he’s dying. We get through that time, somehow, drawing on energy reserves we never knew we had and persevering, despite the exhaustion. There’s no word in English for the specific strength it takes to pull through, but there is a word in Finnish: sisu.

    Olivia Goldhill

    I'm guessing Goldhill is American, as we English have a term for that: Blitz spirit. It's even been invoked as a way of getting us through the vagaries of Brexit! 🙄

    Despite my flippancy, there are, of course, words that are pretty untranslatable between languages. But one thing that unites us no matter what language we speak is music. Interestingly, Alexis Petridis in The Guardian notes that there's teenage musicians making music in their bedrooms that really resonates across language barriers:

    For want of a better name, you might call it underground bedroom pop, an alternate musical universe that feels like a manifestation of a generation gap: big with teenagers – particularly girls – and invisible to anyone over the age of 20, because it exists largely in an online world that tweens and teens find easy to navigate, but anyone older finds baffling or risible. It doesn’t need Radio 1 or what is left of the music press to become popular because it exists in a self-contained community of YouTube videos and influencers; some bedroom pop artists found their music spread thanks to its use in the background of makeup tutorials or “aesthetic” videos, the latter a phenomenon whereby vloggers post atmospheric videos of, well, aesthetically pleasing things.

    Alexis Petridis

    Some people find this scary. I find it completely awesome, but may be over-compensating now that I've passed 35 years of age. Who wants to listen to and like the same music as everyone else?

    Talking of getting older, there's a saying that "life begins at forty". Well, an article in The Economist would suggest that, on average, the happiness of males in Western Europe doesn't vary that much.

    The Economist: graph showing self-reported happiness levels

    I'd love to know what causes that decline in the former USSR states, and the uptick in the United States? The article isn't particularly forthcoming, which is a shame.

    Perhaps as you get to middle-age there's a realisation that this is pretty much going to be it for the rest of your life. In some places, if you have the respect of your family, friends, and culture, and are reasonably well-off, that's no bad thing. In other cultures, that might be a sobering thought.

    One of the great things about studying Philosophy since my teenage years is that I feel very prepared for getting old. Perhaps that's what's needed here? More philosophical thinking and training? I don't think it would go amiss.


    Also check out:

    • What your laptop-holding position says about you (Quartz at Work) — "Over the past few weeks, we’ve been observing Quartzians in their natural habitat and have tried to make sense of their odd office rituals in porting their laptops from one meeting to the next."
    • Meritocracy doesn’t exist, and believing it does is bad for you (Fast Company) — "Simply holding meritocracy as a value seems to promote discriminatory behavior."
    • Your Body as a Map (Sapiens) — "Reading the human body canvas is much like reading a map. But since we are social beings in complex contemporary situations, the “legend” changes depending on when and where a person looks at the map."