Author: Doug Belshaw (page 2 of 60)

Fascinating Friday Facts

Here’s some links I thought I’d share which struck me as interesting:


Header image: Keep out! The 100m² countries – in pictures (The Guardian)

There is no exercise of the intellect which is not, in the final analysis, useless

A quotation from a short story from Jorge Luis Borges’ Labyrinths provides the title for today’s article. I want to dig into the work of danah boyd and the transcript of a talk she gave recently, entitled Agnotology and Epistemological Fragmentation. It helps us understand what’s going on behind the seemingly-benign fascias of social networks and news media outlets.

She explains the title of her talk:

Epistemology is the term that describes how we know what we know. Most people who think about knowledge think about the processes of obtaining it. Ignorance is often assumed to be not-yet-knowledgeable. But what if ignorance is strategically manufactured? What if the tools of knowledge production are perverted to enable ignorance? In 1995, Robert Proctor and Iain Boal coined the term “agnotology” to describe the strategic and purposeful production of ignorance. In an edited volume called Agnotology, Proctor and Londa Schiebinger collect essays detailing how agnotology is achieved. Whether we’re talking about the erasure of history or the undoing of scientific knowledge, agnotology is a tool of oppression by the powerful.

danah boyd

Having already questioned ‘media literacy’ the way it’s currently taught through educational institutions and libraries, boyd explains how the alt-right are streets ahead of educators when it comes to pushing their agenda:

One of the best ways to seed agnotology is to make sure that doubtful and conspiratorial content is easier to reach than scientific material. And then to make sure that what scientific information is available, is undermined. One tactic is to exploit “data voids.” These are areas within a search ecosystem where there’s no relevant data; those who want to manipulate media purposefully exploit these. Breaking news is one example of this.

[…]

Today’s drumbeat happens online. The goal is no longer just to go straight to the news media. It’s to first create a world of content and then to push the term through to the news media at the right time so that people search for that term and receive specific content. Terms like caravan, incel, crisis actor. By exploiting the data void, or the lack of viable information, media manipulators can help fragment knowledge and seed doubt.

danah boyd

Harold Jarche uses McLuhan’s tetrads to understand this visually, commenting: “This is an information war. Understanding this is the first step in fighting for democracy.”

Harold Jarche on Agnotology

We can teach children sitting in classrooms all day about checking URLs and the provenance of the source, but how relevant is that when they’re using YouTube as their primary search engine? Returning to danah boyd:

YouTube has great scientific videos about the value of vaccination, but countless anti-vaxxers have systematically trained YouTube to make sure that people who watch the Center for Disease Control and Prevention’s videos also watch videos asking questions about vaccinations or videos of parents who are talking emotionally about what they believe to be the result of vaccination. They comment on both of these videos, they watch them together, they link them together. This is the structural manipulation of media.

danah boyd

It’s not just the new and the novel. Even things that are relatively obvious to those of us who have grown up as adults online are confusing to older generations. As this article by BuzzFeed News reporter Craig Silverman points out, conspiracy-believing retirees have disproportionate influence on our democratic processes:

Older people are also more likely to vote and to be politically active in other ways, such as making political contributions. They are wealthier and therefore wield tremendous economic power and all of the influence that comes with it. With more and more older people going online, and future 65-plus generations already there, the online behavior of older people, as well as their rising power, is incredibly important — yet often ignored.

Craig Silverman

So when David Buckingham asks ‘Who needs digital literacy?’ I think the answer is everyone. Having been a fan of his earlier work, it saddens me to realise that he hasn’t kept up with the networked era:

These days, I find the notion of digital literacy much less useful – and to some extent, positively misleading. The fundamental problem is that the idea is defined by technology itself. It makes little sense to distinguish between texts (or media) on the grounds of whether they are analogue or digital: almost all media (including print media) involve the use of digital technology at some stage or other. Fake news and disinformation operate as much in old, analogue media (like newspapers) as they do online. Meanwhile, news organisations based in old media make extensive and increasing use of online platforms. The boundaries between digital and analogue may still be significant in some situations, but they are becoming ever more blurred.

David Buckingham

Actually, as Howard Rheingold pointed out a number of years ago in Net Smart, and as boyd has done in her own work, networks change everything. You can’t seriously compare pre-networked and post-networked cultures in any way other than in contrast.

Buckingham suggests that, seeing as the (UK) National Literacy Trust are on the case, we “don’t need to reinvent the wheel”. The trouble is that the wheel has already been reinvented, and lots of people either didn’t notice, or are acting as though it hasn’t been.

There’s a related article by Anna Mckie in the THE entitled Teaching intelligence: digital literacy in the ‘alternative facts’ era which, unfortunately, is now behind a paywall. It reports on a special issue of the journal Teaching in Higher Education where the editors have brought together papers on the contribution made by Higher Education to expertise and knowledge in the age of ‘alternative facts’:

[S]ocial media has changed the dynamic of information in our society, [editor] Professor Harrison added. “We’ve moved away from the idea of experts who assess information to one where the validity of a statement is based on the likes, retweets and shares it gets, rather than whether the information is valid.”

The first task of universities is to go back to basics and “help students to understand the difference between knowledge and information, and how knowledge is created, which is separate to how information is created”, Professor Harrison said. “Within [each] discipline, what are the skills needed to assess that?”

Many assume that schools or colleges are teaching this, but that is not the case, he added. “Academics should also be wary of the extent to which they themselves understand the new paradigms of knowledge creation,” Professor Harrison warned.

Anna McKie

One of the reasons I decided not to go into academia is that, certain notable exceptions aside, the focus is on explaining rather than changing. Or, to finish with another quotation, this time from Karl Marx, “Philosophers have hitherto only interpreted the world in various ways; the point is to change it.”


Also check out:

Sometimes even to live is an act of courage

Thank you to Seneca for the quotation for today’s title, which sprang to mind after reading Rosie Spinks’ claim in Quartz that we’ve reached ‘peak influencer’.

Where once the social network was basically lunch and sunsets, it’s now a parade of strategically-crafted life updates, career achievements, and public vows to spend less time online (usually made by people who earn money from social media)—all framed with the carefully selected language of a press release. Everyone is striving, so very hard.

Thank goodness for that. The selfie-obsessed influencer brigade is an insidious effect of the neoliberalism that permeates western culture:

For the internet influencer, everything from their morning sun salutation to their coffee enema (really) is a potential money-making opportunity. Forget paying your dues, or working your way up—in fact, forget jobs. Work is life, and getting paid to live your best life is the ultimate aspiration.

[…]

“Selling out” is not just perfectly OK in the influencer economy—it’s the raison d’etre. Influencers generally do not have a craft or discipline to stay loyal to in the first place, and by definition their income comes from selling a version of themselves.

As Yascha Mounk, writing in The Atlantic, explains the problem isn’t necessarily with social networks. It’s that you care about them. Social networks flatten everything into a never-ending stream. That stream makes it very difficult to differentiate between gossip and (for example) extremely important things that are an existential threat to democratic institutions:

“When you’re on Twitter, every controversy feels like it’s at the same level of importance,” one influential Democratic strategist told me. Over time, he found it more and more difficult to tune Twitter out: “People whose perception of reality is shaped by Twitter live in a different world and a different country than those off Twitter.”

It’s easier for me to say these days that our obsession with Twitter and Instagram is unhealthy. While I’ve never used Instagram (because it’s owned by Facebook) a decade ago I was spending hours each week on Twitter. My relationship with the service has changed as I’ve grown up and it has changed — especially after it became a publicly-traded company in 2013.

Twitter, in particular, now feels like a neverending soap opera similar to EastEnders. There’s always some outrage or drama running. Perhaps it’s better, as Catherine Price suggests in The New York Times, just to put down our smartphones?

Until now, most discussions of phones’ biochemical effects have focused on dopamine, a brain chemical that helps us form habits — and addictions. Like slot machines, smartphones and apps are explicitly designed to trigger dopamine’s release, with the goal of making our devices difficult to put down.

This manipulation of our dopamine systems is why many experts believe that we are developing behavioral addictions to our phones. But our phones’ effects on cortisol are potentially even more alarming.

Cortisol is our primary fight-or-flight hormone. Its release triggers physiological changes, such as spikes in blood pressure, heart rate and blood sugar, that help us react to and survive acute physical threats.

Depending on how we use them, social networks can stoke the worst feelings in us: emotions such as jealousy, anger, and worry. This is not conducive to healthy outcomes, especially for children where stress has a direct correlation to the take-up of addictive substances, and to heart disease in later life.

I wonder how future generations will look back at this time period?


Also check out:

Anything invented after you’re thirty-five is against the natural order of things

This post is locked for seven days, but accessible to supporters of Thought Shrapnel right now!
To unlock this content, pledge $1 or more on Patreon

The smallest deed is better than the greatest intention

Thanks to John Burroughs for today’s title. For me, it’s an oblique reference to some of the situations I find myself in, both in my professional and personal life. After all, words are cheap and actions are difficult.

I’m going to take the unusual step of quoting someone who’s quoting me. In this case, it’s Stephen Downes picking up on a comment I made in the cc-openedu Google Group. I’d link directly to my comments, but for some reason a group about open education is… closed?

I’d like to echo a point David Kernohan made when I worked with him on the Jisc OER programme. He said: “OER is a supply-side term”. Let’s face it, there are very few educators specifically going out and looking for “Openly Licensed Resources”. What they actuallywant are resources that they can access for free (or at a low cost) and that they can legally use. We’ve invented OER as a term to describe that, but it may actually be unhelpfully ambiguous.

Shortly after posting that, I read this post from Sarah Lambert on the GO-GN (Global OER Graduate Network) blog. She says:

[W]hile we’re being all inclusive and expanding our “open” to encompass any collaborative digital practice, then our “open” seems to be getting less and less distinctive. To the point where it’s getting quite easily absorbed by the mainstream higher education digital learning (eLearning, Technology Enhanced Learning, ODL, call it what you will). Is it a win for higher education to absorb and assimilate “open” (and our gift labour) as the latest innovation feeding the hungry marketised university that Kate Bowles spoke so eloquently about? Is it a problem if not only the practice, but the research field of open education becomes inseparable with mainstream higher education digital learning research?

My gloss on this is that ‘open education’ may finally have moved into the area of productive ambiguity. I talked about this back in 2016 in a post on a blog I post to only very infrequently, so I might as well quote myself again:

Ideally, I’d like to see ‘open education’ move into the realm of what I term productive ambiguity. That is to say, we can do some workwith the idea and start growing the movement beyond small pockets here and there. I’m greatly inspired by Douglas Rushkoff’s new Team Human podcast at the moment, feeling that it’s justified the stance that I and others have taken for using technology to make us more human (e.g. setting up a co-operative) and against the reverse (e.g. blockchain).

That’s going to make a lot of people uncomfortable, and hopefully uncomfortable enough to start exploring new, even better areas. ‘Open Education’ now belongs, for better or for worse, to the majority. Whether that’s ‘Early majority’ or ‘Late majority’ on the innovation adoption lifecycle curve probably depends where in the world you live.

Diffusion of innovation curve
CC BY Pnautilus (Wikipedia)

Things change and things move on. The reason I used that xkcd cartoon about IRC at the top of this post is because there has been much (OK, some) talk about Mozilla ending its use of IRC.

While we still use it heavily, IRC is an ongoing source of abuse and harassment for many of our colleagues and getting connected to this now-obscure forum is an unnecessary technical barrier for anyone finding their way to Mozilla via the web. Available interfaces really haven’t kept up with modern expectations, spambots and harassment are endemic to the platform, and in light of that it’s no coincidence that people trying to get in touch with us from inside schools, colleges or corporate networks are finding that often as not IRC traffic isn’t allowed past institutional firewalls at all.

Cue much hand-wringing from the die-hards in the Mozilla community. Unfortunately, Slack, which originally had a bridge/gateway for IRC has pulled up the drawbridge on that front, so they could go with something like Mattermost, but given recently history I bet they go with Discord (or similar).

As Seth Godin points out in his most recent podcast episode, everyone wants be described as ‘supple’, nobody wants to be described as ‘brittle’. Yet, the actions we take suggest otherwise. We expect that just because the change we see in the world isn’t convenient, that we can somehow slow it down. Nope, you just have to roll with it, whether that’s changing technologies, or different approaches to organising ideas and people.


Also check out:

  • Do Experts Listen to Other Experts? (Marginal Revolution) —”very little is known about how experts influence each others’ opinions, and how that influence affects final evaluations.”
  • Why Symbols Aren’t Forever (Sapiens) — “The shifting status of cultural symbols reveals a lot about who we are and what we value.”
  • Balanced Anarchy or Open Society? (Kottke.org) — “Personal computing and the internet changed (and continues to change) the balance of power in the world so much and with such speed that we still can’t comprehend it.”

A little Friday randomness

Not everything I read and bookmark to come back to is serious. So here for the sake of a little levity, are some things I’ve discovered recently that either made me smile, or think “that’s cool”:


Header image: xkcd

Educational institutions are at a crossroads of relevance

One of the things that attracted me to the world of Open Badges and digital credentialing back in 2011 was the question of relevance. As a Philosophy graduate, I’m absolutely down with the idea of a broad, balanced education, and learning as a means of human flourishing.

However, in a world where we measure schools, colleges, and universities through an economic lens, it’s inevitable that learners do so too. As I’ve said in presentations and to clients many times, I want my children to choose to go to university because it’s the right choice for them, not because they have to.

In an article in Forbes, Brandon Busteed notes that we’re on the verge of a huge change in Higher Education:

This shift will go down as the biggest disruption in higher education whereby colleges and universities will be disintermediated by employers and job seekers going direct. Higher education won’t be eliminated from the model; degrees and other credentials will remain valuable and desired, but for a growing number of young people they’ll be part of getting a job as opposed to college as its own discrete experience. This is already happening in the case of working adults and employers that offer college education as a benefit. But it will soon be true among traditional age students. Based on a Kaplan University Partners-QuestResearch study I led and which was released today, I predict as many as one-third of all traditional students in the next decade will “Go Pro Early” in work directly out of high school with the chance to earn a college degree as part of the package.

This is true to some degree in the UK as well, through Higher Apprenticeships. University study becomes a part-time deal with the ‘job’ paying for fees. It’s easy to see how this could quickly become a two-tier system for rich and poor.

A “job-first, college included model” could well become one of the biggest drivers of both increasing college completion rates in the U.S. and reducing the cost of college. In the examples of employers offering college degrees as benefits, a portion of the college expense will shift to the employer, who sees it as a valuable talent development and retention strategy with measurable return on investment benefits. This is further enhanced through bulk-rate tuition discounts offered by the higher educational institutions partnering with these employers. Students would still be eligible for federal financial aid, and they’d be making an income while going to college. To one degree or another, this model has the potential to make college more affordable for more people, while lowering or eliminating student loan debt and increasing college enrollments. It would certainly help bridge the career readiness gap that many of today’s college graduates encounter.

The ‘career readiness’ that Busteed discusses here is an interesting concept, and one that I think has been invented by employers who don’t want to foot the bill for training. Certainly, my parents’ generation weren’t supposed to be immediately ready for employment straight after their education — and, of course, they weren’t saddled with student debt, either.

Related, in my mind, is the way that we treat young people as data to be entered on a spreadsheet. This is managerialism at its worst. Back when I was a teacher and a form tutor, I remember how sorry I felt for the young people in my charge, who were effectively moved around a machine for ‘processing’ them.

Now, in an article for The Guardian, Jeremy Hannay tells it like it is for those who don’t have an insight into the Kafkaesque world of schools:

Let me clear up this edu-mess for you. It’s not Sats. It’s not workload. The elephant in the room is high-stakes accountability. And I’m calling bullshit. Our education system actively promotes holding schools, leaders and teachers at gunpoint for a very narrow set of test outcomes. This has long been proven to be one of the worst ways to bring about sustainable change. It is time to change this educational paradigm before we have no one left in the classroom except the children.

Just like our dog-eat-dog society in the UK could be much more collaborative, so our education system badly needs remodelling. We’ve deprofessionalised teaching, and introduced a managerial culture. Things could be different, as they are elsewhere in the world.

In such systems – and they do exist in some countries, such as Finland and Canada, and even in some brave schools in this country – development isn’t centred on inspection, but rather professional collaboration. These schools don’t perform regular observations and monitoring, or fire out over-prescriptive performance policies. Instead, they discuss and design pedagogy, engage in action research, and regularly perform activities such as learning and lesson study. Everyone understands that growing great educators involves moments of brilliance and moments of mayhem.

That’s the key: “moments of brilliance and moments of mayhem”. Ironically, bureaucratic, hierarchical systems cannot cope with amazing teachers, because they’re to some extent unpredictable. You can’t put them in a box (on a spreadsheet).

Actually, perhaps it’s not the hierarchy per se, but the power dynamics, as Richard D. Bartlett points out in this post.

Yes, when a hierarchical shape is applied to a human group, it tends to encourage coercive power dynamics. Usually the people at the top are given more importance than the rest. But the problem is the power, not the shape. 

What we’re doing is retro-fitting the worst forms of corporate power dynamics onto education and expecting everything to be fine. Newsflash: learning is different to work, and always will be.

Interestingly, Bartlett defines three different forms of power dynamics, which I think is enlightening:

Follett coined the terms “power-over” and “power-with” in 1924. Starhawk adds a third category “power-from-within”. These labels provide three useful lenses for analysing the power dynamics of an organisation. With apologies to the original authors, here’s my definitions:

  • power-from-within or empowerment — the creative force you feel when you’re making art, or speaking up for something you believe in
  • power-with or social power — influence, status, rank, or reputation that determines how much you are listened to in a group
  • power-over or coercion — power used by one person to control another

The problem with educational institutions, I feel, is that we’ve largely done away with empowerment and social power, and put all of our eggs in the basket of coercion.


Also check out:

  • Working collaboratively and learning cooperatively (Harold Jarche) — “Two types of behaviours are necessary in the network era workplace — collaboration and cooperation. Cooperation is not the same as collaboration, though they are complementary.”
  • Learning Alignment Model (Tom Barrett) – “It is not a step by step process to design learning, but more of a high-level thinking model to engage with that uncovers some interesting potential tensions in our classroom work.”
  • A Definition of Academic Innovation (Inside Higher Ed) – “What if academic innovation was built upon the research and theory of our field, incorporating social constructivist, constructionist and activity theory?”

Remote work is a different beast

You might not work remotely right now, but the chances are that at some point in your career, and in some capacity, you will do. Remote work has its own challenges and benefits, which are alluded to in three articles in Fast Company that I want to highlight. The first is an article summarising a survey Google performed amongst 5,600 of its remote workers.

On the outset of the study, the team hypothesized that distributed teams might not be as productive as their centrally located counterparts. “We were a little nervous about that,” says [Veronica] Gilrane [manager of Google’s People Innovation Lab]. She was surprised to find that distributed teams performed just as well. Unfortunately, she also found that there is a lot more frustration involved in working remotely. Workers in other offices can sometimes feel burdened to sync up their schedules with the main office. They can also feel disconnected from the team.

That doesn’t surprise me at all. Even though probably spend less AFK (Away From Keyboard) as a remote worker than I would in an office, there’s not that performative element, where you have to look like you’re working. Sometimes work doesn’t look like work; it looks like going for a run to think about a problem, or bouncing an idea off a neighbour as you walk back to your office with a cup of tea.

The main thing, as this article points out, is that it’s really important to have an approach that focuses on results rather than time spent doing the work. You do have to have some process, though:

[I]t’s imperative that you stress disciplinary excellence; workers at home don’t have a manager peering over their shoulder, so they have to act as their own boss and maintain a strict schedule to get things done. Don’t try to dictate every aspect of their lives–remote work is effective because it offers workers flexibility, after all. Nonetheless, be sure that you’re requesting regular status updates, and that you have a system in place to measure productivity.

Fully-remote working is different to ‘working from home’ a day or two per week. It does take discipline, if only to stop raiding the biscuit tin. But it’s also a different mindset, including intentionally sharing your work much more than you’d do in a co-located setting.

Fundamentally, as Greg Galant, CEO of a full-remote organisation, comments, it’s about trust:

“My friends always say to me, ‘How do you know if anyone is really working?’ and I always ask them, ‘How do you know if anybody is really working if they are at the office?’” says Galant. “Because the reality is, you can see somebody at their desk and they can stay late, but that doesn’t mean they’re really working.”

[…]

If managers are adhering to traditional management practices, they’re going to feel anxiety with remote teams. They’re going to want to check in constantly to make sure people are working. But checking in constantly prevents work from getting done.

Remote work is strange and difficult to describe to anyone who hasn’t experienced it. You can, for example, in the same day feel isolated and lonely, while simultaneously getting annoyed with all of the ‘pings’ and internal communication coming at you.

At the end of the day, companies need to set expectations, and remote workers need to set boundaries. It’s the only way to avoid burnout, and to ensure that what can be a wonderful experience doesn’t turn into a nightmare.


Also check out:

  • 5 Great Resources for Remote Workers (Product Hunt) — “If you’re a remote worker or spend part of your day working from outside of the office, the following tools will help you find jobs, discover the best cities for remote workers, and learn from people who have built successful freelance careers or location-independent companies.”
  • Stop Managing Your Remote Workers As If They Work Onsite (ThinkGrowth) — “Managers need to back away from their conventional views of what “working hard” looks like and instead set specific targets, explain what success looks like, and trust the team to get it done where, when, and however works best for them.”
  • 11 Tools That Allow us to Work from Anywhere on Earth as a Distributed Company (Ghost) —”In an office, the collaboration tools you use are akin to a simple device like a screwdriver. They assist with difficult tasks and lessen the amount of effort required to complete them. In a distributed team, the tools you use are more like life-support. Everything to do with distributed team tools is about clawing back some of that contextual awareness which you’ve lost by not being in the same space.”

Culture eats strategy for breakfast

The title of this post is a quotation from management consultant, educator, and author Peter Drucker. Having worked in a variety of organisations, I can attest to its truth.

That’s why, when someone shared this post by Grace Krause, which is basically a poem about work culture, I paid attention. Entitled Appropriate Channels, here’s a flavour:

We would like to remind you all
That we care deeply
About our staff and our students
And in no way do we wish to silence criticism
But please make use of the
Appropriate Channels

The Appropriate Channel is tears cried at home
And not in the workplace
Please refrain from crying at your desk
As it might lower the productivity of your colleagues

Organisational culture is difficult because of the patriarchy. I selected this part of the poem, as I’ve come to realise just how problematic it is to let people know (through words, actions, or policies) that it’s not OK to cry at work. If we’re to bring our full selves to work, then emotion is part of it.

Any organisation has a culture, and that culture can be changed, for better or for worse. Restaurants are notoriously toxic places to work, which is why this article in Quartz, is interesting:

Since four-time James Beard award winner Gabrielle Hamilton opened Prune’s doors in 1999, she, along with her co-chef Ashley Merriman, have established a set of principles that help guide employees at the restaurant. According to Hamilton and Merriman, the code has a kind of transformative power. It’s helped the kitchen avoid becoming a hierarchical, top-down fiefdom—a concentration of power that innumerable chefs have abused in the past. It can turn obnoxious, entitled patrons into polite diners who are delighted to have a seat at the table. And it’s created the kind of environment where Hamilton and Merriman, along with their staff, want to spend much of their day.

The five core values of their restaurant, which I think you could apply to any organisation, are:

  1. Be thorough and excellent in everything that you do
  2. Be smart and funny
  3. Be disarmingly honest
  4. Work without division of any kind
  5. Practise servant leadership

We live in the ‘age of burnout’, according to another article in Quartz, but there’s no reason why we can’t love the work we do. It’s all about finding the meaning behind the stuff we get done on a daily basis:

Our freedom to make meaning is both a blessing and a curse. To get somewhat existential about it, “work,” and the problems associated with it as an amorphous whole, do not exist: For the individual, only his or her work exists, and the individual is in control of that, with the very real power radically to change the situation. You could start the process of changing your job right now, today. Yes, arguments about the practicality of that choice well up fast and high. Yes, you would have to find another way to pay the bills. That doesn’t negate the fact that, fundamentally, you are free.

It’s important to remember this, that we choose to do the work we do, that we don’t have to work for a single employer, and that we can tell a different story about ourselves at any point we choose. It might not be easy, but it’s certainly doable.


Also check out:

Things that people think are wrong (but aren’t)

I’ve collected a bunch of diverse articles that seem to be around the topic of things that people think are wrong, but aren’t really. Hence the title.

I’ll start with something that everyone over a certain age seems to have a problem with, except for me: sleep. BBC Health lists five sleep myths:

  1. You can cope on less than five hours’ sleep
  2. Alcohol before bed boosts your sleep
  3. Watching TV in bed helps you relax
  4. If you’re struggling to sleep, stay in bed
  5. Hitting the snooze button
  6. Snoring is always harmless

My smartband regularly tells me that I sleep better than 93% of people, and I think that’s because of how much I prioritise sleep. I’ve also got a system, which I’ve written about before for the times when I do have a rough night.

I like routine, but I also like mixing things up, which is why I appreciate chunks of time at home interspersed with travel. Oliver Burkeman, writing in The Guardian, suggests, however, that routines aren’t the be-all and end-all:

Some people are so disorganised that a strict routine is a lifesaver. But speaking as a recovering rigid-schedules addict, trust me: if you click excitedly on each new article promising the perfect morning routine, you’re almost certainly not one of those people. You’re one of the other kind – people who’d benefit from struggling less to control their day, responding a bit more intuitively to the needs of the moment. This is the self-help principle you might call the law of unwelcome advice: if you love the idea of implementing a new technique, it’s likely to be the opposite of what you need.

Expecting something new to solve an underlying problem is a symptom of our culture’s focus on the new and novel. While there’s so much stuff out there we haven’t experienced, should we spend our lives seeking it out to the detriment of the tried and tested, the things that we really enjoy?

On the recommendation of my wife, I recently listened to a great episode of the Off Menu podcast featuring Victoria Cohen Mitchell. It’s not only extremely entertaining, but she mentions how, for her, a nice Ploughman’s lunch is better than some fancy meal.

This brings me to an article in The Atlantic by Joe Pinsker, who writes that kids who watch and re-watch the same film might be on to something:

In general, psychological and behavioral-economics research has found that when people make decisions about what they think they’ll enjoy, they often assign priority to unfamiliar experiences—such as a new book or movie, or traveling somewhere they’ve never been before. They are not wrong to do so: People generally enjoy things less the more accustomed to them they become. As O’Brien [professor at the University of Chicago’s Booth School of Business] writes, “People may choose novelty not because they expect exceptionally positive reactions to the new option, but because they expect exceptionally dull reactions to the old option.” And sometimes, that expected dullness might be exaggerated.

So there’s something to be said for re-reading novels you read when you were younger instead of something shortlisted for a prize, or discounted in the local bookshop. I found re-reading Dostoevsky’s Crime & Punishment recently exhilarating as I probably hadn’t ready it since I became a parent. Different periods of your life put different spins on things that you think you already know.


Also check out:

  • The ‘Dark Ages’ Weren’t As Dark As We Thought (Literary Hub) — “At the back of our minds when thinking about the centuries when the Roman Empire mutated into medieval Europe we are unconsciously taking on the spurious guise of specific communities.”
  • An Easy Mode Has Never Ruined A Game (Kotaku) — “There are myriad ways video games can turn the dials on various systems to change our assessment of how “hard” they seem, and many developers have done as much without compromising the quality or integrity of their games.”
  • Millennials destroyed the rules of written English – and created something better (Mashable) — “For millennials who conduct so many of their conversations online, this creativity with written English allows us to express things that we would have previously only been conveyed through volume, cadence, tone, or body language.”