Author: Doug Belshaw (page 1 of 20)

To lose old styles of reading is to lose a part of ourselves

Sometimes I think we’re living in the end times:

Out for dinner with another writer, I said, “I think I’ve forgotten how to read.”

“Yes!” he replied, pointing his knife. “Everybody has.”

“No, really,” I said. “I mean I actually can’t do it any more.”

He nodded: “Nobody can read like they used to. But nobody wants to talk about it.”

I wrote my doctoral thesis on digital literacies. There was a real sense in the 1990s that reading on screen was very different to reading on paper. We’ve kind of lost that sense of difference, and I think perhaps we need to regain it:

For most of modern life, printed matter was, as the media critic Neil Postman put it, “the model, the metaphor, and the measure of all discourse.” The resonance of printed books – their lineal structure, the demands they make on our attention – touches every corner of the world we’ve inherited. But online life makes me into a different kind of reader – a cynical one. I scrounge, now, for the useful fact; I zero in on the shareable link. My attention – and thus my experience – fractures. Online reading is about clicks, and comments, and points. When I take that mindset and try to apply it to a beaten-up paperback, my mind bucks.

We don’ really talk about ‘hypertext’ any more, as it’s almost the default type of text that we read. As such, reading on paper doesn’t really prepare us for it:

For a long time, I convinced myself that a childhood spent immersed in old-fashioned books would insulate me somehow from our new media climate – that I could keep on reading and writing in the old way because my mind was formed in pre-internet days. But the mind is plastic – and I have changed. I’m not the reader I was.

Me too. I train myself to read longer articles through mechanisms such as writing Thought Shrapnel posts and newsletters each week. But I don’t read like I used to; I read for utility rather than pleasure and just for the sake of it.

The suggestion that, in a few generations, our experience of media will be reinvented shouldn’t surprise us. We should, instead, marvel at the fact we ever read books at all. Great researchers such as Maryanne Wolf and Alison Gopnik remind us that the human brain was never designed to read. Rather, elements of the visual cortex – which evolved for other purposes – were hijacked in order to pull off the trick. The deep reading that a novel demands doesn’t come easy and it was never “natural.” Our default state is, if anything, one of distractedness. The gaze shifts, the attention flits; we scour the environment for clues. (Otherwise, that predator in the shadows might eat us.) How primed are we for distraction? One famous study found humans would rather give themselves electric shocks than sit alone with their thoughts for 10 minutes. We disobey those instincts every time we get lost in a book.

It’s funny. We’ve such a connection with books, but for most of human history we’ve done without them:

Literacy has only been common (outside the elite) since the 19th century. And it’s hardly been crystallized since then. Our habits of reading could easily become antiquated. The writer Clay Shirky even suggests that we’ve lately been “emptily praising” Tolstoy and Proust. Those old, solitary experiences with literature were “just a side-effect of living in an environment of impoverished access.” In our online world, we can move on. And our brains – only temporarily hijacked by books – will now be hijacked by whatever comes next.

There’s several theses in all of this around fake news, the role of reading in a democracy, and how information spreads. For now, I continue to be amazed at the power of the web on the fabric of societies.

Source: The Globe and Mail

Issue #292: Is there a cure for Tasmania? 🇦🇺

The latest issue of the newsletter hit inboxes earlier today!

💥 Read

🔗 Subscribe

Does the world need interactive emails?

I’m on the fence on this as, on the one hand, email is an absolute bedrock of the internet, a common federated standard that we can rely upon independent of technological factionalism. On the other hand, so long as it’s built into a standard others can adopt, it could be pretty cool.

The author of this article really doesn’t like Google’s idea of extending AMP (Accelerated Mobile Pages) to the inbox:

See, email belongs to a special class. Nobody really likes it, but it’s the way nobody really likes sidewalks, or electrical outlets, or forks. It not that there’s something wrong with them. It’s that they’re mature, useful items that do exactly what they need to do. They’ve transcended the world of likes and dislikes.

Fair enough, but as a total convert to Google’s ‘Inbox’ app both on the web and on mobile, I don’t think we can stop innovation in this area:

Emails are static because messages are meant to be static. The entire concept of communication via the internet is based around the telegraphic model of exchanging one-way packets with static payloads, the way the entire concept of a fork is based around piercing a piece of food and allowing friction to hold it in place during transit.

Are messages ‘meant to be static’? I’m not so sure. Books were ‘meant to’ be paper-based until ebooks came along, and now there’s all kinds of things we can do with ebooks that we can’t do with their dead-tree equivalents.

Why do this? Are we running out of tabs? Were people complaining that clicking “yes” on an RSVP email took them to the invitation site? Were they asking to have a video chat window open inside the email with the link? No. No one cares. No one is being inconvenienced by this aspect of email (inbox overload is a different problem), and no one will gain anything by changing it.

Although it’s an entertaining read, if ‘why do this?’ is the only argument the author, Devin Coldewey, has got against an attempted innovation in this space, then my answer would be why not? Although Coldewey points to the shutdown of Google Reader as an example of Google ‘forcing’ everyone to move to algorithmic news feeds, I’m not sure things are, and were, as simple as that.

It sounds a little simplistic to say so, but people either like and value something and therefore use it, or they don’t. We who like and uphold standards need to remember that, instead of thinking about what people and organisations should and shouldn’t do.

Source: TechCrunch

The Kano model

Using the example of the innovation of a customised home page from the early days of Flickr, this article helps break down how to delight users:

Years ago, we came across the work of Noriaka Kano, a Japanese expert in customer satisfaction and quality management. In studying his writing, we learned about a model he created in the 1980s, known as the Kano Model.

The article does a great job of explaining how you can implement great features but they don’t particularly get users excited:

Capabilities that users expect will frustrate those users when they don’t work. However, when they work well, they don’t delight those users. A basic expectation, at best, can reach a neutral satisfaction a point where it, in essence, becomes invisible to the user.

Try as it might, Google’s development team can only reduce the file-save problems to the point of it working 100% of the time. However, users will never say, “Google Docs is an awesome product because it saves my documents so well.” They just expect files to always be saved correctly.

So it’s a process of continual improvement, and marginal gains in some areas:

One of the predictions that the Kano Model makes is that once customers become accustomed to excitement generator features, those features are not as delightful. The features initially become part of the performance payoff and then eventually migrate to basic expectations.

Lots to think about here, particularly with Project MoodleNet.

Source: UIE

Is the gig economy the mass exploitation of millennials?

The answer is, “yes, probably”.

If the living wage is a pay scale calculated to be that of an appropriate amount of money to pay a worker so they can live, how is it possible, in a legal or moral sense to pay someone less? We are witnessing a concerted effort to devalue labour, where the primary concern of business is profit, not the economic wellbeing of its employees.

The ‘sharing economy’ and ‘gig economy’ are nothing of the sort. They’re a problematic and highly disingenuous way for employers to not care about the people who create value in their business.

The employer washes their hands of the worker. Their immediate utility is the sole concern. From a profit point of view, absolutely we can appreciate the logic. However, we forget that the worker also exists as a member of society, and when business is allowed to use and exploit people in this manner, we endanger societal cohesiveness.

The problem, of course, is late-stage capitalism:

The neoliberal project has encouraged us to adopt a hyper-individualistic approach to life and work. For all the speak of teamwork, in this economy the individual reigns supreme and it is destroying young workers. The present system has become unfeasible. The neoliberal project needs to be reeled back in. The free market needs a firm hand because the invisible one has lost its grip.

And the alternative? Co-operation.

Source: The Irish Times

Humans are not machines

Can we teach machines to be ‘fully human’? It’s a fascinating question, as it makes us think carefully about what it actually means to be a human being.

Humans aren’t just about inputs and outputs. There’s some things that we ‘know’ in different ways. Take music, for example.

In philosophy, it’s common to describe the mind as a kind of machine that operates on a set of representations, which serve as proxies for worldly states of affairs, and get recombined ‘offline’ in a manner that’s not dictated by what’s happening in the immediate environment. So if you can’t consciously represent the finer details of a guitar solo, the way is surely barred to having any grasp of its nuances. Claiming that you have a ‘merely visceral’ grasp of music really amounts to saying that you don’t understand it at all. Right?

There’s activities we do and actions we peform that aren’t the result of conscious thought. What status do we give them?

Getting swept up in a musical performance is just one among a whole host of familiar activities that seem less about computing information, and more about feeling our way as we go: selecting an outfit that’s chic without being fussy, avoiding collisions with other pedestrians on the pavement, or adding just a pinch of salt to the casserole. If we sometimes live in the world in a thoughtful and considered way, we go with the flow a lot, too.

What sets humans apart from animals is the ability to plan and to pay attention to absract things and ideas:

Now, the world contains many things that we can’t perceive. I am unlikely to find a square root in my sock drawer, or to spot the categorical imperative lurking behind the couch. I can, however, perceive concrete things, and work out their approximate size, shape and colour just by paying attention to them. I can also perceive events occurring around me, and get a rough idea of their duration and how they relate to each other in time. I hear that the knock at the door came just before the cat leapt off the couch, and I have a sense of how long it took for the cat to sidle out of the room.

Time is one of the most abstract of the day-to-day things we deal with as humans:

Our conscious experience of time is philosophically puzzling. On the one hand, it’s intuitive to suppose that we perceive only what’s happening rightnow. But on the other, we seem to have immediate perceptual experiences of motion and change: I don’t need to infer from a series of ‘still’ impressions of your hand that it is waving, or work out a connection between isolated tones in order to hear a melody. These intuitions seem to contradict each other: how can I perceive motion and change if I am only really conscious of what’s occurring now? We face a choice: either we don’t really perceive motion and change, or the now of our perception encompasses more than the present instant – each of which seems problematic in its own way. Philosophers such as Franz Brentano and Edmund Husserl, as well as a host of more recent commentators, have debated how best to solve the dilemma.

So where does that leave us in terms of the differences between humans and machines?

Human attempts at making sense of the world often involve representing, calculating and deliberating. This isn’t the kind of thing that typically goes on in the 55 Bar, nor is it necessarily happening in the Lutheran church just down the block, or on a muddy football pitch in a remote Irish village. But gathering to make music, play games or engage in religious worship are far from being mindless activities. And making sense of the world is not necessarily just a matter of representing it.

To me, that last sentence is key: the world isn’t just representations. It’s deeper and more visceral than that.

Source: Aeon

Legislating against manipulated ‘facts’ is a slippery slope

In this day and age it’s hard to know who to trust. I was raised to trust in authority but was particularly struck when I did a deep-dive into Vinay Gupta’s blog about the state being special only because it holds a monopoly on (legal) violence.

As an historian, I’m all too aware of the times that the state (usually represented by a monarch) has served to repress its citizens/subjects. It at least could pretend that it was protecting the majority of the people. As this article states:

Lies masquerading as news are as old as news itself. What is new today is not fake news but the purveyors of such news. In the past, only governments and powerful figures could manipulate public opinion. Today, it’s anyone with internet access. Just as elite institutions have lost their grip over the electorate, so their ability to act as gatekeepers to news, defining what is and is not true, has also been eroded.

So in the interaction between social networks such as Facebook, Twitter, and Instagram on the one hand, and various governments on the other hand, both are interested in power, not the people. Or even any notion of truth, it would seem:

This is why we should be wary of many of the solutions to fake news proposed by European politicians. Such solutions do little to challenge the culture of fragmented truths. They seek, rather, to restore more acceptable gatekeepers – for Facebook or governments to define what is and isn’t true. In Germany, a new law forces social media sites to take down posts spreading fake news or hate speech within 24 hours or face fines of up to €50m. The French president, Emmanuel Macron, has promised to ban fake news on the internet during election campaigns. Do we really want to rid ourselves of today’s fake news by returning to the days when the only fake news was official fake news?

We need to be vigilant. Those we trust today may not be trustworthy tomorrow.

Source: The Guardian

"Things always become obvious after the fact." (Nassim Nicholas Taleb)

Why we forget most of what we read

I read a lot of stuff, and I remember random bits of it. I used to be reasonably disciplined about bookmarking stuff, but then realised I hardly ever went back through my bookmarks. So, instead, I try to use what I read, which is kind of the reason for Thought Shrapnel…

Surely some people can read a book or watch a movie once and retain the plot perfectly. But for many, the experience of consuming culture is like filling up a bathtub, soaking in it, and then watching the water run down the drain. It might leave a film in the tub, but the rest is gone.

Well, indeed. Nice metaphor.

In the internet age, recall memory—the ability to spontaneously call information up in your mind—has become less necessary. It’s still good for bar trivia, or remembering your to-do list, but largely, [Jared Horvath, a research fellow at the University of Melbourne] says, what’s called recognition memory is more important. “So long as you know where that information is at and how to access it, then you don’t really need to recall it,” he says.

Exactly. You need to know how to find that article you read that backs up the argument you’re making. You don’t need to remember all of the details. Search skills are really important.

One study showed that recalling details about episodes for those bingeing on Netflix series was much lower than for thoose who spaced them out. I guess that’s unsurprising.

People are binging on the written word, too. In 2009, the average American encountered 100,000 words a day, even if they didn’t “read” all of them. It’s hard to imagine that’s decreased in the nine years since. In “Binge-Reading Disorder,” an article for The Morning News, Nikkitha Bakshani analyzes the meaning of this statistic. “Reading is a nuanced word,” she writes, “but the most common kind of reading is likely reading as consumption: where we read, especially on the internet, merely to acquire information. Information that stands no chance of becoming knowledge unless it ‘sticks.’”

For anyone who knows about spaced learning, the conclusions are pretty obvious:

The lesson from his binge-watching study is that if you want to remember the things you watch and read, space them out. I used to get irritated in school when an English-class syllabus would have us read only three chapters a week, but there was a good reason for that. Memories get reinforced the more you recall them, Horvath says. If you read a book all in one stretch—on an airplane, say—you’re just holding the story in your working memory that whole time. “You’re never actually reaccessing it,” he says.

So apply what you learn and you’re putting it to work. Hence this post!

Source: The Atlantic (via e180)

Should you lower your expectations?

“Aim for the stars and maybe you’ll hit the treetops” was always the kind of advice I was given when I was younger. But extremely high expectations of oneself is not always a great thing. We have to learn that we’ve got limits. Some are physical, some are mental, and some are cultural:

The problem with placing too much emphasis on your expectations—especially when they are exceedingly high—is that if you don’t meet them, you’re liable to feel sad, perhaps even burned out. This isn’t to say that you shouldn’t strive for excellence, but there’s wisdom in not letting perfect be the enemy of good.

A (now famous) 2006 study found that people in Denmark are the happiest in the world. Researchers also found that have remarkably low expectations. And then:

In a more recent study that included more than 18,000 participants and was published in 2014 in the Proceedings of the National Academy of Sciences, researchers from University College in London examined people’s happiness from moment to moment. They found that “momentary happiness in response to outcomes of a probabilistic reward task is not explained by current task earnings, but by the combined influence of the recent reward expectations and prediction errors arising from those expectations.” In other words: Happiness at any given moment equals reality minus expectations.

So if you’ve always got very high expectations that aren’t being met, that’s not a great situation to be in

In the words of Jason Fried, founder and CEO of software company Basecamp and author of multiple books on workplace performance: “I used to set expectations in my head all day long. But constantly measuring reality against an imagined reality is taxing and tiring, [and] often wrings the joy out of experiencing something for what it is.”

Source: Outside