Tag: apps

Saturday scrubbings

This week on Thought Shrapnel I’ve been focused on messing about with using OBS to create videos. So much, in fact, that this weekend I’m building a new PC to improve the experience.

Sometimes in these link roundups I try and group similar kinds of things together. But this week, much as I did last week, I’ve just thrown them all in a pot like Gumbo.

Tell me which links you find interesting, either in the comments, or on Twitter or the Fediverse (feel free to use the hashtag #thoughtshrapnel)


Melting Ice Reveals a “Lost” Viking-Era Pass in Norway’s Mountains

About 60 artifacts have been radiocarbon dated, showing the Lendbreen pass was widely used from at least A.D. 300. “It probably served as both an artery for long-distance travel and for local travel between permanent farms in the valleys to summer farms higher in the mountains, where livestock grazed for part of the year,” says University of Cambridge archaeologist James Barrett, a co-author of the research.

Tom Metcalfe (Scientific American)

I love it when the scientific and history communities come together to find out new things about our past. Especially about the Vikings, who were straight-up amazing.


University proposes online-only degrees as part of radical restructuring

Confidential documents seen by Palatinate show that the University is planning “a radical restructure” of the Durham curriculum in order to permanently put online resources at the core of its educational offer, in response to the Covid-19 crisis and other ongoing changes in both national and international Higher Education.

The proposals seek to “invert Durham’s traditional educational model”, which revolves around residential study, replacing it with one that puts “online resources at the core enabling us to provide education at a distance.” 

Jack Taylor & Tom Mitchell (Palatinate)

I’m paying attention to this as Durham University is one of my alma maters* but I think this is going to be a common story across a lot of UK institutions. They’ve relied for too long on the inflated fees brought in by overseas students and now, in the wake of the pandemic, need to rapidly find a different approach.

*I have a teaching qualification and two postgraduate degrees from Durham, despite a snooty professor telling me when I was 17 years old that I’d never get in to the institution 😅


Abolish Silicon Valley: memoir of a driven startup founder who became an anti-capitalist activist

Liu grew up a true believer in “meritocracy” and its corollaries: that success implies worth, and thus failure is a moral judgment about the intellect, commitment and value of the failed.

Her tale — starting in her girlhood bedroom and stretching all the way to protests outside of tech giants in San Francisco — traces a journey of maturity and discovery, as Liu confronts the mounting evidence that her life’s philosophy is little more than the self-serving rhetoric of rich people defending their privilege, the chasm between her lived experience and her guiding philosophy widens until she can no longer straddle it.

Cory Doctorow (Boing Boing)

This book is next on my non-fiction reading list. If your library is closed and doesn’t have an online service, try this.


Cup, er, drying itself...

7 things ease the switch to remote-only workplaces

You want workers to post work as it’s underway—even when it’s rough, incomplete, imperfect. That requires a different mindset, though one that’s increasingly common in asynchronous companies. In traditional companies, people often hesitate to circulate projects or proposals that aren’t polished, pretty, and bullet-proofed. It’s a natural reflex, especially when people are disconnected from each other and don’t communicate casually. But it can lead to long delays, especially on projects in which each participant’s progress depends on the progress and feedback of others. Location-independent companies need a culture in which people recognize that a work-in-progress is likely to have gaps and flaws and don’t criticize each other for them. This is an issue of norms, not tools.

Edmund L. Andrews-Stanford (Futurity)

I discovered this via Stephen Downes, who highlights the fifth point in this article (‘single source of truth’). I’ve actually highlighted the sixth one (‘breaking down the barriers to sharing work’) as I’ve also seen that as an important thing to check for when hiring.


How the 5G coronavirus conspiracy theory tore through the internet

The level of interest in the coronavirus pandemic – and the fear and uncertainty that comes with it – has caused tired, fringe conspiracy theories to be pulled into the mainstream. From obscure YouTube channels and Facebook pages, to national news headlines, baseless claims that 5G causes or exacerbates coronavirus are now having real-world consequences. People are burning down 5G masts in protest. Government ministers and public health experts are now being forced to confront this dangerous balderdash head-on, giving further oxygen and airtime to views that, were it not for the major technology platforms, would remain on the fringe of the fringe. “Like anti-vax content, this messaging is spreading via platforms which have been designed explicitly to help propagate the content which people find most compelling; most irresistible to click on,” says Smith from Demos.

James temperton (wired)

The disinformation and plain bonkers-ness around this ‘theory’ of linking 5G and the coronavirus is a particularly difficult thing to deal with. I’ve avoided talking about it on social media as well as here on Thought Shrapnel, but I’m sharing this as it’s a great overview of how these things spread — and who’s fanning the flames.


A Manifesto Against EdTech© During an Emergency Online Pivot

The COVID-19 pandemic is an unprecedented moment in the history of social structures such as education. After all of the time spent creating emergency plans and three- or five-year road maps that include fail safe options, we find ourselves in the actual emergency. Yet not even a month into global orders of shelter in place, there are many education narratives attempting to frame the pandemic as an opportunity. Extreme situations can certainly create space for extraordinary opportunities, but that viewpoint is severely limited considering this moment in time. Perhaps if the move to distance/online/remote education had happened in a vacuum that did not involve a global pandemic, millions sick, tens of thousands dead, tens of millions unemployed, hundreds of millions hungry, billions anxious and uncertain of society’s next step…perhaps then this would be that opportunity moment. Instead, we have a global emergency where the stress is felt everywhere but it certainly is not evenly distributed, so learning/aligning/deploying/assessing new technology for the classroom is not universally feasible. You can’t teach someone to swim while they’re drowning.

Rolin Moe

Rolin Moe is a thoughtful commentator on educational technology. This post was obviously written quickly (note the typo in the URL when you click through, as well as some slightly awkward language) and I’m not a fan of the title Moe has settled on. That being said, the point about this not being an ‘opportunity’ for edtech is a good one.


Dishes washing themselves

NHS coronavirus app: memo discussed giving ministers power to ‘de-anonymise’ users

Produced in March, the memo explained how an NHS app could work, using Bluetooth LE, a standard feature that runs constantly and automatically on all mobile devices, to take “soundings” from other nearby phones through the day. People who have been in sustained proximity with someone who may have Covid-19 could then be warned and advised to self–isolate, without revealing the identity of the infected individual.

However, the memo stated that “more controversially” the app could use device IDs, which are unique to all smartphones, “to enable de-anonymisation if ministers judge that to be proportionate at some stage”. It did not say why ministers might want to identify app users, or under what circumstances doing so would be proportionate.

David Pegg & Paul Lewis (The Guardian)

This all really concerns me, as not only is this kind of technology only going be of marginal use in fighting the coronavirus, once this is out of the box, what else is it going to be used for? Also check out Vice’s coverage, including an interview with Edward Snowden, and this discussion at Edgeryders.


Is This the Most Virus-Proof Job in the World?

It’s hard to think of a job title more pandemic-proof than “superstar live streamer.” While the coronavirus has upended the working lives of hundreds of millions of people, Dr. Lupo, as he’s known to acolytes, has a basically unaltered routine. He has the same seven-second commute down a flight of stairs. He sits in the same seat, before the same configuration of lights, cameras and monitors. He keeps the same marathon hours, starting every morning at 8.

Social distancing? He’s been doing that since he went pro, three years ago.

For 11 hours a day, six days a week, he sits alone, hunting and being hunted on games like Call of Duty and Fortnite. With offline spectator sports canceled, he and other well-known gamers currently offer one of the only live contests that meet the standards of the Centers for Disease Control and Prevention.

David Segal (The New York Times)

It’s hard to argue with my son these days when he says he wants to be a ‘pro gamer’.

(a quick tip for those who want to avoid ‘free registration’ and some paywalls — use a service like Pocket to save the article and read it there)


Capitalists or Cronyists?

To be clear, socialism may be a better way to go, as evidenced by the study showing 4 of the 5 happiest nations are socialist democracies. However, unless we’re going to provide universal healthcare and universal pre-K, let’s not embrace The Hunger Games for the working class on the way up, and the Hallmark Channel for the shareholder class on the way down. The current administration, the wealthy, and the media have embraced policies that bless the caching of power and wealth, creating a nation of brittle companies and government agencies.

Scott Galloway

A somewhat rambling post, but which explains the difference between a form of capitalism that (theoretically) allows everyone to flourish, and crony capitalism, which doesn’t.


Header image by Stephen Collins at The Guardian

Most human beings have an almost infinite capacity for taking things for granted

So said Aldous Huxley. Recently, I discovered a episode of the podcast The Science of Success in which Dan Carlin was interviewed. Now Dan is the host of one of my favourite podcasts, Hardcore History as well as one he’s recently discontinued called Common Sense.

The reason the latter is on ‘indefinite hiatus’ was discussed on The Science of Success podcast. Dan feels that, after 30 years as a journalist, if he can’t get a grip on the current information landscape, then who can? It’s shaken him up a little.

One of the quotations he just gently lobbed into the conversation was from John Stuart Mill, who at one time or another was accused by someone of being ‘inconsistent’ in his views. Mill replied:

When the facts change, I change my mind. What do you do, sir?

John Stuart Mill

Now whether or not Mill said those exact words, the sentiment nevertheless stands. I reckon human beings have always made up their minds first and then chosen ‘facts’ to support their opinions. These days, I just think that it’s easier than ever to find ‘news’ outlets and people sharing social media posts to support your worldview. It’s as simple as that.


Last week I watched a stand-up comedy routine by Kevin Bridges on BBC iPlayer as part of his 2018 tour. As a Glaswegian, he made the (hilarious) analogy of social media as being like going into a pub.

(As an aside, this is interesting, as a decade ago people would often use the analogy of using social media as being like going to an café. The idea was that you could overhear, and perhaps join in with, interesting conversations that you hear. No-one uses that analogy any more.)

Bridges pointed out that if you entered a pub, sat down for a quiet pint, and the person next to you was trying to flog you Herbalife products, constantly talking about how #blessed they felt, or talking ambiguously for the sake of attention, you’d probably find another pub.

He was doing it for laughs, but I think he was also making a serious point. Online, we tolerate people ranting on and generally being obnoxious in ways we would never do offline.

The underlying problem of course is that any platform that takes some segment of the real world and brings it into software will also bring in all that segment’s problems. Amazon took products and so it has to deal with bad and fake products (whereas one might say that Facebook took people, and so has bad and fake people).

Benedict Evans

I met Clay Shirky at an event last month, which kind of blew my mind given that it was me speaking at it rather than him. After introducing myself, we spoke for a few minutes about everything from his choice of laptop to what he’s been working on recently. Curiously, he’s not writing a book at the moment. After a couple of very well-received books (Here Comes Everybody and Cognitive Surplus) Shirky has actually only published a slightly obscure book about Chinese smartphone manufacturing since 2010.

While I didn’t have time to dig into things there and then, and it would been a bit presumptuous of me to do so, it feels to me like Shirky may have ‘walked back’ some of his pre-2010 thoughts. This doesn’t surprise me at all, given that many of the rest of us have, too. For example, in 2014 he published a Medium article explaining why he banned his students from using laptops in lectures. Such blog posts and news articles are common these days, but it felt like was one of the first.


The last decade from 2010 to 2019, which Audrey Watters has done a great job of eviscerating, was, shall we say, somewhat problematic. The good news is that we connected 4.5 billion people to the internet. The bad news is that we didn’t really harness that for much good. So we went from people sharing pictures of cats, to people sharing pictures of cats and destroying western democracy.

Other than the ‘bad and fake people’ problem cited by Ben Evans above, another big problem was the rise of surveillance capitalism. In a similar way to climate change, this has been repackaged as a series of individual failures on the part of end users. But, as Lindsey Barrett explains for Fast Company, it’s not really our fault at all:

In some ways, the tendency to blame individuals simply reflects the mistakes of our existing privacy laws, which are built on a vision of privacy choices that generally considers the use of technology to be a purely rational decision, unconstrained by practical limitations such as the circumstances of the user or human fallibility. These laws are guided by the idea that providing people with information about data collection practices in a boilerplate policy statement is a sufficient safeguard. If people don’t like the practices described, they don’t have to use the service.

Lindsey Barrett

The problem is that we have monopolistic practices in the digital world. Fast Company also reports the four most downloaded apps of the 2010s were all owned by Facebook:

I don’t actually think people really understand that their data from WhatsApp and Instagram is being hoovered up by Facebook. I don’t then think they understand what Facebook then do with that data. I tried to lift the veil on this a little bit at the event where I met Clay Shirky. I know at least one person who immediately deleted their Facebook account as a result of it. But I suspect everyone else will just keep on keeping on. And yes, I have been banging my drum about this for quite a while now. I’ll continue to do so.

The truth is, and this is something I’ll be focusing on in upcoming workshops I’m running on digital literacies, that to be an ‘informed citizen’ these days means reading things like the EFF’s report into the current state of corporate surveillance. It means deleting accounts as a result. It means slowing down, taking time, and reading stuff before sharing it on platforms that you know care for the many, not the few. It means actually caring about this stuff.

All of this might just look and feel like a series of preferences. I prefer decentralised social networks and you prefer Facebook. Or I like to use Signal and you like WhatsApp. But it’s more than that. It’s a whole lot more than that. Democracy as we know it is at stake here.


As Prof. Scott Galloway has discussed from an American point of view, we’re living in times of increasing inequality. The tools we’re using exacerbate that inequality. All of a sudden you have to be amazing at your job to even be able to have a decent quality of life:

The biggest losers of the decade are the unremarkables. Our society used to give remarkable opportunities to unremarkable kids and young adults. Some of the crowding out of unremarkable white males, including myself, is a good thing. More women are going to college, and remarkable kids from low-income neighborhoods get opportunities. But a middle-class kid who doesn’t learn to code Python or speak Mandarin can soon find she is not “tracking” and can’t catch up.

Prof. Scott Galloway

I shared an article last Friday, about how you shouldn’t have to be good at your job. The whole point of society is that we look after one another, not compete with one another to see which of us can ‘extract the most value’ and pile up more money than he or she can ever hope to spend. Yes, it would be nice if everyone was awesome at all they did, but the optimisation of everything isn’t the point of human existence.

So once we come down the stack from social networks, to surveillance capitalism, to economic and markets eating the world we find the real problem behind all of this: decision-making. We’ve sacrificed stability for speed, and seem to be increasingly happy with dictator-like behaviour in both our public institutions and corporate lives.

Dictatorships can be more efficient than democracies because they don’t have to get many people on board to make a decision. Democracies, by contrast, are more robust, but at the cost of efficiency.

Taylor Pearson

A selectorate, according to Pearson, “represents the number of people who have influence in a government, and thus the degree to which power is distributed”. Aside from the fact that dictatorships tend to be corrupt and oppressive, they’re just not a good idea in terms of decision-making:

Said another way, much of what appears efficient in the short term may not be efficient but hiding risk somewhere, creating the potential for a blow-up. A large selectorate tends to appear to be working less efficiently in the short term, but can be more robust in the long term, making it more efficient in the long term as well. It is a story of the Tortoise and the Hare: slow and steady may lose the first leg, but win the race.

Taylor Pearson

I don’t think we should be optimising human beings for their role in markets. I think we should be optimising markets (if in fact we need them) for their role in human flourishing. The best way of doing that is to ensure that we distribute power and decision-making well.


So it might seem that my continual ragging on Facebook (in particular) is a small thing in the bigger picture. But it’s actually part of the whole deal. When we have super-powerful individuals whose companies have the ability to surveil us at will; who then share that data to corrupt regimes; who in turn reinforce the worst parts of the status quo; then I think we have a problem.

This year I’ve made a vow to be more radical. To speak my mind even more, and truth to power, especially when it’s inconvenient. I hope you’ll join me ✊

Clickbait and switch?

Should you design for addiction or for loyalty? That’s the question posed by Michelle Manafy in this post for Nieman Lab. It all depends, she says, on whether you’re trying to attract users or an audience.

With advertising as the primary driver of web revenue, many publishers have chased the click dragon. Seeking to meet marketers’ insatiable desire for impressions, publishers doubled down on quick clicks. Headlines became little more than a means to a clickthrough, often regardless of whether the article would pay off or even if the topic was worthy of coverage. And — since we all know there are still plenty of publications focusing on hot headlines over substance — this method pays off. In short-term revenue, that is.

However, the reader experience that shallow clicks deliver doesn’t develop brand affinity or customer loyalty. And the negative consumer experience has actually been shown to extend to any advertising placed in its context. Sure, there are still those seeking a quick buck — but these days, we all see clickbait for what it is.

Audiences mature over time and become wary of particular approaches. Remember “…and you’ll not believe what came next” approaches?

Ask Manafy notes, it’s much easier to design for addiction than to build an audience. The former just requires lots and lots of tracking — something at which the web has become spectacularly good at, due to advertising.

For example, many push notifications are specifically designed to leverage the desire for human interaction to generate clicks (such as when a user is alerted that their friend liked an article). Push notifications and alerts are also unpredictable (Will we have likes? Mentions? New followers? Negative comments?). And this unpredictability, or B.F. Skinner’s principle of variable rewards, is the same one used in those notoriously addictive slot machines. They’re also lucrative — generating more revenue in the U.S. than baseball, theme parks, and movies combined. A pull-to-refresh even smacks of a slot machine lever.

The problem is that designing for addiction isn’t a long-term strategy. Who plays Farmville these days? And the makers of Candy Crush aren’t exactly crushing it with their share price these days.

Sure, an addict is “engaged” — clicking, liking, swiping — but what if they discover that your product is bad for them? Or that it’s not delivering as much value as it does harm? The only option for many addicts is to quit, cold turkey. Sure, many won’t have the willpower, and you can probably generate revenue off these users (yes, users). But is that a long-term strategy you can live with? And is it a growth strategy, should the philosophical, ethical, or regulatory tide turn against you?

The ‘regulatory tide’ referenced here is exemplified through GDPR, which is already causing a sea change in attitude towards user data. Compliance with teeth, it seems, gets results.

Designing for sustainability isn’t just good from a regulatory point of view, it’s good for long-term business, argues Manafy:

Where addiction relies on an imbalanced and unstable relationship, loyal customers will return willingly time and again. They’ll refer you to others. They’ll be interested in your new offerings, because they will already rely on you to deliver. And, as an added bonus, these feelings of goodwill will extend to any advertising you deliver too. Through the provision of quality content, delivered through excellent experiences at predictable and optimal times, content can become a trusted ally, not a fleeting infatuation or unhealthy compulsion.

Instead of thinking of your audience as ‘users’ waiting for their next hit, she suggests, think of them as your audience. That’s a much better approach and will help you make much better design decisions.

Source: Nieman Lab

Get a Thought Shrapnel digest in your inbox every Sunday (free!)
Holler Box