🏡 What can we learn from the great working-from-home experiment? — “A few knowledge jobs, such as IT support, are properly systematised to allow focused work without endless ad hoc emails. [Cal] Newport believes that others will follow once we all wise up. Or we may find that certain kinds of knowledge work are too unruly to systematise. Improvisation will remain the only mode of working — and, for that, face-to-face contact seems essential.”
I disagree with this, having spent almost a decade doing creative, improvisational work, mostly from my home office.
✊ They left Mozilla to make the internet better. Now they’re spreading its gospel for a new generation. — “Plenty of older tech companies spawned networks of industry leaders. Mozilla has, too, only it’s a different kind of group: a collection of values-driven engineers, marketers, program managers and founders. Most of them share a common story: Looking for a sense of purpose in tech, they took a financial hit for the chance to become part of the company’s cult-like obsession with openness and privacy. Though the company had its flaws, they left feeling deep loyalty to the mission, and a sense of betrayal from those who went on to work for the tech giants Mozilla has been battling. “
Some companies act as a filter for a certain type of person. Mozilla is like that, and while I was there I worked with some of the most ethical and awesome people I’ve ever come across.
🤪 Why It’s Usually Crazier Than You Expect — “The idea that people like (or hate) what other people like (or hate) is important, because it lets small ideas grow bigger than you’d guess if you assume everything is ranked by quality alone. Social momentum is hard to model on a spreadsheet, so it’s hard to predict or think about in terms that seem rational. But it’s so powerful.”
The standard economic model is that people act in their individual and group self-interest. But humans are much more complicated than that.
🎓 Academics Are Really, Really Worried About Their Freedom — “Some will process this as a kind of whining, supposing that all we should really be concerned about is whether people are outright dismissed. However, elsewhere a hostile work environment is considered a breach of civil rights, and as one correspondent wrote, “It isn’t just fear of firing that motivates professors and grad students to be quiet. It is a desire to have friends, to be part of a community. This is a fundamental part of human psychology. Indeed, experiments examining the effects of ostracism highlight what a powerful existential threat it is to be ignored, excluded, or rejected. This has been documented at the neurological level. Ostracism is a form of social death. It is a very potent threat.”
Given how conservative humanity has been for the past tens of thousands of years, and given how radical we need to be to fix the world, I don’t have lots of sympathy with this view. Especially when tenured professors have the kind of job security most people can only dream of.
👩💻 Where we are with digital learning adoption — “We should have less big bang summative exams sat in big rooms with invigilators, there are plenty of alternatives. Online assessment systems can at least allow for typing, which is more authentic, and why not also speaking, and drawing? And in the scenarios where an unseen timed assessment is the only option and it has to be online: sometimes proctoring might be useful. It shouldn’t be the default. But it might have a place, sometimes.”
I’m sharing this to +1,000,000 Amber’s suggestion that, for assessment purposes, speaking and drawing should be as authentic as typing and writing.
Quotation-as-title by Marquis de la Grange. Image: Changing the Letter, 1908, by Joseph Edward Southall
About 60 artifacts have been radiocarbon dated, showing the Lendbreen pass was widely used from at least A.D. 300. “It probably served as both an artery for long-distance travel and for local travel between permanent farms in the valleys to summer farms higher in the mountains, where livestock grazed for part of the year,” says University of Cambridge archaeologist James Barrett, a co-author of the research.
Tom Metcalfe (Scientific American)
I love it when the scientific and history communities come together to find out new things about our past. Especially about the Vikings, who were straight-up amazing.
Confidential documents seen by Palatinate show that the University is planning “a radical restructure” of the Durham curriculum in order to permanently put online resources at the core of its educational offer, in response to the Covid-19 crisis and other ongoing changes in both national and international Higher Education.
The proposals seek to “invert Durham’s traditional educational model”, which revolves around residential study, replacing it with one that puts “online resources at the core enabling us to provide education at a distance.”
Jack Taylor & Tom Mitchell (Palatinate)
I’m paying attention to this as Durham University is one of my alma maters* but I think this is going to be a common story across a lot of UK institutions. They’ve relied for too long on the inflated fees brought in by overseas students and now, in the wake of the pandemic, need to rapidly find a different approach.
*I have a teaching qualification and two postgraduate degrees from Durham, despite a snooty professor telling me when I was 17 years old that I’d never get in to the institution 😅
Liu grew up a true believer in “meritocracy” and its corollaries: that success implies worth, and thus failure is a moral judgment about the intellect, commitment and value of the failed.
Her tale — starting in her girlhood bedroom and stretching all the way to protests outside of tech giants in San Francisco — traces a journey of maturity and discovery, as Liu confronts the mounting evidence that her life’s philosophy is little more than the self-serving rhetoric of rich people defending their privilege, the chasm between her lived experience and her guiding philosophy widens until she can no longer straddle it.
Cory Doctorow (Boing Boing)
This book is next on my non-fiction reading list. If your library is closed and doesn’t have an online service, try this.
You want workers to post work as it’s underway—even when it’s rough, incomplete, imperfect. That requires a different mindset, though one that’s increasingly common in asynchronous companies. In traditional companies, people often hesitate to circulate projects or proposals that aren’t polished, pretty, and bullet-proofed. It’s a natural reflex, especially when people are disconnected from each other and don’t communicate casually. But it can lead to long delays, especially on projects in which each participant’s progress depends on the progress and feedback of others. Location-independent companies need a culture in which people recognize that a work-in-progress is likely to have gaps and flaws and don’t criticize each other for them. This is an issue of norms, not tools.
Edmund L. Andrews-Stanford (Futurity)
I discovered this via Stephen Downes, who highlights the fifth point in this article (‘single source of truth’). I’ve actually highlighted the sixth one (‘breaking down the barriers to sharing work’) as I’ve also seen that as an important thing to check for when hiring.
The level of interest in the coronavirus pandemic – and the fear and uncertainty that comes with it – has caused tired, fringe conspiracy theories to be pulled into the mainstream. From obscure YouTube channels and Facebook pages, to national news headlines, baseless claims that 5G causes or exacerbates coronavirus are now having real-world consequences. People are burning down 5G masts in protest. Government ministers and public health experts are now being forced to confront this dangerous balderdash head-on, giving further oxygen and airtime to views that, were it not for the major technology platforms, would remain on the fringe of the fringe. “Like anti-vax content, this messaging is spreading via platforms which have been designed explicitly to help propagate the content which people find most compelling; most irresistible to click on,” says Smith from Demos.
James temperton (wired)
The disinformation and plain bonkers-ness around this ‘theory’ of linking 5G and the coronavirus is a particularly difficult thing to deal with. I’ve avoided talking about it on social media as well as here on Thought Shrapnel, but I’m sharing this as it’s a great overview of how these things spread — and who’s fanning the flames.
The COVID-19 pandemic is an unprecedented moment in the history of social structures such as education. After all of the time spent creating emergency plans and three- or five-year road maps that include fail safe options, we find ourselves in the actual emergency. Yet not even a month into global orders of shelter in place, there are many education narratives attempting to frame the pandemic as an opportunity. Extreme situations can certainly create space for extraordinary opportunities, but that viewpoint is severely limited considering this moment in time. Perhaps if the move to distance/online/remote education had happened in a vacuum that did not involve a global pandemic, millions sick, tens of thousands dead, tens of millions unemployed, hundreds of millions hungry, billions anxious and uncertain of society’s next step…perhaps then this would be that opportunity moment. Instead, we have a global emergency where the stress is felt everywhere but it certainly is not evenly distributed, so learning/aligning/deploying/assessing new technology for the classroom is not universally feasible. You can’t teach someone to swim while they’re drowning.
Rolin Moe is a thoughtful commentator on educational technology. This post was obviously written quickly (note the typo in the URL when you click through, as well as some slightly awkward language) and I’m not a fan of the title Moe has settled on. That being said, the point about this not being an ‘opportunity’ for edtech is a good one.
Produced in March, the memo explained how an NHS app could work, using Bluetooth LE, a standard feature that runs constantly and automatically on all mobile devices, to take “soundings” from other nearby phones through the day. People who have been in sustained proximity with someone who may have Covid-19 could then be warned and advised to self–isolate, without revealing the identity of the infected individual.
However, the memo stated that “more controversially” the app could use device IDs, which are unique to all smartphones, “to enable de-anonymisation if ministers judge that to be proportionate at some stage”. It did not say why ministers might want to identify app users, or under what circumstances doing so would be proportionate.
It’s hard to think of a job title more pandemic-proof than “superstar live streamer.” While the coronavirus has upended the working lives of hundreds of millions of people, Dr. Lupo, as he’s known to acolytes, has a basically unaltered routine. He has the same seven-second commute down a flight of stairs. He sits in the same seat, before the same configuration of lights, cameras and monitors. He keeps the same marathon hours, starting every morning at 8.
Social distancing? He’s been doing that since he went pro, three years ago.
For 11 hours a day, six days a week, he sits alone, hunting and being hunted on games like Call of Duty and Fortnite. With offline spectator sports canceled, he and other well-known gamers currently offer one of the only live contests that meet the standards of the Centers for Disease Control and Prevention.
David Segal (The New York Times)
It’s hard to argue with my son these days when he says he wants to be a ‘pro gamer’.
(a quick tip for those who want to avoid ‘free registration’ and some paywalls — use a service like Pocket to save the article and read it there)
To be clear, socialism may be a better way to go, as evidenced by the study showing 4 of the 5 happiest nations are socialist democracies. However, unless we’re going to provide universal healthcare and universal pre-K, let’s not embrace The Hunger Games for the working class on the way up, and the Hallmark Channel for the shareholder class on the way down. The current administration, the wealthy, and the media have embraced policies that bless the caching of power and wealth, creating a nation of brittle companies and government agencies.
A somewhat rambling post, but which explains the difference between a form of capitalism that (theoretically) allows everyone to flourish, and crony capitalism, which doesn’t.
I don’t know… I think I’d like to say only that [young people] should learn to be alone and try to spend as much time as possible by themselves. I think one of the faults of young people today is that they try to come together around events that are noisy, almost aggressive at times. This desire to be together in order to not feel alone is an unfortunate symptom, in my opinion. Every person needs to learn from childhood how to spend time with oneself. That doesn’t mean he should be lonely, but that he shouldn’t grow bored with himself because people who grow bored in their own company seem to me in danger, from a self-esteem point of view.
This article in Open Culture quotes the film-maker Andrei Tarkovsky. Having just finished my first set of therapy sessions, I have to say that the metaphor of “puting on your own oxygen mask before helping others” would be a good takeaway from it. That sounds selfish, but as Tarkovsky points out here, other approaches can lead to the destruction of self-esteem.
[T]here are two sources of feeling like a noob: being stupid, and doing something novel. Our dislike of feeling like a noob is our brain telling us “Come on, come on, figure this out.” Which was the right thing to be thinking for most of human history. The life of hunter-gatherers was complex, but it didn’t change as much as life does now. They didn’t suddenly have to figure out what to do about cryptocurrency. So it made sense to be biased toward competence at existing problems over the discovery of new ones. It made sense for humans to dislike the feeling of being a noob, just as, in a world where food was scarce, it made sense for them to dislike the feeling of being hungry.
I’m not sure about the evolutionary framing, but there’s definitely something in this about having the confidence (and humility) to be a ‘noob’ and learn things as a beginner.
Imagine you were to take two identical twins and give them the same starter job, same manager, same skills, and the same personality. One competently does all of their work behind a veil of silence, not sharing good news, opportunities, or challenges, but just plugs away until asked for a status update. The other does the same level of work but communicates effectively, keeping their manager and stakeholders proactively informed. Which one is going to get the next opportunity for growth?
I absolutely love this post. As a Product Manager, I’ve been talking repeatedly recently about making our open-source project ‘legible’. As remote workers, that means over-communicating and, as pointed out in this post, being proactive in that communication. Highly recommended.
Consider packing a suitcase for a trip. It contains many different items – clothes, toiletries, books, electrical items, maybe food and drink or gifts. Some of these items bear a relationship to others, for example underwear, and others are seemingly unrelated, for example a hair dryer. Each brings their own function, which has a separate existence and relates to other items outside of the case, but within the case, they form a new category, that of “items I need for my trip.” In this sense the suitcase resembles the ed tech field, or at least a gathering of ed tech individuals, for example at a conference
If you attend a chemistry conference and have lunch with strangers, it is highly likely they will nearly all have chemistry degrees and PhDs. This is not the case at an ed tech conference, where the lunch table might contain people with expertise in computer science, philosophy, psychology, art, history and engineering. This is a strength of the field. The chemistry conference suitcase then contains just socks (but of different types), but the ed tech suitcase contains many different items. In this perspective then the aim is not to make the items of the suitcase the same, but to find means by which they meet the overall aim of usefulness for your trip, and are not random items that won’t be needed. This suggests a different way of approaching ed tech beyond making it a discipline.
At the start of this year, it became (briefly) fashionable among ageing (mainly North American) men to state that they had “never been an edtech guy”. Follwed by something something pedagogy or something something people. In this post, Martin Weller uses a handy metaphor to explain that edtech may not be a discipline, but it’s a useful field (or area of focus) nonetheless.
Backdoors are usually camouflaged as “accidental” security flaws. In the last year alone, 12 such flaws have been found in WhatsApp. Seven of them were critical – like the one that got Jeff Bezos. Some might tell you WhatsApp is still “very secure” despite having 7 backdoors exposed in the last 12 months, but that’s just statistically improbable.
Don’t let yourself be fooled by the tech equivalent of circus magicians who’d like to focus your attention on one isolated aspect all while performing their tricks elsewhere. They want you to think about end-to-end encryption as the only thing you have to look at for privacy. The reality is much more complicated.
Facebook products are bad for you, for society, and for the planet. Choose alternatives and encourage others to do likewise.
The current social-media model isn’t quite right for family sharing. Different generations tend to congregate in different places: Facebook is Boomer paradise, Instagram appeals to Millennials, TikTok is GenZ central. (WhatsApp has helped bridge the generational divide, but its focus on messaging is limiting.)
Updating family about a vacation across platforms—via Instagram stories or on Facebook, for example—might not always be appropriate. Do you really want your cubicle pal, your acquaintance from book club, and your high school frenemy to be looped in as well?
Some apps are just before their time. Take Path, for example, which my family used for almost the entire eight years it was around, from 2010 to 2018. The interface was great, the experience cosy, and the knowledge that you weren’t sharing with everyone outside of a close circle? Priceless.
While one data broker might only be able to tie my shopping behavior to something like my IP address, and another broker might only be able to tie it to my rough geolocation, that’s ultimately not much of an issue. What is an issue is what happens when those “anonymized” data points inevitably bleed out of the marketing ecosystem and someone even more nefarious uses it for, well, whatever—use your imagination. In other words, when one data broker springs a leak, it’s bad enough—but when dozens spring leaks over time, someone can piece that data together in a way that’s not only identifiable but chillingly accurate.
This idea of cumulative harm is a particularly difficult one to explain (and prove) not only in the world of data, but in every area of life.
Google recently invented a third way to track who you are and what you view on the web.
Each and every install of Chrome, since version 54, have generated a unique ID. Depending upon which settings you configure, the unique ID may be longer or shorter.
So every time you visit a Google web page or use a third party site which uses some Google resource, this ID is sent to Google and can be used to track which website or individual page you are viewing. As Google’s services such as scripts, captchas and fonts are used extensively on the most popular web sites, it’s likely that Google tracks most web pages you visit.
In the last year I have seen more and more researchers like danah boyd suggesting that digital literacies are not enough. Given that some on the Internet have weaponized these tools, I believe she is right. Moving beyond digital literacies means thinking about the epistemology behind digital literacies and helping to “build the capacity to truly hear and embrace someone else’s perspective and teaching people to understand another’s view while also holding their view firm” (boyd, March 9, 2018). We can still rely on social media for our news but we really owe it to ourselves to do better in further developing digital literacies, and knowing that just because we have discussions through screens that we should not be so narcissistic to believe that we MUST be right or that the other person is simply an idiot.
I’d argue, as I did recently in this talk, that what Young and boyd are talking about here is actually a central tenet of digital literacies.
So said Aldous Huxley. Recently, I discovered a episode of the podcast The Science of Success in which Dan Carlin was interviewed. Now Dan is the host of one of my favourite podcasts, Hardcore History as well as one he’s recently discontinued called Common Sense.
The reason the latter is on ‘indefinite hiatus’ was discussed on The Science of Success podcast. Dan feels that, after 30 years as a journalist, if he can’t get a grip on the current information landscape, then who can? It’s shaken him up a little.
One of the quotations he just gently lobbed into the conversation was from John Stuart Mill, who at one time or another was accused by someone of being ‘inconsistent’ in his views. Mill replied:
When the facts change, I change my mind. What do you do, sir?
John Stuart Mill
Now whether or not Mill said those exact words, the sentiment nevertheless stands. I reckon human beings have always made up their minds first and then chosen ‘facts’ to support their opinions. These days, I just think that it’s easier than ever to find ‘news’ outlets and people sharing social media posts to support your worldview. It’s as simple as that.
Last week I watched a stand-up comedy routine by Kevin Bridges on BBC iPlayer as part of his 2018 tour. As a Glaswegian, he made the (hilarious) analogy of social media as being like going into a pub.
(As an aside, this is interesting, as a decade ago people would often use the analogy of using social media as being like going to an café. The idea was that you could overhear, and perhaps join in with, interesting conversations that you hear. No-one uses that analogy any more.)
Bridges pointed out that if you entered a pub, sat down for a quiet pint, and the person next to you was trying to flog you Herbalife products, constantly talking about how #blessed they felt, or talking ambiguously for the sake of attention, you’d probably find another pub.
He was doing it for laughs, but I think he was also making a serious point. Online, we tolerate people ranting on and generally being obnoxious in ways we would never do offline.
The underlying problem of course is that any platform that takes some segment of the real world and brings it into software will also bring in all that segment’s problems. Amazon took products and so it has to deal with bad and fake products (whereas one might say that Facebook took people, and so has bad and fake people).
I met Clay Shirky at an event last month, which kind of blew my mind given that it was me speaking at it rather than him. After introducing myself, we spoke for a few minutes about everything from his choice of laptop to what he’s been working on recently. Curiously, he’s not writing a book at the moment. After a couple of very well-received books (Here Comes Everybody and Cognitive Surplus) Shirky has actually only published a slightly obscure book about Chinese smartphone manufacturing since 2010.
While I didn’t have time to dig into things there and then, and it would been a bit presumptuous of me to do so, it feels to me like Shirky may have ‘walked back’ some of his pre-2010 thoughts. This doesn’t surprise me at all, given that many of the rest of us have, too. For example, in 2014 he published a Medium article explaining why he banned his students from using laptops in lectures. Such blog posts and news articles are common these days, but it felt like was one of the first.
The last decade from 2010 to 2019, which Audrey Watters has done a great job of eviscerating, was, shall we say, somewhat problematic. The good news is that we connected 4.5 billion people to the internet. The bad news is that we didn’t really harness that for much good. So we went from people sharing pictures of cats, to people sharing pictures of cats and destroying western democracy.
Other than the ‘bad and fake people’ problem cited by Ben Evans above, another big problem was the rise of surveillance capitalism. In a similar way to climate change, this has been repackaged as a series of individual failures on the part of end users. But, as Lindsey Barrett explains for Fast Company, it’s not really our fault at all:
In some ways, the tendency to blame individuals simply reflects the mistakes of our existing privacy laws, which are built on a vision of privacy choices that generally considers the use of technology to be a purely rational decision, unconstrained by practical limitations such as the circumstances of the user or human fallibility. These laws are guided by the idea that providing people with information about data collection practices in a boilerplate policy statement is a sufficient safeguard. If people don’t like the practices described, they don’t have to use the service.
The problem is that we have monopolistic practices in the digital world. Fast Companyalso reports the four most downloaded apps of the 2010s were all owned by Facebook:
I don’t actually think people really understand that their data from WhatsApp and Instagram is being hoovered up by Facebook. I don’t then think they understand what Facebook then do with that data. I tried to lift the veil on this a little bit at the event where I met Clay Shirky. I know at least one person who immediately deleted their Facebook account as a result of it. But I suspect everyone else will just keep on keeping on. And yes, I have been banging my drum about this for quite a while now. I’ll continue to do so.
The truth is, and this is something I’ll be focusing on in upcoming workshops I’m running on digital literacies, that to be an ‘informed citizen’ these days means reading things like the EFF’s report into the current state of corporate surveillance. It means deleting accounts as a result. It means slowing down, taking time, and reading stuff before sharing it on platforms that you know care for the many, not the few. It means actually caring about this stuff.
All of this might just look and feel like a series of preferences. I prefer decentralised social networks and you prefer Facebook. Or I like to use Signal and you like WhatsApp. But it’s more than that. It’s a whole lot more than that. Democracy as we know it is at stake here.
As Prof. Scott Galloway has discussed from an American point of view, we’re living in times of increasing inequality. The tools we’re using exacerbate that inequality. All of a sudden you have to be amazing at your job to even be able to have a decent quality of life:
The biggest losers of the decade are the unremarkables. Our society used to give remarkable opportunities to unremarkable kids and young adults. Some of the crowding out of unremarkable white males, including myself, is a good thing. More women are going to college, and remarkable kids from low-income neighborhoods get opportunities. But a middle-class kid who doesn’t learn to code Python or speak Mandarin can soon find she is not “tracking” and can’t catch up.
Prof. Scott Galloway
I shared an article last Friday, about how you shouldn’t have to be good at your job. The whole point of society is that we look after one another, not compete with one another to see which of us can ‘extract the most value’ and pile up more money than he or she can ever hope to spend. Yes, it would be nice if everyone was awesome at all they did, but the optimisation of everything isn’t the point of human existence.
So once we come down the stack from social networks, to surveillance capitalism, to economic and markets eating the world we find the real problem behind all of this: decision-making. We’ve sacrificed stability for speed, and seem to be increasingly happy with dictator-like behaviour in both our public institutions and corporate lives.
Dictatorships can be more efficient than democracies because they don’t have to get many people on board to make a decision. Democracies, by contrast, are more robust, but at the cost of efficiency.
A selectorate, according to Pearson, “represents the number of people who have influence in a government, and thus the degree to which power is distributed”. Aside from the fact that dictatorships tend to be corrupt and oppressive, they’re just not a good idea in terms of decision-making:
Said another way, much of what appears efficient in the short term may not be efficient but hiding risk somewhere, creating the potential for a blow-up. A large selectorate tends to appear to be working less efficiently in the short term, but can be more robust in the long term, making it more efficient in the long term as well. It is a story of the Tortoise and the Hare: slow and steady may lose the first leg, but win the race.
I don’t think we should be optimising human beings for their role in markets. I think we should be optimising markets (if in fact we need them) for their role in human flourishing. The best way of doing that is to ensure that we distribute power and decision-making well.
So it might seem that my continual ragging on Facebook (in particular) is a small thing in the bigger picture. But it’s actually part of the whole deal. When we have super-powerful individuals whose companies have the ability to surveil us at will; who then share that data to corrupt regimes; who in turn reinforce the worst parts of the status quo; then I think we have a problem.
This year I’ve made a vow to be more radical. To speak my mind even more, and truth to power, especially when it’s inconvenient. I hope you’ll join me ✊
I’ve read so much stuff over the past couple of months that it’s been a real job whittling down these links. In the end I gave up and shared a few more than usual!
You Shouldn’t Have to Be Good at Your Job(GEN) — “This is how the 1% justifies itself. They are not simply the best in terms of income, but in terms of humanity itself. They’re the people who get invited into the escape pods when the mega-asteroid is about to hit. They don’t want a fucking thing to do with the rest of the population and, in fact, they have exploited global economic models to suss out who deserves to be among them and who deserves to be obsolete. And, thanks to lax governments far and wide, they’re free to practice their own mass experiments in forced Darwinism. You currently have the privilege of witnessing a worm’s-eye view of this great culling. Fun, isn’t it?”
We’ve spent the decade letting our tech define us. It’s out of control(The Guardian) — “There is a way out, but it will mean abandoning our fear and contempt for those we have become convinced are our enemies. No one is in charge of this, and no amount of social science or monetary policy can correct for what is ultimately a spiritual deficit. We have surrendered to digital platforms that look at human individuality and variance as “noise” to be corrected, rather than signal to be cherished. Our leading technologists increasingly see human beings as a problem, and technology as the solution – and they use our behavior on their platforms as evidence of our essentially flawed nature.”
How headphones are changing the sound of music (Quartz) — “Another way headphones are changing music is in the production of bass-heavy music. Harding explains that on small speakers, like headphones or those in a laptop, low frequencies are harder to hear than when blasted from the big speakers you might encounter at a concert venue or club. If you ever wondered why the bass feels so powerful when you are out dancing, that’s why. In order for the bass to be heard well on headphones, music producers have to boost bass frequencies in the higher range, the part of the sound spectrum that small speakers handle well.”
The False Promise of Morning Routines(The Atlantic) — “Goat milk or no goat milk, the move toward ritualized morning self-care can seem like merely a palliative attempt to improve work-life balance.It makes sense to wake up 30 minutes earlier than usual because you want to fit in some yoga, an activity that you enjoy. But something sinister seems to be going on if you feel that you have to wake up 30 minutes earlier than usual to improve your well-being, so that you can also work 60 hours a week, cook dinner, run errands, and spend time with your family.”
Giant surveillance balloons are lurking at the edge of space(Ars Technica) — “The idea of a constellation of stratospheric balloons isn’t new—the US military floated the idea back in the ’90s—but technology has finally matured to the point that they’re actually possible. World View’s December launch marks the first time the company has had more than one balloon in the air at a time, if only for a few days. By the time you’re reading this, its other stratollite will have returned to the surface under a steerable parachute after nearly seven weeks in the stratosphere.”
The Unexpected Philosophy Icelanders Live By(BBC Travel) — “Maybe it makes sense, then, that in a place where people were – and still are – so often at the mercy of the weather, the land and the island’s unique geological forces, they’ve learned to give up control, leave things to fate and hope for the best. For these stoic and even-tempered Icelanders, þetta reddast is less a starry-eyed refusal to deal with problems and more an admission that sometimes you must make the best of the hand you’ve been dealt.”
What Happens When Your Career Becomes Your Whole Identity(HBR) — “While identifying closely with your career isn’t necessarily bad, it makes you vulnerable to a painful identity crisis if you burn out, get laid off, or retire. Individuals in these situations frequently suffer anxiety, depression, and despair. By claiming back some time for yourself and diversifying your activities and relationships, you can build a more balanced and robust identity in line with your values.”
Having fun is a virtue, not a guilty pleasure(Quartz) — “There are also, though, many high-status workers who can easily afford to take a break, but opt instead to toil relentlessly. Such widespread workaholism in part reflects the misguided notion that having fun is somehow an indulgence, an act of absconding from proper respectable behavior, rather than embracement of life. “
It’s Time to Get Personal(Laura Kalbag) — “As designers and developers, it’s easy to accept the status quo. The big tech platforms already exist and are easy to use. There are so many decisions to be made as part of our work, we tend to just go with what’s popular and convenient. But those little decisions can have a big impact, especially on the people using what we build.”
The 100 Worst Ed-Tech Debacles of the Decade(Hack Education) — “Oh yes, I’m sure you can come up with some rousing successes and some triumphant moments that made you thrilled about the 2010s and that give you hope for “the future of education.” Good for you. But that’s not my job. (And honestly, it’s probably not your job either.)”
Why so many Japanese children refuse to go to school(BBC News) — “Many schools in Japan control every aspect of their pupils’ appearance, forcing pupils to dye their brown hair black, or not allowing pupils to wear tights or coats, even in cold weather. In some cases they even decide on the colour of pupils’ underwear. “
The real scam of ‘influencer’(Seth Godin) — “And a bigger part is that the things you need to do to be popular (the only metric the platforms share) aren’t the things you’d be doing if you were trying to be effective, or grounded, or proud of the work you’re doing.”
Have a quick skim through these links that I came across this week and found interesting:
Overrated: Ludwig Wittgenstein(Standpoint) — “Wittgenstein’s reputation for genius did not depend on incomprehensibility alone. He was also “tortured”, rude and unreliable. He had an intense gaze. He spent months in cold places like Norway to isolate himself. He temporarily quit philosophy, because he believed that he had solved all its problems in his 1922 Tractatus Logico-Philosophicus, and worked as a gardener. He gave away his family fortune. And, of course, he was Austrian, as so many of the best geniuses are.”
EdTech Resistance(Ben Williamson) — “We should not and cannot ignore these tensions and challenges. They are early signals of resistance ahead for edtech which need to be engaged with before they turn to public outrage. By paying attention to and acting on edtech resistances it may be possible to create education systems, curricula and practices that are fair and trustworthy. It is important not to allow edtech resistance to metamorphose into resistance to education itself.”
The Guardian view on machine learning: a computer cleverer than you?(The Guardian) — “The promise of AI is that it will imbue machines with the ability to spot patterns from data, and make decisions faster and better than humans do. What happens if they make worse decisions faster? Governments need to pause and take stock of the societal repercussions of allowing machines over a few decades to replicate human skills that have been evolving for millions of years.”
A nerdocratic oath(Scott Aaronson) — “I will never allow anyone else to make me a cog. I will never do what is stupid or horrible because “that’s what the regulations say” or “that’s what my supervisor said,” and then sleep soundly at night. I’ll never do my part for a project unless I’m satisfied that the project’s broader goals are, at worst, morally neutral. There’s no one on earth who gets to say: “I just solve technical problems. Moral implications are outside my scope”.”
Privacy is power(Aeon) — “The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.”
The Symmetry and Chaos of the World’s Megacities(WIRED) — “Koopmans manages to create fresh-looking images by finding unique vantage points, often by scouting his locations on Google Earth. As a rule, he tries to get as high as he can—one of his favorite tricks is talking local work crews into letting him shoot from the cockpit of a construction crane.”
Dreaming of a day when we can drop the e from elearning and the m from mobile learning & just crack on.
Last week, I noticed that Stephen Downes, in reply to Scott Leslie on Mastodon, had mentioned that he didn’t even think that ‘e-learning’ or ‘edtech’ was really a thing any more, so perhaps Craig dropping that from his bio was symptomatic of a wider shift?
I’m not sure anyone has any status in online learning any more. I’m wondering, maybe it’s not even a discipline any more. There’s learning analytics and open pedagogy and experience design, etc., but I’m not sure there’s a cohesive community looking at what we used to call ed tech or e-learning.
His comments were part of a thread, so I decided not to take it out of context. However, Stephen has subsequently written his own post about it, so it’s obviously something on his mind.
Reflecting on what he covers in OLDaily, he notes that, while everything definitely falls within something broadly called ‘educational technology’, there’s very few people working at that meta level — unlike, say, ten years ago:
[I]n 2019 there’s no community that encompasses all of these things. Indeed, each one of these topics has not only blossomed its own community, but each one of these communities is at least as complex as the entire field of education technology was some twenty years ago. It’s not simply that change is exponential or that change is moving more and more rapidly, it’s that change is combinatorial – with each generation, the piece that was previously simple gets more and more complex.
I think Stephen’s got what Venkatesh Rao might deem an ‘elder blog’:
The concept is derived from the idea of an elder game in gaming culture — a game where most players have completed a full playthrough and are focusing on second-order play.
In other words, Stephen has spent a long time exploring and mapping the emerging territory. What’s happening now, it could be argued, is that new infrastructure is emerging, but using the same territory.
So, to continue the metaphor, a new community springs up around a new bridge or tunnel, but it’s not so different from what went before. It’s more convenient, maybe, and perhaps increases capacity, but it’s not really changing the overall landscape.
So what is the value of OLDaily? I don’t know. In one sense, it’s the same value it always had – it’s a place for me to chronicle all those developments in my field, so I have a record of them, and am more likely to remember them. And I think it’s a way – as it always has been – for people who do look at the larger picture to stay literate. Not literate in the sense of “I could build an educational system from scratch” but literate in the sense of “I’ve heard that term before, and I know it refers to this part of the field.”
I find Stephen’s work invaluable. Along with the likes of Audrey Watters and Martin Weller, we need wise voices guiding us — whether or not we decide to call what we’re doing ‘edtech’.
As many people will be aware, the Open University (OU) is going through a pretty turbulent time in its history. As befitting the nature of the institution, a lot of conversations about its future are happening in public spaces.
Martin Weller, a professor at the university, has been vocal. In this post, a response to a keynote from Tony Bates, he offers a way forward.
I would like to… propose a new role: Sensible Ed Tech Advisor. Job role is as follows:
Ability to offer practical advice on adoption of ed tech that will benefit learners
Strong BS detector for ed tech hype
Interpreter of developing trends for particular context
Understanding of the intersection of tech and academic culture
Communicating benefits of any particular tech in terms that are valuable to educators and learners
Appreciation of ethical and social impact of ed tech
(Lest that sound like I’m creating a job description for myself, I didn’t add “interest in ice hockey” at the end, so you can tell that it isn’t)
Weller notes that Bates mentioned in his his post-keynote write-up that the OU has a “fixation on print as the ‘core’ medium/technology”. He doesn’t think that’s correct.
I’m interested in this, because the view of an institution is formed not only by the people inside it, but by the press and those who have an opinion and an audience. Weller accuses Bates of being woefully out of date. I think he’s correct to call him out on it, as I’ve witnessed recently a whole host of middle-aged white guys lazily referencing things in presentations they haven’t bothered to research very well.
It is certainly true that some disciplines do have a print preference, and Tony is correct to say that often a print mentality is transferred to online. But what this outdated view (it was probably true 10-15 years ago) suggests is a ‘get digital or else’ mentality. Rather, I would argue, we need to acknowledge the very good digital foundation we have, but find ways to innovate on top of this.
If you are fighting an imaginary analogue beast, then this becomes difficult. For instance, Tony does rightly highlight how we don’t make enough use of social media to support students, but then ignores that there are pockets of very good practice, for example the OU PG Education account and the use of social media in the Cisco courses. Rolling these out across the university is not simple, but it is the type of project that we know how to realise. But by framing the problem as one of wholesale structural, cultural change starting from a zero base, it makes achieving practical, implementable projects difficult. You can’t do that small(ish) thing until we’ve done these twenty big things.
We seem to be living at a time when those who were massive, uncritical boosters of technology in education (and society in general) are realising the errors of their ways. I actually wouldn’t count Weller as an uncritical booster, but I welcome the fact that he is self-deprecating enough to include himself in that crowd.
I would also suggest that the sort of “get on the ed tech bus or else” argument that Tony puts forward is outdated, and ineffective (I’ve been guilty of it myself in the past). And as Audrey Watters highlights tirelessly, an unsceptical approach to ed tech is problematic for many reasons. Far more useful is to focus on specific problems staff have, or things they want to realise, than suggest they just ‘don’t get it’. Having an appreciation for this intersection between ed tech (coming from outside the institution and discipline often) and the internal values and culture is also an essential ingredient in implementing any technology successfully.
This is a particularly interesting time in the history of technology in education and society. I’m glad that conversations like this are happening in the open.
I’ve been to the Bett Show (formely known as BETT, which is how the author refers to it in this article) in many different guises. I’ve been as a classroom teacher, school senior leader, researcher in Higher Education, when I was working in different roles at Mozilla, as a consultant, and now in my role at Moodle.
I go because it’s free, and because it’s a good place to meet up with people I see rarely. While I’ve changed and grown up, the Bett Show is still much the same. As Junaid Mubeen, the author of this article, notes:
The BETT show is emblematic of much that EdTech gets wrong. No show captures the hype of educational technology quite like the world’s largest education trade show. This week marked my fifth visit to BETT at London’s Excel arena. True to form, my two days at the show left me feeling overwhelmed with the number of products now available in the EdTech market, yet utterly underwhelmed with the educational value on offer.
It’s laughable, it really is. I saw all sorts of tat while I was there. I heard that a decent sized stand can set you back around a million pounds.
One senses from these shows that exhibitors are floating from one fad to the next, desperately hoping to attach their technological innovations to education. In this sense, the EdTech world is hopelessly predictable; expect blockchain applications to emerge in not-too-distant future BETT shows.
But of course. I felt particularly sorry this year for educators I know who were effectively sales reps for the companies they’ve gone to work for. I spent about five hours there, wandering, talking, and catching up with people. I can only imagine the horror of being stuck there for four days straight.
I like the questions Mubeen comes up with. However, the edtech companies are playing a different game. While there’s some interested in pedagogical development, for most of them it’s just another vertical market.
In the meantime, there are four simple questions every self-professed education innovator should demand of themselves:
What is your pedagogy? At the very least, can you list your educational goals?
What does it mean for your solution to work and how will this be measured in a way that is meaningful and reliable?
How are your users supported to achieve their educational goals after the point of sale?
How do your solutions interact with other offerings in the marketplace?
Somewhat naïvely, the author says that he looks forward to the day when exhibitors are selected “not on their wallet size but on their ability to address these foundational questions”. As there’s a for-profit company behind Bett, I think he’d better not hold his breath.
Audrey Watters is delightfully blunt about the New Media Consortium, known for their regular ‘Horizon reports’, shutting down:
While I am sad for all the NMC employees who lost their jobs, I confess: I will not mourn an end to the Horizon Report project. (If we are lucky enough, that is, that it actually goes away.) I do not think the Horizon Report is an insightful or useful tool. Sorry. I recognize some people really love to read it. But perhaps part of the problem that education technology faces right now – as an industry, as a profession, what have you – is that many of its leaders believe that the Horizon Report is precisely that. Useful. Insightful.