There are major issues of transparency and authenticity here because the beliefs and opinions don’t actually belong to the digital models, they belong to the models’ creators. And if the creators can’t actually identify with the experiences and groups that these models claim to belong to (i.e., person of color, LGBTQ, etc.), then do they have the right to actually speak on those issues? Or is this a new form of robot cultural appropriation, one in which digital creators are dressing up in experiences that aren’t theirs?
Sinead Bovell (Vogue)
This is an incredible article that looks at machine learning and AI through the lens of an industry I hadn’t thought of as being on the brink of being massively disrupted by technology.
It is strange that “cancel culture” has become a project of the left, which spent the 20th century fighting against capricious firings of “troublesome” employees. A lack of due process does not become a moral good just because you sometimes agree with its targets. We all, I hope, want to see sexism, racism, and other forms of discrimination decrease. But we should be aware of the economic incentives here, particularly given the speed of social media, which can send a video viral, and see onlookers demand a response, before the basic facts have been established. Afraid of the reputational damage that can be incurred in minutes, companies are behaving in ways that range from thoughtless and uncaring to sadistic.
If you care about progressive causes, then woke capitalism is not your friend. It is actively impeding the cause, siphoning off energy, and deluding us into thinking that change is happening faster and deeper than it really is. When people talk about the “excesses of the left”—a phenomenon that blights the electoral prospects of progressive parties by alienating swing voters—in many cases they’re talking about the jumpy overreactions of corporations that aren’t left-wing at all.
Helen Lewis (The Atlantic)
Cancel culture is problematic, and mainly because of the unequal power structures involved. This is an important read. See also this article by Albert Wenger which has some suggestions towards the end in this regard.
The goal of productivity is to get the things you have to get done finished so you can spend more time on the things you want to do. Don’t fall into the busy trap, where you judge your self-worth by how productive you are or how much you’ve contributed to your company or manager. We’re all just trying to keep our heads above water. I hope these tips will help you do the same.
Alan Henry (WIRED)
As I wrote yesterday on my personal blog, I have a bit of an issue with perfectionism. So this reminder, along with the other great advice in the article, was a timely reminder.
If you treat somebody with disdain, of course, you give that person a psychological incentive to diminish your opinion and to want you to be less powerful. Inversely, if you demonstrate understanding and appreciation of someone’s contribution, you create a psychological incentive in the individual to give greater weight to your opinion. And that person will want to strengthen the weight of your opinion in the eyes of others. Appreciation and gratitude breed appreciation and gratitude.
Bruce Tulgan (Fast Company)
Creating a productive, psychologically safe, and emotionally intelligent environment means thanking people for the work they do. That means for their day-to-day activities, not just when they put in a herculean effort. A paycheck is not thanks enough for the work we do and the value we provide.
More interesting still is that nostalgia can bring to mind time-periods we didn’t directly experience. In the film Midnight in Paris (2011), Gil is overwhelmed by nostalgic thoughts about 1920s Paris – which he, a modern-day screenwriter, hasn’t experienced – yet his feelings are nothing short of nostalgic. Indeed, feeling nostalgic for a time one didn’t actually live through appears to be a common phenomenon if all the chatrooms, Facebook pages and websites dedicated to it are anything to go by. In fact, a new word has been coined to capture this precise variant of nostalgia – anemoia, defined by the Urban Dictionary and the Dictionary of Obscure Sorrows as ‘nostalgia for a time you’ve never known’.
How can we make sense of the fact that people feel nostalgia not only for past experiences but also for generic time periods? My suggestion, inspired by recent evidence from cognitive psychology and neuroscience, is that the variety of nostalgia’s objects is explained by the fact that its cognitive component is not an autobiographical memory, but a mental simulation – an imagination, if you will – of which episodic recollections are a sub-class.
Nigel Warburton (Aeon)
In the UK at least, shows like Downton Abbey and Call The Midwife are popular. My view of this is that, as this article would seem to support, it’s a kind of nostalgia for a time that was imagined to be better.
There’s a sinister side to this, as well. This kind of nostalgia seems to be particularly prevalent among more conservative-leaning (white) people harking back to a time of greater divisions in society along race and class lines. I think it’s rather disturbing.
Quiet Parks International (QPI) is a nonprofit working to establish certification for quiet parks to raise awareness of and preserve quiet places. The fledgling organization—whose members include audio engineers, scientists, environmentalists, and musicians—has identified at least 262 sites worldwide, including 30 in the US, that it believes are quiet or could become so with management changes….
QPI has no regulatory authority, but like the International Dark Sky Association’s Dark Sky Parks initiative, the nonprofit believes its certification—granted only after a detailed, three-day sound analysis—can encourage public support of preservation efforts and provide guidelines for protection. “The places that are quiet today … are basically leftovers—places that are out of the way,” Quiet Parks cofounder Gordon Hempton says.
Jenny Morber (WIRED)
I live in a part of the world close to both a designated Dark Sky Park and mountains into which I can escape. Light and noise pollution threaten both of them, so I’m glad to hear of these efforts.
To Bharat Mediratta, chief technology officer at Dropbox, the quarantine experience has highlighted a huge gap in the market. “What we have right now is a bunch of different productivity and collaboration tools that are stitched together. So I will do my product design in Figma, and then I will submit the code change on GitHub, I will push the product out live on AWS, and then I will communicate with my team using Gmail and Slack and Zoom,” he says. “We have all that technology now, but we don’t yet have the ‘digital knowledge worker operating system’ to bring it all together.”
OK, so this is a sponsored post by Dropbox on the WIRED website, but what it highlights is interesting. For example, Monday.com (which our co-op uses) rebranded itself a few months ago as a ‘Work OS’. There’s definitely a lot of money to be made for whoever manages to build an integrated solution, although I think we’re a long way off something which is flexible enough for every use case.
Today, I don’t define success the way that I did when I was younger. I don’t measure it in copies sold or dollars earned. I measure it in what my days look like and the quality of my creative expression: Do I have time to write? Can I say what I think? Do I direct my schedule or does my schedule direct me? Is my life enjoyable or is it a chore?
Tim Ferriss has this question he asks podcast guests: “If you could have a gigantic billboard anywhere with anything on it what would it say and why?” I feel like the title of this blog post is one of the answers I would give to that question.
We are a small group of volunteers who met as members of the Higher Ed Learning Collective. We were inspired by the initial demand, and the idea of self-study, interracial groups. The initial decision to form this initiative is based on the myriad calls from people of color for white-bodied people to do internal work. To do the work, we are developing a space for all individuals to read, share, discuss, and interrogate perspectives on race, racism, anti-racism, identity in an educational setting. To ensure that the fight continues for justice, we need to participate in our own ongoing reflection of self and biases. We need to examine ourselves, ask questions, and learn to examine our own perspectives. We need to get uncomfortable in asking ourselves tough questions, with an understanding that this is a lifelong, ongoing process of learning.
This is a fantastic resource for people who, like me, are going on a learning journey at the moment. I’ve found the podcast Seeing White by Scene on Radio particularly enlightening, and at times mind-blowing. Also, the Netflix documentary 13th is excellent, and available on YouTube.
If we put a small amount of time into caring for our gadgets, they can last indefinitely. We’d also be doing the world a favor. By elongating the life of our gadgets, we put more use into the energy, materials and human labor invested in creating the product.
Brian X. Chen (The new York times)
This is a pretty surface-level article that basically suggests people take their smartphone to a repair shop instead of buying a new one. What it doesn’t mention is that aftermarket operating systems such as the Android-based LineageOS can extend the lifetime of smartphones by providing security updates long beyond those provided by vendors.
EncroChat sold customized Android handsets with GPS, camera, and microphone functionality removed. They were loaded with encrypted messaging apps as well as a secure secondary operating system (in addition to Android). The phones also came with a self-destruct feature that wiped the device if you entered a PIN.
The service had customers in 140 countries. While it was billed as a legitimate platform, anonymous sources told Motherboard that it was widely used among criminal groups, including drug trafficking organizations, cartels, and gangs, as well as hitmen and assassins.
EncroChat didn’t become aware that its devices had been breached until May after some users noticed that the wipe function wasn’t working. After trying and failing to restore the features and monitor the malware, EncroChat cut its SIM service and shut down the network, advising customers to dispose of their devices.
Monica Chin (The Verge)
It goes without saying that I don’t want assassins, drug traffickers, and mafia types to be successful in life. However, I’m always a little concerned when there are attacks on encryption, as they’re compromising systems also potentially used by protesters, activists, and those who oppose the status quo.
The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences—including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts—that incorrectly trigger the devices.
“The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans,” one of the researchers, Dorothea Kolossa, said. “Therefore, they are more likely to start up once too often rather than not at all.”
Dan Goodin (Ars Technica)
As anyone with voice assistant-enabled devices in their home will testify, the number of times they accidentally spin up, or misunderstand what you’re saying can be amusing. But we can and should be wary of what’s being listened to, and why.
The Five Levels of Remote Work — and why you’re probably at Level 2
Effective written communication becomes critical the more companies embrace remote work. With an aversion to ‘jumping on calls’ at a whim, and a preference for asynchronous communication… [most] communications [are] text-based, and so articulate and timely articulation becomes key.
Steve Glaveski (The Startup)
This is from March and pretty clickbait-y, but everyone wants to know how they can improve – especially if didn’t work remotely before the pandemic. My experience is that actually most people are at Level 3 and, of course, I’d say that I and my co-op colleagues are at Level 5 given our experience…
All mammals, including us, breathe in through the same opening that we breathe out. Can you imagine if our digestive system worked the same way? What if the food we put in our mouths, after digestion, came out the same way? It doesn’t bear thinking about! Luckily, for digestion, we have a separate in and out. And that’s what the birds have with their lungs: an in point and an out point. They also have air sacs and hollow spaces in their bones. When they breathe in, half of the good air (with oxygen) goes into these hollow spaces, and the other half goes into their lungs through the rear entrance. When they breathe out, the good air that has been stored in the hollow places now also goes into their lungs through that rear entrance, and the bad air (carbon dioxide and water vapor) is pushed out the front exit. So it doesn’t matter whether birds are breathing in or out: Good air is always going in one direction through their lungs, pushing all the bad air out ahead of it.
Walter Murch (Nautilus)
Incredible. Birds are badass (and also basically dinosaurs).
In the many essays of his life he discovered the importance of the moderate life. In his final essay, “On Experience,” Montaigne reveals that “greatness of soul is not so much pressing upward and forward as knowing how to circumscribe and set oneself in order.” What he finds, quite simply, is the importance of the moderate life. We must then, he writes, “compose our character, not compose books.” There is nothing paradoxical about this because his literary essays helped him better essay his life. The lesson he takes from this trial might be relevant for our own trial: “Our great and glorious masterpiece is to live properly.”
Robert Zaresky (The New York Times)
Every week, Bryan Alexander replies to the weekly Thought Shrapnel newsletter. Last week, he sent this article to both me and Chris Lott (who produces the excellent Notabilia).
Developing Questions “(And) what kind of X (is that X)?” “(And) is there anything else about X?” “(And) where is X? or (And) whereabouts is X?” “(And) that’s X like what?” “(And) is there a relationship between X and Y?” “(And) when X, what happens to Y?
Sequence and Source Questions “(And) then what happens? or (And) what happens next?” “(And) what happens just before X?” “(And) where could X come from?”
Intention Questions “(And) what would X like to have happen?” “(And) what needs to happen for X?” “(And) can X (happen)?”
The first two questions: “What kind of X (is that X)?” and “Is there anything else about X?” are the most commonly used.
As a general guide, these two questions account for around 50% of the questions asked in a typical Clean Language session.
I had a great chat with Kristian Still this week, for the first time in about a decade. Kristian was part of EdTechRoundUp back in the day, and early EduTwitter. Among the many things we discussed is his enthusiasm for “clean questioning” which I’m going to investigate further.
Even our throwaway habits can add up to a mountain of carbon. Consider all the little social emails we shoot back and forth—“thanks,” “got it,” “lol.” The UK energy firm Ovo examined email usage and—using data from Lancaster University professor Mike Berners-Lee, who analyzes carbon footprints—they found that if every adult in the UK just sent one less “thank you” email per day, it would cut 16 tons of carbon each year, equal to 22 round-trip flights between New York and London. They also found that 49 percent of us often send thank-you emails to people “within talking distance.” We can lower our carbon output if we’d just take the headphones off for a minute and stop behaving like a bunch of morlocks.
Clive Thompson (WIRED)
Small differences all add up. Our design choices and the decisions we make about technology all have a part to play in fighting climate change.
When you boil it down, neumorphism is a focus on how light moves in three-dimensional space. Its predecessor, skeumorphism, created realism in digital interfaces by simulating textures on surfaces like felt on a poker table or the brushed metal of a tape recorder. An ancillary — though under-developed — aspect of this design style was lighting that interacted realistically with the materials that were being represented; this is why shadows and darkness were so prevalent in those early interfaces.
Jack Koloskus (Input)
The dominant design language over the last five years, without doubt, has been Google’s Material Design. Will a neumorphic approach take over? It’s certainly an interesting approach.
He called on those in the tech industry to look at the bigger picture regarding their work and its implications beyond simply a project—and to think deeply and take a stronger stand with regards to who their labor actually serves.
“It’s not enough to read, it’s not enough to believe in something, it’s not enough to write something, you have to eventually stand for something if you want things to change,” he said.
Kevin Truong (Motherboard)
The tech industry is an interesting one as it’s a relatively new and immature one, at least in its current guise. As a result, the ethics, and the checks and balances aren’t quite there yet.
To my mind, things like unions and professional associations show maturity and the kind of coming together that don’t put moral decisions on the shoulders of individuals, but rather on the whole sector.
[T]here is a narrative chasm between the twee and borderless dreamscape of fantasy Britain and actual, material Britain, where rents are rising and racists are running brave. The chasm is wide, and a lot of people are falling into it. The omnishambles of British politics is what happens when you get scared and mean and retreat into the fairytales you tell about yourself. When you can no longer live within your own contradictions. When you want to hold on to the belief that Britain is the land of Jane Austen and John Lennon and Sir Winston Churchill, the war hero who has been repeatedly voted the greatest Englishman of all time. When you want to forget that Britain is also the land of Cecil Rhodes and Oswald Mosley and Sir Winston Churchill, the brutal colonial administrator who sanctioned the building of the first concentration camps and condemned millions of Indians to death by starvation. These are not contradictions, even though the drive to separate them is cracking the country apart. If you love your country and don’t own its difficulties and its violence, you don’t actually love your country. You’re just catcalling it as it goes by.
Laurie Penny (Longreads)
I always find looking at my country through the lens of foreigners cringe-inducing. I suppose it’s a narrative produced for tourists but, sadly, we seem to have believed our own rhetoric, and look where it’s gotten us…
The idea that Big Tech can mold discourse through bypassing our critical faculties by spying on and analyzing us is both self-serving (inasmuch as it helps Big Tech sell ads and influence services) and implausible, and should be viewed with extreme skepticism
But you don’t have to accept extraordinary claims to find ways in which Big Tech is distorting and degrading our public discourse. The scale of Big Tech makes it opaque and error-prone, even as it makes the job of maintaining a civil and productive space for discussion and debate impossible.
Cory Doctorow (EFF)
A tour de force from Doctorow, who eviscerates the companies that make up ‘Big Tech’ and the role they have in hollowing-out civic society.
About 60 artifacts have been radiocarbon dated, showing the Lendbreen pass was widely used from at least A.D. 300. “It probably served as both an artery for long-distance travel and for local travel between permanent farms in the valleys to summer farms higher in the mountains, where livestock grazed for part of the year,” says University of Cambridge archaeologist James Barrett, a co-author of the research.
Tom Metcalfe (Scientific American)
I love it when the scientific and history communities come together to find out new things about our past. Especially about the Vikings, who were straight-up amazing.
Confidential documents seen by Palatinate show that the University is planning “a radical restructure” of the Durham curriculum in order to permanently put online resources at the core of its educational offer, in response to the Covid-19 crisis and other ongoing changes in both national and international Higher Education.
The proposals seek to “invert Durham’s traditional educational model”, which revolves around residential study, replacing it with one that puts “online resources at the core enabling us to provide education at a distance.”
Jack Taylor & Tom Mitchell (Palatinate)
I’m paying attention to this as Durham University is one of my alma maters* but I think this is going to be a common story across a lot of UK institutions. They’ve relied for too long on the inflated fees brought in by overseas students and now, in the wake of the pandemic, need to rapidly find a different approach.
*I have a teaching qualification and two postgraduate degrees from Durham, despite a snooty professor telling me when I was 17 years old that I’d never get in to the institution 😅
Liu grew up a true believer in “meritocracy” and its corollaries: that success implies worth, and thus failure is a moral judgment about the intellect, commitment and value of the failed.
Her tale — starting in her girlhood bedroom and stretching all the way to protests outside of tech giants in San Francisco — traces a journey of maturity and discovery, as Liu confronts the mounting evidence that her life’s philosophy is little more than the self-serving rhetoric of rich people defending their privilege, the chasm between her lived experience and her guiding philosophy widens until she can no longer straddle it.
Cory Doctorow (Boing Boing)
This book is next on my non-fiction reading list. If your library is closed and doesn’t have an online service, try this.
You want workers to post work as it’s underway—even when it’s rough, incomplete, imperfect. That requires a different mindset, though one that’s increasingly common in asynchronous companies. In traditional companies, people often hesitate to circulate projects or proposals that aren’t polished, pretty, and bullet-proofed. It’s a natural reflex, especially when people are disconnected from each other and don’t communicate casually. But it can lead to long delays, especially on projects in which each participant’s progress depends on the progress and feedback of others. Location-independent companies need a culture in which people recognize that a work-in-progress is likely to have gaps and flaws and don’t criticize each other for them. This is an issue of norms, not tools.
Edmund L. Andrews-Stanford (Futurity)
I discovered this via Stephen Downes, who highlights the fifth point in this article (‘single source of truth’). I’ve actually highlighted the sixth one (‘breaking down the barriers to sharing work’) as I’ve also seen that as an important thing to check for when hiring.
The level of interest in the coronavirus pandemic – and the fear and uncertainty that comes with it – has caused tired, fringe conspiracy theories to be pulled into the mainstream. From obscure YouTube channels and Facebook pages, to national news headlines, baseless claims that 5G causes or exacerbates coronavirus are now having real-world consequences. People are burning down 5G masts in protest. Government ministers and public health experts are now being forced to confront this dangerous balderdash head-on, giving further oxygen and airtime to views that, were it not for the major technology platforms, would remain on the fringe of the fringe. “Like anti-vax content, this messaging is spreading via platforms which have been designed explicitly to help propagate the content which people find most compelling; most irresistible to click on,” says Smith from Demos.
James temperton (wired)
The disinformation and plain bonkers-ness around this ‘theory’ of linking 5G and the coronavirus is a particularly difficult thing to deal with. I’ve avoided talking about it on social media as well as here on Thought Shrapnel, but I’m sharing this as it’s a great overview of how these things spread — and who’s fanning the flames.
The COVID-19 pandemic is an unprecedented moment in the history of social structures such as education. After all of the time spent creating emergency plans and three- or five-year road maps that include fail safe options, we find ourselves in the actual emergency. Yet not even a month into global orders of shelter in place, there are many education narratives attempting to frame the pandemic as an opportunity. Extreme situations can certainly create space for extraordinary opportunities, but that viewpoint is severely limited considering this moment in time. Perhaps if the move to distance/online/remote education had happened in a vacuum that did not involve a global pandemic, millions sick, tens of thousands dead, tens of millions unemployed, hundreds of millions hungry, billions anxious and uncertain of society’s next step…perhaps then this would be that opportunity moment. Instead, we have a global emergency where the stress is felt everywhere but it certainly is not evenly distributed, so learning/aligning/deploying/assessing new technology for the classroom is not universally feasible. You can’t teach someone to swim while they’re drowning.
Rolin Moe is a thoughtful commentator on educational technology. This post was obviously written quickly (note the typo in the URL when you click through, as well as some slightly awkward language) and I’m not a fan of the title Moe has settled on. That being said, the point about this not being an ‘opportunity’ for edtech is a good one.
Produced in March, the memo explained how an NHS app could work, using Bluetooth LE, a standard feature that runs constantly and automatically on all mobile devices, to take “soundings” from other nearby phones through the day. People who have been in sustained proximity with someone who may have Covid-19 could then be warned and advised to self–isolate, without revealing the identity of the infected individual.
However, the memo stated that “more controversially” the app could use device IDs, which are unique to all smartphones, “to enable de-anonymisation if ministers judge that to be proportionate at some stage”. It did not say why ministers might want to identify app users, or under what circumstances doing so would be proportionate.
It’s hard to think of a job title more pandemic-proof than “superstar live streamer.” While the coronavirus has upended the working lives of hundreds of millions of people, Dr. Lupo, as he’s known to acolytes, has a basically unaltered routine. He has the same seven-second commute down a flight of stairs. He sits in the same seat, before the same configuration of lights, cameras and monitors. He keeps the same marathon hours, starting every morning at 8.
Social distancing? He’s been doing that since he went pro, three years ago.
For 11 hours a day, six days a week, he sits alone, hunting and being hunted on games like Call of Duty and Fortnite. With offline spectator sports canceled, he and other well-known gamers currently offer one of the only live contests that meet the standards of the Centers for Disease Control and Prevention.
David Segal (The New York Times)
It’s hard to argue with my son these days when he says he wants to be a ‘pro gamer’.
(a quick tip for those who want to avoid ‘free registration’ and some paywalls — use a service like Pocket to save the article and read it there)
To be clear, socialism may be a better way to go, as evidenced by the study showing 4 of the 5 happiest nations are socialist democracies. However, unless we’re going to provide universal healthcare and universal pre-K, let’s not embrace The Hunger Games for the working class on the way up, and the Hallmark Channel for the shareholder class on the way down. The current administration, the wealthy, and the media have embraced policies that bless the caching of power and wealth, creating a nation of brittle companies and government agencies.
A somewhat rambling post, but which explains the difference between a form of capitalism that (theoretically) allows everyone to flourish, and crony capitalism, which doesn’t.
Weird Internet Careers are the kinds of jobs that are impossible to explain to your parents, people who somehow make a living from the internet, generally involving a changing mix of revenue streams. Weird Internet Career is a term I made up (it had no google results in quotes before I started using it), but once you start noticing them, you’ll see them everywhere.
Gretchen McCulloch (All Things Linguistic)
I love this phrase, which I came across via Dan Hon’s newsletter. This is the first in a whole series of posts, which I am yet to explore in its entirety. My aim in life is now to make my career progressively more (internet) weird.
While the Outdoor Foundation’s 2019 Outdoor Participation Report showed that while a bit more than half of Americans went outside to play at least once in 2018, nearly half did not go outside for recreation at all. Americans went on 1 billion fewer outdoor outings in 2018 than they did in 2008. The number of adolescents ages 6 to 12 who recreate outdoors has fallen four years in a row, dropping more than 3% since 2007
The number of outings for kids has fallen 15% since 2012. The number of moderate outdoor recreation participants declined, and only 18% of Americans played outside at least once a week.
Jason Blevins (The Colorado Sun)
One of Bruce Willis’ lesser-known films is Surrogates (2009). It’s a short, pretty average film with a really interesting central premise: most people stay at home and send their surrogates out into the world. Over a decade after the film was released, a combination of things (including virulent viruses, screen-focused leisure time, and safety fears) seem to suggest it might be a predictor of our medium-term future.
It’s also telling when you think about what lengths companies have had to go through to make the EU versions of their sites different. Complying with GDPR has not been cheap. Any online business could choose to follow GDPR by default across all regions and for all visitors. It would certainly simplify things. They don’t, though. The amount of money in data collection is too big.
Jill Duffy (OneZero)
This is a strangely-titled article, but a decent explainer on what the web looks and feels like to those outside the EU. The author is spot-on when she talks about how GDPR and the recent California Privacy Law could be applied everywhere, but they’re not. Because surveillance capitalism.
The belief that privacy is private has left us careening toward a future that we did not choose, because it failed to reckon with the profound distinction between a society that insists upon sovereign individual rights and one that lives by the social relations of the one-way mirror. The lesson is that privacy is public — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable.
Shoshana Zuboff (The New York Times)
I fear that the length of Zuboff’s (excellent) book on surveillance capitalism, her use of terms in this article such as ‘epistemic inequality, and the subtlety of her arguments, may mean that she’s preaching to the choir here.
The next time you snap a photo together at the park or a restaurant, try asking your child if it’s all right that you post it to social media. Use the opportunity to talk about who can see that photo and show them your privacy settings. Or if a news story about the algorithms on YouTube comes on television, ask them if they’ve ever been directed to a video they didn’t want to see.
Meghan Herbst (WIRED)
There’s some useful advice in this WIRED article, especially that given by my friend Ian O’Byrne. The difficulty I’ve found is when one of your kids becomes a teenager and companies like Google contact them directly telling them they can have full control of their accounts, should they wish…
One reason the best lack conviction, though, is time. They don’t have the time to get to the level of conviction they need, and it’s a knotty problem, because that level of care is precisely what makes their participation in the network beneficial. (In fact, when I ask people who have unintentionally spread misinformation why they did so, the most common answer I hear is that they were either pressed for time, or had a scarcity of attention to give to that moment)
But what if — and hear me out here — what if there was a way for people to quickly check whether linked articles actually supported the points they claimed to? Actually quoted things correctly? Actually provided the context of the original from which they quoted
And what if, by some miracle, that function was shipped with every laptop and tablet, and available in different versions for mobile devices?
This super-feature actually exists already, and it’s called control-f.
Roll the animated GIF!
Mike Caulfield (Hapgood)
I find it incredible, but absolutely believable, that only around 10% of internet users know how to use Ctrl-F to find something within a web page. On mobile, it’s just as easy, as there’s an option within most (all?) browsers to ‘search within page’. I like Mike’s work, as not only is it academic, it’s incredibly practical.
The MicroBachelors also mark a continued shift for EdX, which made its name as one of the first MOOC providers, to a wider variety of educational offerings
In 2018, EdX announced several online master’s degrees with selective universities, including the Georgia Institute of Technology and the University of Texas at Austin.
Two years prior, it rolled out MicroMasters programs. Students can complete the series of graduate-level courses as a standalone credential or roll them into one of EdX’s master’s degrees.
That stackability was something EdX wanted to carry over into the MicroBachelors programs, Agarwal said. One key difference, however, is that the undergraduate programs will have an advising component, which the master’s programs do not.
Natalie Schwartz (Education Dive)
This is largely a rewritten press release with a few extra links, but I found it interesting as it’s a concrete example of a couple of things. First, the ongoing shift in Higher Education towards students-as-customers. Second, the viability of microcredentials as a ‘stackable’ way to build a portfolio of skills.
Note that, as a graduate of degrees in the Humanities, I’m not saying this approach can be used for everything, but for those using Higher Education as a means to an end, this is exactly what’s required.
Today, I still trust Google to not allow business dealings to affect the rankings of its organic results, but how much does that matter if most people can’t visually tell the difference at first glance? And how much does that matter when certain sections of Google, like hotels and flights, do use paid inclusion? And how much does that matter when business dealings very likely do affect the outcome of what you get when you use the next generation of search, the Google Assistant?
Dieter Bohn (The Verge)
I’ve used DuckDuckGo as my go-to search engine for years now. It used to be that I’d have to switch to Google for around 10% of my searches. That’s now down to zero.
One of the toughest situations for a product manager is when they spot a brewing ethical issue, but they’re not sure how they should handle the situation. Clearly this is going to be sensitive, and potentially emotional. Our best answer is to discover a solution that does not have these ethical concerns, but in some cases you won’t be able to, or may not have the time.
I rarely encourage people to leave their company, however, when it comes to those companies that are clearly ignoring the ethical implications of their work, I have and will continue to encourage people to leave.
Marty Cagan (SVPG)
As someone with a sensitive radar for these things, I’ve chosen to work with ethical people and for ethical organisations. As Cagan says in this post, if you’re working for a company that ignores the ethical implications of their work, then you should leave. End of story.
Critics may say that it is unreasonable to expect maps to reflect the communities or achievements of nonhumans. Maps are made by humans, for humans. When beavers start Googling directions to a neighbor’s dam, then their homes can be represented! For humans who use maps solely to navigate—something that nonhumans do without maps—man-made roads are indeed the only features that are relevant. Following a map that includes other information may inadvertently lead a human onto a trail made by and for deer.
But maps are not just tools to get from points A to B. They also relay new and learned information, document evolutionary changes, and inspire intrepid exploration. We operate on the assumption that our maps accurately reflect what a visitor would find if they traveled to a particular area. Maps have immense potential to illustrate the world around us, identifying all the important features of a given region. By that definition, the current maps that most humans use fall well short of being complete. Our definition of what is “important” is incredibly narrow.
Ryan Huling (WIRED)
Cartography is an incredibly powerful tool. We’ve known for a long time that “the map is not the territory” but perhaps this is another weapon in the fight against climate change and the decline in diversity of species?
Then you get people like Greenwald, Assange, Manning and Snowden. They are polarizing figures. They are loved or hated. They piss people off.
They piss people off precisely because they have principles they consider non-negotiable. They will not do the easy thing when it matters. They will not compromise on anything that really matters.
That’s breaking the actual social contract of “go along to get along”, “obey authority” and “don’t make people uncomfortable.” I recently talked to a senior activist who was uncomfortable even with the idea of yelling at powerful politicians. It struck them as close to violence.
So here’s the thing, people want men and women of principle to be like ordinary people.
They aren’t. They can’t be. If they were, they wouldn’t do what they do. Much of what you may not like about a Greenwald or Assange or Manning or Snowden is why they are what they are. Not just the principle, but the bravery verging on recklessness. The willingness to say exactly what they think, and do exactly what they believe is right even if others don’t.
Activists like Greta Thunberg and Edward Snowden are the closest we get to superheroes, to people who stand for the purest possible version of an idea. This is why we need them — and why we’re so disappointed when they turn out to be human after all.
Students’ not comprehending the value of engaging in certain ways is more likely to be a failure in our teaching than their willingness to learn (especially if we create a culture in which success becomes exclusively about marks and credentialization). The question we have to ask is if what we provide as ‘university’ goes beyond the value of what our students can engage with outside of our formal offer.
This is a great post by Dave, who I had the pleasure of collaborating with briefly during my stint at Jisc. I definitely agree that any organisation walks a dangerous path when it becomes overly-fixated on the ‘how’ instead of the ‘what’ and the ‘why’.
The power of an epigram or one of these expressions is that they say a lot with a little. They help guide us through the complexity of life with their unswerving directness. Each person must, as the retired USMC general and former Secretary of Defense Jim Mattis, has said, “Know what you will stand for and, more important, what you won’t stand for.” “State your flat-ass rules and stick to them. They shouldn’t come as a surprise to anyone.”
Of the 11 expressions here, I have to say that other than memento mori (“remember you will die”) I particularly like semper anticus (“always forward”) which I’m going to print out in a fancy font and stick on the wall of my home office.
In a hypothetical world, you could get a Discord (or whatever is next) link for your new job tomorrow – you read some wiki and meta info, sort yourself into your role you’d, and then are grouped with the people who you need to collaborate with on a need be basis. All wrapped in one platform. Maybe you have an HR complaint – drop it in #HR where you can’t read the messages but they can, so it’s a blind 1 way conversation. Maybe there is a #help channel, where you ping you write your problems and the bot pings people who have expertise based on keywords. There’s a lot of things you can do with this basic design.
What is described in this post is a bit of a stretch, but I can see it: a world where work is organised a bit like how gamers organisers in chat channels. Something to keep an eye on, as the interplay between what’s ‘normal’ and what’s possible with communications technology changes and evolves.
What made the last decade so difficult is how education institutions let corporations control the definitions so that a lot of “study and ethical practice” gets left out of the work. With the promise of ease of use, low-cost, increased student retention (or insert unreasonable-metric-claim here), etc. institutions are willing to buy into technology without regard to accessibility, scalability, equity and inclusion, data privacy or student safety, in hope of solving problem X that will then get to be checked off of an accreditation list. Or worse, with the hope of not having to invest in actual people and local infrastructure.
Geoff Cain (Brainstorm in progress)
It’s nice to see a list of some positives that came out of the last decades, and for microcredentials and badging to be on that list.
First, let’s consider the canonized usages. The subreddit r/birbs defines a birb as any bird that’s “being funny, cute, or silly in some way.” Urban Dictionary has a more varied set of definitions, many of which allude to a generalized smallness. A video on the youtube channel Lucidchart offers its own expansive suggestions: All birds are birbs, a chunky bird is a borb, and a fluffed-up bird is a floof. Yet some tension remains: How can all birds be birbs if smallness or cuteness are in the equation? Clearly some birds get more recognition for an innate birbness.
Asher Elbein (Audubon magazine)
A fun article, but also an interesting one when it comes to ambiguity, affinity groups, and internet culture.
“Now, why would Gmail or Facebook pay us? Because what we’re giving them in return is not money but data. We’re giving them lots of data about where we go, what we eat, what we buy. We let them read the contents of our email and determine that we’re about to go on vacation or we’ve just had a baby or we’re upset with our friend or it’s a difficult time at work. All of these things are in our email that can be read by the platform, and then the platform’s going to use that to sell us stuff.”
Fiona Scott Morton (Yale business school) quoted by Peter coy (Bloomberg Businessweek)
Regular readers of Thought Shrapnel know all about surveillance capitalism, but it’s good to see these explainers making their way to the more mainstream business press.
The most famous social credit system in operation is that used by China’s government. It “monitors millions of individuals’ behavior (including social media and online shopping), determines how moral or immoral it is, and raises or lowers their “citizen score” accordingly,” reported Atlantic in 2018.
“Those with a high score are rewarded, while those with a low score are punished.” Now we know the same AI systems are used for predictive policing to round up Muslim Uighurs and other minorities into concentration camps under the guise of preventing extremism.
Violet Blue (Engadget)
Some (more prudish) people will write this article off because it discusses sex workers, porn, and gay rights. But the truth is that all kinds of censorship start with marginalised groups. To my mind, we’re already on a trajectory away from Silicon Valley and towards Chinese technology. Will we be able to separate the tech from the morality?
The researchers worry that the focus on keeping children away from screens is making it hard to have more productive conversations about topics like how to make phones more useful for low-income people, who tend to use them more, or how to protect the privacy of teenagers who share their lives online.
“Many of the people who are terrifying kids about screens, they have hit a vein of attention from society and they are going to ride that. But that is super bad for society,” said Andrew Przybylski, the director of research at the Oxford Internet Institute, who has published several studies on the topic.
Nathaniel Popper (The New York Times)
Kids and screentime is just the latest (extended) moral panic. Overuse of anything causes problems, smartphones, games consoles, and TV included. What we need to do is to help our children find balance in all of this, which can be difficult for the first generation of parents navigating all of this on the frontline.
It’s been a busy week, but I’ve still found time to unearth these gems…
The Dark Psychology of Social Networks(The Atlantic) — “The philosophers Justin Tosi and Brandon Warmke have proposed the useful phrase moral grandstanding to describe what happens when people use moral talk to enhance their prestige in a public forum. Like a succession of orators speaking to a skeptical audience, each person strives to outdo previous speakers, leading to some common patterns. Grandstanders tend to “trump up moral charges, pile on in cases of public shaming, announce that anyone who disagrees with them is obviously wrong, or exaggerate emotional displays.” Nuance and truth are casualties in this competition to gain the approval of the audience. Grandstanders scrutinize every word spoken by their opponents—and sometimes even their friends—for the potential to evoke public outrage. Context collapses. The speaker’s intent is ignored.”
Live Your Best Life—On and Off Your Phone—in 2020(WIRED) — “It’s your devices versus your best life. Just in time for a new decade, though, several fresh books offer a more measured approach to living in the age of technology. These are not self-help books, or even books that confront our relationship with technology head-on. Instead, they examine the realities of a tech-saturated world and offer a few simple ideas for rewriting bad habits, reviewing the devices we actually need, and relearning how to listen amid all the noise.”
People Who Are Obsessed With Success and Prestige(Bennett Notes) — “What does it look like to be obsessed with success and prestige? It probably looks a lot like me at the moment. A guy who starts many endeavors and side projects just because he wants to be known as the creator of something. This a guy who wants to build another social app, not because he has an unique problem that’s unaddressed, but because he wants to be the cool tech entrepreneur who everyone admires and envies. This is a guy who probably doesn’t care for much of what he does, but continues to do so for the eventual social validation of society and his peers.”
The Lesson to Unlearn(Paul Graham) — “Merely talking explicitly about this phenomenon is likely to make things better, because much of its power comes from the fact that we take it for granted. After you’ve noticed it, it seems the elephant in the room, but it’s a pretty well camouflaged elephant. The phenomenon is so old, and so pervasive. And it’s simply the result of neglect. No one meant things to be this way. This is just what happens when you combine learning with grades, competition, and the naive assumption of unhackability.”
The End of the Beginning(Stratechery) — “[In consumer-focused startups] few companies are pure “tech” companies seeking to disrupt the dominant cloud and mobile players; rather, they take their presence as an assumption, and seek to transform society in ways that were previously impossible when computing was a destination, not a given. That is exactly what happened with the automobile: its existence stopped being interesting in its own right, while the implications of its existence changed everything.”
Populism Is Morphing in Insidious Ways(The Atlantic) — “If the 2010s were the years in which predominantly far-right, populist parties permeated the political mainstream, then the 2020s will be when voters “are going to see the consequences of that,” Daphne Halikiopoulou, an associate professor of comparative politics at the University of Reading, in England, told me.”
It’s the network, stupid: Study offers fresh insight into why we’re so divided(Ars Technica) — “There is no easy answer when it comes to implementing structural changes that encourage diversity, but today’s extreme polarization need not become a permanent characteristic of our cultural landscape. “I think we need to adopt new skills as we are transitioning into a more complex, more globalized, and more interconnected world, where each of us can affect far-away parts of the world with our actions,” said Galesic.”
Memorizing Lists of Cognitive Biases Won’t Help (Hapgood) — “But if you want to change your own behavior, memorizing long lists of biases isn’t going to help you. If anything it’s likely to just become another weapon in your motivated reasoning arsenal. You can literally read the list of biases to see why reading the list won’t work.”
How to get more done by doing less(Fast Company) — “Sometimes, the secret to doing more isn’t optimizing every minute, but finding the things you can cull from your schedule. That way, you not only reduce the time you spend on non-essential tasks, but you can also find more time for yourself.”
So said Neil Postman (via Jay Springett). Jay is one of a small number of people who’s work I find particularly thoughtful and challenging.
Another is Venkatesh Rao, who last week referenced a Twitter thread he posted earlier this year. It’s awkward to and quote the pertinent parts of such things, but I’ll give it a try:
Megatrend conclusion: if you do not build a second brain or go offline, you will BECOME the second brain.
Basically, there’s no way to actually handle the volume of information and news that all of us appear to be handling right now. Which means we are getting augmented cognition resources from somewhere. The default place is “social” media.
What those of us who are here are doing is making a deal with the devil (or an angel): in return for being 1-2 years ahead of curve, we play 2nd brain to a shared first brain. We’ve ceded control of executive attention not to evil companies, but… an emergent oracular brain.
I called it playing your part in the Global Social Computer in the Cloud (GSCITC).
Central trade-off in managing your participation in GSCITC is: The more you attempt to consciously curate your participation rather than letting it set your priorities, the less oracular power you get in return.
He reckons that being fully immersed in the firehose of social media is somewhat like reading the tea leaves or understanding the runes. You have to ‘go with the flow’.
Rao uses the example of the very Twitter thread he’s making. Constructing it that way versus, for example, writing a blog post or newsletter means he is in full-on ‘gonzo mode’ versus what he calls (after Henry David Thoreau) ‘Waldenponding’.
I have been generally very unimpressed with the work people seem to generate when they go waldenponding to work on supposedly important things. The comparable people who stay more plugged in seem to produce better work.
My kindest reading of people who retreat so far it actually compromises their work is that it is a mental health preservation move because they can’t handle the optimum GSCITC immersion for their project. Their work could be improved if they had the stomach for more gonzo-nausea.
My harshest reading is that they’re narcissistic snowflakes who overvalue their work simply because they did it.
Well, perhaps. But as someone who has attempted to drink from that firehouse for over a decade, I think the time comes when you realise something else. Who’s setting the agenda here? It’s not ‘no-one’, but neither is it any one person in particular. Rather the whole structure of what can happen within such a network depends on decisions made other than you.
For example, Dan Hon, pointed (in a supporter-only newsletter) to an article by Louise Matsakis in WIRED that explains that the social network TikTok not only doesn’t add timestamps to user-generated content, but actively blocks the clock on your smartphone. These design decisions affect what can and can’t happen, and also the kinds of things that do end up happening.
Writing in The Guardian, Leah McLaren writes about being part of the last generation to really remember life before the internet.
In this age of uncertainty, predictions have lost value, but here’s an irrefutable one: quite soon, no person on earth will remember what the world was like before the internet. There will be records, of course (stored in the intangibly limitless archive of the cloud), but the actual lived experience of what it was like to think and feel and be human before the emergence of big data will be gone. When that happens, what will be lost?
McLaren is evidently a few years older than me, as I’ve been online since I was about 15. However, I definitely reflect on a regular basis about what being hyper-connected does to my sense of self. She cites a recent study published in the official journal of the World Psychiatric Association. Part of the conclusion of that study reads:
As digital technologies become increasingly integrated with everyday life, the Internet is becoming highly proficient at capturing our attention, while producing a global shift in how people gather information, and connect with one another. In this review, we found emerging support for several hypotheses regarding the pathways through which the Internet is influencing our brains and cognitive processes, particularly with regards to: a) the multi‐faceted stream of incoming information encouraging us to engage in attentional‐switching and “multi‐tasking” , rather than sustained focus; b) the ubiquitous and rapid access to online factual information outcompeting previous transactive systems, and potentially even internal memory processes; c) the online social world paralleling “real world” cognitive processes, and becoming meshed with our offline sociality, introducing the possibility for the special properties of social media to impact on “real life” in unforeseen ways.
Firth, J., et al. (2019). The “online brain”: how the Internet may be changing our cognition. World Psychiatry, 18: 119-129.
In her Guardian article, McLaren cites the main author, Dr Joseph Firth:
“The problem with the internet,” Firth explained, “is that our brains seem to quickly figure out it’s there – and outsource.” This would be fine if we could rely on the internet for information the same way we rely on, say, the British Library. But what happens when we subconsciously outsource a complex cognitive function to an unreliable online world manipulated by capitalist interests and agents of distortion? “What happens to children born in a world where transactive memory is no longer as widely exercised as a cognitive function?” he asked.
I think this is the problem, isn’t it? I’ve got no issue with having an ‘outboard brain’ where I store things that I want to look up instead of remember. It’s also insanely useful to have a method by which the world can join together in a form of ‘hive mind’.
What is problematic is when this ‘hive mind’ (in the form of social media) is controlled by people and organisations whose interests are orthogonal to our own.
In that situation, there are three things we can do. The first is to seek out forms of nascent ‘hive mind’-like spaces which are not controlled by people focused on the problematic concept of ‘shareholder value’. Like Mastodon, for example, and other decentralised social networks.
The second is to spend time finding out the voices to which you want to pay particular attention. The chances are that they won’t only write down their thoughts via social networks. They are likely to have newsletters, blogs, and even podcasts.
Third, an apologies for the metaphor, but with such massive information consumption the chances are that we can become ‘constipated’. So if we don’t want that to happen, if we don’t want to go on an ‘information diet’, then we need to ensure a better throughput. One of the best things I’ve done is have a disciplined approach to writing (here on Thought Shrapnel, and elsewhere) about the things I’ve read and found interesting. That’s one way to extract the nutrients.
I’d love your thoughts on this. Do you agree with the above? What strategies do you have in place?