Tag: WIRED (page 1 of 2)

Saturday scrubbings

This week on Thought Shrapnel I’ve been focused on messing about with using OBS to create videos. So much, in fact, that this weekend I’m building a new PC to improve the experience.

Sometimes in these link roundups I try and group similar kinds of things together. But this week, much as I did last week, I’ve just thrown them all in a pot like Gumbo.

Tell me which links you find interesting, either in the comments, or on Twitter or the Fediverse (feel free to use the hashtag #thoughtshrapnel)


Melting Ice Reveals a “Lost” Viking-Era Pass in Norway’s Mountains

About 60 artifacts have been radiocarbon dated, showing the Lendbreen pass was widely used from at least A.D. 300. “It probably served as both an artery for long-distance travel and for local travel between permanent farms in the valleys to summer farms higher in the mountains, where livestock grazed for part of the year,” says University of Cambridge archaeologist James Barrett, a co-author of the research.

Tom Metcalfe (Scientific American)

I love it when the scientific and history communities come together to find out new things about our past. Especially about the Vikings, who were straight-up amazing.


University proposes online-only degrees as part of radical restructuring

Confidential documents seen by Palatinate show that the University is planning “a radical restructure” of the Durham curriculum in order to permanently put online resources at the core of its educational offer, in response to the Covid-19 crisis and other ongoing changes in both national and international Higher Education.

The proposals seek to “invert Durham’s traditional educational model”, which revolves around residential study, replacing it with one that puts “online resources at the core enabling us to provide education at a distance.” 

Jack Taylor & Tom Mitchell (Palatinate)

I’m paying attention to this as Durham University is one of my alma maters* but I think this is going to be a common story across a lot of UK institutions. They’ve relied for too long on the inflated fees brought in by overseas students and now, in the wake of the pandemic, need to rapidly find a different approach.

*I have a teaching qualification and two postgraduate degrees from Durham, despite a snooty professor telling me when I was 17 years old that I’d never get in to the institution 😅


Abolish Silicon Valley: memoir of a driven startup founder who became an anti-capitalist activist

Liu grew up a true believer in “meritocracy” and its corollaries: that success implies worth, and thus failure is a moral judgment about the intellect, commitment and value of the failed.

Her tale — starting in her girlhood bedroom and stretching all the way to protests outside of tech giants in San Francisco — traces a journey of maturity and discovery, as Liu confronts the mounting evidence that her life’s philosophy is little more than the self-serving rhetoric of rich people defending their privilege, the chasm between her lived experience and her guiding philosophy widens until she can no longer straddle it.

Cory Doctorow (Boing Boing)

This book is next on my non-fiction reading list. If your library is closed and doesn’t have an online service, try this.


Cup, er, drying itself...

7 things ease the switch to remote-only workplaces

You want workers to post work as it’s underway—even when it’s rough, incomplete, imperfect. That requires a different mindset, though one that’s increasingly common in asynchronous companies. In traditional companies, people often hesitate to circulate projects or proposals that aren’t polished, pretty, and bullet-proofed. It’s a natural reflex, especially when people are disconnected from each other and don’t communicate casually. But it can lead to long delays, especially on projects in which each participant’s progress depends on the progress and feedback of others. Location-independent companies need a culture in which people recognize that a work-in-progress is likely to have gaps and flaws and don’t criticize each other for them. This is an issue of norms, not tools.

Edmund L. Andrews-Stanford (Futurity)

I discovered this via Stephen Downes, who highlights the fifth point in this article (‘single source of truth’). I’ve actually highlighted the sixth one (‘breaking down the barriers to sharing work’) as I’ve also seen that as an important thing to check for when hiring.


How the 5G coronavirus conspiracy theory tore through the internet

The level of interest in the coronavirus pandemic – and the fear and uncertainty that comes with it – has caused tired, fringe conspiracy theories to be pulled into the mainstream. From obscure YouTube channels and Facebook pages, to national news headlines, baseless claims that 5G causes or exacerbates coronavirus are now having real-world consequences. People are burning down 5G masts in protest. Government ministers and public health experts are now being forced to confront this dangerous balderdash head-on, giving further oxygen and airtime to views that, were it not for the major technology platforms, would remain on the fringe of the fringe. “Like anti-vax content, this messaging is spreading via platforms which have been designed explicitly to help propagate the content which people find most compelling; most irresistible to click on,” says Smith from Demos.

James temperton (wired)

The disinformation and plain bonkers-ness around this ‘theory’ of linking 5G and the coronavirus is a particularly difficult thing to deal with. I’ve avoided talking about it on social media as well as here on Thought Shrapnel, but I’m sharing this as it’s a great overview of how these things spread — and who’s fanning the flames.


A Manifesto Against EdTech© During an Emergency Online Pivot

The COVID-19 pandemic is an unprecedented moment in the history of social structures such as education. After all of the time spent creating emergency plans and three- or five-year road maps that include fail safe options, we find ourselves in the actual emergency. Yet not even a month into global orders of shelter in place, there are many education narratives attempting to frame the pandemic as an opportunity. Extreme situations can certainly create space for extraordinary opportunities, but that viewpoint is severely limited considering this moment in time. Perhaps if the move to distance/online/remote education had happened in a vacuum that did not involve a global pandemic, millions sick, tens of thousands dead, tens of millions unemployed, hundreds of millions hungry, billions anxious and uncertain of society’s next step…perhaps then this would be that opportunity moment. Instead, we have a global emergency where the stress is felt everywhere but it certainly is not evenly distributed, so learning/aligning/deploying/assessing new technology for the classroom is not universally feasible. You can’t teach someone to swim while they’re drowning.

Rolin Moe

Rolin Moe is a thoughtful commentator on educational technology. This post was obviously written quickly (note the typo in the URL when you click through, as well as some slightly awkward language) and I’m not a fan of the title Moe has settled on. That being said, the point about this not being an ‘opportunity’ for edtech is a good one.


Dishes washing themselves

NHS coronavirus app: memo discussed giving ministers power to ‘de-anonymise’ users

Produced in March, the memo explained how an NHS app could work, using Bluetooth LE, a standard feature that runs constantly and automatically on all mobile devices, to take “soundings” from other nearby phones through the day. People who have been in sustained proximity with someone who may have Covid-19 could then be warned and advised to self–isolate, without revealing the identity of the infected individual.

However, the memo stated that “more controversially” the app could use device IDs, which are unique to all smartphones, “to enable de-anonymisation if ministers judge that to be proportionate at some stage”. It did not say why ministers might want to identify app users, or under what circumstances doing so would be proportionate.

David Pegg & Paul Lewis (The Guardian)

This all really concerns me, as not only is this kind of technology only going be of marginal use in fighting the coronavirus, once this is out of the box, what else is it going to be used for? Also check out Vice’s coverage, including an interview with Edward Snowden, and this discussion at Edgeryders.


Is This the Most Virus-Proof Job in the World?

It’s hard to think of a job title more pandemic-proof than “superstar live streamer.” While the coronavirus has upended the working lives of hundreds of millions of people, Dr. Lupo, as he’s known to acolytes, has a basically unaltered routine. He has the same seven-second commute down a flight of stairs. He sits in the same seat, before the same configuration of lights, cameras and monitors. He keeps the same marathon hours, starting every morning at 8.

Social distancing? He’s been doing that since he went pro, three years ago.

For 11 hours a day, six days a week, he sits alone, hunting and being hunted on games like Call of Duty and Fortnite. With offline spectator sports canceled, he and other well-known gamers currently offer one of the only live contests that meet the standards of the Centers for Disease Control and Prevention.

David Segal (The New York Times)

It’s hard to argue with my son these days when he says he wants to be a ‘pro gamer’.

(a quick tip for those who want to avoid ‘free registration’ and some paywalls — use a service like Pocket to save the article and read it there)


Capitalists or Cronyists?

To be clear, socialism may be a better way to go, as evidenced by the study showing 4 of the 5 happiest nations are socialist democracies. However, unless we’re going to provide universal healthcare and universal pre-K, let’s not embrace The Hunger Games for the working class on the way up, and the Hallmark Channel for the shareholder class on the way down. The current administration, the wealthy, and the media have embraced policies that bless the caching of power and wealth, creating a nation of brittle companies and government agencies.

Scott Galloway

A somewhat rambling post, but which explains the difference between a form of capitalism that (theoretically) allows everyone to flourish, and crony capitalism, which doesn’t.


Header image by Stephen Collins at The Guardian

Friday featherings

Behold! The usual link round-up of interesting things I’ve read in the last week.

Feel free to let me know if anything particularly resonated with you via the comments section below…


Part I – What is a Weird Internet Career?

Weird Internet Careers are the kinds of jobs that are impossible to explain to your parents, people who somehow make a living from the internet, generally involving a changing mix of revenue streams. Weird Internet Career is a term I made up (it had no google results in quotes before I started using it), but once you start noticing them, you’ll see them everywhere. 

Gretchen McCulloch (All Things Linguistic)

I love this phrase, which I came across via Dan Hon’s newsletter. This is the first in a whole series of posts, which I am yet to explore in its entirety. My aim in life is now to make my career progressively more (internet) weird.


Nearly half of Americans didn’t go outside to recreate in 2018. That has the outdoor industry worried.

While the Outdoor Foundation’s 2019 Outdoor Participation Report showed that while a bit more than half of Americans went outside to play at least once in 2018, nearly half did not go outside for recreation at all. Americans went on 1 billion fewer outdoor outings in 2018 than they did in 2008. The number of adolescents ages 6 to 12 who recreate outdoors has fallen four years in a row, dropping more than 3% since 2007 

The number of outings for kids has fallen 15% since 2012. The number of moderate outdoor recreation participants declined, and only 18% of Americans played outside at least once a week. 

Jason Blevins (The Colorado Sun)

One of Bruce Willis’ lesser-known films is Surrogates (2009). It’s a short, pretty average film with a really interesting central premise: most people stay at home and send their surrogates out into the world. Over a decade after the film was released, a combination of things (including virulent viruses, screen-focused leisure time, and safety fears) seem to suggest it might be a predictor of our medium-term future.


I’ll Never Go Back to Life Before GDPR

It’s also telling when you think about what lengths companies have had to go through to make the EU versions of their sites different. Complying with GDPR has not been cheap. Any online business could choose to follow GDPR by default across all regions and for all visitors. It would certainly simplify things. They don’t, though. The amount of money in data collection is too big.

Jill Duffy (OneZero)

This is a strangely-titled article, but a decent explainer on what the web looks and feels like to those outside the EU. The author is spot-on when she talks about how GDPR and the recent California Privacy Law could be applied everywhere, but they’re not. Because surveillance capitalism.


You Are Now Remotely Controlled

The belief that privacy is private has left us careening toward a future that we did not choose, because it failed to reckon with the profound distinction between a society that insists upon sovereign individual rights and one that lives by the social relations of the one-way mirror. The lesson is that privacy is public — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable.

Shoshana Zuboff (The New York Times)

I fear that the length of Zuboff’s (excellent) book on surveillance capitalism, her use of terms in this article such as ‘epistemic inequality, and the subtlety of her arguments, may mean that she’s preaching to the choir here.


How to Raise Media-Savvy Kids in the Digital Age

The next time you snap a photo together at the park or a restaurant, try asking your child if it’s all right that you post it to social media. Use the opportunity to talk about who can see that photo and show them your privacy settings. Or if a news story about the algorithms on YouTube comes on television, ask them if they’ve ever been directed to a video they didn’t want to see.

Meghan Herbst (WIRED)

There’s some useful advice in this WIRED article, especially that given by my friend Ian O’Byrne. The difficulty I’ve found is when one of your kids becomes a teenager and companies like Google contact them directly telling them they can have full control of their accounts, should they wish…


Control-F and Building Resilient Information Networks

One reason the best lack conviction, though, is time. They don’t have the time to get to the level of conviction they need, and it’s a knotty problem, because that level of care is precisely what makes their participation in the network beneficial. (In fact, when I ask people who have unintentionally spread misinformation why they did so, the most common answer I hear is that they were either pressed for time, or had a scarcity of attention to give to that moment)

But what if — and hear me out here — what if there was a way for people to quickly check whether linked articles actually supported the points they claimed to? Actually quoted things correctly? Actually provided the context of the original from which they quoted

And what if, by some miracle, that function was shipped with every laptop and tablet, and available in different versions for mobile devices?

This super-feature actually exists already, and it’s called control-f.

Roll the animated GIF!

Mike Caulfield (Hapgood)

I find it incredible, but absolutely believable, that only around 10% of internet users know how to use Ctrl-F to find something within a web page. On mobile, it’s just as easy, as there’s an option within most (all?) browsers to ‘search within page’. I like Mike’s work, as not only is it academic, it’s incredibly practical.


EdX launches for-credit credentials that stack into bachelor’s degrees

The MicroBachelors also mark a continued shift for EdX, which made its name as one of the first MOOC providers, to a wider variety of educational offerings 

In 2018, EdX announced several online master’s degrees with selective universities, including the Georgia Institute of Technology and the University of Texas at Austin.

Two years prior, it rolled out MicroMasters programs. Students can complete the series of graduate-level courses as a standalone credential or roll them into one of EdX’s master’s degrees.

That stackability was something EdX wanted to carry over into the MicroBachelors programs, Agarwal said. One key difference, however, is that the undergraduate programs will have an advising component, which the master’s programs do not. 

Natalie Schwartz (Education Dive)

This is largely a rewritten press release with a few extra links, but I found it interesting as it’s a concrete example of a couple of things. First, the ongoing shift in Higher Education towards students-as-customers. Second, the viability of microcredentials as a ‘stackable’ way to build a portfolio of skills.

Note that, as a graduate of degrees in the Humanities, I’m not saying this approach can be used for everything, but for those using Higher Education as a means to an end, this is exactly what’s required.


How much longer will we trust Google’s search results?

Today, I still trust Google to not allow business dealings to affect the rankings of its organic results, but how much does that matter if most people can’t visually tell the difference at first glance? And how much does that matter when certain sections of Google, like hotels and flights, do use paid inclusion? And how much does that matter when business dealings very likely do affect the outcome of what you get when you use the next generation of search, the Google Assistant?

Dieter Bohn (The Verge)

I’ve used DuckDuckGo as my go-to search engine for years now. It used to be that I’d have to switch to Google for around 10% of my searches. That’s now down to zero.


Coaching – Ethics

One of the toughest situations for a product manager is when they spot a brewing ethical issue, but they’re not sure how they should handle the situation.  Clearly this is going to be sensitive, and potentially emotional. Our best answer is to discover a solution that does not have these ethical concerns, but in some cases you won’t be able to, or may not have the time.

[…]

I rarely encourage people to leave their company, however, when it comes to those companies that are clearly ignoring the ethical implications of their work, I have and will continue to encourage people to leave.

Marty Cagan (SVPG)

As someone with a sensitive radar for these things, I’ve chosen to work with ethical people and for ethical organisations. As Cagan says in this post, if you’re working for a company that ignores the ethical implications of their work, then you should leave. End of story.


Image via webcomic.name

Friday festoonings

Check out these things I read and found interesting this week. Thanks to some positive feedback, I’ve carved out time for some commentary, and changed the way this link roundup is set out.

Let me know what you think! What did you find most interesting?


Maps Are Biased Against Animals

Critics may say that it is unreasonable to expect maps to reflect the communities or achievements of nonhumans. Maps are made by humans, for humans. When beavers start Googling directions to a neighbor’s dam, then their homes can be represented! For humans who use maps solely to navigate—something that nonhumans do without maps—man-made roads are indeed the only features that are relevant. Following a map that includes other information may inadvertently lead a human onto a trail made by and for deer.

But maps are not just tools to get from points A to B. They also relay new and learned information, document evolutionary changes, and inspire intrepid exploration. We operate on the assumption that our maps accurately reflect what a visitor would find if they traveled to a particular area. Maps have immense potential to illustrate the world around us, identifying all the important features of a given region. By that definition, the current maps that most humans use fall well short of being complete. Our definition of what is “important” is incredibly narrow.

Ryan Huling (WIRED)

Cartography is an incredibly powerful tool. We’ve known for a long time that “the map is not the territory” but perhaps this is another weapon in the fight against climate change and the decline in diversity of species?


Why Actually Principled People Are Difficult (Glenn Greenwald Edition)

Then you get people like Greenwald, Assange, Manning and Snowden. They are polarizing figures. They are loved or hated. They piss people off.

They piss people off precisely because they have principles they consider non-negotiable. They will not do the easy thing when it matters. They will not compromise on anything that really matters.

That’s breaking the actual social contract of “go along to get along”, “obey authority” and “don’t make people uncomfortable.” I recently talked to a senior activist who was uncomfortable even with the idea of yelling at powerful politicians. It struck them as close to violence.

So here’s the thing, people want men and women of principle to be like ordinary people.

They aren’t. They can’t be. If they were, they wouldn’t do what they do. Much of what you may not like about a Greenwald or Assange or Manning or Snowden is why they are what they are. Not just the principle, but the bravery verging on recklessness. The willingness to say exactly what they think, and do exactly what they believe is right even if others don’t.

Ian Welsh

Activists like Greta Thunberg and Edward Snowden are the closest we get to superheroes, to people who stand for the purest possible version of an idea. This is why we need them — and why we’re so disappointed when they turn out to be human after all.


Explicit education

Students’ not comprehending the value of engaging in certain ways is more likely to be a failure in our teaching than their willingness to learn (especially if we create a culture in which success becomes exclusively about marks and credentialization). The question we have to ask is if what we provide as ‘university’ goes beyond the value of what our students can engage with outside of our formal offer. 

Dave White

This is a great post by Dave, who I had the pleasure of collaborating with briefly during my stint at Jisc. I definitely agree that any organisation walks a dangerous path when it becomes overly-fixated on the ‘how’ instead of the ‘what’ and the ‘why’.


What Are Your Rules for Life? These 11 Expressions (from Ancient History) Might Help

The power of an epigram or one of these expressions is that they say a lot with a little. They help guide us through the complexity of life with their unswerving directness. Each person must, as the retired USMC general and former Secretary of Defense Jim Mattis, has said, “Know what you will stand for and, more important, what you won’t stand for.” “State your flat-ass rules and stick to them. They shouldn’t come as a surprise to anyone.”

Ryan Holiday

Of the 11 expressions here, I have to say that other than memento mori (“remember you will die”) I particularly like semper anticus (“always forward”) which I’m going to print out in a fancy font and stick on the wall of my home office.


Dark Horse Discord

In a hypothetical world, you could get a Discord (or whatever is next) link for your new job tomorrow – you read some wiki and meta info, sort yourself into your role you’d, and then are grouped with the people who you need to collaborate with on a need be basis. All wrapped in one platform. Maybe you have an HR complaint – drop it in #HR where you can’t read the messages but they can, so it’s a blind 1 way conversation. Maybe there is a #help channel, where you ping you write your problems and the bot pings people who have expertise based on keywords. There’s a lot of things you can do with this basic design.

Mule’s Musings

What is described in this post is a bit of a stretch, but I can see it: a world where work is organised a bit like how gamers organisers in chat channels. Something to keep an eye on, as the interplay between what’s ‘normal’ and what’s possible with communications technology changes and evolves.


The Edu-Decade That Was: Unfounded Optimism?

What made the last decade so difficult is how education institutions let corporations control the definitions so that a lot of “study and ethical practice” gets left out of the work. With the promise of ease of use, low-cost, increased student retention (or insert unreasonable-metric-claim here), etc. institutions are willing to buy into technology without regard to accessibility, scalability, equity and inclusion, data privacy or student safety, in hope of solving problem X that will then get to be checked off of an accreditation list. Or worse, with the hope of not having to invest in actual people and local infrastructure.

Geoff Cain (Brainstorm in progress)

It’s nice to see a list of some positives that came out of the last decades, and for microcredentials and badging to be on that list.


When Is a Bird a ‘Birb’? An Extremely Important Guide

First, let’s consider the canonized usages. The subreddit r/birbs defines a birb as any bird that’s “being funny, cute, or silly in some way.” Urban Dictionary has a more varied set of definitions, many of which allude to a generalized smallness. A video on the youtube channel Lucidchart offers its own expansive suggestions: All birds are birbs, a chunky bird is a borb, and a fluffed-up bird is a floof. Yet some tension remains: How can all birds be birbs if smallness or cuteness are in the equation? Clearly some birds get more recognition for an innate birbness.

Asher Elbein (Audubon magazine)

A fun article, but also an interesting one when it comes to ambiguity, affinity groups, and internet culture.


Why So Many Things Cost Exactly Zero

“Now, why would Gmail or Facebook pay us? Because what we’re giving them in return is not money but data. We’re giving them lots of data about where we go, what we eat, what we buy. We let them read the contents of our email and determine that we’re about to go on vacation or we’ve just had a baby or we’re upset with our friend or it’s a difficult time at work. All of these things are in our email that can be read by the platform, and then the platform’s going to use that to sell us stuff.”

Fiona Scott Morton (Yale business school) quoted by Peter coy (Bloomberg Businessweek)

Regular readers of Thought Shrapnel know all about surveillance capitalism, but it’s good to see these explainers making their way to the more mainstream business press.


Your online activity is now effectively a social ‘credit score’

The most famous social credit system in operation is that used by China’s government. It “monitors millions of individuals’ behavior (including social media and online shopping), determines how moral or immoral it is, and raises or lowers their “citizen score” accordingly,” reported Atlantic in 2018.

“Those with a high score are rewarded, while those with a low score are punished.” Now we know the same AI systems are used for predictive policing to round up Muslim Uighurs and other minorities into concentration camps under the guise of preventing extremism.

Violet Blue (Engadget)

Some (more prudish) people will write this article off because it discusses sex workers, porn, and gay rights. But the truth is that all kinds of censorship start with marginalised groups. To my mind, we’re already on a trajectory away from Silicon Valley and towards Chinese technology. Will we be able to separate the tech from the morality?


Panicking About Your Kids’ Phones? New Research Says Don’t

The researchers worry that the focus on keeping children away from screens is making it hard to have more productive conversations about topics like how to make phones more useful for low-income people, who tend to use them more, or how to protect the privacy of teenagers who share their lives online.

“Many of the people who are terrifying kids about screens, they have hit a vein of attention from society and they are going to ride that. But that is super bad for society,” said Andrew Przybylski, the director of research at the Oxford Internet Institute, who has published several studies on the topic.

Nathaniel Popper (The New York Times)

Kids and screentime is just the latest (extended) moral panic. Overuse of anything causes problems, smartphones, games consoles, and TV included. What we need to do is to help our children find balance in all of this, which can be difficult for the first generation of parents navigating all of this on the frontline.


Gorgeous header art via the latest Facebook alternative, planetary.social

Friday flurries

It’s been a busy week, but I’ve still found time to unearth these gems…

  • The Dark Psychology of Social Networks (The Atlantic) — “The philosophers Justin Tosi and Brandon Warmke have proposed the useful phrase moral grandstanding to describe what happens when people use moral talk to enhance their prestige in a public forum. Like a succession of orators speaking to a skeptical audience, each person strives to outdo previous speakers, leading to some common patterns. Grandstanders tend to “trump up moral charges, pile on in cases of public shaming, announce that anyone who disagrees with them is obviously wrong, or exaggerate emotional displays.” Nuance and truth are casualties in this competition to gain the approval of the audience. Grandstanders scrutinize every word spoken by their opponents—and sometimes even their friends—for the potential to evoke public outrage. Context collapses. The speaker’s intent is ignored.”
  • Live Your Best Life—On and Off Your Phone—in 2020 (WIRED) — “It’s your devices versus your best life. Just in time for a new decade, though, several fresh books offer a more measured approach to living in the age of technology. These are not self-help books, or even books that confront our relationship with technology head-on. Instead, they examine the realities of a tech-saturated world and offer a few simple ideas for rewriting bad habits, reviewing the devices we actually need, and relearning how to listen amid all the noise.”
  • People Who Are Obsessed With Success and Prestige (Bennett Notes) — “What does it look like to be obsessed with success and prestige? It probably looks a lot like me at the moment. A guy who starts many endeavors and side projects just because he wants to be known as the creator of something. This a guy who wants to build another social app, not because he has an unique problem that’s unaddressed, but because he wants to be the cool tech entrepreneur who everyone admires and envies. This is a guy who probably doesn’t care for much of what he does, but continues to do so for the eventual social validation of society and his peers.”
  • The Lesson to Unlearn (Paul Graham) — “Merely talking explicitly about this phenomenon is likely to make things better, because much of its power comes from the fact that we take it for granted. After you’ve noticed it, it seems the elephant in the room, but it’s a pretty well camouflaged elephant. The phenomenon is so old, and so pervasive. And it’s simply the result of neglect. No one meant things to be this way. This is just what happens when you combine learning with grades, competition, and the naive assumption of unhackability.”
  • The End of the Beginning (Stratechery) — “[In consumer-focused startups] few companies are pure “tech” companies seeking to disrupt the dominant cloud and mobile players; rather, they take their presence as an assumption, and seek to transform society in ways that were previously impossible when computing was a destination, not a given. That is exactly what happened with the automobile: its existence stopped being interesting in its own right, while the implications of its existence changed everything.”
  • Populism Is Morphing in Insidious Ways (The Atlantic) — “If the 2010s were the years in which predominantly far-right, populist parties permeated the political mainstream, then the 2020s will be when voters “are going to see the consequences of that,” Daphne Halikiopoulou, an associate professor of comparative politics at the University of Reading, in England, told me.”
  • It’s the network, stupid: Study offers fresh insight into why we’re so divided (Ars Technica) — “There is no easy answer when it comes to implementing structural changes that encourage diversity, but today’s extreme polarization need not become a permanent characteristic of our cultural landscape. “I think we need to adopt new skills as we are transitioning into a more complex, more globalized, and more interconnected world, where each of us can affect far-away parts of the world with our actions,” said Galesic.”
  • Memorizing Lists of Cognitive Biases Won’t Help (Hapgood) — “But if you want to change your own behavior, memorizing long lists of biases isn’t going to help you. If anything it’s likely to just become another weapon in your motivated reasoning arsenal. You can literally read the list of biases to see why reading the list won’t work.”
  • How to get more done by doing less (Fast Company) — “Sometimes, the secret to doing more isn’t optimizing every minute, but finding the things you can cull from your schedule. That way, you not only reduce the time you spend on non-essential tasks, but you can also find more time for yourself.”

Image via xkcd

People will come to adore the technologies that undo their capacities to think

So said Neil Postman (via Jay Springett). Jay is one of a small number of people who’s work I find particularly thoughtful and challenging.

Another is Venkatesh Rao, who last week referenced a Twitter thread he posted earlier this year. It’s awkward to and quote the pertinent parts of such things, but I’ll give it a try:

Megatrend conclusion: if you do not build a second brain or go offline, you will BECOME the second brain.

[…]

Basically, there’s no way to actually handle the volume of information and news that all of us appear to be handling right now. Which means we are getting augmented cognition resources from somewhere. The default place is “social” media.

[…]

What those of us who are here are doing is making a deal with the devil (or an angel): in return for being 1-2 years ahead of curve, we play 2nd brain to a shared first brain. We’ve ceded control of executive attention not to evil companies, but… an emergent oracular brain.

[…]

I called it playing your part in the Global Social Computer in the Cloud (GSCITC).

[…]

Central trade-off in managing your participation in GSCITC is: The more you attempt to consciously curate your participation rather than letting it set your priorities, the less oracular power you get in return.

Venkatesh Rao

He reckons that being fully immersed in the firehose of social media is somewhat like reading the tea leaves or understanding the runes. You have to ‘go with the flow’.

Rao uses the example of the very Twitter thread he’s making. Constructing it that way versus, for example, writing a blog post or newsletter means he is in full-on ‘gonzo mode’ versus what he calls (after Henry David Thoreau) ‘Waldenponding’.

I have been generally very unimpressed with the work people seem to generate when they go waldenponding to work on supposedly important things. The comparable people who stay more plugged in seem to produce better work.

My kindest reading of people who retreat so far it actually compromises their work is that it is a mental health preservation move because they can’t handle the optimum GSCITC immersion for their project. Their work could be improved if they had the stomach for more gonzo-nausea.

My harshest reading is that they’re narcissistic snowflakes who overvalue their work simply because they did it.

Venkatesh Rao

Well, perhaps. But as someone who has attempted to drink from that firehouse for over a decade, I think the time comes when you realise something else. Who’s setting the agenda here? It’s not ‘no-one’, but neither is it any one person in particular. Rather the whole structure of what can happen within such a network depends on decisions made other than you.

For example, Dan Hon, pointed (in a supporter-only newsletter) to an article by Louise Matsakis in WIRED that explains that the social network TikTok not only doesn’t add timestamps to user-generated content, but actively blocks the clock on your smartphone. These design decisions affect what can and can’t happen, and also the kinds of things that do end up happening.


Writing in The Guardian, Leah McLaren writes about being part of the last generation to really remember life before the internet.

In this age of uncertainty, predictions have lost value, but here’s an irrefutable one: quite soon, no person on earth will remember what the world was like before the internet. There will be records, of course (stored in the intangibly limitless archive of the cloud), but the actual lived experience of what it was like to think and feel and be human before the emergence of big data will be gone. When that happens, what will be lost?

Leah McLaren

McLaren is evidently a few years older than me, as I’ve been online since I was about 15. However, I definitely reflect on a regular basis about what being hyper-connected does to my sense of self. She cites a recent study published in the official journal of the World Psychiatric Association. Part of the conclusion of that study reads:

As digital technologies become increasingly integrated with everyday life, the Internet is becoming highly proficient at capturing our attention, while producing a global shift in how people gather information, and connect with one another. In this review, we found emerging support for several hypotheses regarding the pathways through which the Internet is influencing our brains and cognitive processes, particularly with regards to: a) the multi‐faceted stream of incoming information encouraging us to engage in attentional‐switching and “multi‐tasking” , rather than sustained focus; b) the ubiquitous and rapid access to online factual information outcompeting previous transactive systems, and potentially even internal memory processes; c) the online social world paralleling “real world” cognitive processes, and becoming meshed with our offline sociality, introducing the possibility for the special properties of social media to impact on “real life” in unforeseen ways.

Firth, J., et al. (2019). The “online brain”: how the Internet may be changing our cognition. World Psychiatry, 18: 119-129.

In her Guardian article, McLaren cites the main author, Dr Joseph Firth:

“The problem with the internet,” Firth explained, “is that our brains seem to quickly figure out it’s there – and outsource.” This would be fine if we could rely on the internet for information the same way we rely on, say, the British Library. But what happens when we subconsciously outsource a complex cognitive function to an unreliable online world manipulated by capitalist interests and agents of distortion? “What happens to children born in a world where transactive memory is no longer as widely exercised as a cognitive function?” he asked.

Leah McLaren

I think this is the problem, isn’t it? I’ve got no issue with having an ‘outboard brain’ where I store things that I want to look up instead of remember. It’s also insanely useful to have a method by which the world can join together in a form of ‘hive mind’.

What is problematic is when this ‘hive mind’ (in the form of social media) is controlled by people and organisations whose interests are orthogonal to our own.

In that situation, there are three things we can do. The first is to seek out forms of nascent ‘hive mind’-like spaces which are not controlled by people focused on the problematic concept of ‘shareholder value’. Like Mastodon, for example, and other decentralised social networks.

The second is to spend time finding out the voices to which you want to pay particular attention. The chances are that they won’t only write down their thoughts via social networks. They are likely to have newsletters, blogs, and even podcasts.

Third, an apologies for the metaphor, but with such massive information consumption the chances are that we can become ‘constipated’. So if we don’t want that to happen, if we don’t want to go on an ‘information diet’, then we need to ensure a better throughput. One of the best things I’ve done is have a disciplined approach to writing (here on Thought Shrapnel, and elsewhere) about the things I’ve read and found interesting. That’s one way to extract the nutrients.


I’d love your thoughts on this. Do you agree with the above? What strategies do you have in place?

Friday fluctuations

Have a quick skim through these links that I came across this week and found interesting:

  • Overrated: Ludwig Wittgenstein (Standpoint) — “Wittgenstein’s reputation for genius did not depend on incomprehensibility alone. He was also “tortured”, rude and unreliable. He had an intense gaze. He spent months in cold places like Norway to isolate himself. He temporarily quit philosophy, because he believed that he had solved all its problems in his 1922 Tractatus Logico-Philosophicus, and worked as a gardener. He gave away his family fortune. And, of course, he was Austrian, as so many of the best geniuses are.”
  • EdTech Resistance (Ben Williamson) ⁠— “We should not and cannot ignore these tensions and challenges. They are early signals of resistance ahead for edtech which need to be engaged with before they turn to public outrage. By paying attention to and acting on edtech resistances it may be possible to create education systems, curricula and practices that are fair and trustworthy. It is important not to allow edtech resistance to metamorphose into resistance to education itself.”
  • The Guardian view on machine learning: a computer cleverer than you? (The Guardian) — “The promise of AI is that it will imbue machines with the ability to spot patterns from data, and make decisions faster and better than humans do. What happens if they make worse decisions faster? Governments need to pause and take stock of the societal repercussions of allowing machines over a few decades to replicate human skills that have been evolving for millions of years.”
  • A nerdocratic oath (Scott Aaronson) — “I will never allow anyone else to make me a cog. I will never do what is stupid or horrible because “that’s what the regulations say” or “that’s what my supervisor said,” and then sleep soundly at night. I’ll never do my part for a project unless I’m satisfied that the project’s broader goals are, at worst, morally neutral. There’s no one on earth who gets to say: “I just solve technical problems. Moral implications are outside my scope”.”
  • Privacy is power (Aeon) — “The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.”
  • The Symmetry and Chaos of the World’s Megacities (WIRED) — “Koopmans manages to create fresh-looking images by finding unique vantage points, often by scouting his locations on Google Earth. As a rule, he tries to get as high as he can—one of his favorite tricks is talking local work crews into letting him shoot from the cockpit of a construction crane.”
  • Green cities of the future – what we can expect in 2050 (RNZ) — “In their lush vision of the future, a hyperloop monorail races past in the foreground and greenery drapes the sides of skyscrapers that house communal gardens and vertical farms.”
  • Wittgenstein Teaches Elementary School (Existential Comics) ⁠— “And I’ll have you all know, there is no crying in predicate logic.”
  • Ask Yourself These 5 Questions to Inspire a More Meaningful Career Move (Inc.) — “Introspection on the right things can lead to the life you want.”

Image from Do It Yurtself

It’s not a revolution if nobody loses

Thanks to Clay Shirky for today’s title. It’s true, isn’t it? You can’t claim something to be a true revolution unless someone, some organisation, or some group of people loses.

I’m happy to say that it’s the turn of some older white men to be losing right now, and particularly delighted that those who have spent decades abusing and repressing people are getting their comeuppance.

Enough has been written about Epstein and the fallout from it. You can read about comments made by Richard Stallman, founder of the Free Software Foundation, in this Washington Post article. I’ve only met RMS (as he’s known) in person once, at the Indie Tech Summit five years ago, but it wasn’t a great experience. While I’m willing to cut visionary people some slack, he mostly acted like a jerk.

RMS is a revered figure in Free Software circles and it’s actually quite difficult not to agree with his stance on many political and technological matters. That being said, he deserves everything he gets though for the comments he made about child abuse, for the way he’s treated women for the past few decades, and his dictator-like approach to software projects.

In an article for WIRED entitled Richard Stallman’s Exit Heralds a New Era in Tech, Noam Cohen writes that we’re entering a new age. I certainly hope so.

This is a lesson we are fast learning about freedom as it promoted by the tech world. It is not about ensuring that everyone can express their views and feelings. Freedom, in this telling, is about exclusion. The freedom to drive others away. And, until recently, freedom from consequences.

After 40 years of excluding those who didn’t serve his purposes, however, Stallman finds himself excluded by his peers. Freedom.

Maybe freedom, defined in this crude, top-down way, isn’t the be-all, end-all. Creating a vibrant inclusive community, it turns out, is as important to a software project as a coding breakthrough. Or, to put it in more familiar terms—driving away women, investing your hopes in a single, unassailable leader is a critical bug. The best patch will be to start a movement that is respectful, inclusive, and democratic.

Noam Cohen

One of the things that the next leaders of the Free Software Movement will have to address is how to take practical steps to guarantee our basic freedoms in a world where Big Tech provides surveillance to ever-more-powerful governments.

Cory Doctorow is an obvious person to look to in this regard. He has a history of understanding what’s going on and writing about it in ways that people understand. In an article for The Globe and Mail, Doctorow notes that a decline in trust of political systems and experts more generally isn’t because people are more gullible:

40 years of rising inequality and industry consolidation have turned our truth-seeking exercises into auctions, in which lawmakers, regulators and administrators are beholden to a small cohort of increasingly wealthy people who hold their financial and career futures in their hands.

[…]

To be in a world where the truth is up for auction is to be set adrift from rationality. No one is qualified to assess all the intensely technical truths required for survival: even if you can master media literacy and sort reputable scientific journals from junk pay-for-play ones; even if you can acquire the statistical literacy to evaluate studies for rigour; even if you can acquire the expertise to evaluate claims about the safety of opioids, you can’t do it all over again for your city’s building code, the aviation-safety standards governing your next flight, the food-safety standards governing the dinner you just ordered.

Cory Doctorow

What’s this got to do with technology, and in particular Free Software?

Big Tech is part of this problem… because they have monopolies, thanks to decades of buying nascent competitors and merging with their largest competitors, of cornering vertical markets and crushing rivals who won’t sell. Big Tech means that one company is in charge of the social lives of 2.3 billion people; it means another company controls the way we answer every question it occurs to us to ask. It means that companies can assert the right to control which software your devices can run, who can fix them, and when they must be sent to a landfill.

These companies, with their tax evasion, labour abuses, cavalier attitudes toward our privacy and their completely ordinary human frailty and self-deception, are unfit to rule our lives. But no one is fit to be our ruler. We deserve technological self-determination, not a corporatized internet made up of five giant services each filled with screenshots from the other four.

Cory Doctorow

Doctorow suggests breaking up these companies to end their de facto monopolies and level the playing field.

The problem of tech monopolies is something that Stowe Boyd explored in a recent article entitled Are Platforms Commons? Citing previous precedents around railroads, Boyd has many questions, including whether successful platforms be bound with the legal principles of ‘common carriers’, and finishes with this:

However, just one more question for today: what if ecosystems were constructed so that they were governed by the participants, rather by the hypercapitalist strivings of the platform owners — such as Apple, Google, Amazon, Facebook — or the heavy-handed regulators? Is there a middle ground where the needs of the end user and those building, marketing, and shipping products and services can be balanced, and a fair share of the profits are distributed not just through common carrier laws but by the shared economics of a commons, and where the platform orchestrator gets a fair share, as well? We may need to shift our thinking from common carrier to commons carrier, in the near future.

Stowe Boyd

The trouble is, simply establishing a commons doesn’t solve all of the problems. In fact, what tends to happen next is well known:

The tragedy of the commons is a situation in a shared-resource system where individual users, acting independently according to their own self-interest, behave contrary to the common good of all users, by depleting or spoiling that resource through their collective action.

Wikipedia

An article in The Economist outlines the usual remedies to the ‘tragedy of the commons’: either governmental regulation (e.g. airspace), or property rights (e.g. land). However, the article cites the work of Elinor Ostrom, a Nobel prizewinning economist, showing that another way is possible:

An exclusive focus on states and markets as ways to control the use of commons neglects a varied menagerie of institutions throughout history. The information age provides modern examples, for example Wikipedia, a free, user-edited encyclopedia. The digital age would not have dawned without the private rewards that flowed to successful entrepreneurs. But vast swathes of the web that might function well as commons have been left in the hands of rich, relatively unaccountable tech firms.

[…]

A world rich in healthy commons would of necessity be one full of distributed, overlapping institutions of community governance. Cultivating these would be less politically rewarding than privatisation, which allows governments to trade responsibility for cash. But empowering commoners could mend rents in the civic fabric and alleviate frustration with out-of-touch elites.

The Economist

I count myself as someone on the left of politics, if that’s how we’re measuring things today. However, I don’t think we need representation at any higher level than is strictly necessary.

In a time when technology allows you, to a great extent, to represent yourself, perhaps we need ways of demonstrating how complex and multi-faceted some issues are? Perhaps we need to try ‘liquid democracy‘:

Liquid democracy lies between direct and representative democracy. In direct democracy, participants must vote personally on all issues, while in representative democracy participants vote for representatives once in certain election cycles. Meanwhile, liquid democracy does not depend on representatives but rather on a weighted and transitory delegation of votes. Liquid democracy through elections can empower individuals to become sole interpreters of the interests of the nation. It allows for citizens to vote directly on policy issues, delegate their votes on one or multiple policy areas to delegates of their choosing, delegate votes to one or more people, delegated to them as a weighted voter, or get rid of their votes’ delegations whenever they please.

WIkipedia

I think, given the state that politics is in right now, it’s well worth a try. The problem, of course, is that the losers would be the political elites, the current incumbents. But, hey, it’s not a revolution if nobody loses, right?

Friday fermentations

I boiled the internet and this was what remained:

  • I Quit Social Media for a Year and Nothing Magical Happened (Josh C. Simmons) — “A lot of social media related aspects of my life are different now – I’m not sure they’re better, they’re just different, but I can confidently say that I prefer this normal to last year’s. There’s a bit of rain with all of the sunshine. I don’t see myself ever going back to social media. I don’t see the point of it, and after leaving for a while, and getting a good outside look, it seems like an abusive relationship – millions of workers generating data for tech-giants to crunch through and make money off of. I think that we tend to forget how we were getting along pretty well before social media – not everything was idyllic and better, but it was fine.”
  • Face recognition, bad people and bad data (Benedict Evans) — “My favourite example of what can go wrong here comes from a project for recognising cancer in photos of skin. The obvious problem is that you might not have an appropriate distribution of samples of skin in different tones. But another problem that can arise is that dermatologists tend to put rulers in the photo of cancer, for scale – so if all the examples of ‘cancer’ have a ruler and all the examples of ‘not-cancer’ do not, that might be a lot more statistically prominent than those small blemishes. You inadvertently built a ruler-recogniser instead of a cancer-recogniser.”
  • Would the Internet Be Healthier Without ‘Like’ Counts? (WIRED) ⁠— “Online, value is quantifiable. The worth of a person, idea, movement, meme, or tweet is often based on a tally of actions: likes, retweets, shares, followers, views, replies, claps, and swipes-up, among others. Each is an individual action. Together, though, they take on outsized meaning. A YouTube video with 100,000 views seems more valuable than one with 10, even though views—like nearly every form of online engagement—can be easily bought. It’s a paradoxical love affair. And it’s far from an accident.”
  • Are Platforms Commons? (On The Horizon) — “[W]hat if ecosystems were constructed so that they were governed by the participants, rather by the hypercapitalist strivings of the platform owners — such as Apple, Google, Amazon, Facebook — or the heavy-handed regulators? Is there a middle ground where the needs of the end user and those building, marketing, and shipping products and services can be balanced, and a fair share of the profits are distributed not just through common carrier laws but by the shared economics of a commons, and where the platform orchestrator gets a fair share, as well?”
  • Depression and anxiety threatened to kill my career. So I came clean about it (The Guardian) — “To my surprise, far from rejecting me, students stayed after class to tell me how sorry they were. They left condolence cards in my mailbox and sent emails to let me know they were praying for my family. They stopped by my office to check on me. Up to that point, I’d been so caught up in my despair that it never occurred to me that I might be worthy of concern and support. Being accepted despite my flaws touched me in ways that are hard to express.”
  • Absolute scale corrupts absolutely (apenwarr) — “Here’s what we’ve lost sight of, in a world where everything is Internet scale: most interactions should not be Internet scale. Most instances of most programs should be restricted to a small set of obviously trusted people. All those people, in all those foreign countries, should not be invited to read Equifax’s PII database in Argentina, no matter how stupid the password was. They shouldn’t even be able to connect to the database. They shouldn’t be able to see that it exists. It shouldn’t, in short, be on the Internet.”
  • The Automation Charade (Logic magazine) — “The problem is that the emphasis on technological factors alone, as though “disruptive innovation” comes from nowhere or is as natural as a cool breeze, casts an air of blameless inevitability over something that has deep roots in class conflict. The phrase “robots are taking our jobs” gives technology agency it doesn’t (yet?) possess, whereas “capitalists are making targeted investments in robots designed to weaken and replace human workers so they can get even richer” is less catchy but more accurate.”
  • The ambitious plan to reinvent how websites get their names (MIT Technology Review) — “The system would be based on blockchain technology, meaning it would be software that runs on a widely distributed network of computers. In theory, it would have no single point of failure and depend on no human-run organization that could be corrupted or co-opted.”
  • O whatever God or whatever ancestor that wins in the next life (The Main Event) — “And it begins to dawn on you that the stories were all myths and the epics were all narrated by the villains and the history books were written to rewrite the histories and that so much of what you thought defined excellence merely concealed grift.”
  • A Famous Argument Against Free Will Has Been Debunked (The Atlantic) — “In other words, people’s subjective experience of a decision—what Libet’s study seemed to suggest was just an illusion—appeared to match the actual moment their brains showed them making a decision.”

Friday flinchings

Here’s a distillation of the best of what I’ve been reading over the last three weeks:

  • The new left economics: how a network of thinkers is transforming capitalism (The Guardian) — “The new leftwing economics wants to see the redistribution of economic power, so that it is held by everyone – just as political power is held by everyone in a healthy democracy. This redistribution of power could involve employees taking ownership of part of every company; or local politicians reshaping their city’s economy to favour local, ethical businesses over large corporations; or national politicians making co-operatives a capitalist norm.”
  • Dark web detectives and cannabis sommeliers: Here are some jobs that could exist in the future (CBC) — “In a report called Signs of the Times: Expert insights about employment in 2030, the Brookfield Institute for Innovation + Entrepreneurship — a policy institute set up to help Canadians navigate the innovation economy — brings together insights into the future of work gleaned from workshops held across the country.”
  • Art Spiegelman: golden age superheroes were shaped by the rise of fascism (The Guardian) — “The young Jewish creators of the first superheroes conjured up mythic – almost god-like – secular saviours to deal with the threatening economic dislocations that surrounded them in the great depression and gave shape to their premonitions of impending global war. Comics allowed readers to escape into fantasy by projecting themselves on to invulnerable heroes.”
  • We Have Ruined Childhood (The New York Times) — “I’ve come to believe that the problems with children’s mental and emotional health are caused not by any single change in kids’ environment but by a fundamental shift in the way we view children and child-rearing, and the way this shift has transformed our schools, our neighborhoods and our relationships to one another and our communities.”
  • Turning the Nintendo Switch into Android’s best gaming hardware (Ars Technica) — “The Nintendo Switch is, basically, a game console made out of smartphone parts…. Really, the only things that make the Switch a game console are the sweet slide-on controllers and the fact that it is blessed by Nintendo, with actually good AAA games, ecosystem support, and developer outreach.
  • Actually, Gender-Neutral Pronouns Can Change a Culture (WIRED) — “Would native-speaker Swedes, seven years after getting a new pronoun plugged into their language, be more likely to assume this androgynous cartoon was a man? A woman? Either, or neither? Now that they had a word for it, a nonbinary option, would they think to use it?”
  • Don’t Blink! The Hazards of Confidence (The New York Times Magazine) — “Unfortunately, this advice is difficult to follow: overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.”
  • Why These Social Networks Failed So Badly (Gizmodo) — “It’s not to say that without Facebook, the whole internet would be more like a local farmer’s market or a punk venue or an art gallery or comedy club or a Narnia fanfic club, just that those places are harder to find these days.”
  • Every productivity thought I’ve ever had, as concisely as possible (Alexey Guzey) — “I combed through several years of my private notes and through everything I published on productivity before and tried to summarize all of it in this post.”

Header image via Jessica Hagy at Indexed

Friday federations

These things piqued my interest this week:

  • You Should Own Your Favorite Books in Hard Copy (Lifehacker) — “Most importantly, when you keep physical books around, the people who live with you can browse and try them out too.”
  • How Creative Commons drives collaboration (Vox) “Although traditional copyright protects creators from others redistributing or repurposing their works entirely, it also restricts access, for both viewers and makers.”
  • Key Facilitation Skills: Distinguishing Weird from Seductive (Grassroots Economic Organizing) — “As a facilitation trainer the past 15 years, I’ve collected plenty of data about which lessons have been the most challenging for students to digest.”
  • Why Being Bored Is Good (The Walrus) — “Boredom, especially the species of it that I am going to label “neoliberal,” depends for its force on the workings of an attention economy in which we are mostly willing participants.”
  • 5: People having fun on the internet (Near Future Field Notes) — “The internet is still a really great place to explore. But you have to get back into Internet Nature instead of spending all your time in Internet Times Square wondering how everything got so loud and dehumanising.”
  • The work of a sleepwalking artist offers a glimpse into the fertile slumbering brain (Aeon) “Lee Hadwin has been scribbling in his sleep since early childhood. By the time he was a teen, he was creating elaborate, accomplished drawings and paintings that he had no memory of making – a process that continues today. Even stranger perhaps is that, when he is awake, he has very little interest in or skill for art.”
  • The Power of One Push-Up (The Atlantic) — “Essentially, these quick metrics serve as surrogates that correlate with all kinds of factors that determine a person’s overall health—which can otherwise be totally impractical, invasive, and expensive to measure directly. If we had to choose a single, simple, universal number to define health, any of these functional metrics might be a better contender than BMI.”
  • How Wechat censors images in private chats (BoingBoing) — “Wechat maintains a massive index of the MD5 hashes of every image that Chinese censors have prohibited. When a user sends another user an image that matches one of these hashes, it’s recognized and blocked at the server before it is transmitted to the recipient, with neither the recipient or the sender being informed that the censorship has taken place.”
  • It’s Never Too Late to Be Successful and Happy (Invincible Career) — “The “race” we are running is a one-person event. The most important comparison is to yourself. Are you doing better than you were last year? Are you a better person than you were yesterday? Are you learning and growing? Are you slowly figuring out what you really want, what makes you happy, and what fulfillment means for you?”
  • ‘Blitzscaling’ Is Choking Innovation—and Wasting Money (WIRED) — “If we learned anything from the dotcom bubble at the turn of the century, it’s that in an environment of abundant capital, money does not necessarily bestow competitive advantage. In fact, spending too much, to soon on unproven business models only heightens the risk that a company’s race for global domination can become a race to oblivion.”

Image: Federation Square by Julien used under a Creative Commons license

Get a Thought Shrapnel digest in your inbox every Sunday (free!)
Holler Box