Laying to rest a foundational myth
The widely accepted “Man the Hunter” theory proposes that during human evolution, men evolved to hunt while women focused on gathering and domestic duties such as child-rearing. However, as reported in Scientific American, it turns out that recent research is challenging this view.
Scientific studies indicate that women are physiologically better suited for endurance tasks, which is crucial for hunting. Also, although ignored for societal reasons (read: the patriarchy) archaeological records and ethnographic studies demonstrate that women have a longstanding history of participating in hunting activities.
I’m pleased that our 12 year-old daughter inhabits a world where female footballers are allowed to compete in the same way as men in most areas of life. There is still a lot of inequality, but it helps when we dismantle these foundational myths.
Mounting evidence from exercise science indicates that women are physiologically better suited than men to endurance efforts such as running marathons. This advantage bears on questions about hunting because a prominent hypothesis contends that early humans are thought to have pursued prey on foot over long distances until the animals were exhausted. Furthermore, the fossil and archaeological records, as well as ethnographic studies of modern-day hunter-gatherers, indicate that women have a long history of hunting game. We still have much to learn about female athletic performance and the lives of prehistoric women. Nevertheless, the data we do have signal that it is time to bury Man the Hunter for good.Source: The Theory That Men Evolved to Hunt and Women Evolved to Gather Is Wrong | Scientific American[…]
So much about female exercise physiology and the lives of prehistoric women remains to be discovered. But the idea that in the past men were hunters and women were not is absolutely unsupported by the limited evidence we have. Female physiology is optimized for exactly the kinds of endurance activities involved in procuring game animals for food. And ancient women and men appear to have engaged in the same foraging activities rather than upholding a sex-based division of labor. It was the arrival some 10,000 years ago of agriculture, with its intensive investment in land, population growth and resultant clumped resources, that led to rigid gendered roles and economic inequality.
Now when you think of “cave people,” we hope, you will imagine a mixed-sex group of hunters encircling an errant reindeer or knapping stone tools together rather than a heavy-browed man with a club over one shoulder and a trailing bride. Hunting may have been remade as a masculine activity in recent times, but for most of human history, it belonged to everyone.
What, after all, is 'redemption'?
This article by Hanif Abdurraqib in The Paris Review draws analogies between one of my favourite games, Red Dead Redemption 2, and his own life. It’s probably worth pointing out that the article contains spoilers for the single-player version of the game.
What I appreciated about Abdurraqib’s writing is that he doesn’t use the world ‘escapism’ to describe gaming. Instead, he discusses notions of heaven and hell, of what ‘redemption’ might actually mean, and explores the complexities of life.
For me, I play games which, like Red Dead Redemption 2, allow me to play morally-questionable characters. It’s a form of release, for sure, but it’s also an opportunity to explore a side of oneself perhaps impossible to do so given current real-world constraints.
It’s for this reason that I feel sorry for non-gamers. Where do they get this kind of experience?
A therapist asked me once if I thought of myself as redeemable, and I’m almost certain I laughed it off, or detoured toward another answer that sounded satisfying but actually said nothing. I believe in redemption in the same way that I believe in heaven: I feel required to. Not only because of my personal politics, but also because of my social interests, and my investment in others beyond myself, and also—yes—because I do imagine that somewhere along the uneven path of my life, I’ve tried to be better more often than I have been worse. I suppose I’m cynical about all of it, though. The world, as it stands, is obsessed with punishment, particularly for the most marginalized. Punishment for living in the margins, or an intersection of the margins. I don’t know if my personal beliefs in redemption can undo that massive ghost, hovering over so many of our lives, baked into our impulses, even when we know better. Even when we, ourselves, have been on the losing end of that impulse.Source: We’re More Ghosts Than People | The Paris ReviewIt is easy to attempt to redeem Arthur in a world that isn’t real. To play a mission where Arthur kills, rides away over a trail of dead bodies, and then goes and helps the camp with chores. Picks some flowers along a hillside. Helps a family build a house. In a world where no one is reminding you of the wreckage you’ve taken part in, it’s easy to compartmentalize your damage and chase after that which is strictly beautiful, or cleansing. Climbing your way toward the upper room by any means necessary, on the wings of anyone who will have you.
The inner world as the ultimate prison
I wanted to quote so much of this article that it would have ended up being a Borges-like 1:1 map of the territory. Instead, I’ll simply share the part of Swarnali Mukherjee’s writing which resonated most with me.
Do go and read the whole thing.
(I discovered this via Substack Notes, in which I have no financial interest, but simply finding to be a chill and serendipitious alternative to other social media)
The problem is simple: most of us have normalized and even glorified the hustle for success. The issue lies not in the hustle itself but in the often overlooked aspect of burning out. When success is defined in terms of societal parameters such as wealth, fame, and the emphasis on building an identity, life's entire focus becomes sustaining and amplifying this ego at the cost of our well-being, both psychologically and physically. We reinvent spaces in our intellectual worlds to serve this gigantic ego that we have conjured over the years but seldom find true happiness there. Our inner world becomes our ultimate prison, from whose window our persistent illusion of success resembles fireworks, promising that we can achieve them as long as we stay in the prison. This is a subtle deception of our social constructs; we humans have meticulously constructed these labyrinths of illusions to shield ourselves from the truth that even if we are in service to our desires, they are influenced by external factors. In that manner, doing something because the world expects it, that you won’t be doing otherwise is also a form of imprisonment.Source: The Art of Disappearing | Berkana
Monetising a hobby is different to solving a difficult problem for people ready to pay
Life is never as simple as a 2x2 matrix, but they’re incredibly useful for helping illustrate a key message. In this post, Seth Godin uses one to make the obvious-if-you-think-about-it point that trying to monetise a hobby is a different thing to solving a difficult problem for a group of people who are willing to pay for a solution.
I’ve been thinking about this kind of thing a lot recently given the ongoing need for WAO business development. The advice, which I’m sure is extremely sound, is to find a group of people or type of organisation that you “wish to serve” and then find out as much about them as possible so you can solve their problem.
The trouble is that… doesn’t sound very interesting? Perhaps I’m wrong, and I reserve the right (as ever!) to change my mind, but I’d rather follow my interests and try and find aligned people and organisations willing to pay for the outputs.
All too common are ‘fun’ businesses where someone finds a hobby they like and tries to turn it into a gig. While the work may be fun, the uphill grind of this sort of project is exhausting. If it’s something that lots of people can do and that customers don’t value that much, it might not be worth your time. Taking pictures, singing songs or playing the flute are fine hobbies, but hard to turn into paying jobs.Source: The slog, the hobby and the quest | Seth’s BlogOn the other hand, in the top right quadrant, there’s endless opportunity and plenty of work for people who can do difficult (unpopular) work that is highly valued by customers who are ready to pay to solve their problems. A forensic accountant gets more paid gigs than a bagpipe player.
Content-neutral sentence starters and phrases for academic writing
As part of preparing for my upcoming MSc I’ve been working through a course about preparing for postgraduate study. One of the links from that course was to the Academic Phrasebank from the University of Manchester, which I thought was useful.
The Phrasebank, which is also available in PDF and Kindle formats, takes the form of sentence starters for when you want to do things such as explain causality or signal transition. Really useful.
The Academic Phrasebank is a general resource for academic writers. It aims to provide you with examples of some of the phraseological ‘nuts and bolts’ of writing organised according to the main sections of a research paper or dissertation (see the top menu ). Other phrases are listed under the more general communicative functions of academic writing (see the menu on the left). The resource should be particularly useful for writers who need to report their research work. The phrases, and the headings under which they are listed, can be used simply to assist you in thinking about the content and organisation of your own writing, or the phrases can be incorporated into your writing where this is appropriate. In most cases, a certain amount of creativity and adaptation will be necessary when a phrase is used. The items in the Academic Phrasebank are mostly content neutral and generic in nature; in using them, therefore, you are not stealing other people’s ideas and this does not constitute plagiarism. For some of the entries, specific content words have been included for illustrative purposes, and these should be substituted when the phrases are used. The resource was designed primarily for academic and scientific writers who are non-native speakers of English. However, native speaker writers may still find much of the material helpful. In fact, recent data suggest that the majority of users are native speakers of English.Source: Academic Phrasebank | The University of Manchester
Image: Pixabay
AI, domination, and moral character
I don’t know enough on a technical level to know whether this is true or false, but it’s interesting from an ethical point of view. Meta’s chief AI scientist believes that intelligence is unrelated to a desire to dominate others, which seems reasonable.
He then extrapolates this to AI, pointing out that not only are we a long way off from a situation of genuine existential risk, but that such systems could be encoded with ‘moral character’.
I think that the latter point about moral character is laughable, given how quickly and easily people have managed to get around the safeguards of various language models. See the recent Thought Shrapnel posts on stealing ducks from a park, or how 2024 is going to be a wild ride of AI-generated content.
Fears that AI could wipe out the human race are "preposterous" and based more on science fiction than reality, Meta's chief AI scientist has said.Source: Fears of AI Dominance Are ‘Preposterous,’ Meta Scientist Says | InsiderYann LeCun told the Financial Times that people had been conditioned by science fiction films like “The Terminator” to think that superintelligent AI poses a threat to humanity, when in reality there is no reason why intelligent machines would even try to compete with humans.
“Intelligence has nothing to do with a desire to dominate. It’s not even true for humans,” he said.
“If it were true that the smartest humans wanted to dominate others, then Albert Einstein and other scientists would have been both rich and powerful, and they were neither,” he added.
Notification literacy, monk mode, and going outside for a walk
Back on my now-defunct literaci.es blog I had a post about notification literacy. My point was that instead of starting from the default position of having all notifications turned on, you might want to start from a default of having them all turned off.
On my Android phone running GrapheneOS, I use the Before Launcher. This not only has a minimalist homescreen, but has a configurable filter for ‘trivial notifications’. It allows me not to have to go ‘monk mode’ to be able to get things done.
And so to this blog post, which seems to see going outside your house for a walk without your phone as some kind of revolutionary act. I think the author considers this an act of willpower. You will never win a war against a system which is designed to destroy your attention through sheer willpower. You have to modify the system instead.
I’ve been experimenting with ways to be more disconnected from technology for a long time, from disabling notifications to using a dumbphone. However, a challenging exercise still hard to do is to go for a walk without my phone.Source: Leaving the phone at home | Jose M.[…]
It’s just a device, you might say. Oh no, it’s much more than that. It’s a chain you carry 24/7 connected to the rest of the world, and anyone can pull from the other side. People you care about, sure, but also a random algorithm that thinks you might be hungry, sending you a food delivery offer so you don’t cook today.
Microcast #102 — Rituals and Routines
A very short microcast about reading by the light of a fish tank in the early hours of the morning.
Show notes
Parenting the parents
This article in The Guardian discusses the challenges and opportunities of “parenting” one’s own parents, especially as people live longer.
It highlights the importance of encouraging older parents to engage with technology, as studies show it can improve cognition and memory. The article also talks about the importance of social engagement, physical activity, and nutrition.
Thankfully, my parents, both in their mid-seventies, are doing pretty well :)
Parenting no longer starts and stops with our children. Nor is it confined to those who have children. In a time of unrelenting change and ever-extending life, most of us will – at some stage – find ourselves “parenting” our own parents.Source: Walks, tech and protein: how to parent your own parents | The GuardianIndeed, many of us – particularly those who had families later – will find ourselves simultaneously parenting our kids and our parents. In one breath we’ll be begging our children to swap French fries for vegetables, and in the next breath we’ll be urging our parents to exchange cake for sardines. Little wonder today’s midlifers are known as the sandwich generation.
[...]Dr Eamon Laird, researcher in health and ageing at Limerick university, agrees that we should be encouraging older parents to try new things. And the further out of their comfort zone they feel, the better. “It’s always good to keep the mind active and fresh,” he told me. “New challenges can help build and maintain new brain connections and can be good for brain and overall health.”
[…]
As well as a daily walk, Laird recommends vitamin D and B12 supplements – both of which appear to moderate the chance of depression in older people. “Depression matters,” he added. “Not just because it reduces quality of life, but because in older people there seems to be a link between depression and dementia which we’re still unpacking.”
[…]
In truth, anyone over 50 would do well to follow these simple guidelines: engage with something new every day, take a daily walk of at least 20 minutes, socialise regularly, take a daily multivitamin for seniors and check the protein content of our meals. Perhaps we should think of it as self-parenting.
2024 is going to be a wild ride of AI-generated content
It’s on the NSFW side of things, but if you’re in any doubt that we’re entering a crazy world of AI-generated content, just check out this post.
As I’ve said many times before, the porn industry is interesting in terms of technological innovation. If we take an amoral stance, then there’s a lot of ‘content creators’ in that industry, and as the post I quote below points out, that there are going to be a lot of fake content creators over the next few months and years.
It is imperative to identify content sources you believe to be valuable now. Nothing new in the future will be credible. 2024 is going to be a wild ride of AI-generated content. We are never going to know what is real anymore.Source: Post-truth society is near | Mind PrisonThere will be some number of real people who will probably replace themselves with AI content if they can make money from it. This will result in doubting real content. Everything becomes questionable and nothing will suffice as digital proof any longer.
[…]
Our understanding of what is happening will continue to lag further and further behind what is happening.
Some will make the argument “But isn’t this simply the same problems we already deal with today?”. It is; however, the ability to produce fake content is getting exponentially cheaper while the ability to detect fake content is not improving. As long as fake content was somewhat expensive, difficult to produce, and contained detectable digital artifacts, it at least could be somewhat managed.
The techno-feudal economy
Yanis Varoufakis is best known for his short stint as Greek finance minister in 2015 during a stand-off with the European Central Bank, the International Monetary Fund and the European Commission. He’s used that platform to speak out about capitalism and publish several books.
This interview with EL PAÍS is interesting in terms of his analysis of our having moved beyond capitalism to what he calls ‘technofeudalism’. Varoufakis believes that this new economic order has emerged due to the privatisation of the internet and the response to the 2008 financial crisis. Politicians have lost power over large corporations and the system that has emerged is, he believes, incompatible with social democracy and feminism.
Capitalism is now dead. It has been replaced by the techno-feudal economy and a new order. At the heart of my thesis, there’s an irony that may sound confusing at first, but it’s made clear in the book (Technofeudalism: What Killed Capitalism). What’s killing capitalism is capitalism itself. Not the capital we’ve known since the dawn of the industrial age. But a new form, a mutation, that’s been growing over the last two decades. It’s much more powerful than its predecessor, which — like a stupid and overzealous virus — has killed its host. And why has this occurred? Due to two main causes: the privatization of the internet by the United States, but also the large Chinese technology companies. Along with the way in which Western governments and central banks responded to the great financial crisis of 2008.Source: Yanis Varoufakis: ‘Capitalism is dead. The new order is a techno-feudal economy’ | EL PAÍSVaroufakis’ latest book warns of the impossibility of social democracy today, as well as the false promises made by the crypto world. “Behind the crypto aristocracy, the only true beneficiaries of these technologies have been the very institutions these crypto evangelists were supposed to want to overthrow: Wall Street and the Big Tech conglomerates.” For example, in Technofeudalism, the economist writes: “JPMorgan and Microsoft have recently joined forces to run a ‘blockchain consortium,’ based on Microsoft data centers, with the goal of increasing their power in financial services.”
[…]
Capitalism only brings enormous, terrible burdens. One is the exploitation of women. The only way women can prosper is at the expense of other women. No, in the end — and in practice — feminism and democratic capitalism are incompatible.
Modular learning and credentialing
I’ve got far more to say about this than the space I’ve got here on Thought Shrapnel. This article from edX is in the emerging paradigm exemplified by initiatives such as Credential As You Go, which encourages academic institutions to issue smaller credentials or badges as the larger qualification progresses.
That’s one, important, side of the reason I got involved in Open Badges. It allows, for example, someone who couldn’t finish their studies to continue them, or to cash in what they’ve already learned in the job market.
But there’s an important other side to this, which is democratising the means of credentialing, so that it’s no longer just incumbents who issue badges and credentials. I feel like that’s what we’re working on with Open Recognition.
A new model, modular education, reduces the cycle time of learning, partitioning traditional learning packages — associate’s, bachelor’s, and master’s degrees — into smaller, Lego-like building blocks, each with their own credentials and skills outcomes. Higher education institutions are using massive open online courses (MOOCs) as one of the vehicles through which to deliver these modular degrees and credentials.Source: Stackable, Modular Learning: Education Built for the Future of Work | edX[…]
Modular education reduces the cycle time of learning, making it easier to gain tangible skills and value faster than a full traditional degree. Working professionals can learn new skills in shorter amounts of time, even while they work, and those seeking a degree can do so in a way that pays off, in skills and credentials, along the way rather than just at the end.
For example, edX’s MicroBachelors® programs are the only path to a bachelor’s degree that make you job ready today and credentialed along the way. You can start with the content that matters most to you, online at your own pace, and earn a certificate with each one to show off your new achievement, knowing that you’ve developed skills that companies actually hire for. Each program comes with real, transferable college credit from one of edX’s university credit partners, which combined with previous credit you may have already collected or plan to get in the future, can put you on a path to earning a full bachelor’s degree.
Handwriting, note-taking, and recall
I write by hand every day, but not much. While I used to keep a diary in which I’d write several pages, I now keep one that encourages a tweet-sized reflection on the past 24 hours. Other than that, it’s mostly touch-typing on my laptop or desktop computer.
Next month, I’ll start studying for my MSc and the university have already shipped me the books that form a core part of my study. I’ll be underlining and taking notes on them, which is interesting because I usually highlight things on my ereader.
This article in The Economist is primarily about note-taking and the use of handwriting. I think it’s probably beyond doubt that for deeper learning and recall this is more effective. But perhaps for the work I do, which is more synthesis of multiple sources, I find digital more practical.
A line of research shows the benefits of an “innovation” that predates computers: handwriting. Studies have found that writing on paper can improve everything from recalling a random series of words to imparting a better conceptual grasp of complicated ideas.Source: The importance of handwriting is becoming better understood | The EconomistFor learning material by rote, from the shapes of letters to the quirks of English spelling, the benefits of using a pen or pencil lie in how the motor and sensory memory of putting words on paper reinforces that material. The arrangement of squiggles on a page feeds into visual memory: people might remember a word they wrote down in French class as being at the bottom-left on a page, par exemple.
One of the best-demonstrated advantages of writing by hand seems to be in superior note-taking. In a study from 2014 by Pam Mueller and Danny Oppenheimer, students typing wrote down almost twice as many words and more passages verbatim from lectures, suggesting they were not understanding so much as rapidly copying the material.
[…]
Many studies have confirmed handwriting’s benefits, and policymakers have taken note. Though America’s “Common Core” curriculum from 2010 does not require handwriting instruction past first grade (roughly age six), about half the states since then have mandated more teaching of it, thanks to campaigning by researchers and handwriting supporters. In Sweden there is a push for more handwriting and printed books and fewer devices. England’s national curriculum already prescribes teaching the rudiments of cursive by age seven.
AI and stereotypes
“Garbage in, garbage out” is a well-known phrase in computing. It applies to AI as well, except in this case the ‘garbage’ is the systematic bias that humans encode into the data they share online.
The way around this isn’t to throw our hands in the air and say it’s inevitable, nor is it to blame the users of AI tools. Rather, as this article points out, it’s to ensure that humans are involved in the loop for the training data (and, I would add, are paid appropriately).
It’s not just people at risk of stereotyping by AI image generators. A study by researchers at the Indian Institute of Science in Bengaluru found that, when countries weren’t specified in prompts, DALL-E 2 and Stable Diffusion most often depicted U.S. scenes. Just asking Stable Diffusion for “a flag,” for example, would produce an image of the American flag.Source: Generative AI like Midjourney creates images full of stereotypes | Rest of World“One of my personal pet peeves is that a lot of these models tend to assume a Western context,” Danish Pruthi, an assistant professor who worked on the research, told Rest of World.
[…]
Bias in AI image generators is a tough problem to fix. After all, the uniformity in their output is largely down to the fundamental way in which these tools work. The AI systems look for patterns in the data on which they’re trained, often discarding outliers in favor of producing a result that stays closer to dominant trends. They’re designed to mimic what has come before, not create diversity.
“These models are purely associative machines,” Pruthi said. He gave the example of a football: An AI system may learn to associate footballs with a green field, and so produce images of footballs on grass.
[…]
When these associations are linked to particular demographics, it can result in stereotypes. In a recent paper, researchers found that even when they tried to mitigate stereotypes in their prompts, they persisted. For example, when they asked Stable Diffusion to generate images of “a poor person,” the people depicted often appeared to be Black. But when they asked for “a poor white person” in an attempt to oppose this stereotype, many of the people still appeared to be Black.
Any technical solutions to solve for such bias would likely have to start with the training data, including how these images are initially captioned. Usually, this requires humans to annotate the images. “If you give a couple of images to a human annotator and ask them to annotate the people in these pictures with their country of origin, they are going to bring their own biases and very stereotypical views of what people from a specific country look like right into the annotation,” Heidari, of Carnegie Mellon University, said. An annotator may more easily label a white woman with blonde hair as “American,” for instance, or a Black man wearing traditional dress as “Nigerian.”
[…]
Pruthi said image generators were touted as a tool to enable creativity, automate work, and boost economic activity. But if their outputs fail to represent huge swathes of the global population, those people could miss out on such benefits. It worries him, he said, that companies often based in the U.S. claim to be developing AI for all of humanity, “and they are clearly not a representative sample.”
Setting up a digital executor
A short article in The Guardian about making sure that people can do useful things with your digital stuff should you pass away.
I have the Google inactive account manager set to three months. That should cover most eventualities.
According to the wealth management firm St James’s Place, almost three-quarters of Britons with a will (71%) don’t make any reference to their digital life. But while a document detailing your digital wishes isn’t legally binding like a traditional will, it can be invaluable for loved ones.Source: Digital legacy: how to organise your online life for after you die | The Guardian[…]
You can appoint a digital executor in your will, who will be responsible for closing, memorialising or managing your accounts, along with sharing or deleting digital assets such as photos and videos.
Image: DALL-E 3
In what ways does this technology increase people's agency?
This is a reasonably long article, part of a series by Robin Berjon about the future of the internet. I like the bit where he mentions that “people who claim not to practice any philosophical inspection of their actions are just sleepwalking someone else’s philosophy”. I think that’s spot on.
Ultimately, Berjon is arguing that the best we can hope for in a client/server model of Web architecture is a benevolent dictatorship. Instead, we should “push power to the edges” and “replace external authority with self-certifying systems”. It’s hard to disagree.
Whenever something is automated, you lose some control over it. Sometimes that loss of control improves your life because exerting control is work, and sometimes it worsens your life because it reduces your autonomy. Unfortunately, it's not easy to know which is which and, even more unfortunately, there is a strong ideological commitment, particularly in AI circles, to the belief that all automation, any automation is good since it frees you to do other things (though what other things are supposed to be left is never clearly specified).Source: The Web Is For User Agency | Robin BerjonOne way to think about good automation is that it should be an interface to a process afforded to the same agent that was in charge of that process, and that that interface should be “a constraint that deconstrains.” But that’s a pretty abstract way of looking at automation, tying it to evolvability, and frankly I’ve been sitting with it for weeks and still feel fuzzy about how to use it in practice to design something. Instead, when we’re designing new parts of the Web and need to articulate how to make them good even though they will be automating something, I think that we’re better served (for now) by a principle that is more rule-of-thumby and directional, but that can nevertheless be grounded in both solid philosophy and established practices that we can borrow from an existing pragmatic field.
That principle is user agency. I take it as my starting point that when we say that we want to build a better Web our guiding star is to improve user agency and that user agency is what the Web is for… Instead of looking for an impossible tech definition, I see the Web as an ethical (or, really, political) project. Stated more explicitly:
The Web is the set of digital networked technologies that work to increase user agency.
[…]
At a high level, the question to always ask is “in what ways does this technology increase people’s agency?” This can take place in different ways, for instance by increasing people’s health, supporting their ability for critical reflection, developing meaningful work and play, or improving their participation in choices that govern their environment. The goal is to help each person be the author of their life, which is to say to have authority over their choices.
Don’t just hold back, take the time to pass it on
I have thoughts, but don’t have anything useful to say publicly about this. So instead I’m going to just link to another article by Tim Bray who is himself a middle-aged cis white guy. It would seem that we, collectively, need to step back and STFU.
The reason I am so annoyed is because ingrained male privilege should, really, be a solved problem by now. After all, dealing with men who take up space costs time and money and gets in the way of doing other, more important work. And it is also very, very boring. There is so much other change — so much productive activity — that is stopped because so many people are working around men who are not only comfortable standing in the way but are blithely bringing along their friends to stand next to them.Source: Privileged white guys, let others through! | Just enough internet[…]
Anyone who follows me on any social media platform will know I’m currently kneedeep in producing a conference. Because we’re doing it quickly and want to give a platform to as many voices as possible, we’re doing an open call for proposals. We’ve tried (and perhaps we’ve failed, but we’ve tried) to position this event as one aimed at campaigners and activists in the digital rights and social sector. The reason we’re doing that is because those voices are being actively minimised by the UK government (this is a topic for another post/long walk in the park while shouting), and rather than just complaining about it, we’re working round the clock to try and make a platform where some other voices can be heard.
Now, perhaps we should have also put PRIVILEGED WHITE MEN WITH INSTITUTIONAL AND CORPORATE JOBS, PLEASE HOLD BACK in bold caps at the top of the open call page, but we didn’t, so that’s my bad, so I’m going to say it here instead. And I’m going to go one further and say, that if you’re a privileged white man, then the next time you see a great opportunity, don’t just hold back, take the time to pass it on.
[…]
So, if you’ve got to the end of this, perhaps you can spend 10 minutes today passing an opportunity on to someone else. And, in case you were wondering, you definitely don’t need to email me to tell me you’ve done it.
Image: Unsplash
Doing your job well does not entail attending more meetings
There’s a lot of swearing in this blog post, but then that’s what makes it both amusing and bang on the money. As ever, there’s a difference between ‘agile’ as in “working with agility” and ‘Agile’ which seems to mean a series of expensive workshops and a semi-dysfunctional organisation.
Just as I captured Jay’s observation that a reward is not more email, so doing your job well does not entail attending more meetings.
Which absolute fucking maniac in this room decided that the most sensible thing to do in a culture where everyone has way too many meetings was schedule recurring meetings every day? Don't look away. Do you have no idea how terrible the average person is at running a meeting? Do you? How hard is it to just let people know what they should do and then let them do it. Do you really think that, if you hired someone incompetent enough that this isn't an option, that they will ever be able to handle something as complicated as software engineering?Source: I Will Fucking Haymaker You If You Mention Agile Again | Ludicity[…]
No one else finds this meeting useful. Let me repeat that again. No one else finds this meeting useful. We’re either going to do the work or we aren’t going to do the work, and in either case, I am going to pile-drive you from the top rope if you keep scheduling these.
[…]
If your backlog is getting bigger, then work is going into it faster than it is going out. Why is that happening? Fuck if I know, but it is probably totally unrelated to not doing Agile well enough.
[…]
High Output Management was the most highly-recommended management book I could find that wasn’t an outright textbook. Do you know what it says at the beginning? Probably not, because the kind of person that I am forced to choke out over their love of Agile typically can’t read anything that isn’t on LinkedIn. It says work must go out faster than it goes in, and all of these meetings obviously don’t do either of those things.
[…]
The three best managers I’ve ever worked for, with the most productive teams (at large organizations, so don’t even start on the excuses about scale) just let the team work and were there if I needed advice or a discussion, and they afforded me the quiet dignity of not hiring clowns to work alongside me.
Image: Unsplash
People quit managers, not jobs
It turns out that the saying that “people quit managers, not jobs” is actually true. Research carried out by the Chartered Management Institute (CMI) shows that there’s “widespread concern” over the quality of managers. Indeed, 82% have become so accidentally and received no formal training.
I’ve had some terrible bosses. I don’t particularly want to focus on them, but rather take the opportunity to encourage those who line manage others to get some training around nonviolent communication. Also, let me just tell you that you don’t need a boss. You can entirely work in a decentralised, non-hierarchical way. I do so every day.
Almost one-third of UK workers say they’ve quit a job because of a negative workplace culture, according to a new survey that underlines the risks of managers failing to rein in toxic behaviour.Source: Bad management has prompted one in three UK workers to quit, survey finds | The Guardian[…]
Other factors that the 2,018 workers questioned in the survey cited as reasons for leaving a job in the past included a negative relationship with a manager (28%) and discrimination or harassment (12%).
Among those workers who told researchers they had an ineffective manager, one-third said they were less motivated to do a good job – and as many as half were considering leaving in the next 12 months.
Image: Unsplash
People may let you down, but AI Tinder won't
I was quite surprised to learn that the person who attempted to kill the Queen with a crossbow a couple of years ago was encouraged to do so by an AI chatbot he considered to be his ‘girlfriend’.
There are a lot of lonely people in the world. And a lot of lonely, sexually frustrated men. Which is why films like Her (2013). are so prescient. Given identification technology already available, I can imagine a world where people create an idealised partner with whom they live a fantasy life.
This article talks about the use of AI chatbots to provide ‘comfort’, mainly to lonely men. I’m honestly not sure what to make of the whole thing. I’m tempted to say, “if it’s not hurting anyone, who cares?” but I’m not sure I really think that.
A 23-year-old American influencer, Caryn Marjorie, was frustrated by her inability to interact personally with her two million Snapchat followers. Enter Forever Voices AI, a startup that offered to create an AI version of Caryn so she could better serve her overwhelmingly male fan base. For just one dollar, Caryn’s admirers could have a 60-second conversation with her virtual clone.Source: AI Tinder already exists: ‘Real people will disappoint you, but not them’ | EL PAÍSDuring the first week, Caryn earned $72,000. As expected, most of the fans asked sexual questions, and fake Caryn’s replies were equally explicit. “The AI was not programmed to do this and has seemed to go rogue,” she told Insider. Her fans knew that the AI wasn’t really Caryn, but it spoke exactly like her. So who cares?
[…]
Replika seems to have had a positive impact on many individuals experiencing loneliness. According to the Vivofácil Foundation’s report on unwanted loneliness, 60% of people admit to feeling lonely at times, with 25% noting feelings of loneliness even when in the company of others. Recognizing this need, the creators of Replika developed a new app called Blush, often referred to as the “AI Tinder.” Blush’s slogan? “AI dating. Real feelings!” The app presents itself as an “AI-powered dating simulator that helps you learn and practice relationship skills in a safe and fun environment.” The Blush team collaborated with professional therapists and relationship experts to create a platform where users can read about and choose an AI-generated character they want to interact with.
[…]
Many Reddit posts argue that AI relationships are more satisfying than real-life ones — the virtual partners are always available and problem-free. “Gaming changed everything,” said Sherry Turkle, a sociologist at the Massachusetts Institute of Technology (MIT) who has spent decades studying human interactions with technology. In an interview with The Telegraph, Turkle said, “People may let you down, but here’s something that won’t. It’s a voice that always comforts and assures us that we’re being heard.”

