AI and bullshit jobs
I had the pleasure of working with the large-brained Helen Beetham when I was at Jisc just over a decade ago. In this long-ish post, she covers quite a few areas of, with plenty of links, and pulls the threads together around graduate jobs and an AI curriculum.
While I could have quoted a lot of this, especially around innovation, the stories being told to graduates, and the neo-colonial nature of AI companies, I’ve gone for the last three paragraphs in which Helen discusses bullshit jobs. I’d highly recommend reading the whole thing.
My hope is that, rather than a curriculum ‘for AI’, these conversations would create space for learning that addresses human challenges. Getting life on earth out of the mess that fossil fuels and rampant production have made of it will take all the graduate labour we can produce and more. Nobody is going to be without meaningful work - not climate scientists or green energy specialists or engineers or geologists or computer scientists or materials chemists or statisticians. Not a single person educated in the STEM subjects beloved of governments everywhere can be left idle. But nor are we getting out of this without social scientists to help us weather the social and economic and political storms, humanities graduates to develop new laws and policies, new philosophies and imagined futures, and professionals committed to a just transition in their own spheres of work. And there are other crises, entwined with the climate crisis, that graduates need and want to address, such as galloping economic inequality, crises of democracy and human rights, food and water shortages, and the crisis of care. Universities can offer fewer and fewer guarantees of secure employment and decent pay, but they can offer meaningful work, justifying students’ investment in the future.Source: ‘Luckily, we love tedious work’ | Helen BeethamThe longer you look at the things ChatGPT can do, the more they resemble what David Graeber described as Bullshit Jobs - jobs that don’t need doing. While I don’t agree with the way he singles out specific job roles, Graeber is surely right that more and more work involves doing things with data and information and ‘content’ that has no value beyond maintaining those systems. And one claim he made that is borne out by workplace research is that meaningless work is bad for people’s mental health.
It’s a nice little aphorism that ‘if AI can do your job, AI should do your job’. But here’s a different one. If AI can ‘do’ your job, you deserve a better job. And if meaningless jobs are bad for workers’ mental health, how much worse are they for all our futures? The phrase ‘fiddling while Rome burns’ hardly begins to cover our present situation. As the polycrisis heats up, the crisis of not enough water-cooler text is not something any graduate should have to care about, nor any university curriculum either.