The jobs of the future will involve cleaning up environmental and political and epistemological disaster
I saw something recently which suggested that, in the US at least, the number of jobs for software developers peaked in 2019 and has been going down ever since. Good job everyone didn’t retrain as programmers, then.
There are any number of think tanks and policy outlets which tell you what they think the future of work, society, economy, etc. will be like. Of course, none of these organisations is neutral and, at the end of the day, all have a worldview to foist upon the rest of us. The World Economic Forum is one of these bodies and, as Audrey Watters discusses in her latest missive, it predicts the most ridiculous things.
I remember reading Fully Automated Luxury Communism by Aaron Bastani when it came out, pre-pandemic. I was optimistic about the role of technology, including AI, as a way of providing everyone’s needs. But the way that it’s actually being rolled-out, especially post-pandemic, when the hypercapitalists and neo-fascists have removed their masks, has left me somewhat more fearful.
It’s a broad generalisation, but you’ve essentially got two options in your working life: you can be part of the problem, or you can be part of the solution. Sadly, there’s a lot of money to be made in being part of the problem.
Reports issued by the World Economic Forum and the like are a kind of “futurology” – speculation, predictive modeling, planning for the future. “Futurology” and its version of “futurism” emerged in the mid-twentieth century as an attempt to control (and transform) the Cold War world through new kinds of knowledge production and social engineering, new technologies of knowledge production and social engineering to be more precise. (This futurism is different than the Marinetti version, the fascist version. Different-ish.) As Jenny Andersson writes in her history of post-war “future studies,” The Future of the World, these “predictive techniques rarely sought to produce objective representation of a probable future; they tried, rather, to find potential levers with which to influence human action.” These techniques, such as the Delphi method popularized by RAND, are highly technocratic — maybe even “cybernetic”? — and are deeply, deeply intertwined with not just economic forecasting, but with military scenario planning.
[…]
Futurology has always tried to sell a shiny, exciting vision for tomorrow — that is, as I argued above, what it was designed to do. But all this — all this — feels remarkably grim, despite the happy drumbeat. Without a radical adjustment to these plans for energy usage and for knowledge automation, jobs of the future seem likely entail things much less glamorous (or highly paid) than the invented work that get touted in headlines (and here again, the call for this “masculine energy” sort of shit invoked the explicitly fascist elements of futurism).
[…]
The jobs of the future will involve cleaning up environmental and political and epistemological disaster. They will involve care, for the human and more-than-human world. Of course, that’s always been the work. That’s always been the consequence, always the fallout — the caretakers of the world already know.
Source: Second Breakfast (paywalled for members)
Image: Markus Spiske