Friday forebodings

    I think it's alright to say that this was a week when my spirits dropped a little. Apologies if that's not what you wanted to hear right now, and if it's reflected in what follows.

    For there to be good things there must also be bad. For there to be joy there must also be sorrow. And for there to be hope there must be despair. All of this will pass.


    We’re Finding Out How Small Our Lives Really Are

    But there’s no reason to put too sunny a spin on what’s happening. Research has shown that anticipation can be a linchpin of well-being and that looking ahead produces more intense emotions than retrospection. In a 2012 New York Times article on why people thirst for new experiences, one psychologist told the paper, “Novelty-seeking is one of the traits that keeps you healthy and happy and fosters personality growth as you age,” and another referred to human beings as a “neophilic species.” Of course, the current blankness in the place of what comes next is supposed to be temporary. Even so, lacking an ability to confidently say “see you later” is going to have its effects. Have you noticed the way in which conversations in this era can quickly become recursive? You talk about the virus. Or you talk about what you did together long ago. The interactions don’t always spark and generate as easily as they once did.

    Spencer Kornhaber (The Atlantic)

    Part of the problem with all of this is that we don't know how long it's going to last, so we can't really make plans. It's like an extended limbo where you're supposed to just get on with it, whatever 'it' is...


    Career Moats in a Recession

    If you're going after a career moat now, remember that the best skills to go after are the ones that the market will value after the recession ends. You can’t necessarily predict this — the world is complex and the future is uncertain, but you should certainly keep the general idea in mind.

    A simpler version of this is to go after complementary skills to your current role. If you've been working for a bit, it's likely that you'll have a better understanding of your industry than most. So ask yourself: what complementary skills would make you more valuable to the employers in your job market?

    Cedric James (Commonplace)

    I'm fortunate to have switched from education to edtech at the right time. Elsewhere, James says that "job security is the ability to get your next job, not keep your current one" and that this depends on your network, luck, and having "rare and valuable skills". Indeed.


    Everything Is Innovative When You Ignore the Past

    This is hard stuff, and acknowledging it comes with a corollary: We, as a society, are not particularly special. Vinsel, the historian at Virginia Tech, cautioned against “digital exceptionalism,” or the idea that everything is different now that the silicon chip has been harnessed for the controlled movement of electrons.

    It’s a difficult thing for people to accept, especially those who have spent their lives building those chips or the software they run. “Just on a psychological level,” Vinsel said, “people want to live in an exciting moment. Students want to believe they’re part of a generation that’s going to change the world through digital technology or whatever.”

    Aaron Gordon (VICE)

    Everyone thinks they live in 'unprecedented' times, especially if they work in tech.


    ‘We can’t go back to normal’: how will coronavirus change the world?

    But disasters and emergencies do not just throw light on the world as it is. They also rip open the fabric of normality. Through the hole that opens up, we glimpse possibilities of other worlds. Some thinkers who study disasters focus more on all that might go wrong. Others are more optimistic, framing crises not just in terms of what is lost but also what might be gained. Every disaster is different, of course, and it’s never just one or the other: loss and gain always coexist. Only in hindsight will the contours of the new world we’re entering become clear.

    Peter C Baker (the Guardian)

    An interesting read, outlining the optimistic and pessimistic scenarios. The coronavirus pandemic is a crisis, but of course what comes next (CLIMATE CHANGE) is even bigger.


    The Terrible Impulse To Rally Around Bad Leaders In A Crisis

    This tendency to rally around even incompetent leaders makes one despair for humanity. The correct response in all cases is contempt and an attempt, if possible, at removal of the corrupt and venal people in charge. Certainly no one should be approving of the terrible jobs they [Cuomo, Trump, Johnson] have done.

    All three have or will use their increased power to do horrible things. The Coronavirus bailout bill passed by Congress and approved by Trump is a huge bailout of the rich, with crumbs for the poor and middle class. So little, in fact, that there may be widespread hunger soon. Cuomo is pushing forward with his cuts, and I’m sure Johnson will live down to expectations.

    Ian Welsh

    I'm genuinely shocked that the current UK government's approval ratings are so high. Yes, they're covering 80% of the salary of those laid-off, but the TUC was pushing for an even higher figure. It's like we're congratulating neoliberal idiots for destroying our collectively ability to be able to respond to this crisis effectively.


    As Coronavirus Surveillance Escalates, Personal Privacy Plummets

    Yet ratcheting up surveillance to combat the pandemic now could permanently open the doors to more invasive forms of snooping later. It is a lesson Americans learned after the terrorist attacks of Sept. 11, 2001, civil liberties experts say.

    Nearly two decades later, law enforcement agencies have access to higher-powered surveillance systems, like fine-grained location tracking and facial recognition — technologies that may be repurposed to further political agendas like anti-immigration policies. Civil liberties experts warn that the public has little recourse to challenge these digital exercises of state power.

    Natasha Singer and Choe Sang-Hun (The New York Times)

    I've seen a lot of suggestions around smarpthone tracking to help with the pandemic response. How, exactly, when it's trivial to spoof your location? It's just more surveillance by the back door.


    How to Resolve Any Conflict in Your Team

    Have you ever noticed that when you argue with someone smart, if you manage to debunk their initial reasoning, they just shift to a new, logical-sounding reason?

    Reasons are like a salamander’s legs — if you cut one off, another grows in its place.

    When you’re dealing with a salamander, you need to get to the heart. Forget about reasoning and focus on what’s causing the emotions. According to [non-violent communication], every negative emotion is the result of an unmet, universal need.

    Dave bailey

    Great advice here, especially for those who work in organisations (or who have clients) who lack emotional intelligence.


    2026 – the year of the face to face pivot

    When the current crisis is over in terms of infection, the social and economic impact will be felt for a long time. One such hangover is likely to be the shift to online for so much of work and interaction. As the cartoon goes “these meetings could’ve been emails all along”. So let’s jump forward then a few years when online is the norm.

    Martin Weller (The Ed Techie)

    Some of the examples given in this post gave me a much-needed chuckle.


    Now's the time – 15 epic video games for the socially isolated

    However, now that many of us are finding we have time on our hands, it could be the opportunity we need to attempt some of the more chronologically demanding narrative video game masterpieces of the last decade.

    Keith Stuart (The Guardian)

    Well, yes, but what we probably need even more is multiplayer mode. Red Dead Redemption II is on this list, and it's one of the best games ever made. However, it's tinged with huge sadness for me as it's a game I greatly enjoyed playing with the late, great, Dai Barnes.


    Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!


    Header image by Alex Fu

    Friday flickerings

    I've tried to include some links here to other things here, but just like all roads read to Rome, all links eventually point to the pandemic.

    I hope you and people that you care about are well. Stay safe, stay indoors, and let me know which of the following resonate with you!


    Supermensch

    Our stories about where inventiveness comes from, and how the future will be made, overwhelmingly focus on the power of the individual. Such stories appeal to the desire for human perfection (and redemption?) recast in technological language, and they were integral to the way that late-19th-century inventor-entrepreneurs, such as Tesla or Thomas Edison, presented themselves to their publics. They’re still very much part of the narrative of technological entrepreneurism now. Just as Tesla wanted to be seen as a kind of superhero of invention, unbound by conventional restraints, so too do his contemporary admirers at the cutting edge of the tech world. Superheroes resonate within that culture precisely because they embody in themselves the perception of technology as something that belongs to powerful and iconoclastic individuals. They epitomise the idea that technological culture is driven by outsiders. The character of Iron Man makes this very clear: after all, he really is a tech entrepreneur, his superpowers the product of the enhanced body armour he wears.

    Iwan Rhys Morus (Aeon)

    A really interesting read about the link between individualism, superheroes, technology, and innovation.


    The Second Golden Age of Blogging

    Blogging was then diffused into social media, but now social media is so tribal and algo-regulated that anybody with a real message today needs their own property. At the same time, professional institutions are increasingly suffocated by older, rent-seeking incumbents and politically-correct upstarts using moralism as a career strategy. In such a context, blogging — if it is intelligent, courageous, and consistent — is currently one of the most reliable methods for intellectually sophisticated individuals to accrue social and cultural capital outside of institutions. (Youtube for the videographic, Instagram for the photographic, podcasting for the loquacious, but writing and therefore blogging for the most intellectually sophisticated.)

    Justin Murphy (Other LIfe)

    I've been blogging since around 2004, so for sixteen years, and through all of my career to date. It's the best and most enjoyable thing about 'work'.


    NASA Fixes Mars Lander By Telling It to Hit Itself With a Shovel

    NASA expected its probe, dubbed “the mole,” to dig its way through sand-like terrain. But because the Martian soil clumped together, the whole apparatus got stuck in place.

    Programming InSight’s robotic arm to land down on the mole was a risky, last-resort maneuver, PopSci reports, because it risked damaging fragile power and communication lines that attached nearby. Thankfully, engineers spent a few months practicing in simulations before they made a real attempt.

    Dan Robitzski (Futurism)

    The idea of NASA engineers sending a signal to a distant probe to get it to hit itself, in the midst of a crisis on earth, made me chuckle this week.


    Act as if You’re Really There

    Don’t turn your office into a generic TV backdrop. Video is boring enough. The more you remove from the frame, the less visual data you are providing about who you are, where you live, how you work, and what you care about. If you were watching a remote interview with, say, Bong Joon-ho (the South Korean director of Parasite) would you want him sitting on a blank set with a ficus plant? Of course not. You would want to see him in his real office or studio. What are the posters on his wall? The books on his shelf? Who are his influences?

    Douglas Rushkoff (OneZero)

    Useful advice in this post from Douglas Rushkoff. I appreciate his reflection that, "every pixel is a chance to share information about your process and proclivities."


    People Are Looping Videos to Fake Paying Attention in Zoom Meetings

    On Twitter, people are finding ways to use the Zoom Rooms custom background feature to slap an image of themselves in their frames. You can record a short, looping video as your background, or take a photo of yourself looking particularly attentive, depending on the level of believability you're going for. Zoom says it isn't using any kind of video or audio analysis to track attention, so this is mostly for your human coworkers and boss' sake. With one of these images on your background, you're free to leave your seat and go make a sandwich while your boss thinks you're still there paying attention:

    Samantha Cole (Vice)

    As an amusing counterpoint to the above article, I find it funny that people are using video backgrounds in this way!


    A Guide to Hosting Virtual Events with Zoom

    There are lots of virtual event tools out there, like Google Hangouts, YouTube Live, Vimeo Live. For this guide I’ll delve into how to use Zoom specifically. However, a lot of the best practices explored here are broadly applicable to other tools. My goal is that reading this document will give you all the tools you need to be able to set up a meeting and host it on Zoom (or other platforms) in fun and interactive ways.

    Alexa Kutler (Google Docs)

    This is an incredible 28-page document that explains how to set up Zoom meetings for success. Highly recommended!


    The rise of the bio-surveillance state

    Elements of Asia’s bio-surveillance revolution may not be as far off as citizens of Western democracies assume. On 24 March an emergency bill, which would relax limits on urgent surveillance warrants, went before the House of Lords. In any case, Britain’s existing Investigatory Powers Act already allows the state to seize mobile data if national security justifies it. In another sign that a new era in data rights is dawning, the EU is reviewing its recent white paper on AI regulation and delaying a review of online privacy rules. Researchers in both Britain (Oxford) and the US (MIT) are developing virus-tracking apps inviting citizens to provide movement data voluntarily. How desperate would the search for “needles in haystacks” have to get for governments to make such submissions compulsory? Israel’s draconian new regulations – which allegedly include tapping phone cameras and microphones – show how far down this road even broadly Western democracies might go to save lives and economies.

    Jeremy Cliffe (New Statesman)

    We need urgent and immediate action around the current criss. But we also need safeguards and failsafes so that we don't end up with post-pandemic authoritarian regimes.


    The economy v our lives? It's a false choice – and a deeply stupid one

    Soon enough, as hospitals around the world overflow with coronavirus patients, exhausting doctors, nurses, orderlies, custodians, medical supplies, ventilators and hospital cash accounts, doctors will have to make moral choices about who lives or dies. We should not supersede their judgment based on a false choice. Economic depression will come, regardless of how many we let die. The question is how long and devastating it will be.

    Siva Vaidhyanathan (The Guardian)

    Not exactly a fun read, but the truth is the world's economy is shafted no matter which way we look at it. And as I tweeted the other day, there's no real thing that exists, objectively speaking called 'the economy' which is separate from human relationships.


    How the Pandemic Will End

    Pandemics can also catalyze social change. People, businesses, and institutions have been remarkably quick to adopt or call for practices that they might once have dragged their heels on, including working from home, conference-calling to accommodate people with disabilities, proper sick leave, and flexible child-care arrangements. “This is the first time in my lifetime that I’ve heard someone say, ‘Oh, if you’re sick, stay home,’” says Adia Benton, an anthropologist at Northwestern University. Perhaps the nation will learn that preparedness isn’t just about masks, vaccines, and tests, but also about fair labor policies and a stable and equal health-care system. Perhaps it will appreciate that health-care workers and public-health specialists compose America’s social immune system, and that this system has been suppressed.

    Ed Yong (The Atlantic)

    Much of this is a bit depressing, but I've picked up on the more positive bit towards the end. See also the article I wrote earlier this week: People seem not to see that their opinion of the world is also a confession of character


    Enjoy this? Sign up for the weekly roundup, become a supporter, or download Thought Shrapnel Vol.1: Personal Productivity!


    Header image by Sincerely Media.

    To others we are not ourselves but a performer in their lives cast for a part we do not even know that we are playing

    Surveillance, technology, and society

    Last week, the London Metropolitan Police ('the Met') proudly announced that they've begun using 'LFR', which is their neutral-sounding acronym for something incredibly invasive to the privacy of everyday people in Britain's capital: Live Facial Recognition.

    It's obvious that the Met expect some pushback here:

    The Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences. 

    At a deployment, cameras will be focused on a small, targeted area to scan passers-by. The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video or ANPR.

    London Metropolitan Police

    Note the talk of 'intelligence' and 'bespoke watch lists', as well as promises that LFR will not be linked any other systems. (ANPR, for those not familiar with it, is 'Automatic Number Plate Recognition'.) This, of course, is the thin end of the wedge and how these things start — in a 'targeted' way. They're expanded later, often when the fuss has died down.


    Meanwhile, a lot of controversy surrounds an app called Clearview AI which scrapes publicly-available data (e.g. Twitter or YouTube profiles) and applies facial recognition algorithms. It's already in use by law enforcement in the USA.

    The size of the Clearview database dwarfs others in use by law enforcement. The FBI's own database, which taps passport and driver's license photos, is one of the largest, with over 641 million images of US citizens.

    The Clearview app isn't available to the public, but the Times says police officers and Clearview investors think it will be in the future.

    The startup said in a statement Tuesday that its "technology is intended only for use by law enforcement and security personnel. It is not intended for use by the general public." 

    Edward Moyer (CNET)

    So there we are again, the technology is 'intended' for one purpose, but the general feeling is that it will leak out into others. Imagine the situation if anyone could identify almost anyone on the planet simply by pointing their smartphone at them for a few seconds?

    This is a huge issue, and one that politicians and lawmakers on both sides of the Atlantic are both ill-equipped to deal with and particularly concerned about. As the BBC reports, the European Commission is considering a five-year ban on facial recognition in public spaces while it figures out how to regulate the technology:

    The Commission set out its plans in an 18-page document, suggesting that new rules will be introduced to bolster existing regulation surrounding privacy and data rights.

    It proposed imposing obligations on both developers and users of artificial intelligence, and urged EU countries to create an authority to monitor the new rules.

    During the ban, which would last between three and five years, "a sound methodology for assessing the impacts of this technology and possible risk management measures could be identified and developed".

    BBC News

    I can't see the genie going back in this particular bottle and, as Ian Welsh puts it, this is the end of public anonymity. He gives the examples of the potential for all kinds of abuse, from an increase in rape, to abuse by corporations, to an increase in parental surveillance of children.

    The larger issue is this: people who are constantly under surveillance become super conformers out of defense. Without true private time, the public persona and the private personality tend to collapse together. You need a backstage — by yourself and with a small group of friends to become yourself. You need anonymity.

    When everything you do is open to criticism by everyone, you will become timid and conforming.

    When governments, corporations, schools and parents know everything, they will try to control everything. This often won’t be for your benefit.

    Ian Welsh

    We already know that self-censorship is the worst kind of censorship, and live facial recognition means we're going to have to do a whole lot more of it in the near future.

    So what can we do about it? Welsh thinks that this technology should be made illegal, which is one option. However, you can't un-invent technologies. So live facial recognition is going to be used (lawfully) by some organisations, even if it were restricted to state operatives. I'm not sure if that's better or worse than everyone having it?


    At a recent workshop I ran, I was talking during one of the breaks to one person who couldn't really see the problem I had raised about surveillance capitalism. I have to wonder if they would have a problem with live facial recognition? From our conversation, I'd suspect not.

    Remember that facial recognition is not 100% accurate and (realistically) never can be. So there will be false positives. Let's say your face ends up on a 'watch list' or a 'bad actor' database shared with many different agencies and retailers. All of a sudden, you've got yourself a very big problem.


    As BuzzFeed News reports, around half of US retailers are either using live facial recognition, or have plans to use it. At the moment, companies like FaceFirst do not facilitate the sharing of data across their clients, but you can see what's coming next:

    [Peter Trepp, CEO of FaceFirst] said the database is not shared with other retailers or with FaceFirst directly. All retailers have their own policies, but Trepp said often stores will offer not to press charges against apprehended shoplifters if they agree to opt into the store’s shoplifter database. The files containing the images and identities of people on “the bad guy list” are encrypted and only accessible to retailers using their own systems, he said.

    FaceFirst automatically purges visitor data that does not match information in a criminal database every 14 days, which is the company’s minimum recommendation for auto-purging data. It’s up to the retailer if apprehended shoplifters or people previously on the list can later opt out of the database.

    Leticia Miranda (BuzzFeed News)

    There is no opt-in, no consent sought or gathered by retailers. This is a perfect example of technology being light years ahead of lawmaking.


    This is all well-and-good in situations where adults are going into public spaces, but what about schools, where children are often only one step above prisoners in terms of the rights they enjoy?

    Recode reports that, in schools, the surveillance threat to students goes beyond facial recognition. So long as authorities know generally what a student looks like, they can track them everywhere they go:

    Appearance Search can find people based on their age, gender, clothing, and facial characteristics, and it scans through videos like facial recognition tech — though the company that makes it, Avigilon, says it doesn’t technically count as a full-fledged facial recognition tool

    Even so, privacy experts told Recode that, for students, the distinction doesn’t necessarily matter. Appearance Search allows school administrators to review where a person has traveled throughout campus — anywhere there’s a camera — using data the system collects about that person’s clothing, shape, size, and potentially their facial characteristics, among other factors. It also allows security officials to search through camera feeds using certain physical descriptions, like a person’s age, gender, and hair color. So while the tool can’t say who the person is, it can find where else they’ve likely been.

    Rebecca Heilweil (Recode)

    This is a good example of the boundaries of technology that may-or-may-not be banned at some point in the future. The makers of Appearance Search, Avigilon, claim that it's not facial recognition technology because the images it captures and analyses are tied to the identity of a particular person:

    Avigilon’s surveillance tool exists in a gray area: Even privacy experts are conflicted over whether or not it would be accurate to call the system facial recognition. After looking at publicly available content about Avigilon, Leong said it would be fairer to call the system an advanced form of characterization, meaning that the system is making judgments about the attributes of that person, like what they’re wearing or their hair, but it’s not actually claiming to know their identity.

    Rebecca Heilweil (Recode)

    You can give as many examples of the technology being used for good as you want — there's one in this article about how the system helped discover a girl was being bullied, for example — but it's still intrusive surveillance. There are other ways of getting to the same outcome.


    We do not live in a world of certainty. We live in a world where things are ambiguous, unsure, and sometimes a little dangerous. While we should seek to protect one another, and especially those who are most vulnerable in society, we should think about the harm we're doing by forcing people to live the totality of their lives in public.

    What does that do to our conceptions of self? To creativity? To activism? Live facial recognition technology, as well as those technologies that exist in a grey area around it, is the hot-button issue of the 2020s.


    Image by Kirill Sharkovski. Quotation-as-title by Elizabeth Bibesco.

    Friday foggings

    I've been travelling this week, so I've had plenty of time to read and digest a whole range of articles. In fact, because of the luxury of that extra time, I decided to write some comments about each link, as well as the usual quotation.

    Let me know what you think about this approach. I may not have the bandwidth to do it every week, but if it's useful, I'll try and prioritise it. As ever, particularly interested in hearing from supporters!


    Education and Men without Work (National Affairs) — “Unlike the Great Depression, however, today's work crisis is not an unemployment crisis. Only a tiny fraction of workless American men nowadays are actually looking for employment. Instead we have witnessed a mass exodus of men from the workforce altogether. At this writing, nearly 7 million civilian non-institutionalized men between the ages of 25 and 54 are neither working nor looking for work — over four times as many as are formally unemployed.”

    This article argues that the conventional wisdom, that men are out of work because of a lack of education, may be based on false assumptions. In fact, a major driver seems to be the number of men (more than 50% of working-age men, apparently) who live in child-free homes. What do these men end up doing with their time? Many of them are self-medicating with drugs and screens.


    Fresh Cambridge Analytica leak ‘shows global manipulation is out of control’ (The Guardian) — “More than 100,000 documents relating to work in 68 countries that will lay bare the global infrastructure of an operation used to manipulate voters on “an industrial scale” are set to be released over the next months.”

    Sadly, I think the response to these documents will be one of apathy. Due to the 24-hour news cycle and the stream of 'news' on social networks, the voting public grow tired of scandals and news stories that last for months and years.


    Funding (Sussex Royals) — “The Sovereign Grant is the annual funding mechanism of the monarchy that covers the work of the Royal Family in support of HM The Queen including expenses to maintain official residences and workspaces. In this exchange, The Queen surrenders the revenue of the Crown Estate and in return, a portion of these public funds are granted to The Sovereign/The Queen for official expenditure.”

    I don't think I need to restate my opinions on the Royal Family, privilege, and hierarchies / coercive power relationships of all shapes and sizes. However, as someone pointed out on Mastodon, this page by 'Harry and Meghan' is quietly subversive.


    How to sell good ideas (New Statesman) — “It is true that [Malcolm] Gladwell sometimes presses his stories too militantly into the service of an overarching idea, and, at least in his books, can jam together materials too disparate to cohere (Poole referred to his “relentless montage”). The New Yorker essay, which constrains his itinerant curiosity, is where he does his finest work (the best of these are collected in 2009’s What The Dog Saw). For the most part, the work of his many imitators attests to how hard it is to do what he does. You have to be able to write lucid, propulsive prose capable of introducing complex ideas within a magnetic field of narrative. You have to leave your desk and talk to people (he never stopped being a reporter). Above all, you need to acquire an extraordinary eye for the overlooked story, the deceptively trivial incident, the minor genius. Gladwell shares the late Jonathan Miller’s belief that “it is in the negligible that the considerable is to be found”.”

    A friend took me to see Gladwell when he was in Newcastle-upon-Tyne touring with 'What The Dog Saw'. Like the author of this article, I soon realised that Gladwell is selling something quite different to 'science' or 'facts'. And so long as you're OK with that, you can enjoy (as I do) his podcasts and books.


    Just enough Internet: Why public service Internet should be a model of restraint (doteveryone) — “We have not yet done a good job of defining what good digital public service really looks like, of creating digital charters that match up to those of our great institutions, and it is these statements of values and ways of working – rather than any amount of shiny new technology – that will create essential building blocks for the public services of the future.”

    While I attended the main MozFest weekend event, I missed the presentation and other events that happened earlier in the week. I definitely agree with the sentiment behind the transcript of this talk by Rachel Coldicutt. I'm just not sure it's specific enough to be useful in practice.


    Places to go in 2020 (Marginal Revolution) — “Here is the mostly dull NYT list. Here is my personal list of recommendations for you, noting I have not been to all of the below, but I am in contact with many travelers and paw through a good deal of information."

    This list by Tyler Cowen is really interesting. I haven't been to any of the places on this list, but I now really want to visit Eastern Bali and Baku in Azerbaijan.


    Reasons not to scoff at ghosts, visions and near-death experiences (Aeon) — “Sure, the dangers of gullibility are evident enough in the tragedies caused by religious fanatics, medical quacks and ruthless politicians. And, granted, spiritual worldviews are not good for everybody. Faith in the ultimate benevolence of the cosmos will strike many as hopelessly irrational. Yet, a century on from James’s pragmatic philosophy and psychology of transformative experiences, it might be time to restore a balanced perspective, to acknowledge the damage that has been caused by stigma, misdiagnoses and mis- or overmedication of individuals reporting ‘weird’ experiences. One can be personally skeptical of the ultimate validity of mystical beliefs and leave properly theological questions strictly aside, yet still investigate the salutary and prophylactic potential of these phenomena.”

    I'd happily read a full-length book on this subject, as it's a fascinating area. The tension between knowing that much/all of the phenomena is reducible to materiality and mechanics may explain what's going on, but it doesn't explain it away...


    Surveillance Tech Is an Open Secret at CES 2020 (OneZero) — “Lowe offered one explanation for why these companies feel so comfortable marketing surveillance tech: He says that the genie can’t be put back in the bottle, so barring federal regulation that bans certain implementations, it’s increasingly likely that some company will fill the surveillance market. In other words, if Google isn’t going to work with the cops, Amazon will. And even if Amazon decides not to, smaller companies out of the spotlight still will.”

    I suppose it should come as no surprise that, in this day and age, companies like Cyberlink, previously known for their PowerDVD software, have moved into the very profitable world of surveillance capitalism. What's going to stop its inexorable rise? I can only think of government regulation (with teeth).


    ‘Techlash’ Hits College Campuses (New York Times) — “Some recent graduates are taking their technical skills to smaller social impact groups instead of the biggest firms. Ms. Dogru said that some of her peers are pursuing jobs at start-ups focused on health, education and privacy. Ms. Harbour said Berkeley offers a networking event called Tech for Good, where alumni from purpose-driven groups like Code for America and Khan Academy share career opportunities.”

    I'm not sure this is currently as big a 'movement' as suggested in the article, but I'm glad the wind is blowing in this direction. As with other ethically-dubious industries, companies involved in surveillance capitalism will have to pay people extraordinarily well to put aside their moral scruples.


    Tradition is Smarter Than You Are (The Scholar's Stage) — “To extract resources from a population the state must be able to understand that population. The state needs to make the people and things it rules legible to agents of the government. Legibility means uniformity. States dream up uniform weights and measures, impress national languages and ID numbers on their people, and divvy the country up into land plots and administrative districts, all to make the realm legible to the powers that be. The problem is that not all important things can be made legible. Much of what makes a society successful is knowledge of the tacit sort: rarely articulated, messy, and from the outside looking in, purposeless. These are the first things lost in the quest for legibility. Traditions, small cultural differences, odd and distinctive lifeways... are all swept aside by a rationalizing state that preserves (or in many cases, imposes) only what it can be understood and manipulated from the 2,000 foot view. The result... are many of the greatest catastrophes of human history.”

    One of the books that's been on my 'to-read' list for a while is 'Seeing Like a State', written by James C. Scott and referenced in this article. I'm no believer in tradition for the sake of it but, I have to say, that a lot of the superstitions of my maternal grandmother, and a lot of the rituals that come with religion are often very practical in nature.


    Image by Michael Schlegel (via kottke.org)

    Friday fertilisations

    I've read so much stuff over the past couple of months that it's been a real job whittling down these links. In the end I gave up and shared a few more than usual!

    • You Shouldn’t Have to Be Good at Your Job (GEN) — "This is how the 1% justifies itself. They are not simply the best in terms of income, but in terms of humanity itself. They’re the people who get invited into the escape pods when the mega-asteroid is about to hit. They don’t want a fucking thing to do with the rest of the population and, in fact, they have exploited global economic models to suss out who deserves to be among them and who deserves to be obsolete. And, thanks to lax governments far and wide, they’re free to practice their own mass experiments in forced Darwinism. You currently have the privilege of witnessing a worm’s-eye view of this great culling. Fun, isn’t it?"
    • We've spent the decade letting our tech define us. It's out of control (The Guardian) — "There is a way out, but it will mean abandoning our fear and contempt for those we have become convinced are our enemies. No one is in charge of this, and no amount of social science or monetary policy can correct for what is ultimately a spiritual deficit. We have surrendered to digital platforms that look at human individuality and variance as “noise” to be corrected, rather than signal to be cherished. Our leading technologists increasingly see human beings as a problem, and technology as the solution – and they use our behavior on their platforms as evidence of our essentially flawed nature."
    • How headphones are changing the sound of music (Quartz) — "Another way headphones are changing music is in the production of bass-heavy music. Harding explains that on small speakers, like headphones or those in a laptop, low frequencies are harder to hear than when blasted from the big speakers you might encounter at a concert venue or club. If you ever wondered why the bass feels so powerful when you are out dancing, that’s why. In order for the bass to be heard well on headphones, music producers have to boost bass frequencies in the higher range, the part of the sound spectrum that small speakers handle well."
    • The False Promise of Morning Routines (The Atlantic) — "Goat milk or no goat milk, the move toward ritualized morning self-care can seem like merely a palliative attempt to improve work-life balance.It makes sense to wake up 30 minutes earlier than usual because you want to fit in some yoga, an activity that you enjoy. But something sinister seems to be going on if you feel that you have to wake up 30 minutes earlier than usual to improve your well-being, so that you can also work 60 hours a week, cook dinner, run errands, and spend time with your family."
    • Giant surveillance balloons are lurking at the edge of space (Ars Technica) — "The idea of a constellation of stratospheric balloons isn’t new—the US military floated the idea back in the ’90s—but technology has finally matured to the point that they’re actually possible. World View’s December launch marks the first time the company has had more than one balloon in the air at a time, if only for a few days. By the time you’re reading this, its other stratollite will have returned to the surface under a steerable parachute after nearly seven weeks in the stratosphere."
    • The Unexpected Philosophy Icelanders Live By (BBC Travel) — "Maybe it makes sense, then, that in a place where people were – and still are – so often at the mercy of the weather, the land and the island’s unique geological forces, they’ve learned to give up control, leave things to fate and hope for the best. For these stoic and even-tempered Icelanders, þetta reddast is less a starry-eyed refusal to deal with problems and more an admission that sometimes you must make the best of the hand you’ve been dealt."
    • What Happens When Your Career Becomes Your Whole Identity (HBR) — "While identifying closely with your career isn’t necessarily bad, it makes you vulnerable to a painful identity crisis if you burn out, get laid off, or retire. Individuals in these situations frequently suffer anxiety, depression, and despair. By claiming back some time for yourself and diversifying your activities and relationships, you can build a more balanced and robust identity in line with your values."
    • Having fun is a virtue, not a guilty pleasure (Quartz) — "There are also, though, many high-status workers who can easily afford to take a break, but opt instead to toil relentlessly. Such widespread workaholism in part reflects the misguided notion that having fun is somehow an indulgence, an act of absconding from proper respectable behavior, rather than embracement of life. "
    • It’s Time to Get Personal (Laura Kalbag) — "As designers and developers, it’s easy to accept the status quo. The big tech platforms already exist and are easy to use. There are so many decisions to be made as part of our work, we tend to just go with what’s popular and convenient. But those little decisions can have a big impact, especially on the people using what we build."
    • The 100 Worst Ed-Tech Debacles of the Decade (Hack Education) — "Oh yes, I’m sure you can come up with some rousing successes and some triumphant moments that made you thrilled about the 2010s and that give you hope for “the future of education.” Good for you. But that’s not my job. (And honestly, it’s probably not your job either.)"
    • Why so many Japanese children refuse to go to school (BBC News) — "Many schools in Japan control every aspect of their pupils' appearance, forcing pupils to dye their brown hair black, or not allowing pupils to wear tights or coats, even in cold weather. In some cases they even decide on the colour of pupils' underwear. "
    • The real scam of ‘influencer’ (Seth Godin) — "And a bigger part is that the things you need to do to be popular (the only metric the platforms share) aren’t the things you’d be doing if you were trying to be effective, or grounded, or proud of the work you’re doing."

    Image via Kottke.org

    I am not fond of expecting catastrophes, but there are cracks in the universe

    So said Sydney Smith. Let's talk about surveillance. Let's talk about surveillance capitalism and surveillance humanitarianism. But first, let's talk about machine learning and algorithms; in other words, let's talk about what happens after all of that data is collected.

    Writing in The Guardian, Sarah Marsh investigates local councils using "automated guidance systems" in an attempt to save money.

    The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

    Sarah Marsh

    The trouble is, they're not particularly effective:

    It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk

    Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

    Sarah Marsh

    When I was at Mozilla there were a number of colleagues there who had worked on the OFA (Obama For America) campaign. I remember one of them, a DevOps guy, expressing his concern that the infrastructure being built was all well and good when there's someone 'friendly' in the White House, but what comes next.

    Well, we now know what comes next, on both sides of the Atlantic, and we can't put that genie back in its bottle. Swingeing cuts by successive Conservative governments over here, coupled with the Brexit time-and-money pit means that there's no attention or cash left.

    If we stop and think about things for a second, we probably wouldn't don't want to live in a world where machines make decisions for us, based on algorithms devised by nerds. As Rose Eveleth discusses in a scathing article for Vox, this stuff isn't 'inevitable' — nor does it constitute a process of 'natural selection':

    Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”

    And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.

    Rose Eveleth

    I suppose the thing that concerns me the most is people in dire need being subject to impersonal technology for vital and life-saving aid.

    For example, Mark Latonero, writing in The New York Times, talks about the growing dangers around what he calls 'surveillance humanitarianism':

    By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need.

    Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.

    Mark Latonero

    It's easy to think that this is an emergency, so we should just do whatever is necessary. But Latonero explains the risks, arguing that the risk is shifted to a later time:

    If an individual or group’s data is compromised or leaked to a warring faction, it could result in violent retribution for those perceived to be on the wrong side of the conflict. When I spoke with officials providing medical aid to Syrian refugees in Greece, they were so concerned that the Syrian military might hack into their database that they simply treated patients without collecting any personal data. The fact that the Houthis are vying for access to civilian data only elevates the risk of collecting and storing biometrics in the first place.

    Mark Latonero

    There was a rather startling article in last weekend's newspaper, which I've found online. Hannah Devlin, again writing in The Guardian (which is a good source of information for those concerned with surveillance) writes about a perfect storm of social media and improved processing speeds:

    [I]n the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.

    The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.

    In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.

    Hannah Devlin

    Those of you who have read this far and are expecting some big reveal are going to be disappointed. I don't have any 'answers' to these problems. I guess I've been guilty, like many of us have, of the kind of 'privacy nihilism' mentioned by Ian Bogost in The Atlantic:

    Online services are only accelerating the reach and impact of data-intelligence practices that stretch back decades. They have collected your personal data, with and without your permission, from employers, public records, purchases, banking activity, educational history, and hundreds more sources. They have connected it, recombined it, bought it, and sold it. Processed foods look wholesome compared to your processed data, scattered to the winds of a thousand databases. Everything you have done has been recorded, munged, and spat back at you to benefit sellers, advertisers, and the brokers who service them. It has been for a long time, and it’s not going to stop. The age of privacy nihilism is here, and it’s time to face the dark hollow of its pervasive void.

    Ian Bogost

    The only forces that we have to stop this are collective action, and governmental action. My concern is that we don't have the digital savvy to do the former, and there's definitely the lack of will in respect of the latter. Troubling times.

    To be perfectly symmetrical is to be perfectly dead

    So said Igor Stravinsky. I'm a little behind on my writing, and prioritised writing up my experiences in the Lake District over the past couple of days.

    Today's update is therefore a list post:

    • Degrowth: a Call for Radical Abundance (Jason Hickel) — "In other words, the birth of capitalism required the creation of scarcity. The constant creation of scarcity is the engine of the juggernaut."
    • Hey, You Left Something Out (Cogito, Ergo Sumana) — "People who want to compliment work should probably learn to give compliments that sound encouraging."
    • The Problem is Capitalism (George Monbiot) — "A system based on perpetual growth cannot function without peripheries and externalities. There must always be an extraction zone, from which materials are taken without full payment, and a disposal zone, where costs are dumped in the form of waste and pollution."
    • In Stores, Secret Surveillance Tracks Your Every Move (The New York Times) — "For years, Apple and Google have allowed companies to bury surveillance features inside the apps offered in their app stores. And both companies conduct their own beacon surveillance through iOS and Android."
    • The Inevitable Same-ification of the Internet
      (Matthew Ström) — "Convergence is not the sign of a broken system, or a symptom of a more insidious disease. It is an emergent phenomenon that arises from a few simple rules."


    Wretched is a mind anxious about the future

    So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There's plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it's not currently affecting us here and now, we become blasé. We forget that it's already the lived experience for hundreds of millions of people.

    Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government's brutality against the Muslim Uyghur population in the western province of Xinjiang:

    [The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.

    Derek Thompson

    As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we're the ones who should be worried:

    The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.

    Ian Welsh

    As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they're looking for places where they can't be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.

    A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I've spent a long time thinking about quite nebulous concept. It's not the easiest thing to understand when you examine it more closely.

    Privacy is usually considered a freedom from rather than a freedom to, as in "freedom from surveillance". The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they're sharing it with the company that provides that service.

    There's a lot of power in the 'default' privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:

    Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them... Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.

    Sidney Fussell

    Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.

    Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:

    In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology...

    One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.

    Ellen Sheng

    It's not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:

    Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

    [...]

    While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.
    Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

    Benjamin Herold

    In China, of course, a social credit system makes all of this a million times worse, but we in the West aren't heading in a great direction either.

    We're entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.

    There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon's smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.

    Returning to Ian Welsh to finish up, remember that it's the coercive power relationships that make surveillance a bad thing:

    Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.

    Ian Welsh

    We don't want to think about all of this, though, do we?


    Also check out:

    Tracking vs advertising

    We tend to use words to denote something right up to the time that term becomes untenable. Someone has to invent a better one. Take mobile phones, for example. They’re literally named after the least-used app on there, so we’re crying out for a different way to refer to them. Perhaps a better name would be ‘trackers’.

    These days, most people use mobile devices for social networking. These are available free at the point of access, funded by what we’re currently calling ‘advertising’. However, as this author notes, it’s nothing of the sort:

    What we have today is not advertising. The amount of personally identifiable information companies have about their customers is absolutely perverse. Some of the world’s largest companies are in the business of selling your personal information for use in advertising. This might sound innocuous but the tracking efforts of these companies are so accurate that many people believe that Facebook listens to their conversations to serve them relevant ads. Even if it’s true that the microphone is not used, the sum of all other data collected is still enough to show creepily relevant advertising.

    Unfortunately, the author doesn’t seem to have come to the conclusion yet that it’s the logic of capitalism that go us here. Instead, he just points out that people’s privacy is being abused.

    [P]eople now get most of their information from social networks yet these networks dictate the order in which content is served to the user. Google makes the worlds most popular mobile operating system and it’s purpose is drive the company’s bottom line (ad blocking is forbidden). “Smart” devices are everywhere and companies are jumping over each other to put more shit in your house so they can record your movements and sell the information to advertisers. This is all a blatant abuse of privacy that is completely toxic to society.
    Agreed, and it's easy to feel a little helpless against this onslaught. While it's great to have a list of things that users can do, if those things are difficult to implement and/or hard to understand, then it's an uphill battle.

    That being said, the three suggestions he makes are use

    To combat this trend, I have taken the following steps and I think others should join the movement:
    • Aggressively block all online advertisements
    • Don’t succumb to the “curated” feeds
    • Not every device needs to be “smart”
    I feel I'm already way ahead of the author in this regard:
    • Aggressively block all online advertisements
    • Don’t succumb to the “curated” feeds
      • I quit Facebook years ago, haven't got an Instagram account, and pretty much only post links to my own spaces on Twitter and LinkedIn.
    • Not every device needs to be “smart”
      • I don't really use my Philips Hue lights, and don't have an Amazon Alexa — or even the Google Assistant on my phone).
    It's not easy to stand up to Big Tech. The amount of money they pour into things make their 'innovations' seem inevitable. They can afford to make things cheap and frictionless so you get hooked.

    As an aside, it’s interesting to note that those that previously defended Apple as somehow ‘different’ on privacy, despite being the world’s most profitable company, are starting to backtrack.

    Source: Nicholas Rempel

    It's called Echo for a reason

    That last-minute Christmas gift sounds like nothing but unadulterated fun after reading this, doesn’t it?

    It is a significant thing to allow a live microphone in your private space (just as it is to allow them in our public spaces). Once the hardware is in place, and receiving electricity, and connected to the Internet, then you’re reduced to placing your trust in the hands of two things that unfortunately are less than reliable these days: 1) software, and 2) policy.

    Software, once a mic is in place, governs when that microphone is live, when the audio it captures is transmitted over the Internet, and to whom it goes. Many devices are programmed to keep their microphones on at all times but only record and transmit audio after hearing a trigger phrase—in the case of the Echo, for example, “Alexa.” Any device that is to be activated by voice alone must work this way. There are a range of other systems. Samsung, after a privacy dust-up, assured the public that its smart televisions (like others) only record and transmit audio after the user presses a button on its remote control. The Hello Barbie toy only picks up and transmits audio when its user presses a button on the doll.

    Software is invisible, however. Most companies do not make their code available for public inspection, and it can be hacked, or unscrupulous executives can lie about what it does (think Volkswagen), or government agencies might try to order companies to activate them as a surveillance device.

    I sincerely hope that policy makers pay heed to the recommendations section, especially given the current ‘Wild West’ state of affairs described in the article.

    Source: ACLU

← Newer Posts