Tag: surveillance capitalism

I am not fond of expecting catastrophes, but there are cracks in the universe

So said Sydney Smith. Let’s talk about surveillance. Let’s talk about surveillance capitalism and surveillance humanitarianism. But first, let’s talk about machine learning and algorithms; in other words, let’s talk about what happens after all of that data is collected.

Writing in The Guardian, Sarah Marsh investigates local councils using “automated guidance systems” in an attempt to save money.

The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

Sarah Marsh

The trouble is, they’re not particularly effective:

It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk

Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

Sarah Marsh

When I was at Mozilla there were a number of colleagues there who had worked on the OFA (Obama For America) campaign. I remember one of them, a DevOps guy, expressing his concern that the infrastructure being built was all well and good when there’s someone ‘friendly’ in the White House, but what comes next.

Well, we now know what comes next, on both sides of the Atlantic, and we can’t put that genie back in its bottle. Swingeing cuts by successive Conservative governments over here, coupled with the Brexit time-and-money pit means that there’s no attention or cash left.

If we stop and think about things for a second, we probably wouldn’t don’t want to live in a world where machines make decisions for us, based on algorithms devised by nerds. As Rose Eveleth discusses in a scathing article for Vox, this stuff isn’t ‘inevitable’ — nor does it constitute a process of ‘natural selection’:

Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”

And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.

Rose Eveleth

I suppose the thing that concerns me the most is people in dire need being subject to impersonal technology for vital and life-saving aid.

For example, Mark Latonero, writing in The New York Times, talks about the growing dangers around what he calls ‘surveillance humanitarianism’:

By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need.

Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.

Mark Latonero

It’s easy to think that this is an emergency, so we should just do whatever is necessary. But Latonero explains the risks, arguing that the risk is shifted to a later time:

If an individual or group’s data is compromised or leaked to a warring faction, it could result in violent retribution for those perceived to be on the wrong side of the conflict. When I spoke with officials providing medical aid to Syrian refugees in Greece, they were so concerned that the Syrian military might hack into their database that they simply treated patients without collecting any personal data. The fact that the Houthis are vying for access to civilian data only elevates the risk of collecting and storing biometrics in the first place.

Mark Latonero

There was a rather startling article in last weekend’s newspaper, which I’ve found online. Hannah Devlin, again writing in The Guardian (which is a good source of information for those concerned with surveillance) writes about a perfect storm of social media and improved processing speeds:

[I]n the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.

The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.

In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.

Hannah Devlin

Those of you who have read this far and are expecting some big reveal are going to be disappointed. I don’t have any ‘answers’ to these problems. I guess I’ve been guilty, like many of us have, of the kind of ‘privacy nihilism’ mentioned by Ian Bogost in The Atlantic:

Online services are only accelerating the reach and impact of data-intelligence practices that stretch back decades. They have collected your personal data, with and without your permission, from employers, public records, purchases, banking activity, educational history, and hundreds more sources. They have connected it, recombined it, bought it, and sold it. Processed foods look wholesome compared to your processed data, scattered to the winds of a thousand databases. Everything you have done has been recorded, munged, and spat back at you to benefit sellers, advertisers, and the brokers who service them. It has been for a long time, and it’s not going to stop. The age of privacy nihilism is here, and it’s time to face the dark hollow of its pervasive void.

Ian Bogost

The only forces that we have to stop this are collective action, and governmental action. My concern is that we don’t have the digital savvy to do the former, and there’s definitely the lack of will in respect of the latter. Troubling times.

Friday frustrations

I couldn’t help but notice these things this week:

  • Don’t ask forgiveness, radiate intent (Elizabeth Ayer) ⁠— “I certainly don’t need a reputation as being underhanded or an organizational problem. Especially as a repeat behavior, signalling builds me a track record of openness and predictability, even as I take risks or push boundaries.”
  • When will we have flying cars? Maybe sooner than you think. (MIT Technology Review) — “An automated air traffic management system in constant communication with every flying car could route them to prevent collisions, with human operators on the ground ready to take over by remote control in an emergency. Still, existing laws and public fears mean there’ll probably have to be pilots at least for a while, even if only as a backup to an autonomous system.”
  • For Smart Animals, Octopuses Are Very Weird (The Atlantic) — “Unencumbered by a shell, cephalopods became flexible in both body and mind… They could move faster, expand into new habitats, insinuate their arms into crevices in search of prey.”
  • Cannabidiol in Anxiety and Sleep: A Large Case Series. (PubMed) — “The final sample consisted of 72 adults presenting with primary concerns of anxiety (n = 47) or poor sleep (n = 25). Anxiety scores decreased within the first month in 57 patients (79.2%) and remained decreased during the study duration. Sleep scores improved within the first month in 48 patients (66.7%) but fluctuated over time. In this chart review, CBD was well tolerated in all but 3 patients.”
  • 22 Lessons I’m Still Learning at 82 (Coach George Raveling) — “We must always fill ourselves with more questions than answers. You should never retire your mind. After you retire mentally, then you are just taking up residence in society. I do not ever just want to be a resident of society. I want to be a contributor to our communities.”
  • How Boris Johnson’s “model bus hobby” non sequitur manipulated the public discourse and his search results (BoingBoing) — “Remember, any time a politician deliberately acts like an idiot in public, there’s a good chance that they’re doing it deliberately, and even if they’re not, public idiocy can be very useful indeed.”
  • It’s not that we’ve failed to rein in Facebook and Google. We’ve not even tried. (The Guardian) — “Surveillance capitalism is not the same as digital technology. It is an economic logic that has hijacked the digital for its own purposes. The logic of surveillance capitalism begins with unilaterally claiming private human experience as free raw material for production and sales.”
  • Choose Boring Technology (Dan McKinley) — “The nice thing about boringness (so constrained) is that the capabilities of these things are well understood. But more importantly, their failure modes are well understood.”
  • What makes a good excuse? A Cambridge philosopher may have the answer (University of Cambridge) — “Intentions are plans for action. To say that your intention was morally adequate is to say that your plan for action was morally sound. So when you make an excuse, you plead that your plan for action was morally fine – it’s just that something went awry in putting it into practice.”
  • Your Focus Is Priceless. Stop Giving It Away. (Forge) — “To virtually everyone who isn’t you, your focus is a commodity. It is being amassed, collected, repackaged and sold en masse. This makes your attention extremely valuable in aggregate. Collectively, audiences are worth a whole lot. But individually, your attention and my attention don’t mean anything to the eyeball aggregators. It’s a drop in their growing ocean. It’s essentially nothing.”

Image via @EffinBirds

Friday feeds

These things caught my eye this week:

  • Some of your talents and skills can cause burnout. Here’s how to identify them (Fast Company) — “You didn’t mess up somewhere along the way or miss an important lesson that the rest of us received. We’re all dealing with gifts that drain our energy, but up until now, it hasn’t been a topic of conversation. We aren’t discussing how we end up overusing our gifts and feeling depleted over time.”
  • Learning from surveillance capitalism (Code Acts in Education) — “Terms such as ‘behavioural surplus’, ‘prediction products’, ‘behavioural futures markets’, and ‘instrumentarian power’ provide a useful critical language for decoding what surveillance capitalism is, what it does, and at what cost.”
  • Facebook, Libra, and the Long Game (Stratechery) — “Certainly Facebook’s audacity and ambition should not be underestimated, and the company’s network is the biggest reason to believe Libra will work; Facebook’s brand is the biggest reason to believe it will not.”
  • The Pixar Theory (Jon Negroni) — “Every Pixar movie is connected. I explain how, and possibly why.”
  • Mario Royale (Kottke.org) — “Mario Royale (now renamed DMCA Royale to skirt around Nintendo’s intellectual property rights) is a battle royale game based on Super Mario Bros in which you compete against 74 other players to finish four levels in the top three. “
  • Your Professional Decline Is Coming (Much) Sooner Than You Think (The Atlantic) — “In The Happiness Curve: Why Life Gets Better After 50, Jonathan Rauch, a Brookings Institution scholar and an Atlantic contributing editor, reviews the strong evidence suggesting that the happiness of most adults declines through their 30s and 40s, then bottoms out in their early 50s.”
  • What Happens When Your Kids Develop Their Own Gaming Taste (Kotaku) — “It’s rewarding too, though, to see your kids forging their own path. I feel the same way when I watch my stepson dominate a round of Fortnite as I probably would if he were amazing at rugby: slightly baffled, but nonetheless proud.”
  • Whence the value of open? (Half an Hour) — “We will find, over time and as a society, that just as there is a sweet spot for connectivity, there is a sweet spot for openness. And that point where be where the default for openness meets the push-back from people on the basis of other values such as autonomy, diversity and interactivity. And where, exactly, this sweet spot is, needs to be defined by the community, and achieved as a consensus.”
  • How to Be Resilient in the Face of Harsh Criticism (HBR) — “Here are four steps you can try the next time harsh feedback catches you off-guard. I’ve organized them into an easy-to-remember acronym — CURE — to help you put these lessons in practice even when you’re under stress.”
  • Fans Are Better Than Tech at Organizing Information Online (WIRED) — “Tagging systems are a way of imposing order on the real world, and the world doesn’t just stop moving and changing once you’ve got your nice categories set up.”

Header image via Dilbert

Exit option democracy

This week saw the launch of a new book by Shoshana Zuboff entitled The Age of Surveillance Capitalism: the fight for a human future at the new frontier of power. It was featured in two of my favourite newspapers, The Observer and the The New York Times, and is the kind of book I would have lapped up this time last year.

In 2019, though, I’m being a bit more pragmatic, taking heed of Stoic advice to focus on the things that you can change. Chiefly, that’s your own perceptions about the world. I can’t change the fact that, despite the Snowden revelations and everything that has come afterwards, most people don’t care one bit that they’re trading privacy for convenience..

That puts those who care about privacy in a bit of a predicament. You can use the most privacy-respecting email service in the world, but as soon as you communicate with someone using Gmail, then Google has got the entire conversation. Chances are, the organisation you work for has ‘gone Google’ too.

Then there’s Facebook shadow profiles. You don’t even have to have an account on that platform for the company behind it to know all about you. Same goes with companies knowing who’s in your friendship group if your friends upload their contacts to WhatsApp. It makes no difference if you use ridiculous third-party gadgets or not.

In short, if you want to live in modern society, your privacy depends on your family and friends. Of course you have the option to choose not to participate in certain platforms (I don’t use Facebook products) but that comes at a significant cost. It’s the digital equivalent of Thoreau taking himself off to Walden pond.

In a post from last month that I stumbled across this weekend, Nate Matias reflects on a talk he attended by Janet Vertesi at Princeton University’s Center for Information Technology Policy. Vertesi, says Matias, tried four different ways of opting out of technology companies gathering data on her:

  • Platform avoidance,
  • Infrastructural avoidance
  • Hardware experiments
  • Digital homesteading

Interestingly, the starting point is Vertesi’s rejection of ‘exit option democracy’:

The basic assumption of markets is that people have choices. This idea that “you can just vote with your feet” is called an “exit option democracy” in organizational sociology (Weeks, 2004). Opt-out democracy is not really much of a democracy, says Janet. She should know–she’s been opting out of tech products for years.

The option Vertesi advocates for going Google-free is a pain in the backside. I know, because I’ve tried it:

To prevent Google from accessing her data, Janet practices “data balkanization,” spreading her traces across multiple systems. She’s used DuckDuckGo, sandstorm.io, ResilioSync, and youtube-dl to access key services. She’s used other services occasionally and non-exclusively, and varied it with open source alternatives like etherpad and open street map. It’s also important to pay attention to who is talking to whom and sharing data with whom. Data balkanization relies on knowing what companies hate each other and who’s about to get in bed with whom.

The time I’ve spent doing these things was time I was not being productive, nor was it time I was spending with my wife and kids. It’s easy to roll your eyes at people “trading privacy for convenience” but it all adds up.

Talking of family, straying too far from societal norms has, for better or worse, negative consequences. Just as Linux users were targeted for surveillance, so Vertisi and her husband were suspected of fraud for browsing the web using Tor and using cash for transactions:

Trying to de-link your identity from data storage has consequences. For example, when Janet and her husband tried to use cash for their purchases, they faced risks of being reported to the authorities for fraud, even though their actions were legal.

And then, of course, there’s the tinfoil hat options:

…Janet used parts from electronics kits to make her own 2g phone. After making the phone Janet quickly realized even a privacy-protecting phone can’t connect to the network without identifying the user to companies through the network itself.

I’m rolling my eyes at this point. The farthest I’ve gone down this route is use the now-defunct Firefox OS and LineageOS for microG. Although both had their upsides, they were too annoying to use for extended periods of time.

Finally, Vertesi goes down the route of trying to own all your own data. I’ll just point out that there’s a reason those of us who had huge CD and MP3 collections switched to Spotify. Looking after any collection takes time and effort. It’s also a lot more cost effective for someone like me to ‘rent’ my music instead of own it. The same goes for Netflix.

What I do accept, though, is that Vertesi’s findings show that ‘exit democracy’ isn’t really an option here, so the world of technology isn’t really democratic. My takeaway from all this, and the reason for my pragmatic approach this year, is that it’s up to governments to do something about all this.

Western society teaches us that empowered individuals can change the world. But if you take a closer look, whether it’s surveillance capitalism or climate change, it’s legislation that’s going to make the biggest difference here. Just look at the shift that took place because of GDPR.

So whether or not I read Zuboff’s new book, I’m going to continue my pragmatic approach this year. Meanwhile, I’ll continue to mute the microphone on the smart speakers in our house when they’re not being used, block trackers on my Android smartphone, and continue my monthly donations to work of the Electronic Frontier Foundation and the Open Rights Group.

Source: J. Nathan Matias