Tag: surveillance

Friday foggings

I’ve been travelling this week, so I’ve had plenty of time to read and digest a whole range of articles. In fact, because of the luxury of that extra time, I decided to write some comments about each link, as well as the usual quotation.

Let me know what you think about this approach. I may not have the bandwidth to do it every week, but if it’s useful, I’ll try and prioritise it. As ever, particularly interested in hearing from supporters!


Education and Men without Work (National Affairs) — “Unlike the Great Depression, however, today’s work crisis is not an unemployment crisis. Only a tiny fraction of workless American men nowadays are actually looking for employment. Instead we have witnessed a mass exodus of men from the workforce altogether. At this writing, nearly 7 million civilian non-institutionalized men between the ages of 25 and 54 are neither working nor looking for work — over four times as many as are formally unemployed.”

This article argues that the conventional wisdom, that men are out of work because of a lack of education, may be based on false assumptions. In fact, a major driver seems to be the number of men (more than 50% of working-age men, apparently) who live in child-free homes. What do these men end up doing with their time? Many of them are self-medicating with drugs and screens.


Fresh Cambridge Analytica leak ‘shows global manipulation is out of control’ (The Guardian) — “More than 100,000 documents relating to work in 68 countries that will lay bare the global infrastructure of an operation used to manipulate voters on “an industrial scale” are set to be released over the next months.”

Sadly, I think the response to these documents will be one of apathy. Due to the 24-hour news cycle and the stream of ‘news’ on social networks, the voting public grow tired of scandals and news stories that last for months and years.


Funding (Sussex Royals) — “The Sovereign Grant is the annual funding mechanism of the monarchy that covers the work of the Royal Family in support of HM The Queen including expenses to maintain official residences and workspaces. In this exchange, The Queen surrenders the revenue of the Crown Estate and in return, a portion of these public funds are granted to The Sovereign/The Queen for official expenditure.”

I don’t think I need to restate my opinions on the Royal Family, privilege, and hierarchies / coercive power relationships of all shapes and sizes. However, as someone pointed out on Mastodon, this page by ‘Harry and Meghan’ is quietly subversive.


How to sell good ideas (New Statesman) — “It is true that [Malcolm] Gladwell sometimes presses his stories too militantly into the service of an overarching idea, and, at least in his books, can jam together materials too disparate to cohere (Poole referred to his “relentless montage”). The New Yorker essay, which constrains his itinerant curiosity, is where he does his finest work (the best of these are collected in 2009’s What The Dog Saw). For the most part, the work of his many imitators attests to how hard it is to do what he does. You have to be able to write lucid, propulsive prose capable of introducing complex ideas within a magnetic field of narrative. You have to leave your desk and talk to people (he never stopped being a reporter). Above all, you need to acquire an extraordinary eye for the overlooked story, the deceptively trivial incident, the minor genius. Gladwell shares the late Jonathan Miller’s belief that “it is in the negligible that the considerable is to be found”.”

A friend took me to see Gladwell when he was in Newcastle-upon-Tyne touring with ‘What The Dog Saw’. Like the author of this article, I soon realised that Gladwell is selling something quite different to ‘science’ or ‘facts’. And so long as you’re OK with that, you can enjoy (as I do) his podcasts and books.


Just enough Internet: Why public service Internet should be a model of restraint (doteveryone) — “We have not yet done a good job of defining what good digital public service really looks like, of creating digital charters that match up to those of our great institutions, and it is these statements of values and ways of working – rather than any amount of shiny new technology – that will create essential building blocks for the public services of the future.”

While I attended the main MozFest weekend event, I missed the presentation and other events that happened earlier in the week. I definitely agree with the sentiment behind the transcript of this talk by Rachel Coldicutt. I’m just not sure it’s specific enough to be useful in practice.


Places to go in 2020 (Marginal Revolution) — “Here is the mostly dull NYT list. Here is my personal list of recommendations for you, noting I have not been to all of the below, but I am in contact with many travelers and paw through a good deal of information.”

This list by Tyler Cowen is really interesting. I haven’t been to any of the places on this list, but I now really want to visit Eastern Bali and Baku in Azerbaijan.


Reasons not to scoff at ghosts, visions and near-death experiences (Aeon) — “Sure, the dangers of gullibility are evident enough in the tragedies caused by religious fanatics, medical quacks and ruthless politicians. And, granted, spiritual worldviews are not good for everybody. Faith in the ultimate benevolence of the cosmos will strike many as hopelessly irrational. Yet, a century on from James’s pragmatic philosophy and psychology of transformative experiences, it might be time to restore a balanced perspective, to acknowledge the damage that has been caused by stigma, misdiagnoses and mis- or overmedication of individuals reporting ‘weird’ experiences. One can be personally skeptical of the ultimate validity of mystical beliefs and leave properly theological questions strictly aside, yet still investigate the salutary and prophylactic potential of these phenomena.”

I’d happily read a full-length book on this subject, as it’s a fascinating area. The tension between knowing that much/all of the phenomena is reducible to materiality and mechanics may explain what’s going on, but it doesn’t explain it away…


Surveillance Tech Is an Open Secret at CES 2020 (OneZero) — “Lowe offered one explanation for why these companies feel so comfortable marketing surveillance tech: He says that the genie can’t be put back in the bottle, so barring federal regulation that bans certain implementations, it’s increasingly likely that some company will fill the surveillance market. In other words, if Google isn’t going to work with the cops, Amazon will. And even if Amazon decides not to, smaller companies out of the spotlight still will.”

I suppose it should come as no surprise that, in this day and age, companies like Cyberlink, previously known for their PowerDVD software, have moved into the very profitable world of surveillance capitalism. What’s going to stop its inexorable rise? I can only think of government regulation (with teeth).


‘Techlash’ Hits College Campuses (New York Times) — “Some recent graduates are taking their technical skills to smaller social impact groups instead of the biggest firms. Ms. Dogru said that some of her peers are pursuing jobs at start-ups focused on health, education and privacy. Ms. Harbour said Berkeley offers a networking event called Tech for Good, where alumni from purpose-driven groups like Code for America and Khan Academy share career opportunities.”

I’m not sure this is currently as big a ‘movement’ as suggested in the article, but I’m glad the wind is blowing in this direction. As with other ethically-dubious industries, companies involved in surveillance capitalism will have to pay people extraordinarily well to put aside their moral scruples.


Tradition is Smarter Than You Are (The Scholar’s Stage) — “To extract resources from a population the state must be able to understand that population. The state needs to make the people and things it rules legible to agents of the government. Legibility means uniformity. States dream up uniform weights and measures, impress national languages and ID numbers on their people, and divvy the country up into land plots and administrative districts, all to make the realm legible to the powers that be. The problem is that not all important things can be made legible. Much of what makes a society successful is knowledge of the tacit sort: rarely articulated, messy, and from the outside looking in, purposeless. These are the first things lost in the quest for legibility. Traditions, small cultural differences, odd and distinctive lifeways… are all swept aside by a rationalizing state that preserves (or in many cases, imposes) only what it can be understood and manipulated from the 2,000 foot view. The result… are many of the greatest catastrophes of human history.”

One of the books that’s been on my ‘to-read’ list for a while is ‘Seeing Like a State’, written by James C. Scott and referenced in this article. I’m no believer in tradition for the sake of it but, I have to say, that a lot of the superstitions of my maternal grandmother, and a lot of the rituals that come with religion are often very practical in nature.


Image by Michael Schlegel (via kottke.org)

Friday fertilisations

I’ve read so much stuff over the past couple of months that it’s been a real job whittling down these links. In the end I gave up and shared a few more than usual!

  • You Shouldn’t Have to Be Good at Your Job (GEN) — “This is how the 1% justifies itself. They are not simply the best in terms of income, but in terms of humanity itself. They’re the people who get invited into the escape pods when the mega-asteroid is about to hit. They don’t want a fucking thing to do with the rest of the population and, in fact, they have exploited global economic models to suss out who deserves to be among them and who deserves to be obsolete. And, thanks to lax governments far and wide, they’re free to practice their own mass experiments in forced Darwinism. You currently have the privilege of witnessing a worm’s-eye view of this great culling. Fun, isn’t it?”
  • We’ve spent the decade letting our tech define us. It’s out of control (The Guardian) — “There is a way out, but it will mean abandoning our fear and contempt for those we have become convinced are our enemies. No one is in charge of this, and no amount of social science or monetary policy can correct for what is ultimately a spiritual deficit. We have surrendered to digital platforms that look at human individuality and variance as “noise” to be corrected, rather than signal to be cherished. Our leading technologists increasingly see human beings as a problem, and technology as the solution – and they use our behavior on their platforms as evidence of our essentially flawed nature.”
  • How headphones are changing the sound of music (Quartz) — “Another way headphones are changing music is in the production of bass-heavy music. Harding explains that on small speakers, like headphones or those in a laptop, low frequencies are harder to hear than when blasted from the big speakers you might encounter at a concert venue or club. If you ever wondered why the bass feels so powerful when you are out dancing, that’s why. In order for the bass to be heard well on headphones, music producers have to boost bass frequencies in the higher range, the part of the sound spectrum that small speakers handle well.”
  • The False Promise of Morning Routines (The Atlantic) — “Goat milk or no goat milk, the move toward ritualized morning self-care can seem like merely a palliative attempt to improve work-life balance.It makes sense to wake up 30 minutes earlier than usual because you want to fit in some yoga, an activity that you enjoy. But something sinister seems to be going on if you feel that you have to wake up 30 minutes earlier than usual to improve your well-being, so that you can also work 60 hours a week, cook dinner, run errands, and spend time with your family.”
  • Giant surveillance balloons are lurking at the edge of space (Ars Technica) — “The idea of a constellation of stratospheric balloons isn’t new—the US military floated the idea back in the ’90s—but technology has finally matured to the point that they’re actually possible. World View’s December launch marks the first time the company has had more than one balloon in the air at a time, if only for a few days. By the time you’re reading this, its other stratollite will have returned to the surface under a steerable parachute after nearly seven weeks in the stratosphere.”
  • The Unexpected Philosophy Icelanders Live By (BBC Travel) — “Maybe it makes sense, then, that in a place where people were – and still are – so often at the mercy of the weather, the land and the island’s unique geological forces, they’ve learned to give up control, leave things to fate and hope for the best. For these stoic and even-tempered Icelanders, þetta reddast is less a starry-eyed refusal to deal with problems and more an admission that sometimes you must make the best of the hand you’ve been dealt.”
  • What Happens When Your Career Becomes Your Whole Identity (HBR) — “While identifying closely with your career isn’t necessarily bad, it makes you vulnerable to a painful identity crisis if you burn out, get laid off, or retire. Individuals in these situations frequently suffer anxiety, depression, and despair. By claiming back some time for yourself and diversifying your activities and relationships, you can build a more balanced and robust identity in line with your values.”
  • Having fun is a virtue, not a guilty pleasure (Quartz) — “There are also, though, many high-status workers who can easily afford to take a break, but opt instead to toil relentlessly. Such widespread workaholism in part reflects the misguided notion that having fun is somehow an indulgence, an act of absconding from proper respectable behavior, rather than embracement of life. “
  • It’s Time to Get Personal (Laura Kalbag) — “As designers and developers, it’s easy to accept the status quo. The big tech platforms already exist and are easy to use. There are so many decisions to be made as part of our work, we tend to just go with what’s popular and convenient. But those little decisions can have a big impact, especially on the people using what we build.”
  • The 100 Worst Ed-Tech Debacles of the Decade (Hack Education) — “Oh yes, I’m sure you can come up with some rousing successes and some triumphant moments that made you thrilled about the 2010s and that give you hope for “the future of education.” Good for you. But that’s not my job. (And honestly, it’s probably not your job either.)”
  • Why so many Japanese children refuse to go to school (BBC News) — “Many schools in Japan control every aspect of their pupils’ appearance, forcing pupils to dye their brown hair black, or not allowing pupils to wear tights or coats, even in cold weather. In some cases they even decide on the colour of pupils’ underwear. “
  • The real scam of ‘influencer’ (Seth Godin) — “And a bigger part is that the things you need to do to be popular (the only metric the platforms share) aren’t the things you’d be doing if you were trying to be effective, or grounded, or proud of the work you’re doing.”

Image via Kottke.org

I am not fond of expecting catastrophes, but there are cracks in the universe

So said Sydney Smith. Let’s talk about surveillance. Let’s talk about surveillance capitalism and surveillance humanitarianism. But first, let’s talk about machine learning and algorithms; in other words, let’s talk about what happens after all of that data is collected.

Writing in The Guardian, Sarah Marsh investigates local councils using “automated guidance systems” in an attempt to save money.

The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

Sarah Marsh

The trouble is, they’re not particularly effective:

It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk

Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

Sarah Marsh

When I was at Mozilla there were a number of colleagues there who had worked on the OFA (Obama For America) campaign. I remember one of them, a DevOps guy, expressing his concern that the infrastructure being built was all well and good when there’s someone ‘friendly’ in the White House, but what comes next.

Well, we now know what comes next, on both sides of the Atlantic, and we can’t put that genie back in its bottle. Swingeing cuts by successive Conservative governments over here, coupled with the Brexit time-and-money pit means that there’s no attention or cash left.

If we stop and think about things for a second, we probably wouldn’t don’t want to live in a world where machines make decisions for us, based on algorithms devised by nerds. As Rose Eveleth discusses in a scathing article for Vox, this stuff isn’t ‘inevitable’ — nor does it constitute a process of ‘natural selection’:

Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”

And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.

Rose Eveleth

I suppose the thing that concerns me the most is people in dire need being subject to impersonal technology for vital and life-saving aid.

For example, Mark Latonero, writing in The New York Times, talks about the growing dangers around what he calls ‘surveillance humanitarianism’:

By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need.

Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.

Mark Latonero

It’s easy to think that this is an emergency, so we should just do whatever is necessary. But Latonero explains the risks, arguing that the risk is shifted to a later time:

If an individual or group’s data is compromised or leaked to a warring faction, it could result in violent retribution for those perceived to be on the wrong side of the conflict. When I spoke with officials providing medical aid to Syrian refugees in Greece, they were so concerned that the Syrian military might hack into their database that they simply treated patients without collecting any personal data. The fact that the Houthis are vying for access to civilian data only elevates the risk of collecting and storing biometrics in the first place.

Mark Latonero

There was a rather startling article in last weekend’s newspaper, which I’ve found online. Hannah Devlin, again writing in The Guardian (which is a good source of information for those concerned with surveillance) writes about a perfect storm of social media and improved processing speeds:

[I]n the past three years, the performance of facial recognition has stepped up dramatically. Independent tests by the US National Institute of Standards and Technology (Nist) found the failure rate for finding a target picture in a database of 12m faces had dropped from 5% in 2010 to 0.1% this year.

The rapid acceleration is thanks, in part, to the goldmine of face images that have been uploaded to Instagram, Facebook, LinkedIn and captioned news articles in the past decade. At one time, scientists would create bespoke databases by laboriously photographing hundreds of volunteers at different angles, in different lighting conditions. By 2016, Microsoft had published a dataset, MS Celeb, with 10m face images of 100,000 people harvested from search engines – they included celebrities, broadcasters, business people and anyone with multiple tagged pictures that had been uploaded under a Creative Commons licence, allowing them to be used for research. The dataset was quietly deleted in June, after it emerged that it may have aided the development of software used by the Chinese state to control its Uighur population.

In parallel, hardware companies have developed a new generation of powerful processing chips, called Graphics Processing Units (GPUs), uniquely adapted to crunch through a colossal number of calculations every second. The combination of big data and GPUs paved the way for an entirely new approach to facial recognition, called deep learning, which is powering a wider AI revolution.

Hannah Devlin

Those of you who have read this far and are expecting some big reveal are going to be disappointed. I don’t have any ‘answers’ to these problems. I guess I’ve been guilty, like many of us have, of the kind of ‘privacy nihilism’ mentioned by Ian Bogost in The Atlantic:

Online services are only accelerating the reach and impact of data-intelligence practices that stretch back decades. They have collected your personal data, with and without your permission, from employers, public records, purchases, banking activity, educational history, and hundreds more sources. They have connected it, recombined it, bought it, and sold it. Processed foods look wholesome compared to your processed data, scattered to the winds of a thousand databases. Everything you have done has been recorded, munged, and spat back at you to benefit sellers, advertisers, and the brokers who service them. It has been for a long time, and it’s not going to stop. The age of privacy nihilism is here, and it’s time to face the dark hollow of its pervasive void.

Ian Bogost

The only forces that we have to stop this are collective action, and governmental action. My concern is that we don’t have the digital savvy to do the former, and there’s definitely the lack of will in respect of the latter. Troubling times.

Friday fawnings

On this week’s rollercoaster journey, I came across these nuggets:

  • Renata Ávila: “The Internet of creation disappeared. Now we have the Internet of surveillance and control” (CCCB Lab) — “This lawyer and activist talks with a global perspective about the movements that the power of “digital colonialism” is weaving. Her arguments are essential for preventing ourselves from being crushed by the technological world, from being carried away by the current of ephemeral divertemento. For being fully aware that, as individuals, our battle is not lost, but that we can control the use of our data, refuse to give away our facial recognition or demand that the privacy laws that protect us are obeyed.”
  • Everything Is Private Equity Now (Bloomberg) — “The basic idea is a little like house flipping: Take over a company that’s relatively cheap and spruce it up to make it more attractive to other buyers so you can sell it at a profit in a few years. The target might be a struggling public company or a small private business that can be combined—or “rolled up”—with others in the same industry.”
  • Forget STEM, We Need MESH (Our Human Family) — “I would suggest a renewed focus on MESH education, which stands for Media Literacy, Ethics, Sociology, and History. Because if these are not given equal attention, we could end up with incredibly bright and technically proficient people who lack all capacity for democratic citizenship.”
  • Connecting the curious (Harold Jarche) — “If we want to change the world, be curious. If we want to make the world a better place, promote curiosity in all aspects of learning and work. There are still a good number of curious people of all ages working in creative spaces or building communities around common interests. We need to connect them.”
  • Twitter: No, really, we’re very sorry we sold your security info for a boatload of cash (The Register) — “The social networking giant on Tuesday admitted to an “error” that let advertisers have access to the private information customers had given Twitter in order to place additional security protections on their accounts.”
  • Digital tools interrupt workers 14 times a day (CIO Dive) — “The constant chime of digital workplace tools including email, instant messaging or collaboration software interrupts knowledge workers 13.9 times on an average day, according to a survey of 3,750 global workers from Workfront.”
  • Book review – Curriculum: Athena versus the Machine (TES) — “Despite the hope that the book is a cure for our educational malaise, Curriculum is a morbid symptom of the current political and intellectual climate in English education.”
  • Fight for the planet: Building an open platform and open culture at Greenpeace (Opensource.com) — “Being as open as we can, pushing the boundaries of what it means to work openly, doesn’t just impact our work. It impacts our identity.”
  • Psychodata (Code Acts in Education) — “Social-emotional learning sounds like a progressive, child-centred agenda, but behind the scenes it’s primarily concerned with new forms of child measurement.”

Image via xkcd

To be perfectly symmetrical is to be perfectly dead

So said Igor Stravinsky. I’m a little behind on my writing, and prioritised writing up my experiences in the Lake District over the past couple of days.

Today’s update is therefore a list post:

  • Degrowth: a Call for Radical Abundance (Jason Hickel) — “In other words, the birth of capitalism required the creation of scarcity. The constant creation of scarcity is the engine of the juggernaut.”
  • Hey, You Left Something Out (Cogito, Ergo Sumana) — “People who want to compliment work should probably learn to give compliments that sound encouraging.”
  • The Problem is Capitalism (George Monbiot) — “A system based on perpetual growth cannot function without peripheries and externalities. There must always be an extraction zone, from which materials are taken without full payment, and a disposal zone, where costs are dumped in the form of waste and pollution.”
  • In Stores, Secret Surveillance Tracks Your Every Move (The New York Times) — “For years, Apple and Google have allowed companies to bury surveillance features inside the apps offered in their app stores. And both companies conduct their own beacon surveillance through iOS and Android.”
  • The Inevitable Same-ification of the Internet
    (Matthew Ström) — “Convergence is not the sign of a broken system, or a symptom of a more insidious disease. It is an emergent phenomenon that arises from a few simple rules.”


Wretched is a mind anxious about the future

So said one of my favourite non-fiction authors, the 16th century proto-blogger Michel de Montaigne. There’s plenty of writing about how we need to be anxious because of the drift towards a future of surveillance states. Eventually, because it’s not currently affecting us here and now, we become blasé. We forget that it’s already the lived experience for hundreds of millions of people.

Take China, for example. In The Atlantic, Derek Thompson writes about the Chinese government’s brutality against the Muslim Uyghur population in the western province of Xinjiang:

[The] horrifying situation is built on the scaffolding of mass surveillance. Cameras fill the marketplaces and intersections of the key city of Kashgar. Recording devices are placed in homes and even in bathrooms. Checkpoints that limit the movement of Muslims are often outfitted with facial-recognition devices to vacuum up the population’s biometric data. As China seeks to export its suite of surveillance tech around the world, Xinjiang is a kind of R&D incubator, with the local Muslim population serving as guinea pigs in a laboratory for the deprivation of human rights.

Derek Thompson

As Ian Welsh points out, surveillance states usually involve us in the West pointing towards places like China and shaking our heads. However, if you step back a moment and remember that societies like the US and UK are becoming more unequal over time, then perhaps we’re the ones who should be worried:

The endgame, as I’ve been pointing out for years, is a society in which where you are and what you’re doing, and have done is, always known, or at least knowable. And that information is known forever, so the moment someone with power wants to take you out, they can go back thru your life in minute detail. If laws or norms change so that what was OK 10 or 30 years ago isn’t OK now, well they can get you on that.

Ian Welsh

As the world becomes more unequal, the position of elites becomes more perilous, hence Silicon Valley billionaires preparing boltholes in New Zealand. Ironically, they’re looking for places where they can’t be found, while making serious money from providing surveillance technology. Instead of solving the inequality, they attempt to insulate themselves from the effect of that inequality.

A lot of the crazy amounts of money earned in Silicon Valley comes at the price of infringing our privacy. I’ve spent a long time thinking about quite nebulous concept. It’s not the easiest thing to understand when you examine it more closely.

Privacy is usually considered a freedom from rather than a freedom to, as in “freedom from surveillance”. The trouble is that there are many kinds of surveillance, and some of these we actively encourage. A quick example: I know of at least one family that share their location with one another all of the time. At the same time, of course, they’re sharing it with the company that provides that service.

There’s a lot of power in the ‘default’ privacy settings devices and applications come with. People tend to go with whatever comes as standard. Sidney Fussell writes in The Atlantic that:

Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them… Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.

Sidney Fussell

Some of us (including me) are willing to trade some of that privacy for more personalised services that somehow make our lives easier. The tricky thing is when it comes to employers and state surveillance. In these cases there are coercive power relationships at play, rather than just convenience.

Ellen Sheng, writing for CNBC explains how employees in the US are at huge risk from workplace surveillance:

In the workplace, almost any consumer privacy law can be waived. Even if companies give employees a choice about whether or not they want to participate, it’s not hard to force employees to agree. That is, unless lawmakers introduce laws that explicitly state a company can’t make workers agree to a technology…

One example: Companies are increasingly interested in employee social media posts out of concern that employee posts could reflect poorly on the company. A teacher’s aide in Michigan was suspended in 2012 after refusing to share her Facebook page with the school’s superintendent following complaints about a photo she had posted. Since then, dozens of similar cases prompted lawmakers to take action. More than 16 states have passed social media protections for individuals.

Ellen Sheng

It’s not just workplaces, though. Schools are hotbeds for new surveillance technologies, as Benjamin Herold notes in an article for Education Week:

Social media monitoring companies track the posts of everyone in the areas surrounding schools, including adults. Other companies scan the private digital content of millions of students using district-issued computers and accounts. Those services are complemented with tip-reporting apps, facial-recognition software, and other new technology systems.

[…]

While schools are typically quiet about their monitoring of public social media posts, they generally disclose to students and parents when digital content created on district-issued devices and accounts will be monitored. Such surveillance is typically done in accordance with schools’ responsible-use policies, which students and parents must agree to in order to use districts’ devices, networks, and accounts.
Hypothetically, students and families can opt out of using that technology. But doing so would make participating in the educational life of most schools exceedingly difficult.

Benjamin Herold

In China, of course, a social credit system makes all of this a million times worse, but we in the West aren’t heading in a great direction either.

We’re entering a time where, by the time my children are my age, companies, employers, and the state could have decades of data from when they entered the school system through to them finding jobs, and becoming parents themselves.

There are upsides to all of this data, obviously. But I think that in the midst of privacy-focused conversations about Amazon’s smart speakers and Google location-sharing, we might be missing the bigger picture around surveillance by educational institutions, employers, and governments.

Returning to Ian Welsh to finish up, remember that it’s the coercive power relationships that make surveillance a bad thing:

Surveillance societies are sterile societies. Everyone does what they’re supposed to do all the time, and because we become what we do, it affects our personalities. It particularly affects our creativity, and is a large part of why Communist surveillance societies were less creative than the West, particularly as their police states ramped up.

Ian Welsh

We don’t want to think about all of this, though, do we?


Also check out:

Tracking vs advertising

We tend to use words to denote something right up to the time that term becomes untenable. Someone has to invent a better one. Take mobile phones, for example. They’re literally named after the least-used app on there, so we’re crying out for a different way to refer to them. Perhaps a better name would be ‘trackers’.

These days, most people use mobile devices for social networking. These are available free at the point of access, funded by what we’re currently calling ‘advertising’. However, as this author notes, it’s nothing of the sort:

What we have today is not advertising. The amount of personally identifiable information companies have about their customers is absolutely perverse. Some of the world’s largest companies are in the business of selling your personal information for use in advertising. This might sound innocuous but the tracking efforts of these companies are so accurate that many people believe that Facebook listens to their conversations to serve them relevant ads. Even if it’s true that the microphone is not used, the sum of all other data collected is still enough to show creepily relevant advertising.

Unfortunately, the author doesn’t seem to have come to the conclusion yet that it’s the logic of capitalism that go us here. Instead, he just points out that people’s privacy is being abused.

[P]eople now get most of their information from social networks yet these networks dictate the order in which content is served to the user. Google makes the worlds most popular mobile operating system and it’s purpose is drive the company’s bottom line (ad blocking is forbidden). “Smart” devices are everywhere and companies are jumping over each other to put more shit in your house so they can record your movements and sell the information to advertisers. This is all a blatant abuse of privacy that is completely toxic to society.

Agreed, and it’s easy to feel a little helpless against this onslaught. While it’s great to have a list of things that users can do, if those things are difficult to implement and/or hard to understand, then it’s an uphill battle.

That being said, the three suggestions he makes are use

To combat this trend, I have taken the following steps and I think others should join the movement:

  • Aggressively block all online advertisements
  • Don’t succumb to the “curated” feeds
  • Not every device needs to be “smart”

I feel I’m already way ahead of the author in this regard:

  • Aggressively block all online advertisements
  • Don’t succumb to the “curated” feeds
    • I quit Facebook years ago, haven’t got an Instagram account, and pretty much only post links to my own spaces on Twitter and LinkedIn.
  • Not every device needs to be “smart”
    • I don’t really use my Philips Hue lights, and don’t have an Amazon Alexa — or even the Google Assistant on my phone).

It’s not easy to stand up to Big Tech. The amount of money they pour into things make their ‘innovations’ seem inevitable. They can afford to make things cheap and frictionless so you get hooked.

As an aside, it’s interesting to note that those that previously defended Apple as somehow ‘different’ on privacy, despite being the world’s most profitable company, are starting to backtrack.

Source: Nicholas Rempel

It’s called Echo for a reason

That last-minute Christmas gift sounds like nothing but unadulterated fun after reading this, doesn’t it?

It is a significant thing to allow a live microphone in your private space (just as it is to allow them in our public spaces). Once the hardware is in place, and receiving electricity, and connected to the Internet, then you’re reduced to placing your trust in the hands of two things that unfortunately are less than reliable these days: 1) software, and 2) policy.

Software, once a mic is in place, governs when that microphone is live, when the audio it captures is transmitted over the Internet, and to whom it goes. Many devices are programmed to keep their microphones on at all times but only record and transmit audio after hearing a trigger phrase—in the case of the Echo, for example, “Alexa.” Any device that is to be activated by voice alone must work this way. There are a range of other systems. Samsung, after a privacy dust-up, assured the public that its smart televisions (like others) only record and transmit audio after the user presses a button on its remote control. The Hello Barbie toy only picks up and transmits audio when its user presses a button on the doll.

Software is invisible, however. Most companies do not make their code available for public inspection, and it can be hacked, or unscrupulous executives can lie about what it does (think Volkswagen), or government agencies might try to order companies to activate them as a surveillance device.

I sincerely hope that policy makers pay heed to the recommendations section, especially given the current ‘Wild West’ state of affairs described in the article.

Source: ACLU