Tag: BBC (page 2 of 4)

The truth is too simple: one must always get there by a complicated route

😍 Drone Awards 2020: the world seen from above

😷 Adequate Vitamin D Levels Cuts Risk Of Dying From Covid-19 In Half, Study Finds

🔊 The BBC is releasing over 16,000 sound effects for free download

👍 Proposal would give EU power to boot tech giants out of European market

🎧 The Hidden Costs of Streaming Music


Quotation-as-title by George Sand. Image from top-linked post.

Saturday shiftings

I think this is the latest I’ve published my weekly roundup of links. That’s partly because of an epic family walk we did today, but also because of work, and because of the length and quality of the things I bookmarked to come back to…

Enjoy!


Graffiti in Hong Kong subway station (translation: “We can’t return to normal, because the normal that we had was precisely the problem.”)

FC97: Portal Economics

Most of us are still trapped in the mental coordinates of a world that isn’t waiting for us on the other side. You can see this in the language journalists are still using. The coronavirus is a ‘strategic surprise’ and we’re still very much in the ‘fog of war,’ dealing with the equivalent of an ‘alien invasion’ or an ‘unexpected asteroid strike.’ As I said back in March though, this is not a natural disaster, like an earthquake, a one-off event from which we can rebuild. It’s not a war or a financial crisis either. There are deaths, but no combatants, no physical resources have been destroyed, and there was no initial market crash, although obviously the markets are now reacting.

The crisis is of the entire system we’ve built. In another article, I described this as the bio-political straitjacket. We can’t reopen our economies, because if we do then more people will die. We can’t keep them closed either, because our entire way of life is built on growth, and without it, everything collapses. We can give up our civil liberties, submitting to more surveillance and control, but as Amartya Sen would say, what good is a society if the cost of our health and livelihoods is our hard fought for freedoms?

Gus Hurvey (Future Crunch)

This is an incredible read, and if you click through to anything this week to sit down and consume with your favourite beverage, I highly recommend this one.


Coronavirus shows us it’s time to rethink everything. Let’s start with education

There’s nothing radical about the things we’re learning: it’s a matter of emphasis more than content – of centralising what is most important. Now, perhaps, we have an opportunity to rethink the entire basis of education. As local authorities in Scotland point out, outdoor learning could be the best means of getting children back to school, as it permits physical distancing. It lends itself to re-engagement with the living world. But, despite years of research demonstrating its many benefits, the funding for outdoor education and adventure learning has been cut to almost nothing.

George Monbiot (The Guardian)

To some extent, this is Monbiot using a different stick to bang the same drum, but he certainly has a point about the most important things to be teaching our young people as their future begins to look a lot different to ours.


The Machine Pauses

In 1909, following a watershed era of technological progress, but preceding the industrialized massacres of the Somme and Verdun, E.M. Forster imagined, in “The Machine Stops,” a future society in which the entirety of lived experience is administered by a kind of mechanical demiurge. The story is the perfect allegory for the moment, owing not least to its account of a society-wide sudden stop and its eerily prescient description of isolated lives experienced wholly through screens.

Stuart Whatley (The Hedgehog Review)

No, I didn’t know what a ‘demiurge‘ was either. Apparently, it’s “an artisan-like figure responsible for fashioning and maintaining the physical universe”.

This article, which not only quote E.M. Forster, but also Heidegger and Nathaniel Hawthorne, discusses whether we really should be allowing technology to dictate the momentum of society.


Party in a spreadsheet

Party in a Shared Google Doc

The party has no communal chat log. Whilst I can enable edit permissions for those with the party link, shared google docs don’t not allow for chat between anonymous animals. Instead conversations are typed in cells. There are too many animals to keep track of who is who. I stop and type to someone in a nearby cell. My cursor is blue, theirs is orange. I have no idea if they are a close friend or a total stranger. How do you hold yourself and what do you say to someone when personal context is totally stripped away?

Marie Foulston

I love this so much.


Being messy when everything is clean

[T]o put it another way, people whose working lives can be mediated through technology — conducted from bedrooms and kitchen tables via Teams or Slack, email and video calls — are at much less risk. In fact, our laptops and smartphones might almost be said to be saving our lives. This is an unintended consequence of remote working, but it is certainly a new reality that needs to be confronted and understood.

And many people who can work from a laptop are also less likely to lose their jobs than people who work in the service and hospitality industries, especially those who have well-developed professional networks and high social capital. According to The Economist, this group are having a much better lockdown than most — homeschooling notwithstanding. But then, they probably also had a more comfortable life beforehand.

Rachel Coldicutt (Glimmers)

This post, “a scrapbook of links and questions that explore how civil society might be in a digital world,” is a really interesting look at the physicality of our increasingly-digital world and how the messiness of human life is being ‘cleaned up’ by technology.


Remote work worsens inequality by mostly helping high-income earners

Given its potential benefits, telecommuting is an attractive option to many. Studies have shown a substantial number of workers would even agree to a lower salary for a job that would allow them to work from home. The appeal of remote work can be especially strong during times of crisis, but also exists under more normal circumstances.

The ongoing crisis therefore amplifies inequalities when it comes to financial and work-life balance benefits. If there’s a broader future adoption of telecommuting, a likely result of the current situation, that would still mean a large portion of the working population, many of them low-income workers, would be disadvantaged

Georges A. Tanguay & Ugo Lachapelle (The Conversation)

There’s some interesting graphs included in this Canadian study of remote work. While I’ve written plenty about remote work before, I don’t think I’ve really touched on how much it reinforces white, middle-class, male privilege.

The BBC has an article entitled Why are some people better at working from home than others? which suggests that succeeding and/or flourishing in a remote work situation is down to the individual, rather than the context. The truth is, it’s almost always easier to be a man in a work environement ⁠— remote, or otherwise. This is something we need to change.


Unreal engine

A first look at Unreal Engine 5

We’ve just released a first look at Unreal Engine 5. One of our goals in this next generation is to achieve photorealism on par with movie CG and real life, and put it within practical reach of development teams of all sizes through highly productive tools and content libraries.

I remember showing my late grandmother FIFA 18 and her not being able to tell the difference between it and the football she watched regularly on the television.

Even if you’re not a gamer, you’ll find this video incredible. It shows how, from early next year, cinematic-quality experiences will be within grasp of even small development teams.


Grand illusion: how the pandemic exposed we’re all just pretending

Our pretending we’re not drowning is the proof we have that we might still be worth saving. Our performing stability is one of the few ways that we hope we might navigate the narrow avenues that might still get us out.

A thing, though, about perpetuating misperceptions, about pretending – because you’re busy surviving, because you can’t stop playing the rigged game on the off-chance somehow that you might outsmart it, because you can’t help but feel like your circumstances must somehow be your fault – is that it makes it that much harder for any individual within the group to tell the truth.

Lynn Steger Strong (The Guardian)

Wouldn’t be amazing if we collectively turned to one another, recognised our collective desire not to play ‘the game’ any more, and decided to go after those who have rigged the system against us?


How to improve your walking technique

What research shows is that how we walk, our gait mechanics, isn’t as “natural” as we might believe. We learn to walk by observing our parents and the world around us. As we grow up, we embody the patterns we see. These can limit the full potential of our gait. Some of us unconsciouly prevent the pelvis and arms from swinging because of cultural taboos that frown upon having a gait as being, for example, too free.

Suunto

My late, great, friend Dai Barnes was a barefoot runner. He used to talk a lot about how people walk and run incorrectly, partly because of the ‘unnatural’ cushioning of their feet. This article gives some advice on improving your walking gait, which I tried out today on a long family walk.


Header mage via xkcd

To others we are not ourselves but a performer in their lives cast for a part we do not even know that we are playing

Surveillance, technology, and society

Last week, the London Metropolitan Police (‘the Met’) proudly announced that they’ve begun using ‘LFR’, which is their neutral-sounding acronym for something incredibly invasive to the privacy of everyday people in Britain’s capital: Live Facial Recognition.

It’s obvious that the Met expect some pushback here:

The Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders. Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences. 

At a deployment, cameras will be focused on a small, targeted area to scan passers-by. The cameras will be clearly signposted and officers deployed to the operation will hand out leaflets about the activity. The technology, which is a standalone system, is not linked to any other imaging system, such as CCTV, body worn video or ANPR.

London Metropolitan Police

Note the talk of ‘intelligence’ and ‘bespoke watch lists’, as well as promises that LFR will not be linked any other systems. (ANPR, for those not familiar with it, is ‘Automatic Number Plate Recognition’.) This, of course, is the thin end of the wedge and how these things start — in a ‘targeted’ way. They’re expanded later, often when the fuss has died down.


Meanwhile, a lot of controversy surrounds an app called Clearview AI which scrapes publicly-available data (e.g. Twitter or YouTube profiles) and applies facial recognition algorithms. It’s already in use by law enforcement in the USA.

The size of the Clearview database dwarfs others in use by law enforcement. The FBI’s own database, which taps passport and driver’s license photos, is one of the largest, with over 641 million images of US citizens.

The Clearview app isn’t available to the public, but the Times says police officers and Clearview investors think it will be in the future.

The startup said in a statement Tuesday that its “technology is intended only for use by law enforcement and security personnel. It is not intended for use by the general public.” 

Edward Moyer (CNET)

So there we are again, the technology is ‘intended’ for one purpose, but the general feeling is that it will leak out into others. Imagine the situation if anyone could identify almost anyone on the planet simply by pointing their smartphone at them for a few seconds?

This is a huge issue, and one that politicians and lawmakers on both sides of the Atlantic are both ill-equipped to deal with and particularly concerned about. As the BBC reports, the European Commission is considering a five-year ban on facial recognition in public spaces while it figures out how to regulate the technology:

The Commission set out its plans in an 18-page document, suggesting that new rules will be introduced to bolster existing regulation surrounding privacy and data rights.

It proposed imposing obligations on both developers and users of artificial intelligence, and urged EU countries to create an authority to monitor the new rules.

During the ban, which would last between three and five years, “a sound methodology for assessing the impacts of this technology and possible risk management measures could be identified and developed”.

BBC News

I can’t see the genie going back in this particular bottle and, as Ian Welsh puts it, this is the end of public anonymity. He gives the examples of the potential for all kinds of abuse, from an increase in rape, to abuse by corporations, to an increase in parental surveillance of children.

The larger issue is this: people who are constantly under surveillance become super conformers out of defense. Without true private time, the public persona and the private personality tend to collapse together. You need a backstage — by yourself and with a small group of friends to become yourself. You need anonymity.

When everything you do is open to criticism by everyone, you will become timid and conforming.

When governments, corporations, schools and parents know everything, they will try to control everything. This often won’t be for your benefit.

Ian Welsh

We already know that self-censorship is the worst kind of censorship, and live facial recognition means we’re going to have to do a whole lot more of it in the near future.

So what can we do about it? Welsh thinks that this technology should be made illegal, which is one option. However, you can’t un-invent technologies. So live facial recognition is going to be used (lawfully) by some organisations, even if it were restricted to state operatives. I’m not sure if that’s better or worse than everyone having it?


At a recent workshop I ran, I was talking during one of the breaks to one person who couldn’t really see the problem I had raised about surveillance capitalism. I have to wonder if they would have a problem with live facial recognition? From our conversation, I’d suspect not.

Remember that facial recognition is not 100% accurate and (realistically) never can be. So there will be false positives. Let’s say your face ends up on a ‘watch list’ or a ‘bad actor’ database shared with many different agencies and retailers. All of a sudden, you’ve got yourself a very big problem.


As BuzzFeed News reports, around half of US retailers are either using live facial recognition, or have plans to use it. At the moment, companies like FaceFirst do not facilitate the sharing of data across their clients, but you can see what’s coming next:

[Peter Trepp, CEO of FaceFirst] said the database is not shared with other retailers or with FaceFirst directly. All retailers have their own policies, but Trepp said often stores will offer not to press charges against apprehended shoplifters if they agree to opt into the store’s shoplifter database. The files containing the images and identities of people on “the bad guy list” are encrypted and only accessible to retailers using their own systems, he said.

FaceFirst automatically purges visitor data that does not match information in a criminal database every 14 days, which is the company’s minimum recommendation for auto-purging data. It’s up to the retailer if apprehended shoplifters or people previously on the list can later opt out of the database.

Leticia Miranda (BuzzFeed News)

There is no opt-in, no consent sought or gathered by retailers. This is a perfect example of technology being light years ahead of lawmaking.


This is all well-and-good in situations where adults are going into public spaces, but what about schools, where children are often only one step above prisoners in terms of the rights they enjoy?

Recode reports that, in schools, the surveillance threat to students goes beyond facial recognition. So long as authorities know generally what a student looks like, they can track them everywhere they go:

Appearance Search can find people based on their age, gender, clothing, and facial characteristics, and it scans through videos like facial recognition tech — though the company that makes it, Avigilon, says it doesn’t technically count as a full-fledged facial recognition tool

Even so, privacy experts told Recode that, for students, the distinction doesn’t necessarily matter. Appearance Search allows school administrators to review where a person has traveled throughout campus — anywhere there’s a camera — using data the system collects about that person’s clothing, shape, size, and potentially their facial characteristics, among other factors. It also allows security officials to search through camera feeds using certain physical descriptions, like a person’s age, gender, and hair color. So while the tool can’t say who the person is, it can find where else they’ve likely been.

Rebecca Heilweil (Recode)

This is a good example of the boundaries of technology that may-or-may-not be banned at some point in the future. The makers of Appearance Search, Avigilon, claim that it’s not facial recognition technology because the images it captures and analyses are tied to the identity of a particular person:

Avigilon’s surveillance tool exists in a gray area: Even privacy experts are conflicted over whether or not it would be accurate to call the system facial recognition. After looking at publicly available content about Avigilon, Leong said it would be fairer to call the system an advanced form of characterization, meaning that the system is making judgments about the attributes of that person, like what they’re wearing or their hair, but it’s not actually claiming to know their identity.

Rebecca Heilweil (Recode)

You can give as many examples of the technology being used for good as you want — there’s one in this article about how the system helped discover a girl was being bullied, for example — but it’s still intrusive surveillance. There are other ways of getting to the same outcome.


We do not live in a world of certainty. We live in a world where things are ambiguous, unsure, and sometimes a little dangerous. While we should seek to protect one another, and especially those who are most vulnerable in society, we should think about the harm we’re doing by forcing people to live the totality of their lives in public.

What does that do to our conceptions of self? To creativity? To activism? Live facial recognition technology, as well as those technologies that exist in a grey area around it, is the hot-button issue of the 2020s.


Image by Kirill Sharkovski. Quotation-as-title by Elizabeth Bibesco.