The mesmerising murmurations of Europe’s starlings

    Incredible. I highly recommend clicking through to watch the videos!

    A murmuration of starlings

    How the birds move together in such close proximity, as though one organism, is another mystery. One study found that each starling was responding instantly to the six or seven birds closest to it to maintain group cohesion.
    Source: ‘A fragment of eternity’: the mesmerising murmurations of Europe’s starlings | The Guardian

    Moving air through a building more efficiently using a fan

    For those of you sweltering away inside a building, it might be better to be blowing air out of the window…

    [embed]www.youtube.com/watch

    This man reports that the best place to put a fan is about 2 ft from a window, facing the window, and he has numbers on a computer screen to prove it.
    Source: The best place to put a fan | Boing Boing

    Moving air through a building more efficiently using a fan

    For those of you sweltering away inside a building, it might be better to be blowing air out of the window…

    [embed]www.youtube.com/watch

    This man reports that the best place to put a fan is about 2 ft from a window, facing the window, and he has numbers on a computer screen to prove it.
    Source: The best place to put a fan | Boing Boing

    Degrees of Uncertainty

    I rarely watch 24-minute online videos all the way through, but this is excellent and well worth everyone’s time. No matter what your preconceptions are about climate change, or your political persuasion.

    [embed]www.youtube.com/watch

    A data-driven documentary about Neil Halloran.
    Source: Degrees of Uncertainty - A documentary about climate change and public trust in science by Neil Halloran

    If you don’t know what you’re doing, you can be very creative about it

    3 apps to help avoid post-pandemic surveillance culture [VIDEO]

    This is an experiment using a green screen and OBS. Let me know what you think!

    Briar
    Tor
    LibreTorrent
    F-Droid

    Technology is the name we give to stuff that doesn't work properly yet

    So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.

    I'm in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual 'autopilot' can be a useful way to see things differently. So I'm going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years...

    Attempts to put microphones everywhere

    Alexa-enabled EVERYTHING

    In an article for Slate, Shannon Palus ranks all of Amazon's new products by 'creepiness'. The Echo Frames are, in her words:

    A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?

    Shannon Palus

    When we're not talking about weapons of mass destruction, it's not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a 'quiet word' with anyone wearing glasses ever again?

    It's not just Amazon, of course. Google and Facebook are at it, too.

    Full-body deepfakes

    [www.youtube.com/watch](https://www.youtube.com/watch?v=8siezzLXbNo)
    Scary stuff

    With the exception, perhaps, of populist politicians, I don't think we're ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for 'full body deepfakes'.

    The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can't trust anything any more.

    Neuromarketing

    If you clicked on the 'super-secret link' in Sunday's newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it's the tip of the iceberg.

    Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:

    "The sort of future we're looking ahead toward is a world where our neural data — which we don't even have access to — could be used" against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.

    Kaveh Waddell

    This would lead to 'neuromarketing', with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it's coming directly from your brainwaves, short of literally wearing a tinfoil hat, there's nothing much you can do.


    So there we are. Am I being too fearful here?

    There’s no viagra for enlightenment

    This quotation from the enigmatic Russell Brand seemed appropriate for the subject of today's article: the impact of so-called 'deepfakes' on everything from porn to politics.

    First, what exactly are 'deepfakes'? Mark Wilson explains in an article for Fast Company:

    In early 2018, [an anonymous Reddit user named Deepfakes] uploaded a machine learning model that could swap one person’s face for another face in any video. Within weeks, low-fi celebrity-swapped porn ran rampant across the web. Reddit soon banned Deepfakes, but the technology had already taken root across the web–and sometimes the quality was more convincing. Everyday people showed that they could do a better job adding Princess Leia’s face to The Force Awakens than the Hollywood special effects studio Industrial Light and Magic did. Deepfakes had suddenly made it possible for anyone to master complex machine learning; you just needed the time to collect enough photographs of a person to train the model. You dragged these images into a folder, and the tool handled the convincing forgery from there.

    Mark Wilson

    As you'd expect, deepfakes bring up huge ethical issues, as Jessica Lindsay reports for Metro. It's a classic case of our laws not being able to keep up with what's technologically possible:

    With the advent of deepfake porn, the possibilities have expanded even further, with people who have never starred in adult films looking as though they’re doing sexual acts on camera.

    Experts have warned that these videos enable all sorts of bad things to happen, from paedophilia to fabricated revenge porn.

    [...]

    This can be done to make a fake speech to misrepresent a politician’s views, or to create porn videos featuring people who did not star in them.

    Jessica Lindsay

    It's not just video, either, with Google's AI now able to translate speech from one language to another and keep the same voice. Karen Hao embeds examples in an article for MIT Technology Review demonstrating where this is all headed.

    The results aren’t perfect, but you can sort of hear how Google’s translator was able to retain the voice and tone of the original speaker. It can do this because it converts audio input directly to audio output without any intermediary steps. In contrast, traditional translational systems convert audio into text, translate the text, and then resynthesize the audio, losing the characteristics of the original voice along the way.

    Karen Hao

    The impact on democracy could be quite shocking, with the ability to create video and audio that feels real but is actually completely fake.

    However, as Mike Caulfield notes, the technology doesn't even have to be that sophisticated to create something that can be used in a political attack.

    There’s a video going around that purportedly shows Nancy Pelosi drunk or unwell, answering a question about Trump in a slow and slurred way. It turns out that it is slowed down, and that the original video shows her quite engaged and articulate.

    [...]

    In musical production there is a technique called double-tracking, and it’s not a perfect metaphor for what’s going on here but it’s instructive. In double tracking you record one part — a vocal or solo — and then you record that part again, with slight variations in timing and tone. Because the two tracks are close, they are perceived as a single track. Because they are different though, the track is “widened” feeling deeper, richer. The trick is for them to be different enough that it widens the track but similar enough that they blend.

    Mike Caulfield

    This is where blockchain could actually be a useful technology. Caulfield often talks about the importance of 'going back to the source' — in other words, checking the provenance of what it is you're reading, watching, or listening. There's potential here for checking that something is actually the original document/video/audio.

    Ultimately, however, people believe what they want to believe. If they want to believe Donald Trump is an idiot, they'll read and share things showing him in a negative light. It doesn't really matter if it's true or not.


    Also check out: