So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.
I’m in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual ‘autopilot’ can be a useful way to see things differently. So I’m going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years…
Attempts to put microphones everywhere
In an article for Slate, Shannon Palus ranks all of Amazon’s new products by ‘creepiness’. The Echo Frames are, in her words:
A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?
Shannon Palus
When we’re not talking about weapons of mass destruction, it’s not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a ‘quiet word’ with anyone wearing glasses ever again?
It’s not just Amazon, of course. Google and Facebook are at it, too.
Full-body deepfakes
With the exception, perhaps, of populist politicians, I don’t think we’re ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for ‘full body deepfakes’.
The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can’t trust anything any more.
Neuromarketing
If you clicked on the ‘super-secret link’ in Sunday’s newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it’s the tip of the iceberg.
Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:
“The sort of future we’re looking ahead toward is a world where our neural data — which we don’t even have access to — could be used” against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.
Kaveh Waddell
This would lead to ‘neuromarketing’, with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it’s coming directly from your brainwaves, short of literally wearing a tinfoil hat, there’s nothing much you can do.
So there we are. Am I being too fearful here?
Brent Flanders
30 September 2019 — 15:30
Good article Doug,
We live in a digital/connected world. You can either choose to live “in it” or attempt to “avoid it”. If you live “in it” you must be educated about all the above and much more.
There was a time a colleague wanted to make sure we had 100% connectivity from our organization to the world with about 10% of the world having access into our organization. I had to explain, no matter how sophisticated our infrastructure (DMZs, VPNs, Firewalls, etc.) this was not really possible.
You cannot have your cake and eat it too.
Doug Belshaw
30 September 2019 — 20:16
Thanks Brent, it’s difficult not to imagine the dystopian aspects of all this!
Aaron Davis
6 October 2019 — 02:23
As a ‘technology coach’ (I think that is what I am) it is an interesting space to be in. There are so many educators out there praising some of these innovations and the affordances they bring. I think that the least we can do is to be more informed, even if this is fallible and somewhat naive.
One of the challenges that really intrigues me is when someone else gives your consent for something without either asking or often even realising. In some ways shadow profiles touches upon this, but the worst is DNA tests:
https://platform.twitter.com/widgets.js
It is also kind of funny how in education the discussions seems to be about banning smartphones. However, as you touch upon with microphones and wearables, we will not even know what is and is not being captured. A part of me thinks that as a teacher you need to be mindful of this.
What concerns me most are those who feel that we should make the capturing of biometric data standard.
We live in wicked times.
Doug Belshaw
6 October 2019 — 08:37
Wow, that’s a powerful comment, Aaron, made even more so by having read Audrey Watters’ latest HEWN immediately beforehand: https://hewn.substack.com/p/hewn-no-324
The thing which is missing from a lot of these discussions is the ‘why’ – as in ‘why do we need biometric data to make society more secure?’ and ‘why do we need venture capital in education?’
Sadly, money talks and people listen 🙁
Phillip Long
6 October 2019 — 21:50
Your concerns are well founded. Most people don’t realize how much personal data they are sharing when using ‘free’ services (an oxymoron if ever there was one). Just read a few of the brave folks who tried to live without touching any of the big four (Google, Amazon, Apple, MS) for a month or more. It’s more than a filter change, it’s a refactoring of your sources of information, news, and what you can and cannot access. Most people I know who have attempted this have given up.
There are small points of hope that you can limit the personal data exhaust that otherwise leaks when you speak at home with a “smart” device, or go on the web with a non-secure browser. Things like deleting the recordings Alexa or Siri have made of you. i unplug my Alex Dot whenever I’m not proactively using it to listen to something – and there are only a few things I choose to listen to through it connected to my Jambox speaker – NPR, Folk Radio, and one or two others. The rest of the time it is unplugged.
Removing browsing history for real from Google, etc. has recently become feasible as Google is trying to act semi-responsibly.
The NY Times tech team has done a good job recently of giving instructions on how to be more attentive to personal privacy online. And Apple is stating to distinguish itself in its approach to customer privacy, mindful of course that it’s a very low bar out there by which one currently judges these moves. But acknowledgement should be given to even these attempts.
What to me is more concerning is we haven’t had much conversation about a world in which nothing is forgotten. We’ve never lived in such a social environment where ones actions are discoverable by actors beyond our direct interpersonal engagements, and by those anywhere on the deteriorating planet. Lapses of judgement 10, 15, or 20 years ago that may not have been wise actions at the time, but not criminal offenses, are starting to routinely pop up to haunt people today. The default assumption is that these actions are representative of those individuals today. But are they? Would you really say you haven’t changed your views on anything substantive in the past 25 years? THAT worries me, and the ill prepared nature of our communities confronting this scares the bejesus out of me.
Doug Belshaw
7 October 2019 — 07:23
Thanks Phillip, I’ve recently turned on that Google setting to delete my activity history after three months (and with it location history, etc.)
The problem, I think, is that it’s going to take a societal shift to make any of this ‘normal’ behaviour. For example, I declined to join a WhatsApp group created on the Mountain Leader training course at the weekend, because I don’t use Facebook products. At the moment, that’s seen as a kind of hardcore tech veganism, instead of a wise precaution…
Phillip Long
7 October 2019 — 16:19
FB analogous to “hardcore tech veganism” is wonderful mixed metaphor! I deleted all my data and closed my FB account a few years ago. Cambridge Analytics was the last straw. You’re right that the trade-offs made either without thought, or with poor appreciation for the actual consequence that are enabled is a major problem that is akin to a cultural reawakening.
We’re pondering this in relation to the idea of building a self-sovereign learner record that is intended to be a lifelong personally owned repository of verifiable assertions about one’s, achievements, capabilities and skills. But what does this mean to actually ‘own’ this record? What’s the nature of the responsibilities involved? Can we design it and the system around it in such a way that it can be safely and usefully leveraged without a significant investment in time and deep understanding of the underlying design? Or to put it another way, can make it usable and practical for the average person? I believe we can but making things elegantly simple is generally devilishly difficult.
Principles and ethics matter, nevermore so than in today’s digital world, and especially coupled with compassion and social commitment. One of our biggest challenges is to continually frame these conversations in this context, lest others shift the arguments an in dosing so change the context away from the things that really matter.
Cheers,
Phil
Doug Belshaw
7 October 2019 — 16:48
Right, exactly – and the problem is when people come along with resources that are tied to different ethical positions to your own. Always a balancing act!