Tag: technology (page 1 of 8)

Why the internet is less weird these days

I can remember sneakily accessing the web when I was about fifteen. It was a pretty crazy place, the likes of which you only really see these days in the far-flung corners of the regular internet or on the dark web.

Back then, there were conspiracy theories, there was porn, and there was all kinds of weirdness and wonderfulness that I wouldn’t otherwise have experienced growing up in a northern mining town. Some of it may have been inappropriate, but in the main it opened my eyes to the wider world.

In this Engadget article, Violet Blue points out that the demise of the open web means we’ve also lost meaningful free speech:

It’s critical… to understand that apps won, and the open internet lost. In 2013, most users accessing the internet went to mobile and stayed that way. People don’t actually browse the internet anymore, and we are in a free-speech nightmare.

Because of Steve Jobs, adult and sex apps are super-banned from Apple’s conservative walled garden. This, combined with Google’s censorious push to purge its Play Store of sex has quietly, insidiously formed a censored duopoly controlled by two companies that make Morality in Media very, very happy. Facebook, even though technically a darknet, rounded it out.

A very real problem for society at the moment is that we simultaneously want to encourage free-thinking and diversity while at the same time protecting people from distasteful content. I’m not sure what the answer is, but outsourcing the decision to tech companies probably isn’t the answer.

In 1997, Ann Powers wrote an essay called “In Defense of Nasty Art.” It took progressives to task for not defending rap music because it was “obscene” and sexually graphic. Powers puts it mildly when she states, “Their apprehension makes the fight to preserve freedom of expression seem hollow.” This is an old problem. So it’s no surprise that the same websites forbidding, banning, and blocking “sexually suggestive” art content also claim to care about free speech.

As a parent of a 12 year-old boy and eight year-old girl, I check the PEGI age ratings for the games they play. I also trust Common Sense Media to tell me about the content of films they want to watch, and I’m careful about what they can and can’t access on the web.

Violet Blue’s article is a short one, so focuses on the tech companies, but the real issue here is one level down. The problem is neoliberalism. As Byung-Chul Han comments in Psychopolitics: Neoliberalism and New Technologies of Powerwhich I’m reading at the moment:

Neoliberalism represents a highly efficient, indeed an intelligent, system for exploiting freedom. Everything that belongs to practices and expressive forms of liberty –emotion, play and communication –comes to be exploited.

Almost everything is free at the point of access these days, which means, in the oft-repeated phrase, that we are the product. This means that in order to extract maximum value, nobody can be offended. I’m not so sure that I want to live in an inoffensive future.

Source: Engadget (via Noticing)

Why it’s so hard to quit Big Tech

I’m writing this on a Google Pixelbook. Earlier this evening I wiped it, fully intending to install Linux on it, and then… meh. Partly, that’s because the Pixelbook now supports Linux apps in a sandboxed environment (which is great!) but mostly because using ChromeOS on decent hardware is just a lovely user experience.

Writing for TechCrunch, Danny Crichton writes:

Privacy advocates will tell you that the lack of a wide boycott against Google and particularly Facebook is symptomatic of a lack of information: if people really understood what was happening with their data, they would galvanize immediately for other platforms. Indeed, this is the very foundation for the GDPR policy in Europe: users should have a choice about how their data is used, and be fully-informed on its uses in order to make the right decision for them.

This is true for all kinds of things. If people only knew about the real cost of Brexit, about what Donald Trump was really like, about the facts of global warning… and on, and on.

I think it’s interesting to compare climate change and Big Tech. We all know that we should probably change our actions, but the symptoms only affect us directly very occasionally. I’m just pleased that I’ve been able to stay off Facebook for the last nine years…

Alternatives exist for every feature and app offered by these companies, and they are not hard to find. You can use Signal for chatting, DuckDuckGo for search, FastMail for email, 500px or Flickr for photos, and on and on. Far from being shameless clones of their competitors, in many cases these products are even superior to their originals, with better designs and novel features.

It’s not good enough just to create a moral choice and talk about privacy. Just look at the Firefox web browser from Mozilla, which now stands at less than 5% market share. That’s why I think that we need to be thinking about regulation (like GDPR!) to change things, not expect individual users to make some kind of stand.

I mean, just look at things like this recent article that talks about building your own computer, sideloading APK files onto an Android device with a modified bootloader, and setting up your own ‘cloud’ service. It’s do-able, and I’ve done it in the past, but it’s not fun. And it’s not a sustainable solution for 99% of the population.

Source: TechCrunch

Cal Newport on the dangers of ‘techno-maximalism’

I have to say that I was not expecting to enjoy Cal Newport’s book Deep Work when I read it a couple of years ago. As someone who’s always been fascinated by technology, and who has spent most of his career working in and around it, I assume it was going to contain the approach of a Luddite working in his academic ivory tower.

It turns out I was completely wrong in this assumption, and the book was one of the best I read in 2017. Newport is back with a new book that I’ve eagerly pre-ordered called Digital Minimalism: On Living Better with Less Technology. It comes out next week. Again, the title is something that would usually be off-putting to me, but it’s hard to argue about the points that he makes in his blog posts since Deep Work.

As you would expect with a new book coming out, Newport is doing the rounds of interviews. In one with GQ magazine, he talks about the dangers of ‘digital maximalism’, which he defines in the following way:

The basic idea is that technological innovations can bring value and convenience into your life. So, you assess new technological tools with respect to what value or convenience it can bring into your life. And if you can find one, then the conclusion is, “If I can afford it, I should probably have this.” It just looks at the positives. And it’s view is “more is better than less,” because more things that bring you benefits means more total benefits. This is what maximalism is: “If there’s something that brings value, you should get it.”

That type of thinking is dangerous, as:

We see these tools, and we have this narrative that, “You can do this on Facebook,” or “This new feature on this device means you can do this, which would be convenient.” What you don’t factor in is, “Okay, well what’s the cost in terms of my time attention required to have this device in my life?” Facebook might have some particular thing that’s valuable, but then you have the average U.S. user spending something like 50 minutes a day on Facebook products. That’s actually a pretty big [amount of life] that you’re now trading in order to get whatever the potential small benefit is.

[Maximalism] ignores the opportunity cost. And as Thoreau pointed out hundreds of years ago, it’s actually in the opportunity cost that all the interesting math happens.

Newport calls for a new philosophy of technology which includes things like ‘digital minimalism’ (the subject of his new book):

Digital minimalism is a clear philosophy: you figure out what’s valuable to you. For each of these things you say, “What’s the best way I need to use technology to support that value?” And then you happily miss out on everything else. It’s about additively building up a digital life from scratch to be very specifically, intentionally designed to make your life much better.

There might be other philosophies, just like in health in fitness. More important to me than everyone becoming a digital minimalist, is people in general getting used to this idea that, “I have a philosophy that’s really clear and grounded in my values that tells me how I approach technology.” Moving past this ad-hoc stage of like, “Whatever, I just kind of signed up for maximalist stage,” and into something a little bit more intentional.

I’ve never really the type of person to go to a book club, but what with this coming out and Company of One by Paul Jarvis arriving yesterday, perhaps I need to set up a virtual one?

Source: GQ

Through the looking-glass

Earlier this month, George Dyson, historian of technology and author of books including Darwin Among the Machines, published an article at Edge.org.

In it, he cites Childhood’s End, a story by Arthur C. Clarke in which benevolent overlords arrive on earth. “It does not end well”, he says. There’s lots of scaremongering in the world at the moment and, indeed, some people have said for a few years now that software is eating the world.

Dyson comments:

The genius — sometimes deliberate, sometimes accidental — of the enterprises now on such a steep ascent is that they have found their way through the looking-glass and emerged as something else. Their models are no longer models. The search engine is no longer a model of human knowledge, it is human knowledge. What began as a mapping of human meaning now defines human meaning, and has begun to control, rather than simply catalog or index, human thought. No one is at the controls. If enough drivers subscribe to a real-time map, traffic is controlled, with no central model except the traffic itself. The successful social network is no longer a model of the social graph, it is the social graph. This is why it is a winner-take-all game. Governments, with an allegiance to antiquated models and control systems, are being left behind.

I think that’s an insightful point: human knowledge is seen to be that indexed by Google, friendships are mediated by Facebook, Twitter and Instagram, and to some extent what possible/desirable/interesting is dictated to us rather than originating from us.

We imagine that individuals, or individual algorithms, are still behind the curtain somewhere, in control. We are fooling ourselves. The new gatekeepers, by controlling the flow of information, rule a growing sector of the world.

What deserves our full attention is not the success of a few companies that have harnessed the powers of hybrid analog/digital computing, but what is happening as these powers escape into the wild and consume the rest of the world

Indeed. We need to raise our sights a little here and start asking governments to use their dwindling powers to break up mega corporations before Google, Amazon, Microsoft and Facebook are too powerful to stop. However, given how enmeshed they are in everyday life, I’m not sure at this point it’s reasonable to ask the general population to stop using their products and services.

Source: Edge.org

Exit option democracy

This week saw the launch of a new book by Shoshana Zuboff entitled The Age of Surveillance Capitalism: the fight for a human future at the new frontier of power. It was featured in two of my favourite newspapers, The Observer and the The New York Times, and is the kind of book I would have lapped up this time last year.

In 2019, though, I’m being a bit more pragmatic, taking heed of Stoic advice to focus on the things that you can change. Chiefly, that’s your own perceptions about the world. I can’t change the fact that, despite the Snowden revelations and everything that has come afterwards, most people don’t care one bit that they’re trading privacy for convenience..

That puts those who care about privacy in a bit of a predicament. You can use the most privacy-respecting email service in the world, but as soon as you communicate with someone using Gmail, then Google has got the entire conversation. Chances are, the organisation you work for has ‘gone Google’ too.

Then there’s Facebook shadow profiles. You don’t even have to have an account on that platform for the company behind it to know all about you. Same goes with companies knowing who’s in your friendship group if your friends upload their contacts to WhatsApp. It makes no difference if you use ridiculous third-party gadgets or not.

In short, if you want to live in modern society, your privacy depends on your family and friends. Of course you have the option to choose not to participate in certain platforms (I don’t use Facebook products) but that comes at a significant cost. It’s the digital equivalent of Thoreau taking himself off to Walden pond.

In a post from last month that I stumbled across this weekend, Nate Matias reflects on a talk he attended by Janet Vertesi at Princeton University’s Center for Information Technology Policy. Vertesi, says Matias, tried four different ways of opting out of technology companies gathering data on her:

  • Platform avoidance,
  • Infrastructural avoidance
  • Hardware experiments
  • Digital homesteading

Interestingly, the starting point is Vertesi’s rejection of ‘exit option democracy’:

The basic assumption of markets is that people have choices. This idea that “you can just vote with your feet” is called an “exit option democracy” in organizational sociology (Weeks, 2004). Opt-out democracy is not really much of a democracy, says Janet. She should know–she’s been opting out of tech products for years.

The option Vertesi advocates for going Google-free is a pain in the backside. I know, because I’ve tried it:

To prevent Google from accessing her data, Janet practices “data balkanization,” spreading her traces across multiple systems. She’s used DuckDuckGo, sandstorm.io, ResilioSync, and youtube-dl to access key services. She’s used other services occasionally and non-exclusively, and varied it with open source alternatives like etherpad and open street map. It’s also important to pay attention to who is talking to whom and sharing data with whom. Data balkanization relies on knowing what companies hate each other and who’s about to get in bed with whom.

The time I’ve spent doing these things was time I was not being productive, nor was it time I was spending with my wife and kids. It’s easy to roll your eyes at people “trading privacy for convenience” but it all adds up.

Talking of family, straying too far from societal norms has, for better or worse, negative consequences. Just as Linux users were targeted for surveillance, so Vertisi and her husband were suspected of fraud for browsing the web using Tor and using cash for transactions:

Trying to de-link your identity from data storage has consequences. For example, when Janet and her husband tried to use cash for their purchases, they faced risks of being reported to the authorities for fraud, even though their actions were legal.

And then, of course, there’s the tinfoil hat options:

…Janet used parts from electronics kits to make her own 2g phone. After making the phone Janet quickly realized even a privacy-protecting phone can’t connect to the network without identifying the user to companies through the network itself.

I’m rolling my eyes at this point. The farthest I’ve gone down this route is use the now-defunct Firefox OS and LineageOS for microG. Although both had their upsides, they were too annoying to use for extended periods of time.

Finally, Vertesi goes down the route of trying to own all your own data. I’ll just point out that there’s a reason those of us who had huge CD and MP3 collections switched to Spotify. Looking after any collection takes time and effort. It’s also a lot more cost effective for someone like me to ‘rent’ my music instead of own it. The same goes for Netflix.

What I do accept, though, is that Vertesi’s findings show that ‘exit democracy’ isn’t really an option here, so the world of technology isn’t really democratic. My takeaway from all this, and the reason for my pragmatic approach this year, is that it’s up to governments to do something about all this.

Western society teaches us that empowered individuals can change the world. But if you take a closer look, whether it’s surveillance capitalism or climate change, it’s legislation that’s going to make the biggest difference here. Just look at the shift that took place because of GDPR.

So whether or not I read Zuboff’s new book, I’m going to continue my pragmatic approach this year. Meanwhile, I’ll continue to mute the microphone on the smart speakers in our house when they’re not being used, block trackers on my Android smartphone, and continue my monthly donations to work of the Electronic Frontier Foundation and the Open Rights Group.

Source: J. Nathan Matias

Noise cancelling for cars is a no-brainer

We’re all familiar with noise cancelling headphones. I’ve got some that I use for transatlantic trips, and they’re great for minimising any repeating background noise.

Twenty years ago, when I was studying A-Level Physics, I was also building a new PC. I realised that, if I placed a microphone inside the computer case, and fed that into the audio input on the soundcard, I could use software to invert the sound wave and thus virtually eliminate fan noise. It worked a treat.

It doesn’t surprise me, therefore, to find that BOSE, best known for its headphones, are offering car manufacturers something similar with “road noise control”:

With accelerometers, multiple microphones, and algorithms, it’s much more complicated than what I rigged up in my bedroom as a teenager. But the principle remains the same.

Source: The Next Web

Acoustic mirrors

On the beach at Druridge Bay in Northumberland, near where I live, there are large blocks in various intervals. These hulking pieces of concrete, now half-submerged, were deployed on seafronts up and down England to prevent the enemy successfully landing tanks during the Second World War.

I was fascinated to find out that these aren’t the only concrete blocks that protected Britain. BBC News reports that ‘acoustic mirrors’ were installed for a very specific purpose:

More than 100 years ago acoustic mirrors along the coast of England were built with the intention of using them to detect the sound of approaching German zeppelins.

The concave concrete structures were designed to pick up sound waves from enemy aircraft, making it possible to predict their flight trajectory, giving enough time for ground forces to be alerted to defend the towns and cities of Britain.

Some of these, which vary in size, still exist, and have been photographed by Joe Pettet-Smith.

The reason most of us haven’t heard of them is that the technology improved so quickly. Pettet-Smith comments:

The sound mirror experiment, this idea of having a chain of concrete structures facing the Channel using sound to detect the flight trajectory of enemy aircraft, was just that – an experiment. They tried many different sizes and designs before the project was scrapped when radar was introduced.

The science was solid, but aircraft kept getting faster and quieter, which made them obsolete.

Fascinating. The historian (and technologist) within me loves this.

Source: BBC News

Confusing tech questions

Today is the first day of the Consumer Electronics Show, or CES, in Las Vegas. Each year, tech companies showcase their latest offerings and concepts. Nilay Patel, Editor-in-Chief for The Verge, comments that, increasingly, the tech industry is built on a number of assumptions about consumers and human behaviour:

[T]hink of the tech industry as being built on an ever-increasing number of assumptions: that you know what a computer is, that saying “enter your Wi-Fi password” means something to you, that you understand what an app is, that you have the desire to manage your Bluetooth device list, that you’ll figure out what USB-C dongles you need, and on and on.

Lately, the tech industry is starting to make these assumptions faster than anyone can be expected to keep up. And after waves of privacy-related scandals in tech, the misconceptions and confusion about how things works are both greater and more reasonable than ever.

I think this is spot-on. At Mozilla, and now at Moodle, I spend a good deal of my time among people who are more technically-minded than me. And, in turn, I’m more technically-minded than the general population. So what’s ‘obvious’ or ‘easy’ to developers feels like magic to the man or woman on the street.

Patel keeps track of the questions his friends and family ask him, and has listed them in the post. The number one thing he says that everyone is talking about is how people assume their phones are listening to them, and then serving up advertising based on that. They don’t get that that Facebook (and other platforms) use multiple data points to make inferences.

I’ll not reproduce his list here, but here are three questions which I, too, get a lot from friends and family:

“How do I make sure deleting photos from my iPhone won’t delete them from my computer?”

“How do I keep track of what my kid is watching on YouTube?”

“Why do I need to make another username and password?”

As I was discussing with the MoodleNet team just yesterday, there’s a difference between treating users as ‘stupid’ (which they’re not) and ensuring that they don’t have to think too much when they’re using your product.

Source: The Verge (via Orbital Operations)

Baseline levels of conscientiousness

Baseline levels of conscientiousness

As I mentioned on New Years’ Day, I’ve decided to trade some of my privacy for convenience, and am now using the Google Assistant on a regular basis. Unlike Randall Munroe, the author of xkcd, I have no compunction about outsourcing everything other than the Very Important Things That I’m Thinking About to other devices (and other people).

Source: xkcd

Looking back and forward in tech

Looking back at 2018, Amber Thomas commented that, for her, a few technologies became normalised over the course of the year:

  1. Phone payments
  2. Voice-controlled assistants
  3. Drones
  4. Facial recognition
  5. Fingerprints

Apart from drones, I’ve spent the last few years actively avoiding the above. In fact, I spent most of 2018 thinking about decentralised technology, privacy, and radical politics.

However, December is always an important month for me. I come off social media, stop blogging, and turn another year older just before Christmas. It’s a good time to reflect and think about what’s gone before, and what comes next.

Sometimes, it’s possible to identify a particular stimulus to a change in thinking. For me, it was while I was watching Have I Got News For You and the panellists were shown a photo of a fashion designer who put a shoe in front of their face to avoid being recognisable. Paul Merton asked, “doesn’t he have a passport?”

Obvious, of course, but I’d recently been travelling and using the biometric features of my passport. I’ve also relented this year and use the fingerprint scanner to unlock my phone. I realised that the genie isn’t going back in the bottle here, and that everyone else was using my data — biometric or otherwise — so I might as well benefit, too.

Long story short, I’ve bought a Google Pixelbook and Lenovo Smart Display over the Christmas period which I’ll be using in 2019 to my life easier. I’m absolutely trading privacy for convenience, but it’s been a somewhat frustrating couple of years trying to use nothing but Open Source tools.

I’ll have more to say about all of this in due course, but it’s worth saying that I’m still committed to living and working openly. And, of course, I’m looking forward to continuing to work on MoodleNet.

Source: Fragments of Amber