Tag: Amazon (page 1 of 2)

Technology is the name we give to stuff that doesn’t work properly yet

So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.

I’m in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual ‘autopilot’ can be a useful way to see things differently. So I’m going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years…

Attempts to put microphones everywhere

Alexa-enabled EVERYTHING

In an article for Slate, Shannon Palus ranks all of Amazon’s new products by ‘creepiness’. The Echo Frames are, in her words:

A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?

Shannon Palus

When we’re not talking about weapons of mass destruction, it’s not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a ‘quiet word’ with anyone wearing glasses ever again?

It’s not just Amazon, of course. Google and Facebook are at it, too.

Full-body deepfakes

Scary stuff

With the exception, perhaps, of populist politicians, I don’t think we’re ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for ‘full body deepfakes’.

The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can’t trust anything any more.

Neuromarketing

If you clicked on the ‘super-secret link’ in Sunday’s newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it’s the tip of the iceberg.

Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:

“The sort of future we’re looking ahead toward is a world where our neural data — which we don’t even have access to — could be used” against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.

Kaveh Waddell

This would lead to ‘neuromarketing’, with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it’s coming directly from your brainwaves, short of literally wearing a tinfoil hat, there’s nothing much you can do.


So there we are. Am I being too fearful here?

The greatest obstacle to discovery is not ignorance—it is the illusion of knowledge

So said Daniel J. Boorstin. It’s been an interesting week for those, like me, who follow the development of interaction between humans and machines. Specifically, people seem shocked that voice assistants are being used for health questions, also that the companies who make them employ people to listen to samples of voice recordings to make them better.

Before diving into that, let’s just zoom out a bit and remind ourselves that the average level of digital literacies in the general population is pretty poor. Sometimes I wonder how on earth VC-backed companies manage to burn through so much cash. Then I remember the contortions that those who design visual interfaces go through so that people don’t have to think.

Discussing ‘fake news’ and our information literacy problem in Forbes, you can almost feel Kalev Leetaru‘s eye-roll when he says:

It is the accepted truth of Silicon Valley that every problem has a technological solution.

Most importantly, in the eyes of the Valley, every problem can be solved exclusively through technology without requiring society to do anything on its own. A few algorithmic tweaks, a few extra lines of code and all the world’s problems can be simply coded out of existence.

Kalev Leetaru

It’s somewhat tangential to the point I want to make in this article, but Cory Doctorow makes a a good point in this regard about fake news for Locus

Fake news is an instrument for measuring trauma, and the epistemological incoherence that trauma creates – the justifiable mistrust of the establishment that has nearly murdered our planet and that insists that making the richest among us much, much richer will benefit everyone, eventually.

Cory Doctorow

Before continuing, I’d just like to say that I’ve got some skin in the voice assistant game, given that our home has no fewer that six devices that use the Google Assistant (ten if you count smartphones and tablets).

Voice assistants are pretty amazing when you know exactly what you want and can form a coherent query. It’s essentially just clicking the top link on a Google search result, without any of the effort of pointing and clicking. “Hey Google, do I need an umbrella today?”

However, some people are suspicious of voice assistants to a degree that borders on the superstitious. There’s perhaps some valid reasons if you know your tech, but if you’re of the opinion that your voice assistant is ‘always recording’ and literally sending everything to Amazon, Google, Apple, and/or Donald Trump then we need to have words. Just think about that for a moment, realise how ridiculous it is, and move on.

This week an article by VRT NWS stoked fears like these. It was cleverly written so that those who read it quickly could easily draw the conclusion that Google is listening to everything you say. However, let me carve out the key paragraphs:

Why is Google storing these recordings and why does it have employees listening to them? They are not interested in what you are saying, but the way you are saying it. Google’s computer system consists of smart, self-learning algorithms. And in order to understand the subtle differences and characteristics of the Dutch language, it still needs to learn a lot.

[…]

Speech recognition automatically generates a script of the recordings. Employees then have to double check to describe the excerpt as accurately as possible: is it a woman’s voice, a man’s voice or a child? What do they say? They write out every cough and every audible comma. These descriptions are constantly improving Google’s search engines, which results in better reactions to commands. One of our sources explains how this works.

VRS NWS

Every other provider of speech recognition products does this. Obviously. How else would you manage to improve voice recognition in real-world situations? What VRS NWS did was to get a sub-contractor to break a Non-Disclosure Agreement (and violate GDPR) to share recordings.

Google responded on their blog The Keyword, saying:

As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant.

We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.

We apply a wide range of safeguards to protect user privacy throughout the entire review process. Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google.

The Keyword

As I’ve said before, due to the GDPR actually having teeth (British Airways was fined £183m last week) I’m a lot happier to share my data with large companies than I was before the legislation came in. That’s the whole point.

The other big voice assistant story, in the UK at least, was that the National Health Service (NHS) is partnering with Amazon Alexa to offer health advice. The BBC reports:

From this week, the voice-assisted technology is automatically searching the official NHS website when UK users ask for health-related advice.

The government in England said it could reduce demand on the NHS.

Privacy campaigners have raised data protection concerns but Amazon say all information will be kept confidential.

The partnership was first announced last year and now talks are under way with other companies, including Microsoft, to set up similar arrangements.

Previously the device provided health information based on a variety of popular responses.

The use of voice search is on the increase and is seen as particularly beneficial to vulnerable patients, such as elderly people and those with visual impairment, who may struggle to access the internet through more traditional means.

The BBC

So long as this is available to all types of voice assistants, this is great news. The number of people I know, including family members, who have convinced themselves they’ve got serious problems by spending ages searching their symptoms, is quite frightening. Getting sensible, prosaic advice is much better.

Iliana Magra writes in the The New York Times that privacy campaigners are concerned about Amazon setting up a health care division, but that there are tangible benefits to certain sections of the population.

The British health secretary, Matt Hancock, said Alexa could help reduce strain on doctors and pharmacists. “We want to empower every patient to take better control of their health care,” he said in a statement, “and technology like this is a great example of how people can access reliable, world-leading N.H.S. advice from the comfort of their home.”

His department added that voice-assistant advice would be particularly useful for “the elderly, blind and those who cannot access the internet through traditional means.”

Iliana Magra

I’m not dismissing the privacy issues, of course not. But what I’ve found, especially recently, is that the knowledge, skills, and expertise required to be truly ‘Google-free’ (or the equivalent) is an order of magnitude greater than what is realistically possible for the general population.

It might be fatalistic to ask the following question, but I’ll do it anyway: who exactly do we expect to be building these things? Mozilla, one of the world’s largest tech non-profits is conspicuously absent in these conversations, and somehow I don’t think people aren’t going to trust governments to get involved.

For years, techies have talked about ‘personal data vaults’ where you could share information in a granular way without being tracked. Currently being trialled is the BBC box to potentially help with some of this:

With a secure Databox at its heart, BBC Box offers something very unusual and potentially important: it is a physical device in the person’s home onto which personal data is gathered from a range of sources, although of course (and as mentioned above) it is only collected with the participants explicit permission, and processed under the person’s control.

Personal data is stored locally on the box’s hardware and once there, it can be processed and added to by other programmes running on the box – much like apps on a smartphone. The results of this processing might, for example be a profile of the sort of TV programmes someone might like or the sort of theatre they would enjoy. This is stored locally on the box – unless the person explicitly chooses to share it. No third party, not even the BBC itself, can access any data in ‘the box’ unless it is authorised by the person using it, offering a secure alternative to existing services which rely on bringing large quantities of personal data together in one place – with limited control by the person using it.

The BBC

It’s an interesting concept and, if they can get the user experience right, a potentially groundbreaking concept. Eventually, of course, it will be in your smartphone, which means that device really will be a ‘digital self’.

You can absolutely opt-out of whatever you want. For example, I opt out of Facebook’s products (including WhatsApp and Instagram). You can point out to others the reasons for that, but at some point you have to realise it’s an opinion, a lifestyle choice, an ideology. Not everyone wants to be a tech vegan, or live their lives under those who act as though they are one.

Friday ferretings

These things jumped out at me this week:

  • Deepfakes will influence the 2020 election—and our economy, and our prison system (Quartz) ⁠— “The problem doesn’t stop at the elections, however. Deepfakes can alter the very fabric of our economic and legal systems. Recently, we saw a deepfake video of Facebook CEO Mark Zuckerberg bragging about abusing data collected from users circulated on the internet. The creators of this video said it was produced to demonstrate the power of manipulation and had no malicious intent—yet it revealed how deceptively realistic deepfakes can be.”
  • The Slackification of the American Home (The Atlantic) — “Despite these tools’ utility in home life, it’s work where most people first become comfortable with them. ‘The membrane that divides work and family life is more porous than it’s ever been before,’ says Bruce Feiler, a dad and the author of The Secrets of Happy Families. ‘So it makes total sense that these systems built for team building, problem solving, productivity, and communication that were invented in the workplace are migrating to the family space’.”
  • You probably don’t know what your coworkers think of you. Here’s how to change that (Fast Company) — “[T]he higher you rise in an organization, the less likely you are to get an accurate picture of how other people view you. Most people want to be viewed favorably by others in a position of power. Once you move up to a supervisory role (or even higher), it is difficult to get people to give you a straight answer about their concerns.”
  • Sharing, Generosity and Gratitude (Cable Green, Creative Commons) — “David is home recovering and growing his liver back to full size. I will be at the Mayo Clinic through the end of July. After the Mayo surgeons skillfully transplanted ⅔ of David’s liver into me, he and I laughed about organ remixes, if he should receive attribution, and wished we’d have asked for a CC tattoo on my new liver.”
  • Flexibility as a key benefit of open (The Ed Techie) — “As I chatted to Dames and Lords and fiddled with my tie, I reflected on that what is needed for many of these future employment scenarios is flexibility. This comes in various forms, and people often talk about personalisation but it is more about institutional and opportunity flexibility that is important.”
  • Abolish Eton: Labour groups aim to strip elite schools of privileges (The Guardian) — “Private schools are anachronistic engines of privilege that simply have no place in the 21st century,” said Lewis. “We cannot claim to have an education system that is socially just when children in private schools continue to have 300% more spent on their education than children in state schools.”
  • I Can’t Stop Winning! (Pinboard blog) – “A one-person business is an exercise in long-term anxiety management, so I would say if you are already an anxious person, go ahead and start a business. You’re not going to feel any worse. You’ve already got the main skill set of staying up and worrying, so you might as well make some money.”
  • How To Be The Remote Employee That Proves The Stereotypes Aren’t True (Trello blog) — “I am a big fan of over-communicating in general, and I truly believe that this is a rule all remote employees should swear by.”
  • I Used Google Ads for Social Engineering. It Worked. (The New York Times) — “Ad campaigns that manipulate searchers’ behavior are frighteningly easy for anyone to run.”
  • Road-tripping with the Amazon Nomads (The Verge) — “To stock Amazon’s shelves, merchants travel the backroads of America in search of rare soap and coveted toys.”

Image from Guillermo Acuña fronts his remote Chilean retreat with large wooden staircase (Dezeen)

Charity is no substitute for justice

The always-brilliant Audrey Watters eviscerates the latest project from a white, male billionaire to ‘fix education’. Citing Amazon CEO Jeff Bezos’ plan to open a series of “Montessori-inspired preschools in underserved communities” where “the child will be the customer”, Audrey comments:

The assurance that “the child will be the customer” underscores the belief – shared by many in and out of education reform and education technology – that education is simply a transaction: an individual’s decision-making in a “marketplace of ideas.” (There is no community, no public responsibility, no larger civic impulse for early childhood education here. It’s all about privateschools offering private, individual benefits.)

As I’ve said on many occasions, everyone wakes up with cool ideas to change the world. The difference is that you or I would have to run it through many, many filters to get the funding to implement it. Those filters , hopefully, kill 99% of batshit-crazy ideas. Billionaires, in the other hand, can just speak and fund things into existence, no matter how damaging and I’ll thought-out the ideas behind them happen to be.

[Teaching] is a field in which a third of employeesalready qualify for government assistance. And now Jeff Bezos, a man whose own workers also rely on these same low-income programs, wants to step in – not as a taxpayer, oh no, but as a philanthropist. Honestly, he could have a more positive impact here by just giving those workers a raise. (Or, you know, by paying taxes.)

This is the thing. We can do more and better together than we can do apart. The ideas of the many, honed over years, lead to better outcomes than the few thinking alone.

For all the flaws in the public school system, it’s important to remember: there is no accountability in billionaires’ educational philanthropy.

And, as W. B. Yeats famously never said, charity is no substitute for justice.

Whatever your moral and political views, accountability is something that cuts across the divide. I should imagine there are some reading this who send their kids to private schools and don’t particularly see the problem with this. Isn’t it just another example of competition within ‘the market’?

The trouble with that kind of thinking, at least from my perspective, is twofold. First, it assumes that education is a private instead of a public good. Second, that it’s OK to withhold money from society and then use that to subsidise the education of the already-privileged.

Source: Hack Education

Our irresistible screens of splendour

Apple is touting a new feature in the latest version of iOS that helps you reduce the amount of time you spend on your smartphone. Facebook are doing something similar. As this article in The New York Times notes, that’s no accident:

There’s a reason tech companies are feeling this tension between making phones better and worrying they are already too addictive. We’ve hit what I call Peak Screen.

For much of the last decade, a technology industry ruled by smartphones has pursued a singular goal of completely conquering our eyes. It has given us phones with ever-bigger screens and phones with unbelievable cameras, not to mention virtual reality goggles and several attempts at camera-glasses.

The article even gives the example of Augmented Reality LEGO play sets which actively encourage you to stop building and spend more time on screens!

Tech has now captured pretty much all visual capacity. Americans spend three to four hours a day looking at their phones, and about 11 hours a day looking at screens of any kind.

So tech giants are building the beginning of something new: a less insistently visual tech world, a digital landscape that relies on voice assistants, headphones, watches and other wearables to take some pressure off our eyes.

[…]

Screens are insatiable. At a cognitive level, they are voracious vampires for your attention, and as soon as you look at one, you are basically toast.

It’s not enough to tell people not to do things. Technology can be addictive, just like anything else, so we need to find better ways of achieving similar ends.

But in addition to helping us resist phones, the tech industry will need to come up with other, less immersive ways to interact with digital world. Three technologies may help with this: voice assistants, of which Amazon’s Alexa and Google Assistant are the best, and Apple’s two innovations, AirPods and the Apple Watch.

All of these technologies share a common idea. Without big screens, they are far less immersive than a phone, allowing for quick digital hits: You can buy a movie ticket, add a task to a to-do list, glance at a text message or ask about the weather without going anywhere near your Irresistible Screen of Splendors.

The issue I have is that it’s going to take tightly-integrated systems to do this well, at least at first. So the chances are that Apple or Google will create an ecosystem that only works with their products, providing another way to achieve vendor lock-in.

Source: The New York Times

The disappearing computer and the future of AI

I was at the Thinking Digital conference yesterday, which is always an inspiring event. It kicked off with a presentation from a representative of Amazon’s Alexa programme, who cited an article by Walt Mossberg from this time last year. I’m pretty sure I read about it, but didn’t necessarily write about it, at the time.

Mossberg talks about how computing will increasingly become invisible:

Let me start by revising the oft-quoted first line of my first Personal Technology column in the Journal on October 17th, 1991: “Personal computers are just too hard to use, and it’s not your fault.” It was true then, and for many, many years thereafter. Not only were the interfaces confusing, but most tech products demanded frequent tweaking and fixing of a type that required more technical skill than most people had, or cared to acquire. The whole field was new, and engineers weren’t designing products for normal people who had other talents and interests.

Things are different now, of course. We expect even small children to be able to use things like iPads with minimal help.

When the internet first arrived, it was a discrete activity you performed on a discrete hunk of metal and plastic called a PC, using a discrete software program called a browser. Even now, though the net is like the electrical grid, powering many things, you still use a discrete device — a smartphone, say — to access it. Sure, you can summon some internet smarts through an Echo, but there’s still a device there, and you still have to know the magic words to say. We are a long way from the invisible, omnipresent computer in Starship Enterprise.

The Amazon representative on-stage at the conference obviously believes that voice is the next frontier in computing. That’s his job. Nevertheless, he marshalled some pretty compelling, if anecdotal, evidence for that. A couple of videos showed older people, who had been completely bypassed by the smartphone revolution, interacting naturally with Alexa.

I expect that one end result of all this work will be that the technology, the computer inside all these things, will fade into the background. In some cases, it may entirely disappear, waiting to be activated by a voice command, a person entering the room, a change in blood chemistry, a shift in temperature, a motion. Maybe even just a thought.

In the same way that the front end of a website like Facebook, the user interface, is the tip of the iceberg, so voice assistants are the front end for artificial intelligence. Who gets to the process data harvested by these devices, and for what purposes, is an important issue — both now and in the future.

And, if ambient technology is to become as integrated into our lives as previous technological revolutions like wood joists, steel beams, and engine blocks, we need to subject it to the digital equivalent of enforceable building codes and auto safety standards. Nothing less will do. And health? The current medical device standards will have to be even tougher, while still allowing for innovation.

This was the last article Mossberg wrote anywhere, having been a tech journalist since 1991. In signing off, he became a little wistful about the age of gadgetry we’re leaving behind, but it’s hopefully for the wider good.

We’ve all had a hell of a ride for the last few decades, no matter when you got on the roller coaster. It’s been exciting, enriching, and transformative. But it’s also been about objects and processes. Soon, after a brief slowdown, the roller coaster will be accelerating faster than ever, only this time it’ll be about actual experiences, with much less emphasis on the way those experiences get made.

This is an important touchstone article, and one I’ll be returning to in future, no doubt.

Source: The Verge

Work-life balance is actually a circle, according to Jeff Bezos

Whatever your thoughts about Amazon, it’s hard to disagree that they’ve changed the world. Their CEO, Jeff Bezos, has some thoughts about what’s usually termed ‘work-life balance’:

This work-life harmony thing is what I try to teach young employees and actually senior executives at Amazon too. But especially the people coming in. I get asked about work-life balance all the time. And my view is, that’s a debilitating phrase because it implies there’s a strict trade-off. And the reality is, if I am happy at home, I come into the office with tremendous energy. And if I am happy at work, I come home with tremendous energy.

Of course, if you work from home (as I do) being happy at home is crucial to being happy at work.

I like his metaphor of a circle, about it not being a trade-off or ‘balance’:

It actually is a circle; it’s not a balance. And I think that is worth everybody paying attention to it. You never want to be that guy — and we all have a coworker who’s that person — who as soon as they come into a meeting they drain all the energy out of the room. You can just feel the energy go whoosh! You don’t want to be that guy. You want to come into the office and give everyone a kick in their step.

All of the most awesome people I know have nothing like a work-life ‘balance’. Instead, they work hard, play hard, and tie that to a mission bigger than themselves.

Whether that’s true for the staff on targets in Amazon warehouses is a different matter, of course. But for knowledge workers, I think it’s spot-on.

Source: Chicago Tribune

Alexa for Kids as babysitter?

I’m just on my way out if the house to head for Scotland to climb some mountains with my wife.

But while she does (what I call) her ‘last minute faffing’ I read Dan Hon’s newsletter. I’ll just quite the relevant section without any attempt at comment or analysis.

He includes references in his newsletter, but you’ll just have to click through for those.

Mat Honan reminded me that Amazon have made an Alexa for Kids (during the course of which Tom Simonite had a great story about Alexa diligently and non-plussedly educating a group of preschoolers about the history of FARC after misunderstanding their requests for farts) and Honan has a great article about it. There are now enough Alexa (plural?) out there that the phenomenon of “the funny things kids say to Alexa” is pretty well documented as well as the earlier “Alexa is teaching my kid to be rude” observation. This isn’t to say that Amazon haven’t done *any* work thinking about how Alexa works in a kid context (Honan’s article shows that they’ve demonstrably thought about how Alexa might work and that they’ve made changes to the product to accommodate children as a specific class of user) but the overwhelming impression I had after reading Honan’s piece was that, as a parent, I still don’t think Amazon haven’t gone far enough in making Alexa kid-friendly.

They’ve made some executive decisions like coming down hard on curation versus algorithmic selection of content (see James Bridle’s excellent earlier essay on YouTube, that something is wrong on the internet and recent coverage of YouTube Kids’ content selection method still finding ways to recommend, shall we say, videos espousing extreme views). And Amazon have addressed one of the core reported issues of having an Alexa in the house (the rudeness) by designing in support for a “magic word” Easter Egg that will reward kids for saying “please”. But that seems rather tactical and dealing with a specific issue and not, well, foundational. I think that the foundational issue is something more like this: parenting is a *very* personal subject. As I have become a parent, I have discovered (and validated through experimental data) that parents have very specific views about how to do things! Many parents do not agree with each other! Parents who agree with each other on some things do not agree on other things! In families where there are two parents there is much scope for disagreement on both desired outcome and method!

All of which is to say is that the current design, architecture and strategy of Alexa for Kids indicates one sort of one-size-fits-all method and that there’s not much room for parental customization. This isn’t to say that Amazon are actively preventing it and might not add it down the line – it’s just that it doesn’t really exist right now. Honan’s got a great point that:

“[For example,] take the magic word we mentioned earlier. There is no universal norm when it comes to what’s polite or rude. Manners vary by family, culture, and even region. While “yes, sir” may be de rigueur in Alabama, for example, it might be viewed as an element of the patriarchy in parts of California.”

Some parents may have very specific views on how they want to teach their kids to be polite. This kind of thinking leads me down the path of: well, are we imagining a world where Alexa or something like it is a sort of universal basic babysitter, with default norms and those who can get, well, customization? Or what someone else might call: attentive, individualized parenting?

When Alexa for Kids came out, I did about 10 seconds’ worth of thinking and, based on how Alexa gets used in our house (two parents, a five year old and a 19 month old) and how our preschooler is behaving, I was pretty convinced that I’m in no way ready or willing to leave him alone with an Alexa for Kids in his room. My family is, in what some might see as that tedious middle class way, pretty strict about the amount of screen time our kids get (unsupervised and supervised) and suffice it to say that there’s considerable difference of opinion between my wife and myself on what we’re both comfortable with and at what point what level of exposure or usage might be appropriate.

And here’s where I reinforce that point again: are you okay with leaving your kids with a default babysitter, or are you the kind of person who has opinions about how you want your babysitter to act with your kids? (Yes, I imagine people reading this and clutching their pearls at the mere *thought* of an Alexa “babysitting” a kid but need I remind you that books are a technological object too and the issue here is in the degree of interactivity and access). At least with a babysitter I can set some parameters and I’ve got an idea of how the babysitter might interact with the kids because, well, that’s part of the babysitter screening process.

Source: Things That Have Caught My Attention s5e11

Microcast #005

Thinking through an approach to building Project MoodleNet that came to me this weekend, using Google search, Amazon filtering, and the Pinterest browser button as mental models.

Links:

Web Trends Map 2018 (or ‘why we can’t have nice things’)

My son, who’s now 11 years old, used to have iA’s Web Trends Map v4 on his wall. It was produced in 2009, when he was two:

iA Web Trends Map 4 (2009)

I used it to explain the web to him, as the subway map was a metaphor he could grasp. I’d wondered why iA hadn’t produced more in subsequent years.

Well, the answer is clear in a recent post:

Don’t get too excited. We don’t have it. We tried. We really tried. Many times. The most important ingredient for a Web Trend Map is missing: The Web. Time to bring some of it back.

Basically, the web has been taken over by capitalist interests:

The Web has lost its spirit. The Web is no longer a distributed Web. It is, ironically, a couple of big tubes that belong to a handful of companies. Mainly Google (search), Facebook (social) and Amazon (e-commerce). There is an impressive Chinese line and there are some local players in Russia, Japan, here and there. Overall it has become monotonous and dull. What can we do?

It’s difficult. Although I support the aims, objectives, and ideals of the IndieWeb, I can’t help but think it’s looking backwards instead of forwards. I’m hoping that newer approaches such as federated social networks, distributed ledgers and databases, and regulation such as GDPR have some impact.

Source: iA