Spy windows?

    No technology is neutral, and vendors are only ever going to tout the positive qualities. Take this example: it’s a way to create a camera out of any window. Huge benefits, as the article says, but also some rather large (and dystopian) downsides.

    The image depicts a futuristic glass door on the front of a modern corporate building, reflecting a cityscape with skyscrapers under a sky with clouds. The glass features a holographic facial recognition system with a green circle and lock icon surrounding the reflection of a woman's face with short hair and glasses, indicating access has been granted.

    Zeiss is bringing its remarkable Holocam technology to CES 2024, which can turn any glass screen into a camera. This means that everything from the window in your car to the screen on your laptop to the glass on your front door can now possess an invisible image sensor.

    […]

    The Holocam technology “uses holographic in-coupling, light guiding and de-coupling elements to redirect the incoming light of a transparent medium to a hidden image sensor.”

    […]

    Using an entire pane of glass as a camera lens also opens some fascinating optical possibilities. Some of Zeiss' bullet points include “large aperture invisible camera” and “individual adjustment of orientation and size of the field of views.” Which makes me wonder, what is the maximum aperture and focal range of a camera like this?

    Of course, there’s a darker potential for such technology. Given the current fear around hidden cameras in Airbnbs, the idea of every single window (or even shower door) in a rental property being able to spy on you is a little disconcerting.

    Source: This holographic camera turns any window into an invisible camera | Digital Camera World

    Remember distinct music scenes and culinary traditions? Yeah, they're coming back.

    Anything that Anil Dash writes is worth reading and this, his first article for Rolling Stone, is no different. I haven’t quoted it here, but I love the first paragraph. What goes around, comes around, eh?

    This is a vibrant and highly detailed image depicting a fantastical scene reminiscent of a stage set for an imaginary play. The artwork is rich with various elements and layers, featuring multiple colorful structures that resemble different themed areas or sets. On the left, there's a golden-yellow structure with green accents, platforms, and staircases that evoke a bustling market or social hub, with tiny figures that appear to be people engaging in various activities. Centered in the image is a towering cityscape with blue and black skyscrapers rising among white, fluffy clouds against a clear sky. To the right, the scene turns darker with red and black twisted trees and buildings that have a more ominous vibe, including some structures that are on fire and surrounded by dark birds. The entire image is a blend of whimsy and chaos, with numerous birds in flight throughout, some carrying symbols like hearts and crosses. There are also splashes of paint and abstract elements scattered across the image, contributing to the surreal, dreamlike atmosphere. The overall color scheme includes bright red, yellow, blue, and varying shades of dark gray, all set against a light blue background that suggests a waterside setting at the bottom edge of the image.
    [T]his new year offers many echoes of a moment we haven’t seen in a quarter-century. Some of the most dominant companies on the internet are at risk of losing their relevance, and the rest of us are rethinking our daily habits in ways that will shift the digital landscape as we know it. Though the specifics are hard to predict, we can look to historical precedents to understand the changes that are about to come, and even to predict how regular internet users — not just the world’s tech tycoons — may be the ones who decide how it goes.

    […]

    We are about to see the biggest reshuffling of power on the internet in 25 years, in a way that most of the internet’s current users have never seen before. And while some of the drivers of this change have been hyped up, or even over-hyped, a few of the most important changes haven’t gotten any discussion at all.

    […]

    Consider the dramatic power shift happening right now in social media. Twitter’s slide into irrelevance and extremism as it decays into X has hastened the explosive growth of a whole host of newer social networks. There’s the nerdy vibes of the noncommercial Mastodon communities (each one with its own set of Dungeons and Dragons rules to play by), the raucous hedonism of Bluesky (like your old Tumblr timeline at its most scandalous), and the at-least-it’s-not-LinkedIn noisiness of Threads, brought to you by Instagram, meaning Facebook, meaning Meta. There are lots more, of course, and probably another new one popping up tomorrow, but that’s what’s great about it. A generation ago, we saw early social networks like LiveJournal and Xanga and Black Planet and Friendster and many others come and go, each finding their own specific audience and focus. For those who remember a time in the last century when things were less homogenous, and different geographic regions might have their own distinct music scenes or culinary traditions, it’s easy to understand the appeal of an online equivalent to different, connected neighborhoods that each have their own vibe. While this new, more diffuse set of social networks sometimes requires a little more tinkering to get started, they epitomize the complexity and multiplicity of the weirder and more open web that’s flourishing today.

    [...]

    I’m not a pollyanna about the fact that there are still going to be lots of horrible things on the internet, and that too many of the tycoons who rule the tech industry are trying to make the bad things worse. (After all, look what the last wild era online lead to.) There’s not going to be some new killer app that displaces Google or Facebook or Twitter with a love-powered alternative. But that’s because there shouldn’t be. There should be lots of different, human-scale alternative experiences on the internet that offer up home-cooked, locally-grown, ethically-sourced, code-to-table alternatives to the factory-farmed junk food of the internet. And they should be weird.

    Source:  The Internet Is About to Get Weird Again | Rolling Stone

    Image: DALL-E 3

    Accepting and trying to deal with climate as an overriding priority

    I need to dig into this BBC R&D report, but it looks fascinating at first glance. I recognise the names of some of the people who were interviewed in the process of creating it, and what’s interesting to me is that they found that instead of the ‘next big thing’ in terms of technology, they found “a complex set of factors that we believe will enable and catalyse one another, sometimes in surprising and unpredictable ways”.

    The most important of these, of course, was “accepting and trying to deal with climate as an overriding priority” but also identifying two types of complexity. The first is “a sense that in order to simply go about your day as a person, it’s necessary to interact with, and understand, many complex sources of information”. The second is “a sense that the overarching systems of the world like politics, finance, economics, and healthcare, are becoming more complex and difficult to understand”.

    Late in 2022, we began a straightforward-sounding research project: compile a list of technologies that we should be paying attention to in BBC Research & Development over the next few years and make some recommendations about their adoption to the wider BBC. As I’m sure you’ve already guessed, things didn’t turn out quite so straightforward.

    By the end of the project, we’d interviewed twenty-two people from the fields of science, economics, education, technology, design, business leadership, research, activism, journalism, and many points between. We spoke to people from both inside and outside the BBC and around the world. All of these people have a unique view on the future, and our report teases out the common themes from the interviews and compiles their ideas about how things might come to be in the near future.

    We grouped the themes we identified into five sections. The first, A complex world, outlines sources of complexity and uncertainty our interviewees see in their worlds. Climate change is by far the largest and most significant of these. The next section, A divided world, also covers big-picture context and outlines some of the social and economic drivers our interviewees see playing out over the next few years. The AI boom and New interactions go into detail on specific technologies and use cases our interviewees think will be significant. Finally, The case for hope bundles up some of the reasons our interviewees see to be hopeful about the future — provided we are willing to act to bring about the changes we’d like to see in the world

    Source: Projections: Things are not normal | BBC R&D

    Tech typologisation

    People love being typologised. I’m no different, although my result as an ‘Abstract Explorer’ in IBM’s Tech Type quiz wasn’t exactly a surprise.

    Abstract Explorer tech type

    Consider this: a quiz to guide you to your unique fit for tech skills based on your strengths and interests. Find your future with this personalized assessment, bringing you one step closer to new skills to enhance your career in tech and key skills like artificial intelligence (AI). And it takes less than 5 minutes.

    Source: Tech Type Quiz | IBM SkillsBuild

    'Personalisation' is something that humans do

    Audrey Watters, formerly the ‘Cassandra’ of edtech, is now writing about health, nutrition, and fitness technologies at Second Breakfast. It’s great, I’m a paid subscriber.

    In this article, she looks at the overlap between her former and current fields, comparing and contrasting coaches and educators with algorithms. While I don’t share her loathing of ChatGPT, as an educator and a parent I’d definitely agree that motivation and attention is something to which a human is (currently) best suited.

    How well does a teacher or trainer or coach know how you feel, how well you performed, or what you should do or learn next? How well does an app know how you feel, how well you performed, or what you should do next? Digital apps insist that, thanks to the data they collect, they can make better, more precise recommendations than humans ever can — dismissing what humans do as “one size fits all.” Yet it's impossible to scrutinize their algorithmic decision-making. Ideally, at least, you can always ask your coach, "Why the hell am I doing bulgarian split squats?! These suck." And she will tell you precisely why. (Love you, Coach KB.)

    And then (ideally) she’ll say, “If you don’t want to do them, you don’t have to.” And (ideally), she’ll ask you what’s going on. Maybe you feel like shit that day. Maybe you don’t have time. Maybe they hurt your hamstrings. Maybe you’d like to hear some options — other exercises you can do instead. Maybe you’d like to know why she prescribed this exercise in the first place — “it’s a unilateral exercises, and as a runner,” she says, “we want to work on single-leg strength, with a focus on your glute medius and adductors because I’ve noticed, by watching your barbell squats, that those areas are your weak spots.” This is how things get “personalized” — not by some massive data extraction and analysis, but by humans asking each other questions and then tailoring our responses and recommendations accordingly. Teachers and coaches do this every. goddamn. day. Sure, there’s a training template or a textbook that one is supposed to follow; but good teachers and coaches check in, and they switch things up when they’re not really working.

    […]

    If we privilege these algorithms, we’re not only adopting their lousy recommendations; we’re undermining the expertise of professionals in the field. And we’re not only undermining the expertise of professionals in the field, we’re undermining our own ability to think and learn and understand our own bodies. We’re undermining our own expertise about ourselves. (ChatGPT is such a bad bad bad idea.)

    Source: Teacher/Coach as Algorithm | Second Breakfast

    What people are really using generative AI for

    As I’ve written several times before here on Thought Shrapnel, society seems to act as though the giant, monolithic, hugely profitable porn industry just doesn’t… exist? This despite the fact it tends to be a driver of technical innovation. I won’t get into details, but feel free to search for phrases such as ‘teledildonics’.

    So this article from the new (and absolutely excellent) 404 Media on a venture capitalist firm’s overview of the emerging generative AI industry shouldn’t come as too much of a surprise. As a society and as an industry, we don’t make progress on policy, ethics, and safety by pretending things aren’t happening.

    As the father, seeing this kind of news is more than a little disturbing. And we don’t deal with all of it by burying our head in the sand, shaking our head, or crossing our fingers.

    The Andreesen Horowitz (also called a16z) analysis is derived from crude but telling data—internet traffic. Using website traffic tracking company Similarweb, a16z ranks the top 50 generative AI websites on the internet by monthly visits, as of June 2023. This data provides an incomplete picture of what people are doing with AI because it’s not tracking use of popular AI apps like Replika (where people sext with virtual companions) or Telegram chatbots like Forever Companion, which allows users to talk to chatbots trained on the voices of influencers like Amouranth and Caryn Marjorie (who just want to talk about sex).

    […]

    What I can tell you without a doubt by looking at this list of the top 50 generative AI websites is that, as has always been the case online and with technology generally, porn is a major driving force in how people use generative AI in their day to day lives.

    […]

    Even if we put ethical questions aside, it is absurd that a tech industry kingmaker like a16z can look at this data, write a blog titled “How Are Consumers Using Generative AI?” and not come to the obvious conclusion that people are using it to jerk off. If you are actually interested in the generative AI boom and you are not identifying porn as core use for the technology, you are either not paying attention or intentionally pretending it’s not happening.

    Source: 404 Media Generative AI Market Analysis: People Love to Cum

    Microcast #096 — Getting back in the saddle


    Explaining what I've been up to and the difference between being a hedgehog and a fox.

    Show notes


    Image: Pexels

    The only way to outlaw encryption is to outlaw encryption

    An enjoyable take by The Register on the UK’s Online Safety Bill. I was particularly interested by the link to Veilid, a new secure peer-to-peer network for apps which is like the offspring of IPFS and Tor.

    Many others have made the point about how much government ministers like the end-to-end encryption of their own WhatsApp communications. But they’d also like to break into, well… everyone else’s.

    The official madness over data security is particularly bad in the UK. The British state is a world class incompetent at protecting its own data. In the past couple of weeks alone, we have seen the hacking of the Electoral Commission, the state body in charge of elections, the mass exposure of birth, marriage and death data, and the bulk release of confidential personnel information of a number of police forces, most notably the Police Service Northern Ireland. This was immediately picked up by terrorists who like killing police. It doesn't get worse than that.

    This same state is, of course, the one demanding that to “protect children,” it should get access to whatever encrypted citizen communication it likes via the Online Safety Bill, which is now rumored to be going through British Parliament in October. This is akin to giving an alcoholic uncle the keys to every booze shop in town to “protect children”: you will find Uncle in a drunken coma with the doors wide open and the stock disappearing by the vanload.

    […]

    It is just stupidity stacked on incompetence balanced on political Dunning Krugerism, and the advent of Veilid drowns the lot in a tidal wave of foetid futility. What can a government do about a framework? What can it do about open source?

    […]

    The only way to outlaw encryption is to outlaw encryption. Anything less will fail, as it is always possible in software to create kits of parts, all legal by themselves, that can be linked together to provide encryption with no single entity to legislate against. Our industry is fully aware of this. Criminals know it too. Ordinary people will learn it as well, if they have to. This information is free to everyone – except the politicians, it seems. For them, reality is far too expensive.

    Source: Last rites for UK’s ridiculous Online Safety Bill | The Register

    The future of AI will always be more than six months away

    A remarkably sober look at the need for regulation, transparency around how models are trained, and costs in the world of AI. It makes a really good point about the UX required for machine learning to be useful at scale.

    “I have learned from experience that leaving tools completely open-ended tends to confuse users more than assist,” says Kirk. “Think of it like a hall of doors that is infinite. Most humans would stand there perplexed with what to do. We have a lot of work to do to determine the optimal doors to present to users.” Mason has a similar observation, adding that “in the same way that ChatGPT was mainly a UX improvement over GPT-3, I think that we’re just at the beginning of inventing the UI metaphors we’ll need to effectively use AI models in products.”

    […]

    Augmenting work with AI could be worthwhile despite these problems. This was certainly true of the computing revolution: Many people need training to use Word and Excel, but few would propose typewriters or graph paper as a better alternative. Still, it’s clear that a future in which “we automate away all the jobs, including the fulfilling ones,” is more than six months away, as the Future of Life Institute’s letter frets. The AI revolution is unfolding right now—and will still be unfolding a decade from today.

    Source: AI Can’t Take Over Everyone’s Jobs Soon (If Ever) | IEEE Spectrum

    Fitting LLMs to the phenomena

    The author of this post really needs to read Thomas Kuhn’s The Theory of Scientific Revolutions and some Marshall McLuhan (especially on tetrads).

    What he’s describing here is to do with mindsets, the attempt we make to try and fit ‘the phenomena’ into our existing mental models. When that doesn’t work, there’s a crisis, and we have to come up with new paradigms.

    But, more than that, to use McLuhan’s phrase, we “march backwards into the future” always looking to the past to make sense of the present — and future.

    AI image. Midjourney prompt: "tree in shape of brain | ladder resting against trunk of tree --aspect 16:9 --v 5 --no text words letters signatures"

    I have a theory that technological cycles are like the stages of Squid Game: Each one is almost entirely disconnected from the last, and you never know what the next game is going to be until you’re in the arena.

    For example, some new technology, like the automobile, the internet, or mobile computing, gets introduced. We first try to fit it into the world as it currently exists: The car is a mechanical horse; the mobile internet is the desktop internet on a smaller screen. But we very quickly figure out that this new technology enables some completely new way of living. The geography of lives can be completely different; we can design an internet that is exclusively built for our phones. Before the technology arrived, we wanted improvements on what we had, like the proverbial faster horse. After, we invent things that were unimaginable before—how would you explain everything about TikTok to someone from the eighties? Each new breakthrough is a discontinuity, and teleports us to a new world—and, for companies, into a new competitive game—that would’ve been nearly impossible to anticipate from our current world.

    Artificial intelligence, it seems, will be the next discontinuity. That means it won’t tack itself onto our lives as they are today, and tweak them around the edges; it will yank us towards something that is entirely different and unfamiliar.

    AI will have the same effect on the data ecosystem. We’ll initially try to insert LLMs into the game we’re currently playing, by using them to help us write SQL, create documentation, find old dashboards, or summarize queries.

    But these changes will be short-lived. Over time, we’ll find novel things to do with AI, just as we did with the cloud and cloud data warehouses. Our data models won’t be augmented by LLMs; they’ll be built for LLMs. We won’t glue natural language inputs on top of our existing interfaces; natural language will become the default way we interact with computers. If a bot can write data documentation on demand for us, what’s the point of writing it down at all? And we’re finally going to deliver on the promise of self-serve BI in ways that are profoundly different than what we’ve tried in the past.

    Source: The new philosophers | Benn Stancil

    The madman is the man who has lost everything except his reason

    I always enjoy reading L.M. Sacasas' thoughts on the intersection of technology, society, and ethics. This article is no different. In addition to the quotation from G.K. Chesterton which provides the title for this post, Sacasas also quotes Wendell Berry as saying, “It is easy for me to imagine that the next great division of the world will be between people who wish to live as creatures and people who wish to live as machines."

    While I’ve chosen to highlight the part riffing off David Noble’s discussion of technology as religion, I’d highly recommend reading the last three paragraphs of Sacasas' article. In it, he talks about AI as being “the culmination of a longstanding trajectory… [towards] the eclipse of the human person”.

    AI created with Midjourney prompt: "religion of technology | manga | hypnotic --aspect 3:2 --no text words letters signatures"

    The late David Noble’s The Religion of Technology: The Divinity of Man and the Spirit of Invention, first published in 1997, is a book that I turn to often. Noble was adamant about the sense in which readers should understand the phrase “religion of technology.” “Modern technology and modern faith are neither complements nor opposites,” Noble argued, “nor do they represent succeeding stages of human development. They are merged, and always have been, the technological enterprise being, at the same time, an essentially religious endeavor.”

    […]

    The Enlightenment did not, as it turns out, vanquish Religion, driving it far from the pure realms of Science and Technology. In fact, to the degree that the radical Enlightenment’s assault on religious faith was successful, it empowered the religion of technology. To put this another way, the Enlightenment—and, yes, we are painting with broad strokes here—did not do away with the notions of Providence, Heaven, and Grace. Rather, the Enlightenment re-framed these as Progress, Utopia, and Technology respectively. If heaven had been understood as a transcendent goal achieved with the aid of divine grace within the context of the providentially ordered unfolding of human history, it became a Utopian vision, a heaven on earth, achieved by the ministrations Science and Technology within the context of Progress, an inexorable force driving history toward its Utopian consummation.

    […]

    In other words, we might frame the religion of technology not so much as a Christian heresy, but rather as (post-)Christian fan-fiction, an elaborate imagining of how the hopes articulated by the Christian faith will materialize as a consequence of human ingenuity in the absence of divine action.

    Source: Apocalyptic AI | The Convivial Society

    Image: Midjourney (see alt text for prompt)

    Censorship and the porn tech stack

    They say that technical innovation often comes from the porn industry, but the same is true of new forms of censorship.

    For those who don’t know or remember, Tumblr used to have a policy around porn that was literally “Go nuts, show nuts. Whatever.” That was memorable and hilarious, and for many people, Tumblr both hosted and helped with the discovery of a unique type of adult content.

    […]

    [N]o modern internet service in 2022 can have the rules that Tumblr did in 2007. I am personally extremely libertarian in terms of what consenting adults should be able to share, and I agree with “go nuts, show nuts” in principle, but the casually porn-friendly era of the early internet is currently impossible….

    […]

    If you wanted to start an adult social network in 2022, you’d need to be web-only on iOS and side load on Android, take payment in crypto, have a way to convert crypto to fiat for business operations without being blocked, do a ton of work in age and identity verification and compliance so you don’t go to jail, protect all of that identity information so you don’t dox your users, and make a ton of money. I estimate you’d need at least $7 million a year for every 1 million daily active users to support server storage and bandwidth (the GIFs and videos shared on Tumblr use a ton of both) in addition to hosting, moderation, compliance, and developer costs.

    Source: Matt on Tumblr | Why “Go Nuts, Show Nuts” Doesn’t Work in 2022

    Image: Alexander Grey on Unsplash

    The 2022 Drone Photo Awards

    I had a conversation with my neighbour this week about drones. They were pointing out how invasive they can be, while I was talking about the amazing photographs they can take.

    Sure enough, later that day I come across this year’s Drone Photo Award and there’s some absolute stunners in there. The ones of nature are, of course, amazing, but for some reason this one of a Dutch suburb grabbed me as my favourite.

    The annual Drone Photo Awards announced its 2022 winners earlier this month, releasing a remarkable collection of images that frame the world’s most alluring landscapes from a rarely-seen view. This year’s contest garnered submissions from 2,624 participants hailing from 116 countries, and the aerial photos capture a vast array of life on Earth, including a caravan of camel shadows crossing the Arabian Desert, a waterlily harvest in West Bengal, and the veiny trails of lava emerging from a fissure near Iceland’s Fagradalsfjall volcano.
    Source: From a Volcanic Fissure to a Waterlily Harvest, the 2022 Drone Photo Awards Captures Earth’s Stunning Sights from Above | Colossal

    Technological Liturgies

    A typically thoughtful article from L. M. Sacasas in which they “explore a somewhat eccentric frame by which to consider how we relate to our technologies, particularly those we hold close to our bodies.” It’s worth reading the whole thing, especially if you grew up in a church environment as it will have particular resonance.

    Pastoral scene

    I would propose that we take a liturgical perspective on our use of technology. (You can imagine the word “liturgical” in quotation marks, if you like.) The point of taking such a perspective is to perceive the formative power of the practices, habits, and rhythms that emerge from our use of certain technologies, hour by hour, day by day, month after month, year in and year out. The underlying idea here is relatively simple but perhaps for that reason easy to forget. We all have certain aspirations about the kind of person we want to be, the kind of relationships we want to enjoy, how we would like our days to be ordered, the sort of society we want to inhabit. These aspirations can be thwarted in any number of ways, of course, and often by forces outside of our control. But I suspect that on occasion our aspirations might also be thwarted by the unnoticed patterns of thought, perception, and action that arise from our technologically mediated liturgies. I don’t call them liturgies as a gimmick, but rather to cast a different, hopefully revealing light on the mundane and commonplace. The image to bear in mind is that of the person who finds themselves handling their smartphone as others might their rosary beads.

    […]

    Say, for example, that I desire to be a more patient person. This is a fine and noble desire. I suspect some of you have desired the same for yourselves at various points. But patience is hard to come by. I find myself lacking patience in the crucial moments regardless of how ardently I have desired it. Why might this be the case? I’m sure there’s more than one answer to this question, but we should at least consider the possibility that my failure to cultivate patience stems from the nature of the technological liturgies that structure my experience. Because speed and efficiency are so often the very reason why I turn to technologies of various sorts, I have been conditioning myself to expect something approaching instantaneity in the way the world responds to my demands. If at every possible point I have adopted tools and devices which promise to make things faster and more efficient, I should not be surprised that I have come to be the sort of person who cannot abide delay and frustration.

    […]

    The point of the exercise is not to divest ourselves of such liturgies altogether. Like certain low church congregations that claim they have no liturgies, we would only deepen the power of the unnoticed patterns shaping our thought and actions. And, more to the point, we would be ceding this power not to the liturgies themselves, but to the interests served by those who have crafted and designed those liturgies. My loneliness is not assuaged by my habitual use of social media. My anxiety is not meaningfully relieved by the habit of consumption engendered by the liturgies crafted for me by Amazon. My health is not necessarily improved by compulsive use of health tracking apps. Indeed, in the latter case, the relevant liturgies will tempt me to reduce health and flourishing to what the apps can measure and quantify.

    Source: Taking Stock of Our Technological Liturgies | The Convivial Society

    Technological Liturgies

    A typically thoughtful article from L. M. Sacasas in which they “explore a somewhat eccentric frame by which to consider how we relate to our technologies, particularly those we hold close to our bodies.” It’s worth reading the whole thing, especially if you grew up in a church environment as it will have particular resonance.

    Pastoral scene

    I would propose that we take a liturgical perspective on our use of technology. (You can imagine the word “liturgical” in quotation marks, if you like.) The point of taking such a perspective is to perceive the formative power of the practices, habits, and rhythms that emerge from our use of certain technologies, hour by hour, day by day, month after month, year in and year out. The underlying idea here is relatively simple but perhaps for that reason easy to forget. We all have certain aspirations about the kind of person we want to be, the kind of relationships we want to enjoy, how we would like our days to be ordered, the sort of society we want to inhabit. These aspirations can be thwarted in any number of ways, of course, and often by forces outside of our control. But I suspect that on occasion our aspirations might also be thwarted by the unnoticed patterns of thought, perception, and action that arise from our technologically mediated liturgies. I don’t call them liturgies as a gimmick, but rather to cast a different, hopefully revealing light on the mundane and commonplace. The image to bear in mind is that of the person who finds themselves handling their smartphone as others might their rosary beads.

    […]

    Say, for example, that I desire to be a more patient person. This is a fine and noble desire. I suspect some of you have desired the same for yourselves at various points. But patience is hard to come by. I find myself lacking patience in the crucial moments regardless of how ardently I have desired it. Why might this be the case? I’m sure there’s more than one answer to this question, but we should at least consider the possibility that my failure to cultivate patience stems from the nature of the technological liturgies that structure my experience. Because speed and efficiency are so often the very reason why I turn to technologies of various sorts, I have been conditioning myself to expect something approaching instantaneity in the way the world responds to my demands. If at every possible point I have adopted tools and devices which promise to make things faster and more efficient, I should not be surprised that I have come to be the sort of person who cannot abide delay and frustration.

    […]

    The point of the exercise is not to divest ourselves of such liturgies altogether. Like certain low church congregations that claim they have no liturgies, we would only deepen the power of the unnoticed patterns shaping our thought and actions. And, more to the point, we would be ceding this power not to the liturgies themselves, but to the interests served by those who have crafted and designed those liturgies. My loneliness is not assuaged by my habitual use of social media. My anxiety is not meaningfully relieved by the habit of consumption engendered by the liturgies crafted for me by Amazon. My health is not necessarily improved by compulsive use of health tracking apps. Indeed, in the latter case, the relevant liturgies will tempt me to reduce health and flourishing to what the apps can measure and quantify.

    Source: Taking Stock of Our Technological Liturgies | The Convivial Society

    Amazon as a dumb pipe

    I like this idea from Cory Doctorow, but monopolies tend to like exploiting their monopoly position. Still, it might be a way for Amazon to get around being scrutinised closely by regulators?

    But what if buying local was as easy as shopping as Amazon? What if you could buy local while shopping on Amazon?
    Source: View a SKU. Let’s Make Amazon Into a Dumb Pipe | by Cory Doctorow | Medium

    Recalling generative and liberating uses of technology

    I found myself using the phrase “the night is darkest before dawn” today. This post from Anne-Marie Scott is certainly an example of that, and I too look forward to a world beyond “today’s dogpile of an internet”.

    I remember a time when I got excited about generative and liberating uses of technology, enabling people to bring their whole selves to learning, being able to incorporate their world, their context, their knowledge, and in turn develop new connections, new communities, and new knowledge to further explore and build on these things. I think this is still possible, and I think work around open practices, open pedagogies, ethics of care, and decolonisation point the way towards how to do it in today’s dogpile of an internet.
    Source: Hitting the wall and maybe working out how to get back up again | A placid island of ignorance…

    Optimising for feelings, ceding control to the individual

    It would be easy to dismiss this as the musings of a small company before they get to scale. However, what I like about it is that the three things they suggest for software developers (look inward, look away from your screen, cede control to the individual) actually constitute very good advice.

    So, if not numbers, what might we optimize for when crafting software?

    If we’ve learned anything, it’s that all numerical metrics will be gamed, and that by default these numbers lack soul. After all, a life well-lived means something a little different to almost everyone. So it seems a little funny that the software we use almost every waking hour has the same predetermined goals for all of us in mind.

    In the end, we decided that we didn’t want to optimize for numbers at all. We wanted to optimize for feelings.

    While this may seem idealistic at best or naive at worst, the truth is that we already know how to do this. The most profound craftsmanship in our world across art, design, and media has long revolved around feelings.

    […]

    You see — if software is to have soul, it must feel more like the world around it. Which is the biggest clue of all that feeling is what’s missing from today’s software. Because the value of the tools, objects, and artworks that we as humans have surrounded ourselves with for thousands of years goes so far beyond their functionality. In many ways, their primary value might often come from how they make us feel by triggering a memory, helping us carry on a tradition, stimulating our senses, or just creating a moment of peace.

    Source: Optimizing For Feelings | The Browser Company

    Assume that your devices are compromised

    I was in Catalonia in 2017 during the independence referendum. The way that people were treated when trying to exercise democratic power I still believe to be shameful.

    These days, I run the most secure version of an open operating system on my mobile device that I can. And yet I still need to assume it's been compromised.

    In Catalonia, more than sixty phones—owned by Catalan politicians, lawyers, and activists in Spain and across Europe—have been targeted using Pegasus. This is the largest forensically documented cluster of such attacks and infections on record. Among the victims are three members of the European Parliament, including Solé. Catalan politicians believe that the likely perpetrators of the hacking campaign are Spanish officials, and the Citizen Lab’s analysis suggests that the Spanish government has used Pegasus. A former NSO employee confirmed that the company has an account in Spain. (Government agencies did not respond to requests for comment.) The results of the Citizen Lab’s investigation are being disclosed for the first time in this article. I spoke with more than forty of the targeted individuals, and the conversations revealed an atmosphere of paranoia and mistrust. Solé said, “That kind of surveillance in democratic countries and democratic states—I mean, it’s unbelievable.”

    [...]

    [T]here is evidence that Pegasus is being used in at least forty-five countries, and it and similar tools have been purchased by law-enforcement agencies in the United States and across Europe. Cristin Flynn Goodwin, a Microsoft executive who has led the company’s efforts to fight spyware, told me, “The big, dirty secret is that governments are buying this stuff—not just authoritarian governments but all types of governments.”

    [...]

    The Citizen Lab’s researchers concluded that, on July 7, 2020, Pegasus was used to infect a device connected to the network at 10 Downing Street, the office of Boris Johnson, the Prime Minister of the United Kingdom. A government official confirmed to me that the network was compromised, without specifying the spyware used. “When we found the No. 10 case, my jaw dropped,” John Scott-Railton, a senior researcher at the Citizen Lab, recalled. “We suspect this included the exfiltration of data,” Bill Marczak, another senior researcher there, added. The official told me that the National Cyber Security Centre, a branch of British intelligence, tested several phones at Downing Street, including Johnson’s. It was difficult to conduct a thorough search of phones—“It’s a bloody hard job,” the official said—and the agency was unable to locate the infected device. The nature of any data that may have been taken was never determined.

    Source: How Democracies Spy On Their Citizens | The New Yorker

    The future of the web, according to Mozilla

    There’s nothing particularly wrong with this document. It’s just not very exciting. Maybe that’s OK.

    Mozilla's mission is to ensure that the Internet is a global public resource, open and accessible to all. We believe in an Internet that puts people first, where individuals can shape their own experience and are empowered, safe, and independent.

    The Internet itself is low-level infrastructure — a connective backbone upon which other things are built. It’s essential that this backbone remains healthy, but it’s also not enough. People don’t experience the Internet directly. Rather, they experience it through the technology, products, and ecosystems built on top of it. The most important such system is the Web, which is by far the largest open communication system ever built.

    This document describes our vision for the Web and how we intend to pursue that vision. We don’t have all the answers today, and we expect this vision to evolve over time as we identify new challenges and opportunities. We welcome collaboration — both in realizing this vision, and in expanding it in service of our mission.

    Source: Mozilla’s vision for the evolution of the Web

Older Posts →