Thought Shrapnel

Jun 11, 2024 ↓

In the English language, a human alone has distinction while all other living beings are lumped with the nonliving “its.”

Two jaguars lounging on a moss-covered tree branch in a misty tropical forest, surrounded by dense vegetation.

I posted on social media recently that I want more verbs and fewer nouns in my life. This article, via Dense Discovery backs this sentiment up, with reference to the author of Braiding Sweetgrass' indigenous heritage.

Grammar, especially our use of pronouns, is the way we chart relationships in language and, as it happens, how we relate to each other and to the natural world.

[...]

[...]

We have a special grammar for personhood. We would never say of our late neighbor, “It is buried in Oakwood Cemetery.” Such language would be deeply disrespectful and would rob him of his humanity. We use instead a special grammar for humans: we distinguish them with the use of he or she, a grammar of personhood for both living and dead Homo sapiens. Yet we say of the oriole warbling comfort to mourners from the treetops or the oak tree herself beneath whom we stand, “It lives in Oakwood Cemetery.” In the English language, a human alone has distinction while all other living beings are lumped with the nonliving “its.”

There are words for states of being that have no equivalent in English. The language that my grandfather was forbidden to speak is composed primarily of verbs, ways to describe the vital beingness of the world. Both nouns and verbs come in two forms, the animate and the inanimate. You hear a blue jay with a different verb than you hear an airplane, distinguishing that which possesses the quality of life from that which is merely an object.

[...]

Linguistic imperialism has always been a tool of colonization, meant to obliterate history and the visibility of the people who were displaced along with their languages... Because we speak and live with this language every day, our minds have also been colonized by this notion that the nonhuman living world and the world of inanimate objects have equal status. Bulldozers, buttons, berries, and butterflies are all referred to as it, as things, whether they are inanimate industrial products or living beings.

Source: Orion Magazine

Jun 11, 2024 ↓

The logical conclusion of rich, isolated computer programmers having ketamine orgies with each other

Geometric crystal with a sparkling eruption of pink particles on a soft pink background, resembling a stylized, fantastical display.

Ryan Broderick with a reality check about OpenAI and GenAI in general:

I think this [Effective Altruists vs effective accelerationists debate] is all very silly. I also think this the logical conclusion of rich, isolated computer programmers having ketamine orgies with each other. But it does, unfortunately, underpin every debate you’re probably seeing about the future of AI. Silicon Valley’s elite believe in these ideas so devoutly that Google is comfortable sacrificing its own business in pursuit of them. Even though EA and e/acc are effectively just competing cargo cults for a fancy autocorrect. Though, they also help alleviate some of the intense pressure huge tech companies are under to stay afloat in the AI arms race. Here’s how it works.

[...]

Analysts told The Information last year that OpenAI’s ChatGPT is possibly costing the company up to $700,000 a day to operate. Sure, Microsoft invested $13 billion in the company and, as of February, OpenAI was reportedly projecting $2 billion in revenue, but it’s not just about maintaining what you’ve built. The weird nerds I mentioned above have all decided that the finish line here is “artificial general intelligence,” or AGI, a sentient AI model. Which is actually very funny because now every major tech company has to burn all of their money — and their reputations — indefinitely, as they compete to build something that is, in my opinion, likely impossible (don’t @ me). This has largely manifested as a monthly drum beat of new AI products no one wants rolling out with increased desperation. But you know what’s cheaper than churning out new models? “Scaring” investors.

[...]

This is why OpenAI lets CEO Sam Altman walk out on stages every few weeks and tell everyone that its product will soon destroy the economy forever. Because every manager and executive in America hears that and thinks, “well, everyone will lose their jobs but me,” and continues paying for their ChatGPT subscription. As my friend Katie Notopoulos wrote in Business Insider last week, it’s likely this is the majority of what Altman’s role is at OpenAI. Doomer in chief.

[...]

I’ve written this before, but I’m going to keep repeating it until the god computer sends me to cyber hell: The “two” “sides” of the AI “debate“ are not real. They both result in the same outcome — an entire world run by automations owned by the ultra-wealthy. Which is why the most important question right now is not, “how safe is this AI model?” It’s, “do we need even need it?”

Source: Garbage Day

Image: Google DeepMind

Jun 11, 2024 ↓

The latest Hardcore History just dropped

A digitally composed image featuring a woman holding a white owl, dressed in an ancient gold-toned attire, beside a goblet, with mystical forest elements and a lightning strike in the background, evoking a mythical atmosphere. The text 'Dan Carlin's Mania for Subjugation' floats above in distressed red lettering.

I could listen to Dan Carlin read the phone book all day, so to read the announcement that his latest multi-part (and multi-hour!) series for the Hardcore History podcast has started is great news!

So, after almost two decades of teasing it, we finally begin the Alexander the Great saga.

I have no idea how many parts it will turn out to be, but we are calling the series “Mania for Subjugation” and you can get the first installment HERE. (of course you can also auto-download it through your regular podcast app).

[...]

And what a story it is! My go-to example in any discussion about how truth is better than fiction. It is such a good tale and so mind blowing that more than 2,300 years after it happened our 21st century people still eagerly consume books, movies, television shows and podcasts about it. Alexander is one of the great apex predators of history, and he has become a metaphor for all sorts of Aesop fables-like morals-to-the-story about how power can corrupt and how too much ambition can be a poison.

Source: Look Behind You!

Jun 12, 2024 ↓

The iPhone effect, if it was ever real in the first place, is certainly not real now.

Apple logo over an Apple store

It's announcement time at Apple's WWDC. And apart from trying to rebrand AI as "Apple Intelligence" I haven't seen many people get very excited about it. MKBHD has an overview if you want to get into the details. I just use macOS without iCloud because everything works and my Mac Studio is super-fast.

Ryan Broderick has a word for Apple fanboys, who seem to think that everything they touch is gold. Seems like their Vision Pro hasn't brought VR mainstream, and after the more innovative Steve Jobs era, it seems like they're more happy to be a luxury brand that plays it relatively safe.

If you press Apple fanboys about their weird revisionist history, they usually pivot to the argument that while iOS’s marketshare has essentially remained flat for a decade, their competitors copy what they do and that trickles down into popular culture from there. Which I’m not even sure is true either. Android had mobile payments three years before Apple, had a smartwatch a year before, a smart speaker a year before, and launched a tablet around the same time as the iPad. We could go on and on here.

And, I should say, I don’t actually think Apple sees themselves as the great innovator their Gen X blogger diehards do. In the 2010s, they shifted comfortably from a visionary tastemaker, at least aesthetically, into something closer to an airport lounge or a country club for consumer technology. They’ll eventually have a version of the new thing you’ve heard about, once they can rebrand it as something uniquely theirs. It’s not VR, it’s “spatial computing,” it’s not AI, it’s “Apple Intelligence”. But they’re not going to shake the boat. They make efficiently-bundled software that’s easy to use (excluding iPadOS) and works well across their nice-looking and easy-to-use devices (excluding the iPad). Which is why Apple Intelligence is not going to be the revolution the AI industry has been hoping for. The same way the Vision Pro wasn’t. The iPhone effect, if it was ever real in the first place, is certainly not real now.

Source: Garbage Day

Jun 12, 2024 ↓

A shepherd surrounded by a densely packed flock of multi-colored sheep, viewed from above, as he uses a long staff to manage them.

Source: 2024 Drone Photo Awards Nominees

Jun 14, 2024 ↓

Dividers tell the story of how they’ve renovated their houses, becoming architects along the way. Continuers tell the story of an august property that will remain itself regardless of what gets built.

Four-panel illustration showing different life stages on a tree: a child sitting, a teenager climbing, an adult leaning, and an older person sitting.

This long article in The New Yorker is based around the author wondering whether the fun he's had playing with his four year-old will be remembered by his son when he grows up.

Wondering whether you are the same person at the start and end of your life was a central theme of a 'Mind, Brain, and Personal Identity' course I did as part of my Philosophy degree around 22 years ago. I still think about it. On the one hand is the Ship of Theseus argument, where you can one-by-one replace all of the planks of a ship, but it's still the same ship. If you believe it's the same ship, and believe that you're the same person as when you were younger, then the author of this article would call you a 'Continuer'.

On the other hand, if you think that there are important differences between the person you are now and when you were younger. If, for example, the general can't remember 'going over the top' as a young man, despite still having the medal to prove it, is he the same person? If you don't think so, then perhaps you are a 'Divider'.

I don't consider it so clean cut. We tell stories about ourselves and others, and these shape how we think. For example, going to therapy five years ago helped me 'remove the mask' and reconsider who I am. That involved reframing some of the experiences in my life and realising that I am this kind of person rather than that kind of person.

It's absolutely fine to have seasons in your life. In fact, I'm pretty sure there's some ancient wisdom to that effect?

Are we the same people at four that we will be at twenty-four, forty-four, or seventy-four? Or will we change substantially through time? Is the fix already in, or will our stories have surprising twists and turns? Some people feel that they’ve altered profoundly through the years, and to them the past seems like a foreign country, characterized by peculiar customs, values, and tastes. (Those boyfriends! That music! Those outfits!) But others have a strong sense of connection with their younger selves, and for them the past remains a home. My mother-in-law, who lives not far from her parents’ house in the same town where she grew up, insists that she is the same as she’s always been, and recalls with fresh indignation her sixth birthday, when she was promised a pony but didn’t get one. Her brother holds the opposite view: he looks back on several distinct epochs in his life, each with its own set of attitudes, circumstances, and friends. “I’ve walked through many doorways,” he’s told me. I feel this way, too, although most people who know me well say that I’ve been the same person forever.

[...]

The philosopher Galen Strawson believes that some people are simply more “episodic” than others; they’re fine living day to day, without regard to the broader plot arc. “I’m somewhere down towards the episodic end of this spectrum,” Strawson writes in an essay called “The Sense of the Self.” “I have no sense of my life as a narrative with form, and little interest in my own past.”

[...]

John Stuart Mill once wrote that a young person is like “a tree, which requires to grow and develop itself on all sides, according to the tendency of the inward forces which make it a living thing.” The image suggests a generalized spreading out and reaching up, which is bound to be affected by soil and climate, and might be aided by a little judicious pruning here and there.

Source: The New Yorker

Can't access it? Try Pocket or Archive Buttons

Jun 14, 2024 ↓

'Wet streets cause rain' stories

Digital artwork of a brain surrounded by a network of interconnected nodes and icons, including social media and technology symbols.

First things first, the George Orwell quotation below is spurious, as the author of this article, David Cain, points out at the end of it. The point is that, it sounds plausible, so we take it on trust. It confirms our worldview.

We live in a web of belief, as W.V. Quine put it, meaning that we easily accept things that confirm our core beliefs. And then, with beliefs that are more peripheral, we pick them up and put them down at no great cost. Finding out that the capital of Burkina Faso is Ouagadougou and not Bobo-Dioulasso makes no practical difference to my life. It would make a huge difference to the residents of either city, however.

I don't like misinformation, and I think we're in quite a dangerous time in terms of how it might affect democratic elections. However, it has always been so. Gossip, rumour, and straight up lies have swayed human history. The thing is that, just as we are able to refute poor journalism and false statements on social networks about issues we know a lot about, so we need to be a bit skeptical about things outside of our immediate knowledge.

After all, as Cain quotes Michael Crichton as saying, there are plenty of 'wet streets cause rain' stories out there, getting causality exactly backwards — intentionally or otherwise.

Consider the possibility that most of the information being passed around, on whatever topic, is bad information, even where there’s no intentional deception. As George Orwell said, “The most fundamental mistake of man is that he thinks he knows what’s going on. Nobody knows what’s going on.”

Technology may have made this state of affairs inevitable. Today, the vast majority of person’s worldview is assembled from second-hand sources, not from their own experience. Second-hand knowledge, from “reliable” sources or not, usually functions as hearsay – if it seems true, it is immediately incorporated into one’s worldview, usually without any attempt to substantiate it. Most of what you “know” is just something you heard somewhere.

[...]

It makes perfect sense, if you think about it, that reporting is so reliably unreliable. Why do we expect reporters to learn about a suddenly newsworthy situation, gather information about it under deadline, then confidently explain the subject to the rest of the nation after having known about it for all of a week? People form their entire worldviews out of this stuff.

[...]

People do know things though. We have airplanes and phones and spaceships. Clearly somebody knows something. Human beings can be reliable sources of knowledge, but only about small slivers of the whole of what’s going on. They know things because they deal with their sliver every day, and they’re personally invested in how well they know their sliver, which gives them constant feedback on the quality of their beliefs.

Source: Raptitude

Jun 15, 2024 ↓

It's impossible to 'hang out' on the internet, because it is not a place

Two young people sitting on the ground with their backs to a car, sharing an earbud each

I spend a lot of time online, but do I 'hang out' there? I certainly hang out with people playing video games, but that's online rather than on the internet. Drew Austin argues that because of the amount of money and algorithms on the internet, it's impossible to hang out there.

I'm not sure. It depends on your definition of 'hanging out' and it also depends whether you're just focusing on mainstream services, or whether you're including the Fediverse and niche things such as School of the Possible. The latter, held every Friday by Dave Grey, absolutely is 'hanging out', but whether Zoom calls with breakout rooms count as the internet depends on semantics, I guess.

Is “hanging out” on the internet truly possible? I will argue: no it’s not. We’re bombarded with constant thinkpieces about various social crises—young people are sad and lonely; culture is empty or flat or simply too fragmented to incubate any shared meaning; algorithms determine too much of what we see. Some of these essays even note our failure to hang out. The internet is almost always an implicit or explicit villain in such writing but it’s increasingly tedious to keep blaming it for our cultural woes.

Perhaps we could frame the problem differently: The internet doesn’t have to demand our presence the way it currently does. It shouldn’t be something we have to look at all time. If it wasn’t, maybe we’d finally be free to hang out.

[...]

How many hours have been stolen from us? With TV, we at least understood ourselves to be passive observers of the screen, but the interactive nature of the internet fostered the illusion that message boards, Discord servers, and Twitter feeds are digital “places” where we can in fact hang out. If nothing else, this is a trick that gets us to stick around longer. A better analogy for online interaction, however, is sitting down to write a letter to a friend—something no one ever mistook for face-to-face interaction—with the letters going back and forth so rapidly that they start to resemble a real-time conversation, like a pixelated image. Despite all the spatial metaphors in which its interfaces have been dressed up, the internet is not a place.

Source: Kneeling Bus

Image: Wesley Tingey

Jun 15, 2024 ↓

The writer’s equivalent of what in computer architecture is called speculative execution

'World as it actually is' with multiple lines and blobs. Slightly chaotic.

As ever with Venkatesh Rao's posts, there's a lot going on with this one. Ostensibly, it's about the third anniversary of the most recent iteration of his newsletter, but along the way he discusses the current state of the world. It's a post worth reading for the latter reason, but I'm focused here on the role of writing and publishing online, which I do quite a lot.

Rao tries to fit his posts into one of five narrative scaffoldings which cover different time spans. Everything else falls by the wayside. I do the opposite: just publishing everything so I've got a URL for all of the thoughts, and I can weave it together on demand later. I think Cory Doctorow is a bit more like that, too (although more organised and a much better writer than me!)

As a writer, you cannot both react to the world and participate in writing it into existence — the bit role in “inventing the future” writers get to play — at the same time. My own approach to resolving this tension has been to use narratives at multiple time scales as scaffolding for sense-making. Events that conform (but not necessarily confirm) to one or more of my narratives leave me with room to develop my ab initio creative projects. Events that either do not conform to my narratives or simply fall outside of them entirely tend to derail or drain my creative momentum. It is the writer’s equivalent of what in computer architecture is called speculative execution. If you’re right enough, often enough, as a writer, you can have your cake and eat it too — react to the world, and say what you want to say at the same time.

[...]

Writing seemed like a more culturally significant, personally satisfying, aesthetically appropriate, and existentially penetrating thing to be doing in 2014 than it does now in 2024. I think we live in times when writing has less of a role to play in inventing the future, for a variety of reasons. You have to work harder at it, for less reward, in a smaller role. Fortunately for my sanity, writing is not the only thing I do with my life.

[...]

Maybe we’re just at the end of a long arc of 25 years or so, when writing online was exceptionally culturally significant and happened to line up with my most productive writing years, and the other shoe has dropped on the story of “blogging.”

Source: Ribbonfarm Studio

Jun 15, 2024 ↓

It's all just one big ocean

A watercolour image showing how all of the world's oceans are connected

Source: Fix The News

Credit: Natalie Renier/Woods Hole Oceanograpic Instition