Tag: memory

Reafferent loops

In Peter Godfrey-Smith’s book Other Minds, he cites work from 1950 by the German physiologists Erich van Holst and Horst Mittelstaedt.

They used the term afference to refer to everything you take in through the senses. Some of what comes in is due to the changes in the objects around you — that is exafference… — and some of what comes in is due to your own actions: that is reafference.

Peter Godfrey-Smith, Other Minds, p.154

Godfrey-Smith is talking about octopuses and other cephalopods, but I think what he’s discussing is interesting from a digital note-taking point of view.

To write a note and read it is to create a reafferent loop. Rather than wanting to perceive only the things that are not due to you — finding the exafferent among the noise is the senses — you what you read to be entirely due to your previous action. You want the contents of the note to be due to your acts rather than someone else’s meddling, or the natural decay of the notepad. You want the loop between present action and future perception to be firm. Thus enables your to create a form of external memory — as was, almost certainly, the role of much early writing (which is full of records of goods and transactions), and perhaps also the role of some early pictures, though that js much less clear.

When a written message is directed at others, it’s ordinary communication. When you write something for yourself to read, there’s usually an essential role for time — the goal is memory, in a broad sense. But memory like this is a communicative phenomenon; it is communication between your present self and a future self. Diaries and notes-to-self are embedded in a sender/receiver system just like more standard forms of communication.

Peter Godfrey-Smith, Other Minds, p.154-155

Some people talk about digital note-taking as a form of ‘second brain’. Given the type of distributed cognition that Godfrey-Smith highlights in Other Minds, it would appear that by creating reafferent loops that’s exactly the kind of thing that’s happening.

Very interesting.

If you have been put in your place long enough, you begin to act like the place

Astronaut on the moon with an Anarchist flag planted

📉 Of Flying Cars and the Declining Rate of Profit

💪 How to walk upright and stop living in a cave

🤔 It’s Not About Intention, It’s About Action

💭 Are we losing our ability to remember?

🇺🇸 How The Presidential Candidates Spy On Their Supporters


Quotation-as-title by Randall Jarrell. Image from top-linked post.

Remembering the past through photos

A few weeks ago, I bought a Google Assistant-powered smart display and put it in our kitchen in place of the DAB radio. It has the added bonus of cycling through all of my Google Photos, which stretch back as far as when my wife and I were married, 15 years ago.

This part of its functionality makes it, of course, just a cloud-powered digital photo frame. But I think it’s possible to underestimate the power that these things have. About an hour before composing this post, for example, my wife took a photo of a photo(!) that appeared on the display showing me on the beach with our two children when they were very small.

An article by Giuliana Mazzoni in The Conversation points out that our ability to whip out a smartphone at any given moment and take a photo changes our relationship to the past:

We use smart phones and new technologies as memory repositories. This is nothing new – humans have always used external devices as an aid when acquiring knowledge and remembering.

[…]

Nowadays we tend to commit very little to memory – we entrust a huge amount to the cloud. Not only is it almost unheard of to recite poems, even the most personal events are generally recorded on our cellphones. Rather than remembering what we ate at someone’s wedding, we scroll back to look at all the images we took of the food.

Mazzoni points out that this can be problematic, as memory is important for learning. However, there may be a “silver lining”:

Even if some studies claim that all this makes us more stupid, what happens is actually shifting skills from purely being able to remember to being able to manage the way we remember more efficiently. This is called metacognition, and it is an overarching skill that is also essential for students – for example when planning what and how to study. There is also substantial and reliable evidence that external memories, selfies included, can help individuals with memory impairments.

But while photos can in some instances help people to remember, the quality of the memories may be limited. We may remember what something looked like more clearly, but this could be at the expense of other types of information. One study showed that while photos could help people remember what they saw during some event, they reduced their memory of what was said.

She goes on to discuss the impact that viewing many photos from your past has on a malleable sense of self:

Research shows that we often create false memories about the past. We do this in order to maintain the identity that we want to have over time – and avoid conflicting narratives about who we are. So if you have always been rather soft and kind – but through some significant life experience decide you are tough – you may dig up memories of being aggressive in the past or even completely make them up.

I’m not so sure that it’s a good thing to tell yourself the wrong story about who you are. For example, although I grew up in, and identified with, a macho ex-mining town environment, I’ve become happier by realising that my identify is separate to that.

I suppose it’s a bit different for me, as most of the photos I’m looking at are of me with my children and/or my wife. However, I still have to tell myself a story of who I am as a husband and a father, so in many ways it’s the same.

All in all, I love the fact that we can take photos anywhere and at any time. We may need to evolve social norms around the most appropriate ways of capturing images in crowded situations, but that’s separate to the very great benefit which I believe they bring us.

Source: The Conversation