'Personalisation' is something that humans do

    Audrey Watters, formerly the ‘Cassandra’ of edtech, is now writing about health, nutrition, and fitness technologies at Second Breakfast. It’s great, I’m a paid subscriber.

    In this article, she looks at the overlap between her former and current fields, comparing and contrasting coaches and educators with algorithms. While I don’t share her loathing of ChatGPT, as an educator and a parent I’d definitely agree that motivation and attention is something to which a human is (currently) best suited.

    How well does a teacher or trainer or coach know how you feel, how well you performed, or what you should do or learn next? How well does an app know how you feel, how well you performed, or what you should do next? Digital apps insist that, thanks to the data they collect, they can make better, more precise recommendations than humans ever can — dismissing what humans do as “one size fits all.” Yet it's impossible to scrutinize their algorithmic decision-making. Ideally, at least, you can always ask your coach, "Why the hell am I doing bulgarian split squats?! These suck." And she will tell you precisely why. (Love you, Coach KB.)

    And then (ideally) she’ll say, “If you don’t want to do them, you don’t have to.” And (ideally), she’ll ask you what’s going on. Maybe you feel like shit that day. Maybe you don’t have time. Maybe they hurt your hamstrings. Maybe you’d like to hear some options — other exercises you can do instead. Maybe you’d like to know why she prescribed this exercise in the first place — “it’s a unilateral exercises, and as a runner,” she says, “we want to work on single-leg strength, with a focus on your glute medius and adductors because I’ve noticed, by watching your barbell squats, that those areas are your weak spots.” This is how things get “personalized” — not by some massive data extraction and analysis, but by humans asking each other questions and then tailoring our responses and recommendations accordingly. Teachers and coaches do this every. goddamn. day. Sure, there’s a training template or a textbook that one is supposed to follow; but good teachers and coaches check in, and they switch things up when they’re not really working.

    […]

    If we privilege these algorithms, we’re not only adopting their lousy recommendations; we’re undermining the expertise of professionals in the field. And we’re not only undermining the expertise of professionals in the field, we’re undermining our own ability to think and learn and understand our own bodies. We’re undermining our own expertise about ourselves. (ChatGPT is such a bad bad bad idea.)

    Source: Teacher/Coach as Algorithm | Second Breakfast