Fighting health disinformation on Wikipedia

    This is great to see:

    As part of efforts to stop the spread of false information about the coronavirus pandemic, Wikipedia and the World Health Organization announced a collaboration on Thursday: The health agency will grant the online encyclopedia free use of its published information, graphics and videos.

    Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)

    Compared to Twitter's dismal efforts at fighting disinformation, the collaboration is welcome news.

    The first W.H.O. items used under the agreement are its “Mythbusters” infographics, which debunk more than two dozen false notions about Covid-19. Future additions could include, for example, treatment guidelines for doctors, said Ryan Merkley, chief of staff at the Wikimedia Foundation, which produces Wikipedia.

    Donald G. McNeil Jr., Wikipedia and W.H.O. Join to Combat Covid Misinformation (The New York Times)

    More proof that the for-profit private sector is in no way more 'innovative' or effective than non-profits, NGOs, and government agencies.

    Seeing through is rarely seeing into

    Perceptions of the past

    The History teacher in me likes this simple photo quiz site that shows how your perception of the past can easily be manipulated by how photographs are presented.

    Gatekeepers of opportunity and the lottery of privilege

    Despite starting out as a pejorative term, 'meritocracy' is something that, until recently, few people seem to have had a problem with. One of the best explanations of why meritocracy is a problematic idea is in this Mozilla article from a couple of years ago. Basically, it ascribes agency to those who were given opportunities due to pre-existing privilege.

    In an interview with The Chronicle of Higher Education, Michael Sandel makes some very good points about the American university system, which can be more broadly applied to other western nations, such as the UK, which have elite universities.

    The meritocratic hubris of elites is the conviction by those who land on top that their success is their own doing, that they have risen through a fair competition, that they therefore deserve the material benefits that the market showers upon their talents. Meritocratic hubris is the tendency of the successful to inhale too deeply of their success, to forget the luck and good fortune that helped them on their way. It goes along with the tendency to look down on those less fortunate, and less credentialed, than themselves. That gives rise to the sense of humiliation and resentment of those who are left out.

    Michael Sandel, quoted in 'The Insufferable Hubris of the Well-Credentialed'

    As someone who is reasonably well-credentialed, I nevertheless see a fundamental problem with requiring a degree as an 'entry-level' qualification. That's why I first got interested in Open Badges nearly a decade ago.

    Despite the best efforts of the community, elite universities have a vested in maintaining the status quo. Eventually, the whole edifice will come crashing down, but right now, those universities are the gatekeepers to opportunity.

    Society as a whole has made a four-year university degree a necessary condition for dignified work and a decent life. This is a mistake. Those of us in higher education can easily forget that most Americans do not have a four-year college degree. Nearly two-thirds do not.

    [...]

    We also need to reconsider the steep hierarchy of prestige that we have created between four-year colleges and universities, especially brand-name ones, and other institutions of learning. This hierarchy of prestige both reflects and exacerbates the tendency at the top to denigrate or depreciate the contributions to the economy made by people whose work does not depend on having a university diploma.

    So the role that universities have been assigned, sitting astride the gateway of opportunity and success, is not good for those who have been left behind. But I’m not sure it’s good for elite universities themselves, either.

    MICHAEL SANDEL, QUOTED IN 'THE INSUFFERABLE HUBRIS OF THE WELL-CREDENTIALED'

    Thankfully, Sandel, has a rather delicious solution to decouple privilege from admission to elite universities. It's not a panacea, but I like it a first step.

    What might we do about it? I make a proposal in the book that may get me in a lot of trouble in my neighborhood. Part of the problem is that having survived this high-pressured meritocratic gauntlet, it’s almost impossible for the students who win admission not to believe that they achieved their admission as a result of their own strenuous efforts. One can hardly blame them. So I think we should gently invite students to challenge this idea. I propose that colleges and universities that have far more applicants than they have places should consider what I call a “lottery of the qualified.” Over 40,000 students apply to Stanford and to Harvard for about 2,000 places. The admissions officers tell us that the majority are well-qualified. Among those, fill the first-year class through a lottery. My hunch is that the quality of discussion in our classes would in no way be impaired.

    The main reason for doing this is to emphasize to students and their parents the role of luck in admission, and more broadly in success. It’s not introducing luck where it doesn’t already exist. To the contrary, there’s an enormous amount of luck in the present system. The lottery would highlight what is already the case.

    MICHAEL SANDEL, QUOTED IN 'THE INSUFFERABLE HUBRIS OF THE WELL-CREDENTIALED'

    Would people like me be worse off in a more egalitarian system? Probably. But that's kind of the point.

    Tedious sports

    This made me smile:

    You can divide most sports into those that take place in the real world (road cycling, sailing, cross country running) and those that are played on the artificial space of a court or pitch. Some (golf, croquet) occupy an uncertain middle ground, which may be one of the reasons they are so tedious to watch. Others (football, rugby) started as the former and, as they were codified, became the latter.

    Jon Day, Better on TV (London Review of Books)

    Man is equally incapable of seeing the nothingness from which he emerges and the infinity in which he is engulfed

    Biometric surveillance in a post-pandemic future

    I woke up today to the news that, in the UK, the police will get access to to the data on people told to self-isolate on a 'case-by-case basis'. As someone pointed out on Mastodon, this was entirely predictable.

    They pointed to this article by Yuval Noah Harari from March of this year, which also feels like a decade ago. In it, he talks about post-pandemic society being a surveillance nightmare:

    You could, of course, make the case for biometric surveillance as a temporary measure taken during a state of emergency. It would go away once the emergency is over. But temporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon. My home country of Israel, for example, declared a state of emergency during its 1948 War of Independence, which justified a range of temporary measures from press censorship and land confiscation to special regulations for making pudding (I kid you not). The War of Independence has long been won, but Israel never declared the emergency over, and has failed to abolish many of the “temporary” measures of 1948 (the emergency pudding decree was mercifully abolished in 2011). 

    Yuval Noah Harari: the world after coronavirus (The Financial times)

    Remember the US 'war on terror'? That led to an incredible level of domestic and foreign surveillance that was revealed by Edward Snowden a few years ago.

    The trouble, though, is that health is a clear and visible thing, a clear and present danger. Privacy is more nebulous with harms often being in the future, so the trade-off is between the here and now and, well, the opposite.

    Even when infections from coronavirus are down to zero, some data-hungry governments could argue they needed to keep the biometric surveillance systems in place because they fear a second wave of coronavirus, or because there is a new Ebola strain evolving in central Africa, or because . . . you get the idea. A big battle has been raging in recent years over our privacy. The coronavirus crisis could be the battle’s tipping point. For when people are given a choice between privacy and health, they will usually choose health.

    YUVAL NOAH HARARI: THE WORLD AFTER CORONAVIRUS (THE FINANCIAL TIMES)

    For me, just like Harari, the way that governments choose to deal with the pandemic shows their true colours.

    The coronavirus epidemic is thus a major test of citizenship. In the days ahead, each one of us should choose to trust scientific data and healthcare experts over unfounded conspiracy theories and self-serving politicians. If we fail to make the right choice, we might find ourselves signing away our most precious freedoms, thinking that this is the only way to safeguard our health.

    YUVAL NOAH HARARI: THE WORLD AFTER CORONAVIRUS (THE FINANCIAL TIMES)

    Ethics is the result of the human will

    Sabelo Mhlambi is a computer scientist, researcher and Fellow at Harvard’s Berkman Klein Center for Internet & Society. He focuses on the ethical implications of technology in the developing world, particularly in Sub-Saharan Africa, and has written a great, concise essay on technological ethics in relation to the global north and south.

    Ethics is not missing in technology, rather we are witnessing the ethics in technology – the ethics of the powerful. The ethics of individualism.

    Mhlambi makes a number of important points, and I want to share three of them. First, he says that ethics is the result of human will, not algorithmic processes:

    Ethics should not be left to algorithmic definitions and processes, ultimately ethics is a result of the human will. Technology won’t save us. The abdication of social and environmental responsibility by creators of technology should not be allowed to become the norm.

    Second, technology is a driver of change in society, and, because technology is not neutral, we have individualism baked into the tools we use:

    Ethics describes one’s relationship and responsibilities to others and the environment. Ethics is the protocol for human interaction, with each other and with the world. Different ethical systems may be described through this scale: Individualistic systems promote one’s self assertion through the limitation of one’s relationship and responsibilities to others and the environment. In contrast, a more communal ethics asserts the self through the encouragement of one’s relationship and responsibilities to the community and the environment.

    This is, he says, a form of colonialism:

    Technology designed and deployed beyond its ethical borders poses a threat to social stability in different regions with different ethical systems, norms and values. The imposition of a society’s beliefs on another is colonial. This relationship can be observed even amongst members of the South as the more economically developed nations extend their technology and influence into less developed nations, the East to Africa relationship being an example.

    Third, over and above the individualism and colonialism, the technologies we use are unrepresentative because they do not take into account the lived experiences and view of marginalised groups:

    In the development and funding of technology, marginalized groups are underrepresented. Their values and views are unaccounted for. In the software industry marginalized groups make a minority of the labor force and leadership roles. The digital divide continues to increase when technology is only accessible through the languages of the well developed nations. 

    It's an important essay, and one that I'll no doubt be returning to in the weeks and months to come.

    Even while a thing is in the act of coming into existence, some part of it has already ceased to be

    Forward momentum above all things

    This page on a Brian Eno fan site was re-shared on Hacker News this week. It features text from an email from Eno himself, explaining why, although he's grateful that people want to discuss his work, he doesn't want to necessarily see it:

    I think the reason I feel uncomfortable about such a thing is that it becomes a sort of weight on my shoulders. I start to feel an obligation to live up to something, instead of just following my nose wherever it wants to go at the moment. Of course success has many nice payoffs, but one of the disadvantages is that you start to be made to feel responsible for other people's feelings: what I'm always hearing are variations of "why don't you do more records like - (insert any album title) " or "why don't you do more work with - (insert any artist's name)?". I don't know why, these questions are un answerable, why is it so bloody important to you, leave me alone....these are a few of my responses. But the most important reason is "If I'd followed your advice in the first place I'd never have got anywhere".

    Eno goes on to explain that being constantly reminded of your 'exhaust', of what you've already done isn't very conducive to future creative work:

    I'm afraid to say that admirers can be a tremendous force for conservatism, for consolidation. Of course it's really wonderful to be acclaimed for things you've done - in fact it's the only serious reward, becasue it makes you think "it worked! I'm not isolated!" or something like that, and irt makes you feel gratefully connected to your own culture. But on the other hand, there's a tremendously strong pressure to repeat yourself, to do more of that thing we all liked so much. I can't do that - I don't have the enthusiasm to push through projects that seem familiar to me ( - this isn't so much a question of artistic nobility or high ideals: I just get too bloody bored), but at the same time I do feel guilt for 'deserting my audience' by not doing the things they apparently wanted. I'd rather not feel this guilt, actually, so I avoid finding out about situations that could cause it.

    Finally, Eno explains that, just like everyone else, there are days when he wonders where the creative spark comes from:

    The problem is that people nearly always prefer what I was doing a few years earlier - this has always been true. The other problem is that so, often, do I! Discovering things is clumsy and sporadic, and the results don't at first compare well with the glossy and lauded works of the past. You have to keep reminding yourself that they went through that as well, otherwise they become frighteningly accomplished. That's another problem with being made to think about your own past - you forget its genesis and start to feel useless awe toward syour earlier self "How did I do it? Wherever did these ideas come from?". Now, the workaday everyday now, always looks relatively less glamorous than the rose-tinted then (except for those magic mhours when your finger is right on the pulse, and those times only happen when you've abandoned the lifeline of your own history).

    Being creative comes not from looking back, but looking forward. As the enigmatic Taylor, a character in the TV series Billions states in one episode, we should prize "forward momentum above all things".

    We all think we are exceptional, and are surprised to find ourselves criticised just like anyone else

    Scenario planning, climate change, and the pandemic

    Tim O'Reilly is a funny character. Massively talented and influential, but his political views (broadly right libertarian) seem to mean he miss things when he's neverththeless heading in the right direction.

    In a long article published recently, O'Reilly introduces his readers to scenario planning from a very US-centric point of view. It's also a position that, on first reading at least, is a bit techno-solutionist.

    He starts by explaining that just because we date decades and centuries a particular way ("the 90s", "the twentieth century") it's actually cataclysmic events that define the start and end of eras:

    So, when you read stories—and there are many—speculating or predicting when and how we will return to “normal”, discount them heavily. The future will not be like the past. The comfortable Victorian and Georgian world complete with grand country houses, a globe-spanning British empire, and lords and commoners each knowing their place, was swept away by the events that began in the summer of 1914 (and that with Britain on the “winning” side of both world wars.) So too, our comfortable “American century” of conspicuous consumer consumption, global tourism, and ever-increasing stock and home prices may be gone forever.

    Tim O'Reilly, Welcome to the 21st Century: How To Plan For The Post-Covid Future

    For me, the 21st century began on September 11th, 2001 with the twin towers attack. The aftermath of that, including the curtailing of our civil liberties in the west, has been a defining feature of the century so far.

    O'Reilly, however, points to the financial crisis:

    Our failure to make deep, systemic changes after the financial collapse of 2009, and our choice instead to spend the last decade cutting taxes and spending profusely to prop up financial markets while ignoring deep, underlying problems has only made responding to the current crisis that much more difficult. Our failure to build back creatively and productively from the global financial crisis is necessary context for the challenge to do so now.

    Tim O'Reilly, Welcome to the 21st Century: How To Plan For The Post-Covid Future

    All of these things compound one another, with financial uncertainty leading to political instability, and the election of populist leaders. That meant we were less prepared for the pandemic than we could have been, and when it hit, we've suffered (in the UK and US at least) from an incompetent response.

    I recently finished reading A Distant Mirror: The Calamitous 14th Century by Barbara E. Tuchman, which discusses at length something that O'Reilly picks up on:

    If you are a student of history, you know that the massive reduction of the workforce in post-Black Death Europe forced lords to give better terms of tenure—serfdom all but disappeared, and the rise of a mercantile middle class set the stage for the artistic and scientific progress of the Renaissance. Temporary, but catastrophic, events often usher in permanent economic changes. Sometimes the changes appear to be reversed but it just takes time for them to stick. World War II brought women into the workforce, and then victory ushered them back out. But the wine of opportunity, once tasted, was not left undrunk forever.

    Tim O'Reilly, Welcome to the 21st Century: How To Plan For The Post-Covid Future

    I'm hoping, like O'Reilly, that there are silver linings that come out of the pandemic related to climate change. Unlike him, I don't think the answer is more consumption. As an article I shared recently points out, not only can we not have billionaires and solve climate change, but the whole edifice of over-consumption needs to collapse under its own weight.

    At times, our strengths propel us so far forward we can no longer endure our weaknesses and perish from them

    Reducing exam stress by removing pointless exams

    In the UK, it used to be the case that children could leave school at 16. This was the reason for 'O' levels (which my parents took), and GCSEs, which I sat at that age.

    However, these days, young people must remain in education or training until they are 18 years old. What, then, is the point of taking exams aged 16 and 18?

    A group of Tory MPs has written a report, with one of the authors, Flick Drummond, making some good points:

    The paper argues that preparation for GCSE exams means that pupils miss a large chunk of valuable learning because of the time taken up with mock exams and revision, followed by the exams themselves. “That’s almost six months out of a whole year spent preparing for exams,” said Drummond.


    She said she was particularly concerned by the impact of exams on mental health, citing a report backed by the Children’s Society in August that ranked England 36th out of 45 countries in Europe and North America for wellbeing.


    Instead, the new report says, the exams should be replaced by a baccalaureate, which would cover several years’ study and would allow children more time from the age of 15 to settle on the subjects they wanted to study in the sixth form for A-levels or vocational qualifications such as T-levels and apprenticeships, and to explore potential careers in a structured way.

    Richard Adams, Tory MPs back ditching GCSE exams in English school system overhaul (The Guardian)

    As a parent of children who could be affected by this, I actually think this should be trialled first in the private sector and then rolled out in the state sector. Too often, the private sector benefits from treating state school pupils as guinea pigs, and then cherry-picking what works.

    The clever man often worries; the loyal person is often overworked

    Like the flight of a sparrow through a lighted hall, from darkness into darkness

    Face-to-face university classes during a pandemic? Why?

    Earlier in my career, when I worked for Jisc, I was based at Northumbria University in Newcastle. It's just been announced that 770 students there have been infected with COVID-19.

    As Lorna Finlayson, a philosophy lecturer at the University of Essex, points out, the desire to get students on campus for face-to-face teaching is driven by economics. Universities are businesses, and some of them are likely to fail this academic year.

    [A]fter years of pushing to expand online learning and “lecture capture” on the basis that it is what students want, university managers have decided that what students really want now, during a global pandemic, is face-to-face contact. This sudden-onset fetish reached its most perverse extreme in the case of Boston University, which, realising that many teaching rooms lack good ventilation or even windows, decided to order “giant air circulators”, only to discover that the air circulators were very noisy. Apparently unable to source enough “mufflers” for the air circulators, the university ordered Bluetooth headsets to enable students and teachers to communicate over the roar of machinery.

    All of which raises the question: why? The determination to bring students back to campus at any cost doesn’t stem from a dewy-eyed appreciation of in-person pedagogy, nor from concerns about the impact of isolation on students’ mental health. If university managers had any interest in such things, they would not have spent years cutting back on study skills support and counselling services.

    Lorna Finlayson, How universities tricked students into returning to campus (The Guardian)

    I know people who work in universities in various positions. What they tell me astounds me; a callous disregard for human life in the pursuit of either economic survival, or profit.

    This is, as usual, all about the money. With student fees and rents now their main source of revenue, universities will do anything to recruit and retain. When the pandemic hit, university managers warned of a potentially catastrophic loss of income from international student fees in particular. Many used this as an excuse to cut jobs and freeze pay, even as vice-chancellors and senior management continued to rake in huge salaries. As it turned out, international student admissions reached a record high this year, with domestic undergraduate numbers also up – perhaps less due to the irresistibility of universities’ “offer” than to the lack of other options (needless to say, staff jobs and pay have yet to be reinstated).


    But students are more than just fee-payers. They are rent-payers too. Rightly or wrongly, most of those in charge of universities have assumed that only the promise of face-to-face classes would tempt students back to their accommodation. That promise can be safely broken only once rental contracts are signed and income streams flowing.

    Lorna Finlayson, How universities tricked students into returning to campus (The Guardian)

    I predict legal action at some point in the near future.

    'Rulesy' people

    Some people in the world want to fit in. Others want to change it. Still others want to fit in by changing it. Robin Hanson has a theory about how paternalism appears in a culture, linking it to a pattern of behaviours that bestows a form of prestige on those creating and enforcing rules.

    The key idea is that there are many “rulesy” people in the world who specialize in learning of and even creating rules, so that they can then find and reveal violations of these rules around them. This allows them to beat on their rivals, and also to raise their own status. It obviously raises their dominance via the power they wield, but they prefer to be instead seen as prestigious, enforcing rules whose purpose is more clearly altruistic. And what could be more altruistic than keeping people from hurting themselves?

    So many people who are especially good at noticing and applying rules, good at finding potential violations, good at framing situations as rule violations, and willing to at least gossip about violators, are eager for a supply of apparently-paternalism-motived rules they can enforce. So they take suggestions by elites regarding what is good behavior and work to turn them into rules they can enforce. They push to turn norms into laws, and to make norms out of the weak behavior patterns of elites, or common sorts of praise and criticism.

    Robin Hanson, Rulesy Folks Push Paternalism (Overcoming Bias)

    I like Hanson's explanation of how this can work in practice:

    For example, maybe at first some elites sometimes wear hats. Then they and others start to praise hat-wearers. Then more folks start to wear hats, and get proud of how they are good hat people. Good candidates for promotion to elite they are. Then hat fans start to insinuate that people who don’t wear hats are not the best sort of people in various ways, and are only hurting themselves. They say that word needs to get out about the advantages of hats. And those irresponsible people arguing against hats really need to be dealt with – everyone should be told that their arguments don’t meet the highest possible standards of scientific rigor. (Though neither do pro-hat arguments.)


    It becomes a matter of pride to teach your children to wear hats. And to have hats taught in school. And to include the lack of hats in lists of problems that problem people have. Hat fans start to push the orgs of which they are part to promote hats, sometimes even requiring hats at org functions. Finally it is suggested that wouldn’t it be simpler and more efficient to just have the government require hats. Then foreigners who visit us won’t think we are such backward non-hat people. And its really for their own good, as we all know.

    At every step along this path, people can gain by pushing for stricter and stronger hat norms and rules. They are good people, pushing a good thing, which just happens to let them dump harder on rivals. Which is plausibly why we tend to end up with just too many overly restrictive rules. Rules rise with the ratchet of crises that can be blamed on problems said to be fixed by adding new rules. But between the crises, we rarely take away or weaken our rules.

    Robin Hanson, Rulesy Folks Push Paternalism (Overcoming Bias)

    The importance of co-operation

    Quoting Stephen Downes in the introduction to his post, Harold Jarche goes on to explain:

    Managing in complex adaptive systems means influencing possibilities rather than striving for predictability (good or best practices). Cooperation in our work is needed so that we can continuously develop emergent practices demanded by this complexity. What worked yesterday won’t work today. No one has the definitive answer any more, but we can use the intelligence of our networks to make sense together and see how we can influence desired results. This is cooperation and this is the future, which is already here, albeit unevenly distributed.

    Harold Jarche, revisiting cooperation

    It's all very well having streamlined workflows, but that's the way to get automated out of a job.

    One is not superior merely because one sees the world in an odious light

← Newer Posts Older Posts →