Remembering the past through photos

A few weeks ago, I bought a Google Assistant-powered smart display and put it in our kitchen in place of the DAB radio. It has the added bonus of cycling through all of my Google Photos, which stretch back as far as when my wife and I were married, 15 years ago.

This part of its functionality makes it, of course, just a cloud-powered digital photo frame. But I think it’s possible to underestimate the power that these things have. About an hour before composing this post, for example, my wife took a photo of a photo(!) that appeared on the display showing me on the beach with our two children when they were very small.

An article by Giuliana Mazzoni in The Conversation points out that our ability to whip out a smartphone at any given moment and take a photo changes our relationship to the past:

We use smart phones and new technologies as memory repositories. This is nothing new – humans have always used external devices as an aid when acquiring knowledge and remembering.

[…]

Nowadays we tend to commit very little to memory – we entrust a huge amount to the cloud. Not only is it almost unheard of to recite poems, even the most personal events are generally recorded on our cellphones. Rather than remembering what we ate at someone’s wedding, we scroll back to look at all the images we took of the food.

Mazzoni points out that this can be problematic, as memory is important for learning. However, there may be a “silver lining”:

Even if some studies claim that all this makes us more stupid, what happens is actually shifting skills from purely being able to remember to being able to manage the way we remember more efficiently. This is called metacognition, and it is an overarching skill that is also essential for students – for example when planning what and how to study. There is also substantial and reliable evidence that external memories, selfies included, can help individuals with memory impairments.

But while photos can in some instances help people to remember, the quality of the memories may be limited. We may remember what something looked like more clearly, but this could be at the expense of other types of information. One study showed that while photos could help people remember what they saw during some event, they reduced their memory of what was said.

She goes on to discuss the impact that viewing many photos from your past has on a malleable sense of self:

Research shows that we often create false memories about the past. We do this in order to maintain the identity that we want to have over time – and avoid conflicting narratives about who we are. So if you have always been rather soft and kind – but through some significant life experience decide you are tough – you may dig up memories of being aggressive in the past or even completely make them up.
I'm not so sure that it's a good thing to tell yourself the wrong story about who you are. For example, although I grew up in, and identified with, a macho ex-mining town environment, I've become happier by realising that my identify is separate to that.

I suppose it’s a bit different for me, as most of the photos I’m looking at are of me with my children and/or my wife. However, I still have to tell myself a story of who I am as a husband and a father, so in many ways it’s the same.

All in all, I love the fact that we can take photos anywhere and at any time. We may need to evolve social norms around the most appropriate ways of capturing images in crowded situations, but that’s separate to the very great benefit which I believe they bring us.

Source: The Conversation

Acoustic mirrors

On the beach at Druridge Bay in Northumberland, near where I live, there are large blocks in various intervals. These hulking pieces of concrete, now half-submerged, were deployed on seafronts up and down England to prevent the enemy successfully landing tanks during the Second World War.

I was fascinated to find out that these aren’t the only concrete blocks that protected Britain. BBC News reports that ‘acoustic mirrors’ were installed for a very specific purpose:

More than 100 years ago acoustic mirrors along the coast of England were built with the intention of using them to detect the sound of approaching German zeppelins.

The concave concrete structures were designed to pick up sound waves from enemy aircraft, making it possible to predict their flight trajectory, giving enough time for ground forces to be alerted to defend the towns and cities of Britain.

Some of these, which vary in size, still exist, and have been photographed by Joe Pettet-Smith.

The reason most of us haven’t heard of them is that the technology improved so quickly. Pettet-Smith comments:

The sound mirror experiment, this idea of having a chain of concrete structures facing the Channel using sound to detect the flight trajectory of enemy aircraft, was just that - an experiment. They tried many different sizes and designs before the project was scrapped when radar was introduced.

The science was solid, but aircraft kept getting faster and quieter, which made them obsolete.

Fascinating. The historian (and technologist) within me loves this.

Source: BBC News

Unpopular opinions on personal productivity

Before Christmas, I stumbled upon an interesting Twitter thread. It was started by Andrew Chen, General Partner at a16z, who asked:

What is your least popular but deeply held opinion on personal productivity?
He replied to his own tweet to get things started, commenting:
Being super organized is a bad thing. Means there's no room for serendipity, deep thought, can make you overly passive on other peoples' use of your time, as opposed to being focused on outbound. (Sorry to all my super Type A friends)
I'd definitely agree with that. Some of the others in the thread that I agree with are:
  • 9hour workdays are a byproduct of the industrial age. Personal productivity takes a deep fall after grinding on work for 5hours. Office hours kill personal time and productivity (@lpuchii)
  • Going on a run in the middle of the workday (@envarli)
  • Use pen and paper for scribbling notes (@uneeb123)
  • No one else has my job nor are they me, so I can’t simply follow the prescriptions of others. To be more productive, I need to look for new ideas and test. What works for someone else may be antithetical to my work. (@bguenther)
  • Great ideas rarely come from brainstorming sessions. It comes from pondering over a problem for a significant amount of time and coupling it with lots of experiments (@rajathkedi)
As ever, about half-way down the lengthy thread, it devolves into general productivity advice rather than 'unpopular opinions'. Still worth a browse!

Source: Andrew Chen (Twitter)

Confusing tech questions

Today is the first day of the Consumer Electronics Show, or CES, in Las Vegas. Each year, tech companies showcase their latest offerings and concepts. Nilay Patel, Editor-in-Chief for The Verge, comments that, increasingly, the tech industry is built on a number of assumptions about consumers and human behaviour:

[T]hink of the tech industry as being built on an ever-increasing number of assumptions: that you know what a computer is, that saying “enter your Wi-Fi password” means something to you, that you understand what an app is, that you have the desire to manage your Bluetooth device list, that you’ll figure out what USB-C dongles you need, and on and on.

Lately, the tech industry is starting to make these assumptions faster than anyone can be expected to keep up. And after waves of privacy-related scandals in tech, the misconceptions and confusion about how things works are both greater and more reasonable than ever.

I think this is spot-on. At Mozilla, and now at Moodle, I spend a good deal of my time among people who are more technically-minded than me. And, in turn, I’m more technically-minded than the general population. So what’s ‘obvious’ or ‘easy’ to developers feels like magic to the man or woman on the street.

Patel keeps track of the questions his friends and family ask him, and has listed them in the post. The number one thing he says that everyone is talking about is how people assume their phones are listening to them, and then serving up advertising based on that. They don’t get that that Facebook (and other platforms) use multiple data points to make inferences.

I’ll not reproduce his list here, but here are three questions which I, too, get a lot from friends and family:

“How do I make sure deleting photos from my iPhone won’t delete them from my computer?”

“How do I keep track of what my kid is watching on YouTube?”

“Why do I need to make another username and password?”

As I was discussing with the MoodleNet team just yesterday, there’s a difference between treating users as ‘stupid’ (which they’re not) and ensuring that they don’t have to think too much when they’re using your product.

Source: The Verge (via Orbital Operations)

Feeling good (quote)

“You can’t get much done in life if you only work on the days when you feel good.”

(Jerry West)

Creativity as an ongoing experiment

It’s hard not to be inspired by the career of the Icelandic artist Björk. She really does seem to be single-minded and determined to express herself however she chooses.

This interview with her in The Creative Independent is from 2017 but was brought to my attention recently in their (excellent) newsletter. On being asked whether it’s OK to ever abandon a project, Björk replies:

If there isn’t the next step, and it doesn’t feel right, there will definitely be times where I don’t do it. But in my mind, I don’t look at it that way. It’s more like maybe it could happen in 10 years time. Maybe it could happen in 50 years time. That’s the next step. Or somebody else will take it, somebody else will look at it, and it will inspire them to write a poem. I look at it more like that, like it’s something that I don’t own.

[…]

The minute your expectations harden or crystallize, you jinx it. I’m not saying I can always do this, but if I can stay more in the moment and be grateful for every step of the way, then because I’m not expecting anything, nothing was ever abandoned.

Creativity isn’t something that can be forced, she says:

It’s like, the moments that I’ve gone to an island, and I’m supposed to write a whole album in a month, I could never, ever do that. I write one song a month, or two months, whatever happens… If there is a happy period or if there’s a sad period, or I have all the time in the world or no time in the world, it’s just something that’s kind of a bubbling underneath.
Perhaps my favourite part of the interview, however, is where Björk says that she likes leaving things open for growth and new possibilities:
I like things when they’re not completely finished. I like it when albums come out. Maybe it’s got something to do with being in bands. We spent too long… There were at least one or two albums we made all the songs too perfect, and then we overcooked it in the studio, and then we go and play them live and they’re kind of dead. I think there’s something in me, like an instinct, that doesn’t want the final, cooked version on the album. I want to leave ends open or other versions, which is probably why I end up still having people do remixes, and when I play them live, I feel different and the songs can grow.
Well worth reading in full, especially at this time of the year when everything seems full of new possibilities!

Source: The Creative Independent (via their newsletter)

Image by Maddie

Murmurations

Starlings where I live in Northumberland, England, also swarm like this, but not in so many numbers.

I love the way that we give interesting names to groups of animals English (e.g. a ‘murder’ of crows). There’s a whole list of them on Wikipedia.

Source: The Atlantic

Fanatics (quote)

“A fanatic is one who can’t change his mind and won’t change the subject.”

(Winston Churchill)

The problem with Business schools

This article is from April 2018, but was brought to my attention via Harold Jarche’s excellent end-of-year roundup.

Business schools have huge influence, yet they are also widely regarded to be intellectually fraudulent places, fostering a culture of short-termism and greed. (There is a whole genre of jokes about what MBA – Master of Business Administration – really stands for: “Mediocre But Arrogant”, “Management by Accident”, “More Bad Advice”, “Master Bullshit Artist” and so on.) Critics of business schools come in many shapes and sizes: employers complain that graduates lack practical skills, conservative voices scorn the arriviste MBA, Europeans moan about Americanisation, radicals wail about the concentration of power in the hands of the running dogs of capital. Since 2008, many commentators have also suggested that business schools were complicit in producing the crash.
When I finished my Ed.D. my Dad jokingly (but not-jokingly) said that I should next aim for an MBA. At the time, eight years ago, I didn't have the words to explain why I had no desire to do so. Now however, understanding a little bit more about economics, and a lot more about co-operatives, I can see that the default operating system of organisations is fundamentally flawed.
If we educate our graduates in the inevitability of tooth-and-claw capitalism, it is hardly surprising that we end up with justifications for massive salary payments to people who take huge risks with other people’s money. If we teach that there is nothing else below the bottom line, then ideas about sustainability, diversity, responsibility and so on become mere decoration. The message that management research and teaching often provides is that capitalism is inevitable, and that the financial and legal techniques for running capitalism are a form of science. This combination of ideology and technocracy is what has made the business school into such an effective, and dangerous, institution.
I'm pretty sure that forming a co-op isn't on the curriculum of 99% of business schools. As Martin Parker, the author of this long article points out, after teaching in 'B-schools' for 20 years, ethical practices are covered almost reluctantly.
The problem is that business ethics and corporate social responsibility are subjects used as window dressing in the marketing of the business school, and as a fig leaf to cover the conscience of B-school deans – as if talking about ethics and responsibility were the same as doing something about it. They almost never systematically address the simple idea that since current social and economic relations produce the problems that ethics and corporate social responsibility courses treat as subjects to be studied, it is those social and economic relations that need to be changed.
So my advice to someone who's thinking of doing an MBA? Don't bother. You're not going to be learning things that make the world a better place. Save your money and do something more worthwhile. If you want to study something useful, try researching different ways of structuring organistions — perhaps starting by using this page as a portal to a Wikipedia rabbithole?

Source: The Guardian (via Harold Jarche)

Working and leading remotely

As MoodleNet Lead, I’m part of a remote team. If you look at the org chart, I’m nominally the manager of the other three members of my team, but it doesn’t feel like that (at least to me). We’re all working on our areas of expertise and mine happens to be strategy, making sure the team’s OK, and interfacing with the rest of the organisation.

I’m always looking to get better at what I do, so a ‘crash course’ for managing remote teams by Andreas Klinger piqued my interest. There’s a lot of overlap with John O’Duinn’s book on distributed teams, especially in his emphasis of the difference between various types of remote working:

There is a bunch of different setups people call “remote teams”.
  • Satellite teams
    • 2 or more teams are in different offices.
  • Remote employees
    • most of the team is in an office, but a few single employees are remote
  • Fully distributed teams
    • everybody is remote
  • Remote first teams
    • which are “basically” fully distributed
    • but have a non-critical-mass office
    • they focus on remote-friendly communication
When i speak of remote teams, i mean fully distributed teams and, if done right, remote-first teams. I consider all the other one’s hybrid setups.
Using these terms, the Open Badges team at Mozilla was 'Remote first', and when I joined Moodle I was a 'Remote employee', and now the MoodleNet team is 'Fully distributed'.

Some things are easier when you work remotely, and some things are harder. One thing that’s definitely more difficult is running effective meetings:

Everybody loves meetings… right? But especially for remote teams, they are expensive, take effort and are – frankly – exhausting.

If you are 5 people, remote team:

  • You need to announce meetings upfront
  • You need to take notes b/c not everyone needs to join
  • Be on time
  • Have a meeting agenda
  • Make sure it’s not overtime
  • Communicate further related information in slack
  • etc
[...]

And this is not only about meetings. Meetings are just a straightforward example here. It’s true for any aspect of communication or teamwork. Remote teams need 5x the process.

I’m a big believer in working openly and documenting all the things. It saves hassle, it makes community contributions easier, and it builds trust. When everything’s out in the open, there’s nowhere to hide.

Working remotely is difficult because you have to be emotionally mature to do it effectively. You’re dealing with people who aren’t physically co-present, meaning you have to over-communicate intention, provide empathy at a distance, and not over-react by reading something into a communication that wasn’t intended. This takes time and practice.

Ideally, as remote team lead, you want what Laura Thomson at Mozilla calls Minimum Viable Bureaucracy, meaning that you don’t just get your ducks in a row, you have self-organising ducks. As Klinger points out:

In remote teams, you need to set up in a way people can be as autonomously as they need. Autonomously doesn’t mean “left alone” it means “be able to run alone” (when needed).

Think of people as “fast decision maker units” and team communication as “slow input/output”. Both are needed to function efficiently, but you want to avoid the slow part when it’s not essential.

At the basis of remote work is trust. There’s no way I can see what my colleagues are doing 99% of the time while they’re working on the same project as me. The same goes for me. Some people talk about having to ‘earn’ trust, but once you’ve taken someone through the hiring process, it’s better just to give them your trust until they act in a way which makes you question it.

Source: Klinger.io (via Dense Discovery)

Rules for Online Sanity

It’s funny: we tell kids not to be mean to one another, and then immediately jump on social media to call people out and divide ourselves into various camps.

This list by Sean Blanda has been shared in several places, and rightly so. I’ve highlighted what I consider to be the top three.

I’ve started thinking about what are the “new rules” for navigating the online world? If you could get everyone to agree (implicitly or explicitly) to a set of rules, what would they be? Below is an early attempt at an “Rules for Online Sanity” list. I’d love to hear what you think I missed.

  • Reward your “enemies” when they agree with you, exhibit good behavior, or come around on an issue. Otherwise they have no incentive to ever meet you halfway.
  • Accept it when people apologize. People should be allowed to work through ideas and opinions online. And that can result in some messy outcomes. Be forgiving.
  • Sometimes people have differing opinions because they considered something you didn’t.
  • Take a second.
  • There's always more to the story. You probably don't know the full context of whatever you're reading or watching.
  • If an online space makes more money the more time you spend on it, use sparingly.
  • Judge people on their actions, not their words. Don’t get outraged over what people said. Get outraged at what they actually do.
  • Try to give people the benefit of the doubt, be charitable in how you read people’s ideas.
  • Don’t treat one bad actor as representative of whatever group or demographic they belong to.
  • Create the kind of communities and ideas you want people to talk about.
  • Sometimes, there are bad actors that don’t play by the rules. They should be shunned, castigated, and banned.
  • You don’t always have the moral high ground. You are not always right.
  • Block and mute quickly. Worry about the bubbles that creates later.
  • There but for the grace of God go you.
Oh, and about "creating communities": why not support Thought Shrapnel via Patreon and comment on these posts along with people you already know have something in common?

Source: The Discourse (via Read Write Collect)

Baseline levels of conscientiousness

Baseline levels of conscientiousness

As I mentioned on New Years' Day, I’ve decided to trade some of my privacy for convenience, and am now using the Google Assistant on a regular basis. Unlike Randall Munroe, the author of xkcd, I have no compunction about outsourcing everything other than the Very Important Things That I’m Thinking About to other devices (and other people).

Source: xkcd

The endless Black Friday of the soul

This article by Ruth Whippman appears in the New York Times, so focuses on the US, but the main thrust is applicable on a global scale:

When we think “gig economy,” we tend to picture an Uber driver or a TaskRabbit tasker rather than a lawyer or a doctor, but in reality, this scrappy economic model — grubbing around for work, all big dreams and bad health insurance — will soon catch up with the bulk of America’s middle class.

Apparently, 94% of the jobs created in the last decade are freelancer or contract positions. That's the trajectory we're on.

Almost everyone I know now has some kind of hustle, whether job, hobby, or side or vanity project. Share my blog post, buy my book, click on my link, follow me on Instagram, visit my Etsy shop, donate to my Kickstarter, crowdfund my heart surgery. It’s as though we are all working in Walmart on an endless Black Friday of the soul.

[...]

Kudos to whichever neoliberal masterminds came up with this system. They sell this infinitely seductive torture to us as “flexible working” or “being the C.E.O. of You!” and we jump at it, salivating, because on its best days, the freelance life really can be all of that.

I don't think this is a neoliberal conspiracy, it's just the logic of capitalism seeping into every area of society. As we all jockey for position in the new-ish landscape of social media, everything becomes mediated by the market.

What I think’s missing from this piece, though, is a longer-term trend towards working less. We seem to be endlessly concerned about how the nature of work is changing rather than the huge opportunities for us to do more than waste away in bullshit jobs.

I’ve been advising anyone who’ll listen over the last few years that reducing the number of days you work has a greater impact on your happiness than earning more money. Once you reach a reasonable salary, there’s diminishing returns in any case.

Source: The New York Times (via Dense Discovery)

Blockchain bullshit

I’m sure blockchain technologies are going to revolutionise some sectors. But it’s not a consumer-facing solution; its applications are mainly back-office.

Of courses a lot of the hype around blockchain came through the link between it and cryptocurrencies like Bitcoin.

There’s a very real problem here, though. People with decision-making power read predictions by consultants and marketers. Then, without understanding what the tech really is or does, ensure it’s a requirement in rendering processes. This means that vendors either have to start offering that tech, or lie about the fact that they are able to do so.

We documented 43 blockchain use-cases through internet searches, most of which were described with glowing claims like “operational costs… reduced up to 90%,” or with the assurance of “accurate and secure data capture and storage.” We found a proliferation of press releases, white papers, and persuasively written articles. However, we found no documentation or evidence of the results blockchain was purported to have achieved in these claims. We also did not find lessons learned or practical insights, as are available for other technologies in development.

We fared no better when we reached out directly to several blockchain firms, via email, phone, and in person. Not one was willing to share data on program results, MERL processes, or adaptive management for potential scale-up. Despite all the hype about how blockchain will bring unheralded transparency to processes and operations in low-trust environments, the industry is itself opaque. From this, we determined the lack of evidence supporting value claims of blockchain in the international development space is a critical gap for potential adopters.

There’s a simple lesson here: if you don’t understand something, don’t say it’s going to change the world.

Source: MERL Tech (via The Register)

Social mobility

This diagram by Jessica Hagy is a fantastic visual reminder to stay curious:

Source: Indexed

Looking back and forward in tech

Looking back at 2018, Amber Thomas commented that, for her, a few technologies became normalised over the course of the year:

  1. Phone payments
  2. Voice-controlled assistants
  3. Drones
  4. Facial recognition
  5. Fingerprints
Apart from drones, I've spent the last few years actively avoiding the above. In fact, I spent most of 2018 thinking about decentralised technology, privacy, and radical politics.

However, December is always an important month for me. I come off social media, stop blogging, and turn another year older just before Christmas. It’s a good time to reflect and think about what’s gone before, and what comes next.

Sometimes, it’s possible to identify a particular stimulus to a change in thinking. For me, it was while I was watching Have I Got News For You and the panellists were shown a photo of a fashion designer who put a shoe in front of their face to avoid being recognisable. Paul Merton asked, “doesn’t he have a passport?”

Obvious, of course, but I’d recently been travelling and using the biometric features of my passport. I’ve also relented this year and use the fingerprint scanner to unlock my phone. I realised that the genie isn’t going back in the bottle here, and that everyone else was using my data — biometric or otherwise — so I might as well benefit, too.

Long story short, I’ve bought a Google Pixelbook and Lenovo Smart Display over the Christmas period which I’ll be using in 2019 to my life easier. I’m absolutely trading privacy for convenience, but it’s been a somewhat frustrating couple of years trying to use nothing but Open Source tools.

I’ll have more to say about all of this in due course, but it’s worth saying that I’m still committed to living and working openly. And, of course, I’m looking forward to continuing to work on MoodleNet.

Source: Fragments of Amber

See you in 2019!

Thought Shrapnel will be back next year. Until then, unless you’re a supporter, that’s it for 2018.

Thanks for reading, and have a good break.

Routine and ambition (quote)

“Routine, in an intelligent man, is a sign of ambition.”

(W.H. Auden)

Is the unbundling and rebundling of Higher Education actually a bad thing?

Until I received my doctorate and joined the Mozilla Foundation in 2012, I’d spent fully 27 years in formal education. Either as a student, a teacher, or a researcher, I was invested in the Way Things Currently Are®.

Over the past six years, I’ve come to realise that a lot of the scaremongering about education is exactly that — fears about what might happen, based on not a lot of evidence. Look around; there are lot of doom-mongers about.

It was surprising, therefore, to read a remarkably balanced article in EDUCAUSE Review. Laura Czerniewicz, Director of the Centre for Innovation in Learning and Teaching (CILT), at the University of Cape Town, looks at the current state of play around the ‘unbundling’ and ‘rebundling’ of Higher Education.

Very simply, I'm using the term unbundling to mean the process of disaggregating educational provision into its component parts, very often with external actors. And I'm using the term rebundling to mean the reaggregation of those parts into new components and models. Both are happening in different parts of college and university education, and in different parts of the degree path, in every dimension and aspect—creating an extraordinarily complicated environment in an educational sector that is already in a state of disequilibrium.

Unbundling doesn’t simply happen. Aspects of the higher education experience disaggregate and fragment, and then they get re-created—rebundled—in different forms. And it’s the re-creating that is especially of interest.

Although it’s largely true that the increasing marketisation is a stimulus for the unbundling of Higher Education, I’m of the opinion that what we’re seeing has been accelerated primarily because of the internet. The end of capitalism wouldn’t necessarily remove the drive towards this unbundling and rebundling. In fact, I wonder what it would look like if it were solely non-profits, charities, and co-operatives doing this?

Czerniewicz identifies seven main aspects of Higher Education that are being unbundled:

  1. Curriculum
  2. Resources
  3. Flexible pathways
  4. Academic expertise
  5. Opportunities
    • Support
    • Credentials
    • Networks
  6. Graduateness (i.e. 'the status of being a graduate')
  7. Experience
    • Mode (e.g. online, blended)
    • Place
As a white male with a terminal degree sitting outside academia, I guess I have a great deal of privilege to check. That being said, I do (as ever) have some opinions about all of this.

As Czerniewicz points out, there isn’t anything inherently wrong with unbundling and rebundling. It’s potentially a form of creative destruction, followed by some Hegelian synthesis.

But I'd like to conclude on a hopeful note. Unbundling and rebundling can be part of the solution and can offer opportunities for reasonable and affordable access and education for all. Unbundling and rebundling are opening spaces, relationships, and opportunities that did not exist even five years ago. These processes can be harnessed and utilized for the good. We need to critically engage with these issues to ensure that the new possibilities of provision for teaching and learning can be fully exploited for democratic ends for all.
Goodness knows that, as a sector, Higher Education can do a much better job of the three main things I'd say we'd want of universities in 2018:
  • Developing well-rounded citizens ready to participate fully in democratic society.
  • Sending granular signals to the job market about the talents and competencies of individuals.
  • Enabling extremely flexible provision for those in work, or who want to take different learning pathways.
That's not even to mention universities as places of academic freedom and resistance to forms of oppression (including the State).

I think the main reason I’m interested in all of this is mainly through the lens of new forms of credentialing. Czerniewicz writes:

Certification is an equity issue. For most people, getting verifiable accreditation and certification right is at the heart of why they are invested in higher education. Credentials may prove to be the real equalizers in the world of work, but they do raise critical questions about the function and the reputation of the higher education institution. They also raise questions about value, stigma, and legitimacy. A key question is, how can new forms of credentials increase access both to formal education and to working opportunities?
I agree. So the main reason I got involved in Open Badges was that I saw the inequity as a teacher. I want, by the time our eldest child reaches the age where he's got the choice to go to university (2025), to be able to make an informed choice not to go — and still be OK. Credentialing is an arms race that I've done alright at, but which I don't really want him to be involved in escalating.

So, to conclude, I’m actually all for the unbundling and rebundling of education. As Audrey Watters has commented many times before, it all depends who is doing the rebundling. Is it solely for a profit motive? Is it improving things for the individual? For society? Who gains? Who loses?

Ultimately, this isn’t something that be particularly ‘controlled’, only observed and critiqued. No-one is secretly controlling how this is playing out worldwide. That’s not to say, though, that we shouldn’t call out and resist the worst excesses (I’m looking at you, Facebook). There’s plenty of pedagogical process we can make as this all unfolds.

Source: Educause

Credentials and standardisation

Someone pinch me, because I must be dreaming. It’s 2018, right? So why are we still seeing this kind of article about Open Badges and digital credentials?

“We do have a little bit of a Wild West situation right now with alternative credentials,” said Alana Dunagan, a senior research fellow at the nonprofit Clayton Christensen Institute, which researches education innovation. The U.S. higher education system “doesn’t do a good job of separating the wheat from the chaff.”
You'd think by now we'd realise that we have a huge opportunity to do something different here and not just replicate the existing system. Let's credential stuff that matters rather than some ridiculous notion of 'employability skills'. Open Badges and digital credentials shouldn't be just another stick to beat educational institutions.

Nor do they need to be ‘standardised’. Another person’s ‘wild west’ is another person’s landscape of huge opportunity. We not living in a world of 1950s career pathways.

“Everybody is scrambling to create microcredentials or badges,” Cheney said. “This has never been a precise marketplace, and we’re just speeding up that imprecision.”

Arizona State University, for example, is rapidly increasing the number of online courses in its continuing and professional education division, which confers both badges and certificates. According to staff, the division offers 200 courses and programs in a slew of categories, including art, history, education, health and law, and plans to provide more than 500 by next year.

My eyes are rolling out of my head at this point. Thankfully, I’ve already written about misguided notions around ‘quality’ and ‘rigour’, as well thinking through in a bit more detail what earning a ‘credential’ actually means.

Source: The Hechinger Report