The moon is getting 4G
Yep, you read that headline correctly. Vodafone and Nokia are getting huge amounts of publicitly for partnering with scientists to put a 4G network on the moon.
Why? Because it takes too much power to beam back high-definition video directly from the lunar rovers to the earth. So, instead, it’ll be relayed over a data network on the moon and then transmitted back to earth.
It’s totally a marketing thing for Vodafone and Nokia, but it also sounds totally cool…
Source: BBC News
Possible - impossible
“The only way of finding the limits of the possible is by going beyond them into the impossible.”
(Arthur C. Clarke)
Your best decisions don't come when you demand them
As with every episode so far, I greatly enjoyed listening to a recent episode of the Hurry Slowly podcast, this time with interviewee Bill Duggan. He had some great words of wisdom to share, including:
If we’re talking about the creative side, you certainly can’t force it, and a very simple thing is you can’t solve every problem in one day. You can’t solve every problem in one week. You can’t solve every problem in one year. Some problems you just can’t solve, and you don’t know you can’t solve it until you give up trying to solve it.He makes the point during the episode that if you know what you're doing, and have done something similar before, then there's no problem in pushing on until midnight to get stuff done. However, if you're working overtime to try and solve a problem, a lot of research suggests that you'd be better off doing something else to allow your subconscious to work on it, and spark those 'aha!' moments.
Source: Hurry Slowly
Some great links for Product Managers
As I’ve mentioned before, my new role at Moodle is essentially one of a product manager. I’ve done things which overlap the different elements of the role before but never had them in this combination:
Product managers are responsible for guiding the success of a product and leading the cross-functional team that is responsible for improving it. It is an important organizational role — especially in technology companies — that sets the strategy, roadmap, and feature definition for a product or product line. The position may also include marketing, forecasting, and profit and loss (P&L) responsibilities. In many ways, the role of a product manager is similar in concept to a brand manager at a consumer packaged goods company.As a result, I found this list of resources from Product Manager HQ very useful. I reckon I'd come across about 50% of the tools and apps listed before, so I'm looking forward to exploring the other half!
Here’s a few that I hadn’t heard of before:
Mockingbird: Helps you you create and share clickable wireframes. Use it to make mockups of your website or application in minutes.The definition at the top of this post comes from a whole guide put together for new Product Managers by Aha!TinyPM: Lightweight and smart agile collaboration tool with product management, backlog, taskboard, user stories and wiki.
Roadmunk: Visual roadmap software for product management.
Sprint.ly: Agile project management software for your whole team.
UXCam: Allows you to eliminate customer struggle and improve user experience by capturing and visualizing screen video and user interaction data.
Sources: Aha! / Product Manager HQ
Firefox OS lives on in The Matrix
I still have a couple of Firefox OS phones from my time at Mozilla. The idea was brilliant: using the web as the platform for smartphones. The execution, in terms of the partnership and messaging to the market… not so great.
Last weekend, I actually booted up a device as my daughter was asking about ‘that orange phone you used to let me play with sometimes’. I noticed that Mozilla are discontinuing the app marketplace next month.
All is not lost, however, as open source projects can never truly die. This article reports on a ‘fork’ of Firefox OS being used to resurrect one of my favourite-ever phones, which was used in the film The Matrix:
Quietly, a company called KaiOS, built on a fork of Firefox OS, launched a new version of the OS built specifically for feature phones, and today at MWC in Barcelona the company announced a new wave of milestones around the effort that includes access to apps from Facebook, Twitter and Google in the form of its Voice Assistant, Google Maps, and Google Search; as well as a list of handset makers who will be using the OS in their phones, including HMD/Nokia (which announced its 8110 yesterday), Bullitt, Doro and Micromax; and Qualcomm and Spreadtrum for processing on the inside.I think I'm going to have to buy the new version of the Nokia 8110 just... because.
Source: TechCrunch
The 'loudness' of our thoughts affects how we judge external sounds
This is really interesting:
No-one but you knows what it's like to be inside your head and be subject to the constant barrage of hopes, fears, dreams — and thoughts:The "loudness" of our thoughts -- or how we imagine saying something -- influences how we judge the loudness of real, external sounds, a team of researchers from NYU Shanghai and NYU has found.
"Our 'thoughts' are silent to others -- but not to ourselves, in our own heads -- so the loudness in our thoughts influences the loudness of what we hear," says Poeppel, a professor of psychology and neural science.This is why meditation, both in terms of trying to still your mind, and meditating on positive things you read, is such a useful activity.Using an imagery-perception repetition paradigm, the team found that auditory imagery will decrease the sensitivity of actual loudness perception, with support from both behavioural loudness ratings and human electrophysiological (EEG and MEG) results.
“That is, after imagined speaking in your mind, the actual sounds you hear will become softer – the louder the volume during imagery, the softer perception will be,” explains Tian, assistant professor of neural and cognitive sciences at NYU Shanghai. “This is because imagery and perception activate the same auditory brain areas. The preceding imagery already activates the auditory areas once, and when the same brain regions are needed for perception, they are ‘tired’ and will respond less."
As anyone who’s studied philosophy, psychology, and/or neuroscience knows, we don’t experience the world directly, but find ways to interpret the “bloomin' buzzin' confusion”:
According to Tian, the study demonstrates that perception is a result of interaction between top-down (e.g. our cognition) and bottom-up (e.g. sensory processing of external stimulation) processes. This is because human beings not only receive and analyze upcoming external signals passively, but also interpret and manipulate them actively to form perception.Source: Science Daily
Arbitrary deadlines are the enemy of creativity
People like deadlines because people like accountability. There’s nothing wrong with that, apart from the fact that sometimes it’s impossible to know how long something will take, or cost, or even look like in advance. Creativity, in other words, is at odds with arbitrary deadlines:
We may tease them for their diva-like behaviors when they feel persecuted by a deadline, but we have to admit that “develop an amazing new idea” is not something that slides into your schedule, like pick up lunch or respond to new clients. Nor can systems be tweaked and extra hands hired to help hit a goal that requires innovation, the way they can when mundane busy work is piling up. And yet deadlines are a fact of life for any company that wants to stay competitive.Time is a human construct, not something that's objectively 'out there' in the world. As a result it can be interpreted differently in various situations:
Creative work operates on “event time,” meaning it always requires as much time as needed to organically get the job done. (Think of novel writers or other artists.) Other types of work operate on “clock time,” and are aligned with scheduled events. (A teacher obeys classroom hours and the semester calendar, for instance. An Amazon warehouse manager knows the number of customer orders that can be fulfilled in an hour.)I don't particularly like the phrase 'creative people' in this article, as I believe everyone is (or at least can be) creative. Having said that, I agree with the sentiment:
Creative people need another scarce commodity: mental space. Working in a large team and constantly collaborating as a group doesn’t allow a person the clarity of mind to solve problems with fresh ingenious ideas. “Alone time or working with just one close collaborator seemed to be the key under the low time pressure conditions,” says Amabile.Source: QuartzCreative people, she adds, “have to be protected. They have to be isolated in a way, from all the other stuff that comes up during a work day. They can’t be called into meetings that are unrelated to this serious problem that they’re trying to address.”
Arbitrary deadlines are the enemy of creativity
People like deadlines because people like accountability. There’s nothing wrong with that, apart from the fact that sometimes it’s impossible to know how long something will take, or cost, or even look like in advance. Creativity, in other words, is at odds with arbitrary deadlines:
We may tease them for their diva-like behaviors when they feel persecuted by a deadline, but we have to admit that “develop an amazing new idea” is not something that slides into your schedule, like pick up lunch or respond to new clients. Nor can systems be tweaked and extra hands hired to help hit a goal that requires innovation, the way they can when mundane busy work is piling up. And yet deadlines are a fact of life for any company that wants to stay competitive.Time is a human construct, not something that's objectively 'out there' in the world. As a result it can be interpreted differently in various situations:
Creative work operates on “event time,” meaning it always requires as much time as needed to organically get the job done. (Think of novel writers or other artists.) Other types of work operate on “clock time,” and are aligned with scheduled events. (A teacher obeys classroom hours and the semester calendar, for instance. An Amazon warehouse manager knows the number of customer orders that can be fulfilled in an hour.)I don't particularly like the phrase 'creative people' in this article, as I believe everyone is (or at least can be) creative. Having said that, I agree with the sentiment:
Creative people need another scarce commodity: mental space. Working in a large team and constantly collaborating as a group doesn’t allow a person the clarity of mind to solve problems with fresh ingenious ideas. “Alone time or working with just one close collaborator seemed to be the key under the low time pressure conditions,” says Amabile.Source: QuartzCreative people, she adds, “have to be protected. They have to be isolated in a way, from all the other stuff that comes up during a work day. They can’t be called into meetings that are unrelated to this serious problem that they’re trying to address.”
Small 'b' blogging
I’ve been a blogger for around 13 years now. What the author of this post says about its value really resonates with me:
Small b blogging is learning to write and think with the network. Small b blogging is writing content designed for small deliberate audiences and showing it to them. Small b blogging is deliberately chasing interesting ideas over pageviews and scale. An attempt at genuine connection vs the gloss and polish and mass market of most “content marketing”.He talks about the 'topology' of blogging changing over the years:
Crucially, these entry points to the network were very big and very accessible. What do I mean by that? Well - in those early days they were very big in the sense that if you got your content on the Digg homepage a lot of people would see it (relative to the total size of the network at the time). And they were very accessible in the sense that it wasn’t that hard to get your content there! I recall having a bunch of Digg homepage hits and Hacker News homepage hits.I once had 15,000 people read a post of mine within a 24 hour period via a link from Hacker News. Yet the number of people who did something measurable (got in touch, subscribed to my newsletter, etc. ) was effectively zero.
Every community now has a fragmented number of communities, homepages, entry points, tinyletters, influencers and networks. They overlap in weird and wonderful ways - and it means that it’s harder than ever to feel like you got a “homepage” success on these networks. To create a moment that has the whole audience looking at the same thing at the same time.We shouldn't write for page views and fame, but instead to create value. Just this week I've had people cite back to me posts I wrote years ago. It's a great thing.
So I challenge you to think clearly about the many disparate networks you’re part of and think about the ideas you might want to offer those networks that you don’t want to get lost in the feed. Ideas you might want to return to. Think about how writing with and for the network might enable you to start blogging. Forget the big B blogging model. Forget Medium’s promise of page views and claps. Forget the guest post on Inc, Forbes and Entrepreneur. Forget Fast Company. Forget fast content.Source: Tom Critchlow
Small 'b' blogging
I’ve been a blogger for around 13 years now. What the author of this post says about its value really resonates with me:
Small b blogging is learning to write and think with the network. Small b blogging is writing content designed for small deliberate audiences and showing it to them. Small b blogging is deliberately chasing interesting ideas over pageviews and scale. An attempt at genuine connection vs the gloss and polish and mass market of most “content marketing”.He talks about the 'topology' of blogging changing over the years:
Crucially, these entry points to the network were very big and very accessible. What do I mean by that? Well - in those early days they were very big in the sense that if you got your content on the Digg homepage a lot of people would see it (relative to the total size of the network at the time). And they were very accessible in the sense that it wasn’t that hard to get your content there! I recall having a bunch of Digg homepage hits and Hacker News homepage hits.I once had 15,000 people read a post of mine within a 24 hour period via a link from Hacker News. Yet the number of people who did something measurable (got in touch, subscribed to my newsletter, etc. ) was effectively zero.
Every community now has a fragmented number of communities, homepages, entry points, tinyletters, influencers and networks. They overlap in weird and wonderful ways - and it means that it’s harder than ever to feel like you got a “homepage” success on these networks. To create a moment that has the whole audience looking at the same thing at the same time.We shouldn't write for page views and fame, but instead to create value. Just this week I've had people cite back to me posts I wrote years ago. It's a great thing.
So I challenge you to think clearly about the many disparate networks you’re part of and think about the ideas you might want to offer those networks that you don’t want to get lost in the feed. Ideas you might want to return to. Think about how writing with and for the network might enable you to start blogging. Forget the big B blogging model. Forget Medium’s promise of page views and claps. Forget the guest post on Inc, Forbes and Entrepreneur. Forget Fast Company. Forget fast content.Source: Tom Critchlow
What we can learn from Seneca about dying well
As I’ve shared before, next to my bed at home I have a memento mori, an object to remind me before I go to sleep and when I get up that one day I will die. It kind of puts things in perspective.
“Study death always,” Seneca counseled his friend Lucilius, and he took his own advice. From what is likely his earliest work, the Consolation to Marcia (written around AD 40), to the magnum opus of his last years (63–65), the Moral Epistles, Seneca returned again and again to this theme. It crops up in the midst of unrelated discussions, as though never far from his mind; a ringing endorsement of rational suicide, for example, intrudes without warning into advice about keeping one’s temper, in On Anger. Examined together, Seneca’s thoughts organize themselves around a few key themes: the universality of death; its importance as life’s final and most defining rite of passage; its part in purely natural cycles and processes; and its ability to liberate us, by freeing souls from bodies or, in the case of suicide, to give us an escape from pain, from the degradation of enslavement, or from cruel kings and tyrants who might otherwise destroy our moral integrity.Seneca was forced to take his own life by his own pupil, the more-than-a-little-crazy Roman Emperor, Nero. However, his whole life had been a preparation for such an eventuality.
Seneca, like many leading Romans of his day, found that larger moral framework in Stoicism, a Greek school of thought that had been imported to Rome in the preceding century and had begun to flourish there. The Stoics taught their followers to seek an inner kingdom, the kingdom of the mind, where adherence to virtue and contemplation of nature could bring happiness even to an abused slave, an impoverished exile, or a prisoner on the rack. Wealth and position were regarded by the Stoics as adiaphora, “indifferents,” conducing neither to happiness nor to its opposite. Freedom and health were desirable only in that they allowed one to keep one’s thoughts and ethical choices in harmony with Logos, the divine Reason that, in the Stoic view, ruled the cosmos and gave rise to all true happiness. If freedom were destroyed by a tyrant or health were forever compromised, such that the promptings of Reason could no longer be obeyed, then death might be preferable to life, and suicide, or self-euthanasia, might be justified.Given that death is the last taboo in our society, it's an interesting way to live your life. Being ready at any time to die, having lived a life that you're satisfied with, seems like the right approach to me.
“Study death,” “rehearse for death,” “practice death”—this constant refrain in his writings did not, in Seneca’s eyes, spring from a morbid fixation but rather from a recognition of how much was at stake in navigating this essential, and final, rite of passage. As he wrote in On the Shortness of Life, “A whole lifetime is needed to learn how to live, and—perhaps you’ll find this more surprising—a whole lifetime is needed to learn how to die.”Source: Lapham's Quarterly
Anonymity vs accountability
As this article points out, organisational culture is a delicate balance between many things, including accountability and anonymity:
Though some assurance of anonymity is necessary in a few sensitive and exceptional scenarios, dependence on anonymous feedback channels within an organization may stunt the normalization of a culture that encourages diversity and community.Anonymity can be helpful and positive:
For example, an anonymous suggestion program to garner ideas from members or employees in an organization may strengthen inclusivity and enhance the diversity of suggestions the organization receives. It would also make for a more meritocratic decision-making process, as anonymity would ensure that the quality of the articulated idea, rather than the rank and reputation of the articulator, is what's under evaluation. Allowing members to anonymously vote for anonymously-submitted ideas would help curb the influence of office politics in decisions affecting the organization's growth....but also problematic:
Reliance on anonymous speech for serious organizational decision-making may also contribute to complacency in an organizational culture that falls short of openness. Outlets for anonymous speech may be as similar to open as crowdsourcing is—or rather, is not. Like efforts to crowdsource creative ideas, anonymous suggestion programs may create an organizational environment in which diverse perspectives are only valued when an organization's leaders find it convenient to take advantage of members' ideas.The author gives some advice to leaders under five sub-headings:
There's some great advice in here, and I'll certainly be reflecting on it with the organisations of which I'm part.
- Availability of additional communication mechanisms
- Failure of other communication avenues
- Consequences of anonymity
- Designing the anonymous communication channel
- Long-term considerations
Source: opensource.com
On your deathbed, you're not going to wish that you'd spent more time on Facebook
As many readers of my work will know, I don’t have a Facebook account. This article uses Facebook as a proxy for something that, whether you’ve got an account on the world’s largest social network or not, will be familiar:
The trick, like anything to which you're psychologically addicted, is to reframe what you're doing:An increasing number of us are coming to realize that our relationships with our phones are not exactly what a couples therapist would describe as “healthy.” According to data from Moment, a time-tracking app with nearly five million users, the average person spends four hours a day interacting with his or her phone.
The thing I find hardest is to leave my phone in a different room, or not take it with me when I go out. There's always a reason for this (usually 'being contactable') but not having it constantly alongside you is probably a good idea:Many people equate spending less time on their phones with denying themselves pleasure — and who likes to do that? Instead, think of it this way: The time you spend on your phone is time you’re not spending doing other pleasurable things, like hanging out with a friend or pursuing a hobby. Instead of thinking of it as “spending less time on your phone,” think of it as “spending more time on your life.”
There's a great re-adjustment happening with our attitude towards devices and the services we use on them. In a separate BBC News article, Amol Rajan outlines some reasons why Facebook usage may have actually peaked:Leave your phone at home while you go for a walk. Stare out of a window during your commute instead of checking your email. At first, you may be surprised by how powerfully you crave your phone. Pay attention to your craving. What does it feel like in your body? What’s happening in your mind? Keep observing it, and eventually, you may find that it fades away on its own.
Interesting times.
- A drop in users
- A drop in engagement
- Advertiser enmity
- Disinformation and fake news
- Former executives speak out
- Regulatory mood is hardening
- GDPR
- Antagonism with the news industry
Source: The New York Times / BBC News
The Goldilocks Rule
In this article from 2016, James Clear investigates motivation:
Why do we stay motivated to reach some goals, but not others? Why do we say we want something, but give up on it after a few days? What is the difference between the areas where we naturally stay motivated and those where we give up?The answer, which is obvious when we think about it, is that we need appropriate challenges in our lives:
Tasks that are significantly below your current abilities are boring. Tasks that are significantly beyond your current abilities are discouraging. But tasks that are right on the border of success and failure are incredibly motivating to our human brains. We want nothing more than to master a skill just beyond our current horizon.But he doesn’t stop there. He goes on to talk about Mihaly Csikszentmihalyi’s notion of peak performance, or ‘flow’ states:We can call this phenomenonThe Goldilocks Rule. The Goldilocks Rule states that humans experience peak motivation when working on tasks that are right on the edge of their current abilities. Not too hard. Not too easy. Just right.
In order to reach this state of peak performance... you not only need to work on challenges at the right degree of difficulty, but also measure your immediate progress. As psychologist Jonathan Haidt explains, one of the keys to reaching a flow state is that “you get immediate feedback about how you are doing at each step.”Video games are great at inducing flow states; traditional classroom-based learning experiences, not so much. The key is to create these experiences yourself by finding optimum challenge and immediate feedback.
Source: Lifehacker
On the death of Google/Apache Wave (and the lessons we can learn from it)
This article is entitled ‘How not to replace email’ and details both the demise of Google Wave and it’s open source continuation, Apache Wave:
As of a month ago, the Apache Wave project is “retired”. Few people noticed; in the seven years that Wave was an Apache Incubator open source project, it never had an official release, and was stuck at version 0.4-rc10 for the last three years.Yes, I know! There's been a couple of times over the last few years when I've thought that Wave would have been perfect for a project I was working on. But the open source version never seemed to be 'ready'.
The world want ready for it in 2010, but now would seem to be the perfect time for something like Wave:
2017 was a year of rapidly growing interest in federated communications tools such as Mastodon, which is an alternative to Twitter that doesn’t rely on a single central corporation. So this seems like a good time to revisit an early federated attempt to reinvent how we use the internet to communicate with each other.As the author notes, the problem was the overblown hype around it, causing Google to pull it after just three months. He quoted a friend of his who at one time was an active user:
We’d start sending messages with lots of diagrams, sketches, and stuff cribbed from Google Images, and then be able to turn those sort of longer-than-IM-shorter-than-email messages into actual design documents gradually.I feel this too, and it’s actually something we’ve been talking about for internal communications at Moodle. Telegram, (which we use kind of like Slack) is good for short, sharp communication, but there’s a gulf between that and, say, an email conversation or threaded forum discussion.In fact, I’d argue that even having a system that’s a messaging system designed for “a paragraph or two” was on its own worthwhile: even Slack isn’t quite geared toward that, and contrariwise, email […] felt more heavyweight than that. Wave felt like it encouraged the right amount of information per message.
Perhaps this is the sweet spot for the ‘social networking’ aspect of Project MoodleNet?
Wave’s failure didn’t have anything to do with the ideas that went into it.Helpfully, the author outlines some projects he’s been part of, after stating (my emphasis):Those ideas and goals are sound, and this failure even provided good evidence that there’s a real need for something kind of like Wave: fifty thousand people signed a petition to “Save Google Wave” after Google announced they were shutting Wave down. Like so many petitions, it didn’t help (obviously), but if a mediocre implementation got tens of thousands of passionate fans, what could a good implementation do?
I’d say the single most important lesson to take away here, for a technology project at least, is that interoperability is key.It's a really useful article with many practical applications (well, for me at least...)
- Assume that no matter how amazing your new tech is, people are going to adopt it slowly.
- Give your early adopters every chance you can to use your offering together with the existing tools that they will continue to need in order to work with people who haven’t caught up yet.
- And if you’re building a communication tool, make it as simple as possible for others to build compatible tools, because they will expand the network of people your users can communicate with to populations you haven’t thought of and probably don’t understand.
Source: Jamey Sharp
To lose old styles of reading is to lose a part of ourselves
Sometimes I think we’re living in the end times:
I wrote my doctoral thesis on digital literacies. There was a real sense in the 1990s that reading on screen was very different to reading on paper. We've kind of lost that sense of difference, and I think perhaps we need to regain it:Out for dinner with another writer, I said, "I think I've forgotten how to read."
"Yes!" he replied, pointing his knife. "Everybody has."
"No, really," I said. "I mean I actually can't do it any more."
He nodded: "Nobody can read like they used to. But nobody wants to talk about it."
We don't really talk about 'hypertext' any more, as it's almost the default type of text that we read. As such, reading on paper doesn't really prepare us for it:For most of modern life, printed matter was, as the media critic Neil Postman put it, "the model, the metaphor, and the measure of all discourse." The resonance of printed books – their lineal structure, the demands they make on our attention – touches every corner of the world we've inherited. But online life makes me into a different kind of reader – a cynical one. I scrounge, now, for the useful fact; I zero in on the shareable link. My attention – and thus my experience – fractures. Online reading is about clicks, and comments, and points. When I take that mindset and try to apply it to a beaten-up paperback, my mind bucks.
Me too. I train myself to read longer articles through mechanisms such as writing Thought Shrapnel posts and newsletters each week. But I don't read like I used to; I read for utility rather than pleasure and just for the sake of it.For a long time, I convinced myself that a childhood spent immersed in old-fashioned books would insulate me somehow from our new media climate – that I could keep on reading and writing in the old way because my mind was formed in pre-internet days. But the mind is plastic – and I have changed. I'm not the reader I was.
It's funny. We've such a connection with books, but for most of human history we've done without them:The suggestion that, in a few generations, our experience of media will be reinvented shouldn't surprise us. We should, instead, marvel at the fact we ever read books at all. Great researchers such as Maryanne Wolf and Alison Gopnik remind us that the human brain was never designed to read. Rather, elements of the visual cortex – which evolved for other purposes – were hijacked in order to pull off the trick. The deep reading that a novel demands doesn't come easy and it was never "natural." Our default state is, if anything, one of distractedness. The gaze shifts, the attention flits; we scour the environment for clues. (Otherwise, that predator in the shadows might eat us.) How primed are we for distraction? One famous study found humans would rather give themselves electric shocks than sit alone with their thoughts for 10 minutes. We disobey those instincts every time we get lost in a book.
There's several theses in all of this around fake news, the role of reading in a democracy, and how information spreads. For now, I continue to be amazed at the power of the web on the fabric of societies.Literacy has only been common (outside the elite) since the 19th century. And it's hardly been crystallized since then. Our habits of reading could easily become antiquated. The writer Clay Shirky even suggests that we've lately been "emptily praising" Tolstoy and Proust. Those old, solitary experiences with literature were "just a side-effect of living in an environment of impoverished access." In our online world, we can move on. And our brains – only temporarily hijacked by books – will now be hijacked by whatever comes next.
Source: The Globe and Mail