- Windows is still a huge proprietary monster that rips billions of people from their privacy and rights every day.
- Microsoft is known for spreading FUD about "the dangers" of Free Software in order to keep governments and schools from dropping Windows in favor of FOSS.
- To secure their monopoly, Microsoft hooks up kids on Windows by giving out "free" licences to primary schools around the world. Drug dealers use the same tactics and give out free samples to secure new clients.
- Microsoft's Azure platform - even though it can run Linux VMs - is still a giant proprietary hypervisor.
Reduce your costs, retain your focus
The older I get, the less important I realise things are that I deemed earlier in life. For example, the main thing in life seems to be to find something you can find interesting to work on for a long period of time. That’s unlikely to be a ‘job’ but more like a problem to be solved values to exemplify and share.
Jason Fried writes on his company’s blog about the journey that they’ve taken over the last 19 years. Everyone knows Basecamp because it’s been around for as long as you’ve been on the web.
What is true in business is true in your personal life. I'm writing this out in the garden of our terraced property. It's approximately the size of a postage stamp. No matter, it's big enough for what we need, and living here means my wife doesn't have to work (unless she wants to) and I'm not under pressure to earn some huge salary.2018 will be our 19th year in business. That means we’ve survived a couple of major downturns — 2001, and 2008, specifically. I’ve been asked how. It’s simple: It didn’t cost us much to stay in business. In 2001 we had 4 employees. We were competing against companies that had 40, 400, even 4000. We had 4. We made it through, many did not. In 2008 we had around 20. We had millions in revenue coming in, but we still didn’t spend money on marketing, and we still sublet a corner of someone else’s office. Business was amazing, but we continued to keep our costs low. Keeping a handle on your costs must be a habit, not an occasion. Diets don’t work, eating responsibly does.
These days we have huge expectations of what life should give us. The funny thing is that, if you stand back a moment and ask what you actually need, there's never been a time in history when the baseline that society provides has been so high.So keep your costs as low as possible. And it’s likely that true number is even lower than you think possible. That’s how you last through the leanest times. The leanest times are often the earliest times, when you don’t have customers yet, when you don’t have revenue yet. Why would you tank your odds of survival by spending money you don’t have on things you don’t need? Beats me, but people do it all the time. ALL THE TIME. Dreaming of all the amazing things you’ll do in year three doesn’t matter if you can’t get past year two.
We rush around the place trying to be like other people and organisations, when we need to think about what who and what we’re trying to be. The way to ‘win’ at life and business is to still be doing what you enjoy and deem important when everyone else has crashed and burned.
Source: Signal v. Noise
The link between sleep and creativity
I’m a big fan of sleep. Since buying a smartwatch earlier this year, I’ve been wearing it all of the time, including in bed at night. What I’ve found is that I’m actually a good sleeper, regularly sleeping better than 95% of other people who use the same Mi Fit app.
Like most people, after a poor night’s sleep I’m not at my best the next day. This article by Ed Yong in The Atlantic helps explain why.
As you start to fall asleep, you enter non-REM sleep. That includes a light phase that takes up most of the night, and a period of much heavier slumber called slow-wave sleep, or SWS, when millions of neurons fire simultaneously and strongly, like a cellular Greek chorus. “It’s something you don’t see in a wakeful state at all,” says Lewis. “You’re in a deep physiological state of sleep and you’d be unhappy if you were woken up.”We’ve known for generations that, if we’ve got a problem to solve or a decision to make, that it’s a good idea to ‘sleep on it’. Science is catching up with folk wisdom.During that state, the brain replays memories. For example, the same neurons that fired when a rat ran through a maze during the day will spontaneously fire while it sleeps at night, in roughly the same order. These reruns help to consolidate and strengthen newly formed memories, integrating them into existing knowledge. But Lewis explains that they also help the brain extract generalities from specifics—an idea that others have also supported.
The other phase of sleep—REM, which stands for rapid eye movement—is very different. That Greek chorus of neurons that sang so synchronously during non-REM sleep descends into a cacophonous din, as various parts of the neocortex become activated, seemingly at random. Meanwhile, a chemical called acetylcholine—the same one that Loewi identified in his sleep-inspired work—floods the brain, disrupting the connection between the hippocampus and the neocortex, and placing both in an especially flexible state, where connections between neurons can be more easily formed, strengthened, or weakened.The difficulty is that our sleep quality is affected by blue light confusing the brain as to what kind of day it is. That's why we're seeing increasing numbers of devices changing your screen colour towards the red end of the spectrum in the evening. If you have disrupted sleep, you miss out on an important phase of your sleep cycle.
Crucially, they build on one another. The sleeping brain goes through one cycle of non-REM and REM sleep every 90 minutes or so. Over the course of a night—or several nights—the hippocampus and neocortex repeatedly sync up and decouple, and the sequence of abstraction and connection repeats itself. “An analogy would be two researchers who initially work on the same problem together, then go away and each think about it separately, then come back together to work on it further,” Lewis writes.As the article states, there’s further research to be done here. But, given that sleep (along with exercise and nutrition) is one of the three ‘pillars’ of productivity, this certainly chimes with my experience.“The obvious implication is that if you’re working on a difficult problem, allow yourself enough nights of sleep,” she adds. “Particularly if you’re trying to work on something that requires thinking outside the box, maybe don’t do it in too much of a rush.”
Source: The Atlantic
Attention scarcity as an existential threat
This post is from Albert Wenger, a partner a New York-based early stage VC firm focused on investing in disruptive networks. It’s taken from his book World After Capital, currently in draft form.
In this section, Wenger is concerned with attention scarcity, which he believes to be both a threat to humanity, and an opportunity for us.
On the threat side, for example, we are not working nearly hard enough on how to recapture CO2 and other greenhouse gases from the atmosphere. Or on monitoring asteroids that could strike earth, and coming up with ways of deflecting them. Or containing the outbreak of the next avian flu: we should have a lot more collective attention dedicated to early detection and coming up with vaccines and treatments.The reason the world's population is so high is almost entirely due to the technological progress we've made. We're simply better at keeping human beings alive.
On the opportunity side, far too little human attention is spent on environmental cleanup, free educational resources, and basic research (including the foundations of science), to name just a few examples. There are so many opportunities we could dedicate attention to that over time have the potential to dramatically improve quality of life here on Earth not just for humans but also for other species.Interestingly, he comes up with a theory as to why we haven't heard from any alien species yet:
I am proposing this as a (possibly new) explanation for the Fermi Paradox, which famously asks why we have not yet detected any signs of intelligent life elsewhere in our rather large universe. We now even know that there are plenty of goldilocks planets available that could harbor life forms similar to those on Earth. Maybe what happens is that all civilizations get far enough to where they generate huge amounts of information, but then they get done in by attention scarcity. They collectively take their eye off the ball of progress and are not prepared when something really bad happens such as a global pandemic.Attention scarcity, then, has the opportunity to become an existential threat to our species. Pay attention to the wrong things and we could either neglect to avoid a disaster, or cause one of our own making.
Source: Continuations
Our irresistible screens of splendour
Apple is touting a new feature in the latest version of iOS that helps you reduce the amount of time you spend on your smartphone. Facebook are doing something similar. As this article in The New York Times notes, that’s no accident:
The article even gives the example of Augmented Reality LEGO play sets which actively encourage you to stop building and spend more time on screens!There’s a reason tech companies are feeling this tension between making phones better and worrying they are already too addictive. We’ve hit what I call Peak Screen.
For much of the last decade, a technology industry ruled by smartphones has pursued a singular goal of completely conquering our eyes. It has given us phones with ever-bigger screens and phones with unbelievable cameras, not to mention virtual reality goggles and several attempts at camera-glasses.
It’s not enough to tell people not to do things. Technology can be addictive, just like anything else, so we need to find better ways of achieving similar ends.Tech has now captured pretty much all visual capacity. Americans spend three to four hours a day looking at their phones, and about 11 hours a day looking at screens of any kind.
So tech giants are building the beginning of something new: a less insistently visual tech world, a digital landscape that relies on voice assistants, headphones, watches and other wearables to take some pressure off our eyes.
[...]Screens are insatiable. At a cognitive level, they are voracious vampires for your attention, and as soon as you look at one, you are basically toast.
The issue I have is that it's going to take tightly-integrated systems to do this well, at least at first. So the chances are that Apple or Google will create an ecosystem that only works with their products, providing another way to achieve vendor lock-in.But in addition to helping us resist phones, the tech industry will need to come up with other, less immersive ways to interact with digital world. Three technologies may help with this: voice assistants, of which Amazon’s Alexa and Google Assistant are the best, and Apple’s two innovations, AirPods and the Apple Watch.
All of these technologies share a common idea. Without big screens, they are far less immersive than a phone, allowing for quick digital hits: You can buy a movie ticket, add a task to a to-do list, glance at a text message or ask about the weather without going anywhere near your Irresistible Screen of Splendors.
Source: The New York Times
Rethinking hierarchy
This study featured on the blog of the Stanford Graduate School of Business talks about the difference between hierarchical and non-hierarchical structures. It cites work by Lisanne van Bunderen from University of Amsterdam, who found that egalitarianism seemed to lead to better performance:
Context, of course, is vital. One place where hierarchy and a command-and-control approach seems impotant is in high stakes situations such as the battlefield or hospital operating theatres during delicate operations. Lindred Greer, a professor of organizational behavior at Stanford Graduate School of Business, nevertheless believes that, even in these situations, the amount of hierarchy can be reduced:“The egalitarian teams were more focused on the group because they felt like ‘we’re in the same boat, we have a common fate,’” says van Bunderen. “They were able to work together, while the hierarchical team members felt a need to fend for themselves, likely at the expense of others.”
In some cases, hierarchy is an unavoidable part of the work. Greer is currently studying the interaction between surgeons and nurses, and surgeons lead by necessity. “If you took the surgeon out of the operating room, you would have some issues,” she says. But surgeons’ dominance in the operating room can also be problematic, creating dysfunctional power dynamics. To help solve this problem, Greer believes that the expression of hierarchy can be moderated. That is, surgeons can learn to behave in a way that’s less hierarchical.While hierarchy is necessary in some situations, what we need is a more fluid approach to organising, as I've written about recently. The article gives the very practical example of Navy SEALs:
Like the article's author, I'm still looking for something that's going to gain more traction than Holacracy. Perhaps the sociocratic approach could work well, but does require people to be inducted into it. After all, hierarchy and capitalism is what we're born into these days. It feels 'natural' to people.Navy SEALS exemplify this idea. Strict hierarchy dominates out in the field: When a leader says go left, they go left. But when the team returns for debrief, “they literally leave their stripes at the door,” says Greer. The hierarchy disappears; nobody is a leader, nobody a follower. “They fluidly shift out of these hierarchical structures,” she says. “It would be great if business leaders could do this too: Shift from top-down command to a position in which everyone has a say.” Importantly, she reiterated, this kind of change is not only about keeping employees happy, but also about enhancing performance and benefiting the bottom line.
Source: Stanford Graduate School of Business (via Stowe Boyd)
Crawling before you walk
Alberto Corado, Moodle’s UX Lead, sent me an article by Rebecca Guthrie entitled Crawl, Walk, Run. It’s contains good, concise, advice in three parts:
This is what we've been doing so far with the MoodleNet project. I must have spoken to around 50 people all told, running the idea past them, getting their feedback, and iterating towards the prototype we came up with during the design sprint. I'd link to the records I have of those conversations, but I had to take down my notes on the wiki, along with community call stuff, due to GDPR.Crawl. Do things that don’t scale at the beginning. Talk to 50 potential customers, listen, discover pain points, and then begin to form a product to solve that pain. Use this feedback to develop your MVP. Don’t fall in love with your solution. Fall in love with their problem. I’ve mentioned this before, read Lean Startup.
I'm not sure this completely applies to what we're doing with MoodleNet. It's effectively a version of what Tim Ferriss outlines in The 4-Hour Work Week when he suggests creating a page for a product that doesn't exist and taking sign-ups after someone presses the 'Buy' button.Walk. Create mock-ups. Start to develop your product. Go back to your early potential customers and ask them if your MVP (or mockups) solve their problem. Pre-sell it. If you really are solving a problem, they will pay you for the software. Don’t give it away for free, but do give them an incentive to participate. If you can’t get one person to buy before it is ready, do not move onto the next stage with building your product. Or, you will launch to crickets. Go back to your mock-ups and keep going until you create something at least one person wants to buy. The one person should not be a family member or acquaintance. Once you have the pre-sale(s), conduct a Beta round where those paying users test out what you’ve built. Stay in Beta until you can leverage testimonials from your users. Leverage this time to plan for what comes next, an influx of customers based of your client’s testimonials.
What I think we can do is create clickable prototypes using something like Adobe XD, which allows users to give feedback on specific features. We can use this UX feedback to create an approach ready for when the technical architecture is built.
While MoodleNet needs to be sustainable, this isn't about huge sales growth but about serving educators. We do want as many people to use the platform as possible, and we want to grow in a way where there's a feedback loop. So we may end up doing something like giving our initial cohort a certain number of invites to encourage their friends/colleagues to join.Run. Once your Beta is proven, RUN! Run as fast as you can and get Sales. The founder (or one of the founders) must be willing to hustle for sales. I recommend downloading the startup course from Close.io. Steli gives amazing advice.
Food for thought, certainly.
Source: Rebecca Guthrie
Higher Education and blockchain
I’ve said it before, and I’ll say it again: the most useful applications of blockchain technologies are incredibly boring. That goes in education, too.
This post by Chris Fellingham considers blockchain in the context of Higher Education, and in particular credentialing:
The short pitch is that as jobs and education go digital, we need digital credentials for our education and those need to be trustworthy and automisable. Decentralised trust systems may well be the future but I don’t see that it solves a core problem. Namely that the main premium market for Higher Education Edtech is geared twards graduates in developed countries and that market — does not have a problem of trust in its credentials — it has a problem of credibility in its courses. People don’t know what it means to have done a MOOC/Specialization/MicroMasters in X which undermines the market system for it. Shoring up the credential is a second order problem to proving the intrinsic value of the course itself."Decentralised trust systems" is what blockchain aficionados refer to, but what they actually mean is removing trust from the equation. So, in hiring decisions, for example, trust is removed from the equation in favour of cryptographic proof.
Fellingham mentions someone called ‘Smolenski’ who, after a little bit of digging, must be Natalie Smolenski, who works for Learning Machine. That organisation is a driving force, with MIT, behind the Blockcerts standard for blockchain-based digital credentialing.
Like Fellingham, I'm not particularly enamoured with this teleological 'grand narrative' approach to history, of which blockchain believers do tend to be overly-fond. I'm pretty sure that human history hasn't been 'building' in any way towards anything, particularly something that involves less trust between human beings.Smolenski however, is a believer, and in numerous elegant essays has argued blockchain is the latest paradigm shift in trust-based technologies. The thesis puts trust based technologies as a central driver of human development. Kinship was the first ‘trust technology’, followed by language and cultural development. Things really got going with organised religion which was the early modern driver — enabling proto-legal systems and financial systems to emerge. Total strangers could now conduct economic transactions by putting their trust in local laws (a mutually understand system for transactions) in the knowledge that it would be enforced by a trusted third party — the state. Out of this emerged market economies and currencies.
Blockchain at this moment is a kind of religion. It’s based on a hope of things to come:
Indeed. Who on earth would want wants to hard code the way things are right now in Higher Education? If your answer is 'blockchain-based credentials', then I'm not sure you really understand what the question is.Blockchain — be it in credential or currency form ...could well be a major — if not paradigmatic technology — but it has its own logic and fundamentally suits those who use it best — much as social networks turned out to be fertile grounds for fake news. For that reason alone, we should be far more cautious about a shift to blockchain in Higher Education — lest like fake news — it takes an imperfect system and makes it worse.
Source: Chris Fellingham (via Stephen Downes)
On 'instagrammability'
“We shape our tools and thereafter our tools shape us.” (John M. Culkin)I choose not to use or link to Facebook services, and that includes Instagram and WhatsApp. I do, however, recognise the huge power that Instagram has over some people's lives which, of course, trickles down to businesses and those looking to "live the Instagram lifestyle".
The design blog Dezeen picks up on a report from an Australian firm of architects, demonstrating that ‘Instagrammable moments’ are now part of their brief.
I’m all for user stories and creating personas but one case looks like grounds for divorce, Bob is seen as the servant of Michelle, who wants to be photographed doing things she’s seen others doing
One case study features Bob and Michelle, a couple with "very different ideas about what their holiday should look like."It’s easy to roll your eyes at this (and trust me, mine are almost rotating out of their sockets) but the historian in me finds this fascinating. I wonder if future generations will realise that architectural details were a result of photos been taken for a particular service?While Bob wants to surf, drink beer and spend quality time with Michelle, she wants to “be pampered and live the Instagram life of fresh coconuts and lounging by the pool.”
In response to this type of user, designers should focus on providing what Michelle wants, since “Bob’s main job this holiday is to take pictures of Michelle.”
“Michelle wants pictures of herself in the pool, of bright colours, and of fresh attractive food,” the report says. “You’ll also find her taking pictures of remarkable indoor and outdoor artwork like murals or inspirational signage."
Other designers taking users' Instagram preferences into account include Coordination Asia, who recent project for restaurant chain Gaga in Shanghai has been optimised so design elements fit in a photo frame and maximise the potential for selfies.Of course, architects and designers have to start somewhere and perhaps ‘instagrammability’ is a useful creative constraint.Instagram co-founder Mike Krieger told Dezeen that he had noticed that the platform was influencing interior design.
"Hopefully it leads to a creative spark and things feeling different over time," [Krieger] said. "I think a bad effect would be that same definition of instagrammability in every single space. But instead, if you can make it yours, it can add something to the building."Now that I’ve read this, I’ll be noticing this everywhere, no doubt.Instagram was placed at number 66 in the latest Dezeen Hot List of the most newsworthy forces in world design.
Source: Dezeen
F*** off Google
This is interesting, given that Google was welcomed with open arms in London:
Google plans to implant a "Google Campus" in Kreuzberg, Berlin. We, as a decentralized network of people are committed to not letting our beloved city be taken over by this law- and tax-evading company that is building a dystopian future. Let's kick Google out of our neighborhood and lives!What I find interesting is that not only are people organising against Google, they've also got a wiki to inform people and help wean them off Google services.
The problem that I have with ‘replacing’ Google services is that it’s usually non-trivial for less technical users to achieve. As the authors of the wiki point out:
It is though dangerous to think in terms of "alternatives", like the goal was to reach equivalence to what Google offers (and risk to always lag behind). In reality what we want is *better* services than the ones of Google, because they would rest on *better* principles, such as decentralization/distribution of services, end-to-end encryption, uncompromising free/libre software, etc.The two biggest problems with the project of removing big companies such as Google from our lives, are: (i) using web services is a social thing, and (ii) they provide such high quality services for so little financial cost.While presenting these “alternatives” or “replacements” here, we must keep in mind that the true goal is to achieve proper distribution/decentralization of information and communication, and empower people to understand and control where their information goes.
Whether you’re using a social network to connect with friends or working with colleagues on a collaborative document, your choices aren’t solely yours. We negotiate the topography of the web at the same time as weaving the social fabric of society. It’s not enough to give people alternatives, there has to be some leadership to go with it.
Source: Fuck off Google wiki
Seed of good (quote)
“Search for the seed of good in every adversity. Master that principle and you will own a precious shield that will guard you well through all the darkest valleys you must traverse. Stars may be seen from the bottom of a deep well, when they cannot be discerned from the mountaintop. So will you learn things in adversity that you would never have discovered without trouble. There is always a seed of good. Find it and prosper.”
(Og Mandino)
Where memes come from
In my TEDx talk six years ago, I explained how the understanding and remixing of memes was a great way to develop digital literacies. At that time, they were beginning to be used in advertisements. Now, as we saw with Brexit and the most recent US Presidential election, they’ve become weaponised.
This article in the MIT Technology Review references one of my favourite websites, knowyourmeme.com, which tracks the origin and influence of various memes across the web. Researchers have taken 700,000 images from this site and used an algorithm to track their spread and development. In addition, they gathered 100 million images from other sources.
Spotting visually similar images is relatively straightforward with a technique known as perceptual hashing, or pHashing. This uses an algorithm to convert an image into a set of vectors that describe it in numbers. Visually similar images have similar sets of vectors or pHashes.Whereas some things ‘go viral’ by accident and catch the original author(s) off-guard, some communities are very good at making memes that spread quickly.The team let their algorithm loose on a database of over 100 million images gathered from communities known to generate memes, such as Reddit and its subgroup The_Donald, Twitter, 4chan’s politically incorrect forum known as /pol/, and a relatively new social network called Gab that was set up to accommodate users who had been banned from other communities.
Two relatively small communities stand out as being particularly effective at spreading memes. “We find that /pol/ substantially influences the meme ecosystem by posting a large number of memes, while The Donald is the most efficient community in pushing memes to both fringe and mainstream Web communities,” say Stringhini and co.It turns out that, just like in evolutionary biology, creating a large number of variants is likely to lead to an optimal solution for a given environment.They also point out that “/pol/ and Gab share hateful and racist memes at a higher rate than mainstream communities,” including large numbers of anti-Semitic and pro-Nazi memes.
Seemingly neutral memes can also be “weaponized” by mixing them with other messages. For example, the “Pepe the Frog” meme has been used in this way to create politically active, racist, and anti-Semitic messages.
The researchers, who have made their technique available to others to promote further analysis, are even able to throw light on the question of why some memes spread widely while others quickly die away. “One of the key components to ensuring they are disseminated is ensuring that new ‘offspring’ are continuously produced,” they say.As the article states, right now it’s humans creating these memes. However, it won’t be long until we have machines doing this automatically. After all, it’s been five years since the controversy about the algorithmically-created “Keep Calm and…” t-shirts for sale on Amazon.That immediately suggests a strategy for anybody wanting to become more influential: set up a meme factory that produces large numbers of variants of other memes. Every now and again, this process is bound to produce a hit.
For any evolutionary biologist, that may sound familiar. Indeed, it’s not hard to imagine a process that treats pHashes like genomes and allows them to evolve through mutation, reproduction, and selection.
It’s an interesting space to watch, particularly for those interested in digital literacies (and democracy).
Source: MIT Technology Review
The seductive logic of technology (quote)
"Whenever we get swept up in the self-reinforcing momentum and seductive logic of some new technology, we forget to ask what else it might be doing, how else it might be working, and who ultimately benefits most from its appearance. Why time has been diced into the segments between notifications, why we feel so inadequate to the parade of images that reach us through our devices, just why it is that we feel so often feel hollow and spent. What might connect our choices and the processes that are stripping the planet, filthing the atmosphere, and impoverishing human and nonhuman lives beyond number. Whether and in what way our actions might be laying the groundwork for an oppression that is grimmer yet and still more total. And finally we forget to ask whether, in our aspiration to overcome the human, we are discarding a gift we already have at hand and barely know what to do with."
(Adam Greenfield)Inequality, anarchy, and the course of human history
Sometimes I’m reminded of the fact that I haven’t checked in with someone’s worth for a few weeks, months, or even years. I’m continually impressed with the work of my near-namesake Dougald Hine. I hope to meet him in person one day.
Going back through his recent work led me to a long article in Eurozine by David Graeber and David Wengrow about how we tend to frame history incorrectly.
Overwhelming evidence from archaeology, anthropology, and kindred disciplines is beginning to give us a fairly clear idea of what the last 40,000 years of human history really looked like, and in almost no way does it resemble the conventional narrative. Our species did not, in fact, spend most of its history in tiny bands; agriculture did not mark an irreversible threshold in social evolution; the first cities were often robustly egalitarian. Still, even as researchers have gradually come to a consensus on such questions, they remain strangely reluctant to announce their findings to the public – or even scholars in other disciplines – let alone reflect on the larger political implications. As a result, those writers who are reflecting on the ‘big questions’ of human history – Jared Diamond, Francis Fukuyama, Ian Morris, and others – still take Rousseau’s question (‘what is the origin of social inequality?’) as their starting point, and assume the larger story will begin with some kind of fall from primordial innocence.Graeber and Wengrow essentially argue that most people start from the assumption that we have a choice between a life that is 'nasty, brutish, and short' (i.e. most of human history) or one that is more civilised (i.e. today). If we want the latter, we have to put up with inequality.
‘Inequality’ is a way of framing social problems appropriate to technocratic reformers, the kind of people who assume from the outset that any real vision of social transformation has long since been taken off the political table. It allows one to tinker with the numbers, argue about Gini coefficients and thresholds of dysfunction, readjust tax regimes or social welfare mechanisms, even shock the public with figures showing just how bad things have become (‘can you imagine? 0.1% of the world’s population controls over 50% of the wealth!’), all without addressing any of the factors that people actually object to about such ‘unequal’ social arrangements: for instance, that some manage to turn their wealth into power over others; or that other people end up being told their needs are not important, and their lives have no intrinsic worth. The latter, we are supposed to believe, is just the inevitable effect of inequality, and inequality, the inevitable result of living in any large, complex, urban, technologically sophisticated society.But inequality is not the inevitable result of living in a civilised society, as they point out with some in-depth examples. I haven't got space to go through them here, but suffice to say that it seems a classic case of historians cherry-picking their evidence.
As Claude Lévi-Strauss often pointed out, early Homo sapiens were not just physically the same as modern humans, they were our intellectual peers as well. In fact, most were probably more conscious of society’s potential than people generally are today, switching back and forth between different forms of organization every year. Rather than idling in some primordial innocence, until the genie of inequality was somehow uncorked, our prehistoric ancestors seem to have successfully opened and shut the bottle on a regular basis, confining inequality to ritual costume dramas, constructing gods and kingdoms as they did their monuments, then cheerfully disassembling them once again.Definitely worth a read, particularly if you think that ‘anarchy’ is the opposite of ‘civilisation’.If so, then the real question is not ‘what are the origins of social inequality?’, but, having lived so much of our history moving back and forth between different political systems, ‘how did we get so stuck?’
Source: Eurozine (via Dougald Hine)
Image CC BY-NC-SA xina
Mediocrity (quote)
“You needn’t settle for a mediocre life just because the people around you did.”
(Joshua Fields Millburn)
Git yourself off that platform!
This week, tens of thousands of open source projects migrated their codebase away from GitHub to alternatives such as GitLab. Why? Because Microsoft announced that they’ve bought GitHub for $7.5 billion.
For those who don’t spend time in the heady world of software and web development, that sounds like a lot of money for something with a silly name. It will hopefully make things a little clearer to explain that Git is described by Wikipedia in the following way:
Git is a version control system for tracking changes in computer files and coordinating work on those files among multiple people. It is primarily used for source code management in software development, but it can be used to keep track of changes in any set of files. As a distributed revision control system it is aimed at speed, data integrity, and support for distributed, non-linear workflows.Despite GitHub not being open source, it did, until this week host most of the world's open source projects. You can currently use GitHub for free if your project's code is public, and the company sells the ability to create private repositories. As far as I'm aware it's never turned a profit.
I’ve seen lots of reactions to the Microsoft acquistion news, but one of the more insightful posts comes from Louis-Philippe Véronneau. Like me, he doesn’t trust Microsoft at all.
Some people might be fine with Microsoft's takeover, but to me it's the straw that breaks the camel's back. For a few years now, MS has been running a large marketing campaign on how they love Linux and suddenly decided to embrace Free Software in all of its forms. More like MS BS to me.Yep.Let us take a moment to remind ourselves that:
I’m thankful that we’re now starting the MoodleNet project in a post-GDPR and post-GitHub world. We’ll be using GitLab — initially via their hosted service, but longer-term as a self-hosted solution — and as many open-source products and services as possible.
Interestingly, Véronneau notes that you can use Debian’s infrastructure (terms) or RiseUp’s infrastructure (terms) if your project aligns with their ethos.
Source: Louis-Philippe Véronneau
All the questions (quote)
“One who knows all the answers has not been asked all the questions.”
(Confucius)