👀 First Look: Meet the New Linux Distro Inspired by the iPad — “This distro is designed to be a tablet first and a “laptop-lite” experience second. And I do mean “lite”; this is not trying to be a desktop Linux distro that runs tablet apps, but a tablet Linux distro that can run desktop ones – a distinction that’s worth keeping in mind.”
🤯 DALL·E: Creating Images from Text — “GPT-3 showed that language can be used to instruct a large neural network to perform a variety of text generation tasks. Image GPT showed that the same type of neural network can also be used to generate images with high fidelity. We extend these findings to show that manipulating visual concepts through language is now within reach.”
👩💻 You can now run Linux on Apple M1 devices — “While Linux, and even Windows, were already usable on Apple Silicon thanks to virtualization, this is the first instance of a non-macOS operating system running natively on the hardware.”
Quotation-as-title by Montesquieu. Image from top-linked post.
✨ The World’s Oldest Story? Astronomers Say Global Myths About ‘Seven Sisters’ Stars May Reach Back 100,000 Years — “Why are the Australian Aboriginal stories so similar to the Greek ones? Anthropologists used to think Europeans might have brought the Greek story to Australia, where it was adapted by Aboriginal people for their own purposes. But the Aboriginal stories seem to be much, much older than European contact. And there was little contact between most Australian Aboriginal cultures and the rest of the world for at least 50,000 years. So why do they share the same stories?”
🚶♂️ The joy of steps: 20 ways to give purpose to your daily walk — “We need to gallivant around outside in daylight so that our circadian rhythms can regulate sleep and alertness. (Yes, even when the sky is resolutely leaden, it is still technically daylight.) Walking warms you up, too; when you get back indoors, it will feel positively tropical.”
🔐 How Law Enforcement Gets Around Your Smartphone’s Encryption — “Cryptographers at Johns Hopkins University used publicly available documentation from Apple and Google as well as their own analysis to assess the robustness of Android and iOS encryption. They also studied more than a decade’s worth of reports about which of these mobile security features law enforcement and criminals have previously bypassed, or can currently, using special hacking tools.”
😲 The Ethics of Emotion in AI Systems (Research Summary) — “There will always be a gap between the emotions modelled and the experience of EAI systems. Addressing this gap also implies recognizing the implicit norms and values integrated into these systems in ways that cannot always be foreseen by the original designers. With EAI, it is not just a matter of deciding between the right emotional models and proxy variables, but what the responses collected signify in terms of human beings’ inner feelings, judgments, and future actions.”
Quotation-as-title by Baltasar Gracián. Image from top-linked post.
There are major issues of transparency and authenticity here because the beliefs and opinions don’t actually belong to the digital models, they belong to the models’ creators. And if the creators can’t actually identify with the experiences and groups that these models claim to belong to (i.e., person of color, LGBTQ, etc.), then do they have the right to actually speak on those issues? Or is this a new form of robot cultural appropriation, one in which digital creators are dressing up in experiences that aren’t theirs?
Sinead Bovell (Vogue)
This is an incredible article that looks at machine learning and AI through the lens of an industry I hadn’t thought of as being on the brink of being massively disrupted by technology.
It is strange that “cancel culture” has become a project of the left, which spent the 20th century fighting against capricious firings of “troublesome” employees. A lack of due process does not become a moral good just because you sometimes agree with its targets. We all, I hope, want to see sexism, racism, and other forms of discrimination decrease. But we should be aware of the economic incentives here, particularly given the speed of social media, which can send a video viral, and see onlookers demand a response, before the basic facts have been established. Afraid of the reputational damage that can be incurred in minutes, companies are behaving in ways that range from thoughtless and uncaring to sadistic.
If you care about progressive causes, then woke capitalism is not your friend. It is actively impeding the cause, siphoning off energy, and deluding us into thinking that change is happening faster and deeper than it really is. When people talk about the “excesses of the left”—a phenomenon that blights the electoral prospects of progressive parties by alienating swing voters—in many cases they’re talking about the jumpy overreactions of corporations that aren’t left-wing at all.
Helen Lewis (The Atlantic)
Cancel culture is problematic, and mainly because of the unequal power structures involved. This is an important read. See also this article by Albert Wenger which has some suggestions towards the end in this regard.
The goal of productivity is to get the things you have to get done finished so you can spend more time on the things you want to do. Don’t fall into the busy trap, where you judge your self-worth by how productive you are or how much you’ve contributed to your company or manager. We’re all just trying to keep our heads above water. I hope these tips will help you do the same.
Alan Henry (WIRED)
As I wrote yesterday on my personal blog, I have a bit of an issue with perfectionism. So this reminder, along with the other great advice in the article, was a timely reminder.
If you treat somebody with disdain, of course, you give that person a psychological incentive to diminish your opinion and to want you to be less powerful. Inversely, if you demonstrate understanding and appreciation of someone’s contribution, you create a psychological incentive in the individual to give greater weight to your opinion. And that person will want to strengthen the weight of your opinion in the eyes of others. Appreciation and gratitude breed appreciation and gratitude.
Bruce Tulgan (Fast Company)
Creating a productive, psychologically safe, and emotionally intelligent environment means thanking people for the work they do. That means for their day-to-day activities, not just when they put in a herculean effort. A paycheck is not thanks enough for the work we do and the value we provide.
More interesting still is that nostalgia can bring to mind time-periods we didn’t directly experience. In the film Midnight in Paris (2011), Gil is overwhelmed by nostalgic thoughts about 1920s Paris – which he, a modern-day screenwriter, hasn’t experienced – yet his feelings are nothing short of nostalgic. Indeed, feeling nostalgic for a time one didn’t actually live through appears to be a common phenomenon if all the chatrooms, Facebook pages and websites dedicated to it are anything to go by. In fact, a new word has been coined to capture this precise variant of nostalgia – anemoia, defined by the Urban Dictionary and the Dictionary of Obscure Sorrows as ‘nostalgia for a time you’ve never known’.
How can we make sense of the fact that people feel nostalgia not only for past experiences but also for generic time periods? My suggestion, inspired by recent evidence from cognitive psychology and neuroscience, is that the variety of nostalgia’s objects is explained by the fact that its cognitive component is not an autobiographical memory, but a mental simulation – an imagination, if you will – of which episodic recollections are a sub-class.
Nigel Warburton (Aeon)
In the UK at least, shows like Downton Abbey and Call The Midwife are popular. My view of this is that, as this article would seem to support, it’s a kind of nostalgia for a time that was imagined to be better.
There’s a sinister side to this, as well. This kind of nostalgia seems to be particularly prevalent among more conservative-leaning (white) people harking back to a time of greater divisions in society along race and class lines. I think it’s rather disturbing.
Quiet Parks International (QPI) is a nonprofit working to establish certification for quiet parks to raise awareness of and preserve quiet places. The fledgling organization—whose members include audio engineers, scientists, environmentalists, and musicians—has identified at least 262 sites worldwide, including 30 in the US, that it believes are quiet or could become so with management changes….
QPI has no regulatory authority, but like the International Dark Sky Association’s Dark Sky Parks initiative, the nonprofit believes its certification—granted only after a detailed, three-day sound analysis—can encourage public support of preservation efforts and provide guidelines for protection. “The places that are quiet today … are basically leftovers—places that are out of the way,” Quiet Parks cofounder Gordon Hempton says.
Jenny Morber (WIRED)
I live in a part of the world close to both a designated Dark Sky Park and mountains into which I can escape. Light and noise pollution threaten both of them, so I’m glad to hear of these efforts.
Today’s title comes from Edward Snowden, and is a pithy overview of the ‘nothing to hide’ argument that I guess I’ve struggled to answer over the years. I’m usually so shocked that an intelligent person would say something to that effect, that I’m not sure how to reply.
When you say, ‘I have nothing to hide,’ you’re saying, ‘I don’t care about this right.’ You’re saying, ‘I don’t have this right, because I’ve got to the point where I have to justify it.’ The way rights work is, the government has to justify its intrusion into your rights.
This, then, is the fifth article in my ongoing blogchain about post-pandemic society, which already includes:
It does not surprise me that those with either a loose grip on how the world works, or those who need to believe that someone, somewhere has ‘a plan’, believe in conspiracy theories around the pandemic.
What is true, and what can easily be mistaken for ‘planning’ is the preparedness of those with a strong ideology to double-down on it during a crisis. People and organisations reveal their true colours under stress. What was previously a long game now becomes a short-term priority.
For example, this week, the US Senate “voted to give law enforcement agencies access to web browsing data without a warrant”, reports VICE. What’s interesting, and concerning to me, is that Big Tech and governments are acting like they’ve already won the war on harvesting our online life, and now they’re after our offline life, too.
I have huge reservations about the speed in which Covid-19 apps for contact tracing are being launched when, ultimately, they’re likely to be largely ineffective.
We already know how to do contact tracing well and to train people how to do it. But, of course, it costs money and is an investment in people instead of technology, and privacy instead of surveillance.
There are plenty of articles out there on the difference between the types of contact tracing apps that are being developed, and this BBC News article has a useful diagram showing the differences between the two.
TL;DR: there is no way that kind of app is going on my phone. I can’t imagine anyone who I know who understands tech even a little bit installing it either.
Whatever the mechanics of how it goes about doing it happen to be, the whole point of a contact tracing app is to alert you and the authorities when you have been in contact with someone with the virus. Depending on the wider context, that may or may not be useful to you and society.
However, such apps are more widely applicable. One of the things about technology is to think about the effects it could have. What else could an app like this have, especially if it’s baked into the operating systems of devices used by 99% of smartphone users worldwide?
The above diagram is Marshall McLuhan’s tetrad of media effects, which is a useful frame for thinking about the impact of technology on society.
Big Tech and governments have our online social graphs, a global map of how everyone relates to everyone else in digital spaces. Now they’re going after our offline social graphs too.
The general reaction to this seemed to be one of eye-rolling and expressing some kind of Chinese exceptionalism when this was reported back in January.
I think it’s drones that concern me most of all. Places like Baltimore were already planning overhead surveillance pre-pandemic, and our current situation has only accelerated and exacerbated that trend.
In that case, it’s US Predator drones that have previously been used to monitor and bomb places in the Middle East that are being deployed on the civilian population. These drones operate from a great height, unlike the kind of consumer drones that anyone can buy.
However, as was reported last year, we’re on the cusp of photovoltaic drones that can fly for days at a time:
This breakthrough has big implications for technologies that currently rely on heavy batteries for power. Thermophotovoltaics are an ultralight alternative power source that could allow drones and other unmanned aerial vehicles to operate continuously for days. It could also be used to power deep space probes for centuries and eventually an entire house with a generator the size of an envelope.
Linda Vu (TechXplore)
Not only will the government be able to fly thousands of low-cost drones to monitor the population, but they can buy technology, like this example from DefendTex, to take down other drones.
That is, of course, if civilian drones continue to be allowed, especially given the ‘security risk’ of Chinese-made drones flying around.
It’s interesting times for those who keep a watchful eye on their civil liberties and government invasion of privacy. Bear that in mind when tech bros tell you not to fear robots because they’re dumb. The people behind them aren’t, and they have an agenda.
Critics may say that it is unreasonable to expect maps to reflect the communities or achievements of nonhumans. Maps are made by humans, for humans. When beavers start Googling directions to a neighbor’s dam, then their homes can be represented! For humans who use maps solely to navigate—something that nonhumans do without maps—man-made roads are indeed the only features that are relevant. Following a map that includes other information may inadvertently lead a human onto a trail made by and for deer.
But maps are not just tools to get from points A to B. They also relay new and learned information, document evolutionary changes, and inspire intrepid exploration. We operate on the assumption that our maps accurately reflect what a visitor would find if they traveled to a particular area. Maps have immense potential to illustrate the world around us, identifying all the important features of a given region. By that definition, the current maps that most humans use fall well short of being complete. Our definition of what is “important” is incredibly narrow.
Ryan Huling (WIRED)
Cartography is an incredibly powerful tool. We’ve known for a long time that “the map is not the territory” but perhaps this is another weapon in the fight against climate change and the decline in diversity of species?
Then you get people like Greenwald, Assange, Manning and Snowden. They are polarizing figures. They are loved or hated. They piss people off.
They piss people off precisely because they have principles they consider non-negotiable. They will not do the easy thing when it matters. They will not compromise on anything that really matters.
That’s breaking the actual social contract of “go along to get along”, “obey authority” and “don’t make people uncomfortable.” I recently talked to a senior activist who was uncomfortable even with the idea of yelling at powerful politicians. It struck them as close to violence.
So here’s the thing, people want men and women of principle to be like ordinary people.
They aren’t. They can’t be. If they were, they wouldn’t do what they do. Much of what you may not like about a Greenwald or Assange or Manning or Snowden is why they are what they are. Not just the principle, but the bravery verging on recklessness. The willingness to say exactly what they think, and do exactly what they believe is right even if others don’t.
Activists like Greta Thunberg and Edward Snowden are the closest we get to superheroes, to people who stand for the purest possible version of an idea. This is why we need them — and why we’re so disappointed when they turn out to be human after all.
Students’ not comprehending the value of engaging in certain ways is more likely to be a failure in our teaching than their willingness to learn (especially if we create a culture in which success becomes exclusively about marks and credentialization). The question we have to ask is if what we provide as ‘university’ goes beyond the value of what our students can engage with outside of our formal offer.
This is a great post by Dave, who I had the pleasure of collaborating with briefly during my stint at Jisc. I definitely agree that any organisation walks a dangerous path when it becomes overly-fixated on the ‘how’ instead of the ‘what’ and the ‘why’.
The power of an epigram or one of these expressions is that they say a lot with a little. They help guide us through the complexity of life with their unswerving directness. Each person must, as the retired USMC general and former Secretary of Defense Jim Mattis, has said, “Know what you will stand for and, more important, what you won’t stand for.” “State your flat-ass rules and stick to them. They shouldn’t come as a surprise to anyone.”
Of the 11 expressions here, I have to say that other than memento mori (“remember you will die”) I particularly like semper anticus (“always forward”) which I’m going to print out in a fancy font and stick on the wall of my home office.
In a hypothetical world, you could get a Discord (or whatever is next) link for your new job tomorrow – you read some wiki and meta info, sort yourself into your role you’d, and then are grouped with the people who you need to collaborate with on a need be basis. All wrapped in one platform. Maybe you have an HR complaint – drop it in #HR where you can’t read the messages but they can, so it’s a blind 1 way conversation. Maybe there is a #help channel, where you ping you write your problems and the bot pings people who have expertise based on keywords. There’s a lot of things you can do with this basic design.
What is described in this post is a bit of a stretch, but I can see it: a world where work is organised a bit like how gamers organisers in chat channels. Something to keep an eye on, as the interplay between what’s ‘normal’ and what’s possible with communications technology changes and evolves.
What made the last decade so difficult is how education institutions let corporations control the definitions so that a lot of “study and ethical practice” gets left out of the work. With the promise of ease of use, low-cost, increased student retention (or insert unreasonable-metric-claim here), etc. institutions are willing to buy into technology without regard to accessibility, scalability, equity and inclusion, data privacy or student safety, in hope of solving problem X that will then get to be checked off of an accreditation list. Or worse, with the hope of not having to invest in actual people and local infrastructure.
Geoff Cain (Brainstorm in progress)
It’s nice to see a list of some positives that came out of the last decades, and for microcredentials and badging to be on that list.
First, let’s consider the canonized usages. The subreddit r/birbs defines a birb as any bird that’s “being funny, cute, or silly in some way.” Urban Dictionary has a more varied set of definitions, many of which allude to a generalized smallness. A video on the youtube channel Lucidchart offers its own expansive suggestions: All birds are birbs, a chunky bird is a borb, and a fluffed-up bird is a floof. Yet some tension remains: How can all birds be birbs if smallness or cuteness are in the equation? Clearly some birds get more recognition for an innate birbness.
Asher Elbein (Audubon magazine)
A fun article, but also an interesting one when it comes to ambiguity, affinity groups, and internet culture.
“Now, why would Gmail or Facebook pay us? Because what we’re giving them in return is not money but data. We’re giving them lots of data about where we go, what we eat, what we buy. We let them read the contents of our email and determine that we’re about to go on vacation or we’ve just had a baby or we’re upset with our friend or it’s a difficult time at work. All of these things are in our email that can be read by the platform, and then the platform’s going to use that to sell us stuff.”
Fiona Scott Morton (Yale business school) quoted by Peter coy (Bloomberg Businessweek)
Regular readers of Thought Shrapnel know all about surveillance capitalism, but it’s good to see these explainers making their way to the more mainstream business press.
The most famous social credit system in operation is that used by China’s government. It “monitors millions of individuals’ behavior (including social media and online shopping), determines how moral or immoral it is, and raises or lowers their “citizen score” accordingly,” reported Atlantic in 2018.
“Those with a high score are rewarded, while those with a low score are punished.” Now we know the same AI systems are used for predictive policing to round up Muslim Uighurs and other minorities into concentration camps under the guise of preventing extremism.
Violet Blue (Engadget)
Some (more prudish) people will write this article off because it discusses sex workers, porn, and gay rights. But the truth is that all kinds of censorship start with marginalised groups. To my mind, we’re already on a trajectory away from Silicon Valley and towards Chinese technology. Will we be able to separate the tech from the morality?
The researchers worry that the focus on keeping children away from screens is making it hard to have more productive conversations about topics like how to make phones more useful for low-income people, who tend to use them more, or how to protect the privacy of teenagers who share their lives online.
“Many of the people who are terrifying kids about screens, they have hit a vein of attention from society and they are going to ride that. But that is super bad for society,” said Andrew Przybylski, the director of research at the Oxford Internet Institute, who has published several studies on the topic.
Nathaniel Popper (The New York Times)
Kids and screentime is just the latest (extended) moral panic. Overuse of anything causes problems, smartphones, games consoles, and TV included. What we need to do is to help our children find balance in all of this, which can be difficult for the first generation of parents navigating all of this on the frontline.
Happy 25th year, blogging. You’ve grown up, but social media is still having a brawl(The Guardian) — “The furore over social media and its impact on democracy has obscured the fact that the blogosphere not only continues to exist, but also to fulfil many of the functions of a functioning public sphere. And it’s massive. One source, for example, estimates that more than 409 million people view more than 20bn blog pages each month and that users post 70m new posts and 77m new comments each month. Another source claims that of the 1.7 bn websites in the world, about 500m are blogs. And WordPress.com alone hosts blogs in 120 languages, 71% of them in English.”
Emmanuel Macron Wants to Scan Your Face(The Washington Post) — “President Emmanuel Macron’s administration is set to be the first in Europe to use facial recognition when providing citizens with a secure digital identity for accessing more than 500 public services online… The roll-out is tainted by opposition from France’s data regulator, which argues the electronic ID breaches European Union rules on consent – one of the building blocks of the bloc’s General Data Protection Regulation laws – by forcing everyone signing up to the service to use the facial recognition, whether they like it or not.”
This is your phone on feminism (The Conversationalist) — “Our devices are basically gaslighting us. They tell us they work for and care about us, and if we just treat them right then we can learn to trust them. But all the evidence shows the opposite is true. This cognitive dissonance confuses and paralyses us. And look around. Everyone has a smartphone. So it’s probably not so bad, and anyway, that’s just how things work. Right?”
Google’s auto-delete tools are practically worthless for privacy(Fast Company) — “In reality, these auto-delete tools accomplish little for users, even as they generate positive PR for Google. Experts say that by the time three months rolls around, Google has already extracted nearly all the potential value from users’ data, and from an advertising standpoint, data becomes practically worthless when it’s more than a few months old.”
Audrey Watters (Uses This) — “For me, the ideal set-up is much less about the hardware or software I am using. It’s about the ideas that I’m thinking through and whether or not I can sort them out and shape them up in ways that make for a good piece of writing. Ideally, that does require some comfort — a space for sustained concentration. (I know better than to require an ideal set up in order to write. I’d never get anything done.)”
Computer Files Are Going Extinct(OneZero) — “Files are skeuomorphic. That’s a fancy word that just means they’re a digital concept that mirrors a physical item. A Word document, for example, is like a piece of paper, sitting on your desk(top). A JPEG is like a painting, and so on. They each have a little icon that looks like the physical thing they represent. A pile of paper, a picture frame, a manila folder. It’s kind of charming really.”
Inside Mozilla’s 18-month effort to market without Facebook(Digiday) — “The decision to focus on data privacy in marketing the Mozilla brand came from research conducted by the company four years ago into the rise of consumers who make values-based decisions on not only what they purchase but where they spend their time.”
Core human values not eyeballs(Cubic Garden) — “Theres so much more to do, but the aims are high and important for not just the BBC, but all public service entities around the world. Measuring the impact and quality on peoples lives beyond the shallow meaningless metrics for public service is critical.”
So said my namesake Douglas Adams. In fact, he said lots of wise things about technology, most of them too long to serve as a title.
I’m in a weird place, emotionally, at the moment, but sometimes this can be a good thing. Being taken out of your usual ‘autopilot’ can be a useful way to see things differently. So I’m going to take this opportunity to share three things that, to be honest, make me a bit concerned about the next few years…
Attempts to put microphones everywhere
In an article for Slate, Shannon Palus ranks all of Amazon’s new products by ‘creepiness’. The Echo Frames are, in her words:
A microphone that stays on your person all day and doesn’t look like anything resembling a microphone, nor follows any established social codes for wearable microphones? How is anyone around you supposed to have any idea that you are wearing a microphone?
When we’re not talking about weapons of mass destruction, it’s not the tech that concerns me, but the context in which the tech is used. As Palus points out, how are you going to be able to have a ‘quiet word’ with anyone wearing glasses ever again?
It’s not just Amazon, of course. Google and Facebook are at it, too.
With the exception, perhaps, of populist politicians, I don’t think we’re ready for a post-truth society. Check out the video above, which shows Chinese technology that allows for ‘full body deepfakes’.
The video is embedded, along with a couple of others in an article for Fast Company by DJ Pangburn, who also notes that AI is learning human body movements from videos. Not only will you be able to prank your friends by showing them a convincing video of your ability to do 100 pull-ups, but the fake news it engenders will mean we can’t trust anything any more.
If you clicked on the ‘super-secret link’ in Sunday’s newsletter, you will have come across STEALING UR FEELINGS which is nothing short of incredible. As powerful as it is in showing you the kind of data that organisations have on us, it’s the tip of the iceberg.
Kaveh Waddell, in an article for Axios, explains that brains are the last frontier for privacy:
“The sort of future we’re looking ahead toward is a world where our neural data — which we don’t even have access to — could be used” against us, says Tim Brown, a researcher at the University of Washington Center for Neurotechnology.
This would lead to ‘neuromarketing’, with advertisers knowing what triggers and influences you better than you know yourself. Also, it will no doubt be used for discriminatory purposes and, because it’s coming directly from your brainwaves, short of literally wearing a tinfoil hat, there’s nothing much you can do.
Have a quick skim through these links that I came across this week and found interesting:
Overrated: Ludwig Wittgenstein(Standpoint) — “Wittgenstein’s reputation for genius did not depend on incomprehensibility alone. He was also “tortured”, rude and unreliable. He had an intense gaze. He spent months in cold places like Norway to isolate himself. He temporarily quit philosophy, because he believed that he had solved all its problems in his 1922 Tractatus Logico-Philosophicus, and worked as a gardener. He gave away his family fortune. And, of course, he was Austrian, as so many of the best geniuses are.”
EdTech Resistance(Ben Williamson) — “We should not and cannot ignore these tensions and challenges. They are early signals of resistance ahead for edtech which need to be engaged with before they turn to public outrage. By paying attention to and acting on edtech resistances it may be possible to create education systems, curricula and practices that are fair and trustworthy. It is important not to allow edtech resistance to metamorphose into resistance to education itself.”
The Guardian view on machine learning: a computer cleverer than you?(The Guardian) — “The promise of AI is that it will imbue machines with the ability to spot patterns from data, and make decisions faster and better than humans do. What happens if they make worse decisions faster? Governments need to pause and take stock of the societal repercussions of allowing machines over a few decades to replicate human skills that have been evolving for millions of years.”
A nerdocratic oath(Scott Aaronson) — “I will never allow anyone else to make me a cog. I will never do what is stupid or horrible because “that’s what the regulations say” or “that’s what my supervisor said,” and then sleep soundly at night. I’ll never do my part for a project unless I’m satisfied that the project’s broader goals are, at worst, morally neutral. There’s no one on earth who gets to say: “I just solve technical problems. Moral implications are outside my scope”.”
Privacy is power(Aeon) — “The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.”
The Symmetry and Chaos of the World’s Megacities(WIRED) — “Koopmans manages to create fresh-looking images by finding unique vantage points, often by scouting his locations on Google Earth. As a rule, he tries to get as high as he can—one of his favorite tricks is talking local work crews into letting him shoot from the cockpit of a construction crane.”