Tag: privacy (page 1 of 2)

Designing for privacy

Someone described the act of watching Mark Zuckerberg, CEO of Facebook, testifying before Congress as “low level self-harm”. In this post, Joe Edelman explains why:

Zuckerberg and the politicians—they imagine privacy as if it were a software feature. They imagine a system has “good privacy” if it’s consensual and configurable; that is, if people explicitly agree to something, and understand what they agree to, that’s somehow “good for privacy”. Even usually-sophisticated-analysts like Zeynep Tufekci are missing all the nuance here.

Giving the example of a cocktail party where you’re talking to a friend about something confidential and someone else you don’t know comes along, Edelman introduces this definition of privacy:

Privacy, n. Maintaining a sense of what to show in each environment; Locating social spaces for aspects of yourself which aren’t ready for public display, where you can grow those parts of yourself until they can be more public.

I really like this definition, especially the part around “locating social spaces for aspects of yourself which aren’t ready for public display”. I think educators in particular should note this.

Referencing his HSC1 Curriculum which is the basis for workshops he runs for staff from major tech companies, Edelman includes a graphic on the structural features of privacy. I’ll type this out here for the sake of legibility:

  • Relational depth (close friends / acquaintances / strangers / anonymous / mixed)
  • Presentation (crafted / basic / disheveled)
  • Connectivity (transient / pairwise / whole-group)
  • Stakes (high / low)
  • Status levels (celebrities / rank / flat)
  • Reliance (interdependent / independent)
  • Time together (none / brief / slow)
  • Audience size (big / small / unclear)
  • Audience loyalty (loyal / transient / unclear)
  • Participation (invited / uninvited)
  • Pretext (shared goal / shared values / shared topic / many goals (exchange) / emergent)
  • Social Gestures (like / friend / follow / thank / review / comment / join / commit / request / buy)

The post is, of course, both an expert response to the zeitgeist, and a not-too-subtle hint that people should take his course. I’m sure Edelman goes into more depth about each of these structural features in his workshops.

Nevertheless, and even without attending his sessions (which I’m sure are great) there’s value in thinking through each of these elements for the work I’m doing around the MoodleNet project. I’ve probably done some thinking around 70% of these, but it’s great to have a list that helps me organise my thinking a little more.

Source: Joe Edelman

Every part of your digital life is being tracked, packaged up, and sold

I’ve just installed Lumen Privacy Monitor on my Android smartphone after reading this blog post from Mozilla:

New research co-authored by Mozilla Fellow Rishab Nithyanand explores just this: The opaque realm of third-party trackers and what they know about us. The research is titled “Apps, Trackers, Privacy, and Regulators: A Global Study of the Mobile Tracking Ecosystem,” and is authored by researchers at Stony Brook University, Data & Society, IMDEA Networks, ICSI, Princeton University, Corelight, and the University of Massachusetts Amherst.

[…]

In all, the team identified 2,121 trackers — 233 of which were previously unknown to popular advertising and tracking blacklists. These trackers collected personal data like Android IDs, phone numbers, device fingerprints, and MAC addresses.

The link to the full report is linked to in the quotation above, but the high-level findings were:

»Most trackers are owned by just a few parent organizations. The authors report that sixteen of the 20 most pervasive trackers are owned by Alphabet. Other parent organizations include Facebook and Verizon. “There is a clear oligopoly happening in the ecosystem,” Nithyanand says.

» Mobile games and educational apps are the two categories with the highest number of trackers. Users of news and entertainment apps are also exposed to a wide range of trackers. In a separate paper co-authored by Vallina-Rodriguez, he explores the intersection of mobile tracking and apps for youngsters: “Is Our Children’s Apps Learning?

» Cross-device tracking is widespread. The vast majority of mobile trackers are also active on the desktop web, allowing companies to link together personal data produced in both ecosystems. “Cross-platform tracking is already happening everywhere,” Nithyanand says. “Fifteen of the top 20 organizations active in the mobile advertising space also have a presence in the web advertising space.”

We’re finally getting the stage where a large portion of the population can’t really ignore the fact that they’re using free services in return for pervasive and always-on surveillance.

Source: Mozilla: Read, Write, Participate

Survival in the age of surveillance

The Guardian has a list of 18 tips to ‘survive’ (i.e. be safe) in an age where everyone wants to know everything about you — so that they can package up your data and sell it to the highest bidder.

On the internet, the adage goes, nobody knows you’re a dog. That joke is only 15 years old, but seems as if it is from an entirely different era. Once upon a time the internet was associated with anonymity; today it is synonymous with surveillance. Not only do modern technology companies know full well you’re not a dog (not even an extremely precocious poodle), they know whether you own a dog and what sort of dog it is. And, based on your preferred category of canine, they can go a long way to inferring – and influencing – your political views.

Mozilla has pointed out in a recent blog post that the containers feature in Firefox can increase your privacy and prevent ‘leakage’ between tabs as you navigate the web. But there’s more to privacy and security than just that.

Here’s the Guardian’s list:

  1. Download all the information Google has on you.
  2. Try not to let your smart toaster take down the internet.
  3. Ensure your AirDrop settings are dick-pic-proof.
  4. Secure your old Yahoo account.
  5. 1234 is not an acceptable password.
  6. Check if you have been pwned.
  7. Be aware of personalised pricing.
  8. Say hi to the NSA guy spying on you via your webcam.
  9. Turn off notifications for anything that’s not another person speaking directly to you.
  10. Never put your kids on the public internet.
  11. Leave your phone in your pocket or face down on the table when you’re with friends.
  12. Sometimes it’s worth just wiping everything and starting over.
  13. An Echo is fine, but don’t put a camera in your bedroom.
  14. Have as many social-media-free days in the week as you have alcohol-free days.
  15. Retrain your brain to focus.
  16. Don’t let the algorithms pick what you do.
  17. Do what you want with your data, but guard your friends’ info with your life.
  18. Finally, remember your privacy is worth protecting.

A bit of a random list in places, but useful all the same.

Source: The Guardian

The only privacy policy that matters is your own

Dave Pell writes NextDraft, a daily newsletter that’s one of the most popular on the web. I used to subscribe, and it’s undeniably brilliant, but a little US-centric for my liking.

My newsletter, Thought Shrapnel, doesn’t track you. In fact, I have to keep battling MailChimp (the platform I use to send it out) as it thinks I’ve made a mistake. Tracking is so pervasive but I have no need to know exactly how many people clicked on a particular link. It’s an inexact science, anyway.

Pell has written a great post about online privacy:

The story of Cambridge Analytica accessing your personal data on Facebook, supposedly creating a spot-on psychographic profile, and then weaponizing your own personality against you with a series of well-worded messages is now sweeping the media. And it will get louder. And it will pass. And then, I promise, there will be another story about your data being stolen, borrowed, hacked, misused, shared, bought, sold and on and on.

He points out the disconnect between rich people such as Mark Zuckerberg, CEO of Facebook, going to “great lengths” to protect his privacy, whilst simultaneously depriving Facebook users of theirs.

They are right to want privacy. They are right to want to keep their personal lives walled off from anyone from nosy neighbors to potential thieves to, well, Matt Richtel. They should lock their doors and lock down their information. They are right not to want you to know where they live, with whom they live, or how much they spend. They’re right to want to plug a cork in the social media champagne bottle we’ve shaken up in our blind celebration of glass houses.

They are right not to want to toss the floor planks that represent their last hint of personal privacy into the social media wood chipper. They are right in their unwillingness to give in to the seeming inevitability of the internet sharing machine. Do you really think it’s a coincidence that most of the buttons you press on the web are labeled with the word submit?

A Non-Disclosure Agreement (NDA) is something that’s been in the news recently as Donald Trump has taken his shady business practices to the whitehouse. Pell notes that the principle behind NDAs is nevertheless sound: you don’t get to divulge my personal details without my permission.

So you should follow their lead. Don’t do what they say. Do what they do. Better yet, do what they NDA.

[…]

There’s a pretty simple rule: never share anything on any site anywhere on the internet regardless of any privacy settings unless you are willing to accept that the data might one day be public.

The only privacy policy that matters is your own.

Source: Dave Pell

GDPR, blockchain, and privacy

I’m taking an online course about the impending General Data Protection Regulatin (GDPR), which I’ve writing about on my personal blog. An article in WIRED talks about the potential it will have, along with technologies such as blockchain.

People have talked about everyone having ‘private data accounts’ which they then choose to hook up to service providers for years. GDPR might just force that to happen:

A new generation of apps and websites will arise that use private-data accounts instead of conventional user accounts. Internet applications in 2018 will attach themselves to these, gaining access to a smart data account rich with privately held contextual information such as stress levels (combining sleep patterns, for example, with how busy a user’s calendar is) or motivation to exercise comparing historical exercise patterns to infer about the day ahead). All of this will be possible without the burden on the app supplier of undue sensitive data liability or any violation of consumers’ personal rights.

As the article points out, when we know what’s going to happen with our data, we’re probably more likely to share it. For example, I’m much more likely to invest in voice-assisted technologies once GDPR hits in May:

Paradoxically, the internet will become more private at a moment when we individuals begin to exchange more data. We will then wield a collective economic power that could make 2018 the year we rebalance the digital economy.

This will have a huge effect on our everyday information landscape:

The more we share data on our terms, the more the internet will evolve to emulate the physical domain where private spaces, commercial spaces and community spaces can exist separately, but side by side. Indeed, private-data accounts may be the first step towards the internet as a civil society, paving the way for a governing system where digital citizens, in the form of their private micro-server data account, do not merely have to depend on legislation to champion their private rights, but also have the economic power to enforce them as well.

I have to say, the more I discover about the provisions of GDPR, the more excited and optimistic I am about the future.

Source: WIRED

No cash, no freedom?

The ‘cashless’ society, eh?

Every time someone talks about getting rid of cash, they are talking about getting rid of your freedom. Every time they actually limit cash, they are limiting your freedom. It does not matter if the people doing it are wonderful Scandinavians or Hindu supremacist Indians, they are people who want to know and control what you do to an unprecedentedly fine-grained scale.

Yep, just because someone cool is doing it doesn’t mean it won’t have bad consequences. In the rush to add technology to things, we create future dystopias.

Cash isn’t completely anonymous. There’s a reason why old fashioned crooks with huge cash flows had to money-launder: Governments are actually pretty good at saying, “Where’d you get that from?” and getting an explanation. Still, it offers freedom, and the poorer you are, the more freedom it offers. It also is very hard to track specifically, i.e., who made what purchase.

Blockchains won’t be untaxable. The ones which truly are unbreakable will be made illegal; the ones that remain, well, it’s a ledger with every transaction on it, for goodness sakes.

It’s this bit that concerns me:

We are creating a society where even much of what you say, will be knowable and indeed, may eventually be tracked and stored permanently.

If you do not understand why this is not just bad, but terrible, I cannot explain it to you. You have some sort of mental impairment of imagination and ethics.

Source: Ian Welsh

The NSA (and GCHQ) can find you by your ‘voiceprint’ even if you’re speaking a foreign language on a burner phone

This is pretty incredible:

Americans most regularly encounter this technology, known as speaker recognition, or speaker identification, when they wake up Amazon’s Alexa or call their bank. But a decade before voice commands like “Hello Siri” and “OK Google” became common household phrases, the NSA was using speaker recognition to monitor terrorists, politicians, drug lords, spies, and even agency employees.

The technology works by analyzing the physical and behavioral features that make each person’s voice distinctive, such as the pitch, shape of the mouth, and length of the larynx. An algorithm then creates a dynamic computer model of the individual’s vocal characteristics. This is what’s popularly referred to as a “voiceprint.” The entire process — capturing a few spoken words, turning those words into a voiceprint, and comparing that representation to other “voiceprints” already stored in the database — can happen almost instantaneously. Although the NSA is known to rely on finger and face prints to identify targets, voiceprints, according to a 2008 agency document, are “where NSA reigns supreme.”

Hmmm….

The voice is a unique and readily accessible biometric: Unlike DNA, it can be collected passively and from a great distance, without a subject’s knowledge or consent. Accuracy varies considerably depending on how closely the conditions of the collected voice match those of previous recordings. But in controlled settings — with low background noise, a familiar acoustic environment, and good signal quality — the technology can use a few spoken sentences to precisely match individuals. And the more samples of a given voice that are fed into the computer’s model, the stronger and more “mature” that model becomes.

So yeah, let’s put a microphone in every room of our house so that we can tell Alexa to turn off the lights. What could possibly go wrong?

Source: The Intercept

DuckDuckGo moves beyond search

This is excellent news:

Today we’re taking a major step to simplify online privacy with the launch of fully revamped versions of our browser extension and mobile app, now with built-in tracker network blocking, smarter encryption, and, of course, private search – all designed to operate seamlessly together while you search and browse the web. Our updated app and extension are now available across all major platforms – Firefox, Safari, Chrome, iOS, and Android – so that you can easily get all the privacy essentials you need on any device with just one download.

I have a multitude of blockers installed, which makes it difficult to recommend just one to people. Hopefully this will simplify things:

For the last decade, DuckDuckGo has been giving you the ability to search privately, but that privacy was only limited to our search box. Now, when you also use the DuckDuckGo browser extension or mobile app, we will provide you with seamless privacy protection on the websites you visit. Our goal is to expand this privacy protection over time by adding even more privacy features into this single package. While not all privacy protection can be as seamless, the essentials available today and those that we will be adding will go a long way to protecting your privacy online, without compromising your Internet experience.

It looks like the code is all open source, too! 👏 👏 👏

Source: DuckDuckGo blog

WTF is GDPR?

I have to say, I was quite dismissive of the impact of the EU’s General Data Protection Regulation (GDPR) when I first heard about it. I thought it was going to be another debacle like the ‘this website uses cookies’ thing.

However, I have to say I’m impressed with what’s going to happen in May. It’s going to have a worldwide impact, too — as this article explains:

For an even shorter tl;dr the [European Commission’s] theory is that consumer trust is essential to fostering growth in the digital economy. And it thinks trust can be won by giving users of digital services more information and greater control over how their data is used. Which is — frankly speaking — a pretty refreshing idea when you consider the clandestine data brokering that pervades the tech industry. Mass surveillance isn’t just something governments do.

It’s a big deal:

[GDPR is] set to apply across the 28-Member State bloc as of May 25, 2018. That means EU countries are busy transposing it into national law via their own legislative updates (such as the UK’s new Data Protection Bill — yes, despite the fact the country is currently in the process of (br)exiting the EU, the government has nonetheless committed to implementing the regulation because it needs to keep EU-UK data flowing freely in the post-brexit future. Which gives an early indication of the pulling power of GDPR.

…and unlike other regulations, actually has some teeth:

The maximum fine that organizations can be hit with for the most serious infringements of the regulation is 4% of their global annual turnover (or €20M, whichever is greater). Though data protection agencies will of course be able to impose smaller fines too. And, indeed, there’s a tiered system of fines — with a lower level of penalties of up to 2% of global turnover (or €10M

I’m having conversations about it wherever I go, from my work at Moodle (an company headquartered in Australia) to the local Scouts.

Source: TechCrunch

Choose your connected silo

The Verge reports back from CES, the yearly gathering where people usually get excited about shiny thing. This year, however, people are bit more wary…

And it’s not just privacy and security that people need to think about. There’s also lock-in. You can’t just buy a connected gadget, you have to choose an ecosystem to live in. Does it work with HomeKit? Will it work with Alexa? Will some tech company get into a spat with another tech company and pull its services from that hardware thing you just bought?

In other words, the kind of digital literacies required by the average consumer just went up a notch.

Here’s the thing: it’s unlikely that the connected toothpaste will go back in the tube at this point. Consumer products will be more connected, not less. Some day not long from now, the average person’s stroll down the aisle at Target or Best Buy will be just like our experiences at futuristic trade shows: everything is connected, and not all of it makes sense.

It won’t be long before we’ll be inviting techies around to debug our houses…

Source: The Verge