African experts are gathered for two days (19-20 February 2018) in Addis Ababa, Ethiopia to contribute to the development of the African Privacy and Personal Data Protection Guidelines. The meeting, facilitated by the African Union Commission (AUC) and supported by Internet Society, explored the future of privacy and data protection and provided some practical suggestions that African states can consider in implementing the Malabo convention provisions related to online privacy. The guidelines are aimed at empowering citizens, as well as establishing legal certainty for stakeholders through clear and uniform personal data protection rules for the region.
The expert meeting comes amidst growing concern across the world on the need to prepare for the EU General Data Protection Regulation (GDPR), which will be enforced on 25 May 2018. The expert meeting is rather focused on creating general principles for African member states in developing good practices now and in the future. The project, a partnership of the AUC and the Internet Society, comes as a follow up to the recommendations of the Africa Infrastructure Security Guidelines, developed in 2017 to assist speed up their adoption and subsequent ratification of the Malabo Convention.
Both the Heads of States Summit in January 2018 and Specialized Technical Committee Ministerial meeting endorsed the development of these guidelines as a way to strengthen the capacity of African states to deal with emerging issues in the digital space.
The African privacy and data protection landscape is still nascent with only 16 of the 55 countries having adopted comprehensive privacy laws regulating the collection and use of personal information (C Fichet, 2015). The African Union Convention on Cyber Security and Personal Data Protection is considered an important first step aimed at creating a uniform system of data processing and determining a common set of rules to govern cross-border transfer of personal data at the continental (African) level to avoid divergent regulatory approaches between the Member States of the African Union. Now that a continental framework is in place, there is a need for more detailed best practice guidelines on personal data protection to assist countries in the process of domesticating the Malabo Convention into the national laws.
Did you know your television is watching you? Specifically, that most smart TVs are sending data off to their makers and in certain cases, to marketers. Consumer Reports showcased the security flaws and the lack of privacy inherent in connected TV in a report last week, while over at GizmodoKashmir Hill has a new article out about privacy in the smart home that puts a big focus on televisions.
It’s no secret that internet-connected TVs share data with others, nor is it remarkable that most TVs available today are smart. That’s what allows you to watch Netflix, YouTube, or Amazon Prime shows. But the rest of our appliances are also going the way of TV. Samsung and Kenmore both say that, going forward, all of their appliances will have some kind of connectivity built into them.
And for many, the features enabled by connected devices will mostly outweigh the fears of data surveillance. I’m not talking about connected light bulbs and home automation here, but about adding truly innovative and helpful features to once-dumb appliances, letting them become truly smart.
An example of this is a washing machine that can tell how dirty your clothes are and select the proper cycle. Or a fridge that can offer you a remote camera feed to the inside so you can see what’s on the shelf. Maybe the fridge could reorder your water filter when it’s getting old. Even better, maybe that same filter could report back on the purity of the water to environmental agencies and consumers as a way to ensure public health.
Smarter products will have to be connected in order to create information exchanges that benefit the consumer, the manufacturer, and maybe even society. However, the industry so far is screwing this up with an ineptitude driven by greed, short-term thinking, and a desire to act first and beg forgiveness later.
This is emblematic of the culture built up over the last two decades in technology, where we took the internet and used it to turn users into the product. The current backlash against Silicon Valley companies is a reaction to this exchange of personal data for services. Especially as the services became more about keeping the person engaged to the exclusion of their well-being or the well-being of society.
This may sound like hippie dippie stuff, but there is a direct link from Google and Facebook’s behavior to the privacy concerns that people have with regard to connected devices. That those concerns are completely justified only makes it worse.
I’ve spent years trying to tell the industry and the government that privacy matters. Not just because it’s a basic right, but because if you respect people’s privacy and offer them agency over controlling their data, they are more likely to buy the product. And if you offer them a compelling reason to share their data while still offering them some control, you actually build a model where the data you collect has to benefit the user or the larger society.
We are starting to see some momentum on this front, and I am hopeful that 2018 will be a turning point in the U.S. The General Data Protection Regulation in the EU has already established a framework for how to establish data privacy as a human right. What’s even more promising is that many of the regulations in the GDPR are impossible or difficult to implement today, and the EU realizes that.
The hope is that the EU will guide technologists in developing tools that match the regulatory framework while the regulatory stick offer will offer an incentive for companies to make a market to develop the tools required to meet the law. Meanwhile, here in the U.S., technologists are increasingly asking themselves how to get and use data responsibly.
While this entire essay is focused on the importance of managing user privacy and the intentional gathering and sharing of consumer data, security is also related to the topic. Specifically, what happens to consumer data when security is breached. As it stands, consumers are worried both about a loss of their privacy to companies, but also to hackers as part of the all-too-often security breaches.
Until the tech companies get their priorities in order and the government steps up with rules that give consumers some control over their information, I believe the promise of the smart home will never take off, because consumers won’t trust it.
This week’s big news had to do with a heat map published back in November by a fitness tracking application called Strava. A 20-year-old in Australian noticed that the running data from U.S. military personnel indicated where clandestine bases were in Syria. His insights percolated through security analysts on Twitter, and then to the U.S. Department of Defense.
Now the DOD is re-evaluating its policies around wearables and mobile phones, and will likely look at the social media habits of its soldiers as well. What happened with Strava is nothing new, exactly. On a smaller scale, hackers and spies have used public social media profiles to get all kinds of information on targets.
But there are two things that are different about the Strava case—and worth noting. The first is the scale of it. The second is how two types of data were combined to create new insights. Strava helpfully showed data from more than a billion activities which, when combined with the map, created a clear picture for those who knew what they were looking for, and disclosed more than Strava intended.
Inadvertently disclosing new information will be the new challenge of our age as we connect ourselves and our things to the internet. Each of us will leave ever-larger digital footprints, which can be combined in various ways to provide new information, all of which will be searchable to anyone with an internet connection and an interest.
Short of hiding in a bunker, wrapping your phone in foil, and ditching social media, what is a person — or a concerned employer — to do? The short answer is we don’t know. Even fully grasping the problem is tough. There are several aspects to it.
Most importantly, there’s an increasing amount of data about individuals online that’s fairly easy to get. Then there’s an increasing amount of data about that data, so-called metadata, that’s also easy to find (or subpoena). For example, if your tweets are data, then the location data attached to them are metadata. And this data can now be combined in new ways. In this week’s podcast, privacy analyst Chiara Rustici called this a “toxic combination.”
Finally, once data is out there, it can be reused, repurposed, and reformulated to help draw new conclusions and meanings that were never intended. Imagine if that permanent record your teachers threatened you with back in school were real. In this new era it effectively is.
That’s just the data challenge. There’s also an economic challenge. Data is incredibly cheap. Which means getting data and metadata and creating these toxic combinations is also incredibly cheap. It’s also seen as incredibly valuable to corporations, which is why everything from your toothbrush maker to your coffeepot is trying to snag as much information as it can.
Data may be cheap to get and hold economic value, but it’s also expensive and difficult to secure, which means bad actors can get a hold of your social security number and credit cards with what feels like relative ease. And yet, when data breaches happen the individual is left to pay the inevitable costs as they try to restore their credit, deal with financial fallout, or recover embarrassing secrets.
There’s a link from Strava’s disclosure of military secrets to revenge porn, and it runs through the internet and its ability to make getting information easier than ever. And it relies on our increasing ability to digitize anything from our running routes to our photos.
We’re intellectually aware of all this, but whenever it comes time to do something about it, we throw up our collective hands and keep snapping our naked pics. There are few existing weapons to solve this problem, so let’s take a look at what they are and where they fall short.
Opt-ins and transparency: Many of our apps and devices come with a variety of privacy settings that can range from simple — share or do not share — to byzantine. Strava’s were apparently byzantine, which didn’t help folks that wanted to stay off the heat map. But good privacy settings can only go so far. They don’t stop hackers from accessing data and they also don’t stop toxic combinations of data.
Differential Privacy: Apple made this privacy concept famous. Essentially all data collected gets anonymized and injected with random noise to make it hard to recombine it and determine to whom the data refers. This is good for individuals, but it requires technical overhead and that the company do it correctly. Apple’s talked a good game, but researchers looking at its implementation say it left a lot to be desired. The other challenge is that you can still glean a lot of information from anonymized data. Note that none of the Strava folks were identified.
Collect only what you need: This idea is simple. If you are making a device or app, don’t collect more data than you need. For example, the Skybell doorbell doesn’t keep a user’s Wi-Fi credentials after getting set up on the network because it’s not information the company needs. Most other connected devices don’t share that view, however, which led to LIFX bulbs leaking a bunch of Wi-Fi credentials a few years back. Whoops.
This is a tough issue because in many cases companies collect all this extra data in case they might need it someday. And thanks to improvements in machine learning, they may not be wrong. Applying machine learning to random data sets can yield new insights that could improve the service.
Regulations: All of the above are voluntary things that companies can do as a step toward protecting user privacy, or letting users have more say in how their data is used. But the strongest tools to protect privacy will come from regulatory pressure. This year, the world is about to get a massive amount of regulatory pressure in the form of the General Data Protection Regulation. This regulation was passed by the EU in 2016 and goes into effect in May. It acts as a safeguard for data. It enshrines some of the above items, such as needing a reason to collect a piece of data and providing transparency, but it also goes a lot further.
For example, it allows an individual to ask what a company knows about them, forces the company to correct wrong information, and requires the company to dump the user’s data upon request. It also prohibits profiling on the basis of data. These are only some of the regulation’s provisions, but in my conversation with Rustici, it became clear that the GDPR is so forward-looking that from a technical standpoint, we don’t have ways to actually implement some of these provisions yet.
For example, the ability to retract your permission to use data sounds good, but once that data is sold to a third party or combined to create new insights, how can that data be controlled? How can the new knowledge go away?
So while privacy is a huge challenge and one that we’re still wrapping our arms around, we also need to build tools to track each piece of data about us. Maybe even each piece of metadata. Then we need ways to claw that data back. All of this has to be scalable, which leads me to look to something like the blockchain as a way to track data.
We also need to develop a far more sophisticated understanding of what is known about us and how that knowledge can be applied. Which means that companies creating fun blog posts or heat maps based on a wide array of anonymized data should carefully consider how that information could be used.
We keep saying that data is the new oil, but oil is not a wholly harmless substance. We need to accept that data isn’t, either.
I consider myself a high-functioning lazy person. I do my laundry regularly, but leave clean clothes in a pile on the floor. I make it to work on time, but have to set my alarm for an hour earlier than I’d like because I hit the snooze button so many times. I will wear a blazer to my business casual office, but only to cover up my terribly wrinkled shirts… which I pick up off my bedroom floor each morning. At the Internet Society, I work primarily on topics related to security and privacy. Through my work, I have the pleasure of learning about new vulnerabilities or computer viruses, how different apps and devices can or already are spying on me and selling my data, and all other manner of scary online threats. As you can imagine I’ve become increasingly paranoid about my online privacy. Yet, when it comes to online privacy, lazy and paranoid is a terrible combination. I know what I should be doing to better protect my online privacy. I know I should update my devices regularly. I know I should be using two factor authentication when its available. But, like the clothes I know I should be folding, I never take the time to do so. So, this Data Privacy Day, I’m making a change. When it comes to online privacy, for too long I’ve just been lazy and paranoid. Now, it’s time for me to become the paranoid, high-functioning lazy person I know I can be. Like overdressing to hide a wrinkled shirt, it’s time to take my laziness and turn it into a strength. Here are a some actions I’m taking on Data Privacy Day to improve my online privacy. All of them are easy, and a few don’t even require follow-up.
Learn how to “shop smart” for connected devices. You don’t want to have to return a connected device because it is spying on you. Returning things is a pain. Learn how to “shop smart,” and buy privacy respecting connected devices so you won’t have to.* My post on shopping for connected toys and Mozilla’s guide to shopping for connected gifts are both great places to start.
Update your devices and its applications. If a device or app has an auto-update feature, turn it on! Are you really going to want to take the time to update it later? Often this is as easy as a couple clicks. And don’t forget to update the less obvious devices. Anything that’s Internet connected, from your light bulbs to your thermostat, should be updated.
Turn on strong encryption. Some devices and services have the capability to use encryption, but don’t turn encryption by default. This is like owning a safe, but leaving it unlocked. Take a few minutes to see if your devices or services are already using encryption or if you need to turn it on.
Review the permissions on your mobile device. No flashlight app ever needs to track your location or your calendar. So, don’t let them! Seriously, do this, it takes less than five minutes. Review your permissions settings and turn off the permissions for apps to gather more data than you’d like.
Review the privacy settings on your social media and store accounts. You may be sharing a lot more than you intended through your social media and store accounts. Review your privacy settings to determine who can see what you write, the pictures you post, or your other activity on the platform. Ask yourself, who do I want to see this sort of information, and who do I not want to see it. When possible, avoid linking your social media accounts with other third party services. Your social media platform does not need to know what music you listen to, so don’t tie your music streaming service to your social media account!
Boost the privacy protections on your favorite browser. There are lots of great browser extensions or plug-ins that can increase your privacy when browsing the web. One browser plugin, HTTPS Everywhere, will ensure that if a website offers an encrypted SSL connection, it will use it. Others, like Ghostery and Privacy Badger, will block tracking cookies or web beacons that companies use to track your browsing habits. Getting privacy protecting browser plugins is a quick and easy way to better privacy.
Stop reusing passwords. It is tempting to reuse a password for multiple devices or services. How are you supposed to remember different passwords for everything? But, while reusing a password may be easier for you to remember, if hacked or stolen, it also makes it easier for criminals to gain access to your other devices or services. Take a few minutes to get a secure password manager and learn how to use it, or, for home devices, write down your passwords in a securely stored notebook.
Turn on two factor authentication (2FA) for your applications and services. Okay, this one is a bit of a stretch goal but hear me out. When you think of 2FA, think of something you know (e.g. password) and something you have (e.g a security token). 2FA means if someone only has your username and password they can’t login as you, and that’s really important, because companies lose databases of their users passwords all the time. The Two Factor Auth site will walk you through how to set it up for almost every website that supports it. Banks, social media, everything.
This Data Privacy Day, lets take action to better protect our privacy online. We might not clean our rooms and dust our furniture, fold our clothes, or wake up on the first alarm, but we’d rather not have our devices show the world just how lazy we can be.
*Strong security and privacy takes time and effort, and device manufacturers can be lazy too. So sometimes, there isn’t going to be a privacy respecting option. At the Internet Society we’re working hard to make it easier for device manufacturers to do the right thing when it comes to security and privacy. The OTA IoT Trust Framework provides manufacturers and others with a simple risk assessment guide for connected devices and systems.
Last year, I was invited to contribute a paper to a special edition of the Health and Technology Journal published by Springer/Nature. The special issue addressed privacy and security, with a particular focus on healthcare and medical data. I’m happy to announce that now, for four weeks only, the publishers have made the whole issue available free.
“The paper, “Trust and ethical data handling in the healthcare context” examines the issues associated with healthcare data in terms of ethics, privacy, and trust, and makes recommendations about what we, as individuals, should ask for and expect from the organisations we entrust with our most sensitive personal data.”
Although we can find several comprehensive and mature data protection frameworks around the world, current legal safeguards to not seem to prevent data controllers from indulging in:
insufficient care of personal data
unexpected or unwelcome use
In my paper, I argue that a narrow focus on regulatory compliance can lead to a “checklist” mentality, obscure the real reasons why organisations should treat data with care and respect, and lead to poor outcomes for both the organisation and the individual. I suggest that we should be encouraging organisations to develop a more collaborative approach, in which data subjects’ interests are better respected, and organisations find that, as a consequence, their risks are lowered and their reputations enhanced.
This also dovetails with the Online Trust Alliance’s new Cyber Incidents & Breach Trends Report that recommends, in part: “By establishing a culture of stewardship (vs just compliance) and implementing policies that take a proactive approach to proper handling and safeguarding of data, organizations can minimize exposure to the cyber incident tsunami and actually thrive by building and maintaining trust with their customers.”
I didn’t know it at the time, but I had some illustrious co-contributors to this special issue, including:
Giovanni Buttarelli, European Data Protection Supervisor and former Secretary General of the Italian Data Protection Authority
Ann Cavoukian, former Ontario Privacy Commissioner, and the architect of the “Privacy by Design” concept
Luca Belli, leader of the Internet Governance project at the FGV Law School, Rio de Janeiro
Julia Powles, tech law researcher at Cambridge University
… and many others.
If I’d been aware of the lineup, I doubt I’d have had the nerve to put pen to paper. So, whether or not you read my piece, do seize the opportunity to learn from these experienced practitioners and thought leaders on data protection and privacy. Here’s where to find the journal: https://link.springer.com/journal/12553/7/4/page/1
Data Privacy Day, an international effort to create awareness about the importance of respecting privacy, safeguarding data and enabling trust, is this Sunday, 28 January. Now is a great time to read these articles and reports and take a look at your own data privacy and protection practices.