Saturday, 21 April 2018

Data Privacy Is a Human Right. Europe Is Moving Toward Recognizing That.

Published on
by

Mark Zuckerberg played dumb when Congress pressed him on privacy protections. But he should know better — the EU is already forcing his hand.


"The UN’s top human rights office concluded years ago that in order to respect the right to privacy, governments should regulate how private companies — not just police and spy agencies — treat personal data." (Photo: TY Lim / Shutterstock )
"The UN’s top human rights office concluded years ago that in order to respect the right to privacy, governments should regulate how private companies — not just police and spy agencies — treat personal data." (Photo: TY Lim / Shutterstock )
During Mark Zuckerberg’s testimony before the House of Representatives, Representative Anna Eshoo (D-CA) asked the Facebook CEO bluntly if he would be willing to change the company’s business model “in the interest of protecting individual privacy.”

Zuckerberg responded, “I’m not sure what that means.”

While “privacy” may sound like a fuzzy concept, it’s not at all a new idea in either human rights law or the rules that apply to Facebook in some of its largest markets. 

The company has also had to defend its practices before courts and regulatory bodies that have examined the issue — which makes Zuckerberg’s answer unsettling.


The UN’s top human rights office concluded years ago that in order to respect the right to privacy, governments should regulate how private companies — not just police and spy agencies — treat personal data. Although the human rights treaties only strictly apply to governments, there is a long-established norm that >businesses should respect rightseven if a government doesn’t force them to do so — and that’s as true for Facebook as for more usual suspects such as the diamondoil, and tobacco industries. The same UN body has specifically urged web-based companies to make sure their practices don’t facilitate inappropriate government surveillance or otherwise harm human rights.

To achieve this, companies should first recognize that simply because a user has “shared” a piece of information with a platform or others doesn’t mean he or she has lost any privacy interest in it. If one looks closely at Facebook executives’ responses to the scandal surrounding data analysis firm Cambridge Analytica’s access to users’ data, one will find repeated mentions of the idea that this was data the users themselves had shared or made public.

However, as the European Court of Human Rights has recognized, data about us can still raise privacy concerns >even if it isn’t something we’ve kept secret. And European Union law acknowledges even more explicitly that personal information we can’t — or shouldn’t have to — keep to ourselves, such as our race and religious beliefs, can still be sensitive and need protection by both governments and companies.

One reason this is important is that when a company gathers, analyzes, or shares data that can identify personal characteristics such as race, this can lead to discrimination — as the ongoing controversy over allegedly biased housing advertisements on Facebook shows.

Companies such as Facebook also create vast pools of personal data where intelligence agenciespolice, hackers, and fraudsters could go fishing. This makes adherence to human rights principles essential for these companies, including when users have knowingly shared information about themselves.

Human rights courts have also recognized that nearly every step in the handling of personal data — from the initial gathering to useretention, and sharing — can interfere with privacy. This means those actions should be limited to what is truly necessary and is proportionate to a legitimate goal.

UN experts have further stated that if data a company holds about you is wrong, you should be able to get that data corrected or deleted — and under the European Union’s new General Data Protection Regulation(GDPR), this will be an even broader right. In many circumstances, the regulation will also require companies to obtain EU users’ specific and informed consent before gathering their data in the first place.

Given that Facebook’s handing of personal data has been the subject of major rights-based challenges in Europe as well as a 2011 settlement with the U.S. Federal Trade Commission about consumer privacy, it seems highly likely that the company is aware of these human rights principles. It simply needs to be willing to act on them.

The new EU regulation, Facebook will need to do this for its millions of users in the European Union by May 25. During the hearing, Zuckerberg indicated that the company will extend those new user protections “to the world.” This was encouraging, although Zuckerberg did not fully explain the details or offer a timeline.

There is no reason for anyone to be a second-class citizen when it comes to data privacy. The company should establish these protections worldwide by the May GDPR deadline, or explain why it can’t and set an expeditious timeline.

The new European requirements will not be a magic bullet, and may not provide a model for solving every problem related to protecting people’s data. But they do embrace some useful ideas about user empowerment that members of Congress were right to raise during the hearing.

They also show that strong rights protections in this area are possible. Both Facebook and Congress should make sure that possibility becomes a reality for users in the United States.
Sarah St.Vincent is CDT’s Human Rights and Surveillance Fellow.  Her work focuses on ensuring that surveillance practices comply with international and regional human rights laws.

https://www.commondreams.org/views/2018/04/20/data-privacy-human-right-europe-moving-toward-recognizing

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home