Wednesday, April 30, 2008

Privacy: Who's Problem Is It ?

Last week, I was interviewed by a professor who was researching the 'economics of privacy'. He wanted to know why organizations invest in privacy protection. Are there economic drivers for establishing privacy policies? Or is privacy just a 'Lawyers Paradise'?

It is an interesting question. In her PhD-thesis, professor Overkleeft -a Dutch law professor- stated that "privacy is a mandarin science" (G. Overkleeft-Verburg, De Wet Persoonsregistraties; Norm, Toepassing en Evaluatie, 1995). What she meant with that remark is, that there are only a handful of people (lawyers, commissioners, academics) who are privvy to the details of privacy theory and privacy practice. For everybody else, privacy is just a vague feeling, a buzz word, that is often misunderstood and sometimes misused or inappropriately applied. More than 10 years later, it is a safe bet that privacy still is a Mandarin science.... at least, in Europe.

In Europe, privacy is considered a fundamental right, which is laid down accordingly in the EU Charter of Fundamental Rights. But Europe's fundamental problem is its inability to turn the fundamental right of privacy into a people's issue. Europe claims that it has the best privacy laws in the world. Or at least, the eurocrats in Brussels and the Data Protection Authorities in Europe think so. This belief is even enshrined in article 25 of Directive 95/46. The "Digital Fortress Europe"-rule prohibits data export to countries that do not have an "adequate level of protection". Europe fiercely protects the personal data of its citizens, both in Europe and abroad. But in Europe's streets nobody knows those laws, and worse..... hardly anybody cares. The bureaucrats have reduced privacy protection to a legal compliance problem. For the average man, it is hardly an issue. But in the Information Age, privacy should be a matter of concern to everybody, not only to a "happy few", such as the data protection authorities, the lawyers, and a handful of privacy-rights activists. Privacy should not be something driven by law and supervisory authorities, but something driven by internal values of organizations and governements and the need to earn the trust of their stakeholders. Privacy should be something that people care about.

Take for instance The Netherlands. There are no privacy rights groups anymore. Bits of Freedom, a small Dutch digital rights movement, decided to liquidate itself a few years ago. Even worse, the "Dutch People" got the Big Brother Award 2007 because of their lack of interest in privacy. The average Dutchman likes to say that he has "nothing to hide (except his PIN-code)". On the other hand, there are plenty of law firms around that have some sort of a privacy practice. And an increasing number of organizations, companies as well as government agencies, are appointing privacy officers, or have some other form of in-house privacy expertise.

So, the professor's question seems to be a valid one. WHY do organizations invest in privacy if their stakeholders don't seem to care much about it? The answer is probably a complex one. Some organizations seem to have a privacy progam because they want to be seen as acting responsibly towards society. They invest in privacy because of their internal values. For others, it is an issue that they think may give them a comptetitive edge in the market. They invest in privacy to win, but are probably willing to drop it, if it doesn't pay off. And for many others it is just a matter of compliance with a set of very difficult and vague laws, and they do their best to be compliant. They invest in privacy because of risk avoidance and legal compliance.
However, I am also afraid that for most organizations it is just another law with which they -consciously or unconsciously- do not to comply. They don't invest in privacy simply because their stakeholders -internal or external- don't care about it. So why bother....?

So what did I tell the professor in the end? I told him that as long as we don't have good privacy metrics that enable us to show how much organizations lose if they don't protect the privacy of their customers and employees (either in actual cost, in opportunity cost or in lost sales because of lack of trust), the primary reason for investing in privacy is legal compliance and avoidance of legal risk. As long as people remain indifferent about their privacy, there will be no incentive to invest in stronger privacy protection in organizations. Thus, privacy will remain the problem of the legal department. It is seen as a cost rather than a benefit.

When we think of how to shape Privacy 2.0 in the 21st century, we also have to figure out how to make privacy a people's issue.

Wednesday, April 16, 2008

Rethinking Privacy 1.0

Ever since the OECD published its Privacy Principles in 1980, the privacy laws around the world have been focussing primarily on protecting personal data in databases. And all those years the fundamentals of privacy have remained the same. The European Data Protection Directive 95/46, which allows personal data to move across borders in Europe, came into force 15 years after the OECD Privacy Principles on which it is based were published, but meanwhile the world had already changed: the Internet had arrived! Only 8 years after the Directive was introduced, on November 6, 2003, the European Court of Justice -Europe's highest court- was confronted with this change in its only second ever case under the Data Protection Directive: Lindqvist vs. Sweden. This case demonstrated the built-in weakness of the European system of data protection: despite the EU's claim to the contrary, the Directive proved to be not very technology-neutral, and therefore not very future-proof. The various actors in the Lindqvist-case, such as the Advocate-General and the Member States, tried to do their best to get a legal grip on the facts while trying to preserve the essential elements of the Directive, but eventually the Court reached a surprising and far-reaching conclusion: posting personal data on the Internet, despite the fact that everybody around the world with an internet connection could potentially read the information, does NOT violate one of the key-elements of the Directive: the international data transfer rules...!

A second example of my view that today's law is not technology-neutral is Directive 2002/58, a.k.a. the e-Privacy Directive, which specifically addresses privacy in electronic communications. The very first version of this Directive was published in 1997, but already in 2002, it got a complete overhaul. And only 6 years later, the EU is putting Directive 2002/58 up for discussion again, as it is -among other things- trying to address the privacy concerns around Radio Frequency Identification (RFID) technology. So what's next? Bluetooth? GPS? WiMAX? Ubiquitous Computing? Body Area Networks?

In the meantime, governments are trying to broaden their powers to collect information about their citizens and non-citizens in order to prevent terrorism and to combat crime. This is creating a disconnect between the private sector and the public sector, and creates a false impression with the public. Strict privacy rules for the private sector (where privacy risk is relatively low) versus weak privacy rules for the government gives the impression that the private sector cannot be trusted. Which is strange, considering the fact that getting and keeping customer trust is a basic element of doing business for the private sector. Screw your customers and you are out-of-business in no time. On the other hand, weak privacy rules in the public sector is especially damaging if inaccurate or incomplete information is rapidly shared between government agencies or when information is used out-of-context. But such weak privacy rules give the citizen the false impression that governments have such risks under control. George Orwell's "Big Brother" state may not have arrived eyt, but "Little Sister" is already here, and she brought her whole family...!!

We need to rethink privacy in the 21st century!
The world has changed since the OECD introduced its Privacy Principles in 1980. What does privacy mean for us if at the same time we want no terrorism, less crime, better and personalized services, and more convenience? How do we protect privacy in a world that becomes ever more globalized, so our data end up in data systems on the other side of the world? What does privacy mean for people who come from different cultures and backgrounds? How do we protect our privacy if computers, sensors and communication devices become invisible and ubiquitous? How can we build trust into the technologies that we use? How do we make ourselves feel protected against the risk of identity theft and malicious attacks on our private life? And how do we protect the privacy of people who are vulnerable, such as elderly, minors and mentally handicapped, in an inclusive Information Society?

Unlike some other people, I am not saying that privacy is dead. Or that it is an illusion in the Information Age in which we live. No, I am saying that we have to go back to the privacy drawing board, re-define the privacy principles for the 21st century, and come up with a new set of privacy principles that fit the new realities of our global society and which are robust enough to survive technological and social change. Principles that enhance trust with consumers and citizens, stimulate innovation and societal development, and protect democratic principles and the rule of law. What we need is Privacy 2.0 !

All this and more is the main topic of this blog. I welcome you to comment on my thoughts, so we can get a global discussion started how to protect privacy in the 30 years to come.