Defining Privacy

In this post I'll attempt to give a brief, but thorough introduction to privacy. This ambitious task is motivated in part by the staggering amount of people I've dealt with that lack a proper understanding of the subject – whether they be politicians, laymen, or even tech-professionals.

An attempted definition

At its core, privacy is aboute being able to control your own information (i.e. information about you).

Controlling your own information allows you to express yourself selectively, and to be able to seclude yourself or information about yourself – according to your wishes and your own will.

This entails, amongst other things, the ability to keep certain things private. Maybe something you find particularly special, sensitive or secret.

While the domain of privacy has a certain partial overlap with security, the two are distinct fields. Relevant concepts from the security domain include appropriate use, as well as protection of information. A definition of privacy may include "bodily integrity" (i.e. the inviolability of the physical body), but it is a completely different thing than ensuring somebody's physical safety. The latter would fall under the domain of security.

Though information pertaining to you belongs to you (i.e. is under your ownership), other individuals and organizations may see value in it. You may then elect to trade personal details for some sort of benefit in return – such as a product, a service, or some such. Additionally, some information about some people is subject to rules on public interest, e.g. the dealings of politicians or pillars of society.

Additionally, privacy – as other fundamental rights – is not necessarily absolute. If there is a matter of greater importance, a fundamental right (including privacy) may ostensibly be encroached upon to a certain extent (they may never be entirely eroded, as they are fundamental rights, after all), provided that the actions taken are necessary and proportionate. At this point things obviously get complicated, but it illustrates the contextual implications of privacy to a certain extent.

Merriam-Webster (yes, I'm doing this...) defines privacy as

“The quality or state of being apart from company or observation” or “freedom from unauthorized intrusion”.

That may sum it up, though it might be a little limiting...


Lady Justice

While most cultures recognize individuals' ability to to withhold certain parts of their personal information from society, privacy as we know it today is a relatively modern construct, mainly associated with western culture.

Think personal autonomy, self-ownership, and self-determination – all virtually universally recognized as inalienable rights, which spring from the historical context of establishing individual rights and values, as opposed to individuals being property of the state. Privacy allows the individual protection from government, majority or other forms of power, and is an essential enabler of an individual’s autonomy, dignity, and freedom of expression. In a sense, one could argue privacy is at the core of classical liberalism, and as a consequence it is a cornerstone of liberal democracy.

Privacy is thus integral to the concept of liberty – and a prerequisite of freedom. Because of this, it has been defined as a fundamental right by both nation-states (via their constitutions), as well as political unions and intergovernmental organizations (e.g. the European Convention on Human Rights and the Universal Declaration of Human Rights).

Privacy's importance

Privacy is not really something we can trade off for something else. We have an intrinsic need for privacy as human beings, which is why it is defined as a fundamental right – so that we can live dignified lives in civilized society. As such, it is a foundational principle, much like freedom of speech.

To illustrate why we really need privacy, consider this banal example: You probably want privacy in the restroom, even though you're doing nothing wrong.

"If you aren't doing anything wrong, what do you have to hide?"

Some clever answers: "If I'm not doing anything wrong, then you have no cause to watch me." "Because the government gets to define what's wrong, and they keep changing the definition." "Because you might do something wrong with my information." My problem with quips like these -- as right as they are -- is that they accept the premise that privacy is about hiding a wrong. It's not. Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.

– Bruce Schneier

As Edward Snowden says in his book Permanent Record, “Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”

There's also the fact that, on the level of rights, laws or government, you're effectively gving away the rights of other people (who might need it more than you do in the moment). Let's not forget those less fortunate, even in relatively liberal societies, that are stigmatized, discriminated against, threatened or otherwise live within cultural and personal structures that make up tangible threats to their safety and wellbeing in their day-to-day lives. These people have a real, practical need for privacy to protect themselves, be it because of their ethnicity, religion, sexuality, political affiliation, disability, surrounding cultural confines, or otherwise.

And then there are special cases that need it even more – such as lawyers communicating with clients, reporters meeting sources, etc.

As we consider how we establish and protect the boundaries around the individual, and the ability of the individual to have a say in what happens to him or her, we are equally trying to decide:

  • the ethics of modern life;
  • the rules governing the conduct of commerce; and,
  • the restraints we place upon the power of the state.

– Privacy International

Privacy is what allows us to have anything for ourselves – to keep something to ourselves. This makes us able to experiment with who we are and what we think – and thereby develop our personalities.

When things go wrong

A surveillance camera

We needn't think for long to come up with historical examples of how undermining of privacy has enabled atrocities and violations of human rights.

The Gestapo (Nazi Germany), KGB (Soviet Union), Stasi (DDR), and the Ministry of State Security and the Ministry of Foreign Affairs (PRC) all spring to mind as ruthless secretive organizations acting on the behalf of authoritarian regimes, systematically undermining the rights of their own populations by actions ranging between indiscriminate surveillance to politically motivated internment, and extermination.

At the same time, many western countries are challenging invididual and international notions of privacy, as they implement variations on mass surveillance (such as bulk collection of metadata) on a national scale, usually framing it as a question of security vs. freedom – upon which Benjamin Franklin famously once opined.

Then there is, of course, the modern digital private sector and the so-called surveillance-capitalism. Private actors, such as Facebook and Google famously have a lot of data on a lot of people. They get some of it from people registering on their platforms, who often have to consent to these companies' use of their personal data in order to use their services. This is why some critics say "if you're not paying for the product, you are the product": A cynical interpretation is that the use (and misuse) of users' personal data is the price they pay to use the products – even though the case is never framed as such for the users.

Additionally, many of these same actors also track their users, as well as anyone else (in some cases), across other sites on the internet, for instance by having other sites embedding their social media platform's "share"-buttons or their analytics-solutions, strategic partnerships (such as getting purchase data from credit card companies), and buying data from so-called data brokers – companies that trade private information and personal data with anyone that wants to buy or sell, e.g. for use in so-called real time bidding, to deliver so-called targeted advertisement.

The obviously problematic cases here include, for instance, the cases of Grindr, OkCupid and Tinder sharing things like location data, sexual orientation, drug history, HIV-status, etc. to to advertising and marketing companies.

Standing in the way of progress?

Privacy is often presented as an impediment of positive outcomes. Too stringent controls around privacy allegedly leads to worse outcomes in the case of healthcare, research and national security. But this need not be the case. In many cases, we don't even need personal information to solve a given problem – though we may find it nice to have, or want to be able to go one step further. And even in cases where we would seemingly have a technical need for personal information to solve an issue, there are plenty of mechanisms, properties and systems that allow us a certain level of guarantee for privacy protection – such as k-anonymity or differential privacy.

A technical sidenote: Privacy does not equal data protection, nor compliance.

In the case of architecture and solution design, privacy must be weighed against other needs, much like other non-functional requirements (such as security and performance) – with the caveat that evaluating privacy implications means explicitly considering other people's rights and interests up against your wants, and just how you might go about attempting to reach your goal.

And this is some of what privacy by design (a systems engineering approach best practice, as required by the GDPR) entails. Its seven foundational principles are:

  1. Proactive not reactive; preventive not remedial
  2. Privacy as the default setting
  3. Privacy embedded into design
  4. Full functionality – positive-sum, not zero-sum
  5. End-to-end security – full lifecycle protection
  6. Visibility and transparency – keep it open
  7. Respect for user privacy – keep it user-centric

As to how we can achieve this in practice in the case of software development... Maybe privacy engineers have something to learn from the DevSecOps-movement and the security field in this sense, and become enablers rather than nay-sayers – as in answering "Yes, and here's how", rather than "No", when someone asks whether a certain thing can be done. Privacy would then need to be represented at every step in the software development lifecycle.

There is no doubt that we have a need for privacy, i.e. there is a "lower boundary", or a "floor", that defines a minimum amount necessary; We're now trying to define the "upper boundary", or a "ceiling", of what infractions are acceptable and in which cases.

For a little more on privacy engineering, check out this article.

Newer post
A rose by any other name
Older post
Smittestopp Summarized