We see elaborate privacy policies, bills, laws and rights in place. Yet, it is difficult to build a right privacy-aware system for everyone given that the concept of privacy is highly subjective and is open to interpretations.
Before we dig deeper, let us take a look at a few manifestations of privacy in the domain of data transmission. Consider this:
When we want to share information with others using technology, we are dependent on the technology we use to respect whether the information is delivered only to the intended recipient(s) or whether it is shared intentionally or unintentionally with other non-intended recipients.
If we ask under which conditions can we call such a system as “privacy aware”, there can be multiple interpretations. There would be a set of people who would say that as long as their message reaches their recipients, they do not bother about other issues. Let us keep this set of people aside for a while and consider the other possibilities we may have. For instance, some others would say that as long as no one else (governments, corporations, parents, spouses, Sauron) sees the message and it reaches its recipients, it should be alright. By those terms, when the information is delivered only to the intended recipients, we often call it a “privacy-aware” system. When the information is shared with non-intended recipients, we term it as a “privacy breach”. Some others would argue that this is not good enough. If the system stores the data that they had sent through it, irrespective of whether the system makes use of that stored data now or in future, it violates privacy. An edge group would further add that as long as such a system stores data beyond the duration of the actual transmission, it is a violation of privacy, regardless of the intention.
That is a lot of opinions to take in, and it does not even represent the full spectrum of opinions regarding privacy in one domain. We see three patterns emerging out of this example:
- Interpretation of privacy is personal and subjective; one size does not fit all.
- Many privacy-related issues can get intertwined once you dig deeper; the way data usage became a concern in data transmission in the scenario above.
- People often exchange privacy with comfort to certain degrees; based on the amount of facilities offered.
I’ll leave the solution for a later exercise. For now, let us focus on the issues at hand and try to understand them in detail.
When does privacy manifest?
That instinctive uncomfortable feeling we get when someone watches us behind our back perhaps ties back to our survival traits. In an environment where we don’t have to worry about a predator pouncing on us, our instincts evolve to try to figure out who may be watching us, tracking our moves. The demand for privacy rises as our base reaction to protect information of our personal state from such external entities. That is, we like to able to define the terms when we would prefer to remain unobserved by others. This can be for anything — from trying to focus on a problem at hand, having secret conversations, to going on a trip without someone tracking us. The ask changes from “Don’t eat me” to “Leave me alone”. We still remain wary of the predator; except the nature of the predator changes and becomes increasingly harder to uncover.
In technology, the manifestation of privacy appears whenever there is a component of user data involved. We can loosely describe “user data” as any information either actively produced by a user or passively generated by the actions and interactions of the user. User data can be either anonymous or identifiable. Some of these manifestations are apparent and easy to pin-point, some are veiled and seem apparently harmless.
For example, if someone gets unauthorized access to our mobile phones and secretly listens to our conversations, it can be easily pin-pointed as a privacy issue. Yet, when we are under surveillance in public places, it tends to seem alright. In either cases, these systems can access information about us in the form of audio conversations or audio/video feed showing our actions. Both these systems access what I described above as “user data”. Hence, both of these should raise privacy concerns. But, we seem to be less perturbed about public surveillance systems. Why? Maybe because we don’t perceive such observation as threats. Some of us don’t mind being under surveillance, some of us are uncomfortable about it and some of us get downright upset about it.
The point to note here is that even when user data is involved and crucial privacy issues arise, the perception of how much privacy is good privacy remains very much subjective.
Why is privacy optional today?
The nature of the technology we see around us today almost always poses a trade-off between sharing user data and facilities offered. You cannot have your intelligent calendar auto-record travel dates as calendar events without reading and understanding your emails. You cannot browse an item on an e-commerce website without every other website showing you targeted advertisements urging you to buy similar products from them.
This is not to say today’s technology is “evil”. It simply means that they offer an exchange. These products and services need human effort and computing cycles to keep them up and running. So, they need money to build and operate. Plus, many of them need surplus money to pay dividends to their shareholders. Where would all these money come from? Either the user pays directly for the services received, or the user pays indirectly by letting the service sell their user data. The first kind tends to respect user data. The second kind, where the users do not pay immediately out of their pockets, become the rabbit hole of privacy leaks. You never know where your data will end up once it has been packaged and sold by one party.
This makes privacy an optional feature in the technology we have today. The more privacy you seek, lesser the number services that match your requirements. You either give up your personal data or do not be part of those systems. Is this a good enough model? Certainly not. But, trying to fix it will require a good amount of forethought and is a big topic in itself, outside the scope of the current discussion.
Does security guarantee privacy?
The term “security” in the context of technology roughly expands to the concept of disallowing unauthorized access. Encryption is the first stepping stone in that direction. A lot of nuances of security implementations follow. I like to see security as a fundamental requirement of privacy. But security alone does not guarantee privacy.
Consider this: A social networking system can have industry-leading encryption, can have multi-factor authentication, can serve from militarized data centres, all to ensure that only their authorized members can get through and no authorized members can access or modify personal information of other members. This does not guarantee that the data each member generates will remain within the system. There may be terms stating that user data will be shared with third parties as long as a user continues to be a part of the system. So, even with the highest grade security, you cannot control where your user data ends up from a system like this.
On the other hand, consider a system that does not store user data, does not attempt at sharing or selling user data with third parties, does not track a user through other systems, but does not use any encryption and makes all data exchanges over open networks in plain text. Any agencies looking to gather user data can simply snoop-in on such systems and copy all the data they want. Here, even though the system tried to be privacy aware, lack of security mechanisms prevented it from effectively enforcing its goals.
So, a system that wants to respect privacy must have strong security mechanisms. But, it does not mean that any system with strong security mechanisms is built with user privacy in mind.
Blurring the boundaries: Privacy in a connected world
Now, let us consider a scenario where all the issues with privacy identified so far appear all the more seriously: A collective intelligence.
A collective intelligence forms through constant collaboration and coordination between different entities. It is built on the principles of openness, peering (instead of hierarchies), sharing and removing regional boundaries. All of these require members of the collective to share what they know, in real-time, to make conscious decisions as a group and to improve the collective itself. It is a constant flow of user data, a constant flux of state, as it progresses through individual contributions. When seen as a whole, a collective intelligence can undermine individuality in favour of the larger goal.
Given the internet we have today, creating a collective intelligence will be a matter of building better real-time data sharing systems and making the internet more seamless as part of our physical existence. It is not a distant future. Can we still ensure there is enough room for privacy in a system like that? Enough to room to assert one’s individuality?
Theoretically, the simplest implementation of privacy in a collective intelligence is a switch that lets you isolate thoughts that you do not wish to share with the collective. This can be a part of a thought-driven interface or even a manual hardware switch. Think of it like an active-state toggle. Without such a device, there would be very less option to retain individuality. Without individuality, you lose the ability to think independently. Without independent thinking, you lose the capability to make progress at an individual level. And, a system that does not make progress, however advanced it may be, eventually regresses.
For a connected world that is getting more and more connected each day, it is very important to implement privacy-conscious traits as part of its basic routine, as a core part of the technology that is shaping up this connected state. Without such measures, all the innovations in present and future may not make full use of the analytical aspect of the human mind. We need open, yet, private spaces and technology can be the enabler.
Image credit: Open Grid Scheduler / Grid Engine on Flickr.