Rahul Matthan is a partner at law firm Trilegal. He recently published a discussion paper at Takshashila Institution, a Bengaluru-based think tank, offering a new model for data protection that’s based on rights and accountability.
With this paper and the recent Supreme Court ruling on the right to privacy as the context, Founding Fuel invited a host of well-informed people for a discussion on the various nuances of privacy in the age of Big Data.
Here are edited excerpts from the conversation hosted on the Founding Fuel Slack community on September 1. For the full transcript and to participate in the debate, sign in at https://bit.ly/joinprivacy.
Executive summary of Matthan’s discussion paper
Data protection is confined by consent (the consent model). Once a data subject’s consent is obtained, a data controller is free to collect, process and use such data for the specified purpose and will not be liable for any consequences that might result from its actions. This places the onus on an individual to be aware of the terms of data access to which he is providing his consent. This clearly benefits data controllers more than data subjects.
This is inadequate in the interconnected, data-reliant world of today. Given that India will be working on a formal law on data protection in the near future, it is imperative that it relies on an alternative to the consent model in order to protect the interests of data subjects.
We believe that a rights-based model (the rights model) will help secure the interests of a data subject sharing his data with data controllers. This rights model assures to every individual, an inalienable right over his personal data. Any data collector that wishes to access a data subject's personal data must ensure that they do so in a manner that does not violate this inherent data right. This discussion document sets out the contours of such a rights-based model (the rights model) as a substitute for the consent model. The rights model has the following features:
- It assures a set of data rights that are available to everyone.
- It shifts the burden of evaluating the privacy risk to personal data away from the data subject and onto the data controller, forcing the data controller to be mindful of its processes for data collection, processing, transfer and storage. The Model applies equally to the State when it collects / processes personal data.
You can see the discussion paper here.
Legal framework for a newly data-rich nation
charles: A question playing on my mind is, all precedents we have are of societies following a certain trajectory—first get economically well, after which all the other so-called intangibles come into play. What we are witnessing now is an “economically poor” nation migrating to being a “data-rich” nation. This is unprecedented in history. It is widely argued data is the new oil. What is the new metric by which this “oil” can be measured in economic terms? People may just be unaware of what they are trading away. What kind of legal frameworks can be placed when leaps like these happen for those who do not understand the implications of such leaps?
Since technology changes so fast, we might need to regulate in small but regular intervals
rahul: I think it is hard to predict what the harms or benefits can be. It is true that this is unprecedented but this is not the first time that humanity has had to deal with transformation like this. The Industrial Age transformed production and at the time was seen to be incredibly beneficial, but it also resulted in huge transformations in the job market. They say we are about to witness something similar due to the growth of artificial intelligence (AI).
I don’t think regulations can ever be imagined in advance. Instead, they need to assess the imbalance that is being created by a new technological disruption and then intervene to right that imbalance. And since technology changes so fast, we might need to regulate in small but regular intervals.
charles: Historically though, has the nature of the discourse been as fragmented and ferocious as it is now?
rahul: We are in an age of incredible closeness. Technology has made it such that we can watch what’s happening in Houston [in the wake of Hurricane Harvey] (down to Melania’s high heels) just as Houston can see what’s happening in Mumbai [in the wake of flooding following heavy rains on August 29, 2017]. So the discussion is much more ferocious than it has ever been.
As a result, the legal system has less opportunity to get things wrong. In the past you could frame a law and wait for decades to see if you made a mistake. Today, you will get a hundred opinions before you float the consultation paper and get immediate feedback. It is actually very hard to regulate in the traditional way in this sort of an environment.
Because, let’s face it, there is no right answer to these questions. Just suggested solutions.
charles: As a corollary to that question, do any precedents from history come to your mind when legal frameworks have had to wrestle with huge shifts of this kind?
rahul: The historical example I always use is the impact of the Industrial Revolution on the job market. I wrote a piece about it a couple of weeks ago, reflecting on the impact of AI on jobs today.
There will be a period of instability as we re-adjust to the new realities, but as has happened in the past, water eventually finds its own level and we will have new regulations that apply to new circumstances.
The only lens we need to apply is an assessment of the harm that is being caused as a result of being data rich. With every form of wealth those who have it can use it to the detriment of those who do not. Society—and the law—needs to step in to correct that imbalance.
Can consent have a time duration?
msharmas: 1. Can consent have a time duration depending on context? For instance, [giving consent to a] website in the form of a cookie, [or to a] bank or mobile company till the time I am their customer, [or to] online services like Gmail and Amazon, [up till the time] I have an account with them.
2. How can we try and reset the privacy and Aadhaar debate? How can we quantify for the audience the amount of current data we have already given away in form of “consent” in the past—for example financial data (CIBIL), or via services like Gmail and Facebook—which is much more than what Aadhaar data contains!
On the Aadhaar vs. privacy debate, it is hard to reset it given the time that has passed and all that has been done
rahul: 1. On the time duration on consent: most laws around the world require that data should be collected for a purpose and only retained for as long as it is required to achieve that purpose. So technically, the law imposes such a time limit.
2. On the Aadhaar vs. privacy debate, I think it is hard to reset it given the time that has passed and all that has been done. I do think the Right to Privacy judgment has done one sort of a reset and now we will need to see what the other cases hold. I think comparisons to other forms of data collection may not work given that this is the government collecting data—which is always viewed with more suspicion.
Right to privacy by common law vs by the Constitution
ramnath: What is the difference between having the right to privacy by common law, and by the Constitution? If I can go to the court to claim my right to privacy in any case, what did the Supreme Court judgement really change?
The Fundamental Right to Privacy is a right that we have against actions by the state that violate our personal privacy
rahul: The Fundamental Right to Privacy is a right that we have against actions by the state that violate our personal privacy. It does not apply to private parties (except in some very exceptional circumstances where the party is so large as to take on the characteristics of a public service—but we do not need to get into that).
To question a violation of privacy by a private party we need to have a privacy law that stipulates the obligations of each of us in relation to each other and the manner in which we deal with the personal information of our clients and business partners. We currently do not have a privacy law—just the Privacy Rules.
Can private entities like Google restrict access if I don’t consent to share my data?
tanujb: Now that Right to Privacy is a fundamental right, it will uncover the relationship between data and services that we’ve taken for granted. What does the recent judgement mean for these relationships?
Specifically, can a Google refuse to show me search results if I don’t consent to it knowing about me? Conversely, Can LinkedIn continue to keep profile data behind a walled garden that requires logging in to access it? (There was a recent US case on this.) And if they do so, can I demand that they not condition my access on whether I share my data?
rahul: The Right to Privacy judgment was a decision that clarified that we all have a fundamental right to privacy against state action. So, now it is clear that the government cannot violate my privacy without a law that demonstrates a State need and in a manner which is proportional to the need that is sought to be achieved.
This does not apply to private corporations. However, in the judgment, the Supreme Court recognised that the government has appointed the J. Srikrishna Committee to look into the issue and has instructed the government to enact a law on privacy that will apply to private parties. Once that happens—and depending on what the law contains—I will have a better answer to your question.
Will knowing who is accessing my data change the dynamics?
prsahu: I believe that everybody has a concern for privacy mainly because they don’t want anyone else to know certain facts about them which could be used against them and make them vulnerable to exploitation. Do you think having knowledge about who is accessing the information about you will change the dynamics? As I believe making some of the personal data available can lead to many services which would benefit the whole society in the long term, preferably the data being available only in a machine readable form or through use of some homomorphic encryption techniques.
rahul: Let me try and address the issues you raise separately.
I don’t necessarily think knowing WHO is accessing the data is going to set our mind at rest. It might to some extent, but in my experience most people tend to worry about the fact that the data is being accessed at all.
I agree that there is a lot of benefit that can come from judicious use of data. Particularly when it is used in a responsible manner without causing harm. And this is why I have based the argument in the discussion paper on a finding of harm to address the privacy concerns.
Certainly one of the techniques we should encourage is the use of homomorphic encryption since that allows us to use the database for whatever analytical purpose we might devise without accessing the personal information.
On the right against your data being processed
sethia: I just glanced through the Takshashila paper. It appears that the right to processing is a tough one to implement, for having firms disclose all the granular ways data is processed is way too much to ask. Are there any precedents anywhere on this?
rahul: The right against processing is quite common in most data protection laws around the world. Essentially it ensures that the data subject has the right to say that he no longer wishes for his data to be processed and the data controller must stop processing. However, the data subject needs to understand that there could be a diminution in the quality or the nature of the service provided because the data is no longer being processed.
For instance, if the data subject tells Google that he no longer wants it to track his location, then he needs to understand that Google Maps will no longer be able to give him turn-by-turn directions.
This construct is quite common in data protection laws around the world.
sethia: But isn’t it about denying consent to use the location rather than knowing how his location data is being processed?
rahul: So, if you look at the Discussion Paper, the Right Against Processing is exactly that. I quote:
“Data subjects who do not want their data to be processed shall have the right to require that the data controller stop processing their data forthwith. The statute could be implemented in such a manner as to offer the data subject granular control rather than presenting just binary options.”
sethia: From what I understood of the word ‘processing’ is: Saying to an app that I don’t want to give you access to my location vs. saying I am happy to give you location, but you cannot process it for other purpose like generating my timeline history of places I visit or showing me ads due to it.
rahul: That is correct.
Is there any divergence between your framework and the Supreme Court ruling?
nitin.srivastava: Are there conceptual aspects of your framework that you think might be less aligned with the Supreme Court ruling on privacy and the direction it is pointing towards?
rahul: The Supreme Court judgment seems to suggest that we look to the EU model to frame our law. That is a consent-based privacy model that I believe is not useful in the context of the machine learning future we are already about to enter. So, I have suggested that we do away with consent and focus on accountability.
To that extent there might be a disconnect in my thinking compared to the apparent language that the Supreme Court has used.
That said the Supreme Court has emphasised autonomy in the judgment, saying that it is important that we ensure that we retain autonomy over our personal data. Most have interpreted that to mean that we should have the right to be asked for our consent before the data is collected. I think that autonomy can equally be achieved if we have the right to prevent the data controller from using the data if we were unable to provide consent before he collected it.
On children’s online privacy protection
krayker: Is the Children’s Online Privacy Protection Act (COPPA)/equivalent being discussed or is in the ambit of the proposed law? (Given that Facebook/ Google already follow the law in US at least.)
Given that adults themselves are unaware of the inherent complexities and risks by privacy breaches, I guess it's just as important, because a lot of kids join Facebook, WhatsApp, etc. and overshare. By the time they realise, the proverbial horse would have already bolted from the stable!
rahul: I have not specifically covered it in the framework document but I do think this is very important. We will need a special framework to address children and if there is one exception I can make to my principle about replacing consent with accountability, it is in relation to children.
How will right to privacy be implemented for regular citizens?
niloferm: From corporate/legal frameworks, the right to privacy is great, however, as a regular citizen, I have wondered how will this be implemented on a daily basis? Given Indian society, the past joint family system, privacy was never really taken seriously. How will this change an individual’s life?
Now the government has to demonstrate that it really needs to collect information that it is collecting from you
rahul: It is going to change an individual’s life in its interactions with the government. Now the government has to demonstrate that it really needs to collect information that it is currently collecting from you for the purposes that it does. At present it is likely that the government is collecting far more information than it needs on the basis that it might need the information at some later point in time. This can no longer happen.
Secondly, there is a big push for a privacy law that will have a far greater impact on non-state actors.
Is a consent-based approach counter-productive?
krayker: Regarding this debate of a rights-based vs. a consent-based approach, in the case of the rights-based approach, the onus of compliance seems to lie with the data controllers. Whereas with the consent-based approach, the onus of security lies with the user/ individual.
The former is trust-based and convenient for users, the latter, which although seems better with granular control, maybe too frequent and counter-productive due to consent fatigue. Much of the populace is ill-equipped to understand the nuances. For example, Aadhaar gives an option to lock/ unlock biometrics-authentication, but what percentage of the population can really operate or toggle the same?
What is your opinion regarding this?
rahul: This is a really articulate enunciation of the problem with consent. We instinctively feel that so long as we can consent to the manner in which data is collected from us and so long as we decide the uses that it is put to, we have the ability to decide for ourselves what is good for our personal privacy. But few of us actually go through the process of making an informed decision.
That being the case, despite our instinctive need for autonomy in decision making about personal privacy, we are not really doing all that much to safeguard our privacy. In this context, consent is not really an effective safeguard.
The rights-based approach relies on holding corporations accountable for the harms they cause. This, I believe, is a far more effective way to safeguard with privacy.