Header image alt text

Naavi.org

Building a Responsible Cyber Society…Since 1998

Theory of Dynamic Personal Data

Posted by Vijayashankar Na on March 31, 2018
Posted in Cyber Law  | Tagged With: , , , | 1 Comment

“Personal Data” is the object of data protection regulations such as the upcoming Data Protection Act of India, the DISHA 2018, as well as other laws such as GDPR and ITA 2008. “Protecting Personal Data” is considered “Information Privacy” by the Indian judiciary which declared “Privacy as a fundamental Right”. In all the data protection regulations, Data is classified as “Personal Data” and “Sensitive Personal Data (or special category personal data)” and different responsibilities are prescribed to the Data Controllers and Data Processors.

The current global controversy on Face Book being responsible for its customer’s profiles being used for influencing US elections is an interesting case study for examining the efficacy of the current data protection laws and where the laws have failed to capture the real nature of data and are therefore failing in the implementation of the data protection laws. If laws are failing in the current scenario, they will be failing more often when we consider the emerging era of Big Data Analytics, Artifical Intelligence and Quantum Computing.

While ITA 2008 and GDPR are already frozen, India has two data protection regulations in the pipeline namely the DISHA 2018 (Digital Information Security for Health Care Act) and Data Protection Act of India as being drafted by the Justice Srikrishna Committee. It is therefore a great opportunity for the Indian legislators to incorporate certain new provisions of data protection that other legislation including GDPR might have missed. Naavi has already provided some inputs on the proposed laws in earlier articles.

Theory of Dynamic Personal Data

This article will however introduce a new “Theory of Dynamic Personal Data” which if recognized and brought into our regulations may resolve some of the anomalies which we are presently facing.

The basic concept of this theory is that “Personal Data” is dynamic. It is not a static concept where one entity collects it under a “Consent” and uses it for a stated purpose and just destroys it afterwards.

Data once created cannot be easily destroyed. It can only be converted into another form where it looks different. It is therefore like “Energy” that cannot be destroyed in the universe but can only be converted from one state to another.

Energy can even be converted from being a “Particle” or a “Wave”. Similarly Data can be converted into a tangible “Document” or seen as “binary impressions on a magnetic or optical media”.

In the Quantum computing theory, “Data” can be in the form of Qubits with an uncertain state of being either a Zero or One but assuming a probabilistically determinable value when measured.Same “Uncertainty” can be there in the state of “Personal data” also even in the classical computing environment.

Data has a life cycle in which it is generated, re-generated, processed into a value added form, fused and fissioned, deleted and undeleted, forgotten and remembered, used and misused, de-identified or anonymized or pseudonomized and re-identified.

Hence any data protection law which assumes that an “Informed Consent” from the data subject to the first data collector will solve all the problems of “Information Privacy” is a complete myth.

What is required is for the law to recognize that “Personal Data is dynamic in nature” and at any given point of time it exists in a certain state of uncertainty. It can however be measured at a specific point of time when it shows up in a certain form.  This is exactly similar to the “Uncertainty Principle” embedded in the “Superpostioning” concept of the Quantum computing.

Three Fundamental Rules of Dynamic Data Theory

We can define three fundamental rules of Dynamic Data Theory for further discussions

The first rule is that

“Personal Data does not exist in isolation but exists in the Data Universe”.

Such data universe consists of

-the data subject’s data in different forms with different data controllers, collected at different times, along with

-many versions of the personal data processed by different processors for different purposes and

-combined with the data of other data subjects.

The second rule is that

“Personal Data exists in an uncertain state where it may be personal or non personal, sensitive or otherwise and assumes a certain state at the time of its measurement.”

The third rule is that

“Personal Data is not “Absolute” in truth and accuracy and always exists in a form dependent on the context of its collection and use.”

How these rules should be integrated to law making

Let us now elaborate on these three rules and discuss why a data protection law that does not consider these rules is defective ab-initio. 

We define “Personal Data” as that information that is identifiable with a living person. Obviously, Name is the primary identifier for an individual in the physical world. In the Cyber world, it is the e-mail ID or an Avtar ID that substitutes the name as the real identity of a Netizen.

Address in the physical world, the IP address in the cyber world are also identifiers.

Additionally, there could be other parameters such as the Mobile Number, the Aadhaar number, PAN number, the Voter’s ID etc which are all different identity parameters.

There are also additional parameters such as the age, sex, political affiliation, sexual preference, the health information, the financial information etc that are also considered “Personal information” when they are identifiable with a living individual.

The basic or “Primary” personal information is not the health or financial information but the physical identity information such as the name and address or the cyber identity parameters such as the biometric or password. Other information may be important but they are “Secondary Personal Information”.

So far, no law has defined “Primary Personal Information” and “Secondary Personal Information”. We have jumped from “Personal Information” to “Sensitive personal information” without clearly defining which is “Primary” and which is “Secondary”.

In the personal data cycle, “Personal Information” starts with the “Birth Certificate” which defines the name of an individual along with that of his parents, place of birth, date and time of birth. This is the “Primary Personal Information” at the atomic state. Within this, it is difficult to determine which comes first and which comes later.

In olden days, birth certificates used to be issues as “Son/daughter of X, the father and Y the mother”. The name actually came later as an assignment by the parents in the naming ceremony. However, convention today is to issue a “Birth Certificate” incorporating the assigned name. Hence the parameters of the birth certificate namely the Name, data of birth, place of birth, name of father and mother, is the atomic level personal information that needs to be defined as “Primary Personal Information”.

Subsequently other information about the data subject gets added including the record of the DNA profile or blood group etc. Further the education, employment particulars, bank particulars, other health parameters all get added to the “Personal Information”.

What we need to recognize here is that “Personal data changes its state on a continuous basis” though it may appear from time to time in the form of a snap shot which is the electronic document such as PAN card or Aadhaar card, Medical report, Bank statement etc.

Hence law has to define “Personal Information” as an “Evolving set of data that gets tagged to the Primary personal information created with the birth certificate parameters”. It is only the birth certificate parameters that can be frozen as an “Electronic Document defining the personal Information of an individual” and this gets extinguished with the “Death Certificate”.

In between, even the name of the person may change if the person so desires. His age ticks every second, his health data and financial data changes every moment. If therefore we want “Personal Data to be accurate” as a legal requirement, the personal data record has to be updated every moment which is not feasible.

It is in this context that I say that “Personal Data is in an uncertain state” and only when you want to measure it, you try to get a health report or a bank report where the personal data is frozen at a given point of time and place. This is the “Superpositioning Nature of the personal Data” similar to the Quantum computing scenario. While the Quantum super positioning can assume either Zero or One, the Personal Data is a “Continuum” of many states and is changing all along.

In this context, personal data of a person exists in a “Data Universe” where new data gets generated and some of the new data gets tagged with the Personal ID of the data subject and we say that “Personal Data has changed”. But this change of data can be recognized only of the Data Controller becomes aware of the change.

If a data subject shares his data with one data controller on 1st January 2018 and with another data controller on 31st January 2018, the two will be different. Each will be using the data based on the consent obtained and processing it and deriving inferences as if they know the truth. If the data subject says he will vote for BJP on January 2018 then he will be classified as a BJP oriented person. If on January 31, he says he will vote for Mr X from Congress who is the local candidate in the forthcoming election, the data changes colour and makes him a Congress supporter.

If both data is available to a single data processor he will compute a data analytic report showing the trend that this voter is changing his profile and the trend is that he is moving from BJP to Congress. If before the election, Modi makes a speech the trend may change again.

In such a scenario, the “Profiling” remains uncertain. Hence the so called “personal data” which includes the political affiliation is just an interpretation by a data processor with the available information on hand and his own skill in interpretation and it is not an absolute truth that the person is either a BJP supporter or a Congress supporter.

Without understanding the three rules, if law tries to say “No body shall use personal data except as provided by a consent”, then one has to question “Which data are we talking about”?

Is it the direct data that is provided by the individual once that he is a BJP supporter and another time as a Congress supporter? or

Is it the “Processed Data” that says that the person is an undecided voter and may change his preference based on the stimuli he receives closer to the election?

If an analyst like Cambridge Analytica comes to a conclusion, and develops a “Profile Report”, should the law consider this as a “Primary or Secondary Personal data” provided by the data subject or a “Derived Information” that is not necessarily guaranteed as the true and absolute personal information but is only an expert’s view of the analysis.

If so, should the data anlaytics firm be punished for data breach if it shares its analysis with a candidate who is trying to finalize his communication strategy? is a question which the law makers need to answer.

Today, the law makers say that all these decisions will be decided by the “Consent”. According to them they feel… “Let the person collecting the consent get the consent for processing it, deriving meanings and then sharing it with some body else for profit or for a cause etc.”

However at the time of obtaining a consent, the data controller only has a limited view of what information he is getting and how he may use it. But due to the “Dynamic nature of the data”, after collection, the data in the hands of the data controller “Grows”, “metamorphoses” into a different form and he discovers that he can now make new uses of the data.

What he bought was perhaps a caterpillar and now it has become a butterfly.

Should the law now say, go back to the data subject and ask him if he can use the butterfly instead of the caterpillar? . Of course law can say so.. because law can be an ass.

But what we need to ask the law makers is whether we can create a law which recognizes that the data which looks to be a caterpillar today may die as a caterpillar or change into a beautiful butterfly and we should encourage the data holder to nurse the caterpillar in any way the data controller wants and make it more valuable than what it was when it was handed over by the data subject. This is the  business of Data mining and data analytics on which a huge part of the IT industry is standing today.

Another complication in the data scenario is that data may be processed by a number of down stream data processors and today we define due diligence at each level in the form of a “Consent” or “Processing Contract” which can only capture known information about the data and not what can be “Discovered”. Also, down stream data processors are not aware of the original consent and have to proceed with their processing only on the basis of the data processing contract as provided to them by the Data collector.

If data protection laws try to curb the “Discovery”, of new uses of data, we will be curbing scientific development and the concept “Data is the New Oil” would be killed to the detriment of the progress  of the society.

If therefore Mr Aleksandr Kogan created some inferences based on the data he obtained by from Face Book users under a separate consent given on his APP, then the inference he derived was a “Derived Data” and not “Absolute Personal Data” of the data subject.

Presently the community is fighting over the issue as if “Personal data” has been breached. But actually what has happened is that some body created a notional value addition and some body paid money to buy it. It is a total speculation that it was beneficial to Mr Trump and whether similar analysis in India will benefit BJP or Congress.

The “Dynamic Personal Data” theory breaks the guardian knot and releases the “Processed Data” from the constraints of the “Consent on the raw data”.

In other words, the consent obtained for transferring the cater pillar is not allowed to restrain the use of the Butterfly.

Quantum States of Personal Data

When personal data is in the hands of one data processor, it is in a certain state of certainty defined by the information obtained under the “consent”. But while the data is being put to use, it slowly gathers energy and becomes more and more useful with additional information flowing in from a different source and from a different person.

For example,  one person in a certain street address says that he likes to vote for BJP. Then let us say another piece of data that the person attended a BJP rally or a BJP team visited him at his house and had a discussion gets added to the data base. Now the first information gets hotter and hotter until the analyst of the data comes to a conclusion with his algorithm that this is a BJP voter and profiles him as such.

In this example, we can see that a “Personal Information” attains the status of a “Sensitive personal information” without the data subject doing anything or providing any additional information by himself. Same thing may happen when the Google map adds data that this data subject visits a dialysis center every week and the inference is that he is a kidney patient. If this data is looked along with the financial information of the data subject, one can infer if he is a prospective candidate for accepting kidney donation.

This sort of movement of Personal information from one state to another after accumulation of additional data from the Big Data platform or by his own contribution is like the “Quantum Jump” of an electron rendering the atom state change. If the incoming data energy is less than the quantum requirement, it increases its entropy but remains a “personal information” only. But when the entropy level crosses a certain quantum level, the data changes its status. If the data energy is strong enough then it is not only the electron that makes a quantum jump but the nucleus itself may go into fission and change the entire profile of the data. In the Cambridge Analytica case, if the advertising input is strong enough then the profile of the data subject may alter from a BJP supporter to a Congress supporter or vice versa.

Now according to  present data protection laws, the information which was earlier only a “personal information” got fused with other information such as “BJP party activity in the areas” and the result was a “Political profiling” of the data subject which is “Sensitive personal information”. As it is now happening, data privacy activists will say this is an inappropriate use of the consent for manipulating the voter behaviour and should not be allowed.

But is this change of status “Controllable” by law stating that you cannot bombard the data subject with additional information? . If done, are we trying to curb the business of advertising and communication itself? is a point that the data protection laws need to address before jumping to introducing stringent data protection laws in the light of the Face Book -Cambridge Analytica issue.

Thus we need to remember that Data is not static. It grows with the accumulation of additional data from the surroundings. In the process data changes colour and renders the earlier consent meaningless in the new scenario.

Similarly, non personal data can become personal data when there is fusion of identification parameters and an identifiable personal information may become de-identified personal information if the identity parameters are removed.

The Data Protection law in the next generation cannot be blind to this aspect of “Dynamic State of Personal Data” and should not create laws with the assumption that personal data always remains in a static form until the data subject himself provides new data inputs with new consents etc.

Is it a Diamond or Charcoal?

In this process of Data Transition through its life cycle, the value of data may change substantially. Just as Carbon can exist both as charcoal or diamond based on how it is processed, Data can remain worthless or become valuable depending on the processing.

If data processing creates a diamond,

should we stop such processing because the charcoal supplier supplied it at a throw away price thinking that it will be used for burning and gave his consent for its use while the processor applied technology to compress the charcoal and discovered a means of converting it to diamond?

or

should we mandate that all data subjects will get royalty when their personal data will be used to create profits to the down stream industries?..

is a challenge that the data protection law makers of India need to consider when they draft the new laws.

Naavi

 

Reference Articles:

Data Protection Law in India… Three Big Ideas …. Data Trust, Jurisdictional Umbrella and Reciprocal Enforcement Rights

Uphold the “Right to Know” against “Right to Privacy” in the new Data Protection Law

We should forget the “Right to Forget” in Indian Data Protection Act

Cambridge Analytica and Indian Cyber Laws

Personal Data should be considered a personal Property

Public Consultation on Data Protection Legislation

Public Consultation on Data Protection Law…. Some points of discussion :  Part I,  Part I, Part III 

Why We need a Data Breach Protection Act rather than Data Protection Act

CCTV footages.. Whose property is it any way?

Impact of Supreme Court’s Order on Right to Privacy on Cyber Space and Data Protection

Concatenating the individual Conclusions of the Privacy Judgement

Data Protection Act.. We should aim at Compliance with Pleasure not Compliance with Pain.

Look beyond GDPR and Create Personal Data Trusts to manage Privacy of data subjects

Privacy law cannot be only a tool for hiding oneself

Earlier innovative theoretcial Thoughts of Naavi

Theory of IS Motivation

“Theory of Secure Technology Adoption”… what it is..

Section 65B and its relation to the Theory of Soul and Body, rebirth and past life memory

The Three Plus One dimensions of Information Security

Fighting susceptibility for “Cyber Hypnotism” with Ulysses Contracts

Compulsive Cyber Offence Syndrome

In the midst of the discussions on Privacy and Data Protection following the Cambridge Analytica controversy, the Ministry of Health and Family Welfare, Government of India has released the long pending draft of the Health Care sector law on Privacy and Information Security.

Earlier, a discussion on this had been started at Naavi.org under the title of Health Care Data Privacy and Security Act (HDPSA). Now the Act has been renamed as Digital Information Security in Health Care, Act (DISHA).

A copy of the draft is available here:

Public comments have been invited upto 21st April 2018 which may be sent to egov-mohfw@nic.in

Naavi.org will also provide its own comments in the next fortnight.

This law will be in addition to ITA 2008 and the proposed Privacy and Data Protection Act which the Srikrishna Committee is drafting. We also know that the TK Vishwanathan committee was also drafting an amendment to ITA 2008.

With the undue attention that Cambridge Analytica is getting, there is  complete chaos in the domain of Privacy and Data Protection and now this additional law will add further spice to the discussions.

Coinciding with this spur of activity GDPR is being implemented by many Indian companies for the deadline of 25th May 2018.

It is therefore a very active period for Privacy professionals in India. Hopefully we will be able to avoid overlapping legislations  and conflicts in different laws making the work of compliance difficult.

Naavi

Suddenly politicians have become experts in data protection and how mobile apps may collect data without your consent. Mr Rahul Gandhi who once gave us the “Gyan” about “Jupiter Escape Velocity” is about to give us “Gyan” about “Privacy and Data Protection” and how consent should be obtained.

We have not forgotten the fact that his close aide Ms Ramya who manages RG’s twitter openly asked not long ago for Congress workers to open fake accounts and increase the social media foot print so that they could compete with Modi’s popularity on the social media. Congress also used some foreign Bots to post “Likes” and “retweets” so that the fake followers and fake re-tweets could create a “Fake News” in the social media so that every lie uttered gets magnified and is able to fool more number of dumb voters.

Unfortunately, in these efforts, RG forgets that it is his “Ivnarva” speech and “Vish….Vish…..” stumbling which is more popular in Karnataka in the social media than his more erudite utterances on Data Privacy.

Now Both Congress and BJP have started trading charges based on the privacy policies followed by both parties for their respective apps. It is funny that the spokespersons of political parties are suddenly talking like Privacy experts.

My sincere advice to them is to stop talking on this subject even if media tries to needle them. They should say that some experts are looking into whatever allegations are being made and corrective action would be taken as necessary.

The vulnerabilities of the Mobile Apps are known and even big companies have not addressed them adequately. I therefore donot expect political party apps to be more privacy compliant than the apps of the MNCs.

The Cambridge Analytica issue has become the focus of both the national and international media only because Donald Trump was connected to the incident as a beneficairy of the election campaign based on the profiling of people provided by Cambridge Analytica. The fact that the subsidiary of the same company could have been involved in managing the election campaigns of BJP, Congress and JDU in India brought the focus in India. Media would not have been interested if this was only an issue related to the privacy of the public.

What is actually disgusting is that some of the security professionals are joining hands with Rahul Gandhi and trying to spread disinformation. Some of them are even testifying against the Government in the Aadhaar issue. Perhaps they are doing it because they hate Mr Modi or because they have been bought by Congress and the Communists to defame Mr Modi.

It is disappointing to note that such professionals are also criticising the Namo App issue as if they agree with Mr Rahul Gandhi’s view that  Mr Modi is spying the Indian citizens through this App. Mr Rahul Gandhi’s intelligence level is known and no body is surprised at his utterances. But security professionals will come out as hypocrites if they donot understand that we cannot expect the PM to check the source code of the app or the privacy policy just because the App is named after him. He has to depend on technology specialists and if they have made a mistake, take suitable action.

These security professionals should also understand that the App is branded as “NaMo” because of the brand value atttached to the name of Mr Modi. It is not however the personal property of Mr Modi and must be considered as belonging to the Government. Government would have sub contracted the development and maintenance of the App and it is such an organization which is actually responsible for the Privacy policy being not followed etc.

Hence the talk of Mr Modi being responsible is incorrect and nothing different from celebrities being held accountable when the products they endorse fail.

In fact, when Congress lost power in 2014, it was guilty of deleting the Twitter handle of PMO before Mr Manmohan Singh demitted office as if he was personally the owner of the twitter handle and had to remove it when he demits office.

Now many are also criticizing the fact that the Privacy policy of NaMo app was modified after the controversy. In fact this is a matter to be appreciated that when a vulnerability was brought to the notice of the App owner, he is trying to correct it. This is the “Risk Mitigation” effort expected out of the owner. On the other hand, Congress removed its app altogether. This is also fine since this is a “Risk Avoidance” strategy and since Congress did not consider the App successful any way, it was a wise move to withdraw it.

Public should also remember that certain technical information of any app user or internet user such as the browser/mobile used etc is always tracked because this is essential for presenting the content in a proper manner. Hence processing the behaviour and preferences of the users to a certain extent is perfectly legitimate. If this back end processing is done online by a company abroad, the data may have to be sent there. In most cases this would be de-identified information and hence there is no privacy stake here. This is not “Stashing away data of Indian citizens abroad” like people stash away their black money abroad.

RG cannot understand this and hence he may say some thing like “Spying”, “Recording audio or video” etc and this has to be ignored. Even if BJP tries to pursue a defamation case, they may fail since Court may come to the conclusion that no body takes RG seriously and hence no “Defamation” can be attributed to his utterings.

But security professionals should be more responsible and not make lose comments. If they have suggestions to improve the App they should provide those suggestions.

The summary of this discussion is that while we wish that political parties are more careful in drafting the Privacy policy and Terms of use of the Apps we also wish the public should check if they want to give their consent to the sharing of their personal data before any app is installed. Beyond this, it is not correct to use terms such as “Spying” unless to exhibit one’s ignorance. It is OK for Rahul Gandhi and Ramya and not for security professionals.

Naavi

Smart Cities in India and ITA 2008

Posted by Vijayashankar Na on March 25, 2018
Posted in Cyber Law  | No Comments yet, please leave one

When ITA 2000 was drafted, the concept of “Smart Cities”, “Driver less cars” or “Artificial intelligence” or “Humanoid Robots” were not very much in the realm of the vision of the law makers. The main objective was to provide facilitation of E Commerce.

In 2008 ITA 2000 was extended to provide some additional security against Cyber Crimes. At this time, the focus was on “Intermediary Liability” but still the vision was restricted to liability arising out of crimes occurring on E Commerce platforms and to what extent the owner of the platform should be held liable for the offences committed by third party users.

In the context of the Smart Cities, where there is a huge dependence of the infrastructure on “Automated Sensors” which collect data and pass it on to a central processor and the Central processor is programmed to take automated decisions based on the data input and send back operational instructions to decision enforcement mechanisms, there is a debate on whether ITA 2008 can address the new challenges thrown by the Smart city eco system.

In this process, we have legal queries on whether we are violating “Privacy” while our sensors collect information and whether mistakes committed by our “Central Processors” armed with Big data analytic capabilities using Artificial intelligence are punishable as cyber crimes, etc.

The recent Uber autonomous car accident in Arizona has highlighted the consequence of failures by the sensors or the processing systems.

Also, Big Data Analysis which takes raw data from some source and adds intelligence to it to make it more useful information for third parties has raised issues of “Ethics” as we see in the Cambridge Analytica case.

It is interesting to note that without any inclination of such possibilities, ITA 2000 provided that “An action by an automated system is attributable to the person who caused it to behave automatically”. By this one section, all actions of automated systems have been brought under legal scrutiny just as if some human was sitting there and operating the system though he might have used an algorithm as a tool. Such person could be the owner of the system like Uber in the Arizona case.

It is open to Uber to hold the software developer or the sensor manufacturer for their part of failure of the warranty depending on the contractual obligations. Under Section 79 of ITA 2000/8, read with Section 85,  criminal punishments can also be imposed on the intermediaries and their executives for the adverse action by the automated systems.

If therefore in a smart city, automated systems cause any accident, Indian law has some body to be held accountable.

As regards the Big Data analytics, current practice is to depend on the “Consent” obtained by the “Data Collector” who collects the personal data.

If the data collector adds value to the information then the right over the value addition is claimed by the person who added the value. This is recognized under the IPR.

The value added information is different from the raw data handed over by the data subject and hence the contract of data collection has to specify if the data subject permits creation of value over the raw data provided by him and whether he is entitled to any benefits there of. Otherwise he may not be able to object to the value creation.

Naavi has recommended earlier that personal data should be treated as a property and could be made transferable for a consideration with a royalty payable to the data subject if value is encashed by the data collector. However a proper mechanism does not exist for this purpose and hence the value adder is free to make profit on the basis of the raw data supplied by the data subject.

However, when the value addition processing of personal data leads to creation of any “Profile Data” which is used in such a manner as to defame the data subject it may be considered punishable whether or not there was a consent or whether the data was collected from the data subject or from a third party.

The “permission to transfer” and the “Conditionalities of such transfer” inherent in the consent determine whether the Data analytics becomes a “privacy issue” or not.

The damage created by an aggregator or processor of data to the data subject is not much different from the damage that may be created by a malicious person who may hack into CCTVs or other devices of another owner and use it for unauthorized surveillance or DDOS attacks. With Smart cities using CCTV and other monitoring devices in plenty, it is a fertile ground for misuse by hackers if the security is weak.  The legal implication of such damages (eg Dyn Attack) is determined under Section 43A of ITA 2008 which imposes “Reasonable Security Practices” on the owner of a device.

The data aggregators or value processors are however in the nature of “Intermediaries” and their liabilities will be determined with the application of the “Due Diligence” principles.

One Due Diligence aspect that can be considered when personal data is transferred to another person is to transfer the data along with the consent so that the down stream data processor is aware of the consent restrictions. But this again is not an established practice but can be considered.

Hence “Self imposed Ethical Standard” as due diligence is the only available means through which the down stream user of data can be expected to protect the privacy of a data subject with whom he does not have direct contractual contact.

Also, when data is transferred from one data collector to another data processor, if the data is pseudonomized, then the obligations of both the data collector as well as the down stream processor would be either absent or substantially reduced. This can happen in many instances of research but not when the processing intended to be used for marketing. But “Marketing” is almost always a category of use that is prohibited in any consent and hence can be considered as a “Presumption” unless the contrary is proved by an “Explicit Consent”.

When “Artificial Intelligence” is used in a Smart City scenario, the sensors (Including CCTVs equipped with face recognition or Gait recognition) are “machines” which collect the personal data. The “Privacy Breach” therefore is not evident unless the data is disclosed to a human being. As long as the data is being processed within the system, it is difficult to say if the “Privacy has been breached” though it could be a step towards eventual breach of privacy.

Again this is a grey area for law and we need to consider that just as we say “Privacy” is a right available only for “identifiable, living individuals”, we can define that a “Breach of Privacy” is recognized only when a “Living individual” accesses “identifiable personal data” without the consent of the data subject.

With such a definition, the Smart City processing can be largely relieved of the privacy obligations as any data which is collected can be filtered into “Suspect person’s personal Data” and “Non Suspect person’s personal data” with the non suspect person’s personal data being de-identified by the machine itself.

Only the “Suspect person’s personal data” may be escalated to human intervention and as long as the machine (or the person who owns its actions) can justify “Reasonable Doubt” as to why the data subject should be considered as a “Suspect”, Privacy breach may not be considered to have occurred.

Presently, these thoughts are being presented as an extension of the present laws. If this is universally accepted, then we may not need a separate Cyber law for Smart cities. If not, we may consider some amendments to ITA 2008 to add clarifications necessary to expand some of its provisions as may be required.

Naavi

 

 

Dr Pratap Reddy, Executive Chairman of Apollo Hosiptal has stated that  Apollo Hospital had turned off CCTV cameras placed in the ICU when the late Tamil Nadu Chief Minister J.Jayalalitha was undergoing treatment. (Refer report here).

In the light of a strong suspicion that Ms Jayalalitha could have been murdered by a political conspiracy, the action of Apollo Hospital in deliberately switching off the CCTV footage raises a question if Apollo Hospital and Dr Pratap Reddy should face criminal charges of abetting a murder? If there was a facility of CCTV in a hospital, there must be a reason. Mr Pratap Reddy should explain why CCTV was being run when every other ordinary patient was there without regard to their Privacy but only when Ms Jayalalitha was in the hospital, it was switched off.

Similar issues have come to the fore in the case of Sunanda Pushkar suspected murder case where CCTV footages at Hotel Leela Palace went missing. There are many other instances where either the Police have seized the CCTV device and later said that they did not find anything in the DVD or the private establishment which maintained the CCTV  itself said that the CCTV was not functioning when a VVIP crime took place right under its nose.

As a result, the ubiquotous CCTV they want and claim that it was not available when there is a VVIP pressure to suppress truth.

This incident highlights an important policy issue in the country about the Privacy implications of installing CCTVs in public and semi-public places. The Srikrishna Committee working on the new Data Protection law in the country needs to take this into consideration and make a specific provision to ensure that if CCTV with or without face recognition or Gait recognition capability is a tool of security for the community and is permitted to be installed in public places (and Semi-public places) without considering it as a “Privacy Breach”, then there has to be accountability for the footage captured.

We should not allow the CCTV footages to be selectively used  as evidence in some cases and selectively ignored in other cases without the owner being prima facie suspected of having erased evidence when he claims that the CCTV footage in a particular instance is not available. At least he should be made liable to provide proper explanation under the “Due Diligence” concept why in a specific instance the device was not functioning.

If any person provides a “Consent” (express or deemed) to be subjected to being monitored in a given situation, then the data collected about himself and his behaviour should be treated as the property of the data subject. He should have the right to ask for a copy if required. Privacy laws such as the GDPR provides a right to erasure, right to rectification and right for portability of personal data and the CCTV footage must be treated as “personal data” of the data subject. The CCTV data collector cannot be allowed arbitrarily to state that in some cases data is available and in some other cases it is not available.

This principle should be tested now by subjecting Apollo Hospital to a rigorous criminal investigation in respect of the suspected murder of J.Jayalalitha. Simultaneously, I draw the attention of the Justice Srikrishna committee to incorporate such provisions as necessary in the new Data protection act to make CCTV managers accountable to what they collect as data claiming exemption from general Privacy principles through either for  “National Security”  reasons or under the cover of a “Consent”.

Naavi

The news report that Personal profiles of 50 million Face Book users was collected and unauthorizedly used to help Trump win an election has opened  a new debate on Privacy and Data Protection in India. BJP and Congress parties are fighting on TV to blame each other that they are also indulging in a similar misuse of personal data while the local subsidiary of Cambridge Analytica (CA) which is the firm accused of the misuse claims to have served both BJP and Congress in different elections.

Much of the debate that is happening in this connection appears to be dishonest and hypocritical and the bluff has to be called.

We must first recognize that the CA is supposed to have collected the data through an App which was voluntarily downloaded by users who gave a consent for the access of their personal information. The person who collected the information based on the consent provided used it as a data for some kind of research for targeted advertising. The research was bought by Trump’s campaign managers and hopefully he was benefited.

Just as in India anything done by Modi is objected to, the Anti Trump brigade is accusing as if US election was tampered because of the profiling of the consumer research company and the targeted advertising for which it was used. Even if the firm had done a “Psychological Profiling” from the data available, as long as the data was in the public domain or out of an informed consent, there is no breach of Privacy. There are FinTech companies who do data analytics for fixing credit limits and if data analytics is used to create innovative advertising, it is neither a surprise nor some thing to be scoffed at.

This sort of data collection from public resources or from informed consent cannot be objected to just because we donot like Mr Trump winning.

If there is any real objection, one has to go into the fact of whether the “Informed Consent” was actually through a fraud and if so the data collector namely the British academic “Aleksandr Kogan” has to be brought to book.

Presently all Privacy Laws place faith on such consents. But if the Data Collector breaches the agreement and sells the data to another person who uses it for a purpose other than the purpose for which it was provided, it has to be objected to only on grounds of “Breach of Contract, Breach of trust” etc.

As regards the third party who bought to the data, data protection acts need to impose a “Due Diligence” obligation to disclose and get consent from the data vendor that the purchased data can be used for a specific purpose. Since “Advertising” is a legitimate purpose, if the data collector offers a data for advertising to an advertiser and the advertiser may  buy it under the premise that the data subject must have provided the necessary consent.

Is the secondary data user expected to check if the original consent provided to the data collector permits  such use or not is a matter yet to be clearly defined in law though it could be an ethical and moral issue. Also in many cases, even the buyer may not be aware how exactly he is going to use the data and how he can benefit from it. He may be simply buying it speculatively and discover some value added derivatives out of it which he may trade.

It is therefore hypocritical for us to express surprise that FB data could be used for profiling and profiled information can be used for advertising and such advertising could be for political campaigns. All this has to be expected in the era of Big Data anaytics and Artificial Inteligence.

In fact while the laws or privacy so far have missed the need to impose “Due Diligence” by the secondary user of personal data and this can be taken note of and included in the Indian Data Protection Laws, we can draw attention to Section 66B of the ITA 2008 which provides a possibility for “Stretching the legislative intent indicated in the section” to cover the misuse of data. Section 66B is actually meant for punishing the use of stolen computers and mobiles and uses the term “dishonestly receives and retains any stolen Computer Resources”. If we can consider data as a computer resource and the act of use of data for a purpose other than what it was meant as “Stealing”, then Section 66B can be stretched to the data misuse scenario though it is not recommended.

May be the Justice Srikrishna panel may include a clause that

“Any user of personal data shall exercise due diligence to ensure that the purpose for which it may be used is consistent with the consent provided”

Perhaps this is the lesson we can take out of this incident apart from what we have already discussed as to the need of an intermediary called “Data Trust” in the Data Protection environment.

Naavi