The Shape of Things to Come..The New Data Protection Act of India-9 (Definitions-Roles)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

We have so far discussed the definitions of “Privacy” and “Data” in the previous two articles.  In this article let us discuss the definition of different entities and their roles.

In GDPR, important roles for the data handlers are  Data Controller  Data Processor, c) Recipient and Joint Data Controller,

On the other hand, PDPB 2018/2019 defined the roles as “Data Fiduciary”, “Data Processor” and “Consent Manager”.

In the NDPAI, it is suggested that the Data Fiduciary should be considered as “Data Manager”.  The reason why we are suggesting this change is that the “Role of Data Fiduciary” as a “Trustee ship” for the entity which is determining the purpose and means of personal data processing is a good measure. But this “Trusteeship” responsibility is not very practical and it is difficult to expect the commercially minded “Data Controllers” to faithfully discharge the responsibility as a “Trustee”. The Conflict of interest is too strong for the concept to work efficiently.

At the same time, “Data Controller” reduces the importance of the Data Subject/Data Principal as if he is enslaved by the Data Controller. It is therefore necessary to identify a more balanced role to the entity which is today referred to as the Data Controller or Data Fiduciary.

I therefore suggest that the role of the entity which determines the purpose and scope of personal data processing as the “Data Manager”. This retains the superior position of the Data Principal who appoints the “Data Manager” for a specific task.

Also the GDPR defines “Means of Processing” and “Purpose of Processing” as the criteria for identifying the “Controller” status.  This needs a re-look. “Collection” and “Purpose” could be two better parameters to fix the responsibility of an entity as a “Controller”. “Collection” is a key criteria because it is only a “Collector of Personal Data” who is having a relationship with the data subject and can obtain a proper consent where required. It is not feasible for a “Controller” appointing another entity as a “Processor” to collect the personal data.

Secondly, once the “Controller” specifies the “Purpose” and hands over the personal data to a processor, the “Means of Processing” can be left to the processor to determine. In many practical instances, we find that Cloud Service providers offer many services for processing personal data under proprietary technology. They would like to offer their service with a commitment on the required output but would be reluctant to pass on the technology secrets on which they may have intellectual property rights.  Presently, all such processors need to be treated as “Joint Data Controllers” only and not “Data Processors”.

A “Data Processing Contract” specifies the purpose for which the data has to be processed and also specifies the “Security” requirements. Security would automatically include the provision that the data cannot be used for any “Unauthorized purpose”.  Hence with a control on “Purpose” and “Security” under a contractual obligation, the processor can be provided the freedom to preserve his intellectual property rights.

Under these considerations the definition of a “Data Manager” which replaces the term “Data Fiduciary” would be

Data Manager

Data Manager in the context of personal data is any person who collects personal data and determines the purpose of processing.

Data Processor

Data Processor in the context of personal data is any person who processes the personal data received from a Data Manager strictly in accordance with the specified purpose for which the personal data was collected from the data subjects.

The associated definition would be that of a “Person”. The term person may be used both as an “Individual” who could be a data subject and a Data Manager or Data Processor”.

The definition of a “Person” could be

Person

A “Person” in the context of personal data means

a) the individual whose personal data is collected by a Data Manager for an agreed purpose. 

b) the entity of any description which processes the personal data as a data manager or a data processor and includes an individual, corporate entity, partnership firm, society, association of persons, a Government department or any other juridical entity recognized under law.

The role of a “Consent manager” is recognized in PDPB 2019 and not in GDPR. It is an excellent proposition and in the context of the Indian environment where the data principals are less educated, and also have to grapple with language issues in understanding the consent requests and would benefit by the assistance that a “Consent  Manager” can provide.  “Consent manager” always is the “Collector of the personal data” and hence under the above definition of a “Data Manager” the “Consent Manager” is also a Data Manager. However, the “Consent Manager” is a specialized Data Manager since the only purpose for which he collects personal data is to act on behalf of the data subject for providing consent to other Data Managers and to exercise the rights of the Data Principal.

His role therefore is more as a “Privacy Protection Advisor” of the Data Principal. This role can be created by a “Power of Attorney” document without the need for this provision in the law. However, in order to ensure responsibility and accountability, to this important function, it is better for the law to declare this role under the term “Privacy Protection Advisor” instead of “Consent manager”. This will  avoid the clash of the term with a similar term used under the “Account Aggregator” concept of the RBI besides addressing the function of exercising of Rights on behalf of the data principal.

Considering the needs of the Indian society, it is suggested that the Act should encourage both corporate entities and individuals to take on the license as “Privacy Protection Advisors” (PPA) under a suitable accreditation system regulated by the data protection authority.  In this system, the PPA s could be called Category I, Category II or Category III advisors where the lowest category of advisors would be the professionals with necessary knowledge and commitment where as Category II could be firms of Category III advisors and Category I would be independent Corporate entities with a specified capital base and larger responsibilities to technically safeguard the data principals.

The Category III advisors would be like the Chartered Accountants or  or advocates who act individually within their respective professional responsibilities and Category II advisors would be like the CA firm or Lawyer firm where individual professionals who are Category III advisors can work together as a loose association.

This will enable development of professionals who can not only act as Privacy Protection Advisors for individuals but also as “Data Auditors” and would require to fulfill some accreditation criteria of the regulator.

Under this premise, the Consent Manager could be defined as follows.

Consent Manager

Consent Manager  is any person or association of persons or a company or any other juridical entity under any law and capable of being able to sue or be sued upon, which is  authorized by the Data Protection Authority  and may offer services as advisors to assist the individual data principals for providing informed consent to the data managers and to provide assistance for exercising their rights guaranteed under the Act. 

Joint Data Manager

Joint Data Manager in the context of personal data means any combination of two or more data managers who have agreed to share the responsibilities jointly and severally under this Act.

The GDPR defines a role as a “Recipient” who is neither a Data Controller or Data Processor. However, in the GDPR, since “Storing” of personal data is also considered as “Processing”, every recipient of identified personal data will automatically be a “Data Processor” or a “Data Controller”.

In the definition of a “Data Processor” which we used yesterday (article 8) we did not specifically include “Data Storing” as a “Processing activity”.

We defined “Processing” as follows.

“Processing” will be defined as any alteration of a binary sequence of data elements and includes data aggregation, data modification, data deletion, data disclosure, data publishing etc.

In this definition, we captured only such processes that alter the data as “Processing”.

In GDPR and PDPB 2019, “Storing” is also considered as “Processing”. However, considering that there are many service providers who only store data some times the containers of data in safe custody without any access to the data, it may be better to carve out “Storage of Data” as a separate activity not amounting to “processing”.

We therefore suggest that under the “Roles”, we can define a “Data Storage Agent” as a separate entity with a definition as follows.

Data Storage Agent

A Data Storage Agent in the context of personal or non personal data management means any person who is entrusted with the custody of data  for the purpose of safe custody only whether in a data container or otherwise and does not have right to access and will however be responsible for secure storage.

…Discussions will continue…. Comments and suggestions are welcome.

Naavi

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

  1. Introduction
2. Preamble 3.Regulators
4. Chapterization 5. Privacy Definition 6. Clarifications-Binary
7. Clarifications-Privacy 8. Definitions-Data 9. Definitions-Roles
10. Exemptions-Privacy 11. Advertising 12. Dropping of Central Regulatory authority
13. Regulation of Monetization of Data  14. Automated means ..

 

Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-8 (Definitions-Data)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

We have discussed the definition of “Privacy” in some detail in the last few articles. In particular, the suggestion that “Sharing” of identifiable data for processing within an algorithm in such a manner that the identified data is not exposed to a human being has evoked a long debate and I have tried to provide clarifications as required.

One residual query was in respect of what would happen if the anonymised processed data is shared as “Non Personal Data” and the recipient later de-anonymizes the anonymized data.  Obviously when data which was previously identifiable personal data but was anonymized by one processor and then released as “Non Personal Data”, the processor is expected to use an acceptable standard of anonymization which becomes the “Due Diligence” or “Reasonable Security Practice” on his part. When this is voluntarily shared by Processor A to a recipient B, Recipient B is not expected to de-anonymize the data. If Processor B does de-anonymize then Processor B would be guilty of a criminal offence (we shall discuss this later under penalties). At that time, Processor A would have to defend that it had followed “Due Diligence” and claim protection as an intermediary under Section 79.

I reiterate that the concept that the definition of “Sharing” for defining “Privacy”  should be restricted to disclosure to a human being and not an algorithm is a concept which is different from the GDPR jurisprudence.  It is however suggested because we feel that there is an opportunity for India to set new standards in designing a Data Protection Act which can be better than GDPR.

In order to recall all the discussions we had in this regard, we reproduce the definition of Privacy as suggested by us to be included in the NDPAI.

Privacy

Privacy is a fundamental right under the Constitution of India as an independent right under the Right to life and liberty that guarantees an individual that shall not be infringed except under due process of law as defined in this Act and  includes the following.

(a) “Physical Privacy” means the choice of an individual to determine to what extent the individual may chose to share his physical space with others.

(b) “Mental Privacy” means the choice of an individual to determine to what extent the individual may chose to share his mind space with others

(c) “Neuro Privacy” means the choice of an individual to determine to what extent the individual may share his neuro space with others

(d) “Information Privacy” means the choice of an individual to determine to what extent the individual may share data about the individual with others.

Explanation:

“Sharing” in the context above means “making the information available to another human being in such form that it can be experienced by the receiver through any of the senses of seeing, hearing, touching, smelling or tasting of a human in such a manner that the identity  of the individual to whom the data belongs may become recognizable to the receiver with ordinary efforts”.

We also re-iterate that it is the responsibility of the Government to define “Privacy” before going to place a responsibility on the industry to protect the Right to Privacy.


Definition of Data

Having defined that “Privacy” is related to protection of Personal data we need to now define

i) Data, Computer, Processing

ii) Personal Data

iii) Non Personal Data

iv) Sensitive Personal Data

v) Neuro data

vi) Harm to individuals

vii) Harm to Entities

viii) Critical Personal Data

ix) Sensitive non personal data

x) Critical Non Personal Data

xi) Significant Harm

xii) Joint  Data

xiii) Corporate Data

xiv) Business Data

xv) Minor Data

xvi) Personal Data of non citizens

xvii) De-identified Personal Data

xviii) Pseudonymized Personal Data

xix) Anonymized Personal Data

xx) Encrypted Data

Let us first place before the audience the proposed definitions of these 16 different kinds of data and later debate whether all these categories of data are to be defined.

We have consciously decided that we are pursuing development of  a “Privacy Code” that is a combination of present day PDPB 2019 and ITA 2000/8 and the proposed Non Personal Data Governance Act (NPDGA).

Though in the definition of Privacy we have included instruments of personal information contained in  “Oral” and “Paper” form, most of our discussions will revolve around “Data” which is the electronic form of storage of information.

When PDPB 2019 was contemplated, we already had ITA 2000 which had defined Data, Personal data,  Sensitive Personal data and the legal recognition to Data. Hence PDPB 2019 adopted the same definition of data and only made some changes to the definition of sensitive personal information. Presently there is an opportunity to find an improved definition and hence we are proceeding to suggest definitions which may be slightly at variance with the current ITA 2000/8.

i) Data, Computer, Processing

“Data” means information which is expressed or is capable of being expressed in a binary language and includes data in raw form where the binary elements are distributed in a chaotic state and data which is organized into bytes and sequence of bytes.

Correspondingly, “Computer” will be defined as  any device that can generate, process, store or transmit  data or delete data by destroying the organized form of binary distribution back to a chaotic form  and includes all hardware devices and applications which provide the functionality of generation of an organized set of binary expressions, processing them or storing them or transmitting them or handle them in any other form.

Further, “Processing” will be defined as any alteration of a binary sequence of data elements and includes data aggregation, data modification, data deletion, data disclosure, data publishing etc.

ii) Personal data

Personal data means any data that can with reasonable assurance be associated by the receiver with an identifiable living natural person and includes combination of different elements of personal data which in combination create a reasonably assured identity though the different elements might have been acquired from different sources and at different points of time.

iii) Non Personal Data

Any data which is not “Personal data” is “Non Personal Data” and includes Raw data in a chaotic distribution of binary, Corporate Data, Business transaction data, environmental data etc., which donot contain the association with an identity of any specific living natural person.

iv) Sensitive Personal Data

Personal Data which contains such personal data, which may reasonably cause a significant harm to the individual  in the hands of unauthorized person is classified as “Sensitive personal data” and includes 

a) Credentials for accessing restricted data

b) Health data

c) Financial data

d) Sex related data

e) Biometric data

f) Genetic data

An associated definition with Sensitive Personal Information would be the definition of “Harm” and “Significant harm”.

v) Neuro data

Neuro data means the electromagnetic signals that are collected from or fed into the human brain by a Brain Computer Interface in binary form.

vi) Harm to Individuals

“Harm” means any wrongful and adverse impact on the body, mind or property of an individual and includes 

a) Physical or Mental injury

b) Loss, distortion or theft of identity 

c) financial loss or loss of property

d) Loss of reputation or humiliation 

e) Loss of Employment or source of income 

f)  Threat to life and property including causing harassment or subjecting to extortion

g) Causing discriminatory treatment in the society.

h) Psychological or Neurological manipulation which alters the ability of an individual to take autonomous decisions

vii) Harm To Entities

“Harm” in the context of entities means any wrongful and adverse impact on the entity in terms of its property, reputation, business continuity, impairment or cost escalation.

viii) Critical Personal Data

Critical Personal Data means such personal data, deprivation, incapacitation or destruction of which would cause significant harm to an individual and includes biometric data or genetic data or unique official identifiers and personal data under the control of such entities or computer resources whose activities if incapacitated or impaired may have debilitating impact on national security, economy, public health or safety.

ix) Sensitive Non Personal Data 

Sensitive Non Personal Data means such non personal data which the deprivation, modification, deletion or wrongful sharing of  which may reasonably cause a significant harm to any organization including

a) Loss of Business

b) Loss of Money or Property

c) Loss of Reputation

d) Disruption of Business Continuity

e) Unreasonable increase in cost of operation

x) Critical Non Personal Data

Critical Non Personal Data means such non personal data, deprivation, incapacitation or destruction of which would cause significant harm to an entity and includes non  personal data under the control of such entities or computer resources whose activities if incapacitated or impaired may have debilitating impact on national security, economy, public health or safety.

xi) Significant Harm

Significant Harm means such harm caused to an individual or any other entity, which is irreversible or is reasonably difficult to correct once caused.

xii) Joint Data

Joint Data whether personal or non personal means such  data  that is generated during a transaction involving more than one individual or entity

xiii) Corporate Data

Corporate Data means data that can with reasonable assurance be associated with an identifiable non living individual including Government agencies or Partnership firms, proprietary concerns or association of individuals, Not for profit entities, and further includes combination of different elements of  data which in combination create a reasonably assured identity though the different elements might have been acquired from different sources and at different points of time.

xiv) Business Data

Business Data means any data related to a business or Governance transaction whether inclusive of elements of personal data or corporate data or not.

xv) Minor Data

Minor Data means any personal data associated with an individual who is of age less than 18 years.

xvi) Personal Data of Non Citizens

Personal Data of Non Citizens means any personal data of an individual who is not a Citizen of India as per the Citizenship Act of India.

xvi)i De-Identified Personal Data

De-Identified personal Data means such personal data from which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual is removed and made inaccessible to the person to whom the data is disclosed. 

xviii) Pseudonymized Personal Data

Pseudonymized  personal Data means such personal data in which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual are replaced with comparable but randomly altered data elements and made inaccessible to the person to whom the data is disclosed. 

xix) Anonymized Personal Data

Anonymized personal Data means such personal data from which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual are removed and irrevocably destroyed so that the identity of the individual is rendered indeterminate to any person who is in possession of the residual data including the entity or person who caused the anonymization.

xx) Encrypted Data

Encrypted Data means such data that has been converted into a different data and  rendered unusable and unreadable by unauthorized persons .

The above definitions have been provided with some specific reasons that would be clearer as we go ahead and advocate the provisions of the Act.

However, definitions are very critical to the designing of the laws and hence I invite intense debate on the above definitions.

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Naavi

(This is not an exhaustive list of definitions. More will follow)

  1. Introduction
2. Preamble 3.Regulators
4. Chapterization 5. Privacy Definition 6. Clarifications-Binary
7. Clarifications-Privacy 8. Definitions-Data 9. Definitions-Roles
10. Exemptions-Privacy 11. Advertising 12. Dropping of Central Regulatory authority
13. Regulation of Monetization of Data  14. Automated means ..

 

Posted in Cyber Law | Leave a comment

Happy Independence Day

Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-7 (Clarifications-Privacy)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

In our previous article, we discussed a new concept of “Privacy” where I advocated that “Data is in the beholder’s eyes” and hence Data in binary form which is accessed by an algorithm and processed without releasing the data out of the algorithm as an “Identified personal data” should be considered as not “Accessing” personal data for the purpose of personal data protection.

I am aware that supervisory authorities under GDPR would consider that  when identified data is accessed by an algorithm without consent, it amounts to “Automatic Processing” and  is considered as a breach of identifiable personal data.

What I am advocating is that this approach needs to be changed.

“Data” is “Data” only when it is in a form in which a human can understand it.

Identifiable Personal Data in binary form which is not accessible by a human being but is accessible only by an algorithm which ensures that even the admin of the algorithm cannot view it in identified form

and

subsequently released for human consumption only in anonymized form should be considered as not having been “accessed” and hence not considered as an infringement of Privacy.

In GDPR, this view is not supported. However, GDPR recognizes “Anonymisation” and considers “Anonymized personal data” as “Non Personal data”. If viewing by an algorithm in identified form is considered as “Breach” then all anonymization processes are actually processing of personal information and hence must be considered as an “Access” of identified personal data and requires consent.

I am in agreement with the views expressed in this article “Does anonymization or de-identification require consent?

According to this article, in Opinion 05/2014 of the Article 29 Working Party on Anonymisation Techniques, the Working Party stated:

“The Working Party considers that anonymisation as an instance of further processing of personal data can be considered to be compatible with the original purposes of the processing but only on condition the anonymisation process is such as to reliably produce anonymised information in the sense described in this paper.”

But if this view is correct then access of identified personal data by an automated processing algorithm is not an objectionable access. There is therefore an inherent conflict in GDPR.

This principle is extended in the concept which I am trying to advocate as Privacy 2.0 and drawing a principle that whenever  any process accesses identifiable personal data ad returns anonymised personal data where even the algorithm administrator has no access to identified personal data, then the process is compatible with the view that there is no infringement of privacy. Such a process will required that after the algorithm removes all identifiers the identifiers are irrevocable destroyed and are not associated with the output of the process.

This is an important  clarification I am advocating in the New Data Protection Act as part of the definition of “Privacy” and the definition of “Sharing”.

In the current versions of both GDPR and PDPB 2019, the  law does not define “Privacy” and proceeds to speak of various measures to protect the “Information Privacy”. It is felt that this is not fair on the data processing industry that they are required to protect a “Right of Privacy” which even the experts in Judiciary are not confidant of defining. We therefore strongly feel that the definition of privacy is essential in this data protection law though most of the law is related to “Information Privacy”.

The Privacy definition clause defines “Physical Privacy” which is the old concept which Supreme Court in the Kharak Sigh Case upheld. “Mental Privacy” is what Justice Chandrachud defined in the Puttaswamy judgement. The same Puttaswamy judgement also simplified the definition of Privacy into “Protection of Right of Choice” as expressed, which leads to the “Consent” and “Lawful basis for processing” requirements. Further the Puttaswamy judgement as well as GDPR mostly addressed issues of protecting  “Information Privacy” which  is protection of the Right of Choice about the use of personal information by the data subject when the personal information is in electronic form. Though in stray articles/sections it is stated that the principles of the law are also applicable for manual processing or file systems, the essence of the law is protection of personal information in electronic form.

We have tried to remove these attempts of law makers to hide behind vague concepts of “Privacy” and try to catch an industry for non compliance at the appropriate time just like the traffic cop who prefers to hide behind a bend and catch violators rather than standing in the middle of the road and guide the traffic towards compliant driving.

In GDPR we do differentiate between automated processing leading to “Profiling” from “automated decision making”  without human involvement but both require a lawful basis and this view if extended to other data protection laws also resulting in a general belief that access of identifiable personal data by any automated process requires consent/lawful basis and otherwise will be considered as a data breach.

We are challenging this interpretation and seeking  validation in the new data protection law.

(For abundant caution, I am trying to clarify that what is suggested is for the forthcoming new law in India and does not alter the earlier view in GDPR compliance where automated processing/decision making require a consent/lawful basis.)

Naavi

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

  1. Introduction
2. Preamble 3.Regulators
4. Chapterization 5. Privacy Definition 6. Clarifications-Binary
7. Clarifications-Privacy 8. Definitions-Data 9. Definitions-Roles
10. Exemptions-Privacy 11. Advertising 12. Dropping of Central Regulatory authority
13. Regulation of Monetization of Data  14. Automated means ..

 

Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-6 (Clarifications-binary)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect.

In our previous article, we discussed the definition of Privacy. One of the new concepts we tried to bring out is that “Sharing” should be recognized only when identified personal data is made accessible to a human being.

In other words, if personally identified data is visible to an algorithm and not a human, it is not considered as sharing of identified data if after the processing of personal data by the algorithm, the identity is killed within the algorithm and the output contains only anonymised information.

Typically such a situation arises when a CCTV captures a video. Obviously the video captures the face of a person and therefore captures a critical personal data. However, if the algorithm does not have access to a data base of faces to which the captured picture is compared with and identified, the captured picture is only an “Orphan Data” which has an “Identity parameter” but is not “Identifiable”. The output which let us say a report of how many people passed a particular point as captured by the camera etc is devoid of the identity and is therefore not a personal information.

The algorithm may have an AI element where the captured data is compared to a data base of known criminals and if there is any match, the data is escalated to a human being where as if there is no match, it is discarded. In such a case the discarded information does not constitute personal data access while the smaller set of identified data passed onto human attention alone constitutes “Data Access” or “Data Sharing”.

Further, the definition provided yesterday used some strange looking explanation of “Sharing” as

“..making the information available to another human being in such form that it can be experienced by the receiver through any of the senses of seeing, hearing, touching, smelling or tasting of a human..”

This goes with my proposition that “Data is in the beholder’s eyes” and “Data” is “Data” only when a human being is able to perceive it through his senses.

For example, let us see the adjoining document which represents a binary stream.

A normal human being cannot make any meaning out of this binary expression. If it is accessed by a human being therefore, it is “Un-identifiable” information.

A computing device may however be able to make a meaning out of this.

For example, if the device uses a binary to ascii converter, it will read the binary stream as ” Data is in the beholder’s eyes”. Alternatively, if the device uses a binary to decimal converter, it could be read as a huge number. If the AI decides to consider each set separated by a space as a separate readable binary stream, it will read this as a series of numbers.

Similarly if the binary stream was a name, the human cannot “Experience” it as a name because he is not a binary reader. Hence the determination whether a binary stream is “Simple data” or “a Name” or a “Number” etc is determined by the human being to whom it becomes visible. In this context we are calling the sentence in English or number in decimal form as “visibility”. If the reader is an illiterate, even the converted information may be considered as “Not identifiable”. At the same time if the person receiving the information is a “Binary expert who can visualize the binary values”, he may be a computer in himself and consider the information as “Identifiable”.

It is for these reasons that in Naavi’s Theory of Data, one of the hypothesis is that “Data is in the beholder’s eyes”.

The “Experience” in this case is “Readability” through the sensory perception of “Sight”. Similar “Experience” can be recognized if the data can be converted into a “Sound” though an appropriate processing and output device. Theoretically it can also be converted into a sense of touch, smell and taste if there are appropriate devices to convert them into such forms.

If there is a “Neuro input device” associated, then the binary stream can be directly input into the human brain by a thought and it can be perceived as either a sentence or number as the person decides.

These thoughts have been incorporated in the definition of “Privacy” and “Sharing” which was briefly put out in the previous article.

The thought is definitely beyond the “GDPR limits” and requires some deep thinking before the scope of the definition can be understood.

In summary, the thought process is

If an AI algorithm can be designed that identifiable data is processed in such a manner that identity is killed within the algorithm, then there is no privacy concern.  In fact a normal “Anonymizing” algorithm will be one such device which takes in identifiable information and spits out anonymous information. In this school of thought, such processing does not require consent and does not constitute viewing of identifiable data even by the owner of the algorithm (as long as there is no admin over ride)

I request all of you to read this article and the previous article once again and send me a feedback.

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Next article

Naavi

I am adding a reply to one of the comments received on Linked In:

Question:

Consider the situation of google processing your personal data from cookies or server and providing you specific ad. Google claims this automatic processing and output is anonymous.
So your suggestion to allow this?

Answer

It is a good question. It may require a long answer.
In such cases we first need to check through a DPIA what is the harm caused to the individual and arrive at a decision.
In the example cited,  there are three levels of processing. 
At first level there is collection of personal information. If the cookies are persistent cookies and linked to a known customer, it could be personal data and consent is required. If the entire cookie data collected is only anonymous and the collector is not reasonably capable of identifying the individual with other data on hand, it is processing of non personal data. 
At the second level, a profiling occurs and the users are categorised into different market segments may be without individual identity.
For example, if we say category A user’s would be interested in buying a computer, this analysis is not causing harm to the user. Usually this is done by an intermediary market research company. This company need not know the identity of the user and hence it only processes anonymised personal data which is outside the privacy protection regime.
At the third level advertisement is served. If the ad server is aware of the identity of the recipient and  target the ads then it is an activity which could cause harm to privacy.
Let us also take the example of TV ads or hoardings on the street. They are not specifically targeted ads and hence don’t infringe privacy.
Similarly if there are ads on the web space which are not targeted, it would be difficult to call it as infringement. If the ads are targeted by identity, without doubt it would be an infringement.
What you are indicating is a case which falls in between the above two extreme cases of targeted ads to identified individuals and generic ad serving just like the hoarding on the street which is open to everybody.
The determination of privacy impact is determined more by the platform where advertisement is placed. If it is a private space like your email inbox, you may say that there was an infringement. But if it is on a website which you go and visit, the ads may be treated like hoardings and not infringing.
Hence the platform on which the ads are served may determine whether there is harm or not.
What I have suggested would basically apply to intermediaries who only process data without any idea of the data subject and gives out the results to another recipient.  This is what an “Algorithm” would do.
But if Google is able to identify who has provided the data and who is getting the ads, they may not have the status of an “Intermediary” and there could be infringement of privacy.
Hence we may have to take a view based on the context. 

  1. Introduction
2. Preamble 3.Regulators
4. Chapterization 5. Privacy Definition 6. Clarifications-Binary
7. Clarifications-Privacy 8. Definitions-Data 9. Definitions-Roles
10. Exemptions-Privacy 11. Advertising 12. Dropping of Central Regulatory authority
13. Regulation of Monetization of Data  14. Automated means ..

 

Posted in Cyber Law | Leave a comment

75th Independence Day of India celebrated at every home

Posted in Cyber Law | Leave a comment