The Shape of Things to Come..The New Data Protection Act of India-8 (Definitions-Data)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

We have discussed the definition of “Privacy” in some detail in the last few articles. In particular, the suggestion that “Sharing” of identifiable data for processing within an algorithm in such a manner that the identified data is not exposed to a human being has evoked a long debate and I have tried to provide clarifications as required.

One residual query was in respect of what would happen if the anonymised processed data is shared as “Non Personal Data” and the recipient later de-anonymizes the anonymized data.  Obviously when data which was previously identifiable personal data but was anonymized by one processor and then released as “Non Personal Data”, the processor is expected to use an acceptable standard of anonymization which becomes the “Due Diligence” or “Reasonable Security Practice” on his part. When this is voluntarily shared by Processor A to a recipient B, Recipient B is not expected to de-anonymize the data. If Processor B does de-anonymize then Processor B would be guilty of a criminal offence (we shall discuss this later under penalties). At that time, Processor A would have to defend that it had followed “Due Diligence” and claim protection as an intermediary under Section 79.

I reiterate that the concept that the definition of “Sharing” for defining “Privacy”  should be restricted to disclosure to a human being and not an algorithm is a concept which is different from the GDPR jurisprudence.  It is however suggested because we feel that there is an opportunity for India to set new standards in designing a Data Protection Act which can be better than GDPR.

In order to recall all the discussions we had in this regard, we reproduce the definition of Privacy as suggested by us to be included in the NDPAI.

Privacy

Privacy is a fundamental right under the Constitution of India as an independent right under the Right to life and liberty that guarantees an individual that shall not be infringed except under due process of law as defined in this Act and  includes the following.

(a) “Physical Privacy” means the choice of an individual to determine to what extent the individual may chose to share his physical space with others.

(b) “Mental Privacy” means the choice of an individual to determine to what extent the individual may chose to share his mind space with others

(c) “Neuro Privacy” means the choice of an individual to determine to what extent the individual may share his neuro space with others

(d) “Information Privacy” means the choice of an individual to determine to what extent the individual may share data about the individual with others.

Explanation:

“Sharing” in the context above means “making the information available to another human being in such form that it can be experienced by the receiver through any of the senses of seeing, hearing, touching, smelling or tasting of a human in such a manner that the identity  of the individual to whom the data belongs may become recognizable to the receiver with ordinary efforts”.

We also re-iterate that it is the responsibility of the Government to define “Privacy” before going to place a responsibility on the industry to protect the Right to Privacy.


Definition of Data

Having defined that “Privacy” is related to protection of Personal data we need to now define

i) Data, Computer, Processing

ii) Personal Data

iii) Non Personal Data

iv) Sensitive Personal Data

v) Harm to individuals

vi) Harm to Entities

vii) Critical Personal Data

viii) Sensitive non personal data

ix) Critical Non Personal Data

x) Significant Harm

xi) Joint  Data

xii) Corporate Data

xiii) Business Data

xiv) Minor Data

xv) Personal Data of non citizens

xvi) De-identified Personal Data

xvii) Pseudonymized Personal Data

xviii) Anonymized Personal Data

xix) Encrypted Data

Let us first place before the audience the proposed definitions of these 16 different kinds of data and later debate whether all these categories of data are to be defined.

We have consciously decided that we are pursuing development of  a “Privacy Code” that is a combination of present day PDPB 2019 and ITA 2000/8 and the proposed Non Personal Data Governance Act (NPDGA).

Though in the definition of Privacy we have included instruments of personal information contained in  “Oral” and “Paper” form, most of our discussions will revolve around “Data” which is the electronic form of storage of information.

When PDPB 2019 was contemplated, we already had ITA 2000 which had defined Data, Personal data,  Sensitive Personal data and the legal recognition to Data. Hence PDPB 2019 adopted the same definition of data and only made some changes to the definition of sensitive personal information. Presently there is an opportunity to find an improved definition and hence we are proceeding to suggest definitions which may be slightly at variance with the current ITA 2000/8.

i) Data, Computer, Processing

“Data” means information which is expressed or is capable of being expressed in a binary language and includes data in raw form where the binary elements are distributed in a chaotic state and data which is organized into bytes and sequence of bytes.

Correspondingly, “Computer” will be defined as  any device that can generate, process, store or transmit  data or delete data by destroying the organized form of binary distribution back to a chaotic form  and includes all hardware devices and applications which provide the functionality of generation of an organized set of binary expressions, processing them or storing them or transmitting them or handle them in any other form.

Further, “Processing” will be defined as any alteration of a binary sequence of data elements and includes data aggregation, data modification, data deletion, data disclosure, data publishing etc.

ii) Personal data

Personal data means any data that can with reasonable assurance be associated by the receiver with an identifiable living natural person and includes combination of different elements of personal data which in combination create a reasonably assured identity though the different elements might have been acquired from different sources and at different points of time.

iii) Non Personal Data

Any data which is not “Personal data” is “Non Personal Data” and includes Raw data in a chaotic distribution of binary, Corporate Data, Business transaction data, environmental data etc., which donot contain the association with an identity of any specific living natural person.

iv) Sensitive Personal Data

Personal Data which contains such personal data, which may reasonably cause a significant harm to the individual  in the hands of unauthorized person is classified as “Sensitive personal data” and includes 

a) Credentials for accessing restricted data

b) Health data

c) Financial data

d) Sex related data

e) Biometric data

f) Genetic data

An associated definition with Sensitive Personal Information would be the definition of “Harm” and “Significant harm”.

v) Harm to Individuals

“Harm” means any wrongful and adverse impact on the body, mind or property of an individual and includes 

a) Physical or Mental injury

b) Loss, distortion or theft of identity 

c) financial loss or loss of property

d) Loss of reputation or humiliation 

e) Loss of Employment or source of income 

f)  Threat to life and property including causing harassment or subjecting to extortion

g) Causing discriminatory treatment in the society.

h) Psychological or Neurological manipulation which alters the ability of an individual to take autonomous decisions

vi) Harm To Entities

“Harm” in the context of entities means any wrongful and adverse impact on the entity in terms of its property, reputation, business continuity, impairment or cost escalation.

vii) Critical Personal Data

Critical Personal Data means such personal data, deprivation, incapacitation or destruction of which would cause significant harm to an individual and includes biometric data or genetic data or unique official identifiers and personal data under the control of such entities or computer resources whose activities if incapacitated or impaired may have debilitating impact on national security, economy, public health or safety.

viii) Sensitive Non Personal Data 

Sensitive Non Personal Data means such non personal data which the deprivation, modification, deletion or wrongful sharing of  which may reasonably cause a significant harm to any organization including

a) Loss of Business

b) Loss of Money or Property

c) Loss of Reputation

d) Disruption of Business Continuity

e) Unreasonable increase in cost of operation

ix) Critical Non Personal Data

Critical Non Personal Data means such non personal data, deprivation, incapacitation or destruction of which would cause significant harm to an entity and includes non  personal data under the control of such entities or computer resources whose activities if incapacitated or impaired may have debilitating impact on national security, economy, public health or safety.

x) Significant Harm

Significant Harm means such harm caused to an individual or any other entity, which is irreversible or is reasonably difficult to correct once caused.

xi) Joint Data

Joint Data whether personal or non personal means such  data  that is generated during a transaction involving more than one individual or entity

xii) Corporate Data

Corporate Data means data that can with reasonable assurance be associated with an identifiable non living individual including Government agencies or Partnership firms, proprietary concerns or association of individuals, Not for profit entities, and further includes combination of different elements of  data which in combination create a reasonably assured identity though the different elements might have been acquired from different sources and at different points of time.

xiii) Business Data

Business Data means any data related to a business or Governance transaction whether inclusive of elements of personal data or corporate data or not.

xiv) Minor Data

Minor Data means any personal data associated with an individual who is of age less than 18 years.

xv) Personal Data of Non Citizens

Personal Data of Non Citizens means any personal data of an individual who is not a Citizen of India as per the Citizenship Act of India.

xvi) De-Identified Personal Data

De-Identified personal Data means such personal data from which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual is removed and made inaccessible to the person to whom the data is disclosed. 

xvii) Pseudonymized Personal Data

Pseudonymized  personal Data means such personal data in which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual are replaced with comparable but randomly altered data elements and made inaccessible to the person to whom the data is disclosed. 

xviii) Anonymized Personal Data

Anonymized personal Data means such personal data from which all parameters of identity that may with reasonable assurance determine the association of the data with a living natural individual are removed and irrevocably destroyed so that the identity of the individual is rendered indeterminate to any person who is in possession of the residual data including the entity or person who caused the anonymization.

xix) Encrypted Data

Encrypted Data means such data that has been converted into a different data and  rendered unusable and unreadable by unauthorized persons .

The above definitions have been provided with some specific reasons that would be clearer as we go ahead and advocate the provisions of the Act.

However, definitions are very critical to the designing of the laws and hence I invite intense debate on the above definitions.

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Naavi

(This is not an exhaustive list of definitions. More will follow)

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment

Happy Independence Day

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-7 (Clarifications-Privacy)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

In our previous article, we discussed a new concept of “Privacy” where I advocated that “Data is in the beholder’s eyes” and hence Data in binary form which is accessed by an algorithm and processed without releasing the data out of the algorithm as an “Identified personal data” should be considered as not “Accessing” personal data for the purpose of personal data protection.

I am aware that supervisory authorities under GDPR would consider that  when identified data is accessed by an algorithm without consent, it amounts to “Automatic Processing” and  is considered as a breach of identifiable personal data.

What I am advocating is that this approach needs to be changed.

“Data” is “Data” only when it is in a form in which a human can understand it.

Identifiable Personal Data in binary form which is not accessible by a human being but is accessible only by an algorithm which ensures that even the admin of the algorithm cannot view it in identified form

and

subsequently released for human consumption only in anonymized form should be considered as not having been “accessed” and hence not considered as an infringement of Privacy.

In GDPR, this view is not supported. However, GDPR recognizes “Anonymisation” and considers “Anonymized personal data” as “Non Personal data”. If viewing by an algorithm in identified form is considered as “Breach” then all anonymization processes are actually processing of personal information and hence must be considered as an “Access” of identified personal data and requires consent.

I am in agreement with the views expressed in this article “Does anonymization or de-identification require consent?

According to this article, in Opinion 05/2014 of the Article 29 Working Party on Anonymisation Techniques, the Working Party stated:

“The Working Party considers that anonymisation as an instance of further processing of personal data can be considered to be compatible with the original purposes of the processing but only on condition the anonymisation process is such as to reliably produce anonymised information in the sense described in this paper.”

But if this view is correct then access of identified personal data by an automated processing algorithm is not an objectionable access. There is therefore an inherent conflict in GDPR.

This principle is extended in the concept which I am trying to advocate as Privacy 2.0 and drawing a principle that whenever  any process accesses identifiable personal data ad returns anonymised personal data where even the algorithm administrator has no access to identified personal data, then the process is compatible with the view that there is no infringement of privacy. Such a process will required that after the algorithm removes all identifiers the identifiers are irrevocable destroyed and are not associated with the output of the process.

This is an important  clarification I am advocating in the New Data Protection Act as part of the definition of “Privacy” and the definition of “Sharing”.

In the current versions of both GDPR and PDPB 2019, the  law does not define “Privacy” and proceeds to speak of various measures to protect the “Information Privacy”. It is felt that this is not fair on the data processing industry that they are required to protect a “Right of Privacy” which even the experts in Judiciary are not confidant of defining. We therefore strongly feel that the definition of privacy is essential in this data protection law though most of the law is related to “Information Privacy”.

The Privacy definition clause defines “Physical Privacy” which is the old concept which Supreme Court in the Kharak Sigh Case upheld. “Mental Privacy” is what Justice Chandrachud defined in the Puttaswamy judgement. The same Puttaswamy judgement also simplified the definition of Privacy into “Protection of Right of Choice” as expressed, which leads to the “Consent” and “Lawful basis for processing” requirements. Further the Puttaswamy judgement as well as GDPR mostly addressed issues of protecting  “Information Privacy” which  is protection of the Right of Choice about the use of personal information by the data subject when the personal information is in electronic form. Though in stray articles/sections it is stated that the principles of the law are also applicable for manual processing or file systems, the essence of the law is protection of personal information in electronic form.

We have tried to remove these attempts of law makers to hide behind vague concepts of “Privacy” and try to catch an industry for non compliance at the appropriate time just like the traffic cop who prefers to hide behind a bend and catch violators rather than standing in the middle of the road and guide the traffic towards compliant driving.

In GDPR we do differentiate between automated processing leading to “Profiling” from “automated decision making”  without human involvement but both require a lawful basis and this view if extended to other data protection laws also resulting in a general belief that access of identifiable personal data by any automated process requires consent/lawful basis and otherwise will be considered as a data breach.

We are challenging this interpretation and seeking  validation in the new data protection law.

(For abundant caution, I am trying to clarify that what is suggested is for the forthcoming new law in India and does not alter the earlier view in GDPR compliance where automated processing/decision making require a consent/lawful basis.)

Naavi

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-6 (Clarifications-binary)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect.

In our previous article, we discussed the definition of Privacy. One of the new concepts we tried to bring out is that “Sharing” should be recognized only when identified personal data is made accessible to a human being.

In other words, if personally identified data is visible to an algorithm and not a human, it is not considered as sharing of identified data if after the processing of personal data by the algorithm, the identity is killed within the algorithm and the output contains only anonymised information.

Typically such a situation arises when a CCTV captures a video. Obviously the video captures the face of a person and therefore captures a critical personal data. However, if the algorithm does not have access to a data base of faces to which the captured picture is compared with and identified, the captured picture is only an “Orphan Data” which has an “Identity parameter” but is not “Identifiable”. The output which let us say a report of how many people passed a particular point as captured by the camera etc is devoid of the identity and is therefore not a personal information.

The algorithm may have an AI element where the captured data is compared to a data base of known criminals and if there is any match, the data is escalated to a human being where as if there is no match, it is discarded. In such a case the discarded information does not constitute personal data access while the smaller set of identified data passed onto human attention alone constitutes “Data Access” or “Data Sharing”.

Further, the definition provided yesterday used some strange looking explanation of “Sharing” as

“..making the information available to another human being in such form that it can be experienced by the receiver through any of the senses of seeing, hearing, touching, smelling or tasting of a human..”

This goes with my proposition that “Data is in the beholder’s eyes” and “Data” is “Data” only when a human being is able to perceive it through his senses.

For example, let us see the adjoining document which represents a binary stream.

A normal human being cannot make any meaning out of this binary expression. If it is accessed by a human being therefore, it is “Un-identifiable” information.

A computing device may however be able to make a meaning out of this.

For example, if the device uses a binary to ascii converter, it will read the binary stream as ” Data is in the beholder’s eyes”. Alternatively, if the device uses a binary to decimal converter, it could be read as a huge number. If the AI decides to consider each set separated by a space as a separate readable binary stream, it will read this as a series of numbers.

Similarly if the binary stream was a name, the human cannot “Experience” it as a name because he is not a binary reader. Hence the determination whether a binary stream is “Simple data” or “a Name” or a “Number” etc is determined by the human being to whom it becomes visible. In this context we are calling the sentence in English or number in decimal form as “visibility”. If the reader is an illiterate, even the converted information may be considered as “Not identifiable”. At the same time if the person receiving the information is a “Binary expert who can visualize the binary values”, he may be a computer in himself and consider the information as “Identifiable”.

It is for these reasons that in Naavi’s Theory of Data, one of the hypothesis is that “Data is in the beholder’s eyes”.

The “Experience” in this case is “Readability” through the sensory perception of “Sight”. Similar “Experience” can be recognized if the data can be converted into a “Sound” though an appropriate processing and output device. Theoretically it can also be converted into a sense of touch, smell and taste if there are appropriate devices to convert them into such forms.

If there is a “Neuro input device” associated, then the binary stream can be directly input into the human brain by a thought and it can be perceived as either a sentence or number as the person decides.

These thoughts have been incorporated in the definition of “Privacy” and “Sharing” which was briefly put out in the previous article.

The thought is definitely beyond the “GDPR limits” and requires some deep thinking before the scope of the definition can be understood.

In summary, the thought process is

If an AI algorithm can be designed that identifiable data is processed in such a manner that identity is killed within the algorithm, then there is no privacy concern.  In fact a normal “Anonymizing” algorithm will be one such device which takes in identifiable information and spits out anonymous information. In this school of thought, such processing does not require consent and does not constitute viewing of identifiable data even by the owner of the algorithm (as long as there is no admin over ride)

I request all of you to read this article and the previous article once again and send me a feedback.

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Next article

Naavi

I am adding a reply to one of the comments received on Linked In:

Question:

Consider the situation of google processing your personal data from cookies or server and providing you specific ad. Google claims this automatic processing and output is anonymous.
So your suggestion to allow this?

Answer

It is a good question. It may require a long answer.
In such cases we first need to check through a DPIA what is the harm caused to the individual and arrive at a decision.
In the example cited,  there are three levels of processing. 
At first level there is collection of personal information. If the cookies are persistent cookies and linked to a known customer, it could be personal data and consent is required. If the entire cookie data collected is only anonymous and the collector is not reasonably capable of identifying the individual with other data on hand, it is processing of non personal data. 
At the second level, a profiling occurs and the users are categorised into different market segments may be without individual identity.
For example, if we say category A user’s would be interested in buying a computer, this analysis is not causing harm to the user. Usually this is done by an intermediary market research company. This company need not know the identity of the user and hence it only processes anonymised personal data which is outside the privacy protection regime.
At the third level advertisement is served. If the ad server is aware of the identity of the recipient and  target the ads then it is an activity which could cause harm to privacy.
Let us also take the example of TV ads or hoardings on the street. They are not specifically targeted ads and hence don’t infringe privacy.
Similarly if there are ads on the web space which are not targeted, it would be difficult to call it as infringement. If the ads are targeted by identity, without doubt it would be an infringement.
What you are indicating is a case which falls in between the above two extreme cases of targeted ads to identified individuals and generic ad serving just like the hoarding on the street which is open to everybody.
The determination of privacy impact is determined more by the platform where advertisement is placed. If it is a private space like your email inbox, you may say that there was an infringement. But if it is on a website which you go and visit, the ads may be treated like hoardings and not infringing.
Hence the platform on which the ads are served may determine whether there is harm or not.
What I have suggested would basically apply to intermediaries who only process data without any idea of the data subject and gives out the results to another recipient.  This is what an “Algorithm” would do.
But if Google is able to identify who has provided the data and who is getting the ads, they may not have the status of an “Intermediary” and there could be infringement of privacy.
Hence we may have to take a view based on the context. 

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment

75th Independence Day of India celebrated at every home

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment

The Shape of Things to Come..The New Data Protection Act of India-5 (Privacy Definition)

(Continued from the previous article)

P.S: This series of articles is an attempt to place some issues before the Government of India which promises to bring a new Data Protection Law that is futuristic, comprehensive and Perfect. 

In the earlier articles in this series, we have discussed the requirements of the New Data Protection Act regarding the basic objectives, regulatory structure and the Chapterization all of which gives a framework of the desired legislation.

In this article we shall discuss some definitional aspects.

We are presently discussing the possibility of one Mega Act which will replace both ITA 2000 and PDPB 2019 though the Government may ultimately chose to keep the two laws separate. We shall go ahead with the concept of the “Unified Act” for the time being and if necessary it can be bifurcated later on the basis of the different chapters we may create.

The first important definition to be addressed is the “Definition of Privacy” which needs to be protected.

The second but most critical definition of the Act is the definition of “Data” since it is central to all our discussions. The definition has to be further expanded to “Sensitive personal data”, “Critical personal data”, “Neuro data”, “Non Personal-Corporate Data”, “Non Personal Sovereign Data”, “Non Personal Community data”, “Shared Personal Data” etc.

Definition of Privacy

The first definition of Privacy is the one which is required for protection of what Supreme Court has declared as the “Fundamental Right” under Article 21 of the Constitution.

We presently have some understanding of what kind of privacy is protected by data protection laws such as GDPR which is “Information Privacy”. The current definition of “Information Privacy” as used popularly is “Privacy 1.0” where as a need has come to look at two further levels of definition which can be defined as “Privacy 2.0” and “Privacy 3.0”. We may or may not use this software type definition 1.0, 2.0 and 3.0 but we may have to find other names that can be used in the Act. But let us first try to understand the differentiation that can be brought between these three types of Privacy.

Privacy 1.0 means the fundamental right guaranteed under the Indian Constitution under Article 21 as part of the “Right to Life”. We had earlier discussed this subject in our article “The Privacy Judgement… Conclusion.. Need for Definition of Privacy“.  We know that the Puttaswamy judgement did not include the definition of “Privacy” in its final order though it was discussed by the judges in their individual descriptive “Orbiter dicta”.

Privacy can be discussed as “Physical Privacy”, “Mental Privacy”, “Neuro Privacy” and “Information/Data Privacy”.

The requirement of the NDPAI can be served by defining “Privacy” as “Information Privacy” only and proceeding to discuss how “Autonomy and Freedom of Choice” can be imparted to an individual in directing others about how his personal information may be collected, processed and disposed.

We must appreciate that “Right of Privacy” is the “Right of Choice” of an individual to determine how he prefers to share his personal data with others. The difficulty is however capturing the “Right of Choice”  and also managing the changes in the “Choice” of a person over time and managing the difference in the “Choices” of one individual and the other.

Let us therefore determine the first definition of Privacy  as follows:

Privacy:

“Privacy is a fundamental right under the Constitution of India as an independent right under the Right to life and liberty that guarantees an individual that shall not be infringed except under due process of law as defined in this Act and  includes the following.

(a) “Physical Privacy” means the choice of an individual to determine to what extent the individual may chose to share his physical space with others.

(b) “Mental Privacy” means the choice of an individual to determine to what extent the individual may chose to share his mind space with others

(c) “Neuro Privacy” means the choice of an individual to determine to what extent the individual may share his neuro space with others

(d) “Information Privacy” means the choice of an individual to determine to what extent the individual may share data about the individual with others.

Explanation:

“Sharing” in the context above means “making the information available to another human being in such form that it can be experienced by the receiver through any of the senses of seeing, hearing, touching, smelling or tasting of a human in such a manner that the identity  of the individual to whom the data belongs may become recognizable to the receiver with ordinary efforts”.

P.S: In the above definition, infringement of privacy is recognized only when the personal data becomes accessible by another human being. If the personal data is accessible only by a device and not by any human being, the data is not considered as “Shared”. When “Data” is processed by an algorithm without being accessed by any human being, if any human cannot access identified personal data by any reasonable efforts (similar to anonymisation), it is not considered as “infringement”.

This definition which recognizes visibility to humans only as infringement is the concept of Privacy 2.0. The inclusion of neuro privacy is the concept of Privacy 3.0. Both these are included in the above definition. Privacy 1.0 is the current definition used in GDPR where visibility of personal data by a device is also considered as potential data disclosure. Of

We shall discuss the definition of “Data” in the following article. In the meantime, I invite comments on the above.

Naavi

P.S: These discussions are presently for a debate and is a work in progress awaiting more inputs for further refinement. It is understood that the Government may already have a draft and may completely ignore all these recommendations. However, it is considered that these suggestions will assist in the development of “Jurisprudence” in the field of Data Governance in India and hence these discussions will continue until the Government releases its own version for further debate. Other professionals who are interested in participating in this exercise and particularly the Research and Academic organizations are invited to participate. Since this exercise is too complex to institutionalize, it is being presented at this stage as only the thoughts of Naavi.  Views expressed here may be considered as personal views of Naavi and not that of FDPPI or any other organization that Naavi may be associated with.

Next article

Naavi

Print Friendly, PDF & Email
Posted in Cyber Law | Leave a comment