Header image alt text


Building a Responsible Cyber Society…Since 1998

Theory of Dynamic Personal Data

Posted by Vijayashankar Na on March 31, 2018
Posted in Cyber Law  | Tagged With: , , , | 1 Comment

“Personal Data” is the object of data protection regulations such as the upcoming Data Protection Act of India, the DISHA 2018, as well as other laws such as GDPR and ITA 2008. “Protecting Personal Data” is considered “Information Privacy” by the Indian judiciary which declared “Privacy as a fundamental Right”. In all the data protection regulations, Data is classified as “Personal Data” and “Sensitive Personal Data (or special category personal data)” and different responsibilities are prescribed to the Data Controllers and Data Processors.

The current global controversy on Face Book being responsible for its customer’s profiles being used for influencing US elections is an interesting case study for examining the efficacy of the current data protection laws and where the laws have failed to capture the real nature of data and are therefore failing in the implementation of the data protection laws. If laws are failing in the current scenario, they will be failing more often when we consider the emerging era of Big Data Analytics, Artifical Intelligence and Quantum Computing.

While ITA 2008 and GDPR are already frozen, India has two data protection regulations in the pipeline namely the DISHA 2018 (Digital Information Security for Health Care Act) and Data Protection Act of India as being drafted by the Justice Srikrishna Committee. It is therefore a great opportunity for the Indian legislators to incorporate certain new provisions of data protection that other legislation including GDPR might have missed. Naavi has already provided some inputs on the proposed laws in earlier articles.

Theory of Dynamic Personal Data

This article will however introduce a new “Theory of Dynamic Personal Data” which if recognized and brought into our regulations may resolve some of the anomalies which we are presently facing.

The basic concept of this theory is that “Personal Data” is dynamic. It is not a static concept where one entity collects it under a “Consent” and uses it for a stated purpose and just destroys it afterwards.

Data once created cannot be easily destroyed. It can only be converted into another form where it looks different. It is therefore like “Energy” that cannot be destroyed in the universe but can only be converted from one state to another.

Energy can even be converted from being a “Particle” or a “Wave”. Similarly Data can be converted into a tangible “Document” or seen as “binary impressions on a magnetic or optical media”.

In the Quantum computing theory, “Data” can be in the form of Qubits with an uncertain state of being either a Zero or One but assuming a probabilistically determinable value when measured.Same “Uncertainty” can be there in the state of “Personal data” also even in the classical computing environment.

Data has a life cycle in which it is generated, re-generated, processed into a value added form, fused and fissioned, deleted and undeleted, forgotten and remembered, used and misused, de-identified or anonymized or pseudonomized and re-identified.

Hence any data protection law which assumes that an “Informed Consent” from the data subject to the first data collector will solve all the problems of “Information Privacy” is a complete myth.

What is required is for the law to recognize that “Personal Data is dynamic in nature” and at any given point of time it exists in a certain state of uncertainty. It can however be measured at a specific point of time when it shows up in a certain form.  This is exactly similar to the “Uncertainty Principle” embedded in the “Superpostioning” concept of the Quantum computing.

Three Fundamental Rules of Dynamic Data Theory

We can define three fundamental rules of Dynamic Data Theory for further discussions

The first rule is that

“Personal Data does not exist in isolation but exists in the Data Universe”.

Such data universe consists of

-the data subject’s data in different forms with different data controllers, collected at different times, along with

-many versions of the personal data processed by different processors for different purposes and

-combined with the data of other data subjects.

The second rule is that

“Personal Data exists in an uncertain state where it may be personal or non personal, sensitive or otherwise and assumes a certain state at the time of its measurement.”

The third rule is that

“Personal Data is not “Absolute” in truth and accuracy and always exists in a form dependent on the context of its collection and use.”

How these rules should be integrated to law making

Let us now elaborate on these three rules and discuss why a data protection law that does not consider these rules is defective ab-initio. 

We define “Personal Data” as that information that is identifiable with a living person. Obviously, Name is the primary identifier for an individual in the physical world. In the Cyber world, it is the e-mail ID or an Avtar ID that substitutes the name as the real identity of a Netizen.

Address in the physical world, the IP address in the cyber world are also identifiers.

Additionally, there could be other parameters such as the Mobile Number, the Aadhaar number, PAN number, the Voter’s ID etc which are all different identity parameters.

There are also additional parameters such as the age, sex, political affiliation, sexual preference, the health information, the financial information etc that are also considered “Personal information” when they are identifiable with a living individual.

The basic or “Primary” personal information is not the health or financial information but the physical identity information such as the name and address or the cyber identity parameters such as the biometric or password. Other information may be important but they are “Secondary Personal Information”.

So far, no law has defined “Primary Personal Information” and “Secondary Personal Information”. We have jumped from “Personal Information” to “Sensitive personal information” without clearly defining which is “Primary” and which is “Secondary”.

In the personal data cycle, “Personal Information” starts with the “Birth Certificate” which defines the name of an individual along with that of his parents, place of birth, date and time of birth. This is the “Primary Personal Information” at the atomic state. Within this, it is difficult to determine which comes first and which comes later.

In olden days, birth certificates used to be issues as “Son/daughter of X, the father and Y the mother”. The name actually came later as an assignment by the parents in the naming ceremony. However, convention today is to issue a “Birth Certificate” incorporating the assigned name. Hence the parameters of the birth certificate namely the Name, data of birth, place of birth, name of father and mother, is the atomic level personal information that needs to be defined as “Primary Personal Information”.

Subsequently other information about the data subject gets added including the record of the DNA profile or blood group etc. Further the education, employment particulars, bank particulars, other health parameters all get added to the “Personal Information”.

What we need to recognize here is that “Personal data changes its state on a continuous basis” though it may appear from time to time in the form of a snap shot which is the electronic document such as PAN card or Aadhaar card, Medical report, Bank statement etc.

Hence law has to define “Personal Information” as an “Evolving set of data that gets tagged to the Primary personal information created with the birth certificate parameters”. It is only the birth certificate parameters that can be frozen as an “Electronic Document defining the personal Information of an individual” and this gets extinguished with the “Death Certificate”.

In between, even the name of the person may change if the person so desires. His age ticks every second, his health data and financial data changes every moment. If therefore we want “Personal Data to be accurate” as a legal requirement, the personal data record has to be updated every moment which is not feasible.

It is in this context that I say that “Personal Data is in an uncertain state” and only when you want to measure it, you try to get a health report or a bank report where the personal data is frozen at a given point of time and place. This is the “Superpositioning Nature of the personal Data” similar to the Quantum computing scenario. While the Quantum super positioning can assume either Zero or One, the Personal Data is a “Continuum” of many states and is changing all along.

In this context, personal data of a person exists in a “Data Universe” where new data gets generated and some of the new data gets tagged with the Personal ID of the data subject and we say that “Personal Data has changed”. But this change of data can be recognized only of the Data Controller becomes aware of the change.

If a data subject shares his data with one data controller on 1st January 2018 and with another data controller on 31st January 2018, the two will be different. Each will be using the data based on the consent obtained and processing it and deriving inferences as if they know the truth. If the data subject says he will vote for BJP on January 2018 then he will be classified as a BJP oriented person. If on January 31, he says he will vote for Mr X from Congress who is the local candidate in the forthcoming election, the data changes colour and makes him a Congress supporter.

If both data is available to a single data processor he will compute a data analytic report showing the trend that this voter is changing his profile and the trend is that he is moving from BJP to Congress. If before the election, Modi makes a speech the trend may change again.

In such a scenario, the “Profiling” remains uncertain. Hence the so called “personal data” which includes the political affiliation is just an interpretation by a data processor with the available information on hand and his own skill in interpretation and it is not an absolute truth that the person is either a BJP supporter or a Congress supporter.

Without understanding the three rules, if law tries to say “No body shall use personal data except as provided by a consent”, then one has to question “Which data are we talking about”?

Is it the direct data that is provided by the individual once that he is a BJP supporter and another time as a Congress supporter? or

Is it the “Processed Data” that says that the person is an undecided voter and may change his preference based on the stimuli he receives closer to the election?

If an analyst like Cambridge Analytica comes to a conclusion, and develops a “Profile Report”, should the law consider this as a “Primary or Secondary Personal data” provided by the data subject or a “Derived Information” that is not necessarily guaranteed as the true and absolute personal information but is only an expert’s view of the analysis.

If so, should the data anlaytics firm be punished for data breach if it shares its analysis with a candidate who is trying to finalize his communication strategy? is a question which the law makers need to answer.

Today, the law makers say that all these decisions will be decided by the “Consent”. According to them they feel… “Let the person collecting the consent get the consent for processing it, deriving meanings and then sharing it with some body else for profit or for a cause etc.”

However at the time of obtaining a consent, the data controller only has a limited view of what information he is getting and how he may use it. But due to the “Dynamic nature of the data”, after collection, the data in the hands of the data controller “Grows”, “metamorphoses” into a different form and he discovers that he can now make new uses of the data.

What he bought was perhaps a caterpillar and now it has become a butterfly.

Should the law now say, go back to the data subject and ask him if he can use the butterfly instead of the caterpillar? . Of course law can say so.. because law can be an ass.

But what we need to ask the law makers is whether we can create a law which recognizes that the data which looks to be a caterpillar today may die as a caterpillar or change into a beautiful butterfly and we should encourage the data holder to nurse the caterpillar in any way the data controller wants and make it more valuable than what it was when it was handed over by the data subject. This is the  business of Data mining and data analytics on which a huge part of the IT industry is standing today.

Another complication in the data scenario is that data may be processed by a number of down stream data processors and today we define due diligence at each level in the form of a “Consent” or “Processing Contract” which can only capture known information about the data and not what can be “Discovered”. Also, down stream data processors are not aware of the original consent and have to proceed with their processing only on the basis of the data processing contract as provided to them by the Data collector.

If data protection laws try to curb the “Discovery”, of new uses of data, we will be curbing scientific development and the concept “Data is the New Oil” would be killed to the detriment of the progress  of the society.

If therefore Mr Aleksandr Kogan created some inferences based on the data he obtained by from Face Book users under a separate consent given on his APP, then the inference he derived was a “Derived Data” and not “Absolute Personal Data” of the data subject.

Presently the community is fighting over the issue as if “Personal data” has been breached. But actually what has happened is that some body created a notional value addition and some body paid money to buy it. It is a total speculation that it was beneficial to Mr Trump and whether similar analysis in India will benefit BJP or Congress.

The “Dynamic Personal Data” theory breaks the guardian knot and releases the “Processed Data” from the constraints of the “Consent on the raw data”.

In other words, the consent obtained for transferring the cater pillar is not allowed to restrain the use of the Butterfly.

Quantum States of Personal Data

When personal data is in the hands of one data processor, it is in a certain state of certainty defined by the information obtained under the “consent”. But while the data is being put to use, it slowly gathers energy and becomes more and more useful with additional information flowing in from a different source and from a different person.

For example,  one person in a certain street address says that he likes to vote for BJP. Then let us say another piece of data that the person attended a BJP rally or a BJP team visited him at his house and had a discussion gets added to the data base. Now the first information gets hotter and hotter until the analyst of the data comes to a conclusion with his algorithm that this is a BJP voter and profiles him as such.

In this example, we can see that a “Personal Information” attains the status of a “Sensitive personal information” without the data subject doing anything or providing any additional information by himself. Same thing may happen when the Google map adds data that this data subject visits a dialysis center every week and the inference is that he is a kidney patient. If this data is looked along with the financial information of the data subject, one can infer if he is a prospective candidate for accepting kidney donation.

This sort of movement of Personal information from one state to another after accumulation of additional data from the Big Data platform or by his own contribution is like the “Quantum Jump” of an electron rendering the atom state change. If the incoming data energy is less than the quantum requirement, it increases its entropy but remains a “personal information” only. But when the entropy level crosses a certain quantum level, the data changes its status. If the data energy is strong enough then it is not only the electron that makes a quantum jump but the nucleus itself may go into fission and change the entire profile of the data. In the Cambridge Analytica case, if the advertising input is strong enough then the profile of the data subject may alter from a BJP supporter to a Congress supporter or vice versa.

Now according to  present data protection laws, the information which was earlier only a “personal information” got fused with other information such as “BJP party activity in the areas” and the result was a “Political profiling” of the data subject which is “Sensitive personal information”. As it is now happening, data privacy activists will say this is an inappropriate use of the consent for manipulating the voter behaviour and should not be allowed.

But is this change of status “Controllable” by law stating that you cannot bombard the data subject with additional information? . If done, are we trying to curb the business of advertising and communication itself? is a point that the data protection laws need to address before jumping to introducing stringent data protection laws in the light of the Face Book -Cambridge Analytica issue.

Thus we need to remember that Data is not static. It grows with the accumulation of additional data from the surroundings. In the process data changes colour and renders the earlier consent meaningless in the new scenario.

Similarly, non personal data can become personal data when there is fusion of identification parameters and an identifiable personal information may become de-identified personal information if the identity parameters are removed.

The Data Protection law in the next generation cannot be blind to this aspect of “Dynamic State of Personal Data” and should not create laws with the assumption that personal data always remains in a static form until the data subject himself provides new data inputs with new consents etc.

Is it a Diamond or Charcoal?

In this process of Data Transition through its life cycle, the value of data may change substantially. Just as Carbon can exist both as charcoal or diamond based on how it is processed, Data can remain worthless or become valuable depending on the processing.

If data processing creates a diamond,

should we stop such processing because the charcoal supplier supplied it at a throw away price thinking that it will be used for burning and gave his consent for its use while the processor applied technology to compress the charcoal and discovered a means of converting it to diamond?


should we mandate that all data subjects will get royalty when their personal data will be used to create profits to the down stream industries?..

is a challenge that the data protection law makers of India need to consider when they draft the new laws.



Reference Articles:

Data Protection Law in India… Three Big Ideas …. Data Trust, Jurisdictional Umbrella and Reciprocal Enforcement Rights

Uphold the “Right to Know” against “Right to Privacy” in the new Data Protection Law

We should forget the “Right to Forget” in Indian Data Protection Act

Cambridge Analytica and Indian Cyber Laws

Personal Data should be considered a personal Property

Public Consultation on Data Protection Legislation

Public Consultation on Data Protection Law…. Some points of discussion :  Part I,  Part I, Part III 

Why We need a Data Breach Protection Act rather than Data Protection Act

CCTV footages.. Whose property is it any way?

Impact of Supreme Court’s Order on Right to Privacy on Cyber Space and Data Protection

Concatenating the individual Conclusions of the Privacy Judgement

Data Protection Act.. We should aim at Compliance with Pleasure not Compliance with Pain.

Look beyond GDPR and Create Personal Data Trusts to manage Privacy of data subjects

Privacy law cannot be only a tool for hiding oneself

Earlier innovative theoretcial Thoughts of Naavi

Theory of IS Motivation

“Theory of Secure Technology Adoption”… what it is..

Section 65B and its relation to the Theory of Soul and Body, rebirth and past life memory

The Three Plus One dimensions of Information Security

Fighting susceptibility for “Cyber Hypnotism” with Ulysses Contracts

Compulsive Cyber Offence Syndrome

The news report that Personal profiles of 50 million Face Book users was collected and unauthorizedly used to help Trump win an election has opened  a new debate on Privacy and Data Protection in India. BJP and Congress parties are fighting on TV to blame each other that they are also indulging in a similar misuse of personal data while the local subsidiary of Cambridge Analytica (CA) which is the firm accused of the misuse claims to have served both BJP and Congress in different elections.

Much of the debate that is happening in this connection appears to be dishonest and hypocritical and the bluff has to be called.

We must first recognize that the CA is supposed to have collected the data through an App which was voluntarily downloaded by users who gave a consent for the access of their personal information. The person who collected the information based on the consent provided used it as a data for some kind of research for targeted advertising. The research was bought by Trump’s campaign managers and hopefully he was benefited.

Just as in India anything done by Modi is objected to, the Anti Trump brigade is accusing as if US election was tampered because of the profiling of the consumer research company and the targeted advertising for which it was used. Even if the firm had done a “Psychological Profiling” from the data available, as long as the data was in the public domain or out of an informed consent, there is no breach of Privacy. There are FinTech companies who do data analytics for fixing credit limits and if data analytics is used to create innovative advertising, it is neither a surprise nor some thing to be scoffed at.

This sort of data collection from public resources or from informed consent cannot be objected to just because we donot like Mr Trump winning.

If there is any real objection, one has to go into the fact of whether the “Informed Consent” was actually through a fraud and if so the data collector namely the British academic “Aleksandr Kogan” has to be brought to book.

Presently all Privacy Laws place faith on such consents. But if the Data Collector breaches the agreement and sells the data to another person who uses it for a purpose other than the purpose for which it was provided, it has to be objected to only on grounds of “Breach of Contract, Breach of trust” etc.

As regards the third party who bought to the data, data protection acts need to impose a “Due Diligence” obligation to disclose and get consent from the data vendor that the purchased data can be used for a specific purpose. Since “Advertising” is a legitimate purpose, if the data collector offers a data for advertising to an advertiser and the advertiser may  buy it under the premise that the data subject must have provided the necessary consent.

Is the secondary data user expected to check if the original consent provided to the data collector permits  such use or not is a matter yet to be clearly defined in law though it could be an ethical and moral issue. Also in many cases, even the buyer may not be aware how exactly he is going to use the data and how he can benefit from it. He may be simply buying it speculatively and discover some value added derivatives out of it which he may trade.

It is therefore hypocritical for us to express surprise that FB data could be used for profiling and profiled information can be used for advertising and such advertising could be for political campaigns. All this has to be expected in the era of Big Data anaytics and Artificial Inteligence.

In fact while the laws or privacy so far have missed the need to impose “Due Diligence” by the secondary user of personal data and this can be taken note of and included in the Indian Data Protection Laws, we can draw attention to Section 66B of the ITA 2008 which provides a possibility for “Stretching the legislative intent indicated in the section” to cover the misuse of data. Section 66B is actually meant for punishing the use of stolen computers and mobiles and uses the term “dishonestly receives and retains any stolen Computer Resources”. If we can consider data as a computer resource and the act of use of data for a purpose other than what it was meant as “Stealing”, then Section 66B can be stretched to the data misuse scenario though it is not recommended.

May be the Justice Srikrishna panel may include a clause that

“Any user of personal data shall exercise due diligence to ensure that the purpose for which it may be used is consistent with the consent provided”

Perhaps this is the lesson we can take out of this incident apart from what we have already discussed as to the need of an intermediary called “Data Trust” in the Data Protection environment.


Currently GDPR and Aadhaar are both hot subjects for discussion amongst professionals whether they are Privacy activists, Information Security professionals or Lawyers.

GDPR is at one end of the spectrum often looked upon by Privacy activists as the ultimate in Privacy Protection legislation. Aadhaar on the other hand is at the other end of the spectrum often looked upon as the greatest villain in Privacy breach in India.

The Supreme Court of India continues to hear the petition of Privacy Activists who are more concerned about the political damage they can create on the Government by attacking Aadhaar than any public good.

There appear to be some foreign technical persons calling themselves “Ethical Hackers” who are camping in India to hack into Aadhaar data and prove that Aadhaar is the epitome of Privacy invasion in India. It is not clear where motivation comes to these persons and whether they are motivated by their commitment to the Privacy of the Indian Citizen or committed to the political advantages that can accrue to Black Money owners in India if the present intentions of the Government to link Aadhaar to Mobile and Bank accounts is frustrated through intervention from the Supreme Court

We the Indians are aware that even Supreme Court is having its own agenda and many times takes decisions which are “TRP oriented”. The Privacy judgement, the Scrapping of Section 66A are examples of decisions where the Court has shown its inclination to come to conclusions based on the public perception that can be created about the “Progressive Views of the Judiciary”.

In this context it is essential for us to examine how does GDPR try to address the issues of Privacy in the context of Public interest, National Security and Journalistic freedom.

Chapter IX of GDPR  refers to “Provisions Related to Specific Data Processing Situations” and sets in the rules regarding processing of personal data in the context of Right to Freedom of Expression and other issues including “Processing of National Identification Number”.

Article 85 of GDPR  leaves it to member states to reconcile by law the right to protection of personal data pursuant to GDPR with the right to freedom of expression and information including processing for journalistic purposes and the purposes of academic, artistic and literary purposes.

Article 86 refers to personal data in official documents held by a public authority or a private body for the purpose of carrying out an activity in the public interest which may be disclosed under a Right to Information kind of law.

As one can appreciate, the canvas to define exclusion under Article 85 and 86 is fairly wide and if we take this as a guide for the Indian context where we are waiting for our own Data Protection law, there is enough scope to consider that our existing laws including the Right to Information Act can be considered as an automatic exclusion to GDPR.

Article 87 is interesting since it directly relates to a situation similar to Aadhaar. It states as under:

Article 87: Processing of the national identification number

Member States may further determine the specific conditions for the processing of a national identification number or any other identifier of general application. In that case the national identification number or any other identifier of general application shall be used only under appropriate safeguards for the rights and freedoms of the data subject pursuant to this Regulation.

This article provides complete rights to member states to over rule GDPR when it comes to processing of national Identification Number or any other identifier of general application. Obviously, “Appropriate safeguards” are prescribed.

This article provides guidelines both to Indian Companies who are often over reacting to the GDPR  by imposing on themselves non existing restrictions on to what extent the local regulations may over ride GDPR and yet it can be considered as “GDPR Compliance”.

If the member states of EU themselves have the freedom to enact laws that may over ride EU, it is obvious that an independent sovereign country like India where in most cases, the GDPR application is through the contracts between the Data Controller in EU and a Data Processor in India, the local laws such as Information Technology Act 2000/8 will have paramount priority over and above GDPR.

I therefore caution Indian Companies that in their eagerness to be GDPR compliant, they should not ignore the need to be ITA 2008 compliant.

We need to build GDPR Compliance within the parameters of ITA 2008 compliance. Fortunately, ITA 2008 is eminently designed for such requirement since Section 43A and definition of “Reasonable Security Practice” accommodates such contracts as defining the security requirements for compliance. The only difference would be that the remedy may have to be sought under ITA 2000/8 read along with international treaties and laws applicable to international contracts. GDPR cannot be super imposed in derogation of these other remedial options.

The second aspect we need to take note from Article 87 is that even the rigorous GDPR regulation on Privacy provides for an exception of National Identification Number in the EU member countries. Hence the Indian Data Protection Act can also exempt the processing of Aadhaar data from the restrictions.

The Supreme Court should therefore take cognizance of this fact and donot make the mistake that they committed in scrapping of Section  66A of ITA 2008 while ruling on Aadhaar.

Linking of Aadhaar to Bank accounts and to Mobile is a requirement of public interest to prevent Black Money, Benami transactions as well as Terrorism and Crimes and the right of the Government to use the National Identification Number such as Aadhaar for such purposes cannot be curtailed by the Court without taking on the blame that the decision is meant to please the silent majority of anti nationals who advocate that Aadhaar has to be scrapped.

The above support for Aadhaar is however not in derogation of the requirement that there has to be adequate safeguards to secure the Aadhaar usage in a manner that it cannot be misused to commit crimes. It is in this context that the “Virtual Aadhaar” becomes most important as a security measure so that at least in the future “Stored Biometric Attacks” through the Aadhaar user agencies does not occur.

My support for Aadhaar above also does not mean that Aadhaar authorities are taking all steps that are necessary for securing the infrastructure of Aadhaar and that they are not arrogant and not dismissive of the risks.

It is however considered that Aadhaar linking to Financial information and identity of individuals to several activities is essential to build a Safe India and no legal hurdle should be placed to prevent this honest effort of the Government. The security concerns are however real but can be addressed if UIDAI makes full efforts in this regard.

The first thing UIDAI needs to check is the progress of the Virtual Aadhaar implementation. The system should be in trial operation by 1st of April and in mandatory operation by 1st of July.

While some data security organizations in India are busy conducting surveys on our GDPR preparedness, UIDAI itself or other data security organizations should focus also on conducting a survey on our preparedness for implementation of Virtual Aadhaar as an identity to replace Aadhaar identity by Banks and Mobile operators.


I must admit here my excitement about Quantum Computing and discussing the impact of a principle of Physics for Cyber Law development, since I left my formal college education as a student of Physics, when the Quantum Mechanics was at its infancy and it is a feeling like being “Back to the Past” .

Though I had my post graduation in Nuclear Physics and studied Particle Physics to some depth, specialized in subjects such as Nuclear Forces etc., the subject of Quantum Physics was still new and not understood properly at that time. I  had even baffled everybody including myself in an interview at Physical Research Laboratory (PRL) in Ahmedabad when I solved a quantum physics question in real time put to me by the interviewers  who were interviewing me for the post of a “Scientific Assistant”  which most other interviewees had failed to do.

Though I refused the offering despite repeated requests to join and turned my back to the pure science, I never imagined that after 40 years I will return to study the impact of Quantum Mechanics to the present domain of my specialization which happens to be the Techno Legal aspects of Law.

But it appears that Cyber Law in India and elsewhere will be deeply impacted with emerging technologies of which Quantum Computing is one which will over turn many of the present concepts of law.

Hence study of “Cyber Laws in the Emerging Technology Scenario” will be the new focus which we should term the “Quantum Cyber Law Specialization” or “Futuristic Techno Legal Specialization”.


Today I have taken one topic for discussion which is the interpretation of Section 65B of Indian Evidence Act (IEA) and to examine if Naavi’s Interpretation of Sec 65B survive the Superpositioning concept of Quantum Computing.

The legal and Judicial community has struggled to interpret the section even after 18 years of its existence and it would be a further challenge to interpret Sec 65B in the emerging quantum computing age. For a large part of these 18 years since Section 65B (IEA) came into existence,  few recognized its existence and hence there was nt much of a debate on the topic. It is only in the recent past that the community has started discussing the issue many times with a wrong perspective.

During most part of this time, Naavi’s interpretation of Section 65B was not seriously challenged. In the recent days there are a few law professionals who would like to interpret things differently. They may draw support from some Judges who are dishing out judgements without fully understanding the impact of their wrong decisions on the society. This tendency comes from the inability of some to un learn what they have learnt for the last 3 or 4 decades of their legal career. They are therefore uncomfortable with what the Supreme Court stated unambiguously in the Basheer Judgement and want to interpret things in their own way.

Naavi has been saying, wait… it took 14 years for Supreme Court to realize the existence of Sec 65B and it may take a few more years for the entire community to come to the same understanding which Naavi has been advocating since 2000.

In this connection, I have tried to give a thought to what will happen to my interpretations of Section 65B when Quantum Computing comes into play.

Quantum Computing is not an easy concept to understand even by specialists in Physics. Hence for the lawyers and judges to understand Quantum Computing would be understandably challenging. It is possible that I also may have to refine some of my own interpretations presented here and I reserve my right to do so. I will however explore all the Cyber Law challenges presented by the Quantum Computing. For the time being, I am only looking at the concept of “SuperPositioning” and its impact on Section 65B interpretation.

What is SuperPositioning

SuperPositioning is a concept in Quantum Computing.  In the classical computing scenario, a Bit can have a value of either 0 or 1. The Quantum Bit or Qubit can however have a value of 0 and 1 at the same time. When you measure the value, it will show either 0 or 1 but when you are not measuring it can hold two values simultaneously.

This “Dual State capability” of a Qubit may be fascinating for the scientist who swears by the concepts such as Heisenberg’s principle of uncertainty, multiple quantum energy levels of the electron in a hydrogen atom, quantum energy state of the nucleus of a Phosphorous atom, the direction of spinning of a sub atomic particle, light being both a wave and a particle at the same time, there being a parallel universe, time being a new dimension, Worm-hole being a tunnel to future, etc.,.

But to a judge who is looking for “Evidence beyond reasonable doubt” and for the criminal justice system where a witness is expected to answer only in the binary- “Yes” or “No”, the uncertainty inherent in the Quantum Computing will be a huge challenge.

In fact, at present we can state without battling an eyelid that if I stand on the witness box and start talking of the “SuperPositioning” and more specifically on the “Entanglement” aspects of Quantum Computing and how it requires a re-interpretation of Section 65B, I will be thrown out of the Court as some body who has lost his mind.

Since no body can throw me out of this blog, let me take the courage to proceed further and try to raise some issues which may be academic discussion points as of now but will be important for the Cyber Lawyers of the future.

But in the days to come, Cyber Law will be revised to accommodate the “Uncertainty Principle of an Electronic Document”. The time to recognize this concept has already come in respect of Section 65B.

Current Dilemma in Section 65B Yet to be resolved

From the years since ITA 2000 came into being and until the Supreme Court judgement in the P.K.Basheer case on 18th September 2014, there was little discussion on Section 65B of Indian Evidence Act (IEA) in the higher echelons of the Indian judiciary.

The decision of the Chennai AMM Court accepting the first Section 65B certificate issued by Naavi and convicting the accused in the historic Suhas Katti case (Refer here), was perhaps too insignificant in the eyes of the many senior advocates to take note of and hence was not noticed.

Since there were no debates in the august Supreme Court about Section 65B, “Eminent Advocates” who had gained their eminence through their expertise and years of work in “Non cyber law” domains such as Constitutional Law or Law of Evidence did not take time off to discuss the implications of Section 65B in right earnest. One opportunity that was presented in the case of Afsan Guru case in 2005 was lost because the case was a high profile case of terrorist attack against the Nation in which technical issues could not be given too much of importance. Hence when Mr Prashant Bhushan raised the technical issue of non availability of Section 65B certificate for some of the evidence, Court considered the other evidence before it and proceeded with the case.

This was interpreted as a rejection of “Mandatory requirement of Section 65B certificate” under Section 65B and became a precedent that prevailed until the Supreme Court over turned it in the P.K.Basheer case. 

However, Naavi continued to hold his forte and did not accept the Afsan Guru judgement in respect of mandatory requirement of Section 65B certificate for electronic evidence admissibility as correct.

We have discussed several the issues arising out of P.K.Basheer judgement both in naavi.org and ceac.in and readers may refer to them for more clarity.

We have held that the P.K.Basheer judgement has provided judicial support to most of the views of Naavi regarding Section 65B. There was only one aspect of the judgement where we have pointed out that a clarity remained to be exercised. It was in the view expressed in the judgement as follows:

“The situation would have been different had the appellant adduced primary evidence, by making available in evidence, the CDs used for announcement and songs. Had those CDs used for objectionable songs or announcements been duly got seized through the police or Election Commission and had the same been used as primary evidence, the High Court could have played the same in court to see whether the allegations were true. That is not the situation in this case. The speeches, songs and announcements were recorded using other instruments and by feeding them into a computer, CDs were made therefrom which were produced in court, without due certification.”

Naavi has consistently held that “Electronic Record” is a third type of evidentiary object that is different from “Oral” and “Documentary” as provided in Section 17 of IEA and should be considered as a special category whose admissibility is under the provisions of Section 65B alone.

While interpreting Section 65B, some of the “Eminent Non Cyber Law Jurists” have still not reconciled to the unlearning of the concept of “Primary Evidence” and “Secondary Evidence” where “Primary Evidence” lies inside a CD or a hard disk and “Secondary evidence” is a copy that is produced since primary evidence cannot be produced in the court.

In the electronic document scenario, the original document is a “Binary Expression”. The binary expression which we call as an “Electronic Document” is a sequence of bits which is present either in the form of magnetic states of a unit of a magnetic surface or as the depressions on a CD surface which reflect light in a manner different from its neighboring unit. The stream of such bits when read by a reading device associated with a software running on a hardware interprets the sequence of binary expressions as a “Text”, “Audio” or “Video” which we, the humans call as “Electronic Documents” and debate if it is “Primary Evidence” or “Secondary Evidence”.

The “Original Electronic Document” is an expression that can only refer to the first creation of a given sequence of bits which constitute an electronic document being interpreted as evidence. For example when a digital camera captures a picture, it first creates a sequence of bits in the RAM space. This is however not a recognized electronic document where it is in a state not “meant to be accessible so as to be usable for a subsequent reference”. (Sec 4 of ITA 2008).

When this sequence of bits gets transferred to  a “Stored Memory” in a device such as a “memory card” or a “hard disk” etc., that represents the first instance of the electronic document that came into existence. Before this, the magnetic/optical surface on which the document is recorded was in a  “Zero State”. Every bit on the surface was designated “Zero”. When the electronic document is being etched on the surface some of these “Zero” s were converted into “Ones” and the “Unique sequence created” was subject to a “Protocol”. This sequence of bits stored subject to a “Protocol” is what we call as “Original Document”.

But this “Original Document” has no meaning without being read in devices which understand the protocol and renders the information in a human understandable form. For example, if the image has been captured in a .txt or .doc or .mp3 or .avi or .mp4 or formats, then the electronic document has a sequence of zeros and ones which conform to the respective protocols. It is not possible to separate the protocol information from the electronic document itself and hence the document remains in a given format along with the protocol information.

When a reading device is presented with the electric/electronic impulses generated by such a sequence of bits, if the device is capable of interpreting the protocol, it will convert it into a humanly experience document which we may call as Text, Audio or Video which a judge can view and take action. If the device is not capable of understanding the protocol, the document would be rendered in an un-intelligible form. If it is a text, it will appear as gibberish, if it is an audio we may here a meaningless echo sound, if it is a video we may see only lines on the screen. If a sequence of bits need to be experienced by a human being, we must use a device which understands the protocol and converts the bits in a specific manner into an humanly readable/hearable/viewable form on a computer screen or a speaker.

So, even if in the Basheer case the original CD had been produced or in the case of Suhas Katti, the hard disk with yahoo.inc had been produced or in other cases, the memory card of a video camera is produced as “Original Evidence”, the judge can view it only if he uses a device which is configured to the protocol to which the sequence of bits corresponds. If the judge takes a view of the document as he is seeing on a computer, he is responsible for the protocols that have been used in rendering the sequence of bits to a humanly understandable document.

In a comparable environment, if a “Forged” signature is being questioned before a Court, the judge can himself view the signature and form his own opinion on whether the signature is forged or not. But prudence requires that the Court will ask another expert to give it a certificate whether it is forged or not so that the Judge does not become the witness and will only try to interpret the evidence with reference to the law.

The same principle applies to electronic documents viewed by a Judge without insisting on a Section 65B certificate from another.

This aspect was recognized by the magistrate Thiru Arul Raj of the Chennai AMM court in the Trisha defamation case referred to by me in my article on “Arul Raj, the Unsung Hero” (Refer here) in which the principle was laid down that even when the so called “Original” electronic document is before the Court, it has to be Section 65B certified by a third party.

In this background we can now appreciate why the Section 65B certificate requires that it has to be produced in the manner in which it is required to be produced namely

“identifying the electronic Documents rendered in the computer output”,

“Indicating the process by which the computer output was produced”,

“Providing certain warranties on the production of the Computer output” and

then considering the “Computer Output” as “Admissible Evidence” without the need for producing the original.

In this process the Certifier is stating that when he followed a certain protocol which is indicated in the certificate, he was able to view the electronic document in the form in which it has been presented in the computer output and he is responsible for the faithful reproduction of what he himself saw or heard into the format in which he has rendered the computer output.

I wish all eminent jurists including the Judges of Supreme Court go through the above multiple number of times to appreciate why I have been stating that Section 65B certificate can be produced by any third party (subject to a level of credibility) who has viewed the document and not necessarily the administrator of the device (as wrongly indicated in the SLP order in the case of Shafhi Mohammad).

This also underscores my view that in the case of electronic document, we always deal with the “Secondary Document” which  is a rendition of the original etching of the binary sequence and humans are incapable of viewing the “Original” which is a binary expression mixed up with the viewing protocol. We should stop comparing the “Computer Output” under Section 65B with a photocopy of a paper document and talk as if both are same.

Quantum Computing Era

Now, let us turn our attention to the main object of starting this post which was to look at Section 65B in the context of the emerging technologies such as “Quantum Computing”.

The legal professionals may find the earlier paragraphs hard enough to digest and may not have the stomach to start debating what would be Section 65B interpretation in the Quantum Computing era. May be this is too early to discuss the Cyber Law requirements for the emerging technologies since even scientists have tried to start understanding Quantum Computing only now.

But a “Futuristic Cyber Law Specialist” (whom we may also call “Quantum Cyber Law Specialist” or a “Futuristic Techno Legal Specialist”),  needs to tread a path which no body else has tread and therefore we shall continue our exploration.

We must realize that Quantum Computers are expected to work along with Classical computers and hence the current concepts of data storage in bits with “0 or 1” state may not vanish with the advent of Qubits with “0 and 1”. But data may be processed in an “Artificial Intelligence Environment” using “Quantum Computing” and presented in a classical computing environment.

In view of the above, Quantum computing will be part of the process but the  human interaction with the electronic document which will be certified as a computer output in a Section 65B certificate would be in a classical computer.

Additionally, “Quantum Computing” may sit in between two classical computing scenarios. For example, data may be captured by a classical computing system and become part of the “Big Data” which is processed by a Quantum Computing system and results rendered back in Classical computing environment.

Though the journey of the “Electronic Evidence” from birth as the “Original binary impressions on the first classical computing device passes through the “Worm-hole like” quantum computing environment, it comes back into the Classical computing environment when the Sec 65B certifier views it and converts it into a Computer output.

I therefore consider that Section 65B certification interpretation of Naavi will survive the Quantum Computing age. Lawyers may however raise certain forensic doubts regarding the reliability of an electronic document certified under the Section 65B and Forensic witnesses under Section 79A may need to answer them to the satisfaction of the Court.

However Section 65B certification being a matter of fact certification of what is viewed as a Computer output in the classical computer of the observer will not be vitiated by the complexities of the processes that go behind the scene.

Courts should understand that they are not entitled to confront the Section 65B certifier to a cross examination on the reliability of the back end processing systems as long as they are the standards the industry of computing adopts as technology.

I look forward to views from both my legal and technology friends regarding the above.


This is for general information of the public:

One Day Training Progamme on Information Security for Industry Managers

Wednesday: 21 March 2018: Hotel Accord, Puducherry

CII Puducherry is organizing an One Day Training Programme on Information Security for Industry Managers on Wednesday: 21 March 2018: Hotel Accord, Puducherry


This session is meant for all Business, IT and IS managers.

The workshop will be conducted by Na.Vijayashankar, Information Assurance Consultant, popularly known as Naavi and  is a pioneer in Cyber Laws in India ( https://in.linkedin.com/in/naavi)

Date & Timing :    Wednesday, 21 March 2018 – Starting from 0900 to 1700 hrs.

Venue :   Hotel Accord, No. 1, Thilagar Nagar, Ellaipillaichavady, (Near Rajiv Gandhi Statue & Opp to Muruga Theatre).

Those who are interested may contact CII, Puducherry. (www.cii.in)


After the PNB Fraud in which over Rs  11400 crores are suspected to have been lost came to light, many other frauds are slowly tumbling out the closets of E Banking.

Leaving aside the fact that the lenders of different Banks who lent money to Mr Nirav Modi and Mehul Chokshi failed to check the “End Use” of funds and allowed renewal of LOUs without checking the previous utilization and need for extension, it was also realised that PNB had even allowed the Nirav Modi employees to directly access the SWIFT messaging system of the Bank.

The system of the Bank was so configured that SWIFT system could be accessed from outside the banking network. The operating officials of the Bank gave away passwords of multiple officials  to the Nirav Fraud team.

The system had no control that could detect that the log in was from outside the Bank’s network, multiple passwords were entered from the same computer and the messages did not reflect in the CBS system, nor created vouchers for commission or margin collection.

This was a gross failure of the Bank staff and the information security configuration of the systems.

It is true that any IS control can be defeated if the employees are dishonest. But still, the system design should be such that even if some of the employees are dishonest, the fraud should be detected, if not for the first time, in subsequent times.

Unfortunately the creators of the software in Infosys who sell FINACLE and supply it to a number of Indian Banks, are not aware of the intricacies of Banking transactions and how frauds could be committed. Hence their design is a faulty design and Banks are saddled with this defective product.

Now yet another fraud has come to the open in State Bank of India, Chennai where also it appears that the passwords of the Bank staff has been used by an outsider to divert over Rs 3.2 crores of money (Refer article here) meant for purchase of Cars as an unsecured cash advance which was used for funding a Film production. Here again, the security configurations of the CBS software has failed to recognize that Cars were not purchased, money was not credited to a Car dealer’s account, documents such as RC book etc was not submitted, asset inspection did not take place etc.

In all such cases, it is clear that it is not only the Software that failed, but also the internal audit system.

It is high time that Indian Banks rethink on how their “Internal Auditors” are equipped to conduct audits in the Computerized environment.

If internal audit cannot identify this new generation of Bank frauds where the customer himself is given access to the Bank’s systems to design his own loan sanctions, create approvals of several layers of bank officers and take the money out, then there is no need for such audits.

Where such “Self Loans” are used in the “Kite Flying Mode” and repaid with a roll over loan, it is very difficult for normal audit processes to find out the anomaly. There is definitely a need for Computer Assisted Audit techniques either with in built features of the core banking software or through specialised audit tools.

FINACLE Strengths and Weaknesses

The Banking software like FINACLE which costs a fortune for the Banks should have an inbuilt, non-tamperable audit module that should be effective in preventing such frauds to continue beyond the first couple of occurrences if not the first time.

FINACLE boasts of an Audit module as part of its system but it is clear that it has failed in the context of not only PNB Brady Branch but also SBI Chennai branch and in the many other similar cases that have come to light now.

If the Indian Banking system is in doldrums today, a large part of that responsibility should be boarne by the CBS software suppliers who have supplied defective products to the industry.

RBI has failed to subject the software itself to an audit of IDRBT which is mandatory and hence part of the responsibility for the use of defective software lies on the RBI also.

While checking on the Audit capabilities of FINACLE, I came across an article describing the audit capabilities of FINACLE.

Some key FINACLE menus and their use for an auditor has been described in this article. Some of them are briefly reproduced here.

  1. Account Leger Enquiry (ACLI)
  2. Customer Account Leger Print and Office Account Ledger Print (ACLPCA and ACLPOA)
  3. Audit File Inquiry (AFI)
  4. Average Balance (AVGBAL)
  6. Customer Master Inquiry (CUMI)
  7. Report on Expiring Documentary Credits (DCEXPLST)
  8. Query on Documentary Credit (DCQRY)
  9. Exception Report (EXCPRPT)
  10. Generate Report (GR)
  11. Financial Transaction Inquiry (FTI)
  12. Accounts Due for Review (ACDREV)
  13. Inward/Outware Remittance Maintenance (IRM/ORM)
  14. Outstanding Items Report (MSGOIRP)
  15. NPA Report (NPARPT)
  16. Letter of Acknowledgement of Debt Report (LADRPT)
  17. Loan Overdue Position Inquiry (LAOPI)/Ttemporary OD Report (TODRP)
  18. Print Reports (PR)
  19. Guarantee Issued Liability Register (GILR)
  20. Partywise Overdue Packing Credit (POVDPC)

The above list indicates that there should have been several reports that should have thrown up audit queries in respect of PNB Fraud as well as the SBI Fraud.

Now what we need to check is why did the discrepancies were not thrown up by the audits?

The reasons could be many.

  1. First reason could be that no audit was at all conducted. In PNB we are told that RBI did not audit the branch for more than 9 years. It is not clear if the internal audit was also bypassed. If so was there any declaration in the annual reports to the share holders providing the list of branches which were not audited for the last 1/2/3 or more years?
  2. If an audit was conducted, it is possible that the auditors were not aware of all these modules andhow to use them appropriately
  3. Perhaps there was lack of adequate training of  the auditors.
  4. It is also possible that FINACLE comes with some base module that does not include all features and a higher priced module that may include additional modules and the Bank could have not taken the full module for cost considerations.
  5. It is also possible that the FINACLE system itself might not be able to properly analyze the data in the above modules though it may create some printable reports.

Need for Data Analytics in Audit process

Computer Assisted Audit Techniques that are essential for proper auditing of any Computerized data environment requires a capability to

a) Acquire data of different types from across the network available in different platforms and collate it into a common platform for analysis

b) Extract, Classify and Re-classify data into different groups which create new meanings not visible in the direct report

c) Search data across multiple categories and filter them against some specific risk identifying algorithms

d) Use known statistical methods such as Benford law to check on potential frauds

e) Use Forensic audit tools to discover evidence that has been buried by the fraudsters

f) Use “Checking of Controls” as a part of the audit including the Information Security controls such as “Access Control”, “Log Analysis”, “Incident Management System” etc.

It is clear that the current Internal Audit process in Banks is not equipped to conduct an audit outside what reports are submitted by the Branch to the auditor. If the Auditor audits only what the auditee wants him to see, then the value of such audit is low. Perhaps it is what statutory auditors do. But Internal auditors have to go beyond checking the arithmetic accuracy of the transactions and go into an in-depth fraud possibility analysis.

Cost and Training Hurdle

In examining the solutions that the Auditors could use, it was observed that the tools normally considered as reputed “Computer Assisted Audit Tools” or CAATs are prohibitively expensive and require a rigorous training both of which seem to create a hurdle for Banks.

However, it is possible for RBI to equip itself with such tools (ACL, IDEA, ARBUTUS etc) and use it in its audit as a starting point. Other Banks may start using it depending on their size. Obviously the larger Banks donot have any constraint on budget nor ability to train the auditors, But smaller Banks may have a problem.

I therefore suggest that smaller Banks create a “Technology Resource Pool” in a “Centralized Fraud Investigation Center” which should be equipped with such tools and talent and conduct audits of member Banks as a service.

I hope RBI will take such steps to ensure that in future the audit system is strengthened to such an extent that the frauds such as what we are now seeing does not go undetected before it balloons into a huge scam.


(P.S: I have been an ex-Banker and therefore may not be fully aware of the current situation in the Banks about how audits are conducted in the Computerised environment.

But looking at the frauds that are surfacing, it is clear that the system is not working properly and hence some of the observations made above may be true though I may not be able to give evidence of the same. If we want to clean up the Bank system, Bankers need to do a self evaluation of their systems and check if some of the points made here are relevant or not.

I invite comments and suggestions on how to improve Audit systems in Banks in the computerized environment… Naavi)