Can Courts declare death sentence to a Humanoid Robot”… A Criminal Jurisprudential Challenge

India is in the process of revising its age old Criminal law namely the Indian Penal Code 1872 and Criminal Procedure Code 1973 with the new laws Bharatiya Nyaaya Samhita and Bharatiya Nagarik Surakshita samhita 2023 drafts of which are already presented in the Parliament.

In the meantime India is also expected to revise the ITA 2000 with the Digital india Act which may alter the Cyber Jurisprudence that has been developing since last two decades of the existence of ITA 2000.

The Artificial Intelligence itself as a technology is growing along with the developments of Neuro Science, Meta Verse etc.

The society will soon have many confrontations between AI and law and most complicated aspect of this would be in criminal Jurisprudence.

We have seen that evidentiary aspects introduced by ITA 2000 (Section 65B of IEA) have not been absorbed by the Judicial community till date since unlearning the past is that difficult. Now to unlearn the criminal jurisprudence and think of any change arising out of Artificial intelligence is a challenge.

How the Higher Judiciary would react to this need and come up with its own jurisprudential guideline is for the future society to witness.

However we can try to highlight some of the issues that need to be sorted out immediately to avoid a blackout when the new DIA becomes operative.

The essence of Criminal Jurisprudence is the definition of a Crime, definition of a criminal and definition of justice.

Crime can be defined as “an act that is deemed by statute or by the common law to be a public wrong and is therefore punishable by the state in criminal proceedings”

Law and Justice donot always converge and experts define Justice as “A moral ideal that the law seeks to uphold in the protection of rights and punishment of wrongs.”.

Many times Justice has to be an interpretation of the written law and herein lies the domain of “Jurisprudence”.

Jurisprudence has to interpret what is “Ethics” which can be considered as an extension of written law. The distinction of what is a crime in written law and what is a crime in the minds of a victim is always a tough challenge to the Judiciary.

Most of the time criticism of judiciary arises because Judiciary may either stick to the law in words and ignore the law in spirit. Some times Judiciary goes to the other extreme and interprets law as they consider necessary invoking principles such as the “Basic Structure of the Constitution” etc and take complete control of defining what is law irrespective of what is written in the statute and what the public think is ethics.

If we look at Criminal Jurisprudence in the light of emerging technologies such as Artificial Intelligence, Humanoid robots, Virtual Reality, Augmented reality etc there is a basic problem of identifying the “Actor” who has committed a Crime and the “Act” which constitutes a Crime.

The “Act” which constitutes a “Crime” is being defined in the law. For example Section 66 of ITA 2000 defines an offence punishable with 3 years of imprisonment as

“if any person dishonestly or fraudulently does any act refered to in section 43, he shall be punishable with imprisonemnt which may extend to three years or fine which may extend to five lakh rupees or both”

Section 43 associated with this section is a compendium of 10 subsections and commission of any of these 10 acts without the “Permission of the owner or any person who is in charge of a computer, computer system or computer network” shall be liable ….

The 10 acts represented by the subsections of Section 43 of ITA 2000 are ….

Determining an offence under Section 66 therefore involves the interpretation of “Dishonesty” and “Maliciously acting” and also “diminishing of value of information” , “Causing injury to information” etc.

(1) accessing or securing access to such computer, computer system or computer network or computer resource

(2) downloading, copying or extracting any data, computer data base or information from such computer, computer system or computer network including information or data held or stored in any removable storage medium;

(3) introducing or causing to be introduced any computer contaminant or computer virus into any computer, computer system or computer network;

(4) damaging or causing to be damaged any computer, computer system or computer network, data, computer data base or any other programmes residing in such computer, computer system or computer network;

(5) disrupting or causeing disruption of any computer, computer system or computer network;

(6) denying or causing the denial of access to any person authorised to access any computer, computer system or computer network by any means;

(7) providing any assistance to any person to facilitate access to a computer, computer system or computer network in contravention of the provisions of this Act, rules or regulations made thereunder,

(8) charging the services availed of by a person to the account of another person by tampering with or manipulating any computer, computer system, or computer network,

(9) destroying, deleting or altering any information residing in a computer resource or diminishing its value or utility or affecting it injuriously by any means

(10) Stealing, concealing, destroying or altering or causing any person to steal, conceal, destroy or alter any computer source code used for a computer resource with an intention to cause damage,

Here in lies the jurisprudential requirements to be taken into account in defining an act as a crime.

The second aspect of Cyber Crime jurisprudence is the interpretation of who is the “Person” who is responsible for the offence.

In the Artificial Intelligence scenario, an attempt is made to make computer program so sophisticated that it appears that decisions are taken “Automatically”.

When a computer output directly comes out of an input and the process of interpretation of the program, the output follows the principle of GIGO (Garbage in Garbage out) and the programmer takes the responsibility for determining “Means of processing”. The person who provides the input is the user of the software who takes the output as the result of the Computer based automated decision and acts further on the basis of this decision.

We shall take the example of an industrial process in which a Chemical process takes into account the temperature, composition of the processed material etc and determines the time upto which the process should run to generate a required chemical process resulting in a output finished product. If the parameters of input in such a process are dishonestly altered, the process would result in a loss or may even lead to an accident and cause death or injury.

Is this a Section 66 offence? If so who is responsible for it? it the programmer? or the process owner who provided the input? or is it the fault of the sensors which gave a certain reading based on which the operator pressed a button to continue the process?. What if the operator wanted to stop the process but the buttons were mis-wired that the process was triggered instead of being stopped?

These are the issues which require Cyber Jurisprudents to resolve.

When we term certain software as “Artificial Intelligence”, either ANI (Artificial Narrow Intelligence) or even AGI (Artificial General Intelligence), it still follows instructions already in the library and hence the actions of the AI depends entirely on the owner of the library or creator of the library. Hence in such circumstances criminal jurisprudence requires the owner of the software to take the responsibility for the actions of the software and if the creator of the software has not provided the necessary disclosures, the creator (Developer) may also have back to back responsibility. This is clear even in ITA 2000 by virtue of Section 11. (Attribution of an automated activity).

When we enter the realms of “Generative AI” or ASI (Artificial Super Intelligence) where, by design the creator of the algorithm has enabled the software to hallucinate, predict and give out decisions and also learn from its own decisions and modify the next set of outputs on similar inputs, then we are looking at a system which is behaving beyond the original instructions input by the developer.

It is in such circumstances that Cyber Jurisprudence has to interpret whether even the modification of code based on the learnings are to be attributed to the original creator of the algorithm or should the AI itself be considered as a juridical person.

With the emergence of humanoid robots at least one of which is presently acting as Chief Executive of a Company which bears health risks in its products, the consequences of malfunctioning of AI has to be determined in law. Will you put the humanoid robot acting as CEO of a company taking a bad decision that causes death and destruction in the jail for 10 years or for life? or will you give it a death sentence? … is the Criminal Jurisprudence challenge.

I welcome a debate on this aspect so that Meity and MHA may take these into account during the framing of the new IPC law and DIT.


About Vijayashankar Na

Naavi is a veteran Cyber Law specialist in India and is presently working from Bangalore as an Information Assurance Consultant. Pioneered concepts such as ITA 2008 compliance, Naavi is also the founder of Cyber Law College, a virtual Cyber Law Education institution. He now has been focusing on the projects such as Secure Digital India and Cyber Insurance
This entry was posted in Cyber Law. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.