One of the challenges that the Cyber World is facing is in maintaining the trust worthiness of the Internet content. In the coming days there will be increased use of ChatGPT tools by consumers and it is essential to retain the integrity of these applications to the extent possible by adopting appropriate regulatory oversight.
We have already discussed the need for “Accountability and Transparency” of AI algorithms which include a declaration of the owner of the algorithm in all the outputs. The main responsibility for this has to be taken up by the AI based service providers since the algorithm developers would be hiding behind and cannot be easily located. Hence AI based service providers would be held liable for any bias that may be inherent in the algorithm and it would be their responsibility to demand accountability from the AI developers.
Similarly, the Digital Media of the day which create the Internet content and is used as a training base by ChatGPT/Bard etc., needs to also show some accountability. It is well understood that “Hallucination Error” of AI is the responsibility of the Code developer but the “Bias” is created by the training data input. This is easily manipulated by creating an eco system of motivated news spread through the Internet either in the form of Digital Media, or Individual Blogs.
We are aware that Bitcoin authentication frauds can be committed by fraud syndicates taking over of majority of nodes. Similarly by controlling narrative in more than 50% of Internet content on a specific topic, it is possible to inject bias in the AI algorithms that pick up training data from Internet for reinforced learning. While it may be difficult or impossible to poison 50% of the web content, it is possible to create such biased mass of content in respect of a specific issue.
For example, it is possible to create a mass of content on “Adani” or “Khalistan” or “Islamic obligations” etc where more than 50% content may argue that “Adani” is a stock market manipulator, Khalistan is a popular freedom movement or etc. by pumping in articles of a specific nature in the training data/Internet data.
In all such cases, motivated actions of the interested groups cannot be countered by sufficient number of counter views. Hence it is inevitable that the output of AI algorithms like ChatGPT will eventually get corrupted. The corrupted outputs will in due course become the most accepted world view.
If ChatGPT was relied upon when Socrates said Earth is round while everybody else (other than ancient Indians) believed it to be flat, then science would have to struggle harder than it did to establish its credibility.
Currently, a large part of Digital Media is supported by motivated persons like George Soros who invest large sums of money to maintain a hoard of organizations and journalists to spread a prejudiced view have the capability of introducing bias into the ChatGPT4/5 or Bard.
I therefore advocate that as a part of the Intermediary responsibility in India, all Digital Media should be made to declare through a disclaimer the association with a funding agency whether it is George Soros or others.
Naavi had suggested in 2001, the service called “Lookalikes disclosure” (Visit lookalikes.in for more details) to meet the Domain Name disputes arising out of clash of domain names. Similarly a time has come to suggest that every website provide a disclosure “I am not associated with George Soros” or more generically “This website provides independent views and is not funded by vested interests” (Or some thing similar).
Such disclaimers should be considered as “Due Diligence”. Ideally every website expressing “Opinions” should declare its ownership and alignment if any to specific national, political, religious or racial interests.
Just as products are certified for country of origin, Vegetarian or not, etc, websites, blogs, Youtube channels etc can carry Trust Seals indicating their affiliation or neutrality which will be subject to review by the public.
Hope Meity considers this suggestion to be suitable included in the due diligence requirements of Digital Media.