An interesting question has been raised by the Indian Corporate sector regarding the applicability of the new Intermediary Guideline to the corporate interaction platforms such as Zoom/Goto Meeting/webex/Google or Team other companies facilitating streaming of content and messaging among users and also the public at large.
A doubt has arisen that given that with more than 5 million users registered with such platforms, will this makes them significant social media intermediaries and whether they will need to moderate content effectively and have rules in place for moderation.
It is also indicated that the platforms may not be able to exercise control as in the case of the attack in Christchurch, in March 2019, which was live streamed on Facebook though after the event, FaceBook tightened rules for live streaming .
After this incident, 31 countries and several tech companies came together to form a pledge called the “Christchurch call” initiative. India also is a signatory. It is believed that Zoom has also joined this pledge in 2020.
Since Microsoft Teams and other technologies are also in use in schools for online education, the need to have moderation of live streaming is also relevant in certain circumstances.
The Christchurch call for action was an initiative which which included voluntary commitments from Governments and online service provdiers intended to address the issue of terrorist and violent extremist content online and to prevent the abuse of the internet.
We must remember that all terrorist activities are also considered freedom movements or religious commitments by the section of people who are called terrorists. Hence there will always be differences of opinion whether an act is “Terrorism” or “Religious Action”. In between these two extremes there will be the “Freedom of Speech” protagonists some of whom have a leaning on one of these sides or their own political agenda to try and create mis representative narratives. It is this mis representation from the digital media that this Intermediary rules try to addrss.
As regard live streaming, it is news and it is the journalist who has to show maturity and discretion. It is also part of the fact which the reporter may not know and hence some events may get broadcast unknowingly.
What needs to be regulated however is the “Conspiracy” and “Planning” to commit a terrorist activity. The Disha Ravi incident in which it is reported that a Zoom meeting was held to discuss the “Terror Plan” is an example of what may have to be regulated.
However, in such cases, it is difficult to blame the intermediary except if the title of the meeting gave any clear indication about the intention. We have discussed this in the past in the case of Bazee.com case whether the title “DPS MMS Video” which was the video sought to be sold in the platform which was the “Obscene” content on the basis of which action was taken against the executives of Bazee.com under Section 79. (2004-2008)
The 180 days data retention rule may also be applicable to the platforms.
However, the streaming video publishers are like the You Tube. They are the platform used by the other publishers. In the case of You Tube, they become the “Curated Content Publishers”. But the Zoom and others donot “Publish” subsequently and hence donot become the “Media”.
At best the role of Zoom etc will be like a CCTV camera which faithfully captures and broadcasts to the server and may capture events which need to be regulated. But here the platform is a “Pure intermediary” like an ISP and hence the “Social Media” responsibilities donot adhere to them.
The platforms Zoom etc therefore need not to be worried about the new Intermediary guidelines. Also sharing such Videos or content within a community of employees etc may not come under the definition of “Publication” since no “Public” is involved. Hence the entire set of responsibilities donot apply to the Companies. However, if the content is leaked out to the outside world and creates problems (eg when Whats App messages are forwarded to outside of the original group) the person who was responsible for making a controversial content public should bear the responsibility and the company should be in a position to identify such a person through the meta information about viewing, recording and downloading etc.
(This debate may continue.. Comments are welcome)