New guidelines for Social Media, OTT Platforms

The Indian government has come up with a new set of rules and regulations to regulate social media platforms, messaging services, OTT platforms and news portals. These will require Big Tech platforms to set up stronger grievance redressal mechanisms, and appoint executives to coordinate with law enforcement in India.

New guidelines for Social Media, OTT Platforms

Under the new rules, OTT platforms and digital media will be ..required to disclose information about where they publish, The nature of the contents they share as well as their subscriber data. They will also be required to set up a grievance redress mechanism.


If you are doing business in India, you need to be set up in India 

All significant social media intermediaries are required to appoint:

  • a Chief Compliance Officer

The chief compliance officer, who will have to be present in India, shall be responsible for ensuring the platform’s compliance with the IT Act and the rules notified

  • a Nodal Contact Person

A nodal contact person who can be available round-the-clock for “coordination with law enforcement agencies” will also have to be appointed by social media intermediaries

  • and, a Resident Grievance Officer

A monthly compliance report on the complaints received, the action taken and the redressal for such complaints will also have to be published by the intermediaries

Each of the above is required to be Indian residents.

The Rules also necessitate significant social media intermediaries to have a physical contact address in India. This mandatory physical presence in India will have significant implications for foreign players in terms of setting up infrastructure and deployment of resources and taxation.


However, the absence of a registration or a mandatory licensing framework for digital media businesses will hopefully continue to garner interest from foreign players to set up operations in India.


Active monitoring of harmful content — shifting the responsibility to ‘intermediaries’

 In a departure from the previously applicable Information Technology (Intermediary Guidelines) Rules, 2011 (2011 Rules), significant social media intermediaries are now required to endeavor to deploy technology-based measures, including automated tools to identify information that depicts rape, child sexual abuse, or conduct, or information that has previously been removed.

The rules also require maintenance of appropriate human oversight, and periodic review of automated tools. Such active monitoring by intermediaries dilutes the safe harbor protection that was available to intermediaries under the 2011 Rules.


Verification of users — security or risk to privacy?

Significant social media intermediaries are to provide a mechanism for verification of user accounts, e.g. through mobile numbers. Verified users are to be identified by a demonstrable mark such as a tick or dot. Though such verification by users is voluntary, it may still be a setback for user privacy. A platform’s use of personal information provided by users is subject to the platform’s privacy policy, to which users typically have no option but to consent. User information provided for verification will simply become part of the data under the privacy policy that platforms are entitled to collect and use.


Identification of ‘first originator’ of Information

Messaging services (with more than 50 lakh users) will be required to enable identification of the first originator of information if required by a court order or an order of the Government under Section 69 of the IT Act.

Such identification of a user brings into question the end-to-end encryption offered by services such as WhatsApp, Telegram, Signal, etc.; and whether identifying a user as the “first originator” of mischievous information accurately is practically possible for a platform.