'We are considering this to ensure swift action on content for which the respective ministries have domain experts and can better determine the illegality of the content or the web site that needs to be taken down.'

Key Points
- Government plans to expand Section 69(A) powers, allowing multiple ministries to issue emergency content blocking orders directly.
- Currently, only MeitY can issue such orders, with other agencies required to route requests through the ministry.
- Move could significantly increase takedown requests to social media platforms and internet intermediaries operating in India.
The central government is considering amendments to Section 69(A) of the Information Technology Act, 2000, to empower key ministries like information & broadcasting, home affairs, external affairs, and defence, to issue emergency content-blocking and takedown orders, sources told Business Standard.
Currently, the ministry of electronics and information technology (Meity) is the only ministry that can order emergency content takedowns and blocking of web sites or platforms.
"We are considering this to ensure swift action on content for which the respective ministries have domain experts and can better determine the illegality of the content or the web site that needs to be taken down," said a senior government official.
The move to expand these powers to other key ministries is likely to affect social media and Internet intermediaries like Facebook, Instagram, Snapchat, YouTube and X, as the number of blocking orders issued to them could increase significantly.
At present, requests originating from other ministries, regulators, or law enforcement agencies for content takedown or web site blocking under Section 69(A) of the IT Act, 2000, must be routed through Meity, which then issues a formal notice to the intermediary or Internet service provider.
The proposal to expand the scope of Section 69(A) of the IT Act is currently being discussed by senior government officials from key ministries and other stakeholders, and a decision on the best way to bring the amendment is likely soon, the official said.
Faster Takedown Decision Framework
Apart from these key ministries, the government is also likely to widen the scope of the amended to allow some regulators to exercise these powers, with a strict threshold on which content can be taken down, another official said, adding that the ministries and regulators would need to be "sensitised" to the "nature and need" of emergency blocking powers.
Other than Section 69(A) of the IT Act, content takedown notices can also be issued to social media and internet intermediaries under Section 79(3)(b) of the Act.
Under Section 69(A), the central government, or any other authorised officer on its behalf, can order a piece of content or a web site to be taken down if they believe that the action is 'necessary or expedient' to protect the sovereignty and integrity of India, the defence of India, security of the State, friendly relations with foreign States or public order, or for preventing incitement to the commission of any cognizable offence related to the above.
Intermediaries that fail to comply with emergency blocking orders can be punished with up to seven years' imprisonment and a fine, as determined by courts.
On the other hand, the content takedown notices issued under Section 79(3)(b) of the IT Act can be issued by any ministry, regulator or law enforcement agency through the ministry of home affairs' Sahyog Portal.
Intermediaries have a maximum of three hours to respond to such notices and remove the content.
In February this year, Meity had amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, to cut the content takedown response timeline for intermediaries to 3 hours from 24-36 hours earlier.
The amended rules, which came into effect on February 20, require intermediaries to remove non-consensual intimate imagery from their platforms within two hours, instead of a 24-hour window provided earlier.
Similarly, objectionable and unlawful content is now mandated to be removed within three hours of it being flagged either by the ministry or a law enforcement agency.
Govt may shrink content takedown window to 1 hour
Meity is considering cutting down the timeline for content takedown under Section 79 (3)(b) of the Information Technology Act to one hour from the three-hour window that is allowed right now, sources told Business Standard.
Such a move will, however, be made only if social media and internet intermediaries continue to roll out the three-hour window without any major hiccups, an official said.
"This is under consideration, but there are no immediate plans to roll this out. Currently, we have a three-hour window in place, and we are receiving constant feedback from all intermediaries on the implementation," the official said.
The IT ministry will soon start reaching out to all stakeholders, including executives from social media and Internet intermediaries, to discuss the feasibility and viability of the proposed one-hour window, another official said.
"Given the size and spread of the Indian Internet, the possibility of virality of content here is much higher, and therefore, it is important to act on illegal content that much faster," the second official said.
In February this year, the notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, stated that objectionable and unlawful content must be taken down within 3 hours of the intermediary being notified of its presence on the platform.
Further, non-consensual intimate imagery must be removed within one hour of the intermediary being notified, according to the new amended rules, which came into effect on February 20.
The push to make compliance timelines stricter for social media intermediaries has gained ground in India, especially after the rise of synthetic content.
In February, shortly after the amended IT rules came into effect, Union Electronics and Information Technology Minister Ashwini Vaishnaw had reiterated that social media and Internet intermediaries must take responsibility for the content hosted on their platforms to make them safer for children, women, and other online users.
'Platforms must wake up, must understand the importance of reinforcing trust in the institutions that human society has created over thousands of years,' Vaishnaw had said, adding that social media platforms, which do not adopt adequate safety measures to keep their users protected from such harmful content, would be held liable.
Feature Presentation: Aslam Hunani/Rediff






