Google-owned YouTube on Monday said that after multiple investigations, it has not detected any kind of child sexual abuse material (IANS) 
Science & Tech

No child sexual abuse material detected on our platform: YouTube after MeitY notice

Google-owned YouTube on Monday said that after multiple investigations, it has not detected any kind of child sexual abuse material (CSAM) on its platform, and has submitted its formal response to the IT Ministry, after it was served a notice, along with other social media intermediaries last week, by MeitY to remove any CSAM on their platforms.

Author : NewsGram Desk

Google-owned YouTube on Monday said that after multiple investigations, it has not detected any kind of child sexual abuse material (CSAM) on its platform, and has submitted its formal response to the IT Ministry, after it was served a notice, along with other social media intermediaries last week, by MeitY to remove any CSAM on their platforms.

In a statement to IANS, a YouTube spokesperson said that based on "multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators”.

The spokesperson added that no form of content that endangers minors is allowed on YouTube.

“We will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content. We are committed to work with all collaborators in the industry-wide fight to stop the spread of CSAM),” said the company spokesperson.

According to the platform, majority of videos featuring minors on YouTube do not violate its policies. But when it comes to kids, YouTube takes “extra cautious approach towards our enforcement”.

In a statement to IANS, a YouTube spokesperson said that based on "multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators”.(Wikimedia Commons)

The Ministry of Electronics and IT had issued notices to social media intermediaries X (formerly Twitter), YouTube and Telegram, warning them to remove any kind of Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet or face action.

"The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. If they do not act swiftly, their safe harbour under section 79 of the IT Act would be withdrawn and consequences under the Indian law will follow,” said Union Minister of State for Electronics & IT, Rajeev Chandrasekhar.

The Information Technology (IT) Act, 2000, provides the legal framework for addressing pornographic content, including CSAM.

Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

According to YouTube, In India, “we surface a warning at the top of search results for specific search queries related to CSAM”.

This warning states child sexual abuse imagery is illegal and links to the National Cyber Crime Reporting Portal.IANS/VB

Subscribe to our channels on YouTube and WhatsApp 

Olympic Bronze, Then a Shocking Confession: Norway’s Sturla Holm Lægreid Admits Infidelity in Live Milan-Cortina 2026 Interview

Union Budget Session 2026 LIVE: Parliament Debates Union Budget

India Ranks 91st of 182 on Corruption Perceptions Index 2025

32-Year-Old Labourer Birju Dies After Falling into Open Manhole in Delhi’s Rohini

New IT Amendment Targets Deepfakes and Misinformation: Centre Tightens Digital Media Norms; Mandates Rapid Takedown and Labelling of AI-Generated Content