Sunday November 18, 2018
Home Uncategorized Child porn ch...

Child porn charge follows corruption probe for Karnataka IAS officer

0
//
Republish
Reprint

Bangalore: “Thousands of child pornography videos” were discovered on a 250 GB external hard disk found during a Karnataka CID raid on August 5, 2015. The hard disk was found alongside the unaccounted wealth– Rs 4.37 crore in cash, 2.5 kg of gold and diamonds, and documents for dozens of properties– seized by the CID from a sixth floor apartment in the plush Golden Grand property in north Bengaluru. The flat is held in the name of PSK Finance Solutions Pvt Ltd, a firm run by 1990 batch Karnataka IAS officer Kapil Mohan’s father Naresh Mohan and son Ahaan Mohan.

The Lokayukta police, who are investigating the corruption charges against Mohan and his family have handed over the hard disk to the Bangalore police for filing an FIR under sections of the Information Technology Act that deal with child pornography.

“We have registered a case under section 67 B of the IT Act against the owners of the apartment where the material was seized,” the deputy commissioner of police (north) T R Suresh told Indian Express.

The pornographic material was suspected to belong to Mohan since the confiscated hard drive also contained photographs, marks cards and personal documents of the IAS officer. “The hard drive has been forensically examined and the material found on it includes marks cards, documents, pictures of the official and thousands of child porn videos,” sources involved in the investigation claimed.

“Possession of child pornography is an offence under section 67 B of the Information Technology Act. The amount of child pornography material found indicates the presence of a perverted mind. It needs further investigation,” said police sources.

Mohan, who is also the principal secretary in the youth empowerment and sports ministry of the Karnataka government, has claimed his non-involvement with the firms owned by his father and his son.

Upon investigation, it was discovered that companies like PSK Finance Solutions run by Mohan’s family existed only in name. Jaishiv Saxena, an associate of the IAS officer, was arrested by the Lokayukta police for falsely claiming the wealth seized at the Golden Garden flat. Saxena claimed that the money seized came from stake sales in an IT company. However, the Lokayukta investigations uncovered that the Rs 4.37 crore was withdrawn from different bank accounts.

The firms linked to Mohan and his family received huge loans from companies which were allotted contracts by the state-run Karnataka Renewable Energy Development Ltd (KREDL) where Mohan was the managing director. Dishaa Power Corporation Pvt Ltd is a Bangalore firm which was awarded multiple small hydro projects around the state for generating 35 MW. Documents from KREDL and the RoC revealed that the promoters of this company deposited a total of Rs 1.75 crore with PSK Finance.

In March this year, PSK Finance Solutions, in March this year, allotted shares valued at Rs 1.25 crore to Gemini Shares and Stocks Private Limited. Dishaa Power Corporation director Naveen Patil was a director in these companies as well. PSK claimed that the shares were transferred against loans.

Apart from PSK Finance, Millenium Vinimay in Kolkata, run by Mohan’s father and son, also received large loans (Rs 5 crore) from firms such as Divyasree Infrastructure Projects. One of the Divyashree directors, Bhaskar N Raju, is also a share-holder in Dishaa Power Corporation where the majority of stake is held by a Hong Kong firm called Rainbow Energy Ltd.

(Inputs from Indian Express)

 

Click here for reuse options!
Copyright 2015 NewsGram

Next Story

Child Nudity Tackles By Facebook By Removing Posts

NCMEC said it is working with Facebook to develop software to decide which tips to assess first.

0
Facebook, child nudity
A man is silhouetted against a video screen with an Facebook logo in this photo illustration. VOA

Facebook Inc said on Wednesday that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualized context.

A similar system also disclosed Wednesday catches users engaged in “grooming,” or befriending minors for sexual exploitation.

Facebook’s global head of safety Antigone Davis told Reuters in an interview that the “machine helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers.

Facebook, Child nudity
This photo shows a Facebook app icon on a smartphone in New York. VOA

The company is exploring applying the same technology to its Instagram app.

Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material.

Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.

Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.

Davis said the child safety systems would make mistakes but users could appeal.

“We’d rather err on the side of caution with children,” she said.

Facebook, Child nudity
A protester wearing a mask with the face of Facebook founder Mark Zuckerberg is flanked by two fellow activists wearing angry face emoji masks, during a protest against Facebook policies, in London, Britain (From archives) VOA

Facebook’s rules for years have banned even family photos of lightly clothed children uploaded with “good intentions,” concerned about how others might abuse such images.

Before the new software, Facebook relied on users or its adult nudity filters to catch child images. A separate system blocks child pornography or child nudity that has previously been reported to authorities.

Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.

Facebook said the program, which learned from its collection of nude adult photos and clothed children photos, has led to more removals. It makes exceptions for art and history, such as the Pulitzer Prize-winning photo of a naked girl fleeing a Vietnam War napalm attack.

child nudity, facebook
Facebook’s head of global safety policy Antigone Davis speaks during an event at the White House. VOA

Protecting minors

The child grooming system evaluates factors such as how many people have blocked a particular user and whether that user quickly attempts to contact many children, Davis said.

Michelle DeLaune, chief operating officer at the National Center for Missing and Exploited Children (NCMEC), said the organization expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year.

With the increase, NCMEC said it is working with Facebook to develop software to decide which tips to assess first.

Still, DeLaune acknowledged that a crucial blind spot is encrypted chat apps and secretive “dark web” sites where much of new child pornography originates.

Also Read: Facebook Rolls Out A Simplified Version Of Messenger

Encryption of messages on Facebook-owned WhatsApp, for example, prevents machine learning from analyzing them.

DeLaune said NCMEC would educate tech companies and “hope they use creativity” to address the issue. (VOA)