Islamabad, April 13, 2017: A Pakistani man arrested for selling child pornography online has confessed that he lured some 25 children into the heinous act on the pretext of imparting them computer education, the media reported on Thursday.
The Federal Investigation Agency’s (FIA) cyber crime wing on Tuesday arrested Saadat Amin, 45, from Sargodha in Punjab province and seized his computer and laptop, reports Dawn online.
FIA cyber crime head Deputy Director Shahid Hasan said the scam is “first of its kind” in Pakistan.
NewsGram brings to you current foreign news from all over the world.
“During interrogation Amin revealed that he had been selling child pornographic content online for the last few years. Amin used to lure children on the pretext of imparting computer education. He even paid between 3,000 and 5,000 Pakistani rupees to the parents of the victims, saying that their children would learn computer hardware and software (skills) at his one-room rented workshop in Sargodha,” an FIA official told Dawn.
The FIA cyber crime wing launched a probe into the matter on being informed by Norwegian Embassy through a letter that the country’s police had arrested a man in connection with the child pornographic content and that Saadat Amin was one of his accomplices in Pakistan.
According to Amin, he not only sold his own recordings but also “video clips hacked from the servers of Russian and Bangladeshi porn websites to buyers in Norway and Sweden.”
The Norwegian man paid Amin between $100 and $400 for different videos involving young boys, the official said.
So far, the FIA has recovered some 65,000 child pornography video clips from the Amin’s possession hacked from foreign websites. (IANS)
Facebook Inc said on Wednesday that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.
The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualized context.
A similar system also disclosed Wednesday catches users engaged in “grooming,” or befriending minors for sexual exploitation.
Facebook’s global head of safety Antigone Davis told Reuters in an interview that the “machine helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers.
The company is exploring applying the same technology to its Instagram app.
Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material.
Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.
Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.
Davis said the child safety systems would make mistakes but users could appeal.
“We’d rather err on the side of caution with children,” she said.
Facebook’s rules for years have banned even family photos of lightly clothed children uploaded with “good intentions,” concerned about how others might abuse such images.
Before the new software, Facebook relied on users or its adult nudity filters to catch child images. A separate system blocks child pornography or child nudity that has previously been reported to authorities.
Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.
Facebook said the program, which learned from its collection of nude adult photos and clothed children photos, has led to more removals. It makes exceptions for art and history, such as the Pulitzer Prize-winning photo of a naked girl fleeing a Vietnam War napalm attack.
The child grooming system evaluates factors such as how many people have blocked a particular user and whether that user quickly attempts to contact many children, Davis said.
Michelle DeLaune, chief operating officer at the National Center for Missing and Exploited Children (NCMEC), said the organization expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year.
With the increase, NCMEC said it is working with Facebook to develop software to decide which tips to assess first.
Still, DeLaune acknowledged that a crucial blind spot is encrypted chat apps and secretive “dark web” sites where much of new child pornography originates.