Sunday January 26, 2020
Home Business Facebook Buil...

Facebook Builds AI Tool that Tricks Facial Recognition System to Wrongly Identify Person in Video

This de-identification technology earlier worked mostly for still images, The Verge reported

0
//
Facebook, AI, Tool
Face recognition can lead to loss of privacy and face replacement technology may be misused to create misleading videos. Pixabay

Facebook’s artificial intelligence (AI) research team has developed a tool that tricks the facial recognition system to wrongly identify a person in a video, the media reported. Facebook.

The “de-identification” system, which also works in live videos, uses machine learning to change key facial features of a subject in a video, according to a report in VentureBeat on Friday.

“Face recognition can lead to loss of privacy and face replacement technology may be misused to create misleading videos,” reads a paper explaining the company’s approach, as cited by VentureBeat.

This de-identification technology earlier worked mostly for still images, The Verge reported.

Facebook, AI, Tool
The “de-identification” system, which also works in live videos, uses machine learning to change key facial features of a subject in a video, according to a report in VentureBeat on Friday. Pixabay

“Recent world events concerning advances in, and abuse of face recognition technology invoke the need to understand methods that deals with de-identification. Our contribution is the only one suitable for video, including live video, and presents quality that far surpasses the literature methods,” said the paper.

The work is scheduled to be presented at the International Conference on Computer Vision (ICCV), in Seoul, South Korea, next week.

The development comes at a time when Facebook is facing a $35 billion class-action lawsuit for alleged misuse of facial recognition data in Illinois. A US court has denied Facebook’s request to quash the lawsuit.

Also Read- Men and Women with Mental Disorders Die Prematurely

A three-judge panel of the ninth circuit judges in San Francisco rejected Facebook’s plea to quash the lawsuit. The case would now go to trial unless the Supreme Court intervened, TechCrunch reported last week. (IANS)

Next Story

Content Moderators on Facebook and YouTube Asked to Sign PTSD Forms

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign PTSD forms

0
YouTube Facebook
Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder. Pixabay

Content moderators at Facebook and YouTube in Europe and in the US have been asked to sign forms detailing that the job may cause post-traumatic stress disorder (PTSD).

According to The Financial Times and The Verge, global professional services firm Accenture which provides content moderators for big tech firms have asked them to sign a form, explicitly acknowledging that their job could cause post-traumatic stress disorder.

Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. A similar document was also provided by Accenture to workers at a YouTube content moderation facility in Austin, Texas. Accenture said the wellbeing of workers was a “top priority”.

facebook
Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin. Pixabay

“We regularly update the information we give our people to ensure that they have a clear understanding of the work they do,” the company said in a statement.

“According to an employee who signed one of these acknowledgment forms, every moderator at the facility was emailed a link and asked to sign immediately,” the report said.

The Accenture form says workers might review “disturbing” videos and that moderating “such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). Both Facebook and Google said they did not review Accenture’s new form.

The Verge’s probe last month into Accenture’s Austin site described hundreds of low-paid immigrants toiling in, removing videos flagged for extreme violence and terrorist content.

Also Read- Tech Giant Apple Becomes One of The Fastest-Growing Brands in India

“The moment they quit Accenture or get fired, they lose access to all mental health services. One former moderator for Google said she was still experiencing symptoms of PTSD two years after leaving,” the report claimed.

Last year, The Verge published a report of Facebook moderators and one of them said he “sleeps with a gun by his side” after doing the job. (IANS)