Wednesday October 23, 2019
Home Lead Story Facebook Set ...

Facebook Set up a War Room to Fight Election Interference

With the new ad architecture in place, people would be able to see who paid for a particular political ad

0
//
Facebook
Facebook releases Messenger redesign on Android, iOS. Pixabay

In line with its efforts to prevent misuse of its platform during elections, Facebook has set up a War Room to reduce the spread of potentially harmful content.

Facebook faced flak for not doing enough to prevent spread of misinformation by Russia-linked accounts during the 2016 US presidential election. The social networking giant has rolled out several initiatives to fight fake news and bring more transparency and accountability in its advertising since then.

The launch of the first War Room at its headquarters in Menlo Park, California, is part of the social network’s new initiatives to fight election interference on its platform.

Although Facebook opened the doors of the War Room ahead of the general elections in Brazil and mid-term elections in the US, it revealed the details only this week.

The goal behind setting up the War Room was to get the right subject-matter experts from across the company in one place so they can address potential problems identified by its technology in real time and respond quickly.

Facebook
Facebook, social media. Pixabay

“The War Room has over two dozen experts from across the company – including from our threat intelligence, data science, software engineering, research, community operations and legal teams,” Samidh Chakrabarti, Facebook’s Director of Product Management, Civic Engagement, said in a statement on Thursday.

“These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook,” Chakrabarti added.

Facebook said its dashboards offer real-time monitoring on key elections issues, such as efforts to prevent people from voting, increases in spam, potential foreign interference, or reports of content that violates our policies.

The War Room team also monitors news coverage and election-related activity across other social networks and traditional media in order to identify what type of content may go viral.

These preparations helped a lot during the first round of Brazil’s presidential elections, Facebook claimed.

The social networking giant said its technology detected a false post claiming that Brazil’s Election Day had been moved from October 7 to October 8 due to national protests.

While untrue, that message began to go viral. But the team quickly detected the problem, determined that the post violated Facebook’s policies, and removed it in under an hour.

“And within two hours, we’d removed other versions of the same fake news post,” Chakrabarti said.

Facebook
Facebook App on a smartphone device. (VOA)

The team in the War Room, Facebook said, also helped quickly remove hate speech posts that were designed to whip up violence against people from northeast Brazil after the first round of election results were called.

“The work we are doing in the War Room builds on almost two years of hard work and significant investments, in both people and technology, to improve security on Facebook, including during elections,” Chakrabarti said.

Earlier this month Facebook said that it was planning to set up a task force comprising “hundreds of people” ahead of the 2019 general elections in India.

You May Also Like to Read About- McAfee Introduces New Device-to-Cloud Security Solutions

“With the 2019 elections coming, we are pulling together a group of specialists to work together with political parties,” Richard Allan, Facebook’s Vice President for Global Policy Solutions, told the media in New Delhi.

Facebook has also set a goal of bringing a transparency feature for political ads — now available in the US and Brazil — to India by March next year, Allan informed.

With the new ad architecture in place, people would be able to see who paid for a particular political ad. (IANS)

Next Story

Social Media Giant Facebook to Bring New Tools to Protect 2020 US Elections

Facebook also announced an initial investment of $2 million to support projects that empower people to determine what to read and share - both on Facebook and elsewhere

0
Fake, News, WhatsApp, Facebook, India
The Facebook mobile app on an Android smartphone. Wikimedia Commons

Stung by spread of fake news and privacy violations, Facebook on Monday announced several new tools to protect 2020 US elections from being manipulated by nation-state bad actors, and avoid the repeat of 2018 presidential elections hit by Russian interference.

The social networking giant launched “Facebook Protect” to secure the accounts of elected officials, candidates, their staff and others who may be particularly vulnerable to targeting by hackers and foreign adversaries.

“Beginning today, Page admins can enroll their organization’s Facebook and Instagram accounts in ‘Facebook Protect’ and invite members of their organization to participate in the programme as well,” said three top Facebook executives in a lengthy blog post.

Participants will be required to turn on two-factor authentication, and their accounts will be monitored for hacking, such as login attempts from unusual locations or unverified devices.

“If we discover an attack against one account, we can review and protect other accounts affiliated with that same organization that are enrolled in our programme,” said Guy Rosen, VP of Integrity at Facebook.

The company said it has seen people failing to disclose the organization behind their Page as a way to make people think that a Page is run independently.

To address this, Facebook is adding more information about who is behind a Page, including a new “Organizations That Manage This Page” tab that will feature the Page’s “Confirmed Page Owner”, including the organization’s legal name and verified city, phone number or website.

Initially, this information will only appear on Pages with large US audiences that have gone through Facebook’s business verification.

A new US Presidential candidate spend tracker will share ad details across national, state and regional levels.

facebook, instant games
FILE – Attendees walk past a Facebook logo during Facebook Inc’s F8 developers conference in San Jose, California, United States. VOA

“We’ll also make it clear if an ad ran on Facebook, Instagram, Messenger, or the Audience Networks,” said Facebook.

Next month, Facebook will begin labelling media outlets that are wholly or partially under the editorial control of their government as state-controlled media.

This label will be on both their Page and in Facebook Ad Library.

“We will hold these Pages to a higher standard of transparency because they combine the opinion-making influence of a media organization with the strategic backing of a state,” said Katie Harbath, Public Policy Director, Global Elections.

Facebook said it will update the list of state-controlled media on a rolling basis beginning in November.

In early 2020, Facebook plans to expand its labeling to specific posts and apply these labels on Instagram as well.

Also Read: Uber Brings its ‘Public Transport’ Service in Delhi

The company said that over the next month, content across Facebook and Instagram that has been rated false or partly false by a third-party fact-checker will start to be more prominently labeled so that people can better decide for themselves what to read, trust and share.

“The labels below will be shown on top of false and partly false photos and videos, including on top of Stories content on Instagram, and will link out to the assessment from the fact-checker,” informed Nathaniel Gleicher, Head of Cybersecurity Policy and Rob Leathern, Director of Product Management.

Facebook also announced an initial investment of $2 million to support projects that empower people to determine what to read and share – both on Facebook and elsewhere. (IANS)