Facebook has removed several accounts owned by Russia-based Internet Research Agency (IRA), the group accused of meddling in the 2016 US presidential election.
“We removed 70 Facebook and 65 Instagram accounts — as well as 138 Facebook Pages — that were controlled by the Russia-based Internet Research Agency (IRA), Alex Stamos, Chief Security Officer at Facebook, said in a blog post late on Tuesday.
Many of the Pages also ran ads, all of which have been removed. “Of the Pages that had content, the vast majority of them (95 per cent) were in Russian — targeted either at people living in Russia or Russian-speakers around the world including from neighbouring countries like Azerbaijan, Uzbekistan and Ukraine,” Stamos said, who is reportedly planning to leave the company by August.
The IRA has repeatedly used complex networks of inauthentic accounts to deceive and manipulate people who use Facebook, including before, during and after the 2016 US presidential elections. “We removed this latest set of Pages and accounts solely because they were controlled by the IRA — not based on the content,” said Stamos.
This included commentary on domestic and international political issues, the promotion of Russian culture and tourism as well as debate on more everyday issues. The US Special Counsel Robert Mueller’s investigation team is investigating into Russian interference in the 2016 presidential election.
Facebook is also facing a backlash as a political data analytics firm Cambridge Analytica that worked with Donald Trump’s election team allegedly harvested 50 million Facebook profiles of US voters to influence their choices at the ballot box.
The social network later suspended Cambridge Analytica for violating its policies and commitments. Cambridge Analytica received user data from a Facebook app years ago that purported to be a psychological research tool, though the firm was not authorised to have that information. IANS
Facebook says it is getting better at proactively removing hate speech and changing the incentives that result in the most sensational and provocative content becoming the most popular on the site.
The company has done so, it says, by ramping up its operations so that computers can review and make quick decisions on large amounts of content with thousands of reviewers making more nuanced decisions.
In the future, if a person disagrees with Facebook’s decision, he or she will be able to appeal to an independent review board.
Facebook “shouldn’t be making so many important decisions about free expression and safety on our own,” Facebook CEO Mark Zuckerberg said in a call with reporters Thursday.
But as Zuckerberg detailed what the company has accomplished in recent months to crack down on spam, hate speech and violent content, he also acknowledged that Facebook has far to go.
“There are issues you never fix,” he said. “There’s going to be ongoing content issues.”
In the call, Zuckerberg addressed a recent story in The New York Times that detailed how the company fought back during some of its biggest controversies over the past two years, such as the revelation of how the network was used by Russian operatives in the 2016 U.S. presidential election.
The Times story suggested that company executives first dismissed early concerns about foreign operatives, then tried to deflect public attention away from Facebook once the news came out.
Zuckerberg said the firm made mistakes and was slow to understand the enormity of the issues it faced. “But to suggest that we didn’t want to know is simply untrue,” he said.
Zuckerberg also said he didn’t know the firm had hired Definers Public Affairs, a Washington, D.C., consulting firm that spread negative information about Facebook competitors as the social networking firm was in the midst of one scandal after another. Facebook severed its relationship with the firm.
“It may be normal in Washington, but it’s not the kind of thing I want Facebook associated with, which is why we won’t be doing it,” Zuckerberg said.
The firm posted a rebuttal to the Times story.
Facebook said it is getting better at proactively finding and removing contentsuch as spam, violent posts and hate speech. The company said it removed or took other action on 15.4 million pieces of violent content between June and September of this year, about double what it removed in the prior three months.
But Zuckerberg and other executives said Facebook still has more work to do in places such as Myanmar. In the third quarter, the firm said it proactively identified 63 percent of the hate speech it removed, up from 13 percent in the last quarter of 2017. At least 100 Burmese language experts are reviewing content, the firm said.
One issue that continues to dog Facebook is that some of the most popular content is also the most sensational and provocative. Facebook said it now penalizes what it calls “borderline content” so it gets less distribution and engagement.
“By fixing this incentive problem in our services, we believe it’ll create a virtuous cycle: by reducing sensationalism of all forms, we’ll create a healthier, less-polarized discourse where more people feel safe participating,” Zuckerberg wrote in a post.
Critics of the company, however, said Zuckerberg hasn’t gone far enough to address the inherent problems of Facebook, which has 2 billion users.
“We have a man-made, for-profit, simultaneous communication space, marketplace and battle space and that it is, as a result, designed not to reward veracity or morality but virality,” said Peter W. Singer, strategist and senior fellow at New America, a nonpartisan think tank, at an event Thursday in Washington, D.C. (VOA)