Facebook has removed hundreds of accounts and pages linked to Indian political parties or the Pakistani military for what the company described as “coordinated inauthentic behavior or spam.” The Facebook or Instagram accounts, pages or groups were detected through internal investigations into account activity in the region before upcoming elections in India.
“These Pages and accounts were engaging in behaviors that expressly violate our policies. This included using fake accounts or multiple accounts with the same names; impersonating someone else; posting links to malware; and posting massive amounts of content across a network of Groups and Pages in order to drive traffic to websites they are affiliated with in order to make money,” Facebook’s head of cybersecurity policy, Nathaniel Gleicher, said in a statement.
The social media giant has become much more conscious of user activity after a scandal in which data mining firm Cambridge Analytica used information from tens of millions of Facebook users to manipulate political campaigns in multiple countries, including the United States.
Indian political parties are relying heavily on social media to push forward their agenda in a tough general election that begins April 11, and the issue of fake news remains a major concern.
Facebook says 687 pages and accounts that were detected and suspended by its automated system were linked to India’s main opposition party, the Indian National Congress, or INC. The Facebook statement also said the company removed 15 pages, groups and accounts tied to officials associated with Indian IT firm Silver Touch. The information technology firm is linked to the ruling Bharatiya Janata Party. One Silver Touch Facebook page was followed by 2.6 million accounts, compared to 206,000 followers of the INC-linked pages.
The INC tweeted that no official pages run by the party had been taken down. “Additionally, all pages run by our verified volunteers are also unaffected,” it said.
A party official who did not want to be named told VOA that Facebook has not shared further information with the party about the pages in question or provided a list of them.
Pratik Sinha, who runs fact-checking website AltNews.in, said Facebook’s announcement gives a “lopsided” view that only the opposition INC has been engaged in pushing spam. Sinha pointed out that Silver Touch, whose accounts were taken down, had spent much more on advertising on the social media platform compared to the pages created by the INC’s IT cell.
In neighboring Pakistan, 103 pages or accounts linked to the media cell of that country’s military have been removed.
“Although the people behind this activity attempted to conceal their identities, our investigation found that it was linked to employees of the ISPR (Inter-Services Public Relations) of the Pakistani military,” the Facebook statement said.
These individuals, according to the statement, were operating “military fan Pages; general Pakistani interest Pages; Kashmir community Pages; and hobby and news Pages” with posts on politics and the military.
The ISPR declined to comment for this story.
Journalists or rights activists in Pakistan often complain of online trolling or harassment from fake accounts.
Journalist Gharidah Farooqi said she regularly faces threats and harassment online from accounts that appear to be military fan pages. She has complained to the military’s media wing, but been told the institution has nothing to do with the issue.
Another journalist, Asma Shirazi, told VOA she has faced an “organized and institutionalized” campaign against her online for her coverage of opposition leaders, particularly ousted Prime Minister Nawaz Sharif. Shirazi added that she has been accused of being “anti-Pakistan” and taking bribes from Sharif’s (Pakistan Muslim League) party.
Last week, several Facebook accounts posted pictures and personal details — such as home address and contact details — of rights activist Marvi Sirmed and incited people to kill her after falsely accusing her of acting against Islam and promoting a “free sex, incestuous society.”
Sirmed is a regular critic of the military, as well as the current administration of Prime Minister Imran Khan. Facebook has already taken down at least one account, but Sirmed said several others remain. Sirmed says she has complained to local authorities and is awaiting a response.
In the face of criticism that Facebook is not doing enough to combat extremist messaging, the company likes to say that its automated systems remove the vast majority of prohibited content glorifying the Islamic State group and al-Qaida before it’s reported.
But a whistleblower’s complaint shows that Facebook itself has inadvertently provided the two extremist groups with a networking and recruitment tool by producing dozens of pages in their names.
The social networking company appears to have made little progress on the issue in the four months since The Associated Press detailed how pages that Facebook auto-generates for businesses are aiding Middle East extremists and white supremacists in the United States.
On Wednesday, U.S. senators on the Committee on Commerce, Science, and Transportation will be questioning representatives from social media companies, including Monika Bickert, who heads Facebooks efforts to stem extremist messaging.
The new details come from an update of a complaint to the Securities and Exchange Commission that the National Whistleblower Center plans to file this week. The filing obtained by the AP identifies almost 200 auto-generated pages, some for businesses, others for schools or other categories, that directly reference the Islamic State group and dozens more representing al-Qaida and other known groups. One page listed as a “political ideology” is titled “I love Islamic state.” It features an IS logo inside the outlines of Facebook’s famous thumbs-up icon.
In response to a request for comment, a Facebook spokesperson told the AP: “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organizations to stay ahead of bad actors. Auto-generated pages are not like normal Facebook pages as people can’t comment or post on them and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort.”
Facebook has a number of functions that auto-generate pages from content posted by users. The updated complaint scrutinizes one function that is meant to help business networking. It scrapes employment information from users’ pages to create pages for businesses. In this case, it may be helping the extremist groups because it allows users to like the pages, potentially providing a list of sympathizers for recruiters.
The new filing also found that users’ pages promoting extremist groups remain easy to find with simple searches using their names. They uncovered one page for “Mohammed Atta” with an iconic photo of one of the al-Qaida adherents, who was a hijacker in the Sept. 11 attacks. The page lists the user’s work as “Al Qaidah” and education as “University Master Bin Laden” and “School Terrorist Afghanistan.”
Facebook has been working to limit the spread of extremist material on its service, so far with mixed success. In March, it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international extremist groups. It says it has banned 200 white supremacist organizations and 26 million pieces of content related to global extremist groups like IS and al-Qaida.
It also expanded its definition of terrorism to include not just acts of violence attended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with the intent to coerce and intimidate. It’s unclear, though, how well enforcement works if the company is still having trouble ridding its platform of well-known extremist organizations’ supporters.
But as the report shows, plenty of material gets through the cracks and gets auto-generated.
The AP story in May highlighted the auto-generation problem, but the new content identified in the report suggests that Facebook has not solved it.
The report also says that researchers found that many of the pages referenced in the AP report were removed more than six weeks later on June 25, the day before Bickert was questioned for another congressional hearing.
The issue was flagged in the initial SEC complaint filed by the center’s executive director, John Kostyack, that alleges the social media company has exaggerated its success combatting extremist messaging.
“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating pages with titles like `I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.” (VOA)