Sunday December 15, 2019
Home Uncategorized Her Son died ...

Her Son died for ISIS: This British Woman wants to Work to Prevent Radicalization

0
//
A government worker whitewashes an IS flag painted on a wall in Surakarta (Solo), Indonesia, Aug. 5, 2014. BenarNews

– by Ellen Wulfhorst

New York,November 22, 2016: – When Nicola Benyahia’s teenage son slipped away one day to join the Islamic State in Syria, the frantic mother anguished over his disappearance for months while keeping it secret from her friends and most of her family.

NewsGram brings to you current foreign news from all over the world

“I kept it secret because of the shame of it,” she told the Thomson Reuters Foundation. “We didn’t know how to answer people because we couldn’t even make sense of it ourselves. One minute we were just doing our daily life and the next day he was gone.”

Hoping to spare other families such loneliness and despair, Benyahia this week launched Families for Life, a counseling service to help cope with the complexities of radicalization.

Thousands of fighters from the West have joined the ISIS and other radical militants in Syria and Iraq, according to the New York-based Soufan Group, which provides strategic security to governments and multinational organizations.

Some 850 of those fighters and supporters went from Britain, according to authorities, and about 700 there are from France.

They include teenagers like Rasheed Benyahia who became radicalized and, aged 19, made the drastic and, in his case, irreversible decision to leave home and fight.

Families for Life will help those worried about their vulnerable children and those grappling with children they have lost to violent radicalization, said Benyahia, 46, who lives in Birmingham, Britain’s second-largest city.

Her son, who was working at an engineering apprenticeship, left home on May 29, 2015, a day etched in her memory.

“That particular morning I missed him,” she said. “He used to come down and give me a quick kiss and go out the door, but that morning I was a little bit late getting up and missed him.”

The Benyahia family did not know where he was, or if he was dead or alive, until weeks later when he sent a message from Raqqa, a city in northern Syria, where the ultra-hardline Sunni Muslim Islamic State runs training camps and directs operations.

NewsGram brings to you top news around the world today

The family corresponded with him sporadically by text and telephone in the months that followed.

WARNING SIGNS

That ended with a telephone call saying Rasheed Benyahia was killed in a drone strike on Nov. 10 last year on the border of Syria and Iraq.

Before her son left, Benyahia said she saw no signs that could have predicted his fatal move.

But now in hindsight, she said she sees the warning signs and hopes her insight and experience will help families in similarly precarious situations.

For example, her son had switched to go to a different mosque from the one his family attended, and he refused to cut his hair, she said.

He also asked her to shorten his trousers to above his ankle, which she now realizes is a style worn by some strict Muslims.

With Families for Life, Benyahia, a trained mental health counselor and therapist, also plans to work in prevention, such as speaking to school students.

But its most critical task may be helping families wrestling with feelings of shame, guilt and responsibility, she said.

Rasheed Benyahia had been convinced by someone – she still does not know who – that he was not a good Muslim if he did not join the jihadists, she said.

“He was vulnerable, and somebody swooped in,” she said.

While he was missing, she sought help from the Berlin-based German Institute on Radicalization and De-radicalization Studies (GIRDS) and Mothers for Life, a global network of women who have experienced violent jihadist radicalization in their families.

There was no such support in Birmingham, she said.

The city in central England, however, was the site of a bitter controversy two years ago over allegations of a hardline Muslim conspiracy to impose extreme cultural norms and values in some schools.

Check out NewsGram for latest international news updates

“When I speak to people and they realize I lost my son through this, they start opening up and start disclosing their concerns,” said Benyahia, who will join a panel next week on radicalization at Trust Women, an annual women’s rights and trafficking conference run by the Thomson Reuters Foundation.

“I’ve decided to fill in a gap that seems to be there.” (Thomson Reuters Foundation)

(Reporting by Ellen Wulfhorst, Editing by Belinda Goldsmith)

Next Story

Facebook Asked to Take Down Auto-Generated Al-Qaida Pages

Facebook likes to say that its automated systems remove the vast majority of prohibited content glorifying the Islamic State group and al-Qaida before it's reported

0
facebook, Al-qaida, terror, islamic state, pages
Monika Bickert, head of global policy management at Facebook, joined at right by Nick Pickles, public policy director for Twitter, testifies before the Senate Commerce, Science and Transportation Committee, Sept. 18, 2019. VOA

In the face of criticism that Facebook is not doing enough to combat extremist messaging, the company likes to say that its automated systems remove the vast majority of prohibited content glorifying the Islamic State group and al-Qaida before it’s reported.

But a whistleblower’s complaint shows that Facebook itself has inadvertently provided the two extremist groups with a networking and recruitment tool by producing dozens of pages in their names.

The social networking company appears to have made little progress on the issue in the four months since The Associated Press detailed how pages that Facebook auto-generates for businesses are aiding Middle East extremists and white supremacists in the United States.

On Wednesday, U.S. senators on the Committee on Commerce, Science, and Transportation will be questioning representatives from social media companies, including Monika Bickert, who heads Facebooks efforts to stem extremist messaging.

The new details come from an update of a complaint to the Securities and Exchange Commission that the National Whistleblower Center plans to file this week. The filing obtained by the AP identifies almost 200 auto-generated pages, some for businesses, others for schools or other categories, that directly reference the Islamic State group and dozens more representing al-Qaida and other known groups. One page listed as a “political ideology” is titled “I love Islamic state.” It features an IS logo inside the outlines of Facebook’s famous thumbs-up icon.

facebook, Al-qaida, terror, islamic state, pages
Facebook auto-generating Al-Qaida, terror group, pages. Pixabay

In response to a request for comment, a Facebook spokesperson told the AP: “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organizations to stay ahead of bad actors. Auto-generated pages are not like normal Facebook pages as people can’t comment or post on them and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort.”

Facebook has a number of functions that auto-generate pages from content posted by users. The updated complaint scrutinizes one function that is meant to help business networking. It scrapes employment information from users’ pages to create pages for businesses. In this case, it may be helping the extremist groups because it allows users to like the pages, potentially providing a list of sympathizers for recruiters.

The new filing also found that users’ pages promoting extremist groups remain easy to find with simple searches using their names. They uncovered one page for “Mohammed Atta” with an iconic photo of one of the al-Qaida adherents, who was a hijacker in the Sept. 11 attacks. The page lists the user’s work as “Al Qaidah” and education as “University Master Bin Laden” and “School Terrorist Afghanistan.”

Facebook has been working to limit the spread of extremist material on its service, so far with mixed success. In March, it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international extremist groups. It says it has banned 200 white supremacist organizations and 26 million pieces of content related to global extremist groups like IS and al-Qaida.

facebook, Al-qaida, terror, islamic state, pages
An Islamic State flag is captured in this photo illustration. VOA

It also expanded its definition of terrorism to include not just acts of violence attended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with the intent to coerce and intimidate. It’s unclear, though, how well enforcement works if the company is still having trouble ridding its platform of well-known extremist organizations’ supporters.

But as the report shows, plenty of material gets through the cracks and gets auto-generated.

The AP story in May highlighted the auto-generation problem, but the new content identified in the report suggests that Facebook has not solved it.

ALSO READ: U.S. Media Industry Going Through A Bad Phase

The report also says that researchers found that many of the pages referenced in the AP report were removed more than six weeks later on June 25, the day before Bickert was questioned for another congressional hearing.

The issue was flagged in the initial SEC complaint filed by the center’s executive director, John Kostyack, that alleges the social media company has exaggerated its success combatting extremist messaging.

“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating pages with titles like `I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.” (VOA)