Tuesday November 19, 2019
Home Politics Life Under Is...

Life Under Islamic State (ISIS) Terrorist Group: Soldier Hussien Hamza’s Story of Poison Bomb

0
//
Hussien Hamza
Private Hussien Hamza of the Iraqi Army's 73rd Brigade tells VOA what it was like to be attacked by a homemade chemical weapon as they beat IS out of one of their last few strongholds. VOA

– by Heather Murdock

For months, soldiers, civilians and aid workers in Iraq have been reporting Islamic State attacks using homemade chemical weapons. On June 4 in Mosul, militants fired a poisoned mortar at Iraqi forces in a failed attempted to rescue other IS fighters trapped in battle.

Thirty-year-old Private Hussien Hamza of the Iraqi Army’s 73rd Brigade told VOA about his experience in the battle. He spoke in Arabic.

NewsGram brings to you current foreign news from all over the world.

The operation began at 3:30 a.m. We moved quickly on foot, keeping low and running through the neighborhood. The roads in this part of Mosul are too narrow for humvees so we stayed as close to the walls of the houses as possible for cover.

Islamic State militants mostly retreated, but they fired at us as they fled. They were mainly using grenades and mortars, but also plenty of machine guns and sniper rifles. We saw militants prepare to detonate a car bomb and fired at them as we ran. We defused the bomb.

Iraqi forces move towards the battle in Mosul, June 4, just hours before IS threw a homemade chemical weapon at 73rd Brigade Army soldiers on the frontlines.
Iraqi forces move towards the battle in Mosul, June 4, just hours before IS threw a homemade chemical weapon at 73rd Brigade Army soldiers on the frontlines. VOA

By early afternoon, we had moved nearly a kilometer in, and seven militants were trapped in a house just behind our new frontline. About 10 of us went in, and we fought for hours using hand grenades in close quarters.

The militants wore all black. It looked like traditional Afghan or Pakistani clothes, but they didn’t appear to be Afghan or Pakistani. They cursed us when they saw us. We cursed them back. That really pissed them off.

NewsGram brings to you top news around the world today.

By 5 p.m. we killed all seven of them. One of them looked Chinese.

We hadn’t stopped for food or water since the operation began, but we weren’t tired.

Other militants were still firing at the house, trying to rescue their team. A missile hit a car outside the house and it was on fire. Three mortars dropped and poison gas blew in our faces.

After Iraqi forces force IS out of an area, homes and cars are often destroyed and IS mortars continue to fall in Mosul, June 4, 2017.
After Iraqi forces force IS out of an area, homes and cars are often destroyed and IS mortars continue to fall in Mosul, June 4, 2017. VOA

Within seconds I was blinded by the tears streaming down my face and liquid poured out of my nose. White bubbles came from my mouth. I could hear other soldiers coughing.

Some of the men grabbed me and two other soldiers and brought us to a field clinic. Medics said our cases were too advanced to be treated there and they transferred us to a hospital, where they told us we were hit with a chemical attack. I passed out.

After two days of oxygen and injections we were released back to the frontlines but my eyes still hurt and my body is shaking. They say I need to see an eye specialist.

The militants are cowards in the way they fight. They know it’s their last days. They know we will end them. (VOA)

NewsGram is a Chicago-based non-profit media organization. We depend upon support from our readers to maintain our objective reporting. Show your support by Donating to NewsGram. Donations to NewsGram are tax-exempt. 
 

Next Story

Facebook Asked to Take Down Auto-Generated Al-Qaida Pages

Facebook likes to say that its automated systems remove the vast majority of prohibited content glorifying the Islamic State group and al-Qaida before it's reported

0
facebook, Al-qaida, terror, islamic state, pages
Monika Bickert, head of global policy management at Facebook, joined at right by Nick Pickles, public policy director for Twitter, testifies before the Senate Commerce, Science and Transportation Committee, Sept. 18, 2019. VOA

In the face of criticism that Facebook is not doing enough to combat extremist messaging, the company likes to say that its automated systems remove the vast majority of prohibited content glorifying the Islamic State group and al-Qaida before it’s reported.

But a whistleblower’s complaint shows that Facebook itself has inadvertently provided the two extremist groups with a networking and recruitment tool by producing dozens of pages in their names.

The social networking company appears to have made little progress on the issue in the four months since The Associated Press detailed how pages that Facebook auto-generates for businesses are aiding Middle East extremists and white supremacists in the United States.

On Wednesday, U.S. senators on the Committee on Commerce, Science, and Transportation will be questioning representatives from social media companies, including Monika Bickert, who heads Facebooks efforts to stem extremist messaging.

The new details come from an update of a complaint to the Securities and Exchange Commission that the National Whistleblower Center plans to file this week. The filing obtained by the AP identifies almost 200 auto-generated pages, some for businesses, others for schools or other categories, that directly reference the Islamic State group and dozens more representing al-Qaida and other known groups. One page listed as a “political ideology” is titled “I love Islamic state.” It features an IS logo inside the outlines of Facebook’s famous thumbs-up icon.

facebook, Al-qaida, terror, islamic state, pages
Facebook auto-generating Al-Qaida, terror group, pages. Pixabay

In response to a request for comment, a Facebook spokesperson told the AP: “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organizations to stay ahead of bad actors. Auto-generated pages are not like normal Facebook pages as people can’t comment or post on them and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort.”

Facebook has a number of functions that auto-generate pages from content posted by users. The updated complaint scrutinizes one function that is meant to help business networking. It scrapes employment information from users’ pages to create pages for businesses. In this case, it may be helping the extremist groups because it allows users to like the pages, potentially providing a list of sympathizers for recruiters.

The new filing also found that users’ pages promoting extremist groups remain easy to find with simple searches using their names. They uncovered one page for “Mohammed Atta” with an iconic photo of one of the al-Qaida adherents, who was a hijacker in the Sept. 11 attacks. The page lists the user’s work as “Al Qaidah” and education as “University Master Bin Laden” and “School Terrorist Afghanistan.”

Facebook has been working to limit the spread of extremist material on its service, so far with mixed success. In March, it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international extremist groups. It says it has banned 200 white supremacist organizations and 26 million pieces of content related to global extremist groups like IS and al-Qaida.

facebook, Al-qaida, terror, islamic state, pages
An Islamic State flag is captured in this photo illustration. VOA

It also expanded its definition of terrorism to include not just acts of violence attended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with the intent to coerce and intimidate. It’s unclear, though, how well enforcement works if the company is still having trouble ridding its platform of well-known extremist organizations’ supporters.

But as the report shows, plenty of material gets through the cracks and gets auto-generated.

The AP story in May highlighted the auto-generation problem, but the new content identified in the report suggests that Facebook has not solved it.

ALSO READ: U.S. Media Industry Going Through A Bad Phase

The report also says that researchers found that many of the pages referenced in the AP report were removed more than six weeks later on June 25, the day before Bickert was questioned for another congressional hearing.

The issue was flagged in the initial SEC complaint filed by the center’s executive director, John Kostyack, that alleges the social media company has exaggerated its success combatting extremist messaging.

“Facebook would like us to believe that its magical algorithms are somehow scrubbing its website of extremist content,” Kostyack said. “Yet those very same algorithms are auto-generating pages with titles like `I Love Islamic State,’ which are ideal for terrorists to use for networking and recruiting.” (VOA)