Alarmed at the growing hateful content on its platforms, Facebook and Instagram are building teams to look into and act upon racial bias in its algorithms that may be discriminatory in their approach.
The Wall Street Journal reported that the social network will study the AI and ML-driven neural networks to know whether they too promote racial bias.
Follow us on Instagram to get regular updates from us!!
“The racial justice movement is a moment of real significance for our company. Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves”, Vishal Shah, Instagram’s VP of Product, said in a statement.
Instagram will create an “equity team” that will look into the algorithms and harassment policy enforcement while Facebook will also establish a similar team.
“While we’re always working to create a more equitable experience, we are setting up additional efforts to continue this progress,” Shah was quoted as saying in the report on Wednesday.
The details will be out about the teams in the coming months.
The move comes in the wake of the #StopHateforProfit campaign that has begun hurting Facebook financially as over 400 advertisers have pulled out or paused advertising on its platforms.
The civil rights groups leaders in the US were left disappointed after meeting Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg this month over their concerns related to the spread of hateful content on their platforms.
Sandberg, Zuckerberg and other Facebook executives met online racial justice group Color of Change, the Leadership Conference on Civil and Human Rights, the NAACP Legal Defense Fund and others who started the #StopHateforProfit campaign in June.
Also Read: A New Gadget Category For Women
Several top-notch brands like Disney, Coca Cola, Adidas, Walgreens and Starbucks have pulled their ads from the social network. Microsoft has suspended its advertising on Facebook and Instagram through August.
Facebook has reiterated that it has “more work to do” on curbing hate speech. (IANS)