New York: Ever wondered how Facebook determines what you see in your news feed every time you log in? The social networking site has actually devised different algorithms for it.
Apart from things like where you live and the pages you follow, it primarily looks for two broader signals — Topics that are being mentioned a lot and topics that suddenly seen an increase in mentions, Re/Code reported.
For example, singer Justin Bieber is mentioned often on Facebook. So the total volume of mentions is always high and is not a good indicator of whether or not he is part of a trending topic.
So, Facebook would look for a hike in mentions relative to the normal prattle around Bieber.
“This means that things that trend are not just the most highly mentioned people or topics, they have to be tied to some kind of relevant event,” the report said.
Once a topic is identified as trending, it is approved by a human controller, who also writes a short description for the story.
These people don’t get to pick what Facebook adds to the trending section.
“That’s done automatically by the algorithm. They just get to pick the headline,” the report added.
In the wake of the Christchurch attack, New Zealand said on Wednesday that it would work with France in an effort to stop social media from being used to promote terrorism and violent extremism.
Prime Minister Jacinda Ardern said in a statement that she will co-chair a meeting with French President Emmanuel Macron in Paris on May 15 that will seek to have world leaders and CEOs of tech companies agree to a pledge, called the Christchurch Call, to eliminate terrorist and violent extremist content online.
A lone gunman killed 50 people at two mosques in Christchurch on March 15, while livestreaming the massacre on Facebook.
Brenton Tarrant, 28, a suspected white supremacist, has been charged with 50 counts of murder for the mass shooting.
“It’s critical that technology platforms like Facebook are not perverted as a tool for terrorism, and instead become part of a global solution to countering extremism,” Ardern said in the statement.
“This meeting presents an opportunity for an act of unity between governments and the tech companies,” she added.
The meeting will be held alongside the Tech for Humanity meeting of G7 digital ministers, of which France is the chair, and France’s separate Tech for Good summit, both on 15 May, the statement said.
Ardern said at a press conference later on Wednesday that she has spoken with executives from a number of tech firms including Facebook, Twitter, Microsoft, Google and few other companies.
“The response I’ve received has been positive. No tech company, just like no government, would like to see violent extremism and terrorism online,” Ardern said at the media briefing, adding that she had also spoken with Facebook’s Mark Zuckerberg directly on the topic.
A Facebook spokesman said the company looks forward to collaborating with government, industry and safety experts on a clear framework of rules.
“We’re evaluating how we can best support this effort and who among top Facebook executives will attend,” the spokesman said in a statement sent by email. Facebook, the world’s largest social network with 2.7 billion users, has faced criticism since the Christchurch attack that it failed to tackle extremism.
One of the main groups representing Muslims in France has said it was suing Facebook and YouTube, a unit of Alphabet’s Google, accusing them of inciting violence by allowing the streaming of the Christchurch massacre on their platforms.
Facebook Chief Operating Officer Sheryl Sandberg said last month that the company was looking to place restrictions on who can go live on its platform based on certain criteria. (VOA)