Wednesday July 18, 2018
Home Science & Technology Facebook roll...

Facebook rolls out fresh changes with new Signals to ‘better identify and rank authentic Content’

Facebook considers signals like your proximity to the person or page posting, or likes, comments and shares to rank content

0
//
107
Facebook, Pixabay
Republish
Reprint

New York, Feb 1, 2017: In a bid to display more relevant stories on its News Feed, Facebookhas rolled out fresh changes with new signals to ‘better identify and rank authentic content’.

The changes will also have a new real-time prediction algorithm to spot stories that might be relevant to you faster.

NewsGram brings to you current foreign news from all over the world. 

According to a report in Next Web on Wednesday, Facebook’s new signals tap one of its core values — authentic communication — to bring stories to your News Feed that have a higher chance of resonating, and not those considered “misleading, sensational, or spammy”.

Facebook considers signals like your proximity to the person or page posting, or likes, comments and shares to rank content.

NewsGram brings to you top news around the world today.

To do this, “Facebookfirst attempts to identify pages known for posting spam or trying to game the algorithm through means it deems inappropriate, like asking for likes, shares, or comments. This data is then used to train a model to continually identify these types of posts in an attempt to keep them out of your News Feed,” the report said.

If some posts are hidden, that indicates that such content is not meant for a particular user, contrary to the authentic content which will appear higher in your News Feed.

Facebookis also trying to be faster at spotting authentic content and making it appear on the user’s News Feed.

Check out NewsGram for latest international news updates.

This update notes how universal signals change in real time.

“For example, if an article from The Washington Post (a page you subscribe to) is generating a lot of buzz, the algorithm will deem this important and place it higher in your feed, quicker,” the report added. (IANS)

 

Click here for reuse options!
Copyright 2017 NewsGram

Next Story

Facebook Accused of Protecting Far-Right Activists Who Broke the Sites Rules

Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the rules set up by the social media giant, the media reported.

0
Facebook
Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the community rules. Pixabay

Moderators at Facebook are protecting far-right activists, preventing their Pages from being deleted even after they violate the rules set up by the social media giant, the media reported.

The process called “shielded review” was uncovered by Channel 4 Dispatches – a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor.

“In the documentary, a moderator tells the ‘Dispatches’ reporter that Britain First’s pages were left up, even though they repeatedly broke Facebook’s rules, because ‘they have a lot of followers so they’re generating a lot of revenue for Facebook’,” the Guardian reported on Tuesday.

Similarly, popular pages, including those of activists like Tommy Robinson, are protected from Facebook rules.

Robinson is currently in jail, serving a 13-month sentence for contempt of court.

Richard Allan, Facebook’s Head of Public Policy, was quoted as saying in the documentary that the company’s rules are based on revenue.

“If the content is indeed violating it will go,” Allan said.

Facebook, however, said it will remove Robinson’s page if he repeatedly violated the site’s community standards.ABritain First’s Facebook page was eventually banned in March 2018.

“It’s clear that some of what is shown in the programme does not reflect Facebook’s policies or values, and falls short of the high standards we expect.

Facebook
Facebook, social media.Pixabay

“We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention,” Allan said.

The documentary also showed that Facebook moderators have turned blind eye to under-age accounts.

“Moderators are told they can only take action to close down the account of a child who clearly looks 10-years-old if the child actually admits in posts they are under-aged,” The Telegraph reported, citing the documentary.

“We have to have an admission that the person is under-age. If not, we just pretend that we are blind and we don’t know what underage looks like,” a trainer told the undercover reporter.

Facebook is also facing the flak for launching Messenger Kids that encourages children under age 13 to join social media.

British Health Secretary Jeremy Hunt in December warned the social media giant to stay away from his children.

Also read-Facebook Joins Skill India Mission to Train Empower Youth

Early this year, more than 100 child health experts have urged Facebook to withdraw the app.

Despite call for withdrawal by experts, Facebook has decided to expand the reach of Messenger Kids by introducing the video calling and messaging app designed for children under 13 to families in Canada and Peru.

Facebook said it will also introduce Spanish and French versions of the app. (IANS)