To offer NRIs the right to exercise their franchise by e-postal ballots or through proxy voting, the government on Tuesday called for an all-party meeting to discuss the proposal.
Law Minister, DV Sadananda Gowda, acknowledged the demand of Opposition parties in the Rajya Sabha that their views should be taken into consideration while enacting legislation to award voting rights to NRIs and domestic migrant labours.
Leader of Opposition, Ghulam Nabi Azad, moved a motion in the house on this issue. Gowda, in response to the motion, said that the government was working on the Election Commission report regarding voting rights of over one crore NRIs and not as directed by the Supreme Court.
The debate heated up as the Opposition members blamed the government that it is acting against the Supreme Court’s directions. Azad said that the Opposition is not against the voting rights for NRIs, but it is the way the government has moved the proposal without even discussing it with the political parties.
“Today government has said goodbye to all consultation process. Parliament is being made aware of developments from newspaper reports. It has become the habit of the government to bypass the Parliament and the standing committees. When we object, we are being criticised,” said Azad.
Gowda stated that the report advocated the options of the e-postal ballot system and proxy voting. “E-postal and postal ballot voting methodologies are being worked out for the voting rights of the NRIs,” he said.
On the other hand, the minister agreed for an all party meeting as insisted by DMK, BJD, CPI, AIADMK, SP and JD-U. Gowda said, “I (will) request Election Commision to hold all party meeting to get their feedback.”
As U.S. voters prepare to head to the polls Tuesday, the election will also be a referendum on Facebook.
In recent months, the social networking giant has beefed up scrutiny of what is posted on its site, looking for fake accounts, misinformation and hate speech, while encouraging people to go on Facebook to express their views.
“A lot of the work of content moderation for us begins with our company mission, which is to build community and bring the world closer together,” Peter Stern, who works on product policy stakeholder engagement at Facebook, said at a recent event at St. John’s University in New York City.
Facebook wants people to feel safe when they visit the site, Stern said. To that end, it is on track to hire 20,000 people to tackle safety and security on the platform.
As part of its stepped-up effort, Facebook works with third-party fact-checkers and takes down misinformation that contributes to violence, according to a blog post by Mark Zuckerberg, Facebook’s CEO.
But most popular content, often dubbed “viral,” is frequently the most extreme. Facebook devalues posts it deems are incorrect, reducing their viralness, or future views, by 80 percent, Zuckerberg said.
Recently Facebook removed accounts followed by more than 1 million people that it said were linked to Iran but pretended to look like they were created by people in the U.S. Some were about the upcoming midterm elections.
The firm also removed hundreds of American accounts that it said were spamming political misinformation.
Still, Facebook is criticized for what at times appears to be flaws in its processes.
Vice News recently posed as all 100 U.S. senators and bought fake political ads on the site. After approving them all, Facebook said it made a mistake.
Politicians in Britain and Canada have asked Zuckerberg to testify on Facebook’s role on spreading disinformation.
“I think they are really struggling and that’s not surprising, because it’s a very hard problem,” said Daphne Keller, who used to be on Google’s legal team and is now with Stanford University.
“If you think about it, they get millions, billions of new posts a day, most of them some factual claim or sentiment that nobody has ever posted before, so to go through these and figure out which are misinformation, which are false, which are intending to affect an electoral outcome, that is a huge challenge,” Keller said. “There isn’t a human team that can do that in the world, there isn’t a machine that can do that in the world.”
While it has been purging its site of accounts that violate its policies, the company has also revealed more about how decisions are made in removing posts. In a 27-page document, Facebook described in detail what content it removes and why, and updated its appeals process.
Stern, of Facebook, supports the company’s efforts at transparency.
“Having a system that people view as legitimate and basically fair even when they don’t agree with any individual decision that we’ve made is extremely important,” he said.