Facebook has TRUST ratings for users – but it won’t tell you your score
August 22, 2018 in News by RBN Staff
Source: The Sun
The social network is predicting your trustworthiness in a bid to fight fake news
By Sean Keach, Digital Technology and Science Editor
VIDEO: https://www.thesun.co.uk/4b0d4ec5-f73d-4d1a-9591-69090ea80f19
FACEBOOK is rating users based on how “trustworthy” it thinks they are.
Users receive a score on a scale from zero to one that determines if they have a good or bad reputation – but it’s completely hidden.
The rating system was revealed in a report by the Washington Post – and later confirmed by Facebook to The Sun – which says it’s in place to “help identify malicious actors”.
Facebook tracks your behaviour across its site and uses that info to assign you a rating.
Tessa Lyons, who heads up Facebook’s fight against fake news, said: “One of the signals we use is how people interact with articles.
“For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”
Earlier this year, Facebook admitted it was rolling out trust ratings for media outlets.
This involved ranking news websites based on the quality of the news they were reporting.
This rating would then be used to decide which posts should be promoted higher in users’ News Feeds.
User ratings are employed in a similar way – helping Facebook make a judgement about the quality of their post reports.
According to Lyons, a user’s rating “isn’t meant to be an absolute indicator of a person’s credibility”.
Instead, it’s intended as a measurement of working out how risky a user’s actions may be.
How does Facebook’s user rating system work?
Facebook told The Sun that this is how the system works…
- Facebook works to fight fake news by using machine learning systems
- These automated systems predict articles that its human fact-checkers should review
- Facebook developed a process that protects against people “indiscriminately flagging news as fake”, attempting to game the system
- One of the indicators used in this process is how people report articles as false
- For instance, if someone previously gave Facebook feedback that an article was false, and then that article was confirmed false by a fact-checker, that person’s future feedback would be weighted more positively
- This is reflected in an invisible score or rating, which changes depending on the quality of a person’s ratings
- So if someone reports news as false regularly, and that news is rated as true, that person’s future reports will be rated lower than someone with a higher score
- Facebook says this is an effective way to fight misinformation
- Facebook says that people often report something as false because they disagree with a story, or are trying to target a particular publisher
- Attempts to game this feedback are why Facebook can’t rely on the reporting system as a totally accurate indicator
- Facebook told The Sun that the rating is specific to its fake news team, and that there’s no unified score that is like a credit rating used everywhere
A Facebook spokesperson told The Sun: “The idea that we have a centralised ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading.
“What we’re actually doing: we developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system.
“The reason we do this is to make sure that our fight against misinformation is as effective as possible.”
Online commentators are already comparing the system to China’s creepy “social credit” system.
The Chinese government analyses users’ social media habits and online shopping purchases, assigning citizens a score.
Jaywalking or skipping train fares can result in you getting a lower score.
This score is then used to determine whether people can take loans, and even travel on public transport.
Some citizens with very low ratings become “blacklisted”, making it impossible to book a plane flight, rent or buy a property or stay in a luxury hotel.
The system is currently being piloted, but will become mandatory in China by 2020.
Facebook’s own rating system is the latest drive in its bid to tackle fake news, a growing problem for the social network.
The site, which sees 2.23billion users log on every single month, has become a hot-bed for falsified news coverage.
Earlier this year, billionaire Facebook boss Mark Zuckerberg vowed to fight fake news.
“The world feels anxious and divided, and Facebook has a lot of work to do,” the 34-year-old Harvard drop-out explained.
Who is Mark Zuckerberg, the founder of Facebook?
Here’s what you need to know…
- Mark Zuckerberg is the chairman, CEO and co-founder of social networking giant Facebook
- Born in New York in 1984, Zuckerberg already had a “reputation as a programming prodigy” when he started college
- While at Harvard, Zuckerberg launched a site called Face Mash, on which students ranked the attractiveness of their classmates
- Harvard shut the site down after its popularity crashed a network and Zuckerberg later apologised saying it was “completely improper”
- The following term he began working on an early version of Facebook
- The 33-year-old launched the social network from his dorm room on February 4, 20o4 with the help of fellow students
- The friends would end up embroiled in legal disputes as they challenged Zuckerberg for shares in the company
- Zuckerberg also faced action from Cameron and Tyler Winklevoss, as well as Divya Narendra who claimed he had stolen their idea – the disagreement was later turned into the film, The Social Network
- The tech prodigy dropped out of Harvard to focus on Facebook, but received an honorary degree in 2017
- Speaking about the site to Wired magazine in 2010 he said: “The thing I really care about is the mission, making the world open”
- By 2012 Facebook had one billion users. By June 2017 it had reached two billion users every month
Facebook has admitted that its site has been the subject of political fakery campaigns from Russia.
After initially denying any complacency on its part, the social network admitted more than 126 million US users had viewed some form of Russian propaganda.
A congressional hearing followed, with Facebook, Twitter, and Google in the dock.
And Facebook’s been grappling with the problem ever since.