Facebook has revealed it rates whether you are trustworthy. And it's doing this to help combat the spread of fake news on its platform.

It sounds like an episode of Black Mirror, right? But the social network said it started scoring its users’ trustworthiness in order to better determine if a story might be false after a user reports it. Here's what you need to know about your Facebook trustworthiness score.

What is the Facebook trustworthiness score?

The Washington Post reported, with confirmation from Facebook, that Facebook developed a new tool as part of its ongoing effort to stop disinformation. This tool, or system, consists of trust ratings, which went into place over the last year. Facebook partially relies on reports from users to help identify fake news stories. When enough people report a story as false, Facebook's fact-checking team checks it out.

But looking into every story that's reported as “fake news” is a daunting task, so Facebook uses other information, including trust ratings, to help it determine whether it should even look into the report. When speaking to The Washington Post, Facebook didn’t detail everything that goes into a user's so-called trustworthiness score, but it did admit that it considers a user’s track record with reporting stories as false.

Currently, it's unclear if the trustworthiness score is being used for purposes other than reports on news stories.

How does Facebook determine a user's score?

If someone often reports stories as false, and then Facebook's fact-checking team determines those stories are indeed false, the user's trustworthiness score will essentially rise. On the other hand, if a person continually reports stories as false, when they are in fact true, their score will go down. “People often report things that they just disagree with,” Facebook’s product manager told The Washington Post.

How to see your Facebook trustworthiness score

A user's trustworthiness score is strictly an internal rating that only Facebook employees see. If someone regularly and wrongly flags news stories as false, they'll get a low rating, which later helps Facebook's fact-checking team decide whether to even review their flagged stories in the first place. The whole idea is that Facebook fact-checkers don't want to waste time investigating stories that are obviously true.

In other words, you can't see your score. Facebook has also pushed back on The Washington Post's reporting, which described a user's score as both a "trustworthiness score" and a "reputation score" that "ranges on a scale from zero to 1". It attempted to clarify things to Gizmodo:

“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in The Washington Post is misleading. What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible.”

What is N.E.Q.?

In the last year or so, Facebook has begun more heavily relying on another type of trustworthiness marker, except this time it’s exclusively for news agencies. The feature is dubbed N.E.Q, which stands for "news ecosystem equality" and works to measure the legitimacy of news organisations in an effort to separate the big names like BBC and the New York Times from smaller, more unsubstantiated sources. 

Facebook has always rated the journalistic quality of news sources, however, following the tumultuous 2020 U.S. Presidential election, CEO Mark Zuckerberg decided to increase the algorithmic weight given to the N.E.Q. score and its effect on how it decides which stories appear on your News Feed. Following the election, more "untrustworthy" news sources will have a harder time making your Feed as compared to stories from more legitimately trusted outlets. All of these measures comes in the name of trying to help combat the spread of disinformation. 

As with personal Facebook trustworthy scores, news trustworthiness rankings are hidden from pubic view, and only available to a small and very specific group of Facebook employees. 

Like this? Then check out: