Facebook has revealed it rates whether you are trustworthy. And it's doing this to help combat the spread of fake news on its platform.

It sounds like an episode of Black Mirror, right? But the social network said it started scoring its users’ trustworthiness in order to better determine if a story might be false after a user reports it. Here's what you need to know about your Facebook trustworthiness score.

What is the Facebook trustworthiness score?

The Washington Post reported, with confirmation from Facebook, that Facebook developed a new tool as part of its ongoing effort to stop disinformation. This tool, or system, consists of trust ratings, which went into place over the last year. Facebook partially relies on reports from users to help identify fake news stories. When enough people report a story as false, Facebook's fact-checking team checks it out.

But looking into every story that's reported as “fake news” is a daunting task, so Facebook uses other information, including trust ratings, to help it determine whether it should even look into the report. When speaking to The Washington Post, Facebook didn’t detail everything that goes into a user's so-called trustworthiness score, but it did admit that it considers a user’s track record with reporting stories as false.

Currently, it's unclear if the trustworthiness score is being used for purposes other than reports on news stories.

How does Facebook determine a user's score?

If someone often reports stories as false, and then Facebook's fact-checking team determines those stories are indeed false, the user's trustworthiness score will essentially rise. On the other hand, if a person continually reports stories as false, when they are in fact true, their score will go down. “People often report things that they just disagree with,” Facebook’s product manager told The Washington Post.

How to see your Facebook trustworthiness score

A user's trustworthiness score is strictly an internal rating that only Facebook employees see. If someone regularly and wrongly flags news stories as false, they'll get a low rating, which later helps Facebook's fact-checking team decide whether to even review their flagged stories in the first place. The whole idea is that Facebook fact-checkers don't want to waste time investigating stories that are obviously true.

In other words, you can't see your score. Facebook has also pushed back on The Washington Post's reporting, which described a user's score as both a "trustworthiness score" and a "reputation score" that "ranges on a scale from zero to 1". It attempted to clarify things to Gizmodo:

“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in The Washington Post is misleading. What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible.”

Like this? Then check out: