Facebook has been letting people know this week that it's making a concerted effort to look into the potential racial bias that the algorithms that power so much of its social media network might exhibit.

It's set up a new team to study the issue, and to prioritise fairness and equality as new products and features are developed, tested and deployed. However, a new report from NBC's Olivia Solon is calling the sincerity of that change into question. 

It reveals that some researchers within Facebooks ranks were raising red flags about this issue as early as mid-2019. The researchers found that when auto-moderation tools assessed accounts reported for infractions, Black people's accounts were 50% more likely to be automatically disabled than white people's.

That's a huge disparity, and the researchers' concern was compounded when their hierarchical superiors internally apparently sought to quash the results and keep them a secret, rather than passing them up the chain of command and actioning change. The report alleges that this was part of a pattern of responses to other such pieces of research. 

Facebook's defence on this has largely been to call into question the methods used in these research efforts, and now to set up its own official taskforce on the matter. 

However, the report adds more grist to the mill of those who feel the social media giant has simply been to slow to acknowledge the issue, and has allowed too much harm to happen before acting.