(Pocket-lint) - Google has criticised other companies for selling facial recognition tech - and says it "has long been committed to the responsible development of AI (artificial intelligence".
It didn't go as far as to say it wouldn't ever sell the tech, but says there are a lot of issues to be sorted out first. Google says it's working with others to sort these out.
The comments were posted on an official Google blog post by Kent Walker, Google's senior vice president of global affairs. We guess he decides on Google's policies on issues like privacy and security.
Walker doesn't mince words, saying that "unlike some other companies, Google Cloud has chosen not to offer general-purpose facial recognition [for use by others] before working through important technology and policy questions."
It's pretty clear Walker is commenting on the work of Amazon and Microsoft. Employees of both companies have previously protested at those companies partnerships with organisations such as law enforcement and customs in the US. In particular, Amazon's Rekognition software has come under intense criticism from its own employees, shareholders and civil liberties groups.
Amazon's head of AI Matt Wood commented back in June that "There has been no reported law enforcement abuse of Amazon Rekognition... There have always been and will always be risks with new technology capabilities. Each organization choosing to employ technology must act responsibly or risk legal penalties and public condemnation."
For Google's part, Walker adds: "like many technologies with multiple uses, facial recognition merits careful consideration to ensure its use is aligned with our principles and values and avoids abuse and harmful outcomes.
"We continue to work with many organisations to identify and address these challenges."
We're sure to hear a lot more about the ethics around facial recognition over the next few years as it becomes more integrated into the everyday fabric of society - there'll be a lot more stories like this, as well as the rights and wrongs of using facial recognition in public places such the recent revelation that Taylor Swift used the tech to identify potential stalkers at concerts.