Show and Tell, is an Alexa feature designed to help blind and partially sighted people and it's just launched in the UK. It uses the Echo Show's camera to help identify common household grocery items. 

You can just say to your Echo Show "Alexa, what am I holding?" or "Alexa, what’s in my hand?" to start. The whole idea is that it'll help identify items that are hard to distinguish by touch, such as canned or boxed foods. The feature is part of the ever-evolving Alexa Accessibility Hub

Amazon originally developed the feature with the Vista Center for the Blind and Visually Impaired in California. To get started you hold the item around 30cm from the camera and give the command. Alexa will prompt you to turn the item around to show all sides of the packaging. 

Commenting on the new feature, Robin Spinks, senior innovation manager at the Royal National Institute of Blind People (RNIB), says, "Computer vision and artificial intelligence are game-changers in tech that are increasingly being used to help blind and partially sighted people identify everyday products. Amazon’s Show and Tell uses these features to great effect, helping blind and partially sighted people quickly identify items with ease. 

"For example, using Show and Tell in my kitchen has allowed me to easily and independently differentiate between the jars, tins and packets in my cupboards."

Show and Tell was previously available in the US. All Echo Show versions are supported.