If you've ever opened the wonderful glossy pages of Vogue or scrolled through Instagram and thought a lipstick was a fantastic colour, you'll also know trying to find the shade in MAC or at makeup counter in a department store is no easy feat.

First world problems, we know. But this is one problem that technology could fix. At least it is if you like Chanel lipstick. The Parisen fashion brand's Lipscanner app uses a combination of patented AI and AR technology to enable you to match a lipstick colour and texture from any physical or digital image and try the Chanel equivalent on in real time. 

The app isn't the first time beauty and technology have come together - and it won't be the last - but it's a glimpse into what the future of beauty looks like.

We spoke to Cédric Begon, director of the Connected Experience Lab at Chanel Fragrances and Beauty, about the Lipscanner app and where technology could take the beauty industry.

"Machine learning experts in same room as makeup experts"

Chanel has over 400 lip products in various finishes and all of them are part of the Lipscanner app's catalogue. The app doesn't just recognise the colour you scan though, it will identify the texture and finish too. It should therefore be able to detect if the look you've scanned and are looking to match is matt, satin or gloss, to then provide you the equivalent Chanel lip product options.

Cédric told us: "It's quite a significant challenge to identify colours and finish with a smartphone."

"We had our machine learning experts in the same room as the make-up experts and they worked jointly to craft the algorithm, and train it on tens of thousands of images."

"Made the choice to stay on the smartphone"

At the moment, the Chanel Lipscanner app is only available on iOS devices, though an Android version is expected to come in the "next few months" depending on market feedback, we were told. The app has been designed for the smartphone rather than tablets though. 

According to Cédric: "We made the choice to really stay on the smartphone because [the Lipscanner app] connects to an instinctive gesture that you want to use wherever you are and that's probably more more suitable for smartphones."

You don't need the latest model of iPhone with the most up-to-date camera either. Older models will still be able to detect colours and finishes.

Cédric added: "The reason why we started with iPhones is that iOS and iPhones have a very well defined framework. So, we managed to run tests and validate the experience from iPhone 6S to the latest [models]."

"Don't collect a single piece of personal data"

For those concerned over privacy - and rightly so given the Chanel Lipscanner app involves uploading and capturing images - we were told the app doesn't collect any personal data. When you download the app, you don't need to sign up with an email address or provide any information and none of your pictures stay on the app, only the scans do with the results. 

Any images you take through the app, will go into the Photos app of your iPhone only.

Cédric said: "The design of the AI engine was something that we worked very hard to have run on the iPhone itself. The calculation takes place on the phone so we don't need to upload the images and that was a strong prerequisite for us.

"We don't collect a single piece of personal data. It's very important to us. [Lipscanner] works in full autonomy on its own on the phone."

"Potential of AI for beauty is amazing"

The Chanel Lipscanner app is excellent for those who love Chanel lipsticks and who are looking for new shades, or are looking to find a favourite shade from a different brand in a Chanel equivalent for example. But what about if you don't really know what might suit you? 

Could the technology be developed to scan a person's face and recommend lipstick colours based on their skin tone for example? Or go further and match a foundation perhaps. Brands like Il Makiage use a quiz in order to find the perfect foundation match from their 40 available shades for example, but what if AI and AR could do this instead?  

Cédric said: "The potential of AI for beauty is amazing. [Lipscanner] is the result of more than 18 months of efforts to get a good grasp on machine learning and learn how to collaborate intensely and jointly with make-up artists and experts.

"We would like to leverage this capability for other categories. Leverage tech where it's useful. It's fast, can be useful and it can be playful. The Lipscanner app is very compact, and it saves time. So that's something which we are very excited about and we hope we can we can further the experience with other categories." 

"Lipstick is a very powerful product"

We'd love to see the technology used within the Lipscanner app develop into other categories in beauty, such as foundations and we'd definitely use a similar app for Chanel's nail polishes. There's probably a reason lipstick was the starting point though, just as it was for Pinterest's "Try On" feature that offers similar to Lipscanner but without the image scanning.

Cédric said: "There is something very joyful in trying makeup and particularly trying lipstick which is a very powerful product. You know by experience I'm sure, how fast you can change your identity, modify your femininity with a single line of lipstick. It's amazingly powerful.

"We hope [Lipscanner] is going to be interesting for all makeup lovers [not just Chanel fans] and encourage them to try Chanel lipstick. More than that, we hope and we think [Lipscanner] will trigger some desire to try the physical product because as you know, the sensory aspect is absolutely essential.

"We start the story in a sense with what we see on the screen. The ambition is to bring the product in the hands of the customer."

The Chanel Lipscanner app is available now on iOS. You can read how it works in our separate feature but as a hint, it's super simple and easy to use, and as Coco Chanel said herself: "Simplicity is the keynote of all true elegance".