(Pocket-lint) - With more and more smartphones launching with a multi-lens camera system, we're taking a look at where this has all come from and romp through the history of dual, triple and (gasp) quad lens smartphone cameras.
Dual lenses on smartphones aren't new; with a number of models offering a range of unique features using this camera setup - as far back as 2011 in formats you'll recognise - not forgetting the Samsung B710 offering a dual lens back in 2007! (Thanks for that tip Leo.)
Getting a phone with a single lens might now be a rarity, but follow us as we walk you through key moments in smartphone multi-lens camera systems of the past, present ... and into the future with the Galaxy S10, Nokia 9, Huawei P30 Pro and others.
LG Optimus 3D and HTC Evo 3D: Another dimension
In 2011 3D was a thing. The world's TV manufacturers were lining up 3D TV sets, there were 3D films being produced and we were being told that 3D was the next big thing (again).
Both these smartphones (and there were some others) used dual lenses to allow them to take 3D video and 3D photos. They use the same technique used by regular 3D cameras, using those dual lenses to create a sense of depth in images. This was boosted with a 3D display to view those images, without the glasses.
But 3D was just a passing phase, and although we could capture 3D, ultimately, that was only the start of the story for dual lens cameras.
HTC One M8: Making sense
It was the HTC One M8 that really introduced dual lens cameras to the world and saw HTC trying to do something different. The HTC One M8 was launched in April 2014 and used two sensors in the same way that modern smartphone cameras do.
With a 4-megapixel UltraPixel main image sensor and a secondary 2-megapixel sensor capturing extra data, the dual lens camera was used, like 3D, to create a sense of depth in photos. The idea was that the second lens could create a depth map and feed it into the final image.
That meant you could create bokeh/background blur effects, you could refocus the image with a tap and you could easily manipulate photos, keeping the subject sharp and changing the backgrounds, even after you'd taken the photo.
The One M8 was clever, but the camera wasn't that impressive. The effects were rather gimmicky and the benefits of having a dual camera didn't really make an impact - even if the full metal body did.
There are still plenty of devices that have a second lens for "depth" and nothing else - but that's often seen as a method of getting background blurring on portrait shots.
HTC might have started this whole second lens thing, but it was about 2 years in advance of the rest of the pack - and it was 2016 that really saw the industry change.
LG G5: Going wide
Step forward a few years and LG announced the LG G5 in February 2016. There were two things that were interesting about it. Firstly, it attempted to integrate modular accessories - which was a flop - and secondly, LG equipped it with dual cameras, one of the first phones to launch in 2016.
There was a main 16-megapixel sensor and a second 8-megapixel sensor. Rather than combining information to create effects, the second lens was ultra wide-angle.
With 135-degree lens on the rear for that 8MP camera, the LG G5 could shoot wide-angle photos to great effect. You could simply switch from one camera to the other, perfect for tight spots or landscapes - and the chance to create something you can't do with software.
LG added the wide-angle to the V20 and subsequent models in the G and V series, but it wasn't until the Huawei Mate 20 triple camera that we saw big moves in wide-angle from other manufacturers. That's all changing in 2019, as everyone realises that wide-angle is a creatively sound proposition.
Huawei P9: Leica's monochrome mark
In April 2016 Huawei launched the P9 in partnership with Leica, with two cameras sitting on the back. Huawei's big selling point wasn't about depth sensing or wide-angle, it was about monochrome and it was the start of some influential work in multi-camera systems from Huawei.
Leveraging Leica's classic monochrome skills, the Huawei P9 presented two cameras on the rear, claiming one lens captured RGB colour and the second lens captured monochrome detail. This resulted in some great black and white photos, but working together, the P9 attempts to combine information from both sensors to make all your photos better - and generally speaking it all seemed to work well.
Honor used the same system in a number of devices - without the Leica branding - adding a monochrome sensor on the Honor 8 and subsequent devices, until we hit the Honor View 20. It's not just Huawei and Honor - Nokia adopted the same system on the Nokia 8, but with Zeiss lenses.
Apple iPhone 7 Plus: A play to zoom
As 2016 continued, one of the big launches was the Apple iPhone 7 Plus with two cameras on the rear, both 12-megapixels, but offering different focal lengths. The first camera was 23mm zoom, while the second camera was 56mm and we entered the realms of telephoto on phones.
The idea was to let you zoom without losing as much quality. You can switch to the 56mm camera to get you closer and then any digital zooming you do is then starting from a closer position, so the loss in quality will be lessened compared to a regular smartphone camera. Apple wanted to address what it saw as a significant problem with smartphone photography and came up with a solution that matched user behaviour.
Apple also played HTC's game by offering bokeh effects thanks to a depth map drawn from both lenses.
Since the launch of the iPhone 7 Plus, Apple has continued to offer zoom on its phones and many others have moved to adopt a zoomed lens too - in 2017 OnePlus added it to the OnePlus 5, for example, and Samsung's first dual-camera phone, the Note 8, a system it has continued with since.
Huawei P20 and Mate 20 Pro: Three is the magic number
When the Huawei P20 Pro was announced in early 2018, everything was poured into the camera, with a new triple camera system. This added a zoom lens to the to the existing system of RGB and monochrome sensors, but there was a lot more happening with AI - and the birth of an impressive Night Mode.
The Huawei P20 Pro was a great success, a camera that justified its excesses with results and proved the critics wrong. It seemed to do everything.
What was a little suspicious, however, was the evolution in the Huawei Mate 20 later in 2018. Again using a triple camera system, Huawei switched it up, dropped the monochrome sensor swapping in a wide-angle lens instead, effectively turning its back on the previous 2 years of marketing. The results, though, give very little to complain about, adding that desirable wide-angle with seemingly no quality downside for losing that monochrome lens - so did it ever actually do anything?
Samsung also offered a three-lens camera in the Samsung Galaxy A7 in 2018, but opted for regular, wide-angle and a dubious third for "depth information" and nothing else. Oppo graced the R17 Pro with three cameras, but perhaps more confusingly, offered a main camera, a depth camera and a final (disabled) time of flight camera, which at some point in the future should do 3D scanning - but we'll get to that in a second.
Three is likely to be popular in 2019: the Xiaomi Mi 9 will have three cameras, and there's rumours of the iPhone adopting three cameras too - as will as the suggestion that the Samsung Galaxy S10 will have three cameras.
Galaxy A9: Samsung shoots four the stars
Samsung likes "world firsts" and having lost out to Huawei on the triple camera front and been fairly slow to adopt dual camera systems, the Samsung Galaxy A9 strode out with four cameras on the back in 2018.
In this arrangement, you get the culmination of all the approaches since 2014 - depth, wide-angle, zoom - and a normal camera. (We will think that the "depth" camera is a little dubious as most can get that information from one of the other existing lenses, but ho-hum, it's a four camera system.)
The obvious downside is the space it takes up on the back of the phone - but rumours have suggested we'll see four cameras on the Samsung Galaxy S10 X and it has been confirmed that it will be on the Huawei P30 Pro too.
Nokia 9's got five on it
Moving into the realms of rumours, there's long been talk that Nokia is going to put five cameras on the rear of the Nokia 9. That phone might be launched at Mobile World Congress 2019 and start a new chapter in smartphone cameras.
We've seen the Nokia penta-lens arrangement a number of times through leaks, but what's not clear at the moment is how it will actually work. Will this be an amalgamation of all the previous camera options, or will it be something totally unique, gathering light through all the lenses and combining data? Hopefully, we'll know real soon.
And finally: Time of Flight hits AR notes
With all the different cameras going on, there's growth in a new direction and that's something called Time of Flight. We mentioned that it was on the Oppo RX17 Pro - but not doing anything (?) - and it's also on the Honor View 20. Time of Flight seems to just be in place to provide more depth information, put plays its part in 3D scene mapping, the sort of thing you'll use for augmented reality (AR) applications.
So is this just a throwback all the way to 2014 and the HTC One M8? Almost. Time of Flight uses infrared to measure ranges, but can do it at 160fps, basically 3D scanning what it can see. LG has already confirmed that the LG G8 ThinQ will be getting a Time of Flight camera and there's muttering across the industry that more will be using this technology.
But why? Well, LG Innotek who manufactures the hardware (amongst others) says that these cameras can be used in lots of different ways - for biometrics, virtual reality and AR - while being slim and power efficient - so great for smartphone applications.
It's not the first time we've seen 3D scanning or AR cameras on a phone. Google Tango was a whole project dedicated to exactly that with phones like the Asus Zenfone AR and the Lenovo Phab 2 Pro. The project has now closed, but it played it part in laying the foundations for an AR future.