Apple is apparently planning to further beef up the strength of the camera unit on the back of its next iPhone, according to reports. The addition of a 3D depth-sensing lens seems to be the tool of choice.
Fast Company is reporting that this new sensor will allow for further improved augmented reality performance, as well as better photo and video effects - you'd imagine that it would further improve the iPhone's already impressive Portrait mode calibration.
The camera unit apparently uses a laser sensor in a similar way to the iPhone's front-facing array to scan the world it's looking at, and Apple plans to get the system from the same manufacturer that provides that existing laser.
Apple would be catching up, in some ways, to the likes of Samsung's Galaxy S20 Ultra and other S20 models, which already have depth sensors on their rear cameras - although these phones are so new that it's hardly surprising that they're ahead of the game.
Of course, if augmented reality is the biggest likely winner from the improved depth sensing, it does beg the question of how many people will actually take advantage of that benefit.
While front-facing augmented reality, the likes of selfie filters and more, are popular in messaging apps and among younger users, there's less demand for main-camera AR, while AR games are similarly limited in their appeal.
Regardless, according to Fast Company's source Apple currently plans to implement the new laser system in at least one of its next wave of iPhones, so we could see this unveiled later this year, unless it hits the chopping room floor between now and then.