It sounds as though Apple's Siri may introduce room mapping to improve the accuracy of its responses.
The move would be similar to the tech used in speakers to work out positioning within the room (where they can adjust the sound accordingly) including Apple's own HomePod.
Apple has been working with academics from Carnegie Mellon University's Human-Computer Interaction Institute and the group has published a paper on how listening to the surroundings could improve the effectiveness of the voice assistant on a HomePod.
One of the next innovations in devices like the HomePod, Google Home and Amazon Echo devices could be directing audio to people within the room - such as towards the sofa you're sat on, or the group of you at the dining room table. And there have been patents filed to this extent.
That's not actually a big leap from where these devices are presently.
But the research goes further with the intelligence of smart speakers with integrated microphones, suggesting that devices could recognise where ambient sounds are coming from - like a doorbell - and alert the user when that happens. This could be especially helpful for the hard of hearing, for example.