Nestled in Silicon Valley, an innocuous office unit houses the BMW Group Technology Office. Like so many technology companies, it sits off Route 101, neighbours to one of the biggest names in town - Google - in Mountain View.

It's no accident that BMW has an office here in California. Not only does the State of California allow the testing of autonomous vehicles on the road, but many of the technology partners that BMW might look to work with are also located in California.

The aim of the Technology Office is to research and develop technologies, before passing them back to Munich for wider roll-out into BMW's future cars - one such example is the development of wireless Apple CarPlay which made its debut on the BMW 5 Series in 2016.

With Klaus Fröhlich, member of the BMW Board of Management responsible for development, stating on a number of occasions that BMW's autonomous technology will be ready by 2021, there's a lot going on in this particular office.

How is BMW developing its autonomous driving technology? 

Grant Mahler, PhD, is the engineer leading the R&D programme in BMW's California office and introduces us to one of the test vehicles. It's a BMW 7 Series, loaded with eight cameras, five lidar systems and other sensors, with the boot/trunk of the car loaded with hardware to capture and process the information that the car collects.

The lidar and camera systems mean that the car can see the environment it is travelling through. It can identify and track the motion of other vehicles and map the 3D world around it, so it knows where it is, where it's going and where everything else is going too.

BMW technology How the carmaker is developing its autonomous driving system image 3
Pocket-lint

Not only does the car have all the wiring and sensors that the regular 7 Series would offer, but it has the full rig for BMW iNext too, because one of the things that characterises iNext is autonomous driving.

We spy that there's a full x86 PC wired into the boot, along with a tonne of other hardware. The backseats don't have a screen for entertainment, rather a monitor for engineers to see what's happening in real time.

BMW technology How the carmaker is developing its autonomous driving system image 2
Pocket-lint

That's all there for development, as BMW has already shown off what it expects the final in-car module to look like, packing in the hardware from Intel and Mobileye, in a water-cooled unit.

There are some 80 engineering test cars around the globe, gathering data on the roads and learning about driving in different locations. With so many sensors gathering data, it comes in at "a couple of terabytes per hour per car". But it's data that's essential for developing an autonomous driving policy and it's data that informs the algorithm that will ultimately see the car making decisions.

The human and AI interaction 

Much of what BMW is doing to develop its autonomous driving policy starts in simulation. You can't just have sentient cars careening all over the place learning by trial and error, so much of the work comes from a simulator. That gives BMW the opportunity to see what the car does when left to its own devices.

Actually driving the car is easy and many of those technologies are well established and available through existing systems like adaptive cruise control, lane guidance or automatic parking. But when you get to level 3 autonomy, the car becomes "highly autonomous", in that it can basically do everything for you, which means negotiating things like junctions, overtaking, and changing lanes safely.

For these jobs, the car needs to monitor all the other vehicles around it and be able to make predictions about what those vehicles will do. Watching the data captured from the car, the potential routes that other vehicles could make are displayed, changing, in real time, when negotiating a junction. That's something that the human driver does by scanning the mirrors and developing a sense through experience for how a particular driver might behave, before making a decision on what action to take.

This is where the data collected from those test cars becomes so important. Putting a human driver into the car in everyday situations means that "normal" driving behaviour can be analysed and compared to what the AI would do in the same conditions. Once there's alignment between the human driver and the AI simulation you're closer to a driving policy that will see autonomous cars behaving in a "normal" manner - and that forms the basis for the self-driving algorithm.

And autonomous cars will have to make decisions on the road. While there's a system of rules for driving on the road - and computers love rules - very often other drivers do unpredictable things, like pull an illegal u-turn or suddenly change speed or direction with no warning. It's these challenges that autonomous cars will have to deal with to be able to drive safely. 

Bmw Technology How The Carmaker Is Developing Its Autonomous Driving System image 8
Pocket-lint

Machine learning means that cars could, potentially, learn for themselves - based on driver behaviour - but there's always the risk that a car could learn bad behaviours. For this reason, data has to be processed by BMW central before an algorithm change gets pushed out to the fleet, and this is how autonomous cars will get smarter, as behaviours and responses get increasingly refined.

So if you spot test cars out on the road they aren't necessarily driving autonomously waiting for human intervention, instead they are capturing the human driver's data to learn from the decisions that are being made.

Although the group is sitting in California, its acutely aware that driving conditions aren't all the same. The four-way stop is characteristic of the US, while the roundabout is more common on European roads, for example. With test vehicles in a number of different locations, BMW is developing a solution that's aware of driving differences across the globe. 

Launching in 2021 with BMW iNext - or a Fiat?

Level 2 autonomy is getting rather common. Tesla's Autopilot is typical of level 2, taking some of the load off the driver but legally you still have to be in control of the vehicle at all times. Lesser known is Nissan's ProPilot - probably because it doesn't grab headlines in the same way that Tesla does.

Level 3 autonomy is something that's very close. The SAE - who defines what these different levels mean - says that level 3 autonomy is "eyes-off" driving, meaning you still have to be in the driving seat, but not necessarily paying attention all the time.

When talking to us about these systems, BMW made the interesting point that there's a lot of research and development going on across the industry - perhaps some 30+ systems at the moment - but there's a likelihood that not all will survive.

Bmw Technology How The Carmaker Is Developing Its Autonomous Driving System image 9
Pocket-lint

That's one of the reasons that BMW is open to the idea of collaboration and partnerships. Currently, FCA (Fiat Chrysler Automobiles) is the only vehicle partner that BMW has (although that covers a lot of brands, from Fiat through to Maserati).

What that potentially means is that BMW's research won't just be limited to the luxury cars that it's known for, but might one day be taking the load off the driver in a Fiat 500. BMW is open to wider collaboration, but there's yet to be any further announcements.

What we know for certain is that BMW's launch plan for level 3 autonomy is through iNext and it has repeatedly said that it will be ready in 2021. Whether we'll see it introduced through one new vehicle or across the whole group remains to be seen, but before we can let these cars drive themselves there will need to be local regulations established.

While we're confident that the technology will all be in place within the next few years, whether the legal framework will be it a totally different question.