“We create the ‘eyes’ that will help self-driving cars work”
The function you work with helps autonomous cars drive themselves. How does that work?
“A car that drives itself needs to take certain decisions. To do that it needs accurate and robust information. My function – sensor aggregation – supplies just that. Think of it as two parts of a brain. One part scans the environment and passes this data on to the other part of the brain, which then decides what to do and when, for example, execute a lane change.”
What kind of information do you feed this other half of the ‘brain’?
“We give the car environmental descriptions – a 360-degree view of what is around us. So, where exactly are the other road users? What kind of road user are they?, A car, bus, truck? How fast are they going and in which direction? And, just as importantly, we also need to tell the decision maker exactly where the road is, and where we are in relation to it,
So you use GPS?
“It’s not that simple. We need accuracy to be within 10cm. That’s very challenging, but it is essential for safety. So we use our sensor data and compare it to special, incredibly detailed high-definition maps. And that’s how we know where we are.”
What kind of sensors are you working with?
“We have come a long way since the first Adaptive Cruise Control system, which used just one sensor – a radar. In our current prototypes, we have more than 20 sensors all over the car. That’s cameras, radars and laser scanners (also known as LIDARs). They’re all very good at slightly different things. My job is to aggregate all the data they produce and continuously provide a robust picture. We can’t afford to miss anything, so we set up sensors in such a way that they cover for each other’s weaknesses. Some are great for detecting speed, others better at making out shapes, some are sensitive to light conditions while others aren’t and so on.”
It sounds full of complexities – which will only increase when installing it in a production vehicle, surely?
“Yes. It’s relatively easy to build a system that works in a demo, but creating a system that works very well all the time is harder. And not only that, we also want to be able to hide the sensors so we can design a good-looking car.
Is this something you’re testing as part of the DriveMe pilot project –– in Sweden?
“Indeed. The DriveMe trial allows us to expose the system to real-life traffic scenarios. This kind of development is very data intensive. We have already fed huge amounts of data into it. And DriveMe will give us valuable data of a new quality.”
Does that mean self-driving cars are just around the corner?“Not exactly. Before we put a fully autonomous car into production we need to prove that it is safe. We will verify our AD technology within the Drive Me trial, starting next year and take it from there. Our aim is to have a commercial offer around 2020. But, as I said, first we need to prove that it is safe. I like to think of my two kids sitting on the back seat while I am using this system. I always ask myself, is this really safe to be released? Only when we are all absolutely confident that this system is robust and meets the most demanding safety standards will we see this technology in a Volvo.”