Audi systems adjusted to mimic human behaviour

Derek Fung, CarAdvice.com.au
  • Sign in required

    Please sign in to your account to add a vehicle to favourite

  • Share this article

Audi’s self-driving semi-autonomous A7. Picture / Supplied

Audi’s self-driving semi-autonomous A7. Picture / Supplied

The latest iteration of Audi’s self-driving software has been programmed to be more like a human in its driving style.

Among the updates, Audi’s autonomous test vehicles will now drive more like people do — for example, the latest piloted driving software more clearly signals its intentions when changing lanes by activating the indicators and moving to the edge of the lane before attempting the manoeuvre.

When overtaking trucks and other large vehicles, Audi’s self-driving cars will give the truck a much larger lateral gap than smaller vehicles.
The satellite navigation system is also able to compute a route that maximises time spent in self-driving mode.

CarAdvice saw an earlier version of Audi’s self-driving car, at CES 2015, which was able to drive itself most of the way from Silicon Valley, California, to Las Vegas, Nevada.

At the 2015 show, Daniel Lipinski, project leader for Audi’s automated driving projects talked about some of the ways his team has engineered Audi’s self-driving cars to feel more natural to those inside the car.

Compared to a regular A7, the semi-autonomous A7, nicknamed Jack, was equipped with extra sensors: a laser scanner embedded in both the front and rear bumpers, and mid-range radars at each corner.

The lasers monitor goings-on in front and behind the car, while the mid-range radars keep track of what’s happening to the sides. Jack also uses the latest megapixel camera from the second-generation Q7, while some other devices have been upgraded to the latest available versions.

Unlike production vehicles, the network of sensors in Jack have been engineered to provide a level of redundancy. Lipinski says that “this ensures that even if one sensor fails [the car] can operate for the next 15 to 20 seconds, and give the driver enough time to take over”.

All up, these perception units provide the car with a 360-degree view of its surroundings and data from them is fed into the car’s decision-making module.

Housed in computers in the boot, the decision-making engine knows where the car wants to go and the car’s dynamic capabilities. As such, it has to continuously determine whether to stay in the same lane, plan trajectories around other vehicles or see whether it should pull into the slow lane.

The last item in Jack’s chain of command was the execution module, which can alter the steering angle, and manipulate the brakes and accelerator.

Lipinski said that the overall route to CES was as simple going to Google Maps, but to ensure that the 885km trip between Stanford and Las Vegas went smoothly, the team had to do a bit of extra planning.

As a level 3 autonomous vehicle, the A7 Piloted Driving Concept had been designed only to handle regular highway conditions; the driver was responsible for getting the car to the highway initially.

So, the first task was to mark all the stretches of the route that weren’t on the Interstate Highway System. The team then noted the number of lanes for each highway section and flagged all the human-driven parts of the route.

Lipinski claimed Jack was able to pilot itself for 97 per cent of the journey, and that was two years ago.
Audi also announced that it is working together with the government of its hometown Ingolstadt to develop and install infrastructure designed to help self-driving vehicles.

The company is collaborating with the government to upgrade sections of the A9 autobahn to include new roadside posts that can be detected further away and new signs that allow vehicles to precisely determine their location within a lane.

-CarAdvice.com.au