Cars aren’t clever enough to go into full hands-off mode yet, as some drivers have fatally discovered. Picture: REUTERS
Cars aren’t clever enough to go into full hands-off mode yet, as some drivers have fatally discovered. Picture: REUTERS

Fully autonomous cars are still some time away, but that hasn’t stopped some drunk drivers from treating their partially self-driving cars as designated drivers.

Netherlands police last week nabbed a Tesla driving closely behind a slow-moving truck on a mostly empty freeway. When they pulled alongside it, they saw that the driver was asleep at the wheel while the car was being driven by its autopilot system.

The police woke the driver using their sirens, and found he was drunk.

PODCAST | Cargumentative - Homologation Heaven

For more episodes, click here

Subscribe: iono.fm | Spotify | Apple Podcasts | Pocket Casts | Player.fm

This isn’t the first time cops have pulled over sleeping, drunk drivers in Teslas. It took police 11km last November to pull over a driver in California, US. In that case, police had to overtake the vehicle and slow down to force the car to stop.

Like other cars with semiautonomous driving features such as active cruise control and self-steering in certain conditions, the Tesla is designed to prevent drivers from handing over full driving control to the car.

Such cars are meant to warn their drivers to resume driving control with visual and audible alerts, but Tesla’s fail-safes don’t always seem to work.

This raises road-safety issues and there have been much-publicised crashes with Tesla owners who left the driving to the car’s autopilot system, perhaps thinking it can operate the car by itself. It isn’t.

Cars have not yet reached level 5 automonous ability where they can manage all the complexities of driving in traffic.

The most recent incident was a fatal collision of a Tesla Model 3 on March 1, in at least the third deadly US crash reported involving the driver-assistance system. The driver turned on autopilot about 10 seconds before the car collided with a truck.

Tesla CEO Elon Musk said recently its  robotaxis with no human drivers would be available in some US markets as early as 2020, and that “probably two years from now we’ll make a car with no steering wheels or pedals.”

That seems like an optimistic target after the most recent crash, and it is too early for human road users to trust technology totally.

There have been other incidents with artificial-intelligence cars, including in 2018 when an Uber self-driving test vehicle struck and killed a pedestrian at night in Arizona, US.

The Uber car was in self-driving mode at the time, but the company, like other autonomous car makers, needs a back-up driver inside to take over when the autonomous system fails or a complex driving situation happens. The Uber back-up driver was reportedly watching videos on her smartphone.