The US agency responsible of road safety is opening a politician investigation into Tesla’s “self-driving” Autopilot system.
The National Highway Traffic Safety Administration (NHTSA) said it had been acting following 11 Tesla crashes since 2018 involving emergency vehicles.
In some cases, the Tesla vehicles “crashed directly into the vehicles of first responders”, it said.
The investigation will cover roughly 765,000 Tesla cars made since 2014.
That includes those within the Model Y, Model X, Model S and Model 3, the NHTSA said – the whole current range.
‘Control in the least times’
The agency was primarily concerned with a clear inability of Tesla vehicles to deal with vehicles stopped within the road – specifically emergency vehicles attending an event .
Among the list of cases was one where a Tesla “ploughed into the rear” of a parked fire truck attending an accident, and another during which a parked cruiser was struck.
The NHTSA said it had been opening its preliminary investigation into “the technologies and methods wont to monitor, assist, and enforce the driver’s engagement”, while using Autopilot.
It said that within the 11 crashes that prompted its investigation, either Autopilot or a system called Traffic Aware control had been active “just prior” to the collisions.
Tesla Autopilot ‘tricked’ to run without driver
Tesla crash driver ‘was playing video game’
The assistive technology allows the car to automatically steer, accelerate and brake.
But it’s come under attack for being misleading, because it doesn’t automatically drive the car and drivers are required to take care of control and a spotlight in the least times.
Tesla has marketed the feature as an “Autopilot” and promised “full self-driving”, which is now available to some users during a beta version.
Users have abused the system frequently within the past, with examples starting from using their phones while the car drives unattended to switching car seats and leaving no driver at the wheel.
In a statement, an NHTSA spokesperson said: “No commercially available automobiles today are capable of driving themselves. Every available vehicle requires a person’s driver to be on top of things in the least times.”
The investigation’s supporting documents do, however, note the challenging circumstances involved in many of the collisions.
“Most incidents happened after dark and therefore the crash scenes encountered included scene control measures like first responder vehicle lights, flares, an illuminated arrow board, and road cones,” it reads.
It comes days before an occasion to showcase the car company’s software.
Chief executive Elon Musk had previously announced 19 August as “Tesla AI Day”, which he said would showcase the progress of the firm’s AI systems – with a view to recruiting AI experts to the firm.
Tesla disbanded its PR team in October 2020 and can’t be reached for comment.