Earlier this week, an accident in the Culver City, California area has reignited the debate surrounding the autonomous capabilities of Tesla vehicles. On a clear sunny morning this past Monday, a black Tesla Model S rear-ended a fire truck parked on Interstate 405 around 8:30 a.m., according to the California Highway Patrol (CHP). The vehicle owner, who remains unidentified, claims the Tesla’s autopilot features were in use when the crash occurred.
The firetruck was stopped in the left emergency and carpool lanes to block off another accident that happened earlier. The Model S struck the back of the firetruck at 65 mph, with the vehicle’s low front end sliding under the truck, crumpling the hood. The vehicle’s passenger compartment ultimately remained intact, and remarkably, there weren’t any injuries.
Culver City Fire Chief Ken Powell described the crash as “a pretty big hit,” and also mentioned there were two CHP vehicles parked alongside the firetruck. Both of these vehicles had their emergency lights flashing, which the driver should have seen well before the crash occurred. The Model S autopilot should have detected the oncoming vehicles and either slowed the car down or asked the driver to take back the wheel.
While the Model S owner is blaming the vehicle’s computer system for the crash, this might not be the case.
In a statement sent to Bloomberg, Tesla stated the autopilot features on their vehicles was intended for use with a fully attentive driver. Tesla stressed that despite its name, the autopilot technology doesn’t make the Model S fully autonomous, and the driver needs to remain alert at all times. The CHP hasn’t been able to determine whether the Model S was indeed traveling on autopilot at the time of the accident, however Tesla is expected to examine the vehicle’s data logs in the coming days.
The incident caught the attention of the National Transportation Safety Board (NTSB), which has begun collecting information on the accident. According to NTSB spokesman Chris O’Neil, the agency will launch a formal investigation like the one that occurred in 2016 when a driver was killed after their Model S (also on autopilot) drove under a semi that the vehicle’s sensor’s didn’t detect.