Tesla, Inc.’s TSLA Autopilot self-driving feature has come under criticism from several quarters for not being safe enough.
What Happened: Late Sunday, Tesla’s AI Director Andrej Karpathy, who is currently on a sabbatical, had an interesting anecdote to share that testifies to the feature’s safety.
The incident shared on Twitter, apparently minutes of it happened, was regarding how Autopilot helped Karpathy avert an “almost certain collision.” The Tesla executive was apparently changing lane, when a motorcyclist very aggressively shifted lanes and accelerated from behind his car, he said on Twitter.
Autopilot quickly aborted the lane change, he noted. He suggested that he has seen similar events happening in “chip telemetry” many times.
“Experiencing it in real life is something else,” Karpathy said.
Why It’s Important: A judge, last week, ruled that a Tesla Model S driver will be going to trial for allegedly slamming into another vehicle on a Los Angeles County freeway, causing the death of two.
He was charged with man slaughter, marking the first instance of a prosecution in the U.S. involving self-driving technology.
Tesla, meanwhile, is inching closer to rolling out its full self-driving software, which is currently in beta testing. FSD is an improved version of Autopilot, which includes more advanced assisted and semi-autonomous driving capabilities.
Tesla chief executive officer Elon Musk reportedly told media in Brazil that the company would have self-driving cars that obviate the need for human intervention by the same time next year.
In premarket trading on Monday, Tesla stock was down by 0.81% at $658.50, according to Benzinga Pro data.