Last month, we uploaded an article about the death of Joshua Brown. Brown had had Tesla Autopilot activated in his car at the time of the crash. This, of course, generated quite a lot of controversy for Tesla. And it’s not like that’s the only report of a problem with Tesla’s “autonomous” feature. There have been quite a few headlines over the last few months about accidents involving this feature. It’s strange, considering that it feels like only yesterday that this feature was released to the public. Still, it is worth noting that Joshua Brown’s death has been the only death linked to the feature thus far.
If you’ve been keeping up with these stories at all, then you may have noticed something new popping up last week. In Beijing, China, another accident has occurred involving Tesla’s Autopilot. Video footage taken from the dashboard of the car is available online. It should be noted that this accident is more of a fender bender than anything. The Autopilot car hit another car as that car was parked on the side of a busy highway. The damage was minimal and no-one was hurt.
So what’s with all the fuss? Well, it’s because it’s yet another case that presents problems for autonomous driving in public. It’s also highlighted some of the tricky legal areas.
Some people may be wondering who is actually responsible for an accident involving Tesla’s Autopilot. Many would be tempted to blame every case on Tesla. After all, isn’t Autopilot supposed to turn a car into a self-driving machine? Well, not exactly, and that’s part of the problem. Tesla Autopilot isn’t supposed to completely remove driver autonomy.
The problem is that the phrase used is Tesla’s marketing campaigns in China translates to self-driving. It isn’t so much a problem with the technology itself as it is with the terminology used. We’re looking at a translation problem more than we are a portent of doom for autonomous technology.
It has been noted by Tesla that, in many of these cases, the driver had their hand off of the wheel. That, or they weren’t paying enough attention to the road or to what Autopilot was telling them. For example, Autopilot is excellent at detecting when the car in front of you slows down. Autopilot can begin slowing the car down for you and alert you to the problem. But the onus, legally, is still on you to prevent the accident. You’re still expected to be in more control of the vehicle than Autopilot. So if an Autopilot driver crashes into you and injures you, it may not be Tesla you should be suing. The fault is probably still with the driver. Of course, you would have to consult with an auto accident attorney before proceeding with either scenario.
There are an estimated 70,000 Tesla Autopilot vehicles on American roads. Across the entire world, there are many more. If Autopilot was as unsafe a feature as some are making out, we would be seeing a lot more accidents. It should also be noted that Autopilot is only supposed to be used on long stretches of road, like highways. Tesla have already confirmed that it’s not suitable for urban driving. Some of the accidents have taken place in urban areas – so the fault would lie with the driver.
The bottom line? Know the limitations of the technology you’re using.