Executives didn’t want to make Autopilot ‘naggy.’
At the moment, American electric car maker Tesla is facing a multitude of issues surrounding its Autopilot technology, mostly down to a recent spate of accidents caused by drivers being over-reliant on the advanced driver assistance systems. People have asked if the company is doing as much as it can to prevent drivers from being over-reliant on Autopilot, which Tesla frequently reminds people is still in ‘beta’ testing, and it seems we now have an answer to the question.
An article by The Wall Street Journal suggests that when engineers within the company proposed adding features like driver eye-tracking and more steering wheel sensors to ensure that drivers are paying attention to the road, executives shot down the proposals on various grounds, including ineffectiveness, expense, and concerns that people might not be too enamoured with Autopilot if it were setup to be an overbearing nanny.
Despite what its name suggests, Autopilot is not (supposed to be) a hands-off driving feature. Autopilot, in reality, sits alongside various other advanced driver assistance systems from Volvo, Mercedes-Benz, and Audi, each with the capability of offering assistance to drivers, with the goal of ‘taking the edge off’ driving.
However, while Volvo and Mercedes-Benz systems have various systems and sensors to ensure that your attention is on the road at all times (Volvos will disengage their Pilot Assist systems if your hand is off the wheel for more than 5-seconds, or if it detects that you’re not in your seat, which has happened before when this writer chose to reach for his wallet once), Autopilot appears to be easily fooled. There are plenty of videos online showing drivers at their most reckless, watching movies, playing with their phones, or even vacating the drivers’ seat while Autopilot is engaged.
Things like driver eye-tracking would have prevented such hijinks, which is why General Motors has included the feature in its vehicles that offer their new ‘SuperCruise’ semi-autonomous driving system. Autopilot’s lax monitoring of drivers have, at least partially, contributed to the deaths of three people who had the system engaged at the time of their demise.
“When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is one of complacency. They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.” — Elon Musk, Chief Executive, Tesla Inc.
What is clear to us (at least) is that more could have been done to prevent the abuse of and over-reliance on Autopilot. What Tesla needs to really stop doing is acting like a tech company, blaming users for not being as careful as they ought to have been with their products, and start taking responsibility when the fault lands on them. Both Musk and Tesla have ‘oversold’ Autopilot in the past, making bold claims about the system and its capabilities, and now it’s the fault of drivers for being careless? Doesn’t work like that, mate.
For more information on Tesla, check out our Showroom.