The feature, which is part of Tesla’s optional $7,000 Full Self-Driving add-on, is something Musk has been teasing as coming “soon” since December 2018 and has been highly anticipated by Tesla owners looking to reap the rewards from a very expensive promise. While a vehicle recognizing these traffic control devices is a huge advancement in semi-autonomous vehicles, it’s far from the zero-touch commute promise made by the CEO many months ago, and even further from a fleet of one million robotaxis by year’s end.
As indicated in the original notice when enabling Traffic Light and Stop Sign Control, the feature is expected to improve over time thanks to Tesla’s implementation of machine learning—its neural network.
This does beg to ask several pressing questions surrounding the morality of beta testing software on public roads. Recall that Boeing once certified its Maneuvering Characteristics Augmentation System for prime time before it brought down Lion Air Flight 610, exposing intricate flaws in the software. To be perfect, the feature needs seat time, but at what cost?
Would it be right to have similar concerns over Tesla’s new feature potentially skipping a beat? As documented in the video above, the feature already attempted to cross a divided highway because it misunderstood the layout of an intersection, something which could have resulted in an accident. While the functionality is expected to improve and learn from the fleet as more drivers use it, that doesn’t mean it will get every situation right every single time.
Despite paying for a feature called Full Self-Driving, drivers must remember that they still can’t buy a self-driving car today. Anyone behind the wheel of a Tesla with Autopilot is required to pay attention while driving, though as previously indicated in reports by the National Transportation Safety Board, drivers often exhibit an over-reliance on partial automation, meaning that the theory of a perfect beta test exists only in a vacuum.
Got a tip? Send us a note: [email protected]