Tesla’s “Full Self-Driving” Beta (which we have in quotes because, despite the name, it is not fully self-driving, as it still requires the attention of the driver) software is on public streets, and it has been since at least October. The company has been expanding the roll-out of the prototype software to small groups of Tesla owners, as stated on Twitter by the company’s CEO Elon Musk, with further expansion coming in April. We’re not so sure that rapid expansion of the program is such a good idea after seeing how badly the feature performed in the above video, which was also reported on by Road & Track and Jalopnik.
Throughout the sunny drive that YouTube user AI Addict takes, the car shows many signs that the advanced driving assist has a long, long way to go before it’s ready for prime time. Among some of the more minor issues include some hesitancy and confusion when trying to pick a lane, stopping way short of the line and getting stuck behind parked cars. Far more concerning are a couple of near collisions, once when crossing an intersection with no stop sign for cross traffic, and then a moment where it seemed as though it wanted to drive through a fence. In many of these situations, the driver had to manually take over to continue the drive or to avoid a crash. In another video taken a few days prior to the one above, AI Addict had many of the same issues on an evening drive.
Of course, no one expects a beta version of anything to be perfect, and having people test it is a way to find and iron out the issues. As shown in the video, users of the “Full Self-Driving” Beta report when things go wrong so that Tesla has that information and can work to correct it. But self-driving-car beta is unlike most in that it could do serious harm if something goes wrong the driver fails to catch it in time. It seems to us that this software needs to spend more time in the hands of Tesla employees, rather than being unleashed on the public, even in the currently limited numbers. If something goes wrong with what are effectively amateur testers behind the wheel, it could have very serious consequences from killing or seriously injuring someone to simply setting back the cause of automated driving.