Tesla 'Self-Driving' NDA Hopes To Hide The Reality Of An Unfinished Product
from the I'm-sorry-Dave-I-can't-do-that dept
There isn't a day that goes by where Tesla hasn't found itself in the news for all the wrong reasons. Like last week, when Texas police sued Tesla because one of the company's vehicles going 70 miles per hour in self-driving mode failed to function properly, injuring five officers.
Five Montgomery County deputy constables were injured Saturday when the driver of a Tesla rear-ended a cruiser during a traffic stop, causing a chain-reaction crash, authorities said. https://t.co/FfteMQQ4zL
— Pooja Lodhia (@PoojaOnTV) February 27, 2021
If you hadn't been paying attention, Teslas in self-driving mode crashing into emergency vehicles is kind of a thing that happens more than it should. In this latest episode of "let's test unfinished products on public streets," the Tesla vehicle in "self-driving" mode's systems failed completely to detect not only the five officers, but their dog, according to the lawsuit filed against Tesla:
“The Tesla was completely unable to detect the existence of at least four vehicles, six people and a German Shepherd fully stopped in the lane of traffic,” reads the suit. “The Tahoes were declared a total loss. The police officers and the civilian were taken to the hospital, and Canine Officer Kodiak had to visit the vet."
Of course for Musk fans, a persecution complex is required for club membership, resulting in the belief that this is all one elaborate plot to ruin their good time. That belief structure extends to Musk himself, who can't fathom that public criticism and media scrutiny in the wake of repeated self-driving scandals is his own fault. It's also extended to the NDAs the company apparently forces Tesla owners to sign if they want to be included in the Early Access Program (EAP), a community of Tesla fans the company selects to beta test the company's unfinished self-driving (technically "Level 2" driver-assistance system) on public city streets.
The NDA frames the press and transparency as enemies, and urges participants not to share any content online that could make the company look bad, even if it's, you know, true:
"This NDA, the language of which Motherboard confirmed with multiple beta testers, specifically prohibits EAP members from speaking to the media or giving test rides to the media. It also says: "Do remember that there are a lot of people that want Tesla to fail; Don't let them mischaracterize your feedback and media posts." It also encourages EAP members to "share on social media responsibly and selectively...consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared."
Here's the thing: you don't need to worry about this kind of stuff if you're fielding a quality, finished product. And contrary to what Musk fans think, people concerned about letting fanboys test 5,000 pound automated robots that clearly don't work very well are coming from a valid place of concern. Clips like this one, for example, which show the Tesla self-driving system failing to perform basic navigational functions while in self-driving mode, aren't part of some elaborate conspiracy to make Tesla self-driving look bad and dangerous. There's plenty of evidence now clearly showing that Tesla self-driving, at least in its current incarnation, often is bad and dangerous:
Not sure why FSD is such a safety hazard - especially for pedestrians and cyclists?
Check out this video, posted last week. pic.twitter.com/Hg0tGfxXDT
— David Zipper (@DavidZipper) September 19, 2021
Ever since the 2018 Uber fatality in Arizona (which revealed the company had few if any meaningful safety protocols in place) it's been clear that current "self-driving" technology is extremely undercooked. It's also become increasingly clear that widely testing it on public streets (where other human beings have not consented to being used as Guinea pigs) is not a great idea. Especially if you're going to replace trained testers with criticism-averse fanboys you've carefully selected in the hopes they'll showcase only the most positive aspects of your products.
We've been so bedazzled by purported innovation we've buried common sense deep in the back yard. Wanting products to work, and executives to behave ethically, is not some grand conspiracy. It's a reasonable reaction to the reckless public testing of an unfinished, over-marketed product on public streets.
Filed Under: cars, nda, self-driving, transparency
Companies: tesla