Tesla 'Self-Driving' NDA Hopes To Hide The Reality Of An Unfinished Product

from the I'm-sorry-Dave-I-can't-do-that dept

There isn't a day that goes by where Tesla hasn't found itself in the news for all the wrong reasons. Like last week, when Texas police sued Tesla because one of the company's vehicles going 70 miles per hour in self-driving mode failed to function properly, injuring five officers.

If you hadn't been paying attention, Teslas in self-driving mode crashing into emergency vehicles is kind of a thing that happens more than it should. In this latest episode of "let's test unfinished products on public streets," the Tesla vehicle in "self-driving" mode's systems failed completely to detect not only the five officers, but their dog, according to the lawsuit filed against Tesla:

“The Tesla was completely unable to detect the existence of at least four vehicles, six people and a German Shepherd fully stopped in the lane of traffic,” reads the suit. “The Tahoes were declared a total loss. The police officers and the civilian were taken to the hospital, and Canine Officer Kodiak had to visit the vet."

Of course for Musk fans, a persecution complex is required for club membership, resulting in the belief that this is all one elaborate plot to ruin their good time. That belief structure extends to Musk himself, who can't fathom that public criticism and media scrutiny in the wake of repeated self-driving scandals is his own fault. It's also extended to the NDAs the company apparently forces Tesla owners to sign if they want to be included in the Early Access Program (EAP), a community of Tesla fans the company selects to beta test the company's unfinished self-driving (technically "Level 2" driver-assistance system) on public city streets.

The NDA frames the press and transparency as enemies, and urges participants not to share any content online that could make the company look bad, even if it's, you know, true:

"This NDA, the language of which Motherboard confirmed with multiple beta testers, specifically prohibits EAP members from speaking to the media or giving test rides to the media. It also says: "Do remember that there are a lot of people that want Tesla to fail; Don't let them mischaracterize your feedback and media posts." It also encourages EAP members to "share on social media responsibly and selectively...consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared."

Here's the thing: you don't need to worry about this kind of stuff if you're fielding a quality, finished product. And contrary to what Musk fans think, people concerned about letting fanboys test 5,000 pound automated robots that clearly don't work very well are coming from a valid place of concern. Clips like this one, for example, which show the Tesla self-driving system failing to perform basic navigational functions while in self-driving mode, aren't part of some elaborate conspiracy to make Tesla self-driving look bad and dangerous. There's plenty of evidence now clearly showing that Tesla self-driving, at least in its current incarnation, often is bad and dangerous:

Ever since the 2018 Uber fatality in Arizona (which revealed the company had few if any meaningful safety protocols in place) it's been clear that current "self-driving" technology is extremely undercooked. It's also become increasingly clear that widely testing it on public streets (where other human beings have not consented to being used as Guinea pigs) is not a great idea. Especially if you're going to replace trained testers with criticism-averse fanboys you've carefully selected in the hopes they'll showcase only the most positive aspects of your products.

We've been so bedazzled by purported innovation we've buried common sense deep in the back yard. Wanting products to work, and executives to behave ethically, is not some grand conspiracy. It's a reasonable reaction to the reckless public testing of an unfinished, over-marketed product on public streets.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: cars, nda, self-driving, transparency
Companies: tesla


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That One Guy (profile), 4 Oct 2021 @ 3:58pm

    An NDA that encourages you to share good information but strongly discourages sharing anything negative, yeah that's not an NDA so much as them paying to become PR shills for the company.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 4 Oct 2021 @ 4:12pm

    Nothing really to worry about.

    They haven't even started scoping out the turret controls. And you would think the engineers would not be testing impact assault techniques until they developed at least placeholder armor.

    So far, I'm unimpressed by the Bolo mark 0.31.

    link to this | view in thread ]

  3. icon
    That Anonymous Coward (profile), 4 Oct 2021 @ 4:20pm

    Something something the hold my beer race again is unable to learn from its history.
    Someone wanna mail Elon a copy of 'Unsafe at Any Speed'?

    NDA might keep drivers from talking, up until they find out that they are being personally held responsible for harming/killing people.
    Pretty sure auto insurance companies aren't listing shitty AI as an authorized driver, which will leave the fanbois in the very very poor house.

    If only we had some sort of agency charged with protecting the public on the roads who could do something about this.
    But obviously this isn't as important as the liberal conspiracy to silence conservatives online, so its not important enough to pursue.
    I mean they already killed 700K people who didn't need to die, how many can Tesla's manage to kill until they get it fixed?

    link to this | view in thread ]

  4. icon
    Ehud Gavron (profile), 4 Oct 2021 @ 4:27pm

    Your right to talk

    Tesla made a deal: we'll give you FSD to beta... you don't talk about it publicly.

    It's a fair deal.

    FSD isn't there yet. We all know this. What's the big deal when the beta testers want to say "Hi this beta-test isn't working right?"
    None. Don't sign it, or if you do, live by the contract you signed.

    Hitting 5 cops -- definite no no. But was it FSD or a DUI in progress. The cops hate releasing any exculptory information ever, so we won't find out until it hits a courthouse...

    E

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 4 Oct 2021 @ 4:27pm

    It seems. The self driving software does not understand police flashing lights warning signs etc or police stop signs, it simply drives into police cars parked on a road there's no way the self driving software is ready for 1000s of new drivers to use it on public roads it just looks for other cars moving or maybe pedestrians crossing it has no understanding of a police checkpoint or emergency vechicles parked in the middle of a road the flashing lights maybe make it harder to
    determine is that a car or just a traffic light on the road

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 4 Oct 2021 @ 5:02pm

    Re:

    It seems. The self driving software does not understand police flashing lights warning signs etc or police stop signs, [...]

    It doesn't seem to understand object detected.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 4 Oct 2021 @ 5:46pm

    Re:

    Oh for the insurance companies it's a godsend.

    "Oh, you were letting the AI drive? Thanks for admitting negligence!"

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 4 Oct 2021 @ 5:55pm

    Re: Your right to talk

    It's not just 5 cops(one was under a vehicle) but 5 police vehicles. That's a pretty big blunder for any collision.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 4 Oct 2021 @ 5:57pm

    Re: Re:

    There's a reason that most self driving vehicles ignore stopped objects. Any road side debris would cause the vehicle to stop, hence they program the vehicles to ignore them.

    link to this | view in thread ]

  10. icon
    Blake C. Stacey (profile), 4 Oct 2021 @ 6:24pm

    Re: Re:

    That's what they get for not using object-oriented programming.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 4 Oct 2021 @ 6:56pm

    Re: Re: Re:

    What does that even mean? "Not running into objects" is the single most important aspect of safe driving. Even without any knowledge of traffic laws or conventions, you could at least survive most situations by just obeying that rule.

    That anyone would program vehicles to outright ignore objects in their path is batshit insane.

    link to this | view in thread ]

  12. icon
    That Anonymous Coward (profile), 4 Oct 2021 @ 7:02pm

    Re:

    The AI deputized itself and did what cops do when they see a dog.

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 4 Oct 2021 @ 7:13pm

    The ai should be programmed to stop or to slowly drive past any parked cars while avoiding any pedestrians eg do not drive into any cars parked in the way where ever they may be . It could also sound a horn to warn any people in front of its not sure how to proceed like reversing trucks have audio alarms to warn people of possible collisions

    link to this | view in thread ]

  14. icon
    Ehud Gavron (profile), 4 Oct 2021 @ 7:16pm

    Re: Re:

    "Thanks for admitting negligence!" <-- perhaps there was a "to" missing there.

    In the race car, the first thing we do after starting is turn of TCS, ABS, and other "helpful computer bits" that mess with track driving.

    In the street cars we turn off TCS when four-wheeling or going through deep areas of water or loose sand.

    Negligence (failure to use reasonable care) is not "letting the AI drive." It would be failing to consider whether the AI should drive or not.

    Sorry, your weird assertion that using FSD is negligent... fails.

    E

    link to this | view in thread ]

  15. icon
    That Anonymous Coward (profile), 4 Oct 2021 @ 7:18pm

    Re: Re:

    You turned control of the vehicle over to a "beta" AI, that there are a few known issues with... after you signed an NDA that you wouldn't/couldn't tell anyone outside of Tesla if the AI decided to plow through a kindergarten playground???

    Thats a nice Tesla you have... pity it has to sit in your backyard since no insurance company will cover it.
    And even if they improve the AI, its going to be a while before any of them will want to touch them.

    It ran down 6 people, vehicles, and a dog...
    In the other clip the AI decided to pursue humans trying to legally cross the street.

    Given the number of assholes doing the sleep thing while the car drives on the freeway, you can't convince me everyone trying out the full self driving will actually be paying full attention even knowing that other drivers have been screwed over it, they will assume they'll be okay (because it won't happen to them).

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 4 Oct 2021 @ 9:20pm

    Someone Has To

    So yes I one a Tesla, 2 in fact. However, someone has to lead and pointer this field. Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them. Not because they have a bad product. If it GM or Mercedes leading this FSD, it would be them under the microscope. It's. double edged sword.

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 4 Oct 2021 @ 9:22pm

    Someone Has To

    So yes I own a Tesla, 2 in fact. However, someone has to lead and pioneer this field. Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them. Not because they have a bad product. If it GM or Mercedes were leading this FSD, it would be them under the microscope. It's. double edged sword.

    link to this | view in thread ]

  18. icon
    Slow Joe Crow (profile), 4 Oct 2021 @ 10:01pm

    Bovine excrement

    Tesla takes the heat because Tesla calls its SAE level 2 driver assist "Full Self Driving" while GM calls its Level 2 system Blue Cruise, because level 2 is basically a fancy cruise control, while actual self driving is SAE level 5. which is several orders of magnitude beyond anything Tesla has released.

    link to this | view in thread ]

  19. icon
    Slow Joe Crow (profile), 4 Oct 2021 @ 10:14pm

    Re: Bovine excrement

    Minor correction, GO'S system is Super Cruise, Blue Cruise is Ford's Level 2 offering, note that neither company so much as whisper "self driving".

    link to this | view in thread ]

  20. icon
    That One Guy (profile), 4 Oct 2021 @ 11:23pm

    ... cause five car pileups?

    Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them.

    Well that and their cars seem to have a distressing tendency to plow into other vehicles more often that they should(that number being 'never'), but I'm sure you're right and it's just their super-duper advanced tech that's causing the criticism.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 5 Oct 2021 @ 12:31am

    Though the crash that injured the five police officers wasn't anything to do with Self-Driving.

    The linked article states the car was in Autopilot mode which is the name for their cruise control system that explicitly warns drivers when enabling it it doesn't detect stationary objects.

    link to this | view in thread ]

  22. identicon
    Sabroni, 5 Oct 2021 @ 12:42am

    Re: Negligence is not "letting the AI drive"

    An ABS system is not a self driving system. A mechanical system that distributes power or manages breaking does not decide when to brake, or when to turn. The AI "drives" these cars, it doesn't shuffle power from one wheel to another.
    You don't know what you're talking about.

    link to this | view in thread ]

  23. icon
    Ehud Gavron (profile), 5 Oct 2021 @ 12:48am

    Re: Bovine excrement

    While you were out , you missed out on the day in school during math class when it was explained what "orders of magnitude" meant.

    5 is not "orders of magnitude beyond" anything Tesla has released.

    Google search is that way --> GO educate yourself and the people you are mis-educating.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 5 Oct 2021 @ 2:25am

    Re: Re: Re:

    OOPs

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 5 Oct 2021 @ 2:50am

    Re: ... cause five car pileups?

    Well that and their cars seem to have a distressing tendency to plow into other vehicles more often that they should

    That is down to the human monitoring not taking action when the autopilot is not doing its job. Failure to detect stationary objects is a well known problem with the software, and a reason human monitoring is required.

    link to this | view in thread ]

  26. icon
    Jeremy Lyman (profile), 5 Oct 2021 @ 5:24am

    Re:

    Yes there's valid criticism to be found, though it's generally a good idea to understand the systems you're criticizing. This article is written with such a chip on its shoulder that it doesn't bother to distinguish between the limited access "FSD Beta" it wants to blame vs the standard "Autopilot" which was active during this crash. You also didn't seem to notice that they're also suing the bar which served alcohol to the driver, so he was drunk and not driving responsibly. But drunk drivers hitting things is so common it barely qualifies as news any more.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 5 Oct 2021 @ 6:18am

    Re: Re: Bovine excrement

    Level is an arbitrary label, especially when applied to software concepts, and orders of magnitude is a good metaphor for how software complexity, and effort required to solve software problems scale.

    link to this | view in thread ]

  28. identicon
    Paul B, 5 Oct 2021 @ 6:22am

    Re: Your right to talk

    NDA does not apply when a government agency requests or demands information. Plow through 5 cop cars and wave an NDA flag and see how far that gets you.

    link to this | view in thread ]

  29. identicon
    Rocky, 5 Oct 2021 @ 6:55am

    Re: Re:

    Agreed. Karl's coverage of the incident is far from fair, it reads more like a hit-piece that conflates things to come a conclusion that's not supported by the facts.

    There is no doubt that Tesla have some problems with the self-driving which can result it fatalities, but most of the cases so far have shown us that some drivers are shockingly negligent and/or stupid. Sadly, you can't cure stupid with even the most advanced AI - even if it happens to be a beta-version.

    And when Karl wrote: "Of course for Musk fans, a persecution complex is required for club membership", I read that as "I don't like people who don't agree with my views on Musk, therefore I'll denigrate them".

    There's one thing to have a strong opinion about something, but when someone goes down the rabbithole of conflating things to come up with ways to attack what they dislike they strayed into asshole territory and I have no problems at all calling them out for that behavior.

    So Karl, I think you are a bit of a asshole when you write about Musk and his associated companies as evidenced by your conflation of a limited beta-program, it's NDA plus not mentioning the real circumstances and context of the crash compared with every other Tesla car/driver not enrolled in that program.

    On the whole, I think it's good to highlight the risks and pitfalls of self-driving cars and the manufacturers responsibility in marketing the features correctly and you could have easily written a better piece with all the included facts, but you choose not to.

    If you want to paint something in indignant faux black and white, don't be surprised if people call you out on it. Do better.

    link to this | view in thread ]

  30. icon
    Jeremy Lyman (profile), 5 Oct 2021 @ 7:51am

    Re: Re: Re:

    When cars "ignore" stopped objects, it's generally because the forward facing radar tells the system that there's nothing there. Radar is great for judging how fast objects are moving or differences in velocity, but when an object completely stops it blends into the background. Systems that rely too much on radar, trusting it over conflicting sensor inputs, are prone to these types of crash. They were designed to follow highway traffic at speed where, generally speaking, there aren't parked cars in the road.

    link to this | view in thread ]

  31. identicon
    JBDragon, 5 Oct 2021 @ 7:55am

    Having self-driving and relying only on cameras is flawed from the start. I think LIDAR should be a requirement.

    Tesla seems to have issues with running into the back of stopped cars for whatever reasons. That should be the easiest thing to NOT do.

    link to this | view in thread ]

  32. icon
    Ed (profile), 5 Oct 2021 @ 8:11am

    Re: Re: Re: Re:

    Oddly enough, though, my Ford Escape with cameras and radar can and does see non-moving objects in the roadway just fine and slows/stops accordingly. It isn't advertised or referred to as Full Self Driving but simply an "assist", which it is. Yet in thousands of miles of using it to drive across the country several times, it hasn't hit a stopped vehicle or a pedestrian, or even come close to doing so. Funny that.

    link to this | view in thread ]

  33. icon
    Ed (profile), 5 Oct 2021 @ 8:18am

    Re: Re:

    Teslas are supposedly so advanced, yet their ironically-named "Autopilot" can't detect stationary objects? The assist system in my Ford Escape has no problem detecting stationary objects, or people, or dogs. It has stopped for those objects numerous times in my use.

    link to this | view in thread ]

  34. icon
    Jeremy Lyman (profile), 5 Oct 2021 @ 8:40am

    Re: Re: Re: Re: Re:

    Right, the system works reasonably well and doesn't always ignore all immobile objects. That's why they're generally beneficial. But it has the same kinds of warnings Tesla uses:

    WARNING: You are responsible for controlling your vehicle at all times. The system is designed to be an aid and does not relieve you of your responsibility to drive with due care and attention. Failure to follow this instruction could result in the loss of control of your vehicle, personal injury or death.
    WARNING: The system only warns of vehicles detected by the radar sensor. In some cases there may be no warning or a delayed warning. Apply the brakes when necessary. Failure to follow this instruction could result in personal injury or death.
    WARNING WARNING: The system may not detect stationary or slow moving vehicles below 10 km/h.

    Ford Manual

    WARNING: Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model 3. Always watch the road in front of you at all time. Failure to do so can result in serious injury or death.

    Tesla Manual

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 5 Oct 2021 @ 9:55am

    Re: Re: Re:

    Why did it need to stop for stationary objects? Ford has no self driving general release products, so you were using a driver assist product and YOU were legally obliged to drive the car. YOU should have been stopping for stationary objects.

    link to this | view in thread ]

  36. identicon
    Anonymous Coward, 5 Oct 2021 @ 10:35am

    Re: Re: Re:

    their ironically-named "Autopilot" can't detect stationary objects?

    Why's it ironic? Actual autopilot systems (as used in aviation) don't do that either. But most people don't have enough aviation knowledge to know that, which makes it an unfortunate name.

    link to this | view in thread ]

  37. identicon
    Anonymous Coward, 5 Oct 2021 @ 12:47pm

    Musk keeps rebranding other ideas

    Looks like Musk jacked Steam's early access model and ran with it.

    link to this | view in thread ]

  38. icon
    Clandestine (profile), 5 Oct 2021 @ 1:14pm

    happens more than it should?

    So, how often should it happen to a self-driving cars compared to how often non-self driving cars rear-end another car?
    Only then can we say if self-driving cars are more dangerous than idiot driven cars.

    link to this | view in thread ]

  39. identicon
    Anonymous Coward, 5 Oct 2021 @ 1:16pm

    Re: A Dance As old As 2005

    Behold the majestic fall migration for the Tesla fanbois has begun. As they work their way from on tech website to another defending their progenitor from perceived insults. Their displays of rhetoric ratchet up as they attempt to outdo each other in feats of logical fallacy as they vie to impress an extremely limited pool of available mates.

    link to this | view in thread ]

  40. icon
    Lostinlodos (profile), 5 Oct 2021 @ 2:56pm

    Re:

    It’s not much different than the NDA for most beta level testing.

    link to this | view in thread ]

  41. icon
    Lostinlodos (profile), 5 Oct 2021 @ 3:10pm

    Oh, wait…what?

    Oh, wait…what?

    For starters the title of the article is generally ignored in a rant about crashes.
    This type of NDA is par for the corse in beta anything. Do not publicly disclose problems during the beta period.
    It’s why windows beta testers were long inside locked to the public MSDN discussions. It’s why Apple dev beta (pre public) discussion is behind the ADS program.
    I’ve tested software, and hardware devices. Some of them dangerous on error. This is nothing unique.

    Crashes?
    This reeks of not-my-fault-ism!

    How is Tesla at fault for the driver not retaking control?

    Be it an OS or a car: the point of a beta test is to gage reaction to real world situations, make manual corrections, and report flaws; “bugs”.

    So regardless of if it’s alt-esc when a random /rm * starts for no reason, or bus cars and SUVs and people in the road… ?
    Why did the driver not do something?
    Unless self drive or lvl 2 or what ever disengaged the steering MIB style: the driver is responsible. To some degree.

    The damn manual/guide/agreement says the driver is responsible for controlling the vehicle!

    Beta testing is self explanatory to anyone who reads the agreement they sign! It’s not finished!!!!!


    The problem here, on Tesla’s end, is not better vetting the idiots who agree to beta testing. If the driver was doing the job they agreed to, actively testing an assistive technology, they wouldn’t be plowing into things while sleeping, reading, screwing, etc. —

    link to this | view in thread ]

  42. identicon
    Anonymous Coward, 5 Oct 2021 @ 3:49pm

    They sue Tesla and some bar the driver had been drinking in...but what about the driver? He's apparently not named in the suit. Was he at least arrested for DUI?

    Or was the automation a convenient excuse to shift blame to someone with more money than an ordinary drunk driver?

    link to this | view in thread ]

  43. icon
    Scary Devil Monastery (profile), 6 Oct 2021 @ 3:32am

    Re: Re:

    "It’s not much different than the NDA for most beta level testing."

    ...where unethical companies are concerned, yes. That both these statements seem correct just tells us the distance between the bar of acceptability and Situation Normal.

    link to this | view in thread ]

  44. icon
    Jeremy Lyman (profile), 6 Oct 2021 @ 5:03am

    Re: Oh, wait…what?

    This drunk driver mentioned at the start was not part of the FSD Beta test group. The article didn't do you any favors in explaining the situation. There have been NO accidents involving the ~2,000 Beta testers since they started in October 2020.

    link to this | view in thread ]

  45. icon
    Lostinlodos (profile), 6 Oct 2021 @ 11:26am

    Re: Re: Oh, wait…what?

    I think were agreeing?

    When you chose to beta test something it’s not finished. It isn’t expected that it work correctly.
    It’s prerelease.

    The responsibility falls on the tester to maintain a safe operating environment.

    link to this | view in thread ]

  46. icon
    Jeremy Lyman (profile), 7 Oct 2021 @ 10:57am

    Re: Re: Re: Oh, wait…what?

    Yes, we're agreeing. I just didn't want you to put too much effort into arguing the FUD in the article. Tesla is carefully vetting the beta testers and they've been very safe. Drunk drivers hitting cop cars while failing to supervise standard release Autopilot are less safe.

    link to this | view in thread ]

  47. icon
    stockal (profile), 4 Feb 2022 @ 11:24pm

    how to buy/invest in tesla shares from India?

    Tesla share value is increasing day by day because of the rapid growth in electronic Vehicle usage in all countries. Try stockal to buy tesla shares in India easily!

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.