Breaking: Self-Driving Cars Avoid Accident, Do Exactly What They Were Programmed To Do
from the I-can-and-will-do-that,-Dave dept
We just got done talking about how, after logging 1,011,338 autonomous miles since 2009, Google's automated cars have had just thirteen accidents -- none of which were the fault of the Google vehicles. By and large the technology appears to be working incredibly well, with most of the accidents the fault of inattentive human drivers rear-ending Google's specially-equipped Lexus SUVs at stop lights. But apparently, the fact that this technology is working well isn't quite interesting enough for the nation's technology press.A Reuters report making the rounds earlier today proclaimed that two self-driving cars from Google and Delphi Automotive almost got into an accident this week in California. According to the Reuters report, Google's self-driving Lexus "cut off" Delphi's self-driving Audi, forcing the Audi to take "appropriate action." This apparently got the nation's technology media in a bit of a heated lather, with countless headlines detailing the "almost crash." The Washington Post was even quick to inform readers that the almost-crash "is now raising concerns over the technology."
Except it's not. Because not only did the cars not crash, it apparently wasn't even a close call. Both Delphi and Google spokespeople told Ars Technica that both cars did exactly what they were programmed to do and Reuters apparently made an automated mountain out of a molehill:
"I was there for the discussion with Reuters about automated vehicles," she told Ars by e-mail. "The story was taken completely out of context when describing a type of complex driving scenario that can occur in the real world. Our expert provided an example of a lane change scenario that our car recently experienced which, coincidentally, was with one of the Google cars also on the road at that time. It wasn’t a 'near miss' as described in the Reuters story."In other words, As Twitter's Nu Wexler observed, the two cars did exactly what they were programmed to do, though that's obviously a notably less sexy story than Reuters' apparently hallucinated tale of automated automotive incompetence.
Instead, she explained how this was a normal scenario, and the Delphi car performed admirably.
"Our car did exactly what it was supposed to," she wrote. "Our car saw the Google car move into the same lane as our car was planning to move into, but upon detecting that the lane was no longer open it decided to terminate the move and wait until it was clear again."
Breaking: Self-driving cars avoid accident, doing exactly what they are programmed to do
— Nu Wexler (@wexler) June 26, 2015
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: accidents, autonomous vehicles, cars, driving, near miss, self-driving
Companies: delphi, google
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
This doesn't generate as much hype as
BREAKING: Skynet over us, self-driving cars nearly cause apocalypse. John Connor called for help.
As much as I like driving I look forward a future where humans won't touch the wheel. Or a text processor it seems from the idiocy from the media in this case :)
[ link to this | view in chronology ]
Re:
If people cannot be trusted to operate a complex piece of machinery, perhaps the machine can run itself better. No question in the case of cars. I doubt more than one person out of four driving today actually knows how to drive in compliance with traffic laws.
[ link to this | view in chronology ]
That area is mapped out electronically in far more detail than you find elsewhere. That includes mapping the location of traffic lights so that the car will notice them. Temporary traffic lights are a problem, and the car will not notice a police officer signaling the car to stop.
And with much of the processing done in the cloud, be sure to limit your driving to areas with decent cellular coverage.
The average person won't be able to operate a self-driving car in the next few years, but it IS wonderful progress.
Taxi services will soon be able to use them. The dispatcher can check to see if the pick-up location, destination and points in between are suitable for a self-driving car. If not, they can send one of their remaining human drivers.
I assume that this is Uber's business model. Use humans to build up their business using their own cars. They'll be established in many cities just in time for a self-driving fleet to become practical.
[ link to this | view in chronology ]
Re:
They can also recognize stoplights, but maybe not a police officer's hand signal - but that's a problem that's easily solved.
And the areas may be mapped out - but they're mapped out by the cars themselves. Any area where self-driving cars spend any considerable time would also get mapped out in great detail. It's one of those situations where the more it's implemented, the better it works.
[ link to this | view in chronology ]
Re: Re:
Technology Review: Impressive progress hides major limitations of Google’s quest for automated driving.
[ link to this | view in chronology ]
Re: Re: Re:
If they're scared to even test the thing in heavy rains, then it's nowhere near ready for actual deployment.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Which NOW need testing on ACTUAL roads, not controlled test tracks.
Supposedly Nevada has approved some of their highways for such testing, albeit with caveats. Such as a licensed driver must be in the vehicle, even if they are not controlling it. And the 'tests' have their own caveats: IIRC the vehicles to be tested currently do NOT have the ability to enter or exit the highway, not can they do a passing maneuver.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
As I said, the cars themselves help map the terrain, so it's not like they can't see what's around them and operate independently or rely on maps in the cloud. And luckily, stoplights don't appear overnight. It would be easy for the city to work with the Google to incorporate road changes and construction zones (or for the city to maintain the maps themselves).
Snow and ice are definitely serious issues though. Right now that means in certain types of weather, you'll just have to drive yourself.
And these are today's cars - not cars 10 years from now. Go back and look at where they were in 2004, and I'm not too worried that these issues won't be overcome.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Ever hear of ABS?
[ link to this | view in chronology ]
Re: Re: Re:
Spotting a stop light is no harder than spotting a cyclist or pedestrian or other unmapped obstruction. All such processing has to be done on-board, as otherwise an Internet dropout could cause an accident.
[ link to this | view in chronology ]
Re: Re: Re:
Yes, just as the commenter said, for maps and such. Not for processing. If processing is done in the cloud, then the whole system would be impractical for the foreseeable future.
[ link to this | view in chronology ]
Re:
https://youtu.be/IjRXyWFLkEY?t=109
[ link to this | view in chronology ]
Re:
I've seen a Google car recognize hand signals from bicyclists. Unless you mean flashing lights and whatnot, then I'm sure they'll have a kill switch to tell the car to pull over.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
robocars to use time-tested CSMA/CD "collision detection" protocol...
[ link to this | view in chronology ]
Re: robocars to use time-tested CSMA/CD "collision detection" protocol...
[ link to this | view in chronology ]
This is IT though
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Somone is going to make a Johnny Cab. I can feel it.
[ link to this | view in chronology ]
Wait til the autonomous vehicles get into the mix...
Self
Contained
Autonomous
Vehicle
[ link to this | view in chronology ]
A road Test?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Programming makes bad decision for driver!
2. A driver was killed because they could not exit the vehicle in time when railroad crossing gates trapped them. A news story showed how a driver could easily drive through and break off a railroad crossing arm because they are designed to be easily broken off. If a driver is caught between railroad gate crossing arms and vehicle program will not allow the driver to drive into and break off an arm to escape a collision with a train, without a series of steps, to override the collision avoidance system, it is dangerous.
[ link to this | view in chronology ]
The bear rule.
The push-button ignition only indicated the system needs to check that the transmission is out of gear before engaging the engine. That's a fixable bug.
The second one sounds avoidable by the guidance system detecting the position of the train and not proceeding onto the tracks if it is too close.
I expect that, because people are ambitious and creative in their idiocy that self-driven cars will sometimes engage in accidents, some fatal. But they don't have to be casualty free, just safer than human-driven cars.
Granted this may create a liability problem, but that's a different issue. There are similar liability problems with public transit.
[ link to this | view in chronology ]
Re: The bear rule.
http://abcnews.go.com/Blotter/toyota-pay-12b-hiding-deadly-unintended-acceleration/story?id=2297 2214
ABC News first reported the potential dangers of unintended acceleration in an investigation broadcast in November 2009. The report said hundreds of Toyota customers were in “rebellion” after a series of accidents were apparently caused by the unintended acceleration. Two months before, Highway Patrolman Mark Saylor and three members of his family had been killed after the accelerator in his Lexus had become stuck on an incompatible floor mat. Saylor was able to call 911 while his car was speeding over 100 miles per hour and explain his harrowing ordeal right up until the crash that ended his life.
EXCLUSIVE INVESTIGATION: Runaway Toyotas
2) Driver killed when vehicle became trapped between crossing arms Page 1 to 4 here:
http://www.cbsnews.com/pictures/metronorth-train-accidents/
[ link to this | view in chronology ]
Re: Re: The bear rule.
The first one is a problem with controls. It sounds like it's not a problem with power acceleration (fly-by-wire) but with a foot pedal getting stuck. That's actually a situation in which a smart automatic driving system could help, especially if the acceleration was control was fly-by-wire as it could then override the stuck control and slow the car down regardless.
[ link to this | view in chronology ]
Re: Re: The bear rule.
[ link to this | view in chronology ]
Re: Programming makes bad decision for driver!
Question asked and answered, sigh: https://www.azatrax.com/controller.html
"To reduce collisions at grade crossings, railroads are installing four quadrant gate systems on high speed rail corridors, commuter lines, light rail systems and in areas with high concentrations of foolish drivers."
At least once people stop driving their own cars, we can return to more sane two quadrant crossing gates.
[ link to this | view in chronology ]
Warning! You Could Be In Danger...stay tuned
Click to read more...but first, Is something in your garage planning to kill you?
[ link to this | view in chronology ]
Re: Warning! You Could Be In Danger...stay tuned
[ link to this | view in chronology ]
http://www.harmonymotorworks.com/
[ link to this | view in chronology ]