If We're Not Careful, Self-Driving Cars Will Be The Cornerstone Of The DRM'd, Surveillance Dystopias Of Tomorrow
from the who-controls-the-code dept
We've talked a lot about the ethical and programming problems currently facing those designing self driving cars. Some are less complicated, such as how to program cars to bend the rules slightly and be more more human like. Others get more complex, including whether or not cars should be programmed to kill the occupant -- if it means saving a school bus full of children (aka the trolley problem). And once automated cars are commonplace, can law enforcement have access to the car's code to automatically pull a driver over? There's an ocean of questions we're not really ready to answer.But as we accelerate down the evolutionary highway of self-driving technology, the biggest question of all becomes: who gets to control this code? Will the automotive update process be transparent? Will the driver retain the ability to modify their car's code? Will automakers adapt and stop implementing the kind of paper mache level security that has resulted in the endless parade of stories about hacked automobiles it takes five years for automakers to patch?
Trying to force the issue before there's a hacker-induced automotive mass fatality, Ford, GM and Toyota were hit by a class action lawsuit earlier this year claiming the car companies were failing to adequately disclose the problems caused my abysmal auto security:
"Among other things, the lawsuit alleges Toyota, Ford and GM concealed or suppressed material facts concerning the safety, quality and functionality of vehicles equipped with these systems. It charges the companies with fraud, false advertising and violation of consumer protections statutes. Stanley continued, "We shouldn't need to wait for a hacker or terrorist to prove exactly how dangerous this is before requiring car makers to fix the defect. Just as Honda has been forced to recall cars to repair potentially deadly airbags, Toyota, Ford and GM should be required to recall cars with these dangerous electronic systems."This month a court ruled that yes, we will have to probably wait for someone to die before automakers are held liable for lagging automotive security. The case was ultimately dismissed (pdf), the court ruling that the plaintiffs have yet to prove sufficiently concrete harms, and that potential damage (to the driver and to others) remains speculative. At the pace self-driving and smart car technology is advancing, one gets the sneaking suspicion we won't have long to wait before harms become notably more concrete.
But however complicated these legal, ethical, and technical questions are, they become immeasurably more complicated once you realize that smart cars will ultimately form the backbone of the smart cities of tomorrow, working in concert with city infrastructure to build a living urban organism designed to be as efficient as mathematically possible. As Cory Doctorow noted last week, this makes ensuring code transparency and consumer power more important than ever:
"The major attraction of autonomous vehicles for city planners is the possibility that they’ll reduce the number of cars on the road, by changing the norm from private ownership to a kind of driverless Uber. Uber can even be seen as a dry-run for autonomous, ever-circling, point-to-point fleet vehicles in which humans stand in for the robots to come – just as globalism and competition paved the way for exploitative overseas labour arrangements that in turn led to greater automation and the elimination of workers from many industrial processes.You'd hate to wander too casually into the hyperbole territory traditionally reserved for hysterical Luddites, but there's a laundry list of reasons to be worried about the trajectory of the lowly automobile. If we don't demand code transparency and consumer empowerment in automotive standards now, your car may find itself the cornerstone of a future in which DRM, encryption backdoors, lax security standards, eroded consumer legal rights, insurance companies and government power combine to create a supernova of dystopian dysfunction.
If Uber is a morally ambiguous proposition now that it’s in the business of exploiting its workforce, that ambiguity will not vanish when the workers go. Your relationship to the car you ride in, but do not own, makes all the problems mentioned even harder. You won’t have the right to change (or even monitor, or certify) the software in an Autonom-uber. It will be designed to let third parties (the fleet’s owner) override it. It may have a user override (Tube trains have passenger-operated emergency brakes), possibly mandated by the insurer, but you can just as easily see how an insurer would prohibit such a thing altogether."
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: autonomous vehicles, cars, drm, privacy, self-driving cars, surveillance
Reader Comments
Subscribe: RSS
View by: Time | Thread
The good old classics
[ link to this | view in chronology ]
Re: The good old classics
Why anyone would eschew decades of advancement in car safety technologies is beyond me. Old cars are death traps.
[ link to this | view in chronology ]
Re: Re: The good old classics
[ link to this | view in chronology ]
Re: Re: Re: The good old classics
[ link to this | view in chronology ]
Re: Re: The good old classics
[ link to this | view in chronology ]
Re: Re: Re: The good old classics
Also note that driving in a caged car without a helmet makes serious injury more likely than if you had no cage. Side impacts are far more common than rollovers, and smacking your head against a cage during a crash is never pretty...
[ link to this | view in chronology ]
Re: Re: Re: The good old classics
Then have fun dying when you hit something. Those crumple zones are there to protect you--not whatever you hit--by helping to absorb the impact.
Since deceleration trauma is deceleration trauma no matter which direction it occurs in, think about it vertically. If you do the math, you find that getting in a crash at 60 MPH is almost exactly equivalent to falling off of a 14-story building. If that happened, would you rather land directly on the sidewalk, or on a pile of cushions?
[ link to this | view in chronology ]
Re: Re: Re: Re: The good old classics
Either way, I'm taking on certain risks by driving older vehicles. They are risks I'm willing to accept though. I do not base my decision to buy a car from the standpoint of planning to crash it. I do base it on things like my ability to control the car and avoid a crash. Things like not having a computer inserted between my controls and the car.
There is something nice about knowing that I have control and no computer glitch can crash the car for me.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: The good old classics
Yes computers glitch, but there would be far fewer overall crashes once our evil self driving cars take over.
Do do anything financial online? This is done by computers.
Do you fly commercial? Most of the work is done by computers.
Do trust your traffic camera's? Those are done by computers too!
People falsely assume that a malfunction is an instant death sentence and it is hardly the case at all. Crumple Zones, computers, and updated physical mechanics are all deliberated upon my professionals who dedicate more time to these things than you could possibly even understand.
You are a Darwin Awardee in waiting!
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: The good old classics
I especially don't like the idea of trusting a computer when some moron insists on connecting that same computer up to the in dash entertainment system that has Bluetooth and WiFi enabled.
Even more so when the code is locked away and I'm not allowed to look at it. This is a big concern. Knowing what I know about computers I would rather not blindly trust some programmer without having the right to check his work.
As for your questions.
Yes I do financial stuff online, and by doing so I risk someone stealing my money, but no physical harm is done.
I don't fly if I can avoid it. I would also like to point out that those computer systems have been hacked before.
Do I trust traffic camera's? What does that have to do with anything? And no, I don't really trust them, I mean come on. Most of them are not secure and open for anyone to watch if they like. Then they are also wide open for abusive use in tracking people's movement.
I do not fear all technology. In fact I love technology and I am very excited about the advancements that I am seeing in technology. What I fear is the fact that time after time after time people have shown that technology will be abused. I do not fear the tech, I fear the people who are already drooling over the new ways they can abuse it.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: The good old classics
Modern stability controls can do a much better job of allowing you to control the car and avoid a crash than most drivers are able to. You say you're in IT, not a professional driver, so it's safe to assume that includes you. I'm not criticising your car choices, just your rationalisation for them. Claiming you can do a better job of avoiding a crash on your own implies a skill level that's probably higher than the reality.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: The good old classics
I'm not saying that I don't see the benefit of a lot of these advancements. In fact I find ABS systems to be pretty awesome especially in the rain. I just don't like the direction things are going where their is no redundancy and the computer has far too much control.
Also, My dislike of these computer systems is far from the only reason I like older cars. One of the biggest reasons is because most modern cars look like shit.
[ link to this | view in chronology ]
Re: Re: The good old classics
"Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."
- Benjamin Franklin
[ link to this | view in chronology ]
Re: The good old classics
1. When was the last time you heard of a car built in the 60s or 70s going over 100,000 miles? Wasn't too often. However, with a modern card, it's totally routine.
2. Remember the yearly tuneups you had to do in order to get ready for winter? Doesn't seem to be all that common with modern cars.
[ link to this | view in chronology ]
Re: Re: The good old classics
[ link to this | view in chronology ]
Re: Re: Re: The good old classics
[ link to this | view in chronology ]
Re: Re: The good old classics
Also, what is any new car worth once it has 100 000 miles on it? Built-in obsolescence.
[ link to this | view in chronology ]
Time to Unplug and Go Dark
[ link to this | view in chronology ]
Mr. N
[ link to this | view in chronology ]
Re: Mr. N
(Actually, telcos were lucky not to be sued, mostly. They advertised DTMF as being useful for phone menus, but tone dialling service had nothing to do with that. The only thing they got into trouble for IIRC was adding DTMF service to pulse-grandfathered lines without consent.)
[ link to this | view in chronology ]
I grew up on the Internet while constantly hearing promises of an amazing future where freedom and user-empowerment was paramount.
Instead I find the freedoms I routinely enjoyed while growing up are being taken away from me and the future is moving towards a dystopian nightmare. The present ubiquitous surveillance already is a nightmare in and of itself.
I hate that the world that was promised by so many tech luminaries is being perverted towards the exact opposite. I hate that I had so much hope for the future and only find said hope continually being dashed and kicked while in the ground.
Why do you keep giving hope only to take it away? Over and over and over and over again. This is cruelty.
[ link to this | view in chronology ]
Re:
The future is the one you choose.
[ link to this | view in chronology ]
Re: Re:
I no longer have intellectual privacy online, which has replaced much of what I enjoyed about the Internet with anxiety. I do not know what to do about that short of no longer using the Internet, which is rapidly becoming a less feasible option.
I can't make a future where I'm free to create what comes to mind without worrying about threats to my freedom via law enforcement misinterpreting my words (look up what happened to Justin Carter). This is not something I can do anything about.
I can't enjoy many cultural works anymore because of arcane, arbitrary, and vexatious copyright law enforcement on the Internet. This is not something I can do anything about.
Worst of all, I am seeing many tech luminaries *encourage* censorship when once upon a time the concept was akin to heresy! It's gotten so bad that I double-check everything I type out to ensure that it can't easily be taken out of context to ruin me later in life. I don't want my career to be the victim of the next Twitter mob egged-on by Silicon Valley activists and CEOs! I've been trying to do something about that for a year and a half and I've not had been able to make any change.
The future that I want is not one that is in my power to have. Once upon a time I wanted to be like the hackers of olde, creating more software to underpin more of the internet and creating awesome shit. Now? The future that I am getting is so frequently causing pain that I am seriously considering dropping my career in tech.
[ link to this | view in chronology ]
Re: Re: Re:
We live in an unprecedented age of human freedom, access to knowledge and communication. To deny this reality and victimize ourselves as being helpless does a disservice to our ancestors who struggled and died through much worse to make a society that though not perfect, is the best it has ever been.
[ link to this | view in chronology ]
Re: Re: Re: Re:
What a worthless tool. Did you even read about any single war in humanities history, or slavery, or egypt, mao, stalin, pol pot...
I mean what the fuck? You deserve every last miserable negative thing that happens in your life time!
[ link to this | view in chronology ]
Re: Re: Re: Re:
I don't know the potential consequences of any given action online anymore due to not trusting that the watchmen know the difference between right and wrong in the online world.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
The people who I trusted deeply to keep the Internet safe were incapable of doing so. The incapability has been so severe that random non-tech activist groups are making more headway changing tech away from freedom of speech than tech is towards promoting online freedom!
I hate that the people who promised so many great things turned out to be so incredibly weak and powerless. I pray that I have to eat these words eventually. But for now, I feel little more than pain at what has happened to the open Internet and hacker culture.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
'If we're not careful'?
[ link to this | view in chronology ]
Re: 'If we're not careful'?
[ link to this | view in chronology ]
Re: Re: 'If we're not careful'?
[ link to this | view in chronology ]
Re: Re: Re: 'If we're not careful'?
[ link to this | view in chronology ]
> code to automatically pull a driver over?
...and how long will it take before the black hats put a crack on the internet, and random gangbangers can play real-life demolition derby with their phones?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Why is it always all or nothing?
Do you ever get afraid if the motor on an elevator, which many are computerized now a days, will be hacked? Maybe ISIS will hack your local elevator and not allow you to leave the elevator for good! The Google Database is one of the most secure databases on the planet, not even the best hackers in the world could crack their code and usually most terrorist hackers are amateurs who get into a website or two. Big deal, ISIS hacks into the little league baseball website.
Law Enforcement will clearly have access to the cars if needed, if the car is doing something Illegal then yes the officer will intervene. They can write up a citation where the car needs to have a recertification by the manufacturers and DMV that it is once again working legally, failure to do so should rest on the owner of the car's hands.
[ link to this | view in chronology ]
Re: Why is it always all or nothing?
As far as the issue with finding out about software problems, it will be the same as finding out about badly engineered designs in autos. Think the ignition swtich or Toyota's unwanted acceleration. With the DRM and arbitration, finding out it's a problem with a particular make will have to be proved by deaths. Those who refuse to sign a settlement with a NDC in it. So it won't be one death that leads you to find out about it but a lot of deaths before it can't be covered up any more. You can be sure that just like today, these big automakers will do anything in their power to keep it under wraps for as long as possible that there is anything wrong with their products.
Today I'm quite happy to be driving the same vehicle I bought 20 years ago. It's not nickle and diming me to death and pretty much all it takes is gas, oil, tires, and rarely anything else. Best part about it is it doesn't have all the geolocation crap in it.
[ link to this | view in chronology ]
Re: Re: Why is it always all or nothing?
Nobody wants a future where people are forced to do something, in fact many auto manufacturing companies who are building self driving cars will have the self driving part as more of an option. I don't really see the point in Self Driving Cars for myself due to the fact I'm an able bodied 22 year old electrician.
I don't even own a car but I get around with lyft or public transportation. I've actually carved out a nice little niche where I as an electrician never have to drive. At the moment I work as a Construction Electrician where I just put my tools into a lyft car and bring them right over. Soon enough I'll have my AS and will be doing lights out manufacturing.The constitution has made it clear that they can not take away your right in being happy. The automated car though is for people who EG: may not have legs,lost an eye in a refinery or are deaf.
As for the software problems, it's not like these cars are just going to hit the road and crash. There will be a lot of safety tests for each and every car. There will be certifications granted to each car as if they are a driver themselves. If there is a software problem chances are they will find it in the factory or wherever they sell these things. And as for the comparison to your smart phone, no, the checks for these will be a lot more intense.If the head of the NHTSA said that we should not get in the way of this type of innovation, then these cars are probably vetted extremely well.
Software problems though are being looked at as we speak. Tesla has been the first ones to try this software and yes they've had heavy issues with the software in 2013 but it's almost 2016 now and 3 years ago is ancient compared to what the software is today. If these cars were considered a danger to society then they wouldn't even be looked at by the Feds.
I still believe that a lot of people tend to feed into an idea that really isn't there. I remember talking to a friend of mine who said the same thing you're saying. He hates the idea because they will force everyone to use it. And I said why would they and all he said back was Control.
[ link to this | view in chronology ]
Secondary profit center
Whew! Good thing you followed this up with a qualifier, because my first thought was murder for hire.
And wouldn't murder for hire be easy for a self-driving car manufacturer to accomplish? "We don't know what happened...everything was going well and suddenly the car swerved into the path of that monster truck. Must have been a software glitch."
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re: who gets to control this code?
The short answer is: the person who wrote it. Which presents a rather large problem for a society that is increasingly dominated by heavily leveraged aristocrats that don't code.
The reality is that if software developers were guarding the products of their labor as jealously as the RIAA/MPAA mafia, we would still be using punch cards. The significance of that fact is lost on the armies of nodders blithely tapping away on their pocket leashes.
In this century coding is to literacy as reading and writing was to literacy in the 17th century. Software developers are the modern journeymen; hawking their wares not because they want to; but because fair compensation is rare and fleeting.
There of dozens of farces being used to villainize coders. The cyberterrorism oriented TV shows are not unlike the German Jew posters of the 1930's. They are a fear tactic manufactured by socialite barrons of the corporate state in order to keep the newly literate in corporate concentration camps. We are contained by the barbed wire of our vanity, and an utterly corrupt and meaningless financial system.
The reality is that many software developers have written code that was used to do things that the developer deplores. And it is up to us to mitigate the effect of those who would abuse our services. And yes, they _will_ come for us before they are forced to let go of the reigns. Fortunately for us, the order to round us up will be transmitted over computer networks.
I am not so concerned about the morality of a self driving car. The moral issues of code in modern society go far beyond that. The question for us, is whether software developers will permit themselves to be turned against each other by the social wranglings of lesser men.
As the saying goes: "You can always hire one half of the poor to kill the other half." Whether it is true in the modern era, is a question every coder needs to ask himself. For it is in our power to suspend that formerly inevitable conclusion. And that is what the social aristocracy truly fears.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
There are only two answers to this question that make any sense at all.
First, the literal one: Don't be ridiculous; do you have any idea how solidly built a bus is? It's practically inconceivable that there could be any way you could crash a modern car into one that would put the inhabitants of the bus at any serious risk of death or severe injury.
Second, the "in the spirit it was asked" one: No, it absolutely should not be programmed to sacrifice the driver under any circumstances whatsoever. For two reasons: first, because if people know that such functionality exists, they'll never want to buy it, and second (partially feeding into the first) because if such functionality exists, it becomes a target for hackers; someone will inevitably try to find a way to cause it to activate incorrectly, just because it's there. The very real danger of this overwhelmingly outweighs the hypothetical danger of a "trolley problem" incident.
[ link to this | view in chronology ]
Re:
Busses of all types have sorely poor safety features and crash history when things go wrong. Check some news clippings next time you find yourself online and in reality at the same time!
But you are right, a computer system should never be programmed with an acceptable to kill in this circumstance logic. Each system should only ever focus on the safety of its current occupant, while secondarily focusing on minimizing any and all collateral damage to its surroundings.
[ link to this | view in chronology ]
Re: Re:
Yeah, if you're driving a dump truck. Buses are made with very heavy-duty steel frames, because they're designed to take safety very seriously. When a car made of aluminum crashes into one, the car gets crunched and the bus just shakes a little.
[ link to this | view in chronology ]
Software freedom is a must
[ link to this | view in chronology ]
Kill the Terrorists?
[ link to this | view in chronology ]
Re: Kill the Terrorists?
[ link to this | view in chronology ]
Motorcycles
[ link to this | view in chronology ]
Re: Motorcycles
[ link to this | view in chronology ]
Self-Driving Tanks
[ link to this | view in chronology ]
Will the President Self-Driving Limo Have an Override Code?
[ link to this | view in chronology ]