Luddite Redux: Don't Kill The Robots Just Because They Replace Some Jobs

from the first,-do-no-harm dept

Here are a couple points to ponder:
Fun fact #1: California prison guards are expensive.

Fun fact #2: South Korea's getting robot prison guards.
I'm sure the prisoners welcome their new robot overlords, but I bet the prison guards union doesn't. Or any other union for that matter. And they're not alone. Over the past few weeks, tech industry commentators spent slightly more time than usual wringing their hands over whether technology was killing jobs. I think this video captures the debate pretty well.
It might sound paradoxical, but this replacement of humans by machines is actually a good reason to limit secondary liability for the robotics industry. And I'm not just referring to secondary liability in the copyright sense, but to any liability incurred by robot manufacturers because of how others use their robots.

This isn't a theoretical issue. Automation and efficiency have always threatened certain jobs and industries -- and one of the standard reactions is to somehow blame the technology itself and seek to hinder it, quite frequently by over-regulation. Of course, the extreme version of this is where the term "luddite" came from -- an organized effort to attack more efficient technology. Of course, that resulted in violence against the machines. More typical were overly burdensome regulations, such as "red flag laws," that said automobiles could only be driven if someone walked in front of them waving a red flag to "warn people" of the coming automobile. Supporters of this law, like supporters of secondary liability laws for robots, can and will claim that there are "legitimate safety reasons" for such laws and that the impact on holding back the innovation and extending the lifetime of obsolete jobs is just a mere side benefit. But like those red flag laws, applying secondary liability to robotics would significantly hinder a key area of economic growth.

Techdirt has covered the question of a secondary liablity safe harbor for robots before, and Ryan Calo's written a great paper about the legal issues coming out of the robotics arena, but an even more important (and specific) point is exactly why these safe harbors matter for job creation -- even as some continue to argue the other way (that such safe harbors will destroy jobs).

Technology has been replacing human labor since humans invented, well, technology. But while technology may get rid of inefficient jobs, it eventually creates replacements. To cite one commonly-used example, the switched telephone network put operators out of a job, but it created plentiful new jobs for telemarketers (and other businesses that relied upon the packet-switched phone network... including everything built on and around the internet today). The problem is that while it was obvious how many operators would be out of a job, it wasn't immediately clear how lucrative (or annoying) telemarketing could be, let alone the eventual transformation of the phone lines into a vast global information sharing network, and the hundreds of millions of new jobs created because of it.

Erik Brynjolfsson and Andrew McAfee examine this problem in detail in their book, which I recommend. But much of it boils down to this. Technology creates jobs, yet it's not obvious where the new jobs are, so we need bold, persistent experimentation to find them:
Parallel experimentation by millions of entrepreneurs is the best and fastest way to do that. As Thomas Edison once said when trying to find the right combination of materials for a working lightbulb: "I have not failed. I've just found 10,000 ways that won't work." Multiply that by 10 million entrepreneurs and you can begin to see the scale of the economy's innovation potential.
This is especially important for robotics. It's obvious how robots make certain jobs obsolete -- e.g. driverless cars don't need drivers -- but it's less clear what new job opportunities they open up. We need to try different things.

Unfortunately, secondary liability creates problems for robot manufacturers who open up their products for experimentation. Ryan Calo explains this in more detail, but the basic problem is that, unlike computers, robots can easily cause physical harm. And under product liability law in most states, when there's physical harm to person or property, everyone involved in the manufacturing and distribution of that product is legally liable.

Ideally, we'd want something like a robot app store. But robot manufacturers would be unwilling to embrace commercial distribution of third-party apps if it increased their chances of being sued. There's evidence that Section 230's safe harbors (and, to some extent, the DMCA's safe harbors) play a key role in facilitating third-party content on the web. Absent a similar provision for robots, manufacturers are more likely to limit their liability by sticking to single-purpose robots or simply locking down key systems. That's fine, if we know exactly what we want our robots to do -- e.g. replace workers. But if we want robots to create jobs, it'd help to limit secondary liability for the robotics industry, open things up, and let widespread experiments happen freely.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: automation, human labor, jobs, robots, secondary liability, unions


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 29 Nov 2011 @ 8:14pm

    Get off my lawn, you damn robots!

    link to this | view in thread ]

  2. icon
    Jay (profile), 29 Nov 2011 @ 8:20pm

    No story here...

    Fun fact #1: California prison guards are expensive.

    Another link

    One more

    It looks like the WSJ post is gone.

    link to this | view in thread ]

  3. identicon
    Jake, 29 Nov 2011 @ 8:47pm

    I'm pretty sure that this is one job that really, really can't be done more effectively by a robot. More cheaply, yes, in the same way that a CCTV camera in every cell with one guy in an office at the other end of the building monitoring them could do the same job more cheaply. But in terms of preventing inmates harming themselves or each other, this is not going to be an improvement over boots on the ground.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 29 Nov 2011 @ 8:55pm

    I will try to find that on the new distributed open source search engine created to take away the power governments have to censor the current players in the game.
    http://search.yacy.net/

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 29 Nov 2011 @ 9:19pm

    Re:

    I have to agree. I would not trust a robot with the force needed to subdue a prisoner. It sounds like the robots just monitor the prisoners... but if this is going to result in fewer actual guards, that isn't going to be a good thing when you get an eventual large-scale riot.

    link to this | view in thread ]

  6. icon
    Atkray (profile), 29 Nov 2011 @ 9:36pm

    It will never work

    I can't see them programming the robots to smuggle contraband into the prisons.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 29 Nov 2011 @ 9:42pm

    Actually I believe robots would be better suited to keep an eye on prisoners or even subdue them, robots don't get angry, robots don't hold a grudge, robots don't use more force than they are suppose too because they lack emotions, it could greatly reduce one of the primary causes of prison riots and that is mistreatment.

    You can't scare robots, you can argue with them, you can pound and pound and they will be just there in front of you until your anger goes away.

    Of course it depends on how you program them, they can also be ruthless and use deadly force for no good reason at all which could increase aggressive behavior levels inside an enclosed environment that is highly stressful.

    The thing is, robots are nowhere near those capabilities yet, they can be teleoperated though, we have the hardware to do it, we don't have the AI to make it a reality.

    It is possible to do those things because if it was impossible humans wouldn't be able to do them as well.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 29 Nov 2011 @ 9:50pm

    "but this replacement of humans by machines is actually a good reason to limit secondary liability for the robotics industry."

    I would not support limiting liability for a single industry like that. They can play by the same rules as everyone else.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 29 Nov 2011 @ 10:22pm

    Re:

    "robots don't hold a grudge,"

    Are you sure? You wouldn't program the robot to act a little differently towards someone who is a chronic problem?

    "robots don't use more force than they are suppose too because they lack emotions,"

    Well, they won't use more force because of emotions. But they might use more force because they LACK emotions such as compassion. They might also use more force due to a less intuitive grasp of the situation, programming errors, malfunctions, etc.

    "You can't scare robots, you can argue with them, you can pound and pound and they will be just there in front of you until your anger goes away."

    Somehow I don't think this would be good for the prisoners on a psychological level.

    You can't assume a perfect AI. You can say it's "possible" to make one, but that doesn't mean we're actually going to be able to do that in the next hundred years...

    link to this | view in thread ]

  10. icon
    Michael Ho (profile), 29 Nov 2011 @ 11:09pm

    Re: No story here...

    tried to fix that link... wsj might put that article behind its paywall, tho.

    link to this | view in thread ]

  11. icon
    Andrew F (profile), 29 Nov 2011 @ 11:10pm

    Re:

    "Everyone else" isn't homogeneous. If you get a lemon of a car, you can often go after everyone from the dealer to the car manufacturer to the guy the manufacturer bought screws from.

    In contrast, Apple isn't liable when an iOS app wipes out all your data. And Internet companies get special safe harbors under Section 230 and the DMCA.

    Under existing law, robots are treated more like cars than smartphones. The proposal is that, once you start installing apps on your robot, it makes more sense to flip that.

    link to this | view in thread ]

  12. icon
    rosspruden (profile), 29 Nov 2011 @ 11:12pm

    It's excellent to see a Techdirt piece on how safe harbor protections apply to more than just art. Well done, Andrew.

    I once heard Issac Asimov speak in the early 80s. At the time, the Japanese had just starting introducing robots into the auto assembly line and reporters were calling up Asimov for a comment since he had created the term, "robotics". This article above hints at the future Asimov wrote about in all his books and Asimov even alluded to it in his lecture—when robots can replace humans, humans can finally move on to do more important things... but wait, robots are replacing humans! It's the paradox of efficiency: the more work you have taken away, the less work you have to do.

    What makes this article so interesting to me is that it shows why secondary liability protection is so important when the "worker" wades closer into tort law. A telephone switchboard can't hurt anyone and it put many many people of a job. But a robot worker whose laser can slice you in half? Yeah, problem.

    Do those automated Predator drones have secondary liability protections?

    link to this | view in thread ]

  13. icon
    Andrew F (profile), 29 Nov 2011 @ 11:22pm

    Re: Re:

    There'll be things humans are better at for a long time. But we are starting to see exponential growth in certain areas of robotics. Seven years ago, researchers couldn't make an autonomous car drive eight miles across a desert. Last year, Google was testing them in city streets.

    One plus for the robot prison guard is that they're easier to fix. Suppose a robot does make a mistake and uses excessive force. Once a programmer identifies what went wrong, the fix can easily be pushed to all of the other robots very quickly. In contrast, remedying police brutality requires extensive training. And lot of what appears as excessive force may really be a gut self-protective instinct on the part of the officer that's very hard to figure out.

    Will we replace all cops with machines? Probably not, you want a human to have final say over use of force for Isaac Asimov-type reasons. But I wouldn't be surprised if, in 30 years, we saw a 3 to 1 ratio of robots to humans in corrections and law enforcement.

    link to this | view in thread ]

  14. icon
    Andrew F (profile), 29 Nov 2011 @ 11:28pm

    Re:

    >> Do those automated Predator drones have secondary liability protections?

    Doubtful.

    But suppose the cops took a military-grade Predator drone and installed their own custom software on it, and ... bad things happen. I wouldn't blame the Predator manufacturer liable for that, just because they let people install custom software on the drones. Might be a different story if the manufacturer was actively involved in making the custom software.

    link to this | view in thread ]

  15. icon
    Michael Ho (profile), 30 Nov 2011 @ 12:14am

    Re: Re: Civilian Drones

    ask and ye shall receive:
    http://seattletimes.nwsource.com/html/nationworld/2016882681_drones29.html

    Police agencies want drones for air support to find runaway criminals. Utility companies expect they can help monitor oil, gas and water pipelines. Farmers believe drones could aid in spraying crops with pesticides.

    "It's going to happen," said Dan Elwell, vice president of civil aviation at the Aerospace Industries Association. "Now it's about figuring out how to safely assimilate the technology into national airspace."

    link to this | view in thread ]

  16. icon
    ethorad (profile), 30 Nov 2011 @ 1:00am

    not defective?

    And under product liability law in most states, when there's physical harm to person or property, everyone involved in the manufacturing and distribution of that product is legally liable.

    I think the link isn't quite the right one - it goes to a section on liability where the product is defective which isn't quite the point you're making?

    In any case for non-defective robots I would hope that legal suits focus more on the user than the manufacturer. If they didn't I'm amazed that you are still able to buy guns, cars and even hammers in the US - after all they are surely used in causing harm every year.

    link to this | view in thread ]

  17. icon
    Richard (profile), 30 Nov 2011 @ 4:29am

    Re: Re:

    It would be even worse if the technology actually worked. At present the cost is one of the main things keeping the US prison population down to only moderately unreasonable levels. If it became cheaper because the guards were no longer human then the end result would be the incarceration of the entire US population.

    link to this | view in thread ]

  18. identicon
    alternatives(), 30 Nov 2011 @ 5:03am

    I'd rather interact with a robot

    I'm sure the prisoners welcome their new robot overlords

    One would hope the robots are not programmed to be sadistic.

    With humans you get variability and some of the guards are going to be worse human beings than the best humans who are locked up. And the robots are not going to have emotions get in the way of the interactions with the prisoners - IE the Prisoner #6 interacts with Prisoner 571-AZ and Prisoner #6 spits in the face of prison guard Zimbardo. Guard Zimbardo than abuses Prisoner 571-AZ out of frustration because he can't get to Prisoner #6. Prisoner 571-AZ is being abused because he knows #6 and #6 did something to Guard Zimbardo - where is the justice in this situation?

    link to this | view in thread ]

  19. icon
    nasch (profile), 30 Nov 2011 @ 6:08am

    Re: Re:

    And Internet companies get special safe harbors under Section 230 and the DMCA.

    You're not wrong, but my understanding is the safe harbors are not there to remove any liability service providers would have ordinarily, but to ensure that the liability is placed where it ought to have been anyway: on the user actually performing the illegal act. I think it's more of a defense against technologically illiterate judges and juries than anything else.

    link to this | view in thread ]

  20. icon
    nasch (profile), 30 Nov 2011 @ 6:16am

    Re: not defective?

    In any case for non-defective robots I would hope that legal suits focus more on the user than the manufacturer. If they didn't I'm amazed that you are still able to buy guns, cars and even hammers in the US - after all they are surely used in causing harm every year.

    I think the difference is the difficulty in correctly determining the liability. If the robot was modified and then harmed someone, how do you determine why it harmed someone? Sometimes it might be clear, but definitely at other times an analysis of the customer modifications won't leave an obvious conclusion.

    This uncertainty may lead some manufacturers to stay out of the business, or to try to prevent modifications. Possibly products will just be a lot more expensive because of all the insurance and lawyers. I guess the best case scenario would be complicated waivers you have to sign to buy a robot, and by "sign" I don't mean tick a box online.

    All these drawbacks make it worth considering some kind of rule to attempt to more clearly draw the liability line. That won't be easy though, since any simple rule (eg "manufacturers are not liable for anything that happens no matter what") will almost certainly be a bad one.

    link to this | view in thread ]

  21. identicon
    rosspruden, 30 Nov 2011 @ 11:30am

    Re: Re: Re: Civilian Drones

    Thanks, Michael! :)

    link to this | view in thread ]

  22. icon
    Mike42 (profile), 30 Nov 2011 @ 11:45am

    Prison Guards...

    Hey, anyone out there know any prison guards? I do. My best friend is one. I can't tell you how happy he is to be supervising 20 or 30 inmates, many of whom are armed, and his defensive weapon is... a button. If anything bad happens, he presses it, and hopefully no one sticks him before the armed guards show. And when a fight between two gangs DID break out on his watch, the only thing that saved him was the respect he showed the prisoners on a daily basis. One of the gang-bangers actually started for him, but another gang-banger stopped it.

    In another situation, a prisoner asked for a second helping of lunch. A (new)surley guard told him no. The prisoner said, "I'm in for life. I've got no reason to take this" and proceeded to beat the guard to a pulp before someone pulled him off. The guard found out the hard way that respect is the best policy.

    My buddy says there ARE guards with bad attitudes. They also have very short life expectancies.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 30 Nov 2011 @ 3:58pm

    Re: Re:

    And then the car companies claim that the lemon of a car is actually a robot, because it can parallel park itself.

    "Under existing law, robots are treated more like cars than smartphones. The proposal is that, once you start installing apps on your robot, it makes more sense to flip that."

    But as was mentioned in the article, "the basic problem is that, unlike computers, robots can easily cause physical harm."

    (Even though I'm arguing on this side, I'm not totally convinced I'm right, by the way.)

    link to this | view in thread ]

  24. icon
    Frost (profile), 1 Dec 2011 @ 8:45am

    All non-creative work will be automated.

    There is no theoretical reason why we won't eventually automate everything that doesn't explicitly require human ingenuity and creativity. There simply aren't enough of those jobs to go around, or rather there aren't enough employers to hire the entire world population to be creative.

    This, of course, is only a problem for as long as we cling to the outmoded idea that you need a "job" to get "money" so you can get everything else. If we can the idea of money and start running the world on sensible real-world premises and just provide people with what they need, then automation isn't a threat to us - it's the single greatest thing that has ever happened to humanity. 100% unemployment for all - and all the housing, clothing, food etc everyone could possibly need in spite of or even because of that.

    Society is broken right now. It's not the fault of the one thing that has ever been necessary to and instrumental in raising human standards of living - technological progress.

    link to this | view in thread ]

  25. icon
    nasch (profile), 1 Dec 2011 @ 12:49pm

    Re: All non-creative work will be automated.

    There is no theoretical reason why we won't eventually automate everything that doesn't explicitly require human ingenuity and creativity.

    Robots have already painted pictures. Someday computers will create music, literature, and other pieces of art that will be indistinguishable from human handiwork.

    If we can the idea of money and start running the world on sensible real-world premises and just provide people with what they need, then automation isn't a threat to us - it's the single greatest thing that has ever happened to humanity. 100% unemployment for all - and all the housing, clothing, food etc everyone could possibly need in spite of or even because of that.

    Sounds beautiful. It will be a rocky road to get there, though.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.