US Government Making Another Attempt To Regulate Code Like It Regulates International Weapons Sales

from the a-zero-day-may-now-last-20-years dept

When code is treated like weapons, bad things happen. Governing bodies have previously treated encryption as weaponry, ensuring that only the powerful will have access to strong encryption while the general public must make do with weaker or compromised variants.

More recently, the US government went after the creator of a 3D-printed gun, claiming the very existence of printing instructions violated international arms regulations. So, it's not just the end result that's (potentially) covered under this ban (the actual weapon) but the data and coding itself. That's currently being fought in court, carrying with it some potentially disturbing implications for several Constitutional rights.

Now, it appears the conflation of physical weapons/weaponized code is possibly going to make things much, much worse. The EFF notes that the US government's adoption of recommended changes to an international arms trafficking agreement (the Wassenaar Arrangement) will likely cause very serious problems for security researchers and analysts in the future.

The BIS's version of the Wassenaar Arrangement's 2013 amendments contains none of the recommended security research exceptions and vastly expands the amount of technology subject to government control.

Specifically, the BIS proposal would add to the list of controlled technology:

Systems, equipment, components and software specially designed for the generation, operation or delivery of, or communication with, intrusion software include network penetration testing products that use intrusion software to identify vulnerabilities of computers and network-capable devices.

And:


Technology for the development of intrusion software includes proprietary research on the vulnerabilities and exploitation of computers and network-capable devices.

On its face, it appears that BIS has just proposed prohibiting the sharing of vulnerability research without a license.
As if things weren't already dangerous enough for security researchers, what with companies responding with threats and lawyers -- rather than apologies and appreciation -- when informed of security holes and the US government always resting its finger on the CFAA trigger. Violating the terms of this agreement could see researchers facing fines of up to $1 million and/or 20 years in prison.

Wassenaar was originally limited to physical items used in conventional weapons, like guns, landmines and missiles. It was amended in December 2013 to include surveillance tech, mainly in response to stories leaking out about Western companies like Gamma (FinFisher) and Hacking Team selling exploits and malware to oppressive governments, which then used these tools to track down dissidents and journalists.

The push to regulate the distribution of these tools had its heart in the right place, but the unintended consequences will keep good people from doing good things, while doing very little to prevent bad people from acquiring and deploying weaponized software.

The Wassenaar Arrangement's attempt to wrestle a mostly ethereal problem into regulatable problem was, for the most part, handled well. It defined the software it intended to control very narrowly and provided some essential exceptions:
Notably, the controls are not intended apply to software or technology that is generally available to the public, in the public domain, or part of basic scientific research.
But, even so, it still contained the potential to do more harm than good.
We have significant problems with even the narrow Wassenaar language; the definition risks sweeping up many of the common and perfectly legitimate tools used in security research.
Either interpretation (Wassenaar, BIS) is a problem. The BIS version is much worse, but both will result in a less-secure computing world, despite being implemented with an eye on doing the opposite, as Robert Graham at Errata Security points out.
[G]ood and evil products are often indistinguishable from each other. The best way to secure your stuff is for you to attack yourself.

That means things like bug bounties that encourage people to find 0-days in your software, so that you can fix them before hackers (or the NSA) exploit them. That means scanning tools that hunt for any exploitable conditions in your computers, to find those bugs before hackers do. Likewise, companies use surveillance tools on their own networks (like intrusion prevention systems) to monitor activity and find hackers.

Thus, while Wassenaar targets evil products, they inadvertently catch the bulk of defensive products in their rules as well.
And the results will disproportionately negatively affect those who need these protections the most. This is the end result of controls written with physical items (which originates from physical manufacturing plants and travel on physical means of conveyance) in mind but copied-pasted to handle "items" that can traverse the internet with no known originating point.
That's not to say export controls would have no leverage. For example, these products usually require an abnormally high degree of training and technical support that can be tracked. However, the little good export controls provide is probably outweighed by the harm -- such as preventing dissidents in the affected countries from being able to defend themselves. We know they do little good know because we watch Bashar Al Assad brandish the latest iPhone that his wife picked up in Paris. Such restrictions may stop the little people in his country getting things -- but they won't stop him.
The "open-source" exception in Wassenaar can be useful, up to a point. Researchers could post their findings to Github, as Graham points out, to ensure they're still protected. This, of course, means the Arrangement is still mostly useless, as the moment it's put into the public domain, any entity cut out of the distribution loop by this agreement can immediately make use of posted vulnerabilities and exploits. It also makes research destined to be open-sourced forbidden weaponry until the point it's actually made public. So, a laptop full of research is a prohibited weapon, while a Github post containing the same is not.
When security researchers discover 0-day, they typically write a proof-of-concept exploit, then present their findings at the next conference. That means they have unpublished code on their laptop, code that they may make public later, but which is not yet technically open-source. If they travel outside the country, they have technically violated both the letter and the spirit of the export restrictions, and can go to jail for 20 years and be forced to pay a $1 million fine.
Pro tip:
Thus, make sure you always commit your latest changes to GitHub before getting on a plane.
Statements made by the BIS aren't exactly comforting. The BIS's implementation doesn't include an open-source exception, but supposedly, this will still be taken into consideration when the US government starts throwing around fines and prison sentences. Randy Wheeler of the BIS:
"We generally agree that vulnerability research is not controlled, nor is the technology related to choosing a target or finding a target, controlled." However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.
Again, bad for researchers. This gives the government leeway to imply intent when prosecuting, because the allowed and the forbidden look very similar while still in their formative stages.
[T]he only difference between an academic proof of concept and a 0-day for sale is the existence of a price tag.
Even if the exploit is not on the market at the point the government steps in, it would take very little to insinuate that it would have been headed to market, if not for the speedy intervention of regulators.

There is some good news, however. The BIS is accepting comments on its proposed adoption (and partial rewrite) of the amendments to the Wassenaar Arrangement. The comment period ends on July 20, 2015, so sooner rather than later would be good if you're interested in steering the government away from doing further damage to the livelihoods of security researchers.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: code, regulations, security research, wassenaar, weapons


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    neghvar (profile), 1 Jun 2015 @ 12:44pm

    I don't see this any differently than the posting of instruction to make bombs, poisons, and other means of death and destruction online. One may have the know-how, but one also needs the motive and resources to make the devices.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2015 @ 5:06pm

      Re:

      So, according to you, a malicious program is the same thing as a bomb, poison, or other means of death and destruction.

      Brilliant!

      link to this | view in chronology ]

      • icon
        John Fenderson (profile), 2 Jun 2015 @ 7:53am

        Re: Re:

        That's not what he said. What he said was it was the same as instructions for how to build such things. It's a very strained analogy, but not totally wrong.

        link to this | view in chronology ]

      • icon
        Uriel-238 (profile), 2 Jun 2015 @ 11:44am

        To some degree you are correct.

        Malicious programs as as dangerous as the range of damage done by their payloads. Explosives in an open field make for a spectacle and nothing more. Put it in the middle of a hospital, however...

        But you can plant a bomb pretty much anywhere and it will function. Malware has to be compatible with the system it affects, and has to get injected somehow, which is harder to do.

        And making malware usually just destroys data (which can cause damage on its own, if it's the database of a bank or a hospital), but in order to affect real damage, it has to rely on the resources available, and on the programmer knowing how to sabotage a facility from within (so to sabotage a nuclear power plant, you'd need a programmer and a nuclear engineer familiar with the target plant).

        That way, making targeted malware is a lot harder than a targeted sabotage raid.

        And then, in the aftermath it's highly likely the enemy will obtain your code (much like Iran now having Stuxnet) and they will use it against you and to inform their response.

        link to this | view in chronology ]

  • icon
    DannyB (profile), 1 Jun 2015 @ 12:49pm

    Fun memories from the early 1990's.

    You could not export encryption.

    But what about a textbook on encryption? What about a really good textbook? A book that includes source code examples and listings?

    Last time, the US wasn't willing to stop people from exiting the country with a textbook in their hand. Maybe this time that will change.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2015 @ 5:08pm

      Re: Fun memories from the early 1990's.

      Because disallowing the spread of knowledge and censoring facts has always worked out for the best in the past - right?

      link to this | view in chronology ]

      • icon
        DannyB (profile), 2 Jun 2015 @ 5:31am

        Re: Re: Fun memories from the early 1990's.

        I hear public book burnings might be making a comeback. Throw Applied Cryptography on the top of the pile.

        link to this | view in chronology ]

  • icon
    DannyB (profile), 1 Jun 2015 @ 12:54pm

    Security Researchers

    If security researchers in the US are in danger because of the code they work with, then using my special psychic powers, I can foresee what is going to happen.

    Yes.

    The fog is clearing, I am getting a vision . . .


    Security research will be done outside the US. Other countries will have all the good hacking tools. The US will also not have done any real research into how to defend against attacks, because knowing how to defend against attacks requires understanding how the attack will work. Otherwise you end up with software like . . .

    MickeySoft Maginot Line Defender ! Professional Edition

    The US has no penetration tools and weak defenses . . . and . . . something major is happening now . . .

    . . . oh darn, the vision is doing dark . . .

    link to this | view in chronology ]

    • icon
      MadAsASnake (profile), 1 Jun 2015 @ 1:21pm

      Re: Security Researchers

      Pretty sure they'll have a secret law that allows the NSA to keep doing it, which they'll do anyway... thinking that nobody will know.

      link to this | view in chronology ]

      • icon
        DannyB (profile), 1 Jun 2015 @ 2:09pm

        Re: Re: Security Researchers

        That doesn't mean Your Bank and Your Money will be safe from a Cyber Pearl Harbor.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 1 Jun 2015 @ 5:10pm

          Re: Re: Re: Security Researchers

          That's because they don't care about you - the little guy.

          link to this | view in chronology ]

          • icon
            DannyB (profile), 2 Jun 2015 @ 5:34am

            Re: Re: Re: Re: Security Researchers

            While that is true that they don't care about the little guy, they may foolishly fail to recognize that digital bits do not recognize the difference between the nobility and the peasants. A cyber pearl harbor might wipe out the wealth of the rich and poor alike. Equal opportunity hacking.

            link to this | view in chronology ]

      • identicon
        Anonymous Coward, 1 Jun 2015 @ 2:14pm

        Re: Re: Security Researchers

        Why bother having a law that says the NSA can do it? It is pretty clear the NSA does what they want, then finds a way to misinterpret the law to say it was probably legitimate, and was in good faith even if it was illegal. Not having a secret law saves everyone the time and trouble of pretending to comply with it.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 1 Jun 2015 @ 5:11pm

          Re: Re: Re: Security Researchers

          "Why bother having a law that says the NSA can do it?"

          So that the evidence can be used in court?

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 1 Jun 2015 @ 5:41pm

            Re: Re: Re: Re: Security Researchers

            "So that the evidence can be used in court?"

            They can just lie about where the evidence came from. It's called parallel construction.

            link to this | view in chronology ]

            • icon
              That One Guy (profile), 1 Jun 2015 @ 6:16pm

              Re: Re: Re: Re: Re: Security Researchers

              'Evidence laundering' if you would, call it what it really is, not the 'nice' sounding term they like to use to make it seem less a travesty of justice than it really is.

              link to this | view in chronology ]

          • icon
            DannyB (profile), 2 Jun 2015 @ 5:37am

            Re: Re: Re: Re: Security Researchers

            > So that evidence can be used in court


            Hey, man, get with the times. We now have secret courts. It fits the pattern that follows from the spying.


            Massive spying on citizens
            Secret laws
            Secret interpretations of laws
            Secret courts
            Secret court orders
            Secret arrests (in the middle of the night)
            Secret evidence (that the defense cannot access)
            Secret trials
            Secret convictions
            Secret incarceration
            Widespread police brutality condoned, maybe even encouraged
            Government torture programs

            Sound like what we were fighting in the previous century?

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2015 @ 1:15pm

    However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.
    I find the automatic code completion on Microsoft Visual Studio (which is itself proprietary, although you can use it to write open-source code) to be very helpful in developing all kinds of programs, including 0-day exploits. Does that now mean that Microsoft Visual Studio is a controlled program? :)

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2015 @ 1:24pm

      Re:

      "Does that now mean that Microsoft Visual Studio is a controlled program? :)"

      It all depends on who you are, citizen. Step out of line and it will be, for you. Selective enforcement.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2015 @ 1:26pm

    Politicians!

    Politicians determined to regulate technology that have no chance of understanding. They sound like raving fundamentalists, standing at their own alter, claiming only they know the right path when they can't even find the exit from their church to see the real world. GOD save us from the narcissistic idiots.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2015 @ 1:28pm

    I'm sick and tired of everything needing a license. At this rate it won't be long before we need a license to breathe. Sure makes a bunch of licensing boards very wealthy but it comes at a huge social cost.

    link to this | view in chronology ]

  • icon
    dfed (profile), 1 Jun 2015 @ 3:01pm

    This is ironic since a lot of exploits now rely on fallout from 90's crypto export laws: ie. the ability to request crypto strength downgrades...

    link to this | view in chronology ]

  • icon
    Nickweller (profile), 1 Jun 2015 @ 3:40pm

    Regulation of International Weapons Sales?

    "US Government Making Another Attempt To Regulate Code Like It Regulates International Weapons Sales"

    You do realize that the international arms trade is a total free-for all as they'll sell weapons to anyone who can afford them.

    http://www.rottentomatoes.com/m/lord_of_war/

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2015 @ 5:13pm

      Re: Regulation of International Weapons Sales?

      The regulation is for their competition.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2015 @ 4:18pm

    >Politicians determined to regulate technology that have no chance of understanding. They sound like raving fundamentalists, ...

    I appreciate the attempt at an expression of sanity. But at the end you let slip a level of religious bigotry that might offend some people. You might want to be more careful when you aren't trolling.

    But this isn't about regulating technology. This is an attempt to regulate _thought_. It's like, say, taxing paper--and for the same reason: to avoid people being infected with the alien and seditious ideas written thereon. And it's likely to have the same direct effect.

    Tea Party. Bring your own hatchet and USB. Paper is optional.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jun 2015 @ 5:17pm

      Re:

      I think the comparison is apt.

      For example:

      1) "This is an attempt to regulate _thought_"
      ---- check

      2) "to avoid people being infected with the alien and seditious ideas "
      ---- check

      3) "Tea Party"
      ---- check

      btw, raving fundamentalists of all flavors are a very real threat, much more so than some code.

      link to this | view in chronology ]

  • identicon
    Lawrence D’Oliveiro, 1 Jun 2015 @ 7:55pm

    Why Code Is Not Like A Weapon

    Encryption code lets you shop online and pay for things safely. It preserves your privacy. It lets you be sure the person you are communicating with is who they say they are.

    In short, code lets you do constructive things.

    A weapon does none of these things. It has only destructive uses. It is only useful for causing damage to people or property, nothing more.

    This is why code is not like a weapon, and should not be regulated like one.

    link to this | view in chronology ]

    • icon
      DannyB (profile), 2 Jun 2015 @ 5:45am

      Re: Why Code Is Not Like A Weapon

      Don't you think 3D printed guns are like a weapon?

      The new 3D printing must be heavily regulated. There are terrible threats to society that you may not have considered.

      Here is only one example of the danger. An unregulated 3D printer could be used, in secret, to circumvent Arizona's legal limit of two dildos per household. Regulating 3D printers will help keep us all safe from such dangers.

      link to this | view in chronology ]

      • identicon
        Lawrence D’Oliveiro, 2 Jun 2015 @ 4:50pm

        Re: Why Code Is Not Like A Weapon

        Does a 3D-printed gun have any constructive use? No. It may not be very good as a weapon, but it is still a weapon, and should be regulated as such.

        As John “Danger Man” Drake said: “I don’t like guns. They’re noisy, and they hurt people.”

        link to this | view in chronology ]

        • icon
          Uriel-238 (profile), 2 Jun 2015 @ 10:49pm

          The constructive use of printed Guns

          Part of the advantage of the great 3D shape library (which includes all the parts for numerous common weapons) is not fully recognized here in the US, rather it's well understood in Africa, where small and brutal asymmetrical conflicts take place, and one faction is often grossly oppressing the other.

          Printed gun parts can serve as prototypes for making the components for real: You use the prototype to craft the pouring mold and from that make the parts out of metal. Stamp and machine as you require.

          Factions will be able to afford to arm their people that couldn't previously when guns were controlled by dealers. And factions that before had a massive advantage will quickly see cause to not capture, rape, torture, enslave and massacre their opponents at their pleasure, since their upper hand won't be so certain and eventual comeuppance is assured.

          So the international community has very good cause to expand, maintain and sustain an open-source library of printable guns. And they will stop only when we develop something better than firearms with which to fight wars.

          So unless your own western government who couldn't care less about African people and their petty wars (because Africa and dark skin and no oil) plans to build a great censorship firewall around your nation to keep out those gun shapes, they won't be able to stop people from printing gun parts and occasionally making printed guns.

          Also, since the US is full of gun enthusiasts, some of which like to customize and accessorize their weapons, and a 3D printer is an amazing tool for such a hobby, by criminalizing the printing of gun parts you'll just make criminals out of those hobbyists. Because they're not going to stop when the state tells them they don't get to use 3D printers for their hobby.

          PS: Replace Africa with South America if you like and it's all still true.

          link to this | view in chronology ]

          • identicon
            Lawrence D’Oliveiro, 3 Jun 2015 @ 12:23am

            Re: Why Code Is Not Like A Weapon

            Wow, how much more fanciful do you want to get? Is that really how you folks in the US see things—that the solution to the world’s problems is “more guns”? You don’t think the world’s trouble spots are already so awash with cheap, plentiful AK-47s, RPGs and the like, you think they really need this wonderful new “3D printing” technology to make the crucial difference?

            Even the world’s most unpopular governments have no trouble shooting back at people who shoot at them. Just call the attackers “terrorists”, and public opinion will raise nary a whisper.

            But if government troops shoot at a peaceful protest--now that can make all the difference...

            link to this | view in chronology ]

            • icon
              Uriel-238 (profile), 3 Jun 2015 @ 10:02am

              "Fanciful"

              My statement was descriptive, not prescriptive. I was talking of circumstances that are already changing, not how they should be changed.

              But you seem to have a strong notion of how to fix some terrible situations in the world's badlands, so do tell. I'm sure that those suffering from the hard realities of oppression would love to better understand how your preference for pacifistic principle should override their desires for life and liberty.

              Granted the current US regime is a dick and a bully regarding how it chooses where to apply military intervention. But the theaters about which I speak are ones in which the United States officially couldn't care less.

              link to this | view in chronology ]

  • identicon
    Rekrul, 1 Jun 2015 @ 8:25pm

    "We generally agree that vulnerability research is not controlled, nor is the technology related to choosing a target or finding a target, controlled." However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal.


    So that means that compilers and hex editors are now controlled software...

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jun 2015 @ 8:52pm

    Code don't kill people , people kill people ,hold them responsible.

    link to this | view in chronology ]

  • icon
    toyotabedzrock (profile), 2 Jun 2015 @ 1:31am

    I think you are reading the changes in a way that allows you to push an anti regulation argument and you don't offer an alternative.

    You further seem to think researchers being bothered by government outweighs people being killed because they protested or wrote about a protest.

    link to this | view in chronology ]

    • identicon
      Rekrul, 2 Jun 2015 @ 8:08pm

      Re:

      I think you are reading the changes in a way that allows you to push an anti regulation argument and you don't offer an alternative.


      History has shown that if a law can be abused, it will be abused.

      Look at how the CFAA (Computer Fraud and Abuse Act) was used to try and bully Arron Swartz even though he hadn't actually committed a real crime. Look at how the laws against child pornography are being used to punish the same children that they're supposed to protect. Look at how activist groups are now being labeled as "domestic terrorists".

      They don't use vague language in laws because they're too shortsighted to see how it could be abused, they make the language intentionally vague so that they can have the freedom to abuse it however they want.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Jun 2015 @ 7:33am

    Like the researchers state, the difference between an exploit and valid software is very hazy. This would also outlaw jail breaking software as most use 0 days that are not published, so as not to be circumvented immediately by the hardware vendor. All in all, this is just a stupid agreement that most likely will be ignored, and if prosecuted for petty violations will probably cause more damage than they are helping to protect to the global community.

    link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 2 Jun 2015 @ 11:51am

    The Nth country experiment

    demonstrated to us that the enemy will acquire the knowledge they need to make the weapons they want. Thank goodness nuclear weapons are kinda tricky.

    Terrorists, revolutionaries, saboteurs and industrial spies will get their gun-parts designs and code anyway, and quickly. The better thing to do is develop security protocols that defend against all known attacks.

    But the FBI loves its backdoors.

    And big corporations love their pristine publicity.

    If they criminalize the security engineers. The security engineers will go criminal. I'm sure others will pay them well and treat them better to ply their craft.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.