from the nice-one,-idiots dept
The Australian Parliament has passed a law ordaining compelled access to encrypted devices and communications. The legislation was floated months ago and opened up for comment, but it appears the Australian government has ignored the numerous complaints that such a law would violate civil liberties and otherwise be an all-around bad idea. But that's OK. It's completely justified, according to the Prime Minister.
Scott Morrison, Australia’s prime minister, told local radio on Thursday that encryption laws were necessary to target Islamist terrorism, paedophile networks and organised crime. “These laws are used to catch the scum that try to bring our country down and we can’t give them a leave pass,” he said.
Sure, and if innocent people find their communications compromised by government-mandated holes, so be it. The law was rushed through Parliament in a late evening session since every moment wasted was just one more leave pass for scum. Legislators promise to review the law in 18 months to ensure it hasn't been abused or created more problems than it's solved, but let's be honest here: how often does legislation like this get clawed back after a periodic review? It's never happened in the history of the laws governing our surveillance programs, even after leaked docs exposed unconstitutional practices and widespread abuse of surveillance authorities.
Here's a short summary of the new powers the legislation hands over to law enforcement and national security agencies:
The law enables Australia’s attorney-general to order the likes of Apple, Facebook, and Whatsapp to build capability, such as software code, which enables police to access a particular device or service.
Companies may also have to provide the design specifications of their technology to police, facilitate access to a device or service, help authorities develop their own capabilities and conceal the fact that an agency has undertaken a covert operation.
This law will go into effect before the end of the year. How it will go into effect is anyone's guess. The law provides for compelled access -- including the creation of new code -- but no one seems to have any idea what this will look like in practice. The new backdoors-in-everything-but-name will be put in place by developers/manufacturers at the drop of a court order, with the onus on the smart people in the tech business to iron out all of the problems.
The law only prevents the government from demanding that "systemic weaknesses" be built into devices or programs. Everything else is left to the imagination, including the actual process of introducing code changes in multi-user platforms or targeted devices.
An actual software developer, Alfie John, has put together a splendid Twitter thread pointing out the flaws in the government's assumptions about software development. Since the compelled participants are forbidden from discussing surveillance court orders with anyone (which would include coworkers, supervisors, the general public, etc.), these requested alterations would have to be implemented in secret. The problem is coding changes go through a number of hands before they go live. Either everyone involved would need to be sworn to secrecy (which also means being threatened with jail time) or the process falls apart. Changes ordered by a court could be rejected by those higher up on the chain. Worse, the planned encryption hole could see the compelled coder being viewed as a data thief or foreign operative or whatever.
Law enforcement is going to have to make everyone involved in the product/device complicit and covered under the same prison threat for this to work. The more people its exposed to, the higher the chance of leakage. And if the code will break other code -- or the request simply can't be met due to any number of concerns -- the government make ask the court to hold the company and its personnel in contempt for their failure to achieve the impossible.
To make matters worse, the company targeted with a compelled access request may be monitored for leaks before and after the request is submitted, putting employees under surveillance simply because of their profession.
In some cases, the only weakness that can be introduced will be systemic, which will run contrary to the law. How will the government handle this inevitable eventuality? Will it respect the law or will it simply redefine the term to codify its unlawful actions?
Even if all of this somehow works flawlessly, users of devices and communications platforms will be put at risk. Sure, the compelled access might be targeted, but it will teach users to distrust software/firmware updates that may actually keep them safer. The government may even encourage the forging of credentials or security certificates to ensure its compelled exploits reach their targets. And just because these backdoors theoretically only allow one government agent in at a time, that doesn't mean they aren't backdoors. They may be slightly more difficult for malicious actors to exploit, but once the trust is shattered by compelled access, other attack vectors will present themselves.
It's a terrible law justified by the spoken equivalent of a bumper sticker. And it's going to end up doing serious damage -- not just in Australia, but all over the world. Bad legislation spreads like a communicable disease. If one democracy says this is acceptable, other free-world leaders will use its passage as a permission slip for encryption-targeting mandates of their own.
Filed Under: australia, backdoors, compelled access, encryption, moral panics, secrecy, software development, terrorism