Dissecting And Dismantling The Myths Of The DOJ's Motion To Compel Apple To Build A Backdoor
from the dishonest-doj dept
While everyone's waiting for Apple's response (due late next week) to the order to create a backdoor that would help the FBI brute force Syed Farook's work iPhone, the DOJ wasted no time in further pleading its own case, with a motion to compel. I've gone through it and it's one of the most dishonest and misleading filings I've seen from the DOJ -- and that's saying something. Let's dig in a bit:Rather than assist the effort to fully investigate a deadly terrorist attack by obeying this Court's Order of February 16, 2016, Apple has responded by publicly repudiating that Order. Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data which has been found by this Court to be warranted for an important investigation. Despite its efforts, Apple nonetheless retains the technical ability to comply with the Order, and so should be required to obey it.This part is only marginally misleading. The key point: of course Apple has designed a product that allows technology to control access because that's how encryption works. It's as if the DOJ still doesn't understand that. Here's a simple, if unfortunate, fact for the DOJ: there are always going to be some forms of communications that it doesn't get to scoop up. Already we know that Farook and his wife destroyed their two personal iPhones. Why not just recognize that fully encrypted phones are the equivalent of that? No one seems to be whining about the destroyed iPhones and what may have been lost even though the very fact that they were destroyed, and this one was not, suggests that if there was anything important on any of his phones, it wasn't this one. There are also things like communications between, say, a husband and wife in their own home. The DOJ can never get access to those because the two people are dead. Think of that like their brains were encrypted and their death made the key get tossed.
There are lots of situations where the physical reality is that the DOJ cannot recover communications. It's not the end of the world. It's never been the end of the world.
Apple, now (finally) trying to design encryption systems that make it so no one else can get in sees this is the best way to protect the American public, because it means that their own information is much safer. It means fewer phones get stolen. It means fewer people are likely to have their information hacked. It means much more safety for the vast majority of the public. And I won't even get into the fact that it was the US government's own hacking of private data that pushed many companies to move more quickly towards stronger encryption.
The government has reason to believe that Farook used that iPhone to communicate with some of the very people whom he and Malik murdered. The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple. The FBI obtained a warrant to search the iPhone, and the owner of the iPhone, Farook's employer, also gave the FBI its consent to the search. Because the iPhone was locked, the government subsequently sought Apple's help in its efforts to execute the lawfully issued search warrant. Apple refused."May contain" is a pretty weak standard, especially noting what I said above. Furthermore, if there were communications with Farook's victims, then shouldn't that information also be accessible via the phones of those individuals as well? And if they already know that there was communication between the two, much of that data should be available elsewhere, in terms of metadata of a phone call, for example.
Apple left the government with no option other than to apply to this Court for the Order issued on February 16, 2016.Actually, there are plenty of other options, including traditional detective work, looking for information from other sources or just recognizing that sometimes you don't get every piece of data that exists. And that's okay.
The Order requires Apple to assist the FBI with respect to this single iPhone used by Farook by providing the FBI with the opportunity to determine the passcode. The Order does not, as Apple's public statement alleges, require Apple to create or provide a "back door" to every iPhone; it does not provide "hackers and criminals" access to iPhones; it does not require Apple to "hack [its] own users" or to "decrypt" its own phones; it does not give the government "the power to reach into anyone's device" without a warrant or court authorization; and it does not compromise the security of personal information. To the contrary, the Order allows Apple to retain custody of its software at all times, and it gives Apple flexibility in the manner in which it provides assistance. In fact, the software never has to come into the government's custody.And here's where the misleading stuff really starts flowing. It absolutely is a backdoor. Anything that makes it easier for a third party to decrypt data without knowing the key is a backdoor. That's the definition of a backdoor. That it comes in the form of making it substantially easier to brute force the passcode doesn't change the fact that it's still a backdoor.
And, yes, this impacts "every" iPhone. As Senator Ron Wyden correctly notes, if the precedent is set that Apple can be forced to do this for this one iPhone, it means it can be forced to do it for all iPhones. No, this single piece of code may not be the issue -- though there are some concerns that even creating this code could lead to some problems if the phone connects to a server -- but forcing a company to hack its own customers puts everyone at risk.
And yes, there is no legitimate way to describe this without claiming that it's hacking Apple's own customers. The whole point of the system is to get around the fact that they don't have the key, and building a tool to disable security features and then allow a brute force attack on the passcode is very much exactly "hacking" Apple's own customers. Sure, this one still requires a warrant, but once Apple is pushed to create that kind of code -- and other companies are forced to build similar backdoors, the technology itself is being designed with extra vulnerabilities that will put many more people at risk. It's not just about the DOJ seeing what's on this damn phone.
The fact that Apple can retain control over the software is a total red herring. No one cares about that. It's about the precedent of a court requiring a company to hack its own customers, as well as forcing them to create a backdoor that can be used in the future -- even to the point of possibility requiring such backdoors in future products.
In the past, Apple has consistently complied with a significant number of orders issued pursuant to the All Writs Act to facilitate the execution of search warrants on Apple devices running earlier versions of iOS. The use of the All Writs Act to facilitate a warrant is therefore not unprecedented; Apple itself has recognized it for years. Based on Apple's recent public statement and other statements by Apple, Apple's current refusal to comply with the Court's Order, despite the technical feasibility of doing so, instead appears to be based on its concern for its business model and public brand marketing strategy.And the misleading bullshit gets ratcheted up a notch. First of all, we already went through why the "Apple helped us in the past" story is wrong. This is totally different. One is giving access to unencrypted information that Apple had full access to. The other is building a system to hack away security features in order to hack into an encrypted account. Very, very different. Second, the whole idea that better protecting its customers is nothing more than "a brand marketing strategy" is insulting. Should the US government want the American public to be protected from criminals and malicious hackers and attacks? The best way to do that is with encryption. The fact that consumers are demanding that they be safer is not an "Apple marketing strategy" it's Apple looking out for the best interests of its customers.
And I won't even dig deep into the fact that one of the big reasons why the public is clamoring for more protection these days is because the US government ran roughshod over the Constitution over the past few years to suck up all kinds of information it shouldn't have.
Later in the motion, the DOJ again argues that there's no "unreasonable burden" on Apple to hack its own customers. It trots out a similar line that was in the original application for the order, saying "what's the big deal -- we're just asking for software, and Apple makes software, so no burden."
While the Order in this case requires Apple to provide or employ modified software, modifying an operating system which is essentially writing software code in discrete and limited manner is not an unreasonable burden for a company that writes software code as part of its regular business. The simple fact of having to create code that may not now exist in the exact form required does not an undue burden make. In fact, providers of electronic communications services and remote computing services are sometimes required to write some amount of code in order to gather information in response to subpoenas or other process. Additionally, assistance under the All Writs Act has been compelled to provide something that did not previously exist the of the contents of devices seized pursuant to a search warrant. In United States v. Fricosu..., a defendant's computer whose contents were was seized, and the defendant was ordered pursuant to the All Writs Act to assist the government in producing a copy of the contents of the computer. Here, the type assistance does not even require Apple to assist in producing the contents; the assistance is rather to facilitate the FBI's attempts to test passcodes.Again, this is both ridiculous and extremely misleading. Creating brand new software -- a brand new firmware/operating system is fraught with challenging questions and potential security issues. It's not just something someone whips off. If done incorrectly, it could even brick the device entirely, and can you imagine how the FBI would react then? This is something that would require a lot of engineering and a lot of testing -- and still might create additional problems, because software is funny that way. Saying "you guys write software, so writing a whole new bit of software isn't a burden" is profoundly ignorant of the technological issues. Update: If you want a long and detailed post from someone who absolutely knows how iPhone forensics works, and how incredibly involved creating this software would be, go read this blog post right now. In it, Jonathan Zdziarski, notes that the DOJ is flat out lying in the way it describes what it's asking Apple to do, and it would be incredibly involved, and would create all sorts of risks of the code getting out.
Second, the Fricosu case is quite different. That was compelling someone to give up their own encryption key -- something that not all courts agree with by the way, as some view it as a 5th Amendment or 1st Amendment violation. That's quite different than "write a whole new software thing that works perfectly the way we want it to."
As noted above, Apple designs and implements all of the features discussed, writes and signs the routinely patches security or functionality issues in its operating system, and releases new versions of its operating system to address issues. By comparison, writing a program that turns off features that Apple was responsible for writing to begin with would not be unduly burdensome.This shows a profound technological ignorance. Yes, Apple updates its operating system all the time, but yanking out security features is a very different issue, and could have much wider impact. It might not, but to simply assume that it's easy seems profoundly ignorant of how software and interdependencies work. Again, the DOJ just pretends it's easy, as if Apple can just check some boxes that say "turn off these features." That's not how it works.
Moreover, contrary to Apple's recent public statement that the assistance ordered by the Court "could be used over and over again, on any number of devices" and that "[t]he government is asking Apple to hack our own users," the Order is tailored for and limited to this particular phone. And the Order will facilitate only the FBI's efforts to search the phone; it does not require Apple to conduct the search or access any content on the phone. Nor is compliance with the Order a threat to other users of Apple products. Apple may maintain custody of the software, destroy it after its purpose under the Order has been served, refuse to disseminate it outside of Apple, and make clear to the world that it does not apply to other devices or users without lawful court orders. As such, compliance with the Order presents no danger for any other phone and is not "the equivalent of a master key, capable of opening hundreds of millions of locks."We discussed some of this above, but the issue is not the specific code that Apple will be forced to write, but rather the very fact that it will be (contrary to the DOJ's claim) forced to hack their own phones to eliminate key security features, in order to allow the FBI to get around the security of the phone and access encrypted content. If the court can order it for this phone, then yes, it can order it for any iPhone, and that's the key concern. Furthermore, again having Apple tinker with the software can introduce security vulnerabilities -- and already this discussion has revealed a lot about how hackers might now attack the iPhone. I'm all for full disclosure of how systems work, so that's okay. But the real issue is what happens next. If Apple looks to close this "loophole" in how its security works in the next iPhone update, will the court then use the All Writs Act to stop them from doing so? That's the bigger issue here, and one that the DOJ completely pretends doesn't exist.
To the extent that Apple claims that the Order is unreasonably burdensome because it undermines Apple's marketing strategies or because it fears criticism for providing lawful access to the government, these concerns do not establish an undue burden. The principle that "private citizens have a duty to provide assistance to law enforcement officials when it is required is by no means foreign to our traditions."Again, this is a made up talking point. Protecting user privacy, as they demand it, is not a "marketing strategy." It's a safety and security strategy. You'd think, of all agencies, the FBI would appreciate that.
Anyway, you can go through the entire 35 page filing yourself, but these were the key points, and almost all of them are misleading. It should be interesting to see Apple's response next week.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: all writs act, backdoors, doj, encryption, fbi, going dark, hacking, motion to compel, safety, security
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
Apple messed up
IF it is retrievable then Apple messed up when they designed the phone in the first place.
If Apple was working to open source rules then they could simply turn round and say to the DOJ "The information is available - you do it - if you can".
The problem here is that Apple seems to have relied on Security by Obscurity.
[ link to this | view in thread ]
Laughable claim by the FBI
Farook destroyed his personal phone. Why did he not destroy this phone? Obviously, it has no data that is useful to investigators. As pointed out in the article, he used the phone to communicate with work colleagues, so why not get the data off those phones?
The FBI knows all this, so why waste resources on this phone? The only reason is that the FBI thinks that this is a case where people will sympathise with them and this will help the FBI establish a precedent.
[ link to this | view in thread ]
Re: Apple messed up
Not really. Mobile device firmware updates -- at least for iOS and Android -- need to be signed by a digital signature. The DOJ and NSA do not have Apple's signing key. Even if law enforcement had the skills to write an operating system, the hardware would reject it, as not being properly signed.
Not really. What law enforcement is looking to do is to have Apple make it easier for them to attempt to brute-force the PIN. Apple cannot retrieve the data any more than law enforcement can. About the only thing you can argue here is that the 10-failures-and-it's-wiped rule could be burned into hardware, such that it could not be removed. That's certainly possible, and Apple might do that in future devices.
[ link to this | view in thread ]
Re: Laughable claim by the FBI
I'm also curious about how you'd go about creating this software that only works on this one phone - either you write it and the first test is on the actual device (maybe deleting the data the FBI are after due to a bug), or you write it to run on some other phone (or all phones), test it there, then add in the "only on this particular phone" code.
[ link to this | view in thread ]
Re: Re: Apple messed up
I'd bet money they'll implement that.
[ link to this | view in thread ]
all of this seems to me.......
[ link to this | view in thread ]
And if they -do- brick the phone, what can the FBI actually do besides wag a finger at Apple?
[ link to this | view in thread ]
Fifth Amendment, and Corporations rights of Individuals, too! ;)
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
China won't allow Windows later than XP because of the spying issue with Microsoft and the government. That's a whole country that won't be considering Cisco's equipment. I haven't heard the government stepping up to say they will cover Cisco's loss due to their own actions.
Nor in all this do I hear the FBI saying that they know if this is done it will cause a major hit on Apple's profit line they'd be willing to pay for. That part of it seems rather mysteriously silent.
[ link to this | view in thread ]
The answer of course, is that they still feel like they own your phone as much as you do. This gets back to the issue, discussed here often, of companies remote-deleting content on your device because you don't really "own" the content or even the device.
Maybe stuff like this make these companies give some control back to their customers. Of course, Stallman would say that the until everything is free software, you'll never own your device.
[ link to this | view in thread ]
Circular Investigation
[ link to this | view in thread ]
Compelled Commercial Speech
This is the FBI compelling Apple to write software that does what they want. This is nothing more that Government mandated compelled speech.
Why isn't this being framed as a 1st amendment violation, forcing Apple to adopt the FBI's utterances?
[ link to this | view in thread ]
[ link to this | view in thread ]
Procedurally Bizarre Fallacious Preemptive Motion to Compel
But Apple only objected to the order in Tim Cook's post. Apple has not said, "We will refuse to obey this Order."
And more importantly, Apple doesn't have to comply with anything yet. It has until next week to file its opposition.
So the entire motion is procedurally suspect at the least, and based on DOJ saying Apple said something it didn't say, and therefore ... (butthurt or something)
[ link to this | view in thread ]
and yet the Feds........
https://gma.yahoo.com/san-bernardino-shooters-apple-id-passcode-changed-while-234003785--a bc-news-topstories.html
[ link to this | view in thread ]
Apple fight the fight don't give in
[ link to this | view in thread ]
Question!
I agree that backdoors to encryption is bad, but in this instance, is the DoJ in the right with the warrant?
[ link to this | view in thread ]
Of course, the DoI initials lend itself to something even more accurate about that agency.
[ link to this | view in thread ]
Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel
Either way, this is an extraordinary situation where the All Writs Act applies. A number of people died because of these county workers and everyone should be helping solve the crime.
[ link to this | view in thread ]
Re: Question!
On balance though, anecdotaly (I haven't really counted) somewhere around 95% of all the comments I have read about it think it is wrong, legal or not.
[ link to this | view in thread ]
Re: Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel
[ link to this | view in thread ]
Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel
So the real reason must be that DOJ wanted to get their version of the story into the press/public to counter Apple's statements (and it reads like it; and the press did eat it up). And if so, that's not a legit use of our tax dollars in my book.
[ link to this | view in thread ]
Re: Apple messed up
"he key point: of course Apple has designed a product that allows technology to control access because that's how encryption works."
The problem is that Apple claims to have the secure chip thing for encryption, but for all that it depends on a pincode from the user as the true security - because the user has to be able to access the encrypted device, right?
The problem is that Apple's security boils down to the pin code, and two very primitive security concepts: limit attempts, and delay per attempt. Both of those features are set up in IOS or other firmware, and pretty much everyone (including people who worked on it) agree that those things could be disabled without harming the secure key code.
Good encryption would require a pincode long enough to make brute forcing the system meaningless. Even at 8 digits (just numbers) the process would take a couple of years at 1 per second. Force it to be letters rather than numbers and the 8 letter passcode requires 208827064576 attempts. Since it must be manually entered on the device (ie, they cannot send thousands per second or operate in parallel) the security would be intact and there would be little or no discussion.
Apple appears to have made it too easy to brute force, and made the security against brute force be in firmware or OS code which can be overwritten under the right circumstances. That choice makes the FBI request and the court order possible, and Apple hates it.
[ link to this | view in thread ]
Actually, it's not a backdoor. Apple would not be unlocking the device or providing a method by which they could access the device without knowing the pincode. They would only be removing artificial blocks which stop a brute force effort from obtaining the pincode in less than a month. This exists because Apple chose the way their system works, and allowed these features to be set in re-writable firmware or on the OS level, rather than being attached in a manner that would kill the key if changed.
So, no, there is no backdoor. Apple is being asked to remove the bubble wrap from their (relatively) easily picked locked. Nobody is asking them for a masterkey to this lock. The feds will force the lock once the bubble wrap is out of the way.
[ link to this | view in thread ]
Two can play at that game
Anyone calling for crippling encryption is calling for something that would be a massive boon to criminals, and put everyone at risk, and they deserve to be called out on it any time they do so.
[ link to this | view in thread ]
Re: Re: Question!
On the other hand, they followed the 4th Amendment at least.
[ link to this | view in thread ]
Re: Re: Apple messed up
Apple is a consumer product manufacturer, and so usability concerns play a role in decision-making. Life is a series of trade-offs.
Part of what the DOJ is requesting is the ability to test passwords without manual input, to drive the time per test down to the ~80ms obtained from the PBKDF2 rounds.
Personally, I am surprised that they use as few PBKDF2 rounds as they do, such that it introduces only ~80ms delay.
[ link to this | view in thread ]
Re:
Sure it is.
Computer programmers and security analysts would describe that as a backdoor. A backdoor simply means the ability to access secured communications in a reasonable timeframe, where such access would not be possible the absence of the backdoor.
[ link to this | view in thread ]
Even with code signing certs, imo this is a flaw in Apple's approach and I'm glad the FBI brought it to our attention.
I don't really care how this case turns out as long as there's no new legislation. Apple messed up and needs to build their phone so it's impossible for them to help to unlock it or decrypt it.
Any device that can be updated without an active authenticated user logged in has a huge security flaw. A simple fix would be to only allow software updates to be triggered if a user is active. If the phone is on the lock screen, all updates should be pended until the user enters their pass code and starts actively using the phone. The update could be automatically triggered by the unlock so it doesn't require user intervention.
IMO this should be the standard on all devices, including Windows and Android. Automatic updates are fine as long as they're only triggered if a user is active. Although personally, I'd take it one step further and give the user an option to approve all updates.
[ link to this | view in thread ]
The phone they want access to is encrypted. If they want access to it, they'll need to try to brute force the encryption. Apple can't change that at this point.
[ link to this | view in thread ]
Re: Re: Apple messed up
But the existence of such a key as a single point of failure for the entire Apple ecosystem is a really bad mistake - that we criticise mercilessly here when DRM vendors do it.
[ link to this | view in thread ]
Has anyone actually checked that this case is different? I wouldn't know where to look in the licensing agreement for iOS, but somewhere in there would be the ownership rights retained by Apple, and it's completely possible that the rights retained are sufficient to declare the software to be Apple's property for this purpose.
[ link to this | view in thread ]
Re: Re: Re: Apple messed up
The criticism of DRM is that the intended recipient and (potential) attacker are the same person. That is not the case here.
[ link to this | view in thread ]
Re: Re: Laughable claim by the FBI
[ link to this | view in thread ]
It absolutely is a backdoor... And, yes, this impacts "every" iPhone.
The quote is, "The Order does not, as Apple's public statement alleges, require Apple to create or provide a "back door" to every iPhone". I don't see anything in that that's not true. You may believe that this means there will be orders in the future that will apply to every iPhone. But the DOJ is talking about this order. Does it apply to every iPhone?
If the court can order it for this phone, then yes, it can order it for any iPhone, and that's the key concern.
Is it appropriate for the court to consider that? Isn't the judge supposed to rule on the facts of this case? It seems to me that if this warrant is supportable under the Constitution and the All Writs Act (I don't know if it is or not), then it should be issued. I can't see how the judge could legitimately decide that it would be an appropriate warrant to issue but she's going to deny it because of fears of what future warrants might be requested. And inversely if it's not an appropriate warrant based on the law and the facts of the case, then one need not consider the possible future effects to deny it. So either way, what law enforcement might do with this in the future is certainly worth considering, but I don't see how it's relevant to the question of the validity of the warrant.
[ link to this | view in thread ]
Re: Re: Re: Laughable claim by the FBI
The alternative is to install and run a special installer, which checks the devices ID and the runs the real of the install.
All approaches require Apple to do extra work on installing software to enable them to target a particular machine. Therefore targeting a specific machine is not particularly easy. The easiest way to do what is requested is to use a MAC to install the software via USB to the targeted phone.
[ link to this | view in thread ]
Snowden FYI
Journalists: Crucial details in the @FBI v. #Apple case are being obscured by officials. Skepticism here is fair:
[ link to this | view in thread ]
Re:
Whether the order explicitly says so or not doesn't terribly matter actually, as effectively it does via precedent and both sides know it(and in fact at this point I'm almost completely convinced that the DOJ/FBI's entire interest in the case is solely in setting that precedent). Once they've got precedent that companies can be forced to bypass their own encryption, it will be used for other devices, to pretend otherwise is just absurd.
Is it appropriate for the court to consider that? Isn't the judge supposed to rule on the facts of this case? It seems to me that if this warrant is supportable under the Constitution and the All Writs Act (I don't know if it is or not), then it should be issued.
And that's where the details of the case become important. If the DOJ was just asking Apple to hand over data from the device as they've done before, this wouldn't be an issue. The order however goes significantly farther than that in ordering Apple to undermine their own security in order to provide that data, and that is where the real problem comes into play, and where the order steps over the line of what a warrant should be able to force compliance to.
This is not just a demand for data as a normal warrant would cover, it's forcing someone to go out of their way, working on their own dime to undermine their own security, in order to provide that data.
[ link to this | view in thread ]
Tactical error
[ link to this | view in thread ]
Re: Tactical error
*often
**DNA
[ link to this | view in thread ]
Re: Tactical error
*often
**DNA
[ link to this | view in thread ]
ELI5
[ link to this | view in thread ]
Re:
Trying to pretend two opposite things at the same time, while downplaying the most important of them.
On the one hand, it's "only" removing a minor, nearly insignificant part of the security. Something "every expert" agree should not even be part of the security.
On the other hand, the FBI is just waiting for this insignificant security feature to get the data... because this negligible feature is completely blocking them out.
That's quite a contradiction. It seems like this unimportant feature is actually quite useful, and a way to bypass it clearly match the definition of a backdoor. The fact that you despise this feature doesn't make it irrelevant.
[ link to this | view in thread ]
Re:
If what the FBI is asking is actually feasible then that's a major security flaw. One I'd think Apple will be quick to fix in the very near future. But even then two questions will remain: the trust on them will be completely broken so what if all criminals move to an uncrackable solution? Second, if Apple can actually comply with the court order now (costs be damned) and closes this gap in the near future in an upgrade to all iphones, what will the FBI do? Will the courts order the impossible?
Oh and you are spewing a lot of bullshit. If it is a bubble wrap then the feds can go through for sure. It's not that simple. Besides, if the owner set a decently long password with special characters and all it doesn't matter, it will be close to unckacable even using brute force (as a reminder, the hardware has a minimum physicaly limited interval of a few dozen milliseconds). What then? If they were careful enough to destroy two phones to erase data you'd imagine they were careful enough to do their homework.
In the end, everybody loses, feasible or not.
[ link to this | view in thread ]
Re: Re:
I already do. I literally cannot think of a commercial network-connected product that I trust to protect my data, even though most claim they do. Not iThings, not Android, not my car, nothing. So pretty much every connected device I buy fits your category.
This does mean that I buy less things, because I only buy things that I am confident that I can adequately secure or operate without a net connection.
[ link to this | view in thread ]
Parallel Construction?
[ link to this | view in thread ]