The 3rd Party Doctrine: Or Why Lawyers May Not Ethically Be Able To Use Whatsapp
from the metadata-matters dept
In December I went to install the Flywheel app on my new phone. Flywheel, for those unfamiliar, is a service that applies the app-dispatching and backend payment services typical of Uber and Lyft to the local medallion-based taxi business. I'd used it before on my old phone, but as I was installing it on my new one it asked for two specific permissions I didn't remember seeing before. The first was fine and unmemorable, but the second was a show-stopper: "Allow Flywheel access to your contacts?" Saying no made the app exit with passive-aggressive flourish ("You have forcefully denied some of the required permissions.") but I could not for the life of me figure out why I should say yes. Why on Earth would a taxi summoning app require access to my contacts? Tweets to the company were not answered, so it was impossible to know if Flywheel wanted that permission for some minor, reasonable purpose that in no way actually disclosed my contact data to this company, or if it was trying to slurp information about who I know for some other purpose. Its privacy policy, which on the surface seems both reasonable and readable, was last updated in 2013 and makes no reference to why it would now want access to my contacts.
So I didn't finish installing it, although to Flywheel's credit, a January update to the app seems to have re-architected it so that it no longer demands that permission. (On the other hand, the privacy policy appears to still be from 2013.) But the same cannot be said for other apps that insist on reading all my contacts, including, conspicuously, Whatsapp.
Whatsapp has been in the news a lot lately, particularly in light of Facebook's announcement that it planned to merge it with its Messenger service. But the problem described here is a problem even as the app stands on its own. True, unlike the old Flywheel app, Whatsapp can currently be installed without demanding to see the contact information stored on my phone. But it can't be used effectively. It can receive an inbound message from someone else who already knows my Whatsapp number, but it refuses to send an outbound message to a new contact unless I first let Whatsapp slurp up all my contacts. Whatsapp is candid in its privacy policy (last updated in 2016) that it collects this information (in fact it says you agree to "provide us the phone numbers in your mobile address book on a regular basis, including those of both the users of our Services and your other contacts."), which is good, but it never explains why it needs to, which is not good. Given that Signal, another encrypted communications app, does not require slurping up all contacts in order to run, it does not seem like something Whatsapp should need to do in order to provide its essential communications service. The only hint the privacy policy provides is that Whatsapp "may create a favorites list of your contacts for you" as part of its service, but it still isn't obvious why it would need to slurp up your entire address book, including non-Whatsapp user contact information, even for that.
The irony is that an app like Whatapp should be exactly the sort of app that lawyers use. We are duty-bound to protect our clients' confidences, and encrypted communications are often necessary tools for maintaining a meaningful attorney-client relationship because they should allow us to protect the communications secrecy upon which the relationship depends. But that's exactly why I can't use it, didn't finish installing the old Flywheel app, and refuse to use any other app that insists on reading all my contacts for no good, disclosed, or proportionally-narrow reason: I am a lawyer, and I can't let this information out. Our responsibility to protect client confidences may very well extend to the actual identity of our clients. There are too many situations where if others can know who we are talking to it will be devastating to our clients' ability to seek the counsel to which they are Constitutionally entitled.
I wrote about this problem a few years ago in an amicus brief on behalf of the National Association of Criminal Defense Lawyers for the appeal of Smith v. Obama. This case brought a constitutional challenge to the US government's practice of collecting bulk metadata from Verizon Wireless without warrants and without their incumbent requirements of probable cause and specificity. Unfortunately the constitutional challenge failed at the district court level, but not because the court couldn't see how it offended the Fourth Amendment when so much personal information could be so readily available to the government. Instead the district court dismissed the case because the court believed that it was hamstrung by the previous Supreme Court ruling in Smith v. Maryland. Smith v. Maryland is the 1979 case that gave us the third-party doctrine, this idea that if you've already disclosed certain information (such as who you were dialing) you can no longer have a reasonable expectation of privacy in this information that the Fourth Amendment should continue to protect (and thus require the government to get a warrant to access). Even in its time Smith v. Maryland was rather casual about the constitutionally-protected privacy interests at stake. But as applied to the metadata related to our digital communications, it eviscerates the personal privacy the Fourth Amendment exists to protect.
Sen. McConnell argues that 215 spying is not a problem since its 'just metadata.' Wrong - metadata matters. pic.twitter.com/XsSa0en1XE
— Kurt Opsahl (@kurtopsahl) May 31, 2015
The reality is that metadata is revealing. And as I wrote in this amicus brief, the way it is revealing for lawyers not only violates the Fourth Amendment but the Sixth Amendment right to counsel relied upon by our clients. True, it is not always a secret who our clients are. But sometimes the entire representation hinges on keeping that information private.
Thus metadata matters because, even though it is not communications "content," it can nevertheless be so descriptive about the details of a life. And when it comes to lawyers' lives, it ends up being descriptive of their clients' lives as well. And that's a huge problem.
As the brief explained, lawyers get inquiries from uncharged people all the time. Perhaps they simply need advice on how to comport their behavior. Or perhaps they fear they may be charged with a crime and need to make the responsible choice to speak with counsel as early as possible to ensure they will have the best defense. The Sixth Amendment guarantees them the right to counsel, and this right has been found to be meaningful only when the client can feel assured of enough privacy in their communications to speak candidly with their counsel. Without that candor, the counsel cannot be as effective as the Constitution requires. But if the government can easily find out who lawyers have been talking to by accessing their metadata, then that needed privacy evaporates. Who a lawyer has been communicating with, especially a criminal defense lawyer, starts to look like a handy list of potential suspects for the government to go investigate.
And it's not just criminal defense counsel that is affected by metadata vulnerability. Consider the situation we've talked about many times before, where an anonymous speaker may need to try to quash some sort of discovery instrument (including those issued by the government) seeking to unmask them. We've discussed how important it is to have procedural protections so that an anonymous speaker can find a lawyer to fight the unmasking. Getting counsel of course means that there is going to be communication between the speaker and the lawyer. And even though the contents of those communications may remain private, the metadata related to the communications may not be. Thus even though the representation may be all about protecting a person's identity, there may be no way to accomplish it if it turns out there's no way for the lawyer to protect that metadata evincing this attorney-client relationship from either the government helping itself to it, or from greedy software slurping it up – which will make the app maker yet another third party that the government can look to demand this information from.
Unfortunately there is no easy answer to this problem. First, just as it's not really possible for lawyers to avoid using the phone, it is simply not viable for lawyers to avoid using digital technology. Indeed, much of it actually makes our work more productive and cost effective, which is ultimately good for clients. And especially given how unprotected our call records are, it may even be particularly important to use digital technology as an alternative to standard telephony. To some extent lawyers can refuse to use certain apps or services that don't seem to handle data responsibly (I installed Lyft and use Signal instead), but sometimes it's hard to tell the exact contours of an app's behavior, and sometimes even if we can tell it can still be an extremely costly decision to abstain from using certain technology and services. What we need, what everyone needs, is to be able to use technology secure in the knowledge that information shared with it travels no farther and for no other purpose than we expect it to.
Towards that end, we – lawyers and others – should absolutely pressure technology makers into (a) being more transparent about how and why it is accessing metadata in the first place, (b) enabling more gradated levels of access to it, and use of it, so that we don't have to tell any app or service more than it needs to know about our lives for it to run, or that it might ever have to ask for any more than it needs in order to run, and (c) being more principled in both their data sharing practices and resistance to government data demands. Market pressure is one way to affect this outcome (there are a lot of lawyers, and few technologies can afford to be off-limits to us), and perhaps it is also appropriate for some of this pressure to come from regulatory sources.
But before we turn to regulators in outrage we need to aim our ire carefully. Things like the GDPR and CCPA deserve criticism because they tend to be like doing pest control with a flame thrower, seeking to ameliorate harm while being indifferent to any new harm they invite. But the general idea of encouraging clear, nuanced disclosures of how software interacts with personal data, as well as discouraging casual data sharing, is a good one, and one that at the very least the market should demand.
The reality of course is that sometimes data sharing does need to happen – certain useful services will not be useful services without data access, and even data sharing among partners who together supply that service. It would be a mistake to ask regulators to prevent it altogether. Also, it is not private actors who necessarily are the biggest threat to the privacy interests we lawyers need to protect. Even the most responsible tech company is still at the mercy of a voracious government that sees itself as entitled to all the data that these private actors have collected. Someday hopefully the courts will recognize what an assault it is on our constitutional rights for metadata access not to be subject to a warrant requirement. But until that day comes, we should not have to remain so vulnerable. When we turn to the government to help ensure our privacy, our top demand needs to be for the government to better protect us from itself.
Filed Under: 3rd party doctrine, confidentiality, contact info, lawyers, metadata, privacy
Companies: facebook, whatsapp