from the and-control dept
I've discussed in the past how many people mistake privacy as some sort of absolute "thing" rather than a spectrum of trade-offs. Leaving your home to go to the store involves giving up a small amount of privacy, but it's a trade-off most people feel is worth it (not so much for some uber-celebrities, and then they choose other options). Sharing information with a website is often seen as a reasonable trade-off for the services/information that website provides. The real
problem is often just that the true trade-offs aren't clear. What you're giving up and what you're getting back aren't always done transparently, and
that's where people feel their privacy is being violated. When they make the decision consciously and the trade-off seems worth it, almost no one feels that their privacy is violated. Yet, when they don't fully understand, or when the deal they made is unilaterally changed, that's when the privacy is violated, because the deal someone thought they were striking is not what actually happened.
And, unfortunately, it often seems like people are increasingly being
pressured into deals they don't fully understand and don't have full control over. Michael Price, over at the Brennan Center for Justice, took the time to actually read through the "privacy policy" on his new "smart" TV and
it's terrified him. Just the fact that a TV even
has a privacy policy seems oddly terrifying, but it makes sense, given that at least some information goes outbound as part of the "smarts." But how much? Potentially a lot more than people would expect:
The amount of data this thing collects is staggering. It logs where, when, how, and for how long you use the TV. It sets tracking cookies and beacons designed to detect “when you have viewed particular content or a particular email message.” It records “the apps you use, the websites you visit, and how you interact with content.” It ignores “do-not-track” requests as a considered matter of policy.
To some extent, that's not really all that different than a regular computer. But, then it begins to get creepier:
It also has a built-in camera — with facial recognition. The purpose is to provide “gesture control” for the TV and enable you to log in to a personalized account using your face. On the upside, the images are saved on the TV instead of uploaded to a corporate server. On the downside, the Internet connection makes the whole TV vulnerable to hackers who have demonstrated the ability to take complete control of the machine.
More troubling is the microphone. The TV boasts a “voice recognition” feature that allows viewers to control the screen with voice commands. But the service comes with a rather ominous warning: “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party.” Got that? Don’t say personal or sensitive stuff in front of the TV.
You may not be watching, but the telescreen is listening.
Now, yes, some of that certainly can be useful in creating interesting features and services. And, frankly, almost all of the same things can be said about the smartphone in your pocket with Siri or Google Now listening in to anything you say at any moment's notice. But at the very least, with those smartphone systems people tend to see and understand the immediate benefits: they use those tools to get information and they're fairly easy to turn off without creating other problems. With the TV, it seems to be more of the promise of potentially providing some future service -- but it's still willing and ready to listen in the meantime.
This is certainly not to argue that the technology is bad, but that these sorts of things shouldn't be hidden in a 46-page privacy policy that no one is going to read. People should be fully aware of what the deal is, and they should have control over how it's used, with some granular controls: maybe let people set the times in which the TV's "ears" are on -- so that maybe it only works during prime time when you're likely to use the TV. Or let people have access to the logs and data that it's snarfing up so they can view for themselves how it's being used. Make sure that the people using it have both transparency and control, and suddenly this becomes somewhat less scary (well, until the NSA goes to the FISA court to use Section 215 to get all the "metadata" from all your smart TVs.)
And, of course, just as I was finishing up with that article, I came across a report of a
patent from Sony from a few years ago. It actually got some attention back in 2012 for describing a system in which your TV may ask you to
say the advertiser's name to end a commercial. This figure in the patent is the one that quite reasonably got plenty of attention.
Perhaps it's no surprise that some companies are considering something like this. In fact, some of the underlying ideas aren't totally crazy. We've long argued that
good advertising is about making it good content, and making ads that are interactive and
fun is one way to do that. Of course, I don't quite see how the above scenario is very much fun. To me, it sounds horrifying, but others may disagree.
Either way, it's become quite clear that while the world is becoming more connected -- between our computers, our phones, our TVs and much more, people are increasingly going to run into challenges around privacy. And, while some are going to jump to the conclusion that any information gathering and sharing is automatically bad and dangerous (or just crazy), it's going to be important to recognize the trade-offs inherent in these new devices and services. If companies don't want the public to totally freak out, they'd do well to make these processes much more transparent, clear and controllable by the users themselves. Unfortunately, we're not quite there yet. The focus is still on hiding these things out of a fear that no one would use them if they knew what they were giving up. That seems like a recipe doomed to create privacy panics, rather than one that actually enables innovation to advance
and which lets the public be comfortable with the choices they're making.
Filed Under: advertising, future, privacy, smart tvs, transparency