Netflix Moving To Encrypted Streams, As Mozilla Moves To Deprecate Unencrypted Web Pages As Insecure
from the yay-encryption dept
We've been pretty vocal about supporting the encryption of more and more web traffic. It's important for a variety of reasons, not the least of which is your privacy and security. A few months back, we were excited to see the Chrome security team suggest that it should start marking unencrypted web pages as non-secure. It appears that Mozilla is now joining in on the fun, proposing deprecating unencrypted HTTP web pages to encourage more web developers to go full on in support for encrypted HTTPS:In order to encourage web developers to move from HTTP to HTTPS, I would like to propose establishing a deprecation plan for HTTP without security. Broadly speaking, this plan would entail limiting new features to secure contexts, followed by gradually removing legacy features from insecure contexts. Having an overall program for HTTP deprecation makes a clear statement to the web community that the time for plaintext is over -- it tells the world that the new web uses HTTPS, so if you want to use new things, you need to provide security.It's a clever setup. Basically, if you want to take advantage of new features on the web, you'll have to encrypt.
Meanwhile, it appears that Netflix has separately announced that it is moving forward with plans to encrypt all of its infrastructure with HTTPS to better protect your privacy as well:
with our existing server infrastructure and the up to 50% capacity hit we had observed, driven by our traffic mix.In short, yes, deploying HTTPS at that scale is expensive, but the benefit to users is tremendous and worth it.
At that time, we were uncertain of the gains we could achieve with software and hardware optimization and of the timescale for those. I'm pleased to report we have made good progress on that and we presented our FreeBSD work at the Asia BSD conference. We now believe we can deploy HTTPS at a cost that, whilst significant, is well justified by the privacy returns for our users.
So, as we mention today in our investor letter, we intend to roll out HTTPS support over the coming year - for both our site and the content itself - starting with desktop browser tests at scale this quarter.
It's still going to take a while, but we're getting closer to reaching that tipping point where an unencrypted web is a historical anomaly and that's a very good thing.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: encryption, https, security
Companies: mozilla, netflix
Reader Comments
Subscribe: RSS
View by: Time | Thread
Development
Especially when I'm developing software I don't want to add SSL and it's complications to the mix yet. I have enough bugs without adding SSL certificate issues (including such fun as "I can't get real SSL certificates for the domain, security policies on the systems prevent me from adding a local root CA certificate and bits of software don't have the ability to handle self-signed certificates without errors.") and having to correctly configure SSL on both ends before I can even start seeing output.
I'm strongly of the opinion that protocol layers should be independent. HTML shouldn't depend on features of HTTP nor require that it only be served over HTTP. HTTP likewise shouldn't care whether it's running over TCP or SSL or SNA for that matter (yes, even in this decade good old LU6.2 and SNA over bisync is alive and well despite all attempts to correct the situation).
[ link to this | view in chronology ]
Re: Development
Unfortunately, there are a lot of developers that would prefer not to add SSL and HTTPS. That's why Mozilla is proposing what it is.
[ link to this | view in chronology ]
Re: Re: Development
And what are they going to do with IPv6 and built-in IPSec, where the authentication and encryptiong are handled at the IP level rendering SSL/TLS redundant? IPSec is an RFC-level standard, after all.
[ link to this | view in chronology ]
Re: Development
Yes. They should. That's arguably one of the reasons why the Internet's protocol layers are what they are and not something else. It is a serious architectural error to introduce dependencies between them -- or between network data transport protocols and content.
It's also a dubious idea to push for even more reliance on the CA model when (nearly) every day new research results show that it's coming apart at the seams.
There are far more pressing things for Mozilla to work on than this. The functionality of add-ons like AdBlock Edge, NoScript, BetterPrivacy, Disconnect, etc. all need to be in the browser -- because those address some of the most significant threats. Reliance on Adobe Flash needs to be phased out. Ports to other architectures need to be prioritized. (One of the best ways to find bugs in your code, security and otherwise, is to get it running on another CPU/operating system.)
And geez, PLEASE stop the endless, pointless, silly tinkering with the UI - which was perfectly fine 25 revisions ago.
[ link to this | view in chronology ]
Re: Re: Development
By that logic, https should removed from browsers.
[ link to this | view in chronology ]
Re: Re: Re: Development
[ link to this | view in chronology ]
Re: Re: Development
And geez. PLEASE stop the endless, pointless, silly tinkering with wireless technology - my home phone was perfectly fine 40 years ago.
The most important innovation comes from people doing pointless tinkering.
[ link to this | view in chronology ]
Re: Re: Re: Development
But when it's applied to UI design of production software and inflicted on hundreds of millions of people, it's not. Mozilla's developers have only succeeded in making the UI far less useful than it was and in penalizing competent users. Meanwhile, serious security and performance bugs remain unaddressed -- have you looked lately? (where "lately" could be any time in the past several years)
[ link to this | view in chronology ]
Re: Development
Phone tech by it's very nature needs to be relatively slow to change. If anything, it's probably a good example of the OPPOSITE of the attitude you are trying to express there.
Now I am talking about the actual telecom tech rather than all of those bells and whistles and distractions that get added to a modern phone.
In many ways, wireless still SUCKS. It's slow, unreliable, and insecure. It allows for easy mass surveillance. Wireless is good for convenience (sometimes) but is inferior for just about anything else.
[ link to this | view in chronology ]
Re: Re: Development
[ link to this | view in chronology ]
Re: Re: Re: Development
If this is what happens when Mozilla is obsessed with memory usage, I would hate to see what it would be like if they didn't care. Or did you mean they're obsessed with using as much memory as possible?
[ link to this | view in chronology ]
Re: Development
I work in a company governed by PCI-DSS, so maybe my perspective is skewed, but there is always the possibility that new temp worker is going to try an snoop your internal network. No network traffic is safe unless you have endpoint encryption.
BTW, for local SSL/TLS go to a free cert provider and get a cert for something like localdev.[your domain].com, then in your host file (or internal DNS) point localdev.[your domain].com to 127.0.0.1. Now you can make requests to your local system with a cert signed by a trusted root.
[ link to this | view in chronology ]
Re: Re: Development
[ link to this | view in chronology ]
Re: Development
[ link to this | view in chronology ]
Re: Re: Development
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Otherwise, this is really the best solution visible at the moment. Don't trust the CAs? Fine, then your default position is "do not trust", which is what it already should be for unencrypted sites now. Literally nothing has changed for you if you don't trust the CAs.
[ link to this | view in chronology ]
Re:
Given there is competition in the browser market and people care about privacy, CA's that can't be trusted are probably an issue that will be resolved by market pressure.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re:
These preloaded root CAs are a security compromise. They weaken the trust mechanism quite a lot, in exchange for the convenience of not having to verify the trust chain yourself. So your concern is quite valid.
My answer to the problem basically boils down to... yes, it's suboptimal, but it's the best we have right now. If you require a greater level of security, nothing stops you from doing it the proper way: remove the root CAs and validate the site certs yourself. You can then sign those certs with your own root cert (that you've installed in your OS and/or the browser) and everything will work as normal.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
But then again, if SSL becomes more and more the standard, maybe there's greater incentive to fix this issue. I guess we'll see.
Same for TKnarr's points. Most web developers work by running their application server on localhost. So now I need SSL and certificates for that too? Come on.
[ link to this | view in chronology ]
Re:
Not really, depending on what you actually want or need. Basic SSLs can be bought for less than $10/year, and don't run into 3 figure sums until you start adding a lot of subdomains or features. The cheap ones aren't suitable for e-commerce, but if you're doing that without HTTPS because you can't afford a few hundred in basic overhead, you deserve to lose that business anyway.
"So now I need SSL and certificates for that too? Come on."
I would hope a competent admin knows how to self cert their own server, and services exist to provide free SSL certs for testing purposes if you need something externally for some reason. These really aren't excuses in 2015.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
Uh, no?
1) Certificates can be had for free;
2) If you're just 'placeholding' the domains but are not publishing, then don't get certs;
3) If you're just hanging on to the domains and don't care about their Mozilla and Google ranking for now, then just don't get certs;
and lastly but most importantly, just wait a few months...
4) Certificates will be available for free from EFF's Let's Encrypt project.
Press release:
https://www.eff.org/deeplinks/2014/11/certificate-authority-encrypt-entire-web
Let's Encrypt:
https://letsencrypt.org/
[ link to this | view in chronology ]
Re: Re: Re:
Such people are far less likely to be hosting own their own servers. Many hosting providers provide shared SSL for free, and prices for dedicated certs are already being driven down significantly by the greater demand for certs and their increasing non-business usage. I have no doubt that competitive hosting packages will drive the prices down further, as they have made things like limited email addresses and paying premiums for more than 20Mb of disk space a thing of the past.
We're not talking about forcing people to pay hundreds of dollars just to stay online. We're talking about something that the market is already making steps to make as effortless and inexpensive as hosting itself.
"They're served as plaintext because that's what they are"
Cool. That doesn't mean that communications are immune from man in the middle attacks and other things that SSL is designed to prevent, but it's certainly less likely that you'll be a target. But, should security be reduced for everyone just because you don't think you'll be a target?
"some as placeholders for my nieces and nephews when they come of age"
So you're now arguing that the security of the web should be compromised for people who aren't even using their domains? What's wrong with domain parking, forwarding or other services that are available for free?
"Are my domain costs now going to double"
Depends on your hosting provider. Shop around. you have time, it's not suddenly going to be mandatory tomorrow.
[ link to this | view in chronology ]
Re: Re: Re: Re:
I don't want to make anyone else's communications less secure, but it still seems like using certified mail when I just want to send a "wish you were here" postcard. Postcards are still a thing, right?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
As for analogies, I can't really think of a good one. The postcard one is a flawed since using certified mail involves extra time and effort on both sides, whereas if everything's set up properly the person visiting your web site won't have to do anything.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re:
In addition to what PaulT said, you can also self-sign your certs and have people using your site manually install your root cert to use your site.
This is unworkable for a publicly-facing site (who's going to bother to install your special cert, even if they know how?) but can work quite well for sites that are not intended for the general public.
Also, if you're talking about internal sites that aren't going out to the internet at large, then you can ignore all of this HTTPS stuff if you wish without any problem (aside from the obvious security one).
[ link to this | view in chronology ]
Re: Re:
Too which I say too bad because my privacy is worth a slight uptick in your internal production costs.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Counterpoint: why should insecure standards be retained because some blog owners who don't have a lot of visitors or content don't want to put in the work?
I understand that implementing SSL can be pain if you're not used to it, but the web is also much bigger than your page, and the needs of the majority are what will always win out. Standards are deprecated all the time, and there's rarely one that isn't being used or preferred by someone. That's just the way it is.
For the record, I also have a blog that's not on SSL as yet, but I don't expect inferior standards to be adhered to for my sake.
You have multiple options:
- Continue using HTTP, but risk losing visitors as secure standards are prioritised.
- Obtain a cheap (less than $10/year) or even free cert that gives basic SSL capability.
- Rather than host your own content, move your blog to a (usually free) service that provides SSL as part of their standard account package, freeing you from the need to admin the server.
[ link to this | view in chronology ]
Re: Re:
This adds another cost and administrative overhead to be carried out by an individual who wants to put up a simple web-site. An unintended consequence of all such rules and regulation is that they tip the table towards corporations and away from individuals when it comes to all interactions with the general public. It may not be by much, but every little bit that a barrier to entry is raised, some individuals are put of from entry into an area or activity.
[ link to this | view in chronology ]
Re: Re: Re:
If you choose to administer your own server, you've chosen the admin overhead, and the web is better off if you're forced to obey basic security rules. As with everything security related, there's a balance between ease of use and security, and I'm happy with the pendulum swinging back toward security. The web is full of compromised sites and servers run by people who wanted the freedom without the responsibility. Which is exactly why we're having this discussion to begin with.
[ link to this | view in chronology ]
Re: Re: Re: Re:
For example (1) having functional role addresses and paying attention to them is one of the best security tactics available. After all, if the entire rest of the Internet is willing to provide you with free consulting, why would you turn it down?
For example (2) following BCP 38.
For example (3) setting up your web server on as secure an OS as possible with as minimal a software footprint as possible with as feature-poor a web server as possible.
Those things are easier to do and don't require understanding of https/certificates/etc. that. I'm not saying that they're the whole list -- of course they're not. And I'm not saying that https shouldn't be on the list: for a lot of sites, it should. But i think it's important to start with fundamentals and work up to more sophisticated measures.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
I used to work tech support for a hosting company, and I can tell you that a depressingly large number of people fail miserably at the latter point. If it's not a focus of their job role, most people tend to ignore things if they're running properly.
"getting up your web server on as secure an OS as possible with as minimal a software footprint as possible with as feature-poor a web server as possible."
If someone is too lazy/stupid to learn how to set up an SSL certificate, they're certainly not competent to do that effectively. Why is it not a good thing to weed out those people before they have a functioning site accessible by everyone?
[ link to this | view in chronology ]
Re: Re: Re:
And to go to PaulT's point "...if you're doing that withut HTTPS because you can't afford a few hundred in basic overhead, you deserve to lose that business anyway" -- a few hundred might as well be tens of thousands to some. One of the great things about the internet is that it lowers the entry cost for many businesses to near zero, allowing individuals to start up with sweat equity and compete with the big boys. I know, you have to pay for hosting, buy a computer, etc. -- which is kinda the point. Any individual expense might be relatively small, but they add up.
And what about people (or non-profits) who don't *want* to make any money off their website, labors of love and/or art and/or social change? Those sites are as important as (maybe sometimes more important than) money-making sites.
[ link to this | view in chronology ]
Breaking what isn't broken.
There is no value to imposing a "Brazil" style beaurocracy on everyone. All it does is retard creativity and stifle innovation. Only bother that actually matters should be tolerated. BS for it's own sake should not be encouraged.
[ link to this | view in chronology ]
Re: Breaking what isn't broken.
Other than the fact that it can take a little more admin at present, how is imposing HTTPS any more a Brazil-style bureaucracy than making everyone adhere to the other existing standards upon which the web is built? Standards are deprecated, protocols no longer supports, version upgrades forced, etc. all the time. What makes this one different, other than the fact that some people might have to do a little work rather than depend on a version upgrade of some software or other?
"BS for it's own sake should not be encouraged."
Agreed. However, the push for secure connectivity is nothing of the sort. Unless someone has a real counterargument that doesn't boil down to "I don't want to do the work", "I'm assuming I'll be safe because nobody reads my blog" or "it was expensive when I checked the price in 2005", I fail to see the actual problem.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Any plans for Techdirt to support encryption fully?
But: "Your connection to www.techdirt.com is encrypted with modern cryptography. However this page includes other resources which are not secure." - warning from latest Google Chrome on fully-patched Windows 7, via LAN in UK.
Seems to be due to using http: for Google Analytics, LinkedIn sharing, and a handful of other plugins.
[ link to this | view in chronology ]
Re: Any plans for Techdirt to support encryption fully?
[ link to this | view in chronology ]
no
Your gov't can't be trusted
[ link to this | view in chronology ]
Re: no
[ link to this | view in chronology ]
Re: Re: no
The US government doesn't make Cisco hardware or encryption standards either, but they're responsible for borking those up. It seems entirely plausible that the NSA has compromised major certificate authorities in some way, and if they haven't yet I'm sure they're working on it.
[ link to this | view in chronology ]
sidebar: Most VoIP is unencrypted
However, most carriers use unencrypted RTP to carry the content of the call.
[ link to this | view in chronology ]
Internal web sites?
[ link to this | view in chronology ]
Re: Internal web sites?
A half-hour's work is an overly burdensome cost?
[ link to this | view in chronology ]
Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
Re: Re: Re: Re: Internal web sites?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
And if you can't see the difference between Verizon purposefully allowing their connections to Level 3 to degrade with the intent to force Netflix to move off Level 3 to a direct connection to Verizon for Verizon customer (with corresponding payments to Verizon), and Mozilla stating that implementing one technical feature should depend on another technical feature to help ensure the security of the first feature, then I don't know what I could say to change your mind.
[ link to this | view in chronology ]
Re:
Well, yeah, if you ignore the free options available, incorrectly define net neutrality and add a bit of paranoia then you can come to all sorts of crazy conclusions!
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
How are the benefits of encrypting Netflix streams "tremendous and worth it"? Sounds like a faith-based claim to me. Care to share your scientific cost-benefit analysis?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
Maybe you are asking the wrong entity that question. It's obvious that Netflix has done a cost benefit analysis and feels the cost is worth the ROI as stated by Mark Watson of Netflix himself:
[ link to this | view in chronology ]
Re:
https://people.freebsd.org/~rrs/asiabsd_2015_tls.pdf
[ link to this | view in chronology ]
Re: Re:
https://people.freebsd.org/~rrs/asiabsd_2015_tls.pdf
That paper isn't about privacy. Try again?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
However, once 99% of the web is encrypted it will be much easier to actually change the infrastructure of the Internet and make it encrypted by default, at a much lower-level (such as at the Transport or IP level). Asking for that now would be probably be impossible.
[ link to this | view in chronology ]
Besides, SSL is not as secure as people think. Spy organizations like the NSA likely have keys for the most significant HTTPS websites (popular search engines, webmail providers, social networks, etc.), while workplaces have SSL-hijacking firewalls like Palo Alto. This means SSL will only protect against random man-in-the-middle attacks (which are rare) and ISPs (admittedly an effective measure against deep packet inspection).
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Also, the opposite of an "e-commerce site" is not a website "for your own personal use" as implied in your post. Websites can be legitimately both public and anonymous. If I need to provide my personal info to obtain an SSL certificate, the need for such a certificate becomes a problem.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Wrong.
The level of encryption is independent of the certificate. It depends solely on the negotiation between the browser and the server.
The certificate is used only to prove to the browser that it's talking to the real server.
At work, we have our servers configured to use a high level of encryption... and we use a domain-validated certificate. Check with Qualys if you doubt me.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
HTTPS everywhere is fine but...
[ link to this | view in chronology ]
Moby's Dictum
If the CIAFBINSA does NOT raise the roof with their tantrums decrying HTTPS to be the End of the World and the inevitable cause of the Deaths of a Million Babies a Day, then we can be absolutely certain HTTPS has been fully compromised and is an open book to the USG and its minions.
---
[ link to this | view in chronology ]
Re: Moby's Dictum
1. They can just pressure the hosting company to give them the logs when they want something.
2. They know that it will take a lot to get compliance from servers everywhere.
[ link to this | view in chronology ]
Re: Re: Moby's Dictum
That assumes the host keeps logs of the contents of the https traffic, which seems unlikely to be a reliable assumption. Or maybe CIAFBINSA is satisfied with metadata, like what IP connected to the server when? Also doesn't seem quite right, I think they want access to EVERYTHING.
They know that it will take a lot to get compliance from servers everywhere.
Everywhere, yes, but any reduction in their ability to snoop is cause for dire alarm from their perspective.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Mindless over-reaction
[ link to this | view in chronology ]
Re: Mindless over-reaction
What's actually happening: for the first time in the web's history, the security of non-e-commerce sites has actually been a real point of public discussion, and what's on the table is something that a lot of people feel should have been implemented years ago. HTTPS was being implemented by large sites for other reasons long before the Snowden revelations (e.g. Facebook making HTTPS mandatory following the vulnerability exposed by Firesheep). It's just that it wasn't in the general public awareness before Snowden. Since there's now more demand for security, more sites are implementing it, and it makes sense for it to become an overall standard.
Now, quit the scaremongering hyperbole yourself and deal with the facts, OK?
[ link to this | view in chronology ]
Re: Mindless over-reaction
What's interesting is that if it hadn't been for the hysterical overreaction to 9/11, perhaps we wouldn't have such a great need for encryption.
[ link to this | view in chronology ]
The moves are a know length and a know size provided a given connection speed. The ISP will always be able to tell what you are watching.
Further because video encoding does not produce a stream of bits at a constant rate the variation during the stream would quickly tell them what you started watching.
It is pointless.
[ link to this | view in chronology ]
Re:
You're claiming that each Netflix title has a unique length, and that ISPs know exactly how long each title is? How do they have this information, and how do you know that they do?
[ link to this | view in chronology ]
Re: Re:
Actually, the lengths of movies in hundredths of a second is very likely different for almost every movie, even if they are all, generally speaking, one and a half hours long.
If he is talking about the movie's "exact" duration, from start to finish, I would think that this information would be readily available to anyone who is hosting those movies in file form and has software that can measure the exact length of each - something I would assume is available to anyone like Netflix who has to know such time lengths in order to do broadcast scheduling.
I would also assume that entities such as Netflix would also own software that could add or subtract a few hundred milliseconds to the length of any movie they were hosting, or speed up/slow down the movie's running speed.
I think it was just mentioned here on Techdirt recently, that some Legacy Networks have considered speeding movies up in order to insert more commercials, so such time control is obvious possible.
While I doubt that Netflix does any of these things, I do not see any of it as being technically difficult to accomplish, or implement as an automatic process.
I am curious as to why you consider this sort of simple measurement and length comparison to be technically difficult.
Please note I am not agreeing with the poster that Netflix or anyone else does these things - just disagreeing with your apparent claim that many or most movies are the exact same length and that automated measurement comparison would be difficult to implement by entities such as Netflix.
---
[ link to this | view in chronology ]
[ link to this | view in chronology ]
1zqjfg'"(){}:/1zqjfg;9
[ link to this | view in chronology ]