I so wish the EU would tell us developers to Nerd Harder on stopping mass surveillance, alongside protecting privacy otherwise. But I'm guessing they have their own intelligence companies who want technology to let them through.
Re: Re: Re: Re: Re: Yeah, of course... and they're lying through their teeth.
To be clear I don't trust anything (at least when it comes to computers) that I can't verify for myself. Privacy is too important for anything less than paranoia. I can't verify what code Yahoo, et al are running on their computers so I don't trust what they say about it. What I would trust is if Yahoo let native clients encrypt messages in a way (say using DIME) that they couldn't do this scanning.
All I really know about the Snowdon leaks is that they are far too possible.
That said today we sometimes have to trust a company's assertions, but it's my goal in life to get away from that. Plus I've found prettier software this way, and the only inconvenience I'm facing is telling people I'm not on Facebook.
Re: Yeah, of course... and they're lying through their teeth.
To be clear, all the companies mentioned in the PRISM (who are many of the same companies) denied it then too.
And as Christopher Soghoian of the ACLU said in response to that, either the companies are lying through their teeth OR the government has cracked into their server farms. That is if you believe the PRISM leak, like the author of this article does.
O.K. then, go ahead and reengineer the Internet to better realize this vision.
Because we've done this to the fullest degree possible with current protocols. But in the meantime IANA will continue (not take over) the handling of international top-level domains (which US corporations treat as their own) while giving every country their own top-level domain to manage (e.g. .us, .ca, .au, .nz, .jp)
Well, this someone already controls IP-address allocation.
Sure it's right to not trust whoever controls the allocation of domains, be it government, non-profit, or (even more) for-profit, and maybe if the Internet was built using modern computer science we won't be having this discussion.
But that said IANA, not the US government, always has been the ones handing out domain names and they've been doing a great job at it. They remain in the background and the Internet just seems to work.
It's a sad thing to hope for, but given how disliked the candidates are and how close the polling is I don't think we'd loose much from it. And as the researchers suggested this might be what it takes to push industry to fix the security holes throughout in the Internet's wiring, applications, & "Things".
Besides all I want out of this election is chaos, and that would bring it while showcasing an important issue.
To be clear I'm not calling on anyone to be fired, but I will argue my view of the web. Because what we've called incompetence is really a conflicting view.
In my normal browsing, I find most JavaScripts that want to run are trackers and advertisers, and the JavaScripts I want to let run are on a small handful of sites (e.g. online mapping).
Furthermore I find what makes a website succeed is it's content (it's HTML), not it's client-side behaviour (it's JavaScript). And sticking predominantly to the HTML gives the site better behaviour in corner cases as well as helps the site render quicker even on slow networks.
For these and other reasons I think it's appropriate to block scripts by default (if you've got the patience), and for developers to focus on HTML/CSS & add sprinkles of JavaScripts where it helps.
As for Unpatent I see you think you're saving effort by using meteor, but in this case I posit that with the result you got you might as well have used HTML, CSS, 1 line of jQuery, and some simple server side templates (Handlebars, Jinja, AngularJS, PHP - take your pick).
tl;dr Don't tell me to accept JavaScript as a core building block. What it provides is not core to the large majority of websites, and certainly not core to Unpatent.
It's worse than that, the page is almost entirely static.
Not only does this web-designer not understand graceful degradation, he's using JavaScript where HTML is better suited. That is he's making things harder for himself.
Nevertheless, I wish this project luck. It won't replace legal reform, but it's an important stopgap.
It's not like the issues I mentioned don't have solutions, just that those solutions can slow down development or hold up adoption.
The most interesting of these solutions I've seen is a plan to design a peer-to-peer Web where if the client browser doesn't support it, a central server will serve them JavaScript instructing them how to access the page.
As for the issue of identifiers I've seen a couple of solutions, and introducing a semi-centralized translation service is certainly one of them. But given the mindset behind these projects I find QR codes are a more common one.
Still lock-in (thank you Thad, forgot to mention it) is a big issue I haven't seen be addressed well, and as for the political angle we just need to review the new protocols and code for security flaws.
I've been following the developments on rebuilding the Internet, and let's see if I can summarise why we aren't there.
Most commenters on this topic are pointing out the threat of political pressure on a redesigned Internet, but there are other issues at play.
The biggest problem (whether it's for IPv6, mesh networking, or a peer-to-peer Web built on a DHT), is that before end-users see value in running the protocol it must already be popular. As such it's actually not that hard to build a stronger alternative to the Internet, the issue is navigating the catch22 in order to get it used.
Furthermore there's an issue that any purely peer-to-peer identifier (AKA a "pubkeyhash") is inherently unreadable and harder to communicate to friends then a phone number, but an open-minded UI designer should be able to help solve this problem.
In short, we have been onto this task of building a stronger, better Internet but to some extent or other we can only do so incrementally. This is due to not only political pressure, but also marketing.
Yet another point the co-host is missing (which I think I heard touched on) is that while distributed technologies like blockchains can keep things internally consistent, they do nothing to ensure accuracy of that data.
This is an inherent restriction of any sort of database, and credit companies (badly) approximate a solution by providing someone to blame for collecting bad data. So if the companies are driven to black-market peer-to-peer technologies* the companies would still exist to say (on an anonymous identity) "yes, I guarantee this data is correct".
* Not that I have anything against peer-to-peer, done right (not like bitcoin) it's what we need to avoid this censorship.
Now if only they'd turn most their focus on breaking and investigating ISIS's shoddy encryption rather than the rest of ours, I might have a shred of respect for them.
That's certainly the big question, but I'll tell you who won't be next.
Ofcourse the US won't, they've been pushing these "deals" and are in the pockets of the corporations. And it won't be New Zealand, at least until the election next year, as John Key considers this "good economics".
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Re: Re: Re: Re: Re: Re: Yeah, of course... and they're lying through their teeth.
On the post: Yahoo Email Scanning May Sink EU Privacy Shield Agreement
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Re: Re: Re: It's why I implemented by own email servers.
That said, I don't think the comment I was replying to was taking this nuance into account.
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Re: Re: Re: Re: Yeah, of course... and they're lying through their teeth.
All I really know about the Snowdon leaks is that they are far too possible.
That said today we sometimes have to trust a company's assertions, but it's my goal in life to get away from that. Plus I've found prettier software this way, and the only inconvenience I'm facing is telling people I'm not on Facebook.
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Re: Re: Re: Yeah, of course... and they're lying through their teeth.
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Re: It's why I implemented by own email servers.
On the post: Basically All Big Tech Companies Deny Scanning Communications For NSA Like Yahoo Is Doing
Re: Yeah, of course... and they're lying through their teeth.
And as Christopher Soghoian of the ACLU said in response to that, either the companies are lying through their teeth OR the government has cracked into their server farms. That is if you believe the PRISM leak, like the author of this article does.
On the post: Ridiculously Stupid: 4 State Attorneys General File Totally Bogus Lawsuit Against Internet Transition
Re: Re:
Because we've done this to the fullest degree possible with current protocols. But in the meantime IANA will continue (not take over) the handling of international top-level domains (which US corporations treat as their own) while giving every country their own top-level domain to manage (e.g. .us, .ca, .au, .nz, .jp)
On the post: Ridiculously Stupid: 4 State Attorneys General File Totally Bogus Lawsuit Against Internet Transition
Re: Mesnick Missing the Point
Sure it's right to not trust whoever controls the allocation of domains, be it government, non-profit, or (even more) for-profit, and maybe if the Internet was built using modern computer science we won't be having this discussion.
But that said IANA, not the US government, always has been the ones handing out domain names and they've been doing a great job at it. They remain in the background and the Internet just seems to work.
On the post: The Internet Of Poorly Secured Things Is Fueling Unprecedented, Massive New DDoS Attacks
Could we have an attack this election?
Besides all I want out of this election is chaos, and that would bring it while showcasing an important issue.
On the post: Unpatent Launches Combination Crowdfunding/Crowdsourcing Platform To Invalidate Stupid Patents
Re: Re: Re: Re: Nope. Can't be bothered.
In my normal browsing, I find most JavaScripts that want to run are trackers and advertisers, and the JavaScripts I want to let run are on a small handful of sites (e.g. online mapping).
Furthermore I find what makes a website succeed is it's content (it's HTML), not it's client-side behaviour (it's JavaScript). And sticking predominantly to the HTML gives the site better behaviour in corner cases as well as helps the site render quicker even on slow networks.
For these and other reasons I think it's appropriate to block scripts by default (if you've got the patience), and for developers to focus on HTML/CSS & add sprinkles of JavaScripts where it helps.
As for Unpatent I see you think you're saving effort by using meteor, but in this case I posit that with the result you got you might as well have used HTML, CSS, 1 line of jQuery, and some simple server side templates (Handlebars, Jinja, AngularJS, PHP - take your pick).
tl;dr Don't tell me to accept JavaScript as a core building block. What it provides is not core to the large majority of websites, and certainly not core to Unpatent.
On the post: Unpatent Launches Combination Crowdfunding/Crowdsourcing Platform To Invalidate Stupid Patents
Re: Nope. Can't be bothered.
Not only does this web-designer not understand graceful degradation, he's using JavaScript where HTML is better suited. That is he's making things harder for himself.
Nevertheless, I wish this project luck. It won't replace legal reform, but it's an important stopgap.
On the post: If Someone Is Testing Ways To Take Down The Internet, Perhaps It's Time To Build A Stronger Internet
Re: Re: Why hasn't this been done before?
The most interesting of these solutions I've seen is a plan to design a peer-to-peer Web where if the client browser doesn't support it, a central server will serve them JavaScript instructing them how to access the page.
As for the issue of identifiers I've seen a couple of solutions, and introducing a semi-centralized translation service is certainly one of them. But given the mindset behind these projects I find QR codes are a more common one.
Still lock-in (thank you Thad, forgot to mention it) is a big issue I haven't seen be addressed well, and as for the political angle we just need to review the new protocols and code for security flaws.
On the post: This Bill Could Stop Protectionist State Broadband Laws, But ISP Control Over Congress Means It Won't Pass
Re: Congress...
On the post: If Someone Is Testing Ways To Take Down The Internet, Perhaps It's Time To Build A Stronger Internet
Why hasn't this been done before?
Most commenters on this topic are pointing out the threat of political pressure on a redesigned Internet, but there are other issues at play.
The biggest problem (whether it's for IPv6, mesh networking, or a peer-to-peer Web built on a DHT), is that before end-users see value in running the protocol it must already be popular. As such it's actually not that hard to build a stronger alternative to the Internet, the issue is navigating the catch22 in order to get it used.
Furthermore there's an issue that any purely peer-to-peer identifier (AKA a "pubkeyhash") is inherently unreadable and harder to communicate to friends then a phone number, but an open-minded UI designer should be able to help solve this problem.
In short, we have been onto this task of building a stronger, better Internet but to some extent or other we can only do so incrementally. This is due to not only political pressure, but also marketing.
On the post: Supreme Court Just Made It Easier For Patent Trolls
Re: Hillary's book
On the post: Techdirt Podcast Episode 77: The Link Between Credit And Surveillance
Re: (Mason Wheeler)
This is an inherent restriction of any sort of database, and credit companies (badly) approximate a solution by providing someone to blame for collecting bad data. So if the companies are driven to black-market peer-to-peer technologies* the companies would still exist to say (on an anonymous identity) "yes, I guarantee this data is correct".
* Not that I have anything against peer-to-peer, done right (not like bitcoin) it's what we need to avoid this censorship.
On the post: Snowden Docs Show GCHQ, MI5 To Be All Haystack, No Needle
SHOCKER! WOW!
Now if only they'd turn most their focus on breaking and investigating ISIS's shoddy encryption rather than the rest of ours, I might have a shred of respect for them.
On the post: India Seeks To Renegotiate 47 Investment Treaties Because Of Their Corporate Sovereignty Clauses
Who'll be next?
Ofcourse the US won't, they've been pushing these "deals" and are in the pockets of the corporations. And it won't be New Zealand, at least until the election next year, as John Key considers this "good economics".
Anyone else want to chip in?
On the post: It's Official: US International Trade Commission Predicts Negligible Economic Benefits From TPP
Re: Yup, dollars speak louder than percentages
Next >>