He was right... mostly. It was a slightly different subset of social media that orchestrated this: online activism groups. I'm subscribed to mailing lists for several of them, because they do a lot of good, but... sometimes they also do something like this.
I saw the story as it unfolded, a few days before it became national news, and what happened was very deliberate: It was Trayvon Martin all over again. What should have been a very simple, local case of a thug attacking a guy with a gun and then the inevitable tragic consequences befalling him got spun into a national media circus by people who want to incite another Rodney King-style race riot to call attention to their cause celebré.
Yes, the heavy-handed police response was bad, but considering the riots already breaking out, it was certainly the lesser of two evils. (If you don't believe me, look up the statistics on the death tolls and property damage in the Rodney King riots!) They made the least bad choice they could have in a bad situation. Let's hope this finally gets put to rest now, at least until the next overly aggressive thug gets himself killed in a stupid way while happening to be black.
Exactly. Why are we complaining about it being a "show trial," when the entire thing was all for show in the first place?
The "injustice" crowd (goodname for them) failed to get the race riot they were trying for in the Trayvon Martin case, so now they're trying again. This grand jury hearing wasn't for the benefit of smart people like the author here, who understand the law; it was a bit of political theatre to clearly consider the evidence and show the agitators that they have no case.
I've also got to say your insults of C are offensive. It was designed as a close as possible "to the metal" language.
See the quote by Tony Hoare, above. Saying "it's only doing what it's designed to do" is no excuse when it was already well known, decades earlier, that this was a bad design.
It was supposed to alleviate the reliance on Assembly.
...and so was every other high level language in existence at the time. What makes C so special, except that it did such a bad job of it?
Of course it's dangerous when misused. It's low-level capable, and you ought to feel like you're walking around with lit sticks of dynamite in both hands while using it.
Yeah, think about that for a second. Granted, I've never actually used dynamite, but it seems to me, just based on common sense, that holding a lit stick in your hand, for any reason, is Doing It Wrong. And if you're supposed to feel like you're doing something very wrong every time you use it... well, yeah, that's actually a pretty apt description. It's wrong to use C. Period. ;)
Just because some people use it to write crap is no reason to damn the language.
Sure, and I don't for that reason. My reason is because people who have taken the time to learn it right keep making the same mistakes anyway.
34 years ago, Tony Hoare gave a very interesting, and somewhat prophetic, Turing Award lecture. He talks about his work on ALGOL compilers, and one of the things he said has been on my mind recently:
In that design I adopted certain basic principles that I believe to be as valid today as they were back then. The first principle was security: The principle that every syntactically incorrect program should be rejected by the compiler and that every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself. Thus no core dumps should ever be necessary. It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time.
A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interest of efficiency on production runs. Unanimously, they urged us not to—they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson.
He said this in 1980, about work he had done in 1960, so this was known and understood to be a good idea as far back as 50 years ago. But, of course, the programming community in general didn’t listen. Several years later, the consequences came back to bite us, in the form of the Morris Worm.
It rampaged throughout the fledgling Internet of the day, crashing an estimated 10% of all systems connected to the Internet by exploiting buffer overruns in a handful of specific UNIX programs. The author, a sleazebag by the name of Robert Morris, later claimed that he just wanted to find a way to “count the number of computers on the Internet,” but his actions put the lie to that statement. He encrypted the Worm and used rootkit techniques to hide it from the file system, and he released it from a different university than the one he attended, in an attempt to cover his tracks. A person who believes they aren’t doing anything wrong doesn’t try to hide what they’re doing, and comments in his original source code make it clear that his intention was anything but benign; he was trying to build what we call a botnet today.
And all because of buffer exploits in a handful of C programs. That really should have put us all on notice. Hoare was right, and in any sane world, the C language would have been dead by 1990. But it didn’t happen, and those who refuse to learn from history are doomed to repeat it, so once the Internet started becoming a big thing among the general public, in the early 2000s, we ended up with a bunch of new worms that snuck into Windows systems through buffer exploits. Remember Slammer? Blaster? Code Red?
Hoare was right. We should have listened.
But we didn't, and now we get stuff like Heartbleed. Every few weeks, we get new security patches coming out for major software, fixing buffer overrun vulnerabilities. We get the same problems popping up over and over and over again, not because someone didn't learn how to do it the right way, but because someone who did know... was human, and made a mistake.
The guy responsible for the Heartbleed vulnerability isn’t a bad programmer. Have a look at the commit where the bug was introduced. See if you can find the problem without being told where it is.
It’s clear that this is not the work of an incompetent n00b; this is someone who really knows his way around the language. But he made a mistake, and it’s a subtle enough one that most people, even knowing beforehand that that changeset contains a severe bug and knowing what class of bug it is (a buffer exploit vulnerability) won’t be able to find it.
To err is human, to forgive divine, but to continue to make the same error and willfully refuse to learn from it... that's unforgivable. At least not by a non-divine mere mortal such as myself. And when a mistake can have consequences of this magnitude, that’s also unforgivable. That puts the language, which forgives such mistakes all to easily, fundamentally at odds with reality vis a vis human nature. That means something’s gotta give, and it’s not going to be reality… and this is what happens when it does.
Remember when Steve Jobs died, the minor kerfuffle over Richard Stallman’s quoting Chicago Mayor Harold Washington WRT the corrupt former Mayor Daley: “I’m not glad he’s dead, but I’m glad he’s gone”? It was just a few days later that Dennis Ritchie, the creator of C, died, and that’s exactly how I felt about him. As one of my former coworkers put it, Ritchie’s true legacy to the world is the buffer overflow.
There’s really no excuse left for C, other than inertia. (Which, if you recall, is ultimately what ran the Titanic into that iceberg.) Can we let it and its entire misbegotten family die already? It’s 25 years overdue for its own funeral.
And WRT your snark in the next post, it’s worth noting that at the time the Morris Worm first brought the Internet to its knees by exploiting buffer overruns in C, Apple was already five years into its Macintosh project that ended up defining the entire future of operating system design… in Pascal.
And yes, "everything is a file" was definitely a good idea. But as a developer, surely you're well aware of the distinction between a bad idea and a good idea implemented badly, even when the result is crap in both cases? ;)
And before that, he served in the military. He was the first example that came to mind. He's hardly the only example I could name. Military service does not automatically make someone some sort of paragon whose honor is beyond question, and it's a bit silly of you to imply that it does.
Well, as a programmer, I don't have nearly as much respect for Unix as you do. I believe that the highest virtue in coding is clarity, whereas Unix saddled a good portion of the world with a culture of intentional obfuscation. ("UNIX is basically a simple operating system, but you have to be a genius to understand the simplicity.")
The problem isn't that it's "too advanced for mere pedestrians to grasp;" it's that a lot of it is too advanced for *NIX wizards to grasp! When Eric freaking Raymond himself posted a blog post about running into some serious printing problems, he later reported:
The “I thought I was the only one” letters that Raymond found so interesting aren’t coming from the [mere pedestrian]-set; they’re coming from Linux geeks who read essays written by Eric Raymond. And they’re frustrated by open source software’s terrible usability. The problem isn’t just that [mere pedestrians] can’t use desktop Linux — the problem is that even Linux geeks have trouble figuring it out.
(Emphasis added)
UNIX is a horrible system, and simply because it's less horrible than all of the other things it obsoleted doesn't change that fact. Not to mention that if it wasn't for UNIX, we would never have been saddled with the C language (created specifically to build UNIX) and the myriad buffer overflow vulnerabilities that have plagued the Internet ever since the Morris Worm. Tony Hoare called NULL "the billion-dollar mistake," but by his scale, how many billions worth of mistake is C (and C++ and every other abomination that sprang from its roots)?
Linus Torvalds: Created Linux, an OS that never got above 1% market share for decades until an American company (Google) found a way to build an interface on it that doesn't suck. Outside of Android, it still has below 1% market share.
Alan Cox: Created Smalltalk, whose market share makes Linux look like a killer app. It was highly influential in the development of Objective-C, which, likewise, no one ever used, until an American company (Apple) made it the new standard. Outside of the Apple ecosystem, which crams Objective-C down all of the iDiots' throats, there is still nobody using Objective-C for anything.
Tim Berners-Lee: Created HTML, which nobody used until an American company (Netscape) created a useful interface to it. Then it became a massive worldwide phenomenon, so the guy gets some credit; at least he designed a halfway decent product that became popular with a bit of outside marketing and exposure.
Alan Turing: Did a bunch of research that duplicated the work of an American researcher (Alonzo Church). Church's work is generally considered to be of higher quality, but somehow Turing is the one everyone remembers.
Minitel: Never heard of it. A bit of Googling turns up that it was a really big BBS system that was popular for a while before the World Wide Web came around and made it obsolete. (See Netscape, above.)
If I was an employer, and I saw one of my employees post "No Justice No Peace" online, particularly in connection with a volatile situation very close by in which there has already been widespread rioting, it's not much of a leap of logic to imagine that this employee is making a threat: "there will be No Peace as long as there is No Justice (by my definition of justice, of course, which means 'getting what I want'.)"
I'd have fired him over that too, with or without the DHS getting involved. Considering the circumstances, it's not the least bit unrealistic to consider that "making terroristic threats."
I still don't understand how retransmission fees make any sense in the first place. If a broadcaster is giving something away for free, by broadcasting it unencrypted over the air where anyone with the proper equipment can tune in at no cost, how do they then get to say "no, you have to pay us to use it"?
If I ran a performance venue, and I had drinking fountains there, but I hung a sign at the gate saying "no taking water out of here without paying for it", how enforceable would that be?
That case decided that a burrito was not a sandwich, but food experts don't all agree on that point.
Seriously? A sandwich is filling served between two slices of bread. A burrito does not have slices of bread; it has the filling wrapped in a tortilla. What room is there for disagreement on something so simple?
He's pushing new federalized trade secret laws. This is a really bad idea, which we'll be discussing in more detail later.
Here's a good federal trade secret law: Thou Shalt Not Have Trade Secrets. Anything less than that, I would oppose.
Patents and copyrights are both based on good ideas that did very useful things for our society. We owe the Industrial Revolution, almost in its entirety, to patents. Since then, those who profit from patents and copyrights have taken the concepts and corrupted them, but they could still be reformed, restored to their original purity, and be something good.
Trade Secrets, on the other hand, have no redeeming virtues and need to be done away with.
Thank you! I've been saying this for years: my property is my property! When I buy it, I own it, and the company that made it has no more claim on it. This is just another of the many reasons why DRM needs to be outlawed.
Re: Re: Sure, just like Uber has "legitimate business practices"
Left field? This is exactly the sort of stuff people with their feet firmly on the ground have been warning about ever since a few people started getting carried away singing Uber's praises. The company's run by an Objectivist, which alone is essentially prima facie evidence that it's going to be sleazy and abusive. We've known about their illegal price gouging in crises (coming down straight from the top!) since Sandy hit New York, and now this.
What actually surprises me about that is that he made a serious technical error: if you are sanitizing inputs at all, you are Doing It Wrong. (Just look at the myriad iterations of PHP's escape_sql_properly_no_really_we_swear_we_got_it_right_this_time functions!)
The only way to do it right is with parametrized queries, which don't require any escaping.
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Jesse Jackson finally got something right
So what's your position, then? You freely admit that it won't and essentially can't ever be a free market with healthy competition, so what is the objection towards regulations that would stifle healthy competition in a free market, seeing as how none exists in this space?
Title II is the best way to handle this particular situation, where a free market simply has no chance anyway.
On the post: Prosecutor Lays The Blame For The Ferguson Debacle At The Feet Of 'Social Media'
Re: More BS
On the post: Prosecutor Lays The Blame For The Ferguson Debacle At The Feet Of 'Social Media'
I saw the story as it unfolded, a few days before it became national news, and what happened was very deliberate: It was Trayvon Martin all over again. What should have been a very simple, local case of a thug attacking a guy with a gun and then the inevitable tragic consequences befalling him got spun into a national media circus by people who want to incite another Rodney King-style race riot to call attention to their cause celebré.
Yes, the heavy-handed police response was bad, but considering the riots already breaking out, it was certainly the lesser of two evils. (If you don't believe me, look up the statistics on the death tolls and property damage in the Rodney King riots!) They made the least bad choice they could have in a bad situation. Let's hope this finally gets put to rest now, at least until the next overly aggressive thug gets himself killed in a stupid way while happening to be black.
On the post: The Ferguson Grand Jury Decision Proves 'The System' Still 'Works'
Re: Grand Jury Intubation
The "injustice" crowd (goodname for them) failed to get the race riot they were trying for in the Trayvon Martin case, so now they're trying again. This grand jury hearing wasn't for the benefit of smart people like the author here, who understand the law; it was a bit of political theatre to clearly consider the evidence and show the agitators that they have no case.
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Re: Re: Re: Re: Wait, what?
See the quote by Tony Hoare, above. Saying "it's only doing what it's designed to do" is no excuse when it was already well known, decades earlier, that this was a bad design.
...and so was every other high level language in existence at the time. What makes C so special, except that it did such a bad job of it?
Yeah, think about that for a second. Granted, I've never actually used dynamite, but it seems to me, just based on common sense, that holding a lit stick in your hand, for any reason, is Doing It Wrong. And if you're supposed to feel like you're doing something very wrong every time you use it... well, yeah, that's actually a pretty apt description. It's wrong to use C. Period. ;)
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Re: Re: Re: Re: Wait, what?
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Re: Re: Re: Re: Wait, what?
Sure, and I don't for that reason. My reason is because people who have taken the time to learn it right keep making the same mistakes anyway.
34 years ago, Tony Hoare gave a very interesting, and somewhat prophetic, Turing Award lecture. He talks about his work on ALGOL compilers, and one of the things he said has been on my mind recently:
He said this in 1980, about work he had done in 1960, so this was known and understood to be a good idea as far back as 50 years ago. But, of course, the programming community in general didn’t listen. Several years later, the consequences came back to bite us, in the form of the Morris Worm.
It rampaged throughout the fledgling Internet of the day, crashing an estimated 10% of all systems connected to the Internet by exploiting buffer overruns in a handful of specific UNIX programs. The author, a sleazebag by the name of Robert Morris, later claimed that he just wanted to find a way to “count the number of computers on the Internet,” but his actions put the lie to that statement. He encrypted the Worm and used rootkit techniques to hide it from the file system, and he released it from a different university than the one he attended, in an attempt to cover his tracks. A person who believes they aren’t doing anything wrong doesn’t try to hide what they’re doing, and comments in his original source code make it clear that his intention was anything but benign; he was trying to build what we call a botnet today.
And all because of buffer exploits in a handful of C programs. That really should have put us all on notice. Hoare was right, and in any sane world, the C language would have been dead by 1990. But it didn’t happen, and those who refuse to learn from history are doomed to repeat it, so once the Internet started becoming a big thing among the general public, in the early 2000s, we ended up with a bunch of new worms that snuck into Windows systems through buffer exploits. Remember Slammer? Blaster? Code Red?
Hoare was right. We should have listened.
But we didn't, and now we get stuff like Heartbleed. Every few weeks, we get new security patches coming out for major software, fixing buffer overrun vulnerabilities. We get the same problems popping up over and over and over again, not because someone didn't learn how to do it the right way, but because someone who did know... was human, and made a mistake.
The guy responsible for the Heartbleed vulnerability isn’t a bad programmer. Have a look at the commit where the bug was introduced. See if you can find the problem without being told where it is.
It’s clear that this is not the work of an incompetent n00b; this is someone who really knows his way around the language. But he made a mistake, and it’s a subtle enough one that most people, even knowing beforehand that that changeset contains a severe bug and knowing what class of bug it is (a buffer exploit vulnerability) won’t be able to find it.
To err is human, to forgive divine, but to continue to make the same error and willfully refuse to learn from it... that's unforgivable. At least not by a non-divine mere mortal such as myself. And when a mistake can have consequences of this magnitude, that’s also unforgivable. That puts the language, which forgives such mistakes all to easily, fundamentally at odds with reality vis a vis human nature. That means something’s gotta give, and it’s not going to be reality… and this is what happens when it does.
Remember when Steve Jobs died, the minor kerfuffle over Richard Stallman’s quoting Chicago Mayor Harold Washington WRT the corrupt former Mayor Daley: “I’m not glad he’s dead, but I’m glad he’s gone”? It was just a few days later that Dennis Ritchie, the creator of C, died, and that’s exactly how I felt about him. As one of my former coworkers put it, Ritchie’s true legacy to the world is the buffer overflow.
There’s really no excuse left for C, other than inertia. (Which, if you recall, is ultimately what ran the Titanic into that iceberg.) Can we let it and its entire misbegotten family die already? It’s 25 years overdue for its own funeral.
And WRT your snark in the next post, it’s worth noting that at the time the Morris Worm first brought the Internet to its knees by exploiting buffer overruns in C, Apple was already five years into its Macintosh project that ended up defining the entire future of operating system design… in Pascal.
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Re: Re: Re: Re: Wait, what?
And yes, "everything is a file" was definitely a good idea. But as a developer, surely you're well aware of the distinction between a bad idea and a good idea implemented badly, even when the result is crap in both cases? ;)
On the post: Employee Fired After Posting Pictures Of DHS Vehicles Parked In Hotel Parking Lot
Re: Re: Re: Re:
On the post: Employee Fired After Posting Pictures Of DHS Vehicles Parked In Hotel Parking Lot
Re: Re:
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Re: Re: Wait, what?
The problem isn't that it's "too advanced for mere pedestrians to grasp;" it's that a lot of it is too advanced for *NIX wizards to grasp! When Eric freaking Raymond himself posted a blog post about running into some serious printing problems, he later reported:
As one blogger mentioned in response to this:
(Emphasis added)
UNIX is a horrible system, and simply because it's less horrible than all of the other things it obsoleted doesn't change that fact. Not to mention that if it wasn't for UNIX, we would never have been saddled with the C language (created specifically to build UNIX) and the myriad buffer overflow vulnerabilities that have plagued the Internet ever since the Morris Worm. Tony Hoare called NULL "the billion-dollar mistake," but by his scale, how many billions worth of mistake is C (and C++ and every other abomination that sprang from its roots)?
On the post: EU Parliament Wants To Break Up Google... Because It's Big & American Or Something
Re: Re: Re: Re: Wait, what?
Alan Cox: Created Smalltalk, whose market share makes Linux look like a killer app. It was highly influential in the development of Objective-C, which, likewise, no one ever used, until an American company (Apple) made it the new standard. Outside of the Apple ecosystem, which crams Objective-C down all of the iDiots' throats, there is still nobody using Objective-C for anything.
Tim Berners-Lee: Created HTML, which nobody used until an American company (Netscape) created a useful interface to it. Then it became a massive worldwide phenomenon, so the guy gets some credit; at least he designed a halfway decent product that became popular with a bit of outside marketing and exposure.
Alan Turing: Did a bunch of research that duplicated the work of an American researcher (Alonzo Church). Church's work is generally considered to be of higher quality, but somehow Turing is the one everyone remembers.
Minitel: Never heard of it. A bit of Googling turns up that it was a really big BBS system that was popular for a while before the World Wide Web came around and made it obsolete. (See Netscape, above.)
Sorry, but the guy's got a point.
On the post: Employee Fired After Posting Pictures Of DHS Vehicles Parked In Hotel Parking Lot
If I was an employer, and I saw one of my employees post "No Justice No Peace" online, particularly in connection with a volatile situation very close by in which there has already been widespread rioting, it's not much of a leap of logic to imagine that this employee is making a threat: "there will be No Peace as long as there is No Justice (by my definition of justice, of course, which means 'getting what I want'.)"
I'd have fired him over that too, with or without the DHS getting involved. Considering the circumstances, it's not the least bit unrealistic to consider that "making terroristic threats."
On the post: Bludgeoned And Bleeding, Aereo Finally Files For Bankruptcy
If I ran a performance venue, and I had drinking fountains there, but I hung a sign at the gate saying "no taking water out of here without paying for it", how enforceable would that be?
On the post: DailyDirt: The Legal Definition Of What You're Eating...
Seriously? A sandwich is filling served between two slices of bread. A burrito does not have slices of bread; it has the filling wrapped in a tortilla. What room is there for disagreement on something so simple?
On the post: As Expected: Trial Lawyers Made A Huge Miscalculation In Killing Recent Patent Reform
Here's a good federal trade secret law: Thou Shalt Not Have Trade Secrets. Anything less than that, I would oppose.
Patents and copyrights are both based on good ideas that did very useful things for our society. We owe the Industrial Revolution, almost in its entirety, to patents. Since then, those who profit from patents and copyrights have taken the concepts and corrupted them, but they could still be reformed, restored to their original purity, and be something good.
Trade Secrets, on the other hand, have no redeeming virtues and need to be done away with.
On the post: Automakers Like TOTALLY Promise Not To Abuse The Ocean Of Location Data Their Cars Now Collect
Re: Re:
On the post: Automakers Like TOTALLY Promise Not To Abuse The Ocean Of Location Data Their Cars Now Collect
Re: Re: Sure, just like Uber has "legitimate business practices"
On the post: Automakers Like TOTALLY Promise Not To Abuse The Ocean Of Location Data Their Cars Now Collect
Re: OT: xkcd & Bobby Tables
The only way to do it right is with parametrized queries, which don't require any escaping.
On the post: Jesse Jackson Insists He's Lobbying For Weaker Net Neutrality Rules To Help Protect The Poor
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Jesse Jackson finally got something right
Turn down the ideology a few notches and have a look at the facts, and all this makes a lot more sense.
On the post: Jesse Jackson Insists He's Lobbying For Weaker Net Neutrality Rules To Help Protect The Poor
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Jesse Jackson finally got something right
Title II is the best way to handle this particular situation, where a free market simply has no chance anyway.
Next >>