The Death Of Moore's Law Is Greatly Exaggerated
from the the-myth-will-live-on dept
Every so often, Intel co-founder Gordon Moore shows up for an interview, and it's only a matter of time before someone asks him about Moore's Law, and he says something along the lines of how shrinking chips is going to reach its physical limits at some point. This has actually been going on for some time, but every time it happens, someone gets excited and writes up an article with a headline about the "death of Moore's Law." Do some Google searches on the topic and you'll find hundreds of articles on it starting years ago. The latest such article comes from Extremetech, claiming that Moore Sees 'Moore's Law' Dead in a Decade, as if that's something new. Go back a few years, and you can find nearly identical articles. However, the larger point is that it doesn't matter.There are a few reasons for this. First of all, Moore's Law isn't what most people seem to think it is. It's not (and never has been) a "law." Even worse, what it means and how people interpret it has continued to change over time. In fact, even Moore himself hasn't been consistent about what it means -- and the parts most often attributed to Moore aren't accurate at all. In fact, this latest article, from ExtremeTech gets the basic facts wrong, saying that originally Moore's Law was about the number of transistors on a chip doubling every 18 months and it was later pushed back to two years. Actually, the original statement from Moore was about doubling every 12 months, and he was the one who later revised it to two years. At some point, others seemed to average the two and say it was 18 months. The more important point, however, is that it doesn't really matter. The specifics of Moore's Law have long since lost their significance, and its true importance today is simply as a shorthand way of saying that technology gets better and cheaper at a rather rapid pace -- and that's likely to continue for quite some time whether or not chip makers figure out how to squeeze more transistors onto a chip or not.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: gordon moore, moore's law
Reader Comments
Subscribe: RSS
View by: Time | Thread
Clarification
It means that it gets better and cheaper at the same pace or slower than before.
Just applying some logic...
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Not just
It could well be rapid linear growth, but it's not; it's exponential. And that kind of growth becomes a problem for human perception, in time, as we just aren't equipped to think that way. Perhaps it would be better for us all if the rate of growth did slow a little. See here for some more musings on this.
[ link to this | view in chronology ]
Re: Not just
I agree. It's hard for businesses to keep up. Many companies have adopted a "2 year" replacement schedule for electronics which is a lot of wasted equipment (usually perfectly good) being sent away to rot. If technology is growing exponentially, so are the landfills. It would be nice for technology to slow a little so we can figure out a way to dispose of all this wasted equipment...and save some money on the side.
[ link to this | view in chronology ]
Re: Not just
Except that the marginal utility of those improvements is falling exponentially so most people do not "see" exponential growth. For myself, I am impatient for the future, I want faster progress, much faster progress - pour it on ...
[ link to this | view in chronology ]
[ link to this | view in chronology ]
"The Death Of Moore's Law Is Exponentially Exaggerated"
[ link to this | view in chronology ]
Slow news day?
I hardly mean to sound cynical, but, something with a tad Moore depth (sorry couldn't resist) would be interesting.
And for the guy who's replacing his tech every 2 yrs, I might'a bought that argument 10+ yrs ago when PCs were very slow, and memory expensive, and any improvement was appreciated, but the only way that argument works now is if you're buying the bargain basement product w/no speed, memory or accessories.
[ link to this | view in chronology ]
I don't think that's all that alarmist though. What we're seeing in correlation with this trend is that parallel programming abstractions are becoming more mature (and important) and that people are learning how to construct their software to exploit this. Or well, at least that part's starting to happen anyhow.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
Even MIPS isn't so important (for every several clock "ticks", a single instruction is "executed"), because even instructions can be of different importance.
To explain this :
I am currently writing for PIC microcontroller. It is a good chip, and works on 4 MHz "clock", the oscilator generates ~4 million a second. For every 4 such "pulses", a single instruction in the processor is executed. Hence, the chip operates at One MIPS. One million instructions per second.
But a single instruction in advanced processors can be quite complex, some badass one-insutrction calculation, so the chips cannot be compared on niether HZ nor MIPS scale.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
It's become Bill's Law anyhow....
And you have no choice. If you don't keep up in both software versions and the CPU power to run them, you will find it increasingly difficult to communicate with your suppliers and customers as they are forced to update by their suppliers and/or customers.
I predict More's law may start to level off when or if the industry starts to adopt real Open Standards (not necessarily Open Source) and finally steps off the upgrade treadmill.
[ link to this | view in chronology ]
It's about transistors
The number of "transistors on a chip" will be an obsolete measure of technology and Moore's Law will be an obsolete measure of technology. So - will Moore's Law be obsolete in 10 years? Maybe it will - but not because transistors no longer double - maybe because the rule itself is an obsolete measure of technology.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Continuation of thought..
That would be nice.
[ link to this | view in chronology ]
Two major advances not yet achieved...
1) room-temperature superconductors, and
2) true 3-dimensional semiconductors
Either of these would accelerate the rate of increase of computing power vs. cost. A combination of the two would be a major revolution.
But then, just 16 years ago, a 1-gigabyte hard drive cost over $1000. And semiconductor memory was around $3-5/Mb. I now carry on my person more hard drive space AND more random-access memory than existed in the whole world the year I was born.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Hardware is superb but the software still sucks
There should be a Moore's Law for software: Application size doubles every 2 years but the software isn't much better.
[ link to this | view in chronology ]
How about
[ link to this | view in chronology ]
moore
[ link to this | view in chronology ]