The Death Of Moore's Law Is Greatly Exaggerated

from the the-myth-will-live-on dept

Every so often, Intel co-founder Gordon Moore shows up for an interview, and it's only a matter of time before someone asks him about Moore's Law, and he says something along the lines of how shrinking chips is going to reach its physical limits at some point. This has actually been going on for some time, but every time it happens, someone gets excited and writes up an article with a headline about the "death of Moore's Law." Do some Google searches on the topic and you'll find hundreds of articles on it starting years ago. The latest such article comes from Extremetech, claiming that Moore Sees 'Moore's Law' Dead in a Decade, as if that's something new. Go back a few years, and you can find nearly identical articles. However, the larger point is that it doesn't matter.

There are a few reasons for this. First of all, Moore's Law isn't what most people seem to think it is. It's not (and never has been) a "law." Even worse, what it means and how people interpret it has continued to change over time. In fact, even Moore himself hasn't been consistent about what it means -- and the parts most often attributed to Moore aren't accurate at all. In fact, this latest article, from ExtremeTech gets the basic facts wrong, saying that originally Moore's Law was about the number of transistors on a chip doubling every 18 months and it was later pushed back to two years. Actually, the original statement from Moore was about doubling every 12 months, and he was the one who later revised it to two years. At some point, others seemed to average the two and say it was 18 months. The more important point, however, is that it doesn't really matter. The specifics of Moore's Law have long since lost their significance, and its true importance today is simply as a shorthand way of saying that technology gets better and cheaper at a rather rapid pace -- and that's likely to continue for quite some time whether or not chip makers figure out how to squeeze more transistors onto a chip or not.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: gordon moore, moore's law


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Shohat, 19 Sep 2007 @ 6:57am

    Clarification

    way of saying that technology gets better and cheaper at a rather rapid pace


    It means that it gets better and cheaper at the same pace or slower than before.

    Just applying some logic...

    link to this | view in thread ]

  2. identicon
    name, 19 Sep 2007 @ 7:03am

    yay for wasted space

    link to this | view in thread ]

  3. identicon
    seumas, 19 Sep 2007 @ 7:04am

    Not just

    Moore's law is indeed the rate of growth of technology, which you might as well measure as the growth in computer power as anything else, but to say that all it means is "rapid growth" kind of misses the point.

    It could well be rapid linear growth, but it's not; it's exponential. And that kind of growth becomes a problem for human perception, in time, as we just aren't equipped to think that way. Perhaps it would be better for us all if the rate of growth did slow a little. See here for some more musings on this.

    link to this | view in thread ]

  4. identicon
    TheDock22, 19 Sep 2007 @ 7:19am

    Re: Not just

    Perhaps it would be better for us all if the rate of growth did slow a little.

    I agree. It's hard for businesses to keep up. Many companies have adopted a "2 year" replacement schedule for electronics which is a lot of wasted equipment (usually perfectly good) being sent away to rot. If technology is growing exponentially, so are the landfills. It would be nice for technology to slow a little so we can figure out a way to dispose of all this wasted equipment...and save some money on the side.

    link to this | view in thread ]

  5. identicon
    RandomThoughts, 19 Sep 2007 @ 7:24am

    Wasted equipment and landfills? Most technology is disposed of not because of new technology but because of the write off due to depreciation value. You don't have that old computer because of Moores Law but because the bean counters decide you do.

    link to this | view in thread ]

  6. identicon
    Joe Schmoe, 19 Sep 2007 @ 7:39am

    ; ) Oh Mike, a better title would have been

    "The Death Of Moore's Law Is Exponentially Exaggerated"

    link to this | view in thread ]

  7. identicon
    James, 19 Sep 2007 @ 7:55am

    Slow news day?

    Mike.. while you expounded a bit, the whole point of this seems odd. So there are articles preaching the death of "Moore's Law".. who cares?

    I hardly mean to sound cynical, but, something with a tad Moore depth (sorry couldn't resist) would be interesting.

    And for the guy who's replacing his tech every 2 yrs, I might'a bought that argument 10+ yrs ago when PCs were very slow, and memory expensive, and any improvement was appreciated, but the only way that argument works now is if you're buying the bargain basement product w/no speed, memory or accessories.

    link to this | view in thread ]

  8. identicon
    Danno, 19 Sep 2007 @ 7:58am

    Hasn't the law already fallen through though? We're not seeing speed increases in chips much more, but instead we're getting more processors on a die.

    I don't think that's all that alarmist though. What we're seeing in correlation with this trend is that parallel programming abstractions are becoming more mature (and important) and that people are learning how to construct their software to exploit this. Or well, at least that part's starting to happen anyhow.

    link to this | view in thread ]

  9. identicon
    seumas, 19 Sep 2007 @ 8:10am

    Re:

    Moore's law was never about clock speed though, it was about transistor density. But machines aren't getting any less powerful. For example, last time I saw a "compared" benchmark, it showed a recent core2-duo running the benchmark 15 times faster than a 1 GHz P3. Things haven't slowed down!

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 19 Sep 2007 @ 8:30am

    You are wrong. Moores law is about transistor density, and when it stops we have a somewhat more scary and unknown territory to traverse. Moores law says "you can rely on me, until you can't anymore." When you can't anymore, it is time to find new technology and new laws. Or, if you're smarter, be ready for when moores law fails with the new tech.

    link to this | view in thread ]

  11. identicon
    Xiera, 19 Sep 2007 @ 8:47am

    Mike, you were bored, weren't you?

    link to this | view in thread ]

  12. identicon
    Superfli95, 19 Sep 2007 @ 8:49am

    People make a big deal about the "Death of Moore's Law" as if it is something new and astonishing. It's not. We always knew that someday we would hit physical barriers in our currant transistor technology; parasitic capacitance, quantum effects, etc. It is just a matter of finding other ways of storing information, phase changing materials, quantum mechanics, quantum optics, holographic storage etc. This is how the world works people, develop a technology until it reaches its limit and then develop knew technology.

    link to this | view in thread ]

  13. identicon
    Shohat, 19 Sep 2007 @ 8:50am

    Re:

    We are seeing speed increase, but I guess you are referring to clock speed (in HZ), which is not-so-important.

    Even MIPS isn't so important (for every several clock "ticks", a single instruction is "executed"), because even instructions can be of different importance.

    To explain this :
    I am currently writing for PIC microcontroller. It is a good chip, and works on 4 MHz "clock", the oscilator generates ~4 million a second. For every 4 such "pulses", a single instruction in the processor is executed. Hence, the chip operates at One MIPS. One million instructions per second.
    But a single instruction in advanced processors can be quite complex, some badass one-insutrction calculation, so the chips cannot be compared on niether HZ nor MIPS scale.

    link to this | view in thread ]

  14. identicon
    Joe Smith, 19 Sep 2007 @ 9:27am

    Re: Not just

    it's exponential. And that kind of growth becomes a problem for human perception, in time, as we just aren't equipped to think that way. Perhaps it would be better for us all if the rate of growth did slow a little.

    Except that the marginal utility of those improvements is falling exponentially so most people do not "see" exponential growth. For myself, I am impatient for the future, I want faster progress, much faster progress - pour it on ...

    link to this | view in thread ]

  15. identicon
    zcat, 19 Sep 2007 @ 9:44am

    It's become Bill's Law anyhow....

    Windows will require twice the processing power and storage capacity approximately every 18 months even when it goes years between a major update, due to a combination of constant patch-on-patch updates and accumulated cruft.

    And you have no choice. If you don't keep up in both software versions and the CPU power to run them, you will find it increasingly difficult to communicate with your suppliers and customers as they are forced to update by their suppliers and/or customers.

    I predict More's law may start to level off when or if the industry starts to adopt real Open Standards (not necessarily Open Source) and finally steps off the upgrade treadmill.

    link to this | view in thread ]

  16. identicon
    Duodave, 19 Sep 2007 @ 9:47am

    It's about transistors

    Moore's law may be a simple yardstick of technology, but the fact is, it's just about transistors. What happens when chips themselves are obsolete and we've discovered some new technology that's better than silicon wafers?

    The number of "transistors on a chip" will be an obsolete measure of technology and Moore's Law will be an obsolete measure of technology. So - will Moore's Law be obsolete in 10 years? Maybe it will - but not because transistors no longer double - maybe because the rule itself is an obsolete measure of technology.

    link to this | view in thread ]

  17. identicon
    timtimtim, 19 Sep 2007 @ 9:48am

    Moore's Law is just a specific variation of the Law of Accelerating Returns (more appropriately called accelerating change). Regardless of whether or not the doubling of transistors every 12 months or two years remains accurate, the principle remains the same: computers become exponentially faster because technology improves exponentially.

    link to this | view in thread ]

  18. identicon
    zcat, 19 Sep 2007 @ 9:50am

    Continuation of thought..

    Perhaps if we can get to a point where software no longer demands exponentially increasing amounts of storage and processing power, Moore's law might get replaced by Negroponte's law; The same amount of storage and processing power becomes half as expensive approximately every 18 months.

    That would be nice.

    link to this | view in thread ]

  19. icon
    Instructor (profile), 19 Sep 2007 @ 11:37am

    Two major advances not yet achieved...

    Two major advances expected some time ago, but no yet realized:

    1) room-temperature superconductors, and
    2) true 3-dimensional semiconductors

    Either of these would accelerate the rate of increase of computing power vs. cost. A combination of the two would be a major revolution.

    But then, just 16 years ago, a 1-gigabyte hard drive cost over $1000. And semiconductor memory was around $3-5/Mb. I now carry on my person more hard drive space AND more random-access memory than existed in the whole world the year I was born.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 19 Sep 2007 @ 12:02pm

    u r dum

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 19 Sep 2007 @ 12:02pm

    u r dum

    link to this | view in thread ]

  22. identicon
    TW Burger, 19 Sep 2007 @ 12:03pm

    Hardware is superb but the software still sucks

    I have seen 25 years of impressive hardware advances but very little corresponding software advances. Excel came out with a limit of 65536 rows in DOS and Office 2003 Excel still has this limit. The only increase I ever see in software is the hard drive and memory footprint.

    There should be a Moore's Law for software: Application size doubles every 2 years but the software isn't much better.

    link to this | view in thread ]

  23. identicon
    a step backwards, 19 Sep 2007 @ 12:57pm

    How about

    we, instead of making chips smaller and incrementally faster make the current size really frikin fast. Smaller always seems to me to be more of a marketing bullet point. Could they (Intel or AMD) make a 10ghz single processor with a massive cache on a larger die or is smaller always better? I don't really care to much about how much power it would use ( as something that awesome could and possibly would only be used by institutions and gov'ts ) it would be the jet engine of computing. Huge and energy hungry but fast as all get out. I want to see the perfection of the tech not the constant superseding for incrementally lower overall gain we have been getting.

    link to this | view in thread ]

  24. identicon
    Moore, 27 Jan 2010 @ 4:33am

    moore

    chips always taste better smaller. moore was a genius.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.