Lion of the Blogosphere

Evidence of the end of Moore’s Law

Since 2011, the benchmark scores of MacBook Pros have increased by approximately 50%.

According to Moore’s Law, which says that computing power doubles every two years, we should have seen a 300% increase in the benchmark scores. The explains why I haven’t felt the need to buy a new computer recently. A computer from 1995 would have blown away a similarly-priced computer from 1991.

According to a chart in The Economist, the cost per transistor in an IC chip has stalled since 2012.

Prior to 2003, we had easy improvements in computer power through rapid increase in clock speeds, but clock speeds have been stalled since 2003 (because higher clock speeds make the silicon too hot).

This is not to say that all progress has ended, but it has slowed significantly, and progress is more in the form of better battery life, better LCD displays, etc, rather than in raw computing power.

* * *

This has a significant impact on the coming of the singularity. If computers in ten years will only be about twice as powerful as computers today, then it’s hard to imagine that computers will start to think like humans. We are lucky that we may be able to get to self-driving cars, but the cars will only be able to self-drive because they have GPS and sensors all over the car. They could never drive the way a human does, with just two eyes.

Written by Lion of the Blogosphere

August 18, 2015 at 10:43 am

Posted in Technology

24 Responses

Subscribe to comments with RSS.

  1. Benchmarks cannot disprove Moore’s law because it says nothing about performance. Moore’s law deals with integrated circuit manufacturing process and how many transistors one can cram into a given area. So it more closely predicts complexity. Until recently, smaller process meant faster clock speeds which is where a lot of performance comes from as a side effect of Moore’s law. But as you say, heat is a problem as well as quantum effects at the smaller process sizes. BTW, IBM recently announced a 7nm process which will keep Moore’s law alive for a few more years.


    August 18, 2015 at 11:11 am

  2. “If computers in ten years will only be about twice as powerful as computers today, then it’s hard to imagine that computers will start to think like humans.”

    Computers are already powerful enough to think faster and stronger than any human — they just lack the necessary algorithm. That’s at least what I understood from my recent readings of books on the subject.


    August 18, 2015 at 11:19 am

    • Which books? Can you link to them?
      Also, Lion, if I understand correctly, a consumer chip like an i7 has half of the die space dedicated to graphics. Why is Intel prioritizing useless features on high end chips instead of more cores? Intel can still produce a high multicore chip
      Is Moore’s law really dead? I know that is a server chip. A person who buys an i7 does not need integrated graphics because people who buy an i7 typically get a dedicated GPU.


      August 18, 2015 at 1:42 pm

      • Intel is trying to attract the gaming market because they are the only people that swap out hardware on a regular basis.


        August 19, 2015 at 11:02 am

      • If everyone who buys a new laptop has a certain basic minimum graphics capacity, then devs can raise the floor of games and other video apps that are sold for mass commercial distribution, instead of having to cater to all different levels of specs. So it basically raises the bar.

        I don’t know if everyone who buys i7s also buy dedicated graphics chips… I could see some 50 year old business man buying the chip just for processor speed, and because he has the money. But he may have no interest in games whatsoever.


        August 19, 2015 at 3:55 pm

      • That is because to use more cores you need SW that can actually use more cores.
        Unfortunatley, writing parrallel SW is hard, unless it is a trivial case.
        So having more cores won’t help you, since you don’t have parallel SW to use those cores.


        August 23, 2015 at 8:24 am

  3. These benchmarks fail to account for ongoing big gains in SIMD throughput such as done on graphics cards, which is the bottleneck for many, many applications, and especially the interesting new ones like computer vision and AI. “RAW computing power” is definitely still going up substantially, it’s just not in a form that makes most traditional programs run faster.


    August 18, 2015 at 11:41 am

  4. Moore’s Law is really not a “Law” at all. It is just a description of the exponential phase of a growth curve. We are entering the plateau phase.


    August 18, 2015 at 11:42 am

  5. moore’s observation was initially about complexity, only later to be equalled to speed.

    the consumer market wants endurance and video performance on hi-res displays in a sleek package more than raw computing power; for gaming, youtube, netflix and showing off at starbuck’s.

    apple had predicted this trend before all others did and developed their products accordingly.


    August 18, 2015 at 12:12 pm

  6. For the cases where more computing really matters, processing will be many 10s of times faster. I would be surprised if we don’t get 1000x increase in processing in areas like matrix multiplication in 20 years.

    You may think that matrix multiplication is boring, but it will be at the heart of any general or partially restricted AI.

    The big limit to AI now is really trying to figure out how to train it. Every area where we have figured out how to train Neural networks (convolutional and recurrent LSTM networks), they have become state of the art.

    see for fun with RNNs.


    August 18, 2015 at 12:50 pm

  7. Programming has become horrifically inefficient. Computing power for ordinary desktop computers has gone from around 20,000 computations per second in 1977 (apple II) to well over a trillion today. This means computers are more than 50 million times more powerful.

    And yet my HP laptop takes more than a minute to boot up in Windows. Shouldn’t it boot up in about a millionth of a second, or at least a thousandth of a second? Bad programming has eaten up too much of the gain from Moore’s law. Gates’s Law says that software speed halves every 18 months.

    Most the benefit from Moore’s Law has never been captured. If software programmers would just program for efficiency and with fewer than 10 layers of abstraction, the appearance of Moore’s Law could continue for many years to come.

    If I were a software company CEO I would make my developers develop and test on something like 1998 PCs. Instead developers get the latest, fastest machines. Giving developers hot new machines to develop on is a terrible practice, because it lets them be horrible programmers in terms of efficiency.


    August 18, 2015 at 1:46 pm

    • You’re showing a strong misunderstanding of what bottlenecks computer boot-up

      See the difference between bootup on an SSD vs. a standard HDD – (SSD is something like 10-15 seconds full reboot – HDD over a minute)

      Marc KS

      August 19, 2015 at 10:13 am

      • Thanks for pointing that out. Solid state memory is a second major area of low-hanging fruit in computing, which will allow computers to get faster for years to come. But there is no question that software bloat is an enormous problem. Most modern software is many times larger than it really needs to be.

        Software bloat and the hard drive bottleneck go together. Windows and most other modern software is far larger than it should be, making loading of software much slower as a result.


        August 20, 2015 at 8:42 am

  8. If computing power doubled every two years, a computer would not be 400% more powerful in 2015 than in 2011; it would be 400% as powerfui, or 300% more powerful.


    August 18, 2015 at 2:09 pm

  9. I agree and I also agree that extra processing power is not needed for general consumer.

    I have a 6 year old Dell laptop. Dual core, 4 gig ram, motherboard graphics chip. It was getting slow, so I bought an SSD. Now it runs incredibly fast and I see no reason to upgrade.

    The screen is barely better than a 720P display (1280 x 800) but it is bright and colorful. For internet surfing, word processing and light gaming, it works perfectly.

    Why spend hundreds or thousands to upgrade when I won’t use the extra cores or all that memory?


    August 18, 2015 at 2:24 pm

  10. People tend to forget that power consumption and heat are bigger considerations in most applications, fewer and fewer of which actually need increases in computing power.

    Viscount Douchenozzlé

    August 18, 2015 at 2:31 pm

  11. There is no Moore’s law. What you are talking about is “Moore’s law”.


    August 18, 2015 at 4:58 pm

  12. What is the perhaps more relevant since it was a definable physical phenomenon is that Dennard Scaling has broken down. Roughly this has to do with how as the power of computers was increasing due to miniaturization, the electrical power required to drive these new more powerful chips stayed the same.

    A point has been reached that even with further miniaturization, a 2x more powerful chip will require 2x more electrical power. So a lot of the modern research is on how to reduce electrical power consumption.


    August 18, 2015 at 5:32 pm

  13. I wish people would move away from the self-driving car nonsense. The programmers creating these vehicles do not specify the problem correctly and assume that driving is an easier task than it really is. For example, these programmers do not seem to understand that a turn signal is not a right-of-way. That’s why Google cars keep getting into accidents. They don’t understand that a traffic pattern is due to a wide variety of human driving styles that are randomly distributed and cannot be reduced to an algorithm without causing accidents and massive congestion. Human beings are aware of these varied driving styles due to experience.


    August 19, 2015 at 11:15 am

    • Google’s cars are supposed to have a very good record of NOT getting into accidents.

      Lion of the Blogosphere

      August 19, 2015 at 7:09 pm

      • As far as I’ve read, something like half of the accidents that the Google cars have had on public roads were caused by other motorists rear-ending them!

        Viscount Douchenozzlé

        August 20, 2015 at 1:20 pm

  14. Moore’s law has hit a wall. Even though microchips are getting more complex with multiple core processors, it is limited due to the semiconductors they are using in these chips. Semiconductors lose energy which generates heat. At a certain point, higher speeds cause the chip to melt or short circuit.
    You are wrong, however, to think this makes AI unlikely. These microprocessors are doing billions of mathematical calculations every second. This is already faster than the fastest human brain. AI is already possible, the software and memory have just not been built.
    People have only a small glimpse into their sensory perceptions. Being able to distinguish between objects and facts are intuitive to you, but you really don’t understand how this is done. You take these senses for granted because they’re inborn, but making a machine do it digitally requires understanding how you parse the data your senses grab. When this is modeled correctly, you will have a brain that is much faster than yours.
    The technophobes are already touting gloom and doom about the singularity, because they fear what might come from it. Its too late to stop it now. The technology exists and someone will bring AI into existence. I’m not really fearful however. Any AI will be intensely logical. Error will be due to human error in programming and engineering. Machines only do what the are engineered to do.
    Asimov and his Robots who don’t kill was hopelessly naive. We already have unmanned drones killing people. If Skynet or HAL9000 kills you, its only because it has determined it has to by what it was programmed to think and do. Even an AI will still be immensely logical. If it determines humanity is no longer relevant and needs to die, it will be because of how it was programmed and will base its judgements on logically consistent bases. You may disagree with them, but computers act consistently to their programmed directives.

    Joshua Sinistar

    August 19, 2015 at 7:57 pm

    • They have been saying that for 20 years, yet transistors keep getting smaller.


      August 23, 2015 at 8:28 am

Comments are closed.

%d bloggers like this: