Moore’s Law thoughts
According to Wikipedia, Moore’s Law, which is an observation and not a natural law of physics, states that the number of transistors on a chip doubles every 18 months. So when engineers from Intel, for example, say that Moore’s Law is still going strong, they mean that the number transistors on Intel’s most powerful chips continue to increase at that rate.
Throughout the second half of the 20th century, the consumer experience tended to parallel the increasing number of transistors on a chip. That is, chips with twice as many transistors led to personal computers with twice as much apparent computing power (in other words, they ran software twice as fast) for the same price or even less than computers 18 months older.
The year 2004 is a key year in the significant slowing of personal computer performance. That’s the year when clock speeds reached a plateau at which they have been stuck ever since. Up until 2004, smaller transistors allowed for faster clock speed which has a direct correlation with how many instructions the microprocessor can execute per second. But in 2004, the clock speeds reached a maximum such that any higher speeds caused the chips to overheat.
Modern computer chips have multiple cores. Two cores are now standard even in low-end computers, and more powerful personal computers have quad core chips. However, only specialized software can run on multiple cores at the same time. The
vast majority most [commenters pointed out various examples of programs that can parallel process stuff] of tasks you do on your computer only use one core at a time, and there isn’t any practical benefit to putting more than four cores on a personal computer. So it’s impossible to make PCs run current software any faster in the future by making 8-core or 16-core chips.
It also seems to me that the rate of price decreases have significantly slowed in the recent past. No longer is the most current chip the same price or even less expensive than an 18-month-old chip with half as many transistors.
The end result is that between 2004 and 2012, the yearly increase in apparent computing power of personal computers slowed down a lot. It’s my observation that around 2012, the rate slowed down even more. It seems to me that it has been increasing at less than 10% per year since 2012, whereas during the 1980s and 1990s we got used to something like a 60% performance increase per year. (When you compound 60% per year, you get more than a 10,000 times performance increase in a 20 year period, which is what happened between 1980 and 2000.)
That explains why I’m still using a six-year-old computer. That explains why computer sales are in the doldrums, there’s no longer a huge benefit to upgrading.
* * *
It’s my advice that if you have a computer with an i3 or higher chip, you will get a much bigger apparent performance boost from updating your HDD to an SDD compared to buying a new computer with an HDD.
* * *
The apparent performance of iPads increased ten times between 2012 and 2016 based on my tests, which is a 78% increase per year. However, that’s going to come to an end, because iPad performance is always going to be slower than a desktop computer (given that the iPad has power and size constraints), and the current iPad Pro 9.7 is only 29% slower than a similarly priced desktop computer.
Apple has made a lot of money because consumers felt the need to upgrade their iPhones every two years, but I predict that’s coming to end because their iOS devices have caught up to desktop computers and will only become 10% more powerful per year going forward. (Maybe Apple’s secret weapon is that you can’t replace the lithium ion batteries which have a limited lifetime before they no longer hold a charge.)