Lion of the Blogosphere

Moore’s Law thoughts

According to Wikipedia, Moore’s Law, which is an observation and not a natural law of physics, states that the number of transistors on a chip doubles every 18 months. So when engineers from Intel, for example, say that Moore’s Law is still going strong, they mean that the number transistors on Intel’s most powerful chips continue to increase at that rate.

Throughout the second half of the 20th century, the consumer experience tended to parallel the increasing number of transistors on a chip. That is, chips with twice as many transistors led to personal computers with twice as much apparent computing power (in other words, they ran software twice as fast) for the same price or even less than computers 18 months older.

The year 2004 is a key year in the significant slowing of personal computer performance. That’s the year when clock speeds reached a plateau at which they have been stuck ever since. Up until 2004, smaller transistors allowed for faster clock speed which has a direct correlation with how many instructions the microprocessor can execute per second. But in 2004, the clock speeds reached a maximum such that any higher speeds caused the chips to overheat.

Modern computer chips have multiple cores. Two cores are now standard even in low-end computers, and more powerful personal computers have quad core chips. However, only specialized software can run on multiple cores at the same time. The vast majority most [commenters pointed out various examples of programs that can parallel process stuff] of tasks you do on your computer only use one core at a time, and there isn’t any practical benefit to putting more than four cores on a personal computer. So it’s impossible to make PCs run current software any faster in the future by making 8-core or 16-core chips.

It also seems to me that the rate of price decreases have significantly slowed in the recent past. No longer is the most current chip the same price or even less expensive than an 18-month-old chip with half as many transistors.

The end result is that between 2004 and 2012, the yearly increase in apparent computing power of personal computers slowed down a lot. It’s my observation that around 2012, the rate slowed down even more. It seems to me that it has been increasing at less than 10% per year since 2012, whereas during the 1980s and 1990s we got used to something like a 60% performance increase per year. (When you compound 60% per year, you get more than a 10,000 times performance increase in a 20 year period, which is what happened between 1980 and 2000.)

That explains why I’m still using a six-year-old computer. That explains why computer sales are in the doldrums, there’s no longer a huge benefit to upgrading.

* * *

It’s my advice that if you have a computer with an i3 or higher chip, you will get a much bigger apparent performance boost from updating your HDD to an SDD compared to buying a new computer with an HDD.

* * *

The apparent performance of iPads increased ten times between 2012 and 2016 based on my tests, which is a 78% increase per year. However, that’s going to come to an end, because iPad performance is always going to be slower than a desktop computer (given that the iPad has power and size constraints), and the current iPad Pro 9.7 is only 29% slower than a similarly priced desktop computer.

Apple has made a lot of money because consumers felt the need to upgrade their iPhones every two years, but I predict that’s coming to end because their iOS devices have caught up to desktop computers and will only become 10% more powerful per year going forward. (Maybe Apple’s secret weapon is that you can’t replace the lithium ion batteries which have a limited lifetime before they no longer hold a charge.)

Written by Lion of the Blogosphere

March 2, 2017 at 1:59 pm

Posted in Technology

77 Responses

Subscribe to comments with RSS.

  1. L1 cache size matters a ton and has continued to increase thanks to Moore’s law. And the other thing that’s advanced a lot is the SIMD features on the cores. Stuff like high quality video playback is way better now because CPUs are much better at matrix operations, not just more cores or whatever.

    bobbybobbob

    March 2, 2017 at 2:08 pm

    • All true, but I think the current result of these improvements is a 10% increase per year in apparent performance rather than the crazily fast increases we saw in the 20th century.

      Lion of the Blogosphere

      March 2, 2017 at 2:13 pm

      • Your right that 3-4 ghz was the theoretical limit given heat and electricity constraints. That’s why after the Prescott chip disaster and being overtaken by AMD’s chip design of the time Intel stopped focusing on clock cycles (ghz) and started focusing on pipeline/core architecture. They actually went back to 2 ghz chips for a while which substantially outperformed the 3 ghz models.

        The clock cycles model is a bad way to measure processor performance. It’s true that car travel is limited by the physics of being on the ground but that doesn’t mean that the cars performance once it has a good engine cannot still be improved with aerodynamic design etc. same thing with CPU’s. Yes the theoretical limit night by 500 mph given wind resistance and tire design etc, but that doesn’t mean that getting to 500 mph can’t be improved. Similar to a car, people notice acceleration more than absolute velocity.

        Likewise the CPU might take 10 hours to encode a video at 3 ghz, but there are many improvements we can make without increasing ghz to decrease processing time.

        The real problem with noticing speed differences is that computers just “work” now.. It’s the same reason why it doesn’t matter if you play a game at 60 frames per second or 120.. If the human eye can’t detect it (or any of our five senses) then it’s irrelevant. Same thing with high resolution displays. Beyond a certain DPI (dots per inch), it’s irrelevant because we can’t notice improvement.

        It was relevant in the 90s because we hadn’t maxed out our sensory capabilities. So once again it is humans that are the true limitation. This is why the robots will get rid of us all…

        Paul Ryan's Sickly Old Lapdog

        March 2, 2017 at 4:27 pm

      • People definitely notice a difference between 60 FPS and 120 FPS if given the equipment. VR headsets recommend 90 FPS minimum to avoid nausea. Similarly, we’ve gone from 1k being ‘High Definition’ to hardware manufacturers rolling out 4k displays. Again, people notice (though we can agree these are subject to diminishing returns.)
        Whether the more expensive hardware to get higher framerates/resolution is worth the additional cost is the limiting factor.

        Panther of the Blogocube

        March 3, 2017 at 4:21 am

      • It’s physically impossible for you to notice because it is higher than the naked eyes ability to process. Why do you think movies are 30 dos and have stayed that way for decades.

        Paul Ryan's Sickly Old Lapdog

        March 3, 2017 at 2:28 pm

      • Didn’t they change the method for measuring TV resolution so that “4K” is actually only 2K using the old method?

        CamelCaseRob

        March 3, 2017 at 3:25 pm

      • There’s a large amount of evidence done by psychologists / neuroscientists and the military that proves empirically that humans can perceive & comprehend images flashed for less than 1/200th of a second. Human attention being what it is, the efficacy probably goes up even more in high stress situations.

        Panther of the Blogocube

        March 4, 2017 at 12:08 am

      • Do you have any research links? This is honestly the first time I’ve heard this. Very interesting from a gaming (computer type not PUA) perspective.

        Paul Ryan's Sickly Old Lap Dog

        March 4, 2017 at 12:55 am

  2. The vast majority of tasks you do on your computer only use one core at a time, and there isn’t any practical benefit to putting more than four cores on a personal computer.

    An unappreciated benefit of multicore processors is that multitasking of processes is a lot less sluggish than it used to be. It’s not something that people notice as much as one process becoming faster, but it has an effect on the overall “snappiness” of your experience.

    IHTG

    March 2, 2017 at 2:10 pm

    • He’s also completely wrong to say that the vast majority of tasks don’t use mulitple cores. First of all there is the OS itself, which is basically what you’re describing. And then modern games, internet browsers, photo editing software, etc. are all written to take advantage of multiple cores.

      All you have to do is open a program that analyzes CPU usage by process and by processor and you can see that Lion is wrong. I have a 4 core system and the OS and the various programs I run do an excellent job keeping the load balanced between all four cores.

      Although I do agree that CPU upgrades aren’t really necessary for everyday computing. I have a pretty old computer and the only problems I run into are related to the 4GB memory limit and not the CPU (although 2 cores wouldn’t be enough).

      Magnavox

      March 2, 2017 at 2:51 pm

      • Javascript is single threaded. Intel and modern ARM do crazy tricks like OOO to get around stalled clock speed.

        JW Bell

        March 2, 2017 at 9:29 pm

      • Please elaborate on these tricks. What is OOO?

        Paul Ryan's Sickly Old Lap Dog

        March 4, 2017 at 12:55 am

  3. I’ve had several computer traumas, which I won’t describe here (too long & agonizing).

    Upshot of computer trauma #1: I took to hand writing drafts. That’s how I got into fountain pens – and my source was Lion!

    Upshot of computer trauma #2: I am going to buy a manual typewriter at some point. I am not joking. I love the digitization of knowledge and think that it’s been a boon to mankind, but nothing beats the physical act of hand writing, and creating work on a typewriter.

    gothamette

    March 2, 2017 at 2:21 pm

    • Eh… I don’t recommend manual typewriters.

      But glad you like the fountain pens! Too bad I don’t have any need to take handwritten notes.

      Lion of the Blogosphere

      March 2, 2017 at 2:26 pm

      • I write shopping lists, etc. I fill out forms with them. I love the feel of them. I feel as if I am drawing. Oddly enough the best ones I own are the cheapest: two Pilot Preppies, fine & medium, costing $4 apiece. I bought converters but I mostly reuse the old cartridges, filling them with a reusable syringe. It cost me $10 altogether for the supplies. With 2 bottles of ink, one a gift, I am in business for a year.

        gothamette

        March 2, 2017 at 4:40 pm

      • Don’t be surprised if typewriters make a comeback as a new SWPL toy, like phonographs and turntables!

        JS

        March 3, 2017 at 9:05 pm

    • I actually typed my bar exam on a manual in 1983. But I recommend getting a Selectric II or III. The tactile feeling is exquisite, far more satisfying than any manual.

      Explainer 21

      March 2, 2017 at 3:26 pm

      • In the 1980s they had lightweight portable electric typewriters (but they still required electricity, unlike laptop computers).

        Lion of the Blogosphere

        March 2, 2017 at 4:33 pm

      • There’s quite an enthusiastic market/audience for old Olympias, Smith Coronas, Hermes, Olivettis.

        gothamette

        March 2, 2017 at 4:38 pm

      • Nostalgia thing?

        Lion of the Blogosphere

        March 2, 2017 at 4:39 pm

      • Typewriters have been popular among hipsters and enthusiasts recently. I don’t think hipsters actually use them much though. I think they just keep them around for decoration.

        Tom

        March 2, 2017 at 5:14 pm

      • “Nostalgia”

        Partly. I guess the manual thing would be nostalgia, but the fact is – some of them are really well made! I’m talking about an item made in the 1960s that still functions perfectly. How cool is that? And you don’t need electricity – which means if the juice fails (which it has), you can still type.

        The practical side is that when you’ve written something you’ve written it. It can’t get corrupted. That’s happened to me. I back up obsessively, but it’s a horrible feeling when you try to open an important document and it’s corrupted.

        About the revising, in my little n of 1, computers make it easier to correct, but they also make it easy to fuss & fiddle. (In my handwriting, it’s easy to revise. I just draw a line through words I want to delete.)

        Some people think the tactile experience of these vintage typewriters is superior to electrics. I’d have to compare. I don’t know.

        gothamette

        March 2, 2017 at 7:03 pm

      • Use OneNote https://www.onenote.com/

        It’s free, and it stores your information in the cloud, as you type, so you can’t lose anything.

        Lion of the Blogosphere

        March 2, 2017 at 8:29 pm

      • OK! Thank you.

        gothamette

        March 2, 2017 at 9:37 pm

    • problem with old typewriters is it’s very tedious to revising and editing

      grey enlightenment

      March 2, 2017 at 4:20 pm

      • Just for drafts that can’t get eaten up by hard drive failure, electrical failure, OS failure. I’m a fast typist so typing up a draft even with mistakes would be easy.

        gothamette

        March 2, 2017 at 4:37 pm

      • Just get a second hard drive (or thumb drive) to back up your files and/or a printer to make hard copies.

        destructure

        March 2, 2017 at 7:00 pm

    • “Upshot of computer trauma #1: I took to hand writing drafts. That’s how I got into fountain pens – and my source was Lion!”

      Always handwrite drafts. Even if your first pass is clunky and disorganized, you’ll get much more done in one sitting. Then you can type it in the computer, set it aside for awhile, and edit later to your satisfaction. It’s a much more efficient and enjoyable way to write; I’ve done it that way since college. Otherwise, you’ll get stuck for two hours endlessly polishing your first paragraph.

      hard9bf

      March 3, 2017 at 5:48 pm

  4. 1. how long does even the 10% performance per year increase hold?

    2. I’m not much of a gamer, but for games wouldn’t even 10% be a big deal?

    3. If performance is increasing at such a low rate, does that mean that new generations of gaming consoles will be released every 20 years instead of every 7 years?

    Otis the Sweaty

    March 2, 2017 at 2:25 pm

    • 3. Yes, I think that for all computer hardware we will see the slowing of release dates.

      Lion of the Blogosphere

      March 2, 2017 at 4:00 pm

      • The problem is not hardware, it’s that we’ve maxed out our sensory limitations. Basically video games are approaching real life or as close as they will get to it.

        The next major advance for computing where’s our senses will notice will be AI. Every cognitioning being has the ability to discern smart from stupid.

        Paul Ryan's Sickly Old Lapdog

        March 2, 2017 at 4:30 pm

      • We haven’t maxed out our senses at all. We just can’t find a way to use CGI to credibly duplicate real life, at least not on a large scale within a reasonable budget. If the gaming companies could make a game look like Hardcore Harry, surely they would.

        But even on the best gaming PCs, the graphics during actual gameplay are not as good as a Pixar movie. Or, in many cases, as good as the cutscenes that are included in the games. There’s probably a software limitation there (putting in the time/money to include as much detail in an explorable game world as exists in the fixed limitations of a movie), but I suspect there’s also a hardware limitation.

        But Lion’s point is well illustrated by looking at the PlayStation upgrades.

        If you compare graphics between the PS3 and PS4 (2006 – 2013), the improvement is barely visible, the main difference being loading times. The upgrade from PS2 to PS3 (2000 – 2006), by contrast, was huge, and the upgrade from PS1 to PS2 (1994 – 2000) was bigger still.

        In 12 years, 3D video game graphics went from weird, blocky, semi-abstract shapes to a decent but rough approximation of a CGI film. In the 11 years since, they’ve slightly improved that rough representation.

        My question is, seeing that computing power has reached a plateau, Lion is still an optimist on AI. I don’t see it happening. We’ll hit an automation plateau there as well, compounded by dysgenic breeding that means we’ll no longer have enough engineers smart enough to figure out the advances of the past.

        Wency

        March 2, 2017 at 10:39 pm

      • “My question is, seeing that computing power has reached a plateau, Lion is still an optimist on AI. I don’t see it happening. We’ll hit an automation plateau there as well, compounded by dysgenic breeding that means we’ll no longer have enough engineers smart enough to figure out the advances of the past.”

        That’s interesting.

      • Games (or, in the sense we’re talking about here – Simulators) generally have to run at real time. This significantly reduces what can be done because the computer is doing it on the fly. There are tricks to precalculate a lot of things in games as well. This is done as much as possible and is why things like lighting and animation are mostly static or only happen in predetermined ways.

        AI is a different kind of problem. It doesn’t matter nearly as much because intelligence that is slow can still be intelligence. If we could create an AI with the problem solving ability of an average person, but hundreds of times slower, it would still be a tremendous breakthrough.
        I’m being optimistic but something like Watson is already proving that our current technology is, in some ways, sufficiently advanced to be better than human intelligence. I sincerely doubt the dysgenic cultural issues will hinder it, barring some kind of extreme die-off.

        Panther of the Blogocube

        March 3, 2017 at 5:11 am

  5. I agree with the SSD and added one to a nearly 10 yr old PC to run the operating system. With that single upgrade, most users couldn’t distinguish the performance from a brand new one. At least for the things most home users would use it for. Such as Internet, word processing, spreadsheet, ebook management, video streaming and sound. Of course, if you want top of the line audio you’d need to buy a separate DAC even for a new PC. But it still wouldn’t sound any better with a newer PC.

    destructure

    March 2, 2017 at 3:13 pm

  6. It’s true that having more cores doesn’t lead to higher performance for most consumer and office uses. Word Processing is limited by how fast your fingers can move, not by how powerful your computer is. And what would Power Point do with more cores?

    For Scientific and Research purposes, however, more cores can lead to a very real increase in performance. Programs like R and SAS can, with the proper techniques, take full advantage of parallel processing which can greatly improve speed. This is especially noticeable when working with very large data sets.

    Gerald

    March 2, 2017 at 4:23 pm

  7. What about the new AMD chips?

    Tom

    March 2, 2017 at 5:16 pm

  8. Lion , what do you say about that “girls” episode (6×03) all the
    SJW talks about now ?

    eyaldavid

    March 2, 2017 at 5:48 pm

  9. Someone in Southeast Asia will make replacement batteries. There are plenty of modding and repair videos on YouTube. Though, Apple is notorious for planned obsolescence. They’ll try hard to force people to buy.

    Squish

    March 2, 2017 at 6:10 pm

  10. If apple sticks to unremovable batteries as the tech approaches maturity and the incentives to upgrade frequently diminish, they will lose the market to their many competitors that do have removable batteries.
    Same goes with any other planned obsolescence games. People will have less patience with it.

    Giovanni Dannato

    March 2, 2017 at 6:12 pm

  11. I’ve been feeling this way for quite some time as well. In 2003 I bought a computer with a clock of 3.2ghz (pentium 4). In 2008 I bought a core 2 duo with a 3.33ghz processor. In 2014 I bought a 4.0ghz quad core. So in more than 11 years the clock only increased by .8ghz. Not really that much of a difference. Obviously the cores do some good, but many programs are not optimized to use multiple cores. I sometimes think the only reason that any application needs more power these days is because of lazy programming/optimization. Since people have so much RAM/processing power, it caused all programmers to not bother as much with optimization and I’ve noticed with many computer games I play, each patch runs worse and worse, without any improvement in the graphics. They sometimes will release an optimization update for these programs, and it makes a big difference. But optimization usually takes a back seat to making other small tweaks to the games.

    Also its worth noting for the over heating problem that I bought this computer (quad core 4.0ghz processor) and with the stock cooler it came with it was overheating when I played certain process intense games. However, when I used a liquid cooler it runs absolutely fine. It wasn’t just me either many who had the processor claimed it was getting so hot it would throttle down. Maybe the solution for getting faster PC’s on the market is coming up with reliable liquid coolers?

    Yup

    March 2, 2017 at 7:03 pm

    • “I sometimes think the only reason that any application needs more power these days is because of lazy programming/optimization”

      Indeed, unless you are solving super-huge math/data problems, or doing super high res graphics, if your computer seems to do everything slowly it’s because of lazy programming.

      Lion of the Blogosphere

      March 2, 2017 at 8:31 pm

      • Yes, I think programmers got lazier in 2009 or so. Every computer was coming with 8gbs of RAM it felt like. Even though the processor was barely an upgrade from 2004.

        Yup

        March 2, 2017 at 9:24 pm

  12. What I really would like to know is what the buzz is on Neogaf about Moore’s law.

    sabril

    March 2, 2017 at 8:03 pm

    • LOL. I personally don’t mind the NeoGaf updates, but I’m not sure why Otis thinks it’s Ground Zero of liberalism. I just went over there to check it out. Out of 40 threads on the first page of the Off-Topic forum, only 8, based on their titles, were about political topics. I checked a few of them out, and there’s virtually no substantive discussion. The signal-to-noise ratio is very low. Lots of one-line, incomplete sentence, or even 2-3 word posts. The average LotB comment looks like Tolstoy by comparison. And as for the other 32 threads on the first page, here’s a representative sample:

      Who was your fictional crush when you were younger?
      Toby Kebbell claims he is “done” as Doctor Doom unless Marvel Studios gets him back
      Disney XD Renews ‘DuckTales’ Ahead Of Launch, Unveils First-Look Video
      A locksmith saved my job today
      should I get a haircut? (help self confidence is low)

      I don’t see what the big deal is.

      Hermes

      March 2, 2017 at 9:05 pm

      • It’s self-bias. Much like what people are saying on LotB blog is only relevant if you’re already a reader.
        To be honest I often think Otis is just a false flag poster, trying to egg people on to weird extremes by his own bizarre behavior.

        Panther of the Blogocube

        March 3, 2017 at 4:00 am

      • I have to admit, I take guilty pleasure in his phrase “destroy the immigrant community.” I fantasize about being around the Thanksgiving dinner table with my liberal relatives as they rail against Trump, blurting out that I like him because he’s “destroying the immigrant community,” and seeing the looks on their faces.

        Hermes

        March 3, 2017 at 9:28 am

  13. My 2 year old laptop is already falling apart from show much daily usage..

    I might try a Chromebook for 150.00. Do you think it will be fast enough to surf the web and watch youtube etc. That’s all i really use it for. Or should i get a $300.00 laptop? I don’t play video games.

    ttgy

    March 2, 2017 at 8:19 pm

    • Chromebook is absolutely fast enough for web and video. Especially the newer ones. Make sure you research the benchmarks before you buy.

      Lion of the Blogosphere

      March 2, 2017 at 8:34 pm

      • I bought Chromebooks for my wife and me after Lion blogged about getting one for his father. I love mine and it is just as fast as my desktop for web usage. The only reasons not to have one would be if you have spotty internet access, play games, or want to store lots of documents locally.

        CamelCaseRob

        March 3, 2017 at 3:46 pm

      • This web page has the Octane scores of Chromebooks, so you can see how they compare. Some are as fast as a mid-level Windows desktop computer, I think even the slowest are still fast enough. Look for 4GB of RAM if you want to have multiple high-bandwidth pages open at the same time.

        https://zipso.net/chromebook-specs-comparison-table/

        Lion of the Blogosphere

        March 3, 2017 at 4:10 pm

    • About three years ago my ’08 Gateway MT6840 was sputtering (1GB RAM, 160GM memory, 2Ghz Intel Core 2 Duo processor), so I bought a $170 Acer C720 Chrombook at Best Buy. The Chrombook has been great for web and email. I do all my computing & printing at the office. I keep and play about 300 songs on the Gateway.

      E. Rekshun

      March 3, 2017 at 5:26 pm

  14. Moore’s Law has hit the wall. Its called semiconductors. Semiconductors aren’t superconductors. They bleed heat and power when you use it. That’s why heat sinks are not optional on every chip past the first generation of 8088s. Semiconductors can only go so fast before they melt down. 2 to 3 ghz seems to be the maximum they can get from our x-ray laser silicon chips. Multi-cores are just trying to get around this by putting more than one circuit on the microchip, but they hit a wall here too. All programs work on a circuit because that’s how electricity works. Multi-core chips have to have operating systems that split the data and that means glitches.
    Most of you have slow computers because of your system picking up spyware and adware as you surf. You can zoom back up to speed by copying out your data and rebooting your system. That’ll zap all the spyware and adware that’s gumming up your computer. Your spyware and internet security programs cannot get it all out. You have to reboot it regularly to zap it all.
    You can also speed up your computer by adding memory. Windows 7 and later can use up to 4 GB. If you up your memory up to maximum of 4 GB, your computers will run faster. SSDs do not have spinning spindles like other Hard Drives. They use flash memory like a flash card or USB memory. These SSDs are noticeably faster when they run programs. You can also up the memory on your system by adding a USB drive and starting a program like Readyboost. A Readyboost program will use some of your USB drive’s memory for your computer and make it faster.
    Clean it up with a reboot (save your data on external backups), up your system memory to 4GB and add USB drives with Readyboost, and see how fast you can go.

    Joshua Sinistar

    March 2, 2017 at 8:19 pm

    • Moore’s law isn’t about clock speed its about transistor density

      Magnavox

      March 2, 2017 at 9:10 pm

    • You can definitely get more out of a chip than 3ghz, I think the problem like you said is the heat. My 4.0ghz (not over clocked) i7 was going up to 85C and with a liquid cooler it typically stays at about 50C.

      Also are most people still running 32-bit OS’s? I’ve been running 64 bit for a long time so it felt like to me they were ancient history, but maybe since I’m a computer nerd I’m a bit disconnected from the public. I mention this because you are saying 4GB is the maximum RAM amount (on most 64bit OS’s its 192GB).

      Yup

      March 2, 2017 at 9:36 pm

      • My 6-year-old PC has 8GB of RAM and is running 64-bit Windows 10.

      • Windows 7 runs on both 32 and 64 bits. I run 32 myself, because its more stable on my PC than the 64 that came with it. The 64 bit OS was crashing, but the 32 bit Windows 7 is far more stable. Windows 7 can only use 4 GB. I’m not sure what the max GB for Windows 10 is though. Linux systems can use 8 GB, but I hate the set up.

        Joshua Sinistar

        March 3, 2017 at 8:58 pm

      • Chips cooled by liquid nitrogen have hit stable speeds of close to 8 GHZ so it is a pure physics/heat issue…

        Paul Ryan's Sickly Old Lap Dog

        March 4, 2017 at 12:59 am

  15. Gpu performance has continued with very strong gains unlike the CPU because it is inherently easy to parralelize. This performance translates to tangible performance in gaming and graphics heavy apps

    Gs

    March 3, 2017 at 12:51 am

  16. A big part of the problem is that the hardware engineers who design chips are extremely intelligent and/or autistic, and so they have a hard time understanding how to optimize their designs for the real world.

    They optimize processors for maximum possible throughput; how fast they can emit 1’s and 0’s. Double the number of cores (to be more accurate, the number of simultaneous threads) and you double the rate at which it can process data — in theory. Along with increasing the simultaneous thread count, the biggest efforts recently have been focused on SIMD vector instructions that perform operations like multiplication or division on several variables simultaneously.

    The problem is that it’s extremely hard for programmers to write multithreaded code. It’s so much more work and causes so many bugs that most don’t even bother, and many popular languages don’t support it well. Plus, many, arguably most tasks do not parallelize well and so do not benefit from simultaneous multithreading. As for SIMD instructions, compilers, even Intel’s own compiler, simply aren’t smart enough to know how to use them. So to use them you have to either handwrite assembly or use the intrinsics in C.

    Apple has benefited greatly because they optimize their mobile processors for practical workloads rather than autistic theoretical parallel vector-processing ones. Their processors, like the A10 are 1) optimized for single-threaded performance (since 99.9% of all code is single-threaded, especially on mobile), 2) have only 2 cores and 3) have huge caches (a theoretically worse but practically far better use of silicon real estate than adding more cores or SIMD). Apple’s processors accordingly obliterate the quad-core, higher-gigahertz Android competition in all even remotely real-world performance metrics. It’s not even close.

    The DEC Alpha, certainly the most ahead-of-its-time (oh, what could have been) and probably still the conceptually best-ever CPU design — it famously could run (Intel) x86 programs in emulation faster than Intel chips could run them natively — followed similar principles. The Alphas had enormous caches for the 90’s, not far off from today’s processors. It’s not a coincidence that Apple’s top CPU engineers are largely Alpha veterans.

    Peak autism, on the other hand, was achieved with Itanium, a processor so optimized for parallel workloads that nobody including the manufacturer Intel was able to write an efficient or even sane compiler for it. It was theoretically capable of breathtaking speed but practically slow as a dog, and it darn near killed Intel.

    One of the genius business decisions we can thank Carly Fiorina for (HP having bought Compaq, which bought Digital) is killing Alpha in favor of… Itanium. Although it lives yet again; the Chinese supercomputers that recently took the #1 and 2 spots are both using unauthorized, improved copies of Alpha chips.

    snorlaxwp

    March 3, 2017 at 1:22 am

    • I obviously screwed up the link there; should just be on “Itanium” if you can fix.

      snorlaxwp

      March 3, 2017 at 1:24 am

    • This is a really interesting and insightful post. Get a blog dude!

      Paul Ryan's Sickly Old Lapdog

      March 3, 2017 at 2:57 pm

      • Thanks!

        I’ve considered it often but I worry I wouldn’t have many readers. Or if I did I would lose them by not producing output at a steady enough rate. Or, worse, I would become popular, and I’d get doxxed and be rendered unemployable for life.

        snorlaxwp

        March 3, 2017 at 6:00 pm

      • Ok, but please submit a guest post to Lion, then, elaborating more on your thoughts of today’s microprocessor design. The public demands more!

        Paul Ryan's Sickly Old Lap Dog

        March 4, 2017 at 1:01 am

      • I’d do one if asked — does Lion do guest posts? Does he find my computer-related postings interesting?

        snorlaxwp

        March 5, 2017 at 1:16 am

  17. “However, only specialized software can run on multiple cores at the same time.”

    We’ve had threaded software since the late 90s, before multi-core CPUs were available to consumers. These days virtually all software can use multiple cores and threads. Windows XP already supported using more than 1 core per process. In Windows 7 and onwards the task manager shows how many threads each process (program) has spawned. Task Manager > Details pane > Right click on the columns > Add Columns > Scroll and tick “Threads”.

    Rn 48

    March 3, 2017 at 6:59 am

  18. One more reason why “a career in computer programming sucks”! First Indians, now Cubans…

    Miami Herald, 02/27/17 – Cuba has ‘largest pool of untapped IT talent in the Americas’

    http://www.miamiherald.com/news/nation-world/world/americas/cuba/article135249259.html#storylink=hpdigest

    In Havana’s iconic Bacardí building, teams of computer programmers are working for U.S. companies with the tacit permission of the Cuban government. Could the island become the next international hotspot for software development?

    …Most of the U.S. companies hiring computer engineers and programmers in Cuba put them to work programming or designing applications for cell phones and Internet sites, as well as more complex coding with open source software…Cuba has many highly educated programmers who are currently “underemployed.”

    …With salaries of approximately $5 per hour — a more “competitive” rate than at other programming centers in the region — and in the same time zone as the United States, contracting Cuban programmers “looks very promising…Cuba offers the lowest wages with the highest concentration of professional grade software developers anywhere in the Americas …Cuba’s large and multiskilled pool of software programmers and Internet researchers represents perhaps the largest pool of untapped IT talent in the Americas.”

    “…there is a lot of justifiable reasons for this excitement because of the quality of the Cuban workforce,” adding that many multinational companies would “very much like to tap into that workforce.”

    …The former Obama administration, as part of its campaign to ease sanctions on Cuba, allowed U.S. companies to hire Cuban programmers in late 2015…

    E. Rekshun

    March 3, 2017 at 1:31 pm

  19. update from Vox on the ongoing destruction of the immigrant community: http://www.vox.com/policy-and-politics/2017/3/3/14801474/ice-detain-deport-immigrant

    Otis the Sweaty

    March 3, 2017 at 3:27 pm

  20. Lion, a bit OT but possibly usable for another article—

    Libertarian black mayor of Jo-burg in South Africa opposes invasion by pseudo-immigrants, upholds free market, rights, transparency. Extreme left goes spastic: https://newpittsburghcourieronline.com/2017/03/03/the-mayor-of-johannesburg-a-trump-type-reality-tale/2/

    <>

    Also:

    <>

    Robert

    March 3, 2017 at 6:18 pm

  21. What are you trying to say, that there is no Moore’s Law?

    Dave

    March 3, 2017 at 6:54 pm

    • No, the number of transistors they can put on a single chip continues to increase, for now.

      Lion of the Blogosphere

      March 3, 2017 at 8:00 pm

      • I think they’ve about maxed what they can etch with X-Ray lasers on silicon. They can make them bigger, but they’ll heat up like crazy. Those water cooling systems shows how much heat is being generated now. Until superconductors start being used, heat is a huge problem.

        Joshua Sinistar

        March 3, 2017 at 9:02 pm

      • Either superconductors or quantum computers.

        Paul Ryan's Sickly Old Lap Dog

        March 4, 2017 at 1:02 am

  22. The only reason that I ever upgrade my computer to a new one is due to a motherboard failure. Sometimes capacitors will fry after five years or so.

    When that occurs, the cpu is likely not to be supported by the newer motherboards. Hence, unless you want to buy an older generation motherboard of which there are likely to be few on the market, you would have to upgrade the cpu as well. At that point, you essentially have a new computer.

    All hard drives, to include SSD’s, will eventually fail. I don’t keep any critical information on my SSD due to a lesser ability to recover it should the drive fail. Only the OS goes on the SSD. Files that I cannot lose go onto server grade HDDs; never onto consumer HDDs (too many failures over the years).

    It has been slowly necessary to upgrade the amount of RAM one has over the past decade, but as long as you buy sticks with normal voltages then they should remain compatible with most new motherboards.

    What’s left? A video card? If you play video games, that needs to be upgraded every five years or so unless you need to have the best at all times.

    So, essentially, once you have some decent RAM sticks and some better hard drives, your need for computer replacement basically boils down to how long you can keep your motherboard and OS SSD going for (or OS HDD depending). I can’t envision any computer advances on the horizon that will cause anyone to need to up their performance for what most people use computers for.

    That is, I suppose, until VR becomes more integrated into business and everyday communication.

    I’ve also done stupid shit like break a motherboard while switching them out (applying pressure with my fingers while concentrating on something else). If you build your own computers, you have to calculate in an occasional accident.

    The other issue is the impending OS choice to come in 2020 when windows 7 support runs out. When that occurs, computer users will have to switch to windows 10, which will likely consistently require an updated computer (by design), switch to MAC, or switch to LINUX (very few will do this). Except for LINUX, which is too user unfriendly, these choices essentially either remove one’s ability to keep outdated but adequate computer hardware or to choose their hardware and continuously update it as needed.

    In other words, the days of the casual PC builder, who can sit on a stable system indefinitely, will likely be over. Their solution for lagging PC sales is to force upgrades though continuously evolving and mandatory OS hardware requirements.

    Dave

    March 5, 2017 at 12:05 pm

    • I installed Windows 10 on my 6-year-old i7 computer without any performance problems. At least not after I upgraded the main drive to an SDD (using the larger HDD for file storage).

      • For sure. What I was referring to is the following:

        an intercompany hardware / software development strategy which requires newer chips to run features of the OS (and possibly eventually to run the OS at all) and requires the newest OS (Windows 10 vs 7) if one wants to reliably run the chip (a new chip being the reality in the case of your MOBO failing).

        In the link below, you see that you are locked out of features without a newer chip and browser, the latter only being available in newer versions of windows. I’d guess that what features you are locked out of, with older chips, will increase with time.

        In addition, this strategy is perfect for a periodic mandate of hardware updates to reinvigorate the declining PC hardware industry. Its probably somewhat good for innovation and health of the industry, but you can count on them to eventually be too enthusiastic in terms of causing the OS requirements to outpace older chips too soon.

        In the case of the below incompatibility, I have my doubts that its about the chip. It looks like they are preventing access through both browser and chip compatibility, but they admit that the older chips can perform more or less as required:

        http://www.pcmag.com/news/349792/4k-netflix-on-windows-10-requires-kaby-lake-chip-and-microso

        Below, you see that Windows will not support newer processors with older windows versions, nor will it support older processors with Windows 10: and the soon-to-be unsupported processor gamut will begin with a relatively new processor, in Skylake, beginning in July 2017:

        http://www.anandtech.com/show/9964/microsoft-to-only-support-new-processors-on-windows-10

        To punctuate my implied point with an anecdote: my 4 core AMD processor from 2010 still run flawlessly, at nearly the performance of a 2015 8 core processor ( the 8 core is a slight improvement in one or two performance measures, but has slightly less performance in others).

        Dave

        March 5, 2017 at 4:22 pm


Comments are closed.

%d bloggers like this: