I recently took a look to see if it was worth replacing my 5 year old desktop. To my surprise, the answer was "no." Looking at the CPU benchmarks, it looks like a "modern" i7-4770 would clock in at less than twice the performance of my 5 year old i7-920. In the old days, 5 years would have been enough to get at least a quadrupling of performance. Not even getting a doubling in 5 years would have been unthinkable. Part of it is that Intel's no longer getting any competition from AMD. Part of it is because getting up past about 4GHz would overheat a PC, so the easy way out of just merely increasing clock speed is out. Increasing the number of cores have already hit diminishing returns as far as most PC users are concerned (I'm an exception: I regularly process video).
The flip side of this is that the base operating system hasn't been using more hardware resources recently. Windows 8 is actually less resource hungry than Windows 7, which would have been unthinkable in the old days. Thanks to Microsoft's desire to compete in the tablets space with Apple and Google, Windows 8 actually runs decently on a tablet with just 2GB of RAM. This gave me the courage to replace my wife's 4-year old X201 with a Microsoft Surface Pro with half the RAM. My wife didn't even notice the missing RAM, despite running the resource hungry Android Studio, which is enough to spin my desktop PC's fan up.
This has several implications for users and developers:
- Rather than buy a mid-range machine and planning to replace it every few years, it might be cheaper to build a high end machine and upgrade components. Given that CPUs and motherboards are no longer going to have to be trashed every few years, you might as well get a chassis that supports easy hard drive and SSD replacements/expansions, and GPU upgrades, if you will run GPU-intensive activities.
- I/O standards do make a big difference, but any PC with a free slot will let you upgrade to USB 3 and other standards, so again, expand-ability might be more important than "planning to throw it away."
- An adequately high end machine will probably last a good 10 years in this environment (i.e., a i7 4770k wouldn't be obsolete for 10 years), which means that it makes sense to put money into a high quality power supply, since the higher quality power supply would provide cost savings when you plan to run a machine for that long. This is in contrast to the "buy-and-replace" strategy, where spending $20 more on a better power supply wouldn't pay for itself in power savings.
- This also seems to be applying to laptops, though laptops do benefit from the power efficiency gains of the latest processors, so if battery life matters to you, an upgrade every 4-5 years might make sense. The way most people seem to use laptops (constantly plugged in and never actually used as a mobile device), I think most people should replace laptops every 10 years, getting new batteries every 3-4 years or so, assuming that their device supports it.
No comments:
Post a Comment