Computers Are not Getting Faster in a Meaningful Way, GPU is Half-Baked Tech, Too Many Software Developers Suck
See also: Reader Comment: “Bloomberg piece goes hand-in-hand with today’s article that you posted”.
I happened to be thinking about how well my 2013 Mac Pro serves my needs—very well now for nearly three years, edging out the late 2015 iMac 5K most of the time, or being edged out slightly some of the time.
On the other hand, my most frequent task of creating lens rendering aperture series takes up to 5 minutes at a go, and I do that a lot. I would really like to see that process run 2X to 3X faster. (well, 10X faster). But after three years, the improvement in CPU speed is little better than a rounding error, laptop or desktop (and Apple has made zero speed improvement on the Mac Pro, not even bothering to offer the incrementally faster CPUs).
SSD speed now exceeds the needs of virtually all programs, memory capacity is rarely a real limitation on most all tasks, and memory speed is not a constraining factor for most tasks. The CPU is too often the limit.
Which brings me to two general points.
First, CPU performance is maxed out with little sign of any major gains on the horizon. While incremental gains are seen in dribs and drabs in the laptop and desktop space, performance has hit a wall and it sure doesn’t look like that’s going to change any time soon. Gone are the days of a 25% boost in speed when a new Mac comes out. While the GPU has contributed some benefits, those benefits range from non-existent to spotty and sporadic (certain specialty programs excepted). And programs using the GPU tend to crash; GPU support is too often a science fair project that cannot be used reliably*. The lack of speed gains are not just a Mac thing; PCs might be overclocked for some incremental gain (4.4 GHz vs 4 GHz is a yawner), there are no 10GHz or 20GHz CPUs in any Mac or Windows PC.
* Photoshop crashes with GPU enabled, unable to complete recent MPG testing, Sigma Photo Pro crashes 100% of the time with GPU support enabled, for several years now, display quality is degraded with GPU enabled at certain scaling factors. Ad nauseum: the GPU is a half-baked technology.
Second, many if not most programs could run much faster if software developers did not suck. Some don’t suck, but many do. For example, why can’t I sharpen 10 layers in parallel in Photoshop with one command? Instead, the sharpening has to be invoked layer by layer, which is even worse: the effort is serialized by human-driven actions (even my scripts cannot run in parallel given Photoshop’s serialized operational design). There is something irritating about highly intelligent people being dumb as hell when it comes to obvious things.
15-year-old inertial thinking is killing software performance. In at least some cases.
In other cases, it’s just design by nitwits—taking a more mundane example, why does Apple Mail lock me out with no response and/or a rainbow beachball while using one CPU going at 100% for 3 to 20 seconds, or prevent typing for 3-5 seconds while searching? It’s brain-dead incompetence in software engineering.
There is a whole range of poorly done software out there (in performance terms), but it all boils down to a failure to see what’s right in front of them. A curious blindness. So maybe it is a Russian plot!