And hidden therein is my actual point: complexity. There has long been a trend in computing of endlessly piling on the abstractions, with no regard for the consequences. The web is an ever growing mess of complexity, with larger and larger blobs of inscrutable JavaScript being shoved down pipes with no regard for the pipe’s size or the bridge toll charged by the end-user’s telecom. Electron apps are so far removed from hardware that their jarring non-native UIs can take seconds to respond and eat up the better part of your RAM to merely show a text editor or chat application.
The PC in front of me is literally five thousand times faster than the graphing calculator in my closet - but the latter can boot to a useful system in a fraction of a millisecond, while my PC takes almost a minute. Productivity per CPU cycle per Watt is the lowest it’s been in decades, and is orders of magnitude (plural) beneath its potential. So far as most end-users are concerned, computers haven’t improved in meaningful ways in the past 10 years, and in many respects have become worse. The cause is well-known: programmers have spent the entire lifetime of our field recklessly piling abstraction on top of abstraction on top of abstraction. We’re more concerned with shoving more spyware at the problem than we are with optimization, outside of a small number of high-value problems like video decoding.
1 Programs have grown fat and reckless in scope, and it affects literally everything, even down to the last bastion of low-level programming: C.