Krokodil wrote:[...] I do agree modern computing is rather boring. You don't see the groundbreaking innovations anymore. It seems the days of expensive, exotic systems, with groundbreaking and mind blowing performance - have come to an end [...]
I agree with what you have observed, but I think the homogenization of computing (converging to x86 / Windows / Linux) is the sign of a mature technology. That isn't to say that x86 (or, say, Linux) was the best of what was technologically available -- far from it -- but a combination of luck and market forces caused things to settle out. Generally economics will favor the lowest cost tool that gets the job done most of the time.
When technologies are in their infancy you see a much wider ecosystem of competing products and unique offerings, but when they mature, things homogenize and are treated as a foundation. The details are shoved under the hood and people move on to the next level of abstraction. All the excitement of early electrical delivery and the "War of the Currents" has ended and we now just expect to get our watts with 120V AC at 60Hz (at least in this country). Or in telecom, everything from strowger and crossbar switches to TDM circuits and ATM has given way to boring old IP transit and Metro Ethernet.
It is hard to find good examples, because the pace of innovation for computers has been so quick. I don't think there has been anything similar in recorded history where humans were able to witness the birth and maturity of a world changing technology (and all the upheaval in-between) within a single generation. But the gist of it is that units of computing power are now as generic as watts from the wall. And there are, of course, the equivalent "power companies" of computing: Amazon Web Services, Google Cloud, Microsoft Azure, etc. Unless you are an engineer at Intel or ARM there probably isn't a lot to get excited about with hardware any more. Computing has become a commodity.
With the base layers "solved" to some extent, I think the innovation has moved into the software domain. I have been very impressed with the developments in image recognition. As another example, AWS has made massively parallel computing accessible to everyone, and there is still a long way to go in taking advantage of that with a lot of the foundational pieces still missing. Horizontally scalable databases (Cassandra, Voldemort, DynamoDB, etc.) are still in their infancy. I don't think people are very good at programming for truly horizontal/ parallelized environments, but in any case, I think software is where the excitement is these days. It is no longer focusing on how the watts are generated (which I say knowing that this still fascinates myself as well as most others here), but rather what they are used for.