There's a bit of a 'reboot' as such (I forget what it's generally referred to) that contributes to this. By that I'm referring to the way that microcomputers reset progress in software.
When low-power microcomputers hit the masses, the advances that had happened in software in the minicomputer / mainframe world couldn't follow along. So in the first instance you had computers that could only barely run BASIC programs and where you were programming in assembly close to the bare metal if you really wanted to do much more advanced things with the machine.
Now, the kids growing up on these computers in the 70s and 80s (I'm an 80s kid myself) had no idea about things like GUIs or the latest in virtualization technology on minicomputers or all the many problems that had been solved in big-mainframe world but were yet to hit consumer devices. One factor is that consumer hardware of the time wouldn't have run the software efficiently anyway; another one is that there was no Internet where you could just Google anything and find out.
So many of us then grew up in this era and became software engineers that would go on to write operating systems and software of the 90s and beyond. We'd never seen or heard of, for example, Lisp Machines or what they could do. Which is I think why you end up with this weird generational gap, almost like a chasm of knowledge born in the 80s and 90s.
Whenever I watch an Alan Kay video I'm blown away by how much was possible 'back then'. My mental map of technological progress starts with 8-bit micros in the 1980s, which we all thought were 'cutting edge technology' except that they weren't, in the broadest scheme of things. It's this amazement that I think leads to the feeling of a 'Golden Lost Age'.
Remember when Windows 95 touted preemptive multitasking as a groundbreaking new feature? Or when DMA for hard disks was a 'new' thing? (except that the Mother Of All Demos basically showed the concept, from what I vaguely remember from watching it many years back).
We see the same cycle in mobile computing - phones were once close-to-the-metal devices; now they're running full mutli-tasking basically desktop-class operating systems. The difference these days is that we have the Internet and we have the lessons of history actually available to us.
I remember that awkward thrill to see a student thesis from the 60s being more thoughtful than the latest vector drawing program from A. And I've toyed with a long list of advanced packages, none of them has that tiny geometry solver in them.
Time is surely not an arrow in progress. Lot's of colateral and inherited sub cultures are sucking energy. As you mentioned, a new 'market' is also a potentially huge drawback, but that's such a common thing. People will see the world their way, not as a PhD knowing the history and state of the art (even researchers don't know all).
It happens in programming languages too. All the web started a freeform joyful environments, unlike c++ or java with their heavyweight specs and standards. But then complexity hits and they're suddenly bringing a lot of structure, types, conventions, etc etc. It's like a child, who cannot enjoy his parents universe, he needs something compatible with his new mind.
ps: my latest 'the newest is less than the old' moment was realizing Haskell was specified in 1990. At that time mainstream users were given Windows 3.0 and DOS :)
When low-power microcomputers hit the masses, the advances that had happened in software in the minicomputer / mainframe world couldn't follow along. So in the first instance you had computers that could only barely run BASIC programs and where you were programming in assembly close to the bare metal if you really wanted to do much more advanced things with the machine.
Now, the kids growing up on these computers in the 70s and 80s (I'm an 80s kid myself) had no idea about things like GUIs or the latest in virtualization technology on minicomputers or all the many problems that had been solved in big-mainframe world but were yet to hit consumer devices. One factor is that consumer hardware of the time wouldn't have run the software efficiently anyway; another one is that there was no Internet where you could just Google anything and find out.
So many of us then grew up in this era and became software engineers that would go on to write operating systems and software of the 90s and beyond. We'd never seen or heard of, for example, Lisp Machines or what they could do. Which is I think why you end up with this weird generational gap, almost like a chasm of knowledge born in the 80s and 90s.
Whenever I watch an Alan Kay video I'm blown away by how much was possible 'back then'. My mental map of technological progress starts with 8-bit micros in the 1980s, which we all thought were 'cutting edge technology' except that they weren't, in the broadest scheme of things. It's this amazement that I think leads to the feeling of a 'Golden Lost Age'.
Remember when Windows 95 touted preemptive multitasking as a groundbreaking new feature? Or when DMA for hard disks was a 'new' thing? (except that the Mother Of All Demos basically showed the concept, from what I vaguely remember from watching it many years back).
We see the same cycle in mobile computing - phones were once close-to-the-metal devices; now they're running full mutli-tasking basically desktop-class operating systems. The difference these days is that we have the Internet and we have the lessons of history actually available to us.