Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's an old quote about "why would I pay to have the code written more efficiently when processors are constantly getting faster and harddrives are constantly getting bigger?" that always comes to mind about MS software. I don't know the validity of that quote to be any more accurate than the 640k memory one, but it always just had the feel of authenticity by everything you see as circumstantial evidence


It feels like they’ve always taken the approach: “Why rewrite anything when we can just add more virtualization?” In the short term, that might help ensure compatibility with older versions with minimal testing. But after 40-something years, it’s clear that it’s become a mountain of technical debt—one that Microsoft has no real plans to tackle any time soon.


The underlying issue is MS software is running on customer machines so it’s not part of their bottom line. They have little incentive to care as long as it’s not so slow their monopoly breaks.


My tinfoil hat told me that they're in cahoots with the big PC manufacturers, and use it as a part of planned obsolescence.


Additionally, I suspect there's 4 decades of legacy backward compatibility hacks that doing anything intelligent to help UX is impossible. It might break some peanut butter factory in Indiana that is paying for support.


They have been breaking things left and right for quite some time now, I don't think they care about this anymore.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: