Let me start by acknowledging your good experience, Jaclaz, and to say that I generally agree with your viewpoints.
And we have to recognize that different people do different things with Windows. The same people do different things with Windows than they did back in the past. And different computers have different levels of quality.
I've been striving to create the "perfect engineering workstation" since the 1970s. At the engineering offices in which I've worked (think "Dilbert"), we never had a Windows OS before XP that would run a long time under heavy interactive use, no matter how well set up or tweaked. No need to even mention pre-NT systems, which often needed multiple reboots a day, though we did find ways to use them effectively.
Our collective experience was that, though MUCH better designed, NT and then Windows 2000 would simply corrupt themselves within hours under heavy use and either crash or need a reboot after some resource became exhausted and things would just stop working right. By contrast, our servers, running Server or NT or 2000 would actually run and run for months reliably. Why? Because hardly anyone ever logged into them interactively, and they were really expensive hardware.
But our workstations, quite high quality Windows $5K to $10K machines, when used to do serious engineering and pushed hard would inevitably fail and need a fresh boot in as little as a few hours. One could barely do really big operations with them without some kind of glitch(e.g., processing or transferring hundreds of megabytes of data, which was a big amount back then). We had to invent incremental file copy programs with read-back verification just so big multidirectory copies could be retried and retried until the sets of data were copied without error. Bad old days indeed by comparison today. And yes, we always used NTFS since it came out.
Sure, XP came along and made that light years better - systems used to be able to go days without a reboot, and sure enough you could often process hundreds of megabytes of data without problems, BUT... Inevitably a reboot was still needed when resource leaks led it to failure. And occasionally it would just blue screen. XP was simply not a system with which to reliably work for days and process gigabytes of data. Even XP x64, though much better (I used it for years), was limited.
But with later Vista, then Windows 7 and newer and the advent of 64 bit processing, one can just work, and crunch through terabytes of data without any trouble. The newest systems have fundamentally more reliability. It's a simple truth.
It's inevitable that stuff nowadays has to become better programmed and tested - machines are simply so much faster and capable of storing so much more data than their predecessors the same software from yesteryear that ran for days would run into the ground in minutes on a modern computer. It's saying a lot when a machine runs for weeks or months now. And I'm still talking about under hard use - not just sitting there.
Oh, and I still do run XP almost daily, in virtual machines. I regularly test software in that environment. Even well tweaked and kept fully updated to the point of Microsoft's abandonment, it's simply not as stable a system as Win 7 and 8.
People often dis the most recent Windows versions, and with a lot of good reasons - Microsoft has made many blunders with the UI and with their choices of what to package into it, all in pursuit of our wallets. They seem to want it to be a toy. But one thing they've done is make the kernel stable. Perhaps it's just because they haven't really changed the NT core all that much in recent times.
If set up well, Windows 8.1 can be super stable - a base for a quite useful workstation.