Jump to content

Long Term System Stability?


NoelC

Recommended Posts

... If compacting (please read as defagmenting) the Registry is a bad thing, then compacting a database is also a very bad thing to do, as well as defragmenting a filesystem. :unsure: ...

 

Any takers for this gauntlet? :whistle:

Link to comment
Share on other sites


Naah, that is not at all a gauntlet, it is just a set of statements.

 

This (point 7, specifically) may be interesting to challenge:

http://rwmj.wordpress.com/2010/02/18/why-the-windows-registry-sucks-technically/

As well as this one :whistle::

http://reboot.pro/topic/7681-the-registry-as-a-filesystem/

 

Though they may make it even harder to choose an approach....

 

I mean, choose one ;):

  1. IF the Registry is a database, then it makes sense to compact (please read as defragment) it periodically.
  2. IF the Registry is BOTH a database AND a filesystem, it makes sense to compact (please read as defragment) it periodically.
  3. IF the Registry is NOT a database, BUT ONLY a filesystem, then it makes even more sense to defragment it periodically.

 

jaclaz

Link to comment
Share on other sites

Heh heh heh, nope.  Haven't had a floppy drive - even a 3-1/2 incher, since 2006.

 

I take that back...  One day maybe I'll dig my old Sol 20 (8080 system) I built in the '70s out of the closet and see if it still runs.  I have a couple of 8 inch floppies for that to boot CP/M. I certainly won't need any more though.  :angel

 

Ran Visual Studio 2013 with Update 2 all day today, collaborating with a colleague long distance via Skype and RAdmin, and we blazed through a few thousand lines of C code (including some slick shader code; it's fun to make a GPU sing).  Everything's running great.  Not a glitch, not a lost data byte.  When I needed the computer to work, it worked.  I don't worry whether something's going to go wrong.  It's nice to be able to really rely on a system like this.

 

Uptime11Days.png

 

Oh, and a note to MikeRL:  You do NOT need to (nor should you) manipulate your registry via a "defragger" application.

 

-Noel

 

I can't help but notice that you have 48GB of RAM and dual Intel Xeon processors. I assume you're running a workstation of some kind?

Link to comment
Share on other sites

MrMaguire, I have a Dell Precision T5500.  It's a simply awesome machine.  The Xeons are Westmeres, which are getting a bit old by Intel's Tick/Tock standards, but they're the top-of-the-line Westmeres so they're as powerful as all but the near top of the line newest chips.  The RAM is ECC.  This is rock solid hardware, and it complements the solid OS reliability nicely.

 

My point of view regarding running programs on your registry are as follows:

 

1.  It's not a regular maintenance task that Windows provides - even today.

 

2.  Something that reorganizes a database is arguably more likely to corrupt the database than not doing it at all.

 

If you find that defragging your registry is helpful, and you've developed practices that include doing so, more power to you.  I have never defragged a registry and my systems simply do not slow down over time.  I've never sensed a need, and I'd argue that I do as much with Windows as anyone - though clearly different stuff.  I don't make a practice of installing and uninstalling stuff on my main workstation - I do that kind of work in VMs, though to be honest I have test VMs that are 5+ years old and which I HAVE installed tons of stuff in the process of testing that haven't shown any signs of loading up either.

 

-Noel

Link to comment
Share on other sites

MrMaguire, I have a Dell Precision T5500.  It's a simply awesome machine.  The Xeons are Westmeres, which are getting a bit old by Intel's Tick/Tock standards, but they're the top-of-the-line Westmeres so they're as powerful as all but the near top of the line newest chips.  The RAM is ECC.  This is rock solid hardware, and it complements the solid OS reliability nicely.

 

Ooh. I would have bet that it was a Dell Precision Workstation of some kind. I have a Dell Precision 380 myself, with a 2.8GHz Pentium D and 2GB of ECC DDR2 (Which I'm hoping to upgrade to 8GB). That's what I'm running XP Pro x64 on and I love it. I got this computer used and quite frankly I'd rather have old quality hardware than something more powerful but perhaps less durable and rugged. After a full day of messing with vmWare player and other things, Windows still seems to be perfectly happy, but I do shut it down every night.

Edited by MrMaguire
Link to comment
Share on other sites

By the way, I've had two old Precision 470s just die in the past year - one after a 1 hour power outage (longer than its UPS could run it) and the other just while running.  Apparently there was some capacitor problem in that model.  I'm not sure whether that same issue might be in your 380 - it's one model newer.

 

I noticed that when I took apart the 470s all the plastic parts were extremely brittle, and most catches, hold-downs, etc., just broke.  I would have thought they were built better, but I got a reality check from that.

 

Oh, and it's been another day of computer usage on this boot and still no glitches.  It's a magical time for WIndows 8.1 stability.

 

Uptime_05_30_2014.png

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

Thanks for the heads up, Noel. I haven't looked to see what brand and series of capacitors my Precision 380 has, but I'll make sure to do some looking around on the internet at least, so I can see as to whether or not it was a real problem.

 

A Google search regarding the Precision 470 returned this thread on badcaps.net. Apparently Dell used a bad series of Nichicon capacitors and supposedly redesigned the 'board with solid state capacitors later on.

 

It's interesting that the plastic parts became brittle. I've heard of that happening to (mostly) old beige coloured plastics that are exposed to lots of heat and UV light, such as the Super Nintendo.

Edited by MrMaguire
Link to comment
Share on other sites

  • 2 weeks later...

27 days uptime since the May Windows Updates, zero glitches.

 

But it won't see 28 - the June Windows Updates are here and a reboot will be required.

 

Uptime27Days.png

 

-Noel

Link to comment
Share on other sites

... my Precision 380 ...

 

Interesting relic, looks like BTX form factor:

 

FabdSvaF.jpg

 

To prevent the plastics from melting and the lythic caps from busting I'd enforce cooling by laying a 92~120mm (the larger that fits) extractor fan at the rear grill, cutting the grill inside its swept area to maximize flow. Something like this:

 

D4CoIctJ.png

Link to comment
Share on other sites

Yes Why? Defragging or Disk Optimizer as its called now is also a No No if you use SSD.

Why not?

http://www.msfn.org/board/topic/171875-long-term-system-stability/#entry1078900

http://www.msfn.org/board/topic/171875-long-term-system-stability/#entry1078910

An interesting point would be the "practical effects" of compacting a database (or a filesystem), i.e. the difference in performance in a comparison between two identical machines, used the same number of hours, let's say 2,000 hours by two twin brothers/sisters doing exactly the same thing, on one of which routine compacting/defragging is carried regularly (without overdoing it) and on one of which nothing is done.

Possibly on modern (fastish) hardware there wouldn't be a sensible difference, but - as an example - IF the database (or filesystem) becomes corrupted (for any reason) the difference result in an attempted recovery would be likely the same between success and failure. :ph34r:

jaclaz

Link to comment
Share on other sites

"Rules of Thumb" change over time.  Technology changes.  SSDs don't "wear out" anywhere nearly as fast as they used to.  The difference came about when wear-leveling flash controller logic was introduced.  Plus they're quite often larger today than they were just a few years ago (i.e., you might buy a 512 GB disk instead of a 128 GB disk, which will give you 4x the lifespan on the same write load).

 

And practical best practices are usually not extreme.  Thus "never" do this or "always" do that aren't usually as good advice as "do this or that in moderation" or "when it makes sense".

 

To this day SSDs perform sequential reads and writes (I.e., reads/writes of multiple contiguous logical blocks of data) more efficiently than random reads of single blocks - just because there's overhead in the process of preparing the commands and awaiting the responses.  Fewer commands is better.  You could make the point that hypothetically a defragged database or file system will perform better.

 

That being said, practically speaking one simply does not have to worry about defragging anything on a modern Windows system.  It WILL work fine for years without doing so, and it will retain substantially all of its performance.  What you save is the risk of corrupting something by monkeying with data that the system doesn't expect you to monkey with.  Plus Windows DOES do a fair bit of self-maintenance - arguably all that it needs.

 

I'm always surprised at how focused people can be (and how much effort they'll spend) in eking some hypothetical little bit of performance out of their registry accesses, yet the same folks may be completely oblivious to the fact that Windows 8.1 is actually a good bit slower - measurably slower across multiple systems - at doing most operations in the file system than all of its predecessors (including Windows 8).  It seems a bit penny wise and pound foolish if you ask me. 

 

But people have to feel good about their computers, and there's certainly a "feel good" factor in doing system maintenance.

 

-Noel

 

 

P.S.,

 

For what it's worth, Windows 8.1 file system access appears to be about 5% faster after the June Windows Updates as compared to a fully updated system with the May updates - no defragging required.  :)   I've measured this on multiple systems.

Link to comment
Share on other sites

 

... Defragging or Disk Optimizer as its called now is also a No No if you use SSD.

 

Why not?

 

"You should never defrag an SSD. Don't even think about it. The reason is that physical data placement on an SSD is handled solely by the SSD's firmware, and what it reports to Windows is NOT how the data is actually stored on the SSD.

 

This means that the physical data placement a defragger shows in it's fancy sector chart has nothing to do with reality. The data is NOT where Windows thinks it is, and Windows has no control over where the data is actually placed.

 

To even out usage on its internal memory chips SSD firmware intentionally splits data up across all of the SSD's memory chips, and it also moves data around on these chips when it isn't busy reading or writing (in an attempt to even out chip usage.)

 

Windows never sees any of this, so if you do a defrag Windows will simply cause a whole bunch of needless I/O to the SSD and this will do nothing except decrease the useful life of the SSD."

Link to comment
Share on other sites

Where the data is stored inside the SSD really has no bearing on whether the file system is fragmented, except that an SSD's I/O accesses vary between blindingly fast (e.g., when blocks are contiguous and the OS can ask for more of them sequentially in one I/O request), and blazingly fast (when blocks have to be requested from disparate addresses with separate "random" I/O operations). 

 

In short, it's a little slower to do things via random small I/O requests vs. bigger sequential requests. 

 

Wasn't it you who was concerned about the best speed of 4K random I/Os being only in the 40 MB/second range, TELVM?  Consider that 100% fragmented file system on an SSD would be using all random I/O operations.  Practically speaking a file system doesn't tend to get to 100% fragmented, and there's always a mix of small and large operations.  But for a given I/O load, a greater percentage large operations does yield better performance.

 

Keep in mind that since some time in Windows 7's lifetime Microsoft's Disk Optimizer software won't "defrag" an SSD in the traditional sense (of placing logical blocks in order) anyway.  It' "optimizes" the disk by sending TRIM commands for all blocks that are not in use, thus ensuring it has given all the proper indications to the SSD controller about what's in use and what's not.  This is actually a Good Thing, though not actually necessary.  Modern SSDs manage their free space autonomously, without direct input from the OS, though can do so more effectively when it does give that input.

 

Again, it's not something the average joe needs to worry about.  The OS has already got SSD management covered.  Without a complete understanding of how things work at multiple levels, a lot of people screw things up more than they help things.  I'm sensitive to this because I see a lot of folks (not specifically here, but in general) who think they know how things work but really don't and tend to mess things up or get duped into thinking they need useless "cleaner" applications.  To use Marketing-speak, the "system maintenance / cleaner" genre has made its own market by making people believe they need/want it.

 

-Noel

Edited by NoelC
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...