Yet, as shown here, NO ONE knows just when a defrag's cost will result
in a savings (wear, electricity, or by whatever measurement) over
continuing to access fragmented files. Saying that you know when to
Who said anything about cost? I just made the simple statement that the
more you defrag, the more wear and tear you inflict on the hard drive.
defrag is no more steeped in science or proof than someone who admits
they don't know and just goes ahead to schedule the defrag anyway. You
don't know "when it's needed". You just wait until the percentage of
fragmentation gets above some arbitrary threshold upon which you've
chosen (by the way, some defraggers can "schedule" themself to defrag
when fragmentation gets above some threshold, again, arbitrarily chosen
by the user). The other user admits they don't know "when it's needed"
so figure they'll just run it or schedule it at short intervals since
they don't perceive the added wear (which may not exceed the added wear
to access fragmented files) as a risk to the survival of their device.
I let the defragger tell me when to defrag. If it says to, I do. Most
times not immediately, but within a few hours or few days.
When I analyze a drive is subjective based on experience: use the system
a lot; install/delete software; save/load lots of files, etc.; anaylze
more frequently. Less use, etc., analyze less. But, in any case, at
least once a month, I check.
Your measure is just as arbitrary and not based on valid premises as is
someone else making an arbitrary choice of when to schedule a defrag.
I've been told by those who deal with system performance for a living,
that hard drive performance, in general, at least with NTFS, takes a hit
with as little as 10% fragmentation, and definitely becomes a problem
when over 20%, which this link seems to verify:
http://www.condusiv.com/disk-defrag/fragmentation-impact/
[snip]
Why do you think defraggers have an analyzer tool separate from the
defragging part? It's there so the user can test the drive for the
degree of fragmentation, and decide if they want to defrag.
As I stated, showing the level of current fragmentation does not
indicate how long you get an ROI on wear for defragging those files. You
will add more disk wear with the defrag (several small ones or a few big
ones) but at some point you are hoping that the saved wear from
accessing defragmented files exceeds the wear by the defrag and
accessing fragmented files. Showing the current level of fragmentation
gives you no usable information to know when that ROI point will be met.
I agree.
I gave up a long time ago on using the percentage of fragmentation or
[snip]
The level of fragmentation (percentage or fragment count) really doesn't
provide you a decent gauge to figure out when you should defrag.
Optimum defragging is a balancing act based on how the system is used.
Using percent fragmentation is not the best (or only) criteria, but as
far as reading, loading and saving files (and applications), which is
what most users do most of the time, it's a simple one, easily
determined, that is adequate most of the time.
Geez, what defragger do you use? The ones that I use do a scan before
I use the one provided by Microsoft.
And I was referring to the initial scan, not the defragging operation
itself. I said it badly. That last sentence should have said "...files'
info..." which is not always a quick operation depending on how big the
Master File Table is and how much of the file info needed is resident.
defrag so they can build up a list of eligible files for the type of
defrag operation that you choose to run. If you use the same defrag
algorithm each time, it shouldn't be moving all files but just the ones
that are fragmented (and perhaps have more fragments than a threshold
you configure in the defragger, if an available option). The only
non-fragmented files that should get moved are to make room, say, for
the MFT reserve area or pagefile.
Whether only fragmented files, and not contiguous files, are moved, even
if you're defragging daily, really depends on a lot of things, most
importantly, the defrag algorithm. For example, is it set to make
contiguous everything, except system areas, of course, from as close as
possible to the beginning of the drive? That is, to leave no empty gaps
between files. This will maximize reads and saves, yes, but it's the
major cause of file fragmentation.
Microsoft needs to release a new, more efficient and SMARTER file system
that's less prone to fragmentation. Other OSes have them. Why not
Windows?
Stef