OREALLY said:
When trying to defragment my Seabright 500GB drive...Windows 7 never
completes the process and the drive remains 1% fragmented. So I 'check
files' and scan for bad sectors.The message seems to stay at 'Processing
517 files.' The memory usage keeps going up to the point where I have to
close the program. (I have 8GB installed). Something is not right!
Any Ideas?
Thanks,
Oreally
Doing a search, I can find plenty of evidence this is "by design"
and it's stupid. They try to use as much memory as possible, to make
stage 4 checking faster or something. Some people report
chkdsk even causes pageouts, so it isn't even stopping when
the memory is nominally full.
One user suggested to start something like a Virtual Machine,
a program with a known large memory allocation, *then* start
chkdsk. chkdsk will then take note of the amount of memory
available on the system. Then, exit the Virtual Machine
or other kind of memory hogging program. chkdsk will then
stay within it's originally computed memory footprint,
avoiding running the machine into a paging situation.
(I find it hard to believe the bad chkdsk code actually
runs that way, but experiment and see if it is true.)
I'm on a 32 bit system right now, so the Testlimit program
here works. If you have a 64 bit system, you may be able to use
either of these programs. testlimit64 is a 64 bit program and
only runs on a 64 bit OS.
http://live.sysinternals.com/Tools/WindowsInternals/
Friday, November 12, 2010 10:53 AM 76152 Testlimit.exe
Monday, November 15, 2010 2:18 PM 79224 testlimit64.exe
Go to Start, and type in "cmd.exe". Basically, start a Command Prompt,
navigate to the directory holding the downloaded copy of Testlimit,
then try something like this.
testlimit -d 1900
What that does, is leak and touch 1900MB of memory. I'm on a 4GB
machine, with a 32 bit OS, and a single process can't go higher
than 1900MB. (I think I noticed Photoshop suffering in a similar
way, and can't go higher than roughly that value.) As long as
testlimit is running, with that leaked allocation, chkdsk will be
fooled into thinking it can't have all the memory. Now, start
your chkdsk run as you normally would. Now, go back to the
MSDOS command prompt window and press <control>-c to stop
the testlimit program (control-c will cause the program
to be killed). It will release the 1900 MB of memory back
to the OS. Now, the theory is, chkdsk can be a pig, but it
will stop when it gets within 1900MB of the end of memory.
If you use testlimit64, the command would look like this
testlimit64 -d 4096
That would allocate 4GB of memory, which you could then "give back"
once chkdsk is running. The amount of memory you allocate, could be
set to say, half your installed memory.
The reason I'm suggesting this, is to see whether this "workaround"
actually works or not. I don't see a reason why it should, but you
have the incentive to give it a try.
You can see more fun with "testlimit" program here, which is where
I first read about it.
http://blogs.technet.com/b/markrussinovich/archive/2008/11/17/3155406.aspx
The last time I experimented with "testlimit", I wasn't able to
convince myself that *any* option caused it to actually
commit (and not just reserve) memory. The -d option seems to be
doing the right thing now. If I look in Task Manager, and look
at the list of processes, it shows Mem Usage and VM Size fields,
are roughly the same size, which means the program has committed
the memory. So it seems to be working well enough, to be used to
test the proposed "chkdsk workaround".
Paul