or in win2k pro
I mean disable it and not have any swapfile whatsoever
+ Reply to Thread
Results 1 to 10 of 10
-
Right click My Computer.
Properties
Advanced
Performance Settings
Advanced
Virtual Memory Change
No Paging File
Set
Then OK and APPLY your way out of there. -
thanks but if I remember corectly 2kpro automatically makes a swapfile on each boot even if you try to disable it
-
Why would you want to disable it? even if you have enough physical ram to load everything you might possibly need to windows xp will still want to write to cache file and will run miserably without one.
-
Win ME and 98 and 95 ran blazingly fast with 512 MBs of RAM and no swapfile
on file encoding it sped things up probably 10%
the results were very noticable
and I have always bought the fastest harddrives I could get but that swap file thing really blows -
If you ever run out of physical RAM you could be in big trouble though...
Regards.Michael Tam
w: Morsels of Evidence -
only happened 1ce in 6 months
and all that happened was that the program that I opened crashed
the only side effects other then that were-asus monitor utility would give a floating point error,and blair witch games would not run with sound,turn virtual memory back on and they all worked
but for encoding mpegs I switched virtual memory off and there was a very noticable speed increase -
With any NT-based system you're better off managing your pagefile properly than trying to shut it off altogether (you are with 95/98 too, but it's at least possible to run without one). The system will be more stable and in some cases may perform better than it would without one. This advice is kind of tailored toward servers or professional workstations, but it's all based on practical experience (mine and others') trying to get acceptable performance out of NT/2000 machines.
You want your paging space to be at least 2.5x physical memory. That may sound excessive, but it seems to be a good rule of thumb for everything from little desktop machines running nothing but Office up through network servers. It's right for NT4 and 2000, and works for the 9x line, so it's reasonable to assume that it applies to XP. A pagefile that's too small will be a problem, one that's too big will be fine.
Always set minimum and maximum to the same size. Letting the pagefile size change is a very bad thing - never, ever allow Windows to change the size on the fly. It will always try to start at the minimum and then have to expand it, which is a horrible performance hit. Most of the problems people have with the entire machine grinding to a dead halt during paging are caused by this.
Avoid pagefile fragmentation. This is another reason to set min/max the same. You can use a dedicated partition for your pagefile, which is the simple solution. But if you only have one drive to use for paging you're better off completely defragmenting C: and putting it there. You may have to do some juggling with the location of the pagefile to do this, since it's always in use most tools can't defragment it while you're running.
If you have multiple drives you should spread the paging space across different physical devices (dedicated drives are even better, but impractical for a workstation). But if you're using a second drive for capture you shouldn't put the pagefile (or anything else) there, for obvious reasons. -
sterno - I have two drives. My OS is on the master and I usually cap to my secondary/slave drive. I've been setting my pagefile on my secondary drive for a while. Is it better to set the pagefile on the secondary hard drive, even though I cap with that drive?
I know that the system is supposed to perform better when the pagefile is set on a separate physical drive than the system, but continuous writing to that drive kind of throws a monkey wrench into the equation. It works fine for me, but I'd like to hear any technical answers, if possible. -
The idea is that your drive is more or less limited to doing one thing at a time. If it's paging, the drive heads have to seek to the right spot for the pagefile, do their thing, seek back to the next file the system needs to access, maybe then seek back to the pagefile, etc. The system will often perform better with the pagefile and system on different drives because your OS and application files can be accessed independently of the pagefile (at least with SCSI or reasonably modern UDMA systems), which is also the reason that captures are likely to run better with a dedicated drive.
In theory, the pagefile being on the same drive as the capture file could cause you more dropped frames. In practice, I have a machine running Windows 2000 with 128MB and a single 80MB drive and I can still capture HuffyUV in VirtualDub at 480x480 or 352x480 from a WinTV card without enough dropped frames to be noticable.
Bottom line - if it works, go with it. If you find you have more dropped frames than you like, try shifting some or all of the pagefile to C: and see if you have any less. Like I said, those are just guidelines from seeing what worked best in server/workstation environments, capture is a little different.
Similar Threads
-
Virtual Windows XP mode with W7 Home Premium?
By neworldman in forum ComputerReplies: 3Last Post: 4th Oct 2010, 12:45 -
virtual memory too low?
By deadman1972 in forum Newbie / General discussionsReplies: 3Last Post: 14th Apr 2009, 09:22 -
VLC Media Player hangs and starts hogging lots of Virtual Memory
By shiningwizard in forum Software PlayingReplies: 2Last Post: 27th Aug 2008, 08:17 -
Xvid4PSP making Windows give Virtual Memory low message
By godfist314 in forum Video ConversionReplies: 5Last Post: 22nd Jul 2008, 14:09 -
how do you receive virtual memory?
By mvp in forum Newbie / General discussionsReplies: 6Last Post: 2nd Aug 2007, 07:46