Question

I have several applications that need real memory, and they consume a lot of it. One solution would be to have every piece of memory inside application LOCKED (VirtualLock) but it would require many hours to do right, since apps are in .NET.

So, my question is, is the DISABLING the swap file (I have PLENTY of RAM on the machine) valid strategy to ensure that everything will really be in memory?

Update:

Let me repeat - I know that this might be very DIRTY way of doing things, and may break operation of the whole OS, but, I'll take full responsibility and cope with consequences, would just want to know what I might run into, problem-wise.

Here's what other StackExchangers think about it: https://serverfault.com/questions/23621/any-benefit-or-detriment-from-removing-a-pagefile-on-an-8gb-ram-machine

Was it helpful?

Solution

There is one thing that you cannot force into memory by doing that: Executable images and mapped files. Those are a each "page file" of their own. When memory pressure occurs Windows detects that their pages in memory have not been modified and just discards those pages because they can be reloaded later.

Everything that is not file-backed cannot be paged out (there is just no place to put it). So I guess your technique would work in practice.

You won't see a lot of problems. I am running without paging file all the time (16GB RAM). You loose the ability to capture full memory dumps in case of a blue screen but most likely you don't need that.

Just make sure that you never hit the physical memory limit or else a lot of programs will crash hard. Nobody writes their programs to cope with OOM situations (except core Windows components which I never have seen crash from that. Good job by them.).

OTHER TIPS

I just need to ensure that my data is in memory at all times

But what good is having data in memory but not the code needed to access it? You can be sure your data is in RAM but you can't be sure the code will be. You'll incur the exact same kind of delay when code needs to be paged back in. The first set of candidates are the .NET framework assemblies, they are prejitted ("ngen-ed") and backed by their *.ni.dll files. Something you can do about it, you could just delete them from the GAC directory (c:\windows\assembly\gac_64 and gac_msil directories). The price you'll pay is a significantly slowed-down warm start and no sharing when you run more than one .NET program. The next set of candidates are the Windows operating system DLLs. Nothing you can do about that.

Another thing to fret about is the file system cache. As RAM usage goes up, the cache shrinks. This will significantly slow down disk access. Particularly writes will be horribly slow if they can't be cached, going from a few microseconds to seconds.

It may well be that none of this is an actual concern when your RAM usage is highly predictable and well below the limit and you only ever run one program on the machine. You'll have to monitor it to ensure that's the case. Having to worry about RAM significantly defeats the purpose of having bought a lot of it. Particularly when you have no guarantee that disabling the paging file actually buys you anything.

I tried disabling the page-file. It didn't end well...

I'll just say that I did a wild goose chase. Disabling the page-file put EXTREMELY big pressure on the disk i/o, that took all the punches now.

Summary: in my case, having Windows balance file cache/application memory usage is the clear winner.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top