Question

We have a command line exe that takes input from a text file and produces an output text file. It is used for complex industrial simulations.

The source code for this exe is long gone. Now it was easy enough to create a .NET wrapper which controls the execution of this exe and links in with an external app via a web service.

Unfortunetely a new requirement is to run optimization over this black box model. Now there are various methods to perform black box optimization but they all require calling the executable thousands (millions?) of times. Its obvious that the creation and parsing of disk based text files is the bottleneck of the simulation process.

Is there anyway I can trick this executable into not writing to a physical disk? If we were on Unix I suppose pipes would do the trick, but our deployment server is Windows Server 03.

It just occurred to me that a ramdrive might solve this problem, but I havent played with one of those since MS-DOS 6. Any commercial products worth looking at? Does anyone have any other ideas for emulating a physical drive through code? We are on .NET 3.5.

Was it helpful?

OTHER TIPS

If you're running on Vista, there's a commercial Ramdisk product that may suit. The x64 version may be needed if your system is already using most of its memory, to make sure you don't end up doing too much page swapping.

Another option is to spend a bit of cash on a 15000 RPM disk or a SSD (solid state disk), although that'll be slower than a RAM disk.

In the long term though it may be cheaper to reverse engineer the processing tool and rewrite it from scratch to avoid the bottlenecks.

Pipeing output is possible in windows as well - if the executable generates output only to standard out, you can use that.

If it actually writes to a file: since windows caches filesystem writes, I could imagine that simply writing, then deleting small files is almost as fast as a ramdisk. Have you actually tried running the program at a realistic rate, deleting all output post-run? If you keep an eye on CPU usage and the disk queue you should get an idea if plain old disk caching suffices.

If you run it in a virtual machine, caching for its virtual disks is the responsibility of the host OS... which means, you can run your Windows server inside a VM on more or less anything, and get an extra layer of caching. Does that work in the environment?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top