Question

We have a .Net framework 4 software solution with numerous .dll files.

Those files are hosted on a network server, and run from clients on a common remote folder.

We want to reduce the number of .dll files in this server folder.

Some questions do arise:

  • Will the bigger merged .dll be slower / faster to start / or to execute than numerous smaller .dll?
  • Is there a benefit to use NGen over network library files, in order to optimize it for each client?

Most of those .dll are in fact called from 32 bit unmanaged code, via a COM visible interface.

Was it helpful?

Solution

A single larger DLL shouldn't be any slower to start or execute than multiple smaller DLLs. It could potentially be faster to start, since the operating system wouldn't have to do as much initialization work. I doubt, however, that you'd notice the difference. Having fewer DLLs will reduce the memory footprint of your program by a little bit. Again, not a whole lot.

I would not recommend running NGen on DLLs that are being served over the network. NGen is intended to compile and optimize for the processor on which it's running. If your client machines have different architectures, the NGen image might be less than optimum or it might just fail to work.

Additions after comments:

See Improving Application Startup Time for more info on improving startup time. Also Writing High-Performance Managed Applications : A Primer.

Also note that the loader doesn't JIT the entire assembly. It JITs on an as-needed basis. If your program doesn't use a class that's in the assembly, that class's code will never be JITted. Furthermore, a method isn't JITted until first use. So if you never call the method Foo.Bar(), then it will never be JITted.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top