문제

I have a common ASP.NET web application that is shared by multiple sites. I used NuGet to package this common web application and distribute it over the multiple sites. Following this idea: Multiple ASP.NET web applications depending on a common base application.

By doing this, I end up with some issues. When updating for a new version, NuGet becomes really slow. This is because of the 800 content-files it contains. Somehow NuGet has to take around 1~2 seconds for each content-file it needs to uninstall, ending up with approximately 25 minutes for uninstall and 5 minutes for install . Especially with a TFS-binding. Looking at the source-code of NuGet I figured that the API of Visual Studio where NuGet is talking to is the bottleneck. Causing Visual Studio to peak around 100% of CPU-usage the whole process.

So I thought, if Visual Studio is that slow, maybe I can do it without it. To my disappointment the NuGet commandline (that runs without Visual Studio) only downloads a package and unzips it. It won't update the project-file due the fact that some Powershell-scripts could reference to the DTE-object... ( although I don't).

Now I'm wondering: what are my options?

  1. Doing some XML-magic on the project-file to add content-items? What are the drawbacks of doing this? Is there already a tool for this?
  2. Doing some magic with build scripts?
  3. Throw away the content-files out of the package and use another tool like Bower or something? How may this be integrated into the project? Because eventually I want to see the content-files I have.
  4. Not using NuGet at all but something else ...? OpenWrap? Horn? (seems to no longer be active) Or maybe no package manager at all?

Please help me find the best solution :)

.

One other thing that frustrates me is when performing a NuGet update, it does an uninstall followed by an install. Why, if the changes between versions could be minimal?

도움이 되었습니까?

해결책

One option is that instead of adding 800 content files, you can embed them as resources in your shared library.

Then use something like EmbeddedResourceVirtualPathProvider https://github.com/mcintyre321/EmbeddedResourceVirtualPathProvider to serve them up.

That way NuGet is simply replacing the .dll instead of 800 individual files.

There are plenty of articles online about how to use VirtualPathProvider.

EDIT: Questions about performance

The solution seems relatively simple, so I would just try it and see if the performance is acceptable. The nice thing is that you don't have to do anything special with paths in your web app.

Here are the steps you need to do:

  1. In your resource library, set the content files to embedded resource. You can do that quickly by opening up the .csproj file in a text editor and replacing <Content with <EmbeddedResource.
  2. Add a reference to your resource library to your web app.
  3. Add the EmbeddedResourceVirtualPathProvider NuGet package to your web app
  4. Register the virtual path provider. Here's an example to register using AppInitialize() stored in the App_Code folder.
namespace TestWebProject.App_Code
{
    public class RegisterVirtualPathProvider
    {
        public static void AppInitialize()
        {
            HostingEnvironment.RegisterVirtualPathProvider(new EmbeddedResourceVirtualPathProvider.Vpp()
            {
                {typeof (Marker).Assembly, @"..\TestResourceLibrary"},

            });
        }
    }
}
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top