Question

I have a project using subsonic that I developed in Visual Studio 2008 on C: drive. No problem there. I've just upgraded to Visual Studio 2010 (and as my computer coincidentally died, I'm now running Windows XP virtualised with VirtualBox).

The project runs without complaint on C: drive, but if I run it from G: (a mapped drive which points to a partition on the base PC), I can't run the custom tools subsonic uses (error listed below), OR run the web application ('start without debugging' gives me: Failed to start monitoring changes to 'G:\GPNNT\GpnntApp\GpnntApp').

This is a .net 3.5 solution.

enter image description here

This would seem to be a well-documented and straightforward problem. I have taken the following actions:

(1) BATCH FILE

c:
cd "C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727"
caspol -all -reset
caspol -q -machine -addgroup 1 -url file:////g:\* FullTrust -name "G Drive"
caspol -q -machine -addgroup 1 -url g:\* FullTrust -name "G Drive 1"


c:
cd "C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319"
caspol -all -reset
caspol -q -machine -addgroup 1 -url file:////g:\* FullTrust -name "G Drive"
caspol -q -machine -addgroup 1 -url g:\* FullTrust -name "G Drive"

pause

(I have tried a zillion different url formats, all to no avail)

(2) The .Net 2.0 Configuration utility (Control Panel > Admin Tools)

Using the analysis tool, both settings made in the batch file above appear to apply to files on the drive.
I also tried setting the intranet group to FullTrust (something I'd rather not do !). No difference.

(3) loadFromRemoteSources

It is reasonable to assume that although the project itself only uses .NET 2, VS2010 itself might use .NET 4 internally. After some more googling (eg here), I added

<runtime>
  <loadFromRemoteSources enabled="true"/>
</runtime>

to both .net versions' machine.config files.

(4) Upgrading to VS2010 SP1

None of these has made an iota of difference. Can anyone shed light on this before my blood pressure reaches dangerously high levels ? I suppose I can go back to running everything off C:, but it does seem a bit ridiculous in this age of virtualisation. I really want the data in a different place to the VM.

I note this SO post has the same problem, and blames Test projects, which is not terribly satisfactory. I also don't have a test project, although there may be test references buried in the SubSonic dlls somewhere I suppose.

LAST MINUTE ADDITION: I also notice SQL Server 2005/8 won't talk to G: (eg. recover a backup from there), and assume any solution would also allow this to occur. That would be another great-to-have.

Was it helpful?

Solution

For posterity, here are some results of my investigations.

Mapped drives in VS 2010:

  • there is an initial message about loading a project from an unsafe location. This is fixed using CASPOL as described. CASPOL is pretty flexible with its URLs and accepts both the shown formats. CASPOL is disabled in .NET 4 by default so the settings won't make a difference (see reasons here).
  • after that there are several more problems, I didn't document them but after fixing each one, another springs up. loadFromRemoteSources fixed one message, but nothing I did could touch 'Failed to start monitoring changes ...'. As part of this, Christoph's answer is probably correct (at least for .NET 2) in that you'd probably have to set up each assembly on the drive, which is totally impractical for a VS project drive.

So, not surprisingly, I think it is just going to be too painful to store VS projects on a mapped drive. Source control and a local project are the way to go. Frankly, the lack of undo on a networked drive is a pain for dev work as well IMO.

BUT

the original issue was not so much that I needed a networked drive as that I didn't want to store the project on C: of my VM (ie. I wanted to be able to back up the data separately from the VM based drive image).

The answer staring me in the face the whole time, was to create a second virtual disk and attach it to the VM as G:. It's a local drive, so I don't get all the trust issues, but I get full data separation. I keep all the data on that drive in a Dropbox folder, giving me full realtime offsite backup as well, and that's me happy.

OTHER TIPS

Maybe I'm wrong (I did not look this up) but I think you need to work with strong names when working on mapped drives. Like

caspol -m -q -ag My_Machine_Zone -url g:\* Nothing -n "GDrive"

caspol -m -q -ag "GDrive" -strong -file "<pathToFile>" "<assemblyStrongName>" "<assemblyVersion>" FullTrust -n "GDriveFileX" -d "Code Group for fileX"

I would always create a toplevel group so you can easily modify or remove the policy later on. I think it makes sense not to allow to give full trust to a mapped drive based on the uri e ven though this is painfull.

If possible use something more modern.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top