Frage

I have a WCF service hosted in Azure.

I have a spatialite database file I'm going to keep in Azure blob storage (1.1G). Compressed it is 500K.

I would like to copy it to local storage when my service starts, and then use spatialite to run various spatial functions off the database file. The spatial data is static.

Does anybody have a code snippet (C#) to copy a file from azure blob storage to local storage?

(also, I think this approach makes sense - does it?)

(also, should I bother compressing the file for blob storage?)

Thanks

EDIT: Thanks for the first two responses. I was hoping for some code snippets to use. I could use a little more explanation on which would be the better route to go. Just code it all or use this bootstrap idea.

SOLUTION: I'm marking SMARX's as answer because it should work for any protected azure file, but since the file is a publicly available blob file, I skipped the CloudStorageAccount route suggested by SMARX, in favor of simple web access. I'm wondering if there are any speed advantages to using SMARX's approach though. Any comments would be appreciated.

// Retrieve an object that points to the local storage resource
LocalResource localResource = RoleEnvironment.GetLocalResource("MyLocalStorage");

WebClient webClient = new WebClient();
webClient.DownloadFile(blobUrl, localResource.RootPath + "mySpatialiteDB.sqlite");

NOTE: You have to configure the local storage through your webRole properties

War es hilfreich?

Lösung

How about: CloudStorageAccount.Parse(...).CreateCloudBlobClient().GetBlobReference("path/of/blob").DownloadFile(RoleEnvironment.GetLocalResource("nameOfLocalResource").RootPath);

Do that in your RoleEntryPoint in OnStart before you do anything else?

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top