Question

I am looking for a robust way to copy files over a Windows network share that is tolerant of intermittent connectivity. The application is often used on wireless, mobile workstations in large hospitals, and I'm assuming connectivity can be lost either momentarily or for several minutes at a time. The files involved are typically about 200KB - 500KB in size. The application is written in VB6 (ugh), but we frequently end up using Windows DLL calls.

Thanks!

Was it helpful?

Solution

I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.

  • Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
  • Do you want verifiable copies? Sounds like you already have that by verifying hashes.
  • Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
  • Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
  • Do you want to know when the network comes back? IsNetworkAlive has your answer.

Based on what I know so far, I think the following pseudo-code would be my approach:

sourceFile = Compress("*.*");
destFile = "X:\files.zip";

int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
   do {
     // optionally, increment a failed counter to break out at some point
     Sleep(1000);
   while (!IsNetworkAlive(NETWORKALIVELAN));
}

Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.

OTHER TIPS

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

Try using BITS (Background Intelligent Transfer Service). It's the infrastructure that Windows Update uses, is accessible via the Win32 API, and is built specifically to address this.

It's usually used for application updates, but should work well in any file moving situation.

http://www.codeproject.com/KB/IP/bitsman.aspx

I agree with Robocopy as a solution...thats why the utility is called "Robust File Copy"

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

And by default, a million retries. That should be plenty for your intermittent connection.

It also does restartable transfers and you can even throttle transfers with a gap between packets assuing you don't want to use all the bandwidth as other programs are using the same connection (/IPG switch)?.

How about simply sending a hash after or before you send the file, and comparing that with the file you received? That should at least make sure you have a correct file.

If you want to go all out you could do the same process, but for small parts of the file. Then when you have all pieces, join them on the receiving end.

Hm, seems rsync does it, and does not need server/daemon/install I thought it does - just $ rsync src dst.

SMS if it's available works.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top