Question

I often have to transfer large files >50GBs sometimes >100GBs between drives both internal and external during backups of our networks email servers. What is the best method of transferring these files? Command Line such as XCOPY? Possibly something robust enough to continue the transfer if interrupted due to time limits or network issues.

Was it helpful?

Solution

Check out robocopy. From Wikipedia:

robocopy, or "Robust File Copy", is a command-line directory replication command. It was available as part of the Windows Resource Kit, and introduced as a standard feature of Windows Vista and Windows Server 2008.

OTHER TIPS

For free, I use SyncToy (from Microsoft). That way if something fails it doesn't abort the whole transfer.

The next best for non-repetitive tasks IMHO is XCopy.

I have used Teracopy with good success.

I get asked this question every now and again and I always say the same thing. Microsoft Background Intelligent Transfer Service (BITS). This is the same technology used to deliver large service packs and such to workstations. Some of the features:

  • Network Throttling
  • Asynchronous Transfers
  • Auto-Resume
  • Priority Levels for Downloads
  • Proven Transfer Mechanism

For those not wanting to deal with the command line syntax you can explore wrapper applications, such as SharpBITS.NET, that provide a GUI interface.

I use CopyHandler and find it does the job well.

Well i use http://itrnsfr.com to transfer my big files online. I wish they extend the quote over 2 GB they currently offer to free users

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top