Question

I want to move about 800gb of data from an NTFS storage device to a FAT32 device (both are external hard drives), on a Windows System.

What is the best way to achieve this?

  1. Simply using cut-paste?
  2. Using the command prompt ? (move)
  3. Writing a batch file to copy a small chunks of data on a given interval ?
  4. Use some specific application that does the job for me?
  5. Or any better idea...?

What is the most safe, efficient and fast way to achieve such a time consuming process?

Was it helpful?

Solution

Robocopy

You can restart the command and it'll resume. I use it all the time over the network. Works on large files as well.

OTHER TIPS

I would physically move the hard dsk if possible.

I've found fast copy to be quite good for this sort of thing. Its a gui tool ....

http://www.ipmsg.org/tools/fastcopy.html.en

If you have to move it over a network, you want to use FTP between the servers. The Windows File system will get bogged down with chatty protocols.

I've found Teracopy to be pretty fast and handy. Allegedly Fastcopy (as suggested by benlumley) is even faster, but I don't have any experience with it.

Try using WinRar or a zipping tool. Big "files" are moved quicker than lots of small ones. Most zipping tools allow to split the archive(zip) files into multiple archives.

You might even reduce the size a bit when you turn on compression.

Command Line: xcopy is probably your best bet

Command Reference: http://www.computerhope.com/xcopyhlp.htm

One of the fastest way to copy files is use robocopy as pointed by Pyrolistical in above post. its very flexible and powerful. If command doesn't work from your dos prompt directly then try with powershell option like below example.

Must Check the documentation for this command before using it "robocopy /?".

powershell "robocopy  'Source' 'destination' /E /R:3 /W:10 /FP /MT:25 /V" 

/E - Copy subdirectory including empty ones.
/R - Retry 3 times if failed.
/W - wait for 10 seconds between retries.
/FP - include full path name in output.
/MT - Multi thread.
/V - verbose output.

I wanted to comment a comment about multithreading, from @hello_earth, 201510131124, but I don't have enough reputation points on Stackoverflow (I've mostly posted on Superuser up until now) :
Multithreading is typically not efficient when it comes to copying files from 1 storage device to 1 other, because the fastest throughput is reached for sequential reads, and using multiple threads will make a HDD rattle and grind like crazy to read or write several files at the same time, and since a HDD can only access one file at a time it must read or write one chunk from a file then move to a chunk from another file located in a different area, which slows down the process considerably (I don't know how a SSD would behave in such a case). It is both inefficient and potentially harmful : the mechanical stress is considerably higher when the heads are moving repeatedly across the platters to reach several areas in short succession, rather than staying at the same spot to parse a large contiguous file.

I discovered this when batch checking the MD5 checksums of a very large folder full of video files with md5deep : with the default options the analysis was multithreaded, so there were 8 threads with an i7 6700K CPU, and it was excruciatingly slow. Then I added the -j1 option, meaning 1 thread, and it proceeded much faster, since the files were now read sequentially.

Another consideration that derives from this is that the transfer speed will be significantly higher if files are not fragmented, and also, more marginally, if they are located at the begining of a hard disk drive, corresponding to the outermost parts of the platters, where the linear velocity is maximum (that aspect is irrelevant with a solid state drive or other flash memory based device).

Also, the original poster wanted “the most safe, efficient and fast way to achieve such a time consuming process” – I'd say that one has to choose a compromise favoring either speed/efficiency, or safety : if you want safety, you have to check that each file was copied flawlessly (by checking MD5 checksums, or with something like WinMerge) ; if you don't do that, you can never be 100% sure that there weren't some SNAFUs in the process (hardware or software issues) ; if you do that, you have to spend twice as much time on the task.

For instance : I relied on a little tool called SynchronizeIt! for my file copying purposes, because it has the huge advantage compared to most similar tools of preserving all timestamps (including directory timestamps, like Robocopy does with the /DCOPY:T switch), and it has a streamlined interface with just the options I need. But I discovered that some files were always corrupted after a copy, truncated after exactly 25000 bytes (so the copy of a 1GB video for instance had 25000 good bytes then 1GB of 00s, the copy process was abnormally fast, took only a split second, which triggered my suspicion in the first place). I reported this issue to the author a first time in 2010, but then he chalked it up to a hardware malfunction, and didn't think twice about it. I still used SI, but started to check files thoroughly every time I made a copy (with WinMerge or Total Commander) ; when files ended up corrupted I used Robocopy instead (files which were corrupted with SynchronizeIt, when they were copied with Robocopy, then copied again with SynchronizeIt, were copied flawlessly, so there was something in the way they were recorded on the NTFS partition which confused that software, and which Robocopy somehow fixed). Then in 2015 I reported it again, after having identified more patterns regarding which files were corrupted : they had all been downloaded with particular download managers. That time the author did some digging, and found the explanation : it turned out that his tool had trouble copying files with the little known “sparse” attribute, and that some download managers set this attribute to save space when downloading files in multiple chunks. He provided me with an updated version which correctly copies sparse files, but hasn't released it on his website (the currently available version is 3.5 from 2009, the version I now use is a 3.6 beta from October 2015), so if you want to try that otherwise excellent software, be aware of that bug, and whenever you copy important files, thoroughly verify if each copied file is identical to the source (using a different tool), before deleting them from the source.

I used Teracopy and copied 50+GB to a 128GB flash drive.
Too almost 48 hours...had to do it twice because had a power hiccup. Had to re-format and start over...Not my favorite thing to do...

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top