I’ve been using a handy little Microsoft tool for many years now called ROBOCOPY which is a great command line tool for copying large sets of folders and files from one place to another, especially over a network. It has a nifty network-restartable mode (/Z option) which means if it fails, you can just re-run the command and it will pick up where it left off – essential when you’re moving Gigabytes and Terra-bytes of data. It’s also great for mirroring data between file shares on servers.
From a command prompt (usually with Administrator rights), here is how I use it:
robocopy /Z /E /R:0 /W:0 <source-path> <dest-path> /XD "Temp" "Temporary Internet Files" /XF pagefile.sys hiberfil.sys
- /Z – this is the network restartable mode, it checks the destination to see if the file already exists and skips, and also provides a % counter as the file copies
- /E – this tells it to copy sub-folders, including empty ones
- /R:0 – the default for Robocopy if it can’t get a lock on a file is to retry 3 times, but if it can’t lock a file for reading the first time, it usually never will, so I set it to zero.
- /W:0 – this is the number of seconds to wait if retrying, because we don’t retry I just set this to zero.
- <source-path>, <dest-path> should speak for themselves
- /XD – this is to exlude folders such as “Temp” and “Temporary Internet Files” which I never bother backing up, they should never contain anything useful.
- /XF – this is to exclude certain files, such as the system Paging and Hibernation files which are really big and are re-created if you ever restore a system.
Why not use XCOPY instead?
Good question, I just got so used to using ROBOCOPY and the few times I tried XCOPY it didn’t seem to do what I wanted as reliably. And as it turns out Microsoft agrees with me, and has actually deprecated Xcopy in favour of Robocopy – so stop using Xcopy today! Robocopy is now shipped by default with Windows Vista/7 and Server 2008 and onwards; it used to have to be installed separately from a Resource Kit.
Here’s a big tip that you might come across
There is a known issue with some Windows Vista/7 machines (usually the ones that have been upgraded from a Windows XP install) – you’ll get an infinite loop problem during backup that will cause the “Application Data” folder to be nested so many times that it will eventually cause a “file name too long” error in the NTFS path.
- /XJ – Exclude NTFS junction points, to avoid the endless “Application Data” folder nesting on some Windows Vista/7 machines.
for /f "delims=" %X IN ('dir /x /b /s "Application Data"') do ren "%X" "ad"
Note: the command above is quite destructive so please only run it over a backup set that you’re trying to delete, but can’t because of the “file name too long” error.