Question

I have written a shellscript which tries to pull a tar file from an ftp server and untar it locally. I need to extract specific files from the tar archive. The filename of the tarfile contains a date; I need to be able to select a tar file based on this date.

abc_myfile_$date.tar is the format of the file I am pulling from the ftp server.

My current code looks like this:

for host in ftpserver
do
ftp -inv host <<END_SCRIPT
user username password
prompt
cd remotepath
lcd localpath
mget *myfile_$date*.tar
quit
END_SCRIPT
done

for next in `ls localpath/*.tar`
do
tar xvf $next *required_file_in_tar_file*.dat
done

when i run the script am not able to untar the files

I am able to get a single tar file from the ftp server only if I mention the exact name of that file. I would like to get a file which has myfile_$date in its name. After this I would like to extract it to a local path to get the specified files in that tar file whose names consist of my required_files.

Was it helpful?

Solution

You get the .tar file, but decompress it with z option. Compressed files (those that require z) normally have .tar.gz prefix. Try

tar xvf $next *required_file_in_tar_file*.dat

OTHER TIPS

Firstly, if you want to use wildcards for the file name that you're getting from the server you need to use mget instead of get. Wildcard file expansion (the *) does not work for the get command.
Once you have pulled the file the tar operation will work as expected, most modern versions of linux/bsd have a 'smart' tar, which doesn't need the 'z' command to specify that the tar file is compressed - they'll figure out that the tarball is compressed on their own and uncompress it automatically, providing the appropriate compression/decompression tool is on the system (bzip2 for .jz files, gzip for .gz files).

I'm not quite sure, but does the FTP protocol not have a command mget if you want to download multiple files? (instead of get)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top