Question

I am trying Get-S3Object -BucketName pilot but it has lot of sub directories.

Was it helpful?

Solution

You probably want to use Copy-S3Object instead, for the actual copying. Get-S3Object returns a collection of S3Object metadata. Use Get-S3Object to isolate the objects you want to copy, and then pass the important information from them (the Key) into Copy-S3Object.

You can use the KeyPrefix Parameter to filter the results down to a specific sub-directory. For example:

$srcBucketName = "myBucket"
$objects = get-s3object -bucketname $srcBucketName -KeyPrefix "mySubdirectory"

If you need to copy data from all of your directories, you can break this overall operation into smaller chunks by calling get-s3object individually, once for each directory in your bucket.

If the individual directories are still yet too large, you can further chunk your operation by using the MaxKeys parameter.


Once you're ready to copy, you can do something like this:

$objects | 
% { copy-s3object -BucketName $srcBucketName -Key $_.Key -LocalFile "somePath" }

...where you've defined your local path. You can even match the subdirectories and keys to your local path by resolving the key in a string: "C:/someDir/$($_.Key)"

OTHER TIPS

You want to use Read-S3Object to copy to your local job. Read-S3Object calls Get-S3Object, so you simply need this:

Read-S3Object -BucketName $myS3Bucket-KeyPrefix $myS3SubDirectory -Folder $myLocalFolder

The version of AWS for Powershell that I have doesn't allow piping for copy-s3object. So you can use read-s3object.

Also, the get-s3object doesn't have a bucket name parameter, weird.

Get-S3Object -BucketName mybucket -Key path/to/files | % { Read-S3Object -BucketName mybucket -Key $_.Key -File $("C:\"+$_.key)}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top