It appears there is also a cmdlet called Read-S3Object
that ends up with the same result. Had to use that.
Didn't see anything about Copy-S3object
being deprecated or having its functionality changed, so that's unfortunate.
Assuming you have:
- Powershell V3
- Amazon Tools for Powershell v2.x
- Appropriate Amazon Credentials
Then the following script should work:
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
### SET ONLY THE VARIABLES BELOW ###
$accessKey = "" # Amazon access key.
$secretKey = "" # Amazon secret key.
$fileContainingAmazonKeysSeparatedByNewLine = "" # Full path to a file, e.g. "C:\users\killeens\desktop\myfile.txt"
$existingFolderToPlaceDownloadedFilesIn = "" # Path to a folder, including a trailing slash, such as "C:\MyDownloadedFiles\" NOTE: This folder must already exist.
$amazonBucketName = "" # the name of the Amazon bucket you'll be retrieving the keys for.
### SET ONLY THE VARIABLES ABOVE ###
$creds = New-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey
Set-AWSCredentials -Credentials $creds
$amazonKeysToDownload = Get-Content $fileContainingAmazonKeysSeparatedByNewLine
$uniqueAmazonKeys = $amazonKeysToDownload | Sort-Object | Get-Unique
$startingpath = $existingFolderToPlaceDownloadedFilesIn
$uniqueAmazonKeys | ForEach-Object {
$keyname = $_
$fullpath = $startingpath + $keyname
Read-S3Object -BucketName $amazonBucketName -Key $keyname -File $fullpath
}
Obviously there would be better ways to produce this (as a function that accepts parameters, in a Powershell v4 workflow with parallel loops and a throttle count, better dealing with credentials, etc.) but this gets it done in its most basic form.