Question

AWS_ACCESS_KEY_ID = '<access key>'
AWS_SECRET_ACCESS_KEY = '<my secret key>'
Bucketname = 'Bucket-name' 
import boto
from boto.s3.key import Key
import boto.s3.connection
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,
        host ='s3.ap-southeast-1.amazonaws.com',
        is_secure=True,               # uncommmnt if you are not using ssl
        calling_format = boto.s3.connection.OrdinaryCallingFormat(),
        )
bucket = conn.get_bucket(Bucketname)

Error:

  Traceback (most recent call last):
   File "uploads3.py", line 69, in <module>
    upload_hello_file_s3()
  File "uploads3.py", line 25, in upload_hello_file_s3
    bucket = conn.get_bucket(Bucketname)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 431, in get_bucket
    bucket.get_all_keys(headers, maxkeys=0)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/bucket.py", line 364, in get_all_keys
    '', headers, **params)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/bucket.py", line 321, in _get_all
    query_args=s)
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 543, in make_request
    override_num_retries=override_num_retries)
  File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 937, in make_request
    return self._mexe(http_request, sender, override_num_retries)
  File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 899, in _mexe
    raise e
socket.gaierror: [Errno -2] Name or service not known

please help me to solve this problem there is no problem in bucket name and access key and secret key.

Was it helpful?

Solution

You can also use the following (boto.s3.connect_to_region):

import boto
from boto.s3.key import Key
import boto.s3.connection

AWS_ACCESS_KEY_ID = '<access key>'
AWS_SECRET_ACCESS_KEY = '<my secret key>'
Bucketname = 'Bucket-name' 


conn = boto.s3.connect_to_region('ap-southeast-1',
       aws_access_key_id=AWS_ACCESS_KEY_ID,
       aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
       is_secure=True,               # uncomment if you are not using ssl
       calling_format = boto.s3.connection.OrdinaryCallingFormat(),
       )
bucket = conn.get_bucket(Bucketname)

This way you don't have to care about the 'exact' endpoint with the full hostname. And yes like @garnaat mentioned, use the latest boto API.

OTHER TIPS

from boto3.session import Session

ACCESS_KEY='your_access_key'

SECRET_KEY='your_secret_key'

session = Session(aws_access_key_id=ACCESS_KEY,aws_secret_access_key=SECRET_KEY)

s3 = session.resource('s3')

my_bucket = s3.Bucket('bucket_name')

for s3_file in my_bucket.objects.all():

           print(s3_file.key)

The request to the host s3.ap-southeast-1.amazonaws.com is failing. I also cannot resolve it from my end. Check your bucket settings for the correct host.

There might also be a problem with your internet connection or the DNS server. Try pinging the host manually from command line and see if it resolves. Alternatively, try using a different DNS.

Edit: Quick googling suggests that the host might be s3-ap-southeast-1.amazonaws.com.

There is a typo in the host parameter. The right one is: s3-ap-southeast-1.amazonaws.com

REFERENCES Amazon Regions and Endpoints

The question is answered, but I wanted to include some additional info that helped me. Keep in mind latest boto is boto3, but I was stuck using Python 2.7 in a legacy environment.

Authentication

There are at least 3 ways to authenticate with boto: First, you can include credentials (access key, secret key) in the connect_to_region() call. A second way is to define the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and then don't supply credentials in the connect_to_region() call. Finally, if using boto 2.5.1 or later, boto can use the IAM role for an instance to create temporary credentials.

For the first two, you need to use AWS console to create a user with access to a bucket. In the third method, create an IAM role with access to the bucket and assign it to the instance. The 3rd way is often the best because then you don't have to store credentials in source control, or manage credentials in the environment.

Accessing the Bucket

Now on to the mistake I made that caused the same message as the OP. The top level objects in S3 are buckets and everything below are keys. In my case the object I wanted to access was at s3:top-level/next-level/object. I tried to access it like this:

bucket = conn.get_bucket('top-level/next-level')

The point is that next-level is not a bucket but a key, and you'll get the "Name or service not known" message if the bucket doesn't exist.

Gotcha: capture traffic on your Ethernet link and ensure CNAME in DNS queries do NOT contain '\r' character e.g. in the bucket name.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top