Question

I recently got an SSL certificate for my site:

https://ram.rachum.com/

It works great in browsers. But it fails for requests:

>>> import requests
>>> requests.get('https://ram.rachum.com')
Traceback (most recent call last):
  File "<pyshell#1>", line 1, in <module>
    requests.get('https://ram.rachum.com')
  File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
    return request('get', url, **kwargs)
  File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
    return session.request(method=method, url=url, **kwargs)
  File "C:\Python27\lib\site-packages\requests\sessions.py", line 354, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Python27\lib\site-packages\requests\sessions.py", line 460, in send
    r = adapter.send(request, **kwargs)
  File "C:\Python27\lib\site-packages\requests\adapters.py", line 250, in send
    raise SSLError(e)
SSLError: hostname 'ram.rachum.com' doesn't match either of '*.webfaction.com', 'webfaction.com'

Why? Why does requests look at the webfaction certificate rather than my own certificate, which is valid for ram.rachum.com?

Was it helpful?

Solution

You are using a requests library without support for SNI (server name indication), but you have multiple SSL certificates behind the same IP address which requires SNI. You can verify this with openssl s_client. Without given a name for SNI the server just gives the default certificate for this IP, which is *.webfaction.com:

openssl s_client -connect ram.rachum.com:443
...
 0 ...CN=*.webfaction.com

But if you specify a hostname for SNI it returns the expected certificate:

openssl s_client -connect ram.rachum.com:443 -servername ram.rachum.com
...
 0 ...CN=ram.rachum.com...

Maybe you need to upgrade your requests library and other modules too, see using requests with TLS doesn't give SNI support

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top