Question

I was wondering if anyone could assist me with checking to see if "get_redditor" returns an error or not. I have used the "fetch=True" argument and it still returns. However, if you go to the user "Alaska88" page then it does not exist. The error happens when the program reaches the "for comment in comments" line and I am assuming the try-except doesn't work due to it being a lazy object. Thank you in advance for any time or help.

import praw
import urllib2
r = praw.Reddit('testing scraper')
r.login()
account = r.get_redditor('Alaska88',fetch=True)
comments = account.get_comments(sort ='new',time ='all')        
print 'before comment loop'
try:
        for comment in comments:
                print 'in comment loop' 
                print(comment.body.encode('utf-8'))
        print('/////////////////////////')
except urllib2.HTTPError:
        print 'In Except'       
        time.sleep(60)
        pass 

The error begins here =>

File "reddit_bot.py", line 9, in for comment in comments: File "/usr/local/lib/python2.7/dist-packages/praw-

The error then ends here =>

raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found

Était-ce utile?

La solution

You're catching the wrong exception most likely.

urllib2.HTTPError is in your except, but requests.exceptions.HTTPError is in your traceback.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top