Question

I am having some trouble with a scrapy pipeline. My information is being scraped form sites ok and the process_item method is being called correctly. However the spider_opened and spider_closed methods are not being called.

class MyPipeline(object):

    def __init__(self):
        log.msg("Initializing Pipeline")
        self.conn = None
        self.cur = None

    def spider_opened(self, spider):
        log.msg("Pipeline.spider_opened called", level=log.DEBUG)

    def spider_closed(self, spider):
        log.msg("Pipeline.spider_closed called", level=log.DEBUG)

    def process_item(self, item, spider):
        log.msg("Processsing item " + item['title'], level=log.DEBUG)

Both the __init__ and process_item logging messages are displyed in the log, but the spider_open and spider_close logging messages are not.

I need to use the spider_opened and spider_closed methods as I want to use them to open and close a connection to a database, but nothing is showing up in the log for them.

If anyone has any suggested that would be very useful.

Was it helpful?

Solution

Sorry, found it just after I posted this. You have to add:

dispatcher.connect(self.spider_opened, signals.spider_opened)
dispatcher.connect(self.spider_closed, signals.spider_closed)

in __init__ otherwise it never receives the signal to call it

OTHER TIPS

Proper method names are open_spider and close_spider, not spider_opened and spider_closed. It is documented here: http://doc.scrapy.org/en/latest/topics/item-pipeline.html#writing-your-own-item-pipeline.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top