문제

I am trying to deploy scrapy project using scrapyd but it is giving me error ...

sudo scrapy deploy default -p eScraper
Building egg of eScraper-1371463750
'build/scripts-2.7' does not exist -- can't clean it
zip_safe flag not set; analyzing archive contents...
eScraperInterface.settings: module references __file__
eScraper.settings: module references __file__
Deploying eScraper-1371463750 to http://localhost:6800/addversion.json
Server response (200):
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/webservice.py", line 18, in render
    return JsonResource.render(self, txrequest)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/txweb.py", line 10, in render
    r = resource.Resource.render(self, txrequest)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/resource.py", line 250, in render
    return m(request)
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/webservice.py", line 66, in render_POST
    spiders = get_spider_list(project)
  File "/usr/local/lib/python2.7/dist-packages/scrapyd/utils.py", line 65, in get_spider_list
    raise RuntimeError(msg.splitlines()[-1])
RuntimeError: OSError: [Errno 20] Not a directory: '/tmp/eScraper-1371463750-Lm8HLh.egg/images'

Earlier i was able to deploy the project properly but not now..... But if use crawl spider using scrapy crawl spiderName then there is no problem... can some one help me please....

도움이 되었습니까?

해결책

Try these two things: 1. May be you have deployed too many versions , try deleting some older versions 2. before deploying, delete the build folder and the setup file

And as far as running the crawler is concerned if u run crawler of any arbitrary name which u even have not deployed, the scrapyd will return 'OK' response along with the job id.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top