Question

I had multiple spiders in my project so decided to run them by uploading to scrapyd server. I had uploaded my project succesfully and i can see all the spiders when i run the command

curl http://localhost:6800/listspiders.json?project=myproject

when i run the following command

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider2

Only one spider runs because of only one spider given, but i want to run run multiple spiders here so the following command is right for running multiple spiders in scrapyd ?

curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1,spider2,spider3........

And later i will run this command using cron job i mean i will schedule this to run frequently

Was it helpful?

Solution

If you want to run multiple spiders using scrapyd, schedule them one by one. scrapyd will run them in the same order but not at the same time.

See also: Scrapy 's Scrapyd too slow with scheduling spiders

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top