Frage

I installed scrapyd with pip, and I don't have a '/var/log/scrapyd' dir. I'm trying to find out what's happening to my http call since I get and 'OK' estatus when I initiate it, but no log is generated in 'logs/project/spider/' (and according to listjobs.json, after one second the job is marked as finished but I don't see an error.)

War es hilfreich?

Lösung

You must manually create a config file in this path:

/etc/scrapyd/scrapyd.conf

You can specify inside the folder where the logs are stored. For example, this is my config file:

[scrapyd]
eggs_dir    = /usr/scrapyd/eggs
logs_dir    = /usr/scrapyd/logs
items_dir   = /usr/scrapyd/items
jobs_to_keep = 100
dbs_dir     =/usr/scrapyd/dbs
max_proc    = 0
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 10
http_port   = 6800
debug       = off
runner      = scrapyd.runner
application = scrapyd.app.application
launcher    = scrapyd.launcher.Launcher

[services]
schedule.json     = scrapyd.webservice.Schedule
cancel.json       = scrapyd.webservice.Cancel
addversion.json   = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json  = scrapyd.webservice.ListSpiders
delproject.json   = scrapyd.webservice.DeleteProject
delversion.json   = scrapyd.webservice.DeleteVersion
listjobs.json     = scrapyd.webservice.ListJobs

Make sure that all the folders have correct persmissions.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top