scrapyd配置文件: 如果没有没有配置文件,scrapyd会使用自身的默认值,比如默认每个CPU 最多只执行4个scrapy进程。 CentOS 6.5 64 位 scrapy 1.3.3 scrapyd 1.1.1
如果设置了scrapyd的配置文件: scrapyd会搜索路径:
• /etc/scrapyd/scrapyd.conf (Unix)• c:\scrapyd\scrapyd.conf (Windows)• /etc/scrapyd/conf.d/* (in alphabetical order, Unix)• scrapyd.conf• ~/.scrapyd.conf (users home directory)
我的配置文件放在etc /scrapyd/scrapyd.conf 下
[scrapyd]eggs_dir = /usr/scrapyd/eggslogs_dir = /usr/scrapyd/logsjobs_to_keep = 100dbs_dir = /usr/scrapyd/dbsmax_proc = 0max_proc_per_cpu = 800finished_to_keep = 100poll_interval = 5.0bind_address = 192.168.17.30http_port = 6800debug = offrunner = scrapyd.runnerapplication = scrapyd.app.applicationlauncher = scrapyd.launcher.Launcherwebroot = scrapyd.website.Root[services]schedule.json = scrapyd.webservice.Schedulecancel.json = scrapyd.webservice.Canceladdversion.json = scrapyd.webservice.AddVersionlistprojects.json = scrapyd.webservice.ListProjectslistversions.json = scrapyd.webservice.ListVersionslistspiders.json = scrapyd.webservice.ListSpidersdelproject.json = scrapyd.webservice.DeleteProjectdelversion.json = scrapyd.webservice.DeleteVersionlistjobs.json = scrapyd.webservice.ListJobs
其中在打开web界面时,如果长时间没有操作,后台会报出Timing out..