Issue
I try to run bash script that launch many spiders in my Docker container.
My supervisor.conf
that placed in "/etc/supervisor/conf.d/
" looke like that:
[program:scrapy]
command=/tmp/start_spider.sh
autorestart=false
startretries=0
stderr_logfile=/tmp/start_spider.err.log
stdout_logfile=/tmp/start_spider.out.log
but supervisor return this errors:
2015-08-21 10:50:30,466 CRIT Supervisor running as root (no user in config file)
2015-08-21 10:50:30,466 WARN Included extra file "/etc/supervisor/conf.d/tor.conf" during parsing
2015-08-21 10:50:30,478 INFO RPC interface 'supervisor' initialized
2015-08-21 10:50:30,478 CRIT Server 'unix_http_server' running without any HTTP authentication checking
2015-08-21 10:50:30,478 INFO supervisord started with pid 5
2015-08-21 10:50:31,481 INFO spawned: 'scrapy' with pid 8
2015-08-21 10:50:31,555 INFO exited: scrapy (exit status 0; not expected)
2015-08-21 10:50:32,557 INFO gave up: scrapy entered FATAL state, too many start retries too quickly
And my program stop to running. But if I manually run my program , it works very well ...
How to resolve this ? any ideas?
Solution
I found the solution to my problem. For the supervisor.conf
, change
[program:scrapy]
command=/tmp/start_spider.sh
autorestart=false
startretries=0
by:
[program:scrapy]
command=/bin/bash -c "exec /tmp/start_spider.sh > /dev/null 2>&1 -DFOREGROUND"
autostart=true
autorestart=false
startretries=0
Answered By - pi-2r
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.