Issue
Just like the title. I use -s LOG_FILE=mylog.txt for saving an external log. But I would like also see the log as the spider is running. Is there a way to do that? I'm using Windows 10 and prefer an answer that works in Windows 10.
Amateur developer without computing background here so please go easy on me.
Solution
Use gnu's tee
tool:
scrapy crawl myspider 2>&1 | tee crawl.log
2>&1
redirect stderr to stdout - you want errors and info in the same file most likely.
| tee crawl.log
pipes that output to tee
which splits it to crawl.log
file and stdout.
There's tee
implementation for windows too:
There's a Win32 port of the Unix tee command, that does exactly that. See http://unxutils.sourceforge.net/ or http://getgnuwin32.sourceforge.net/
taken from: https://stackoverflow.com/a/796492/3737009
Answered By - Granitosaurus
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.