Issue
Is there a way with scrapy to dynamically set the settings for a spider given at runtime?
I want to add an isDebug
variable to my spider and depending on it's value I want to adjust log level, pipelines and various other settings ...
When trying to manipulate the settings as said in the manual, like this:
def from_crawler(cls, crawler):
settings = crawler.settings
settings['USER_AGENT'] = 'Overwridden-UA'
I always get TypeError: Trying to modify an immutable Settings object
Solution
Settings
object is immutable by itself, but has number of set methods, for ex settings.set
https://github.com/scrapy/scrapy/blob/129421c7e31b89b9b0f9c5f7d8ae59e47df36091/scrapy/settings/init.py#L234
Recent versions of Scrapy (beginning from 1.0) spiders has class method update_settings
@classmethod
def update_settings(cls, settings):
settings.setdict(cls.custom_settings or {}, priority='spider')
which is intended to override settings with ones presented in custom_settings
property of the spider. So to reach your goal you can override that method in some way like that
class TheSpider(scrapy.Spider):
name = 'thespider'
is_debug = True
custom_debug_settings = {
# Put your debug settings here
}
@classmethod
def update_settings(cls, settings):
settings.setdict(getattr(cls, 'custom_debug_settings' \
if getattr(cls, 'is_debug', False) \
else 'custom_settings', None) or {},
priority='spider')
And of course there is project-wide 'two scoops of Django' way to have custom settings file for debug purposes, so it could be something like that:
settings.py (add to the end of the file):
try:
from dev_settings import *
except ImportError:
pass
then you can create dev_settings.py next to settings.py and add there settings you'd like to customize for your development purposes - them will be overwritten if dev_settings.py exists or import will be just ignored if not.
Answered By - mizhgun
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.