Issue
I want to get an argument outside spiders. Before scrapy version 1.7, I can do this:
from scrapy.conf import settings
But now, instead of
from scrapy.utils.project import get_project_settings
But, I found something different, the latter way, I can't get the arguments that I pass by the command line -s like below:
scrapy parse --spider=result_spider -c _parse_results -d 3 --nocolour "http://sss.com" -s ACT=grab_result
The ACT parameter can be accessed in the spider, but in other files, it cannot be accessed, even if get_the_settings() is used.
from scrapy.utils.project import get_project_settings
from sqlalchemy import Column, String, Integer
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
settings = get_project_settings()
class Keyword(Base):
__tablename__ = settings.get('KEYWORD_TABLE', 'hwords')
print('ACT value is: %s' % settings['ACT']) # The value is None, even I have pass something
id = Column(Integer, primary_key=True)
word = Column(String(256), unique=True)
def __repr__(self):
return "<%s(word='%s')>" % (self.__tablename__, self.word)
Solution
I can't find the answer everywhere, and the use of get_project_settings() is not like mine. It seems that I have to use argparse to get the command line parameters.
Answered By - Jett
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.