Issue
I am getting below error while running my scrapy project. I tried everything suggested on stackoverflow but yet nothing has solved the problem.
Feel free to ask for more information. Looking forward for any help.
(venv) [kalpesh@localhost scraper]$ scrapy crawl mrdeepfakes -a output=db
Traceback (most recent call last):
File "/home/kalpesh/venv/bin/scrapy", line 8, in <module>
sys.exit(execute())
File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/cmdline.py", line 113, in execute
settings = get_project_settings()
File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/utils/project.py", line 69, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/settings/__init__.py", line 287, in setmodule
module = import_module(module)
File "/home/kalpesh/venv/lib64/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'scraper.settings'
Solution
Make sure your scrapy.cfg file has the same default and project name as your spider crawler name inside the spiders folder. I tried changing my spider crawler name in the python file and subsequently modifying the scrapy.cfg file and it works.
Answered By - Erika Aldisa
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.