Error code:
2020-03-17 11:04:54,639 INFO sqlalchemy.engine.base.Engine select count(*) from proxy;
2020-03-17 11:04:54 [sqlalchemy.engine.base.Engine] INFO: select count(*) from proxy;
2020-03-17 11:04:54,640 INFO sqlalchemy.engine.base.Engine {}
2020-03-17 11:04:54 [sqlalchemy.engine.base.Engine] INFO: {}
Traceback (most recent call last):
File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\user\AppData\Local\Programs\Python\Python37\Scripts\scrapy.exe\__main__.py", line 9, in <module>
File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 114, in execute
settings = get_project_settings()
File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\utils\project.py", line 69, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\settings\__init__.py", line 294, in setmodule
module = import_module(module)
File "c:\users\user\appdata\local\programs\python\python37\lib\importlib\__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'case1090110'
setting.py
BOT_NAME = 'case1090110'
SPIDER_MODULES = ['case1090110.spiders']
NEWSPIDER_MODULE = 'case1090110.spiders'
DB_CON_STR = 'postgresql://postgres:postgres@XXX:1234/OOO'
DOWNLOADER_MIDDLEWARES = {
# 'case1090110.middlewares.Case1090110DownloaderMiddleware': 543,
# 'case1090110.middlewares.ProxyMongoMiddleware': 542,
'case1090110.middlewares.ProxyMiddleware': 542,
'case1090110.middlewares.HeaderMiddleware': 543
}
ITEM_PIPELINES = {
# 'case1090110.pipelines.MongoDBPipeline': 300,
'case1090110.pipelines.NewsSaveToPostgresPipeline':400
}
執行主程式:
from scrapy.utils.log import configure_logging
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
def run_spider(name):
configure_logging(install_root_handler=False)
process = CrawlerProcess(get_project_settings())
process.crawl(name)
process.start()
if __name__ == '__main__':
run_spider('google_news')
連線本地的SQL就沒有問題,但連線到遠端SQL就會出現no module:case1090110
請問各位該如何解決?