iT邦幫忙

0

scrapy連線至遠端sql時出現ModuleNotFoundError

  • 分享至 

  • xImage

Error code:

2020-03-17 11:04:54,639 INFO sqlalchemy.engine.base.Engine select count(*) from proxy;
2020-03-17 11:04:54 [sqlalchemy.engine.base.Engine] INFO: select count(*) from proxy;
2020-03-17 11:04:54,640 INFO sqlalchemy.engine.base.Engine {}
2020-03-17 11:04:54 [sqlalchemy.engine.base.Engine] INFO: {}
Traceback (most recent call last):
  File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\Scripts\scrapy.exe\__main__.py", line 9, in <module>
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 114, in execute
    settings = get_project_settings()
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\utils\project.py", line 69, in get_project_settings
    settings.setmodule(settings_module_path, priority='project')
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\settings\__init__.py", line 294, in setmodule
    module = import_module(module)
  File "c:\users\user\appdata\local\programs\python\python37\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'case1090110'

setting.py

BOT_NAME = 'case1090110'

SPIDER_MODULES = ['case1090110.spiders']
NEWSPIDER_MODULE = 'case1090110.spiders'

DB_CON_STR = 'postgresql://postgres:postgres@XXX:1234/OOO'

DOWNLOADER_MIDDLEWARES = {
#    'case1090110.middlewares.Case1090110DownloaderMiddleware': 543,
#     'case1090110.middlewares.ProxyMongoMiddleware': 542,
	'case1090110.middlewares.ProxyMiddleware': 542,
    'case1090110.middlewares.HeaderMiddleware': 543
}

ITEM_PIPELINES = {
   # 'case1090110.pipelines.MongoDBPipeline': 300,
    'case1090110.pipelines.NewsSaveToPostgresPipeline':400
}

執行主程式:

from scrapy.utils.log import configure_logging
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

def run_spider(name):
    configure_logging(install_root_handler=False)
    process = CrawlerProcess(get_project_settings())
    process.crawl(name)
    process.start()


if __name__ == '__main__':
    run_spider('google_news')

連線本地的SQL就沒有問題,但連線到遠端SQL就會出現no module:case1090110
請問各位該如何解決?

看更多先前的討論...收起先前的討論...
Rex Chien iT邦新手 4 級 ‧ 2020-03-17 11:57:03 檢舉
看起來像是啟動 scrapy 時的工作目錄不同造成的耶?
你連線本地/遠端資料庫時,所在的工作目錄和啟動的指令是什麼?
Huiicat iT邦新手 4 級 ‧ 2020-03-17 13:37:40 檢舉
本地的工作目錄是D:\【程式設計】\kyc-crawler\case1090110\case1090110
遠端我只是連'postgresql://postgres:postgres@XXX:1234/OOO',實際資料還是在本地裡面,這樣會有什麼問題嗎?
Rex Chien iT邦新手 4 級 ‧ 2020-03-17 14:38:05 檢舉
連線本地/遠端資料庫時,都是在同一個工作目錄執行主程式嗎
Huiicat iT邦新手 4 級 ‧ 2020-03-18 13:43:37 檢舉
是,後來發現似乎是因為更改過scrapy專案的名稱,導致無法辨認,雖然所有引用道專案名稱的地方都更改過,但仍然不能使用,最後是直接開一個新的專案後解決的
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友回答

立即登入回答