iT邦幫忙

0

Scrapy在middlewares中使用os.system()呼叫spider時出現找不到專案名稱.settings

  • 分享至 

  • xImage

問題如題,因為有使用mongodb,在本地端測試時可以,但一旦要連上server mongodb的話就會出問題,server pgsql卻沒事,有可能是server mognodb出問題嗎?

error code:

Traceback (most recent call last):
  File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "c:\users\user\appdata\local\programs\python\python37\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\user\AppData\Local\Programs\Python\Python37\Scripts\scrapy.exe\__main__.py", line 9, in <module>
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 114, in execute
    settings = get_project_settings()
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\utils\project.py", line 69, in get_project_settings
    settings.setmodule(settings_module_path, priority='project')
  File "c:\users\user\appdata\local\programs\python\python37\lib\site-packages\scrapy\settings\__init__.py", line 294, in setmodule
    module = import_module(module)
  File "c:\users\user\appdata\local\programs\python\python37\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'searchnews.settings'

settings.py

DOWNLOADER_MIDDLEWARES = {
    'searchnews.middlewares.ProxyMongoMiddleware': 542,
    # 'searchnews.middlewares.ProxyMiddleware': 542,
    'searchnews.middlewares.HeaderMiddleware': 543
}

middlewares.py

class ProxyMongoMiddleware(object):

...
    def proxy_count(self):
        import os
        self.client = pymongo.MongoClient(self.DB_URI)
        self.db_collection = self.client[self.DB_NAME][self.DBCOL_PROXY]
        ip_num = self.db_collection.find().count()
        temp = 1
        if ip_num < 2:
            while temp < 10:
                # temp = 10
                os.system("scrapy crawl proxy_example")
                os.system("scrapy crawl xici")
                temp = self.db_collection.find().count()
        else:
            return False
        return True
...
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

1 個回答

0
bingtw
iT邦新手 5 級 ‧ 2020-03-26 16:24:19

你應該要先檢查你執行 scrapy 的機器連到 server mongodb 是不是正常...

我要發表回答

立即登入回答