我最近在做網路爬蟲,我爬蟲的網址是這個https://www.jkf.net/lady/2021/rank.php
,我有成功把資料爬下來,但是我要輸出成CSV檔案,卻發生了錯誤,錯誤訊息是
ValueError: I/O operation on closed file.
我的程式如下:
import csv
import requests
from bs4 import BeautifulSoup
url = 'https://www.jkf.net/lady/2021/rank.php'
response = requests.get(url=url)
soup = BeautifulSoup(response.text, 'lxml')
info_items = soup.find_all('li', 'in_fade')
with open('JKF.csv', 'w', encoding='utf-8', newline='') as csv_file:
csv_writer = csv.writer(csv_file)
csv_writer.writerow(['號碼', '暱稱', '網址'])
for item in info_items:
num = item.find('div', 'num').text.strip()
name = item.find('div', 'tx').text.strip()
link =item.find('a').get('href')
csv_writer.writerow([num, name, link])
print('{} {} 網址:{}'.format(num,name,link))
請問到底為什麼會這樣呢?有誰知道的可以幫我解答嗎?謝謝
Python 要注意縮排,程式區塊. for 的部分要在 with 的範圍內.
你的寫法, with 後面接兩行,你還多了個空白行.然後離開了 with 的範圍,Python就自動的close file了.這是with 方便,但不熟就會像你這樣遇到這個情況.
這些就遇到多了,就會注意.很多編輯器,都會有幫我們把程式區塊,在旁邊可以收或展開,可以善加利用.底下的我沒執行,只是排了一下.
import csv
import requests
from bs4 import BeautifulSoup
url = 'https://www.jkf.net/lady/2021/rank.php'
response = requests.get(url=url)
soup = BeautifulSoup(response.text, 'lxml')
info_items = soup.find_all('li', 'in_fade')
with open('JKF.csv', 'w', encoding='utf-8', newline='') as csv_file:
csv_writer = csv.writer(csv_file)
csv_writer.writerow(['號碼', '暱稱', '網址'])
for item in info_items:
num = item.find('div', 'num').text.strip()
name = item.find('div', 'tx').text.strip()
link =item.find('a').get('href')
csv_writer.writerow([num, name, link])
print('{} {} 網址:{}'.format(num,name,link))