iT邦幫忙

2024 iThome 鐵人賽

DAY 25
0

開始訓練

有些系統需要指定CUDA安裝路徑

export PATH=/usr/local/cuda/bin${PATH:+:${PATH}}#v新增環境變數
exportLD_LIBRARY_PATH=/usr/local/cuda/lib64:${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}

輸入以下指令,載入json設定檔。libnvinfer.so.7: cannot open shared object file的警告可以暫時忽略。

accelerate launch main.py --load_json_path "/home/user/trainingconfig.json"

以下是簡單的測試

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
from diffusers import StableDiffusionPipeline
import os
from datasets import load_dataset

# 配置
model_name = "CompVis/stable-diffusion-v-1-4"
output_dir = "/path/to/save/lora_model"
train_data_dir = "/path/to/your/dataset"

# 加載數據集
dataset = load_dataset('imagefolder', data_dir=train_data_dir)

# 初始化模型和tokenizer
model = StableDiffusionPipeline.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

# 訓練參數
num_train_epochs = 3
batch_size = 4
learning_rate = 5e-5

# 訓練過程
def train(model, tokenizer, dataset, output_dir, num_train_epochs, batch_size, learning_rate):
    optimizer = torch.optim.AdamW(model.parameters(), lr=learning_rate)
    model.train()
    for epoch in range(num_train_epochs):
        for step, batch in enumerate(dataset):
            inputs = tokenizer(batch["text"], return_tensors="pt", padding=True, truncation=True)
            outputs = model(**inputs, labels=inputs["input_ids"])
            loss = outputs.loss
            loss.backward()
            optimizer.step()
            optimizer.zero_grad()
            if step % 10 == 0:
                print(f"Epoch {epoch}, Step {step}, Loss {loss.item()}")
    model.save_pretrained(output_dir)
    tokenizer.save_pretrained(output_dir)

# 開始訓練
train(model, tokenizer, dataset, output_dir, num_train_epochs, batch_size, learning_rate)

之後會自動開始訓練。訓練好的模型位於訓練設定檔所寫的output_dir目錄。將.safetensors檔移動至SD WebUI根目錄下的/models/Lora。


上一篇
DAY24 : 訓練LORA模型(1)
下一篇
DAY26 : 訓練LORA模型(3)
系列文
(A)ㄟ你啥時要換大頭貼ㄞ(I)30
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言