iT邦幫忙

第 12 屆 iThome 鐵人賽

DAY 25
0
自我挑戰組

零基礎成為 AI 解夢大師秘笈系列 第 25

【零基礎成為 AI 解夢大師秘笈】Day25 - 周易解夢之人工智慧(6)

  • 分享至 

  • xImage
  •  

人工智慧6

前言

系列文章簡介

大家好,我們是 AI . FREE Team - 人工智慧自由團隊,這一次的鐵人賽,自由團隊將從0到1 手把手教各位讀者學會 (1)Python基礎語法 (2)Python Web 網頁開發框架 – Django (3)Python網頁爬蟲 – 周易解夢網 (4)Tensorflow AI語言模型基礎與訓練 – LSTM (5)實際部屬AI解夢模型到Web框架上。

為什麼技術要從零開始寫起

自由團隊的成立宗旨為開發AI/新科技的學習資源,提供各領域的學習者能夠跨域學習資料科學,並透過自主學習發展協槓職涯,結合智能應用到各式領域,無論是文、法、商、管、醫領域的朋友,都可以自由的學習AI技術。

資源

AI . FREE Team 讀者專屬福利 → Python Basics 免費學習資源

本次將會把之前的內容,再做一次複習,並且詳述一些細節,並且帶來multilabel的實作

超參數(Hyperparameter)

在訓練模型時有分參數以及超參數,所謂的參數就是模型的weight、bias那些的,透過向前、向後傳播自己去更新的參數,而超參數就是透過我們人工調動的參數 : 學習率?(learning rate)、迭代(iterations)、隱藏層數目(L) 、隱藏單元數(?^[1] 、 ?^[2] …),還有之前課程講到的批次量

當然,上面的重要程度,是筆者自己在建模型時的習慣,大家可以多做幾次就會找到調超參數的習慣、方法

局部極值

梯度下降卡在某個局部極小值,而找不到全域極值 (global optimum)

在梯度下降時,因為loss越來越小,以上圖來說,坡度會逐漸平緩,導數會越來越小,直到趨近於0為止,更新的會極為緩慢,導致梯度不更新

有很多優化器可以解決此問題 : Adam、Rmsprop

梯度下降是一個複雜的問題,越難分類的任務,會有越複雜的波峰、波谷

Softmax迴歸

softmax迴歸是一個可以處理多重標籤決策邊界 (multi-label decision boundaries)問題

提到softmax前我們必須先講一下何謂multi-labelmulti-class

上述的圖應該非常明顯XD(我自己覺得),簡單來說class一次只會有一類,反之label可以有多類

好,那麼問題來了,我們要怎麼去分類多個標籤,大家還記得在前面的binary的分類時,我們都用0.5作為一個threshold,大於0.5為1,反之為0

multi的方式也一樣,只是最後輸出,會根據你有n個類別,輸出n個類神經元

每個類神經元代表的是一類,雲、太陽、月亮這樣是3類,就會有3個類神經元在output layer,如果今天是multi-label,就是像下面英文描述那樣,大於0.5的為True,反之為0

Muiti-class的問題,通常我們會選擇最大的類神經元的作為輸出,不過後來有人發明了softmax,活化輸出

softmax是一種正規化方式,會將每類答案轉換成0~1之間的值,而每一類的答案相加後和為1

從上面的圖片可以發現,每個類神經的輸出都是一種機率值

OK,有興趣的朋友,可以在研究一下,我們就來帶實作的部分~

實作

載入套件

%matplotlib inline
%config InlineBackend.figure_format = 'retina'
import matplotlib.pyplot as plt
import pandas as pd

import numpy as np
import seaborn as sns
import warnings
from datetime import datetime
from matplotlib.colors import ListedColormap
from sklearn.datasets import make_classification, make_moons, make_circles
from sklearn.metrics import confusion_matrix, classification_report, mean_squared_error, mean_absolute_error, r2_score
from sklearn.linear_model import LogisticRegression
from sklearn.utils import shuffle
from keras.models import Sequential
from keras.layers import Dense, Dropout, BatchNormalization, Activation
from keras.optimizers import Adam
from keras.utils.np_utils import to_categorical

import keras.backend as K
from keras.wrappers.scikit_learn import KerasClassifier

與上次文章的相同

資料集(Dataset)

def make_multiclass(N=500, D=2, K=3):
    np.random.seed(1)
    X = np.zeros((N*K, D))
    y = np.zeros(N*K)
    for j in range(K):
        ix = range(N*j, N*(j+1))
        # radius
        r = np.linspace(0.0,1,N)
        # theta
        t = np.linspace(j*4,(j+1)*4,N) + np.random.randn(N)*0.2
        X[ix] = np.c_[r*np.sin(t), r*np.cos(t)]
        y[ix] = j
    fig = plt.figure(figsize=(6, 6))
    plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.RdYlGn, alpha=0.8)
    plt.xlim([-1,1])
    plt.ylim([-1,1])
    return X, y
X, y = make_multiclass(K=3)

這裡有3個類別,可以把資料print出來看一下

print(y)

輸出

可以發現多了一個2,有3種類別(0,1,2)

建置模型、定義超參數、loss function以及optimizer

model = Sequential()
model.add(Dense(128, activation='relu'))
model.add(Dense(64, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(16, activation='relu'))
model.add(Dense(3, activation='softmax'))

model.compile(Adam(lr=0.01), 'categorical_crossentropy', metrics=['accuracy'])

因為之前都講過了,故這裡會帶快一點

y_cat = to_categorical(y)

print出來看一下

print(y_cat,y_cat.shape)

shape是只矩陣的維度(只有array與np.array可以,一般list沒有這樣的指令)

[[1. 0. 0.]
 [1. 0. 0.]
 [1. 0. 0.]
 ...
 [0. 0. 1.]
 [0. 0. 1.]
 [0. 0. 1.]] (1500, 3)

因為輸出3種類別的神經元,因此要將答案one hot encoding(可以看上面雲、太陽、月亮的格式)

開始訓練

history = model.fit(X, y_cat, verbose=1, epochs=50)
plot_loss_accuracy(history)

輸出

Epoch 1/50
47/47 [==============================] - 0s 2ms/step - loss: 0.6006 - accuracy: 0.6880
Epoch 2/50
47/47 [==============================] - 0s 2ms/step - loss: 0.2525 - accuracy: 0.8940
Epoch 3/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0524 - accuracy: 0.9800
Epoch 4/50
47/47 [==============================] - 0s 2ms/step - loss: 0.1231 - accuracy: 0.9633
Epoch 5/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0394 - accuracy: 0.9833
Epoch 6/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0781 - accuracy: 0.9793
Epoch 7/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0388 - accuracy: 0.9880
Epoch 8/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0435 - accuracy: 0.9867
Epoch 9/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0323 - accuracy: 0.9893
Epoch 10/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0315 - accuracy: 0.9900
Epoch 11/50
47/47 [==============================] - 0s 2ms/step - loss: 0.1469 - accuracy: 0.9700
Epoch 12/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0308 - accuracy: 0.9867
Epoch 13/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0355 - accuracy: 0.9860
Epoch 14/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0619 - accuracy: 0.9807
Epoch 15/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0405 - accuracy: 0.9873
Epoch 16/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0213 - accuracy: 0.9927
Epoch 17/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0368 - accuracy: 0.9853
Epoch 18/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0516 - accuracy: 0.9807
Epoch 19/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0408 - accuracy: 0.9840
Epoch 20/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0156 - accuracy: 0.9953
Epoch 21/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0412 - accuracy: 0.9853
Epoch 22/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0233 - accuracy: 0.9913
Epoch 23/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0192 - accuracy: 0.9927
Epoch 24/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0271 - accuracy: 0.9907
Epoch 25/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0218 - accuracy: 0.9900
Epoch 26/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0225 - accuracy: 0.9927
Epoch 27/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0153 - accuracy: 0.9953
Epoch 28/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0326 - accuracy: 0.9873
Epoch 29/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0152 - accuracy: 0.9940
Epoch 30/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0148 - accuracy: 0.9960
Epoch 31/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0333 - accuracy: 0.9887
Epoch 32/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0462 - accuracy: 0.9880
Epoch 33/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0302 - accuracy: 0.9893
Epoch 34/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0830 - accuracy: 0.9713
Epoch 35/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0188 - accuracy: 0.9933
Epoch 36/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0278 - accuracy: 0.9893
Epoch 37/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0139 - accuracy: 0.9947
Epoch 38/50
47/47 [==============================] - 0s 3ms/step - loss: 0.0321 - accuracy: 0.9873
Epoch 39/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0131 - accuracy: 0.9947
Epoch 40/50
47/47 [==============================] - 0s 3ms/step - loss: 0.1445 - accuracy: 0.9707
Epoch 41/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0757 - accuracy: 0.9807
Epoch 42/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0712 - accuracy: 0.9793
Epoch 43/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0398 - accuracy: 0.9860
Epoch 44/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0157 - accuracy: 0.9940
Epoch 45/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0113 - accuracy: 0.9953
Epoch 46/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0107 - accuracy: 0.9960
Epoch 47/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0213 - accuracy: 0.9927
Epoch 48/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0127 - accuracy: 0.9947
Epoch 49/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0128 - accuracy: 0.9940
Epoch 50/50
47/47 [==============================] - 0s 2ms/step - loss: 0.0150 - accuracy: 0.9947

如果覺得訓練的進度條很擾人,可以把verbose設成0

觀看邊界

def plot_multiclass_decision_boundary(model, X, y):
    x_min, x_max = X[:, 0].min() - 0.1, X[:, 0].max() + 0.1
    y_min, y_max = X[:, 1].min() - 0.1, X[:, 1].max() + 0.1
    xx, yy = np.meshgrid(np.linspace(x_min, x_max, 101), np.linspace(y_min, y_max, 101))
    cmap = ListedColormap(['#FF0000', '#00FF00', '#0000FF'])

    Z = model.predict_classes(np.c_[xx.ravel(), yy.ravel()], verbose=0)
    Z = Z.reshape(xx.shape)
    fig = plt.figure(figsize=(8, 8))
    plt.contourf(xx, yy, Z, cmap=plt.cm.RdYlGn, alpha=0.8)
    plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.RdYlGn)
    plt.xlim(xx.min(), xx.max())
    plt.ylim(yy.min(), yy.max())
    
plot_multiclass_decision_boundary(model, X, y)

classfication_report

y_pred = model.predict_classes(X, verbose=0)
print(classification_report(y, y_pred))
              precision    recall  f1-score   support

         0.0       0.98      1.00      0.99       500
         1.0       1.00      0.99      0.99       500
         2.0       1.00      0.99      0.99       500

    accuracy                           0.99      1500
   macro avg       0.99      0.99      0.99      1500
weighted avg       0.99      0.99      0.99      1500

想更深入認識 AI . FREE Team ?

自由團隊 官方網站:https://aifreeblog.herokuapp.com/
自由團隊 Github:https://github.com/AI-FREE-Team/
自由團隊 粉絲專頁:https://www.facebook.com/AI.Free.Team/
自由團隊 IG:https://www.instagram.com/aifreeteam/
自由團隊 Youtube:https://www.youtube.com/channel/UCjw6Kuw3kwM_il39NTBJVTg/

文章同步發布於:自由團隊部落格
(想看更多文章?學習更多AI知識?敬請鎖定自由團隊的頻道!)


上一篇
【零基礎成為 AI 解夢大師秘笈】Day24 - 周易解夢之人工智慧(5)
下一篇
【零基礎成為 AI 解夢大師秘笈】Day26 - 周易解夢之人工智慧(7)
系列文
零基礎成為 AI 解夢大師秘笈30
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言