iT邦幫忙

2021 iThome 鐵人賽

DAY 8
0
自我挑戰組

開學之前....系列 第 13

Day14-Machine Learning : Self-attention

因為之後要用到,今天就簡單閱讀了這個,

投影片和內容都是來自台大李弘毅教授的youtube
https://www.youtube.com/watch?v=gmsMY5kc-zw&ab_channel=Hung-yiLee

  • Network的input: vector or a set of vector
    output: 1.每個vector has a label(eg.sequence labeling)
    2.整個sequence has a label
    3.model自己決定輸出labels的數量(eg.seq2seq)

先介紹output為每個vector皆有label- Sequence Labeling

https://ithelp.ithome.com.tw/upload/images/20210929/20140843luSKXumr35.png
給Fully-connected network整個window的資料
但因window沒辦法cover整個sequence
所以如要選取整個,使用self-attention

Self-attention

https://ithelp.ithome.com.tw/upload/images/20210929/20140843EwcTBakDPs.png
Self-attention會吃一整個sequence 的資訊,輸入幾個vector就輸出幾個

https://ithelp.ithome.com.tw/upload/images/20210929/201408430Z6NeCyISa.png
Input:一串vector
Output:考慮了整個input sequence才產生的
B1到b4是同時被計算出來的

https://ithelp.ithome.com.tw/upload/images/20210929/201408432bnpZvImGf.png
?:a1和其他input的關聯度,also called Attention score

https://ithelp.ithome.com.tw/upload/images/20210929/20140843bimlnauDAP.png
https://ithelp.ithome.com.tw/upload/images/20210929/201408431xMnWMVRxa.png
https://ithelp.ithome.com.tw/upload/images/20210929/20140843Bx084UZS4l.png
https://ithelp.ithome.com.tw/upload/images/20210929/20140843S533StlAJG.png

https://ithelp.ithome.com.tw/upload/images/20210929/20140843epxHpzvPEJ.png
Truncated self-attention:不看一整句話,只看一小個範圍就好(人設定範圍大小)(ex:只看前後),加快運算的速度

Self-attention is the complex version of CNN.
Cuz CNN only attends in a receptive field(範圍由機器network 決定的)


上一篇
Day13-290. Word Pattern
下一篇
Day15-1.Two Sum
系列文
開學之前....20

尚未有邦友留言

立即登入留言