iT邦幫忙

第 11 屆 iThome 鐵人賽

DAY 22
0
Google Developers Machine Learning

ML Study Jam Journey系列 第 22

Day 22 Art and Science of Machine Learning (cont.)

  • 分享至 

  • xImage
  •  

Neural Networks

Linear Model can be represented as nodes and edges
Non-Linear Transformation (aka Activation Function)

Training

Three common failure modes for gradient descent

  • Gradients can vanish - Use ReLu instead of sigmoid/tanh
  • Gradients can explode - Batch Normalization
  • ReLu layers can die - Lower learning rates

上一篇
Day 21 Art and Science of Machine Learning (cont.)
下一篇
Day 23 Art and Science of Machine Learning (cont.)
系列文
ML Study Jam Journey30
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言