2019 iT 邦幫忙鐵人賽

DAY 10

機器學習與數學天天玩系列 第 10

Day 10-機器學習與數學天天玩-PCA-Transformation of Vectors in Spaces

/images/emoticon/emoticon28.gif We are now in the PCA journey, station 2.

  1. Statistical Introduction
    2. Transformation of Vectors in Spaces
  2. Orthogonal Projection

We learned the concept of mean value, variance, covariance and linear transformation in the statistical introduction. Based on these, the course brings us into another page, application of vectors.

To compare the difference or similairty of data points among datasets, it would be efficient to apply the computation of vectors. For example, if we want to compare how two people resemble each other, a vector with features can be described simply like this: [The length of left eye (cm), The length of right eye (cm), The distance between two eyebrows (cm)]. So, suppose that Iron man has a feature vector like [9, 8.5, 5], and superman has a feature vector like [10, 10, 4], then how similar the iron man and the superman are is a question we can analyze.

In the following days, I will show how to analyze questions like this via dot product and inner product.

Day 9-機器學習與數學天天玩-PCA-Statistical Introduction: Linear Transformation Final Summary
Day 11-機器學習與數學天天玩-PCA-Transformation of Vectors in Spaces: Dot Product Part 1