Member-only story
Understanding Principal Component Analysis and its Application in Data Science — Part 2
Learn the mathematical intuition behind PCA
In the first part of this article, I introduced the covariance matrix and discussed the mathematical foundation of PCA. In this part first, we discuss the application of singular value decomposition (SVD) in SVD and then we see a case study of PCA for the MNIST dataset. Finally, we discuss the multivariate normal distribution and the limitations of PCA. If I refer to a figure or equation or listing which cannot be found here, then you should look for that in the first part of the article. The Eqs. 1–69, Figures 1–20 and Listings 1–17 are given in the first part.
You can find the first part of this article here: Understanding Principal Component Analysis and its Application in Data Science — Part 1
PCA using Singular Value Decomposition (SVD)
Suppose that A is an m×n matrix. Then AᵀA will be a square m×m matrix. We can easily show that this matrix is symmetric. We only need to show that it is equal to its transpose.
This result also shows why in Eq. 68, the covariance matrix is always symmetric. The eigenvectors and eigenvalues are only defined for square matrices, so we cannot define the eigenvalues of A if m≠n. However, we…