Linear Algebra v2: Regression and SVD?
Date:
Building off of my first talk, I will couple linear algebra and calculus to introduce the idea of the cost function along with the simple technique of linear regression. We’ll then revisit eigenvectors and values while briefly describing how to systematically find them. The Google Page Rank algorithm will also be shown again, this time with the numerical calculations of finding the eigenvector associated with $\lambda = 1$. From there, we’ll touch on the singular value decomposition (SVD), one of the most important linear algebraic tools given its utility for dimensionality reduction and look at image compression using the technique.
Associated Jupyter Notebook Here
Note. For the code to work, you’ll have to download the image titled tundy.jpeg
and place it in the same directory as linalg_talk.ipynb
for things to work correctly.