Linear Algebra for Machine Learning

In this course, we will study the theory of matrices behind the basic concepts of machine learning. We will present the matrix algebra and derivations of and over matrices. These definitions are crucial to understand the optimization methods in machine learning. We will start the course with a general introduction, and then we will proceed to cover these topics: trace, norm, distance, angle, orthogonality; Kronecker product, vec operator, Hadamard product; linear systems and generalized inverses; Moore-Penrose inverse. Determinants; linear, bilinear and quadratic forms; eigenvalues and eigenvectors of matrices; matrix differentiation; polar decomposition; Hessian; minimization of a second-degree n-variable polynomial subject to linear constraints. We will conclude the course with some applications to machine learning.