Skip to content

Math for Machine Learning

Jhalak Patel edited this page Nov 9, 2017 · 16 revisions

Math Basics

  1. Basic Tensor Operations

  2. Principal Component Analysis: When dealing with high dimensions, but the information can be captured in lower dimensions i.e. data has low intrinsic dimensionality which makes the algorithm scalable and lifts the curse of high dimensions. PCA is useful for eliminating higher dimensions.

    Intuitions: Principal components are the directions in which data has the highest variance. The direction of the first principal component is given by the first eigenvector of the covariance matrix. PCA is not selecting some characteristics and discarding the others. Instead, it constructs some new characteristics that turn out to summarize our list

  3. Single Value Decomposition (SVD)

  4. Eigen Values and Eigen-Vectors: Eigen-Vectors can be used as a bases for representing any vector (for eg) in N dimension space. For a general setting, we have a system, which takes N dim input "x", applies some Linear Transformation i.e. "A", we are interested in knowing output y = Ax. Now matrix multiplication is computationally intensive. How to deal with it. Eigen-Vectors to the rescue. Eigen-Vectors are the vectors which do not change direction when a Linear Transformation "A" is applied to it. Thus, A. (Eigen-Vector) = Lambda (some constant) x Eigen-Vector. Thus Eigen-Vector i.e. V is only scaled. Now for any system, we V is the bases, any input x can be represented as linear combination of V. x = bV. So, y = Ax --> A (bV) --> b (AV) --> b . Lambda . V. Thus easier to calculate the transformation. Useful Links For Intution: 1 2

  5. Euclidean Distance

  6. Affine Transformation

  7. Kronecker Product - Outer Product of Matrices

  8. Covariance Matrix -

  9. Winograd Algorithm

Blogs

  1. Matrix Factorization

Videos

  1. Essence of Linear Algebra

CheatSheet

  1. Linear Algebra CheatSheet

Aside

  1. Time Complexity of Mathematical-Arithmetic Operations
Clone this wiki locally