Machine Learning Zero to Hero- Linear Algebra Part-1

Vishal Kumar
2 min readMar 24, 2021
Image Courtesy: dev.to

Just like alphabets are basic building blocks of the English language similarly Linear Algebra is the basic building block of Machine Learning. This series will cover the basic requirements of linear algebra to jump into machine learning.

4 topics that we will cover in linear algebra are:

  1. Introduction to Vectors and Matrices
  2. Matrix Operations
  3. Eigen Values and Eigen Vectors
  4. Principal Component Analysis

Introduction to Vectors :

Vectors are ordered tuples or a list of numbers e.g. X = [1,2,3,4] that have magnitude and direction.

Introduction to Matrices:

Matrix, a set of numbers arranged in rows and columns to form a rectangular array. The numbers are called the elements, or entries, of the matrix. The size of a matrix is denoted by its number of rows and columns (MxN) for e.g.
A = [1,2,3,4]
here A is a matrix of size (1,4) it has 1 row and 4 columns.

Image Courtesy: khanacademy.org

Matrix is widely used in machine learning for data representation and data handling. Image processing tasks are performed on the matrix because an image is nothing but a matrix of pixels.

Image Pixels to Matrix

In the next part, we will learn about some important matrix operations for machine learning.

--

--

Vishal Kumar

Data Scientist, Data Science Enthusiast working on NLP, Knowledge Graphs and deep learning.