Linear Algebra
Last updated
Last updated
Linear Algebra is an important topic to understand, a lot of deep learning algorithms use it, so this chapter will teach the topics needed to understand what will come next.
Scalars: A single number
Vector: A 1D array of numbers, where each element is identified by an single index
Matrix: A 2D array of numbers, below we have a (2-row)X(3-col) matrix. In matrices a single element is identified by two indexes instead of one.
Here we show how to create them on matlab and python(numpy)
Here we will show some important matrix operations.
If you have an image (2D matrix) and multiply with a rotation matrix, you will have a rotated image. Now if you multiply this rotated image with the transpose of the rotation matrix, the image will be "un-rotated" Basically to transpose a matrix means to swap it's rows and cols. Or in other words to rotate the matrix around it's main diagonal.
Basically we add 2 matrices by adding each element with the other. Both matrices need to have same dimension
Multiply all elements of the matrix by a scalar
The matrix product of an n×m matrix with an m×ℓ matrix is an n×ℓ matrix. The (i,j) entry of the matrix product AB is the dot product of the ith row of A with the jth column of B. The number of columns in the first matrix must match the number of rows in the second matrix. The result will be another matrix or a scalar with dimensions defined by the rows of the first matrix and columns of the second matrix.
There are some special matrices that are interesting to know.
Identity: The diagonal of the identity matrix is filled with ones, all the rest are zeros. If you multiply a matrix B by the identity matrix you will have the matrix B as the result,
Inverse: Used on matrix division and to solve linear systems.
Sometimes we need to organize information with more than 2 dimensions, we call tensor an n-dimensional array. For example an 1D tensor is a vector, a 2D tensor is a matrix, a 3D tensor is a cube, and a 4D tensor is a vector of cubes, a 5D tensor is a matrix of cubes.
In Matlab
The next chapter we will learn about Linear Classification.
Basically the operation is to "dot product" each row of the first matrix(k) with each column of the second matrix(m). Some examples
Matrix multiplication is not always commutative , but the dot product between 2 vectors is commutative, .
Here we will show how to use matrix multiplication to implement a linear classifier. We don't care now about what the linear classifier does, just pay attention that we use our linear algebra to solve it.
We can merge the weights and bias (called the bias trick) to solve the linear classification as a single matrix multiplication
In Python