Matrices (Pt. 5): Additional

Foreword

I believe this page may not be particularly important or crucial for any data scientists as these are largely from Mathematics. 🐱‍👤

Linear Transformation

I stumbled upon "3Blue1Brown" YouTube channel, and it truly explained vectors, matrices, and linear transformation in an extremely visual and narrative way (it is truly an awesome channel for Mathematics in general).

The main idea is that transformation is equivalent to a function, and a function basically takes in inputs to return outputs.

Thus, if a vector (2 elements for it is 2-D) is given as input, the output will also be a vector as well. Hence, as due to the properties of matrix multiplication, the 'transformation' has to be square matrix.

Rotation Matrix

A rotation matrix in 2-D (idea is similar in 3-D) has the formula of

The idea is to rotate Vector 1 by "theta" degrees, the values of the linear transformation matrix is as accordingly.

The sign of "theta" denotes the direction as well (there is not really a scientific reason behind this, but it is from the right-hand rule).

  • Rotating anti-clockwise is positive, and

  • Rotating clockwise is negative

Thus, common rotations are:

Maybe you can fancy the matrix for a 360 degrees rotation 🤔.

More on the next page ⏭

Last updated