Matrices (Pt. 4): Types

Types of Matrices

In this page, we will be discussing on the commonly-used matrices, which serves to lead on to our ultimate discussion of Linear Algebra. 🎨

Identity Matrices

An identity is like a copy of something.

  • It may be helpful to think of identity matrices as the "value 1", as any value that is multiplied with 1 returns itself as well

  • Similarly, an identity matrix when multiplied to a matrix, replicates the matrix

An identity matrix is simply a matrix with the value 1 ranging diagonally from the upper-left to the bottom-right.

It also has to be n x n sized, in which n should simply satisfy the matrix multiplication operation.

In this case, identity matrices are commutative (refer to matrix multiplication properties), in which for a matrix of A and identity matrix of I:

AI=IAAI = IA

There exists a identity matrix for every possible matrix

This order of identity matrices to be used should not be a big challenge. Since the non-commutative property does not apply, as long as the matrix multiplication operation passes, it should be fine.

In the example following this, the identity matrix example is acting as the left-matrix in the multiplication operation, and thus the multiplication will work accordingly as long as the amount of rows of the matrix to be applied is the same as n.

Permutation Matrix

The meaning of permutate is to simply change the order or arrangement. Hence, a permutation matrix acts to primarily switch between either rows or columns of a matrix.

With this, a permutation matrix is extremely similar to an identity matrix, as the interest is only in changing the order instead of its values.

As to be expected a permutation matrix contains the values of 0s and 1s, but should not contain the diagonal structure an identity matrix has. However, for the structure of a permutation matrix:

  • There should only be a "1" for every row and column

The matrix above is a valid permutation matrix as it can be seen that the value 1 exist only once for its row and column.

It may help to think this as Sudoku, where there can only exist the value of "1" for both row and column.

Column Permutation

As would the name suggest, a column permutation matrix acts to re-order a matrix's columns. To differentiate itself from a row matrix, as positioning matters (non-commutative), it is positioned as a right-matrix.

The big question is, for what column permutation matrix is needed for a particular desired re-ordering of columns? An arbitrary scenario of swapping columns (1, 2, 3) to (2, 3, 1) will demonstrate this concept.

For column permutation, by simply identifying the desired change, we can then draft out the required permutation matrix for such desirable change.

Hence, to put everything together, we can perform a re-ordering of columns by placing the permutation matrix as the right-matrix of the operation.

Transpose Matrix

Very very much like permutation, except instead of re-ordering either rows or columns, transposition switches rows for columns (it is not wrong if you think it switches columns with rows as well).

The definition of transpose is to "switch places". In this case, again, transposition of a matrix refers to interchanging rows with columns.

  • Matrix of size (m x n) initially will have size (n x m)

  • Elements in the same row are represented as a new column (or columns are represented as a new row 💁‍♀️)

Another important thing to know is that as due to the non-commutative property of matrices, the result of the transposition of a product of matrices is the product of its transposed factors, in the reverse order.

Looking at RHS, through the act of both

  • Transposing itself, and

  • Reversal of order of factors,

This ensures that the number of columns of the left matrix and the rows of the right matrix is the same (n).

Inverse Matrix

The inverse of a random variable x looks something like this:

1x\frac{1}{x}

The above expression would be similarly re-written as:

x1 x^{\mathrm{-1}}

Now, we know for a fact that

x1×x=1 x^{\mathrm{-1}}\times x=1

This idea extends to matrices as well, where

A1×A=I A^{\mathrm{-1}}\times A=I
  • A1A^{\mathrm{-1}}is the inverse of Matrix A

  • A is Matrix A

  • I is an identity matrix

Firstly, it might be good to mention that not all matrices are invertible. This is because sometimes A1×A=0A^{\mathrm{-1}}\times A=0instead of 1.

However, assuming that Matrix A is indeed invertible, how would we then find A1A^{\mathrm{-1}}?

We will discuss this only at elimination in Linear Algebra😎 (stay tuned!!).

Last updated