Linear Algebra (Pt. 4): Gauss-Jordan Elimination
Last updated
Last updated
We've previously solved a system of linear equation previously. It looked something like
However, what if the question begs to solve 2 systems of linear equation at once 🤠?
Given a new example of:
where
What if we were to arbitrarily tweak "b" and also state that
As you might have remembered under the "ways to think about MM", for:
The entire Matrix A multiplies with
Columns of Matrix B, will
Return columns of Matrix C
The only difference between Gauss, and Gauss-Jordan is that Gauss would stop REF, while Gauss-Jordan Elimination would continue to RREF (Reduced Row Echelon Form).
Gauss-Jordan, compared to Gauss, is capable of solving multiple systems of linear equation as mentioned. However, the other differences is also to reach RREF (Reduced Row Echelon Form).
To recap, the REF is an upper-triangular form (u) in which only the elements below the pivoted element has the value 0. For example,
The above is in REF as all elements below a possible pivot element is 0.
However, RREF requires a matrix to be firstly in REF form, and
Have all pivots as the value 1, and
elements above pivot to be of value 0
Thankfully, all possible pivots of (1, 1) and (2, 2) are the value 1. If not, for example if (1, 1) has the value of 3, we can easily divide the entire row by 3 easily (may result in a few fractions in other elements in the row 💀).
We would subsequently carry on to eliminate (1, 2) as it is above the pivot (2, 2), and is not of the value 0.
Since all pivots are 1, and elements both below and above the pivots are 0, this satisfies the RREF condition.
If you are observant, you might have realized that we have started off with a Matrix A (LHS) and an identity matrix (RHS), to reach RREF of an identity matrix (LHS) and final Matrix (RHS).
Looking only at LHS, we have transformed A into I through a series of elimination (2 steps only actually). Hence, to re-use the notation as used before,
We divide over A by both side, we can essentially get
Thus, the product of all elimination matrices are similarly equal to the inverse of Matrix A 🙉.
The first row is, while
The second row is
, and
Hence, it is legal to thus propose the change of Matrix C from to .
This final matrix is essentially , the inverse matrix of A. However, we have also found the inverse matrix along the journey to RREF.
System of Linear Equation
Augmented Matrix