Solution of similtaneous linear equations
- Intuitive Visual Matrices Table of Contents TOC
- Math: Area of a Parallelogram equals geometric mean of triangles
- Math: Derivation of Matrix Determinant
- Two 2 dimensional determinant of a matrix animation showing it is equal to the area of the parallelogram
- Interpretation of Matrix determinant as hyper-volume
- Intuitive Matrix Inverse
- Solution of similtaneous linear equations
- Matrices, Eigenvalues, Eigenvectors
End of TOC
[pmath] A = (matrix{3}{3}{a_11 a_12 a_13 a_21 a_22 a_23 a_31 a_32 a_33}) (matrix{3}{1}{X_1 X_2 X_3 }) = (matrix{3}{1}{b_1 b_2 b_3 }) [/pmath]
Multiplying the first column by [pmath] X_1 [/pmath] alters the determinant as follows:
[pmath] det (matrix{3}{3}{a_11*X_1 a_12 a_13 a_21*X_1 a_22 a_23 a_31*X_1 a_32 a_33}) = X_1 * det(A) [/pmath]
Now you can multiply the other columns and add to the first column to your hearts content as it does not alter the determinant due to linear dependency on the other rows.
- Multiply column 2 by [pmath] X_2 [/pmath] and add to column 1
- Multiply column 3 by [pmath] X_3 [/pmath] and add to column 1
[pmath] det (matrix{3}{3}{ a_11*X_1+a_12*X_2+a_13*X_3 a_12 a_13 a_21*X_1+a_22*X_2+a_23*X_3 a_22 a_23 a_31*X_1+a_32*X_2+a_33*X_3 a_32 a_33}) = X_1 * det(A) [/pmath]
Woot! That first column now looks like the b vector! Substitute:
[pmath] det (matrix{3}{3}{ b_1 a_12 a_13 b_2 a_22 a_23 b_3 a_32 a_33}) = X_1 * det(A) [/pmath]
Dividing both sides by [pmath] det(A) [/pmath] yields
[pmath] {det (matrix{3}{3}{ b_1 a_12 a_13 b_2 a_22 a_23 b_3 a_32 a_33})}/ {det(A)} = X_1 [/pmath]
Which lends itself to the following form:
[pmath] (matrix{1}{3}{b_1 b_2 b_3}) * (matrix{1}{3}{{A_11}^c {A_21}^c {A_31}^c>}) = X_1*det(A) [/pmath]
Where c denotes the cofactor of the associated matrix element
Observations
- The solution does not depend on the direction of the [pmath] X_1 [/pmath] coefficients
- The solution is scaled by the [pmath] X_1 [/pmath] coefficients
- There is alot that can be said to be "vector like". The scaled and added columns are in the same direction as the other columns. ( basis vectors ?) And thus the cross product yields zero!
- Dot product of b vector with cross product generated cofactor vector shows the solution to be in the direction of the cofactor vector scaled by the projection of b and det(A)
- Cofactor vector is orthogonal to all the other column vectors showing that only the component of b in the orthogonal direction is what counts in the solution
References

0 Comments