One easy consequence of our definition of determinant from last time is that any singular matrix must have determinant zero. Suppose is a singular matrix and that is the matrix which puts into row reduced form. Then we have

If is singular, once we put it in row reduced form it must have a row of zeros. We can now break up into a product of elementary matrices, one of which will have be of the form . We know that this matrix will have determinant zero, so the product will be zero, and thus . Likewise, if , we can write as a product of elementary row matrices and one will be , so is singular. Now we know that a matrix is singular if and only if its determinant is zero.

Suppose now that the first row of can be written as for some vectors and . We wish to show that

First suppose that the rows through form a linearly dependent set. Then our matrix is singular so has determinant zero. The determinants on the right in the above equation are zero too, so our we have our result.

Suppose now that through are linearly independent. We can then extend these to a basis for by adding a vector, call it . Then there exist scalars for such that

Some simple manipulations from last time give us the following.

And likewise,

Now we combine these results,

Since we can swap rows without altering the determinant, this result holds for any rows. A similar argument shows the result also holds for columns.

### Like this:

Like Loading...

*Related*

[…] Now, applying the linearity we discussed last time, […]

Pingback by The Laplace/Cofactor Expansion « Mathematics Prelims — June 17, 2009 @ 3:22 pm |