Suppose we have a system of equations with equations and unknowns. How can we tell if this system has a solution? Writing the system as , we see that the left-hand side gives us a particular linear combination of the columns of . That is, the system can be thought of as

And so we see that the system has a solution if and only if can be expressed as a linear combination of the columns of . In fact, taking the set of all possible vectors for which has a solution, we have a vector space called the *column space* of , which we’ll normally denote . There are a couple of ways we can think of the column space:

- The set of all linear combinations of the columns of
- The set of all vectors such that has a solution
- The set of all for any vector

These are all equivalent, but sometimes it may be more helpful to think of the column space in one way than another.

A similar concept is that of the *row space*, which is the set of all linear combinations of the rows of and denoted . Notice that .

Now suppose that we multiply by a matrix on the right. How might the column space of differ from ? Using the third characterization of a column space above, and assuming is , we have that

So the column space of is contained in the column space of , and in particular . The above is telling us that linear combinations of the columns of are actually linear combinations of the columns of . One way to see this is to realize that each column of is actually a linear combination of combination of columns of . Letting denote the -th column of , we can write as follows.

Now if we take a linear combination of these columns we have

Our choice of the values is arbitrary — they can be whatever we’d like them to be — but the are fixed since they’re entries in . This restricts the possible linear combinations of the rows, so . Repeating the same argument but using rows instead of columns, or just using the above argument with , we have that . Notice if (or ) is non-singular, then we have . Combined with the above we have equality when or is non-singular.

Now what we’d like to show is that the dimension of the row space and the dimension of the column space are actually equal for any matrix . Simply pick to be the nonsingular matrix so that is in row-reduced echelon form, and pick to be the nonsingular matrix so that is in column-reduced echelon form. Notice then

where is the identity matrix for some . Since matrix multiplication is associative, . Putting the matrix into row-reduced echelon form with the moves all the non-zero rows to the top with each row having a leading one and the rest of that column zeroed out. When we multiply this by we put the matrix into column-reduced form which moves all the non-zero columns to the left and zeros out everything on in the rows with leading ones. With this matrix we clearly have . We can show that when and are both nonsingular that , using the *rank-nullity theorem* which we’ve yet to prove.

[…] up from last time, we have if and are the non-singular matrices such that , […]

Pingback by The Rank-Nullity Theorem « Mathematics Prelims — June 11, 2009 @ 10:06 pm |