Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and the Yahoo Answers website is now in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

linear algebra question?

supppose A is m by n matrix with the property that for all vector b in R^m (the mth dimension),

the equation Ax = b has at most one solution

use the definition of linear independence to explain why the columns of A must be linearly independent.

i dont even know where to begin

thanks!

1 Answer

Relevance
  • 7 years ago

    With some things in linear algebra you can develop an intuition for what ought to go on in a proof based on your intuition about geometric things, or some other mental picture of what is going on. With problems like this, there is absolutely nothing to go from except the algebraic definitions of "linear independence" and matrix multiplication. I think the algebraic definition of "linear independence" is fairly subtle: until you've studied linear algebra for a long while, it is hard to understand why the algebraic definition is the way it is (and, in particular, why it is more convenient for algebraic reasoning than a lot of other formulations of the intuition of linear independence). So it might be that the proof I am about to write will seem like a bizarre recitation of equations with no connection to each other. It takes a while to see how to work with the definition of linear independence in proofs. I hope that seeing this example helps.

    Let a_1, a_2, ..., a_n denote the columns of A. To show that the columns of A are linearly dependent from the definition, you must show that for any scalars c_1, c_2, ..., c_n with the property that c_1 a_1 + c_2 a_2 + ... + c_n a_n = 0, it must be the case that c_1 = c_2 = ... = c_n = 0.

    So let's fix any scalars c_1, c_2, ..., c_n with c_1 a_1 + c_2 a_2 + ... + c_n a_n = 0. If we let c denote the nx1 column vector with entries c_1, c_2, ..., c_n, then from the definition of matrix multiplication, we have

    c_1 a_1 + c_2 a_2 + ... + c_n a_n = Ac.

    [You should think about this a while until it makes sense to you. For example a special case of this statement is that if A is a 2x2 matrix with columns a_1 and a_2, then the column vector 7 a_1 - 5 a_2 is the same as the matrix product of A with the column vector (7, -5).]

    Because of this, our assumption that c_1 a_1 + c_2 a_2 + ... + c_n a_n = 0 implies that Ac = 0. But if we let z denote the nx1 zero vector, then clearly Az = 0 also. So c and z are both solutions to the matrix equation Ax = 0. By hypothesis, the equation Ax = 0 can have at most one solution. So it must be that c = z is equal to the nx1 zero vector. This means that c_1 = c_2 = ... = c_n = 0.

    Conclusion: it is indeed the case that whenever c_1, c_2, ..., c_n are scalars and c_1 a_1 + c_2 a_2 + ... + c_n a_n = 0, it must be that c_1 = c_2 = ... = c_n = 0. This shows from the definition that the columns of A are linearly independent.

Still have questions? Get your answers by asking now.