Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Does size really matter?

Specifically, does eigenvalue size really matter?

If you look at the eigenvalues computed from Mann's hockey-stick data, you'll see that the leading eigenvalue (associated with the hockey-stick PC) is about 0.4. (Google is your friend here.) The remaining eigenvalues rapidly decay to zero; only two or three other eigenvalues are more than a tiny fraction of the size of the first eigenvalue.

Now if you look at the eigenvalues of random noise data that has been "mined" for a hockey-stick, you will see that the leading eigenvalue is on the order of 0.05. The remaining eigenvalues decay away slowly enough so that you have 30 or 40 eigenvalues that are a significant fraction of the size of the first eigenvalue.

What does that tell you about the differences between Mann's data and random noise data?

Update:

"The values did change but only as minor blimps and bumps. It probably surprised him more then anyone, so I'm not completely tossing the notion."

--Say, What????

3 Answers

Relevance
  • 1 decade ago
    Favorite Answer

    That sounds significant to me, but honestly I'm not enough of an expert on EOF analysis to say.

    Rio, you make eigenvalues and eigenvectors sounds so easy--they're just proportions! So, do you think you could help me find them for a simple matrix? Try this one on for size:

    [3 2;-1 0]

    This is a 2 x 2, the second row is after the semicolon. I think you should be able to do this "by hand."

  • Rio
    Lv 6
    1 decade ago

    You don't really have to use the nomenclature (eigenvalues) or (eigenvectors). It's just another name for proportional values and corresponding characteristics. To make it true the values are substitute to satisfy; Ax=lambda(x) with A being the matrix. However the vectors cannot equal zero. Are you still with me? That's going to tell you quite a bit about proportions. As long as they all remain consistent there is no problem. I'm wondering why your stuck on eigenvalues? Mann refuses to let anyone know his code/values. So I see no point in arguing.

    ed: I take the last part back he was forced into divulging his code. The values did change but only as minor blimps and bumps. It probably surprised him more then anyone, so I'm not completely tossing the notion.

  • bob326
    Lv 5
    1 decade ago

    "To make it true the values are substitute to satisfy; Ax=lambda(x) with A being the matrix. However the vectors cannot equal zero. "

    Ah, the definition of an eigenvector. How trivial.

    I'm assuming you've never studied linear algebra.

Still have questions? Get your answers by asking now.