Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and the Yahoo Answers website is now in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

? asked in Science & MathematicsMathematics · 8 years ago

Linear Algebra Question?

Hi,

I have a question regarding Orthogonal sets and projections.

Say we have the vectors y = [3, -1, 1, 13], v(1) = [1, -2, -1, 2] and v(2) = [-4, 1, 0, 3]

The projection of y, or y hat, would be found using the equation:

y hat = (y • v(1))/(v(1) • v(1)) * (v(1)) + (y • v(2))/(v(2) • v(2)) * (v(2))

IF the set is orthogonal.

But what do we do if it isn't orthogonal? How would we find the projection of y then?

I read through my textbook today but cannot seem to find what I'm looking for...

Any help would be greatly appreciated!

1 Answer

Relevance
  • Indica
    Lv 7
    8 years ago
    Favorite Answer

    Here are two ways …

    Let V = span(v₁,v₂). You want the projection of y onto the subspace V.

    (i) Find an orthogonal basis for V, say u₁,u₂. Do this is using Gram-Schmidt method. For a pair of vectors this is particularly easy … u₁=v₁, u₂=v₂−(v₂•u₁)u₁/|u₁|²

    Then carry on as before.

    (ii) Assemble the projection matrix P(4X4) so then the projection of y is Py.

    Find P using P = A(AᵀA)⁻¹Aᵀ where A(4X2) is matrix with columns v₁,v₂ (quotable result)

Still have questions? Get your answers by asking now.