Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.
Trending News
Show that the determinant of the matrix C does not depend on x (Linear Algebra)?
How do I solve this problem? (Hints are appreciated, Thanks)
Show that the determinant of C, |C|, does not depend on x.
the matrix C =
sin(x), cos(x), 0
-cos(x) , sin(x), 0
(sin(x) - cos(x) ), (sin(x) + cos(x) ), 1
C is a 3x3 matrix with each element separated by a comma
THank you!! I had already worked it out to get sin^2(x) + cos^2(x), but leave it to me to forget my trig identities! Haha.
6 Answers
- PeterTLv 51 decade agoFavorite Answer
Determinant of a 3*3 matrix
a,b,c
d,e,f
g,h,i is
(aci + bfg + cdh) - (gec + hfa + idb)
aci +bfg + cdh = sin^2(x) + 0 + 0 = sin^2(x)
gec + hfa + idb = 0 + 0 - cos^2(x)
so det is sin^2(x) - (-cos^2(x)) = sin^2(x) + cos^2(x) = 1
- ben eLv 71 decade ago
Calculate the determinant and you will find come cancellation and some expressions like cos^2(x) + sin^2(x) which is equal
to 1.
- 1 decade ago
Very simple, you can explicitly calculate it, by cofactor expansion on the last column:
|C| = |sin(x) cos(x); -cos(x) sin(x)| * 1 + |.| * 0 + |.| * 0 = sin^2 + cos^2 = 1
- 1 decade ago
To evaluate a determinant you can expand by
minors along any row or column. In this case it seems
simplest to take the last column.
Indeed
det(C) = 0*det(C_13) - 0*det(C_23) + 1*det(C_33)
Here C_13 means remove the first row and 3rd column.
This means det(C) = det(C_33) = det(sin(x)*sin(x) - (-cosx)*cos(x)) = sin^2(x) + cos^(x) = 1
so det(C) = 1 which is independent of x
- How do you think about the answers? You can sign in to vote the answer.
- readLv 44 years ago
permit A_lambda denote the lambda-eigenspace of A. (c) follows right this moment from (b): if dim A_lambda = ok, meaning there's a foundation (v_1, ..., v_k) for A_lambda; if so (P^(-a million)v_1, ..., P^(-a million)v_k) is a foundation for B_lambda, so dim B_lambda = ok. To practice (b), use the undeniable fact that P is invertible. If (v_1, ..., v_k) is a foundation for A_lambda, then you definately comprehend that (P^(-a million)v_1, ..., P^(-a million)v_k) is a linearly self reliant checklist of vectors (in any different case the v_j might themselves be linearly based), and you comprehend that each and all of the P^(-a million)v_j lie in B_lambda. So this is a minimum of a partial foundation. think it weren't an entire foundation for B_lambda; permit w in B_lambda be linearly self reliant from each and all of the P^(-a million)v_j. yet then, via (a), Pw is in A_lambda, and linearly self reliant from each and all of the v_j, contradicting the determination of (v_1, ..., v_k) as a foundation for A_lambda. This completes the evidence.
- something crazyLv 41 decade ago
the determinant is 1 and is thus independent of x...
0[(-cos)(sin+cos)-(sin)(sin-cos)] - 0[(sin)-0(sin-cos)] + 1[(sin)(sin)-(cos)(-cos)]
=sin^2+cos^2=1