Problems on algebra on alpha and beta?

If A and B are roots of the quadrctic ax^2+bx+c=0. Obtain the equation whose roots are (1/(A^2) and 1/(B^2), If in the above equation AB^2=1, Prove that a^(3)+c^(3)+abc=0

POMPOM2011-07-01T11:10:04Z

Favorite Answer

As A & B are roots of ax² + bx + c = 0, we must have
ax² + bx + c = a.(x - A)(x - B)
=> x² + (b/a)x + (c/a) = x² - (A + B)x + AB [a ≠ 0]
=> A + B = - b/a.....(1) & AB = c/a.........(2)

Now, the equations whose roots are 1/A² and 1/B² is
(x - 1/A²).(x - 1/B²) = 0
=> x² - (1/A² + 1/B²).x + (1/A²).(1/B²) = 0
=> x² - ((A² + B²)/A²B²).x + 1/A²B² = 0
=> A².B².x² - (A² + B²).x + 1 = 0 [A, B ≠ 0]
=> A²B².x² - [(A + B)² - 2AB].x + 1 = 0
Putting A + B = - b/a & AB = c/a
(c/a)².x² - [(-b/a)² - 2.c/a].x + 1 = 0
=> (c²/a²).x² - [b²/a² - 2.c/a].x + 1 = 0
=> c².x² - (b² - 2ca).x + a² = 0
- This is the required equation.


Now, for the second part,
AB² = 1 => AB = 1/B => 1/B = c/a [as AB = c/a by (2)]
=> B = a/c
So, A = - B - b/a [ as A + B = -b/a by (1)]
=> A = - a/c - b/a = - (a² + bc)/(ac)

Putting A = - (a² + bc)/(ac) & B = a/c in (2)
[- (a² + bc)/(ac)].(a/c) = c/a
=> - (a³ + abc)/(ac²) = c/a
=> - (a³ + abc) = c³
=> a³ + c³ + abc = 0