We can prove this by contradiction.
Suppose that the equation (a^2+b^2)x^2 + 2(a+b)x+2 = 0 has real roots when a is not equal to b.
Then we can use the quadratic formula to find the roots:
x = (-2(a+b) +/- sqrt((2(a+b))^2 - 4(a^2+b^2)(2)))/(2(a^2+b^2))
Simplifying this expression, we get:
x = (-a-b +/- sqrt(a^2 + 2ab + b^2 - 2a^2 - 2b^2))/a^2 + b^2
x = (-a-b +/- sqrt(-a^2 + 2ab - b^2))/a^2 + b^2
Now, since a is not equal to b, we know that a^2 + b^2 > 2ab. This means that the expression inside the square root is negative, making the roots imaginary.
Thus, our assumption that the equation has real roots when a is not equal to b is false. Therefore, the equation has no real roots when a is not equal to b.
Prove that (a^2+b^2)x^2 + 2(a+b)x+2 = 0 has no real roots if a is not equal to b
2 answers
seems like a lot of work.
The discriminant
(2(a+b))^2 - 4*2(a^2+b^2)
= 4a^2+8ab+4b^2 - 8a^2-8b^2
= -4(a-b)^2
this is negative unless a=b
The discriminant
(2(a+b))^2 - 4*2(a^2+b^2)
= 4a^2+8ab+4b^2 - 8a^2-8b^2
= -4(a-b)^2
this is negative unless a=b