Having a lil problem

Prove that the roots of
ax^2 + (a + b)x+b are real for all values of k

note the "x"s aren't multiplication signs.

a x^2 + bx + c has the discriminant of

D = b^2 - 4ac.

If D is nonnegative then the function has real roots.

In this case you have

D = (a + b)^2 - 4 a b = (a-b)^2

which is larger than or equal to zero because it is a square.

Similar Questions
  1. Which correctlydescribes the roots of the following cubic equation? x3 - 3x2 + 4x - 12 = 0 (1 point) Three real roots, • each
    1. answers icon 1 answer
  2. Which correctly describes the roots of the following cubic equation?x^3 - 3x^2 + 4x - 12 = 0 a. Three real roots, each with a
    1. answers icon 1 answer
  3. Which correctly describes the roots of the following cubic equation??x^3 - 3x^2 + 4x - 12 = 0 a. Three real roots, each with a
    1. answers icon 1 answer
  4. Show that the equation x^4 + 4x + c = 0 has at most two real roots.I believe we're supposed to prove this by proof of
    1. answers icon 0 answers
more similar questions