We can actually use the Zeros Theorem and the Conjugate Zeros Theorem together to conclude that an odd-degree polynomial with real coefficients must have atleast one real root (since the non-real roots must come in conjugate pairs). But how can we get the same conclusion by just considering the end behavior of an odd-degree polynomial?

Hint: Think about how the root(s) of a polynomial show up on its graph. Then, look at what we know about the end behavior of an odd-degree polynomial, and note what has to happen between the ends.

2 answers

Doesn't the graph of an odd-degree polynomial with a positive coefficient end up getting lost way up in the first quadrant?
And get lost way down in the third quadrants?

How did it get from way down there in the third to way up there in the first ??
My guess would be it "must have crossed the x-axis somewhere" at least once", meaning, there has to be at least one real root.

If the coefficient is negative, just reverse the logic , "goes from second quadrant to fourth".
Odd degree polynomials graph with the - and + ends on the opposite side of the x axis, don't they. So they have to cross that real axis at least once.