According to a survey conducted in 1990 by

Independent Sector, the percent of their incomes
that Americans give to charities is related
to their household incomes. For families
with annual incomes between $5000 and
$100,000, the percent is modeled by
P = 0.0014x^2 − 0.1529 x + 5.855
Where P is the percentage of annual income
given and x is the annual income in thousands of dollars.
What is the largest of the two annual incomes
at which Americans give 3.2%(P = 3.2) of their income to charity?

1 answer

When P = 3.2,

3.2 = 0.0014x^2 − 0.1529x + 5.855
=> 0.0014x^2 − 0.1529 x + 2.655 = 0
Solving the quadratic, we get:
x = 87.554, x = 21.660

Since x in is thousands, the two amounts are:
$87,554 and $21,660

Which does the question ask for?