Asked by Liz
According to a survey conducted in 1990 by
Independent Sector, the percent of their incomes
that Americans give to charities is related
to their household incomes. For families
with annual incomes between $5000 and
$100,000, the percent is modeled by
P = 0.0014x^2 − 0.1529 x + 5.855
Where P is the percentage of annual income
given and x is the annual income in thousands of dollars.
What is the largest of the two annual incomes
at which Americans give 3.2%(P = 3.2) of their income to charity?
Independent Sector, the percent of their incomes
that Americans give to charities is related
to their household incomes. For families
with annual incomes between $5000 and
$100,000, the percent is modeled by
P = 0.0014x^2 − 0.1529 x + 5.855
Where P is the percentage of annual income
given and x is the annual income in thousands of dollars.
What is the largest of the two annual incomes
at which Americans give 3.2%(P = 3.2) of their income to charity?
Answers
Answered by
Arora
When P = 3.2,
3.2 = 0.0014x^2 − 0.1529x + 5.855
=> 0.0014x^2 − 0.1529 x + 2.655 = 0
Solving the quadratic, we get:
x = 87.554, x = 21.660
Since x in is thousands, the two amounts are:
$87,554 and $21,660
Which does the question ask for?
3.2 = 0.0014x^2 − 0.1529x + 5.855
=> 0.0014x^2 − 0.1529 x + 2.655 = 0
Solving the quadratic, we get:
x = 87.554, x = 21.660
Since x in is thousands, the two amounts are:
$87,554 and $21,660
Which does the question ask for?
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.