We're doing optimization problems and this is one that I am having trouble with:
Suppose a business can sell x gadgets for p=250-0.01x dollars apiece, and it costs the business c(x)= 100+25x dollars to produce x gadgets. Determine the production level and cost per gadget required to maximize profit.
I saw in one problem in the book that they determine the production level by setting the derivative of the cost equal to the derivative of the revenue. (I don't know if this is the way that it always needs to be done-- I don't really understand how to find production level.)
Going on this, I know that
p(x)= r(x) - c(x)
where p=profit, r=revenue, and c= cost
which means that p(x)+ c(x) = r(x)
therefore:
250 -.01x + 100 + 25x= 350+24.99x= r(x)
But if we set c'(x)= r'(x) we get
25=24.99 which doesnt work.
Also, if we set p'(x)=0 (to maximize profit) we end up with -.01=0.... which doesn't work, again.
How should I solve this?