The wording of the question is somewhat confusing.
You state that the value declines at an "even" rate.
Are you saying that the rate is the same for each year? I am sure that is what you meant.
Mathematically, the value can never be zero, but since we are dealing with money, I picked .004 cents arbitrarily.
So let the rate of depreciation be r
then
85000(1-r)^27.5 = .004
(1-r)^27.5 = .004/85000
[(1-r)^27.5]^(1/27.5) = (.004/85000)^(1/27.5)
1-r = .54144
r = .45856
so it depreciates at a rate of 45.856% per year
for a) change the % to a fraction
for b) take 45.856% of 85000
The owner of a
rental house can depreciate its value over a period of 27 1/2
years, meaning that the value of the house declines
at an even rate over that period of time until the value is $0
a. By what fraction does the value of the house depreciate
the first year?
b. If the house is judged to be worth $85,000, what is
the value of the first year’s depreciation?
1 answer