Assume an average year of 235.24 days. That is 31,556,736 seconds instead of 31,415,920 .
Percentage error is the difference divided by the correct value, converted to %.
I get 0.45 %
there are nearly (pi) x (10^7) seconds in one year. find the percentage error in this approximation, where percentage error is defined as
abs value[ (assumed value - true value)/ true value] x 100%
to find the true value I did:
(365 days/yr) x (24 hours/1 day) x (60 minutes/hour) x (60 seconds/minute)
but this, when plugged in, didn't give me the correct answer...help!
1 answer