Please help me with this math problem.

A woman is given a loan of $20,000. for 1 year. If the instrest charged is $800, the was was the intrest rate on the loan?

To get the answer I set the problem up like this:

R= 800/20,000= 25(1/100)= 25/100= 25%

but I know this cant be right. When I go back and multiply the rate times the base(20,000) I get $5,000. Please tell me what I have done wrong.

2 answers

I got 4%. I divided 800 and 20000 and got .04 which = 4%... but I could be very wrong.
Angela -- you set up the problem correctly -- but somehow didn't get the right answer. You probably divided 20,000 by 800.

Anonymous is right. 800/20,000 = 0.04 = 4%.