Ask a New Question

Question

If a store owner buys a microwave for $200 and sells it for $275. What is the percent markup based on cost?
11 years ago

Answers

Chelle
Cost: $200
Selling Price: $275

275/200 = 1.375

Move decimal two places to right: 137.5
Or, Multiply by 100: 1.375 x 100 = 137.5

Next, subtract 100 from 137.5
137.5 - 100 = 37.5%

Or, you can just subtract the cost from the selling price then divide answer by cost. Then move decimal two places to the right, or, multiply by 100.

275 - 200 = 75/200 = 0.375
11 years ago

Related Questions

A store owner buys suppleis from a vendor for 8.40. The terms of sale are 2/30, n/30. What will be... A store owner buys 40 shirts for $600. She then adds 40% to her cost and tags each shirt with the s... Every day, a bakery owner buys either butter(B) or ghee(G).The type of item purchased in consecutive... A store owner buys clothes at wholesale and adds 80% to the wholesale price to set the retail price... a store owner buys a coats for $56 each. She adds 30% to the cost and sells the coats at 15% off. Fi... a store owner buys coats for %56 each. she adds 30% to the cost and sells the coats at 15% off. find... A store owner buys brooms for $4 and sells them at a markup of 270%. One day, she decides to put the... The owner of a small store buys coats for ​$60.00 each. Answer parts a and b. Question content area... The owner of a bookstore buys used books from customers for $1.50 each. The owner then resells the u...
Ask a New Question
Archives Contact Us Privacy Policy Terms of Use