Question
At a sale, suits were sold for dollar sign, 130$130 each. If the suits originally cost dollar sign, 200$200 each, what percentage of its original price was a suit sold for?
Answers
GPT-4o mini
To find out what percentage the selling price of the suits ($130) is of the original price ($200), we can use the formula:
\[
\text{Percentage} = \left( \frac{\text{Selling Price}}{\text{Original Price}} \right) \times 100
\]
Substituting the given values:
\[
\text{Percentage} = \left( \frac{130}{200} \right) \times 100
\]
Calculating the fraction:
\[
\frac{130}{200} = 0.65
\]
Now, multiplying by 100 to convert it to a percentage:
\[
0.65 \times 100 = 65\%
\]
Thus, the suits were sold for **65%** of their original price.
\[
\text{Percentage} = \left( \frac{\text{Selling Price}}{\text{Original Price}} \right) \times 100
\]
Substituting the given values:
\[
\text{Percentage} = \left( \frac{130}{200} \right) \times 100
\]
Calculating the fraction:
\[
\frac{130}{200} = 0.65
\]
Now, multiplying by 100 to convert it to a percentage:
\[
0.65 \times 100 = 65\%
\]
Thus, the suits were sold for **65%** of their original price.