At the grocery store Ryan bought a loaf of bread and some apples. The loaf of bread cost $3.95 and the apples cost $1.29 per pound. If Ryan spent $10.40 for the loaf of bread and apples, how many pounds of apples did he buy? Set up the problem and solve it.

1 answer

To solve the problem, we need to establish the total amount Ryan spent on the loaf of bread and the apples. We can then form an equation to find out how many pounds of apples he bought.

Let \( x \) be the number of pounds of apples Ryan bought.

  1. The cost of the loaf of bread is $3.95.
  2. The cost of the apples is $1.29 per pound, so the total cost for the apples is \( 1.29x \).

According to the problem, Ryan spent a total of $10.40. We can set up the following equation:

\[ 3.95 + 1.29x = 10.40 \]

Next, we need to isolate \( x \).

Subtract $3.95 from both sides:

\[ 1.29x = 10.40 - 3.95 \]

Calculating the right side:

\[ 10.40 - 3.95 = 6.45 \]

So, the equation now is:

\[ 1.29x = 6.45 \]

Now, divide both sides by $1.29 to solve for \( x \):

\[ x = \frac{6.45}{1.29} \]

Calculating the division:

\[ x \approx 5.00 \]

Thus, Ryan bought approximately 5 pounds of apples.