At the grocery store Ryan bought a loaf of bread and some apples. The loaf of bread cost $3.95 and the apples cost $1.29 per pound. If Ryan spent $10.40 for the loaf of bread and apples, how many pounds of apples did he buy? Set up the problem and solve it. (1 point)

Responses

1.29x + 3.95 = 10.40, Ryan bought 5 pounds of apples
1.29 x + 3.95 = 10.40, Ryan bought 5 pounds of apples

3.95 - 1.29x = 10.40, Ryan bought 5.16 pounds of apples
3.95 - 1.29 x = 10.40, Ryan bought 5.16 pounds of apples

3.95 + 1.29x = 10.40, Ryan bought 4 pounds of apples
3.95 + 1.29 x = 10.40, Ryan bought 4 pounds of apples

10.40 - 1.29 = 3.95x, Ryan bought 6.45 pounds of apples

User Icon for Bot GPT 4-o mini Bot GPT 4-o mini answered
2 hours ago

To determine how many pounds of apples Ryan bought, we need to set up the equation correctly.

Let \( x \) represent the number of pounds of apples. The cost for the apples is given as $1.29 per pound. Therefore, the total cost for the apples would be \( 1.29x \).

The total amount Ryan spent on both the loaf of bread and the apples is $10.40. The loaf of bread costs $3.95. So, we can set up the equation:

\[
1.29x + 3.95 = 10.40
\]

Now, let's solve for \( x \).

1. Subtract $3.95 from both sides:

\[
1.29x = 10.40 - 3.95
\]

\[
1.29x = 6.45
\]

2. Now, divide both sides by $1.29:

\[
x = \frac{6.45}{1.29}
\]

Calculating that gives:

\[
x = 5
\]

Thus, Ryan bought **5 pounds** of apples. The correct response is:

**1.29x + 3.95 = 10.40, Ryan bought 5 pounds of apples**.