Let x be the number of pounds of apples Ryan bought.
The cost of the apples is 1.29*x.
So the total cost is 1.29*x + 3.95 = 10.40.
Therefore, 1.29*x = 10.40 - 3.95 = 6.45.
Thus, x = 6.45 / 1.29 = <<6.45/1.29=5>>5 pounds of apples. Answer: \boxed{5}.
At the grocery store Ryan bought a loaf of bread and some apples. The loaf of bread cost $3.95 and the apples cost $1.29 per pound. If Ryan spent $10.40 for the loaf of bread and apples, how many pounds of apples did he buy? Set up the problem and solve it. (1 point)
1 answer