To solve this problem, we need to set up an equation to find how many pounds of apples Ryan bought.
Let's assume that Ryan bought x pounds of apples.
According to the problem, the cost of the loaf of bread is $3.95 and the cost of apples is $1.29 per pound.
Therefore, the cost of the apples that Ryan bought can be calculated using the equation:
Cost of apples = Price per pound × Number of pounds of apples
In this case, the cost of the apples is $10.40 - $3.95 (cost of the loaf of bread) = $6.45.
Equating this with the equation above, we get:
$6.45 = $1.29 × x
To solve for x, we can divide both sides of the equation by $1.29:
x = $6.45 / $1.29
Now, let's solve this equation:
x ≈ 5
Therefore, Ryan bought approximately 5 pounds of apples at the grocery store.