An initial-value problem is given by the differential equation,
f(x,y) = x + y, y(0) = 1.64
The Euler-midpoint method is used to find an approximate value to y(0.1) with a step size of h = 0.1.
Then use the integrating factor method, to find the exact value of y(0.1).
Hence, determine the global error, giving your answer to 5 decimal places.
got stuck with this. anybody can help?