A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small.

A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small.
The distance between interference fringes increases.
The distance between interference fringes remains the same.
The effect cannot be determined unless the distance between the slits and the screen is known.
The distance between interference fringes also decreases.

5 answers

The first off center maximum occurs when the path lengths to the screen from the two slits differs by one wavelength. If you move the slits closer together, the angle to the screen must increase to get path lengths a wavelength different.
Therefore is you decrease the spacing between slits, you WIDEN the interference pattern of the target.
At a fixed angle from the slits, if you decrease the distance between the slits the path length difference decreases. So, the angle needs to increase to get back to the same path length difference (half the wavelenght for a fringe).
I had that the distance between interference fringes also decrease but I go it wrong
this is the

d sin(theta) = m (lambda)

I am confused on what its asking now,
isn't is d?
yes
For the same m (lambda)
for smaller d, you need BIGGER theta
so the peaks on the target will be further apart as d shrinks.