suppose that the separation between speakers A and B is 5.80 m and the speakers are vibrating in phase. They are playing identical 135 Hz tones, and the speed of sound is 343 m/s. What is the largest possible distance between speaker B and the observer at C, such that he observes destructive interference?

Distruive interference? Unless you put C in the line between A and B, this the asnwer is at infinity.

Now on the line between A and B, the distance from A to C has to be an odd multiple of halfwavelength greater than C to B. Now, the distance between the speakers is two wavelength. So if AC-BC is n lambda/2, then n must be one, so
AC=BC + 2.54/2 where BC= 5.8-AC

solve for AC.

Please tell me where I am going wrong.

AC = BC + 2.54/2

AC = 7.366

How come it's BC =5.8 - AC. Should it be BC= AC -5.8

because

AC = sqrt [(AB)^2 + (BC)^2]
AC = 5.8 + BC

How come it's BC =5.8 - AC? Shouldn't it be BC= AC -5.8

because

AC = sqrt [(AB)^2 + (BC)^2]
AC = 5.8 + BC