Why is an antenna designed to be one-half of the wavelength of the wave it is supposed to receive?

In my book all it says is that the voltage is larger when the antenna is 1/2 the wavelength of the wave.

2 answers

It is explained in detail here

http://farside.ph.utexas.edu/teaching/jk1/lectures/node82.html

The important point is that an antenna that is much shorter than the wavelength will have a radiation resistance that will be far less than the internal radiation.

Then for every joule of energy transmitted in the form of radio-waves, you'll necessarily have dissipated many more joules of energy in the form of heat.

If an antenna is of the order of a wavelength, then the radiation resistance will be quite large. To use an antenna for transmission, the impedance of the antenna needs to be the same as that of the transmitter (to be precise, it has to be the complex conjugate), because otherwise the output of the transmitter will be partally reflected back into the transmitter, potentially damaging the transmitter.

Now, since the impedance of a half wavelength antenna is of the same order as that of typical transmitters and coax cables, this makes them easy to use.

But in practice, an antenna tuner will be used to match the impedance of the antenna to that of the transmitter, so it is not necessary to have an antenna that is exactly the "right length".
Typo in first sentece:

I meant to say:

The important point is that an antenna that is much shorter than the wavelength will have a radiation resistance that will be far less than the internal ohmic resistance.