For a given power output, the higher
the voltage the lower the required current. A lower current results in lower power (I^2*R) loss. To deliver a
1000 Watts @ 100 volts requires 10 amps
of current(P = V*I). To deliver 1000 Watts @ 1000 volts requires only 1 Amp
of current.
When the voltage reaches its' destination, it is stepped down to the
required amplitude by transformers.
Why is the voltage of the electricity produced by power stations is increased before its transmitted through the national grid?
Please help
Thanks
1 answer