Assume DC current where voltage is constant.
P=EI
P=180 w
E=120 v
Solve for I (answer in ampères)
The power (P) required to run a motor is equal to the voltage (E) applied to that motor times the current (I) supplied to the motor. If the motor data says the motor uses 180 watts of power and the voltage applied to the motor is 120 volts, how much current will the motor require?
1 answer