A circuit employs a silicon solar cell to detect flashes of light lasting 0.25 s. The smallest current the circuit can detect reliably is 0.32 μA .
Assuming that all photons reaching the solar cell give their energy to a charge carrier, what is the minimum power of a flash of light of wavelength 570 nm that can be detected?
Attempt at solution:
Using Equation 1: f=c/wavelength
=(3.00*10^8m/s)/(570*10^-9m)=5.26HZ
And Equation 2: E=hf
Planck's Constant= h=6.63*10^-34J*s= 4.14*10^-15eV*s
E=(6.63*10^-34J*s)(5.26HZ)=3.49*10^-19
3.49*10^-19/0.25s=1.39*10^-18 W
I'm not sure where to factor in the
0.32 μA, but I know that the unit is suppose to be in W.
1 answer
P = E/t