Imagine that you are in the lab and you can decide the thickness of the Si layer of your solar cell. You want to optimize the solar cell performance for a wavelength of λ=1000nm, for which the absorption coefficient is α(1000nm)=102cm−1.

Which of the following thicknesses dSi would give a better performance? Take into account that you already know two things:

(1) Beer-Lambert's law.

(2) For silicon, the minority carrier diffusivity is around D=27cm2/s and the minority carrier lifetime is around τ=15μs.

a) 100μm
b) 180μm
c) 300μm

2 answers

Using Beer-Lambert's law we can calculate the intensity of the light after passing through the silicon layer:

II0=e−α(1000nm)dSi
For dSi=100μm, we have II0=0.37, so 63% of the light is absorbed in the silicon layer.

For dSi=180μm, we have II0=0.17, so 83% of the light is absorbed in the silicon layer.

For dSi=300μm, we have II0=0.05, so 95% of the light is absorbed in the silicon layer.

It would seem that dSi=300μm is the best option. However, we can calculate the diffusion length as

Ld=Dτ−−−√=27∗15∗10−6−−−−−−−−−−−√=0.02cm=200μm
If we choose dSi=300μm, we would have a silicon layer much thicker than the diffusion length, hence an inefficient collection of the charge carriers. Therefore the best option would be dSi=180μm, which achieves a slighlty lower absorption but better collection, since it is smaller than the diffusion length.
180