3 answers
(click or scroll down)
The Richter Scale is logarithmic, 5-2 = 3.
10^3 = 1000
10^3 = 1000
The Richter scale measures the magnitude of seismic waves produced by an earthquake. It is a logarithmic scale, meaning that each whole number increase on the Richter scale represents a tenfold increase in the amplitude of seismic waves.
The formula to calculate the difference in intensity between two earthquake magnitudes is:
Intensity ratio = 10^(M2 - M1)
Where M2 is the magnitude of the more intense earthquake (in this case, 5.0) and M1 is the magnitude of the less intense earthquake (2.0).
Using this formula, we can calculate the intensity ratio:
Intensity ratio = 10^(5.0 - 2.0) = 10^3 = 1,000
Therefore, the 5.0 magnitude earthquake is 1,000 times more intense than a magnitude 2.0 quake.