Developed by Charles F. Richter in 1934, the scale measures the magnitude of an earthquake, with 0 being a very small earthquake that is generally not felt, going up to a magnitude 12, which would be like a large, miles-wide meteor hitting the earth's surface. The measurement formula in mathematical terms is
"A is the amplitude, in millimeters, measured directly from the photographic paper record of the Wood-Anderson seismometer, a special type of instrument. The distance factor comes from a table that can be found in Richter's (1958) book Elementary Seismology," (Louie, J., 1996). The solution is the local magnitude of the earthquake.
When measuring the magnitude of an earthquake, the difference between a magnitude seven and magnitude six earthquake is 10 times stronger, whereas the difference between a magnitude eight and magnitude six earthquake would be 100 times stronger. This is due to the logarithmic basis of the scale. "Each whole number increase in magnitude represents a tenfold increase in measured amplitude; as an estimate of energy, each whole number step in the magnitude scale corresponds to the release of about 31 times more energy than the amount associated with the preceding whole number value," (USGS, 1989).
In America, many earthquakes tend to occur on the West Coast. ...