The magnitude of an earthquake, R\displaystyle \text{R} is defined by R=log10[IcIn]\displaystyle {R}={{\log}_{{{10}}}{\left[\frac{{I}_{{c}}}{{I}_{{n}}}\right]}} where Ic\displaystyle {I}_{{c}} is the intensity of the earthquake (measured by the amplitude of a seismograph reading taken 100 km from the epicenter of the earthquake) and In\displaystyle {I}_{{n}} is the intensity of a ''standard earthquake'' (whose amplitude is 1 micron = 104\displaystyle {10}^{{-{{4}}}} cm).

The 1906 San Francisco earthquake had a magnitude of 8.3 on the Richter scale. At the same time in South America there was an eathquake with magnitude 4.6 that caused only minor damage. How many times more intense was the San Francisco earthquake than the South American one? Round your answer to two decimal places.

The San Francisco earthquake was times more intense than the South American Earthquake.