The magnitude of an earthquake, R\displaystyle \text{R} is defined by ` "R" = log_(10)["I_c/I_n"]` where `"I_c"` is the intensity of the earthquake (measured by the amplitude of a seismograph reading taken 100 km from the epicenter of the earthquake) and `"I_n"` is the intensity of a ''standard earthquake'' (whose amplitude is 1 micron =10^-4 cm).

If one earthquake is 38 times as intense as another, how much larger is its magnitude on the Richter scale?
Your answer is   .