Significant Digits

Let's begin by helping you get a feel for what it means to have ten-digit
floating point numbers. Keep in mind that when we say a floating point number
has ten digits, we mean that, when written in scientific notation, its mantissa
has ten digits.

Suppose you are doing a calculation that depends upon knowing the surface area
of the earth. Suppose further that your calculation must be accurate to ten
significant digits. If your measure of the earth's surface area is not
accurate to ten digits, then there's no way your calculation can be accurate to
ten digits. In other words, you're calculations can be no more accurate than
your data.

Is it reasonable to expect to be able to measure the surface area of the earth
to ten digits of accuracy? Let's see exactly what that would entail. First of
all, use Mathematica to calculate the surface area of the earth subject to the
following two assumptions:

 The radius of the earth is 6371 kilometers.
 The earth is a perfect sphere.

Recall that the surface area of a sphere of radius $r$ is {\it 4 Pi r\^2}.

Thanks to Brigham Young, a city block in Salt Lake City is a square one-seventh
of a mile on a side. What is the surface area of the earth expressed in city
blocks? (In case you've forgotten, there are 5280 feet in a mile, 12 inches in
a foot, 2.54 centimeters in an inch, and 100,000 centimeters in a kilometer.)

(1) At a minimum, how many city blocks would you have to overlook in order
to change the last of the ten digits that express your estimate of the surface
area of the earth?

Of course, the earth is not a perfect sphere and the radius that we have given
is not entirely precise either. Furthermore, the surface of the earth is
extremely wrinkled. All of this makes your estimate of the earth's surface
area, with its ten digits of ``accuracy,'' rather suspect. Suppose that your
estimate is accurate to one-tenth of one percent.

 (2) In that case, how many of the ten digits can you trust, and how many
are just garbage? Explain your answer.