Tina Rand, age 13, of Glendale, Ariz., for her question:
HOW DO WE MEASURE THE SPEED OF LIGHT?
One of the most exact measurements made of the speed of light was that by an American physicist named Albert Michelson at the University of Chicago in 1926. He used a rapidly rotating mirror with many faces, and a distant mirror.
A flash of light from one face of the rotating mirror passed to the fixed mirror, which returned it just in time to be reflected from the next face of the rotating mirror. Michelson could then find the speed of light simply by dividing the distance covered (twice the distance between the rotating mirror and fixed mirror) by the time that had elapsed that is, the time it takes for the mirror to turn from one face to the next.
Michelson's experiments led to a value of 186,272 miles per second for the speed of light. Later studies by other investigators showed that the speed of light in a vacuum was slightly greater: 186,282 miles per second. Light travels more slowly through the Earth's atmosphere.
In the early 1600s, Galileo, the great Italian astronomer, was one of the first to try to measure light speed. He and an assistant stationed themselves on two hills in Tuscany 20 miles apart. By exchanging signal flashes between the two hilltops, Galileo tried to decide whether or not light travels instantaneously. However, his method was too crude and he was unable to give a definite answer.
In 1676, more than 30 years after Galileo's death, a Danish astronomer named Olaus Roemer noticed that the eclipse of one of the moons of Jupiter occurred later and later as the Earth moved in its orbit to the opposite side of the sun from Jupiter. Then, as the Earth moved back into its former position, the eclipses came on schedule again.
The total delay in the time of the eclipses was about 1,000 seconds, or 17 minutes. Roemer decided this must mean that it takes 17 minutes for light to travel across the Earth's orbit. Since the diameter of the orbit was known to be very nearly 186 million miles, Roener was sure the speed of light had to be about 186,000 miles a second.
Galileo's attempt to measure the speed of light between two nearby points had been correct in principle but he did not have accurate equipment and methods.
In the 1850s, two French scientists, each working alone, measured the speed of light over short distances. Armand Fizeau used a distance of six miles and Jean Foucault used even shorter paths. Both arrived at answers very near 186,000 miles a second.
Scientists call the speed of light a "fundamental constant" of nature. That is, its speed is the same whether the observers who measure it are moving or not. The German born American physicist Albert Einstein, in his theory of relativity, described the speed of light as a fundamental constant.
The wave lengths of light waves are measured in a unit called an angstrom, which equals 0.0000001 millimeter. About 250 million angstroms equal one inch. The frequency of light waves is measured in units called hertz. One hertz equals one wave passing a point in one second.