Sandage (1995) likens the measurement of H0 to a game of chess. In chess, only a grand master `` experiences a compelling sense of the issue and the best move. This player ``knows'' by intuition which clues are relevant... In other words his or her intuition judges what is real in the game, what will or will not lead to contradiction, and what aspects of the data to ignore.''
Although there are perhaps differences in philosophy and many different techniques for measuring H0, its importance cannot be underestimated. Knowledge of H0 is required to constrain the estimates of the baryon density from nucleosynthesis at early epochs in the Universe. The larger the value of H0, the larger the component of non-baryonic dark matter is required, especially if the Universe has a critical density. The Hubble constant specifies both the time and length scales at the epoch of equality of the energy densities of matter and radiation. Both the scale at the horizon and the matter density determine the peak in the perturbation spectrum of the early universe. Hence, an accurate knowledge of the Hubble constant can provide powerful constraints on theories of the large-scale structure of galaxies. At present, large values of H0 are problematic for the currently most successful models, those dominated by cold dark matter.
A value of H0 to ± 1% accuracy is still a goal far beyond currently available measurement techniques. However, if, for example a value of H0 = 70 km/sec/Mpc were confirmed at ± 1% (95% confidence), and the ages of the oldest objects in the Universe were confirmed to be > 12 Gyr, then a number of issues would be brought into tight focus (and corresponding new problems raised!). A cosmological constant would be required, there would be no further debate over the need for non-baryonic dark matter, and at least the standard version of cold dark matter would be ruled out (conclusively).
The requirements for measuring an accurate value of H0 are simple to list in principle, but extremely difficult to meet in practice. As discussed in more detail in Freedman (1997), in general, there are 4 criteria that need to be met for any method. First, the method should be based upon well-understood physics; second, it should operate well into the smooth Hubble flow (velocity-distances greater than 10,000, and preferably, 20,000 km/sec); third, the method should be based on a statistically significant sample of objects, empirically established to have high internal accuracy; and finally, the method needs to be demonstrated empirically to be free of systematic errors. This list of criteria applies both to classical distance indicators as well as to other physical methods (in the latter case, for example, the Sunyaev Zel'dovich effect or gravitational lenses). The last point requires that several distance indicators meeting the first three criteria be available, but the current reality is that, unfortunately, at the present time, an ideal distance indicator or other method meeting all of the above criteria does not exist. The measurement of H0 to ± 1% is not yet possible; however, recent progress (reviewed below) illustrates that a measurement to ± 10% is now feasible.