A. Discrepancies At All Scales
When Fritz Zwicky proposed in 1933 that dark matter might be responsible for the high velocity dispersion of galaxies in the Coma Cluster (see Chapter III A), he was familiar with the concept of dark matter, and with earlier attempts to dynamically measure the density of dark matter in the Galaxy. Over the decades that followed, however, the presence of dark matter in clusters and in galaxies were discussed largely independently of each other. It wasn't until the 1960s that mass discrepancies on multiple scales began again to be considered within a common context.
In his pioneering paper of 1963, Arrigo Finzi cited Zwicky's 1933 work on galaxy clusters, the 1957 observation of M31's rotation curve from van de Hulst et al., as well as more recent determinations of the mass of the Milky Way, and argued in favor of a common interpretation for these phenomena . He then went on to consider various possible forms of what we would today call “baryonic” dark matter, ruling them out one-by-one. He even went as far as to suggest that these phenomena might be explained by modifying Newton's gravitational force law, so that it scaled as r−3/2 at large distances.
Despite the highly original and prescient nature of Finzi's work, it was largely ignored by the scientific community , attracting only 50 citations over the past 50 years. Although it is impossible to unambiguously identify the precise reasons for this, the very bold nature of Finzi's conclusions may have been difficult for many of his colleagues to accept, or even seriously consider. In any case, this work had little impact, and it would be another decade before other scientists began to pursue similar lines of inquiry.
As mentioned in Chapter IV, two independent groups published groundbreaking papers in 1974, each presenting a strong case for the existence of large amounts of mass in the outer parts of galaxies. The first of these papers, by the Estonian astronomers Jaan Einasto, Ants Kaasik and Enn Saar, was submitted on April 10 and was entitled “Dynamic evidence on massive coronas of galaxies” . These authors began with a discussion of galactic rotation curves, citing the work of Roberts that would be eventually published in Ref. , and presented an analysis of rotation curve data that included estimates for the contributions from stars for five galaxies of different mass. They argued that the discrepancy between the total mass and the stellar mass implied the existence of a “corona”, consisting of a “previously unrecognised, massive population”. They then used 105 pairs of galaxies to estimate the total mass and dimensions of their galactic coronas, concluding that the total mass of galaxies exceeded that in stars by an order of magnitude. Finally, the authors argued that these new mass estimates could also explain the mass discrepancy that had been observed in clusters. Similar arguments had also been sketched earlier by the Einasto, including at the 1972 IAU meeting in Athens .
On May 28 – about six weeks after Einasto et al. – Jerry Ostriker, Jim Peebles and Amos Yahil submitted a paper of similar content and scope, entitled “The size and mass of galaxies, and the mass of the universe” . This paper did not present any new observations, but instead compiled existing estimates for the masses of (mostly giant spiral) galaxies. They begin with galactic rotation curves, citing the papers of Roberts and Rots  and Rogstad and Shostak  as evidence for their flatness in the outer parts of galaxies. The authors then went on to build a case for the existence of large amounts of dark matter in the outer parts of galaxies, based on mass estimates from galaxy pairs, the dynamics of dwarf galaxies, and the so-called timing argument for the Local Group. And although the observations presented in this paper were not new, and were subject to large uncertainties, the authors appear to have been confident in their conclusions, stating that the trend of increasing mass with increasing radius is “almost certainly real”, and arguing that this trend was in line with the “virial discrepancy” that had been observed in clusters and groups of galaxies [264, 124, 265]. The first sentences of this paper's body summarizes well the sentiment of the authors:
“There are reasons, increasing in number and quality, to believe that the masses of ordinary galaxies may have been underestimated by a factor of 10 or more. Since the mean density of the Universe is computed by multiplying the observed number density of galaxies by the typical mass per galaxy, the mean density of the Universe would have been underestimated by the same factor.”
In 1979, Sandra Faber and John Gallagher published an influential review, “Masses and mass-to-light ratios of galaxies” , which played an important role in crystallizing the opinion among cosmologists and astronomers that dark matter was indeed abundant in the Universe. Interestingly, they chose not to use the terms “corona” or “halo”, as suggested by the two above mentioned papers, but instead adopted the phrase “massive envelope” to describe the distribution of dark matter in astrophysical systems 13.
As astronomers continued to gather information on the masses of galaxies, and on other observables of cosmological relevance, cosmologists began to increasingly reflect upon the implications of those findings for the formation of structure and the evolution of the Universe. In 1974, the same year as the two key papers described above [109, 229], Richard Gott, James Gunn, David Schramm and Beatrice Tinsley published a paper that provides us with an illuminating snapshot of the status of cosmology at that time . The conclusions of this paper, entitled “An unbound universe”, appear within the original abstract:
“A variety of arguments strongly suggest that the density of the universe is no more than a tenth of the value required for closure. Loopholes in this reasoning may exist, but if so, they are primordial and invisible, or perhaps just black.”
In this paper, the authors argued that the body of astronomical data indicated that there was simply not enough matter in the Universe – even accounting for the large mass-to-light ratios observed among galaxies – to equal or exceed the critical density of the Universe. Among other caveats to this conclusion, they considered possible contributions from low-mass neutrinos, as had been suggested by Cowsik and McClelland, but ultimately ruled out this possibility as well.
In the early 1980s, the introduction of the theory of inflation profoundly changed the thinking of the cosmological community, and allowed one for the first time to make specific predictions for the total cosmological density and for the spectrum of density perturbations [145, 198, 302, 151, 151, 147, 34]. This began a decade long struggle to reconcile models of structure formation with what had by then become the “theoretical imperative” of a flat Universe . This struggle was exacerbated by estimates of the cosmological matter density arising from galaxy clusters which pointed toward a total abundance of matter – including dark matter, by then accepted by most cosmologists – that was clearly insufficient to close the Universe . The resolution to this problem had to await the discovery of the accelerating expansion rate of the Universe, and the contribution to the total energy density arising from a cosmological constant or “dark energy”.
Meanwhile, Jim Peebles had pointed out that the absence of fluctuations in the cosmic microwave background at a level of ∼10−4 was incompatible with a Universe that was composed of only baryonic matter, and argued that this problem would be relieved if the Universe was instead dominated by massive, weakly interacting particles, whose density fluctuations could begin to grow prior to decoupling  (see also, Ref. ). This and other papers that will be discussed in the next section received enormous attention from the scientific community, and rapidly led to the establishment of cold dark matter as the leading paradigm to describe the structure and evolution of the Universe at all scales.
C. Numerical Simulations
Much of our current understanding of the structure and evolution of dark matter halos in the Universe is based on the results of computer simulations. Such explorations have a longer history than one might expect. Working in the 1940s, the ingenious Swedish scientist Erik HolmbergHolmberg, Erik|( exploited the fact that light follows the same inverse square law as the gravitational force, and performed the first simulation of the interaction between two galaxies on an analog computer that consisted of 74 light-bulbs, photo-cells and galvanometers. He then calculated the amount of light received by each cell, and manually moved the light bulbs in the direction that received the most light.
Holmberg published his paper in November of 1941, shortly before the United States entered World War II. In the following years, the work of many research institutes ground to a halt, but science meanwhile continued to make progress, thanks in large part to the enormous resources made available to military research, especially at the Los Alamos National Laboratory, at which computers and advanced numerical techniques were developed within the context of the Manhattan Project. The first application of such computers to gravitational systems was arguably performed by John Pasta and Stanislaw Ulam in 1953. Their numerical experiments were performed on the Los Alamos computer, which by then had been applied to a variety of other problems, including early attempts to decode DNA sequences and the first chess-playing program. A number of other pioneering studies exploring the evolution of a system of gravitationally interacting massive particles appeared throughout the 1960s and 1970s, gradually increasing the number of simulated particles from N ∼ 100 in the early works to N ∼ 1000 [320, 321, 319, 237, 331].
By the early 1970s, it had become possible to numerically simulate the dynamics of galaxies. Simulations carried out by Richard Miller, Kevin Prendergast and William Quirk  as well as by Frank Hohl  each found rotationally suppored galaxies consisting of a stellar disk to be unstable, in contradition with observations. Instead of reaching an equilibrium configuration, such systems were found to change rapidly, forming bars and evolving toward a more elliptical and pressure supported configuration. The solution to this problem was proposed in 1973 by Jerry Ostriker and Jim Peebles, who recognized that a rotationally supported stellar disk could be stable if embedded within a massive spherical halo .
The first attempt to numerically solve the formation and evolution of cosmological structures in an expanding universe was presented in a famous paper published in 1974 by William Press and Paul Schechter . This was followed by a number of developments in the late 1970s and early 1980s that significantly advanced the power of such endeavors (see, for example, Refs. [4, 312, 142, 183, 77, 106, 184]). First, a combination of improvements in processor speed and in numerical techniques made it possible for the first time to simultaneously simulate millions of particles. Second, the newly proposed theory of inflation [146, 199] offered a physical means by which initial density perturbations could be generated, providing the initial conditions for cosmological simulations. And third, the results of the first large 3D survey of galaxies (the CfA redshift survey) were published in 1982, providing a distribution that could be directly compared with the output of simulations.
In some ways, the results of cosmological simulations do not depend much on what the dark matter consists of. In particular, they are largely insensitive to the electroweak or other non-gravitational interactions that may (or may not) be experienced by dark matter particles – for the purposes of structure formation, such particles are effectively “collisionless”. What does impact the results of such simulations, however, is the initial velocity distribution of the dark matter particles [99, 58, 59]. Importantly, this provides cosmologists with a way to discriminate between different classes of dark matter candidates. Standard model neutrinos, for example, decoupled from thermal equilibrium in the early Universe at a temperature that is much greater than their mass, and thus remained highly relativistic throughout the epoch of structure formation. In contrast, supersymmetric neutralinos are predicted to freeze-out of thermal equilibrium at a temperature below their mass, and are thus non-relativistic throughout cosmic history. Axions generated through misalignment production are also predicted to be produced with non-relativistic velocities.
At the largest scales – those associated with galaxy clusters and superclusters – cosmological simulations predict a pattern of structure that is largely insensitive to the initial velocities of the dark matter. At smaller scales, however, density fluctuations can be washed out by the random thermal motion of individual dark matter particles. As a result, the growth of small scale structure is predicted to be suppressed if the dark matter is relativistic, or “hot” [281, 240].
Non-relativistic, or “cold” dark matter particles undergo a very different sequence of structure formation. The much shorter free-streaming length of such particles allow them to form very low mass halos; roughly in the range of ∼10−3 to ∼10−9 M⊙ for a typical neutralino, for example. These very small halos form very early in the Universe's history, and then go on to merge with one another, gradually building up larger and larger dark matter structures. This bottom-up, or hierarchical, process of structure formation is in stark contrast to the top-down sequence predicted for hot dark matter.
Simulations of large scale structure are, of course, only useful if their results can be compared to the actual patterns of structure found in the Universe. This was made possible with the CfA survey, which was the first extensive 3D survey of galaxies in the local Universe . Among other features, CfA revealed the first indications of the “cosmic web”, which described the distribution of matter on the largest scales. This survey also identified the presence of significant structure on sub-cluster scales, in conflict with the predictions of hot dark matter simulations .
In the wake of the failures of hot dark matter, it was quickly becoming appreciated that cold dark matter could do a much better job of accounting for the observed patterns of large scale structure. To quote the 1984 paper by George Blumenthal, Sandra Faber, Joel Primack, and Martin Rees :
“We have shown that a universe with ∼ 10 times as much cold dark matter as baryonic matter provides a remarkably good fit to the observed universe. This model predicts roughly the observed mass range of galaxies, the dissipational nature of galaxy collapse, and the observed Faber-Jackson and Tully-Fisher relations. It also gives dissipationless galactic halos and clusters. In addition, it may also provide natural explanations for galaxy-environment correlations and for the differences in angular momenta between ellipticals and spiral galaxies.”
The first simulations of cold dark matter were carried out by Marc Davis, George Efstathiou, Carlos Frenk, and Simon White, who published their results in 1985 . The resemblance of their simulated distribution of dark matter halos to that of the galaxies in the CfA survey was clear, serving to further elevate the status of cold dark matter within the cosmological community.
By middle of the 1980s, the paradigm of cold dark matter was well on its way to becoming firmly established. And although scenarios involving mixed dark matter (containing significant quantities of both cold and hot dark matter) and warm dark matter (suppressing structure only on the scale of dwarf galaxies and below) would each continue to be discussed in the literature, the possibility that the dark matter was dominated by neutrinos or other relativistic particles was quickly abandoned.
A decade later, the predictions of cosmological simulations had shifted in focus from the distribution of cold dark matter halos to the shapes of those halos. In 1996, Julio Navarro, Carlos Frenk and Simon White published a remarkable result, based on an analysis of the halos generated in their high-resolution cold dark matter simulations :
The spherically averaged density profiles of all our halos can be fit over two decades in radius by scaling a simple universal profile. The characteristic overdensity of a halo, or equivalently its concentration, correlates strongly with halo mass in a way which reflects the mass dependence of the epoch of halo formation.
The simple fitting formula derived by the authors became known as the Navarro-Frenk-White profile. This parametrization is still widely used today, and represents the primary benchmark for most dark matter detection studies, despite the fact that it is expected to be inaccurate in the innermost regions of galaxies, where baryons dominate the gravitational potential.
In more recent years, the frontier for cosmological simulations has focused on the implementation of baryonic physics, including the hydroynamical evolution of gas in astrophysical structures, stellar formation, and feedback from supernova explosions and black holes. Current simulations are not yet able to resolve all relevant scales – which range between sub-parsec distances for stellar formation to Gpc scales for cosmological structures – but implement baryonic physics through the introduction of suitable “sub-grid” parameters which attempt to encode the collective behaviour of large amounts of gas and stars. Such parameters are generally tuned to match observable quantities, such as the galaxy mass function and the galaxy-central black hole mass relation, as in e.g. the recent suite of Eagle simulations .
13 In the discussions that took place as part of our research for this historical review, we encountered a considerable range of opinions regarding the relative importance of galactic rotation curves in establishing the existence of dark matter. This supports a picture in which different groups of scientists found quite different lines of evidence to be compelling during this period of time. Despite these disagreements regarding the strengths and weaknesses of the various observations and arguments, a consensus nonetheless began to emerge in favor of dark matter's existence. Back.