Operationally, the problem of the cosmic large-scale structure can be divided into two parts: understanding the clustering evolution of the dark matter and understanding the gaseous and radiative processes that lead to the formation of galaxies. Specialized simulation techniques have been developed to tackle both aspects of the problem. The evolution of the dark matter is most often calculated using N-body techniques, implemented through a variety of efficient algorithms, such as P3M (Particle-particle/particle-mesh; Efstathiou et al. 1985), AP3M (the adaptive mesh version of P3M; Couchman et al. 1995) and hierarchical trees (Barnes & Hut 1986, Springel et al. 2001, Stadel 2000). Gaseous and radiative processes are followed by combining a hydrodynamics code with an N-body code. Numerical hydrodynamic techniques used in cosmology include Eulerian methods (Cen 1992), Lagrangian codes based on Smooth Particle Hydrodynamics (SPH) (Gingold & Monaghan 1977), and hybrid codes (e.g. Gnedin 1995, Pen 1998). These techniques have different strengths and weaknesses, but they all give similar results in the simplest cosmological problems where a detailed comparison has been performed (Frenk et al. 1999).
There has been a rapid growth in the size and power of cosmological simulations in the two and a half decades since this technique was introduced into the subject by Peebles (1970). One way to measure this growth is by the number of particles employed in the simulations. The size of the largest simulations has grown exponentially, in a manner reminiscent of the well-known "Moore's law" that describes the increase in cpu speed with time, except that the advent of massively parallel supercomputers led to a sudden order-of-magnitude jump in size towards the end of the past decade. The largest simulations carried out to date are the 1-billion particle "Hubble volume," N-body simulations performed by the Virgo consortium, an international collaboration of reseachers in the UK, Germany and Canada.