John Peacock's Research
I am a cosmologist, and my research is concerned with understanding the overall properties of the universe. This page summarises some of my past and current research projects. My textbook Cosmological Physics (CUP) gives plenty of background detail.
Power-spectrum clustering analysis
The large-scale structure in the galaxy distribution is one of the most important relics of the early universe. The power spectrum of density fluctuations contains critical information about the amount of matter in the universe, and its physical nature. I have pursued methods for direct measurement of this clustering, using Fourier techniques. In the early 1990s, I carried out and analyzed an all-sky redshift survey of radio galaxies [1]. This was one of the first measurements of the clustering power spectrum in the "linear" regime (where the density fluctuations are small). This work correctly predicted the level of microwave-background fluctuations seen in 1992 by the COBE satelliteThe 2dF Galaxy Redshift Survey
Measurements of galaxy clustering were advanced into a new regime of precision by this collaboration of UK and Australian astronomers. The 2dFGRS measured redshifts for about 230,000 galaxies, and yielded definitive measurements for many key statistical properties of the galaxy distribution [3], [4]. Many 2dFGRS results received high levels of publicity, but the most important result was probably the power-spectrum analysis [4]. This measurement is sufficiently precise that it reveals not only the overall shape (from which the mean density of dark matter is deduced), but also low-level inflections which result from acoustic oscillations of the baryon-photon fluid. These results tell us that the total density of the universe is about 30% of critical, with about 15% of this matter being normal (baryonic). When combined with measurements of anisotropies in the cosmic microwave background, the 2dFGRS results determined most of the key cosmological parameters to about 5% accuracy. [5]Nonlinear evolution of clustering
One of the essential steps in using galaxy clustering for quantitative cosmology is to understand the nonlinear development of gravitational instability. This requires numerical simulation, and I am a core member of the Virgo Consortium, which has provided the world"s largest numerical datasets of gravitational clustering incorporating dark matter and gas [6], including the billion-particle "Hubble volume", which simulates the majority of the visible universe. Such calculations are nevertheless no substitute for analytical understanding, and I have devoted much effort towards developing approximations by which the simulation results can be understood [7], [8], [9]. Using these methods, I was able to show in 1994 that different tracers of large-scale structure were all consistent with a single underlying linear power spectrum [7].High-redshift galaxies
In the expanding universe, galaxies seen at large distances (high redshift) are seen in the past, and so allow a study of cosmological evolution. I have concentrated on studies of radio-loud active galaxies, which are especially massive and luminous. I have used statistical samples of radio galaxies and quasars to detect the "redshift cutoff" -- a decline in the comoving density of these active galaxies beyond a redshift z = 2 -- 3 [10]. [11]. In addition, I was one of the co-inventors of the now standard "unified model" for active galaxies, in which radio galaxies contain a hidden quasar nucleus, so that the appearance of active galaxies depends on orientation [12]. The stellar populations in young high-redshift galaxies are of great interest, and can contain very old stars [13]. This was an early argument against universes of critical density. Most recently, I have worked as part of the UK Submillimetre Survey Consortium to study galaxies at high redshift in the emission from dust-enshrouded starbursts, which reveal high-redshift galaxies in the process of formation [14]. It appears that only about 20% of the energy output from young stars is seen directly.Density peaks and biased galaxy formation
Understanding the data on distant galaxies requires a theory of galaxy formation. Analytic insight into the properties of the first galaxies by treating them as rare high peaks in the cosmic density field, and Alan Heavens & I wrote the first paper on the statistics of cosmological density maxima [15], Later extensions covered angular momentum, clustering, and the mass function of these proto-objects [16]. One of the most important predictions of the peak model was that galaxies can be "biased", meaning that their clustering may not follow that of the mass field. However, this model applies only to rare early-forming galaxies, and cannot deal with the present-day clustering. This problem is solved with a new approach known as the "halo model" [17], which has been rapidly adopted since its proposal in 2000. The halo model reduces many complex features of the galaxy distribution to a single universal function, which specifies how the number of galaxies per clump of dark matter varies with clump mass.Gravitational lensing
Because the mass in the universe is clumped, light rays from distant objects suffer gravitational deflection, thus distorting the view to high redshifts. This gravitational lensing is now a field of its own in cosmology. I performed the first calculation of the statistics of gravitational lensing, showing that the universe has a small "optical depth" due to lensing by galaxies [18]. I am actively pursuing collaborations in the area of lensing, with the aim of making maps of the dark matter, and testing some of the above ideas about the nonlinear evolution of the mass distribution.Statistical cosmology
Finally, astronomers frequently use advanced statistical methods in their research. I invented a new statistical goodness-of-fit test, which is a two-dimensional analogue of the well-known Kolmogorov-Smirnov test [19]. This test has become widely known and used through its inclusion in the standard text of numerical software, "Numerical Recipes".