Cosmological structure formation in the multiverse

COSFORM was an Advanced Grant from the European Research Council (number 670193), which ran for 6 years from October 2015 to October 2021.

The programme of research in the COSFORM grant was directed at the outstanding puzzle of modern cosmology: the strangely small non-zero value of the vacuum density, which is observed to be at least 60 powers of 10 smaller than the expected zero-point contribution of quantum fields. This puzzle can be approached in three ways: (1) Evolution; (2) Revision of gravity; (3) Observer selection in the multiverse. The first two of these can be addressed by ongoing and future large galaxy surveys. Part of the research programme is directed at new ways of assuring robust measurements from these surveys of the effective equation of state of dark energy and the growth rate of density fluctuations. But so far such tests show no deviation from standard gravity and a cosmological constant, Lambda. In any case, even some form of dynamical dark energy could be supplemented by the zero-point energy density of the vacuum, so the Lambda 'scale problem' remains.

This fact drives interest in a multiverse solution, in which different causally disconnected domains may be able to possess different effective cosmological constants. Such a multiverse arises from the bubbles predicted in those inflationary cosmological models that display stochastic behaviour driven by quantum fluctuations of the inflation field. The astrophysically interesting aspect of this approach is to ask how galaxy formation would be affected by different levels of vacuum energy. Such a question has previously been addressed only by oversimplified analytic arguments, and there are many reasons for attempting a more realistic treatment, not least because it is important to see if the predicted exponential sensitivity of galaxy formation efficiency to Lambda holds up. In any case, there is much of interest to be learned regarding the robustness of current theories of galaxy formation by 'stress-testing' them outside the rather restricted parameter regimes normally considered.

The state of affairs summarized above motivates two distinct strands of future research:

• 1: Exploitation of new galaxy redshift surveys, and planning for future surveys. The need here is to develop new methods for testing robustness of BAO and RSD measurements. The intention is to investigate these fundamental-cosmology signatures using the galaxy population split according to its geometrical environment within the cosmic web, since this yields systematically different populations and formation histories.

• 2: Theoretical investigation of galaxy formation in the multiverse. This involves semianalytic modelling of the long-term history of star formation, as well as more detailed hydrodynamical simulations, to act as a cross-check on the semianalytic results and to understand in more detail how the formation of cosmic structure is expected to proceed in non-standard cosmologies.

Project 1 — Dissecting large-scale structure by environment

The current standard LambdaCDM cosmology owes much to input from large-scale structure, with the first precise evidence for a low matter density coming from the APM and QDOT surveys in the early 1990s. By 2003, the 2dF Galaxy Redshift Survey had advanced such 3D studies of the galaxy density field tenfold, to 220,000 galaxies. The 2dFGRS made precise measurements of two key statistics relating to the matter density: the power spectrum and the detection of BAO features and redshift-space distortions, caused by the peculiar velocity field associated with the growth of inhomogeneities.

The geometrical standard ruler provided by Baryon Acoustic Oscillations (BAO) allows the distance-redshift relation to be measured empirically, yielding the expansion history and hence the ability to measure if dark energy has evolved. BAO have become a hugely important tool in geometrical cosmology, with the signal-to-noise of the detection increasing greatly as the SDSS increased the size of the dataset. The results from the SDSS-III BOSS project have extended the detection to z=0.57 using Luminous Red Galaxies as tracers.

The rate of growth of density fluctuations (which probes the strength of gravity on 10-100 Mpc scales) can be measured by the Redshift-Space Distortion (RSD) signature of anisotropic clustering arising from the velocities associated with the development of cosmological structure. The RSD signal reflects directly the rate of growth of density fluctuations - which in turn depends on both the evolution of dark energy and on the strength of gravity on 10-100 Mpc scales. Measurement of the RSD signature with new generations of survey has continued to be an Edinburgh strength, yielding the highest-redshift measurement of the fluctuation growth rate, and a growth rate consistent with Einstein. An aim of this application is to improve the precision of such work, while retaining control over systematics.

Two major surveys are highly suitable for this work: GAMA is a 2-mag deeper successor to 2dFGRS, having assembled 250,000 redshifts over 200 deg^2 between 2006 & 2014; SDSS-IV. The SDSS-IV eBOSS took data for 6 years from July 2014, assembling a sample of 640k galaxies over 0.6 In the longer term, attention in galaxy redshift surveys will shift to DESI and Euclid. DESI plans a 5000-fibre spectrograph on the Kitt Peak 4m, measuring >20 million galaxy redshifts over 2019-2023, and allowing an order of magnitude improvement in the distance scale over 0.8 < z < 1.5. ESA's Euclid satellite will produce similar sized galaxy samples up to z=2 during its 2020-2025 mission.

Galaxy clustering by large-scale environment

In order to have confidence in any observational claim of a deviation from Lambda plus Einstein gravity, the result will need to demonstrate consistency from a number of independent lines of evidence. For RSD, this means dissecting the peculiar velocity field into its component parts, following possibilities such as infall onto clusters, and also RSD shape distortions of voids.

To approach this systematically, we need a quantitative tool for dividing a redshift survey into its fundamentally distinct geometrical environments - i.e. voids, filaments, sheets and nodes. This can conveniently be done using the quasi-potential generated by the galaxy number density field and using the eigenvalues of the Hessian matrix of second derivatives of this potential. This is a framework with attractive properties: e.g. the halo mass function split by environment in this way can be calculated analytically in an extension of Press-Schechter theory. A code of this sort has been successfully applied to GAMA. It is intended to take the same approach with the other surveys in this project; identification of the cosmic web with this approach in sparser surveys (BOSS; eBOSS) remains feasible with appropriate adjustment of the smoothing kernel. This lays the foundation for a new approach to RSD, determining the signature in each geometrical environment separately. This can be viewed as a generalization of the 'clipping' approach, in which high-density regions are downweighted.

The parallel goals for this research direction are to measure the empirical redshift-space clustering of galaxies in the different environmental regions, while setting up realistic simulations to make robust theoretical predictions for a given model. This exploits a new method in which the spatial distribution of the haloes below the resolution limit of the simulation is reconstructed using the distribution of resolved haloes, together with a rapid coverage of parameter space using a method in which the dark-matter halo catalogue from a single cosmology can be 'recast' to generate an accurate halo catalogue that is characteristic of a different model. Observational studies include both auto- and cross-correlation studies (e.g. the galaxy-group cross-correlation function in redshift space).

Project 2 — Galaxy formation in the multiverse

The importance of continued empirical measurements of dark energy does not remove the central challenge of understanding the scale of the vacuum energy, and the most radical approach to this is to apply anthropic selection within a multiverse. This route has been controversial, with denigrators dismissing 'the A-word' as an unscientific content-free tautology. But although difficult procedural issues undeniably exist, there can be no doubt that well-posed and interesting scientific problems are generated by the anthropic viewpoint. The biggest hint that observer selection should be included in the terms of discussion is the `why-now problem': why do we observe the universe at almost exactly the unique time when this strangely small vacuum density first comes to dominate the cosmos? A plausible answer is that larger values of the vacuum density would have inhibited structure formation. A significant further step was taken by Weinberg in 1989, who used the anthropic argument to make the impressively bold prediction that Lambda would indeed turn out to be non-zero at about the observed level. The argument is Bayesian, with an overall posterior probability unoform in Lambda, but weighted by the collapse fraction: the proportion of mass in the universe that has become incorporated into sufficiently large nonlinear objects. The uniform prior in vacuum density is defensible, since in practice we are interested in only a small range of values around zero, and zero is not a special value. The observed stellar collapse fraction can be well approximated using simple analytic arguments based on extensions of Press-Schechter theory. It therefore seems plausible that a higher Lambda would have suppressed the asymptotic efficiency of star formation beyond its current low level in which only about 2% of the cosmic baryon content is in the form of stars.

More realistic treatment of galaxy formation

Given the importance of this argument, it seems highly unsatisfactory that the analysis of structure formation is carried out in such a back-of-the-envelope fashion. Galaxy formation is not completely understood, but there exists a sophisticated body of relevant theory, which has been claimed to give good quantitative predictions of the build-up of galaxies and the stars within them. Therefore, we should make our best attempt to apply this physical framework in the context of cosmological parameters beyond the standard model. This opens up a number of novel research directions - particularly the issue of star formation in the indefinite future, and how this ultimate asymptotic efficiency of star formation may be affected by changes in cosmological parameters.

We do not know what physical parameters might vary within the multiverse; Weinberg's prediction of the observed vacuum density used the simplest ensemble, with only a variation in vacuum energy. Once this calculation has been repeated including realistic galaxy-formation physics for the first time, the conclusion may alter. More complex ensembles may then be investigated, building up to the radical view is that of the string-theory landscape, in which all of physics is free to vary. The approach to this question is experimental: try out different classes of ensemble and see which ones fail to match observation.

This programme of research also has benefits on the more purely astrophysical front. All current galaxy formation models include adjustable sub-grid degrees of freedom that are inferred by matching LambdaCDM to observations. By 'stress-testing' theories outside their normal regime of application, we hope to gain insight into the robustness of current predictions.

Semianalytic approach

Galaxy formation needs to follow the history of gas within dark-matter haloes, adding prescriptions for how cold gas turns into stars, followed by feedback of energy from the stars and from central black holes. All this must be calculated while following the hierarchical merging of dark-matter haloes. This can be computed within an explicit $N$-body simulation, but there are disadvantages in resolution and volume. The alternative is to generate 'semianalytic' merger trees of haloes via a rapid Monte Carlo algorithm: this is much faster than direct simulation and evades the problems of mass resolution and limited statistics - but at the price of losing the spatial relation between haloes.

An initial research aim is to use this approach to explore a broad parameter space of non-standard cosmologies, following which a selection of models can be simulated directly to compare the robustness of the results. Significant modification of the code are needed, allowing integration into the far future and coverage of unusual parameter regimes. But at extreme future times it is necessary to resort to purely theoretical arguments. By this stage, gas that has failed to form stars should be in a highly diffuse form, and it should be possible to make arguments about its asymptotic state. But if it should turn out that (e.g.) the majority of the gas in the universe is able to form stars over timescales of a trillion years, we would face the puzzle that we are not typical observers even in our own universe. Furthermore, if the majority of the gas always cools and fragments given sufficient time, independent of Lambda, Weinberg's anthropic argument will fail.

Detailed simulations

The semianalytic work must be compared with detailed simulations where the gas history can be followed directly down to the resolution limit. A number of groups exist worldwide that have poured years of effort into developing codes for cosmological hydrodynamics, into which the necessary subgrid prescriptions for star formation can be inserted. Using such tools, a number of distinct workpackages can be envisaged:

(1) Exploring the growth of dark matter haloes in different cosmologies. The halo rescaling method can be extended to allow predictions for the future evolution of dark matter haloes beyond z=0, as well as to cover non-standard cosmologies, and this can be calibrated against direct dark-matter simulations of the far future.

(2) Using the full galaxy-formation code in zoom simulations, the future of star formation in Milky Way type haloes can be assessed. These are the galaxies that dominate the present stellar content of the universe, and it is interesting to identify whether the present day marks a unique episode during their evolution in terms of the conversion of baryons into stars.

(3) Using the best matching set of physical subgrid models that describe the Milky Way, the investigation is then extended to different cosmological models. Initial runs are medium resolution simulations, predicting the galaxies with the highest conversion efficiency. Once these are identified, zoom runs with high resolution yield more robust measures of the star formation history and future of the galaxies.

Selected publications