The current year, 2016, will remain as a landmark in the history of science and, even more, physical sciences. It is the year in which the first direct detection of gravitational waves was made, by a vast international consortium employing the Advanced LIGO ground-based interferometer. The LISA Pathfinder space probe – designed to test the drag-free technology necessary for LISA, the future space-born gravitational wave detector planned by ESA with international patterns – has not only flown with success after being launched at the end of 2015, but has reported a performance greatly superior to expectations, with noise level detection already of the order of what we will need for LISA.
The ability to detect gravitational waves pushes our current technology to the limits in a number of areas, but opens the window to a completely new way of looking at the Universe. Until now astronomy, astrophysics and cosmology have been based on some form of electromagnetic information, coming from any of the known emitting sources in our cosmos, from individual stars to entire galaxies. Astronomers have created communities specialised in the detection, analysis and interpretation of photons received from such sources in diverse regions of the electromagnetic spectrum, from visible optical frequencies to radio, and to X-ray or gamma rays associated with the most violent phenomena of the Universe, such as quasars or supernovae explosions. But, from the early astronomers in the Egyptian or Sumeric ancient civilizations, to the modern astronomers using the Hubble Space Telescope, the Chandra X-ray Space Telescope or the Very Large Telescope, our Universe has been always studied by means of electromagnetic signals.
With gravitational waves we are really in front of a transformative step in our way of looking at the sky. One of the many testable, and tested, predictions of our theory of gravity, General Relativity, gravitational waves will become our new tool to reveal the nature of the Universe, probing for the first time the fabric of spacetime.
What are the prime sources in this new era, those that replace, for instance, stars in conventional astronomy?
The answer is binaries of compact objects resulting from the ultimate fate of stars. Among these, binaries of massive black holes living at the centre of galaxies are the loudest sources, giving the strongest and more easily detectable signals when LISA will be operative. The black holes in these binaries can weigh from just shy of a million solar masses to more than ten billion solar masses. Our Milky Way hosts a (single) black hole weighing a few million solar masses, for. LISA will detect preferentially massive black hole mergers happening in the early stages of the Universe, ten or more billions of years before our time, when galaxies were still young and were often colliding against each other as the Universe was much denser than it is today.
As a scientist I like to think I should understand as much as possible the tools I need to carry out my research and go after the most challenging problems. Modern astronomy has come about because we have first elaborated a beautiful, coherent theory of stellar structure and evolution. With that, stars have been astronomy’s prime tool; without that, most of what we know now would have not been possible. Cosmology itself, as a quantitative, verifiable science, began in the 1920s because it was possible to measure the distances of objects, such as galaxies, and this is also done using stars. Now the question is, in our time, do we understand the nature of our new sources, massive black hole binaries, in the same way as we understood stars in the late 1800s? The answer is no. But this is an exciting time to bring the knowledge of such objects to a new level.
It all begins when two galaxies, each with their own massive black hole sitting at their centre, collide and then gradually merge into one single galaxy, and their large halos of dark matter form a mutual, irresistible gravitational pull. Massive black hole binaries are then thought to evolve across an enormous range of spatial scales. This was clear already at the time of the first major theoretical work on massive black hole binaries (Begelman, Blandford and Rees 1980, Nature). For typical massive black holes, weighing 10-100 million solar masses, the stage at which gravitational wave emission becomes the dominant mechanism to drain the orbital energy of the binary and bring it to coalescence is reached only when the two black holes reach a separation of a milliparsec. But they start their journey tens of thousands of light years away, when they are still in the nuclei of their merging host galaxies.
Ideas of the physical processes governing the evolution of the orbit of the pair of massive black holes have been around for a while, but modelling them correctly requires the use of complex computer models that solve the set of coupled, partial differential equations for gravity, pressure forces and radiation to the very least. The early part of their journey, until they are well above a milliparsec scale, can be described by Newtonian equations, while the latter part needs the intervention of general relativistic calculations, solving Einstein’s equation or at least some approximation of the latter in the form of the so-called ‘post-Newtonian expansion’ (Blanchet, Luc, 2006). Calculations of this type require the use of supercomputers. Indeed solving even the simplest of these models requires so many operations that it would take a thousand years on a conventional notebook or workstation.
A critical stage is when the two black holes become close enough to become mutually bound by gravity. At this point we can say that the binary has formed. Supercomputer calculations through the years have shown that in this phase the drag by the dense, cold interstellar gas in galactic nuclei is the dominant process (Mayer et al. 2007; Chapon, Mayer et al. 2013). After the binary has formed the jury is still out on what is the main source of the drag, but it may well depend on the type of galaxy where the binary is evolving – if there is plenty of cold gas down to the heart of the nucleus, which would torque the binary as long as there are asymmetries in its distribution, a process similar to planet migration occurs (Mayer 2013). Alternatively, stars can rob kinetic energy and angular momentum of the binary as a result of their gravitational pull when they fly close to it, and bring it to the gravitational wave regime (Milosavljevic and Merritt 2001; Khan et al. 2011; Vasiliev and Merritt 2015).
Traditionally, computer models that were able to describe the effect of encounters with stars were not able to model friction and torques by gas, nor was it possible to study the whole binary shrinking process from the galaxy merger state to when relativistic effects begin. Recently we have used some of the fastest supercomputers in the world, located in Switzerland, China and Germany, to carry out the first simulation that follows all the phases of the evolution of the binary, up to the point when gravitational wave radiation begins (Khan, Fiacconi, Mayer et al. 2016). We started from a galaxy merger extracted from a state-of-the-art simulation of galaxy formation, called ARGO (Feldman & Mayer 2015), which was previously run on the PizDaint supercomputer in Switzerland. The result was unexpected; the two black holes, which weigh more than 100 million solar masses, fuse into one with a gravitational wave burst in less than ten million years after the galaxy collision. We also demonstrated that the emitted waves fall into the LISA band before they coalesce. The timescale of the process is almost 100 times shorter than usually assumed, to make forecasts for how many black hole merger events LISA should detect.
This is exciting news, and it is also well understood; it is simply a consequence of the fact that galaxies were about 100 times denser several billion years ago than they are today, whereby the key processes determining the shrinking of the binary all depend on density.
Now the challenge ahead of us is mostly computational. This simulation is the first of its kind, and required more than a year of nearly continuous computing, despite harnessing the power of such big machines. But there is a catch. Even the best simulation programs that we currently have can use less than 10% of the total computing power of these supercomputers at once. Inefficient usage could become even more evident when the bigger and more powerful Exascale supercomputers appear in a couple of years. Yet computer science offers us new techniques to improve the so-called ‘scalability’ of simulation codes, namely their ability to run in parallel on a large number of processing units, from traditional CPUs to Graphics Processing Units (GPUs).
If we will advance our codes to approach 100% efficiency on the new supercomputers, we could run tens of simulations simultaneously. This will be the way to provide the necessary theoretical support to produce realistic forecasts for LISA, and help with the interpretation of the data afterwards. We can envision a supercomputer entirely dedicated to black hole merger simulations, including those focusing on the final phase of coalescence in full general relativity.
This may seem ambitious, but it may also be the only way. The parameter of space is huge and has to be explored with an ambitious simulation campaign. Supercomputers dedicated to very important tasks, such as weather forecasting, already exist. The endeavour of looking at the Universe through the new window of gravitational waves might be a revolutionary step in mankind’s knowledge; it might make history, just as the first astronomical observations of Galileo, Kepler and Copernicus did five centuries before us. It definitely deserves an unprecedented effort in dedicating computational resources, and any kind of other necessary resources, to it.
Chapon, Damien; Mayer, Lucio; Teyssier, Romain, 2013, Monthly Notices of the Royal Astronomical Society, 429, 3114,
“Hydrodynamics of galaxy mergers with supermassive black holes: is there a last parsec problem?”
Feldmann, Robert; Mayer, Lucio, 2015, Monthly Notices of the Royal Astronomical Society, 446, 1939
“The Argo simulation – I. Quenching of massive galaxies at high redshift as a result of cosmological starvation”
Mayer, L.; Kazantzidis, S.; Madau, P.; Colpi, M.; Quinn, T.; Wadsley, J, 2007, Science, 316, 1874
“Rapid Formation of Supermassive Black Hole Binaries in Galaxy Mergers with Gas”
Milosavljević, Miloš; Merritt, David, 2001, Astrophysical Journal, 563, 34
“Formation of Galactic Nuclei”
Blanchet, Luc, 2006,
Living Reviews in Relativity, Volume 9, Issue 1,
“Gravitational Radiation from Post-Newtonian Sources and Inspiralling Compact Binaries”
Begelman, M. C.; Blandford, R. D.; Rees, M. J. 1980, Nature, 287, 307
Massive black hole binaries in active galactic nuclei
Khan, Fazeel Mahmood; Just, Andreas; Merritt, David, Astrophysical Journal, 732, 89
“Efficient Merger of Binary Supermassive Black Holes in Merging Galaxies”
Mayer, Lucio, 2013, Classical and Quantum Gravity, Volume 30, Issue 24
“Massive black hole binaries in gas-rich galaxy mergers; multiple regimes of orbital decay and interplay with gas inflows”
Vasiliev, Eugene; Antonini, Fabio; Merritt, 2015, David, Astrophysical Journal, 810, 49
“The Final-parsec Problem in the Collisionless Limit”
Professor Lucio Mayer
Center for Theoretical Astrophysics and Cosmology
Computational Astrophysics Group
+41 446 356 198