vendredi 30 décembre 2016

Hoping and believing are different things for physicists

The monster, the second sister and Cinderella : three Magi announcing the era of direct gravitational wave astrometry

On February 11 the LIGO-Virgo collaboration announced the detection of Gravitational Waves (GW). They were emitted about one billion years ago by a Binary Black Hole (BBH) merger and reached Earth on September 14, 2015. The claim, as it appears in the ‘discovery paper’ [1] and stressed in press releases and seminars, was based on “> 5.1 σ significance.” Ironically, shortly after, on March 7 the American Statistical Association (ASA) came out (independently) with a strong statement warning scientists about interpretation and misuse of p-values [2]...
In June we have finally learned [4] that another ‘one and a half ’ gravitational waves from Binary Black Hole mergers were also observed in 2015, where by the ‘half’ I refer to the October 12 event, highly believed by the collaboration to be a gravitational wave, although having only 1.7 σ significance and therefore classified just as LVT (LIGO-Virgo Trigger) instead of GW. However, another figure of merit has been provided by the collaboration for each event, a number based on probability theory and that tells how much we must modify the relative beliefs of two alternative hypotheses in the light of the experimental information. This number, at my knowledge never even mentioned in press releases or seminars to large audiences, is the Bayes factor (BF), whose meaning is easily explained: if you considered a priori two alternative hypotheses equally likely, a BF of 100 changes your odds to 100 to 1; if instead you considered one hypothesis rather unlikely, let us say your odds were 1 to 100, a BF of 104 turns them the other way around, that is 100 to 1. You will be amazed to learn that even the “1.7 sigma” LVT151012 has a BF of the order of ≈ 1010 , considered a very strong evidence in favor of the hypothesis “Binary Black Hole merger” against the alternative hypothesis “Noise”. (Alan Turing would have called the evidence provided by such an huge ‘Bayes factor,’ or what I. J. Good would have preferred to call “Bayes-Turing factor” [5],1 100 deciban, well above the 17 deciban threshold considered by the team at Bletchley Park during World War II to be reasonably confident of having cracked the daily Enigma key [7].)...
Figure 3: The Monster (GW150914), Cinderella (LVT151012) and the third sister (GW151226), visiting us in 2015 (Fig. 1 of [4] – see text for the reason of the names). The published ‘significance’ of the three events (Table 1 of [4]) is, in the order, “> 5.3 σ”, “1.7 σ” and “> 5.3 σ”, corresponding to the following p-values: 7.5 × 10-8 , 0.045, 7.5 × 10-8. The log of the Bayes factors are instead (Table 4 of [4]) approximately 289, 23 and 60, corresponding to Bayes factors about 3 × 10125 , 1010 and 1026

... even if at a first sight it does not look dissimilar from GW151226 (but remember that the waves in Fig. 3 do not show raw data!), the October 12 event, hereafter referred as Cinderella, is not ranked as GW, but, more modestly, as LVT, for LIGO-Virgo Trigger. The reason of the downgrading is that ‘she’ cannot wear a “> 5σ’s dress” to go together with the ‘sisters’ to the ‘sumptuous ball of the Establishment.’ In fact Chance has assigned ‘her’ only a poor, unpresentable 1.7 σ ranking, usually considered in the Particle Physics community not even worth a mention in a parallel session of a minor conference by an undergraduate student. But, despite the modest ‘statistical significance’, experts are highly confident, because of physics reasons* (and of their understanding of background), that this is also a gravitational wave radiated by a BBH merger, much more than the 87% quoted in [4]. [Detecting something that has good reason to exist , because of our understanding of the Physical World (related to a network of other experimental facts and theories connecting them!), is quite different from just observing an unexpected bump, possibly due to background, even if with small probability, as already commented in footnote 15. And remember that whatever we observe in real life, if seen with high enough resolution in the N-dimensional phase space, had very small probability to occur! (imagine, as a simplified example, the pixel content of any picture you take walking on the road, in which N is equal to five, i.e two plus the RGB code of each pixel).]
Giulio D'Agostini (Submitted on 6 Sep 2016)


Will the first 5-sigma claim from LHC Run2 be a fluke?
In the meanwhile it seems that particle physicists are hard in learning the lesson and the number of graves in the Cemetery of physics ... has increased ..., the last funeral being recently celebrated in Chicago on August 5, with the following obituary for the dear departed: “The intriguing hint of a possible resonance at 750 GeV decaying into photon pairs, which caused considerable interest from the 2015 data, has not reappeared in the much larger 2016 data set and thus appears to be a statistical fluctuation” [57]. And de Rujula’s dictum gets corroborated. [If you disbelieve every result presented as having a 3 sigma, or ‘equivalently’ a 99.7% chance of being correct, you will turn out to be right 99.7% of the times. (‘Equivalently’ within quote marks is de Rujula’s original, because he knows very well that there is no equivalence at all.)] Someone would argue that this incident has happened because the sigmas were only about three and not five. But it is not a question of sigmas, but of Physics, as it can be understood by those who in 2012 incorrectly turned the 5σ into 99,99994% “discovery probability” for the Higgs [58], while in 2016 are sceptical in front of a 6σ claim (“if I have to bet, my money is on the fact that the result will not survive the verifications” [59]): the famous “du sublime au ridicule, il n’y a qu’un pas” seems really appropriate! ... 
Seriously, the question is indeed that, now that predictions of New Physics around what should have been a natural scale substantially all failed, the only ‘sure’ scale I can see seems Planck’s scale. I really hope that LHC will surprise us, but hoping and believing are different things. And, since I have the impression that are too many nervous people around, both among experimentalists and theorists, and because the number of possible histograms to look at is quite large, after the easy bets of the past years (against CDF peak and against superluminar neutrinos in 2011; in favor of the Higgs boson in 2011; against the 750 GeV di-photon in 2015, not to mention that against Supersymmetry going on since it failed to predict new phenomenology below the Z0 – or the W? – mass at LEP, thus inducing me more than twenty years ago to gave away all SUSY Monte Carlo generators I had developed in order to optimize the performances of the HERA detectors.) I can serenely bet, as I keep saying since July 2012, that the first 5-sigma claim from LHC will be a fluke. (I have instead little to comment on the sociology of the Particle Physics theory community and on the validity of ‘objective’ criteria to rank scientific value and productivity, being the situation self evident from the hundreds of references in a review paper which even had in the front page a fake PDG entry for the particle [60] and other amenities you can find on the web, like [61].)
Id.

Bayesian anatomy of the 750 GeV fluke 
The statistical anomalies at about 750 GeV in ATLAS [1, 2] and CMS [3, 4] searches for a diphoton resonance (denoted in this text as F {for digamma}) at √s = 13 TeV with about 3/fb caused considerable activity (see e.g., Ref. [5, 6, 7]). The experiments reported local significances, which incorporate a look-elsewhere effect (LEE, see e.g., Ref. [8, 9]) in the production cross section of the z, of 3.9σ and 3.4σ, respectively, and global significances, which incorporate a LEE in the production cross section, mass and width of the F, of 2.1σ and 1.6σ, respectively. There was concern, however, that an overall LEE, accounting for the numerous hypothesis tests of the SM at the LHC, cannot be incorporated, and that the plausibility of the F was difficult to gauge. 
Whilst ultimately the F was disfavoured by searches with about 15/fb [10, 11], we directly calculate the relative plausibility of the SM versus the SM plus F in light of ATLAS data available during the excitement, matching, wherever possible, parameter ranges and parameterisations in the frequentist analyses. The relative plausibility sidesteps technicalities about the LEE and the frequentist formalism required to interpret significances. We calculate the Bayes-factor (see e.g., Ref. [12]) in light of ATLAS data, 
Our main result is that we find that, at its peak, the Bayes-factor was about 7.7 in favour of the F. In other words, in light of the ATLAS 13 TeV 3.2/fb and 8 TeV 20.3/fb diphoton searches, the relative plausibility of the F versus the SM alone increased by about eight. This was “substantial” on the Jeffreys’ scale [13], lying between “not worth more than a bare mention” and “strong evidence.” For completeness, we calculated that this preference was reversed by the ATLAS 13 TeV 15.4/fb search [11], resulting in a Bayes-factor of about 0.7. Nevertheless, the interest in F models in the interim was, to some degree, supported by Bayesian and frequentist analyses. Unfortunately, CMS performed searches in numerous event categories, resulting in a proliferation of background nuisance parameters and making replication difficult without cutting corners or considerable computing power.
Andrew Fowlie  (Submitted on 22 Jul 2016 (v1), last revised 6 Dec 2016 (this version, v2))



mercredi 9 novembre 2016

[Today the world is trumper than yesterday!, Est ce que le monde d'hier était moins trompeur qu'aujourd'hui ?]

Yesterday forecast for the 2016 american presidential election
from projects.fivethirtyeight.com/2016-election-forecast

Today projection after the vote
Beware the color labels for Trump and Clinton in the following is opposite to the last graphic!

from uselectionatlas.org/RESULTS (November 9)

Last comment (November 19)
Will Trump victory make the USA a more obvious plutocracy?
Here are the last (final?) results :

from uselectionatlas.org/RESULTS (November 19)



dimanche 6 novembre 2016

[There, is] plenty of room for new phases at high pressure [!,?]

No comment

Evidence for a new phase of dense hydrogen above 325 gigapascals
Philip Dalladay-Simpson, Ross T. Howie & Eugene Gregoryanz
Nature 529, 63–67 (07 January 2016)
Almost 80 years ago it was predicted that, under sufficient compression, the H–H bond in molecular hydrogen (H2) would break, forming a new, atomic, metallic, solid state of hydrogen. Reaching this predicted state experimentally has been one of the principal goals in high-pressure research for the past 30 years. Here, using in situ high-pressure Raman spectroscopy, we present evidence that at pressures greater than 325 gigapascals at 300 kelvin, H2 and hydrogen deuteride (HD) transform to a new phase—phase V. This new phase of hydrogen is characterized by substantial weakening of the vibrational Raman activity, a change in pressure dependence of the fundamental vibrational frequency and partial loss of the low-frequency excitations. We map out the domain in pressure–temperature space of the suggested phase V in H2 and HD up to 388 gigapascals at 300 kelvin, and up to 465 kelvin at 350 gigapascals; we do not observe phase V in deuterium (D2). However, we show that the transformation to phase IV′ in D2 occurs above 310 gigapascals and 300 kelvin. These values represent the largest known isotropic shift in pressure, and hence the largest possible pressure difference between the H2 and D2 phases, which implies that the appearance of phase V of D2 must occur at a pressure of above 380 gigapascals. These experimental data provide a glimpse of the physical properties of dense hydrogen above 325 gigapascals and constrain the pressure and temperature conditions at which the new phase exists. We speculate that phase V may be the precursor to the non-molecular (atomic and metallic) state of hydrogen that was predicted 80 years ago.


New low temperature phase in dense hydrogen: The phase diagram to 421 GPa
Ranga Dias, Ori Noked, Isaac F. Silvera
(Submitted on 7 Mar 2016 (v1), last revised 26 May 2016 (this version, v2))
In the quest to make metallic hydrogen at low temperatures a rich number of new phases have been found and the highest pressure ones have somewhat flat phase lines, around room temperature. We have studied hydrogen to static pressures of GPa in a diamond anvil cell and down to liquid helium temperatures, using infrared spectroscopy. We report a new phase at a pressure of GPa and T=5 K. Although we observe strong darkening of the sample in the visible, we have no evidence that this phase is metallic hydrogen.


No "Evidence for a new phase of dense hydrogen above 325 GPa"
Ranga P. Dias, Ori Noked, Isaac F. Silvera
(Submitted on 18 May 2016)
In recent years there has been intense experimental activity to observe solid metallic hydrogen. Wigner and Huntington predicted that under extreme pressures insulating molecular hydrogen would dissociate and transition to atomic metallic hydrogen. Recently Dalladay-Simpson, Howie, and Gregoryanz reported a phase transition to an insulating phase in molecular hydrogen at a pressure of 325 GPa and 300 K. Because of its scientific importance we have scrutinized their experimental evidence to determine if their claim is justified. Based on our analysis, we conclude that they have misinterpreted their data: there is no evidence for a phase transition at 325 GPa.




Nature of the Metallization Transition in Solid Hydrogen
Sam Azadi, N. D. Drummond, W. M. C. Foulkes
(Submitted on 2 Aug 2016)
Determining the metalization pressure of solid hydrogen is one of the great challenges of high-pressure physics. Since 1935, when it was predicted that molecular solid hydrogen would become a metallic atomic crystal at 25 GPa [1], compressed hydrogen has been studied intensively. Additional interest arises from the possible existence of room-temperature superconductivity [2], a metallic liquid ground state [3], and the relevance of solid hydrogen to astrophysics [4, 5].  
Early spectroscopic measurements at low temperature suggested the existence of three solid-hydrogen phases [4]. Phase I, which is stable up to 110 GPa, is a molecular solid composed of quantum rotors arranged in a hexagonal close-packed structure. Changes in the low-frequency regions of the Raman and infrared spectra imply the existence of phase II, also known as the broken-symmetry phase, above 110 GPa. The appearance of phase III at 150 GPa is accompanied by a large discontinuity in the Raman spectrum and a strong rise in the spectral weight of molecular vibrons. Phase IV, characterized by the two vibrons in its Raman spectrum, was discovered at 300 K and pressures above 230 GPa [6–8]. Another new phase has been claimed to exist at pressures above 200 GPa and higher temperatures (for example, 480 K at 255 GPa) [9]. This phase is thought to meet phases I and IV at a triple point, near which hydrogen retains its molecular character. The most recent experimental results [10] indicate that H2 and hydrogen deuteride at 300 K and pressures greater than 325 GPa transform to a new phase V, characterized by substantial weakening of the vibrational Raman activity. Other features include a change in the pressure dependence of the fundamental vibrational frequency and the partial loss of the low-frequency excitations.  
Although it is very difficult to reach the hydrostatic pressure of more than 400 GPa at which hydrogen is normally expected to metalize, some experimental results have been interpreted as indicating metalization at room temperature below 300 GPa [6]. However, other experiments show no evidence of the optical conductivity expected of a metal at any temperature up to the highest pressures explored [11]. Experimentally, it remains unclear whether or not the molecular phases III and IV are metallic, although it has been suggested that phase V may be non-molecular (atomic) [10]. Metalization is believed to occur either via the dissociation of hydrogen molecules and a structural transformation to an atomic metallic phase [6, 12], or via band-gap closure within the molecular phase [13, 14]. In this work we investigate the latter possibility using advanced computational electronic structure methods.
Structures of crystalline materials are normally determined by X-ray or neutron diffraction methods. These techniques are very challenging for low-atomic-number elements such as hydrogen [15]. Fortunately optical phonon modes disappear, appear, or experience sudden shifts in frequency when the crystal structure changes. It is therefore possible to identify the transitions between phases using optical methods.


(Submitted on 5 Oct 2016)
We have studied solid hydrogen under pressure at low temperatures. With increasing pressure we observe changes in the sample, going from transparent, to black, to a reflective metal, the latter studied at a pressure of 495 GPa. We have measured the reflectance as a function of wavelength in the visible spectrum finding values as high as 0.90 from the metallic hydrogen. We have fit the reflectance using a Drude free electron model to determine the plasma frequency of 30.1 eV at T= 5.5 K, with a corresponding electron carrier density of 6.7x1023 particles/cm3 , consistent with theoretical estimates. The properties are those of a metal. Solid metallic hydrogen has been produced in the laboratory

dimanche 2 octobre 2016

Solar neutrinos: Oscillations or (almost) No-oscillations (?)

Neutrino oscillations disentangled from adiabatic flavor conversion : always mind your terminology!
Next Tuesday will be announced the Nobel prize in physics 2016. That makes two days left to think one more time about the interesting physics from the previous year, learning some lessons from the past:
The Nobel prize in physics 2015 has been awarded "... for the discovery of neutrino oscillations which show that neutrinos have mass". While SuperKamiokande (SK), indeed, has discovered oscillations, {the} Sudbury Neutrino Observatory (SNO) observed effect of the adiabatic (almost non-oscillatory) flavor conversion of neutrinos in the matter of the Sun. Oscillations are irrelevant for solar neutrinos apart from small electron neutrino regeneration inside the Earth. Both oscillations and adiabatic conversion do not imply masses uniquely and further studies were required to show that non-zero neutrino masses are behind the SNO results. Phenomena of oscillations (phase effect) and adiabatic conversion (the Mikheïev-Smirnov-Wolfenstein (MSW) effect driven by the change of mixing in matter) are described in pedagogical way.

In {the figure above} we show graphic representations of the neutrino oscillations and adiabatic conversion which are based on analogy with the electron spin precession in the magnetic field. Neutrino polarization vector in flavor space (“spin”) is moving in the flavor space around the “eigenstate axis” (magnetic field) whose direction is determined by the mixing angle 2θm. Oscillations are equivalent to the precession of the neutrino polarization vector around fixed axis, Fig. a. Oscillation probability is determined by projection of the neutrino vector on the axis z. The direction up of the neutrino vector corresponds to the νe, direction down – to νa. Adiabatic conversion is driven by rotation of the cone itself, i.e. change of direction of the magnetic field (cone axis) according to change of the mixing angle, Fig. b. Due to adiabaticity the cone opening angle does not change and therefore the neutrino vector follow rotation of axis.
...
Oscillations do not need the mass. Recall that it was the subject of the classical Wolfenstein’s paper [9] to show that oscillations can proceed for massless neutrinos. This requires, however, introduction of the non-standard interactions of neutrinos which lead to non-diagonal potentials in the flavor basis and therefore produce mixing. 
In oscillations we test the dispersion relations, that is, the relations between the energy and momentum, and not masses immediately. Oscillations are induced because of difference of dispersion of neutrino components that compose a mixed state... 
It is consistency of results of many experiments in wide energy ranges and different environments: vacuum, matter with different density profiles that makes explanation of data without mass almost impossible. In this connection one may wonder which type of experiment/measurement can uniquely identify the true mass? Let us mention three possibilities:
• Kinematical measurements: distortion of the beta decay spectrum near the end point. Notice that similar effect can be produced if a degenerate sea of neutrinos exists which blocks neutrino emission near the end point.
• Detection of neutrinoless double beta decay which is the test of the Majorana neutrino mass. Here complications are related to possible contributions to the decay from new L-violating interactions.
• Cosmology is sensitive to the sum of neutrino masses, and in future it will be sensitive to even individual masses. Here the problem is with degeneracy of neutrino mass and cosmological parameters.
...
In January 1986 at the Moriond workshop A. Messiah (he gave the talk [16]) asked me: “why do you call effect that happens in the Sun the resonance oscillations? It has nothing to do with oscillations, I will call it the MSW effect”. My reply was “yes, I agree, we simply did know how to call it. I will explain and correct this in my future talks and publications”. Messiah’s answer was surprising: “No way..., now this confusion will stay forever”. That time I could not believe him. I have published series of papers, delivered review talks, lectures in which I was trying to explain, fix terminology, etc.. All this has been described in details in the talk at Nobel symposium [17], and for recent review see [8]. 
Ideally terminology should reflect and follow our understanding of the subject. Deeper understanding may require a change or modification of terminology. At the same time changing terminology is very delicate thing and can be done with great care. 
In conclusion, the answer to the question in the title of the paper is 
“Solar neutrinos: Almost No-oscillations”.
The SNO experiment has discovered effect of the adiabatic flavor conversion (the MSW effect). Oscillations (effect of the phase) are irrelevant. Evolution of the solar neutrinos can be considered as independent (incoherent) propagation of the produced eigenstates in matter. Flavors of these eigenstates (described by mixing angle) change according to density change. At high energies (SNO) the adiabatic conversion is close to the non-oscillatory transition which corresponds to production of single eigenstate. Oscillations with small depth occur in the matter of the Earth.
A. Yu. Smirnov (Submitted on 8 Sep 2016)

lundi 29 août 2016

High (energy physics exploration) by East-west (collaboration on heavy ion collision experiments)

There is more than potential new elementary particles to understand fundamental interactions
This short note describes the long collaborative effort between Arizona and Krak´ow, showing some of the key strangeness signatures of quarkgluon plasma. It further presents an annotated catalog of foundational questions defining the research frontiers which I believe can be addressed in the foreseeable future in the context of relativistic heavy ion collision experiments. The list includes topics that are specific to the field, and ventures towards the known-to-be-unknown that may have a better chance with ions as compared to elementary interactions. 
Some 70 years ago the development of relativistic particle accelerators heralded a new era of laboratory-based systematic exploration and study of elementary particle interactions...  
The outcomes of this long quest are on one hand the standard model (SM) of particle physics, and on another, the discovery of the primordial deconfined quark-gluon plasma (QGP). These two foundational insights arose in the context of our understanding of the models of particle production and more specifically, the in-depth understanding of strong interaction processes. To this point we recall that in the context of SM discovery we track decay products of e.g. the Higgs particle in the dense cloud of newly formed strongly interacting particles. In the context of QGP we need to understand the gas cloud of hadrons into which QGP decays and hadronizes. Hadrons are always all we see at the end. They are the messengers and we must learn to decipher the message.
Jan Rafelski   (Submitted on 25 Aug 2016)


Exotic states of nuclear matter matter too
The year 1964/65 saw the rise of several new ideas which in the following 50 years shaped the discoveries in fundamental subatomic physics: 1. The Hagedorn temperature TH ; later recognized as the melting point of hadrons into 2. Quarks as building blocks of hadrons; and, 3. The Higgs particle and field escape from the Goldstone theorem, allowing the understanding of weak interactions, the source of inertial mass of the elementary particles. The topic in this paper is Hagedorn temperature 
TH
 and the strong interaction phenomena near to TH . I present an overview of 50 years of effort with emphasis on: a) Hot nuclear and hadronic matter; b) Critical behavior near 
TH
 ; c) Quark-gluon plasma (QGP); d) Relativistic heavy ion (RHI) collisions1 ; e) The hadronization process of QGP; f) Abundant production of strangeness flavor... 
A report on ‘Melting Hadrons, Boiling Quarks and TH’ relates strongly to quantum chromodynamics (QCD), the theory of quarks and gluons, the building blocks of hadrons, and its lattice numerical solutions; QCD is the quantum (Q) theory of color-charged (C) quark and gluon dynamics (D); for numerical study the space-time continuum is discretized on a ‘lattice’. Telling the story of how we learned that strong interactions are a gauge theory involving two types of particles, quarks and gluons, and the working of the lattice numerical method would entirely change the contents of this article, and be beyond the expertise of the author. I recommend instead the book by Weinberg [8], which also shows the historical path to QCD... 
Our conviction that we achieved in laboratory experiments the conditions required for melting (we can also say, dissolution) of hadrons into a soup of boiling quarks and gluons became firmer in the past 15-20 years. Now we can ask, what are the ‘applications’ of the quark-gluon plasma physics? Here is a short wish list:  
1) Nucleons dominate the mass of matter by a factor 1000. The mass of the three ‘elementary’ quarks found in nucleons is about 50 times smaller than the nucleon mass. Whatever compresses and keeps the quarks within the nucleon volume is thus the source of nearly all of mass of matter. This clarifies that the Higgs field provides the mass scale to all particles that we view today as elementary. Therefore only a small %-sized fraction of the mass of matter originates directly in the Higgs field; see Section 7.1 for further discussion. The question: What is mass? can be studied by melting hadrons into quarks in RHI collisions 
2) Quarks are kept inside hadrons by the ‘vacuum’ properties which abhor the color charge of quarks. This explanation of 1) means that there must be at least two different forms of the modern æther that we call ‘vacuum’: the world around us, and the holes in it that are called hadrons. The question: Can we form arbitrarily big holes filled with almost free quarks and gluons? was and remains the existential issue for laboratory study of hot matter made of quarks and gluons, the QGP. Aficionados of the lattice-QCD should take note that the presentation of two phases of matter in numerical simulations does not answer this question as the lattice method studies the entire Universe, showing hadron properties at low temperature, and QGP properties at high temperature 
3) We all agree that QGP was the primordial Big-Bang stuff that filled the Universe before ‘normal’ matter formed. Thus any laboratory exploration of the QGP properties solidifies our models of the Big Bang and allows us to ask these questions: What are the properties of the primordial matter content of the Universe? and How does ‘normal’ matter formation in early Universe work?  
4) What is flavor? In elementary particle collisions, we deal with a few, and in most cases only one, pair of newly created 2nd, or 3rd flavor family of particles at a time. A new situation arises in the QGP formed in relativistic heavy ion collisions. QGP includes a large number of particles from the second family: the strange quarks and also, the yet heavier charmed quarks; and from the third family at the LHC we expect an appreciable abundance of bottom quarks. The novel ability to study a large number of these 2nd and 3rd generation particles offers a new opportunity to approach in an experiment the riddle of flavor 
5) In relativistic heavy ion collisions the kinetic energy of ions feeds the growth of quark population. These quarks ultimately turn into final state material particles. This means that we study experimentally the mechanisms leading to the conversion of the colliding ion kinetic energy into mass of matter. One can wonder aloud if this sheds some light on the reverse process: Is it possible to convert matter into energy in the laboratory? The last two points show the potential of ‘applications’ of QGP physics to change both our understanding of, and our place in the world. For the present we keep these questions in mind. This review will address all the other challenges listed under points 1), 2), and 3) above; however, see also thoughts along comparable foundational lines presented in Subsections 7.3 and 7.4..
(Submitted on 13 Aug 2015 (v1), last revised 16 Sep 2015 (this version, v2))

 Snapshot of two colliding lead ions just after impact (simulation).

At a special seminar on 10 February 2000, spokespersons from the experiments on CERN's Heavy Ion programme presented compelling evidence for the existence of a new state of matter in which quarks, instead of being bound up into more complex particles such as protons and neutrons, are liberated to roam freely.
Theory predicts that this state must have existed at about 10 microseconds after the Big Bang, before the formation of matter, as we know it today, but until now it had not been confirmed experimentally. Our understanding of how the universe was created, which was previously unverified theory for any point in time before the formation of ordinary atomic nuclei, about three minutes after the Big Bang, has with these results now been experimentally tested back to a point only a few microseconds after the Big Bang. (CERN Bulletin 07/00; 14 February 2000)

jeudi 25 août 2016

The bumpy road to the discovery of quasars and massive black holes

Invitation to a Midsummer Night's Dream Reading

My intention in this talk is to share a story in which I was very fortunate to participate almost since the beginning, but which is often ignored by the young generation of astronomers. It is the story of the discovery of Massive Black Holes (MBHs). Since everybody in this assembly knows well the subject in its present stage of development, I thought indeed that it could be interesting to show how the ideas that people take for granted presently had such difficulties to emerge and to gain credence. I think that this subject allows, better than any others, to observe that research is not “a long quiet river”, and on the contrary evolves in a non-linear and erratic way, full of mistakes and of dead ends, and that it gives rise to passionate controversies. We will see that the story of MBHs is made of fruitless researches opening on unexpected discoveries, come-backs of visionary models which were first neglected, temporary very fashionable but wrong models, strong debates involving even new physical laws, misinterpretations responsible for decades of stagnation, thousands of papers and nights of the largest telescopes. But finally it opened on a coherent physical model and on a new vision of galaxy evolution. Since it is a long story, I have selected only a few fragments...
Suzy Collin (Submitted on 27 Apr 2006 (v1), last revised 1 Sep 2006 (this version, v3))

The article is not long in fact. I reproduce below its two figures as lobby cards to advertise its reading!

Figure 1. This radio map of NGC 6251, as published by Readhead, Cohen & Blandford in 1978, shows that a small jet 5 light-years long is aligned with a larger jet of 600 000 light-years, itself aligned with the direction of the radio lobes, separated by 9 millions light-years. The fact that the two jets at the small and intermediate scales are seen only on one side, while the lobes at large scale are almost symmetrical with respect to the galaxy, proves that the side of the jet directed towards us is relativistic boosted, and therefore that the bulk velocity of the jet is very close to the velocity of light. {It was a fundamental discovery to support the cosmological distance hypothesis of quasars}
Figure 2. Cartoon produced by McCray at the Cambridge summer school in 1977 and called “Response of astrophysicists to a fashionable new idea”. I extract a few lines from his paper: “Beyond the accretion radius, r, astrophysicists are sufficiently busy to not be influenced by the fashionable new idea. But others, within r, begin a headlong plunge towards it... In their rush to be the first, they almost invariably miss the central point, and fly off on some tangent... In the vicinity of the idea, communication must finally occur, but it does so in violent collisions... Some individuals may have crossed the rationality horizon rs beyond which the fashionable idea has become an article of faith. These unfortunate souls never escape. Examples of this latter phenomenon are also familiar to all of us.”
Pour le lecteur francophone je recommande chaudement la lecture d'un autre texte de Susy Collin-Zahn lui aussi extrêmement pédagogique et informatif sur les controverses autour des quasars et plus généralement autour du modèle cosmologique standard : La théorie du Big Bang rend bien compte des décalages observés publié sur le riche site Science... et pseudo-sciences.

mercredi 24 août 2016

The seven pillars of (heuristical) wisdom


... it helps to recall the definition of an expert as a man who knows all the mistakes possible in his field*. Our whole problem is to make the mistakes as fast as possible- my part- and recognize them** -your part! Can a unifying concept in one field be applied in another? Let me call on a septet of sibyls to say yes if they will.
Sayings of the seven sibyls
(1) The Unknown is Knowable
(2) Advance by Trial and Error 
(3) Measurement and Theory are Inseparable
(4) Analogy Gives Insight
(5) New Truth Connects with Old Truth
(6) Complementarity Guards against Contradiction
(7) Great Consequences Spring from Lowly Sources
 John Archibald Wheeler

* Approximate quote possibly attributable to Niels Bohr, the real one being "An expert is a person who has found out by his own painful experience all the mistakes that one can make in a very narrow field" according to Edward Teller.


Addendum on August 25, 2016:
Fluctuat [et,nec] mergitur
This post is dedicated to the whole community of physicists at LHC and is also a continuation of my comment on the blog Resonaance about the huge amount of research articles devoted to the now notorious excess of events at invariant mass around 750 GeV in pp → γγ collisions, first reported [1a, 1b] in the preliminary 2015 LHC data collected at 13 TeV and later fading away in the new 2016 LHC data [2a, 2b].

**For why it can be difficult to admit mistakes some psychologists have explained with the theory of cognitive dissonance.  

lundi 4 juillet 2016

Neutrino physics cold[ol] case{s}

The 20th century (story of the) neutrino
In the recent past, two Nobel Prizes were given to Neutrino Physics. In 2002 Ray Davis of USA and Matoshi Koshiba of Japan got the Nobel Prize for Physics while last year (2015) Arthur McDonald of Canada and Takaaki Kajita of Japan got the Nobel Prize. To understand the importance of neutrino research it is necessary to go through the story of the neutrino in some detail. 
Starting with Pauli and Fermi, the early history of the neutrino is described culminating in its experimental detection by Cowan and Reines. Because of its historical importance the genesis of the solar neutrino problem and its solution in terms of neutrino oscillation are described in greater detail. In particular, we trace the story of the 90-year-old thermonuclear hypothesis which states that the Sun and the stars are powered by thermonuclear fusion reactions and the attempts to prove this hypothesis experimentally. We go through Davis’s pioneering experiments to detect the neutrinos emitted from these reactions in the Sun and describe how the Sudbury Neutrino Observatory in Canada was finally able to give a direct experimental proof of this hypothesis in 2002 and how, in the process, a fundamental discovery i.e. the discovery of neutrino oscillation and neutrino mass was made. 
We next describe the parallel story of cosmic-ray-produced neutrinos and how their study by SuperKamioka experiment in Japan won the race by discovering neutrino oscillations in 1998. 
Many other important issues are briefly discussed at the end...
Milestones in the neutrino story

  • 1930 Birth of Neutrino: Pauli 
  • 1932 Theory of beta decay, ”Neutrino” named: Fermi
  • 1954 First detection of neutrino: Cowan and Reines 
  • 1964 Discovery of muneutrino: Lederman, Schwartz and Steinberger 
  • 1965 Detection of atmospheric neutrino: KGF 1970 Start of the solar neutrino experiment: Davis 
  • 1987 Detection of neutrinos from supernova: SuperKamioka 
  • 1998 Discovery of neutrino oscillation and mass: SuperKamioka
  • 2001 Discovery of tauneutrino: DONUT 
  • 2002 Solution of the solar neutrino puzzle: SNO 
  • 2005 Detection of geoneutrinos: KamLAND 
  • 2013 Detection of ultra high energy neutrinos from space: Ice Cube 

G Rajasekaran (Institute of Mathematical Sciences, Chennai & Chennai Mathematical Institute) (Submitted on 22 Jun 2016)

A potential 21st century counterpart...
What exactly is Dark Matter? New theories for what really constitutes Dark Matter appear to make the news headlines every week. At a slower pace, these theories are slowly being eliminated. We revisit this scientific thriller and make the case that condensed neutrino matter is a leading suspect. We provide a forensic discussion of some subtle evidence and show that independent experimental results due out in 2019 from the KATRIN experiment [1] will either be the definitive result or eliminate condensed neutrinos as a Dark Matter candidate... The ... experiment ... will have the sensitivity to determine the mass of the electron antineutrino down to 0.35 eV/c2 ... This mass range for the electron antineutrino is in direct contradiction to the upper bound claimed by the Planck satellite consortium. If KATRIN discovers a neutrino mass in this range, we contend that the cosmological blackbody radiation raw data analysis must be revisited and that it would be a major finding endorsing condensed neutrinos as the so-called Dark Matter, which everyone has been looking for.
(Submitted on 27 Jun 2016)
... and another speculative (rival?) one here



jeudi 16 juin 2016

Gravitational wave astronomy stays pitch-black (up to now)

No electromagnetic counterparts from optical wavelengths...
We present a search for an electromagnetic counterpart of the gravitational wave source GW151226. Using the Pan-STARRS1 telescope we mapped out 290 square degrees in the optical i_ps filter over a period starting 11.45hr after the LIGO information release (49.48hr after the GW trigger) and lasting for a further 28 days. We typically reached sensitivity limits of i_ps=20.3-20.8 and covered 26.5% of the LIGO probability skymap. We supplemented this with ATLAS survey data, reaching 31% of the probability region to shallower depths of m~19. We found 49 extragalactic transients (that are not obviously AGN), including a faint transient in a galaxy at 7Mpc (a luminous blue variable outburst) plus a rapidly decaying M-dwarf flare. Spectral classification of 20 other transient events showed them all to be supernovae. We found an unusual transient, PS15dpn, with an explosion date temporally coincident with GW151226 which evolved into a type Ibn supernova. The redshift of the transient is secure at z=0.1747 +/- 0.0001 and we find it unlikely to be linked, since the luminosity distance has a negligible probability of being consistent with that of GW151226. In the 290 square degrees surveyed we therefore do not find a likely counterpart. However we show that our survey strategy would be sensitive to Neutron Star-Nentron Star mergers producing kilonovae at D < 100 Mpc, which is promising for future LIGO/Virgo searches.
S. J. Smartt et al, (Submitted on 15 Jun 2016)


 ... to gamma ray ones
We present the Fermi Gamma-ray Burst Monitor (GBM) and Large Area Telescope (LAT) observations of the LIGO binary black hole merger event GW151226 and candi- date LVT151012. No candidate electromagnetic counterparts were detected by either the GBM or LAT. We present a detailed analysis of the GBM and LAT data over a range of timescales from seconds to years, using automated pipelines and new techniques for char- acterizing the upper limits across a large area of the sky. Due to the partial GBM and LAT coverage of the large LIGO localization regions at the trigger times for both events, differences in source distances and masses, as well as the uncertain degree to which emission from these sources could be beamed, these non-detections cannot be used to constrain the variety of theoretical models recently applied to explain the candidate GBM counterpart to GW150914.
J. L. Racusin et al, (Submitted on 15 Jun 2016)

mercredi 15 juin 2016

GW151226 : Second direct detection (first replication) of a gravitational wave detection !

Live : https://iframe.dacast.com/b/59062/c/268750 !






The first available animation !




The Paper !
We report the observation of a gravitational-wave signal produced by the coalescence of two stellar-mass black holes. The signal, GW151226, was observed by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) on December 26, 2015 at 03:38:53 UTC. The signal was initially identified within 70 s by an online matched-filter search targeting binary coalescences. Subsequent off-line analyses recovered GW151226 with a network signal-to-noise ratio of 13 and a significance greater than 5σ. The signal persisted in the LIGO frequency band for approximately 1 s, increasing in frequency and amplitude over about 55 cycles from 35 to 450 Hz, and reached a peak gravitational strain of 3.4±0.7±0.9×10-22. The inferred source-frame initial black hole masses are 14.2±8.3±3.7M and 7.5±2.3±2.3M, and the final black hole mass is 20.8±6.1±1.7M. We find that at least one of the component black holes has spin greater than 0.2. This source is located at a luminosity distance of 440±180±190 Mpc corresponding to a redshift of 0.09±0.03 ±0.04 . All uncertainties define a 90% credible interval. This second gravitational-wave observation provides improved constraints on stellar populations and on deviations from general relativity
GW151226: Observation of Gravitational Waves from a 22-Solar-Mass Binary Black Hole Coalescence B. P. Abbott et al.* (LIGO Scientific Collaboration and Virgo Collaboration) (Received 31 May 2016; published 15 June 2016) 

mercredi 8 juin 2016

Higgs times seven (minus one) / sept moins une fois le boson de Higgs

750 GeV = 6×125 GeV!

The first LHC data about pp collisions at √ s = 13 TeV agree with the Standard Model (SM), except for a hint of an excess in pp → γγ peaked at invariant mass around 750 GeV [1]. We denote the new resonance with the symbol {digamma}, used in archaic greek as the digamma letter and later as the number 6 ≈ Mz/Mh, but disappeared twice... unlike many other anomalies that disappeared, the γγ excess cannot be caused by a systematic issue, neither experimental nor theoretical. Theoretically, the SM background is dominated by tree-level q→ γγ scatterings, which cannot make a γγ resonance [See {below} for a attempt of finding a Standard Model interpretation.] Experimentally, one just needs to identify two photons and measure their energy and direction. The γγ excess is either the biggest statistical fluctuation since decades, or the main discovery.
(Submitted on 30 May 2016)


750 GeV scalar boson = (6 top quarks + 6 antitop quarks) bound state?
We shall here explore the possibility that the diphoton excess in the inclusive γγ spectrum, recently found by the ATLAS and CMS collaborations [1, 2], with a mass of 750 GeV can be a bound state of particles already present in the Standard Model, namely a bound state of 6 top + 6 antitop quarks. Thus we would need no new fundamental particles, interactions or free parameters beyond the Standard Model to explain this peak, which otherwise looks like “new physics”!  
For several years we have worked on the somewhat controversial idea [3, 4, 5, 6, 7, 8] that the exchange of Higgses and gluons between 6 top and 6 antitop quarks provides sufficiently strong attraction between these quarks for a very light (compared to the mass of 12 top quarks) bound state S to be formed. The 6 tops + 6 antitops are all supposed to be in the 1s state in the atomic physics notation and, because of there being just 3 colors and 2 spin states for a top-quark, this is the maximum number allowed in the 1s shell. 
Further speculations around this bound state were mostly built up under the assumption of a hoped for new principle – the multiple point principle [9, 10, 11] – from which we actually predicted the mass of the Higgs boson long before it was found [12]. This principle says that there shall be several phases of space (i.e. several vacua) with the same energy density. One of these should have a condensate of the bound states S. It was even speculated then that such a condensate – or new vacuum – could form the interior of balls, containing highly compressed ordinary matter, which make up the dark matter [13, 14, 15]. Thus the discovery, if confirmed, of the bound state S could support a theory, in which dark matter could be incorporated into a pure Standard Model theory, only adding the multiple point principle, which predicts the values of coupling constants but otherwise without new physics.
(Submitted on 12 May 2016)

mardi 12 avril 2016

Riding on a laser beam ...

...  to chase the Starshot interstellar flight dream?


In the nearly 60 years of spaceflight we have accomplished wonderful feats of exploration that have shown the incredible spirit of the human drive to explore and understand our universe. Yet in those 60 years we have barely left our solar system with the Voyager 1 spacecraft launched in 1977 finally leaving the solar system after 37 years of flight at a speed of 17 km/s or less than 0.006% the speed of light. As remarkable as this is we will never reach even the nearest stars with our current propulsion technology in even 10 millennium. We have to radically rethink our strategy or give up our dreams of reaching the stars, or wait for technology that does not currently exist. While we all dream of human spaceflight to the stars in a way romanticized in books and movies, it is not within our power to do so, nor it is clear that this is the path we should choose. We posit a technological path forward, that while not simple, it is within our technological reach. We propose a roadmap to a program that will lead to sending relativistic probes to the nearest stars and will open up a vast array of possibilities of flight both within our solar system and far beyond. Spacecraft from gram level complete spacecraft on a wafer ("wafersats") that reach more than 1/4 c and reach the nearest star in 20 years to spacecraft with masses more than 10^5 kg (100 tons) that can reach speeds of greater than 1000 km/s. These systems can be propelled to speeds currently unimaginable with existing propulsion technologies. To do so requires a fundamental change in our thinking of both propulsion and in many cases what a spacecraft is. In addition to larger spacecraft, some capable of transporting humans, we consider functional spacecraft on a wafer, including integrated optical communications, imaging systems, photon thrusters, power and sensors combined with directed energy propulsion.

...Directed energy systems are ubiquitous, used throughout science and industry to melt or vaporize solid objects, such as for laser welding & cutting, as well as in defense. Recent advances in photonics now allow for a 2D array of phase locked laser amplifiers fed by a common low power seed laser that have already achieved near 50% wall plug conversion efficiency. It is known as a MOPA (Master Oscillator Power Amplifier) design. 
Schematic design of phased array laser driver. Wavefront sensing from both local and extended systems combined with the system metrology are critical to forming the final beam.

The technology is proceeding on a "Moore's Law" like pace with power per mass at 5 kg/kW with the size of a 1 kW amplifier not much larger than a textbook. There is already a roadmap to reduce this to 1 kg/kW in the next 5 years and discussions for advancing key aspects of the technology to higher TRL are beginning. These devices are revolutionizing directed energy applications and have the potential to revolutionize many related applications. Due to the phased array technology the system can simultaneous send out multiple beams and thus is inherently capable of simultaneous multitasking as well as multi modal.  
The laser system can be built and tested at any level from desktop to extremely large platforms. This is radically different than the older "use a huge laser" approach to photon propulsion. This is the equivalent to modern parallel processing vs an older single processor supercomputer




Parameters for full class 4 [a 10 km array] system with 1 g wafer SC and 1 m sail. Craft achieves 0.2 c in about 10 min (assuming an extended illumination) and takes about 20 years to get to Alpha Centauri. Communications rate assumes class 4 drive array is also used for reception with a 1 watt short burst from a 100 mm wafer SC. Here we use the 1 meter drive reflector as the transmit and receive optical system on the spacecraft. We also assume a photon/bit ratio near unity. In this case we get a data rate at Andromeda of about 65 kb/s. In the previous figure for the same wafer scale spacecraft the only optical system on the spacecraft was the 100 mm wafer. The data rate received at the Earth from Alpha Centauri is about 0.65 kbs during the burst assuming we can use the DE-STAR 4 driver as the receiver and only the wafer itself for the transmission optic. The plot above shows a much more conservative photon/bit ratio of 40 while unity has been achieved but never over the extremely long distances discussed here.
(Submitted on 5 Apr 2016 (v1), last revised 7 Apr 2016 (this version, v2))


//added on April 13, 2016: I guess some of the advances in photonics are spin-off applications of the late Strategic Defense Initiative but it is fascinating that they could be the backbone of the new Yuri Milner scientific/speculative breakthrough initiative publicized yesterday.
Anyway the future of large phased-array lasers in Earth orbit could definitely be in Directed Energy System for Targeting of Asteroids and exploRation at a time when nuclear ballistic missiles are likely not the most pressing threat anymore...

dimanche 3 avril 2016

After astrophysics and condensed matter theory, holography meets biology

No comment // almost
Holographic Biology 

There are successful applications of the holographic AdS/CFT correspondence to high energy and condensed matter physics. We apply the holographic approach to photosynthesis that is an important example of nontrivial quantum phenomena relevant for life which is being studied in the emerging field of quantum biology. Light harvesting complexes of photosynthetic organisms are many-body quantum systems, in which quantum coherence has recently been experimentally shown to survive for relatively long time scales even at the physiological temperature despite the decohering effects of their environments. We use the holographic approach to evaluate the time dependence of entanglement entropy and quantum mutual information in the Fenna-Matthews-Olson (FMO) protein-pigment complex in green sulfur bacteria during the transfer of an excitation from a chlorosome antenna to a reaction center. It is demonstrated that the time evolution of the mutual information simulating the Lindblad master equation in some cases can be obtained by means of a dual gravity describing black hole formation in the AdS-Vaidya spacetime. The wake up and scrambling times for various partitions of the FMO complex are discussed.
(Submitted on 30 Mar 2016)

Holographic Mystique

Thus far, in spite of many interesting developments, the overall progress towards a systematic study and classification of various ’strange’ metallic states of matter has been rather limited. To that end, it was argued that a recent proliferation of the ideas of holographic correspondence originating from string theory might offer a possible way out of the stalemate. However, after almost a decade of intensive studies into the proposed extensions of the holographic conjecture to a variety of condensed matter problems, the validity of this intriguing approach remains largely unknown. This discussion aims at ascertaining its true status and elucidating the conditions under which some of its predictions may indeed be right (albeit, possibly, for a wrong reason).

... some limited form of a bulk-boundary relationship might, in fact, be quite robust and hold regardless of whether or not the systems in question possess any particular symmetries, unlike in the original AdS/CFT construction. Naively, this form of correspondence can even be related to the Einstein’s equivalence principle (i.e., ’curvature equals force’), according to which free motion in a curved space should be indistinguishable from the effect of a physical interaction (only, this time around, in the tangential direction). 
...
Together with the systematic comparison between the predictions of the condensed matter theory holography and other, more traditional, approaches and/or experimental data it would be a necessary step towards vetting the intriguing, yet speculative, holographic ideas before the latter can be rightfully called a novel technique for tackling the ’strange metals’ and other strongly correlated materials. In any event, though, it would seem rather unlikely that a hands-on expertise in string theory will become a mandatory prerequisite for understanding the cuprates.
(Submitted on 31 Mar 2016)

Holographic Universe
... the AdS/CFT correspondence and holographic dualities have aroused immense enthusiasm in the string theory community. This constitutes, after all, normal scientific research. The phenomenon is still puzzling[45]. At a minimum, the holographic duality is an interesting tool for calculating fundamental physics. The dictionary the duality offers—between a world in flat space-time and a curved world with gravity—works in both directions. Some calculations are simpler with supergravity than in dual gauge theory. 
Gauge/gravity duality has enhanced the stature of Albert Einstein’s own theory...But, despite its recognized elegance, general relativity has been used by only a small portion of the scientific community. This is not surprising. After all, general relativity seemed confined to cases of strongly curved space-time: compact stars, the big bang, gravitational waves. Its effects were utterly negligible at the scales at work in condensed matter physics and nuclear physics. Why should gravity play a role in the quantum world? Yet, over the last twenty years general relativity has finally penetrated the world of modern physics. Specialists in condensed matter, nuclear physics, fluid turbulence, and quantum information are actively interested in general relativity. 
Why this dramatic turnaround? 
As science progresses, the virtues of cross-fertilization between different areas of knowledge have become widely appreciated. But this is the result rather than the cause. The key factor has been the AdS/CFT correspondence. Thanks to general relativity and its string theory extensions, one can now describe phenomena that have nothing to do with gravity in strong fields. 
On the other hand, the AdS/CFT correspondence has not been mathematically demonstrated. The holographic principle remains a conjecture. 
Its degree of experimental verification is zero. 
String theorists believe in it because their theory supports a specific version of holography, and, under certain important restrictions, black hole thermodynamics suggests it as well. But to conclude that it is a correct representation of nature is an enormous leap. 
We still do not know if string theory is correct. Let us suppose it is. Different formulations of the holographic principle have been tested only in situations that do not correspond to our world. The ensuing equations describe possible worlds that are similar, but not identical to our own. Extant solutions have allowed us to test the principles of string theory in limiting cases and to show the consistency of the theory, but never in situations corresponding exactly to the world in which we live. 
Suppose, on the other hand, that string theory proves false. What would become of the AdS/CFT correspondence? In loop quantum gravity, space-time emerges as the coarse-graining of fundamental structures made discrete, the atoms of space. The formation of a black hole and its ultimate evaporation are described by a unitary process that respects the laws of quantum mechanics [46] Similarly, the black holes described in the approach to non-commutative geometry do not evaporate completely and therefore escape the information paradox [47] The holographic conjecture has nonetheless improved the epistemological status of black holes. 
Their status was not always so elevated. The first exact solution of general relativity describing the space-time of a black hole was discovered by Karl Schwarzschild in 1916. Until the 1950s, general relativity theorists were embarrassed by black holes because of their singularities. Later they seemed esoteric objects, hardly believable. Then, in the decades between 1960 and 1990, they became both relevant to astrophysics, and fascinating in their own right. As we have seen, black holes have proven key for understanding quantum gravity, and the deep dualities between distant fields of theoretical physics. Perhaps someday they will become ubiquitous, because they have become useful in the description of everyday systems.  
Volume Two, Issue One, published February 9, 2016




vendredi 26 février 2016

Oh, gravitational waves, jingle, jingle in all the bands!

Dashing through space and time...
We show that the black hole binary (BHB) coalescence rates inferred from the advanced LIGO (aLIGO) detection of GW150914 imply an unexpectedly loud GW sky at milli-Hz frequencies accessible to the evolving Laser Interferometer Space Antenna (eLISA), with several outstanding consequences. First, up to thousands of BHB will be individually resolvable by eLISA; second, millions of non resolvable BHBs will build a confusion noise detectable with signal-to-noise ratio of few to hundreds; third – and perhaps most importantly – up to hundreds of BHBs individually resolvable by eLISA will coalesce in the aLIGO band within ten years. eLISA observations will tell aLIGO and all electromagnetic probes weeks in advance when and where these BHB coalescences are going to occur, with uncertainties of <10s and <1deg2 . This will allow the pre-pointing of telescopes to realize coincident GW and multi-wavelength electromagnetic observations of BHB mergers. Time coincidence is critical because prompt emission associated to a BHB merger will likely have a duration comparable to the dynamical time-scale of the systems, and is only possible with low frequency GW alerts



The multi-band GW astronomy concept. The violet lines are the total sensitivity curves (assuming two Michelson) of three eLISA configurations; from top to bottom N2A1, N2A2, N2A5 (from [11]). The orange lines are the current (dashed) and design (solid) aLIGO sensitivity curves. The lines in different blue flavours represent characteristic amplitude tracks of BHB sources for a realization of the flat population model (see main text) seen with S/N> 1 in the N2A2 configuration (highlighted as the thick eLISA middle curve), integrated assuming a five year mission lifetime. The light turquoise lines clustering around 0.01Hz are sources seen in eLISA with S/N< 5 (for clarity, we down-sampled them by a factor of 20 and we removed sources extending to the aLIGO band); the light and dark blue curves crossing to the aLIGO band are sources with S/N> 5 and S/N> 8 respectively in eLISA; the dark blue marks in the upper left corner are other sources with S/N> 8 in eLISA but not crossing to the aLIGO band within the mission lifetime. For comparison, the characteristic amplitude track completed by GW150914 is shown as a black solid line, and the chart at the top of the figure indicates the frequency progression of this particular source in the last 10 years before coalescence. The shaded area at the bottom left marks the expected confusion noise level produced by the same population model (median, 68% and 95% intervals are shown). The waveforms shown are second order post-Newtonian inspirals phenomenologically adjusted with a Lorentzian function to describe the ringdown.

...  in a three-arm LISA!
The observation of GW150914 brings unexpected prospects in multi-band GW astronomy, providing even more compelling evidence that a milli-Hz GW observatory will not only open a new window on the Universe, but will also naturally complete and enhance the payouts of the high frequency window probed by aLIGO. The scientific potential of multi-band GW astronomy is enormous, ranging from multimessenger astronomy, cosmology and ultra precise gravity tests with BHBs, to the study of the cosmological BHB merger rate, and to the mutual validation of the calibration of the two GW instruments. This is a unique new opportunity for the future of GW astronomy, and how much of this potential will be realized in practice, depends on the choice of the eLISA baseline. Should an extremely de-scoped design like the New Gravitational Observatory (NGO) [27] be adopted, all the spectacular scientific prospects outlined above will likely be lost. Re-introducing the third arm (i.e. six laser links) and increasing the arm-length to at least two million kilometres (A2) will allow observation of more than 50 resolved BHB with both eLISA and aLIGO, and the detection of the unresolved confusion noise with S/N> 30. We also stress that the most interesting systems emit at f > 10−2Hz, a band essentially ’clean’ from other sources. There, the eLISA sensitivity critically depends on the shot noise, which is determined by the number of photons collected at the detector mirrors. It is therefore important to reconsider the designed mirror size and laser power under the novel appealing prospect of observing more of these BHBs and with an higher S/N.
(Submitted on 22 Feb 2016)

Given its tremendous potential for fundamental physics and astrophysics, the European Space Agency (ESA) has selected the observation of the Universe at GW frequencies around one mHz as one of the three main science themes of the “Cosmic Vision Program” [42]. Indeed, a call for mission proposals for the “Gravitational Universe” science theme is expected for late 2016, and the L3 launch slot in 2034 has been reserved for the selected mission. The main candidate mission for this call (for which a decision will be made by 2018-19, so as to allow sufficient time for industrial production before the nominal 2034 launch date) is the evolving Laser Interferometer Space Antenna (eLISA) [43], named after the “classic LISA” concept of the late 90’s and early 2000s [44]. The eLISA mission concept consists of a constellation of three spacecraft, trailing the Earth around the Sun at a distance of about fifteen degrees. Each spacecraft will contain one or two test masses in almost perfect free fall, and laser transponders which will allow measurements of the relative proper distances of the test masses in different spacecraft via laser interferometry. This will allow the detection of the effect of possible GW signals (which would change the distance between the test masses). The most technically challenging aspect of the mission will be to maintain the test masses in almost perfect free fall. For this reason, a scaled-down version of one of eLISA’s laser links will be tested by the “LISA Pathfinder” mission. Pathfinder was launched by ESA in December 2015, and it will provide crucial tests of how well eLISA’s low frequency acceleration noise can be suppressed. 
... 
There are, however, other aspects to the eLISA mission that are yet to be evaluated and decided upon by ESA, within the constraints imposed by the allocated budget for the “Gravitational Universe” science theme. A “Gravitational Observatory Advisory Team” (GOAT) [45] has been established by ESA to advise on the scientific and technological issues pertaining to an eLISA-like mission. Variables that affect the cost of the mission include: (i) the already mentioned low-frequency acceleration noise; (ii) the mission lifetime, which is expected to range between one and several years, with longer durations involving higher costs because each component has to be thoroughly tested for the minimum duration of the mission, and may also require higher fuel consumption, since the orbital stability of the triangular constellation sets an upper limit on the mission duration and therefore achieving a longer mission may require the constellation to be further from the Earth; (iii) the length L of the constellation arms, which may range from one to several million km, with longer arms involving higher costs to put the constellation into place and to maintain a stable orbit and slowly varying distances between the spacecraft; (iv) the number of laser links between the spacecraft, i.e., the number of “arms” of the interferometer (with four links corresponding to two arms, i.e., only one interferometer, and six links to three arms, i.e., two independent interferometers at low frequencies [46]): giving up the third arm would cut costs (mainly laser power, industrial production costs), while possibly hurting science capabilities (especially source localization) and allowing for no redundancy in case of technical faults in one of the laser links.

All the black-hole mergers we take?
Gravitational waves penetrate all of cosmic history, which allows eLISA to explore scales, epochs, and new physical effects not accessible in any other way (see figure {below}). Indeed a detectable gravitational wave background in the eLISA band is predicted by a number of new physical ideas for early cosmological evolution (Hogan, 2006, Maggiore, 2000). Two important mechanisms for generating stochastic backgrounds are phase transitions in the early Universe and cosmic strings. 
... the eLISA frequency band of about 0.1 mHz to 100 mHz today corresponds to the horizon at and beyond the Terascale frontier of fundamental physics. This allows eLISA to probe bulk motions at times about 3×10-18 – 3×10-10 seconds after the Big Bang, a period not directly accessible with any other technique. Taking a typical broad spectrum into account, eLISA has the sensitivity to detect cosmological backgrounds caused by new physics active in the range of energy from 0.1 TeV to 1000 TeV, if more than a modest fraction ΩGW of about 10-5 of the energy density is converted to gravitational radiation at the time of production 
Various sources of gravitational wave background of cosmological origin are presented in detail in Binétruy et al.(2012)...

The observed (redshifted) frequency of wave-generating phenomena is shown as a function of cosmic scale factor a, with the present epoch at the right. The redshifted Hubble rate (horizon scale) is shown in black for a standard Grand Unified Theory (GUT) and a lower temperature Terascale (TeV) inflationary cosmology. Blue regions are accessible to electromagnetic (EM) observations: the Universe since recombination (right box) and cosmic microwave background (CMB) fluctuations (left box). The red bar shows the range of cosmic history accessible through eLISA from processes within the horizon up to about 1000 TeV.

One of the most promising science goals of the mission are supermassive black holes, which appear to be a key component of galaxies. They are ubiquitous in near bright galaxies and share a common evolution. The intense accretion phase that supermassive black holes experience when shining as quasi-stellar objects and active galactic nuclei erases information on how and when the black holes formed. eLISA will unravel precisely this information. Very massive black holes are expected to transit into the mass interval to which eLISA is sensitive along the course of their cosmic evolution. eLISA will then map and mark the loci where galaxies form and cluster, using black holes as clean tracers of their assembly by capturing gravitational waves emitted during their coalescence, that travelled undisturbed from the sites where they originated. On the other hand, middleweight black holes of 105M are observed in the near universe, but our knowledge of these systems is rather incomplete. eLISA will investigate a mass interval that is not accessible to current electromagnetic techniques, and this is fundamental to understand the origin and growth of supermassive black holes. Due to the transparency of the universe to gravitational waves at any redshift, eLISA will explore black holes of 105M – 107M out to a redshift z ≤ 20, tracing the growth of the black hole population.  
eLISa will also shed light on the path of black holes to coalescence in a galaxy merger. This is a complex process, as various physical mechanisms involving the interaction of the black holes with stars and gas need to be at play and work effectively, acting on different scales (from kpc down to 10-3 pc). Only at the smallest scales gravitational waves are the dominant dissipative process driving the binary to coalescence. eLISA will trace the last phase of this evolution. Dual active galactic nuclei (AGN), i.e. active black holes observed during their pairing phase, offer the view of what we may call the galactic precursors of black hole binary coalescences. They are now discovered in increasing numbers, in large surveys. By contrast, evidence of binary and recoiling AGN is poor, as the true nature of a number of candidates is not yet fully established. eLISA only will offer the unique view of an imminent binary merger by capturing its loud gravitational wave signal...  
Current electromagnetic observations are probing only the tip of the massive black hole distribution in the universe, targeting black holes with large masses, between 107M – 109M . Conversely, eLISA will be able to detect the gravitational waves emitted by black hole binaries with total mass (in the source rest frame) as small as 104 M and up to 107 M , out to a redshift as remote as z ∼ 20. eLISA will detect fiducial sources out to redshift z 10 with SNR  10 and so it will explore almost all the mass-redshift parameter space relevant for addressing scientific questions on the evolution of the black hole population. Redshifted masses will be measured to an unprecedented accuracy, up to the 0.1–1% level, whereas absolute errors in the spin determination are expected to be in the range 0.01–0.1, allowing us to reconstruct the cosmic evolution of massive black holes. eLISA observations hence have the potential of constraining the astrophysics of massive black holes along their entire cosmic history, in a mass and redshift range inaccessible to conventional electromagnetic observations 
On smaller scales, eLISA will also bring a new revolutionary perspective, in this case relative to the study of galactic nuclei. eLISA will offer the deepest view of galactic nuclei, exploring regions to which we are blind using current electromagnetic techniques and probing the dynamics of stars in the space-time of a Kerr black hole, by capturing the gravitational waves emitted by stellar black holes orbiting the massive black hole. Extreme mass ratio inspirals (EMRI) detections will allow us to infer properties of the stellar environment around a massive black hole, so that our understanding of stellar dynamics in galactic nuclei will be greatly improved. Detection of EMRIs from black holes in the eLISA mass range, that includes black holes similar to the Milky Way’s, will enable us to probe the population of central black holes in an interval of masses where electromagnetic observations are challenging... 
General Relativity has been extensively tested in the weak field regime both in the solar system and by using binary pulsars. eLISA will provide a unique opportunity of confronting GR in the highly dynamical strong field regime of massive black holes. eLISA will be capable of detecting inspiral and/or merger plus ring-down parts of the gravitational wave signal from coalescing massive black holes binaries of comparable mass. For the nearby events (z ∼ 1) the last several hours of the gravitational wave signal will be clearly seen in the data, allowing direct comparison with the waveforms predicted by GR. The inspiral phase could be observed by eLISA up to a year before the final merger with relatively large SNR. Comparison of the observed inspiral rate with the predictions of GR will provide a valuable test of the theory in the regime of strong, dynamical gravitational fields. 
The merger of two black holes could be observed by eLISA throughout the Universe if it falls into the detector band. 
Pau Amaro-Seoane et al.
(Submitted on 17 Jan 2012)