vendredi 31 octobre 2014

Faites moi peur (ou Vous voulez me rassurez) ...

//Le blogueur fête aujourd'hui Halloween à sa manière en convoquant quelques physiciens qui n'hésitent pas à parler du scénario cauchemardesque (nightmare scenario) de la physique des hautes énergies, à savoir pas de phénomènes au delà de ceux prévus par le Modèle Standard observables au LHC. Le but est évidemment de se rassurer en montrant que ses mêmes physiciens réfléchissent sur ce qui pourrait faire avancer leur discipline.


... Monsieur Shifman
String theory appeared as an extension of the dual resonance model of hadrons in the early 1970, and by mid-1980 it raised expectations for the advent of “the theory of everything” to Olympic heights. Now we see that these heights are unsustainable. Perhaps this was the greatest mistake of the string-theory practitioners. They cornered themselves by promising to give answers to each and every question that arises in the realm of fundamental physics, including the hierarchy problem, the incredible smallness of the cosmological constant, and the diversity of the mixing angles. I think by now the “theory-of-everything-doers” are in disarray, and a less formal branch of string theory is in crisis [a more formal branch evolved to become a part of mathematics or (in certain occasions) mathematical physics]. 
At the same time, leaving aside the extreme and unsupported hype of the previous decades, we should say that string theory, as a qualitative extension of field theory, exhibits a very rich mathematical structure and provides us with a new, and in a sense superior, understanding of mathematical physics and quantum field theory. It would be a shame not to explore this structure. And, sure enough, it was explored by serious string theorists. 
The lessons we learned are quite illuminating. First and foremost we learned that physics does not end in four dimensions: in certain instances it is advantageous to look at four dimensional physics from a higher-dimensional perspective... A significant number of advances in field theory, including miracles in N = 4 super-Yang-Mills... came from the string-theory side...
... since the 1980s Polyakov was insisting that QCD had to be reducible to a string theory in 4+1 dimensions. He followed this road... arriving at the conclusion that confinement in QCD could be described as a problem in quantum gravity. This paradigm culminated in Maldacena’s observation (in the late 1990’s) that dynamics of N=4 super-Yang- Mills in four dimensions (viewed as a boundary of a multidimensional bulk) at large N can be read off from the solution of a string theory in the bulk... 
Unfortunately (a usual story when fashion permeates physics), people in search of quick and easy paths to Olympus tend to overdo themselves. For instance, much effort is being invested in holographic description in condensed matter dynamics (at strong coupling). People pick up a supergravity solution in higher dimensions and try to find out whether or not it corresponds to any sensible physical problem which may or may not arise in a condensed matter system. To my mind, this strategy, known as the “solution in search of a problem” is again a dead end. Attempts to replace deep insights into relevant dynamics with guesses very rarely lead to success.
(Submitted on 31 Oct 2012 (v1), last revised 22 Nov 2012 (this version, v3))

... Monsieur White
In his overview talk[1] at Strings 2013, David Gross discussed the “nightmare scenario” in which the Standard Model Higgs boson is discovered at the LHC but no other new short-distance physics, in particular no signal for SUSY, is seen. He called it the “extreme pessimistic scenario” but also said it was looking more and more likely and (if it is established) then, he acknowledged
“We got it wrong.” “How did we misread the signals?” “What to do?”.
He said that if it comes about definitively the field, and string theorists in particular, will suffer badly. He said that it will be essential for theorists who entered the field most recently to figure out where previous generations went wrong and also to determine what experimenters should now look for.
In the following, I will argue that a root cause has been the exaggeration of the significance of the discovery of asymptotic freedom that has led to the historically profound mistake of trying to go forward by simply formulating new short-distance theories, supersymmetric or otherwise, while simultaneously ignoring both deep infra- red problems and fundamental long-distance physics.
In his recent “Welcome” speech[2] at the Perimeter Institute, Neil Turok expressed similar concerns to those expressed by Gross. He said that
“All the {beyond the Standard Model} theories have failed ... Theoretical physics is at a crossroads right now ... {there is} a very deep crisis.”
He argued that nature has turned out to be simpler than all the models - grand unified, super-symmetric, super-string, loop quantum gravity, etc, and that string theorists, especially, are now utterly confused - with no predictions at all. The models have failed, in his opinion, because they have no new, simplifying, underlying principle. They have complicated the physics by adding extra parameters, without introducing any simplifying concepts.

(Submitted on 5 Jun 2014)

vendredi 24 octobre 2014

De l'art de mesurer la constante de Hubble en cherchant notre place au milieu de nulle part

La longue marche vers une "cosmologie de précision"

The plots below show the time evolution of our knoweldge of the Hubble Constant H0, the scaling between radial velocity and distance in kilometers per second per Megaparsec, since it was first determined by Lemaitre, Robertson and Hubble in the late 1920's. The first major revision to Hubble's value was made in the 1950's due to the discovery of Population II stars by W. Baade. That was followed by other corrections for confusion, etc. that pretty much dropped the accepted value down to around 100 km/s/Mpc by the early 1960's.




The last plot shows modern (post Hubble Space Telescope) determinations, including results from gravitational lensing and applications of the Sunyaev-Zeldovich effect. Note the very recent convergence to values near 65 +/- 10 km/sec/Mpc (about 13 miles per second per million light-years)... Currently, the old factor of two discrepancy in the determination of the cosmic distance scale has been reduced to a dispersion of the order of 10 km/s out of 65-70, or 15-20%. Quite an improvement!
One major additional change in the debate since the end of the 20th century has been the discovery of the accelerating universe (cf. Perlmutter et al. 1998 and Riess et al. 1998) and the development of "Concordance" Cosmology. In the early 1990's, one of the strongest arguments for a low (~50 km/s/Mpc) value of the Hubble Constant was the need to derive an expansion age of the universe that was older than, now, the oldest stars, those found in globular star clusters. The best GC ages in 1990 were in the range 16-18 Gyr. The expansion age of the Universe depends primarily on the Hubble constant but also on the value of various other cosmological parameters, most notably then the mean mass density over the closure density, ΩM. For an "empty" universe, the age is just 1/H0 or 9.7 Gyr for H0=100 km/s/Mpc and 19.4 Gyr for 50 km/s/Mpc. For a universe with ΩM=1.000, the theorist's favorite because that is what is predicted by inflation, the age is 2/3 of that for the empty universe. So if the Hubble Constant was 70 km/s/Mpc, the age of an empty universe was 13.5 Gyr, less than the GC ages, and if Ωwas 1.000 as favored by the theorists, the expansion age would only be 9 Gyr, much much less than the GC ages. Conversely if H0 was 50 km/s/Mpc, and ΩM was the observers' favorite value of 0.25, the age came out just about right. Note that this still ruled out ΩM= 1.000 though, inspiring at least one theorist to proclaim that H0 must be 35! The discovery of acceleration enabled the removal of much of this major remaining discrepancy in timescales, that between the expansion age of the Universe and the ages of the oldest stars, those in globular clusters. The introduction of a Cosmological constant, &Lambda, one of the most probable causes for acceleration, changes the computation of the Universe's expansion age. A positive ΩΛ increases the age. The Concordance model has an H0=72 km/s/Mpc, an Ω= 1.0000... made up of ΩΛ=0.73 and ΩM=0.27. Those values yield an age for the Universe of ~ 13.7 Gyr. This alone would not have solved the timescale problem, but a revision of the subdwarf distance scale based on significantly improved paralaxes to nearby subdwards from the ESA Hiparcos mission, increased the distances to galactic globular clusters and thus decreased their estimated ages. The most recent fits of observed Hertzsprung-Russel diagrams to theoretical stellar models (isochrones) by the Yale group (Demarque, Pinsonneault and others) indicates that the mean age of galactic globulars is more like 12.5 Gyr, comfortably smaller than the Expansion age.
John P. Huchra, Copyright 2008
Les derniers pas...
The recent Planck observations of the cosmic microwave background (CMB) lead to a Hubble constant of H0=67.3±1.2 km/s/Mpc for the base six-parameter ΛCDM model (Planck Collaboration 2013, hereafter P13). This value is in tension, at about the 2.5σ level, with the direct measurement of H0=73.8 ± 2.4 km/s/Mpc reported by Riess et al (2011 R11). If these numbers are taken at face value, they suggest evidence for new physics at about the 2.5σ level (for example, exotic physics in the neutrino or dark energy sectors...). The exciting possibility of discovering new physics provides strong motivation to subject both the CMB and H0 measurements to intense scrutiny. This paper presents a reanalysis of the R11 Cepheid data. The  H0 measurement from these data has the smallest error and has been used widely in combination with CMB measurements for cosmological parameter analysis (e.g. Hinshaw et al. 2012; Hou et al. 2012; Sievers et al. 2013). The study reported here was motivated by certain aspects of the R11 analysis: the R11 outlier rejection algorithm (which rejects a large fraction, ∼ 20%, of the Cepheids), the low reduced χ2 values of their fits, and the variations of some of the parameter values with different distance anchors, particularly the metallicity dependence of the period-luminosity relation... 
[The] figure [below] compares these two estimates of H0 with the P13 results from the [Planck+WP+highL (ACT+South Pole Telescope)+BAO (2dF Galaxy Redshift and SDSS redshiftsurveys)] likelihood for the base ΛCDM cosmology and some extended ΛCDM models. I show the combination of CMB and Baryon Acoustic Oscillations [BAO] data since H0 is poorly constrained for some of these extended models using CMB temperature data alone. (For reference, for this data combination H0=67.80±0.77 km/s/Mpc in the base ΛCDM model.) The combination of CMB and BAO data is certainly not prejudiced against new physics, yet the H0 values for the extended ΛCDM models shown in this figure all lie within 1σ of the best fit value for the base ΛCDM model. For example, in the models exploring new physics in the neutrino sector, the central value of H0 never exceeds 69.3 km/s/Mpc. If the true value of H0 lies closer to, say, H0=74 km/s/Mpc , the dark energy sector, which is poorly constrained by the combination of CMB and BAO data, seems a more promising place to search for new physics. In summary, the discrepancies between the Planck results and the direct H0 measurements... are not large enough to provide compelling evidence for new physics beyond the base ΛCDM cosmology.

The direct estimates (red) of H0 (together with 1σ error bars) for the NGC 4258 distance anchor  and for all three distance anchors. The remaining (blue) points show the constraints from P13 for the base ΛCDM cosmology and some extended models combining CMB data with data from baryon acoustic oscillation surveys. The extensions are as follows: mν, the mass of a single neutrino species; mν + Ωk, allowing a massive neutrino species and spatial curvature; Neff , allowing additional relativistic neutrino-like particles; Neff +msterile, adding a massive sterile neutrino and additional relativistic particles; Neff+mν, allowing a massive neutrino and additional relativistic particles; w, dark energy with a constant equation of state w = p/ρ; w + wa , dark energy with a time varying equation of state. I give the 1σ upper limit on mν and the 1σ range for Neff . 
(Submitted on 14 Nov 2013 (v1), last revised 8 Feb 2014 (this version, v2))

"cosmologie de précision" : un terme à prendre avec des pincettes 




Chercher notre place au milieu de nulle part...
Tel pourrait être le propos de la cosmologie dans une perspective anthropologique. Mais ce blog ci n'est pas le lieu pour ce genre de débat. Le blogueur préfère laisser la parole de fin à une grande dame de l'enseignement de l'astronomie en France Lucienne Gougenheim en espérant que ce qui précède illustre bien l'actualité de sa conclusion générale extraite d'un exposé pédagogique sur la constante de Hubble et l'âge de l'Univers daté de 1996

  • La distance n'est pas le seul paramètre qui conditionne la valeur de H0...
  • La nature de la chandelle standard est complexe ; même quand nous avons une bonne connaissance théorique de la propriété qui sert de critère de distance, il convient de discuter l'importance des différents paramètres dont elle dépend.
  • On ne passe de la connaissance de H0 à celle de d'âge de l'univers que dans le cadre d'un modèle cosmologique.
  • ...un problème complexe ne peut se comprendre (et en conséquence se résoudre) que par la prise en compte de l'ensemble des paramètres dont il dépend...