mardi 26 août 2014

Peut-on mettre un peu d'ordre dans le processus de sélection des théories physiques?

De la cohérence observationnelle à la cohérence mathématique... 
My first point is that the conditions of theory choice should be ordered. Frequently we see the listing of criteria for theory choice given in a flat manner, where one is not given precedence over the other a priori. We see consilience, simplicity, falsifiability, naturalness, consistency, economy, all together in an unordered list of factors when judging a theory. However, consistency must take precedence over any other factors. Observational consistency is obviously central to everyone, most especially our experimental colleagues, when judging the relevance of theory for describing nature. Despite some subtleties that can be present with regards to observational consistency (There can be circumstances where a theory is observationally consistent in a vast number of observables, but in a few it does not get right, yet no other decent theory is around to replace it. In other words, observational consistency is still the top criterion, but the best theory may not be 100% consistent.) it is a criterion that all would say is at the top of the list.
Mathematical consistency, on the other hand, is not as fully appreciated... Mathematical consistency has a preeminent role right up there with ob- servational consistency, and can be just as subtle, time-consuming and difficult to establish. We have seen that in the case of effective theories it trumps other theory choice considerations such as simpleness, predictivity, testability, etc 
My second point builds on the first. Since consistency is preeminent, it must have highest priority of establishment compared to other conditions. Deep, thoughtful reflection and work to establish the underlying self-consistency of a theory takes precedence over finding ways to make it more natural or to have less parameters (i.e., simple). Highest priority must equally go into understanding all of its observational implications. A theory should not be able to get away with being fuzzy on either of these two counts, before the higher order issues of simplicity and naturalness and economy take center stage. That this effort might take considerable time and effort should not be correlated with a theory’s value, just as it is not a theory’s fault if it takes humans decades to build a collider to sufficiently high energy and luminosity to test it. 
Additionally, dedicated effort on mathematical consistency of the theory, or class of theories, can have enormous payoffs in helping us understand and interpret the implications of various theory proposals and data in broad terms. An excellent example of that in recent years is by Adams et al. [15], who showed that some theories in the infrared with a cutoff cannot be self-consistently embedded in an ultraviolet complete theory without violating standard assumptions regarding superluminality or causality. The temptation can be high to start manipulating uninteresting theories into simpler and more beautiful versions before due diligence is applied to determine if they are sick at their cores. This should not be rewarded... 
Finally, I would like to make a comment about the implications of this discussion for the LHC and other colliders that may come in the future...  
In the years since the charm quark was discovered in the mid 1970’s there has been tremendous progress experimentally and important new discoveries, including the recent discovery of a Higgs boson-like state [20], but no dramatic new discovery that can put us on a straight and narrow path beyond the SM. That may change soon at the LHC. Nevertheless, it is expensive in time and money to build higher energy colliders, our main reliable transporter into the high energy frontier. This limits the prospects for fast experimental progress. 
In the meantime though, hundreds of theories have been born and have died. Some have died due to incompatibility of new data (e.g., simplistic technicolor theories, or simpleminded no-scale supersymmetry theories), but others have died under their own self-consistency problems (e.g., some extra-dimensional models, some string phenomenology models, etc.). In both cases, it was care in establishing consistency with past data and mathematical rigor that have doomed them. In that sense, progress is made. Models come to the fore and fall under the spotlight or survive. When attempting to really explain everything, the consistency issues are stretched to the maximum. For example, it is not fully appreciated in the supersymmetry community that it may even be difficult to find a “natural” supersymmetric model that has a high enough reheat temperature to enable baryogenesis without causing problems elsewhere [21a, 21b]. There are many examples of ideas falling apart when they are pushed very hard to stand up to the full body of evidence of what we already know. 
Relatively speaking, theoretical research is inexpensive. It is natural that a shift develop in fundamental science. The code of values in theoretical research will likely alter in time, as experimental input slows. Ideas will be pursued more rigorously and analysed critically. Great ideas will always be welcome. However, soft model building tweaks for simplicity and naturalness will become less valuable than rigorous tests of mathematical consistency. Distant future experimental implications identified for theories not fully vetted will become less valuable than rigorous computations of observational consistency across the board of all currently known data. One can hope that unsparing devotion to full consistency, both observational and mathematical, will be the hallmarks of the future era.

James D. Wells (Submitted on 3 Nov 2012)

(Encore) un philosophe dans la soupe du physicien

Ne pas oublier que la physique découle de la philosophie naturelle... 
"... theoretical physics has not done great in the last decades. Why? Well, one of the reasons, I think, is that it got trapped in a wrong philosophy: the idea that you can make progress by guessing new theory and disregarding the qualitative content of previous theories. This is the physics of the “why not?” Why not studying this theory, or the other? Why not another dimension, another field, another universe? Science has never advanced in this manner in the past. Science does not advance by guessing. It advances by new data or by a deep investigation of the content and the apparent contradictions of previous empirically successful theories. "
By John Horgan | August 21, 2014

mercredi 18 juin 2014

Un philosophe (apporte son grain de sel) à la table du physicien

Le spectacle de la Nature est un banquet où la soupe phénoménologique se doit d'être riche en modèles mathématiques variés
Où le blogueur essaie d'argumenter sur la nécessité de comparer les différents modèles mathématiques proposés par les physiciens pour comprendre et explorer plus avant la réalité, en le faisant à sa manière habituelle* c'est-à-dire par une citation de texte:
From the times of Niels Bohr, many physicists, mathematicians and biologists have been attentive to philosophical aspects of our doing. Most of us are convinced that the frontier situation of our research can point to aspects of some philosophical relevance - if only the professional philosophers would take the necessary time to become familiar with our thinking. Seldom, however, we read something of the philosophers which can inspire us. The US-American philosopher Charles Sanders Peirce (1839-1914) is an admirable exception. In his semiotics and pragmaticist (he avoids the word “pragmatic”) thinking, he provides a wealth of ideas, spread over an immense life work. It seems to me that many of his ideas, comments, and concepts can shed light on the why and how of mathematization...
 The quality of a mathematical model is not how similar it is to the segment of reality under consideration, but whether it provides a flexible and goal-oriented approach, opening for doubts and indicating ways for the removal of doubts (later trivialized by Popper’s falsification claim). More precisely, Peirce claims
  •  Be aware of differences between different approaches! 
  • Try to distinguish different goals (different priorities) of modelling as precise as possible! 
  • Investigate whether different goals are mutually compatible, i.e., can be reached simultaneously!
  • Behave realistically! Don’t ask: How well does the model reflect a given segment of the world? But ask: Does this model of a given segment of the world support the wanted and possibly wider activities / goals better than other models?
I may add: we have to strike a balance between Abstraction vs. construction, Top-down vs. bottom-up, and Unification vs. specificity. We better keep aware of the variety of Modelling purposes and the multifaceted relations between Theory - model - experiment. Our admiration for the Power of mathematization, the Unreasonable effectiveness of mathematics (Wigner) should not blind us for the Staying and deepening limitations of mathematization opposite new tasks.

*Remarques transtextuelles (ou portrait du blogueur en métacognition)
Quelque part dans son Moi profond, le transcyberphysicien se rêve en soldat inconnu de la guerre épistémologique que se livrent les défenseurs des différents modèles scientifiques de la gravitation quantique (théories des supercordes, gravitation quantique à boucles, piste tensorielle, géométrie spectrale non commutative...); mais à travers son discours basé essentiellement sur un usage immodéré d'extraits de ses propres lectures, il se voit aussi comme une sorte de Sancho Panza: (son Ça en somme ;-) infidèle compagnon de route virtuel d'un célèbre blogueur de sciences polémiste (et parfois triste sire) dont il relate parfois les tribulations dans le métatexte de ce blog.

dimanche 25 mai 2014

Simple comme le nouveau (mais déjà ancien) modèle standard minimal

Rubrique : Sans commentaire //ou presque

Un modèle simple avant d'être beau...
//Début mai on évoquait l'apparente simplicité des lois de la Nature qui sont aujourd'hui presque toutes condensées dans le Modèle Standard Minimal ( MSM en anglais) de la physique des particules et la théorie de la relativité générale (voir ce billet pour un aperçu de l'expression mathématique de cette relative simplicité). On dit bien presque parce que depuis l'achèvement du MSM au début des années 70 de nouveaux faits expérimentaux ont été mis en évidence à la fin des années 90 ou au début des années 2000 qui ne sont pas prédit par ce dernier...
There exist many possible directions to go beyond the Minimal Standard Model (MSM): supersymmetry,extra dimensions, extra gauge symmetries (e.g., grand unification), etc. They are motivated to solve aesthetic and theoretical problems of the MSM, but not necessarily to address empirical problems. It is embarrassing that all currently proposed frameworks have some phenomenological problems, e.g., excessive flavor-changing effects, CP violation, too-rapid proton decay, disagreement with electroweak precision data, and unwanted cosmological relics. In this letter, we advocate a different and conservative approach to physics beyond the MSM. We include the minimal number of new degrees of freedom to accommodate convincing (e.g., > 5σ) evidence for physics beyond the MSM. We do not pay attention to aesthetic problems, such as fine-tuning, the hierarchy problem, etc. We stick to the principle of minimality seriously to write down the Lagrangian that explains everything we know. We call such a model the New Minimal Standard Model (NMSM). In fact, the MSM itself had been constructed in this spirit, and it is a useful exercise to follow through with the same logic at the advent of the major discoveries we have witnessed. Of course, we require it to be a consistent Lorentz-invariant renormalizable four-dimensional quantum field theory, the way the MSM was constructed.


Hooman Davoudiasl, Ryuichiro Kitano, Tianjun Li, Hitoshi Murayama,The New Minimal Standard Model 12/05/2004


...aux prévisions cosmologiques vieilles de 10 ans déjà...
... The spectrum index of the ϕ^2 chaotic inflation modelis predicted to be 0.96. This may be confirmed in improved  cosmic- microwave background anisotropy data, with more years of WMAP and Planck. The tensor-to-scalar ratio is 0.16.

... et toujours robustes aujourd'hui jusqu'à preuve du contraire
The Planck nominal mission temperature anisotropy measurements, combined with the WMAP large-angle polarization, constrain the scalar spectral index to ns=0.9603±0.0073.
Subtracting the various dust models and re-deriving the r constraint still results in high significance of detection. For the model which is perhaps the most likely to be close to re- ality (DDM2 cross) the maximum likelihood value shifts to r = 0.16+0.0616 −0.05 with r = 0 disfavored at 5.9σ.

Quid d(')u(ne) prochain(e extension du) nouveau modèle standard minimal?
 //Rédaction encore en cours

dimanche 18 mai 2014

(Voyage) dans la tête de Gerard 't Hooft

Rubrique : Curiositêtes (1)
Le blogueur démarre une nouvelle rubrique qui aura pour but de présenter des physiciens plus ou moins connus du public amateur de sciences, en mettant l'accent sur des travaux originaux plus ou moins reconnus par leurs pairs.

L' tête au carré
Gerardus van 't Hooft est un physicien théoricien hollandais dont le patronyme pourrait se traduire en français par "L' tête" puisqu'en hollandais la tête se dit het Hoofd). Bien qu'ayant été récompensé par le prestigieux prix Nobel il y a déjà quinze ans, il est toujours actif scientifiquement (*) et il n'hésite pas à prendre une part active à la visibilité et à la défense de ses idées sur internet:
  • ses derniers articles sont toujours déposés en libre accès sur arxiv
  • il continue à être invité dans des institutions scientifiques prestigieuses pour des séminaires comme on peut le voir ici, la toile est également riche d'autres vidéos de ses interventions, on recommande particulièrement celle-, qui s'adresse au grand publique;
  • son site personnel est une mine d'or pour l'esprit curieux qui veut entrer dans la tête d'un physicien aussi généreux dans le partage de ses travaux et ses idées qu'il est grand par : l'importance de ses contributions scientifiques et la clarté avec laquelle il expose la physique contemporaine et ses idées plus originales;
  • soulignons enfin qu'il est à notre connaissance le seul Prix Nobel de physique à avoir un compte et à intervenir sur le site public collaboratif : Physics Stack Exchange.
(* recevoir un prix Nobel n'est pas qu'une récompense, c'est aussi une charge de travail et d'obligations sociales diverses et nombreuses qui peuvent entraver la créativité et la productivité de l'heureux récipiendaire).

Le dernier des héros (du Modèle Standard et de la théorie) quantique (des champs)?
Comme le prouve son prix Nobel, on peut être sûr que G. 't Hooft a déjà sa place dans l'histoire des sciences de par sa contribution décisive à l'achèvement du Modèle Standard.
Rappelons que ce modèle, dont une partie essentielle appelée théorie d'unification électrofaible était pour l'essentielle déjà construite à la fin des années soixante (grâce aux travaux de Glashow, Salam et Weinberg en particulier), attendait encore au début des années soixante-dix une véritable reconnaissance de l'ensemble des physiciens. Or cette reconnaissance fut acquise grâce à la démonstration par 't Hooft  - encore étudiant et bien épaulé par son directeur de thèse Veltmann - de la renormalisabilité de cette théorie : propriété fondamentale qui permet de "dominer" les infinis qui apparaissent systématiquement dans les calculs de théorie quantique des champs et qui menacent sans cesse leur pouvoir prédictif et fait planer le doute sur leur cohérence interne.
On pourrait poursuivre en parlant aussi de la place de choix qu'occupe aussi 't Hooft dans l'autre secteur du Modèle Standard: celui qui porte sur l'interaction forte modélisée par la chromodynamique quantique, théorie dont il a été l'un des premiers à comprendre la nature topologique et non perturbative mais dont il ne maîtrisait peut-être pas assez tous les aspects phénoménologiques pour apprécier l'importance de ses propre résultats ainsi que la justesse du conseil d'un autre physicien lui recommandant de publier rapidement ses travaux...
I announced at that meeting my finding that the coefficient determining the running of the coupling strength, that he called β(g^2), for non-Abelian gauge theories is negative, and I wrote down Eq. (5.3) on the blackboard. [Kurt] Symanzik was surprised and skeptical. “If this is true, it will be very important, and you should publish this result quickly, and if you won’t, somebody else will,” he said. I did not follow his advice. A long calculation on quantum gravity with Veltman had to be finished first.
On voit en lisant la dernière phrase de la citation précédente que 't Hooft était déjà impliqué dans un programme de recherche encore plus vaste visant à intégrer la dernière interaction fondamentale connue: la gravitation dans le cadre de la théorie quantique des champs.

Un des fondateurs de la vision quantique des trous noirs et père du principe holographique
Quels ont donc été les apports de 't Hooft au programme d'unification de la physique fondamentale depuis lors? La transcription écrite d'une conférence en l'honneur d'Abdus Salam (autre héro du Modèle Standard) donnée en 1993 et revue et corrigée en 2009 nous donne un élément de réponse:
I am given the opportunity to contemplate some very deep questions concerning the ultimate unification that may perhaps be achieved when all aspects of quantum theory, particle theory and general relativity are combined. One of these questions is the dimensionality of space and time... When we quantize gravity perturbatively we start by postulating a Fock space in which basically free particles roam in a three plus one dimensional world. Naturally, when people discuss possible cut-off mechanisms, they think of some sort of lattice scheme either in 3+1 dimenisional Minkowski space or in 4 dimensional Euclidean space. The cut-off distance scale is then suspected to be the Planck scale. Unfortunately any such lattice scheme seems to be in conflict with local Lorentz invariance or Euclidean invariance, as the case may be, and most of all also with coordinate reparametrization invariance. It seems to be virtually impossible to recover these symmetries at large distance scales, where we want them. So the details of the cut-off are kept necessarily vague. The most direct and obvious physical cut-off does not come from non-renormalizability alone, but from the formation of microscopic black holes as soon as too much energy would be accumulated into too small a region. From a physical point of view it is the black holes that should provide for a natural cut-off all by themselves. This has been this author’s main subject of research for over a decade. 
't Hooft, Dimensional Reduction in Quantum Gravity,1993-2009

Pour contempler plus simplement la vision quantique originale qu'à 't Hooft du trou noir (une particule élémentaire comme les autres à l'échelle de Planck?) on peut se reporter à l'extrait suivant:
For an intuitive understanding of our world, the Hawking effect seems to be quite welcome. It appears to imply that black holes are just like ordinary forms of matter: they absorb and emit things, they have a finite temperature, and they have a finite lifetime. One would have to admit that there are still important aspects of their internal dynamics that are not yet quite understood, but this could perhaps be considered to be of later concern. Important conclusions could already be drawn: the Hawking effect implies that black holes come in a denumerable set of distinct quantum states. This also adds to a useful and attractive picture of what the dynamical properties of space, time and matter may be like at the Planck scale: black holes seem to be a natural extension of the spectrum of elementary physical objects, which starts from photons, neutrinos, electrons and all other elementary particles.
 't Hooft, Quantum gravity without space-time singularities or horizons, 18/09/2009

Au final ces réflexions l'ont conduit à formuler plusieurs conjectures dont la plus connue et reconnue semble-t-il est le principe holographique:
What is known for sure is that Quantum Mechanics works, that the gravitational force exists, and that General Relativity works. The approach advocated by me during the last decades is to consider in a direct way the problems that arise when one tries to combine these theories, in particular the problem of gravitational instability. These considerations have now led to what is called “the Holographic Principle”, and it in turn led to the more speculative idea of deterministic quantum gravity ... 

't Hooft, The Holographic Principle, 2000

La dernière phrase de l'extrait précédent se termine sur une autre idée beaucoup plus controversée: celle d'une théorie déterministe sous-jacente à la mécanique quantique.

(La compréhension de la physique à) l'échelle de Planck vaut bien une conjecture de 't Hooft (non orthodoxe)
Voyons donc un peu plus en détail ce qui peut conduire un spécialiste de la théorie quantique à remettre en question son interprétation standard:
It is argued that the so-called holographic principle will obstruct attempts to produce physically realistic models for the unification of general relativity with quantum mechanics, unless determinism in the latter is restored. The notion of time in GR is so different from the usual one in elementary particle physics that we believe that certain versions of hidden variable theories can – and must – be revived. A completely natural procedure is proposed, in which the dissipation of information plays an essential role.
't Hooft, Quantum Gravity as a Dissipative Deterministic System, 03-04/1999
Beneath Quantum Mechanics, there may be a deterministic theory with (local) information loss. This may lead to a sufficiently complex vacuum state, and to an apparent non-locality in the relation between the deterministic (“ontological”) states and the quantum states, of the kind needed to explain away the Bell inequalities. Theories of this kind would not only be appealing from a philosophical point of view, but may also be essential for understanding causality at Planckian distance scales.

Évidemment cette conjecture de 't Hooft suscite semble-t-il pas mal de scepticisme d'autant qu'elle remet en cause rien de moins que le programme de l'informatique quantique comme on le verra dans le prochain paragraphe. Les critiques les plus explicites s'expriment naturellement sur la blogosphère. Pour avoir aussi une idée de la réaction plus officielle de ses pairs, on peut lire cette entrevue récente de 't Hooft dans laquelle il relate une brève discussion avec le physicien John Bell; mais la rencontre date des années 80 et Bell a depuis disparu tandis que le modèle développé par 't Hooft (basé sur des automates cellulaires) s'est raffiné depuis.

Un bel exemple d'échange de réflexion scientifique en ligne sur Physics.Stack.Exchange
Heureusement internet nous offre un autre lieu virtuel d'échanges de point de vue intéressant à travers un site de questions, réponses et commentaires où l'on peut suivre un dialogue en ligne entre 't Hooft et une grande figure de l'information quantique, en l'occurrence Peter Shor (le père de l'algorithme du même nom):

The problem with these blogs is that people are inclined to start yelling at each other. (I admit, I got infected and it's difficult not to raise one's electronic voice.) I want to ask my question without an entourage of polemics.
My recent papers were greeted with scepticism. I've no problem with that. What disturbes me is the general reaction that they are "wrong". My question is summarised as follows:
Did any of these people actually read the work and can anyone tell me where a mistake was made?
... A revised version of my latest paper was now sent to the arXiv ... Thanks to you all. My conclusion did not change, but I now have more precise arguments concerning Bell's inequalities and what vacuum fluctuations can do to them.
asked Aug 15 '12 at 9:35 G. 't Hooft

Réponse: I can tell you why I don't believe in it. I think my reasons are different from most physicists' reasons, however. Regular quantum mechanics implies the existence of quantum computation. If you believe in the difficulty of factoring (and a number of other classical problems), then a deterministic underpinning for quantum mechanics would seem to imply one of the following.

  • There is a classical polynomial-time algorithm for factoring and other problems which can be solved on a quantum computer.
  • The deterministic underpinnings of quantum mechanics require 2n resources for a system of size O(n).
  • Quantum computation doesn't actually work in practice.
None of these seem at all likely to me ... For the third, I haven't seen any reasonable way to how you could make quantum computation impossible while still maintaining consistency with current experimental results.
answered Aug 17 '12 at 14:11 Peter Shor
Commentaire: @Peter Shor: I have always opted for your 3rd possibility: the "error correcting codes" will eventually fail. The quantum computer will not work perfectly (It will be beaten by a classical computer, but only if the latter would be scaled to Planckian dimensions). This certainly has not yet been contradicted by experiment. – G. 't Hooft Aug 17 '12 at 20:45

Signalons que pour le moment il semble que la physique expérimentale ne soit pas encore parvenue à réfuter la prévision de 't Hooft. 

La nature est(-elle) plus folle à l'échelle de Planck que les théoriciens des cordes peuvent l'imaginer(?) 
Voilà une formule que l'on emprunte presque littéralement à un passage de l'article fondateur de 1993 écrit par 't Hooft et cité précédemment. Le lecteur peut la voir comme un clin d'oeil à un célèbre blogueur très critique aujourd'hui comme hier envers un modèle déterministe supposé sous-tendre la mécanique quantique dont 't Hooft est l'auteur. Lubos Motl, pour ne pas le citer, profite de la prépublication d'un long article de synthèse du physicien hollandais sur ce sujet pour attaquer un de ses postulats: l'existence d'une base ontologique dans l'espace de Hilbert qui décrit les états possibles d'un système quantique. Comme à son habitude Lubos développe une argumentation qui s'appuie sur des exemples de grande valeur pédagogique pour qui veut comprendre la physique quantique; mais son analyse de la thèse qu'il critique nous semble trop superficielle pour que le lecteur se fasse une idée précise du pouvoir heuristique de cette dernière et de ses enjeux épistémologiques.
On se contentera pour notre part (pour le moment) de mettre en exergue les points suivants qui nous paraissent intéressants:
... I do find that local deterministic models reproducing quantum mechanics, do exist; they can easily be constructed. The difficulty signalled by Bell and his followers, is actually quite a subtle one. The question we do address is: where exactly is the discrepancy? If we take one of our classical models, what goes wrong in a Bell experiment with entangled particles? Were assumptions made that do not hold? Or do particles in our models refuse to get entangled? ...
The evolution is deterministic. However, this term must be used with caution. “De- terministic” cannot imply that the outcome of the evolution process can be foreseen. No human, nor even any other imaginable intelligent being, will be able to compute faster than Nature itself. The reason for this is obvious: our intelligent being would also have to employ Nature’s laws, and we have no reason to expect that Nature can duplicate its own actions more efficiently than itself. ...
... There are some difficulties with our theories that have not yet been settled. A recurring mystery is that, more often than not, we get quantum mechanics alright, but a hamiltonian emerges that is not bounded from below. In the real world there is a lower bound, so that there is a vacuum state. A theory without such a lower bound not only has no vacuum state, but it also does not allow a description of thermodynamics using statistical physics. Such a theory would not be suitable for describing our world. How serious do we have to take this difficulty? We suspect that there will be several ways to overcome it, the theory is not yet complete, but a reader strongly opposed to what we are trying to do here, may well be able to find a stick that seems suitable to destroy our ideas. Others, I hope, will be inspired to continue along this path. There are many things to be further investigated, one of them being superstring theory. This theory seems to be ideally suited for the approach we are advocating.  
G. 't Hooft, The Cellular Automaton Interpretation of Quantum Mechanics, 7/05/2014

Prophétie à propos d'une symétrie conformationnelle locale exacte de la Nature (mais spontanément brisée en deçà de l'échelle de Planck)
Au delà de ce débat sur l'interprétation de la mécanique quantique, le travail de 't Hooft offre l'occasion de voir un chercheur en action, prêt à élaborer et défendre des hypothèses audacieuses en construisant des modèles aussi précis que possible (et dans une certaine mesure réfutables) pour voir jusqu'où peuvent le guider selon son point de vue les concepts qui ont si bien servit la physique comme la causalité et la localité.
Voici pour finir une dernière de ses conjectures qui aura peut-être un plus grand avenir, telle qu'elle est évoquée sur son site personnel:
I claim to have found how to put quantum gravity back in line so as to restore quantum mechanics for pure black holes. It does not happen automatically, you need a new symmetry. It is called local conformal invariance. This symmetry is often used in superstring and supergravity theories, but very often the symmetry is broken by what we call “anomalies”. These anomalies are often looked upon as a nuisance but a fact of life. I now claim that black holes only behave as required in a consistent theory if all conformal anomalies cancel out. This is a very restrictive condition, and, very surprisingly, this condition also affects the Standard Model itself. All particles are only allowed to interact with gravity and with each other in very special ways. Conformal symmetry must be an exact  local symmetry, which is spontaneously broken by the vacuum,  exactly  like in the Higgs mechanism.

This leads to the prediction that models exist where all unknown parameters of the Standard Model, such as the finestructure constant, the proton-electron mass ratio, and in fact all other such parameters are computable. Up till now these have been freely adjustable parameters of the theory, to be determined by experiment but they were not yet predicted by any theory.
I am not able to compute these numbers today because the high energy end of the elementary particle properties is not known. There is one firm prediction: constants of Nature are truly constant. All attempts to detect possible space and time dependence of the Standard Model parameters will give negative results. This is why I am highly interested in precision measurements of possible space-time dependence of constants of Nature, such as the ones done by using a so-called "frequency comb". These are high precision comparisons between different spectral frequencies in atoms and molecules. They tell us something very special about the world we live in. 
't Hooft

//Rédaction et dernières retouches éditoriales le jeudi 22 mai 2014.

jeudi 15 mai 2014

Chercher : la beauté (rétrospectivement), la simplicité (maintenant) et la surprise (à venir)

S comme simplicité?
Rétrospectivement on peut dire que la physique théorique du XX siècle a été marquée par l'affirmation revendiquée de la beauté comme critère heuristique, à travers les découvertes de chercheurs comme Albert Einstein et Paul Dirac dans la première moitié du siècle puis celles de Chen Ning Yang  par exemple dans la seconde moitié. Or il semble qu'en ce début de XXI siècle, ce qui préoccupe maintenant les théoriciens c'est de mieux comprendre la simplicité des lois de la Nature; c'est ce que montre le résumé suivant du programme d'une conférence qui vient de s'achever aujourd'hui à Princeton et dont le titre a inspirée ce billet:
... recent data from Cosmic Microwave Background measurements, the Large Hadron Collider at CERN and Dark Matter experiments that show the universe to be surprisingly simple on both the microphysical and macrophysical scale: there is a striking absence -- thus far -- of new particles, WIMP dark matter or non-gaussianity. The recent report by BICEP2 of the detection of primordial gravitational waves produces some tension with current results from the Planck and WMAP satellites that may indicate unexpected complexity. However, this workshop will occur at a time when the results have yet to be confirmed, so we are free to imagine various scenarios. At the same time, there is the intriguing fact (clue?) that the measured Higgs and top quark mass lie within the narrow range corresponding to a metastable Higgs vacuum in the standard model. What could all this mean? What ideas need to be jettisoned, revised or created to naturally explain this simplicity?

Or si la simplicité de la Nature intrigue les théoriciens c'est parce qu'elle n'est bizarrement pas aussi "naturelle" qu'ils voudraient qu'elle soit. Mais derrière cette idée de naturalité se cachent des critères esthétiques, ceux forgés par les succès théoriques passés, qu'il faut peut-être aujourd'hui abandonner.

S comme surprise!
Néanmoins pour remettre en cause les conceptions théoriques trop étroites du siècle précédent le physicien doit aussi s'appuyer sur elles pour imaginer leurs extrapolations les plus hardies et tester leurs conséquences les plus ténues, dans le secret espoir de voir l'inattendu. Parvenir à une découverte surprenante comme celles qui marqua les débuts de la physique quantique (la structure de l'atome, l'effet photoélectrique, le spin électronique), voilà la quête secrète de tous les physiciens:
During a visit to the Super Proton Synchrotron [Margaret Thatcher] spoke John Ellis, who introduced himself as a theoretical physicist. The conversation continued:

Thatcher: “What do you do?”
Ellis: “Think of things for the experiments to look for, and hope they find something different.”
Thatcher: “Wouldn’t it be better if they found what you predicted?”
Ellis: “Then we would not learn how to go further!”
Aidan Randle-Conde, Blog Quantum Diaries, Margaret Thatcher, politician, scientist,15/04/2013

mardi 13 mai 2014

Pas de rétractation (mais plus d'incertitude) sur une preuve expérimentale récente de l'inflation cosmologique

D'une rumeur qui enfle à une autre qui (la dé)gonfle (ou bienvenue dans l'ère de la Science Ouverte)
Il y a quelques semaines, le transcyberphysicien relayait ici le battage médiatique orchestré autour de l'annonce d'une possible découverte astrophysique importante. Peter Coles, un cosmologiste anglais, mais aussi un blogueur reconnu, a très bien parlé de cet événement: 
When the BICEP2 team announced that a “major astrophysics discovery” would be announced this Monday I have to admit that I was quite a bit uncomfortable about the way things were being done. I’ve never been keen on “Science by Press Release” and when it became clear that the press conference would be announcing results that hadn’t yet been peer-reviewed my concerns deepened.
However, the BICEP2 team immediately made available not only the “discovery” paper but also the data products, so people with sufficient expertise (and time) could try to unpick the content. This is fully in the spirit of open science and I applaud them for it. Indeed one could argue that putting everything out in the open the way they have is ensuring that that their work is being peer-reviewed in the open by the entire cosmological community not secretly and by one or two anonymous individuals. The more I think about it the more convinced I am becoming that this is a better way of doing peer review than the traditional method, although before I decide that for sure I’d like to know whether the BICEP2 actually does stand up!
One of the particularly interesting developments in this case is the role social media are playing in the BICEP2 story. A Facebook Group was set up in advance of Monday’s announcement and live discussion started immediately the press conference started. The group now has well over 700 members, including many eminent cosmologists. And me. There’s a very healthy scientific discussion going on there which may well prove to be a model of how such things happen in the future. Is this a sign of a major change in the way science is done, the use of digital technology allowing science to break free from the shackles placed on it by traditional publication processes? Maybe.
 Telescoper (alias Peter Coles), Blog In the dark, Bicep2, Social Media and Open Science 14/05/2014

Dans un autre de ses billets il discutait aussi de façon précise et claire les incertitudes qui pèsent sur la véracité de la découverte en question, ou plus précisément sur l'origine cosmologique du signal mis en évidence par l'expérience Bicep2. Mais les discussions techniques autour des incertitudes expérimentales se déroulaient "à bas bruit" semble-t-il dans la blogosphère jusqu'à ce que la situation change il y a trois jours, suite à la publication d'un billet d'Adam Falkowski, physicien des particules (et blogueur "fou" alias Jester ;-) qui prétendait relayer une rumeur selon laquelle les chercheurs de la collaboration Bicep2 auraient reconnus avoir commis une erreur dans l'évaluation d'un signal parasite. 

La parole à un scrutateur sceptique (lanceur d'alerte ou diffuseur de rumeur?)
Voilà ce que dit entre autre chose le billet en question: 
The BICEP claim of detecting the primordial B-mode in the polarization of the Cosmic Microwave Background was a huge news. If confirmed, it would be an evidence of gravity waves produced during cosmic inflation, and open a window on physics at an incredibly high energy scale of order 10^16 GeV.
Barring a loose cable, the biggest worry about the BICEP signal is that the collaboration may have underestimated the galactic foreground emission. BICEP2 performed the observations at only one frequency of 150 GHz which is very well suited to study the CMB, but less so for polarized dust or synchrotron emission. As for the latter, more can be learned by going to higher frequencies, while combining maps at different frequencies allows one to separate the galactic and the CMB component. Although the patch of the sky studied by BICEP is well away from the galactic plane, the recently published 353 GHz polarized map from Planck demonstrates that there may be significant emission from these parts of the sky.
... New data from Planck, POLARBEAR, ACTpole, and Keck Array should clarify the situation within a year from now.  However, at this point, there seems to be no statistically significant evidence for the primordial B-modes of inflationary origin in the CMB.
 Jester (alias Adam Falkowski), Blog Résonaances Is BICEP wrong? 12/05/14

La réponse des scientifiques scrutés (chercheurs de vérité ou promoteurs de découverte prématurée?)
Le billet un peu "poil à gratter" de Jester a tout de suite suscité beaucoup de réactions et l'intérêt des média scientifiques populaires anglo-saxons qui ont fait réagir les chercheurs dont le travail était remis en question:
Clement Pryke, a cosmologist at the University of Minnesota, Twin Cities, and a co-principal investigator for the BICEP team, acknowledges that the foreground map is an important and thorny issue. Part of the problem is that the Planck team has not made the raw foreground data available, he says. Instead, BICEP researchers had to do the best they could with a PDF file of that map that the Planck team presented at a conference.
... The BICEP team will not be revising or retracting its work, which it posted to the arXiv preprint server, Pryke says: "We stand by our paper."

On 12 May, a rumour emerged on the physics blog Résonaances that the BICEP2 team has already admitted defeat. The blogger, particle physicist Adam Falkowski at CERN, says he has heard through the scientific grapevine that the BICEP2 collaboration misinterpreted a preliminary Planck map in its analysis.
... "We tried to do a careful job in the paper of addressing what public information there was, and also being upfront about the uncertainties. We are quite comfortable with the approach we have taken." [says principal investigator John Kovac at Harvard University]
Lisa Grossman, Rumours swirl over credibility of big bang ripple find, 13/05/2014

Au moment où ces lignes sont écrites (15 mai 11h50 heure de Paris) on apprend que le prestigieux Princeton Center for Theoretical Science organise ce jour même (de façon impromptue semble-t-il) un événement spécial pour discuter justement des dernières avancées dans la compréhension des incertitudes expérimentales incriminées :

Special Event: May 15, 2014
Towards an Understanding of Foregrounds in the BICEP2 Region
Speakers: Raphael Flauger (IAS and NYU) with discussion by Lyman Page (Princeton)
Thursday, May 15 at 9:30 am
PCTS Seminar Room, Room 407
(video recording and slides will be made available after the talk)


Bicep2 versus Planck?
En attendant de nouvelles informations scientifiques de première main sur la question on peut essayer de mettre en lumière les enjeux du débat en croisant les sources d'informations. Commençons par comparer ces deux avis de blogueurs plus ou moins habiles dans l'art de la provocation: 
... the BICEP2 experiment announced a significant detection of the primordial B-model in the CMB power spectrum... 
  • If this holds up, it's huge, comparable in magnitude to the discovery of the Higgs boson. Probably even more exciting because of the surprise element...
  • If you hear a sledgehammer in the corridor of your lab, that may be your local Planck member banging his head on the wall. Yeah, apart from many noble aspects, science also has this lowly competition side. A billion dollar experiment that misses a Nobel-prize-worth low-hanging fruit... I wouldn't wish  to be in their skin if BICEP is right. 
Jester,  Blog RésonaancesCurly impressions, 17/03/2014

The total price of the Planck satellite was €700 million, almost a billion of dollars. On the other hand, BICEP2's expenses are comparable to $10 million, about one hundred times smaller. That's why Planck is the Goliath and BICEP2 was the David ...
If your budget is 100 times smaller than the budget of someone else, it may be and feel more likely that you won't win but it simply does not imply that you can't discover something important before the Goliath does. Even though "chance" could be enough to explain all these unexpected events, the victories of the under dogs, there are also detailed reasons why BICEP2 has apparently done the discovery before Planck.
BICEP2 had a vision. They were focusing on the B-modes, assuming from the beginning that there could be something new over there. Their devices were sufficiently optimized to do the job and they have apparently succeeded in the job. In comparison, Planck has gotten into a pessimistic mode in which people assume that "they can't discover something really new, anyway" which is why they don't even try so hard.
Lubos Motl, Blog The Reference FrameBICEP2 vs Planck: nothing wrong with screen scraping, 14/05/2014

Mais laissons-là la polémique à tendance sociologique pour revenir à des intérêts plus centrés sur la connaissance scientifique.

Les résultats de Bicep2 dépendent(-ils) de ceux de Planck !(?)
Cherchons dans les réactions des lecteurs de blogs des questions, des remarques et des informations intéressantes:

Nick said... 13 May 2014 07:29 (blog Résonaances)
Is it correct then that BICEP2 was systematically unable to precede PLANCK? Surely in their methodology before getting funded someone would have asked about how they would measure the foreground, the answer would have been "We will have to wait for PLANCK!" So in this sense this very modest telescope was never designed to compete for a Nobel Prize but was always complementary to PLANCK? 

Jester said... 13 May 2014 08:29  (blog Résonaances)
Nick, my understanding is that the amount of polarized foreground in the BICEP patch is a bit of a surprise. That region is rather clean in temperature maps, apparently it is less clean in polarization. But, right, it was always clear that for a fully reliable estimation of the foregrounds we need measurements at several frequencies, and Planck is by far best suited to do that.

pion says: May 14, 2014 at 2:40 am (blog Not Even Wrong)

It seems to me that Planck is not our best hope to settle this issue mainly due to the fact that it is a satellite, and information from certain ground-based telescopes might be more credible.
Since the CMB polarization level is obtained from differencing two intensity measurements toward the same direction on the sky, any optical imperfection of the detectors can potentially leak the dominant intensity to the faint B-mode polarization. A few of these spurious signals can be potentially mitigated by the telescope’s scanning strategy; basically, each pixel in the field is observed multiple times with (ideally) different orientations of the polarimeter. Ideally, this would suppress a large fraction of the systematic but not entirely.
Most ground-based telescopes benefit from the earth rotation, others use half waveplate, and in general the scanning strategy could be optimized for minimizing the intensity-to-B-mode leakage. Satellites in orbit, however, are limited and their typical scanning strategy is sub-optimal if not poor. We know, as a fact, that Planck never published their B-maps, and I guess that this is partially due to the issue of B-mode systematics which is always a challenge, for satellites in particular...
My best bet is that a joint effort of ground-based instruments (which are located off the pole) might ultimately provide a conclusive answer to this thorny issue. The problem with this alternative, though, is that ground-based experiments are limited to a relatively narrow frequency window. Hopefully, two or three frequency bands will suffice for the construction of a reliable polarized dust model, but this is not a priori guaranteed.

Le mot de la fin à un cosmologiste encore dubitatif 
I repeat what I’ve said before in response to the BICEP2 analysis, namely that the discussion of foregrounds in their paper is disappointing. I’d also say that I think the foreground emission at these frequencies is so complicated that none of the simple approaches that were available to the BICEP2 team are reliable enough to be convincing. ... I think BICEP2 has definitely detected something at 150 GHz but we simply have no firm evidence at the moment that it is primordial. That will change shortly, with the possibility of other experiments (specifically Planck, but also possibly SPTPol) supplying the missing evidence.

I’m not particularly keen on the rumour-mongering that has gone on, but then I’m not very keen either on the way the BICEP2 result has been presented in some quarters as being beyond reasonable doubt when it clearly doesn’t have that status. Yet.

Rational scepticism is a very good thing. It’s one of the things that makes science what it is. But it all too easily turns into mudslinging. 

Telescoper (alias Peter Coles), Blog In the dark, That BICEP Rumour…, 14/05/2014

Conclusion et morale pour cette histoire
L'annonce très médiatique du premier signe expérimental de l'existence d'ondes gravitationnelles primordiales et de la preuve de la véracité de l'inflation cosmologique était forcément un peu gonflée ...
Le transcyberphysicien, 15/05/2014

Voilà pour la conclusion en forme de boutade; quant-à la morale, il en est une qui s'impose encore et toujours, chaque fois du moins qu'une découverte scientifique extraordinaire est annoncée publiquement:  
Extraordinary claims require extraordinary evidence
Des affirmations extraordinaires exigent des preuves extraordinaires 
Carl Sagan Cosmos,12 - Encyclopaedia Galactica, 14/12/1980

//La rédaction de ce billet a été achevée le jeudi 15/05/2015