Sans commentaire // ou presque
//Le blogueur continue son enquête obstinée sur la supersymétrie et le problème de la naturalité évoquée déjà à plusieurs reprises (comme
ici et
là) en s'arrêtant aujourd'hui sur les contributions récentes de deux physiciens expérimentés mais encore jeunes
et car avides de découvertes !
... avant que le second "round" de fonctionnement du LHC à des énergies plus élevées n'ait épuisé le champ des possibles à 10-3 près ?
//C'est l'avis défendu par H. Murayama, physicien théoricien japonais, dans un récent article qui dresse un panorama des perspectives ouvertes à la physique expérimentale des hautes énergies par les avancées les plus récentes:
The discovery of a “Higgs-like particle” on July 4, 2012 was a truly historic moment in the history of science ... So far, what we’ve seen looks minimal ... It took a whopping eighty years to come to the point where we now have a UV-complete theory of strong, weak, and electromagnetic forces with all of the parameters measured ...
Despite this achievement, or rather because of it, there is a building anxiety in the community. How come we don't see anything else? Will the progress stop? There is no sign of physics beyond the Standard Model in the LHC data. For a typical search for supersymmetric particles, for example, squarks and gluinos are excluded up to 1.3 TeV or so. On the other hand, the conventional arguments based on the naturalness concept suggested that we should have new particles that stabilize the electroweak scale below TeV. It appears that "natural and simple" models have been excluded ...
I have to point out, however, that certain levels of fine-tuning do occur in nature. All examples I'm aware of, with the glaring exception of the cosmological constant, are at most at the level of a few per-mille. The current LHC limit has not quite reached that level; the next runs at 13-14 TeV may well reveal new physics as we hoped for.
Hitoshi Murayama, Future experimental programs, 01/11/2013
Ne pas renoncer à [la meilleur{?} solution potentielle pour] résoudre un problème de naturalité qui a déjà conduit à plusieurs découvertes fondamentales
//Voilà l'argumentation développée dans la suite de l'article de Murayama:
... the best argument we have right now to expect new physics in the TeV range is the naturalness: we would like to avoid fine-tuning between the bare m2h and the radiative correction ... Even though many in the community are ditching the naturalness altogether, I still take the argument seriously because it has worked many times before.
One example I always bring up is the discovery of the positron [24, 25]. In classical electrodynamics, the Coulomb self-energy of the electron is linearly divergent, ... It would have required a fine cancellation between the “bare” mass of the electron ... and the correction to yield a small mass [of] 0.511 MeV. However, the discovery of the positron and quantum mechanics told us that the vacuum is always fluctuating, producing a pair of e+e−, that annihilates back to the vacuum within the time allowed by the uncertainty principle ... When you place an electron in this fluctuating vacuum, it may find a (virtual) positron near it and decide to annihilate it. Then the other electron that was originally in the vacuum fluctuation is now left out and becomes a “real” particle. It turns out that this process cancels the linear divergence exactly, leaving only a logarithmic divergence. Even for an electron as small as the Planck distance, it amounts to only 9% correction. The cancellation is guaranteed by a (softly broken) chiral symmetry. You can see that the naturalness problem was solved by doubling the number of particles!
The idea of supersymmetry was pushed to repeat the history. Because the Higgs boson must repel itself, it also has a divergent self-repulsion energy ... But by doubling the number of particles (namely introducing superpartners), there is a cancellation between the self-repulsion among Higgs bosons, and the induced attraction arising from the loop of higgsinos (fermionic partner of the Higgs boson). Again, the correction is down to a logarithmic divergence ...
In the case of the electron, new physics (positron) appears “early” at [a] Compton wave length [of] 400 fm well before we get down to the smaller “classical radius of electron” rc = e2/mec2 ≈ 1 fm where the theory becomes fine-tuned. In another well-known case, however, nature did fine-tune it so that the discovery was delayed.
The example is COBE (Cosmic Background Explorer) that discovered the CMB anisotropy. People expected anisotropy at the level of 10−5 so that the observed large- scale structure can be explained. But the search went on, so much so that people started writing articles questioning the inflationary cosmology itself. When COBE discovered the quadrupole moment, it was small. Actually, compared to our best prediction today based on the WMAP data, it was nearly an order of magnitude smaller than theory. This is usually understood today as a consequence of cosmic variance, namely that the quadrupole moment has only 2l + 1 = 5 numbers to be measured and hence is subject to a statistical uncertainty of O(1/√5). I find the observed quadrupole moment to be fine-tuned at the 2% level. Note that the inflation was invented to solve the naturalness problems, horizon problem and flatness problem of the standard Big Bang cosmology. It worked: the current data beautifully confirm predictions of inflation. But it was a little fine-tuned and it required patience and more work. So the moral I draw from these examples is that the naturalness argument generally does work. But there are cases where fine-tuning at the level of a few percent or even few per-mille (some examples in nuclear physics are well-known, see [26]). ... we have not fully explored down to that level of not-that-fine-tuning yet. And it took ten years for Tevatron to discover [the]top [quark]. Patience pays, hence my optimism.
Ibid.
L'histoire ne peut servir de justification scientifique mais l'analyse rétrospective nous en apprend beaucoup sur les erreurs passées ou les opportunités manquées
//C'est ce que montre N. Arkani-Hamed, un jeune (il a l'âge du blogueur ;-) et brillant physicien théoricien américain, dans un article qui expose de façon claire et condensée le défi théorique posé par la mise en évidence expérimentale du boson de Higgs :
As is well known, for massive particles of spin one like the W and Z bosons, merely ignoring mass at high energies is missing one piece of physics, given the mismatch in the degrees of freedom between the three massive polarization states and the two helicities of the massless spin 1 particles. Furthermore, the interactions of the third ‘longitudinal’ component become large at high energies. Something new is needed to unitarize scattering processes involving the longitudinal W’s and Z’s, and from the list of consistent possibilities for particle interactions, only spin 0 particles—the simplest possibility being a single Higgs particle—can do the job. It is remarkable that this simplest possibility is actually realized in Nature ... why is the Higgs mass so much smaller than ultraviolet scales like the Planck scale? For massless particles with spin, we have a satisfying answer to this question: as we have already remarked, the number of spin degrees of freedom are discretely different for massless versus massive particles: 3≠2 for spin 1, and 5 ≠ 2 for spin 2. Therefore, interactions cannot change massless particles into massive ones. But the situation is different for the Higgs: 1 = 1, there is no difference in degrees of freedom between massive and massless spin 0 particles, and thus no direct understanding for the lightness of the Higgs, absent new physics at the weak scale.
This is not the first time naturalness arguments have arisen in the development of fundamental physics. Indeed, issues similar to the Higgs tuning problem arose three times in the 20 th century, and were resolved in the expected way, with ‘natural’ new physics showing up where expected... naturalness issues arose in the context of [(i) the linearly divergent energy stored in the electric field surrounding a point-like electron, soften by the quantum field effects of the positron, (ii)] the mass splitting between the charged and neutral pions, where the ρ meson cuts off the quadratically divergent contribution to the charged pion mass from the photon loop at the required scale, [(iii)] as well as with K–K¯ mixing, where the charm quark appeared where needed to cut off the quadratically divergent contribution to ΔmK.
Of course arguments from history are suspect, and can be used to illustrate any polemical point one wishes to make. If we go back further in time, naturalness arguments have also failed spectacularly. A good example relates to the heliocentric model of the solar system, which was already advanced in ancient times by Aristarchos. He had already brilliantly determined the distance from the Earth to the Sun, which was thought to be enormously large. But by putting the Sun at the center of the solar system, his theory made a prediction of parallax for the distant stars. This is too small to be seen by the naked eye. Of course he knew a way out—he had to declare that the distant stars were even more ridiculously far away than the Sun. This struck most of his contemporaries as obviously wrong, and special pleading for this model: why should the earth be so close to one object, and so far away from all the others, conveniently far enough for the model not to be false? Their notion of naturalness thus led them to reject the simple and correct model of the solar system.
Nima Arkani-Hamed, Beyond the Standard Model theory, 01/10/2013
//Reste à savoir si les physiciens actuels ne répètent pas d'une certaine manière l'erreur passée des contemporains d'Aristarque (
Aristarchus in English) en négligeant pour le moment certains modèles qui ne rentrent pas dans le cadre peut-être trop étroit de leur notion de naturalité ... mais ceci est une autre histoire débattue
ailleurs par le blogueur.
Addendum 30 Octobre 2015
Je trouve intéressant dans le cadre de ce blog de compléter les propos de Nima Arkani-Hamed en explicitant la seule source écrite sur le travail d'Aristarque dont il est question ici, d'autant qu'il s'agit d'un extrait d'un texte extraordinaire,
l'Arénaire, écrit par l('un d)e(s) premier(s) physicien(s) de l'antiquité, Archimède:
You are aware the 'universe' is the name given by most astronomers to the sphere the centre of which is the centre of the earth, while its radius is equal to the straight line between the centre of the sun and the centre of the earth. This is the common account as you have heard from astronomers. But Aristarchus has brought out a book consisting of certain hypotheses, wherein it appears, as a consequence of the assumptions made, that the universe is many times greater than the 'universe' just mentioned. His hypotheses are that the fixed stars and the sun remain unmoved, that the earth revolves about the sun on the circumference of a circle, the sun lying in the middle of the orbit, and that the sphere of fixed stars, situated about the same centre as the sun, is so great that the circle in which he supposes the earth to revolve bears such a proportion to the distance of the fixed stars as the centre of the sphere bears to its surface.
Archimedes
3rd century B.C