De la cohérence observationnelle à la cohérence mathématique...
My first point is that the conditions of theory choice should be ordered. Frequently we see the listing of criteria for theory choice given in a flat manner, where one is not given precedence over the other a priori. We see consilience, simplicity, falsifiability, naturalness, consistency, economy, all together in an unordered list of factors when judging a theory. However, consistency must take precedence over any other factors. Observational consistency is obviously central to everyone, most especially our experimental colleagues, when judging the relevance of theory for describing nature. Despite some subtleties that can be present with regards to observational consistency (There can be circumstances where a theory is observationally consistent in a vast number of observables, but in a few it does not get right, yet no other decent theory is around to replace it. In other words, observational consistency is still the top criterion, but the best theory may not be 100% consistent.) it is a criterion that all would say is at the top of the list.Mathematical consistency, on the other hand, is not as fully appreciated... Mathematical consistency has a preeminent role right up there with ob- servational consistency, and can be just as subtle, time-consuming and difficult to establish. We have seen that in the case of effective theories it trumps other theory choice considerations such as simpleness, predictivity, testability, etc
My second point builds on the first. Since consistency is preeminent, it must have highest priority of establishment compared to other conditions. Deep, thoughtful reflection and work to establish the underlying self-consistency of a theory takes precedence over finding ways to make it more natural or to have less parameters (i.e., simple). Highest priority must equally go into understanding all of its observational implications. A theory should not be able to get away with being fuzzy on either of these two counts, before the higher order issues of simplicity and naturalness and economy take center stage. That this effort might take considerable time and effort should not be correlated with a theory’s value, just as it is not a theory’s fault if it takes humans decades to build a collider to sufficiently high energy and luminosity to test it.
Additionally, dedicated effort on mathematical consistency of the theory, or class of theories, can have enormous payoffs in helping us understand and interpret the implications of various theory proposals and data in broad terms. An excellent example of that in recent years is by Adams et al. [15], who showed that some theories in the infrared with a cutoff cannot be self-consistently embedded in an ultraviolet complete theory without violating standard assumptions regarding superluminality or causality. The temptation can be high to start manipulating uninteresting theories into simpler and more beautiful versions before due diligence is applied to determine if they are sick at their cores. This should not be rewarded...
Finally, I would like to make a comment about the implications of this discussion for the LHC and other colliders that may come in the future...
In the years since the charm quark was discovered in the mid 1970’s there has been tremendous progress experimentally and important new discoveries, including the recent discovery of a Higgs boson-like state [20], but no dramatic new discovery that can put us on a straight and narrow path beyond the SM. That may change soon at the LHC. Nevertheless, it is expensive in time and money to build higher energy colliders, our main reliable transporter into the high energy frontier. This limits the prospects for fast experimental progress.
In the meantime though, hundreds of theories have been born and have died. Some have died due to incompatibility of new data (e.g., simplistic technicolor theories, or simpleminded no-scale supersymmetry theories), but others have died under their own self-consistency problems (e.g., some extra-dimensional models, some string phenomenology models, etc.). In both cases, it was care in establishing consistency with past data and mathematical rigor that have doomed them. In that sense, progress is made. Models come to the fore and fall under the spotlight or survive. When attempting to really explain everything, the consistency issues are stretched to the maximum. For example, it is not fully appreciated in the supersymmetry community that it may even be difficult to find a “natural” supersymmetric model that has a high enough reheat temperature to enable baryogenesis without causing problems elsewhere [21a, 21b]. There are many examples of ideas falling apart when they are pushed very hard to stand up to the full body of evidence of what we already know.
Relatively speaking, theoretical research is inexpensive. It is natural that a shift develop in fundamental science. The code of values in theoretical research will likely alter in time, as experimental input slows. Ideas will be pursued more rigorously and analysed critically. Great ideas will always be welcome. However, soft model building tweaks for simplicity and naturalness will become less valuable than rigorous tests of mathematical consistency. Distant future experimental implications identified for theories not fully vetted will become less valuable than rigorous computations of observational consistency across the board of all currently known data. One can hope that unsparing devotion to full consistency, both observational and mathematical, will be the hallmarks of the future era.James D. Wells (Submitted on 3 Nov 2012)