Science is unabashedly radical, willing to toss aside established wisdom and ideas to embrace mind-warping new concepts (if the data backs them up). Science is relentlessly conservative, deeply suspicious of new claims and determined to hold firm to cherished truths that have stood the test of time. As strange as it may seem, both these contradictory statements have held true throughout the 450-year history of "modern" science. In fact, they are what have given science the stability and creativity that root its cultural power.
But when a research field faces a crisis, how do scientists know which side of the radical/conservative divide to embrace? When should scientists hold to their conservatism and when should they jump ship for the promise of radicalism?
These questions underpin the enigma of science. If you are looking for a modern example of this tension, you need look no further than the field that brought you the Higgs boson. That is because particle physics has a problem and, for some, the solution means a radical rewriting of the discipline's fundamental rules.
For a century, the goal of particle physics (also called high energy physics) was to find the fundamental stuff of reality. More important, however, was to find the mathematical rules — the laws — governing that stuff. Through decades of painstaking effort, particle physicists created the grand Standard Model that neatly described a cosmos of particles called quarks and leptons. The interactions between those particles were mediated by other particles called bosons. Together, the description of particles and interactions made the Standard Model a work of staggering power. There was only one problem.
The Standard Model is not natural.
When physicists use the term "naturalness" they are speaking specifically about the constants of nature that have fed into the mathematical laws. These constants describe things like the strength of interactions between different classes of particles. They are numbers that have to be measured directly via experiments.
In a "natural" theory the size of these numbers should eventually "make sense." That means their different values would eventually be explainable within the context of the next level of mathematical laws. Having some constants be wildly tiny and others wildly large and never having an explanation for why nature "chose" those values would be ... well ... unnatural.
The Standard Model is, unfortunately, pretty unnatural. It contains a mess of constants whose values don't neatly fit any coherent explanation. Worse, if some of those values were to be tweaked, even slightly, we would end up with a very different kind of universe. This is part of the well-known problem called "the fine-tuned universe."
For decades physicists have searched for a deeper level of theory that would make sense of all the "coincidences" in these constants. The hope was that a single, all-embracing law would be found that would explicitly and uniquely tell us why this one universe looks the way it does.
Alas, the search seems to have failed and in the wake of that failure some researchers are ready to throw in the towel on naturalness. These physicists ask if, perhaps, those constants of nature are not set by some deeper level of law but are just the result of blind randomness. Since this move works only in a cosmos composed of many distinct "pocket" universes, stepping away from naturalness demands the acceptance of a multiverse.
Marcelo and I have written about the multiverse many times before, and the point today is NOT its ultimate veracity. No. Today I want to dwell on the choice. Do physicists hold to their Dreams of a Final Theory (as Steven Weinberg puts it) for this universe we live in? Do they stand with conservatism? Or, in the face of theoretical blocks at every turn, do they abandon that goal and embrace radicalism in the form of a new, dizzying possibility known as the multiverse?
Here is the important point to consider. What is occurring in particle physics now is not unique. In particular, as science has pushed forward it has often had to adapt to, and embrace, ideas that a generation before would have deemed heresy. There was resistance from purely mechanically minded physicists when the concepts of fields — entities that extend through all space — were first introduced in the study of electricity and magnetism. And don't even get me started on quantum mechanics!
Look at its history and you will see that some pretty essential elements of science have been bent and stretched to accommodate its push into new territory. But the big question, the burning question, is this: Before the data is clear (or before there is any data at all, as with multiverse models), how do you know when to conserve or when to radicalize?