Book · General · Language · Problem

Reductionism in quantum physics – a naturalness mire?

[Draft] [“Beyond the Standard Model” series]

Background on the “crisis”

This Quanta Magazine article (below) has an eye-catching title, but its gist relates to the hierarchy problem, which I discussed in a prior post. That 2017 post (and additionaL commentary) used quotes by physicists – Sean Carroll, Leon Lederman, Fermilab’s Don Lincoln (video) – and Wiki to cover the context and vocabulary of the “crisis” discussed here. As well as the common mathematical techniques involved.

Rather than adding notes on this recent article as a comment to my old post, I decided that a new post was appropriate, as a follow-up to the recent post “Beyond the Standard Model – sliver of reality?

The rethink

• Quanta Magazine > “A Deepening Crisis Forces Physicists to Rethink Structure of Nature’s Laws” by Natalie Wolchover, Senior Writer/Editor (March 1, 2022) – In a slew of recent papers, researchers have thrown reductionism to the wind.

I’m not sure that starting her article by discussing Thomas Kuhn’s book helps Wolchover’s case.[1] However, expectations for the world’s great particle colliders (the Large Hadron Collider) and mathematical models (supersymmetry, string theory, etc.) – to resolve some major puzzles – have yet to be fulfilled.

That missing resolution is particularly in regard to concepts of “naturalness” and “fine-tuning” of values which define fundamental parameters.

Wiki: The heuristic rule that parameters in a fundamental physical theory should not be too fine-tuned is called naturalness.

So, revisiting assumptions makes sense. Seeking something more profound. Particularly regarding the mass (energy scale) of the Higgs boson. Tallying self-interactions. (There’s a useful graphic in the article.[2]) And so-called vacuum energy – its energy scale also.

Researchers are increasingly zeroing in on what they see as a weakness in the conventional reasoning about naturalness. It rests on a seemingly benign assumption, one that has been baked into scientific outlooks since ancient Greece: Big stuff consists of smaller, more fundamental stuff — an idea known as reductionism.

They’re exploring novel ways in which big and small distance scales might conspire, producing values of parameters that look unnaturally fine-tuned from a reductionist perspective.

Looking at the problem in a new way maybe … building on the notion of effective theories (effective field theory). Practical ignorance of detail.

Physicists refer to low-energy, long-distance physics as “the IR,” and high-energy, short-distance physics as “the UV,” drawing an analogy with infrared and ultraviolet wavelengths of light.

You can, for instance, model water with a hydrodynamic equation that treats it as a smooth fluid, glossing over the complicated dynamics of its H2O molecules. The hydrodynamic equation includes a term representing water’s viscosity — a single number, which can be measured at IR scales, that summarizes all those molecular interactions happening in the UV.

Wolchover cites the success of applying effective field theory (EFT) to predicting the mass of the charm quark. Cutoff energies for applying a model. High vs. low-energy corrections (to avoid infinities).

A cutoff not far above the mass of the Higgs boson itself would make the Higgs about as heavy as the corrections coming from the cutoff, and everything would look natural.

And then revisiting gravity, which “doesn’t play by the usual reductionist rules.” As in black holes, where: “More energy no longer lets you see shorter distances.”

quantum gravity seems to toy with nature’s architecture, making a mockery of the neat system of nested scales that EFT-wielding physicists have grown accustomed to.

UV-IR mixing potentially resolves naturalness problems by breaking EFT’s reductionist scheme.

[Nathaniel Craig, a theoretical physicist at UCSB] “Gravity violates the normal EFT reasoning because it mixes physics at all length scales — short distances, long distances.

Another perspective: The observable universe as a particle box with limited number of high-energy particle states. Dependent on the surface area rather than volume, so there’s “far less high-energy activity than the EFT calculation assumes.”

That means the usual EFT calculation of the cosmological constant [an IR property of the whole universe] is too naive.

Of course, another approach uses string theory – for UV-IR mixing. Correlations which cancel infinities.

The hierarchy problem, in this context, asks why corrections from these string states don’t inflate the Higgs, if there’s nothing like supersymmetry to protect it.

The new models represent a growing grab bag of UV-IR mixing ideas.

And the quest for a quantum gravity model continues. Experimental evidence? Testable predictions? Perhaps “the whole UV-IR mixing concept lacks promise” – a promise of a paradigm shift, eh.

The Standard Model stands.


My take

So, is there a crisis in physics? An impending paradigm shift?

What qualifies as a paradigm shift? Changes in core concepts. A new conceptual and mathematical framework. Reconsideration of exemplars and shared preconceptions. A new way of viewing reality which reconciles anomalies. Something akin to a (gestalt-like) flip in perception of an ambiguous image.

Wilczek’s parable of intelligent deepwater fish might qualify – figuring out that they are not living in empty space but in a medium called water.

Otherwise, Wiki cites some of the “classical cases” of Kuhnian paradigm shifts in science.

The Standard Model of physics is cited as an example of a currently accepted paradigm. As well as moving beyond Newtonian gravity.

The many contributions problem

Historically, the adoption of a heliocentric model of the solar system over a geocentric one is cited as a paradigm shift. Happening over decades “through a complex social process.” Not merely a shift in coordinate systems. Not merely a fine tuning of predictive calculations. Not merely an alternative explanation for everyday experience.

The geocentric Ptolemaic system entailed epicycles, circles moving on other circles, in order to align with astronomical observables – motions in the heavens. Predictions involved tallying those additional geometric contributions.

Wiki: Epicycles worked very well and were highly accurate, because, as Fourier analysis later showed, any smooth curve can be approximated to arbitrary accuracy with a sufficient number of epicycles.

But computational complexity was not the only factor in competing models. And the “purity” of circular motion persisted.

As Wiki notes:

The geocentric model was eventually replaced by the heliocentric model. Copernican heliocentrism could remove Ptolemy’s epicycles because the retrograde motion could be seen to be the result of the combination of Earth and planet movement and speeds. Copernicus felt strongly that equants were a violation of Aristotelian purity, and proved that replacement of the equant with a pair of new epicycles was entirely equivalent. Astronomers often continued using the equants instead of the epicycles because the former was easier to calculate, and gave the same result.

It has been determined [by whom?], in fact, that the Copernican, Ptolemaic and even the Tychonic models provided identical results to identical inputs. They are computationally equivalent. It wasn’t until Kepler demonstrated a physical observation that could show that the physical sun is directly involved in determining an orbit that a new model was required.

The geocentric system was still held for many years afterwards, as at the time the Copernican system did not offer better predictions than the geocentric system, and it posed problems for both natural philosophy and scripture. The Copernican system was no more accurate than Ptolemy’s system, because it still used circular orbits. This was not altered until Johannes Kepler postulated that they were elliptical … .

Perhaps a paradigm shift – a new model – in quantum physics might entail something corresponding to:

  • Supplanting “epicycles” (loop diagrams?),
  • Supplanting purity / naturalness of “circles” [3],
  • Introducing another factor (“the physical sun” in the above quote) directly involved in determining quantum energies.

Some of my posts speculate that puzzles about the Higgs “mass” and vacuum energy might have to do with the vocabulary of “mass” and “charge,” for example. The notion that so-called point particles have such reducible aspects or properties. That such terms define intrinsic properties of so-called fundamental particles rather than topologically induced effects in energy fields and fluxes – effects which are measured as those properties.

Is “UV-IR mixing” a similar drift? Or a revised regularization scheme? Perspective flip or alternative reckoning? I’m not sure if “mixing” contains a sense of interactions between layers of an energy structure (Wilczek’s Grid), within a gestalt-like topology.

So, I’m re-visiting Wiki’s article on Naturalness (physics) regarding “Naturalness and the gauge hierarchy problem” for the Higgs boson mass.

  • Effective field theory
  • Cutoff scale (“cut-off to the divergent loop integrals”)
  • Tallying independent contributions for an observable
  • The divergent radiative correction (“blow up” or managing “divergences to all orders in perturbation theory”)
  • Mixing and loop contributions
Notes

[1] Wolchover starts her article with a not uncommon go-to by discussing Thomas Kuhn’s seminal 1962 book The Structure of Scientific Revolutions. The book was a favorite study when I was in graduate school. Often cited still (through succeeding editions as well), my impression is that evolving critique of his book – particularly the notion of paradigm shift – muted its domain of applicability. As well as raised the possibility that a more complete consideration of the history of science might not require Kuhn’s “sharp distinction between paradigmatic and non-paradigmatic science.” As well as a more elusive sense of progress.

Wiki: According to a report released in 2014 by the National Science Foundation, 26% of Americans surveyed believe that the sun revolves around the Earth.

Wiki: According to Max Planck, “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Wiki: Many philosophers and historians of science, including Kuhn himself, ultimately accepted a modified version of Kuhn’s model, which synthesizes his original view with the gradualist model that preceded it. Kuhn’s original model is now generally seen as too limited.

Personally, I wonder about paradigms and physics in the context of language. The habit of applying everyday language to discuss concepts and mathematical models disconnected wildly from human experience. Among others, Sean Carroll talks a lot of about this context of emergent layered descriptions of reality, each using its own appropriate vocabulary.

My issue is that “stacking” layers of vocabulary likely compromises our understanding at some point. And makes the only shared framework a mathematical one accessible to only an analytical elite.

So, I’m not sure that a deeper understanding of quantum theory, even replacing the vocabulary of “particle, force, mass, charge,” etc., will resolve outstanding problems in modern physics. Let alone entail a so-called paradigm shift. Even with higher-dimensional topologies.

[2] In the graphic, as a possible solution, is the idea that: “The Higgs scale and Planck scale are connected through a complex set of push-and-pull effects.” Replacing the “push & pull” metaphor with “interactions” between Grid layers (e.g., quantum field energy and vacuum energy) is more to my liking.

[3] Also, other departures from “perfection” akin to celestial flaws, mutability, and chaotic dynamics.

Related posts

Sisyphean hierarchy (April 13, 2017)

Effective theory

How stiff is space-time?

3 thoughts on “Reductionism in quantum physics – a naturalness mire?

  1. As an example of a potential paradigm shift, here’s another use of advanced simulations in contemporary physics research.

    • Phys.org > “A solar illusion: Coronal loops may not be what they seem” by National Center for Atmospheric Research (March 2, 2022)

    The research, led by the National Center for Atmospheric Research (NCAR) and published in The Astrophysical Journal, relied on a cutting-edge, realistic 3D simulation of the solar corona.

    As sheets of bright plasma fold over themselves, the folds look like bright thin lines, mimicking the look of distinct and self-contained strands of plasma. … which the research team is calling the “coronal veil” hypothesis …

    “I have spent my entire career studying coronal loops,” said NCAR scientist Anna Malanushenko, who led the study. “I was excited that this simulation would give me the opportunity to study them in more detail. I never expected this. When I saw the results, my mind exploded. This is an entirely new paradigm of understanding the Sun’s atmosphere.

    However, the coronal loops seen on the Sun have never behaved exactly as they should, based on our understanding of magnets. …

    The possibility that these loops are instead wrinkles in a coronal veil helps explain this and other discrepancies with our expectations of the loops …

    “This study reminds us as scientists that we must always question our assumptions and that sometimes our intuition can work against us,” Malanushenko said.

    While the MURaM simulation is one of the most realistic ever created of the solar corona, it’s still just a model. Understanding how many coronal loops are actually optical illusions will require carefully designed observational methods that probe the corona and new data analysis techniques.

    “We know that designing such techniques would be extremely challenging, but this study demonstrates that the way we currently interpret the observations of the Sun may not be adequate for us to truly understand the physics of our star,” Malanushenko said.

  2. I was reading a Science.org article [1] and encountered a reference to lattice quantum chromodynamics.

    Wiki: Lattice QCD is a well-established non-perturbative approach to solving the quantum chromodynamics (QCD) theory of quarks and gluons. It is a lattice gauge theory formulated on a grid or lattice of points in space and time. When the size of the lattice is taken infinitely large and its sites infinitesimally close to each other, the continuum QCD is recovered.

    Numerical lattice QCD calculations using Monte Carlo methods can be extremely computationally intensive, requiring the use of the largest available supercomputers. To reduce the computational burden, the so-called quenched approximation can be used, in which the quark fields are treated as non-dynamic “frozen” variables. While this was common in early lattice QCD calculations, “dynamical” fermions are now standard. These simulations typically utilize algorithms based upon molecular dynamics or microcanonical ensemble algorithms.

    Notes

    [1] That article (below) was interesting regarding diversity initiatives in college physics and comparing Caltech’s positioning on this problem (as witnessed in publications and policy statements in the last decade). Efforts complementing more general initiatives in science, technology, engineering, and math (STEM).

    The article also connected with a recently read book: The Aristocracy of Talent: How Meritocracy Made the Modern World.

    But the essence of the American experiment remained the same: create equality of opportunity but expect that equality of opportunity to lead to a highly unequal outcome as people sorted themselves out according to their abilities and energies. – Wooldridge, Adrian. The Aristocracy of Talent: How Meritocracy Made the Modern World (p. 19). Skyhorse. Kindle Edition.

    • Science.org > “Fix the System, Not the Students” by Jeffrey Mervis (2 Mar 2022) – Change requires building bridges, removing barriers.

    A supportive environment:

    • A welcoming, nurturing departmental “climate” which looks beyond GPAs and standardized test scores, recognizing “other things besides their coursework that affect someone’s ability to succeed in school.”
    • Funded grad-undergrad mentoring – mentoring networks (not just the traditional teaching assistant or lone advisor).
    • Hands-on learning (a full-range of learning modalities).
    • Student-led clubs.
    • Free tutoring.
    • Research internships, paid boot camps.
    • Bridge programs.
  3. Another article re the use of paradigm shift. In this case, revisiting the esoteric black hole paradox (an active field of research within quantum gravity) – “that black holes are more complex than originally thought.”

    • Phys.org > “Scientists may have solved Stephen Hawking’s black hole paradox” (March 18, 2022)

    In the 1960s, physicist John Archibald Wheeler, discussing black holes’ lack of observable features beyond their total mass, spin, and charge, coined the phrase “black holes have no hair” – known as the no-hair theorem.

    However, the newly discovered “quantum hair” provides a way for information to be preserved as a black hole collapses and, as such, resolves one of modern science’s most famous quandaries, experts say.

    “It was generally assumed within the scientific community that resolving this paradox would require a huge paradigm shift in physics, forcing the potential reformulation of either quantum mechanics or general relativity.

    [Roberto Casadio, professor of Theoretical Physics at the University of Bologna] “However, in the quantum theory [vs. the classical theory based on general relativity], the state of the matter that collapses and forms the black hole continues to affect the state of the exterior [of the black hol], albeit in a way that is compatible with present experimental bounds. This is what is known as ‘quantum hair.'”

Comments are closed.