• Question: Is high performance computing going to be the majority of work in particle physics until we can start to make sense of the patterns?

    Asked by tulloche to Philippe on 18 Nov 2019.
    • Photo: Philippe Gambron

      Philippe Gambron answered on 18 Nov 2019:


      High performance computing is useful to process the data and do certain numerical calculations (like lattice QCD which tries to describe the interactions between quarks).

      However it will be of little use to understand why particle physics is the way it is. These are not at all heavy calculations. It would just require a very bright idea.

      We observe patterns among particles that we can’t explain. For example, particles come in three copies which are increasingly heavy. The electron (and it’s little brother, the neutrino) have “cousins” which are identical particles, only heavier, the muon and the tau (along with the corresponding neutrinos).

      The picture of particle physics that we have was established in the early 70’s and was coined the “Standard Model”. Since then, many attempts have been made to go further and explain those patterns. As far as I know, all those attempts were based on symmetries. We would postulate a new symmetry that would explain some features but that symmetry would imply interactions and particles that we do no observe which, in turn, required another explanation.

      It’s as if an archeologist was trying to explain the shape of a pile of bricks that had been unearthed. A possible explanation was that this was part of a house but, then, where is the rest of it? In that case, it is easy. We could say that the house collapsed and someone stole the bricks. But, in particle physics, this is not so simple and required increasingly convoluted theories.

      The first attempt was based on the group SU(5) which tried to extend the symmetry of the standard model. This extended symmetry implied the existence of new particles that could allow the proton to decay but this was never observed. This is when things started to go wrong. The establishment of the Standard Model had been the culmination of decades of successes but, since then, none of these predictions have been verified. Of course, there have been many experimental successes since then, like the discovery of heavier quarks and leptons, as well as the W and Z gauge bosons (the particles mediating the weak interaction) and, more recently, the Higgs boson but all these were predictions of the Standard Model.

      Since the 70’s, many theories were made in the hope that the new particles would be observed at the LHC. The LHC came and found nothing unexpected. It confirmed the Standard Model by discovering the Higgs boson but excluded some of the theoretical attempts that had been made to go further. So, now, without experimental input, it would be difficult to do it, unless someone has a very bright idea.

      However, many puzzles are staring at us in the face without the need for a larger particle accelerator. Some measurements do not exactly give the result predicted by the theory. Neutrinos can change flavour. We can describe it but there is no explanation for their mixing. We also do not know why there is so little anti-matter despite the fact that, in the lab, matter and anti-matter behave almost in the same way.

      So there has been no overwhelming discovery that would show us the way but many puzzles whose solution could bring us forward.

Comments