Accelerating magnetic field imperfections discovery

  • Hubert Niewiadomski


    Hubert Niewiadomski

  • Date

    November 4, 2022

  • Read time

    11 min


This blog post is part of the ‘From CERN to Cledar’ series by Cledar founders Hubert Niewiadomski and Piotr Nyczyk, about how their combined 15-year experience at CERN – one of the world’s largest research institutes – shaped the vision and values that make Cledar unique.

When I learned that, despite everybody’s best efforts, some of the measurements from the Large Hadron Collider (LHC for short) were still months away from being genuinely useful to the science community, I realized that I had to act and I had to think differently to find a solution. This is the story of how we (me and my colleagues at the time) anticipated physics that had never been measured before to accelerate scientific advancement at CERN.

The LHC is at the heart of CERN. Its purpose is to accelerate particles that are flying at almost the speed of light in opposite directions. The idea is that these particles should collide and for the data about these collisions to be captured so that scientists can better understand the forces that drive and shape the universe.

When explaining this to non-scientific colleagues, such experiments can seem so abstract and detached from our daily lives. And yet the findings they generate are incredibly relevant to how the world works today and to the scientific and medical advances that we take for granted and will be reliant on in the future (in ways we don’t yet know). Tumor detection, the x-ray, CT scans, nuclear … we rely on the science behind these technologies every day and they will continue to contribute to the health of mankind and, potentially, the planet. Yep, it’s pretty important stuff.

The TOTEM project at CERN

Within the LHC, beams of protons circulate within two pipes that cross over at key collision points. Thanks to Einstein’s theory of relativity, we know that particles, usually protons, gain ultra-high momenta and, on collision, can create new particles or demonstrate internal interactions between particles at a very deep level (normally not visible to the naked eye). These deep-level interactions reveal the nature of strong and weak interactions, and this is fundamental to understanding particle and nuclear physics, as well as cosmology and the origins of the universe.

Back in 2003, I was recruited to work on the TOTEM experiment at CERN – regarded as the ‘longest’ experiment at CERN due to the fact that detectors were spread across a distance of around half a kilometer. The goal of the TOTEM project was to study the deep interactions taking place between protons in the LHC. However, to even ‘see’ what was happening required an extremely powerful level of magnification. At CERN, the LHC was configured to work as a giant microscope by emitting proton beams to serve as probing waves to see the internal structure of other protons. To provide an idea of scale, the configuration at CERN allowed us to achieve a level of detail 1,000 billion times greater than would be possible with a typical optical microscope.

Obstacles standing in the way of scientific progress

There were several key challenges that we needed to overcome in order for the LHC to deliver on its potential and generate useful, accurate data for the science community:

  • How to configure the magnetic fields of the LHC to bend the beams in order to provide the optimal magnifications and reduce the impact of external perturbations on the measurement.
  • How to control all the settings that could negatively influence the experiment, such as misalignments of the magnets, imprecision of the electric currents driving the magnets, mechanical imperfections of instrumentation, as well as other external factors. All of this needed to be controlled in order to improve the accuracy of any tests conducted.

But perhaps the greatest challenge was the cost and time associated with getting accurate results from the collider. The TOTEM experiment had been designed with certain assumptions, namely that it would deliver a high level of accuracy. But to achieve this state would require large amounts of ‘run time’ in order to optimize all of the settings and calibrate the array of instruments. The problem was that this required significant budget and it was difficult to find sponsors to fund this run time without the promise of accurate results immediately. Worse still, even with lengthy testing, the level of accuracy that could be delivered at the end of this process – accurate to within a few percent – was not sufficient for the ambitious physics programs planned by scientists reliant on the LHC.

In a way, this situation is not dissimilar to dilemmas faced by many businesses today. They know that data, data science, and technology can help them to interpret and make sense of the world around them, as well as predict future behavior with confidence and accuracy. And yet to achieve this, they need to invest, test, refine, and repeat. In rapidly changing markets and with ever-growing pressure to deliver on stakeholder expectations, it’s natural that some companies hesitate. You need to have a solid business case for your project and you need to be confident that you have the right people and skills in order to deliver on your vision. In an age of dizzying possibilities but also skill shortages, many companies struggle.

Machine Learning and data science deliver a solution

The TOTEM project was at risk. I took it upon myself to seek or develop a solution to this impasse. In my role as Deputy Analysis Coordinator, I proposed that we apply Monte Carlo Simulations to various LHC imperfections and see how they affect the proton tracks. I observed that the proton tracks themselves carried some degree of information that could be helpful in understanding the optics imperfections.

I then decided to develop a proprietary Machine Learning (ML) algorithm that would simulate the actual LHC running conditions based on the observed distribution of proton tracks. The algorithm was an extension of the Principal Component Analysis (PCA), adjusted to allow for small amounts of non-linearity caused by forward momentum loss experienced by protons at collision. To train the algorithm, a set of proton tracks was simulated for various LHC running conditions, including all possible LHC collider imperfections within their tolerances. This included magnet current imperfections, magnet tilts and displacements, beam momentum and position uncertainty, magnetic field non-linearities, and more.

The results? Astonishing accuracy

The algorithm, together with the nominal settings, was able to estimate the actual running conditions with an unimaginably high accuracy of 2.5 permille level (ten times better than the originally planned procedure, which was based on direct beam measurements). It transpired that, although theory-based approaches helped, they did not take into consideration all possible permutations. Once theory was enhanced with partial information from track level, the picture became much clearer. We then needed to identify which components affected final optics estimation and how. This was achieved through a set of derivative matrices that demonstrated the sensitivity of the model to various inputs1.

This was probably the most-accurate accelerator optics measurement ever achieved. It enabled some of the most-precise proton internal structure measurements to date and led to the discovery of a new particle – the Odderon2.

In short, the development and application of this proprietary ML algorithm, combined with the use of relevant real-world data, enabled the science community and CERN to dramatically accelerate its research programs (from a year or more to weeks), save hundreds of thousands of euros, and to achieve unprecedented levels of accuracy in their data.

CERN experiences inspire Cledar values

Both me and Cledar co-founder, Piotr Nyczyk, place tremendous value on the experience we gained during our combined 15 years at CERN. These experiences have shaped the values that we live and work by at Cledar. These values are to adopt a bird’s eye view, question the fundamentals, and to seek expert opinion.

Adopt a bird’s eye view
We always strive to adopt a big-picture, holistic view of every problem or challenge we face. In the case of the LHC, this meant delving into the sources of imperfections and their possible impact on the LHC’s performance. It also meant considering all the different data and events that could be observed, and how they could be combined to provide real, valuable, insight. In addition, we also took on board a wide variety of perspectives to enrich our view of the problem and make us aware of potential solutions. Often, it’s only when you put all of these pieces and perspectives together, that a solution emerges.

Question the fundamentals
At first, it was believed that there was no alternative but to invest in run time, learn, iterate, and live with the results – whether sufficiently accurate or not. The TOTEM project was at risk of stagnation or, at best, not delivering on its lofty expectations. I firmly believed that a solution could be found. Although my ideas were initially met with resistance and doubt, I knew that in order to make a breakthrough, we would need to question everything we had learned thus far and be prepared to try something new.

Despite our collective experience, we still take time at Cledar to question the fundamentals on each project or solution, and this regularly inspires completely new approaches or streams of innovation that deliver real, tangible value to the project or our client’s business.

Seek expert opinion
I was truly privileged to be surrounded by so many great minds at CERN. It allowed me to seek opinion from experts of all different backgrounds and domains. This included scientists, technologists, and engineers, all of whom – in their own unique way – contributed to the development of a game-changing solution for the TOTEM project and LHC.

We apply the same principles at Cledar today – we leverage the various domains within our team, as well as our global network, to see problems and solutions from a variety of perspectives. We do this in a collaborative way, with each perspective afforded the respect it deserves (i.e. no egos). This helps us, and our clients, proceed with greater confidence than if informed by individual domains alone.

Translating CERN experience to business benefits for Cledar clients

At Cledar, we are able to provide a unique perspective on a variety of challenges, ranging from business and technology, through to societal, economic, and even climatological. We use our academic and scientific background, but we also apply mathematical expertise and our deep understanding of how to deploy the right technology in the right way. This holistic and unique perspective empowers us to see and deploy solutions to challenges in a way that many other companies simply can’t.

Examples of this include:

  • We have built models that can predict the likelihood of loan defaults at 97% accuracy. We did this by taking data from existing loans and enriching it with third-party data about global and local economic conditions to understand how they impact risk and whether they increase or decrease the risk of default. Our work in this area can help investors optimize their risk-reward ratio when investing.
  • We combined 20 years’ worth of data and more than 2 million data points to help a Caribbean Island reform its economic policy in the aftermath of natural disaster. Using data modelling and predictive analytics, Cledar helped inform a policy that would generate the income needed to support recovery, without overburdening local businesses and citizens.
  • We used proprietary AI methodology to significantly increase the accuracy of object detection in video data. This can be used to enhance levels of security for surveillance systems and to enhance traffic management capabilities for vehicles, complexes (commercial, industrial, or residential), towns or cities.

Learn more about Cledar’s experiences at CERN

Before reuniting at Cledar, Hubert Niewiadomski and Piotr Nyczyk spent a combined 15 years at CERN – one of the world’s largest and most-renowned scientific research institutes. Learn more about their CERN story and how their CERN experiences shaped Cledar values.


1. LHC Optics Measurement with Proton Tracks Detected by the Roman Pots of the TOTEM Experiment New J.Phys. 16 (2014) 103041

2. Odderon Exchange from Elastic Scattering Differences between pp and p¯p Data at 1.96 TeV and from pp Forward Scattering Measurements


Curious about the latest tech insights? Follow us on our social media.

See more of what we do on our social media.


Want to predict and measure future behaviors with accuracy?

Let’s talk to undertand how you can benefit from our CERN experience with machine learning, AI, data science and algorithms.