Fermilab-led team tests Azure AI for particle physics data challenge

The Fermi National Accelerator Laboratory, known as Fermilab, is America's particle physics and accelerator laboratory and also among the world’s biggest data processors. In the quest to understand the mysteries of the universe, a team of particle physicists is conducting experiments that are expected to generate close to an exabyte (one billion gigabytes) of data in the next decade.
These scientists from Fermilab and other research institutes have a front row seat on a classic big data problem. Do you choose faster but less accurate selections? Slower processing but more accuracy? Or both—requiring expensive hardware upgrades and more power for extra processing of data?
Enter Moore’s Law. It’s been fueling progress in technology since the Sixties with its claim that processor speeds double every two years. However, Moore’s Law is no longer delivering as we reach the physical limits of processors. As the scale of particle physics experiments continues to grow, scientists can see that their computing demands will also continue to grow. The next generation of experiments is expected to generate ten times more data and more complex events for the team’s machine learning algorithms to analyze.
The team wondered if using external AI would accelerate the inference of their machine learning models. A team of physicists led by Fermilab worked with Microsoft on a possible answer to these questions.