AI Colloquium

ML meets Particle Physics

by Christoph Langenbruch (University of Heidelberg)

Europe/Berlin
E04 (OH14)

E04

OH14

Description

The particle physics experiments at the Large Hadron Collider at CERN produce some of the largest data samples in science to date. While the analysis of these data samples offers a unique opportunity to study the fundamental building blocks of nature with unprecedented precision, the ever increasing volume of data also represents an enormous challenge.  To address this challenge, and to optimally exploit the data, the use of modern machine-learning methods is essential. These machine learning methods need to be accurate and robust, as well as resource efficient.

In this talk, I will give an overview of machine learning techniques in particle physics with particular focus on applications in the LHCb experiment, one of the four large experiments located at the LHC. The LHCb experiment recently moved its first event filter stage, to a flexible full software system, using a heterogeneous GPU+CPU architecture. This makes LHCb an ideal environment for the development and deployment of novel machine-learning methods. I will give examples of the current usage of modern machine-learning techniques within LHCb, discuss ongoing developments, and give a brief outlook on future improvements.