Preview |
PDF, English
Download (1MB) | Terms of use |
Abstract
Of all scientific fields, few rival high-energy physics in the degree to which symmetries dictate its concepts, methods, and discoveries. Surprisingly, while particle-physics researchers were early enthusiasts of machine learning, they paid limited attention to models engineered around the discipline’s built-in symmetries. Transformers implement the permutation symmetry of particles in an efficient and scalable way, yet their systematic application in high-energy physics is a recent development. We establish autoregressive transformers as event generators that grasp the autoregressive dynamics of QCD jet radiation and reliably generate jet multiplicities beyond the limits of the training data. Although various Lorentz-equivariant graph networks have been introduced for jet tagging, none employed Lorentz-equivariant transformer architectures. Our Lorentz-equivariant Geometric Algebra Transformer (L-GATr) closes this gap as the first Lorentz-equivariant transformer, matching the performance of the graph networks on small-scale datasets and outscaling them on large-scale datasets. Building on this foundation, we create the first Lorentz-equivariant generative network.
Document type: | Dissertation |
---|---|
Supervisor: | Plehn, Prof. Dr. Tilman |
Place of Publication: | Heidelberg |
Date of thesis defense: | 9 July 2025 |
Date Deposited: | 17 Jul 2025 12:02 |
Date: | 2025 |
Faculties / Institutes: | The Faculty of Physics and Astronomy > Institute for Theoretical Physics |
DDC-classification: | 530 Physics |
Controlled Keywords: | Machine Learning, High-Energy Physics |