Lattice Field Theory and Machine Learning
from
Thursday, December 5, 2024 (10:00 AM)
to
Friday, December 6, 2024 (5:00 PM)
Monday, December 2, 2024
Tuesday, December 3, 2024
Wednesday, December 4, 2024
Thursday, December 5, 2024
10:30 AM
Self-learning Monte Carlo with equivariant Transformer
-
Yuki Nagai
(
Information Technology Center
)
Self-learning Monte Carlo with equivariant Transformer
Yuki Nagai
(
Information Technology Center
)
10:30 AM - 11:30 AM
Room: Auditorium
Machine learning and deep learning have significantly advanced computational physics, particularly in simulating complex systems. Equivariance is critical for modeling physical systems, as it introduces a strong inductive bias into the probability distribution learned by the machine learning model. However, imposing symmetry often results in reduced acceptance rates in self-learning Monte Carlo (SLMC). We propose a symmetry-equivariant attention mechanism for SLMC, designed for systematic improvement. The proposed architecture is evaluated on the spin-fermion model (double exchange model) on a two-dimensional lattice [1]. Our results demonstrate that the method overcomes the limitations of linear models, achieving improved acceptance rates. Furthermore, the model quality follows a scaling law akin to large language models, showing consistent improvement with increased model depth. Our work paves the way for the development of more accurate and efficient Monte Carlo algorithms with machine learning for simulating complex physical systems.
11:30 AM
Tea Break
Tea Break
11:30 AM - 12:00 PM
Room: Auditorium
12:00 PM
Gauge covariant Transforme
-
Akio Tomiya
(
Tokyo Woman's Christian University
)
Gauge covariant Transforme
Akio Tomiya
(
Tokyo Woman's Christian University
)
12:00 PM - 1:00 PM
Room: Auditorium
1:00 PM
Lunch Break
Lunch Break
1:00 PM - 2:30 PM
Room: Auditorium
2:30 PM
Renormalization Group and Flow-Based Generative Models
-
Yi-Zhuang You
(
University of California, San Diego
)
Renormalization Group and Flow-Based Generative Models
Yi-Zhuang You
(
University of California, San Diego
)
2:30 PM - 3:30 PM
Room: Auditorium
Abstract: Generative modeling in machine learning can be understood as the inverse process of renormalization, establishing a profound connection between lattice field theory and modern AI techniques. This perspective allows the development of unsupervised learning algorithms that inherently learn to perform renormalization from the field theory action. In this talk, I will explore how flow-based generative models, inspired by AdS-CFT duality, can accelerate the sampling of conformal field theories (CFTs) using network architectures defined on the AdS bulk geometry. Furthermore, I will discuss how these concepts extend to image generation, enabling more efficient error correction through holographic principles.
3:30 PM
Tea Break
Tea Break
3:30 PM - 4:00 PM
Room: Auditorium
4:00 PM
Hands-on NeuLat: A Toolbox for generative neural samplers in Lattice Field Theory
-
Kim Nicoli
(
University of Bonn
)
Hands-on NeuLat: A Toolbox for generative neural samplers in Lattice Field Theory
Kim Nicoli
(
University of Bonn
)
4:00 PM - 5:00 PM
Room: Auditorium
Friday, December 6, 2024
10:30 AM
What is the flow-based sampling endgame?
-
Gurtej Kanwar
(
The University of Edinburgh
)
What is the flow-based sampling endgame?
Gurtej Kanwar
(
The University of Edinburgh
)
10:30 AM - 11:30 AM
Room: Auditorium
Flow-based sampling has emerged as an effective way to parameterize sampling from many known and unknown distributions. Within the context of lattice quantum field theory, there are several ongoing efforts to incorporate these and related methods into practical calculations. I will first discuss our recent work in this area and then bring up the question and some proposed answers of what the "endgame" of this research direction might be.
11:30 AM
Tea Break
Tea Break
11:30 AM - 12:00 PM
Room: Auditorium
12:00 PM
Continuous normalizing flows for gauge theories
-
Mathis Gerdes
(
Universiteit van Amsterdam
)
Continuous normalizing flows for gauge theories
Mathis Gerdes
(
Universiteit van Amsterdam
)
12:00 PM - 1:00 PM
Room: Auditorium
We define an expressive continuous normalizing flow architecture for matrix Lie groups that is equivariant under group transformations. We apply this to lattice gauge theories in two dimensions as a proof-of-principle and demonstrate competitive performance, showing its potential as a tool for future lattice sampling tasks.
1:00 PM
Lunch Break
Lunch Break
1:00 PM - 2:30 PM
Room: Auditorium
2:30 PM
Diffusion models: cumulants and lattice field theory
-
Gert Aarts
(
Swansea University
)
Diffusion models: cumulants and lattice field theory
Gert Aarts
(
Swansea University
)
2:30 PM - 3:30 PM
Room: Auditorium
3:30 PM
Tea Break
Tea Break
3:30 PM - 4:00 PM
Room: Auditorium
4:00 PM
Applications of flow models to the generation of correlated lattice QCD ensembles (online)
-
Fernando Romero-López
(
Bern University
)
Applications of flow models to the generation of correlated lattice QCD ensembles (online)
Fernando Romero-López
(
Bern University
)
4:00 PM - 5:00 PM
Room: Auditorium