Neural networks that respect local gauge symmetry learn nonlocal patterns on a lattice
The paper introduces a new kind of graph neural network that builds local gauge symmetry into its core. The authors embed non-Abelian gauge symmetry directly into message passing by using matrix-valued, gauge-covariant features and symmetry-compatible update rules. In plain terms, the network is designed so its internal signals transform correctly when independent symmetry changes are made at each lattice site. This allows it to learn intrinsically nonlocal quantities without forcing the input into a few handcrafted invariant descriptors.
Local gauge symmetry means that physically equivalent configurations can differ by independent symmetry operations at every site of a lattice. On the lattice used here, the fundamental variables live on links between sites. Those link variables U_ij take values in a compact gauge group (for example U_ij in SU(N_f)) and act like discrete “parallel transporters.” Under a site-dependent change g_i the link transforms as U_ij -> g_i U_ij g_j^†. Physical observables must be invariant under these changes and are often formed from closed products of links called Wilson loops. For non-Abelian groups (where order matters), Wilson loops are matrix-valued and path-dependent, so simple lists of small loops do not capture all the information.
To address this, the authors recast learning as gauge-equivariant message passing on a graph that represents the lattice. Instead of first building gauge-invariant inputs, the network keeps matrix-valued, gauge-covariant features on nodes and edges and enforces the local symmetry at every layer. Message passing then implements gauge-covariant transport: local updates move and combine these matrix features along paths on the lattice in a way that respects the transformation law. As the network iterates, nonlocal correlations and loop-like structures such as Wilson lines can appear implicitly from repeated local operations.