Theory of Equivariant Machine Learning I – Invited Special Session
Session Type: Lecture
Session Code: A1L-D
Location: Room 4
Date & Time: Wednesday March 22, 2023 (09:00 - 10:00)
Chair: Soledad Villar, David Hogg
Track: 12
Paper ID | Paper Name | Authors | Abstract |
---|---|---|---|
3227 | Modeling Active Symmetries on Graphs with Graph Neural Networks | Teresa Huang | Typical equivariant machine learning models focus on active symmetries (e.g., rotational symmetries of molecules). On the other hand, graph neural networks are commonly designed to be permutation equivariant to satisfy a passive symmetry, namely, the order of the nodes in the graph. In this work we translate the techniques from equivariant machine learning to graph neural networks, with the focus on modeling active symmetries on graphs. To this end we consider fixed graphs with non-trivial automorphisms and show how to parameterize equivariant functions using irreducible representations. For real-world graphs that are typically asymmetric, we first coarsen the graphs where non-trivial automorphisms may arise, as a way to approximate (active) symmetries. We demonstrate a bias-variance tradeoff in the performance of equivariant graph models as a function of the group of symmetries considered. The talk is based on joint work with Ron Levie (Technion – Israel Institute of Technology) and Soledad Villar (Johns Hopkins University). |
3223 | Necessary and Sufficient Conditions for Equivariant Pointwise Nonlinearities | Andrew Sands, Risi Kondor, Erik Thiede | Group equivariant neural networks have seen success in machine learning for the physical sciences. By ensuring that the inputs, outputs, and all activations transform according to a linear representation of the group, equivariant neural networks preserve crucial symmetries in a learning problem. However, designing these neural networks requires special considerations. In particular, the choice of nonlinearity is nontrivial: while one might wish to use common point-wise nonlinearities such as ReLU or hyperbolic tangent functions, for many group representations these nonlinearities break equivariance. Here, we characterize the class of group representations where these nonlinearities can be used by giving necessary and sufficient conditions that the group representation must obey. |
3228 | Data-Driven Discovery of Invariant and equivariant Coordinates | George Kevrekidis | We develop neural network architectures that produce symmetry-respecting parametrizations of sampled smooth manifolds. In particular, given a manifold M and a (known) group G, imposing orthogonality conditions allows the identification of a G-invariant submanifold along with it’s orthogonal complement, and their parametrization allows for further estimation of invariant and equivariant functions. We discuss the well-studied theoretical aspects and regularity conditions required for this decomposition, along with some computational considerations. |