escnn documentation

escnn is a Pytorch based library for equivariant deep learning.

Equivariant neural networks guarantee a prespecified transformation behavior of their features under transformations of their input. This package provides functionality for the equivariant processing of signals over Euclidean spaces (e.g. planar images or 3D signals). It implements the most general convolutional maps which are equivariant under the isometries of the Euclidean space, that is, under translations, rotations and reflections. Currently, it supports 2 and 3 dimensional Euclidean spaces. The library also supports compact-group equivariant linear maps (interpreted as a special case of equivariant maps on a 0 dimensional Euclidean space) which can be used to construct equivariant MLPs.

Warning

escnn.kernels has been largely refactored in version 1.0.0. While the interface of the other sub-packages in not affected, the weights trained using an older version of the library might not be compatible with newer instantiations of your models. For backward compatibility, we recommend using version 0.1.9 of the library.

Package Reference

The library is structured into four subpackages with different high-level features:

  • escnn.group implements basic concepts of group and representation theory

  • escnn.kernels solves for spaces of equivariant convolution kernels

  • escnn.gspaces defines the Euclidean spaces and their symmetries

  • escnn.nn contains equivariant modules to build deep neural networks

Typically, only the high level functionalities provided in escnn.gspaces and escnn.nn are needed to build an equivariant model.

Getting Started and Useful References

To get started, we provide an introductory tutorial which introduces the basic functionality of the library. A second tutorial goes through building and training an equivariant model on the rotated MNIST dataset. Note that escnn also supports equivariant MLPs; see these examples.

Check also the tutorial on Steerable CNNs using our library in the Deep Learning 2 course at the University of Amsterdam.

If you want to better understand the theory behind equivariant and steerable neural networks, you can check these references: - Erik Bekkers’ lectures on Geometric Deep Learning at in the Deep Learning 2 course at the University of Amsterdam - The course material also includes a tutorial on group convolution and another about Steerable CNNs, using this library. - My thesis provides a brief overview of the essential mathematical ingredients needed to understand Steerable CNNs.

Cite Us

The development of this library was part of the work done in our ICLR 22 paper and is an extension of the e2cnn library developed in our previous NeurIPS 19 paper . Please, cite us if you use this code in your own work:

@inproceedings{cesa2022a,
     title={A Program to Build {E(N)}-Equivariant Steerable {CNN}s },
     author={Gabriele Cesa and Leon Lang and Maurice Weiler},
     booktitle={International Conference on Learning Representations (ICLR)},
     year={2022},
 }

 @inproceedings{e2cnn,
     title={{General {E(2)}-Equivariant Steerable CNNs}},
     author={Weiler, Maurice and Cesa, Gabriele},
     booktitle={Conference on Neural Information Processing Systems (NeurIPS)},
     year={2019},
 }

Indices and tables