InvertibleNetworks.jl documentation

About

InvertibleNetworks.jl is a package of invertible layers and networks for machine learning. The invertibility allows to backpropagate through the layers and networks without the need for storing the forward state that is recomputed on the fly, inverse propagating through it. This package is the first of its kind in Julia with memory efficient invertible layers, networks and activation functions for machine learning.

Installation

This package is registered in the Julia general registry and can be installed in the REPL package manager (]):

] add InvertibleNetworks

Authors

This package is developed and maintained by Felix J. Herrmann's SlimGroup at Georgia Institute of Technology. The main contributors of this package are:

  • Rafael Orozco, Georgia Institute of Technology (rorozco@gatech.edu)
  • Philipp Witte, Microsoft Corporation (pwitte@microsoft.com)
  • Gabrio Rizzuti, Utrecht University (g.rizzuti@umcutrecht.nl)
  • Mathias Louboutin, Georgia Institute of Technology (mlouboutin3@gatech.edu)
  • Ali Siahkoohi, Georgia Institute of Technology (alisk@gatech.edu)

References

  • Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. ArXiv

  • Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, ArXiv

  • Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. ArXiv

  • Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. ArXiv

  • Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. ArXiv

  • Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. ArXiv

The following publications use [InvertibleNetworks.jl]:

Acknowledgments

This package uses functions from NNlib.jl, Flux.jl and Wavelets.jl