项目作者: AWehenkel

项目描述 :
Implementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
高级语言: Python
项目地址: git://github.com/AWehenkel/UMNN.git
创建时间: 2019-05-13T08:44:07Z
项目社区:https://github.com/AWehenkel/UMNN

开源协议:BSD 3-Clause "New" or "Revised" License

下载


Unconstrained Monotonic Neural Networks (UMNN)


Official implementation of Unconstrained Monotonic Neural Networks (UMNN) and the experiments presented in the paper:

Antoine Wehenkel and Gilles Louppe. “Unconstrained Monotonic Neural Networks.” (2019).
[arxiv]

Other implementations:

  • Check the Zuko library for a clean and complete implementation of UMNNs based normalizing flows.
  • Check here if you are interested by modeling functions that are monotonic with respect to more than one input variable. (Do not hesitate to contact me for more details)

Dependencies

The code has been tested with Pytorch 1.1 and Python3.6.
Some code to draw figures and load dataset are taken from
FFJORD
and Sylvester normalizing flows for variational inference.

Usage

Simple Monotonic Function

This experiment is not described in the paper. We create the following dataset:
x = [x_1, x_2, x_3] is drawn from a multivariate Gaussian, y = 0.001(x_1^3 + x_1) + x_2 + sin(x_3).
We suppose that we are given the information about the monotonicity of y with respect to x_1.

  1. python MonotonicMLP.py

In this experiment we show that a classical MLP won’t be able to
model a function that is monotonic with respect to x_1 because its effect is small
in comparison to the other variables. The UMNN performs better than an MLP while
ensuring that the output is monotonic with respect to x_1.

Toy Experiments

  1. python ToyExperiments.py

See ToyExperiments.py for optional arguments.

MNIST

  1. python MNISTExperiment.py

See MNISTExperiment.py for optional arguments.

UCI Dataset

You have to download the datasets with the following command:

  1. python datasets/download_datasets.py

Then you can execute:

  1. python UCIExperiments.py --data ['power', 'gas', 'hepmass', 'miniboone', 'bsds300']

See UCIExperiments.py for optional arguments.

VAE

You have to download the datasets:

  • MNIST:
    1. python datasets/download_datasets.py
  • OMNIGLOT: the dataset can be downloaded from link;
  • Caltech 101 Silhouettes: the dataset can be downloaded from link.
  • Frey Faces: the dataset can be downloaded from link.
    1. python TrainVaeFlow.py -d ['mnist', 'freyfaces', 'omniglot', 'caltech']

Other Usage

All the files related to the implementation of UMNN (Conditionner network, Integrand Network and Integral)
are located in the folder models/UMNN.

  • NeuralIntegral.py computes the integral of a neural network
    (with 1d output) using the Clenshaw-Curtis(CC) quadrature, it computes sequentially the different evaluation points required by CC.
  • ParallelNeuralIntegral.py processes all the evaluation points at once making the computation almost as fast as the forward evaluation
    the net but to the price of a higher memory cost.
  • UMNNMAF.py contains the implementation of the different networks required by UMNN.
  • UMNNMAFFlow.py contains the implementation of flows made of UMNNs.

Cite

If you make use of this code in your own work, please cite our paper:

  1. @inproceedings{wehenkel2019unconstrained,
  2. title={Unconstrained monotonic neural networks},
  3. author={Wehenkel, Antoine and Louppe, Gilles},
  4. booktitle={Advances in Neural Information Processing Systems},
  5. pages={1543--1553},
  6. year={2019}
  7. }