nessai.flows.realnvp

Implementation of Real Non Volume Preserving flows.

Module Contents

Classes

RealNVP

Implementation of RealNVP.

class nessai.flows.realnvp.RealNVP(features, hidden_features, num_layers, num_blocks_per_layer, mask=None, context_features=None, net='resnet', use_volume_preserving=False, activation=F.relu, dropout_probability=0.0, batch_norm_within_layers=False, batch_norm_between_layers=False, linear_transform=None, pre_transform=None, pre_transform_kwargs=None, actnorm=False, distribution=None, **kwargs)

Bases: nessai.flows.base.NFlow

Implementation of RealNVP.

This class modifies SimpleRealNVP from nflows to allows for a custom mask to be parsed as a numpy array and allows for MLP to be used instead of a ResNet

> L. Dinh et al., Density estimation using Real NVP, ICLR 2017.

Parameters:
featuresint

Number of features (dimensions) in the data space

hidden_featuresint

Number of neurons per layer in each neural network

num_layersint

Number of coupling transformations

num_blocks_per_layerint

Number of layers (or blocks for resnet) per neural network for each coupling transform

maskarray_like, optional

Custom mask to use between coupling transforms. Can either be a single array with the same length as the number of features or and two-dimensional array of shape (# features, # num_layers). Must use -1 and 1 to indicate no updated and updated.

context_featuresint, optional

Number of context (conditional) parameters.

net{‘resnet’, ‘mlp’}

Type of neural network to use

use_volume_preservingbool, optional (False)

Use volume preserving flows which use only addition and no scaling

activationfunction

Activation function implemented in torch

dropout_probabilityfloat, optional (0.0)

Dropout probability used in each layer of the neural network

batch_norm_within_layersbool, optional (False)

Enable or disable batch norm within the neural network for each coupling transform

batch_norm_between_layersbool, optional (False)

Enable or disable batch norm between coupling transforms

linear_transform{‘permutation’, ‘lu’, ‘svd’, None}

Linear transform to use between coupling layers. Not recommended when using a custom mask.

pre_transformstr

Linear transform to use before the first transform.

pre_transform_kwargsdict

Keyword arguments to pass to the transform class used for the pre- transform.

actnormbool

Include activation normalisation as described in arXiv:1807.03039. Batch norm between layers must be disabled if using this option.

kwargs

Keyword arguments are passed to the coupling class.