This is the official repo for the paper "Universal Neural Optimal Transport" (Geuter et al., 2025). To get started, install the requirements via
pip install -r requirements.txt
The pretrained model used for all our experiments is uploaded to the Models
folder. Make sure to git lfs pull
instead of git pull
to pull the model files as well (if you don't wan't to use the pretrained model, git pull
suffices). To use the pretrained FNO (Fourier Neural Operator), simply run
from src.evaluation.import_models import load_fno
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
model = load_fno("unot_fno", device=device)
mu = ... # first flattened input measure, shape (batch_size, resolution**2)
nu = ... # second flattened input measure
g = model(mu, nu) # shape (batch_size, resolution**2)
To use the FNO trained on variable
from src.evaluation.import_models import load_fno_var_epsilon
model = load_fno_var_epsilon("unot_fno_var_eps")
If you want to train your own model, you first need to prepare the test datasets, and can then run a train script as outlined below.
To download the test datasets, run
python scripts/make_data.py
Then, create test datasets with
python scripts/create_test_set.py
To train the model, run
python scripts/main_neural_operator.py
Various training hyperparameters as well as other (boolean) flags can be passed to this script; e.g. to train without wandb logging, run
python scripts/main_neural_operator.py --no-wandb
The folder also contains training files to train a model with variable
If you find this repository helpful, please consider citing our paper:
@article{geuter2025universal,
title={Universal Neural Optimal Transport},
author={Geuter, J. and Kornhardt, G. and Tomasson, I. and Laschos, V.},
year={2025},
url={https://arxiv.org/abs/2212.00133v5}
}