HINTS: This repository contains the code used to generate results in the paper:
"Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers" by Zhang, Kahana, Kopanicakova, Turkel, Ranade, Pathak, and Karniadakis.
If you use the developed code/its components for your research, please use the following bibtex entries (or equivalent) to cite us
@article{zktrpkk_24,
title = {Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers},
author = {Enrui Zhang and Adar Kahana and Alena Kopani{\v{c}}{\'a}kov{\'a} and Eli Turkel and Rishikesh Ranade and Jay Pathak and George Em Karniadakis},
journal = {Nature Machine Intelligence},
pages = {2522-5839},
year = {2024},
doi = {10.1038/s42256-024-00910-x},
URL = {https://www.nature.com/articles/s42256-024-00910-x},
note={https://arxiv.org/abs/2208.13273},
}
scipy>=1.4.1
torch>=1.8
tqdm>=4.46.0
numpy>=1.21
matplotlib>=3.1.2
GPU is not necessary but it is highly desirable for training the DeepONets efficiently. If that case, a proper installation of the GPU drivers (such as CUDE integration, etc.) is expected.
- cd HINTS_numpy
- Make sure you have python 3.7 installed (or newer, newest version may be inconsistent).
- Make sure you have pip installed.
- Open a command line interface and switch to the project folder.
- Run 'pip install -r requirements.txt' from within the project folder. (Typical installation time on "normal" desktop computer is < 2 mins.)
The usage of the code is through a file called 'configs.py' located in the HINTS_NP project folder.
- To train a DeepONet, choose the desired parameters (such as the dimension, the problem, the domain size, etc.).
- Set the variable 'ITERATION_METHOD' to 'DeepONet' and execute python3 main.py
- This will automatically create data, train a DeepONet and finally throw an error.
The reason for that is the code is able to do 'Only DeepONet' mode, which is not intended in most cases of HINTS.
- During training, the folder 'debug_figs' will start logging images based on the plotting interval config.
These help monitor the training and see that the network trains well. - After training is done, a model is saved into models folder.
- Set the variable 'ITERATION_METHOD' to 'Numerical_DeepONet_Hybrid' to run the HINTS as: python3 main.py
- The outputs are logged into the outputs folder. (Typical run time for 1D examples is < 10 mins, including training of DeepONet. Expected output is reported in manuscript, see for example convergence plots on Figure 3B for 1D Poisson equation and HINTS-Jacobi solver.)
To train another DeepONet (change the problem/scenario) the results.npz file in the output folder needs to be deleted.
Alternatively, the flag 'FORCE_RETRAIN' can be set to True.
- We recommend changing the name of the models and the datasets at the bottom of the configs file as per the simulation one wishes to run.
We provide examples of config files, datasets, and pre-trained models in order to reproduce the results reported in the manuscript.
In particular:
example_configs/configs_1D_Helmholtz_HINTS_Jacobi.py can be used to run the HINTS-Jacobi solver for the Helmholtz problem and reproduce Figure 3 in the manuscript
example_configs/configs_1D_Poisson_HINTS_Jacobi.py can be used to run the HINTS-Jacobi solver for the Poisson problem and reproduce Figure 2 in the manuscript
example_configs/configs_1D_Poisson_MG_HINTS_GS_smoother.py can be used to run MG solver with HINTS-GS(Gauss-Seidel) smoother for the Poisson problem and reproduce Figure 5 in the manuscript
To run the desired experiments, please replace config.py with any of these files.
PETSc-based code: large-scale hybrid preconditioning for Krylov methods and demonstration of interfacing with state-of-the-art linear algebra
This code of HINTS uses Firedrake for assembly of finite element systems and PETSc for linear algebra, including standard stationary and Krylov methods
Firedrake=0.13.0+6118.g149f8fda6
petsc=3.20.5
torch=2.2.2
numpy=1.24.0
matplotlib=3.9.0
pandas=2.2.2
GPU is not necessary but it is highly desirable for training the DeepONets efficiently.
-
Make sure to deactivate any conda enviroment you might have!
-
Install Firedrake - official guidance can be found at https://www.firedrakeproject.org/download.html.
We have followed these steps:
2.1. mkdir my_firedrake
2.2. cd my_firedrake
2.3. curl -O https://raw.githubusercontent.com/firedrakeproject/firedrake/master/scripts/firedrake-install
2.4. add support for exodus meshes, i.e., add "petsc_options.add("--download-exodusii")" to line 745 of firedrake-install script
2.5. python3 firedrake-install --disable-ssh --no-package-manager
3.1. source the firedrake enviroment, i.e.,
. /my_path/my_firedrake/firedrake/bin/activate
3.2. cd HINTS_petsc
3.3. Export path to HINTS_petsc code, i.e.,
export PYTHONPATH=$PYTHONPATH:my_path/HINTS_petsc
3.4. cd example
We have uploaded to Zenodo an instance of the dataset used to generate the large-scale results reported in the paper.
This dataset can be downloaded using the following link: https://zenodo.org/records/10904349/files/NonNestedHelm3D_5000_1_32_1_0.0001.pkl?download=1.
Using the provided code, you can generate larger or smaller datasets. The HINTS preconditioner typically performs better if the DeepOnet is trained using a larger number of samples.
To execute the different experiments, one can use the following commands:
3.5.1. HYPRE-AMG preconditioner:
- using uploaded samples for k=6, you can test the code as
python3 -u hints_test_hypre_sampled_k.py --num_samples_total 10000 --num_samples 100000 --k_sigma 6.0
3.5.1. HINTS-MG preconditioner, e.g., for the example where DON is trained with only 10,000 on cube 8x8x8
- python3 -u hints_test_HINTSgmg_sampled_k.py --epochs 50000000 --force_retrain false --recreate_data false --only_train false --num_samples_total 10000 --num_samples 10000 --dofs_don 8 --num_basis_functions 128 --k_sigma 6
To obtain alternative configurations of HINTS-Jacobi-MG, please explore different values of parameters for --dofs_don, --num_samples, --num_basis_functions, and --k_sigmas. This will require you to create different datasets as well as to train deepOnets.
This code was developed for research purposes only. The authors make no warranties, express or implied, regarding its suitability for any particular purpose or its performance.
The software is realized with NO WARRANTY and it is licenzed under BSD 3-Clause license
Copyright (c) 2024 Brown University
Alena Kopaničáková ([email protected]), Adar Kahana ([email protected])
- Alena Kopanicakova (Brown, Providence; USI, Lugano; ANITI, Toulouse)
- Adar Kahana (Brown, Providence)
- Enrui Zhang (Brown, Providence)