Skip to content

This repository contains code, which was used to generate large-scale results in the HINTS paper.

License

Notifications You must be signed in to change notification settings

kopanicakova/HINTS_precond

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 

Repository files navigation

License DOI

HINTS: This repository contains the code used to generate results in the paper:
"Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers" by Zhang, Kahana, Kopanicakova, Turkel, Ranade, Pathak, and Karniadakis.

If you use the developed code/its components for your research, please use the following bibtex entries (or equivalent) to cite us

@article{zktrpkk_24,
title = {Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers},
author = {Enrui Zhang and Adar Kahana and Alena Kopani{\v{c}}{\'a}kov{\'a} and Eli Turkel and Rishikesh Ranade and Jay Pathak and George Em Karniadakis},
journal = {Nature Machine Intelligence},
pages = {2522-5839},
year = {2024},
doi = {10.1038/s42256-024-00910-x},
URL = {https://www.nature.com/articles/s42256-024-00910-x},
note={https://arxiv.org/abs/2208.13273},
}

Numpy-based code: HINTS methodology and small-scale examples

Depedencies

scipy>=1.4.1
torch>=1.8
tqdm>=4.46.0
numpy>=1.21
matplotlib>=3.1.2

Hardware requirements

GPU is not necessary but it is highly desirable for training the DeepONets efficiently. If that case, a proper installation of the GPU drivers (such as CUDE integration, etc.) is expected.

Installation guide

  1. cd HINTS_numpy
  2. Make sure you have python 3.7 installed (or newer, newest version may be inconsistent).
  3. Make sure you have pip installed.
  4. Open a command line interface and switch to the project folder.
  5. Run 'pip install -r requirements.txt' from within the project folder. (Typical installation time on "normal" desktop computer is < 2 mins.)

Instructions to run the code

The usage of the code is through a file called 'configs.py' located in the HINTS_NP project folder.

  1. To train a DeepONet, choose the desired parameters (such as the dimension, the problem, the domain size, etc.).
  2. Set the variable 'ITERATION_METHOD' to 'DeepONet' and execute python3 main.py
  • This will automatically create data, train a DeepONet and finally throw an error.
    The reason for that is the code is able to do 'Only DeepONet' mode, which is not intended in most cases of HINTS.
  1. During training, the folder 'debug_figs' will start logging images based on the plotting interval config.
    These help monitor the training and see that the network trains well.
  2. After training is done, a model is saved into models folder.
  3. Set the variable 'ITERATION_METHOD' to 'Numerical_DeepONet_Hybrid' to run the HINTS as: python3 main.py
  4. The outputs are logged into the outputs folder. (Typical run time for 1D examples is < 10 mins, including training of DeepONet. Expected output is reported in manuscript, see for example convergence plots on Figure 3B for 1D Poisson equation and HINTS-Jacobi solver.)

To train another DeepONet (change the problem/scenario) the results.npz file in the output folder needs to be deleted.
Alternatively, the flag 'FORCE_RETRAIN' can be set to True.

  • We recommend changing the name of the models and the datasets at the bottom of the configs file as per the simulation one wishes to run.

Reproducing the results from the manuscript

We provide examples of config files, datasets, and pre-trained models in order to reproduce the results reported in the manuscript.
In particular:
example_configs/configs_1D_Helmholtz_HINTS_Jacobi.py can be used to run the HINTS-Jacobi solver for the Helmholtz problem and reproduce Figure 3 in the manuscript
example_configs/configs_1D_Poisson_HINTS_Jacobi.py can be used to run the HINTS-Jacobi solver for the Poisson problem and reproduce Figure 2 in the manuscript
example_configs/configs_1D_Poisson_MG_HINTS_GS_smoother.py can be used to run MG solver with HINTS-GS(Gauss-Seidel) smoother for the Poisson problem and reproduce Figure 5 in the manuscript

To run the desired experiments, please replace config.py with any of these files.

PETSc-based code: large-scale hybrid preconditioning for Krylov methods and demonstration of interfacing with state-of-the-art linear algebra

This code of HINTS uses Firedrake for assembly of finite element systems and PETSc for linear algebra, including standard stationary and Krylov methods

Depedencies

Firedrake=0.13.0+6118.g149f8fda6
petsc=3.20.5
torch=2.2.2
numpy=1.24.0
matplotlib=3.9.0
pandas=2.2.2

Hardware requirements

GPU is not necessary but it is highly desirable for training the DeepONets efficiently.

Installation guide (building Petsc might take more than 1 hour)

  1. Make sure to deactivate any conda enviroment you might have!

  2. Install Firedrake - official guidance can be found at https://www.firedrakeproject.org/download.html.

    We have followed these steps:
    2.1. mkdir my_firedrake
    2.2. cd my_firedrake
    2.3. curl -O https://raw.githubusercontent.com/firedrakeproject/firedrake/master/scripts/firedrake-install
    2.4. add support for exodus meshes, i.e., add "petsc_options.add("--download-exodusii")" to line 745 of firedrake-install script
    2.5. python3 firedrake-install --disable-ssh --no-package-manager

Instructions to run the code

3.1. source the firedrake enviroment, i.e.,
       . /my_path/my_firedrake/firedrake/bin/activate
3.2. cd HINTS_petsc
3.3. Export path to HINTS_petsc code, i.e.,
       export PYTHONPATH=$PYTHONPATH:my_path/HINTS_petsc
3.4. cd example

Running numerical experiment:

We have uploaded to Zenodo an instance of the dataset used to generate the large-scale results reported in the paper. This dataset can be downloaded using the following link: https://zenodo.org/records/10904349/files/NonNestedHelm3D_5000_1_32_1_0.0001.pkl?download=1. Using the provided code, you can generate larger or smaller datasets. The HINTS preconditioner typically performs better if the DeepOnet is trained using a larger number of samples.
To execute the different experiments, one can use the following commands:

3.5.1. HYPRE-AMG preconditioner: 
	- using uploaded samples for k=6, you can test the code as
		python3 -u hints_test_hypre_sampled_k.py  --num_samples_total 10000 --num_samples 100000 --k_sigma 6.0 


3.5.1. HINTS-MG preconditioner, e.g., for the example where DON is trained with only 10,000 on cube 8x8x8 
	- python3 -u hints_test_HINTSgmg_sampled_k.py   --epochs 50000000 --force_retrain false --recreate_data false --only_train false --num_samples_total 10000 --num_samples 10000 --dofs_don 8  --num_basis_functions 128 --k_sigma 6

To obtain alternative configurations of HINTS-Jacobi-MG, please explore different values of parameters for --dofs_don, --num_samples, --num_basis_functions, and --k_sigmas. This will require you to create different datasets as well as to train deepOnets.

Dislaimer

This code was developed for research purposes only. The authors make no warranties, express or implied, regarding its suitability for any particular purpose or its performance.

License

The software is realized with NO WARRANTY and it is licenzed under BSD 3-Clause license

Copyright

Copyright (c) 2024 Brown University

Contact

Alena Kopaničáková ([email protected]), Adar Kahana ([email protected])

Contributors:

  • Alena Kopanicakova (Brown, Providence; USI, Lugano; ANITI, Toulouse)
  • Adar Kahana (Brown, Providence)
  • Enrui Zhang (Brown, Providence)

About

This repository contains code, which was used to generate large-scale results in the HINTS paper.

Resources

License

Stars

Watchers

Forks

Packages

No packages published