Example of a simple AI project integrated with DTU's hpc using a few MLOPS tools.
- Create a Weights and Biases account. Create a project and get the wandb API key.
- Login to the DTU HPC. Follow the official documentation.
- Make sure you are in a linux shell and not a login node - In terminal type: linuxsh
- Make sure a python module is loaded. In terminal type: module load python3/3.10.7 (Optionally, add it to your .bashrc)
- Clone this repository
- Create a python environment - In terminal type: python3 -m venv .venv
- Activate the environment: source .venv/bin/activate
- Install the packages in the requirements file: pip install -r requirements.txt
- Create a file called secret.txt and paste the API key to it. F.ex: echo my_api_key >> secret.txt
- Change the config.yaml file with correct project name and user name
- Submit jobs to the HPC by typing: python3 create_job.py
- You can change the variables by for example: python3 create_job.py hyper.epochs=10 hyper.batch_size=64
- See the available options in the config.yaml file. Add your own in the create_job.py script.
- Change the BSUB queue options to suite your job needs.
- The LSF hpc logs will be added to the lsf_logs folder. Read them in terminal by f.ex: cat gpu_123456.out