Skip to content

Commit

Permalink
Qualified number of epochs for LoRa weights
Browse files Browse the repository at this point in the history
  • Loading branch information
AndriyMulyar authored Mar 29, 2023
1 parent b10890f commit aa4dd0e
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@ Note: the full model on GPU (16GB of RAM required) performs much better in our q
# Reproducibility

Trained LoRa Weights:
- gpt4all-lora: https://huggingface.co/nomic-ai/gpt4all-lora
- gpt4all-lora-epoch-2 https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2
- gpt4all-lora (four full epochs of training): https://huggingface.co/nomic-ai/gpt4all-lora
- gpt4all-lora-epoch-2 (three full epochs of training) https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2

Raw Data:
- [Training Data Without P3](https://s3.amazonaws.com/static.nomic.ai/gpt4all/2022_03_27/gpt4all_curated_data_without_p3_2022_03_27.tar.gz)
Expand Down

0 comments on commit aa4dd0e

Please sign in to comment.