Skip to content

Latest commit

 

History

History
40 lines (27 loc) · 1.53 KB

download_openllama.md

File metadata and controls

40 lines (27 loc) · 1.53 KB

Download OpenLLaMA weights

OpenLLaMA is a permissively licensed open source reproduction of Meta AI’s LLaMA 7B and 13B checkpoints trained on the RedPajama dataset. The weights can serve as the drop in replacement of LLaMA in existing implementations. We also provide a smaller 3B variant.

To see all the available checkpoints for Open LLaMA, run:

python scripts/download.py | grep open_llama

which will print

openlm-research/open_llama_3b
openlm-research/open_llama_7b
openlm-research/open_llama_13b

In order to use a specific OpenLLaMA checkpoint, for instance open_llama_3b, download the weights and convert the checkpoint to the lit-gpt format:

pip install huggingface_hub

python scripts/download.py --repo_id openlm-research/open_llama_3b

python scripts/convert_hf_checkpoint.py --checkpoint_dir checkpoints/openlm-research/open_llama_3b

By default, the convert_hf_checkpoint step will use the data type of the HF checkpoint's parameters. In cases where RAM or disk size is constrained, it might be useful to pass --dtype bfloat16 to convert all parameters into this smaller precision before continuing.

You're done! To execute the model just run:

pip install sentencepiece

python generate/base.py --prompt "Hello, my name is" --checkpoint_dir checkpoints/openlm-research/open_llama_3b