forked from camenduru/LLaVA-colab
Colab | Info |
---|---|
🌋 LLaVA_7b_8bit_colab 7B (8bit) | |
🌋 LLaVA_7b_colab 7B (16bit) (Pro High-RAM 😐 22GB RAM 14GB VRAM) |
https://www.youtube.com/watch?v=o7zQAa0NPds
After clicking the third cell, please wait for the model to load.
~14.7GB
for 16bit
~8GB
for 8bit
~5 minutes
We will not receive any output because it will be running in another thread.
https://github.com/haotian-liu/LLaVA