This is Fooocus from the world of Stable Difusion in the world of Text Generation, the same ease of use and the same convenience.
This is simple and modern Gradio webui for LLM models GGML format (.bin) based on LLaMA.
Navigation:
Installing
Presets
Model downloading
Args
You can use this webui in cloud service Colab:
- Simple to use
- Comfy to work
- Not demanding on resources
- Beautiful and pleasant interface
>>> Portable one-click packege <<<
Step-by-step installation:
- Install Python 3.10.6 and Git
- Run
git clone https://github.com/ehristoforu/TensorLM-webui.git
- Run
cd TensorLM-webui
- Run
update_mode.bat
&& enter 1 and 2 - Run
start.bat
Step-by-step installation:
- Install Python 3.10.6 and Git
- Run
git clone https://github.com/ehristoforu/TensorLM-webui.git
- Run
cd TensorLM-webui
- Run
python pip install -r requirements.txt
- Run
python webui.py
Step-by-step installation:
- Install Python 3.10.6 and Git
- Run
git clone https://github.com/ehristoforu/TensorLM-webui.git
- Run
cd TensorLM-webui
- Run
python pip install -r requirements.txt
- Run
python webui.py
In this app there is 23 default presets.
Thanks, @mustvlad for system prompts!
You can create your custom presets, instruction in presets
folder (it is .md-file).
With this interface you don't need to scour the Internet looking for a compatible model; in the "Tabs" checkbox and in the "ModelGet" tab you can choose which model to download from our verified repository on HuggingFace.
To use args:
- In Windows: edit start.bat with Notepad and edit line with
python webui.py
topython webui.py [Your args]
, for ex.python webui.py --inbrowser
- In MacOS & Linux: run
python webui.py
with args -python webui.py {Your args}
, for ex.python webui.py --inbrowser
--inbrowser --share --lowvram --debug --quiet
While there are no forks 😔, perhaps you will be the first who can significantly improve this application!
@software{ehristoforu_TensorLM-webui_2024,
author = {ehristoforu},
month = apr,
title = {{TensorLM-webui}},
url = {https://github.com/ehristoforu/TensorLM-webui},
year = {2024}
}