Steps to run the code:
- Install docker
- Create a virtual environment with python 3.10.13
conda create --name myenv python=3.10.13
- Activate in your new virtual environment
conda activate myenv
- Install the required requirements
pip install -r requirements.txt
- Create a folder called
/data
undermixtral_8x7b/
and add review data from https://www.kaggle.com/datasets/yasserh/amazon-product-reviews-dataset - Create a folder called
/env
undermixtral_8x7b/
and add a file with the following:- postgres.env
POSTGRES_DB=postgres POSTGRES_USER=admin POSTGRES_PASSWORD=root
- connection.env
DRIVER=psycopg2 HOST=postgres PORT=5432 DATABASE=postgres USERNAME=admin PASSWORD=root
- Download Mistral 7B model
mistral-7b-v0.1.Q4_K_M.gguf
from https://huggingface.co/TheBloke/Mistral-7B-v0.1-GGUF and add it tomixtral_8x7b/model/
- Download Llama 7B model
nous-hermes-llama-2-7b.Q4_K_M.gguf
from https://huggingface.co/TheBloke/Nous-Hermes-Llama-2-7B-GGUF and add it tomixtral_8x7b/model/
- Download Mixtral 8x7B model
mixtral-8x7b-v0.1.Q4_K_M.gguf
from https://huggingface.co/TheBloke/Mixtral-8x7B-v0.1-GGUF and add it tomixtral_8x7b/model/
- Download Llama 70B model
llama-2-70b-chat.Q4_K_M.gguf
from https://huggingface.co/TheBloke/Llama-2-70B-Chat-GGUF and add it tomixtral_8x7b/model/
- Run the command
docker-compose up --build
- Run in the notebook
mixtral_8x7b.ipynb
├── mixtral_8x7b
│
├────────── base <- Configuration class
├────────── encoder <- Encoder class
├────────── generator <- Generator class
├────────── retriever <- Retriever class
├────────── data <- csv file
├────────── env <- env files
├────────── model <- GGUF models
│
│────────── config.yaml <- Config definition
│
│────────── mixtral_8x7b.ipynb <- notebook
│
│──────── requirements.txt <- package versions
└──────── docker-compose.yaml