Skip to content

Commit

Permalink
UI Version 1
Browse files Browse the repository at this point in the history
  • Loading branch information
LeafmanZ committed Jun 14, 2023
1 parent aae4490 commit f32a33e
Show file tree
Hide file tree
Showing 54 changed files with 61,054 additions and 2 deletions.
34 changes: 32 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,8 +61,6 @@ If you want to start from an empty database, delete the `index`.

Note: When you run this for the first time, it will download take time as it has to download the embedding model. In the subseqeunt runs, no data will leave your local enviroment and can be run without internet connection.



## Ask questions to your documents, locally!
In order to ask a question, run a command like:

Expand Down Expand Up @@ -95,6 +93,38 @@ In order to ask a question, run a command like:
python run_localGPT.py --device_type cpu
```

# Run the UI
The UI is broken into two parts; an API `run_localGPTAPI.py` and an UI '`localGPTUI.py`.

Start by opening up `run_localGPTAPI.py`.

If you are running on cpu change `DEVICE_TYPE = 'cuda'` to `DEVICE_TYPE = 'cpu'`.
Comment out the following:
```shell
model_id = "TheBloke/WizardLM-7B-uncensored-GPTQ"
model_basename = "WizardLM-7B-uncensored-GPTQ-4bit-128g.compat.no-act-order.safetensors"
LLM = load_model(device_type=DEVICE_TYPE, model_id=model_id, model_basename = model_basename)
```
Uncomment:
```shell
model_id = "TheBloke/guanaco-7B-HF" # or some other -HF or .bin model
LLM = load_model(device_type=DEVICE_TYPE, model_id=model_id)
```

If you are running gpu there should be nothing to change.
Save and close `run_localGPTAPI.py`.

Open up a terminal and activate your python environment that contains the dependencies installed from requirements.txt.
Navigate to the `/LOCALGPT` directory.
Run the command `python run_localGPT_API.py`.
Wait untill everything has loaded in. You should see something like `INFO:werkzeug:Press CTRL+C to quit`.

Open up a second terminal and activate the same python environment.
Navigate to the `/LOCALGPT/localGPTUI` directory
Run the command `python localGPTUI.py`.

Open up a web browser and go the address `http://localhost:5111/`.

# How does it work?
Selecting the right local models and the power of `LangChain` you can run the entire pipeline locally, without any data leaving your environment, and with reasonable performance.

Expand Down
51 changes: 51 additions & 0 deletions localGPTUI/localGPTUI.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from flask import Flask, render_template, request
from werkzeug.utils import secure_filename

import os
import sys
import requests

sys.path.append(os.path.join(os.path.dirname(__file__), '..'))

app = Flask(__name__)
app.secret_key = "LeafmanZSecretKey"

### PAGES ###
@app.route('/', methods=['GET', 'POST'])
def home_page():
if request.method == 'POST':
if 'user_prompt' in request.form:
user_prompt = request.form['user_prompt']
print(f'User Prompt: {user_prompt}')

main_prompt_url = 'http://localhost:5110/api/prompt_route'
response = requests.post(main_prompt_url, data={'user_prompt': user_prompt})
print(response.status_code) # print HTTP response status code for debugging
if response.status_code == 200:
# print(response.json()) # Print the JSON data from the response
return render_template('home.html', show_response_modal=True, response_dict = response.json())
elif 'documents' in request.files:
delete_source_url = 'http://localhost:5110/api/delete_source' # URL of the /api/delete_source endpoint
if request.form.get('action') == 'reset':
response = requests.get(delete_source_url)

save_document_url = 'http://localhost:5110/api/save_document'
run_ingest_url = 'http://localhost:5110/api/run_ingest' # URL of the /api/run_ingest endpoint
files = request.files.getlist('documents')
for file in files:
print(file.filename)
filename = secure_filename(file.filename)
file_path = os.path.join('temp', filename) # replace with your preferred path
file.save(file_path)
with open(file_path, 'rb') as f:
response = requests.post(save_document_url, files={'document': f})
print(response.status_code) # print HTTP response status code for debugging
os.remove(file_path) # remove the file after sending the request
# Make a GET request to the /api/run_ingest endpoint
response = requests.get(run_ingest_url)
print(response.status_code) # print HTTP response status code for debugging

# Display the form for GET request
return render_template('home.html', show_response_modal=False, response_dict={'Prompt': 'None','Answer': 'None', 'Sources': [('ewf','wef')]})
if __name__ == '__main__':
app.run(debug=False, port =5111)
Loading

0 comments on commit f32a33e

Please sign in to comment.