Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/main'
Browse files Browse the repository at this point in the history
# Conflicts:
#	ingest.py
  • Loading branch information
PromtEngineer committed Jun 10, 2023
2 parents d3e7fee + b073013 commit 72919e4
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 1 deletion.
6 changes: 5 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,11 @@ pip install xformers
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

self.to(device)

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=PromtEngineer/localGPT&type=Date)](https://star-history.com/#PromtEngineer/localGPT&Date)

# Disclaimer
This is a test project to validate the feasibility of a fully local solution for question answering using LLMs and Vector embeddings. It is not production ready, and it is not meant to be used in production. Vicuna-7B is based on the Llama model so that has the original Llama license.
2 changes: 2 additions & 0 deletions constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@

PERSIST_DIRECTORY = f"{ROOT_DIRECTORY}/DB"

INGEST_THREADS = 8

# Define the Chroma settings
CHROMA_SETTINGS = Settings(
chroma_db_impl='duckdb+parquet',
Expand Down

0 comments on commit 72919e4

Please sign in to comment.