Skip to content

Commit

Permalink
Merge pull request Significant-Gravitas#794 from crimson-knight/add-d…
Browse files Browse the repository at this point in the history
…ocumentation-for-caching-types

Adds information on how to use the other cache methods available
  • Loading branch information
richbeales authored Apr 12, 2023
2 parents f9d8f72 + d24c4af commit 5bb77a8
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 0 deletions.
1 change: 1 addition & 0 deletions .env.template
Original file line number Diff line number Diff line change
Expand Up @@ -17,3 +17,4 @@ OPENAI_AZURE_EMBEDDINGS_DEPLOYMENT_ID=deployment-id-for-azure-embeddigs
IMAGE_PROVIDER=dalle
HUGGINGFACE_API_TOKEN=
USE_MAC_OS_TTS=False
MEMORY_BACKEND=local
11 changes: 11 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,17 @@ export PINECONE_ENV="Your pinecone region" # something like: us-east4-gcp
```


## Setting Your Cache Type

By default Auto-GPT is going to use LocalCache instead of redis or Pinecone.

To switch to either, change the `MEMORY_BACKEND` env variable to the value that you want:

`local` (default) uses a local JSON cache file
`pinecone` uses the Pinecone.io account you configured in your ENV settings
`redis` will use the redis cache that you configured

## View Memory Usage

1. View memory usage by using the `--debug` flag :)
Expand Down

0 comments on commit 5bb77a8

Please sign in to comment.