Skip to content

Commit

Permalink
Localhost plugin support
Browse files Browse the repository at this point in the history
  • Loading branch information
isafulf committed Apr 8, 2023
1 parent 96add7c commit 15b1169
Show file tree
Hide file tree
Showing 8 changed files with 463 additions and 65 deletions.
1 change: 1 addition & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
scripts/
tests/
examples/
local-server/
*.md
*.pyc
.dockerignore
Expand Down
26 changes: 25 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,11 @@ This README provides detailed information on how to set up, develop, and deploy
- [Qdrant](#qdrant)
- [Redis](#redis)
- [Running the API Locally](#running-the-api-locally)
- [Testing a Localhost Plugin in ChatGPT](#testing-a-localhost-plugin-in-chatgpt)
- [Personalization](#personalization)
- [Authentication Methods](#authentication-methods)
- [Deployment](#deployment)
- [Installing a Developer Plugin](#installing-a-developer-plugin)
- [Webhooks](#webhooks)
- [Scripts](#scripts)
- [Limitations](#limitations)
Expand Down Expand Up @@ -122,6 +124,13 @@ Follow these steps to quickly set up and run the ChatGPT Retrieval Plugin:
9. Run the API locally: `poetry run start`
10. Access the API documentation at `http://0.0.0.0:8000/docs` and test the API endpoints (make sure to add your bearer token).

### Testing in ChatGPT

To test a locally hosted plugin in ChatGPT, follow these steps:

1. Run the API on localhost: `poetry run dev`
2. Follow the instructions in the [Testing a Localhost Plugin in ChatGPT](#testing-a-localhost-plugin-in-chatgpt) section of the README.

For more detailed information on setting up, developing, and deploying the ChatGPT Retrieval Plugin, refer to the full Development section below.

## About
Expand Down Expand Up @@ -257,7 +266,6 @@ For more detailed instructions on setting up and using each vector database prov

[Redis](https://redis.com/solutions/use-cases/vector-database/) is a real-time data platform suitable for a variety of use cases, including everyday applications and AI/ML workloads. It can be used as a low-latency vector engine by creating a Redis database with the [Redis Stack docker container](/examples/docker/redis/docker-compose.yml). For a hosted/managed solution, [Redis Cloud](https://app.redislabs.com/#/) is available. For detailed setup instructions, refer to [`/docs/providers/redis/setup.md`](/docs/providers/redis/setup.md).


#### LlamaIndex

[LlamaIndex](https://github.com/jerryjliu/llama_index) is a central interface to connect your LLM's with external data.
Expand Down Expand Up @@ -289,6 +297,22 @@ Append `docs` to the URL shown in the terminal and open it in a browser to acces

**Note:** If you add new dependencies to the pyproject.toml file, you need to run `poetry lock` and `poetry install` to update the lock file and install the new dependencies.

### Testing a Localhost Plugin in ChatGPT

To test a localhost plugin in ChatGPT, use the provided [`local-server/main.py`](/local-server/main.py) file, which is specifically configured for localhost testing with CORS settings, no authentication and routes for the manifest, OpenAPI schema and logo.

Follow these steps to test your localhost plugin:

1. Run the localhost server using the `poetry run dev` command. This starts the server at the default address (e.g. `localhost:3333`).

2. Visit [ChatGPT](https://chat.openai.com/), select "Plugins" from the model picker, click on the plugins picker, and click on "Plugin store" at the bottom of the list.

3. Choose "Develop your own plugin" and enter your localhost URL (e.g. `localhost:3333`) when prompted.

4. Your localhost plugin is now enabled for your ChatGPT session.

For more information, refer to the [OpenAI documentation](https://platform.openai.com/docs/plugins/getting-started/openapi-definition).

### Personalization

You can personalize the Retrieval Plugin for your own use case by doing the following:
Expand Down
18 changes: 18 additions & 0 deletions local-server/ai-plugin.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
{
"schema_version": "v1",
"name_for_model": "retrieval",
"name_for_human": "Retrieval Plugin",
"description_for_model": "Plugin for searching through the user's documents (such as files, emails, and more) to find answers to questions and retrieve relevant information. Use it whenever a user asks something that might be found in their personal information.",
"description_for_human": "Search through your documents.",
"auth": {
"type": "none"
},
"api": {
"type": "openapi",
"url": "http://localhost:3333/.well-known/openapi.yaml"
},
"logo_url": "http://localhost:3333/.well-known/logo.png",
"contact_email": "[email protected]",
"legal_info_url": "[email protected]"
}

Binary file added local-server/logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
145 changes: 145 additions & 0 deletions local-server/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
# This is a version of the main.py file found in ../../../server/main.py for testing the plugin locally.
# Use the command `poetry run dev` to run this.
from typing import Optional
import uvicorn
from fastapi import FastAPI, File, Form, HTTPException, Body, UploadFile

from models.api import (
DeleteRequest,
DeleteResponse,
QueryRequest,
QueryResponse,
UpsertRequest,
UpsertResponse,
)
from datastore.factory import get_datastore
from services.file import get_document_from_file

from starlette.responses import FileResponse

from models.models import DocumentMetadata, Source
from fastapi.middleware.cors import CORSMiddleware


app = FastAPI()

PORT = 3333

origins = [
f"http://localhost:{PORT}",
"https://chat.openai.com",
]

app.add_middleware(
CORSMiddleware,
allow_origins=origins,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)


@app.route("/.well-known/ai-plugin.json")
async def get_manifest(request):
file_path = "./local-server/ai-plugin.json"
return FileResponse(file_path, media_type="text/json")


@app.route("/.well-known/logo.png")
async def get_logo(request):
file_path = "./local-server/logo.png"
return FileResponse(file_path, media_type="text/json")


@app.route("/.well-known/openapi.yaml")
async def get_openapi(request):
file_path = "./local-server/openapi.yaml"
return FileResponse(file_path, media_type="text/json")


@app.post(
"/upsert-file",
response_model=UpsertResponse,
)
async def upsert_file(
file: UploadFile = File(...),
metadata: Optional[str] = Form(None),
):
try:
metadata_obj = (
DocumentMetadata.parse_raw(metadata)
if metadata
else DocumentMetadata(source=Source.file)
)
except:
metadata_obj = DocumentMetadata(source=Source.file)

document = await get_document_from_file(file, metadata_obj)

try:
ids = await datastore.upsert([document])
return UpsertResponse(ids=ids)
except Exception as e:
print("Error:", e)
raise HTTPException(status_code=500, detail=f"str({e})")


@app.post(
"/upsert",
response_model=UpsertResponse,
)
async def upsert(
request: UpsertRequest = Body(...),
):
try:
ids = await datastore.upsert(request.documents)
return UpsertResponse(ids=ids)
except Exception as e:
print("Error:", e)
raise HTTPException(status_code=500, detail="Internal Service Error")


@app.post("/query", response_model=QueryResponse)
async def query_main(request: QueryRequest = Body(...)):
try:
results = await datastore.query(
request.queries,
)
return QueryResponse(results=results)
except Exception as e:
print("Error:", e)
raise HTTPException(status_code=500, detail="Internal Service Error")


@app.delete(
"/delete",
response_model=DeleteResponse,
)
async def delete(
request: DeleteRequest = Body(...),
):
if not (request.ids or request.filter or request.delete_all):
raise HTTPException(
status_code=400,
detail="One of ids, filter, or delete_all is required",
)
try:
success = await datastore.delete(
ids=request.ids,
filter=request.filter,
delete_all=request.delete_all,
)
return DeleteResponse(success=success)
except Exception as e:
print("Error:", e)
raise HTTPException(status_code=500, detail="Internal Service Error")


@app.on_event("startup")
async def startup():
global datastore
datastore = await get_datastore()


def start():
uvicorn.run("local-server.main:app", host="localhost", port=PORT, reload=True)
Loading

0 comments on commit 15b1169

Please sign in to comment.