Skip to content
forked from akshata29/entaoai

Chat and Ask on your own data. Accelerator to quickly upload your own enterprise data and use OpenAI services to chat to that uploaded data and ask questions

Notifications You must be signed in to change notification settings

123saga/chatpdf

 
 

Repository files navigation

ChatGPT + Enterprise data with Azure OpenAI

This sample demonstrates a few approaches for creating ChatGPT-like experiences over your own data. It uses Azure OpenAI Service to access the ChatGPT model (gpt-35-turbo and gpt3), and vector store (Pinecone, Redis and others) or Azure cognitive search for data indexing and retrieval.

The repo provides a way to upload your own data so it's ready to try end to end.

Features

  • Upload (PDF/Text Documents as well as Webpages) Upload
  • Chat Chat
  • Q&A interfaces Ask
  • Explores various options to help users evaluate the trustworthiness of responses with citations, tracking of source content, etc.
  • Shows possible approaches for data preparation, prompt construction, and orchestration of interaction between model (ChatGPT) and retriever
  • Integration with Cognitive Search and Vector stores (Redis, Pinecone)

Architecture

Architecture

Getting Started

NOTE In order to deploy and run this example, you'll need an Azure subscription with access enabled for the Azure OpenAI service. You can request access here.

Prerequisites

To Run Locally

Installation

  1. Deploy the required Azure Services - Using Automated Deployment or Manually following minimum required resources
    1. az deployment sub create --location eastus --template-file main.bicep --parameters prefix=astoai resourceGroupName=astoai location=eastus
    2. OpenAI service. Please be aware of the model & region availability documented [here] (https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/models#model-summary-table-and-region-availability)
    3. Storage Account and a container
    4. One of the Document Store
      1. Pinecone Starter
      2. Cognitive Search
      3. Redis
  2. Git clone the repo
  3. Open the cloned repo folder in VSCode
  4. Open new terminal and go to /app/frontend directory
  5. Run npm install to install all the packages
  6. Go to /api/Python directory
  7. Run pip install -r requirements.txt to install all required python packages
  8. Copy sample.settings.json to local.settings.json
  9. Update the configuration (Minimally you need OpenAi, one of the document store, storage account)
  10. Start the Python API by running func host start
  11. Open new terminal and go to /api/backend directory
  12. Copy env.example to .env file and edit the file to enter the Python localhost API and the storage configuration
  13. Run py(or python) app.py to start the server.
  14. Open new terminal and go to /api/frontend directory
  15. Run npm run dev to start the local server

Once in the web app:

  • Try different topics in chat or Q&A context. For chat, try follow up questions, clarifications, ask to simplify or elaborate on answer, etc.
  • Explore citations and sources
  • Click on "settings" to try different options, tweak prompts, etc.

Resources

Note

Adapted from the Azure OpenAI Search repo at OpenAI-CogSearch

About

Chat and Ask on your own data. Accelerator to quickly upload your own enterprise data and use OpenAI services to chat to that uploaded data and ask questions

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 46.9%
  • TypeScript 40.8%
  • CSS 7.7%
  • Bicep 4.5%
  • HTML 0.1%