-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy path.env
80 lines (61 loc) · 2.51 KB
/
.env
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
# The Llama Cloud API key.
# LLAMA_CLOUD_API_KEY=
# The provider for the AI models to use.
MODEL_PROVIDER=openai
# The name of LLM model to use.
MODEL=gpt-4o-mini
# Name of the embedding model to use.
EMBEDDING_MODEL=text-embedding-3-large
# Dimension of the embedding model to use.
EMBEDDING_DIM=1024
# The questions to help users get started (multi-line).
# CONVERSATION_STARTERS=
# The OpenAI API key to use.
# OPENAI_API_KEY=
# Temperature for sampling from the model.
# LLM_TEMPERATURE=
# Maximum number of tokens to generate.
# LLM_MAX_TOKENS=
# The number of similar embeddings to return when retrieving documents.
# TOP_K=
# The time in milliseconds to wait for the stream to return a response.
STREAM_TIMEOUT=60000
# For generating a connection URI, see https://docs.timescale.com/use-timescale/latest/services/create-a-service
# The PostgreSQL connection string.
# PG_CONNECTION_STRING=
# FILESERVER_URL_PREFIX is the URL prefix of the server storing the images generated by the interpreter.
FILESERVER_URL_PREFIX=http://localhost:3000/api/files
# Customize prompt to generate the next question suggestions based on the conversation history.
# Disable this prompt to disable the next question suggestions feature.
NEXT_QUESTION_PROMPT="You're a helpful assistant! Your task is to suggest the next question that user might ask.
Here is the conversation history
---------------------
{conversation}
---------------------
Given the conversation history, please give me 3 questions that you might ask next!
Your answer should be wrapped in three sticks which follows the following format:
```
<question 1>
<question 2>
<question 3>
```"
# The system prompt for the AI model.
SYSTEM_PROMPT=You are a helpful assistant who helps users with their questions.
# An additional system prompt to add citation when responding to user questions.
SYSTEM_CITATION_PROMPT='You have provided information from a knowledge base that has been passed to you in nodes of information.
Each node has useful metadata such as node ID, file name, page, etc.
Please add the citation to the data node for each sentence or paragraph that you reference in the provided information.
The citation format is: . [citation:<node_id>]()
Where the <node_id> is the unique identifier of the data node.
Example:
We have two nodes:
node_id: xyz
file_name: llama.pdf
node_id: abc
file_name: animal.pdf
User question: Tell me a fun fact about Llama.
Your answer:
A baby llama is called "Cria" [citation:xyz]().
It often live in desert [citation:abc]().
It\'s cute animal.
'