Skip to content

Commit

Permalink
Update main.py
Browse files Browse the repository at this point in the history
  • Loading branch information
sudarshan-koirala authored Aug 6, 2023
1 parent 7d0f948 commit b153c79
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions main.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
If you don't know the answer, just say that you don't know, don't try to make up an answer.
ALWAYS return a "SOURCES" part in your answer.
The "SOURCES" part should be a reference to the source of the document from which you got your answer.
xample of your response should be:
The example of your response should be:
Context: {context}
Question: {question}
Expand Down Expand Up @@ -60,7 +60,7 @@ def load_model(
model_path="model/llama-2-7b-chat.ggmlv3.q8_0.bin",
model_type="llama",
max_new_tokens=512,
temperature=0.5,
temperature=0.7,
):
"""
Load a locally downloaded model.
Expand Down Expand Up @@ -202,11 +202,11 @@ async def process_chat_message(message):
callback_handler.answer_reached = True
response = await qa_chain.acall(message, callbacks=[callback_handler])
bot_answer = response["result"]
source_documents = bot_answer["source_documents"]
source_documents = response["source_documents"]

if source_documents:
answer += f"\nSources:" + str(source_documents) # type: ignore
bot_answer += f"\nSources:" + str(source_documents)
else:
answer += "\nNo sources found" # type: ignore
bot_answer += "\nNo sources found"

await cl.Message(content=answer).send()
await cl.Message(content=bot_answer).send()

0 comments on commit b153c79

Please sign in to comment.