need help understanding "Functions" with Chat #234
Replies: 4 comments 2 replies
-
Make sure you revoke that key, it was included in email notifications |
Beta Was this translation helpful? Give feedback.
-
It looks like it’s returning that it wants to use a function. It’s up to your code then to call your function and return the results to GPT.
|
Beta Was this translation helpful? Give feedback.
-
I get this to work in Python just fine. You ask about the weather, the LLM knows this, it pipes your input to the function call, gets the result and gives it back to the LLM, and then it replies to you. It seems like the examples provided so far only do the first step. The LLM knows it should call the function, but doesn't call it itself, so it's up to you to call it based on the initial response? But then what? How do you pass the result of the function back to the LLM so that it can then formulate a reply with the answer? How do you close the loop using openai-php/client? A complete example would be very nice. |
Beta Was this translation helpful? Give feedback.
-
After you call your function, you append the results to the messages array:
|
Beta Was this translation helpful? Give feedback.
-
Hello, I'm currently trying to experiment with the "Functions" system using Openai-php, however I'm struggling to figure out how this all connects together, the function isn't being called to my knowledge Could someone advise me on how this works? I would be very appreciative.
Here is my code:
I borrowed a bit from the documentation, as I am trying to simply figure out how it works.
Beta Was this translation helpful? Give feedback.
All reactions