You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
v3.12.0:
- Added a "prompt" attribute to LLM response strings, referencing the prompt used to generate them (can be disabled with the SAVE_MEMORY configuration option).
- Enhanced dicts returned by LLMResponse.parse_json() to include a new "llm_response" attribute, referencing the original LLM response string (dict content remains unaffected).
- Updated Role.<NAME> and ApiType.<NAME> values to be Enums of their respective types, rather than plain strings.
- Improved type definitions and type hints across the codebase.
Merge pull request #26 from Nayjest/tokenize_remote_models
Tiktoken usage for estimating number of tokens in prompt / response, fitting semantic search results to target token num