Skip to content

Tags: Nayjest/ai-microcore

Tags

v3.12.2

Toggle v3.12.2's commit message
upd. pkg version to v3.12.2

v3.12.1

Toggle v3.12.1's commit message
v3.12.1

- Possibility to force-disable streaming when callbacks was provided (openai, anthropic API)
- Fix tests when .env file is present

v3.12.0

Toggle v3.12.0's commit message
v3.12.0:

- Added a "prompt" attribute to LLM response strings, referencing the prompt used to generate them (can be disabled with the SAVE_MEMORY configuration option).
- Enhanced dicts returned by LLMResponse.parse_json() to include a new "llm_response" attribute, referencing the original LLM response string (dict content remains unaffected).
- Updated Role.<NAME> and ApiType.<NAME> values to be Enums of their respective types, rather than plain strings.
- Improved type definitions and type hints across the codebase.

v3.11.1

Toggle v3.11.1's commit message
bugfix python 3.10, v3.11.1

v3.10.3

Toggle v3.10.3's commit message
pylint fix

v3.10.1

Toggle v3.10.1's commit message
v 3.10.1

default logging refactoring / improvements:
- possibility to configure logging output method / request+response formatters separately
- fix color reset bug

v3.10.0

Toggle v3.10.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Merge pull request #26 from Nayjest/tokenize_remote_models

Tiktoken usage for estimating number of tokens in prompt / response, fitting semantic search results to target token num

v3.9.1

Toggle v3.9.1's commit message
Readme upd.