Tags: melroy89/ollama
Tags
fix docker build-args env context is not accessible from job.*.strategy. since it's in the environment, just tell docker to use the environment variable[1] [1]: https://docs.docker.com/reference/cli/docker/buildx/build/#build-arg
make the modelfile path relative for `ollama create` (ollama#8380)
openai: accept additional headers to fix CORS errors (ollama#8343)
llm: do not error on "null" format (ollama#8139) This fixes another regression in the previous commit that fixed other known bugs.
llm: do not silently fail for supplied, but invalid formats (ollama#8130 ) Changes in ollama#8002 introduced fixes for bugs with mangling JSON Schemas. It also fixed a bug where the server would silently fail when clients requested invalid formats. It also, unfortunately, introduced a bug where the server would reject requests with an empty format, which should be allowed. The change in ollama#8127 updated the code to allow the empty format, but also reintroduced the regression where the server would silently fail when the format was set, but invalid. This commit fixes both regressions. The server does not reject the empty format, but it does reject invalid formats. It also adds tests to help us catch regressions in the future. Also, the updated code provides a more detailed error message when a client sends a non-empty, but invalid format, echoing the invalid format in the response. This commits also takes the opportunity to remove superfluous linter checks.
ci: be more aggressive on parallelism in build (ollama#8102)
runner: switch logging back to stderr (ollama#8091) This puts the low-level runner logging back on stderr for consistency with prior releases
PreviousNext