Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Giving --model-metadata-file <file.json> does not seem to load that file #3169

Open
paul-gauthier opened this issue Feb 7, 2025 · 2 comments
Labels
priority question Further information is requested

Comments

@paul-gauthier
Copy link
Collaborator

I have a models metadata file in ~/.cache/aider-model-metadata.json, which looks something like this:

{

    "foobar-model": {

        "input_cost_per_token": 0,

        "litellm_provider": "openai",

        "max_input_tokens": 90000,

        "max_output_tokens": 4096,

        "max_tokens": 90000,

        "mode": "chat",

        "output_cost_per_token": 0

    }

}

There's more model than one though. I run aider with this command:

aider \

  --model foobar-model \

  --model-metadata-file ~/.cache/aider-model-metadata.json \

  --config ~/.config/aider/aider.conf.yml

v0.73.0:

When running on 0.73.0, everything works fine. Using /models I can see the model:


/models foobar



Models which match "foobar":

- foobar-model

- openai/foobar-model

v0.74.0:

When running on 0.74.0, I see that the model isn't there when doing /models:


/models foobar



No models match "foobar".

I get this error when trying to use the model:


litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call.

Both versions report the same settings, relevant lines from aider -v:


  - model: foobar-model

  - model_metadata_file: /Users/oskar/.cache/aider-model-metadata.json

Let me know what other information I can provide.

Originally posted by @oskarkook in #2928

@paul-gauthier paul-gauthier added bug Something isn't working priority labels Feb 7, 2025
@paul-gauthier paul-gauthier added question Further information is requested and removed bug Something isn't working priority labels Feb 19, 2025
@paul-gauthier
Copy link
Collaborator Author

Thanks for trying aider and filing this issue.

You need to include a litellm_provider field in the model file. This tells litellm how to communicate with the model's API. So:

{

    "foobar-model": {

        "input_cost_per_token": 0,

        "litellm_provider": "openai",

        "max_input_tokens": 90000,

        "max_output_tokens": 4096,

        "max_tokens": 90000,

        "mode": "chat",

        "output_cost_per_token": 0,

        "litellm_provider": "openai"
    }

}

@oskarkook
Copy link

But now you just have litellm_provider twice in there?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants