Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support for Claude & Llama 3.x 70b with Groq #25

Open
borisyankov opened this issue Nov 24, 2024 · 4 comments
Open

feat: add support for Claude & Llama 3.x 70b with Groq #25

borisyankov opened this issue Nov 24, 2024 · 4 comments

Comments

@borisyankov
Copy link
Contributor

Add support and experiment how well these two models perform:

  • Claude 3.5 Sonnet - often cited as much better than 4 (and even 4o) at coding
  • Llama 3.x 70b + Groq - not as good as the Claude or GPT 4o but inference is much faster and for many tasks it might be good enough
@grabbou
Copy link
Collaborator

grabbou commented Nov 24, 2024

Yup, ideally, we should allow configurability of the model, together with (if required) a model-specific prompt. The goal (and I think required next step) is to allow this as well as some self-hosted models.

I would also love to self-hosted something for everyone, so no API key is required. But it won't bring much to the table, unless I go bankrupt paying for it!

@adeleke5140
Copy link

adeleke5140 commented Nov 26, 2024

Hi @grabbou, my team at mastra has been building an AI framework for a while and we have support for these models out of the box.

I see another issue regarding a logger and we have that built in too.

We are also built on top of AI sdk so I believe it should be plug and play. We can onboard you if you're interested.

@grabbou grabbou changed the title Add support for Claude & Llama 3.x 70b with Groq feat: add support for Claude & Llama 3.x 70b with Groq Nov 26, 2024
@grabbou
Copy link
Collaborator

grabbou commented Nov 26, 2024

Thanks for reaching out @adeleke5140 but we're not looking to adopt any higher-level framework at this point. At this point, we want to stay close to the metal and focus on experimenting with more tools instead.

I am not against this in general, especially as we grow, so let me keep this in the back of my head. Thanks!

@adeleke5140
Copy link

@grabbou You are welcome. I understand and I'm excited to see where Cali goes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants