Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Add support for Groq (LLM) #516

Open
spikecodes opened this issue Mar 6, 2024 · 3 comments
Open

[Feature]: Add support for Groq (LLM) #516

spikecodes opened this issue Mar 6, 2024 · 3 comments

Comments

@spikecodes
Copy link

Brief Description

Add support for using Groq as an LLM

Rationale

Groq has an OpenAI-compatible API that allows you to call Mixtral LLM generation on their specialized hardware known as LPUs (Language Processing Units) to get high quality output at 4x the speed of GPT 3.5.

Suggested Implementation

Create a new LLM

@XIII-IIIX
Copy link

This would be epic!

@djjrock
Copy link

djjrock commented Mar 12, 2024

+1

@Arunprakaash
Copy link

i have added support for groq #526

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants