You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Hello!
The issue is related to the use of Together AI models, such as CodeLlama-34b and Llama-3-70b-chat-hf.
Despite that CodeLlama-34b exists in this LiteLLM documentation, I got the following issue:
NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/CodeLlama-34b. Please visit https://api.together.xyz to see the list of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Here is the issue related to Llama-3-70b-chat-hf:
NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/Llama-3-70b-chat-hf. Please visit https://api.together.xyz to see thelist of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
My question: is Llama-3-70b-chat-hf supported in Open Interpreter?
if yes, can you please provide me with its Function Call?
Describe the solution you'd like
Integrate latest models of Llama in Together AI to Open Interpreter.
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
I think it can be handled by defining custom API endpoint URL with API key.
Together API endpoint is openai compatible thus should not cause any problems.
Same goes for groq api.
Can you please provide a Python code snippet to give a try of what you have proposed?
The problem is even with the current documentation, I got the following for CodeLlama-34b that exists in this LiteLLM documentation:
NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/CodeLlama-34b. Please visit https://api.together.xyz to see the list of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Is your feature request related to a problem? Please describe.
Hello!
The issue is related to the use of
Together AI
models, such asCodeLlama-34b
andLlama-3-70b-chat-hf
.Despite that
CodeLlama-34b
exists in this LiteLLM documentation, I got the following issue:Here is the issue related to
Llama-3-70b-chat-hf
:My question: is
Llama-3-70b-chat-hf
supported inOpen Interpreter
?if yes, can you please provide me with its
Function Call
?Describe the solution you'd like
Integrate latest models of
Llama
inTogether AI
toOpen Interpreter
.Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: