You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
When we try to use Gemini LLM we only have option to pass it as an API KEY, what if we have a json file through which we want to connect to our LLM Project?
Describe the solution you'd like
In gemini_chat.py inside the else part, we can have a solution of passing the path to the json file through which we will be connecting to the Gemini LLM project
The code for connection can look something like this
`
if "api_key" in config or os.getenv("GOOGLE_API_KEY"):
import google.generativeai as genai
genai.configure(api_key=config["api_key"])
self.chat_model = genai.GenerativeModel(model_name)
else:
# Authenticate using VertexAI
import google.auth
import vertexai
from vertexai.generative_models import GenerativeModel, Part
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = PATH
credentials, _ = google.auth.default()
vertexai.init()
self.chat_model = GenerativeModel("gemini-pro")
`
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
When we try to use Gemini LLM we only have option to pass it as an
API KEY
, what if we have a json file through which we want to connect to our LLM Project?Describe the solution you'd like
In
gemini_chat.py
inside the else part, we can have a solution of passing the path to the json file through which we will be connecting to the Gemini LLM projectThe code for connection can look something like this
`
if "api_key" in config or os.getenv("GOOGLE_API_KEY"):
import google.generativeai as genai
`
The text was updated successfully, but these errors were encountered: