-
Notifications
You must be signed in to change notification settings - Fork 23
Issues: llm-tools/embedJs
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
I'm trying to install the package on my window and get the following error:
bug
Something isn't working
#86
opened Jun 28, 2024 by
huyphamhaha
Windows install problem version 0.0.89
bug
Something isn't working
#85
opened Jun 25, 2024 by
fdx-matias
Add OllamaEmbedding (It is available in langchain)
enhancement
New feature or request
#84
opened Jun 23, 2024 by
hassanallaham
Add Amazon Bedrock (included in langchain)
enhancement
New feature or request
#83
opened Jun 17, 2024 by
converseKarl
Add Groq for super fast inference with LLM's (supported by Langchain)
enhancement
New feature or request
#80
opened Jun 17, 2024 by
converseKarl
Allow conversation history to persist
enhancement
New feature or request
#79
opened Jun 17, 2024 by
davetaz
Limit token usage output parameter across all queried llm models
enhancement
New feature or request
#75
opened Jun 11, 2024 by
converseKarl
Short and long term memory
enhancement
New feature or request
#72
opened Jun 4, 2024 by
converseKarl
Add extra parameter for loader source titles, return it with the result
enhancement
New feature or request
#67
opened May 29, 2024 by
dr460nf1r3
Switching Loaders / Vectors
enhancement
New feature or request
#58
opened May 18, 2024 by
converseKarl
Better way to manage depenedencies?
dependencies
Pull requests that update a dependency file
question
Further information is requested
#57
opened May 15, 2024 by
adhityan
Support for multi model inputs
enhancement
New feature or request
#56
opened May 15, 2024 by
adhityan
Add post Context/Custom Meta to Loaders / Rag information to targetted Rag(id)
enhancement
New feature or request
#42
opened Apr 30, 2024 by
converseKarl
I'm unsure how to run a model that needs inputs
question
Further information is requested
#39
opened Apr 26, 2024 by
JonahElbaz
Question - can we get the token numbers input and output from the query transaction
question
Further information is requested
#30
opened Apr 23, 2024 by
converseKarl
add an optional fine tuned model id when performing a query once it passes to the LLM
enhancement
New feature or request
#29
opened Apr 23, 2024 by
converseKarl
How to use streaming for OpenAI models?
enhancement
New feature or request
#26
opened Apr 23, 2024 by
benfiratkaya
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.