llm-ops
Here are 12 public repositories matching this topic...
Firebase for AI Agents: Open-source backend platform that puts powerful generative models at the core of your database. With managed memory and RAG capabilities, developers can easily build AI agents, enhance their apps with generative tables, and create magical UI experiences.
-
Updated
Jun 3, 2024 - Python
AIConfig is a config-based framework to build generative AI applications.
-
Updated
Jun 4, 2024 - Python
Friendli: the fastest serving engine for generative AI
-
Updated
Jun 19, 2024 - Python
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
-
Updated
Jun 21, 2024 - Python
Miscellaneous codes and writings for MLOps
-
Updated
Jun 23, 2024 - Jupyter Notebook
Python SDK for running evaluations on LLM generated responses
-
Updated
Jun 25, 2024 - Python
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
Jun 27, 2024 - Python
Open-source alternative to Assistant's API with a managed backend for memory, RAG, tools and tasks. ~Supabase for building AI agents.
-
Updated
Jul 1, 2024 - Python
The prompt engineering, prompt management, and prompt evaluation tool for TypeScript, JavaScript, and NodeJS.
-
Updated
Jun 30, 2024 - TypeScript
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
-
Updated
Jul 1, 2024 - Python
Improve this page
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."