Generate & Ship UI with minimal effort - Open Source Generative UI with natural language
-
Updated
Nov 23, 2024 - TypeScript
Generate & Ship UI with minimal effort - Open Source Generative UI with natural language
Build a RAG preprocessing pipeline
RAG-nificent is a state-of-the-art framework leveraging Retrieval-Augmented Generation (RAG) to provide instant answers and references from a curated directory of PDFs containing information on any given topic. Supports Llama3.1 and OpenAI Models via the Groq API.
Search for a holiday and get destination advice from an LLM. Observability by Dynatrace.
This repo is for advanced RAG systems, each branch will represent a project based on RAG.
Demo LLM (RAG pipeline) web app running locally using docker-compose. LLM and embedding models are consumed as services from OpenAI.
AI-driven prompt generation and evaluation system, designed to optimize the use of Language Models (LLMs) in various industries. The project consists of both frontend and backend components, facilitating prompt generation, automatic evaluation data generation, and prompt testing.
Learn Retrieval-Augmented Generation (RAG) from Scratch using LLMs from Hugging Face and Langchain or Python
This is a production-ready applications using RAG-based Language Model.
Using MLflow to track a RAG pipeline, using LLamaIndex and Ollama/HuggingfaceLLMs
Hybrid Search RAG Pipeline integrating BM25 and vector search techniques using LangChain
A GenAI based search system that scans numerous fashion product descriptions to recommend suitable options based on user queries.
This project implements document ingestion, embedding generation, and retrieval-augmented generation (RAG). If you are looking for a small project to understand the implementation of basic RAG then this project is good to go.
Add a description, image, and links to the rag-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the rag-pipeline topic, visit your repo's landing page and select "manage topics."