📄️ Architecture
This section outlines the architecture of the services, their interactions, and planned features.
📄️ Quick Start
Setup to run the repository in both production and development environments.
📄️ Model Context Protocol
MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
📄️ LangGraph
Gain control with LangGraph to design agents that reliably handle complex tasks.
📄️ Supabase
Demo on getting POSTGRES_DSN with Supabase’s free database, auth, and APIs.
📄️ Langfuse
Traces, evals, prompt management and metrics to debug and improve your LLM application.
📄️ Grafana Stack
By configuring the OpenAI Integration, users can gain valuable insights into token usage rates, response times, and overall costs. This integration empowers users to make data-driven decisions while ensuring optimal utilization of OpenAI APIs.