Gustavo Karsten
πŸš€

LangAgents β€” AI Agent Platform

Full-stack AI agent platform with LangGraph execution graphs, RAG retrieval, real-time SSE streaming and complete LLM observability via Langfuse.

Core Project

## πŸ”§ Tech Stack

LangGraphLangChainFastAPINext.jsAnthropic ClaudeChromaDBLangfusePostgreSQLDockerSSE

## πŸ“ˆ Workflow

api

User Query Input

User sends a query via Next.js frontend

ai

LangGraph Agent

StateGraph orchestrates reasoning steps (ReAct pattern)

database

RAG Retrieval

Semantic search in ChromaDB with nomic-embed-text embeddings

ai

Claude LLM Response

Generate response with context via Anthropic Claude

transform

SSE Streaming

Real-time token streaming to frontend via Server-Sent Events

monitoring

Langfuse Tracing

Full observability: latency, cost per token, quality metrics

## ✨ Features

  • β€’LangGraph StateGraph for stateful agent execution
  • β€’RAG pipeline with ChromaDB vector store
  • β€’Real-time token streaming via SSE
  • β€’Full LLM observability with Langfuse (traces, cost, latency)
  • β€’Multi-model support (Claude, GPT, local Ollama)
  • β€’FastAPI async backend with Pydantic validation
  • β€’Next.js frontend with Server Components
  • β€’Docker Compose deployment

## 🎯 Results

  • βœ“End-to-end latency under 2s for complex agent queries
  • βœ“100% trace coverage for all LLM calls
  • βœ“Modular architecture β€” swap LLM providers in minutes
  • βœ“Used as foundation for DockPlus AI client projects

## πŸ”— Related Projects