Skip to content
Bob: The Open-Source AI Agent Platform Built for Real-World Applications
← ← Back to Thinking AI

Bob: The Open-Source AI Agent Platform Built for Real-World Applications

Building AI-powered applications today means navigating a maze of fragmented tools, proprietary lock-in, and infrastructure complexity. You need a conversational interface, document retrieval, user management, and flexible model support — but stitching all of that together from scratch is a serious undertaking. That is exactly the problem we set out to solve with Bob.

Bob is an open-source AI agent platform developed by Ten Invent, released under the MIT license. It combines conversational AI, a full RAG (Retrieval-Augmented Generation) pipeline, multi-tenant user management, and a provider abstraction layer that lets you switch between cloud and local LLMs without changing a single line of application code. Whether you are building an internal knowledge assistant, a customer-facing chatbot, or a research tool grounded in your own documents, Bob gives you a production-ready foundation to start from.

The Problem: AI Applications Need More Than Just a Model

Large language models are powerful, but a raw model API is not an application. To build something genuinely useful, you need persistent conversation memory so users can have coherent multi-turn dialogues. You need document ingestion and retrieval so the model can answer questions grounded in your actual data rather than hallucinating. You need authentication, user management, and API token handling so multiple people or teams can use the system securely. And you need the freedom to choose your LLM provider — maybe Amazon Bedrock today, maybe a local model running on Ollama tomorrow — without rewriting your backend.

Most teams end up building all of this infrastructure themselves, often poorly and always slowly. Bob packages these capabilities into a cohesive, well-architected platform that you can deploy, extend, and customize.

Architecture Overview

Bob follows a clean separation between its backend and frontend, connected through a RESTful API with Server-Sent Events (SSE) for real-time streaming responses.

The backend is built with FastAPI and SQLAlchemy 2, using Alembic for database migrations. It handles all business logic: conversation management, agent orchestration, document processing, user authentication, and LLM communication. MariaDB (or MySQL) serves as the relational database, while ChromaDB acts as the vector store for the RAG pipeline.

The frontend is a modern React application built with Vite, TypeScript, and Tailwind CSS. It provides a responsive chat interface with real-time streaming, document upload capabilities, session management, and an admin panel for user approval and system configuration.

Infrastructure is handled through Docker for local development and AWS CloudFormation templates for cloud deployment, making it straightforward to go from a local prototype to a production environment.

Key Features

Conversational AI with Persistent Memory

Bob maintains session-based conversation history, allowing users to have natural multi-turn dialogues. Each session preserves context, so the model understands references to earlier parts of the conversation. Responses are streamed in real time via SSE, providing a fluid and responsive user experience rather than making users wait for a complete response to be generated.

RAG Pipeline with ChromaDB

One of Bob's most powerful capabilities is its Retrieval-Augmented Generation pipeline. Users can upload documents that are processed through LangChain's document processing utilities, chunked, embedded, and stored in ChromaDB as a vector store. When a user asks a question, Bob retrieves the most relevant document chunks and includes them as context for the LLM, producing answers that are grounded in your actual data.

This is critical for enterprise and professional use cases where accuracy matters. Instead of relying on the model's training data alone, Bob can reference your internal documentation, research papers, policy documents, or any other text corpus you provide.

Agent Tools via Strands Agents

Bob is not just a chat interface — it is an agent platform. Through Strands Agents, Bob orchestrates a set of tools that the AI can invoke during conversations: a calculator for mathematical operations, time lookup for date and time queries, document retrieval for RAG-powered answers, and summarization for condensing longer texts. The agent decides when and how to use these tools based on the user's request, making interactions significantly more capable than simple prompt-response exchanges.

Multi-Tenant System with JWT Authentication

Bob is designed for multi-user environments from the ground up. It includes a full authentication system based on JWT tokens, with user registration that requires admin approval before access is granted. Users can manage their own API tokens for programmatic access. This makes Bob suitable for team deployments, internal tools, and SaaS-style applications where user isolation and access control are essential.

Flexible LLM Support: Cloud or Local, Your Choice

Perhaps the most distinctive aspect of Bob's architecture is its provider abstraction layer. Bob supports two LLM backends out of the box:

  • Amazon Bedrock for cloud-based inference, giving you access to models like Claude, Llama, and others available through AWS.
  • OpenAI-compatible servers such as LM Studio, Ollama, or vLLM for local or self-hosted inference.

Switching between providers is a configuration change, not a code change. This means you can develop locally with a model running on Ollama, test with a different model on LM Studio, and deploy to production on Amazon Bedrock — all without modifying your application. For organizations with data sovereignty requirements or those who want to experiment with different models, this flexibility is invaluable.

Tech Stack at a Glance

| Layer | Technology | |---|---| | Backend Framework | FastAPI | | ORM & Migrations | SQLAlchemy 2 + Alembic | | Database | MariaDB / MySQL | | Vector Store | ChromaDB | | Document Processing | LangChain | | Agent Orchestration | Strands Agents | | AWS Integration | boto3 | | Frontend Framework | React + Vite | | Frontend Language | TypeScript | | Styling | Tailwind CSS | | Auth | JWT | | Infrastructure | Docker, AWS CloudFormation |

Getting Started

Getting Bob up and running is designed to be as simple as possible. The repository includes Docker Compose configurations for local development, so you can have the entire stack — backend, frontend, database, and vector store — running with a few commands.

  1. Clone the repository from https://github.com/ianghel/bob.
  2. Configure your environment variables to point to your chosen LLM provider.
  3. Run docker compose up to start all services.
  4. Access the frontend, register a user, and start chatting.

The repository README provides detailed instructions for both local development and AWS deployment using the included CloudFormation templates.

Why Open Source Matters

We built Bob as an open-source project because we believe that AI infrastructure should be transparent, auditable, and customizable. Proprietary AI platforms lock you into specific vendors, hide their implementation details, and charge premium prices for features that should be table stakes. With Bob, you get the full source code under the MIT license. You can inspect every line, modify anything to fit your needs, and deploy it wherever you want — on your own servers, in your own cloud account, or on a developer's laptop.

The MIT license means there are no restrictions on commercial use. You can build products on top of Bob, integrate it into your existing systems, or use it as a learning resource to understand how production AI applications are structured.

Contributing and Community

Bob is actively developed by Ten Invent, and we welcome contributions from the community. Whether it is adding new agent tools, improving the RAG pipeline, supporting additional LLM providers, enhancing the frontend, or fixing bugs — we appreciate every pull request and issue report.

Visit the repository at https://github.com/ianghel/bob to explore the code, read the documentation, and get involved. If you find Bob useful, give the project a star on GitHub — it helps others discover it.

We are excited to see what you build with Bob.