AnythingLLM: The Full-Stack RAG Platform
An enterprise-grade, private RAG solution to build LLM apps that can chat with anything.
Categories
Building the App
Code Execution & Agents
Self-Hosted
TypeScript
Our Take
AnythingLLM provides a complete, self-hostable Retrieval-Augmented Generation (RAG) platform out of the box. It's designed for enterprises and developers who need a secure, multi-user environment to turn documents, videos, or any data source into a queryable knowledge base for their LLM applications.
Quick Start
```bash
# Using Docker
docker run -d -p 3001:3001 --cap-add SYS_ADMIN -v $(pwd)/storage:/app/server/storage -v $(pwd)/hotdir:/app/server/hotdir --name anything-llm mintplexlabs/anythingllm
```