Docker Compose Setup
Docker Compose is the easiest way to run both the Pensieve backend and frontend together. This guide provides ready-to-use compose files for both pre-built images and building from source.
For running individual containers, see the Docker guide.
Prerequisites
| Tool | Version | Purpose |
|---|---|---|
| Docker | ≥ 20.10 | Container runtime |
| Docker Compose | ≥ 2.0 | Multi-container orchestration (included with Docker Desktop) |
Option A: Using Pre-built Images (Recommended)
Create a docker-compose.yml file:
services:
backend:
image: ghcr.io/5000k/pensieve-backend:latest
ports:
- "8000:8000"
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- SEARCH_API_KEY=${SEARCH_API_KEY}
volumes:
- ./config.yaml:/app/config.yaml:ro
- ./workspaces:/app/workspaces
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 5s
retries: 3
frontend:
image: ghcr.io/5000k/pensieve-frontend:latest
ports:
- "3000:80"
depends_on:
backend:
condition: service_healthy
Create a .env file in the same directory for your secrets:
# .env — Docker Compose automatically loads this file
OPENAI_API_KEY=sk-your-openai-api-key
SEARCH_API_KEY=tvly-your-tavily-api-key
Start both services:
docker compose up -d
The frontend is available at http://localhost:3000 and the backend API at http://localhost:8000.
Option B: Build from Source
If you want to build from the repository source code, use this docker-compose.yml in the repository root:
services:
backend:
build:
context: .
dockerfile: Dockerfile
ports:
- "8000:8000"
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- SEARCH_API_KEY=${SEARCH_API_KEY}
volumes:
- ./config.yaml:/app/config.yaml:ro
- ./workspaces:/app/workspaces
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 5s
retries: 3
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
ports:
- "3000:80"
depends_on:
backend:
condition: service_healthy
Build and start:
docker compose up -d --build
To rebuild after code changes:
docker compose build
docker compose up -d
Configuration
Environment Variables
Docker Compose automatically loads a .env file from the same directory as docker-compose.yml. Use this for secrets:
# .env
OPENAI_API_KEY=sk-your-openai-api-key
SEARCH_API_KEY=tvly-your-tavily-api-key
See Environment Variables for all available variables.
Global Config
Mount your config.yaml into the backend container. See Global Configuration for the full reference.
If you don’t mount a config.yaml, the backend uses its built-in defaults — you still need to provide OPENAI_API_KEY via the environment.
Workspace Data
The workspaces volume stores all workspace data, including per-workspace configuration, identity memory, topics, and project artifacts. This data persists across container restarts.
See Workspace Configuration for how to configure individual workspaces.
Common Operations
Start Services
docker compose up -d
Stop Services
docker compose down
View Logs
# All services
docker compose logs -f
# Backend only
docker compose logs -f backend
# Frontend only
docker compose logs -f frontend
Rebuild After Code Changes
docker compose up -d --build
Health Check
curl http://localhost:8000/health
# → {"status":"ok"}
Changing Ports
To use different host ports, update the ports mapping in docker-compose.yml:
services:
backend:
ports:
- "9000:8000" # host:container
frontend:
ports:
- "8080:80"
If you change the backend’s host port (or hostname), also pass the updated URLs to the frontend service as runtime environment variables — no image rebuild is required:
frontend:
image: ghcr.io/5000k/pensieve-frontend:latest
ports:
- "8080:80"
environment:
- VITE_API_URL=http://localhost:9000
- VITE_WS_URL=ws://localhost:9000/ws
Or via your .env file:
# .env
VITE_API_URL=http://localhost:9000
VITE_WS_URL=ws://localhost:9000/ws
frontend:
image: ghcr.io/5000k/pensieve-frontend:latest
ports:
- "8080:80"
environment:
- VITE_API_URL=${VITE_API_URL}
- VITE_WS_URL=${VITE_WS_URL}
Also update the cors_origins in your Global Configuration to include the new frontend URL.
Next Steps
- Global Configuration — configure LLM providers and search
- Workspace Configuration — customize the pipeline for your workspace
- Environment Variables — full environment variable reference
- Local Setup — run without containers for development