Self-Hosting
Self-Hosting
Self-hosting AI for Database gives you complete control over your data. Nothing leaves your network -- database connections, queries, and results all stay within your infrastructure.
Why Self-Host?
- Data sovereignty -- no data leaves your network, ever.
- Compliance -- meet strict regulatory requirements (HIPAA, PCI-DSS, government).
- Network simplicity -- no need for SSH tunnels or IP allowlisting since everything is internal.
- Custom AI models -- use your own LLM deployment or private API keys.
Deployment Options
AI for Database is distributed as a Docker image and can be deployed on any infrastructure.
Docker Compose (Simplest)
version: '3.8'
services:
aifordb:
image: aifordatabase/server:latest
ports:
- "3000:3000"
environment:
- DATABASE_URL=postgresql://user:pass@localhost/aifordb
- OPENAI_API_KEY=sk-your-key
- ENCRYPTION_KEY=your-32-char-encryption-key
- JWT_SECRET=your-jwt-secret
volumes:
- aifordb_data:/app/data
volumes:
aifordb_data:Run with docker compose up -d and access the UI at http://localhost:3000.
Kubernetes
We provide Helm charts for production Kubernetes deployments:
helm repo add aifordb https://charts.aifordatabase.com
helm install aifordb aifordb/aifordatabase \
--set database.url="postgresql://user:pass@db:5432/aifordb" \
--set ai.apiKey="sk-your-key" \
--set encryption.key="your-32-char-key"The Helm chart includes configuration for replicas, resource limits, ingress, persistent volumes, and health checks.
System Requirements
Minimum (small team, < 10 users):
- 2 CPU cores
- 4 GB RAM
- 20 GB disk
Recommended (production, 10-100 users):
- 4 CPU cores
- 8 GB RAM
- 50 GB SSD
AI for Database requires a metadata database (PostgreSQL 14+) to store configuration, user accounts, and query history. This is separate from the databases you connect as data sources.
AI Model Configuration
Self-hosted deployments support multiple AI backends:
- OpenAI API -- set
OPENAI_API_KEYenvironment variable - Azure OpenAI -- set
AZURE_OPENAI_ENDPOINTandAZURE_OPENAI_KEY - Anthropic Claude -- set
ANTHROPIC_API_KEY - Local models -- point to any OpenAI-compatible API endpoint (e.g., Ollama, vLLM)
Updates
Self-hosted instances receive updates through new Docker image tags. We publish stable releases monthly and patch releases as needed. Enable automatic update checks in the admin settings to receive notifications.
Support
Self-hosted customers on Enterprise plans receive dedicated support including deployment assistance, upgrade guidance, and priority issue resolution. Community support is available on our Discord for open-source users.