Environment
Environment Variables
Llumen uses environment variables for configuration. No config files needed!
Required
| Variable | Description | Example |
|---|---|---|
API_KEY | OpenRouter or OpenAI API key | sk-or-v1-... |
Optional
| Variable | Description | Default |
|---|---|---|
API_BASE | Custom API endpoint | https://openrouter.ai/api |
DATA_PATH | Storage directory | . (current dir) |
BIND_ADDR | Network socket | 0.0.0.0:80 |
TRUSTED_HEADER | middleware auth | None |
Docker Configuration
Example
docker run -d \
-e API_KEY="sk-or-v1-..." \
-e DATA_PATH="/data" \
-e BIND_ADDR="0.0.0.0:8080" \
-p 8080:8080 \
-v ./llumen-data:/data \
ghcr.io/pinkfuwa/llumen:latest
API Provider Configuration
OpenRouter (Recommended)
- Sign up at openrouter.ai
- Generate an API key
- Set the key:
export API_KEY="sk-or-v1-..."
# API_BASE defaults to OpenRouter, no need to set
Why OpenRouter?
- Access to 100+ models
- Pay-per-use pricing
- No subscriptions
- Automatic fallbacks
OpenAI
export API_KEY="sk-..."
export API_BASE="https://api.openai.com"
Local Models (Ollama)
# Start Ollama
ollama serve
# Configure llumen
export API_KEY="ollama"
export API_BASE="http://localhost:11434/v1"
Custom Endpoints
Any OpenAI-compatible API works:
export API_KEY="your-key"
export API_BASE="https://your-endpoint.com/v1"
Storage Configuration
Data Directory
Llumen stores:
- Conversation history
- User preferences
- Session data
- Uploaded files (temporary)
Docker:
-v ./llumen-data:/data
-e DATA_PATH=/data
Native:
export DATA_PATH=/var/lib/llumen
mkdir -p $DATA_PATH
./llumen
Data Structure
data/
├── conversations/
│ └── *.json
├── users/
│ └── users.db
└── uploads/
└── (temporary files)
Network Configuration
Binding Address
Listen on all interfaces (default):
BIND_ADDR=0.0.0.0:80
Localhost only:
BIND_ADDR=127.0.0.1:80
Custom port:
BIND_ADDR=0.0.0.0:8080
Behind Reverse Proxy
If using Nginx, Caddy, or similar:
# Llumen listens locally
BIND_ADDR=127.0.0.1:3000
# Reverse proxy handles external access
Example Nginx config:
server {
listen 80;
server_name llumen.example.com;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
UI Settings
Settings configurable through the web interface:
Appearance
- Theme selection
- Dark/light mode
- Font size
- Pattern overlays
Chat Behavior
- Default chat mode
- Token streaming
- Message formatting
- Auto-scroll
Privacy
- Conversation history
- Anonymous usage
- Local storage only
Advanced Configuration
Multiple Instances
Run multiple llumen instances:
# Instance 1
docker run -d --name llumen-1 -p 8001:80 -e API_KEY="..." -v ./data1:/data ghcr.io/pinkfuwa/llumen:latest
# Instance 2
docker run -d --name llumen-2 -p 8002:80 -e API_KEY="..." -v ./data2:/data ghcr.io/pinkfuwa/llumen:latest
Custom API Models
Configure specific models in the UI after login.
Resource Limits
Docker resource constraints:
docker run -d \
--memory="256m" \
--cpus="1.0" \
-e API_KEY="..." \
ghcr.io/pinkfuwa/llumen:latest
Security Best Practices
Change default password
- Log in with
admin/P@88w0rd - Go to Settings -> Security
- Change password immediately
- Use a strong, unique password
Protect API keys
- Never commit
.envfiles to git - Use secrets management in production
- Rotate keys periodically
- Use separate keys per instance
Network security
- Use HTTPS in production (reverse proxy)
- Restrict access with firewall rules
- Consider VPN for remote access
- Don't expose to public internet without auth