Provider Selection
- OpenRouter: wide range of model, native support(modality detection with openrouter API)
- OpenAI: Functional, require more setup(need to specify modality)
- Ollama: Limited by hardware
OpenRouter (Recommended)
Setup:
- Sign up at openrouter.ai
- Generate API key
- Set in llumen:
export API_KEY="sk-or-v1-your-key"
# API_BASE defaults to OpenRouter
OpenAI
Setup:
export API_KEY="sk-your-openai-key"
export API_BASE="https://api.openai.com"
Local Models (Ollama)
Setup:
- Install Ollama
- Pull models:
ollama pull llama3
ollama pull mistral - Configure llumen:
export API_KEY="ollama"
export API_BASE="http://localhost:11434/v1"
Other Providers
Any OpenAI-compatible API works:
export API_KEY="your-key"
export API_BASE="https://your-provider.com/v1"
Tested Compatible services:
- Groq
- Ollama Cloud
- Naga.ac
- Copilot API