Skip to main content

Provider Selection

  • OpenRouter: wide range of model, native support(modality detection with openrouter API)
  • OpenAI: Functional, require more setup(need to specify modality)
  • Ollama: Limited by hardware

Setup:

  1. Sign up at openrouter.ai
  2. Generate API key
  3. Set in llumen:
export API_KEY="sk-or-v1-your-key"
# API_BASE defaults to OpenRouter

OpenAI

Setup:

export API_KEY="sk-your-openai-key"
export API_BASE="https://api.openai.com"

Local Models (Ollama)

Setup:

  1. Install Ollama
  2. Pull models:
    ollama pull llama3
    ollama pull mistral
  3. Configure llumen:
    export API_KEY="ollama"
    export API_BASE="http://localhost:11434/v1"

Other Providers

Any OpenAI-compatible API works:

export API_KEY="your-key"
export API_BASE="https://your-provider.com/v1"

Tested Compatible services:

  • Groq
  • Ollama Cloud
  • Naga.ac
  • Copilot API