In self-hosted mode, kombify AI uses your own API keys to access AI models. This guide covers how to set up and manage your keys securely.Documentation Index
Fetch the complete documentation index at: https://docs.kombify.io/llms.txt
Use this file to discover all available pages before exploring further.
Supported providers
| Provider | Models | Setup |
|---|---|---|
| OpenAI | GPT-4o, GPT-5-nano | API key from platform.openai.com |
| Anthropic | Claude Sonnet, Claude Haiku | API key from console.anthropic.com |
| Google AI | Gemini Pro, Gemini Flash | API key from aistudio.google.com |
| Azure AI Foundry | Hosted models | Endpoint + API key |
| OpenRouter | 200+ models | API key from openrouter.ai |
| Ollama | Self-hosted models | Local endpoint (no key needed) |
Adding API keys
Go to Settings > API Keys
Each provider has its own section with fields for the required credentials.
Security
- All keys are encrypted at rest with AES-256-GCM
- Keys never leave your server
- No key data is sent to kombify servers
- Keys are stored in the local database (SQLite in self-hosted mode)
Cost management
kombify AI tracks token usage and estimated costs per request. View your spending in AI Settings > Usage.Further reading
Supported models
Full list of supported models per provider
Configuration reference
All AI configuration options
