Concepts
LLM Configuration
Language Model Configuration in Contaigents
LLM Configuration
Configure your preferred Language Model provider for AI operations.
Supported Providers
- OpenAI
- Anthropic
- DeepSeek
- HuggingFace
- Replicate
- OpenRouter
- Local Models (via Ollama)
Configuration Options
- API Key management
- Model selection
- Temperature settings
- Context window size
- Response formatting