04 — Configuration System
Overview
OpenHands uses a layered configuration system:
TOML defaults (config.template.toml)
└── User config file (config.toml)
└── Environment variables (OH_*, VITE_*, etc.)
└── Python Pydantic models (runtime objects)
Later layers override earlier ones.
TOML Configuration
File: config.template.toml
The template file documents all available configuration sections:
[core]
workspace_base = "./workspace"
max_iterations = 100
default_agent = "CodeActAgent"
runtime = "docker" # docker | remote | local | cli
file_store = "local" # local | s3 | google_cloud | memory
[llm]
model = "claude-sonnet-4-20250514"
api_key = "sk-..."
temperature = 0.0
max_output_tokens = 4096
num_retries = 8
caching_prompt = true
[agent]
enable_cmd = true
enable_browsing = true
enable_jupyter = true
enable_editor = true
enable_think = true
enable_finish = true
[sandbox]
timeout = 120
base_container_image = "nikolaik/python-nodejs:python3.12-nodejs22"
[security]
security_analyzer = "" # or "invariant"
confirmation_mode = falsePython Configuration Classes
OpenHandsConfig (main)
File: openhands/core/config/openhands_config.py:29-189
class OpenHandsConfig(BaseModel):
llms: dict[str, LLMConfig] # Named LLM configurations
agents: dict[str, AgentConfig] # Named agent configurations
default_agent: str # Default: "CodeActAgent"
sandbox: SandboxConfig
security: SecurityConfig
runtime: str # "docker" | "remote" | "local" | "cli"
file_store: str # "local" | "s3" | "google_cloud" | "memory"
file_store_path: str # Default: "~/.openhands"
max_iterations: int # Default: 100
max_budget_per_task: float | None # USD budget cap per task
enable_browser: bool # Default: True
mcp: MCPConfig # MCP server configuration
kubernetes: KubernetesConfig # K8s-specific settings
git_user_name: str # Default: "openhands"
git_user_email: str # Default: "openhands@all-hands.dev"Key methods:
get_llm_config(name)(line 140) — Retrieve named LLM config (default:'llm')get_agent_config(name)(line 155) — Retrieve named agent config (default:'agent')get_agent_to_llm_config_map()(line 166) — Map agent names to their LLM configs
LLMConfig
File: openhands/core/config/llm_config.py
class LLMConfig(BaseModel):
model: str # e.g. "claude-sonnet-4-20250514"
api_key: SecretStr | None
base_url: str | None
temperature: float # Default: 0.0
max_output_tokens: int | None
max_input_tokens: int | None
num_retries: int # Default: 8
retry_min_wait: int # Default: 15
retry_max_wait: int # Default: 120
caching_prompt: bool # Default: True
native_tool_calling: bool | None # None = auto-detect
disable_vision: bool # Default: False
reasoning_effort: str | None # For reasoning models (o3, etc.)
input_cost_per_token: float | None
output_cost_per_token: float | None
custom_llm_provider: str | None
drop_params: bool # Default: TrueAgentConfig
File: openhands/core/config/agent_config.py
class AgentConfig(BaseModel):
enable_cmd: bool # Default: True
enable_browsing: bool # Default: True
enable_jupyter: bool # Default: True
enable_editor: bool # Default: True
enable_llm_editor: bool # Default: False
enable_think: bool # Default: True
enable_finish: bool # Default: True
enable_mcp: bool # Default: True
enable_stuck_detection: bool # Default: True
enable_history_truncation: bool # Default: True
llm_config: str | None # Override LLM config name
condenser: dict # Condenser configuration
cli_mode: bool # Default: FalseSandboxConfig
File: openhands/core/config/sandbox_config.py
class SandboxConfig(BaseModel):
base_container_image: str # Default: "nikolaik/python-nodejs:..."
timeout: int # Command timeout in seconds
selected_repo: str | None # Repository to clone
runtime_startup_env_vars: dict # Extra env vars for sandbox
enable_auto_lint: bool # Default: TrueSecurityConfig
File: openhands/core/config/security_config.py
class SecurityConfig(BaseModel):
security_analyzer: str # "" or "invariant"
confirmation_mode: bool # Default: FalseConfiguration Loading Order
1. config.template.toml # Shipped defaults
│
2. ~/.openhands/config.toml # User overrides (or path from args)
│
3. Environment variables # OH_* prefix for core settings
│
4. CLI arguments # --model, --agent, etc.
│
5. Pydantic model defaults # Field defaults in Python classes
The loading is handled by setup_config_from_args() and related functions in
openhands/core/config/.
Key Environment Variables
Core (OH_*)
| Variable | Description |
|---|---|
LLM_MODEL |
LLM model name |
LLM_API_KEY |
LLM API key |
LLM_BASE_URL |
Custom LLM endpoint |
WORKSPACE_BASE |
Base workspace directory |
WORKSPACE_MOUNT_PATH_IN_SANDBOX |
Mount path inside sandbox |
DEBUG |
Enable debug logging |
LOG_ALL_EVENTS |
Log all events (verbose) |
LOG_JSON |
Emit structured JSON logs |
SANDBOX_ENV_* |
Passed into sandbox as env vars (prefix stripped) |
Frontend (VITE_*)
| Variable | Description |
|---|---|
VITE_BACKEND_HOST |
Backend API URL |
VITE_FRONTEND_PORT |
Frontend dev server port |
Server / Runtime
| Variable | Description |
|---|---|
port |
Server listen port (default: 3000) |
DISABLE_VSCODE_PLUGIN |
Disable VSCode plugin in runtime |
OPENHANDS_FORCE_VISION |
Force vision capability on |
LITE_LLM_API_URL |
Override LiteLLM proxy URL |
WEB_HOST |
Web host for environment detection |
Server Configuration
File: openhands/app_server/config.py
The ServerConfig provides pluggable backends for server-level storage:
class ServerConfig:
conversation_store: str # "file" | "redis"
settings_store: str # "file" | "redis"
secrets_store: str # "file" | "redis"This enables deployment flexibility — local development can use file-based stores, while production deployments can use Redis for shared state across instances.
Frontend Configuration
The React frontend uses Vite environment variables (prefixed with VITE_). The
proxy configuration in vite.config.ts forwards API requests to the backend:
// frontend/vite.config.ts
server: {
proxy: {
'/api': 'http://localhost:3000',
'/ws': { target: 'ws://localhost:3000', ws: true }
}
}The web client config is served via the /api/v1/web-client/config endpoint,
which provides runtime configuration (GitHub/Replicated app slugs, feature flags)
to the frontend without rebuilding.