Configuration
How to connect ClawSide with LLM/Agent
OpenClaw
OpenClaw’s Gateway can serve a small OpenAI-compatible Chat Completions endpoint. This endpoint is disabled by default.
- Stop openclaw gateway
openclaw gateway stop
- To enable it, edit
~/.openclaw/openclaw.jsonto enablegateway.http.endpoints.chatCompletions.enabledastrue.
{
"gateway": {
"port": 18789,
"mode": "local",
"bind": "loopback",
"auth": {
"mode": "token",
"token": "${YOUR...TOKEN}"
},
"http": {
"endpoints": {
"chatCompletions": {
"enabled": true
}
}
}
}
}
- Restart openclaw gateway
openclaw gateway start
Reference: OpenClaw Gateway docs
Ollama
No authentication is required when accessing Ollama’s API locally via http://localhost:11434. Ollama allows cross-origin requests from 127.0.0.1 and 0.0.0.0 by default.
For browser extensions, set OLLAMA_ORIGINS to include chrome-extension://*:
# Allow all Chrome extensions
OLLAMA_ORIGINS=chrome-extension://* ollama serve
References:
Hermes-Agent
- Stop hermes gateway
hermes gateway stop
- Edit
~/.hermes/.envand append:
# Enable Hermes HTTP Gateway
API_SERVER_ENABLED=true
# Set Gateway auth token
API_SERVER_KEY=123456
# Allow chrome extension visiting
GATEWAY_ALLOW_ALL_USERS=true
- Restart hermes gateway
hermes gateway start
References: