Troubleshooting

How to connect ClawSide with LLM/Agent

OpenClaw

OpenClaw’s Gateway can serve a small OpenAI-compatible Chat Completions endpoint. This endpoint is disabled by default.

  1. Stop openclaw gateway
openclaw gateway stop
  1. To enable it, edit ~/.openclaw/openclaw.json to enable gateway.http.endpoints.chatCompletions.enabled as true.
{
	"gateway": {
    "port": 18789,
    "mode": "local",
    "bind": "loopback",
    "auth": {
      "mode": "token",
      "token": "${YOUR...TOKEN}"
    },
    "http": {
      "endpoints": {
        "chatCompletions": {
          "enabled": true
        }
      }
    }
  }
}
  1. Restart openclaw gateway
openclaw gateway start

Reference: OpenClaw Gateway docs

Ollama

No authentication is required when accessing Ollama’s API locally via http://localhost:11434. Ollama allows cross-origin requests from 127.0.0.1 and 0.0.0.0 by default.

For browser extensions, set OLLAMA_ORIGINS to include chrome-extension://*:

# Allow all Chrome extensions
OLLAMA_ORIGINS=chrome-extension://* ollama serve

References:

Hermes-Agent

  1. Stop hermes gateway
hermes gateway stop
  1. Edit ~/.hermes/.env and append:
# Enable Hermes HTTP Gateway
API_SERVER_ENABLED=true
# Set Gateway auth token
API_SERVER_KEY=123456
# Allow chrome extension visiting
GATEWAY_ALLOW_ALL_USERS=true
  1. Restart hermes gateway
hermes gateway start

References: