CLI Setup Reference
CLI Setup Reference
Section titled “CLI Setup Reference”This page is the full reference for openclaw onboard.
For the short guide, see Onboarding (CLI).
What the wizard does
Section titled “What the wizard does”Local mode (default) walks you through:
- Model and auth setup (OpenAI Code subscription OAuth, Anthropic API key or setup token, plus MiniMax, GLM, Ollama, Moonshot, and AI Gateway options)
- Workspace location and bootstrap files
- Gateway settings (port, bind, auth, tailscale)
- Channels and providers (Telegram, WhatsApp, Discord, Google Chat, Mattermost plugin, Signal)
- Daemon install (LaunchAgent or systemd user unit)
- Health check
- Skills setup
Remote mode configures this machine to connect to a gateway elsewhere. It does not install or modify anything on the remote host.
Local flow details
Section titled “Local flow details”Existing config detection
- If
~/.openclaw/openclaw.jsonexists, choose Keep, Modify, or Reset. - Re-running the wizard does not wipe anything unless you explicitly choose Reset (or pass
--reset). - CLI
--resetdefaults toconfig+creds+sessions; use--reset-scope fullto also remove workspace. - If config is invalid or contains legacy keys, the wizard stops and asks you to run
openclaw doctorbefore continuing. - Reset uses
trashand offers scopes:- Config only
- Config + credentials + sessions
- Full reset (also removes workspace)
- If
Model and auth
- Full option matrix is in Auth and model options.
Workspace
- Default
~/.openclaw/workspace(configurable). - Seeds workspace files needed for first-run bootstrap ritual.
- Workspace layout: Agent workspace.
- Default
Gateway
- Prompts for port, bind, auth mode, and tailscale exposure.
- Recommended: keep token auth enabled even for loopback so local WS clients must authenticate.
- In token mode, interactive setup offers:
- Generate/store plaintext token (default)
- Use SecretRef (opt-in)
- In password mode, interactive setup also supports plaintext or SecretRef storage.
- Non-interactive token SecretRef path: `—gateway-token-ref-env
. - Requires a non-empty env var in the onboarding process environment. - Cannot be combined with—gateway-token`. - Disable auth only if you fully trust every local process. - Non-loopback binds still require auth.Channels
- WhatsApp: optional QR login
- Telegram: bot token
- Discord: bot token
- Google Chat: service account JSON + webhook audience
- Mattermost plugin: bot token + base URL
- Signal: optional
signal-cliinstall + account config - BlueBubbles: recommended for iMessage; server URL + password + webhook
- iMessage: legacy
imsgCLI path + DB access - DM security: default is pairing. First DM sends a code; approve via `openclaw pairing approve
` or use allowlists.
Remote mode details
Section titled “Remote mode details”Remote mode configures this machine to connect to a gateway elsewhere.
What you set:
- Remote gateway URL (
ws://...) - Token if remote gateway auth is required (recommended)
Auth and model options
Section titled “Auth and model options”Anthropic API key
Uses ANTHROPIC_API_KEY if present or prompts for a key, then saves it for daemon use.
Anthropic Claude CLI
Reuses a local Claude CLI login on the gateway host and switches model
selection to claude-cli/....
- macOS: checks Keychain item “Claude Code-credentials”
- Linux and Windows: reuses
~/.claude/.credentials.jsonif present
On macOS, choose “Always Allow” so launchd starts do not block.
Anthropic token (setup-token paste)
Run claude setup-token on any machine, then paste the token.
You can name it; blank uses default.
OpenAI Code subscription (Codex CLI reuse)
If ~/.codex/auth.json exists, the wizard can reuse it.
OpenAI Code subscription (OAuth)
Browser flow; paste code#state.
Sets agents.defaults.model to openai-codex/gpt-5.4 when model is unset or openai/*.
OpenAI API key
Uses OPENAI_API_KEY if present or prompts for a key, then stores the credential in auth profiles.
Sets agents.defaults.model to openai/gpt-5.4 when model is unset, openai/*, or openai-codex/*.
xAI (Grok) API key
Prompts for XAI_API_KEY and configures xAI as a model provider.
OpenCode
Prompts for OPENCODE_API_KEY (or OPENCODE_ZEN_API_KEY) and lets you choose the Zen or Go catalog.
Setup URL: opencode.ai/auth.
API key (generic)
Stores the key for you.
Vercel AI Gateway
Prompts for AI_GATEWAY_API_KEY.
More detail: Vercel AI Gateway.
Cloudflare AI Gateway
Prompts for account ID, gateway ID, and CLOUDFLARE_AI_GATEWAY_API_KEY.
More detail: Cloudflare AI Gateway.
MiniMax
Config is auto-written. Hosted default is MiniMax-M2.7.
More detail: MiniMax.
Synthetic (Anthropic-compatible)
Prompts for SYNTHETIC_API_KEY.
More detail: Synthetic.
Ollama (Cloud and local open models)
Prompts for base URL (default http://127.0.0.1:11434), then offers Cloud + Local or Local mode.
Discovers available models and suggests defaults.
More detail: Ollama.
Moonshot and Kimi Coding
Moonshot (Kimi K2) and Kimi Coding configs are auto-written. More detail: Moonshot AI (Kimi + Kimi Coding).
Custom provider
Works with OpenAI-compatible and Anthropic-compatible endpoints.
Interactive onboarding supports the same API key storage choices as other provider API key flows:
- Paste API key now (plaintext)
- Use secret reference (env ref or configured provider ref, with preflight validation)
Non-interactive flags:
--auth-choice custom-api-key--custom-base-url--custom-model-id--custom-api-key(optional; falls back toCUSTOM_API_KEY)--custom-provider-id(optional)- `—custom-compatibility
(optional; defaultopenai`)
Skip
Leaves auth unconfigured.
Model behavior:
- Pick default model from detected options, or enter provider and model manually.
- Wizard runs a model check and warns if the configured model is unknown or missing auth.
Credential and profile paths:
- OAuth credentials:
~/.openclaw/credentials/oauth.json - Auth profiles (API keys + OAuth):
~/.openclaw/agents/<agentId>/agent/auth-profiles.json
Credential storage mode:
- Default onboarding behavior persists API keys as plaintext values in auth profiles.
--secret-input-mode refenables reference mode instead of plaintext key storage. In interactive setup, you can choose either:- environment variable ref (for example
keyRef: { source: "env", provider: "default", id: "OPENAI_API_KEY" }) - configured provider ref (
fileorexec) with provider alias + id
- environment variable ref (for example
- Interactive reference mode runs a fast preflight validation before saving.
- Env refs: validates variable name + non-empty value in the current onboarding environment.
- Provider refs: validates provider config and resolves the requested id.
- If preflight fails, onboarding shows the error and lets you retry.
- In non-interactive mode,
--secret-input-mode refis env-backed only.- Set the provider env var in the onboarding process environment.
- Inline key flags (for example
--openai-api-key) require that env var to be set; otherwise onboarding fails fast. - For custom providers, non-interactive
refmode storesmodels.providers.<id>.apiKeyas{ source: "env", provider: "default", id: "CUSTOM_API_KEY" }. - In that custom-provider case,
--custom-api-keyrequiresCUSTOM_API_KEYto be set; otherwise onboarding fails fast.
- Gateway auth credentials support plaintext and SecretRef choices in interactive setup:
- Token mode: Generate/store plaintext token (default) or Use SecretRef.
- Password mode: plaintext or SecretRef.
- Non-interactive token SecretRef path:
--gateway-token-ref-env <ENV_VAR>. - Existing plaintext setups continue to work unchanged.
Outputs and internals
Section titled “Outputs and internals”Typical fields in ~/.openclaw/openclaw.json:
agents.defaults.workspaceagents.defaults.model/models.providers(if Minimax chosen)tools.profile(local onboarding defaults to"coding"when unset; existing explicit values are preserved)gateway.*(mode, bind, auth, tailscale)session.dmScope(local onboarding defaults this toper-channel-peerwhen unset; existing explicit values are preserved)channels.telegram.botToken,channels.discord.token,channels.matrix.*,channels.signal.*,channels.imessage.*- Channel allowlists (Slack, Discord, Matrix, Microsoft Teams) when you opt in during prompts (names resolve to IDs when possible)
skills.install.nodeManagerwizard.lastRunAtwizard.lastRunVersionwizard.lastRunCommitwizard.lastRunCommandwizard.lastRunMode
openclaw agents add writes agents.list[] and optional bindings.
WhatsApp credentials go under ~/.openclaw/credentials/whatsapp/<accountId>/.
Sessions are stored under ~/.openclaw/agents/<agentId>/sessions/.
Gateway wizard RPC:
wizard.startwizard.nextwizard.cancelwizard.status
Clients (macOS app and Control UI) can render steps without re-implementing onboarding logic.
Signal setup behavior:
- Downloads the appropriate release asset
- Stores it under
~/.openclaw/tools/signal-cli/<version>/ - Writes
channels.signal.cliPathin config - JVM builds require Java 21
- Native builds are used when available
- Windows uses WSL2 and follows Linux signal-cli flow inside WSL
Related docs
Section titled “Related docs”- Onboarding hub: Onboarding (CLI)
- Automation and scripts: CLI Automation
- Command reference:
openclaw onboard