Skip to content

Onboarding Reference

This is the full reference for openclaw onboard. For a high-level overview, see Onboarding (CLI).

  1. Existing config detection

    • If ~/.openclaw/openclaw.json exists, choose Keep / Modify / Reset.
    • Re-running onboarding does not wipe anything unless you explicitly choose Reset (or pass --reset).
    • CLI --reset defaults to config+creds+sessions; use --reset-scope full to also remove workspace.
    • If the config is invalid or contains legacy keys, the wizard stops and asks you to run openclaw doctor before continuing.
    • Reset uses trash (never rm) and offers scopes:
      • Config only
      • Config + credentials + sessions
      • Full reset (also removes workspace)
  2. Model/Auth

    • Anthropic API key: uses ANTHROPIC_API_KEY if present or prompts for a key, then saves it for daemon use.
    • Anthropic Claude CLI: on macOS onboarding checks Keychain item “Claude Code-credentials” (choose “Always Allow” so launchd starts don’t block); on Linux/Windows it reuses ~/.claude/.credentials.json if present and switches model selection to claude-cli/....
    • Anthropic token (paste setup-token): run claude setup-token on any machine, then paste the token (you can name it; blank = default).
    • OpenAI Code (Codex) subscription (Codex CLI): if ~/.codex/auth.json exists, onboarding can reuse it.
    • OpenAI Code (Codex) subscription (OAuth): browser flow; paste the code#state.
      • Sets agents.defaults.model to openai-codex/gpt-5.2 when model is unset or openai/*.
    • OpenAI API key: uses OPENAI_API_KEY if present or prompts for a key, then stores it in auth profiles.
    • xAI (Grok) API key: prompts for XAI_API_KEY and configures xAI as a model provider.
    • OpenCode: prompts for OPENCODE_API_KEY (or OPENCODE_ZEN_API_KEY, get it at https://opencode.ai/auth) and lets you pick the Zen or Go catalog.
    • Ollama: prompts for the Ollama base URL, offers Cloud + Local or Local mode, discovers available models, and auto-pulls the selected local model when needed.
    • More detail: Ollama
    • API key: stores the key for you.
    • Vercel AI Gateway (multi-model proxy): prompts for AI_GATEWAY_API_KEY.
    • More detail: Vercel AI Gateway
    • Cloudflare AI Gateway: prompts for Account ID, Gateway ID, and CLOUDFLARE_AI_GATEWAY_API_KEY.
    • More detail: Cloudflare AI Gateway
    • MiniMax: config is auto-written; hosted default is MiniMax-M2.7.
    • More detail: MiniMax
    • Synthetic (Anthropic-compatible): prompts for SYNTHETIC_API_KEY.
    • More detail: Synthetic
    • Moonshot (Kimi K2): config is auto-written.
    • Kimi Coding: config is auto-written.
    • More detail: Moonshot AI (Kimi + Kimi Coding)
    • Skip: no auth configured yet.
    • Pick a default model from detected options (or enter provider/model manually). For best quality and lower prompt-injection risk, choose the strongest latest-generation model available in your provider stack.
    • Onboarding runs a model check and warns if the configured model is unknown or missing auth.
    • API key storage mode defaults to plaintext auth-profile values. Use --secret-input-mode ref to store env-backed refs instead (for example keyRef: { source: "env", provider: "default", id: "OPENAI_API_KEY" }).
    • OAuth credentials live in ~/.openclaw/credentials/oauth.json; auth profiles live in `~/.openclaw/agents/

    /agent/auth-profiles.json` (API keys + OAuth). - More detail: /concepts/oauth

  3. Workspace

    • Default ~/.openclaw/workspace (configurable).
    • Seeds the workspace files needed for the agent bootstrap ritual.
    • Full workspace layout + backup guide: Agent workspace
  4. Gateway

    • Port, bind, auth mode, tailscale exposure.
    • Auth recommendation: keep Token even for loopback so local WS clients must authenticate.
    • In token mode, interactive setup offers:
      • Generate/store plaintext token (default)
      • Use SecretRef (opt-in)
      • Quickstart reuses existing gateway.auth.token SecretRefs across env, file, and exec providers for onboarding probe/dashboard bootstrap.
      • If that SecretRef is configured but cannot be resolved, onboarding fails early with a clear fix message instead of silently degrading runtime auth.
    • In password mode, interactive setup also supports plaintext or SecretRef storage.
    • Non-interactive token SecretRef path: `—gateway-token-ref-env

    . - Requires a non-empty env var in the onboarding process environment. - Cannot be combined with —gateway-token`. - Disable auth only if you fully trust every local process. - Non‑loopback binds still require auth.

  5. Channels

    • WhatsApp: optional QR login.
    • Telegram: bot token.
    • Discord: bot token.
    • Google Chat: service account JSON + webhook audience.
    • Mattermost (plugin): bot token + base URL.
    • Signal: optional signal-cli install + account config.
    • BlueBubbles: recommended for iMessage; server URL + password + webhook.
    • iMessage: legacy imsg CLI path + DB access.
    • DM security: default is pairing. First DM sends a code; approve via `openclaw pairing approve

    ` or use allowlists.

Use --non-interactive to automate or script onboarding:

Terminal window
openclaw onboard --non-interactive \
--mode local \
--auth-choice apiKey \
--anthropic-api-key "$ANTHROPIC_API_KEY" \
--gateway-port 18789 \
--gateway-bind loopback \
--install-daemon \
--daemon-runtime node \
--skip-skills

Add --json for a machine‑readable summary.

Gateway token SecretRef in non-interactive mode:

Terminal window
export OPENCLAW_GATEWAY_TOKEN="your-token"
openclaw onboard --non-interactive \
--mode local \
--auth-choice skip \
--gateway-auth token \
--gateway-token-ref-env OPENCLAW_GATEWAY_TOKEN

--gateway-token and --gateway-token-ref-env are mutually exclusive.

Provider-specific command examples live in CLI Automation. Use this reference page for flag semantics and step ordering.

Terminal window
openclaw agents add work \
--workspace ~/.openclaw/workspace-work \
--model openai/gpt-5.2 \
--bind whatsapp:biz \
--non-interactive \
--json

The Gateway exposes the onboarding flow over RPC (wizard.start, wizard.next, wizard.cancel, wizard.status). Clients (macOS app, Control UI) can render steps without re‑implementing onboarding logic.

Onboarding can install signal-cli from GitHub releases:

  • Downloads the appropriate release asset.
  • Stores it under ~/.openclaw/tools/signal-cli/<version>/.
  • Writes channels.signal.cliPath to your config.

Notes:

  • JVM builds require Java 21.
  • Native builds are used when available.
  • Windows uses WSL2; signal-cli install follows the Linux flow inside WSL.

Typical fields in ~/.openclaw/openclaw.json:

  • agents.defaults.workspace
  • agents.defaults.model / models.providers (if Minimax chosen)
  • tools.profile (local onboarding defaults to "coding" when unset; existing explicit values are preserved)
  • gateway.* (mode, bind, auth, tailscale)
  • session.dmScope (behavior details: CLI Setup Reference)
  • channels.telegram.botToken, channels.discord.token, channels.matrix.*, channels.signal.*, channels.imessage.*
  • Channel allowlists (Slack/Discord/Matrix/Microsoft Teams) when you opt in during the prompts (names resolve to IDs when possible).
  • skills.install.nodeManager
  • wizard.lastRunAt
  • wizard.lastRunVersion
  • wizard.lastRunCommit
  • wizard.lastRunCommand
  • wizard.lastRunMode

openclaw agents add writes agents.list[] and optional bindings.

WhatsApp credentials go under ~/.openclaw/credentials/whatsapp/<accountId>/. Sessions are stored under ~/.openclaw/agents/<agentId>/sessions/.

Some channels are delivered as plugins. When you pick one during setup, onboarding will prompt to install it (npm or a local path) before it can be configured.