# OpenClaw

OpenClaw (formerly Moltbot, originally Clawdbot) is a powerful AI messaging gateway that connects multiple messaging platforms (WhatsApp, Telegram, Discord, Slack, Signal, iMessage, and more) to AI models. By integrating with Infron, you can access a wide range of models including GPT-5.2, Claude-4.5, Gemini-3, DeepSeek, and more.

### **Account & API Keys Setup**

The first step to start using Infron is to [create an account](https://infron.ai/login) and [get your API key](https://infron.ai/dashboard/apiKeys).

### Setup Guide <a href="#openrouter-setup" id="openrouter-setup"></a>

Infron powers model access via its [OpenAI-compatible API ](https://app.gitbook.com/s/oWo5LeOZTLqTSLX7mP7F/openai-compatible-api).​​​

#### Step 1: Install OpenClaw

Install OpenClaw globally via npm:

```bash
npm install -g openclaw@latest
```

Or run the onboarding wizard to set up OpenClaw:

```bash
openclaw onboard --install-daemon
```

#### Step 2: Configure Infron Provider

Add the Infron provider configuration to your `~/.openclaw/openclaw.json` file:

```json
{
  "models": {
    "mode": "merge",
    "providers": {
      "infron": {
        "baseUrl": "https://llm.onerouter.pro/v1",
        "apiKey": "<API_KEY>",
        "api": "openai-completions",
        "models": [
          {
            "id": "deepseek/deepseek-v3.2",
            "name": "DeepSeek Chat via Infron",
            "reasoning": false,
            "input": ["text"],
            "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
            "contextWindow": 64000,
            "maxTokens": 8192
          },
          {
            "id": "openai/gpt-5.2",
            "name": "GPT-5.2 via Infron",
            "reasoning": false,
            "input": ["text", "image"],
            "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
            "contextWindow": 200000,
            "maxTokens": 8192
          },
          {
            "id": "google/gemini-3-pro-preview",
            "name": "Gemini 3 Pro via Infron",
            "reasoning": false,
            "input": ["text", "image"],
            "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
            "contextWindow": 200000,
            "maxTokens": 8192
          },
          {
            "id": "anthropic/claude-sonnet-4.5",
            "name": "Claude Sonnet 4.5 via Infron",
            "reasoning": false,
            "input": ["text", "image"],
            "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
            "contextWindow": 200000,
            "maxTokens": 8192
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "openai/gpt-5.2"
      },
      "models": {
        "deepseek/deepseek-v3.2": {},
        "openai/gpt-5.2": {},
        "google/gemini-3-pro-preview": {},
        "anthropic/claude-sonnet-4.5": {}
      }
    }
  }
}
```

#### Step 3: Add More Models (Optional)

You can add more models to the `models` array. Check the [model list](https://infron.ai/models) for available models and their capabilities.

#### Step 4: Verify the Configuration

List the available models:

```bash
openclaw models list
```

You should see your configured models:

```
Model                                      Input      Ctx      Local Auth  Tags
deepseek/deepseek-v3.2                     text       63k      no    yes   configured
openai/gpt-5.2                             text+image 195k     no    yes   default,configured
google/gemini-3-pro-preview                text+image 195k     no    yes   configured
anthropic/claude-sonnet-4.5                text+image 195k     no    yes   configured
```

#### Step 5: Set the Default Model

Set your preferred default model:

```bash
openclaw models set openai/gpt-5.2
```

### Use Cases

Once configured, you can use Infron models in various ways:

#### Via CLI Agent

```bash
# Run a quick agent command
openclaw agent --local --agent main --message "Explain quantum computing in simple terms"
```

#### Via Messaging Channels

Configure your messaging channels (WhatsApp, Telegram, Discord, etc.) and the gateway will automatically use your configured model:

```bash
# Start the gateway
openclaw gateway run

# Check channel status
openclaw channels status
```

#### Switching Models

You can switch models at any time:

```bash
# Set a different default model
openclaw models set anthropic/claude-sonnet-4.5

# Or specify a model inline (Method 1 only)
openclaw agent --local --agent main --model deepseek/deepseek-v3.2 --message "Hello"
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://infronai.gitbook.io/docs/frameworks-and-integrations/openclaw.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
