# OpenWork

[**OpenWork**](https://openworklabs.com/docs/start-here/get-started) is an AI workspace built on OpenCode primitives that helps individuals and teams connect, manage, and customize their AI stack inside a project. It supports flexible model integration through workspace-level configuration, allowing users to add custom or managed LLM providers, work with local or hosted models, and extend capabilities through reusable skills. This makes OpenWork a practical platform for organizing and scaling AI-powered workflows across different projects and teams.

### **Account & API Keys Setup**

The first step to start using Infron is to [create an account](https://infron.ai/login) and [get your API key](https://infron.ai/dashboard/apiKeys).

### Setup Guide <a href="#openrouter-setup" id="openrouter-setup"></a>

`OpenWork` is built on `OpenCode` primitives, Support everything that you could modify in `.opencode.json`, like adding a [model](https://opencode.ai/docs/models/).&#x20;

You can find the corresponding model names [here](https://infron.ai/models) and configure them in the path below. You also have the option to configure multiple models simultaneously.

Inside `~/.config/opencode/opencode.json`:

{% tabs %}
{% tab title="Example" %}

```json
{
  "$schema": "https://opencode.ai/config.json",
  "model": "infron/anthropic/claude-sonnet-4.6",
  "provider": {
    "infron": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Infron",
      "options": {
        "baseURL": "https://llm.onerouter.pro/v1",
        "apiKey": "{env:INFRON_API_KEY}"
      },
      "models": {
        "anthropic/claude-opus-4.7": {
          "name": "Claude Opus 4.7"
        },
        "anthropic/claude-sonnet-4.6": {
          "name": "Claude Sonnet 4.6"
        },
        "anthropic/claude-haiku-4.5": {
          "name": "Claude Haiku 4.5"
        },
        "z-ai/glm-5": {
          "name": "GLM 5"
        },
        "z-ai/glm-5.1": {
          "name": "GLM 5.1"
        },
        "z-ai/glm-4.7": {
          "name": "GLM 4.7"
        },
        "moonshotai/kimi-k2.5": {
          "name": "Kimi K2.5"
        },
        "moonshotai/kimi-k2-thinking": {
          "name": "Kimi K2 Thinking"
        },
        "qwen/qwen3.6-plus": {
          "name": "Qwen 3.6 Plus"
        },
        "qwen/qwen3.5-plus": {
          "name": "Qwen 3.5 Plus"
        },
        "qwen/qwen3-max": {
          "name": "Qwen3 Max"
        },
        "qwen/qwen3-coder-plus": {
          "name": "Qwen3 Coder Plus"
        }
      }
    }
  }
}
```

{% endtab %}
{% endtabs %}

<figure><img src="https://3822312837-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FZ9C9AjT7j46HAcQrOVWw%2Fuploads%2FvYfI4IfpGUjnjiUL4mHM%2FScreenshot%202026-04-17%20at%2019.33.58.png?alt=media&#x26;token=b12ce9b0-e1cd-42df-929f-7d68a6751554" alt=""><figcaption></figcaption></figure>

<figure><img src="https://3822312837-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FZ9C9AjT7j46HAcQrOVWw%2Fuploads%2Fg8akkg9IJFCy2vUmi55I%2FScreenshot%202026-04-17%20at%2019.33.02.png?alt=media&#x26;token=fabba4df-6d60-4db4-b5d1-fc1c44779792" alt=""><figcaption></figcaption></figure>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://infronai.gitbook.io/docs/frameworks-and-integrations/openwork.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
