# OpenAI Agents SDK

Seamlessly integrate Infron AI AI with OpenAI Agents SDK for building multi-agent workflows.

The [OpenAI Agents SDK](https://github.com/openai/openai-agents-python) is a lightweight yet powerful framework for building multi-agent workflows. And the SDK is compatible with any model providers that support the OpenAI Chat Completions API format.

This guide will walk you through how to use Infron AI LLM API with OpenAI Agents SDK.

## Get Started <a href="#get-started" id="get-started"></a>

1. Set up your Python environment and install the Agents SDK.

```sh
python -m venv env
source env/bin/activate
pip install openai-agents==0.0.7
```

2. Set up your Infron AI API key.

Using Infron is to [create an account](https://infron.ai/login) and [get your API key](https://infron.ai/dashboard/apiKeys).

## Hello world example <a href="#hello-world-example" id="hello-world-example"></a>

```python
import os
from openai import AsyncOpenAI
from agents import (
    Agent,
    Runner,
    set_default_openai_api,
    set_default_openai_client,
    set_tracing_disabled,
)

BASE_URL = "https://llm.onerouter.pro/v1"
API_KEY = "Your_API_KEY"
MODEL_NAME = "gpt-5.1-chat"

set_default_openai_api("chat_completions")
set_default_openai_client(AsyncOpenAI(base_url=BASE_URL, api_key=API_KEY))
set_tracing_disabled(disabled=True)

agent = Agent(name="Assistant",
              instructions="You are a helpful assistant", model=MODEL_NAME)

result = Runner.run_sync(
    agent, "Write a haiku about recursion in programming. step by step.")
print(result.final_output)

# Code within the code,
# Functions calling themselves,
# Infinite loop's dance.
```

## Handoffs example <a href="#handoffs-example" id="handoffs-example"></a>

```python
import os
import asyncio
from openai import AsyncOpenAI
from agents import (
    Agent,
    Runner,
    set_default_openai_api,
    set_default_openai_client,
    set_tracing_disabled,
)

BASE_URL = "https://llm.onerouter.pro/v1"
API_KEY = "Your_API_KEYS"
MODEL_NAME = "gpt-5.1-chat"

set_default_openai_api("chat_completions")
set_default_openai_client(AsyncOpenAI(base_url=BASE_URL, api_key=API_KEY))
set_tracing_disabled(disabled=True)

spanish_agent = Agent(
    name="Spanish agent",
    instructions="You only speak Spanish.",
    model=MODEL_NAME,
)

english_agent = Agent(
    name="English agent",
    instructions="You only speak English",
    model=MODEL_NAME,
)

triage_agent = Agent(
    name="Triage agent",
    instructions="Handoff to the appropriate agent based on the language of the request.",
    handoffs=[spanish_agent, english_agent],
    model=MODEL_NAME,
)


async def main():
    result = await Runner.run(triage_agent, input="Write a haiku about recursion in programming. step by step.")
    print(result.final_output)


if __name__ == "__main__":
    asyncio.run(main())
```

## Functions example <a href="#functions-example" id="functions-example"></a>

```python
import os
import asyncio
from openai import AsyncOpenAI
from agents import (
    Agent,
    Runner,
    set_default_openai_api,
    set_default_openai_client,
    set_tracing_disabled,
    function_tool,
)

BASE_URL = "https://llm.onerouter.pro/v1"
API_KEY = "Your_API_KEYS"
MODEL_NAME = "gpt-5.1-chat"

set_default_openai_api("chat_completions")
set_default_openai_client(AsyncOpenAI(base_url=BASE_URL, api_key=API_KEY))
set_tracing_disabled(disabled=True)

@function_tool
def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny."

agent = Agent(
    name="Hello world",
    instructions="You are a helpful agent.",
    tools=[get_weather],
    model=MODEL_NAME,
)

async def main():
    result = await Runner.run(agent, input="What's the weather in Tokyo?")
    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())
```
