# Function Calling (/docs/function-calling)


Overview [#overview]

Function calling lets models generate structured function calls that your application can execute. This is essential for building agents, data retrieval systems, and interactive applications.

Defining Tools [#defining-tools]

Pass tools in the `tools` array with JSON Schema descriptions:

```json
{
  "model": "model-id",
  "messages": [{"role": "user", "content": "What's the weather in Tokyo?"}],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_weather",
        "description": "Get the current weather for a location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "City name, e.g., Tokyo"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"],
              "description": "Temperature unit"
            }
          },
          "required": ["location"]
        }
      }
    }
  ]
}
```

Tool Call Flow [#tool-call-flow]

The function calling workflow follows these steps:

1. **Send request** with tool definitions
2. **Model returns** a `tool_calls` array (instead of content)
3. **Your app executes** the function with the provided arguments
4. **Send results back** with `role: "tool"` messages
5. **Model generates** the final response

Complete Example [#complete-example]

```python
import json
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://api.yuhuanstudio.com/v1"
)

# Step 1: Define tools
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get weather for a city",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {"type": "string"},
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
                },
                "required": ["location"]
            }
        }
    }
]

# Step 2: Send initial request
messages = [{"role": "user", "content": "What's the weather in Tokyo and London?"}]
response = client.chat.completions.create(
    model="model-id",
    messages=messages,
    tools=tools
)

# Step 3: Process tool calls
assistant_message = response.choices[0].message
messages.append(assistant_message)

for tool_call in assistant_message.tool_calls:
    args = json.loads(tool_call.function.arguments)

    # Execute your function
    result = {"temperature": 22, "unit": "celsius", "condition": "sunny"}

    messages.append({
        "role": "tool",
        "tool_call_id": tool_call.id,
        "content": json.dumps(result)
    })

# Step 4: Get final response
final_response = client.chat.completions.create(
    model="model-id",
    messages=messages,
    tools=tools
)

print(final_response.choices[0].message.content)
```

Tool Choice [#tool-choice]

Control how the model selects tools:

| Value                                               | Behavior                                      |
| --------------------------------------------------- | --------------------------------------------- |
| `"auto"`                                            | Model decides whether to call tools (default) |
| `"none"`                                            | Model will not call any tools                 |
| `"required"`                                        | Model must call at least one tool             |
| `{"type": "function", "function": {"name": "..."}}` | Force a specific tool                         |

```python
response = client.chat.completions.create(
    model="model-id",
    messages=messages,
    tools=tools,
    tool_choice="required"  # Must use a tool
)
```

Parallel Tool Calls [#parallel-tool-calls]

Models can request multiple tool calls simultaneously:

```json
{
  "choices": [{
    "message": {
      "tool_calls": [
        {"id": "call_1", "function": {"name": "get_weather", "arguments": "{\"location\":\"Tokyo\"}"}},
        {"id": "call_2", "function": {"name": "get_weather", "arguments": "{\"location\":\"London\"}"}}
      ]
    }
  }]
}
```

To disable parallel calls:

```python
response = client.chat.completions.create(
    model="model-id",
    messages=messages,
    tools=tools,
    parallel_tool_calls=False
)
```

Models with Function Calling [#models-with-function-calling]

<Callout>
  Not all models support function calling. Use the `GET /v1/models` endpoint and check for the `function_calling` capability to find models that support tools.
</Callout>
