Function Calling
Enable models to call functions and use tools for dynamic interactions.
Overview
Function calling lets models generate structured function calls that your application can execute. This is essential for building agents, data retrieval systems, and interactive applications.
Defining Tools
Pass tools in the tools array with JSON Schema descriptions:
{
"model": "model-id",
"messages": [{"role": "user", "content": "What's the weather in Tokyo?"}],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name, e.g., Tokyo"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "Temperature unit"
}
},
"required": ["location"]
}
}
}
]
}Tool Call Flow
The function calling workflow follows these steps:
- Send request with tool definitions
- Model returns a
tool_callsarray (instead of content) - Your app executes the function with the provided arguments
- Send results back with
role: "tool"messages - Model generates the final response
Complete Example
import json
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.yuhuanstudio.com/v1"
)
# Step 1: Define tools
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}
]
# Step 2: Send initial request
messages = [{"role": "user", "content": "What's the weather in Tokyo and London?"}]
response = client.chat.completions.create(
model="model-id",
messages=messages,
tools=tools
)
# Step 3: Process tool calls
assistant_message = response.choices[0].message
messages.append(assistant_message)
for tool_call in assistant_message.tool_calls:
args = json.loads(tool_call.function.arguments)
# Execute your function
result = {"temperature": 22, "unit": "celsius", "condition": "sunny"}
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"content": json.dumps(result)
})
# Step 4: Get final response
final_response = client.chat.completions.create(
model="model-id",
messages=messages,
tools=tools
)
print(final_response.choices[0].message.content)Tool Choice
Control how the model selects tools:
| Value | Behavior |
|---|---|
"auto" | Model decides whether to call tools (default) |
"none" | Model will not call any tools |
"required" | Model must call at least one tool |
{"type": "function", "function": {"name": "..."}} | Force a specific tool |
response = client.chat.completions.create(
model="model-id",
messages=messages,
tools=tools,
tool_choice="required" # Must use a tool
)Parallel Tool Calls
Models can request multiple tool calls simultaneously:
{
"choices": [{
"message": {
"tool_calls": [
{"id": "call_1", "function": {"name": "get_weather", "arguments": "{\"location\":\"Tokyo\"}"}},
{"id": "call_2", "function": {"name": "get_weather", "arguments": "{\"location\":\"London\"}"}}
]
}
}]
}To disable parallel calls:
response = client.chat.completions.create(
model="model-id",
messages=messages,
tools=tools,
parallel_tool_calls=False
)Models with Function Calling
Not all models support function calling. Use the GET /v1/models endpoint and check for the function_calling capability to find models that support tools.
How is this guide?