# Responses API (/docs/responses-api)


Overview [#overview]

The Responses API is an advanced endpoint that follows the OpenAI Responses API specification. Unlike Chat Completions, it supports built-in tools, structured outputs, and agent-like reasoning workflows natively.

<Callout>
  Yunxin is one of the few API gateways that fully implements the Responses API — a critical milestone for agent-oriented applications.
</Callout>

Endpoint [#endpoint]

```
POST /v1/responses
```

Request Body [#request-body]

```json
{
  "model": "model-id",
  "input": "What is the weather in San Francisco?",
  "tools": [
    {
      "type": "web_search_preview"
    }
  ]
}
```

Parameters [#parameters]

| Parameter              | Type         | Required | Description                                            |
| ---------------------- | ------------ | -------- | ------------------------------------------------------ |
| `model`                | string       | Yes      | Model ID to use                                        |
| `input`                | string/array | Yes      | Input text or array of input items                     |
| `instructions`         | string       | No       | System instructions for the model                      |
| `previous_response_id` | string       | No       | ID of a previous response to continue the conversation |
| `tools`                | array        | No       | Array of tool definitions                              |
| `temperature`          | number       | No       | Sampling temperature (0–2)                             |
| `max_output_tokens`    | integer      | No       | Maximum output tokens                                  |
| `stream`               | boolean      | No       | Enable streaming                                       |
| `metadata`             | object       | No       | Arbitrary metadata to attach                           |

Input Formats [#input-formats]

The `input` field accepts multiple formats:

**Simple text:**

```json
{
  "input": "Hello, how are you?"
}
```

**Array of messages:**

```json
{
  "input": [
    {"role": "user", "content": "What is 2+2?"}
  ]
}
```

**With previous response context:**

```json
{
  "input": [
    {"type": "message", "role": "user", "content": "Follow up question..."}
  ],
  "previous_response_id": "resp_abc123"
}
```

Built-in Tools [#built-in-tools]

The Responses API supports built-in tools that don't require external implementation:

| Tool                 | Description                              |
| -------------------- | ---------------------------------------- |
| `web_search_preview` | Search the web for real-time information |
| `code_interpreter`   | Execute code for data analysis           |
| `file_search`        | Search through uploaded files            |

List Available Tools [#list-available-tools]

```
GET /v1/responses/tools
```

Returns the list of built-in tools available for the Responses API.

Response [#response]

```json
{
  "id": "resp_abc123",
  "object": "response",
  "created_at": 1709251200,
  "model": "model-id",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {
          "type": "output_text",
          "text": "Based on my search, the current weather in San Francisco is..."
        }
      ]
    }
  ],
  "usage": {
    "input_tokens": 15,
    "output_tokens": 42,
    "total_tokens": 57
  }
}
```

Examples [#examples]

Basic Request [#basic-request]

```python
import httpx

response = httpx.post(
    "https://api.yuhuanstudio.com/v1/responses",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
    json={
        "model": "model-id",
        "input": "Explain the theory of relativity."
    }
)

print(response.json())
```

With Web Search [#with-web-search]

```python
response = httpx.post(
    "https://api.yuhuanstudio.com/v1/responses",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
    json={
        "model": "model-id",
        "input": "What are the latest developments in fusion energy?",
        "tools": [{"type": "web_search_preview"}]
    }
)
```

With Instructions [#with-instructions]

```python
response = httpx.post(
    "https://api.yuhuanstudio.com/v1/responses",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
    json={
        "model": "model-id",
        "instructions": "You are a senior software engineer. Always provide code examples.",
        "input": "How do I implement a binary search tree?",
        "temperature": 0.3
    }
)
```

Streaming [#streaming]

```python
import httpx

with httpx.stream(
    "POST",
    "https://api.yuhuanstudio.com/v1/responses",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
    json={
        "model": "model-id",
        "input": "Write a poem about the ocean.",
        "stream": True
    }
) as response:
    for chunk in response.iter_lines():
        print(chunk)
```

Comparison with Chat Completions [#comparison-with-chat-completions]

| Feature           | Chat Completions      | Responses API                      |
| ----------------- | --------------------- | ---------------------------------- |
| Format            | OpenAI Chat format    | OpenAI Responses format            |
| Built-in tools    | No                    | Yes (web search, code interpreter) |
| Structured output | Via `response_format` | Native support                     |
| Agent workflows   | Manual orchestration  | Native support                     |
| Response chaining | Manual                | Via `previous_response_id`         |
| Compatibility     | All OpenAI SDKs       | Newer OpenAI SDK versions          |
