# LM Studio (/docs/providers/lmstudio)


Overview [#overview]

[LM Studio](https://lmstudio.ai) is a desktop GUI application that enables local inference of GGUF models with an OpenAI-compatible local server API.

**Official Website:** [https://lmstudio.ai](https://lmstudio.ai)

Key Features [#key-features]

* **Local Inference** — Run models entirely on your machine
* **Desktop GUI** — User-friendly interface for model management
* **Any GGUF Model** — Support for any GGUF format model
* **OpenAI-Compatible API** — Local server with standard endpoints
* **GPU Acceleration** — CUDA, Metal, and ROCm support

Usage Example [#usage-example]

```python
from openai import OpenAI

client = OpenAI(
    api_key="dummy-key",
    base_url="http://localhost:1234/v1"
)

response = client.chat.completions.create(
    model="model-id",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
```

Available Models [#available-models]

LM Studio supports any GGUF format model. Download models through the LM Studio application.

<Callout type="info">
  Models and capabilities depend on your hardware. Check the LM Studio documentation for model requirements.
</Callout>

Official Resources [#official-resources]

* [LM Studio Website](https://lmstudio.ai)
* [Documentation](https://lmstudio.ai/docs)
* [GGUF Models on Hugging Face](https://huggingface.co/models?search=gguf)
