Function Calling
Function calling lets models generate structured JSON arguments for functions you define, enabling integration with external tools and APIs.
Prerequisites
- An API key (get one here)
- OpenAI SDK installed (Quickstart)
- A model that supports function calling (OpenAI, Google, or Anthropic models)
What you'll learn:
- How to define tools/functions for the model to call
- How to use function calling with the Responses API
- How to extract structured data using JSON Schema
Basic Example
- Python
from openai import OpenAI
client = OpenAI()
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use.",
},
},
"required": ["location", "format"],
},
},
},
]
messages = [
{"role": "system", "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."},
{"role": "user", "content": "What's the weather like today in Hamburg?"},
]
response = client.chat.completions.create(
model="gpt-4.1",
messages=messages,
tools=tools,
)
print(response.choices[0].message)
Example output:
ChatCompletionMessage(
content=None,
role='assistant',
tool_calls=[
ChatCompletionMessageToolCall(
id='call_adnFRLazqswLI1ky6FU2O40u',
function=Function(
arguments='{"location":"Hamburg","format":"celsius"}',
name='get_current_weather'
),
type='function'
)
]
)
Function Calling with Responses API
from openai import OpenAI
client = OpenAI()
tools = [
{
"type": "function",
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogota, Colombia",
}
},
"required": ["location"],
"additionalProperties": False,
},
}
]
response = client.responses.create(
model="gpt-4.1",
input=[{"role": "user", "content": "What is the weather like in Paris today?"}],
tools=tools,
)
print(response.output)
Structured Output (JSON Schema)
Parse text into a structured schema:
import json
from openai import OpenAI
client = OpenAI()
description = """
Our Premium Laptop Backpack features padded compartments that fit
laptops up to 15.6 inches. Available in navy blue, black, and gray.
Price: $79.99, on sale for $64.99.
"""
response = client.responses.create(
model="gpt-4.1",
input=f"Extract structured product information: {description}",
text={
"format": {
"type": "json_schema",
"name": "product_details",
"schema": {
"type": "object",
"properties": {
"product_name": {"type": "string"},
"features": {"type": "array", "items": {"type": "string"}},
"colors": {"type": "array", "items": {"type": "string"}},
"pricing": {
"type": "object",
"properties": {
"regular_price": {"type": "number"},
"sale_price": {"type": "number"},
"currency": {"type": "string"},
},
"required": ["regular_price", "sale_price", "currency"],
"additionalProperties": False,
},
},
"required": ["product_name", "features", "colors", "pricing"],
"additionalProperties": False,
},
"strict": True,
}
},
)
print(json.dumps(json.loads(response.output[0].content[0].text), indent=2))
Compatibility
info
Function calling is supported by OpenAI, Google, and Anthropic models. Not all T-Cloud-hosted open-source models support it yet. For the best experience on T-Cloud, we recommend using models with full function-calling compatibility.
The Responses API is currently fully supported only for OpenAI models.
Next Steps
- Streaming — Stream function calling responses in real-time
- Chat Completions — Basic chat API usage
- API Endpoints — Full endpoint reference