Skip to main content
POST /v1/chat/completions Tools are declared with a nested function object. The model returns tool_calls inside the assistant message.

Define a tool

tools = [{
    "type": "function",
    "function": {
        "name": "get_weather",
        "description": "Get the weather for a city",
        "parameters": {
            "type": "object",
            "properties": {"city": {"type": "string"}},
            "required": ["city"],
        },
    },
}]

Full loop

from openai import OpenAI
import json

client = OpenAI(base_url="https://api.abliteration.ai/v1", api_key=os.environ["ABLIT_KEY"])

messages = [{"role": "user", "content": "What's the weather in Lagos?"}]

resp = client.chat.completions.create(
    model="abliterated-model",
    messages=messages,
    tools=tools,
)

msg = resp.choices[0].message
messages.append(msg)

for call in msg.tool_calls or []:
    args = json.loads(call.function.arguments)
    result = get_weather(args["city"])  # your function
    messages.append({
        "role": "tool",
        "tool_call_id": call.id,
        "content": json.dumps(result),
    })

final = client.chat.completions.create(model="abliterated-model", messages=messages)
print(final.choices[0].message.content)

Streaming

Tool-call arguments arrive across multiple chunks. Accumulate delta.tool_calls[i].function.arguments strings until finish_reason: "tool_calls". See streaming.

Forcing a tool

tool_choice={"type": "function", "function": {"name": "get_weather"}}
"none" disables tools; "auto" is the default.

Parallel tool calls

The model can return multiple tool_calls in one response. Execute them in parallel and append each result with a matching tool_call_id.