Skip to content

Quickstart

This content is for v1.0.0. Switch to the latest version for up-to-date documentation.

Get up and running with AI Foundation Services in minutes. This guide walks you through installing the SDK, setting up authentication, and making your first API call.


AI Foundation Services uses an OpenAI-compatible API, so you can use the official OpenAI SDKs.

Terminal window
pip install openai

  1. Go to the API Key Portal and create a free trial key.
  2. Or purchase via the T-Cloud Marketplace for production use.

Terminal window
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://llm-server.llmhub.t-systems.net/v2"

Terminal window
curl -X POST "$OPENAI_BASE_URL/chat/completions" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "Llama-3.3-70B-Instruct",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is quantum computing in simple terms?"}
],
"temperature": 0.5,
"max_tokens": 150
}'

from openai import OpenAI
client = OpenAI()
texts = ["The quick brown fox jumps over the lazy dog", "Data science is fun!"]
result = client.embeddings.create(input=texts, model="jina-embeddings-v2-base-de")
print(f"Embedding dimension: {len(result.data[0].embedding)}")
print(f"Token usage: {result.usage}")