Skip to main content
Version: Latest

Authentication

AI Foundation Services uses API keys for authentication. All requests must include your API key in the Authorization header.


Get an API Key

Free Trial Key

Get started immediately with a free trial key:

  1. Visit the API Key Portal
  2. Create an account and generate your API key
  3. Your trial key gives you access to all available models

Production Key

For production workloads, purchase via the T-Cloud Marketplace.


Set Up Environment Variables

Store your API key as an environment variable — never hardcode it in your source code.

export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://llm-server.llmhub.t-systems.net/v2"

To persist across sessions, add these lines to your ~/.zshrc or ~/.bashrc file.


Using the API Key

With OpenAI SDKs

When OPENAI_API_KEY and OPENAI_BASE_URL are set, the OpenAI SDKs pick them up automatically:

from openai import OpenAI

client = OpenAI() # No need to pass api_key or base_url

You can also pass them explicitly:

from openai import OpenAI

client = OpenAI(
api_key="your_api_key_here",
base_url="https://llm-server.llmhub.t-systems.net/v2",
)

With HTTP Requests

Include the API key in the Authorization header:

curl -X POST "https://llm-server.llmhub.t-systems.net/v2/chat/completions" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{"model": "Llama-3.3-70B-Instruct", "messages": [{"role": "user", "content": "Hello"}]}'

Base URL

All API requests go to:

https://llm-server.llmhub.t-systems.net/v2

This is the OpenAI-compatible endpoint. For the full API specification, see the API Reference.


Best Practices

  • Never commit API keys to version control. Use .env files and add them to .gitignore.
  • Use environment variables in production rather than hardcoding keys.
  • Rotate keys regularly via the API Key Portal.
  • Monitor usage through the API Key Portal to track token consumption and costs.
© Deutsche Telekom AG