One API. All models.

Langdock API

Integrate leading AI models and specialized agents directly into your existing applications with a single, enterprise-ready API.

merck logo
eppendorf logo
der spiegel logo
personio logo
lw logo
axpo logo
volksbank logo
sumup logo
babbel logo
Unicef logo
frankfurter allgemeine logo
get your guide logo
the economist logo
mobile.de logo
süddeutsche logo

Completion API

Access leading language models from OpenAI, Anthropic, Mistral, and Google for text generation, analysis, and reasoning tasks.

Learn more

Embedding API

Generate high-quality embeddings for semantic search, similarity matching, and RAG applications.

Learn more

Agent API

Create and manage custom AI agents programmatically with specialized knowledge and capabilities.

Learn more

Knowledge folder API

Manage your organization’s knowledge base programmatically for RAG and document processing.

Learn more

Available models

API

One unified API for all models. GDPR-compliant, hosted in the EU.

Pricing applies only to our API product. Chat and Agents, when purchased including AI models, have no usage-based cost component. Langdock charges 10% on top of the model provider's price. Model prices origin from the model providers in USD. All prices excl. VAT.

Model
Input tokens
Output tokens
Regions
GPT-5.2
Input tokens

1.65 / 1M tokens

Output tokens

13.17 / 1M tokens

Regions
GPT-5
Input tokens

1.18 / 1M tokens

Output tokens

9.4 / 1M tokens

Regions
Gemini 3 Pro Preview
Input tokens

2.35 / 1M tokens

Output tokens

14.11 / 1M tokens

Regions
Gemini 2.5 Pro
Input tokens

2.35 / 1M tokens

Output tokens

14.11 / 1M tokens

Regions
See all (40+)

Questions & answers

What is the Langdock API?

The Langdock API allows developers to integrate state‑of‑the‑art AI models, agents, and knowledge management features directly into their applications with enterprise‑grade security and GDPR compliance.

Which AI models can I access through the API?

You can access models from OpenAI (including GPT‑5.2), Anthropic (Claude), Mistral, and Google, all through a unified interface.

Does Langdock support embeddings and RAG workflows?

Yes. The API includes an Embedding API and a Knowledge Folder API, enabling semantic search, document retrieval, and full retrieval‑augmented generation pipelines.

Can I create custom agents using the API?

Absolutely. The Agent API lets you create, update, and query AI agents with custom instructions, knowledge, attachments, and model configurations.

How do I authenticate with the Langdock API?

All requests require a bearer token sent via the authorization header. You can generate an API token in the Langdock workspace settings.

What are the default rate limits?

500 requests per minute and 60,000 tokens per minute. Additional enterprise limits may be available upon request.