API Reference

Meridian API

Drop-in replacement for OpenAI and Anthropic SDKs. Point your base_url here and your existing code works unchanged.

OpenAI style

Pass your API key as a Bearer token in the Authorization header.

Authorization: Bearer sk-...

Anthropic style

Pass your API key directly in the x-api-key header.

x-api-key: sk-...

OpenAI Compatible

POST
/v1/chat/completions Chat completions with streaming
POST
/v1/completions Legacy text completions
POST
/v1/responses Responses API format
GET
/v1/models List available models
POST
/v1/tokenize Count tokens for a prompt
POST
/v1/images/generations Image generation
POST
/v1/images/edits Image editing with mask

Anthropic Compatible

POST
/anthropic/v1/messages Messages API with streaming
GET
/anthropic/v1/models List available models
GET
/anthropic/v1/models/:id Get a single model

System

GET
/health Liveness probe
GET
/metrics Uptime & memory snapshot
Shell
curl https://your-host:4242/v1/chat/completions \
  -H "Authorization: Bearer sk-..." \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'
Python
from openai import OpenAI

client = OpenAI(
    api_key="sk-...",
    base_url="http://your-host:4242/v1",
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True,
)
Node.js
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "sk-...",
  baseURL: "http://your-host:4242/v1",
});

const stream = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
  stream: true,
});
Python (Anthropic SDK)
import anthropic

client = anthropic.Anthropic(
    api_key="sk-...",
    base_url="http://your-host:4242/anthropic",
)

message = client.messages.create(
    model="claude-opus-4-5",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}],
)