OpenAI-compatible API

Use PublicaAI from your code

PublicaAI exposes a drop-in OpenAI-compatible endpoint backed by Anthropic Claude on AWS Bedrock. Point any OpenAI SDK or LangChain client at our base URL with your staff key — that's it.

Base URL
https://bedrock.publicaai.com
Auth header
Authorization: Bearer sk-…

Available models

Model IDDescriptionCost multiplierMax tokens
claude-4-5-haikuFast, low cost1×4,000
claude-4-5-sonnetBalanced performance3×4,000

Credit cost: (input_tokens + output_tokens × 5) × cost_multiplier × 1.2. Output tokens are 5× more expensive than input tokens.

Quickstart

Install the OpenAI SDK and point it at PublicaAI:

bash
pip install openai
python
from openai import OpenAI

client = OpenAI(
    api_key="sk-your-publicaai-key",     # your staff API key
    base_url="https://bedrock.publicaai.com/v1",
)

response = client.chat.completions.create(
    model="claude-4-5-sonnet",
    messages=[
        {"role": "user", "content": "Summarise our company values in 3 bullets."},
    ],
    max_tokens=500,
)

print(response.choices[0].message.content)
print(f"Credits remaining: {response.usage.credits_remaining}")

Other endpoints

GET
/v1/models

Lists every model available to your key, with cost multipliers and limits.

GET
/v1/usage

Returns your remaining credits, 24h usage breakdown and per-model statistics.

Limits & responses

Rate limit
60
requests / minute / key
Max tokens
4,000
per request
Insufficient credits
402
HTTP status returned