PublicaAI exposes a drop-in OpenAI-compatible endpoint backed by Anthropic Claude on AWS Bedrock. Point any OpenAI SDK or LangChain client at our base URL with your staff key — that's it.
https://bedrock.publicaai.comAuthorization: Bearer sk-…| Model ID | Description | Cost multiplier | Max tokens |
|---|---|---|---|
claude-4-5-haiku | Fast, low cost | 1× | 4,000 |
claude-4-5-sonnet | Balanced performance | 3× | 4,000 |
Credit cost: (input_tokens + output_tokens × 5) × cost_multiplier × 1.2. Output tokens are 5× more expensive than input tokens.
Install the OpenAI SDK and point it at PublicaAI:
pip install openai
from openai import OpenAI
client = OpenAI(
api_key="sk-your-publicaai-key", # your staff API key
base_url="https://bedrock.publicaai.com/v1",
)
response = client.chat.completions.create(
model="claude-4-5-sonnet",
messages=[
{"role": "user", "content": "Summarise our company values in 3 bullets."},
],
max_tokens=500,
)
print(response.choices[0].message.content)
print(f"Credits remaining: {response.usage.credits_remaining}")
/v1/modelsLists every model available to your key, with cost multipliers and limits.
/v1/usageReturns your remaining credits, 24h usage breakdown and per-model statistics.