Know what your agents are doing. In production. Right now. Canary gives you full observability — traces, costs, success rates, and tool calls — so you catch failures before your users do.
import { Canary } from '@heycanary/sdk';
const canary = new Canary({ apiKey: 'ck_...' });2 lines
to integrate
< 1ms
overhead
3+
frameworks supported
Free tier
included at launch
From first LLM call to daily cost reports — full observability in minutes.
Every agent run captured with full event timeline, outcome tracking, and metadata.
Real-time spend tracking per agent, model, and session. Catch cost anomalies before they hit your bill.
Success rates, latency percentiles, and failure patterns across all your agent tools.
Token usage, model performance, and cost per call. Compare models side-by-side.
Automatic error flagging with full context. Know when agents fail and why.
Morning health report: cost anomalies, error spikes, model comparison, and actionable issues.
Two lines of code. Works with every major LLM provider and agent framework.
npm install @heycanary/sdk
Initialize Canary and start a session — auto-captures LLM calls.
Traces, costs, errors, and tool calls appear in your dashboard instantly.
import { Canary } from '@heycanary/sdk';
import OpenAI from 'openai';
const canary = new Canary({ apiKey: 'ck_...' });
const session = canary.startSession({ name: 'support-agent' });
const openai = new OpenAI();
const res = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Hello!' }],
});
session.end({ outcome: 'success' });Drop-in support for the frameworks and providers you already use.
Canary is designed by engineers who've run agents at scale and know what breaks.
Your data stays yours. We built Canary with enterprise-grade security from day one.
Async event capture adds < 1ms latency. Your agents won't even notice.
OpenAI, Anthropic, LangChain, custom agents — Canary works with all of them, no lock-in.
Start free. Scale when you're ready.
Join the waitlist and be first to get full observability when we launch.