Node.js SDK
The ModelTrack Node.js SDK auto-instruments Anthropic and OpenAI calls with a single import. Works with TypeScript and JavaScript.
Installation
For now, copy the SDK file into your project. An npm package is coming soon.
# Copy from the ModelTrack repo
cp modeltrack/sdks/node/modeltrack.ts your-project/
# Coming soon:
# npm install modeltrackAuto-instrumentation
Import modeltrack at the top of your entry file. All Anthropic and OpenAI SDK instances will automatically route through the ModelTrack proxy.
import 'modeltrack' // Add this one line
import Anthropic from '@anthropic-ai/sdk'
// This client now routes through ModelTrack automatically
const client = new Anthropic()
const response = await client.messages.create({
model: 'claude-sonnet-4-6',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }]
})
// OpenAI works the same way
import OpenAI from 'openai'
const openai = new OpenAI()
const completion = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Hello!' }]
})How it works: On import, the SDK patches the Anthropic and OpenAI constructors to set baseURL to the proxy and inject X-ModelTrack-* default headers. If you explicitly set baseURLin the constructor, it won't be overridden.
Manual base URL
If you prefer not to use auto-instrumentation, set the base URL manually:
import Anthropic from '@anthropic-ai/sdk'
const client = new Anthropic({
baseURL: 'http://localhost:8080'
})
// For OpenAI:
import OpenAI from 'openai'
const openai = new OpenAI({
baseURL: 'http://localhost:8080'
})Configuration
Configure the SDK via environment variables or programmatically with configure().
Environment variables
| Variable | Default | Description |
|---|---|---|
MODELTRACK_PROXY_URL | http://localhost:8080 | Proxy URL |
MODELTRACK_TEAM | - | Team name for attribution |
MODELTRACK_APP | - | Application name |
MODELTRACK_FEATURE | - | Feature name |
MODELTRACK_CUSTOMER_TIER | - | Customer tier |
MODELTRACK_SESSION_ID | - | Session ID for multi-step workflows |
MODELTRACK_TRACE_ID | - | Trace ID for distributed tracing |
MODELTRACK_PROMPT_TEMPLATE | - | Prompt template ID for cost analysis |
Programmatic configuration
import { configure } from 'modeltrack'
configure({
proxyUrl: 'http://localhost:8080',
team: 'ml-research',
app: 'chatbot',
feature: 'customer-support',
customerTier: 'enterprise',
promptTemplate: 'support-v2',
})Session tracking
Use withSession() to group multiple LLM calls into a single session. This is useful for multi-step agent workflows.
import { withSession } from 'modeltrack'
import Anthropic from '@anthropic-ai/sdk'
const client = new Anthropic()
// All calls within this function share the same session ID
await withSession('user-query-123', async () => {
// Step 1: Classify the query
const classification = await client.messages.create({
model: 'claude-haiku-4-5',
max_tokens: 100,
messages: [{ role: 'user', content: 'Classify: refund request' }]
})
// Step 2: Generate response
const response = await client.messages.create({
model: 'claude-sonnet-4-6',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Handle refund for order #123' }]
})
})
// With optional trace ID
await withSession('session-123', async () => {
// ...
}, { traceId: 'trace-456' })Note: withSession() uses module-level state, not async context tracking. In concurrent scenarios (e.g., handling multiple requests in a server), consider passing session IDs via headers instead.
Setting team, app, and feature
You can set attribution at three levels:
// 1. Environment variables (set once, applies globally)
// MODELTRACK_TEAM=ml-research
// MODELTRACK_APP=chatbot
// MODELTRACK_FEATURE=customer-support
// 2. Programmatic (set once in your app startup)
import { configure } from 'modeltrack'
configure({ team: 'ml-research', app: 'chatbot' })
// 3. Per-client headers (maximum flexibility)
const client = new Anthropic({
defaultHeaders: {
'X-ModelTrack-Team': 'ml-research',
'X-ModelTrack-App': 'chatbot',
'X-ModelTrack-Feature': 'customer-support',
}
})Compatibility
The Node.js SDK auto-patches these libraries:
- ✓
@anthropic-ai/sdk— Anthropic Node.js SDK - ✓
openai— OpenAI Node.js SDK
For frameworks like Vercel AI SDK, see the Frameworks guide.