Security at ModelTrack
How we protect your data and API keys. ModelTrack is designed so that sensitive information never persists — your keys pass through, your prompts stay private, and only cost metadata is stored.
API Key Handling
Your LLM API keys are passed through to providers in real-time. The proxy reads the Authorization header, forwards the request, and returns the response. That's it.
- ✓Keys are NEVER stored, logged, or persisted
- ✓Keys exist only in memory during the request (typically < 1 second)
- ✓Keys are never sent to any ModelTrack service or third party
- ✓The proxy never modifies your Authorization header
Data flow
Your App
│
│ Request + API key (Authorization header)
▼
ModelTrack Proxy (in-memory only, < 1 second)
│
│ Forwards request + API key to provider
▼
LLM Provider (Anthropic, OpenAI, etc.)
│
│ Response
▼
ModelTrack Proxy
│
├──▶ Cost event (model, tokens, cost, latency) → Database
│ (NO key, NO prompt, NO response content)
│
│ Response (unmodified)
▼
Your AppWhat We Track
ModelTrack records only the metadata needed for cost tracking and attribution. Here is exactly what is and is not included in a cost event.
What IS tracked
- ✓Model name
- ✓Input & output tokens
- ✓Cost (USD)
- ✓Latency (ms)
- ✓Provider
- ✓Team / App / Feature headers
- ✓Timestamp
- ✓Cache hit status
- ✓Request ID
What is NOT tracked
- ✗Prompt content
- ✗Response content
- ✗API keys
- ✗Authorization headers
- ✗Request/response bodies
Example cost event
{
"timestamp": "2026-03-26T10:30:00Z",
"request_id": "req_abc123",
"provider": "anthropic",
"model": "claude-sonnet-4-6",
"input_tokens": 150,
"output_tokens": 320,
"cost_usd": 0.00234,
"latency_ms": 1200,
"team": "ml-research",
"app": "chatbot",
"feature": "customer-support",
"session_id": "user-query-123",
"cache_hit": false,
"routed": false
}Infrastructure
Our hosted service runs on Google Cloud with security best practices:
- ✓
Google Cloud Run
Serverless, auto-scaling containers. No persistent disk — the proxy is completely stateless.
- ✓
Google Firestore
Cost events and account data are stored in Firestore with encryption at rest (AES-256).
- ✓
SSL/TLS everywhere
All connections — from your app to the proxy, and from the proxy to LLM providers — use TLS encryption.
- ✓
Stateless proxy
The proxy holds nothing in persistent storage. Each request is independent. There is no disk, no local database, no file system state.
Self-Hosted Option
ModelTrack is open source. If you need maximum control over your data, run the entire stack in your own infrastructure.
- →Deploy the proxy, API, and dashboard in your own VPC
- →Your data never leaves your infrastructure
- →Full control over networking, access, and encryption
- →Same features as the hosted version
Get started with self-hosting: github.com/ModelTrack/modeltrack
Reporting Issues
We take security seriously. If you discover a vulnerability or security concern, please let us know.
- →Email: security@modeltrack.ai
- →Responsible disclosure is welcome and appreciated
- →We will acknowledge receipt within 48 hours
- →We will not take legal action against good-faith security researchers
For general support questions, reach out to support@modeltrack.ai.