JavaScript SDK
Complete reference for the Brokle JavaScript SDK - TypeScript-first with full async support
JavaScript SDK
The Brokle JavaScript SDK provides comprehensive AI observability for Node.js applications, with full TypeScript support and modern ESM/CommonJS compatibility.
Installation
npm install broklepnpm add brokleyarn add brokleRequirements:
- Node.js 20+
- TypeScript 4.7+ (optional, for type definitions)
Quick Start
import { Brokle } from 'brokle';
import { wrapOpenAI } from 'brokle/openai';
import OpenAI from 'openai';
// Initialize client
const client = new Brokle({ apiKey: 'bk_...' });
// Wrap OpenAI for automatic tracing
const openai = wrapOpenAI(new OpenAI());
// All LLM calls are now traced
const response = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
});
console.log(response.choices[0].message.content);
// Ensure traces are sent
await client.shutdown();Client Initialization
Brokle Client
import { Brokle } from 'brokle';
const client = new Brokle({
apiKey: 'bk_...', // Required: Your API key
baseUrl: undefined, // Optional: Custom API URL
environment: 'production', // Optional: Environment name
sampleRate: 1.0, // Optional: Sampling rate (0.0-1.0)
flushAt: 100, // Optional: Batch size before flush
flushInterval: 10, // Optional: Auto-flush interval (seconds)
debug: false, // Optional: Enable debug logging
compression: 'gzip', // Optional: Payload compression
});Environment Variables
The client reads from environment variables if not provided:
export BROKLE_API_KEY=bk_...
export BROKLE_BASE_URL=https://api.brokle.com
export BROKLE_ENVIRONMENT=production
export BROKLE_SAMPLE_RATE=1.0
export BROKLE_DEBUG=false// No arguments needed if env vars are set
const client = new Brokle({});Tracing
Starting Spans
await client.startActiveSpan('operation', async (span) => {
span.setAttribute('userId', user.id);
span.setAttribute('feature', 'search');
const result = await performOperation(userQuery);
client.updateCurrentSpan({ output: result });
return result;
});Using traceFunction Wrapper
import { traceFunction } from 'brokle';
const processQuery = traceFunction(
'process_query',
async (query: string) => {
const result = await search(query);
return { query, results: result };
},
{ captureInput: true, captureOutput: true }
);
// Each call creates a traced span
const result = await processQuery('find documents');Callback Pattern
await client.startActiveSpan('operation', async (span) => {
span.setAttribute('userId', user.id);
const result = await doWork();
client.updateCurrentSpan({ output: result });
return result;
});Span Types
General Spans
await client.startActiveSpan('operation', async (span) => {
span.setAttribute('key', 'value');
const result = await doWork();
client.updateCurrentSpan({ output: result });
return result;
});Generation Spans (LLM Calls)
await client.startActiveGeneration('chat', 'gpt-4', 'openai', async (span) => {
const response = await callLLM(messages);
span.setAttribute('gen_ai.request.temperature', 0.7);
span.setAttribute('gen_ai.request.max_tokens', 500);
span.setAttribute('gen_ai.usage.input_tokens', response.usage.prompt_tokens);
span.setAttribute('gen_ai.usage.output_tokens', response.usage.completion_tokens);
client.updateCurrentSpan({ output: response.content });
return response;
});Retrieval Spans
await client.startActiveSpan('vector_search', async (span) => {
const results = await vectorStore.search(query, 5);
client.updateCurrentSpan({
output: {
count: results.length,
scores: results.map(r => r.score)
}
});
return results;
}, {
'brokle.span.type': 'retrieval',
'gen_ai.retrieval.source': 'documents',
'gen_ai.retrieval.top_k': 5,
});Tool Spans
await client.startActiveSpan('calculator', async (span) => {
const result = evaluate('2 + 2');
client.updateCurrentSpan({ output: { result } });
return result;
}, {
'brokle.span.type': 'tool',
'gen_ai.tool.name': 'calculator',
}, {
input: { expression: '2 + 2' },
});Nested Spans
Create hierarchical traces using nested callbacks. Parent-child relationships are automatically established via OpenTelemetry context propagation:
await client.startActiveSpan('pipeline', async (parentSpan) => {
parentSpan.setAttribute('userId', 'user_123');
parentSpan.setAttribute('sessionId', 'session_456');
// Child span 1 - automatically a child of 'pipeline'
const result1 = await client.startActiveSpan('step_1', async (span) => {
return await stepOne();
});
// Child span 2 - automatically a child of 'pipeline'
const result2 = await client.startActiveSpan('step_2', async (span) => {
return await stepTwo(result1);
});
client.updateCurrentSpan({ output: result2 });
return result2;
});Span Methods
setAttribute
Add key-value pairs to the current span:
await client.startActiveSpan('operation', async (span) => {
span.setAttribute('key', 'value');
span.setAttribute('count', 42);
span.setAttribute('isPremium', true);
// ...
});updateCurrentSpan
Update the active span with output, metadata, or prompt linking:
await client.startActiveSpan('operation', async (span) => {
const result = await doWork();
// Set output
client.updateCurrentSpan({ output: result });
// Set metadata
client.updateCurrentSpan({ metadata: { status: 'done' } });
// Link a prompt
const prompt = await client.prompts.get('my_prompt');
client.updateCurrentSpan({ prompt });
return result;
});end
OTEL spans are automatically ended when the callback completes. You do not need to call span.end() manually inside startActiveSpan callbacks -- the SDK handles this for you, including on errors.
LLM Integrations
OpenAI
import { Brokle } from 'brokle';
import { wrapOpenAI } from 'brokle/openai';
import OpenAI from 'openai';
const client = new Brokle({ apiKey: 'bk_...' });
const openai = wrapOpenAI(new OpenAI());
// Chat completions
const response = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }]
});
// Streaming
const stream = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello!' }],
stream: true
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
// Embeddings
const embeddings = await openai.embeddings.create({
model: 'text-embedding-3-small',
input: ['Hello world']
});Anthropic
import { Brokle } from 'brokle';
import { wrapAnthropic } from 'brokle/anthropic';
import Anthropic from '@anthropic-ai/sdk';
const client = new Brokle({ apiKey: 'bk_...' });
const claude = wrapAnthropic(new Anthropic());
const response = await claude.messages.create({
model: 'claude-3-sonnet-20240229',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }]
});Prompts
Fetching Prompts
// Get a prompt by name
const prompt = await client.prompts.get('customer_support');
// Get a specific version
const prompt = await client.prompts.get('customer_support', {
label: 'production'
});
const prompt = await client.prompts.get('customer_support', {
version: 3
});Using Prompts
// Convert to OpenAI format
const messages = prompt.toOpenAIMessages({
customerName: 'John',
issue: 'billing question'
});
const response = await openai.chat.completions.create({
model: prompt.model || 'gpt-4',
messages
});Lifecycle Management
Flushing
Force send pending traces:
// Async flush
await client.flush();Shutdown
Graceful shutdown:
// Flush and close connections
await client.shutdown();Process Exit Handling
// Handle process exit
process.on('SIGTERM', async () => {
await client.shutdown();
process.exit(0);
});
process.on('beforeExit', async () => {
await client.shutdown();
});Integration with Web Frameworks
Express:
import express from 'express';
import { Brokle } from 'brokle';
const client = new Brokle({ apiKey: 'bk_...' });
const app = express();
// Middleware for tracing
app.use((req, res, next) => {
client.startActiveSpan(`${req.method} ${req.path}`, async (span) => {
span.setAttribute('method', req.method);
span.setAttribute('path', req.path);
span.setAttribute('userAgent', req.headers['user-agent'] || '');
return new Promise<void>((resolve) => {
let ended = false;
const endSpan = () => {
if (ended) return;
ended = true;
span.setAttribute('statusCode', res.statusCode);
resolve();
};
res.on('finish', endSpan);
res.on('close', endSpan);
next();
});
});
});
// Graceful shutdown
process.on('SIGTERM', async () => {
await client.shutdown();
process.exit(0);
});Next.js:
// lib/brokle.ts
import { Brokle } from 'brokle';
export const brokle = new Brokle({ apiKey: process.env.BROKLE_API_KEY });
// In API routes
import { brokle } from '@/lib/brokle';
export async function POST(req: Request) {
return await brokle.startActiveSpan('api/chat', async (span) => {
const result = await handleChat(req);
brokle.updateCurrentSpan({ output: result });
return Response.json(result);
});
}Fastify:
import Fastify from 'fastify';
import { Brokle } from 'brokle';
const client = new Brokle({ apiKey: 'bk_...' });
const app = Fastify();
app.addHook('onRequest', (req, reply, done) => {
client.startActiveSpan(`${req.method} ${req.url}`, async (span) => {
span.setAttribute('method', req.method);
span.setAttribute('url', req.url);
return new Promise<void>((resolve) => {
let ended = false;
const endSpan = () => {
if (ended) return;
ended = true;
span.setAttribute('statusCode', reply.statusCode);
resolve();
};
reply.raw.on('finish', endSpan);
reply.raw.on('close', endSpan);
done();
});
});
});
app.addHook('onClose', async () => {
await client.shutdown();
});TypeScript Support
The SDK is written in TypeScript with full type definitions:
import { Brokle } from 'brokle';
import type { Span } from '@opentelemetry/api';
const client = new Brokle({ apiKey: 'bk_...' });
interface SearchResult {
results: string[];
count: number;
}
// startActiveSpan is fully typed - the return type matches your callback
const result: SearchResult = await client.startActiveSpan(
'search',
async (span: Span) => {
span.setAttribute('query', 'test');
span.setAttribute('userId', 'user_123');
const data: SearchResult = {
results: ['result1', 'result2'],
count: 2,
};
client.updateCurrentSpan({ output: data });
return data;
}
);Error Handling
Capturing Errors
Errors thrown inside startActiveSpan callbacks are automatically captured on the span (via span.recordException and span.setStatus), then re-thrown:
await client.startActiveSpan('operation', async (span) => {
// If riskyOperation() throws, the error is automatically:
// 1. Recorded as an exception on the span
// 2. Sets span status to ERROR
// 3. Ends the span
// 4. Re-throws the error
const result = await riskyOperation();
client.updateCurrentSpan({ output: result });
return result;
});If you need to handle specific error types without re-throwing:
try {
await client.startActiveSpan('operation', async (span) => {
const result = await riskyOperation();
client.updateCurrentSpan({ output: result });
return result;
});
} catch (error) {
if (error instanceof ValidationError) {
// Handle validation error - span already has error recorded
console.warn('Validation failed:', error.message);
} else {
throw error;
}
}SDK Errors
The SDK is designed to never break your application:
// SDK errors are logged but don't propagate
const client = new Brokle({ apiKey: 'invalid_key' });
// This still works - SDK failures are silent
await client.startActiveSpan('operation', async (span) => {
const result = await doWork(); // Your code runs normally
client.updateCurrentSpan({ output: result });
return result;
});Configuration Reference
| Option | Type | Default | Description |
|---|---|---|---|
| Required | |||
apiKey | string | Required | Brokle API key (must start with bk_) |
| Connection | |||
baseUrl | string | "https://api.brokle.com" | API endpoint |
timeout | number | 30000 | Request timeout in milliseconds |
| Control | |||
enabled | boolean | true | Master switch to disable SDK completely |
tracingEnabled | boolean | true | Enable/disable tracing |
metricsEnabled | boolean | true | Enable/disable metrics collection |
logsEnabled | boolean | false | Enable/disable OTLP log export |
| Project | |||
environment | string | "default" | Environment name (e.g., production, staging) |
release | string | "" | Release identifier for deployment tracking |
sampleRate | number | 1.0 | Sampling rate (0.0-1.0) |
debug | boolean | false | Enable debug logging |
| Batching | |||
flushAt | number | 100 | Maximum batch size before flush |
flushInterval | number | 10 | Auto-flush interval in seconds |
flushSync | boolean | false | Use SimpleSpanProcessor (for serverless) |
maxQueueSize | number | 10000 | Maximum spans in queue |
| Transport | |||
compression | string | "gzip" | Compression algorithm |
transport | string | "http" | Transport protocol: "http" or "grpc" (guide) |
grpcEndpoint | string | undefined | Explicit gRPC endpoint (guide) |
metricsInterval | number | 60000 | Metrics export interval in milliseconds |
Best Practices
1. Initialize Once
// Good - module level
const client = new Brokle({ apiKey: process.env.BROKLE_API_KEY });
export async function process() {
await client.startActiveSpan('op', async (span) => {
// ...
});
}
// Bad - creates new client each call
export async function process() {
const client = new Brokle({ apiKey: '...' }); // Don't do this
}2. Use Callbacks for Automatic Cleanup
// Good - span is automatically ended and errors are captured
await client.startActiveSpan('op', async (span) => {
const result = await doWork();
client.updateCurrentSpan({ output: result });
return result;
});
// The callback pattern guarantees:
// - span.end() is always called (even on error)
// - Errors are recorded on the span
// - Span status is set correctly3. Add Meaningful Context
await client.startActiveSpan('process_order', async (span) => {
span.setAttribute('orderId', order.id);
span.setAttribute('customerTier', customer.tier);
span.setAttribute('userId', user.id);
const result = await processOrder(order);
client.updateCurrentSpan({ output: result });
return result;
});4. Handle Shutdown
process.on('SIGTERM', async () => {
await client.shutdown();
process.exit(0);
});Always call await client.shutdown() before your process exits to ensure all traces are sent.
Troubleshooting
Traces Not Appearing
- Check API key:
console.log(process.env.BROKLE_API_KEY) - Ensure shutdown: Add
await client.shutdown()before exit - Check debug mode:
new Brokle({ debug: true })
High Memory Usage
Reduce batch size and flush interval:
const client = new Brokle({
flushAt: 50,
flushInterval: 2, // Flush every 2 seconds
});Slow Performance
Enable sampling for high-throughput:
const client = new Brokle({ sampleRate: 0.1 }); // 10% sampling