Overview
This guide shows you how to integrate Latitude Telemetry into an existing application that uses the official OpenAI SDK. After completing these steps:- Every OpenAI call (e.g.
chat.completions.create) can be captured as a log in Latitude. - Logs are grouped under a prompt, identified by a
path, inside a Latitude project. - You can inspect inputs/outputs, measure latency, and debug OpenAI-powered features from the Latitude dashboard.
You’ll keep calling OpenAI exactly as you do today — Telemetry simply observes
and enriches those calls.
Requirements
Before you start, make sure you have:- A Latitude account and API key
- A Latitude project ID
- A Node.js-based project that uses the OpenAI SDK
Steps
1
Install requirements
Add the Latitude Telemetry package to your project:
2
Initialize Latitude Telemetry
Create a single
LatitudeTelemetry instance when your app startsYou must pass the OpenAI SDK so Telemetry can instrument it.telemetry.ts
The Telemetry instance should only be created once. Any OpenAI client
instantiated after this will be automatically traced.
3
Wrap your OpenAI-powered feature
Wrap the code that calls OpenAI using
telemetry.capture.The
path:- Identifies the prompt in Latitude
- Can be new or existing
- Should not contain spaces or special characters (use letters, numbers,
- _ / .)
Seeing your logs in Latitude
Once your feature is wrapped, logs will appear automatically.- Open the prompt in your Latitude dashboard (identified by
path) - Go to the Traces section
- Each execution will show:
- Input and output messages
- Model and token usage
- Latency and errors
- One trace per feature invocation