Overview
This guide shows you how to integrate Latitude Telemetry into an existing application that uses the official OpenAI SDK. After completing these steps:- Every OpenAI call (e.g.
chat.completions.create) can be captured as a log in Latitude. - Logs are attached to a specific prompt and version in Latitude.
- You can annotate, evaluate, and debug your OpenAI-powered features from the Latitude dashboard.
You’ll keep calling OpenAI directly — Telemetry simply observes and enriches those calls.
Requirements
Before you start, make sure you have:- A Latitude account and API key.
- At least one prompt created in Latitude (so you have a
promptUuidandversionUuidto associate logs with). - A Node.js-based project that uses the OpenAI SDK.
Steps
1
Install requirements
Add the Latitude Telemetry package to your project:
2
Initialize Latitude Telemetry with OpenAI
Create a Import
LatitudeTelemetry instance and pass the OpenAI SDK as an instrumentation.telemetry (and optionally openai) wherever you need to run prompts.3
Wrap your OpenAI-powered feature
Wrap the code that calls OpenAI with a Telemetry prompt span, and execute your OpenAI call inside that span.
Seeing your logs in Latitude
Once you’ve wrapped your Azure-powered feature, you can see your logs in Latitude.- Go to the Traces section of your prompt in Latitude.
- You should see new entries every time your code is executed, including:
- Input/output messages
- Model name
- Latency and error information