Overview
This guide shows you how to integrate Latitude Telemetry into an existing application that uses the Vercel AI SDK. After completing these steps:- Each
generateTextcall can be captured as a log in Latitude. - Logs are attached to a specific prompt and version in Latitude.
- You can annotate, evaluate, and debug your Vercel AI SDK-powered features from the Latitude dashboard.
You keep using the Vercel AI SDK as usual — Telemetry observes calls when telemetry is enabled in generateText.
Requirements
Before you start, make sure you have:- A Latitude account and API key.
- At least one prompt created in Latitude.
- A Node.js-based project that uses the Vercel AI SDK (e.g.
ai,@ai-sdk/openai, etc.).
Steps
1
Install requirements
Add the Latitude Telemetry package to your project:
2
Initialize Latitude Telemetry
Create a
LatitudeTelemetry instance. No specific instrumentation is required for the Vercel AI SDK.3
Wrap your Vercel AI SDK-powered feature
Wrap the code that calls
generateText with a Telemetry prompt span, and make sure telemetry is enabled in the Vercel AI SDK call.Important: Theexperimental_telemetry.isEnabledflag must be set totrueongenerateTextfor Latitude Telemetry to capture these calls.
Seeing your logs in Latitude
Once you’ve wrapped your Vercel AI SDK-powered feature and enabled telemetry ingenerateText, you can see your logs in Latitude.
- Go to the Traces section of your prompt in Latitude.
- You should see new entries every time your code is executed, including:
- Prompt text and generated output
- Underlying provider/model used
- Latency and error information