Skip to main content

Overview

This guide shows you how to integrate Latitude Telemetry into an existing application that uses Together AI. After completing these steps:
  • Every Together AI generation can be captured as a log in Latitude.
  • Logs are attached to a specific prompt and version in Latitude.
  • You can annotate, evaluate, and debug your Together-powered features from the Latitude dashboard.
You’ll keep calling Together directly — Telemetry simply observes and enriches those calls.

Requirements

Before you start, make sure you have:
  • A Latitude account and API key.
  • At least one prompt created in Latitude.
  • A Node.js-based project that uses together-ai.

Steps

1

Install requirements

Add the Latitude Telemetry package to your project:
npm add @latitude-data/telemetry @opentelemetry/api
2

Initialize Latitude Telemetry with Together AI

Create a LatitudeTelemetry instance and pass the Together client class as an instrumentation.
import { LatitudeTelemetry } from '@latitude-data/telemetry'
import { Together } from 'together-ai'

export const telemetry = new LatitudeTelemetry('your-latitude-api-key', {
  instrumentations: {
    together: Together, // Enables automatic tracing for Together
  },
})
3

Wrap your Together AI-powered feature

Wrap the code that calls Together with a Telemetry prompt span, and execute your Together call inside that span.
import { context } from '@opentelemetry/api'
import { BACKGROUND } from '@latitude-data/telemetry'
import { Together } from 'together-ai'

export async function generateSupportReply(input: string) {
  const $prompt = telemetry.prompt(BACKGROUND(), {
    promptUuid: 'your-prompt-uuid',
    versionUuid: 'your-version-uuid',
  })

  await context
    .with($prompt.context, async () => {
      const together = new Together({
        apiKey: process.env.TOGETHER_API_KEY!,
      })

      const response = await together.chat.completions.create({
        model: 'meta-llama/Meta-Llama-3-70B-Instruct',
        messages: [
          { role: 'system', content: 'You are a helpful support assistant.' },
          { role: 'user', content: input },
        ],
      })

      // Use response here...
    })
    .then(() => $prompt.end())
    .catch((error) => $prompt.fail(error as Error))
    .finally(() => telemetry.flush())
}

Seeing your logs in Latitude

Once you’ve wrapped your Together AI-powered feature, you can see your logs in Latitude.
  1. Go to the Traces section of your prompt in Latitude.
  2. You should see new entries every time your code is executed, including:
    • Input/output messages
    • Model name
    • Latency and error information