← Back to changelog
December 2, 2024

New documentation for Google Vertex AI and Gemini tracing

Picture Marc KlingenMarc Klingen

Comprehensive guides for tracing Google Vertex AI and Gemini models with Langfuse

We’ve published comprehensive documentation on how to trace Google Vertex AI and Gemini models with Langfuse. This guide helps you implement observability for your Google Vertex AI applications, including the Gemini model family.

What’s Included

  1. Step-by-step Integration Guide: Detailed instructions for setting up Langfuse tracing with Google Vertex AI, including code examples and best practices.
  2. Framework Examples: Ready-to-use code snippets demonstrating how to implement tracing for application frameworks such as LangChain.

Key Tracing Features

  • Automatic capture of prompts, completions, and tokens
  • Latency tracking for model calls
  • Cost calculation for Vertex AI usage
  • Support for multi-modal inputs with Gemini
  • Structured logging of model parameters and metadata

How to Get Started

  1. Check out our new documentation
  2. Follow the setup instructions to configure your Google Cloud credentials
  3. Implement tracing using our SDK examples
  4. Start monitoring your Vertex AI and Gemini model calls in the Langfuse UI

By the way, Google Vertex and Gemini are also supported in the LLM Playground, Prompt Experiments, and LLM-as-a-judge evaluations (changelog).

Was this page useful?

Questions? We're here to help

Subscribe to updates