Trace Google GenAI
LLM Providers
Auto-instrument Google GenAI with traceAI in under 3 minutes. Every LLM call, tool use, retrieval, and chain step becomes an OpenTelemetry span you can search, replay, and debug.
Recipes for Google GenAI
Prerequisites
Before you start
- · A working Google GenAI app — local or already in production.
- · A free Future AGI account with
FI_API_KEYandFI_SECRET_KEY. - · Python 3.9+ / Node 18+ / Java 17+ depending on which SDK you're installing.
Install
pip install traceAI-google-genaiTrace recipe
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_google_genai import GoogleGenAIInstrumentor
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="GOOGLE_GENAI_APP",
)
GoogleGenAIInstrumentor().instrument(tracer_provider=trace_provider)
# Your existing Google GenAI code runs unchanged from here.
# Every call is now an OpenTelemetry span in Future AGI.What Future AGI captures
Trace fields you'll see in the dashboard
-
Spans for every Google GenAI call: input, output, latency, tokens, cost, model name, errors
-
Trace tree across LLM, tool, retrieval, embedding, and chain spans
-
Custom attributes via `using_attributes` (session_id, user_id, prompt_template, tags, custom dicts)
-
Streaming-safe — partial chunks aggregated into a single span
Common gotchas
Read these before you ship
- 01
Set `FI_API_KEY` and `FI_SECRET_KEY` in env before calling `register()` — silent fallback otherwise.
- 02
Async frameworks: instantiate the instrumentor *before* you create the client, not after.
- 03
Streaming responses are aggregated into a single span only when you use the official SDK iterator.
Next: chain it with the other recipes
Trace is the first step. Most teams add an evaluator the same week, and start optimising or simulating once they have a baseline. Each recipe takes minutes to wire up.
Adjacent integrations
More integrations like Google GenAI
OpenAI
GPT-4o, GPT-5, o-series, and the OpenAI Responses API.
Anthropic
Claude Opus, Sonnet, and Haiku via the Anthropic Messages API.
Cohere
Command, Embed, and Rerank via the Cohere API.
Mistral
Mistral Large, Codestral, and open-weight Mistral / Mixtral.