IBM watsonx on Future AGI
Cloud Platforms
IBM watsonx.ai foundation models for regulated workloads.
What you get
Everything traced, scored, and improvable
One install instruments IBM watsonx on OpenTelemetry. Future AGI then layers evaluators, optimisers, and simulations on top of the same trace tree — no second SDK, no double instrumentation.
Trace
Auto-instrumented spans
Every IBM watsonx call becomes a span — inputs, outputs, latency, tokens, cost, model name, tool args, retrieval results, and chain steps captured automatically.
Evaluate
70+ evaluators on every span
Attach Groundedness, Context Relevance, Prompt Injection, Toxicity, and 70+ more — purpose-built scorers powered by the Turing eval models, not generic LLM-as-judge.
Optimize
Closed-loop improvement
Pipe failed traces into agent-opt: GEPA, PromptWizard, ProTeGi, and Bayesian search rewrite your prompts with proof of measured gains.
Simulate
Adversarial scenarios at scale
Generate hundreds of personas and run them through your IBM watsonx agent before launch — text and voice, scripted or persona-driven.
Quickstart · <3 min
Instrument IBM watsonx in three steps
-
Step 1
Install the traceAI package
One package per language, ships from PyPI, npm, and Maven Central.
-
Step 2
Register the trace provider
Set
FI_API_KEYandFI_SECRET_KEY, then callWatsonxInstrumentor().instrument(). -
Step 3
Run your existing IBM watsonx app
No code changes. Traces appear in the Future AGI dashboard within seconds.
Install
<dependency>
<groupId>ai.futureagi</groupId>
<artifactId>traceai-java-watsonx</artifactId>
<version>LATEST</version>
</dependency>Instrument
import ai.futureagi.fi.instrumentation.TraceProvider;
import ai.futureagi.traceai.watsonx.WatsonxInstrumentor;
TraceProvider provider = TraceProvider.builder()
.projectName("watsonx_app")
.projectType("observe")
.build();
new WatsonxInstrumentor().instrument(provider);
// Your existing IBM watsonx code runs unchanged.
// Every call is now an OpenTelemetry span in Future AGI.Pick a recipe
What do you want to do with IBM watsonx?
Each recipe is a copy-paste page with the exact code, the gotchas, and a working example you can clone.
Recipes for IBM watsonx
Adjacent integrations
Other cloud platforms
Vertex AI
Google Cloud's hosted Gemini, Anthropic, and Llama endpoints.
AWS Bedrock
Amazon Bedrock invocation across Claude, Llama, Mistral, Nova, and Titan.
Azure OpenAI
Microsoft Azure's regulated OpenAI deployments and assistants.
Replicate
Run open-source AI models on Replicate's serverless GPUs.