Future AGI × Ollama
Ollama logo

Ollama on Future AGI

LLM Providers

Local Llama, Mistral, and Qwen via the Ollama runtime.

python typescript java · trace evaluate

What you get

Everything traced, scored, and improvable

One install instruments Ollama on OpenTelemetry. Future AGI then layers evaluators, optimisers, and simulations on top of the same trace tree — no second SDK, no double instrumentation.

Trace

Auto-instrumented spans

Every Ollama call becomes a span — inputs, outputs, latency, tokens, cost, model name, tool args, retrieval results, and chain steps captured automatically.

Evaluate

70+ evaluators on every span

Attach Groundedness, Context Relevance, Prompt Injection, Toxicity, and 70+ more — purpose-built scorers powered by the Turing eval models, not generic LLM-as-judge.

Optimize

Closed-loop improvement

Pipe failed traces into agent-opt: GEPA, PromptWizard, ProTeGi, and Bayesian search rewrite your prompts with proof of measured gains.

Simulate

Adversarial scenarios at scale

Generate hundreds of personas and run them through your Ollama agent before launch — text and voice, scripted or persona-driven.

Quickstart · <3 min

Instrument Ollama in three steps

  1. Step 1

    Install the traceAI package

    One package per language, ships from PyPI, npm, and Maven Central.

  2. Step 2

    Register the trace provider

    Set FI_API_KEY and FI_SECRET_KEY, then call OllamaInstrumentor().instrument().

  3. Step 3

    Run your existing Ollama app

    No code changes. Traces appear in the Future AGI dashboard within seconds.

Install

pip install traceAI-ollama

Instrument

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_ollama import OllamaInstrumentor

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="OLLAMA_APP",
)

OllamaInstrumentor().instrument(tracer_provider=trace_provider)

# Your existing Ollama code runs unchanged from here.
# Every call is now an OpenTelemetry span in Future AGI.

Pick a recipe

What do you want to do with Ollama?

Each recipe is a copy-paste page with the exact code, the gotchas, and a working example you can clone.