Table of Contents
-
What is MCP?
Model Context Protocol (MCP) is emerging as the industry standard for LLM interactions with external tools and data sources. Just as LLM APIs converged around the OpenAI specification, we’re seeing a similar consolidation around MCP as the unified protocol for LLM applications. You can read more about the MCP here
At Future AGI, we’re thrilled about the possibilities that MCP brings to our ecosystem. It opens up powerful new ways for our customers to interact with our platform and create more sophisticated AI applications.
-
Why Future AGI MCP Matters?
LLMs are powerful-but to truly make them useful in real-world workflows, they need access to tools, data, and evaluations. That’s what the Model Context Protocol (MCP) enables.
With Future AGI’s MCP Server, you can do the following using natural language:
- Run automatic evaluations - Evaluate batch and single inputs on various evaluation metrics present in Future AGI both on local points and large datasets
- Prototype and Observe your Agents - You can add observability and evaluations while both prototyping and deploying your agents into production using natural language
- Manage datasets - Upload, evaluate, download datasets and find insights with natural language
- Add Protection Rules- Apply toxicity detection, prompt injection protection, and other guardrails to your applications automatically using chat
- Synthetic Data Generation - Generate Synthetic Data by describing about the dataset and objective

Image 1: Future AGI’ MCP Server’s features
-
How to Set Up Future AGI MCP?
Create an account athttp://app.futureagi.com/ and obtain your API key and Secret key from the dashboard. These credentials are required for the Future AGI MCP Server to authenticate with our API.
To run the server locally:
git clone <https://github.com/future-agi/futureagi-mcp-server.git>
cd futureagi-mcp-server
# Install dependencies
brew install uv
uv sync
# Set environment variables
export FI_API_KEY="your_api_key"
export FI_SECRET_KEY="your_secret_key"
# Run server
python main.py
Configure your MCP clients (Cursor/Claude Desktop) with
{
"mcpServers": {
"FutureAGI-MCP": {
"command": "uvx",
"args": [
"futureagi-mcp-server"
],
"env": {
"FI_SECRET_KEY": "your_api_key",
"FI_API_KEY": "your_secret_key",
}
}
}
}
You can also add the Future AGI docs MCP to your clients by running the below command in your terminal. It will prompt you to choose the mcp clients present on your local system. You can choose all, which will add configuration for all the mcp clients
npx @mintlify/mcp@latest add futureagi

Image 2: Future AGI MCP with tools like evaluate, protect, upload_dataset, and command configurations in Cursor IDE.
-
Exploring the Future AGI Features Using MCP
4.1 Evaluating single and batch inputs
You can provide a prompt like “Evaluate the following inputs for tone, toxicity” along with your input. The AI assistant will automatically fetch all available evaluators from the Future AGI Platform. It will then run evaluation on the data and provides the output in structured format

Image 3: MCP tool evaluation detecting toxicity in text

Image 4: MCP deterministic evaluation validating image of Asian and Indian men
4.2 A Powerful Use Case: Conversing with Your Data Using Cursor and Future AGI
Let’s explore an exciting example of how you can leverage Cursor as a natural interface to interact with your organization’s data-powered by Future AGI.
Imagine being able to evaluate your data through a simple conversation, without ever touching a graphical UI. Thanks to our Future AGI MCP server, this is now a reality.
Suppose you ask Cursor:
“Can you find rag_chat.csv, upload it to Future AGI, suggest three evaluations, and add them to the dataset?”
Here’s what happens behind the scenes:
- File Discovery & Upload: Cursor searches for rag_chat.csv locally and uploads it to the Future AGI platform.
- Evaluator Selection: It automatically fetches all available evaluators and selects three appropriate ones.
- Evaluation Assignment: These selected evaluations are then attached to the dataset.
It also tries to correct itself based on the error thrown by the tool, as shown below

Image 5: Cursor interface using MCP to upload rag_chat.csv

Image 6: Future AGI MCP adding three LLM evaluations-context relevance, factual accuracy, and groundedness
- Insight Delivery: Once evaluations are complete, you can ask Cursor to download the evaluated dataset and present the insights-again, all through natural language.

Image 7: Evaluation insights from Future AGI MCP showing context relevance, factual accuracy, and groundedness scores
4.3 Synthetic Data Generation
You can now ask your assistant to generate Synthetic Data for a specific use case. It will decide on the dataset columns and their types and send it to Future AGI. Data generation then starts in the background; wait for some time and ask it to download the data. It will be served to your local folder. Be specific about dataset for best results

Image 8: Cursor interface showing synthetic dataset generation and insights for e-commerce customer support
4.4 Seamlessly Add Code Observability and Prototype Using Natural Language
A simple prompt like:
“Can you search for crew_ai instrumentation in the Future AGI docs and suggest the code changes using the custom trace_provider?”
…is all it takes.
Cursor will take care of the rest-searching the documentation, understanding the relevant instrumentation steps, and generating the necessary code changes. No need to manually comb through pages of docs. Just ask, and it delivers.

Image 9: Cursor editor displaying Crew AI instrumentation setup using Future AGI’s trace_provider integration
Through natural conversations with tools like Claude, team members can prototype ideas, run evaluations, and analyze performance metrics or edge cases-right from their seats.
For the more technical members of your team, modern AI-powered IDEs like Cursor seamlessly integrate into the workflow, accelerating development, instrumentation, and iteration with minimal friction.
Conclusion
The Future AGI MCP Server isn’t just a tool-it’s a developer-first platform that redefines how you work with LLMs. By bringing together the power of Model Context Protocol, LLM evaluation, and natural language interfaces, it allows any team to build, test, and monitor AI applications with unmatched simplicity and speed.
Whether you’re evaluating batch data, generating synthetic samples, or hardening your model against security risks, the MCP Server makes it easier than ever to move from idea to deployment.
Ready to experience the future of conversational AI and evaluation? Start here.
FAQs
Q1: What tools can connect to the Future AGI MCP server?
Currently, tools like Cursor, Claude Desktop, and any MCP-compatible client can connect. Simply configure them with the MCP server path and environment variables.
Q2: What do I need to run the MCP server?
You’ll need: - A Future AGI account with API and Secret Keys - uvx installed in your system - An MCP Client Application like Claude Desktop or Cursor
Q3: What’s coming next to Future AGI MCP?
Future AGI’s MCP capabilities are growing fast. Soon, you’ll be able to manage prompts more efficiently-create, update, and refine them with AI suggestions tailored to your use case. Enhanced dataset operations will make data handling smoother and more flexible. We’re also bringing advanced synthetic data generation powered by knowledge bases, making your AI workflows more intelligent and seamless than ever.
Related Articles
View all
Future AGI + OpenAI Agent SDK: Real-Time Monitoring Unlocked
Discover how Future AGI unlocks x-ray tracing, live dashboards, smart alerts, and evaluator-driven quality with the OpenAI Agent SDK-turning black-box agents into trusted production AI.
Top 11 LLM API Providers in 2025
Compare OpenAI, Anthropic, Gemini and 8 more LLM APIs in 2025 - token pricing, latency, context, enterprise features - to choose the right large-language-model API.
API vs MCP: What's the difference?
Explore API vs MCP differences: how Model Context Protocol transforms AI integration with two-way context streaming, tool discovery, and reduced boilerplate.