Are you sure you want to delete this access key?
title | description | sidebar_label | sidebar_position |
---|---|---|---|
Using MCP (Model Context Protocol) in Promptfoo | Configure and integrate Model Context Protocol (MCP) with Promptfoo to enable tool use, memory, and agentic capabilities across different LLM providers | Model Context Protocol (MCP) | 20 |
Promptfoo supports the Model Context Protocol (MCP) for advanced tool use, and agentic workflows. MCP allows you to connect your Promptfoo providers to an external MCP server, such as the modelcontextprotocol/server-memory, to enable tool orchestration, and more.
To enable MCP for a provider, add the mcp
block to your provider's config
in your promptfooconfig.yaml
:
description: Testing MCP memory server integration with Google AI Studio
providers:
- id: google:gemini-2.0-flash
config:
mcp:
enabled: true
server:
command: npx
args: ['-y', '@modelcontextprotocol/server-memory']
name: memory
enabled
: Set to true
to enable MCP for this provider.server
: (Optional) Configuration for launching or connecting to an MCP server.
command
: The command to launch the MCP server (e.g., npx
).args
: Arguments to pass to the command (e.g., ['-y', '@modelcontextprotocol/server-memory']
).name
: (Optional) A name for the server instance.url
: URL for connecting to a remote MCP server.headers
: (Optional) Custom HTTP headers to send when connecting to a remote MCP server (only applies to url
-based connections).auth
: (Optional) Authentication configuration for the server. Can be used to automatically set auth headers for all connection types.
type
: Authentication type, either 'bearer'
or 'api_key'
.token
: Token for bearer authentication.api_key
: API key for api_key authentication.url
instead of command
/args
.MCP servers can be run locally or accessed remotely. For development and testing, a local server is often simplest, while production environments may use a centralized remote server.
providers:
- id: openai:chat:gpt-4.1
config:
apiKey: <your-api-key>
mcp:
enabled: true
server:
url: http://localhost:8000
providers:
- id: openai:chat:gpt-4.1
config:
apiKey: <your-api-key>
mcp:
enabled: true
server:
url: http://localhost:8000
headers:
X-API-Key: your-custom-api-key
Authorization: Bearer your-token
X-Custom-Header: custom-value
This can be useful when:
Promptfoo allows a single provider to connect to multiple MCP servers by using the servers
array in your provider's MCP config. All tools from all connected servers will be available to the provider.
providers:
- id: openai:chat:gpt-4.1
config:
mcp:
enabled: true
servers:
- command: npx
args: ['-y', '@modelcontextprotocol/server-memory']
name: server_a
- url: http://localhost:8001
name: server_b
headers:
X-API-Key: your-api-key
servers:
array (not just server:
) to specify multiple MCP servers.providers:
- id: anthropic:claude-3-5-sonnet-20241022
config:
mcp:
enabled: true
servers:
- command: npx
args: ['-y', '@modelcontextprotocol/server-memory']
name: memory
- command: npx
args: ['-y', '@modelcontextprotocol/server-filesystem']
name: filesystem
- command: npx
args: ['-y', '@modelcontextprotocol/server-github']
name: github
This configuration connects a single provider to multiple MCP servers, giving it access to memory storage, filesystem operations, and GitHub integration simultaneously.
You can configure multiple MCP servers by assigning different MCP server configurations to different providers in your promptfooconfig.yaml
. Each provider can have its own mcp.server
block, allowing you to run separate memory/tool servers for different models or use cases.
description: Using multiple MCP servers
providers:
- id: google:gemini-2.0-flash
config:
mcp:
enabled: true
server:
command: npx
args: ['-y', '@modelcontextprotocol/server-memory']
name: gemini-memory
- id: openai:chat:gpt-4.1
config:
apiKey: <your-api-key>
mcp:
enabled: true
server:
url: http://localhost:8001
name: openai-memory
headers:
X-API-Key: openai-server-api-key
- id: anthropic:messages:claude-3-5-sonnet-20241022
config:
mcp:
enabled: true
server:
url: http://localhost:8002
name: anthropic-memory
headers:
Authorization: Bearer anthropic-server-token
In this example:
npx
.This setup is useful for testing, benchmarking, or running isolated agentic workflows in parallel.
MCP is supported by most major providers in Promptfoo, including:
In addition to the general MCP integration described above, OpenAI's Responses API has native MCP support that allows direct connection to remote MCP servers without running local MCP servers. This approach is specific to OpenAI's Responses API and offers:
For detailed information about using MCP with OpenAI's Responses API, see the OpenAI Provider MCP documentation.
Press p or to see the previous file or, n or to see the next file
Browsing data directories saved to S3 is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with AWS S3!
Are you sure you want to delete this access key?
Browsing data directories saved to Google Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Google Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to Azure Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Azure Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to S3 compatible storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with your S3 compatible storage!
Are you sure you want to delete this access key?