Are you sure you want to delete this access key?
sidebar_label | description |
---|---|
FAQ | Evaluate LLM outputs, run red team tests, and automate AI security checks with Promptfoo's open-source framework. Configure assertions and metrics for 50+ providers. |
Promptfoo is a local-first, open-source tool designed to help evaluate (eval) large language models (LLMs). Promptfoo is designed for application developers and for business applications. It features a simple, flexible, and extensible API. With Promptfoo you can:
LLM red teaming is the process of systematically testing LLMs to identify potential vulnerabilities, weaknesses, and unintended behaviors before deployment. Promptfoo supports this by offering a framework for generating and executing adversarial tests, aligned with industry standards like OWASP LLM Top 10 and NIST AI Risk Management Framework.
Promptfoo's red teaming capabilities allow you to:
For more details, see our LLM Red Teaming Guide.
Promptfoo supports a wide range of LLM providers, including:
Promptfoo's flexible architecture allows for easy integration with new or custom LLM providers. For the most up-to-date list and integration instructions, please refer to our Providers documentation.
No, the source code runs on your machine. Calls to LLM APIs are sent directly to the respective provider. The Promptfoo team does not have access to these requests or responses.
No, API keys are stored as local environment variables and are never transmitted anywhere besides directly to the LLM API.
No, Promptfoo operates locally, and all data remains on your machine. The only exception is when you explicitly use the share command, which stores inputs and outputs in Cloudflare KV for two weeks.
No, we do not collect any personally identifiable information (PII).
Promptfoo proxy settings are configured through environment variables:
HTTP_PROXY
: For HTTP requestsHTTPS_PROXY
: For HTTPS requestsNO_PROXY
: Comma-separated list of hosts to exclude from proxyingThe proxy URL format is: [protocol://][user:password@]host[:port]
For example:
# Basic proxy
export HTTPS_PROXY=http://proxy.company.com:8080
# Proxy with authentication
export HTTPS_PROXY=http://username:password@proxy.company.com:8080
# Exclude specific hosts from proxying
export NO_PROXY=localhost,127.0.0.1,internal.domain.com
Note: Environment variables are specific to your terminal/shell instance. If you need them permanently, add them to your shell's startup file (e.g., ~/.bashrc
, ~/.zshrc
).
For environments with custom certificate authorities (like corporate environments), configure SSL/TLS settings using these environment variables:
PROMPTFOO_CA_CERT_PATH
: Path to a custom CA certificate bundle. The path can be absolute or relative to your working directory. Invalid paths will log a warning:
# Absolute path
export PROMPTFOO_CA_CERT_PATH=/path/to/ca-bundle.crt
# Relative path
export PROMPTFOO_CA_CERT_PATH=./certs/ca-bundle.crt
PROMPTFOO_INSECURE_SSL
: Set to true
to disable SSL certificate verification:
export PROMPTFOO_INSECURE_SSL=true
Remember that like all environment variables, these settings are specific to your terminal/shell instance.
Promptfoo can be integrated into CI/CD pipelines via GitHub Action, used with testing frameworks like Jest and Vitest, and incorporated into various stages of the development process.
Set the following environment variables before running the CLI to disable all outbound network requests:
export PROMPTFOO_DISABLE_TELEMETRY=1
export PROMPTFOO_DISABLE_UPDATE=1
export PROMPTFOO_DISABLE_REDTEAM_REMOTE_GENERATION=true
export PROMPTFOO_DISABLE_SHARING=1
export PROMPTFOO_SELF_HOSTED=1
Only configure local or self-hosted LLM providers (e.g., Ollama) so the CLI does not attempt to reach external APIs.
Yes. The documentation website follows the LLMs.txt specification so automated tools can easily index our content. You can access the files at:
Usage with AI assistants: Copy the llms-full.txt content into your AI assistant (ChatGPT, Claude, etc.) for comprehensive promptfoo context when working on LLM evaluations, red-teaming, or configuration questions.
Press p or to see the previous file or, n or to see the next file
Browsing data directories saved to S3 is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with AWS S3!
Are you sure you want to delete this access key?
Browsing data directories saved to Google Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Google Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to Azure Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Azure Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to S3 compatible storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with your S3 compatible storage!
Are you sure you want to delete this access key?