Are you sure you want to delete this access key?
Llamafile has an OpenAI-compatible HTTP endpoint, so you can override the OpenAI provider to talk to your llamafile server.
In order to use llamafile in your eval, set the apiBaseUrl
variable to http://localhost:8080
(or wherever you're hosting llamafile).
Here's an example config that uses LLaMA_CPP for text completions:
providers:
- id: openai:chat:LLaMA_CPP
config:
apiBaseUrl: http://localhost:8080/v1
If desired, you can instead use the OPENAI_BASE_URL
environment variable instead of the apiBaseUrl
config.
Press p or to see the previous file or, n or to see the next file
Browsing data directories saved to S3 is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with AWS S3!
Are you sure you want to delete this access key?
Browsing data directories saved to Google Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Google Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to Azure Cloud Storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with Azure Cloud Storage!
Are you sure you want to delete this access key?
Browsing data directories saved to S3 compatible storage is possible with DAGsHub. Let's configure your repository to easily display your data in the context of any commit!
promptfoo is now integrated with your S3 compatible storage!
Are you sure you want to delete this access key?