Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

perplexity.md 1.1 KB

You have to be logged in to leave a comment. Sign In

Perplexity

The Perplexity API (pplx-api) offers access to Perplexity, Mistral, Llama, and other models.

It is compatible with the OpenAI API. In order to use the Perplexity API in an eval, set the apiHost config key to api.perplexity.ai.

Here's an example config that compares Perplexity's large and small Llama 3 online models:

providers:
  - id: openai:chat:llama-3-sonar-large-32k-online
    config:
      apiHost: api.perplexity.ai
      apiKeyEnvar: PERPLEXITY_API_KEY
  - id: openai:chat:llama-3-sonar-small-32k-online
    config:
      apiHost: api.perplexity.ai
      apiKeyEnvar: PERPLEXITY_API_KEY

In this example, you'd have to set the PERPLEXITY_API_KEY environment variable (you can also enter it directly in the config using the apiKey property).

If desired, you can instead use the OPENAI_API_HOST environment variable instead of the apiHost config key.

For a complete list of supported models, see Perplexity's chat completion documentation.

Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...