How-to
1 min read

Vercel Perplexity IntegrationMarketplace

Learn how to integrate Perplexity with Vercel.
Table of Contents

Perplexity API specializes in deep language understanding and generation models. Integrating Perplexity with Vercel allows your applications to use advanced text interpretation, sentiment analysis, and language modeling capabilities.

You can use the Vercel and Perplexity integration to power a variety of AI applications, including:

  • Advanced text generation: Use Perplexity for creating sophisticated text generation tools for content creation and communication
  • Semantic text analysis: Use Perplexity for applications requiring deep semantic analysis of text, such as sentiment analysis or topic detection
  • Language understanding applications: Use Perplexity in tools for language understanding and translation, enhancing communication across languages

Perplexity provides models focused on deep language understanding and generation. They excel in tasks such as contextual text interpretation, predictive typing, and nuanced sentiment analysis.

codellama-70b-instruct

Type: Chat

Meta's CodeLlama 70B model hosted by Perplexity.

sonar-medium-chat

Type: Chat

Perplexity's mixture of experts model based on Mixtral-8x7b-instruct.

sonar-medium-online

Type: Chat

Perplexity's mixture of experts chat model based on mixtral-8x7b-instruct with access to their web search index.

sonar-small-chat

Type: Chat

Perplexity's 7B parameter chat model based on Mistral 7B.

sonar-small-online

Type: Chat

Perplexity's 7B model based on Mistral 7B with access to their web search index.

llama-3-70b-instruct

Type: Chat

Meta's Llama 70B model hosted by Perplexity.

llama-3-8b-instruct

Type: Chat

Meta's Llama 8B model hosted by Perplexity.

mistral-7b-instruct-v0.2

Type: Chat

Mistral's 7B model hosted by Perplexity.

mistral-8x22B-instruct

Type: Chat

Mistral's 8x22B model hosted by Perplexity.

mistral-8x7B-instruct

Type: Chat

Mistral's 8x7B model hosted by Perplexity.

The Vercel Perplexity API integration can be accessed through the AI tab on your Vercel dashboard.

To follow this guide, you'll need the following:

  1. Navigate to the AI tab in your Vercel dashboard
  2. Select Perplexity API from the list of providers, and press Add
  3. Review the provider information, and press Add Provider
  4. You can now select which projects the provider will have access to. You can choose from All Projects or Specific Projects
    • If you select Specific Projects, you'll be prompted to select the projects you want to connect to the provider. The list will display projects associated with your scoped team
    • Multiple projects can be selected during this step
  5. Select the Connect to Project button
  6. You'll be redirected to the provider's website to complete the connection process
  7. Once the connection is complete, you'll be redirected back to the Vercel dashboard, and the provider integration dashboard page. From here you can manage your provider settings, view usage, and more
  8. Pull the environment variables into your project using Vercel CLI
    terminal
    vercel env pull .env.development.local
  9. Install the providers package
    pnpm
    yarn
    npm
    pnpm i openai ai
  10. Connect your project using the code below:
    Next.js (/app)
    Next.js (/pages)
    SvelteKit
    Other frameworks
    app/api/chat/route.ts
    // app/api/chat/route.ts
    import { OpenAIStream, StreamingTextResponse } from 'ai';
    import OpenAI from 'openai';
    const perplexity = new OpenAI({
    apiKey: process.env.PERPLEXITY_API_KEY || '',
    baseURL: 'https://api.perplexity.ai',
    });
    export async function POST(req: Request) {
    // Extract the `messages` from the body of the request
    const { messages } = await req.json();
    // Request the OpenAI-compatible API for the response based on the prompt
    const response = await perplexity.chat.completions.create({
    model: 'pplx-7b-chat',
    stream: true,
    messages: messages,
    });
    // Convert the response into a friendly text-stream
    const stream = OpenAIStream(response);
    // Respond with the stream
    return new StreamingTextResponse(stream);
    }
  1. Add the provider to your page using the code below:
    Next.js (/app)
    Next.js (/pages)
    SvelteKit
    app/chat/page.tsx
    // app/chat/page.tsx
    'use client';
    import { useChat } from 'ai/react';
    export default function Chat() {
    const { messages, input, handleInputChange, handleSubmit } = useChat();
    return (
    <div>
    {messages.map((m) => (
    <div key={m.id}>
    {m.role === 'user' ? 'User: ' : 'AI: '}
    {m.content}
    </div>
    ))}
    <form onSubmit={handleSubmit}>
    <input
    value={input}
    placeholder="Say something..."
    onChange={handleInputChange}
    />
    </form>
    </div>
    );
    }
Last updated on May 3, 2024