
Overview
Portkey natively integrates with the Vercel AI SDK to make your apps production-ready and reliable. Just import Portkey’s Vercel package and use it as a provider in your Vercel AI app to enable all of Portkey features:
- Full-stack observability and tracing for all requests
- Interoperability across 250+ LLMS
- Built-in 50+ SOTA guardrails
- Simple & semantic caching to save costs & time
- Route requests conditionally and make them robust with fallbacks, load-balancing, automatic retries, and more
- Continuous improvement based on user feedback
npm install @portkey-ai/vercel-provider
Instructions
Here's how you can use Portkey with Vercel SDK's generateText
function:
import { createPortkey } from '@portkey-ai/vercel-provider';import { generateText } from 'ai';
const portkeyConfig = { "provider": "openai", // Choose your provider (e.g., 'anthropic') "api_key": "OPENAI_API_KEY", "override_params": { "model": "gpt-4o" }};
const portkey = createPortkey({ apiKey: 'YOUR_PORTKEY_API_KEY', config: portkeyConfig,});
const { text } = await generateText({ model: portkey.chatModel(''), // Provide an empty string, we defined the model in the config prompt: 'What is Portkey?',});
console.log(text);