1 min read
Meta’s latest and most powerful Llama 4 models are now available through the Vercel Marketplace via Groq.
To get started for free, install the Groq integration in the Vercel dashboard or add Groq to your existing projects with the Vercel CLI:
vercel install groq
You can then use the AI SDK Groq provider with Lama 4:
import { groq } from '@ai-sdk/groq';import { streamText } from 'ai';import fs from 'fs'
const result = streamText({ model: groq('meta-llama/llama-4-scout-17b-16e-instruct'), messages: [ { role: 'user', content: [ { type: 'text', text: 'Describe the image in detail.' }, { type: 'image', image: fs.readFileSync('./data/llama.png') }, ], }, ],});
for await (const textPart of result.textStream) { process.stdout.write(textPart);}
For a full demo, check out the official Groq chatbot template (which now uses Llama 4) or compare Llama 4 against other models side-by-side on our AI SDK Playground. To learn more, visit our AI documentation.