Video Generation Quickstart
This quickstart walks you through generating your first video with AI Gateway. Supported models include Veo, Kling, Wan, and Grok Imagine Video.
Video generation requires AI SDK v6. Check your ai package version with npm list ai and upgrade if needed.
Create a new directory and initialize a Node.js project:
Terminalmkdir ai-video-demo cd ai-video-demo pnpm initInstall AI SDK v6 and development dependencies:
Terminalnpm install ai dotenv @types/node tsx typescriptTerminalyarn add ai dotenv @types/node tsx typescriptTerminalpnpm add ai dotenv @types/node tsx typescriptTerminalbun add ai dotenv @types/node tsx typescriptGo to the AI Gateway API Keys page in your Vercel dashboard and click Create key to generate a new API key.
Create a
.env.localfile and save your API key:.env.localAI_GATEWAY_API_KEY=your_ai_gateway_api_keyCreate an
index.tsfile:index.tsimport { experimental_generateVideo as generateVideo } from 'ai'; import fs from 'node:fs'; import 'dotenv/config'; async function main() { const result = await generateVideo({ model: 'google/veo-3.1-generate-001', prompt: 'A serene mountain landscape at sunset with clouds drifting by', aspectRatio: '16:9', duration: 8, }); // Save the generated video fs.writeFileSync('output.mp4', result.videos[0].uint8Array); console.log('Video saved to output.mp4'); } main().catch(console.error);Run your script:
Terminalpnpm tsx index.tsVideo generation can take several minutes. If you hit timeout issues, see extending timeouts for Node.js.
The generated video will be saved as
output.mp4in your project directory.- See supported video generation models
- Learn about image-to-video generation to animate images
Video models vary in their input formats and required parameters. Some accept buffers while others require URLs. Always check the Video Generation docs for model-specific requirements.
Transform a single image into a video by adding motion. The image becomes the video content itself.
import { experimental_generateVideo as generateVideo } from 'ai';
import fs from 'node:fs';
import 'dotenv/config';
const result = await generateVideo({
model: 'alibaba/wan-v2.6-i2v',
prompt: {
image: 'https://example.com/your-image.png',
text: 'The scene slowly comes to life with gentle movement',
},
duration: 5,
});
fs.writeFileSync('output.mp4', result.videos[0].uint8Array);Generate a video that transitions between a starting and ending image. The model interpolates the motion between them.
import { experimental_generateVideo as generateVideo } from 'ai';
import fs from 'node:fs';
import 'dotenv/config';
const firstFrame = fs.readFileSync('start.png');
const lastFrame = fs.readFileSync('end.png');
const result = await generateVideo({
model: 'klingai/kling-v2.6-i2v',
prompt: {
image: firstFrame,
text: 'Smooth transition between the two scenes',
},
providerOptions: {
klingai: {
imageTail: lastFrame,
mode: 'pro',
},
},
});
fs.writeFileSync('output.mp4', result.videos[0].uint8Array);Generate a new video scene featuring characters or content from reference media. References can be images or videos that show the model what your characters look like.
import { experimental_generateVideo as generateVideo } from 'ai';
import fs from 'node:fs';
import 'dotenv/config';
const result = await generateVideo({
model: 'alibaba/wan-v2.6-r2v',
prompt: 'character1 and character2 have a friendly conversation in a cozy cafe',
resolution: '1920x1080',
duration: 4,
providerOptions: {
alibaba: {
// References can be images or videos
referenceUrls: [
'https://example.com/cat.png',
'https://example.com/dog.png',
],
shotType: 'single',
},
},
});
fs.writeFileSync('output.mp4', result.videos[0].uint8Array);Some video models require URLs instead of raw file data for image or video inputs. You can use Vercel Blob to host your media files.
- Go to the Vercel dashboard
- Select your project (or create one)
- Click Storage in the top navigation
- Click Create Database and select Blob
- Follow the prompts to create your blob store
- Copy the
BLOB_READ_WRITE_TOKENto your.env.localfile
AI_GATEWAY_API_KEY=your_ai_gateway_api_key
BLOB_READ_WRITE_TOKEN=your_blob_tokenInstall the Vercel Blob package:
pnpm add @vercel/blobimport { experimental_generateVideo as generateVideo } from 'ai';
import { put } from '@vercel/blob';
import fs from 'node:fs';
import 'dotenv/config';
// Upload image to Vercel Blob
const imageBuffer = fs.readFileSync('input.png');
const { url: imageUrl } = await put('input.png', imageBuffer, {
access: 'public',
});
const result = await generateVideo({
model: 'klingai/kling-v2.6-i2v',
prompt: {
image: imageUrl, // Pass URL instead of buffer
text: 'The scene slowly comes to life with gentle movement',
},
providerOptions: {
klingai: {
mode: 'std',
},
},
});
fs.writeFileSync('output.mp4', result.videos[0].uint8Array);See the Vercel Blob docs for more details on uploading and managing files.
For more details, see the Video Generation Capabilities docs.
Was this helpful?