Skip to content

Llama 4 Scout 17B 16E Instruct

Llama 4 Scout 17B 16E Instruct is a natively multimodal Mixture of Experts (MoE) model with a context window of 131.1K tokens, purpose-built for processing entire codebases, multi-document corpora, and extended user activity logs in a single inference call.

Tool UseVision (Image)
index.ts
import { streamText } from 'ai'
const result = streamText({
model: 'meta/llama-4-scout',
prompt: 'Why is the sky blue?'
})