Skip to content

Grok 3 Beta

Grok 3 Beta is xAI's full-scale Grok 3 reasoning model. Trained on the Colossus supercomputer, it targets math, science, and coding benchmarks with a context window of 131.1K tokens.

Tool Use
index.ts
import { streamText } from 'ai'
const result = streamText({
model: 'xai/grok-3',
prompt: 'Why is the sky blue?'
})

Frequently Asked Questions

  • What makes Grok 3 Beta different from Grok 2?

    Grok 3 Beta is trained on the Colossus supercomputer. It scores higher than Grok 2 on math, science, and coding benchmarks.

  • What is the context window for Grok 3 Beta?

    131.1K tokens. You also get up to 131.1K tokens per response for long completions.

  • How does Grok 3 Beta compare to Grok 3 Fast?

    Grok 3 Beta prioritizes reasoning depth and quality, while Grok 3 Fast optimizes for lower latency at a slight quality tradeoff. Both share the same context window of 131.1K tokens.

  • What does Grok 3 Beta cost through Vercel AI Gateway?

    Current pricing is shown on this page. AI Gateway routes across providers, and rates may vary by provider.

  • How do I authenticate with Grok 3 Beta through Vercel AI Gateway?

    Use your Vercel AI Gateway API key with the model identifier xai/grok-3. AI Gateway manages provider routing and authentication automatically.

  • Is Grok 3 Beta suitable for agentic applications?

    Grok 3 Beta supports tool calling and multi-step reasoning, making it capable for agentic workflows. For latency-sensitive agent loops, consider Grok 3 Fast as an alternative.

  • Does Vercel AI Gateway support Zero Data Retention for Grok 3 Beta?

    Zero Data Retention is not currently available for this model. ZDR on AI Gateway applies to direct gateway requests; BYOK flows aren't covered. See https://vercel.com/docs/ai-gateway/capabilities/zdr for configuration details.