Ministral 14B
Ministral 14B is part of Mistral AI's third-generation edge series, achieving 85% on AIME '25 with its reasoning variant and supporting native image understanding for on-device and edge deployment.
import { streamText } from 'ai'
const result = streamText({ model: 'mistral/ministral-14b', prompt: 'Why is the sky blue?'})What To Consider When Choosing a Provider
Zero Data Retention
AI Gateway supports Zero Data Retention for this model via direct gateway requests (BYOK is not included). To configure this, check the documentation.Authentication
AI Gateway authenticates requests using an API key or OIDC token. You do not need to manage provider credentials directly.
Ministral 14B ships in base, instruct, and reasoning variants, all with native image understanding. The reasoning variant reaches 85% on AIME '25 in Mistral AI's published Ministral 3 post.
When to Use Ministral 14B
Best For
Unified text and image applications:
Combine generation and vision in a single model
Extended thinking capability:
Use cases that benefit from the reasoning variant
Multilingual applications:
Serving diverse global regions
Apache 2.0 for commercial products:
Teams that need an Apache 2.0 license
Consider Alternatives When
Smaller parameter footprint:
You need a more compact model (consider Ministral 3B or 8B)
Server-side throughput at scale:
Peak server throughput is the primary concern
Conclusion
Ministral 14B is an edge-optimized model in Mistral AI's portfolio. Ministral 14B is multimodal from launch, optionally extended with reasoning, and designed for deployment from edge devices to enterprise workflows. Choose Ministral 14B when an application needs both image and text understanding at the 14B size class.
FAQ
85% on AIME '25 for the reasoning variant, per Mistral AI's Ministral 3 announcement.
Yes. Native image understanding is included in all variants (base, instruct, and reasoning) as part of the Ministral 3 series.
Mistral AI designed the Ministral 3 series for deployment from edge devices to enterprise workflows.
Apache 2.0.
Yes. The Ministral 3 series supports 40+ native languages.
The reasoning variant activates extended thinking to improve accuracy on complex problems. The instruct variant is optimized for standard instruction-following tasks with lower latency.
Ministral 14B adds image understanding, reasoning capability, and higher benchmark performance. Ministral 8B is text-only but features a specialized interleaved sliding-window attention pattern for faster and more memory-efficient inference at lower cost.