Interfaze Beta
Interfaze Beta merges specialized DNN/CNN models with an LLM to handle deterministic developer tasks like OCR, scraping, classification, structured outputs, and web extraction. It supports 1M tokens input and 32K tokens output. On AI Gateway, pay $1.5 per million input tokens and $3.5 per million output tokens.
import { streamText } from 'ai'
const result = streamText({ model: 'interfaze/interfaze-beta', prompt: 'Why is the sky blue?'})Frequently Asked Questions
What is Interfaze Beta?
Interfaze Beta is a hybrid AI system from Interfaze that routes each request to a specialized DNN or CNN model when one fits, and falls back to an LLM otherwise. It targets developer tasks like OCR, scraping, classification, structured outputs, and web extraction.
What is the context window and output limit?
The context window is 1M tokens and the maximum output is 32K tokens.
Which input modalities does Interfaze Beta support?
Text, images, audio, files, and video. The API stays OpenAI Chat Completions compatible across all of them.
How do I call Interfaze Beta through AI Gateway?
Set the model to
interfaze/interfaze-betain the AI SDK, Chat Completions API, Responses API, Messages API, or other API formats, from TypeScript or Python. AI Gateway handles authentication and routing. See https://interfaze.ai/ for the model page.What is the pricing?
On AI Gateway, Interfaze Beta costs $1.5 per million input tokens and $3.5 per million output tokens. Current rates appear on this page.
How well does Interfaze Beta handle structured output?
Interfaze reports 98 to 99% accuracy on structured output generation, which makes Interfaze Beta a fit for pipelines that parse responses against a schema.
Does Interfaze Beta support zero data retention?
Zero Data Retention is not currently available for this model. Zero Data Retention is offered on a per-provider basis. See https://vercel.com/docs/ai-gateway/capabilities/zdr for details.