A chatbot that allows you to dynamically set the LLM using Vercel AI SDK with Feature Flags and Edge Config

This example demonstrates how to use the Vercel AI SDK with Next.js, Feature Flags, and Edge Config to create a flexible AI-powered application with dynamic model switching capabilities.
Run create-next-app with npm, Yarn, or pnpm to bootstrap the example:
npx create-next-app --example https://github.com/vercel-labs/ai-sdk-flags-edge-config ai-sdk-flags-edge-config-example
yarn create next-app --example https://github.com/vercel-labs/ai-sdk-flags-edge-config ai-sdk-flags-edge-config-example
pnpm create next-app --example https://github.com/vercel-labs/ai-sdk-flags-edge-config ai-sdk-flags-edge-config-example
To run the example locally you need to:
.env.example file, but in a new file called .env.npm install to install the required dependencies.npm run dev to launch the development server.Note: you can generate the value for the FLAGS_SECRET by running the following code in your terminal:
node -e "console.log(crypto.randomBytes(32).toString('base64url'))"
To run the node-example.ts file run the following command in your terminal
pnpm tsx node-example.ts
To learn more about Vercel AI SDK, Next.js, Feature Flags, and Edge Config, take a look at the following resources: