This is an open source AI app version of Anthropic's Artifacts UI in their Claude chat app.
This is an open-source version of apps like Anthropic's Claude Artifacts, Vercel v0, or GPT Engineer.
Powered by the E2B SDK.
Make sure to give us a star!
In your terminal:
git clone https://github.com/e2b-dev/fragments.git
Enter the repository:
cd fragments
Run the following to install the required dependencies:
npm i
Create a .env.local
file and set the following:
# Get your API key here - https://e2b.dev/E2B_API_KEY="your-e2b-api-key"# OpenAI API KeyOPENAI_API_KEY=# Other providersANTHROPIC_API_KEY=GROQ_API_KEY=FIREWORKS_API_KEY=TOGETHER_API_KEY=GOOGLE_AI_API_KEY=GOOGLE_VERTEX_CREDENTIALS=MISTRAL_API_KEY=XAI_API_KEY=### Optional env vars# Domain of the siteNEXT_PUBLIC_SITE_URL=# Disabling API key and base URL input in the chatNEXT_PUBLIC_NO_API_KEY_INPUT=NEXT_PUBLIC_NO_BASE_URL_INPUT=# Rate limitRATE_LIMIT_MAX_REQUESTS=RATE_LIMIT_WINDOW=# Vercel/Upstash KV (short URLs, rate limiting)KV_REST_API_URL=KV_REST_API_TOKEN=# Supabase (auth)SUPABASE_URL=SUPABASE_ANON_KEY=# PostHog (analytics)NEXT_PUBLIC_POSTHOG_KEY=NEXT_PUBLIC_POSTHOG_HOST=
npm run dev
npm run build
Make sure E2B CLI is installed and you're logged in.
Add a new folder under sandbox-templates/
Initialize a new template using E2B CLI:
e2b template init
This will create a new file called e2b.Dockerfile
.
Adjust the e2b.Dockerfile
Here's an example streamlit template:
# You can use most Debian-based base imagesFROM python:3.19-slimRUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly# Copy the code to the containerWORKDIR /home/userCOPY . /home/user
Specify a custom start command in e2b.toml
:
start_cmd = "cd /home/user && streamlit run app.py"
Deploy the template with the E2B CLI
e2b template build --name <template-name>
After the build has finished, you should get the following message:
✅ Building sandbox template <template-id> <template-name> finished.
Open lib/templates.json in your code editor.
Add your new template to the list. Here's an example for Streamlit:
"streamlit-developer": {"name": "Streamlit developer","lib": ["streamlit","pandas","numpy","matplotlib","request","seaborn","plotly"],"file": "app.py","instructions": "A streamlit app that reloads automatically.","port": 8501 // can be null},
Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.
Optionally, add a new logo under public/thirdparty/templates
Open lib/models.json in your code editor.
Add a new entry to the models list:
{"id": "mistral-large","name": "Mistral Large","provider": "Ollama","providerId": "ollama"}
Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see adding providers below).
Open lib/models.ts in your code editor.
Add a new entry to the providerConfigs
list:
Example for fireworks:
fireworks: () => createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),
Optionally, adjust the default structured output mode in the getDefaultMode
function:
if (providerId === 'fireworks') {return 'json'}
Optionally, add a new logo under public/thirdparty/logos
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.
This is an open source AI app version of Anthropic's Artifacts UI in their Claude chat app.
This is an open-source version of apps like Anthropic's Claude Artifacts, Vercel v0, or GPT Engineer.
Powered by the E2B SDK.
Make sure to give us a star!
In your terminal:
git clone https://github.com/e2b-dev/fragments.git
Enter the repository:
cd fragments
Run the following to install the required dependencies:
npm i
Create a .env.local
file and set the following:
# Get your API key here - https://e2b.dev/E2B_API_KEY="your-e2b-api-key"# OpenAI API KeyOPENAI_API_KEY=# Other providersANTHROPIC_API_KEY=GROQ_API_KEY=FIREWORKS_API_KEY=TOGETHER_API_KEY=GOOGLE_AI_API_KEY=GOOGLE_VERTEX_CREDENTIALS=MISTRAL_API_KEY=XAI_API_KEY=### Optional env vars# Domain of the siteNEXT_PUBLIC_SITE_URL=# Disabling API key and base URL input in the chatNEXT_PUBLIC_NO_API_KEY_INPUT=NEXT_PUBLIC_NO_BASE_URL_INPUT=# Rate limitRATE_LIMIT_MAX_REQUESTS=RATE_LIMIT_WINDOW=# Vercel/Upstash KV (short URLs, rate limiting)KV_REST_API_URL=KV_REST_API_TOKEN=# Supabase (auth)SUPABASE_URL=SUPABASE_ANON_KEY=# PostHog (analytics)NEXT_PUBLIC_POSTHOG_KEY=NEXT_PUBLIC_POSTHOG_HOST=
npm run dev
npm run build
Make sure E2B CLI is installed and you're logged in.
Add a new folder under sandbox-templates/
Initialize a new template using E2B CLI:
e2b template init
This will create a new file called e2b.Dockerfile
.
Adjust the e2b.Dockerfile
Here's an example streamlit template:
# You can use most Debian-based base imagesFROM python:3.19-slimRUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly# Copy the code to the containerWORKDIR /home/userCOPY . /home/user
Specify a custom start command in e2b.toml
:
start_cmd = "cd /home/user && streamlit run app.py"
Deploy the template with the E2B CLI
e2b template build --name <template-name>
After the build has finished, you should get the following message:
✅ Building sandbox template <template-id> <template-name> finished.
Open lib/templates.json in your code editor.
Add your new template to the list. Here's an example for Streamlit:
"streamlit-developer": {"name": "Streamlit developer","lib": ["streamlit","pandas","numpy","matplotlib","request","seaborn","plotly"],"file": "app.py","instructions": "A streamlit app that reloads automatically.","port": 8501 // can be null},
Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.
Optionally, add a new logo under public/thirdparty/templates
Open lib/models.json in your code editor.
Add a new entry to the models list:
{"id": "mistral-large","name": "Mistral Large","provider": "Ollama","providerId": "ollama"}
Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see adding providers below).
Open lib/models.ts in your code editor.
Add a new entry to the providerConfigs
list:
Example for fireworks:
fireworks: () => createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),
Optionally, adjust the default structured output mode in the getDefaultMode
function:
if (providerId === 'fireworks') {return 'json'}
Optionally, add a new logo under public/thirdparty/logos
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.