Interactive Demo
Try It Now
Section titled “Try It Now”Try WebLLM directly in your browser with two types of demos:
- Quick Tests: Streamlined examples showing exactly what’s being sent
- Live Examples: Full interactive demos with request summaries and source code
Quick Tests
Section titled “Quick Tests”Test WebLLM with these streamlined examples. Each shows exactly what’s being sent:
View Source
await promptInstall();
const result = await generateText({ prompt: 'Write a haiku about coding'});
console.log(result.text);View Source
await promptInstall();
const result = await generateText({ prompt: 'Write a short story about a robot learning to paint', task: 'creative', hints: { "quality": "high" }});
console.log(result.text);View Source
await promptInstall();
const result = await generateText({ prompt: 'What is the capital of France?', task: 'qa', hints: { "speed": "fastest" }});
console.log(result.text);View Source
await promptInstall();
const result = await generateText({ prompt: 'Write a JavaScript function that checks if a string is a palindrome', task: 'coding', hints: { "quality": "best", "capabilities": { "codeGeneration": true } }});
console.log(result.text);View Source
await promptInstall();
const result = await generateText({ messages: [ // 3 messages in conversation ], task: 'qa'});
console.log(result.text);Live Examples
Section titled “Live Examples”1. Simple Text Generation
Section titled “1. Simple Text Generation”Generate a Haiku
The simplest WebLLM example - generate creative text from a prompt
View Source Code
// Ensure extension is installedawait promptInstall();
// Generate a haikuconst result = await generateText({prompt: 'Write a haiku about coding'});
// Display the resultconsole.log(result.text);2. Check Extension Status
Section titled “2. Check Extension Status”Check WebLLM Availability
Detect if the extension is installed and check browser compatibility
View Source Code
// Check if extension is availableconst available = isAvailable();console.log('Extension installed:', available);
// Get browser informationconst browserInfo = getBrowserInfo();console.log('Browser:', browserInfo.browserName);console.log('Supported:', browserInfo.isSupported);
if (browserInfo.isSupported && browserInfo.installUrl) {console.log('Install URL:', browserInfo.installUrl);}
if (!browserInfo.isSupported && browserInfo.reason) {console.log('Not supported:', browserInfo.reason);}3. Task-Specific Generation
Section titled “3. Task-Specific Generation”Translation Task
Use task hints to guide provider selection for translation
View Source Code
// Ensure extension is readyawait promptInstall();
// Generate translation with task hintsconst result = await generateText({task: 'translation',hints: { quality: 'high', capabilities: { multilingual: true }},prompt: 'Translate to Spanish: Hello, how are you today?'});
console.log('Translation:', result.text);4. Quick Q&A
Section titled “4. Quick Q&A”Quick Question & Answer
Fast factual responses optimized for speed
View Source Code
await promptInstall();
// Quick Q&A with speed optimizationconst result = await generateText({task: 'qa',hints: { speed: 'fastest', quality: 'standard'},prompt: 'What is the capital of France?'});
console.log('Answer:', result.text);console.log('Response time: ~' + (result.usage?.totalTokens || 'unknown') + ' tokens');5. Code Generation
Section titled “5. Code Generation”Generate Code
Generate code with reasoning capabilities
View Source Code
await promptInstall();
// Code generation with quality hintsconst result = await generateText({task: 'coding',hints: { quality: 'best', capabilities: { codeGeneration: true, reasoning: true }},prompt: 'Write a JavaScript function that checks if a string is a palindrome'});
console.log('Generated code:');console.log(result.text);6. Conversation with Context
Section titled “6. Conversation with Context”Multi-turn Conversation
Maintain context across multiple messages
View Source Code
await promptInstall();
// Multi-turn conversationconst result = await generateText({messages: [ { role: 'user', content: 'What is TypeScript?' }, { role: 'assistant', content: 'TypeScript is a typed superset of JavaScript that compiles to plain JavaScript.' }, { role: 'user', content: 'What are its main benefits?' }]});
console.log('AI Response:');console.log(result.text);Complete Code Examples
Section titled “Complete Code Examples”Below are full HTML examples you can copy and run in your own projects.
Example 1: Simple Text Generation
Section titled “Example 1: Simple Text Generation”The most basic WebLLM example - generate text from a prompt.
<!DOCTYPE html><html><head> <title>WebLLM Simple Demo</title></head><body> <h1>Simple Text Generation</h1> <button id="generate">Generate Haiku</button> <pre id="result"></pre>
<script type="module"> import { promptInstall, generateText } from 'webllm';
document.getElementById('generate').onclick = async () => { try { // Ensure extension is installed await promptInstall();
// Generate text const result = await generateText({ prompt: 'Write a haiku about coding' });
document.getElementById('result').textContent = result.text; } catch (error) { document.getElementById('result').textContent = 'Error: ' + error.message; } }; </script></body></html><!DOCTYPE html><html><head> <title>WebLLM Simple Demo</title></head><body> <h1>Simple Text Generation</h1> <button id="generate">Generate Haiku</button> <pre id="result"></pre>
<script type="module"> import { promptInstall } from 'webllm'; import { generateText } from 'ai'; import { webllm } from 'webllm-ai-provider';
document.getElementById('generate').onclick = async () => { try { // Ensure extension is installed await promptInstall();
// Generate text const result = await generateText({ model: webllm({ task: 'creative' }), prompt: 'Write a haiku about coding' });
document.getElementById('result').textContent = result.text; } catch (error) { document.getElementById('result').textContent = 'Error: ' + error.message; } }; </script></body></html>What this does:
- Checks if the WebLLM extension is installed (prompts user to install if needed)
- Generates a creative haiku when the button is clicked
- Displays the result or any errors
Example 2: Chat Conversation
Section titled “Example 2: Chat Conversation”Build a simple chat interface with conversation history.
<!DOCTYPE html><html><head> <title>WebLLM Chat Demo</title> <style> #messages { max-height: 400px; overflow-y: auto; border: 1px solid #ccc; padding: 10px; margin: 10px 0; } .message { margin: 5px 0; padding: 8px; border-radius: 4px; } .user { background: #e3f2fd; } .assistant { background: #f5f5f5; } </style></head><body> <h1>Chat with AI</h1> <div id="messages"></div> <input type="text" id="input" placeholder="Type your message..." style="width: 300px;"> <button id="send">Send</button>
<script type="module"> import { promptInstall, generateText } from 'webllm';
const messages = []; let ready = false;
// Initialize extension (async () => { try { await promptInstall(); ready = true; } catch (error) { addMessage('system', 'Failed to initialize: ' + error.message); } })();
function addMessage(role, content) { const div = document.createElement('div'); div.className = `message ${role}`; div.textContent = `${role}: ${content}`; document.getElementById('messages').appendChild(div); }
document.getElementById('send').onclick = async () => { if (!ready) { alert('WebLLM is not ready yet'); return; }
const input = document.getElementById('input'); const userMessage = input.value.trim(); if (!userMessage) return;
// Add user message addMessage('user', userMessage); messages.push({ role: 'user', content: userMessage }); input.value = '';
try { // Generate response const result = await generateText({ messages: messages });
// Add assistant message addMessage('assistant', result.text); messages.push({ role: 'assistant', content: result.text }); } catch (error) { addMessage('system', 'Error: ' + error.message); } };
// Send on Enter key document.getElementById('input').onkeypress = (e) => { if (e.key === 'Enter') document.getElementById('send').click(); }; </script></body></html><!DOCTYPE html><html><head> <title>WebLLM Chat Demo</title> <style> #messages { max-height: 400px; overflow-y: auto; border: 1px solid #ccc; padding: 10px; margin: 10px 0; } .message { margin: 5px 0; padding: 8px; border-radius: 4px; } .user { background: #e3f2fd; } .assistant { background: #f5f5f5; } </style></head><body> <h1>Chat with AI</h1> <div id="messages"></div> <input type="text" id="input" placeholder="Type your message..." style="width: 300px;"> <button id="send">Send</button>
<script type="module"> import { promptInstall } from 'webllm'; import { generateText } from 'ai'; import { webllm } from 'webllm-ai-provider';
const messages = []; let ready = false;
// Initialize extension (async () => { try { await promptInstall(); ready = true; } catch (error) { addMessage('system', 'Failed to initialize: ' + error.message); } })();
function addMessage(role, content) { const div = document.createElement('div'); div.className = `message ${role}`; div.textContent = `${role}: ${content}`; document.getElementById('messages').appendChild(div); }
document.getElementById('send').onclick = async () => { if (!ready) { alert('WebLLM is not ready yet'); return; }
const input = document.getElementById('input'); const userMessage = input.value.trim(); if (!userMessage) return;
// Add user message addMessage('user', userMessage); messages.push({ role: 'user', content: userMessage }); input.value = '';
try { // Generate response const result = await generateText({ model: webllm({ task: 'qa' }), messages: messages });
// Add assistant message addMessage('assistant', result.text); messages.push({ role: 'assistant', content: result.text }); } catch (error) { addMessage('system', 'Error: ' + error.message); } };
// Send on Enter key document.getElementById('input').onkeypress = (e) => { if (e.key === 'Enter') document.getElementById('send').click(); }; </script></body></html>What this does:
- Creates a chat interface that maintains conversation history
- Sends the full conversation context with each request
- Allows pressing Enter to send messages
- Handles errors gracefully
Example 3: React Integration
Section titled “Example 3: React Integration”WebLLM in a React application with hooks.
import { useState, useEffect } from 'react';import { promptInstall, generateText, isAvailable } from 'webllm';
function App() { const [input, setInput] = useState(''); const [output, setOutput] = useState(''); const [loading, setLoading] = useState(false); const [error, setError] = useState(null);
async function handleGenerate() { setLoading(true); setError(null);
try { // Prompt for installation if needed if (!isAvailable()) { await promptInstall(); }
// Generate text const result = await generateText({ task: 'creative', hints: { quality: 'high' }, prompt: input });
setOutput(result.text); console.log('Tokens used:', result.usage.totalTokens); } catch (err) { setError(err.message); } finally { setLoading(false); } }
return ( <div className="App"> <h1>WebLLM React Demo</h1>
<textarea value={input} onChange={(e) => setInput(e.target.value)} placeholder="Enter your prompt..." rows={4} style={{ width: '100%', marginBottom: '10px' }} />
<button onClick={handleGenerate} disabled={loading || !input} style={{ padding: '10px 20px' }} > {loading ? 'Generating...' : 'Generate'} </button>
{error && ( <div style={{ color: 'red', marginTop: '10px' }}> Error: {error} </div> )}
{output && ( <div style={{ marginTop: '20px' }}> <h3>Result:</h3> <pre style={{ background: '#f5f5f5', padding: '15px', borderRadius: '4px' }}> {output} </pre> </div> )} </div> );}
export default App;Installation for React:
npm install webllmRunning These Examples
Section titled “Running These Examples”Option 1: Using a Module Bundler (Recommended)
Section titled “Option 1: Using a Module Bundler (Recommended)”Setup:
# Create new projectnpm create vite@latest my-webllm-demo -- --template vanilla
# Install WebLLMcd my-webllm-demonpm install webllm
# Or install with Vercel AI SDKnpm install webllm webllm-ai-provider aiThen copy any example code into your project and run:
npm run devOption 2: Using Import Maps (No Build Step)
Section titled “Option 2: Using Import Maps (No Build Step)”For simple HTML files, use import maps:
<!DOCTYPE html><html><head> <script type="importmap"> { "imports": { "webllm": "https://esm.sh/webllm@latest", "ai": "https://esm.sh/ai@latest", "webllm-ai-provider": "https://esm.sh/webllm-ai-provider@latest" } } </script></head><body> <!-- Your example code here --> <script type="module"> import { generateText } from 'webllm'; // ... rest of your code </script></body></html>Option 3: Using CodePen or JSFiddle
Section titled “Option 3: Using CodePen or JSFiddle”You can also run these examples on online code editors:
- Open CodePen or JSFiddle
- Add import map in HTML settings
- Copy the example code
- Make sure you have the WebLLM extension installed in your browser
Troubleshooting
Section titled “Troubleshooting”Extension Not Installing
Section titled “Extension Not Installing”If promptInstall() shows an error:
- Make sure you’re using Chrome, Edge, or another Chromium browser
- Check that you don’t already have the extension installed
- Try refreshing the page after installation
”No Providers Configured” Error
Section titled “”No Providers Configured” Error”If you get this error:
- Click the WebLLM extension icon in your browser toolbar
- Go to the “Providers” tab
- Add at least one provider (Anthropic, OpenAI, or local model)
- Add your API key or download a local model
Import Errors
Section titled “Import Errors”If you see module import errors:
- Make sure you’ve installed webllm:
npm install webllm - Check that your bundler supports ES modules
- For plain HTML, use import maps (see Option 2 above)
Interactive Demos Not Working
Section titled “Interactive Demos Not Working”If the “Run Demo” buttons don’t work:
- Make sure JavaScript is enabled in your browser
- Check the browser console for any errors
- Ensure you have the WebLLM extension installed
- Try refreshing the page
Next Steps
Section titled “Next Steps”- Learn More: Read the SDK Getting Started guide
- API Reference: Explore the Client Library API
- Advanced Usage: Check out Vercel AI Examples
- Best Practices: Review Best Practices for production apps